简体   繁体   English

Android Camera2 在 TextureView 上显示黑色和扭曲的 JPEG 图像?

[英]Android Camera2 displays black and distorted JPEG image on TextureView?

Im making a test app for a friend, on the Samsung S20.我在三星 S20 上为朋友制作了一个测试应用程序。

The Samsung S20 has a ToF (Time of Flight) camera facing the back.三星 S20 背面有一个 ToF(飞行时间)摄像头。

I will like to display the ToF image preview & regular camera preview on a TextureView side by side.我想在TextureView上并排显示 ToF 图像预览和常规相机预览。

Im able to get the ToF sensor and convert its raw output to visual output using a color mask and display depth ranges visually (red farthest, oranges, etc..), see the screenshot:我能够获得 ToF 传感器并将其原始 output 转换为视觉 output 使用颜色蒙版并在视觉上显示深度范围(最远的红色,橙色等),请参见屏幕截图:

在此处输入图像描述

Below is relevant code:下面是相关代码:

<?xml version="1.0" encoding="utf-8"?>
<androidx.coordinatorlayout.widget.CoordinatorLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".MainActivity">
    <com.google.android.material.appbar.AppBarLayout
        android:id="@+id/appBarLayout"
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:theme="@style/AppTheme.AppBarOverlay">
        <androidx.appcompat.widget.Toolbar
            android:id="@+id/toolbar"
            android:layout_width="match_parent"
            android:layout_height="?attr/actionBarSize"
            android:background="?attr/colorPrimary"
            app:popupTheme="@style/AppTheme.PopupOverlay" />

        <androidx.constraintlayout.widget.ConstraintLayout
            android:layout_width="match_parent"
            android:layout_height="619dp"
            android:background="#FFFFFFFF">
            <TextureView
                android:id="@+id/regularBackCamera"
                android:layout_width="320dp"
                android:layout_height="240dp"
                android:layout_marginEnd="44dp"
                app:layout_constraintBottom_toBottomOf="parent"
                app:layout_constraintEnd_toEndOf="parent"
                app:layout_constraintTop_toTopOf="parent"
                app:layout_constraintVertical_bias="0.899" />
            <TextView
                android:id="@+id/textView3"
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                android:text="Raw ToF Data"
                android:textColor="@android:color/primary_text_light"
                app:layout_constraintEnd_toEndOf="@+id/rawData"
                app:layout_constraintStart_toStartOf="@+id/rawData"
                app:layout_constraintTop_toBottomOf="@+id/rawData" />
            <TextureView
                android:id="@+id/rawData"
                android:layout_width="320dp"
                android:layout_height="240dp"
                android:layout_marginStart="44dp"
                app:layout_constraintBottom_toTopOf="@+id/regularBackCamera"
                app:layout_constraintStart_toStartOf="parent"
                app:layout_constraintTop_toTopOf="parent"
                app:layout_constraintVertical_bias="0.485" />
            <TextView
                android:id="@+id/textView5"
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                android:layout_marginStart="120dp"
                android:text="Back Camera"
                android:textColor="@android:color/primary_text_light"
                app:layout_constraintStart_toStartOf="@+id/regularBackCamera"
                app:layout_constraintTop_toBottomOf="@+id/regularBackCamera" />
        </androidx.constraintlayout.widget.ConstraintLayout>
    </com.google.android.material.appbar.AppBarLayout>
</androidx.coordinatorlayout.widget.CoordinatorLayout>

MainActivity class:主要活动 class:

/*  This is an example of getting and processing ToF data
 */

public class MainActivity extends AppCompatActivity implements DepthFrameVisualizer, RegularCameraFrameVisualizer {
    private static final String TAG = MainActivity.class.getSimpleName();
    public static final int CAM_PERMISSIONS_REQUEST = 0;

    private TextureView rawDataView;
    private TextureView regularImageView;
    private Matrix ToFBitmapTransform;
    private Matrix regularBackCameraBitmapTransform;
    private BackToFCamera backToFCamera;
    private RegularBackCamera regularBackCamera;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        rawDataView = findViewById(R.id.rawData);
        regularImageView = findViewById(R.id.regularBackCamera);
        checkCamPermissions();

    }

    @Override
    protected void onPause() {
        super.onPause();

        if ( backToFCamera !=null)
        {
            backToFCamera.getCamera().close();
            backToFCamera = null;

        }
        if ( regularBackCamera!= null)
        {
            regularBackCamera.getCamera().close();
            regularBackCamera = null;
        }
    }

    @Override
    protected void onResume() {
        super.onResume();

        backToFCamera = new BackToFCamera(this, this);
        String tofCameraId = backToFCamera.openCam(null);

        regularBackCamera = new RegularBackCamera(this, this);
        //pass in tofCameraId to avoid opening again since both regular cam & ToF camera are back facing
        regularBackCamera.openCam(tofCameraId);

    }

    @Override
    protected void onDestroy() {
        super.onDestroy();               // Add this line
    }

    private void checkCamPermissions() {
        if (ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
            ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.CAMERA}, CAM_PERMISSIONS_REQUEST);
        }
    }

    @Override
    public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
    }

    @Override
    public void onRawDataAvailable(Bitmap bitmap) {
        renderBitmapForToFToTextureView(bitmap, rawDataView);
    }

    @Override
    public void onRegularImageAvailable(Bitmap bitmap) {
        renderBitmapToTextureView( bitmap,regularImageView);
    }

    /* We don't want a direct camera preview since we want to get the frames of data directly
        from the camera and process.

        This takes a converted bitmap and renders it onto the surface, with a basic rotation
        applied.
     */
    private void renderBitmapForToFToTextureView(Bitmap bitmap, TextureView textureView) {

        if (bitmap!=null && textureView!=null) {
            Canvas canvas = textureView.lockCanvas();
            canvas.drawBitmap(bitmap, ToFBitmapTransform(textureView), null);
            textureView.unlockCanvasAndPost(canvas);
        }
    }

    private void renderBitmapToTextureView(Bitmap bitmap, TextureView textureView) {
        if (bitmap!=null && textureView!=null)
        {
        Canvas canvas = textureView.lockCanvas();
        if (canvas!=null) {
            canvas.drawBitmap(bitmap, regularBackCamBitmapTransform(textureView), null);
            textureView.unlockCanvasAndPost(canvas);
        }
        }
    }

    private Matrix ToFBitmapTransform(TextureView view) {

        if (view!=null) {
            if (ToFBitmapTransform == null || view.getWidth() == 0 || view.getHeight() == 0) {
                int rotation = getWindowManager().getDefaultDisplay().getRotation();
                Matrix matrix = new Matrix();
                int centerX = view.getWidth() / 2;
                int centerY = view.getHeight() / 2;

                int bufferWidth = DepthFrameAvailableListener.SAMSUNG_S20_TOF_WIDTH;
                int bufferHeight = DepthFrameAvailableListener.SAMSUNG_S20_TOF_HEIGHT;

                RectF bufferRect = new RectF(0, 0, bufferWidth, bufferHeight);
                RectF viewRect = new RectF(0, 0, view.getWidth(), view.getHeight());
                matrix.setRectToRect(bufferRect, viewRect, Matrix.ScaleToFit.CENTER);

                Log.i(TAG, " rotation:" + rotation);
                if (Surface.ROTATION_90 == rotation) {

                    matrix.postRotate(270, centerX, centerY);
                } else if (Surface.ROTATION_270 == rotation) {

                    matrix.postRotate(90, centerX, centerY);
                } else if (Surface.ROTATION_180 == rotation) {

                    matrix.postRotate(180, centerX, centerY);
                } else {
                    //strange but works!
                    matrix.postRotate(90, centerX, centerY);
                }


                ToFBitmapTransform = matrix;
            }
        }
        return  ToFBitmapTransform;
    }

    private Matrix regularBackCamBitmapTransform(TextureView view) {
        if (view!=null) {
            if (regularBackCameraBitmapTransform == null || view.getWidth() == 0 || view.getHeight() == 0) {

                int rotation = getWindowManager().getDefaultDisplay().getRotation();
                Matrix matrix = new Matrix();
                RectF bufferRect = new RectF(0, 0, MAX_PREVIEW_WIDTH,MAX_PREVIEW_HEIGHT);
                RectF viewRect = new RectF(0, 0, view.getWidth(), view.getHeight());
                matrix.setRectToRect(bufferRect, viewRect, Matrix.ScaleToFit.CENTER);
                float centerX = viewRect.centerX();
                float centerY = viewRect.centerY();

                Log.i(TAG, " rotation:" + rotation);
                if (Surface.ROTATION_90 == rotation) {

                    matrix.postRotate(270, centerX, centerY);
                } else if (Surface.ROTATION_270 == rotation) {

                    matrix.postRotate(90, centerX, centerY);
                } else if (Surface.ROTATION_180 == rotation) {

                    matrix.postRotate(180, centerX, centerY);
                } else {
                    //strange but works!
                    matrix.postRotate(90, centerX, centerY);
                }

                regularBackCameraBitmapTransform = matrix;
            }
        }
        return regularBackCameraBitmapTransform;
    }
}

Listener that signals a frame is available for display, look at the function publishOriginalBitmap() :发出帧信号的侦听器可用于显示,请查看 function publishOriginalBitmap()

import static com.example.opaltechaitestdepthmap.RegularBackCamera.MAX_PREVIEW_HEIGHT;
import static com.example.opaltechaitestdepthmap.RegularBackCamera.MAX_PREVIEW_WIDTH;

public class BackCameraFrameAvailableListener implements ImageReader.OnImageAvailableListener {
    private static final String TAG = BackCameraFrameAvailableListener.class.getSimpleName();
    private RegularCameraFrameVisualizer regularCameraFrameVisualizer;

    public BackCameraFrameAvailableListener(RegularCameraFrameVisualizer regularCameraFrameVisualizer) {
        this.regularCameraFrameVisualizer = regularCameraFrameVisualizer;
    }

    @Override
    public void onImageAvailable(ImageReader reader) {
        try {
            Image image = reader.acquireNextImage();
          if (image != null && image.getFormat() == ImageFormat.JPEG)
            {

                publishOriginalBitmap(image);
            }

        }
        catch (Exception e) {
            Log.e(TAG, "Failed to acquireNextImage: " + e.getMessage());
        }
    }

    private void publishOriginalBitmap(final Image image) {

        if (regularCameraFrameVisualizer != null) {
            new Thread() {
                public void run() {
                    Bitmap bitmap = returnBitmap(image);
                    if (bitmap != null) {
                        regularCameraFrameVisualizer.onRegularImageAvailable(bitmap);
                        bitmap.recycle();
                    }
                }
            }.start();
        }

    }

    private Bitmap returnBitmap(Image image) {
        Bitmap bitmap = null;
        // width=1920,height=1080
        int width =1920;
        int height =1080;
        if (image!=null) {

            Log.i(TAG,"returnBitmap,CONSTANT MAX width:"+MAX_PREVIEW_WIDTH +",MAX height:"+MAX_PREVIEW_HEIGHT);
            Log.i(TAG,"BEFORE returnBitmap,image.width:"+width +",height:"+height );
            if (image!=null) {
                Image.Plane[] planes = image.getPlanes();
                if (planes!=null && planes.length>0) {

                    ByteBuffer buffer = image.getPlanes()[0].getBuffer();
                    image.close();
                    Log.i(TAG,"buffer size:"+buffer.capacity());

                        float currenBufferSize = buffer.capacity();
                        float jpegReportedArea = width * height;
                        if (currenBufferSize >=jpegReportedArea ) {
                            Log.i(TAG,"currenBufferSize >=jpegReportedArea ");

                            float quotient =  jpegReportedArea/currenBufferSize ;
                            float f_width = width * quotient;
                            width = (int) Math.ceil(f_width);
                            float f_height = height * quotient;
                            height = (int) Math.ceil(f_height);
                        }
                        else
                        {
                            Log.i(TAG,"currenBufferSize <jpegReportedArea ");
                            float quotient = currenBufferSize / jpegReportedArea;
                            float f_width = (width * quotient);
                            width = (int) Math.ceil(f_width);
                            float f_height = (height * quotient);
                            height = (int) Math.ceil(f_height);

                        }


                        Log.i(TAG,"AFTER width:"+width+",height:"+height);
                        //***here bitmap is black
                        bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.RGB_565);

                        buffer.rewind();
                        if (bitmap!=null) {
                            bitmap.copyPixelsFromBuffer(buffer);
                        }

                }

            }
        }

        return bitmap;
    }
} 

The interface used by the listener to signal image is ready:侦听器用于发送图像信号的接口已准备就绪:

package com.example.opaltechaitestdepthmap;

        import android.graphics.Bitmap;

public interface RegularCameraFrameVisualizer {
    void onRegularImageAvailable(Bitmap bitmap);

} }

Handles camera states:处理相机状态:

public class RegularBackCamera extends CameraDevice.StateCallback {

    private static final String TAG = RegularBackCamera.class.getSimpleName();
    private static int FPS_MIN = 15;
    private static int FPS_MAX = 30;
    public static final int MAX_PREVIEW_WIDTH = 1920;
    public static final int MAX_PREVIEW_HEIGHT = 1080;
    private Context context;
    private CameraManager cameraManager;
    private ImageReader RawSensorPreviewReader;
    private CaptureRequest.Builder previewBuilder;
    private BackCameraFrameAvailableListener imageAvailableListener;
    private String cameraId;
    private CameraDevice camera;

    public RegularBackCamera(Context context, RegularCameraFrameVisualizer frameVisualizer) {
        this.context = context;
        cameraManager = (CameraManager)context.getSystemService(Context.CAMERA_SERVICE);
        imageAvailableListener = new BackCameraFrameAvailableListener(frameVisualizer);

    }

    // Open the back camera and start sending frames
    public String openCam(String idToExclude) {
        this.cameraId  = getBackCameraID(idToExclude);
        Size size = openCamera(this.cameraId);

        //Tried this DID NOT WORK Size smallerPreviewSize =chooseSmallerPreviewSize();

                RawSensorPreviewReader = ImageReader.newInstance(MAX_PREVIEW_WIDTH,
                        MAX_PREVIEW_HEIGHT, ImageFormat.JPEG,2);
        Log.i(TAG,"ImageFormat.JPEG, width:"+size.getWidth()+", height:"+ size.getHeight());
        RawSensorPreviewReader.setOnImageAvailableListener(imageAvailableListener, null);

        return this.cameraId;
    }

    private String getBackCameraID(String idToExclude) {
        String cameraId = null;
        CameraManager cameraManager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
        try {

            if (idToExclude!=null) {
                for (String camera : cameraManager.getCameraIdList()) {
                    //avoid getting same camera
                    if (!camera.equalsIgnoreCase(idToExclude)) {
                        //avoid return same camera twice as 1 sensor can only be accessed once
                        CameraCharacteristics chars = cameraManager.getCameraCharacteristics(camera);
                        final int[] capabilities = chars.get(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES);
                        boolean facingBack = chars.get(CameraCharacteristics.LENS_FACING) == CameraMetadata.LENS_FACING_BACK;

                        if (facingBack) {
                            cameraId = camera;
                            // Note that the sensor size is much larger than the available capture size
                            SizeF sensorSize = chars.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE);
                            Log.i(TAG, "Sensor size: " + sensorSize);

                            // Since sensor size doesn't actually match capture size and because it is
                            // reporting an extremely wide aspect ratio, this FoV is bogus
                            float[] focalLengths = chars.get(CameraCharacteristics.LENS_INFO_AVAILABLE_FOCAL_LENGTHS);
                            if (focalLengths.length > 0) {
                                float focalLength = focalLengths[0];
                                double fov = 2 * Math.atan(sensorSize.getWidth() / (2 * focalLength));
                                Log.i(TAG, "Calculated FoV: " + fov);
                            }

                        }

                    }//end avoid getting same camera


                }//end for
            }
            else
            {
                for (String camera : cameraManager.getCameraIdList()) {

                    //avoid return same camera twice as 1 sensor can only be accessed once
                    CameraCharacteristics chars = cameraManager.getCameraCharacteristics(camera);
                    final int[] capabilities = chars.get(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES);
                    boolean facingFront = chars.get(CameraCharacteristics.LENS_FACING) == CameraMetadata.LENS_FACING_BACK;

                    if (facingFront) {
                        cameraId = camera;
                        // Note that the sensor size is much larger than the available capture size
                        SizeF sensorSize = chars.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE);
                        Log.i(TAG, "Sensor size: " + sensorSize);

                        // Since sensor size doesn't actually match capture size and because it is
                        // reporting an extremely wide aspect ratio, this FoV is bogus
                        float[] focalLengths = chars.get(CameraCharacteristics.LENS_INFO_AVAILABLE_FOCAL_LENGTHS);
                        if (focalLengths.length > 0) {
                            float focalLength = focalLengths[0];
                            double fov = 2 * Math.atan(sensorSize.getWidth() / (2 * focalLength));
                            Log.i(TAG, "Calculated FoV: " + fov);
                        }

                    }
                }//end for
            }
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
        return    cameraId ;
    }

    //opens camera based on ID & returns optimal size caped at maximum size based on docs
    private Size openCamera(String cameraId) {
        Size size = null;
        try{
            int permission = ContextCompat.checkSelfPermission(context, Manifest.permission.CAMERA);
            if(PackageManager.PERMISSION_GRANTED == permission) {
                if ( cameraManager!=null) {
                    if (cameraId!=null) {
                        cameraManager.openCamera(cameraId, this, null);


                            CameraCharacteristics characteristics
                                    =  cameraManager.getCameraCharacteristics(cameraId);


                            StreamConfigurationMap map = characteristics.get(
                                    CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);

                        size = Collections.max(
                                    Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)),
                                    new CompareSizeByArea());
                        if (size.getWidth() > MAX_PREVIEW_WIDTH ||  size.getHeight() > MAX_PREVIEW_HEIGHT)
                        {
                            size = new Size( MAX_PREVIEW_WIDTH ,MAX_PREVIEW_HEIGHT);

                        }

                       List<Size> sizes =  Arrays.asList(map.getOutputSizes(ImageFormat.JPEG));
                       for (int i=0; i<sizes.size(); i++)
                       {
                           Log.i(RegularBackCamera.class.toString(),"JPEG sizes, width="+sizes.get(i).getWidth()+","+"height="+sizes.get(i).getHeight());
                       }

                    }

                }
            }else{
                Log.e(TAG,"Permission not available to open camera");
            }
        }catch (CameraAccessException | IllegalStateException | SecurityException e){
            Log.e(TAG,"Opening Camera has an Exception " + e);
            e.printStackTrace();
        }
        return  size;
    }


    @Override
    public void onOpened(@NonNull CameraDevice camera) {
        try {
            this.camera = camera;
            previewBuilder = camera.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            previewBuilder.set(CaptureRequest.JPEG_ORIENTATION, 0);
            Range<Integer> fpsRange = new Range<>(FPS_MIN, FPS_MAX);
            previewBuilder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, fpsRange);
            previewBuilder.addTarget(RawSensorPreviewReader.getSurface());

            List<Surface> targetSurfaces = Arrays.asList(RawSensorPreviewReader.getSurface());

            camera.createCaptureSession(targetSurfaces,
                    new CameraCaptureSession.StateCallback() {
                        @Override
                        public void onConfigured(@NonNull CameraCaptureSession session) {
                            onCaptureSessionConfigured(session);
                        }
                        @Override
                        public void onConfigureFailed(@NonNull CameraCaptureSession session) {
                            Log.e(TAG,"!!! Creating Capture Session failed due to internal error ");
                        }
                    }, null);

        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    private void onCaptureSessionConfigured(@NonNull CameraCaptureSession session) {
        Log.i(TAG,"Capture Session created");
        previewBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
        try {
            session.setRepeatingRequest(previewBuilder.build(), null, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }


    @Override
    public void onDisconnected(@NonNull CameraDevice camera) {

        if (camera!=null)
        {
            camera.close();
            camera = null;
        }
    }

    @Override
    public void onError(@NonNull CameraDevice camera, int error) {
        if (camera!=null)
        {
            camera.close();
            Log.e(TAG,"onError,cameraID:"+camera.getId()+",error:"+error);
            camera = null;


        }
    }

    protected Size chooseSmallerPreviewSize()
    {
        CameraManager cm = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
        CameraCharacteristics cc = null;
        try {
            cc = cm.getCameraCharacteristics(this.cameraId);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
        StreamConfigurationMap streamConfigs = cc.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
        Size[] sizes = streamConfigs.getOutputSizes( ImageFormat.JPEG);
        Size smallerPreviewSize = chooseVideoSize( sizes);

        return smallerPreviewSize;
    }


  //Rerefences: https://stackoverflow.com/questions/46997776/camera2-api-error-failed-to-create-capture-session
    protected Size chooseVideoSize(Size[] choices) {
        List<Size> smallEnough = new ArrayList<>();

        for (Size size : choices) {
            if (size.getWidth() == size.getHeight() * 4 / 3 && size.getWidth() <= 1080) {
                smallEnough.add(size);
            }
        }
        if (smallEnough.size() > 0) {
            return Collections.max(smallEnough, new CompareSizeByArea());
        }

        return choices[choices.length - 1];
    }

    public CameraDevice getCamera() {
        return camera;
    }
}

Helper to sort preview sizes:帮助对预览尺寸进行排序:

public class CompareSizeByArea implements Comparator<Size> {
    @Override
    public int compare(Size lhs, Size rhs) {
        return Long.signum((long) lhs.getWidth() * lhs.getHeight() -
                (long) rhs.getWidth() * rhs.getHeight());
    }

}

I included the code for the regular camera only since the regular camera was not displaying, however the code for obtaining the ToF camera & listeners is the exactly the same except ToF specific logic.我只包含了常规摄像头的代码,因为常规摄像头没有显示,但是获取 ToF 摄像头和侦听器的代码完全相同,除了 ToF 特定逻辑。

I'm not seeing any exceptions or errors in the app logs, however the logs system show:我在应用程序日志中没有看到任何异常或错误,但是日志系统显示:

E/CHIUSECASE: [ERROR  ] chxusecase.cpp:967 ReturnFrameworkResult() ChiFrame: 0 App Frame: 0 - pResult contains more buffers (1) than the expected number of buffers (0) to return to the framework!
E/CamX: [ERROR][CORE   ] camxnode.cpp:4518 CSLFenceCallback() Node::FastAECRealtime_IFE0 : Type:65536 Fence 3 handler failed in node fence handler
E/CamX: [ERROR][SENSOR ] camxsensornode.cpp:9279 GetSensorMode() Sensor name: s5k2la
E/CamX: [ERROR][SENSOR ] camxsensornode.cpp:9302 GetSensorMode() W x H : 4032, 3024
E//vendor/bin/hw/vendor.samsung.hardware.camera.provider@3.0-service_64: vendor/qcom/proprietary/commonsys-intf/adsprpc/src/fastrpc_apps_user.c:750: Error 0xe08132b8: remote_handle_invoke failed
E/CamX: [ERROR][ISP    ] camxispiqmodule.h:1871 IsTuningModeDataChanged() Invalid pointer to current tuning mode parameters (0x0)
E/CamX: [ERROR][PPROC  ] camxipenode.cpp:9529 GetFaceROI() Face ROI is not published

**1) How can I display the regular back facing camera as a Bitmap on TextureView, correctly? **1) 如何在 TextureView 上正确地将常规后置摄像头显示为 Bitmap?

  1. Save that bitmap as JPEG or PNG in internal drive**将 bitmap 保存为内部驱动器中的 JPEG 或 PNG **

Thanks a million!太感谢了!

If you want to actually covert an Image in JPEG format to a Bitmap, you can't just copy the bytes over, as you do with:如果您想将 JPEG 格式的图像实际转换为 Bitmap,则不能只复制字节,如下所示:

                    Log.i(TAG,"AFTER width:"+width+",height:"+height);
                    //***here bitmap is black
                    bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.RGB_565);

                    buffer.rewind();
                    if (bitmap!=null) {
                        bitmap.copyPixelsFromBuffer(buffer);
                    }

You need to actually decode the compressed JPEG, for example with BitmapFactory.decodeByteArray .您需要实际解码压缩的 JPEG,例如使用BitmapFactory.decodeByteArray That'll just produce a Bitmap from the Image contents, though you have to create a byte[] from the plane[0] ByteBuffer.这只会从图像内容中生成 Bitmap,尽管您必须从plane[0] ByteBuffer 创建一个byte[]

However, you really don't want to capture JPEGs here, those tend to be slow and won't get you very good frame rate.但是,您真的不想在这里捕获 JPEG,它们往往很慢并且不会为您提供非常好的帧速率。 Unless you have a strong reason, just use the TextureView's SurfaceTexture as a target for the camera (by creating a Surface from the SurfaceTexture).除非您有充分的理由,否则只需使用 TextureView 的 SurfaceTexture 作为相机的目标(通过从 SurfaceTexture 创建一个 Surface)。 That'll pass data in an efficient device-specific format, and you don't have to do any copying (still have to handle the scaling, though).这将以有效的设备特定格式传递数据,并且您不必进行任何复制(但仍然必须处理缩放)。

And if you need to modify the preview data before drawing, use the YUV_420_888 format, which is also efficient and will run at 30fps.并且如果需要在绘制前修改预览数据,使用YUV_420_888格式,同样高效,会以30fps的速度运行。 But that takes quite a bit more effort to draw to screen, since you'll have to convert to RGB.但这需要更多的努力才能绘制到屏幕上,因为您必须转换为 RGB。

I don't quite understand what you're trying to achieve, but maybe I can push you in the right direction.我不太了解您要达到的目标,但也许我可以将您推向正确的方向。

JPG is a compressed file format, so using it for a camera preview is a no-go. JPG 是一种压缩文件格式,因此不能将其用于相机预览。 You generally want to let Camera directly draw onto the TextureView without any compression.您通常希望让 Camera 直接绘制到 TextureView 上而不进行任何压缩。

You did leave a comment that you need to do some kind of processing first, but did you try using a different file format if this kind of processing needs to be done realtime while showing a preview?您确实留下了您需要先进行某种处理的评论,但是如果这种处理需要在显示预览时实时完成,您是否尝试过使用不同的文件格式? Any kind of a compressed image format will generally result in bad performance.任何类型的压缩图像格式通常都会导致性能不佳。

You can also show a preview directly while occasionally saving a compressed JPG/PNG on the external storage.您还可以直接显示预览,同时偶尔将压缩的 JPG/PNG 保存在外部存储上。 You can do that with Camera2, though CameraX has a much simpler way of doing it via use cases .您可以使用 Camera2 来做到这一点,尽管 CameraX 通过用例有更简单的方法来做到这一点。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM