简体   繁体   English

拍摄其中带有Camera Preview的SurfaceView屏幕截图

[英]Taking a ScreenShot of SurfaceView with Camera Preview in it

I am trying to Implement a Functionality that includes taking Pictures While Recording Video. 我正在尝试实现一种功能,其中包括在录制视频时拍照。 That's the reason i have concluded to use the Screenshot approach of SurfaceView. 这就是我得出结论使用SurfaceView的Screenshot方法的原因。
However, when i try to take the Screen Shot of SurfaceView. 但是,当我尝试拍摄SurfaceView的屏幕截图时。 I am always getting a Blank Image. 我总是得到空白图像。

Here is the code that i am using for taking a Snapshot: 这是我用来拍摄快照的代码:

View tempView = (View)MY_SURFACE_VIEW;
tempView.setDrawingCacheEnabled(true);
Bitmap tempBmp = Bitmap.createBitmap(tempView.getDrawingCache());
tempView.setDrawingCacheEnabled(false);
//Saving this Bitmap to a File....

In case you guys may think this is a duplicate Question, let me assure you that i have tried the following Solutions provided on SO for the same Problem before asking this one. 如果你们可能认为这是一个重复的问题,让我向您保证,在问这个问题之前,我已经尝试过针对同一问题在SO上提供的以下解决方案。

  1. https://stackoverflow.com/questions/24134964/issue-with-camera-picture-taken-snapshot-using-surfaceview-in-android https://stackoverflow.com/questions/24134964/issue-with-camera-picture-taken-snapshot-using-surfaceview-in-android
  2. Facing issue to take a screenshot while recording a video 录制视频时遇到截图问题
  3. Take camera screenshot while recording - Like in Galaxy S3? 录制时拍摄相机屏幕截图-像在Galaxy S3中一样?
  4. Taking screen shot of a SurfaceView in android 在Android中拍摄SurfaceView的屏幕截图
  5. Get screenshot of surfaceView in Android (This is the correct answer, but Partially Answered. I have already asked @sajar to Explain the Answer) 在Android中获取surfaceView的屏幕截图 (这是正确的答案,但部分答案。我已经要求@sajar解释答案)

Other Resources on Internet: Internet上的其他资源:
1. http://www.coderanch.com/t/622613/Android/Mobile/capture-screenshot-simple-animation-project 2. http://www.phonesdevelopers.com/1795894/ 1. http://www.coderanch.com/t/622613/Android/Mobile/capture-screenshot-simple-animation-project 2. http://www.phonesdevelopers.com/1795894/

None of this has worked so far for me. 到目前为止,所有这些都对我没有任何作用。 I also know that we need to create some Thread that interacts with the Surface Holder and Gets the Bitmap from It. 我也知道我们需要创建一些与Surface Holder交互并从其中获取位图的线程。 But i am not sure how to implement that. 但是我不确定如何实现。

Any Help is Highly Appreciated. 高度赞赏任何帮助。

Here's another one: Take screenshot of SurfaceView . 这是另一个: 截取SurfaceView的屏幕截图

SurfaceViews have a "surface" part and a "view" part; SurfaceViews具有“表面”部分和“视图”部分。 your code tries to capture the "view" part. 您的代码尝试捕获“视图”部分。 The "surface" part is a separate layer, and there's no trivial "grab all pixels" method. “表面”部分是一个单独的层,没有简单的“抓取所有像素”方法。 The basic difficulty is that your app is on the "producer" side of the surface, rather than the "consumer" side, so reading pixels back out is problematic. 基本的困难是您的应用程序位于表面的“生产者”一侧,而不是“消费者”一侧,因此回读像素是有问题的。 Note that the underlying buffers are in whatever format is most convenient for the data producer, so for camera preview it'll be a YUV buffer. 请注意,底层缓冲区的格式最适合数据生成者使用,因此对于摄像机预览而言,它将是YUV缓冲区。

The easiest and most efficient way to "capture" the surface pixels is to draw them twice, once for the screen and once for capture. “捕获”表面像素的最简单,最有效的方法是绘制两次,一次用于屏幕,一次用于捕获。 If you do this with OpenGL ES, the YUV to RGB conversion will likely be done by a hardware module, which will be much faster than receiving camera frames in YUV buffers and doing your own conversion. 如果使用OpenGL ES进行此操作,则YUV到RGB的转换很可能由硬件模块完成,这比在YUV缓冲区中接收摄像机帧并进行自己的转换要快得多。

Grafika's "texture from camera" activity demonstrates manipulation of incoming video data with GLES. Grafika的 “来自相机纹理”活动演示了如何使用GLES处理传入的视频数据。 After rendering you can get the pixels with glReadPixels() . 渲染后,您可以使用glReadPixels()获得像素。 The performance of glReadPixels() can vary significantly between devices and different use cases. 在设备和不同用例之间, glReadPixels()的性能可能会有很大差异。 EglSurfaceBase#saveFrame() shows how to capture to a Bitmap and save as PNG. EglSurfaceBase#saveFrame()显示了如何捕获EglSurfaceBase#saveFrame() Bitmap并另存为PNG。

More information about the Android graphics architecture, notably the producer-consumer nature of SurfaceView surfaces, can be found in this document . 有关Android图形体系结构的更多信息,尤其是SurfaceView表面的生产者-消费者性质,可以在本文档中找到。

public class AndroidSurfaceviewExample extends Activity implements SurfaceHolder.Callback  {

static Camera camera;
SurfaceView surfaceView;
SurfaceHolder surfaceHolder;
static boolean boo;
static Thread x;
GLSurfaceView glSurfaceView;
public static Bitmap mBitmap;
public static Camera.Parameters param;
public static Camera.Size mPreviewSize;
public static byte[] byteArray;

PictureCallback jpegCallback;
private Bitmap inputBMP = null, bmp, bmp1;
public static ImageView imgScreen;
/** Called when the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    setContentView(R.layout.camera);

    surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
    surfaceHolder = surfaceView.getHolder();
    Button btnTakeScreen = (Button)findViewById(R.id.btnTakeScreen);
    imgScreen = (ImageView)findViewById(R.id.imgScreen);



    btnTakeScreen.setOnClickListener(new View.OnClickListener() {
        @Override
        public void onClick(View view) {
            Bitmap screen = Bitmap.createBitmap(getBitmap());
            imgScreen.setImageBitmap(screen);
        }
    });


    // Install a SurfaceHolder.Callback so we get notified when the
    // underlying surface is created and destroyed.
    surfaceHolder.addCallback(this);

    // deprecated setting, but required on Android versions prior to 3.0
    surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);

    jpegCallback = new PictureCallback() {
        @SuppressLint("WrongConstant")
        public void onPictureTaken(byte[] data, Camera camera) {
            FileOutputStream outStream = null;
            try {
                outStream = new FileOutputStream(String.format("/sdcard/%d.jpg", System.currentTimeMillis()));
                outStream.write(data);
                outStream.close();
                Log.d("Log", "onPictureTaken - wrote bytes: " + data.length);
            } catch (FileNotFoundException e) {
                e.printStackTrace();
            } catch (IOException e) {
                e.printStackTrace();
            } finally {
            }
            Toast.makeText(getApplicationContext(), "Picture Saved", 2000).show();
            refreshCamera();
        }
    };
}






public void refreshCamera() {
    if (surfaceHolder.getSurface() == null) {
        // preview surface does not exist
        return;
    }

    // stop preview before making changes
    try {
        camera.stopPreview();
    } catch (Exception e) {
        // ignore: tried to stop a non-existent preview
    }

    // set preview size and make any resize, rotate or
    // reformatting changes here
    // start preview with new settings
    try {
        camera.setPreviewDisplay(surfaceHolder);
        camera.startPreview();
    } catch (Exception e) {

    }
}

public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
    // Now that the size is known, set up the camera parameters and begin
    // the preview.
    refreshCamera();
}

public void surfaceCreated(SurfaceHolder holder) {


    if (camera == null) {
        try {
            camera = Camera.open();
        } catch (RuntimeException ignored) {
        }
    }

    try {
        if (camera != null) {
            WindowManager winManager = (WindowManager) getApplicationContext().getSystemService(Context.WINDOW_SERVICE);
            camera.setPreviewDisplay(surfaceHolder);
        }
    } catch (Exception e) {
        if (camera != null)
            camera.release();
        camera = null;
    }

    if (camera == null) {
        return;
    } else {
        camera.setPreviewCallback(new Camera.PreviewCallback() {
            @Override
            public void onPreviewFrame(byte[] bytes, Camera camera) {
                if (param == null) {
                    return;
                }
                byteArray = bytes;
            }
        });
    }





    param = camera.getParameters();
    mPreviewSize = param.getSupportedPreviewSizes().get(0);

    param.setColorEffect(Camera.Parameters.EFFECT_NONE);

    //set antibanding to none
    if (param.getAntibanding() != null) {
        param.setAntibanding(Camera.Parameters.ANTIBANDING_OFF);
    }

    // set white ballance
    if (param.getWhiteBalance() != null) {
        param.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_CLOUDY_DAYLIGHT);
    }

    //set flash
    if (param.getFlashMode() != null) {
        param.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
    }

    //set zoom
    if (param.isZoomSupported()) {
        param.setZoom(0);
    }

    //set focus mode
    param.setFocusMode(Camera.Parameters.FOCUS_MODE_INFINITY);


    // modify parameter
    camera.setParameters(param);
    try {
        // The Surface has been created, now tell the camera where to draw
        // the preview.
        camera.setPreviewDisplay(surfaceHolder);
        camera.startPreview();
    } catch (Exception e) {
        // check for exceptions
        System.err.println(e);
        return;
    }
}

public void surfaceDestroyed(SurfaceHolder holder) {
    // stop preview and release camera
    camera.stopPreview();
    camera.release();
    camera = null;
}



public Bitmap getBitmap() {
    try {
        if (param == null)
            return null;

        if (mPreviewSize == null)
            return null;

        int format = param.getPreviewFormat();
        YuvImage yuvImage = new YuvImage(byteArray, format, mPreviewSize.width, mPreviewSize.height, null);
        ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();

        Log.i("myLog","array: "+byteArray.toString());



        Rect rect = new Rect(0, 0, mPreviewSize.width, mPreviewSize.height);

        yuvImage.compressToJpeg(rect, 75, byteArrayOutputStream);
        BitmapFactory.Options options = new BitmapFactory.Options();
        options.inPurgeable = true;
        options.inInputShareable = true;
        mBitmap = BitmapFactory.decodeByteArray(byteArrayOutputStream.toByteArray(), 0, byteArrayOutputStream.size(), options);

        byteArrayOutputStream.flush();
        byteArrayOutputStream.close();
    } catch (IOException ioe) {
        ioe.printStackTrace();
    }

    return mBitmap;
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM