简体   繁体   English

在后台捕获相机帧(Android)

[英]Capture camera frames in background (Android)

My problem is this: I want a background service, that will obtain frames from the camera in real-time, so that I can analyze them. 我的问题是:我需要一个后台服务,该服务将从摄像机实时获取帧,以便我可以对其进行分析。 I've seen a lot of similar topics here that supposedly address this issue, but none of them has really worked in my case. 我在这里看到了许多类似的主题,据说可以解决此问题,但在我看来,这些主题都没有真正起作用。

My first attempt was to create an Activity, which started a Service, and inside the service I created a surfaceView, from which I got a holder and implemented a callback to it in which I prepared the camera and everything. 我的第一个尝试是创建一个Activity,以启动一个Service,然后在该服务内部创建一个surfaceView,从中获得一个holder,并实现对其的回调,在其中准备照相机和所有东西。 Then, on a previewCallback, I could make a new thread, which could analyze the data I was getting from the onPreviewFrame method of PreviewCallback. 然后,在PreviewCallback上,我可以创建一个新线程,该线程可以分析从PreviewCallback的onPreviewFrame方法获取的数据。

That worked well enough, while I had that service in the foreground, but as soon as I opened up another application (with the service still running in the background), I realized that the preview wasn't there so I couldn't get the frames from it. 当我在前台使用该服务时,效果很好,但是当我打开另一个应用程序(该服务仍在后台运行)时,我意识到预览不存在,因此无法获取帧。

Searching on the internet, I found out I could perhaps solve this with SurfaceTexture. 在互联网上搜索时,我发现可以用SurfaceTexture解决此问题。 So, I created an Activity which'd start my service, like this: 因此,我创建了一个Activity来启动我的服务,如下所示:

public class SurfaceTextureActivity extends Activity {

public static TextureView mTextureView;

public static Vibrator mVibrator;   
public static GLSurfaceView mGLView;

protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    mGLView = new GLSurfaceView(this);

    mTextureView = new TextureView(this);

    setContentView(mTextureView);


    try {
        Intent intent = new Intent(SurfaceTextureActivity.this, RecorderService.class);
        intent.putExtra(RecorderService.INTENT_VIDEO_PATH, "/folder-path/");
        startService(intent);
        Log.i("ABC", "Start Service "+this.toString()+" + "+mTextureView.toString()+" + "+getWindowManager().toString());
    }
    catch (Exception e)  {
        Log.i("ABC", "Exc SurfaceTextureActivity: "+e.getMessage());
    }

}

}

And then I made the RecorderService implement SurfaceTextureListener, so that I could open the camera and do the other preparations, and then perhaps capture the frames. 然后,我使RecorderService实现了SurfaceTextureListener,以便可以打开相机并进行其他准备工作,然后捕获帧。 My RecorderService currently looks like this: 我的RecorderService当前如下所示:

public class RecorderService extends Service implements TextureView.SurfaceTextureListener, SurfaceTexture.OnFrameAvailableListener {

    private Camera mCamera = null;
    private TextureView mTextureView;
    private SurfaceTexture mSurfaceTexture;
    private float[] mTransformMatrix;

    private static IMotionDetection detector = null;
    public static Vibrator mVibrator;

    @Override
    public void onCreate() {
        try {

            mTextureView = SurfaceTextureActivity.mTextureView;
            mTextureView.setSurfaceTextureListener(this);

            Log.i("ABC","onCreate");

//          startForeground(START_STICKY, new Notification()); - doesn't work

        } catch (Exception e) {
            Log.i("ABC","onCreate exception "+e.getMessage());
            e.printStackTrace();
        }

    }

    @Override
    public void onFrameAvailable(SurfaceTexture surfaceTexture) 
    {
        //How do I obtain frames?!
//      SurfaceTextureActivity.mGLView.queueEvent(new Runnable() {
//          
//          @Override
//          public void run() {
//              mSurfaceTexture.updateTexImage();
//              
//          }
//      });
    }

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surface, int width,
            int height) {

        mSurfaceTexture = surface;
        mSurfaceTexture.setOnFrameAvailableListener(this);
        mVibrator = (Vibrator)this.getSystemService(VIBRATOR_SERVICE);

         detector = new RgbMotionDetection();

        int cameraId = 0;
        Camera.CameraInfo info = new Camera.CameraInfo();

        for (cameraId = 0; cameraId < Camera.getNumberOfCameras(); cameraId++) {
            Camera.getCameraInfo(cameraId, info);
            if (info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT)
                break;
        }

        mCamera = Camera.open(cameraId);
        Matrix transform = new Matrix();

        Camera.Size previewSize = mCamera.getParameters().getPreviewSize();
        int rotation = ((WindowManager)(getSystemService(Context.WINDOW_SERVICE))).getDefaultDisplay()
                .getRotation();
        Log.i("ABC", "onSurfaceTextureAvailable(): CameraOrientation(" + cameraId + ")" + info.orientation + " " + previewSize.width + "x" + previewSize.height + " Rotation=" + rotation);

        try {

        switch (rotation) {
        case Surface.ROTATION_0: 
            mCamera.setDisplayOrientation(90);
            mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
                    previewSize.height, previewSize.width, Gravity.CENTER));
            transform.setScale(-1, 1, previewSize.height/2, 0);
            break;

        case Surface.ROTATION_90:
            mCamera.setDisplayOrientation(0);
            mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
                    previewSize.width, previewSize.height, Gravity.CENTER));
            transform.setScale(-1, 1, previewSize.width/2, 0);
            break;

        case Surface.ROTATION_180:
            mCamera.setDisplayOrientation(270);
            mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
                    previewSize.height, previewSize.width, Gravity.CENTER));
            transform.setScale(-1, 1, previewSize.height/2, 0);
            break;

        case Surface.ROTATION_270:
            mCamera.setDisplayOrientation(180);
            mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
                    previewSize.width, previewSize.height, Gravity.CENTER));
            transform.setScale(-1, 1, previewSize.width/2, 0);
            break;
        }

            mCamera.setPreviewTexture(mSurfaceTexture);


        Log.i("ABC", "onSurfaceTextureAvailable(): Transform: " + transform.toString());

        mCamera.startPreview();
//      mTextureView.setVisibility(0);

        mCamera.setPreviewCallback(new PreviewCallback() {

            @Override
            public void onPreviewFrame(byte[] data, Camera camera) {
                if (data == null) return;
                Camera.Size size = mCamera.getParameters().getPreviewSize();
                if (size == null) return;

                //This is where I start my thread that analyzes images
                DetectionThread thread = new DetectionThread(data, size.width, size.height);
                thread.start();

            }
        });
        } 
        catch (Exception t) {
             Log.i("ABC", "onSurfaceTextureAvailable Exception: "+ t.getMessage());
        }
    }

However, similarly as in the other case, since my analyzing thread starts inside the onSurfaceTextureAvailable, which is only when the texture is there , and not when I open up another application, the frame capturing won't continue when I open up something else. 然而,同样在其他情况下,由于我的分析线程onSurfaceTextureAvailable,这是只有当纹理是存在的,而不是当我打开另一个应用程序时,帧捕捉不会当我打开别的东西继续内开始。

Some ideas have shown that it's possible, but I just don't know how. 一些想法表明这是可能的,但我只是不知道如何。 There was an idea, that I could implement SurfaceTexture.onFrameAvailable and then once new frame is available, trigger a runnable to be ran on render thread (GLSurfaceView.queueEvent(..)) and finally run a runnable call SurfaceTexture.updateTexImage(). 有一个想法,我可以实现SurfaceTexture.onFrameAvailable,然后在有新帧可用时,触发可运行对象以在渲染线程(GLSurfaceView.queueEvent(..))上运行,最后运行可运行对象,调用SurfaceTexture.updateTexImage()。 Which is what I've tried (it's commented out in my code), but it doesn't work, the application crashes if I do that. 这是我尝试过的(在我的代码中已注释掉),但是它不起作用,如果这样做,应用程序将崩溃。

What else can I possibly do? 我还能做什么? I know that this can work somehow, because I've seen it used in apps like SpyCameraOS (yes, I know it's open-source and I've looked at the code, but I couldn't make a working solution), and I feel like I'm missing just a small piece somewhere, but I have no idea what I'm doing wrong. 我知道这可以以某种方式起作用,因为我已经看到它在SpyCameraOS之类的应用中使用过(是的,我知道它是开源的,并且我已经看过代码,但是我无法提供可行的解决方案),而且我感觉我在某处缺少一小块东西,但是我不知道自己在做什么错。 I've been at this for the past 3 days, and no success. 我已经过去3天了,但没有成功。

Help would be greatly appreciated. 帮助将不胜感激。

Summarizing the comments: direct the output of the Camera to a SurfaceTexture that isn't tied to a View object. 总结评论:将Camera的输出定向到与View对象无关的SurfaceTexture。 A TextureView will be destroyed when the activity is paused, freeing its SurfaceTexture, but if you create a separate SurfaceTexture (or detach the one from the TextureView) then it won't be affected by changes in Activity state. 当活动暂停时,TextureView将被销毁,释放其SurfaceTexture,但是如果您创建单独的SurfaceTexture(或将其与TextureView分离),则它不会受到Activity状态更改的影响。 The texture can be rendered to an off-screen Surface, from which pixels can be read. 可以将纹理渲染到屏幕外的表面,从中可以读取像素。

Various examples can be found in Grafika . 可以在Grafika中找到各种示例。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM