简体   繁体   中英

How to properly implement camera2 realtime frame processing using RxJava?

I'm making reactive wrapper over camera2, my goal is to get each frame and then pass to face recognition.

So, I created a wrapper method over setOnImageAvailableListener

  fun createOnImageAvailableFlowable(imageReader: ImageReader, handler: Handler): Flowable<ImageReader> {
        return Flowable.create({ subscriber ->
            imageReader.setOnImageAvailableListener({
                if (!subscriber.isCancelled)
                    subscriber.onNext(it)
            }, handler)

            subscriber.setCancellable {
                imageReader.setOnImageAvailableListener(null, null)
            }
        }, BackpressureStrategy.LATEST)
    }

Reactive chain looks as follows:

 createOnImageAvailableFlowable(imageReader!!, null)
 .concatMap {
     it.acquireLatestImage()?.use { image ->
        val rotation = ReactiveCamera.getRotationCompensation(cameraId!!, this, applicationContext)
        val visionImage = FirebaseVisionImage.fromMediaImage(image, rotation)
        firebaseFaceDetector
          .detectInImage(visionImage)
          .toFlowable(BackpressureStrategy.LATEST)
          .map { list ->Optional(list)}
     } ?: Flowable.just(Optional(null))
 }
 ...

This code works, but cause some lags on preview surface because all work performed in the main thread. This needs to be performed in separate thread. My naive solution is to add observeOn operator before concatMap:

createOnImageAvailableFlowable(imageReader!!, null)
.observeOn(Schedulers.io()) // doesn't switch thread
.concatMap {
 // still main thread
}
...

But it doesn't affect, all work still in the main thread. If I specify concatMapEager instead of concatMap, all works as expected in separate thread, but the frames comes with a significant delay.

What I'm doing wrong? How can I instruct the reactive stream to be performed in a separate thread in this case? How can backpressure be handled in case of realtime frame processing?

Upd

I provided my own thread as Kiskae suggested, but now, only first emission happens in scheduler's thread, but the rest emissions remain in the main thread:

createOnImageAvailableFlowable(imageReader!!, null)
.subscribeOn(AndroidSchedulers.from(nonMainThread.looper))
.concatMap {
   val t = Thread.currentThread()
   val name = t.name
   Log.d(TAG, "current thread {$name}")
 ...
}

Output:

D/MainActivity: current thread {Camera2}
D/MainActivity: current thread {main}
D/MainActivity: current thread {main}
D/MainActivity: current thread {main}
D/MainActivity: current thread {main}

Looking at the documentation of ImageReader.setOnImageAvailableListener :

Handler: The handler on which the listener should be invoked, or null if the listener should be invoked on the calling thread's looper.

Since you're subscribing on the main looper it ends up setting up the callback using the main looper, this causes all the processing before the concatMap to always occur on the application thread.

You can solve this by either providing a handler instead of null or calling subscribeOn and providing a handler-based scheduler like RxAndroid's HandlerScheduler .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM