简体   繁体   中英

Camera2: SurfaceTexture buffer size is being overridden

I am writing an app using Camera2 API, which should show preview from camera and take a picture. Currently my code works as follows:

  1. When Camera Fragment is instantiated, wait for TextureView.SurfaceTextureListener.onSurfaceTextureAvailable to be called
  2. In ViewModel get available and suitable picture and preview sizes from CameraCharacteristics, and emit found preview size to Fragment with LiveData
  3. Fragment observes preview size LiveData and calls setDefaultBufferSize with new size for its TextureView 's SurfaceTexture
  4. When new size is set, capture session is created, and repeating preview request is set, so TextureView starts to show image from camera
  5. To avoid disrupting other camera apps' work, all camera-related things are cleared after Fragment's onPause and steps 1-4 are followed again after onResume
  6. Surface instance is shared between Fragment and camera logic classes: the shared variable is initialized with it in TextureView.SurfaceTextureListener.onSurfaceTextureAvailable and is set to null when TextureView.SurfaceTextureListener.onSurfaceTextureDestroyed is called

This works fine for some devices of popular brands with modern Android versions, but the app should work on the particular generic Chinese tablet with Android 6 (" CameraManager: Using legacy camera HAL "), and there I face a problem.

  • When the camera is instantiated and preview is started, I see that preview size is 640x480 (so the image is stretched), however, the size passed to setDefaultBufferSize is 1280x720
  • Logcat also is full of continuous Surface::setBuffersUserDimensions(this=0x7f55fb5200,w=640,h=480) messages
  • I've found on SO, that on some Samsung devices with Android 5 some resolutions may not really be available for Camera2, but here when I close the app and open it again, the preview resolution is 1280x720 as needed
  • So my guess is that I may call setDefaultBufferSize too early on first Camera Fragment setup, and only when the view is recreated when after the app was minimized, the needed resolution is "picked up"
  • I also tried to call setDefaultBufferSize in lambda passed to TextureView.post , and it solved the problem except for the case when I should ask for user's permissions on Camera Fragment (ie when user opens the camera for the first time), so the Fragment is paused a few times to show permissions pop-ups. However, without TextureView.post setDefaultBufferSize is also called in main thread, so I guess that delay caused by TextureView.post was the game changer here
  • Also in setDefaultBufferSize docs I see: The new default buffer size will take effect the next time the image producer requests a buffer to fill. For Canvas this will be the next time Surface.lockCanvas is called. For OpenGL ES, the EGLSurface should be destroyed (via eglDestroySurface), made not-current (via eglMakeCurrent), and then recreated (via eglCreateWindowSurface) to ensure that the new default size has taken effect. It seems to me that it may be about the case

Solved this by overriding onSurfaceTextureSizeChanged of SurfaceTextureListener and calling surfaceTexture.setDefaultBufferSize there with the desired preview size. When default buffer size is overridden with incorrect size (during initialization), this method is called, and I override it again.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM