简体   繁体   中英

How to get preview frames to do live video share using camera2 api

I am trying to do live video sharing using camera 2 API in Android.

I have two devices for doing this activity. I can see the preview in SurfaceView what I am seeing, is the preview using setRepeatingRequest API on the first device. But now I want to render the frames of what I am seeing from the first device to the second device.

I couldn't find anything to get me started while searching Google.

You need to transmit the image buffers over the network to the second device, and then display them there. (This is assuming you can't wire them together with a USB cable, and are relying on WiFi or cellular data).

That's a lot of work; you can't send the raw image buffers because that would take too much network bandwidth, so you have to compress them with the hardware encoders. And then decode them on the other end with the hardware decoders. And you need to deal with network congestion, bandwidth estimation, and all the other things that can go wrong in a network link.

The closest thing I'm aware of for making this happen easily is the WebRTC framework .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM