简体   繁体   中英

Is it possible to extract frames from a live camera feed in react-native?

I am trying to send image frames to a tensor flow library on a server. I got it all setup using pure React JS, but implementing it on react-native is giving me a lot of problems.

It seems impossible to get the frames from the webRCT stream, and even without the webRCT I can't find a way to even get frames from any video in react-native.

Has anyone found a good solution to this?

Have you tried a library that can do this for you? Maybe this one can help:

https://github.com/react-native-webrtc/react-native-webrtc

Instead of "pure" ReactJS you need a bit of configuration for both Android & iOS after installing the module.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM