I'm facing with an issue: I implemented a device to detect stranger for my home, which includes inputs from some IP camera and use a tensorflow model to process frame got from them.
Now I want to build a dashboard (use Flask or Django - python framework as backend) to streaming the processed frames I got from the system, and if possible, do some transform on them (such as stack multi frames into one, etc), and run the server so I can watch it from distances. Currently, I'm sending frame by frame as independent images but it costs so much bandwidth. I read how h264 encoder work and felt very exicted about it. Now, the question is, how can I use h264 or any encoder like it transfer my data and reduce the bandwidth?
OpenCV have some Video Encoding capabilities but cv::VideoWriter
is intended to write to a file. You might be able to access the raw stream by writing to a device but it's a bit hacky.
If you really want to stream H264 video then you probably have better chance with GStreamer. You can easily feed it with OpenCV matrix content. I don't have an example under the hand though.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.