Hello everyone,
I am trying to implement low-latency video streaming using WebRTC. I write my code in C++ (websocket etc.), use only webrtc signalling server which is written in Python (ref1). When I use a webcam, I do not have any problem streaming video to the client, however, I try to use the FLIR camera, I get a lot of problems while implementation. There are a few questions in my mind to clear. I hope you guys give me some recommendations.
ref1: https://github.com/centricular/gstwebrtc-demos/tree/master/signalling
PS:
By running gst-inspect-1.0 webrtcbin
you will find that both source and sink capability for this plugin is just application/x-rtp
.
Therefore, if you want webrtcbin to work as a source pad, you will need to pipe it to some sort of RTP depayloader such as rtph264depay
for video and rtpopusdepay
for audio.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.