简体   繁体   中英

Gstreamer webrtc pipeline problem for open source camera

Hello everyone,

I am trying to implement low-latency video streaming using WebRTC. I write my code in C++ (websocket etc.), use only webrtc signalling server which is written in Python (ref1). When I use a webcam, I do not have any problem streaming video to the client, however, I try to use the FLIR camera, I get a lot of problems while implementation. There are a few questions in my mind to clear. I hope you guys give me some recommendations.

  • Is there any specific data-type that I should do pipeline to webrtc as a source? I just would like to know what kind of data I should send as a source in webrtc?
  • I try to send an image to check whether my WebRTC implementation works properly (except webcam), it gives me the error "Pipeline is empty". What can cause this problem? This is actually the main problem why I would like to know data type etc. to understand what exactly I should pipe into webrtc.

ref1: https://github.com/centricular/gstwebrtc-demos/tree/master/signalling

PS:

  • Client and Jetson Nano in the network
  • Server for signals is running on Jetson Nano

By running gst-inspect-1.0 webrtcbin you will find that both source and sink capability for this plugin is just application/x-rtp .

Therefore, if you want webrtcbin to work as a source pad, you will need to pipe it to some sort of RTP depayloader such as rtph264depay for video and rtpopusdepay for audio.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM