[英]OpenCV + Python streaming of H264 video over network using UDP protocol
I have a python client code that receives the video stream transmitted using VLC or OBS Studio software.我有一个 python 客户端代码,它接收使用 VLC 或 OBS Studio 软件传输的视频 stream。
Client code:客户端代码:
import cv2
import time
target_url = 'udp://@0.0.0.0:1235'
stream = cv2.VideoCapture(target_url)
while True:
r, f = stream.read()
if r:
cv2.imshow('IP Camera stream',f)
It is able to read and display the video stream transmitted using VLC from another machine.它能够读取和显示从另一台机器使用 VLC 传输的视频 stream。 Now I want to create the video server app instead of using VLC.
现在我想创建视频服务器应用程序而不是使用 VLC。 I tried to use
cv2.VideoWriter
but it only takes local files and not udpsink.我尝试使用
cv2.VideoWriter
但它只需要本地文件而不是 udpsink。 After browsing through the.net, I got few stackoverflow answers suggesting pyzmq [Ref 1] which uses TCP, manually creating socket and handling it [Ref 2] which is not going to work because the client should be able to receive from both VLC as well as custom app.在浏览了 .net 之后,我得到了一些 stackoverflow 的答案,建议使用 TCP 的 pyzmq [参考文献 1],手动创建套接字并处理它 [参考文献 2],这是行不通的,因为客户端应该能够从两个 VLC 接收以及自定义应用程序。
Then I got to know about NetGear [Ref 3], which is a great tool.然后我了解了 NetGear [参考文献 3],这是一个很棒的工具。 But it doesn't support UDP as it internally uses pyzmq [Ref 4].
但它不支持 UDP,因为它在内部使用 pyzmq [参考文献 4]。
Basically I am looking for something like cv2.VideoWriter('udp://192.168.1.2:5000', fourcc, ..)
.基本上我正在寻找类似
cv2.VideoWriter('udp://192.168.1.2:5000', fourcc, ..)
的东西。
Question: Is there a way in which the live camera feed can be converted into H264 with bitrate and fps then transmitted over UDP so that it can be received using cv2.VideoCapture('udp://@0.0.0.0:5000')
?问题:有没有一种方法可以将实时摄像头源转换为具有比特率和 fps 的 H264,然后通过 UDP 传输,以便可以使用
cv2.VideoCapture('udp://@0.0.0.0:5000')
接收?
[Ref 1] Python Opencv and Sockets - Streaming video encoded in h264 [参考 1] Python Opencv 和 Sockets - 以 h264 编码的流媒体视频
[Ref 2] https://stackoverflow.com/a/63717263/12455023 [参考 2] https://stackoverflow.com/a/63717263/12455023
[Ref 3] https://stackoverflow.com/a/57204835/12455023 [参考 3] https://stackoverflow.com/a/57204835/12455023
[Ref 4] https://github.com/abhiTronix/vidgear/issues/281 [参考 4] https://github.com/abhiTronix/vidgear/issues/281
I'd suggest to use gstreamer for this.我建议为此使用 gstreamer。 You may try:
您可以尝试:
#!/usr/bin/env python
import cv2
print(cv2.__version__)
# Uncommenting this would allow to check if your opencv build has GSTREAMER support
#print(cv2.getBuildInformation())
cap = cv2.VideoCapture("udpsrc port=5000 ! application/x-rtp,media=video,encoding-name=H264 ! queue ! rtpjitterbuffer latency=500 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! video/x-raw,format=BGR ! queue ! appsink drop=1", cv2.CAP_GSTREAMER)
# For NVIDIA using NVMM memory
#cap = cv2.VideoCapture("udpsrc port=5000 ! application/x-rtp,media=video,encoding-name=H264 ! queue ! rtpjitterbuffer latency=500 ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! queue ! appsink drop=1", cv2.CAP_GSTREAMER)
width = cap.get(cv2.CAP_PROP_FRAME_WIDTH)
height = cap.get(cv2.CAP_PROP_FRAME_HEIGHT)
#fps = cap.get(cv2.CAP_PROP_FPS) #doesn't work with python in my case so forcing below...you may have to adjust for your case
fps = 30
if not cap.isOpened():
print('Failed to open camera')
exit
print('Source opened, framing %dx%d@%d' % (width,height,fps))
writer = cv2.VideoWriter("appsrc ! video/x-raw,format=BGR ! queue ! videoconvert ! x264enc insert-vui=1 ! h264parse ! rtph264pay ! udpsink port=5001", cv2.CAP_GSTREAMER, 0, float(fps), (int(width),int(height)))
# For NVIDIA using NVMM memory
#writer = cv2.VideoWriter("appsrc ! video/x-raw,format=BGR ! queue ! videoconvert ! video/x-raw,format=BGRx ! nvvidconv ! nvv4l2h264enc insert-sps-pps=1 insert-vui=1 ! h264parse ! rtph264pay ! udpsink port=5001", cv2.CAP_GSTREAMER, 0, float(fps), (int(width),int(height)))
if not writer.isOpened():
print('Failed to open writer')
cap.release()
exit
while True:
ret_val, img = cap.read();
if not ret_val:
break
writer.write(img);
cv2.waitKey(1)
writer.release()
cap.release()
This should stream to localhost on port 5001, and you should be able to receive on Linux host running X (expect up to 10 seconds to setup) with:这应该是 stream 到端口 5001 上的本地主机,并且您应该能够在运行 X 的 Linux 主机上接收(预计最多 10 秒设置):
gst-launch-1.0 udpsrc port=5001 ! application/x-rtp,media=video,encoding-name=H264 ! queue ! rtpjitterbuffer latency=500 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! xvimagesink
If you want to stream to a given host, set host property of udpsink while disabling auto-multicast:如果要将 stream 发送到给定主机,请在禁用自动多播的同时设置 udpsink 的主机属性:
writer = cv2.VideoWriter("appsrc ! video/x-raw,format=BGR ! queue ! videoconvert ! x264enc insert-vui=1 ! h264parse ! rtph264pay ! udpsink port=5001 host=<target_IP> auto-multicast=0
If you want to use multicast (better avoid with wifi):如果你想使用多播(最好避免使用 wifi):
writer = cv2.VideoWriter("appsrc ! video/x-raw,format=BGR ! queue ! videoconvert ! x264enc insert-vui=1 ! h264parse ! rtph264pay ! udpsink port=5001 host=224.1.1.1
# And you may receive on any LAN Linux host host with:
gst-launch-1.0 udpsrc multicast-group=224.1.1.1 port=5001 ! application/x-rtp, media=video,encoding-name=H264 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! xvimagesink
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.