简体   繁体   English

服务器(PC)上的实时视频流由机器人通过UDP发送的图像

[英]Live video stream on server (PC) from images sent by robot through UDP

Hmm. 嗯。 I found this which seems promising: 我发现这看起来很有希望:

http://sourceforge.net/projects/mjpg-streamer/ http://sourceforge.net/projects/mjpg-streamer/


Ok. 好。 I will try to explain what I am trying to do clearly and in much detail. 我将尝试清楚地详细解释我想要做的事情。

I have a small humanoid robot with camera and wifi stick ( this is the robot ). 我有一个带摄像头和wifi棒的小型人形机器人( 这是机器人 )。 The robot's wifi stick average wifi transfer rate is 1769KB/s. 机器人的wifi贴平均wifi传输速率为1769KB / s。 The robot has 500Mhz CPU and 256MB RAM so it is not enough for any serious computations (moreover there are already couple modules running on the robot for motion, vision, sonar, speech etc). 机器人有500Mhz CPU和256MB RAM,因此它不足以进行任何严肃的计算(此外,机器人上已经运行了几个模块用于运动,视觉,声纳,语音等)。

I have a PC from which I control the robot. 我有一台PC,我可以从中控制机器人。 I am trying to have the robot walk around the room and see a live stream video of what the robot sees in the PC. 我想让机器人在房间里走动,看一下机器人在PC上看到的实时视频。

What I already have working. 我已经有的工作了。 The robot is walking as I want him to do and taking images with the camera. 机器人正在我想要他走路并用相机拍摄图像。 The images are being sent through UDP protocol to the PC where I am receiving them (I have verified this by saving the incoming images on the disk). 图像通过UDP协议发送到我收到它们的PC(我通过将传入的图像保存在磁盘上来验证这一点)。

The camera returns images which are 640 x 480 px in YUV442 colorspace. 相机返回YUV442色彩空间中640 x 480像素的图像。 I am sending the images with lossy compression (JPEG) because I am trying to get the best possible FPS on the PC. 我发送有损压缩(JPEG)的图像,因为我试图在PC上获得最好的FPS。 I am doing the compression to JPEG on the robot with PIL library. 我正在使用PIL库对机器人进行JPEG压缩。

My questions: 我的问题:

  1. Could somebody please give me some ideas about how to convert the incoming JPEG images to a live video stream? 有人可以给我一些关于如何将传入的JPEG图像转换为实时视频流的想法吗? I understand that I will need some video encoder for that. 我知道我需要一些视频编码器。 Which video encoder do you recommend? 您推荐哪种视频编码器? FFMPEG or something else? FFMPEG还是其他什么? I am very new to video streaming so I want to know what is best for this task. 我对视频流非常陌生,所以我想知道什么是最适合这项任务的。 I'd prefer to use Python to write this so I would prefer some video encoder or library which has Python API. 我更喜欢使用Python来编写这个,所以我更喜欢一些带有Python API的视频编码器或库。 But I guess if the library has some good command line API it doesn't have to be in Python. 但我想如果库有一些好的命令行API,它不一定是在Python中。

  2. What is the best FPS I could get out from this? 我可以从中获得最好的FPS是什么? Given the 1769KB/s average wifi transfer rate and the dimensions of the images? 鉴于1769KB / s的平均wifi传输速率和图像的尺寸? Should I use different compression than JPEG? 我应该使用与JPEG不同的压缩吗?

  3. I will be happy to see any code examples. 我很乐意看到任何代码示例。 Links to articles explaining how to do this would be fine, too. 链接到解释如何做到这一点的文章也没关系。

Some code samples. 一些代码示例。 Here is how I am sending JPEG images from robot to the PC (shortened simplified snippet). 以下是我将JPEG图像从机器人发送到PC(缩短的简化代码段)的方法。 This runs on the robot: 这在机器人上运行:

# lots of code here

UDPSock = socket(AF_INET,SOCK_DGRAM)

  while 1:
    image = camProxy.getImageLocal(nameId)
    size = (image[0], image[1])
    data = image[6]
    im = Image.fromstring("YCbCr", size, data)
    s = StringIO.StringIO()
    im.save(s, "JPEG")

    UDPSock.sendto(s.getvalue(), addr)

    camProxy.releaseImage(nameId)

  UDPSock.close()

  # lots of code here

Here is how I am receiving the images on the PC. 这是我在PC上接收图像的方式。 This runs on the PC: 这在PC上运行:

  # lots of code here

  UDPSock = socket(AF_INET,SOCK_DGRAM)
  UDPSock.bind(addr)

  while 1:
    data, addr = UDPSock.recvfrom(buf)
    # here I need to create a stream from the data
    # which contains JPEG image

  UDPSock.close()

  # lots of code here

Checking out your first question. 检查你的第一个问题。 Though the solution here uses a non-streaming set of pictures. 虽然此处的解决方案使用非流式图片集。 It might help. 它可能有所帮助。 The example uses pyMedia. 该示例使用pyMedia。

Some along the lines of what you want. 有些沿着你想要的路线。

If you have a need to edit a binary stream: 如果您需要编辑二进制流:

Try pyffmpeg and test each available codec for the best performance. 尝试pyffmpeg并测试每个可用的编解码器以获得最佳性能。 You probably need a very lightweight codec like Smoke or low profile H263 or x264, and you probably need to drop the resolution to 320x240. 您可能需要一个非常轻量级的编解码器,如Smoke或低调H263或x264,您可能需要将分辨率降至320x240。

You have a trade off between latency of the video encoding and decoding and the bandwidth used, you might find dropping down to 160x120 with raw packets for a quick scene analysis and only periodically transmitting a full frame. 您可以在视频编码和解码的延迟与使用的带宽之间进行权衡,您可能会发现使用原始数据包降至160x120以进行快速场景分析,并且仅定期传输完整帧。 You could also mix a raw, low latency, low resolution, high update feed with a high compressed, high latency, high resolution, low update feed. 您还可以将原始,低延迟,低分辨率,高更新源与高压缩,高延迟,高分辨率,低更新源混合使用。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 UDP上的视频流 - Video stream over UDP 从实时视频流中获取帧 - Get a frame from a live video stream 如何从客户端到 flask 服务器并返回客户端的 stream 实时视频帧? - How to stream live video frames from client to flask server and back to the client? 使用UDP和Python Server从Android到PC的音频流中的噪声 - Noise in Audio Streaming from Android to PC using UDP with Python Server 从 ipcamera 的实时视频源中提取图像 - Extract images from live video feed of ipcamera Live video stream with webcam in web and predict result from a keras model and show live stream data in a dashboard - Live video stream with webcam in web and predict result from a keras model and show live stream data in a dashboard 如何从 url 到 python ZE5BA8B4C39C29BF13426EF5E0287AAA - how to stream live video from a url into python tkinter? 如何编写代码并测量从 udp 客户端发送到 python 中的 udp 服务器的时间? - How to write a code and measure time sent from udp client to udp server in python? 无法通过flask-socketIO服务器在HTML客户端中显示来自python客户端的视频流 - Cant display my video stream from a python client in the HTML client through a flask-socketIO server HLS直播stream服务器 - HLS live stream server
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM