[英]Best way to stream low latency video from a Raspberry Pi to an UWP-APP
For a project, I have to communicate with a Raspberry Pi Zero from a UWP-APP
via TCP.对于一个项目,我必须通过 TCP 从UWP-APP
与 Raspberry Pi Zero 进行通信。 Because both, the Raspberry and the computer with the interface, have got a private IP, I have to use a server to forward messages from one client to the other one.因为 Raspberry 和带有接口的计算机都有私有 IP,所以我必须使用服务器将消息从一个客户端转发到另一个客户端。 This part already works but now my problem is that I have to implement video streaming from the Raspberry to the UWP-APP.这部分已经有效,但现在我的问题是我必须实现从 Raspberry 到 UWP-APP 的视频流。
Because my partner is in charge of creating and designing the UWP-APP, I have made myself a little Test-Interface with WindowsForms
.因为我的伙伴负责创建和设计 UWP-APP,所以我自己做了一个WindowsForms
的小测试接口。 I have tried several techniques like Netcat
the video output over the server to the client or direct TCP-streaming with raspivid
, but the best solution so far is the one I found in this project here .我已经尝试了几种技术,例如Netcat
视频 output 通过服务器到客户端或使用raspivid
直接 TCP 流传输,但迄今为止最好的解决方案是我在这个项目中找到的一个。 But instead of using the Eneter.Messaging-library
I use my own class for communication with TcpClients.但我没有使用Eneter.Messaging-library
,而是使用我自己的 class 与 TcpClients 通信。
I use mono to run my C# script on the Raspberry and the code to stream the Video looks like this:我使用 mono 在 Raspberry 上运行我的 C# 脚本和 stream 的代码,视频如下所示:
while (true)
{
//Wait with streaming until the Interface is connected
while (!RemoteDeviceConnected || VideoStreamPaused)
{
Thread.Sleep(500);
}
//Check if Raspivid-Process is already running
if(!Array.Exists(Process.GetProcesses(), p => p.ProcessName.Contains("raspivid")))
raspivid.Start();
Thread.Sleep(2000);
VideoData = new byte[VideoDataLength];
try
{
while (await raspivid.StandardOutput.BaseStream.ReadAsync(VideoData, 0, VideoDataLength) != -1 && !VideoChannelToken.IsCancellationRequested && RemoteDeviceConnected && !VideoStreamPaused)
{
// Send captured data to connected clients.
VideoConnection.SendByteArray(VideoData, VideoDataLength);
}
raspivid.Kill();
Console.WriteLine("Raspivid killed");
}
catch(ObjectDisposedException)
{
}
}
Basically, this method just reads the h264 data from the Standard-Output-Stream of the raspivid
process in chunks and sends it to the server.基本上,这种方法只是从raspivid
进程的 Standard-Output-Stream 中分块读取 h264 数据并将其发送到服务器。
The next method runs on the server and just forwards the byte array to the connected interface-client.下一个方法在服务器上运行,只是将字节数组转发到连接的接口客户端。
while (RCVVideo[id].Connected)
{
await RCVVideo[id].stream.ReadAsync(VideoData, 0, VideoDataLength);
if (IFVideo[id] != null && IFVideo[id].Connected == true)
{
IFVideo[id].SendByteArray(VideoData, VideoDataLength);
}
}
SendByteArray() uses the NetworkStream.Write() Method. SendByteArray() 使用 NetworkStream.Write() 方法。
On the interface, I write the received byte[] to a named pipe, to which the VLC-Control connects to:在接口上,我将接收到的 byte[] 写入名为 pipe,VLC-Control 连接到该名称:
while (VideoConnection.Connected)
{
await VideoConnection.stream.ReadAsync(VideoData, 0, VideoDataLength);
if(VideoPipe.IsConnected)
{
VideoPipe.Write(VideoData, 0, VideoDataLength);
}
}
Following code initializes the pipe-server:以下代码初始化管道服务器:
// Open pipe that will be read by VLC.
VideoPipe = new NamedPipeServerStream(@"\raspipipe",
PipeDirection.Out, 1,
PipeTransmissionMode.Byte,
PipeOptions.WriteThrough, 0, 10000);
And for VLC:对于 VLC:
LibVLC libVLC = new LibVLC();
videoView1.MediaPlayer = new MediaPlayer(libVLC);
videoView1.MediaPlayer.Play(new Media(libVLC, @"stream/h264://\\\.\pipe\raspipipe", FromType.FromLocation));
videoView1.MediaPlayer.EnableHardwareDecoding = true;
videoView1.MediaPlayer.FileCaching = 0;
videoView1.MediaPlayer.NetworkCaching = 300;
This works fine on the Windowsforms-App and I can get the delay down to 2 or 3 seconds (It should be better in the end but it is acceptable).这在 Windowsforms-App 上运行良好,我可以将延迟降低到 2 或 3 秒(最终应该会更好,但可以接受)。 But on the UWP-App I can't get it to work even after adding /LOCAL/ to the pipe name.但是在 UWP-App 上,即使将 /LOCAL/ 添加到 pipe 名称后,我也无法使其工作。 It shows that the VLC-Control connects to the pipe, and I can see that data is written to the pipe but it doesn't display video.它显示 VLC-Control 连接到 pipe,我可以看到数据已写入 pipe 但它不显示视频。
So my question is:所以我的问题是:
How can I get this to work with the VLC-Control (LibVLCSharp) in UWP?我怎样才能让它与 UWP 中的 VLC-Control (LibVLCSharp) 一起工作? Am I missing something fundamental?我错过了一些基本的东西吗?
Or is there even a better way to stream the video in this case?或者在这种情况下有没有更好的方法来 stream 视频?
I have researched a bit on the UWP-MediaPlayerElement to but I can't find a way to get my byte[] into it.我对 UWP-MediaPlayerElement 进行了一些研究,但我找不到将我的 byte[] 放入其中的方法。
First of all, thank you for your quick responses and interesting ideas!首先,感谢您的快速回复和有趣的想法!
I took a look into Desktop Bridge but it is not really what I wanted, because my colleague has already put in a lot of effort to design the UWP-APP and my Windows-Form is just a botch to try things out.我查看了桌面桥,但它并不是我真正想要的,因为我的同事已经付出了很多努力来设计 UWP-APP,而我的 Windows-Form 只是一个拙劣的尝试。
But the thing that really worked for me was StreamMediaInput
.但真正对我有用的是StreamMediaInput
。 I have no idea how I missed this before.我不知道我以前是怎么错过的。 This way I just passed my NetworkStream
directly to the MediaPlayer
without using a Named-Pipe.这样,我只是将我的NetworkStream
直接传递给MediaPlayer
,而不使用命名管道。
LibVLC libVLC = new LibVLC();
videoView1.MediaPlayer = new MediaPlayer(libVLC);
Media streamMedia = new Media(libVLC, new StreamMediaInput(Client.Channels.VideoConnection.stream), ":demux=h264");
videoView1.MediaPlayer.EnableHardwareDecoding = true;
videoView1.MediaPlayer.FileCaching = 0;
videoView1.MediaPlayer.NetworkCaching = 500;
videoView1.MediaPlayer.Play(streamMedia);
This solution is now working for me both, in UWP and in Windows-Forms.这个解决方案现在在 UWP 和 Windows-Forms 中都适用于我。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.