[英]Flv stream to sockets with ffmpeg, node.js and socket.io
so i am having some problems with understanding how to pull this off. 所以我在理解如何实现这一目标方面遇到了一些问题。 I know that streaming video is a topic that is hard, and that there is a lot to take account off, but anyway here we go to start of learning how to stream video.
我知道流视频是一个很难解决的话题,还有很多事情需要考虑,但是无论如何我们还是要开始学习如何流视频。
I am using SocketIoClientDotNet as the node.js client for the c# application. 我正在使用SocketIoClientDotNet作为c#应用程序的node.js客户端。
I am sending byte arrays of the video to node which are creating a temporary file and appending the buffer to it. 我将视频的字节数组发送到正在创建临时文件并将缓冲区附加到该节点的节点。 I have tried to set the source of the video element to that file but it doesnt read it as video and are all black.
我试图将video元素的源设置为该文件,但它不会将其读取为视频,并且全都是黑色的。 I have tried to download a copy of the file since it did not work and it turns out vlc cant play it either.
我尝试下载该文件的副本,因为它无法正常工作,事实证明,vlc无法播放它。 Ok to the code:
好的代码:
C# C#
bool ffWorkerIsWorking = false;
private void btnFFMpeg_Click(object sender, RoutedEventArgs e)
{
BackgroundWorker ffWorker = new BackgroundWorker();
ffWorker.WorkerSupportsCancellation = true;
ffWorker.DoWork += ((ffWorkerObj,ffWorkerEventArgs) =>
{
ffWorkerIsWorking = true;
using (var FFProcess = new Process())
{
var processStartInfo = new ProcessStartInfo
{
FileName = "ffmpeg.exe",
RedirectStandardInput = true,
RedirectStandardOutput = true,
UseShellExecute = false,
CreateNoWindow = true,
Arguments = " -loglevel panic -hide_banner -y -f gdigrab -draw_mouse 1 -i desktop -f flv -"
};
FFProcess.StartInfo = processStartInfo;
FFProcess.Start();
byte[] buffer = new byte[32768];
using (MemoryStream ms = new MemoryStream())
{
while (!FFProcess.HasExited)
{
int read = FFProcess.StandardOutput.BaseStream.Read(buffer, 0, buffer.Length);
if (read <= 0)
break;
ms.Write(buffer, 0, read);
clientSocket.Emit("video", ms.ToArray());
ms.Flush();
if (!ffWorkerIsWorking)
{
ffWorker.CancelAsync();
break;
}
}
}
}
});
ffWorker.RunWorkerAsync();
}
JS (server) JS(服务器)
var buffer = new Buffer(32768);
var isBuffering = false;
var wstream;
socket.on('video', function(data) {
if(!isBuffering){
wstream = fs.createWriteStream('fooTest.flv');
isBuffering = true;
}
buffer = Buffer.concat([buffer, data]);
fs.appendFile('public/fooTest.flv', buffer, function (err) {
if (err) throw err;
console.log('The "data to append" was appended to file!');
});
});
What am i doing wrong here? 我在这里做错了什么?
With the OutputDataReceived
event you capture the text
output of the process stdout. 通过
OutputDataReceived
事件,您可以捕获流程标准输出的text
输出。 That's why in the first case the server complains about the UTF-8 encoding. 这就是为什么在第一种情况下,服务器会抱怨UTF-8编码。 Your second example works because you're sending a binary stream.
第二个示例有效,因为您正在发送二进制流。
You need to capture the binary base stream. 您需要捕获二进制基本流。 See this answer on how to do it: Capturing binary output from Process.StandardOutput
请参阅有关此操作的答案: 从Process.StandardOutput捕获二进制输出
I don't know how you plan to stream exactly, but if you use FLV there are already HTTP/RTMP servers you can use (eg. Nginx with the RTMP module). 我不知道您打算如何精确流式传输,但是如果您使用FLV,则已经可以使用HTTP / RTMP服务器(例如,带有RTMP模块的Nginx)。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.