简体   繁体   English

Node.js 服务器上的实时视频流

[英]Live Video Stream on a Node.js Server

I have been researching this a lot but am frustrated as I feel like the solution should be simple though I know wont be.我一直在研究这个,但很沮丧,因为我觉得解决方案应该很简单,尽管我知道不会。 Ideally i'd just want to use node to host the server, webrtc getusermedia to get the live stream on the local client and use something like socket.io to send the stream to the server and then the server would broadcast the stream to the remote client;理想情况下,我只想使用 node 来托管服务器,webrtc getusermedia 在本地客户端上获取实时流,并使用 socket.io 之类的东西将流发送到服务器,然后服务器将流广播到远程客户; as if it was a simple messaging chat app.好像它是一个简单的消息聊天应用程序。

Just thinking about this some more it seems as an approach this simple would be impossible because a live video requires continuous large amounts of data to be sent, which does not equate to sending a single message or even file after an event (send button pressed).再考虑一下,这种简单的方法似乎是不可能的,因为实时视频需要连续发送大量数据,这并不等同于在事件发生后发送单个消息甚至文件(按下发送按钮) .

Maybe I am wrong however, can a live video stream app follow the same structure of a node/socket.io messenger app?然而,也许我错了,实时视频流应用程序可以遵循 node/socket.io 信使应用程序的相同结构吗? Would you send the media object returned from getUserMedia, the blob, some binary data some how (I've tried all of these but perhaps not correctly).您是否会发送从 getUserMedia 返回的媒体对象、blob、一些二进制数据(我已经尝试了所有这些,但可能不正确)。

The ideal goal would be an app that uses as little extra fluff as necessary, as little npm installs, as little extra javascript libraries, or little worrying about encoding/decoding or whatever the hell ICE or STUN are.理想的目标是应用尽可能少地使用额外的功能,尽可能少地安装 npm,尽可能少地安装额外的 javascript 库,或者不用担心编码/解码或 ICE 或 STUN 是什么鬼东西。 Is there any way this is possible or am I asking for too much?有什么办法可以做到,还是我要求太多?

Ideal Client理想客户

    var socket = io();
    var local = document.getElementById("local_video");
    var remote = document.getElementById("remote_video");

    // display local video
    navigator.mediaDevices.getUserMedia({video: true, audio: true}).then(function(stream) {
      local.src = window.URL.createObjectURL(stream);
      socket.emit("stream", stream);
    }).catch(function(err){console.log(err);});

    // displays remote video
    socket.on("stream", function(stream){
      remote.src = window.URL.createObjectURL(stream);

    });

Ideal Server理想服务器

var app = require("express")();
var http = require("http").Server(app);
var fs = require("fs");
var io = require("socket.io")(http);

app.get('/', onRequest);
http.listen(process.env.PORT || 3000, function() {
    console.log('server started');
})

//404 response
function send404(response) {
    response.writeHead(404, {"Content-Type" : "text/plain"});
    response.write("Error 404: Page not found");
    response.end();
}

function onRequest(request, response) {
  if(request.method == 'GET' && request.url == '/') {
    response.writeHead(200, {"Content-Type" : "text/html"});
    fs.createReadStream("./index.html").pipe(response);
  } else {
    send404(response);
  }
}

io.on('connection', function(socket) {
  console.log("a user connected");
  socket.on('stream', function(stream) {
    socket.broadcast.emit("stream", stream);
  });
  socket.on('disconnect', function () {
    console.log("user disconnected");
  });
});

This is the broken app in action : https://nodejs-videochat.herokuapp.com/这是运行中的损坏应用程序: https : //nodejs-videochat.herokuapp.com/

This is the broken code on github: https://github.com/joshydotpoo/nodejs-videochat这是github上的损坏代码: https : //github.com/joshydotpoo/nodejs-videochat

Try to be clear and specific. 尽量清楚明确。 First, you are not using WebRTC here. 首先,您没有在这里使用WebRTC。 getUserMedia() is a part of navigator WebAPI which you are using to get media stream from the camera. getUserMedia()导航器WebAPI的一部分,您可以使用它从相机获取媒体流。

Using WebRTC means you are using ICE and STUN/TURN servers for the purpose of signaling. 使用WebRTC意味着您正在使用ICE和STUN / TURN服务器进行信令。 You will use your host server(Node) for specifying ICE configuration, identify each user and provide a way to call each other. 您将使用主机服务器(节点)指定ICE配置,识别每个用户并提供相互呼叫的方式。

If you want to stream it through your host, probably you should stream it in chunks and set up your own signaling infrastructure. 如果您想通过主机传输它,可能您应该以块的形式流式传输并设置自己的信令基础架构。 You can use Stream API with socket io to stream data in chunks(packets). 您可以将Stream API与套接字io一起使用来以块(数据包)的形式传输数据。 See here Stream API(socket.io) 在这里看到Stream API(socket.io)

Also, you can check out the live example of WebRTC + Socket.io here: Socket.io | 另外,您可以在这里查看WebRTC + Socket.io的实例: Socket.io | WebRTC Video Chat WebRTC视频聊天

You can find out more information here: sending a media stream to Host server 您可以在此处找到更多信息: 将媒体流发送到主机服务器

I think the topic is about Node Server to support Live Streaming or Video Chat , it's much complex than what you think, let me illustrate it.我认为主题是关于Node Server以支持实时流媒体视频聊天,它比您想象的要复杂得多,让我来说明一下。 Both Live Streaming and Video Chat could use WebRTC, but it's not required to use WebRTC for Live Streaming . Live StreamingVideo Chat都可以使用 WebRTC,但Live Streaming不需要使用 WebRTC。 Both need some Node Server to support the signaling and streaming.两者都需要一些节点服务器来支持信令和流媒体。

If you want to publish your camera as a live stream, and forward to many like thousands of players, it's something called Live Streaming .如果您想将您的相机作为实时流发布,并转发给成千上万的玩家,这就是所谓的实时流媒体 The latency is not very critical, generally 3~10s is OK.延迟不是很关键,一般3~10s就可以了。

If you want to talk to each other, use your camera, also forward to other users, it's called Video Chat .如果你想互相交谈,使用你的相机,也可以转发给其他用户,这叫做视频聊天 The latency is very sensitive, MUST <400ms, generally ~200ms.延迟非常敏感,必须<400ms,一般~200ms。

They are totally different, let's discuss them separately.它们完全不同,让我们分别讨论它们。

Live Streaming直播

The key for live streaming is cross-platform(both H5 and mobile), fluency without buffering, fast startup to switch between streams.直播的关键是跨平台(H5和手机),流畅无缓冲,快速启动切换流。 The stream arch is like bellow:流拱形如下:

Publisher ---> Server/CDN ---> Player

Let's talk about player, HLS(LLHLS) is a premier deliver protocol, it's widely used and works good at H5(both PC and mobile) and Mobile(both iOS and Android).让我们来谈谈播放器,HLS(LLHLS)是一个首要的交付协议,它被广泛使用并且在H5(PC和移动)和移动(iOS和Android)上运行良好。 The only problem is the latency is about 5~10s, or even larger.唯一的问题是延迟大约5~10s,甚至更大。 Because it's file based protocol.因为它是基于文件的协议。

For Chrome, it's also OK to use hls.js to play HLS, by MSE Chrome 用hls.js玩 HLS 也是可以的,by MSE

Another low latency(3~5s) protocol is OK, it's HTTP-FLV and it's supported by all PC-H5 by hls.js , and mobile by ijkplayer , and some CDN also support this protocol.另一个低延迟(3~5s)协议是可以的,它是 HTTP-FLV,所有 PC-H5 的hls.jsijkplayer 的移动设备都支持它,一些 CDN 也支持该协议。 The only problem is not friendly for mobile-H5.唯一的问题是对mobile-H5不友好。

For player, WebRTC is also OK to play the stream, it works well on PC-H5 like Chrome.对于播放器,WebRTC 也可以播放流,它在 PC-H5 上运行良好,如 Chrome。 The problem is mobile, it's very hard to run a WebRTC native player.问题出在移动设备上,运行 WebRTC 本地播放器非常困难。 Beside the complexity, you also need a signaling server, which used to exchange SDP.除了复杂性之外,您还需要一个信令服务器,用于交换SDP。

For publisher, it's complex because it depends on your client:对于发布者来说,这很复杂,因为这取决于您的客户:

  • If H5 publisher, only WebRTC is available, so you need a server to covert WebRTC to protocol for player.如果是 H5 发布者,则只有 WebRTC 可用,因此您需要一个服务器将 WebRTC 转换为播放器的协议。 Recommend SRS推荐SRS
  • If Native mobile publisher, recommend FFmpeg, there are lots of libraries and bindings.如果是 Native mobile 发布者,推荐 FFmpeg,有很多库和绑定。 Any RTMP server is ok, also some node servers.任何 RTMP 服务器都可以,还有一些节点服务器。
  • If TV device, it maybe use SRT, you also need a server to covert.如果是电视设备,它可能使用SRT,您还需要一个服务器来进行转换。 Recommand SRS again.再次推荐 SRS。

Ultimately, the live streaming economy is based on C/C++, FFmpeg/WebRTC/SRS is writen by C/C++, however there are some servers by nodejs, and you could find by the protocol like nodejs rtmp .说到底,直播经济是基于C/C++的,FFmpeg/WebRTC/SRS是用C/C++写的,不过也有一些是nodejs的服务器,你可以通过nodejs rtmp等协议找到。

Video Chat视频聊天

Latency is the most important feature for video chat, so you must use WebRTC for client, both publisher and player.延迟是视频聊天最重要的特性,因此您必须将 WebRTC 用于客户端,发布者和播放器。

There are different servers for video chat:视频聊天有不同的服务器:

  • A room server, as signaling to exchange SDP for client, to manage the rooms and users, to kickoff some user, or mute the microphone, etc.房间服务器,作为为客户端交换 SDP 的信令,管理房间和用户,启动某些用户,或将麦克风静音等。
  • A SFU server(or MCU), to deliver media streams for all clients. SFU 服务器(或 MCU),为所有客户端提供媒体流。 There are also some SFUs, like Janus , mediasoup , and SRS .还有一些 SFU,如JanusmediasoupSRS
  • CDN: Few of CDN supports WebRTC server, but QUIC is developing as transport of WebRTC and HTTP/3, so in future might be better. CDN:很少有CDN支持WebRTC服务器,但QUIC正在开发作为WebRTC和HTTP/3的传输,所以未来可能会更好。 Right now, you could search for some WebRTC cloud service.现在,您可以搜索一些 WebRTC 云服务。

As I said, it's very complicated to build a WebRTC system, so please think about your scenario again and again: Are you really need a WebRTC system, or just need to publish live streaming by WebRTC?正如我所说,构建一个WebRTC系统非常复杂,所以请再三考虑你的场景:你真的需要一个WebRTC系统,还是只需要通过WebRTC发布直播?

If not sure, try live streaming solution first, it's much simple and stable.如果不确定,请先尝试直播解决方案,它简单且稳定。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM