简体   繁体   English

如何从node.js中的连续图像流创建MediaStream轨道? (用于WebRTC)

[英]How can I create a MediaStream track from a continuous stream of images in node.js? (for usage with WebRTC)

I want to stream a robot cam from a web media element. 我想从网络媒体元素流式传输机器人凸轮。 I have access to the camera in node.js, which is providing a live stream of images (continually producing a new frame at ~20fps). 我可以使用node.js中的摄像头,该摄像头提供了实时图像流(不断以〜20fps的速度生成新帧)。

In this same situation in the browser, one could write the image to a canvas and capture the stream. 在浏览器的这种相同情况下,可以将图像写入画布并捕获流。

Is there some way to construct a MediaStreamTrack object that can be directly added to the RTCPeerConnection, without having to use browser-only captureStream or getUserMedia APIs? 有没有什么方法可以构造可以直接添加到RTCPeerConnection的MediaStreamTrack对象,而不必使用仅浏览器使用的captureStreamgetUserMedia API?

I've tried the npm module canvas , which is supposed to port canvas to node -- then maybe I could captureStream the canvas after writing the image to it. 我尝试了npm模块canvas ,该模块应该将canvas端口移植到节点上-然后也许我可以在将图像写入其中之后捕获Stream画布。 But that didn't work. 但这没有用。

I'm using the wrtc node WebRTC module with the simple-peer wrapper. 我正在将wrtc节点WebRTC模块与simple-peer包装一起使用。

Check out the video-compositing example here. 在此处查看video-compositing示例。

https://github.com/node-webrtc/node-webrtc-examples https://github.com/node-webrtc/node-webrtc-examples

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM