简体   繁体   中英

How to stream a canvas element in WebRtc?

I was searching about WebRtc and I found this great project on GitHub:

https://github.com/mexx91/basicVideoRTC

The communication between 2 cameras works great using node.js.

It's possible to before stream a getuserMedia modify it in a canvas element and so stream this object?

Thanks

It seems currently this is not possible in a cross-browser compatible fashion.

But it may be in the future, you can take a glimpse at the HTMLCanvasElement.captureStream interface as implemented by recent Firefox browsers, see https://developer.mozilla.org/en-US/docs/Web/API/HTMLCanvasElement/captureStream .

It allows you to capture the content of a canvas to a stream that can be send wia WebRTC to your peer then.

By getUserMedia() method we can get the audio and video stream in codec format through the microphone and webcam respectively.

After converted this codec format in url of user video, it would assigned into source tag under the video element to make complete video.

So the video we are getting from getUserMedia() api is like other usual video eg:-

<video width="320" height="240" controls>
  <source src="http://www.w3schools.com/tags/movie.mp4" type="video/mp4">
  <source src="http://www.w3schools.com/tags/movie.ogg" type="video/ogg">
  Your browser does not support the video tag.
</video>

http://jsfiddle.net/ez3pA/2/

So you can do various things about video and canvas element together. We can get good examples of this on a site http://html5doctor.com/video-canvas-magic/

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM