简体   繁体   English

将视频 blob 播放到 Canvas

[英]Play video blob into Canvas

How do you play a video blob into a Canvas?如何将视频块播放到 Canvas 中?

HTML: HTML:

<canvas id="canvas" ></canvas>

JS: JS:

var canvas = document.querySelector("#canvas")
mediaRecorder.ondataavailable = function(e) {
    if (e.data.size > 0) {
       var blob = e.data; // webm video blob chunk
       // how to play this blob into a canvas
    }
}

Another use-case另一个用例

setInterval(function(){
   if(frames.length > 0) {
      var webmBlob = Whammy.fromImageArray(frames, frameRate); // webm video
      // render webm video blob into Canvas
   }
}, 1000);

What your MediaRecorder's ondataavailable outputs is not a video blob, it's only a chunk of a video file.您的 MediaRecorder 的ondataavailable输出不是视频块,它只是视频文件的一部分。 Except for the very first one, it can't be played alone, you need to concatenate it with all the previous chunks that have been recorded until then.除了第一个之外,它不能单独播放,您需要将它与之前录制的所有块连接起来。

const chunks = [];
recorder.ondatavailable = (evt) => {
  chunks.push( evt.data ); // store all the chunks
  play( new Blob( chunks ) ); // concatenate all the chunks to as single Blob
};

Then to play a video served as a Blob, you have to create an URL pointing to that Blob, this is done using the URL.createObjectURL() method.然后播放作为 Blob 的视频,您必须创建一个指向该 Blob 的 URL,这是使用URL.createObjectURL()方法完成的。

Now, to draw it on a canvas, you must pass through a <video> element, on which you'll set the src to the blob: URI we created right before:现在,要在 canvas 上绘制它,您必须通过一个 <video> 元素,在该元素上将src设置为我们之前创建的 blob: URI:

 (async () => { const source = document.createElement( 'video' ); source.crossOrigin = "anonymous"; source.muted = true; source.src = "https://upload.wikimedia.org/wikipedia/commons/a/a4/BBH_gravitational_lensing_of_gw150914.webm"; console.log('loading, please wait'); await source.play(); const player = document.createElement( 'video' ); player.muted = true; const output = document.getElementById( 'output' ); const ctx = output.getContext( '2d' ); const stream = (source.captureStream && source.captureStream()) || source.mozCaptureStream(); const recorder = new MediaRecorder( stream ); const chunks = []; recorder.ondataavailable = (evt) => { console.clear(); chunks.push( evt.data ); play( new Blob( chunks ) ); }; console.clear(); console.log('buffering, please wait 5s'); recorder.start( 5000 ); // 5s per chunk source.addEventListener( 'ended', (evt) => recorder.stop() ); function play( blob ) { if( player.paused ) { drawing = true; player.addEventListener( 'loadedmetadata', (evt) => { output.width = player.videoWidth; output.height = player.videoHeight; requestAnimationFrame( loop ); }, { once: true } ); } player.src = URL.createObjectURL( blob ); player.play(); } function loop() { ctx.drawImage( player, 0, 0 ); if(.player;paused ) { requestAnimationFrame( loop ). } } })().catch( console;error );
 <canvas id="output"></canvas>

But note that it's really not a common thing to do...但请注意,这真的不是一件常见的事情......

We normally do wait for the whole recording is done before doing anything with the resulting video file:我们通常会等待整个录制完成,然后再对生成的视频文件进行任何操作:

const chunks = [];
recorder.ondatavailable = (evt) => {
  chunks.push( evt.data ); // store all the chunks
};
recorder.onstop = (evt) => { // only when the recording is entirely done
  play( new Blob( chunks ) ); // concatenate all the chunks to as single Blob
};

But if you really want to draw that video on the canvas in real-time, just do this and don't use a MediaRecorder at all:但是,如果您真的想在 canvas 上实时绘制该视频,只需执行此操作,并且根本不要使用 MediaRecorder:

 (async () => { const source = document.createElement( 'video' ); source.crossOrigin = "anonymous"; source.muted = true; source.src = "https://upload.wikimedia.org/wikipedia/commons/a/a4/BBH_gravitational_lensing_of_gw150914.webm"; console.log('loading, please wait'); await source.play(); console.clear(); const player = document.createElement( 'video' ); player.muted = true; const output = document.getElementById( 'output' ); const ctx = output.getContext( '2d' ); const stream = (source.captureStream && source.captureStream()) || source.mozCaptureStream(); player.addEventListener( 'loadedmetadata', (evt) => { output.width = player.videoWidth; output.height = player.videoHeight; requestAnimationFrame( loop ); }, { once: true } ); player.srcObject = stream; player.play(); function loop() { ctx.drawImage( player, 0, 0 ); if(.source;paused ) { requestAnimationFrame( loop ). } } })().catch( console;error );
 <canvas id="output"></canvas>

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM