簡體   English   中英

將視頻 blob 播放到 Canvas

[英]Play video blob into Canvas

如何將視頻塊播放到 Canvas 中?

HTML:

<canvas id="canvas" ></canvas>

JS:

var canvas = document.querySelector("#canvas")
mediaRecorder.ondataavailable = function(e) {
    if (e.data.size > 0) {
       var blob = e.data; // webm video blob chunk
       // how to play this blob into a canvas
    }
}

另一個用例

setInterval(function(){
   if(frames.length > 0) {
      var webmBlob = Whammy.fromImageArray(frames, frameRate); // webm video
      // render webm video blob into Canvas
   }
}, 1000);

您的 MediaRecorder 的ondataavailable輸出不是視頻塊,它只是視頻文件的一部分。 除了第一個之外,它不能單獨播放,您需要將它與之前錄制的所有塊連接起來。

const chunks = [];
recorder.ondatavailable = (evt) => {
  chunks.push( evt.data ); // store all the chunks
  play( new Blob( chunks ) ); // concatenate all the chunks to as single Blob
};

然后播放作為 Blob 的視頻,您必須創建一個指向該 Blob 的 URL,這是使用URL.createObjectURL()方法完成的。

現在,要在 canvas 上繪制它,您必須通過一個 <video> 元素,在該元素上將src設置為我們之前創建的 blob: URI:

 (async () => { const source = document.createElement( 'video' ); source.crossOrigin = "anonymous"; source.muted = true; source.src = "https://upload.wikimedia.org/wikipedia/commons/a/a4/BBH_gravitational_lensing_of_gw150914.webm"; console.log('loading, please wait'); await source.play(); const player = document.createElement( 'video' ); player.muted = true; const output = document.getElementById( 'output' ); const ctx = output.getContext( '2d' ); const stream = (source.captureStream && source.captureStream()) || source.mozCaptureStream(); const recorder = new MediaRecorder( stream ); const chunks = []; recorder.ondataavailable = (evt) => { console.clear(); chunks.push( evt.data ); play( new Blob( chunks ) ); }; console.clear(); console.log('buffering, please wait 5s'); recorder.start( 5000 ); // 5s per chunk source.addEventListener( 'ended', (evt) => recorder.stop() ); function play( blob ) { if( player.paused ) { drawing = true; player.addEventListener( 'loadedmetadata', (evt) => { output.width = player.videoWidth; output.height = player.videoHeight; requestAnimationFrame( loop ); }, { once: true } ); } player.src = URL.createObjectURL( blob ); player.play(); } function loop() { ctx.drawImage( player, 0, 0 ); if(.player;paused ) { requestAnimationFrame( loop ). } } })().catch( console;error );
 <canvas id="output"></canvas>

但請注意,這真的不是一件常見的事情......

我們通常會等待整個錄制完成,然后再對生成的視頻文件進行任何操作:

const chunks = [];
recorder.ondatavailable = (evt) => {
  chunks.push( evt.data ); // store all the chunks
};
recorder.onstop = (evt) => { // only when the recording is entirely done
  play( new Blob( chunks ) ); // concatenate all the chunks to as single Blob
};

但是,如果您真的想在 canvas 上實時繪制該視頻,只需執行此操作,並且根本不要使用 MediaRecorder:

 (async () => { const source = document.createElement( 'video' ); source.crossOrigin = "anonymous"; source.muted = true; source.src = "https://upload.wikimedia.org/wikipedia/commons/a/a4/BBH_gravitational_lensing_of_gw150914.webm"; console.log('loading, please wait'); await source.play(); console.clear(); const player = document.createElement( 'video' ); player.muted = true; const output = document.getElementById( 'output' ); const ctx = output.getContext( '2d' ); const stream = (source.captureStream && source.captureStream()) || source.mozCaptureStream(); player.addEventListener( 'loadedmetadata', (evt) => { output.width = player.videoWidth; output.height = player.videoHeight; requestAnimationFrame( loop ); }, { once: true } ); player.srcObject = stream; player.play(); function loop() { ctx.drawImage( player, 0, 0 ); if(.source;paused ) { requestAnimationFrame( loop ). } } })().catch( console;error );
 <canvas id="output"></canvas>

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM