简体   繁体   English

使用captureStream和mediaRecorder进行画布录制

[英]Canvas recording using captureStream and mediaRecorder

How can i record streams from more than one canvas? 如何从多个画布录制流? ie, when i change one canvas to other it has to record the active canvas continue to the first. 也就是说,当我将一个画布更改为其他画布时,它必须将活动画布继续记录到第一个画布。

I have done like this: 我这样做了:

stream = canvas.captureStream();
mediaRecorder = new MediaRecorder(stream, options);
mediaRecorder.ondataavailable = handleDataAvailable;
mediaRecorder.start(10);

function handleDataAvailable(event) {
  recordedBlobs.push(event.data);
}

But when adding another stream, only the first part is recorded. 但是当添加另一个流时,只记录第一部分。 I'am pushing recorded data to a global array. 我将记录的数据推送到全局数组。

In the current implementations, you can't switch the recorded tracks of a MediaRecorder's stream . 在当前的实现中, 您无法切换MediaRecorder流的录制轨道

When you try to do so, Firefox throws you in the console that 当您尝试这样做时,Firefox会在控制台中引发您的注意

MediaRecorder does not support recording multiple tracks of the same type at this time. MediaRecorder目前不支持录制多个相同类型的曲目。

while Chrome keeps silent and records black frames instead of the second track... 而Chrome保持沉默并记录黑框而不是第二首曲目......

 var canvases = Array.prototype.slice.call(document.querySelectorAll('canvas')), recordingStream, current = 0, chunks = [], recorder, switchInterval; function startRecording() { // first gather both canvases streams & extract the videoTracks let streams = canvases.map((c) => { return c.captureStream(30) }); let tracks = streams.map((s) => { return s.getVideoTracks()[0] }); // create a new MediaStream with both tracks in it // we don't use addTrack because of https://bugzilla.mozilla.org/show_bug.cgi?id=1296531 recordingStream = 'MediaStream' in window && new MediaStream(tracks) || new webkitMediaStream(tracks); // init the MediaRecorder recorder = new MediaRecorder(recordingStream); recorder.ondataavailable = saveChunks; recorder.onstop = exportVideo; recorder.onerror = (e) => { console.log(e.name) }; recorder.start(); stopRec.disabled = false; // switch the canvas to be recorder every 200ms switchInterval = setInterval(switchStream, 200); } // switch mute one of the tracks, then the other function switchStream() { current = +!current; var tracks = recordingStream.getVideoTracks(); tracks[current].enabled = true; // commented because it seems FF doesn't support canvasTrack's method yet // doesn't work in chrome even when there anyway // tracks[current].requestFrame(); tracks[+!current].enabled = false; } function saveChunks(evt) { // store our video's chunks if (evt.data.size > 0) { chunks.push(evt.data); } } stopRec.onclick = function stopRecording() { if (recorder.state !== 'recording') { this.disabled = true; return; } // stop everything recorder.stop(); // this will trigger exportVideo clearInterval(switchInterval); stopCanvasAnim(); a.style.display = b.style.display = 'none'; this.parentNode.innerHTML = ""; } function exportVideo() { // we've got everything vid.src = URL.createObjectURL(new Blob(chunks)); } var stopCanvasAnim = (function initCanvasDrawing() { // some fancy drawings var aCtx = canvases[0].getContext('2d'), bCtx = canvases[1].getContext('2d'); var objects = [], w = canvases[0].width, h = canvases[0].height; aCtx.fillStyle = bCtx.fillStyle = 'ivory'; for (var i = 0; i < 100; i++) { objects.push({ angle: Math.random() * 360, x: 100 + (Math.random() * w / 2), y: 100 + (Math.random() * h / 2), radius: 10 + (Math.random() * 40), speed: 1 + Math.random() * 20 }); } var stop = false; var draw = function() { aCtx.fillRect(0, 0, w, h); bCtx.fillRect(0, 0, w, h); for (var n = 0; n < 100; n++) { var entity = objects[n], velY = Math.cos(entity.angle * Math.PI / 180) * entity.speed, velX = Math.sin(entity.angle * Math.PI / 180) * entity.speed; entity.x += velX; entity.y -= velY; aCtx.drawImage(imgA, entity.x, entity.y, entity.radius, entity.radius); bCtx.drawImage(imgB, entity.x, entity.y, entity.radius, entity.radius); entity.angle++; } if (!stop) { requestAnimationFrame(draw); } } var imgA = new Image(); var imgB = new Image(); imgA.onload = function() { draw(); startRecording(); }; imgA.crossOrigin = imgB.crossOrigin = 'anonymous'; imgA.src = "https://dl.dropboxusercontent.com/s/4e90e48s5vtmfbd/aaa.png"; imgB.src = "https://dl.dropboxusercontent.com/s/rumlhyme6s5f8pt/ABC.png"; return function() { stop = true; }; })(); 
 <p> <button id="stopRec" disabled>stop recording</button> </p> <canvas id="a"></canvas> <canvas id="b"></canvas> <video id="vid" controls></video> 

Note that there is currently an open issue on the w3c github project mediacapture-record about this. 请注意,目前w3c github项目mediacapture-record上存在一个未解决的问题


But, there is a simple workaround to this issue : 但是,这个问题有一个简单的解决方法:

  • use an other offscreen [hidden] * offscreen (the chrome bug is now fixed in latest 58 canary) canvas, only used for the recorder, 使用其他离屏 [隐藏] * 离屏 (铬错误现在固定在最新58金丝雀)帆布,仅用于记录器,
  • draw the frames of the wanted canvas on it. 在上面绘制想要的画布的框架。

This way, no problem ;-) 这样,没问题;-)
The same workaround could also be used to save different videos on the same MediaRecorder. 同样的解决方法也可用于在同一MediaRecorder上保存不同的视频。

 var canvases = document.querySelectorAll('canvas'), recordingCtx, current = 0, chunks = [], recorder, switchInterval; // draw one of our canvas on a third one function recordingAnim() { recordingCtx.drawImage(canvases[current], 0, 0); // if recorder is stopped, stop the animation if (!recorder || recorder.state === 'recording') { requestAnimationFrame(recordingAnim); } } function startRecording() { var recordingCanvas = canvases[0].cloneNode(); recordingCtx = recordingCanvas.getContext('2d'); recordingCanvas.id = ""; // chrome forces us to display the canvas in doc so it can be recorded, // This bug has been fixed in chrome 58.0.3014.0 recordingCtx.canvas.style.height = 0; document.body.appendChild(recordingCtx.canvas); // draw one of the canvases on our recording one recordingAnim(); // init the MediaRecorder recorder = new MediaRecorder(recordingCtx.canvas.captureStream(30)); recorder.ondataavailable = saveChunks; recorder.onstop = exportVideo; recorder.start(); stopRec.onclick = stopRecording; // switch the canvas to be recorder every 200ms switchInterval = setInterval(switchStream, 200); } function saveChunks(evt) { // store our final video's chunks if (evt.data.size > 0) { chunks.push(evt.data); } } function stopRecording() { // stop everything, this will trigger recorder.onstop recorder.stop(); clearInterval(switchInterval); stopCanvasAnim(); a.style.display = b.style.display = 'none'; this.parentNode.innerHTML = ""; recordingCtx.canvas.parentNode.removeChild(recordingCtx.canvas) } // when we've got everything function exportVideo() { vid.src = URL.createObjectURL(new Blob(chunks)); } // switch between 1 and 0 function switchStream() { current = +!current; } // some fancy drawings var stopCanvasAnim = (function initCanvasDrawing() { var aCtx = canvases[0].getContext('2d'), bCtx = canvases[1].getContext('2d'); var objects = [], w = canvases[0].width, h = canvases[0].height; aCtx.fillStyle = bCtx.fillStyle = 'ivory'; // taken from http://stackoverflow.com/a/23486828/3702797 for (var i = 0; i < 100; i++) { objects.push({ angle: Math.random() * 360, x: 100 + (Math.random() * w / 2), y: 100 + (Math.random() * h / 2), radius: 10 + (Math.random() * 40), speed: 1 + Math.random() * 20 }); } var stop = false; var draw = function() { aCtx.fillRect(0, 0, w, h); bCtx.fillRect(0, 0, w, h); for (var n = 0; n < 100; n++) { var entity = objects[n], velY = Math.cos(entity.angle * Math.PI / 180) * entity.speed, velX = Math.sin(entity.angle * Math.PI / 180) * entity.speed; entity.x += velX; entity.y -= velY; aCtx.drawImage(imgA, entity.x, entity.y, entity.radius, entity.radius); bCtx.drawImage(imgB, entity.x, entity.y, entity.radius, entity.radius); entity.angle++; } if (!stop) { requestAnimationFrame(draw); } } var imgA = new Image(); var imgB = new Image(); imgA.onload = function() { draw(); startRecording(); }; imgA.crossOrigin = imgB.crossOrigin = 'anonymous'; imgA.src = "https://dl.dropboxusercontent.com/s/4e90e48s5vtmfbd/aaa.png"; imgB.src = "https://dl.dropboxusercontent.com/s/rumlhyme6s5f8pt/ABC.png"; return function() { stop = true; }; })(); 
 <p> <button id="stopRec">stop recording</button> </p> <canvas id="a"></canvas> <canvas id="b"></canvas> <video id="vid" controls></video> 

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用 MediaRecorder 记录多个 canvas - Recording multiple canvas using MediaRecorder <video>使用 MediaRecorder() 从<canvas>使用 canvas.captureStream() 在 firefox、chrome 上呈现不同 - <video> playback of recorded stream using MediaRecorder() from <canvas> using canvas.captureStream() renders differently at firefox, chromium 将drawImage与MediaRecorder一起使用时,为什么canvas.captureStream中的视频为空 - Why is video from canvas.captureStream empty when using drawImage with MediaRecorder 使用 canvas.captureStream() 使用 alpha 通道捕获视频 - Capture video with alpha channel using canvas.captureStream() HTML 使用 MediaRecorder 记录 canvas 里面有一个视频 - HTML using MediaRecorder record canvas with a video inside MediaRecorderAPI:如何使用来自 canvas 的 captureStream 和来自音频文件的音频源来生成媒体 Stream - MediaRecorderAPI : How to produce a Media Stream using captureStream from canvas & an audio source from an audio file 使用MediaRecorder将HTML canvas元素记录到视频中会导致视频损坏,并且只能在Chrome中使用 - Recording an HTML canvas element to video with MediaRecorder results in a corrupted video and only works in Chrome Canvas.captureStream() 在 Angular 8 中不存在 - Canvas.captureStream() does not exist in Angular 8 如何检测何时不支持`canvas.captureStream()`? - How to detect when `canvas.captureStream()` is not supported? 使用 MediaRecorder 录制时如何获取录制时间 - How to get the time of the recording while recording with MediaRecorder
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM