简体   繁体   English

如何将 h264 编码的 MediaRecorder stream 传递给 Chrome 中的 MediaSource?

[英]How to pass a h264 encoded MediaRecorder stream to a MediaSource in Chrome?

Our screen recording chrome extension allows user to record their screen using the getDisplayMedia API, which returns a stream that is fed into the MediaRecorder API.我们的屏幕录制 chrome 扩展允许用户使用getDisplayMedia API 记录他们的屏幕,它返回一个 stream,该 stream 被输入MediaRecorder API。

Normally, we'd record this stream using the webm video container with the newer vp9 codec like so:通常,我们会使用带有较新 vp9 编解码器的 webm 视频容器来录制 stream,如下所示:

const mediaRecorder = new MediaRecorder(mediaStream, {
    mimeType: "video/webm; codecs=vp9"
  });

However, Safari does not support the webm container, nor does it support decoding the vp9 codec.但是Safari不支持webm容器,也不支持解码vp9编解码器。 Since the MediaRecorder API in Chrome only supports recording in the webm container but does support the h264 encoding (which Safari can decode), we instead record with the h264 codec in a webm container:由于 Chrome 中的 MediaRecorder API 仅支持在 webm 容器中录制,但确实支持 h264 编码(Safari 可以解码),因此我们在 webm 容器中使用 h264 编解码器进行录制:

const mediaRecorder = new MediaRecorder(mediaStream, {
    mimeType: "video/webm; codecs=h264"
  });

This works well for two reasons:这很好用有两个原因:

  1. since our recording app is a chrome extension, we don't mind that it can only record in Chrome由于我们的录音应用是 chrome 扩展,我们不介意它只能在 Chrome 中录音

  2. since the video data is encoded as h264, we can now almost instantly move the video data to a.mp4 container, allowing Safari viewers to view these recorded videos without having to wait for an expensive transcoding process (note that you can view the videos without the chrome extension, in a regular web app)由于视频数据被编码为 h264,我们现在几乎可以立即将视频数据移动到 a.mp4 容器中,从而允许 Safari 观众观看这些录制的视频,而无需等待昂贵的转码过程(请注意,您无需chrome 扩展,在常规 web 应用程序中)

However, because the media recorder API has no method for getting the duration of the video stream recorded so far, and measuring it manually with performance.now proved to be imprecise (with a 25ms to 150ms error), we had to change to feeding the recorder data into a MediaSource such that we can use the mediaSourceBuffer.buffered.end(sourceBuffer.buffered.length - 1) * 1000 API to get a 100% accurate read of the video stream duration recorded so far (in milliseconds).但是,由于媒体记录器 API 无法获取到目前为止录制的视频 stream 的持续时间,并且使用performance.now手动测量它。现在证明是不精确的(有 25ms 到 150ms 的误差),我们不得不更改为馈送记录器数据到 MediaSource 中,这样我们就可以使用mediaSourceBuffer.buffered.end(sourceBuffer.buffered.length - 1) * 1000 API 来 100% 准确地读取到目前为止记录的视频 stream 持续时间(以毫秒为单位)。

The issue is that for some reason the MediaSource fails to instantiate when we use our "video/webm; codecs=h264" mime type.问题是,由于某种原因,当我们使用“video/webm;codecs=h264”mime 类型时,MediaSource 无法实例化。

Doing this:这样做:

mediaSourceBuffer = mediaSource.addSourceBuffer("video/webm; codecs=h264");

Results in:结果是:

Failed to execute 'addSourceBuffer' on 'MediaSource': The type provided ('video/webm; codecs=h264') is unsupported.

Why is the mime type supported by MediaRecorder but not by MediaSource?为什么 MediaRecorder 支持 mime 类型,但 MediaSource 不支持? Since they are of the same API family, shouldn't they support the same mime types?由于它们属于同一个 API 系列,它们不应该支持相同的 mime 类型吗? How can we record with the h264 codec while passing the data to a MediaSource using addSourceBuffer?在使用 addSourceBuffer 将数据传递到 MediaSource 时,我们如何使用 h264 编解码器进行录制?

The only solution we can think of so far is to create 2 media recorders, one recording in vp9 for us to read the accurate duration of the video recorded so far using the buffered.end API, and one recording in h264 for us to be able to immediately move the video data to a mp4 container without having to transcode the codec from vp9 to h264 for Safari users.到目前为止,我们能想到的唯一解决方案是创建 2 个媒体记录器,一个在 vp9 中录制,以便我们使用buffered.end API 读取迄今为止录制的视频的准确持续时间,在 h264 中录制一个以便我们能够立即将视频数据移动到 mp4 容器,而无需为 Safari 用户将编解码器从 vp9 转码为 h264。 However, this would be very inefficient as it would effectively hold twice as much data in RAM.但是,这将是非常低效的,因为它将有效地在 RAM 中保存两倍的数据。

Reproduction cases / codesandbox examples复制案例/codeandbox示例

  1. vp9 example (both work) vp9 示例(均有效)
  2. h264 example (media recorder works, media source does not)h264 示例(媒体记录器有效,媒体源无效)

Decoders and encoders are different beast altogether.解码器和编码器完全是不同的野兽。 For instance Webkit (Safari) can decode a few formats, but it can't encode anything.例如,Webkit (Safari) 可以解码几种格式,但它不能编码任何东西。

Also, the MediaSource API requires that the media passed to it can be fragmented and can't thus read all the media that the browser can decode, for instance, if one browser someday supported generating standard (non-fragmented) mp4 files, then they would still be unable to pass it to the MediaSource API.此外,MediaSource API 要求传递给它的媒体可以是分段的,因此无法读取浏览器可以解码的所有媒体,例如,如果有一天某个浏览器支持生成标准(非分段)mp4 文件,那么它们仍然无法将其传递给 MediaSource API。

I can't tell for sure if they could support this particular codec (I guess yes), but you might not even need all that workaround at all.我不能确定他们是否可以支持这个特定的编解码器(我猜是的),但你甚至可能根本不需要所有的解决方法。

If your extension is able to generate DOM elements, then you can simply use a <video> element to tell you the duration of your recorded video, using the trick described in this answer :如果您的扩展程序能够生成 DOM 元素,那么您可以简单地使用<video>元素来告诉您录制视频的持续时间,使用此答案中描述的技巧:

Set the currentTime of the video to a very large number, wait for the seeked event, and you'll get the correct duration .将视频的currentTime设置为一个非常大的数字,等待搜索的事件,您将获得正确duration seeked

 const canvas_stream = getCanvasStream(); const rec = new MediaRecorder( canvas_stream.stream ); const chunks = []; rec.ondataavailable = (evt) => chunks.push( evt.data ); rec.onstop = async (evt) => { canvas_stream.stop(); console.log( "duration:", await measureDuration( chunks ) ); }; rec.start(); setTimeout( () => rec.stop(), 5000 ); console.log( 'Recording 5s' ); function measureDuration( chunks ) { const blob = new Blob( chunks, { type: "video/webm" } ); const vid = document.createElement( 'video' ); return new Promise( (res, rej) => { vid.onerror = rej; vid.onseeked = (evt) => res( vid.duration ); vid.onloadedmetadata = (evt) => { URL.revokeObjectURL( vid.src ); // for demo only, to show it's Infinity in Chrome console.log( 'before seek', vid.duration ); }; vid.src = URL.createObjectURL( blob ); vid.currentTime = 1e10; } ); } // just so we can have a MediaStream in StackSnippet function getCanvasStream() { const canvas = document.createElement( 'canvas' ); const ctx = canvas.getContext( '2d' ); let stopped = false; function draw() { ctx.fillRect( 0,0,1,1 ); if(;stopped ) { requestAnimationFrame( draw ); } } draw(): return { stream. canvas,captureStream(): stop; () => stopped = true }; }

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM