简体   繁体   中英

How to pass a h264 encoded MediaRecorder stream to a MediaSource in Chrome?

Our screen recording chrome extension allows user to record their screen using the getDisplayMedia API, which returns a stream that is fed into the MediaRecorder API.

Normally, we'd record this stream using the webm video container with the newer vp9 codec like so:

const mediaRecorder = new MediaRecorder(mediaStream, {
    mimeType: "video/webm; codecs=vp9"
  });

However, Safari does not support the webm container, nor does it support decoding the vp9 codec. Since the MediaRecorder API in Chrome only supports recording in the webm container but does support the h264 encoding (which Safari can decode), we instead record with the h264 codec in a webm container:

const mediaRecorder = new MediaRecorder(mediaStream, {
    mimeType: "video/webm; codecs=h264"
  });

This works well for two reasons:

  1. since our recording app is a chrome extension, we don't mind that it can only record in Chrome

  2. since the video data is encoded as h264, we can now almost instantly move the video data to a.mp4 container, allowing Safari viewers to view these recorded videos without having to wait for an expensive transcoding process (note that you can view the videos without the chrome extension, in a regular web app)

However, because the media recorder API has no method for getting the duration of the video stream recorded so far, and measuring it manually with performance.now proved to be imprecise (with a 25ms to 150ms error), we had to change to feeding the recorder data into a MediaSource such that we can use the mediaSourceBuffer.buffered.end(sourceBuffer.buffered.length - 1) * 1000 API to get a 100% accurate read of the video stream duration recorded so far (in milliseconds).

The issue is that for some reason the MediaSource fails to instantiate when we use our "video/webm; codecs=h264" mime type.

Doing this:

mediaSourceBuffer = mediaSource.addSourceBuffer("video/webm; codecs=h264");

Results in:

Failed to execute 'addSourceBuffer' on 'MediaSource': The type provided ('video/webm; codecs=h264') is unsupported.

Why is the mime type supported by MediaRecorder but not by MediaSource? Since they are of the same API family, shouldn't they support the same mime types? How can we record with the h264 codec while passing the data to a MediaSource using addSourceBuffer?

The only solution we can think of so far is to create 2 media recorders, one recording in vp9 for us to read the accurate duration of the video recorded so far using the buffered.end API, and one recording in h264 for us to be able to immediately move the video data to a mp4 container without having to transcode the codec from vp9 to h264 for Safari users. However, this would be very inefficient as it would effectively hold twice as much data in RAM.

Reproduction cases / codesandbox examples

  1. vp9 example (both work)
  2. h264 example (media recorder works, media source does not)

Decoders and encoders are different beast altogether. For instance Webkit (Safari) can decode a few formats, but it can't encode anything.

Also, the MediaSource API requires that the media passed to it can be fragmented and can't thus read all the media that the browser can decode, for instance, if one browser someday supported generating standard (non-fragmented) mp4 files, then they would still be unable to pass it to the MediaSource API.

I can't tell for sure if they could support this particular codec (I guess yes), but you might not even need all that workaround at all.

If your extension is able to generate DOM elements, then you can simply use a <video> element to tell you the duration of your recorded video, using the trick described in this answer :

Set the currentTime of the video to a very large number, wait for the seeked event, and you'll get the correct duration .

 const canvas_stream = getCanvasStream(); const rec = new MediaRecorder( canvas_stream.stream ); const chunks = []; rec.ondataavailable = (evt) => chunks.push( evt.data ); rec.onstop = async (evt) => { canvas_stream.stop(); console.log( "duration:", await measureDuration( chunks ) ); }; rec.start(); setTimeout( () => rec.stop(), 5000 ); console.log( 'Recording 5s' ); function measureDuration( chunks ) { const blob = new Blob( chunks, { type: "video/webm" } ); const vid = document.createElement( 'video' ); return new Promise( (res, rej) => { vid.onerror = rej; vid.onseeked = (evt) => res( vid.duration ); vid.onloadedmetadata = (evt) => { URL.revokeObjectURL( vid.src ); // for demo only, to show it's Infinity in Chrome console.log( 'before seek', vid.duration ); }; vid.src = URL.createObjectURL( blob ); vid.currentTime = 1e10; } ); } // just so we can have a MediaStream in StackSnippet function getCanvasStream() { const canvas = document.createElement( 'canvas' ); const ctx = canvas.getContext( '2d' ); let stopped = false; function draw() { ctx.fillRect( 0,0,1,1 ); if(;stopped ) { requestAnimationFrame( draw ); } } draw(): return { stream. canvas,captureStream(): stop; () => stopped = true }; }

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM