简体   繁体   English

更改 WebRTC 流中的播放延迟

[英]Change playout delay in WebRTC stream

I'm trying to cast a live MediaStream (Eventually from the camera) from peerA to peerB and I want peerB to receive the live stream in real time and then replay it with an added delay.我正在尝试将实时 MediaStream(最终从相机)从 peerA 投射到 peerB,我希望 peerB 实时接收实时流,然后以额外的延迟重播。 Unfortunately in isn't possible to simply pause the stream and resume with play since it jump forward to the live moment.不幸的是,不能简单地暂停流并继续播放,因为它会跳转到实时时刻。

So I have figured out that I can use MediaRecorder + SourceBuffer rewatch the live stream.所以我发现我可以使用 MediaRecorder + SourceBuffer 重新观看直播。 Record the stream and append the buffers to MSE (SourceBuffer) and play it 5 seconds later.记录流并将缓冲区附加到 MSE (SourceBuffer) 并在 5 秒后播放。 This works grate on the local device (stream).这适用于本地设备(流)。 But when I try to use Media Recorder on the receivers MediaStream (from pc.onaddstream ) is looks like it gets some data and it's able to append the buffer to the sourceBuffer.但是当我尝试在接收器上使用 Media Recorder 时,MediaStream(来自pc.onaddstream )看起来像是获取了一些数据,并且能够将缓冲区附加到 sourceBuffer。 however it dose not replay.但是它不会重播。 sometime i get just one frame.有时我只得到一帧。

const [pc1, pc2] = localPeerConnectionLoop()
const canvasStream = canvas.captureStream(200)

videoA.srcObject = canvasStream
videoA.play()

// Note: using two MediaRecorder at the same time seem problematic
// But this one works
// stream2mediaSorce(canvasStream, videoB)
// setTimeout(videoB.play.bind(videoB), 5000)

pc1.addTransceiver(canvasStream.getTracks()[0], {
  streams: [ canvasStream ]
})

pc2.onaddstream = (evt) => {
  videoC.srcObject = evt.stream
  videoC.play()

  // Note: using two MediaRecorder at the same time seem problematic
  // THIS DOSE NOT WORK
  stream2mediaSorce(evt.stream, videoD)
  setTimeout(() => videoD.play(), 2000)
}

/**
 * Turn a MediaStream into a SourceBuffer
 * 
 * @param  {MediaStream}      stream   Live Stream to record
 * @param  {HTMLVideoElement} videoElm Video element to play the recorded video in
 * @return {undefined}
 */
function stream2mediaSorce (stream, videoElm) {
  const RECORDER_MIME_TYPE = 'video/webm;codecs=vp9'
  const recorder = new MediaRecorder(stream, { mimeType : RECORDER_MIME_TYPE })

  const mediaSource = new MediaSource()
  videoElm.src = URL.createObjectURL(mediaSource)
  mediaSource.onsourceopen = (e) => {
    sourceBuffer = mediaSource.addSourceBuffer(RECORDER_MIME_TYPE);

    const fr = new FileReader()
    fr.onerror = console.log
    fr.onload = ({ target }) => {
      console.log(target.result)
      sourceBuffer.appendBuffer(target.result)
    }
    recorder.ondataavailable = ({ data }) => {
      console.log(data)
      fr.readAsArrayBuffer(data)
    }
    setInterval(recorder.requestData.bind(recorder), 1000)
  }

  console.log('Recorder created')
  recorder.start() 
}

Do you know why it won't play the video?你知道为什么它不能播放视频吗?

I have created a fiddle with all the necessary code to try it out, the javascript tab is the same code as above, (the html is mostly irrelevant and dose not need to be changed)我已经创建了一个包含所有必要代码的小提琴来尝试它,javascript 选项卡与上面的代码相同,(html 大多无关紧要,不需要更改)

Some try to reduce the latency, but I actually want to increase it to ~10 seconds to rewatch something you did wrong in a golf swing or something, and if possible avoid MediaRecorder altogether有些人试图减少延迟,但我实际上想将其增加到大约 10 秒以重新观看您在高尔夫挥杆中做错的事情或其他事情,如果可能,请完全避免使用 MediaRecorder

EDIT: I found something called "playout-delay" in some RTC extension编辑:我在一些 RTC 扩展中发现了一个叫做“播放延迟”的东西

that allows the sender to control the minimum and maximum latency from capture to render time允许发送者控制从捕获到渲染时间的最小和最大延迟

How can i use it?我怎样才能使用它? Will it be of any help to me?对我有帮助吗?

Update, there is new feature that will enable this, called playoutDelayHint .更新,有一个新功能可以实现这一点,称为playoutDelayHint

We want to provide means for javascript applications to set their preferences on how fast they want to render audio or video data.我们希望为 javascript 应用程序提供方法来设置他们对渲染音频或视频数据的速度的偏好。 As fast as possible might be beneficial for applications which concentrates on real time experience.对于专注于实时体验的应用程序,尽可能快可能是有益的。 For others additional data buffering may provide smother experience in case of network issues.对于其他人,额外的数据缓冲可能会在网络问题的情况下提供更流畅的体验。

Refs:参考:
https://discourse.wicg.io/t/hint-attribute-in-webrtc-to-influence-underlying-audio-video-buffering/4038 https://discourse.wig.io/t/hint-attribute-in-webrtc-to-influence-underlying-audio-video-buffering/4038

https://bugs.chromium.org/p/webrtc/issues/detail?id=10287 https://bugs.chromium.org/p/webrtc/issues/detail?id=10287

Demo: https://jsfiddle.net/rvekxns5/ doe i was only able to set max 10s in my browser but it's more up to the UA vendor to do it's best it can with the resources available演示: https: //jsfiddle.net/rvekxns5/ doe 我只能在我的浏览器中设置最多 10 秒,但这更多取决于 UA 供应商在可用资源的情况下做到最好

 import('https://jimmy.warting.se/packages/dummycontent/canvas-clock.js') .then(({AnalogClock}) => { const {canvas} = new AnalogClock(100) document.querySelector('canvas').replaceWith(canvas) const [pc1, pc2] = localPeerConnectionLoop() const canvasStream = canvas.captureStream(200) videoA.srcObject = canvasStream videoA.play() pc1.addTransceiver(canvasStream.getTracks()[0], { streams: [ canvasStream ] }) pc2.onaddstream = (evt) => { videoC.srcObject = evt.stream videoC.play() } $dur.onchange = () => { pc2.getReceivers()[0].playoutDelayHint = $dur.valueAsNumber } })
 <!-- all the irrelevant part, that you don't need to know anything about --> <h3 style="border-bottom: 1px solid">Original canvas</h3> <canvas id="canvas" width="100" height="100"></canvas> <script> function localPeerConnectionLoop(cfg = {sdpSemantics: 'unified-plan'}) { const setD = (d, a, b) => Promise.all([a.setLocalDescription(d), b.setRemoteDescription(d)]); return [0, 1].map(() => new RTCPeerConnection(cfg)).map((pc, i, pcs) => Object.assign(pc, { onicecandidate: e => e.candidate && pcs[i ^ 1].addIceCandidate(e.candidate), onnegotiationneeded: async e => { try { await setD(await pc.createOffer(), pc, pcs[i ^ 1]); await setD(await pcs[i ^ 1].createAnswer(), pcs[i ^ 1], pc); } catch (e) { console.log(e); } } })); } </script> <h3 style="border-bottom: 1px solid">Local peer (PC1)</h3> <video id="videoA" muted width="100" height="100"></video> <h3 style="border-bottom: 1px solid">Remote peer (PC2)</h3> <video id="videoC" muted width="100" height="100"></video> <label> Change playoutDelayHint <input type="number" value="1" id="$dur"> </label>

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM