简体   繁体   English

使用WebRTC流媒体文件

[英]Stream media file using WebRTC

Here is my use case: Alice has a cool new media track that she wants Bob to listen in to. 这是我的用例:爱丽丝有一个很酷的新媒体曲目,她希望鲍勃收听。 She selects the media file in her browser and the media file starts playing instantly in Bob's browser. 她在浏览器中选择媒体文件,媒体文件立即开始在Bob的浏览器中播放。

I'm not even sure if this is possible to build using WebRTC API right now. 我现在还不确定是否可以使用WebRTC API进行构建。 All examples I can find use streams obtained via getUserMedia() but this is what I have: 我可以找到的所有示例都使用通过getUserMedia()获得的流,但这就是我所拥有的:

var context = new AudioContext();
var pc = new RTCPeerConnection(pc_config);

function handleFileSelect(event) {
    var file = event.target.files[0];

    if (file) {
        if (file.type.match('audio*')) {
            console.log(file.name);
            var reader = new FileReader();

            reader.onload = (function(readEvent) {
                context.decodeAudioData(readEvent.target.result, function(buffer) {
                    var source = context.createBufferSource();
                    var destination = context.createMediaStreamDestination();
                    source.buffer = buffer;
                    source.start(0);
                    source.connect(destination);
                    pc.addStream(destination.stream);
                    pc.createOffer(setLocalAndSendMessage);
                });
            });

            reader.readAsArrayBuffer(file);
        }
    }
}

On the receiving side I have the following: 在接收方面,我有以下内容:

function gotRemoteStream(event) {
    var mediaStreamSource = context.createMediaStreamSource(event.stream);
    mediaStreamSource.connect(context.destination);
}

This code does not make the media (music) play on the receiving side. 此代码不会使媒体(音乐)在接收方播放。 I do however receive an ended event right after the WebRTC handshake is done and the gotRemoteStream function was called. 但是,在WebRTC握手完成并且调用了 gotRemoteStream函数之后,我确实收到了一个已 结束的事件。 The gotRemoteStream function gets called the media does not start playing. 调用gotRemoteStream函数,媒体无法开始播放。

On Alice's side the magic is suppose to happen in the line that says source.connect(destination) . 在爱丽丝的一面,神奇的假设发生在说明source.connect(目的地)的行中 When I replace that line with source.connect(context.destination) the media start playing correctly through Alice's speakers. 当我用source.connect(context.destination)替换该行时,媒体开始通过Alice的扬声器正确播放。

On Bob's side a media stream source is created based upon Alice's stream. 在Bob的一侧,基于Alice的流创建媒体流源。 However when the local speaker are connected using mediaStreamSource.connect(context.destination) the music doesn't start playing through the speakers. 但是,当使用mediaStreamSource.connect(context.destination)连接本地扬声器时,音乐不会通过扬声器开始播放。

Off course I could always send the media file through a DataChannel but where is the fun in that... 当然,我总是可以通过DataChannel发送媒体文件,但那里的乐趣在哪...

Any clues on what is wrong with my code or some ideas on how to achieve my use case would be greatly appreciated! 关于我的代码有什么问题或者如何实现我的用例的一些想法的任何线索将不胜感激!

I'm using the latest and greatest Chrome Canary. 我正在使用最新最好的Chrome Canary。

Thanks. 谢谢。

It is possible to play the audio using the Audio element like this: 可以使用Audio元素播放音频,如下所示:

function gotRemoteStream(event) {
    var player = new Audio();
    attachMediaStream(player, event.stream);
    player.play();
}

Playing back the audio via the WebAudio API it not working (yet) for me. 通过WebAudio API播放音频它对我来说还不起作用。

Note sure about Chrome; 请注意Chrome; sounds like a bug. 听起来像个臭虫。

try it on Firefox (nightly I suggest); 在Firefox上试试(我建议每晚); we have WebAudio support there though I don't know all the details about what's supported currently. 虽然我不知道当前支持的所有细节,但我们在那里有WebAudio支持。

Also, on Firefox at least we have stream = media_element.captureStreamUntilEnded(); 另外,在Firefox上至少我们有stream = media_element.captureStreamUntilEnded(); we use it in some of our tests in dom/media/tests/mochitests I believe. 我相信,我们在dom / media / tests / mochitests的一些测试中使用它。 This lets you take any audio or video element and capture the output as a mediastream. 这使您可以采用任何音频或视频元素并将输出捕获为媒体流。

Edit: see below; 编辑:见下文; both Chrome and Firefox have misses in combining WebAudio with WebRTC PeerConnections, but in different places. Chrome和Firefox都没有将WebAudio与WebRTC PeerConnections结合使用,但在不同的地方。 Mozilla hopes to fix the last bug there very soon. Mozilla希望很快能解决最后一个bug。

Check out the page MediaStream Integration . 查看MediaStream Integration页面。 It illustrates WebRTC integration with the Web Audio API. 它说明了WebRTC与Web Audio API的集成。 In particular this example is relevant for your question: 特别是此示例与您的问题相关:

  1. Capture microphone input, visualize it, mix in another audio track and stream the result to a peer 捕获麦克风输入,将其可视化,混合另一个音轨并将结果传输到对等端
<canvas id="c"></canvas>
<audio src="back.webm" id="back"></audio>
<script>
    navigator.getUserMedia('audio', gotAudio);
    var streamRecorder;
    function gotAudio(stream) {
        var microphone = context.createMediaStreamSource(stream);
        var backgroundMusic = context.createMediaElementSource(document.getElementById("back"));
        var analyser = context.createAnalyser();
        var mixedOutput = context.createMediaStreamDestination();
        microphone.connect(analyser);
        analyser.connect(mixedOutput);
        backgroundMusic.connect(mixedOutput);
        requestAnimationFrame(drawAnimation);

        peerConnection.addStream(mixedOutput.stream);
    }
</script>

I fear, however, that this is only a proposal currently. 但是,我担心这只是目前的提案。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM