简体   繁体   English

如何从 WebRtc MediaStreamTrackWeb object 获取本机 MediaStreamTrack

[英]How can I get native MediaStreamTrack from WebRtc MediaStreamTrackWeb object

I want to mix MediaStreamTrack objects in Dart using the package:universal_html/js.dart library.我想使用package:universal_html/js.dart库在 Dart 中混合 MediaStreamTrack 对象。

      JsAudioContext audioContext = JsAudioContext();
      audioContext.initialize();
      var senders = await call!.peerConnection!.getSenders();
      for (var sender in senders) {
        for (var track in senderTracks) {
          if (sender.track!.id != track.id) {
            audioContext.connect(track);
          }
        }
      }

But WebRtc hides the jsTrack native object inside the MediaStreamTrackWeb object.但是 WebRtc 在MediaStreamTrackWeb object 中隐藏了jsTrack原生 object。 How can I access this object?如何访问此 object? Is there anyone have an idea?有没有人有想法?

I found the solution using the js_bindings library.我使用js_bindings库找到了解决方案。 The library's MediaStream.getTracks() method throws a type error.库的MediaStream.getTracks()方法引发类型错误。 I solved this problem using js_util interop.我使用 js_util 互操作解决了这个问题。

JsAudioContext.dart: JsAudioContext.dart:

import 'dart:convert';
import 'package:flutter_webrtc/flutter_webrtc.dart' as webrtc;
import 'package:dart_webrtc/src/media_stream_track_impl.dart' as track_impl;
import 'package:js_bindings/js_bindings.dart' as js_bindings;
import 'package:universal_html/html.dart' as html;
import 'dart:js_util' as js_util;

class JsAudioContext {
  js_bindings.AudioContext? audioContext;
  js_bindings.MediaStreamAudioDestinationNode? destinationNode;
  JsAudioContext() {
    audioContext = js_bindings.AudioContext();
  }

  void createMediaStreamDestination() {
    destinationNode = audioContext?.createMediaStreamDestination();
  }

  void connect(webrtc.MediaStreamTrack? trackWeb) {
    track_impl.MediaStreamTrackWeb mediaStreamTrackWeb =
        trackWeb as track_impl.MediaStreamTrackWeb;
    html.MediaStreamTrack htmlTrack = mediaStreamTrackWeb.jsTrack;
    var sourceStream = audioContext?.createMediaStreamSource(
        js_bindings.MediaStream([htmlTrack as js_bindings.MediaStreamTrack]));
    sourceStream?.connect(destinationNode!);
  }

  webrtc.MediaStreamTrack getMixedTrack() {
    List<dynamic> outputTrack =
        js_util.callMethod(destinationNode!.stream, 'getTracks', []);

    webrtc.MediaStreamTrack rtcTrack = track_impl.MediaStreamTrackWeb(
        outputTrack.toList()[0] as html.MediaStreamTrack);
    return rtcTrack;
  }
}

sip_call_event_service.dart: sip_call_event_service.dart:

@override
  Future startConference(List<SipCallData> activeCallList) async {
    List<webrtc.MediaStreamTrack> receivedTracks = <webrtc.MediaStreamTrack>[];

    for (var item in activeCallList) {
      Call? call = sipuaHelper!.findCall(item.id!);
      var receives = await call!.peerConnection!.getReceivers();
      for (var element in receives) {
        receivedTracks.add(element.track!);
      }
    }

    JsAudioContext jsAudioContext = JsAudioContext();

    for (var item in activeCallList) {
      Call? call = sipuaHelper!.findCall(item.id!);
      jsAudioContext.createMediaStreamDestination();

      var receivers = await call!.peerConnection!.getReceivers();
      for (var receiver in receivers) {
        for (var track in receivedTracks) {
          if (receiver.track!.id != track.id) {
            jsAudioContext.connect(track);
          }
        }
      }

      var senders = await call.peerConnection!.getSenders();
      for (var sender in senders) {
        jsAudioContext.connect(sender.track);
      }

      await senders.first.replaceTrack(jsAudioContext.getMixedTrack());
    }
  }

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM