简体   繁体   English

从锁定屏幕接听电话时,Callkit 和 Webrtc 没有音频

[英]Callkit and Webrtc no audio when accept call from locked screen

I'm trying to make callkit work with webrtc on incoming call, but when I receive call and accept it from locked screen there are no sound untill I run the app in the foreground mode.我试图让 callkit 在来电时与 webrtc 一起工作,但是当我接到电话并从锁定屏幕接受它时,直到我在前台模式下运行应用程序时才发出声音。 I've configured audiosession send notification to RTCAudioSession, but it doesn't work.我已经配置了 audiosession 向 RTCAudioSession 发送通知,但它不起作用。 Do you have some workarounds?你有一些解决方法吗?

      func configureAudioSession() {

        let sharedSession = AVAudioSession.sharedInstance()
        do {
            try sharedSession.setCategory(AVAudioSessionCategoryPlayAndRecord, mode: AVAudioSessionModeVideoChat, options: .mixWithOthers)
            try sharedSession.setMode(AVAudioSessionModeVideoChat)
//            try sharedSession.setAggregatedIOPreference(AVAudioSessionIOType.aggregated)
        } catch {
            debugPrint("Failed to configure `AVAudioSession`")
        }
    }

    func handleIncomingCall(spaceName:String) {
        if callUUID != nil {
            oldCallUUID = callUUID
        }
        callUUID = UUID()
        print("CallManager handle uuid = \(callUUID?.description)")
        let update = CXCallUpdate()
        update.hasVideo = true
        update.remoteHandle = CXHandle(type: .generic, value: spaceName)
        self.configureAudioSession()
        provider?.reportNewIncomingCall(with: callUUID!, update: update, completion: { error in
            print("CallManager report new incoming call completion")
        })
    }

 func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) {
        print("CallManager didActivate")
        RTCAudioSession.sharedInstance().audioSessionDidActivate(audioSession)
        RTCAudioSession.sharedInstance().isAudioEnabled = true
        self.callDelegate?.callIsAnswered()
    }

    func provider(_ provider: CXProvider, didDeactivate audioSession: AVAudioSession) {
        print("CallManager didDeactivate")
        RTCAudioSession.sharedInstance().audioSessionDidDeactivate(audioSession)
        RTCAudioSession.sharedInstance().isAudioEnabled = false
    }

Okay, I've found what the cause of an issue.好的,我找到了问题的原因。 In IOS 12 there is a problem with webrtc, when you start webrtc from locked screen and trying to get access to camera - the output volume breaks, so the solution is to check if the screen is Active or not, and if not - do not request and add local RTCVideoTrack into your RTCStream.在 IOS 12 中,webrtc 存在问题,当您从锁定屏幕启动 webrtc 并尝试访问相机时 - 输出音量中断,因此解决方案是检查屏幕是否处于活动状态,如果不是 - 不要请求并将本地 RTCVideoTrack 添加到您的 RTCStream 中。

您的测试 iPhone 的 iOS 版本是什么?

Please Note that I share my code and its about to my needs and I share for reference.请注意,我分享了我的代码及其关于我的需求,并分享以供参考。 you need to change it according to your need.你需要根据你的需要改变它。

when you receive voip notification create new incident of your webrtc handling class, and add this two lines to code block because enabling audio session from voip notification fails当您收到 voip 通知时,创建 webrtc 处理类的新事件,并将这两行添加到代码块中,因为从 voip 通知启用音频会话失败

RTCAudioSession.sharedInstance().useManualAudio = true
RTCAudioSession.sharedInstance().isAudioEnabled = false 

didReceive method; didReceive 方法;

func pushRegistry(_ registry: PKPushRegistry, didReceiveIncomingPushWith payload: PKPushPayload, for type: PKPushType, completion: @escaping () -> Void) {
               let state = UIApplication.shared.applicationState
               
        
     
                   if(payload.dictionaryPayload["hangup"] == nil && state != .active
                   ){
                       
               
                     Globals.voipPayload = payload.dictionaryPayload as! [String:Any] // I pass parameters to Webrtc handler via Global singleton to create answer according to sdp sent by payload.
                        
                       RTCAudioSession.sharedInstance().useManualAudio = true
                       RTCAudioSession.sharedInstance().isAudioEnabled = false
                       
                     
                      
                     Globals.sipGateway = SipGateway() // my Webrtc and Janus gateway handler class
                    
                       
                     Globals.sipGateway?.configureCredentials(true) // I check janus gateway credentials stored in Shared preferences and initiate websocket connection and create peerconnection 
to my janus gateway which is signaling server for my environment
                    
                       
                  initProvider() //Crating callkit provider
                       
                       self.update.remoteHandle = CXHandle(type: .generic, value:String(describing: payload.dictionaryPayload["caller_id"]!))
                          Globals.callId = UUID()
             
                       let state = UIApplication.shared.applicationState
                       
                      
                          Globals.provider.reportNewIncomingCall(with:Globals.callId , update: self.update, completion: { error in
                           
                           
                          })
                       
                
               }
              
           }
    
        
        func  initProvider(){
            let config = CXProviderConfiguration(localizedName: "ulakBEL")
            config.iconTemplateImageData = UIImage(named: "ulakbel")!.pngData()
            config.ringtoneSound = "ringtone.caf"
                   // config.includesCallsInRecents = false;
                    config.supportsVideo = false
            
            Globals.provider = CXProvider(configuration:config )
            Globals.provider.setDelegate(self, queue: nil)
             update = CXCallUpdate()
             update.hasVideo = false
             update.supportsDTMF = true
      
        }
    

modify your didActivate and didDeActive delegate functions like below,修改您的 didActivate 和 didDeActive 委托函数,如下所示,

func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) {
       print("CallManager didActivate")
       RTCAudioSession.sharedInstance().audioSessionDidActivate(audioSession)
       RTCAudioSession.sharedInstance().isAudioEnabled = true
      // self.callDelegate?.callIsAnswered()
    
 
   }

   func provider(_ provider: CXProvider, didDeactivate audioSession: AVAudioSession) {
       print("CallManager didDeactivate")
RTCAudioSession.sharedInstance().audioSessionDidDeactivate(audioSession)
       RTCAudioSession.sharedInstance().isAudioEnabled = false
    
 
   }

in Webrtc handler class configure media senders and audiosession在 Webrtc 处理程序类中配置媒体发送器和音频会话

private func createPeerConnection(webRTCCallbacks:PluginHandleWebRTCCallbacksDelegate) {
   
        let rtcConfig =  RTCConfiguration.init()
        rtcConfig.iceServers = server.iceServers
        rtcConfig.bundlePolicy = RTCBundlePolicy.maxBundle
        rtcConfig.rtcpMuxPolicy = RTCRtcpMuxPolicy.require
        rtcConfig.continualGatheringPolicy = .gatherContinually
        rtcConfig.sdpSemantics = .planB
        
        let constraints = RTCMediaConstraints(mandatoryConstraints: nil,
                                                 optionalConstraints: ["DtlsSrtpKeyAgreement":kRTCMediaConstraintsValueTrue])
           
        pc = sessionFactory.peerConnection(with: rtcConfig, constraints: constraints, delegate: nil)
        self.createMediaSenders()
        self.configureAudioSession()
        
   
        
      if webRTCCallbacks.getJsep() != nil{
        handleRemoteJsep(webrtcCallbacks: webRTCCallbacks)
        }
      
    }

mediaSenders;媒体发送者;

private func createMediaSenders() {
        let streamId = "stream"
        
        // Audio
        let audioTrack = self.createAudioTrack()
        self.pc.add(audioTrack, streamIds: [streamId])
        
        // Video
      /*  let videoTrack = self.createVideoTrack()
        self.localVideoTrack = videoTrack
        self.peerConnection.add(videoTrack, streamIds: [streamId])
        self.remoteVideoTrack = self.peerConnection.transceivers.first { $0.mediaType == .video }?.receiver.track as? RTCVideoTrack
        
        // Data
        if let dataChannel = createDataChannel() {
            dataChannel.delegate = self
            self.localDataChannel = dataChannel
        }*/
    }

  private func createAudioTrack() -> RTCAudioTrack {
        let audioConstrains = RTCMediaConstraints(mandatoryConstraints: nil, optionalConstraints: nil)
        let audioSource = sessionFactory.audioSource(with: audioConstrains)
        let audioTrack = sessionFactory.audioTrack(with: audioSource, trackId: "audio0")
        return audioTrack
    }

audioSession ;音频会话;

private func configureAudioSession() {
        self.rtcAudioSession.lockForConfiguration()
        do {
            try self.rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue)
            try self.rtcAudioSession.setMode(AVAudioSession.Mode.voiceChat.rawValue)
        } catch let error {
            debugPrint("Error changeing AVAudioSession category: \(error)")
        }
        self.rtcAudioSession.unlockForConfiguration()
    }

Please consider that because I worked with callbacks and delegates code includes delegates and callback chunks.请考虑这一点,因为我使用回调和委托代码包括委托和回调块。 you can ignore them accordingly!!你可以相应地忽略它们!!

FOR REFERENCE You can also check the example at this link供参考您还可以在此链接中查看示例

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 处理来自锁定屏幕的来电时 iOS VoIP CallKit 应用程序出现问题 - Issue with iOS VoIP CallKit app when handling incoming call from locked screen WebRTC 音频在使用 CallKit 的锁定屏幕中不起作用 - WebRTC audio is not working in lock screen using CallKit CallKit:屏幕锁定时启动应用程序 - CallKit: Launch app when screen is locked 当从锁定屏幕接听电话时,iOS 麦克风不工作或无法通过 webrtc 发送语音 - iOS Microphone not working or not sending the voice through webrtc when the call is answered from locked screen 自动化 Callkit 接听电话 - Automating Callkit accept call Callkit - 如果从后台开始通话,则没有音频 - Callkit - No audio if starting a call from background 屏幕锁定期间接听来电时音频不起作用-迅速 - Audio not working when incoming call is answered during screen is locked - swift iOS 音频在电话被锁定时接听电话时不工作。 用于通话的WebRTC - iOS Audio not working during call answered when phone is locked. WebRTC used for calling Callkit 呼叫结束问题(错误域=com.apple.callkit.error.requesttransaction 代码=4)和呼叫接受后的音频问题 - Callkit call End issue(error domain=com.apple.callkit.error.requesttransaction code=4) and Audio issue after Call accept 在Callkit iOS中清理WebRTC呼叫 - Call clean of WebRTC call in callkit iOS
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM