简体   繁体   English

调用 broadcastPaused() 后 AVAssetWriter 异步视频和音频

[英]AVAssetWriter async video and audio after calling broadcastPaused()

I am trying to record a video with sound from device screen using ReplayKit and RPBroadcastSampleHandler.我正在尝试使用 ReplayKit 和 RPBroadcastSampleHandler 从设备屏幕录制带有声音的视频。

When i record it just using "start broadcast" and "stop" the result i get is great.当我只使用“开始广播”和“停止”录制它时,我得到的结果很棒。

But if i try to pause recording (by using red status bar) i got problems.但是,如果我尝试暂停录制(通过使用红色状态栏),我会遇到问题。 The result i got is video and audio with different length (audio is shorter but have all i need).我得到的结果是具有不同长度的视频和音频(音频更短但我需要的一切)。 On the recording i got video and audio that start being async after moment of tapping status bar(ios14).在录制时,我得到了在点击状态栏(ios14)之后开始异步的视频和音频。 Audio goes good, but video freezing when status bar tapped and continue when modal window closed.音频很好,但是当状态栏点击时视频冻结,当模态 window 关闭时继续。 As result i got video without audio in the end.结果我最终得到了没有音频的视频。

Here is my code:这是我的代码:

1.All class fields i have: 1.我拥有的所有 class 字段:

class SampleHandler: RPBroadcastSampleHandler {
    
    private let videoService = VideoService()
    private let audioService = AudioService()
    private var isRecording = false
    
    private let lock = NSLock()
    private var finishCalled = false
    
    private var videoWriter: AVAssetWriter!
    private var videoWriterInput: AVAssetWriterInput!
    private var microphoneWriterInput: AVAssetWriterInput!
    private var sessionBeginAtSourceTime: CMTime!

2.Some configure on start capturing: 2.开始捕获的一些配置:

override func broadcastStarted(withSetupInfo setupInfo: [String : NSObject]?) {
    guard !isRecording else { return }
    isRecording = true
    BroadcastData.clear()
    BroadcastData.startVideoDate = Date()
    BroadcastData.status = .writing
    sessionBeginAtSourceTime = nil
                    
    configurateVideoWriter()
}

private func configurateVideoWriter() {
    let outputFileLocation = videoService.getVideoFileLocation()
    
    videoWriter = try? AVAssetWriter.init(outputURL: outputFileLocation,
                                          fileType: AVFileType.mov)
    
    configurateVideoWriterInput()
    configurateMicrophoneWriterInput()
    
    if videoWriter.canAdd(videoWriterInput) { videoWriter.add(videoWriterInput) }
    if videoWriter.canAdd(microphoneWriterInput) { videoWriter.add(microphoneWriterInput) }

    videoWriter.startWriting()
}

private func configurateVideoWriterInput() {
    let RESOLUTION_COEF: CGFloat = 16
    let naturalWidth = UIScreen.main.bounds.width
    let naturalHeight = UIScreen.main.bounds.height
    let width = naturalWidth - naturalWidth.truncatingRemainder(dividingBy: RESOLUTION_COEF)
    let height = naturalHeight - naturalHeight.truncatingRemainder(dividingBy: RESOLUTION_COEF)
    
    let videoSettings: [String: Any] = [
        AVVideoCodecKey: AVVideoCodecType.h264,
        AVVideoWidthKey: width,
        AVVideoHeightKey: height
    ]
    
    videoWriterInput = AVAssetWriterInput(mediaType: .video,
                                          outputSettings: videoSettings)
    videoWriterInput.expectsMediaDataInRealTime = true
}

private func configurateMicrophoneWriterInput() {
    let audioOutputSettings: [String : Any] = [
        AVFormatIDKey: kAudioFormatMPEG4AAC,
        AVNumberOfChannelsKey : 1,
        AVSampleRateKey : 44100.0,
        AVEncoderBitRateKey: 96000
    ]

    microphoneWriterInput = AVAssetWriterInput(mediaType: .audio,
                                               outputSettings: audioOutputSettings)
    microphoneWriterInput.expectsMediaDataInRealTime = true
}

3.Write process: 3.编写过程:

override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: 
RPSampleBufferType) {
    guard isRecording && videoWriter?.status == .writing else { return }
    
    if BroadcastData.status != .writing {
        isRecording = false
        finishBroadCast()
        return
    }
    
    if sessionBeginAtSourceTime == nil {
        sessionBeginAtSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
        videoWriter.startSession(atSourceTime: sessionBeginAtSourceTime!)
    }
    
    switch sampleBufferType {
    case .video:
        if videoWriterInput.isReadyForMoreMediaData {
            videoWriterInput.append(sampleBuffer)
        }
    case .audioMic:
        if microphoneWriterInput.isReadyForMoreMediaData {
            microphoneWriterInput.append(sampleBuffer)
        }
    case .audioApp:
        break
    @unknown default:
        print("unknown")
    }
}

4.Pause and resume 4.暂停和恢复

override func broadcastPaused() {
    super.broadcastPaused()
}

override func broadcastResumed() {
    super.broadcastResumed()
}

Pausing and resuming the recording creates a gap in the video presentation timestamps and a discontinuity in the audio, which I believe explains your symptoms.暂停和恢复录制会在视频演示时间戳中产生间隙,并在音频中产生不连续性,我相信这可以解释您的症状。

What you need to do is measure how long the recording was paused for, possibly using the sample buffer timestamps, and then subtract that offset from the presentation timestamps of all the subsequent CMSampleBuffer 's that you process.您需要做的是测量记录暂停了多长时间,可能使用样本缓冲区时间戳,然后从您处理的所有后续CMSampleBuffer的呈现时间戳中减去该偏移量。 CMSampleBufferCreateCopyWithNewTiming() can help you with this. CMSampleBufferCreateCopyWithNewTiming()可以帮助您解决这个问题。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM