[英]AKPlayer crashes when playing from buffer on channelCount condition
我努力使以下場景按預期工作(代碼將在下面提供)。
記錄我的麥克風輸入並將AVAudioPCMBuffer
存儲在內存中,這是通過AVAudioPCMBuffer
擴展方法copy(from buffer: AVAudioPCMBuffer, readOffset: AVAudioFrameCount = default, frames: AVAudioFrameCount = default)
。 我確實在錄音結束時得到了緩沖區 。
當記錄結束時,將緩沖區傳遞給AKPlayer
並播放。 這是一個代碼片段,用於演示我的操作(我知道它不是完整的應用程序代碼,如果需要我可以共享它):
。
private var player: AKPlayer = AKPlayer()
self.player.buffering = .always
// in the record complete callbak:
self.player.buffer = self.bufferRecorder?.pcmBuffer
self.player.volume = 1
self.player.play()
當我檢查和調試應用程序時,我可以看到緩沖區具有正確的長度,並且我的所有輸出/輸入設置使用相同的處理格式(采樣率,通道,比特率等)以及記錄的緩沖區,但仍然是我的應用程序崩潰在這條線上:
2018-10-28 08:40:32.625001+0200 BeatmanApp[71037:6731884] [avae] AVAEInternal.h:70:_AVAE_Check:
required condition is false: [AVAudioPlayerNode.mm:665:ScheduleBuffer: (_outputFormat.channelCount == buffer.format.channelCount)]
當我調試並瀏覽AudioKit代碼時,我可以看到該方法的line 162
行上的AKPlayer+Playback.swift
的斷行: playerNode.scheduleBuffer
更多可能有用的信息:
謝謝!
好的,這是超級非冷卻調試會話。 我不得不調查AVAudioEngine
以及如何在那里完成這種情況,當然這不是我正在尋找的最終結果。 這個任務幫助我理解了如何使用AudioKit
解決它(我的應用程序的一半是使用AudioKit
的工具實現的,所以用AVFoundation
重寫它是沒有意義的)。
AFFoundation
解決方案:
private let engine = AVAudioEngine()
private let bufferSize = 1024
private let p: AVAudioPlayerNode = AVAudioPlayerNode()
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(.playAndRecord, mode: .default, options: .defaultToSpeaker)
} catch {
print("Setting category to AVAudioSessionCategoryPlayback failed.")
}
let inputNode = self.engine.inputNode
engine.connect(inputNode, to: engine.mainMixerNode, format: inputNode.inputFormat(forBus: 0))
// !!! the following lines are the key to the solution.
// !!! the player has to be attached to the engine before actually connected
engine.attach(p)
engine.connect(p, to: engine.mainMixerNode, format: inputNode.inputFormat(forBus: 0))
do {
try engine.start()
} catch {
print("could not start engine \(error.localizedDescription)")
}
recordBufferAndPlay(duration: 4)
recordBufferAndPlay
函數:
func recordBufferAndPlay(duration: Double){
let inputNode = self.engine.inputNode
let total: Double = AVAudioSession.sharedInstance().sampleRate * duration
let totalBufferSize: UInt32 = UInt32(total)
let recordedBuffer : AVAudioPCMBuffer! = AVAudioPCMBuffer(pcmFormat: inputNode.inputFormat(forBus: 0), frameCapacity: totalBufferSize)
var alreadyRecorded = 0
inputNode.installTap(onBus: 0, bufferSize: 256, format: inputNode.inputFormat(forBus: 0)) {
(buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in
recordedBuffer.copy(from: buffer) // this helper function is taken from audio kit!
alreadyRecorded = alreadyRecorded + Int(buffer.frameLength)
print(alreadyRecorded, totalBufferSize)
if(alreadyRecorded >= totalBufferSize){
inputNode.removeTap(onBus: 0)
self.p.scheduleBuffer(recordedBuffer, at: nil, options: .loops, completionHandler: {
print("completed playing")
})
self.p.play()
}
}
}
AudioKit
解決方案:
因此,在AudioKit解決方案中,應該在您的AKPlayer對象上調用這些行。 請注意,這應該在您實際啟動引擎之前完成。
self.player.buffering = .always
AudioKit.engine.attach(self.player.playerNode)
AudioKit.engine.connect(self.player.playerNode, to: self.mixer.inputNode, format: AudioKit.engine.inputNode.outputFormat(forBus: 0))
如果記錄完成的方式與在AVAudioEngine中完成的方式相似,則在節點(麥克風或其他節點)上安裝一個水龍頭並記錄PCM樣本的緩沖區。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.