[英]Enable audio input AudioKit v5
I am trying to migrate an app from AudioKit v4 to v5 and I am having a hard time finding documentation on the migration, and I can't find these in the Cookbook.我正在尝试将应用程序从 AudioKit v4 迁移到 v5,但我很难找到有关迁移的文档,而且我在 Cookbook 中找不到这些文档。 Previously we could set defaultToSpeaker and audioInputEnabled through AKSettings.
以前我们可以通过 AKSettings 设置 defaultToSpeaker 和 audioInputEnabled。 Now, these properties are gone and I can't find how can I replace them.
现在,这些属性不见了,我找不到如何替换它们。
v4: v4:
AKSettings.audioInputEnabled = true
AKSettings.defaultToSpeaker = true
Does anyone know how these parameters can be set with the new version?有谁知道新版本如何设置这些参数? Any feedback is highly appreciated!
任何反馈都非常感谢!
Nazarii,纳扎里,
In AudioKit 5, here's how I set up my audio input parameters:在 AudioKit 5 中,以下是我设置音频输入参数的方法:
import AudioKit
import AVFoundation
class Conductor {
static let sharedInstance = Conductor()
// Instantiate the audio engine and Mic Input node objects
let engine = AudioEngine()
var mic: AudioEngine.InputNode!
// Add effects for the Mic Input.
var delay: Delay!
var reverb: Reverb!
let mixer = Mixer()
// MARK: Initialize the audio engine settings.
init() {
// AVAudioSession requires the AVFoundation framework to be imported in the header.
do {
Settings.bufferLength = .medium
try AVAudioSession.sharedInstance().setPreferredIOBufferDuration(Settings.bufferLength.duration)
try AVAudioSession.sharedInstance().setCategory(.playAndRecord,
options: [.defaultToSpeaker, .mixWithOthers, .allowBluetoothA2DP])
try AVAudioSession.sharedInstance().setActive(true)
} catch let err {
print(err)
}
// The audio signal path with be:
// input > mic > delay > reverb > mixer > output
// Mic is connected to the audio engine's input...
mic = engine.input
// Mic goes into the delay...
delay = Delay(mic)
delay.time = AUValue(0.5)
delay.feedback = AUValue(30.0)
delay.dryWetMix = AUValue(15.0)
// Delay output goes into the reverb...
reverb = Reverb(delay)
reverb.loadFactoryPreset(.largeHall2)
reverb.dryWetMix = AUValue(0.4)
// Reverb output goes into the mixer...
mixer.addInput(reverb)
// Engine output is connected to the mixer.
engine.output = mixer
// Uncomment the following method, if you don't want to Start and stop the audio engine via the SceneDelegate.
// startAudioEngine()
}
// MARK: Start and stop the audio engine via the SceneDelegate
func startAudioEngine() {
do {
print("Audio engine was started.")
try engine.start()
} catch {
Log("AudioKit did not start! \(error)")
}
}
func stopAudioEngine() {
engine.stop()
print("Audio engine was stopped.")
}
}
Please let me know if this works for you.请让我知道这是否适合您。
Take care, Mark保重,马克
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.