I want to create something like a hearing aid app, where once i hit a the "startRecording" UIButton, it continuously records what I'm saying, and simultaneously plays it back to me at the same instant, in my earphones. It's basically to help people with hearing disabilities to hear the sounds from the surrounding environment, better and louder through earphones.
I am trying to implement it using the AVAudioKit, with the AudioRecorder and AudioPlayer working together with the same filepath "filename", in a while loop.
I get the error for line: audioPlayer.delegate = self
Thread 1: Fatal error: Unexpectedly found nil while unwrapping an Optional value.
@IBOutlet weak var startRecording: UIButton!
var recordingSession : AVAudioSession!
var audioRecorder : AVAudioRecorder!
var audioPlayer : AVAudioPlayer!
var fileNameString : String = "test.m4a"
@IBAction func buttonPressed(_ sender: Any) {
print("button pressed")
let filename = getDirectory().appendingPathComponent("\(fileNameString)")
if audioRecorder == nil{ // DAF needs to be started
let settings = [AVFormatIDKey: Int(kAudioFormatAppleLossless),
AVEncoderAudioQualityKey: AVAudioQuality.max.rawValue,
AVEncoderBitRateKey: 320000,
AVNumberOfChannelsKey: 1,
AVSampleRateKey: 12000.0] as [String : Any]
do{
audioRecorder = try AVAudioRecorder(url: filename, settings: settings)
audioRecorder.delegate = self
//audioRecorder.record()
do{
audioPlayer = try AVAudioPlayer(contentsOf: filename, fileTypeHint: nil)
}
catch let error{
print("\(error)")
}
audioPlayer.delegate = self
audioPlayer.prepareToPlay()
while true {
audioRecorder.record()
sleep(1)
audioPlayer.play()
}
//startRecording.setTitle("Stop ", for: .normal)
} catch{
print ("failed")
}
}
else { // DAF started, needs to stop
audioRecorder.stop()
audioRecorder = nil
startRecording.setTitle("Start", for: .normal)
playRecording()
}
AVAudioRecording to a file and reading from that file to play will result in too much latency for real-time audio, due to the API writing and reading files using fairly large blocks or buffers of samples.
A better iOS API is to use for your purpose is the Audio Unit API with the RemoteIO Audio Unit. Using the RemoteIO Audio Unit can result in very low latencies from microphone to speaker (or headset). This is a C callback API however, as Apple currently does not recommend using Swift inside a real-time audio context.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.