简体   繁体   English

如何使用 AVFAudio 的 SDK 录制、播放和保存音频

[英]How to use AVFAudio's SDK to Record, Play and save audio

Ive been trying to implement the AVFoundation's framework AVFAudio in order to record audio, play audio, as well as change the audio data per the user's selected presets.我一直在尝试实现 AVFoundation 的框架 AVFAudio,以便根据用户选择的预设来录制音频、播放音频以及更改音频数据。 Ive also been trying to find out how to save files locally to the user's device, however, upon reading apple's documentation on AVFAudio, I can hardly make any sense of which steps to take when creating these files.我也一直在尝试找出如何将文件本地保存到用户的设备,但是,在阅读苹果关于 AVFAudio 的文档后,我几乎无法理解创建这些文件时要采取哪些步骤。 Ive been following along with https://www.raywenderlich.com/21868250-audio-with-avfoundation/lessons/1 and managed to set up some functions here.我一直在关注https://www.raywenderlich.com/21868250-audio-with-avfoundation/lessons/1并设法在这里设置了一些功能。

Here I have set up saving the audio, but as you can see, this would only save the audio to a temporary directory.在这里我已经设置了保存音频,但是正如你所看到的,这只会将音频保存到一个临时目录中。 I am wondering how I can save the audio file locally to the user's device.我想知道如何将音频文件本地保存到用户的设备。

// MARK: Saving audio
    var urlForVocals: URL {
        let fileManger = FileManager.default
        let tempDirectory = fileManger.temporaryDirectory
        let filePath = "TempVocalRecording.caf"
        return tempDirectory.appendingPathComponent(filePath)
    }
    

I am generally confused about the AVFoundation's framework when using AVFAudio and the documentation https://developer.apple.com/documentation/avfaudio does not go into specifics of how to implement each method.在使用 AVFAudio 时,我通常对 AVFoundation 的框架感到困惑,而文档https://developer.apple.com/documentation/avfaudio并没有 go 详细说明如何实现每种方法。 For Example;例如; The Doc states that for Creating an Audio Player: We need to init(contentsOf:url), but does not go into what the url is and why we are using it?该文档指出,对于创建音频播放器:我们需要 init(contentsOf:url),但不会将 go 转换为 url 是什么以及我们为什么使用它? Can anyone help me understand what steps to take further, I feel like i'm running around in circles trying to understand this framework and the apple documentation.任何人都可以帮助我了解进一步采取哪些步骤,我觉得我在绕圈子试图理解这个框架和苹果文档。

Here's a relatively bare-bones version.这是一个相对简单的版本。 See inline comments for what is happening.请参阅内联评论以了解正在发生的事情。

cclass AudioManager : ObservableObject {
    @Published var canRecord = false
    @Published var isRecording = false
    @Published var audioFileURL : URL?
    private var audioPlayer : AVAudioPlayer?
    private var audioRecorder : AVAudioRecorder?
    
    init() {
        //ask for record permission. IMPORTANT: Make sure you've set `NSMicrophoneUsageDescription` in your Info.plist
        AVAudioSession.sharedInstance().requestRecordPermission() { [unowned self] allowed in
            DispatchQueue.main.async {
                if allowed {
                    self.canRecord = true
                } else {
                    self.canRecord = false
                }
            }
        }
    }

    //the URL where the recording file will be stored
    private var recordingURL : URL {
        getDocumentsDirectory().appendingPathComponent("recording.caf")
    }

    private func getDocumentsDirectory() -> URL {
        let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
        return paths[0]
    }
    
    func recordFile() {
        do {
            //set the audio session so we can record
            try AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode: .default)
            try AVAudioSession.sharedInstance().setActive(true)
            
        } catch {
            print(error)
            self.canRecord = false
            fatalError()
        }
        //this describes the format the that the file will be recorded in
        let settings = [
            AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
            AVSampleRateKey: 12000,
            AVNumberOfChannelsKey: 1,
            AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
        ]
        do {
            //create the recorder, pointing towards the URL from above
            audioRecorder = try AVAudioRecorder(url: recordingURL,
                                                settings: settings)
            audioRecorder?.record() //start the recording
            isRecording = true
        } catch {
            print(error)
            isRecording = false
        }
    }
    
    func stopRecording() {
        audioRecorder?.stop()
        isRecording = false
        audioFileURL = recordingURL
    }
    
    func playRecordedFile() {
        guard let audioFileURL = audioFileURL else {
            return
        }
        do {
            //create a player, again pointing towards the same URL
            self.audioPlayer = try AVAudioPlayer(contentsOf: audioFileURL)
            self.audioPlayer?.play()
        } catch {
            print(error)
        }
    }
}

struct ContentView: View {
    
    @StateObject private var audioManager = AudioManager()
    
    var body: some View
    {
        VStack {
            if !audioManager.isRecording && audioManager.canRecord {
                Button("Record") {
                    audioManager.recordFile()
                }
            } else {
                Button("Stop") {
                    audioManager.stopRecording()
                }
            }
            
            if audioManager.audioFileURL != nil && !audioManager.isRecording {
                Button("Play") {
                    audioManager.playRecordedFile()
                }
            }
        }
    }
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM