简体   繁体   中英

How to use Audiokit to call a function based on frequency

I've seen that it's possible to use the frequency tracker in a function, but I can't pull it off myself. There's an example file that can display frequency, but I haven't found anything that calls a function when the tracker hits a certain frequency.

I'm trying to call the AKAudioPlayer.pause() function when the Frequency drops below 40hz. I edited a test .mp3 file so that there is a section of 32hz in the audio file. I can even see on the frequency tracker text, that the tracker is reading the 32 hz, but I don't have any success in getting it to pause, even when I write the function to be dependent on this text.

This is a modified version of the microphone tracker, for tracking an audio file.

I receive the error: "Unexpectedly found nil while unwrapping an Optional value" both when trying to form the function directly with the tracker.frequency value, and when using a variable, converted to a string based off of the tracker.frequency.

This leads me to believe that the tracker frequency is my problematic nil optional value and to the conclusion that, that's what brings the whole thing down in flames. But I can't figure out what to do about it. I know there is a function that works to display the frequency, but I can't recreate its success.

In the below code (mostly an Audiokit example file) I've tried to use the Tracker.Frequency to trigger the pause function:

import AudioKit
import AudioKitUI
import UIKit


class ViewController: UIViewController {

    @IBOutlet private var frequencyLabel: UILabel!
    @IBOutlet private var amplitudeLabel: UILabel!
    @IBOutlet private var noteNameWithSharpsLabel: UILabel!
    @IBOutlet private var noteNameWithFlatsLabel: UILabel!
    @IBOutlet private var audioInputPlot: EZAudioPlot!


    @IBAction func Play(_ sender: Any) {

        if input.isStarted == false
        {
            input.play()
        }
    }


    @IBAction func Pause(_ sender: Any)
    {

        if input.isStarted
        {
            input.pause()
        }

    }



    var input: AKAudioPlayer!
    var song = try! AKAudioFile(readFileName: "TEST.mp3")// ... the error is here

    var tracker: AKFrequencyTracker!
    var silence: AKBooster!

    let noteFrequencies = [16.35, 17.32, 18.35, 19.45, 20.6, 21.83, 23.12, 24.5, 25.96, 27.5, 29.14, 30.87]
    let noteNamesWithSharps = ["C", "C♯", "D", "D♯", "E", "F", "F♯", "G", "G♯", "A", "A♯", "B"]
    let noteNamesWithFlats = ["C", "D♭", "D", "E♭", "E", "F", "G♭", "G", "A♭", "A", "B♭", "B"]

    func setupPlot() {
        let plot = AKNodeOutputPlot(input, frame: audioInputPlot.bounds)
        plot.plotType = .rolling
        plot.shouldFill = true
        plot.shouldMirror = true
        plot.color = UIColor.blue
        audioInputPlot.addSubview(plot)
    }

    override func viewDidLoad() {
        super.viewDidLoad()
          pauseOnQueue()

        AKSettings.audioInputEnabled = true
        do {input = try AKAudioPlayer(file: song)}
            catch {print("ERROR")}
        tracker = AKFrequencyTracker(input)
        silence = AKBooster(tracker, gain: 1)
    }

    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)

        AudioKit.output = silence
        AudioKit.start()
        setupPlot()


        Timer.scheduledTimer(timeInterval: 0.1,
                             target: self,
                             selector: #selector(ViewController.updateUI),
                             userInfo: nil,
                             repeats: true)
    }




    @objc func updateUI() {
        if tracker.amplitude > 0.1 {
            frequencyLabel.text = String(format: "%0.1f", tracker.frequency)

            var frequency = Float(tracker.frequency)
            while frequency > Float(noteFrequencies[noteFrequencies.count - 1]) {
                frequency /= 2.0
            }
            while frequency < Float(noteFrequencies[0]) {
                frequency *= 2.0
            }

            var minDistance: Float = 10_000.0
            var index = 0

            for i in 0..<noteFrequencies.count {
                let distance = fabsf(Float(noteFrequencies[i]) - frequency)
                if distance < minDistance {
                    index = i
                    minDistance = distance
                }
            }
            let octave = Int(log2f(Float(tracker.frequency) / frequency))
            noteNameWithSharpsLabel.text = "\(noteNamesWithSharps[index])\(octave)"
            noteNameWithFlatsLabel.text = "\(noteNamesWithFlats[index])\(octave)"
        }
        amplitudeLabel.text = String(format: "%0.2f", tracker.amplitude)
    }

    func pauseOnQueue() {

        frequencyLabel.text = String(format: "%0.1f", tracker.frequency)

        let frequency = Float(tracker.frequency)


        if  frequency < 50 && frequency > 20  && input.isStarted == true

        { input.pause() }

    }

    }

I've answered a similar question here: AudioKit (iOS) - Add observer for frequency / amplitude change

There's a decision that you have to make about how you want to detect this frequency - is it something that happens fast and needs to be in the DSP code or can you just poll for frequency at semi-regular intervals from Swift.

I don't think you should add !s to the AudioKit node definitions because they might not exist at the time you're expecting them to even though it seems like it should from the view controller life cycle. In general, don't rely on view controller code to controller your audio. Perhaps put all the audio related stuff in a singleton and let it manage itself?

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM