簡體   English   中英

如何監視設備的音頻輸出,以判斷聲音是否從揚聲器/耳機插孔中發出?

[英]How can I monitor the device's audio output to be able to tell if sound is coming out of the speakers/headphone jack?

設備開始輸出音頻時,我需要立即啟動操作。 我正在使用AVPlayer,並且正在從Parse流音頻文件。 使用諸如等待(AVPlayer.currentTime()!= nil)和(AVPlayer.rate> 0)等其他方法不夠准確,我需要確切地知道從設備輸出音頻的時間。 我嘗試使用AVAudioEngine,然后附加具有AVAudioNodeBus的AVAudioNode,但無法正常工作。 任何建議或技巧都非常好,謝謝!

這是我的AudioEngine代碼。 我在實例級別實例化AudioEngine。 創建standardFormat時,我不知道將什么用於standardFormatWithSampleRate或channel塊。 當我嘗試安裝installTapOnBus時,我不知道該塊使用什么,所以我設置為nil,但這也會觸發錯誤。 任何幫助將不勝感激,我是iOS開發人員的新手,已經閱讀過Apple的文檔多次,但我只是束手無策,也找不到在線的最新示例。

class TableViewController: UITableViewController, AVAudioPlayerDelegate {

var iDArray = [String]()
var NameArray = [String]()


var durationInSeconds = Double()

var currentSong = String()




override func viewDidLoad() {
    super.viewDidLoad()



    let ObjectIDQuery = PFQuery(className: "Songs")
    ObjectIDQuery.findObjectsInBackgroundWithBlock {
        (objectsArray: [PFObject]?, error: NSError?) -> Void in

        //objectsArray!.count != 0
        var objectIDs = objectsArray!

        for i in 0...objectIDs.count-1 {
                self.iDArray.append(objectIDs[i].valueForKey("objectId") as! String)
                self.NameArray.append(objectIDs[i].valueForKey("SongName") as! String)

                self.tableView.reloadData()
            }

    }

    do {
        try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
        print("AVAudioSession Category Playback OK")
        do {
            try AVAudioSession.sharedInstance().setActive(true)
            print("AVAudioSession is Active")
        } catch let error as NSError {
            print(error.localizedDescription)
        }
    } catch let error as NSError {
        print(error.localizedDescription)
    }



}

func grabSong () {


    let songQuery = PFQuery(className: "Songs")
    songQuery.getObjectInBackgroundWithId(iDArray[SelectedSongNumber], block: {
        (object: PFObject?, error : NSError?) -> Void in


        if let audioFile = object?["SongFile"] as? PFFile {
            let audioFileUrlString: String = audioFile.url!
            let audioFileUrl = NSURL(string: audioFileUrlString)!


            AudioPlayer = AVPlayer(URL: audioFileUrl)
            AudioPlayer.play()

    })

}

func audioFunction() {

    var audioPlayerNode = AVAudioNode()
    var audioBus = AVAudioNodeBus()


    var standardFormat = AVAudioFormat(standardFormatWithSampleRate: <#T##Double#>, channels: <#T##AVAudioChannelCount#>)


    AudioEngine.attachNode(audioPlayerNode)

    audioPlayerNode.outputFormatForBus(audioBus)

    audioPlayerNode.installTapOnBus(audioBus, bufferSize: 100, format: standardFormat, block: nil)

    if AudioEngine.running == true {
        print("the audio engine is running")
    } else {
        print("the audio engine is NOTTT running")
    }

}


func attachNode(audioNode : AVAudioNode) {
    AudioEngine.attachNode(audioNode)

    AudioEngine.outputNode
    print(AudioEngine.outputNode.description)

    if AudioEngine.running == true {
        print("the audio engine is running")
    } else {
        print("the audio engine is NOTTT running")
    }
}

override func tableView(tableView: UITableView, numberOfRowsInSection section: Int) -> Int {

    return iDArray.count
}


override func tableView(tableView: UITableView, cellForRowAtIndexPath indexPath: NSIndexPath) -> UITableViewCell {
    let cell = tableView.dequeueReusableCellWithIdentifier("Cell")
    cell?.textLabel!.text = NameArray[indexPath.row]

    return cell!
}



override func tableView(tableView: UITableView, didSelectRowAtIndexPath indexPath: NSIndexPath) {

   SelectedSongNumber = indexPath.row
    grabSong()
}

}

我應該改用AVAudioSession嗎? 還是AVCaptureSession?

我會在AVPlayer上使用音頻點擊來了解何時實際播放/即將播放音頻。 基本上,您會在音頻播放揚聲器/耳機插孔之前得到一個音頻點擊回調。

一些並發症:我不確定如何獲取某些流類型(pls,icecast)的AVAsset曲目,但是遠程(和本地)mp3文件可以正常工作。

var player: AVPlayer?

func doit() {
    let url = NSURL(string: "URL TO YOUR POSSIBLY REMOTE AUDIO FILE")!
    let asset = AVURLAsset(URL:url)
    let playerItem = AVPlayerItem(asset: asset)

    let tapProcess: @convention(c) (MTAudioProcessingTap, CMItemCount, MTAudioProcessingTapFlags, UnsafeMutablePointer<AudioBufferList>, UnsafeMutablePointer<CMItemCount>, UnsafeMutablePointer<MTAudioProcessingTapFlags>) -> Void = {
        (tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut) -> Void in

        // Audio coming out!
        let status = MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, nil, numberFramesOut)
        print("get audio: \(status)\n")
    }

    var callbacks = MTAudioProcessingTapCallbacks(
        version: kMTAudioProcessingTapCallbacksVersion_0,
        clientInfo: UnsafeMutablePointer(Unmanaged.passUnretained(self).toOpaque()),
        `init`: nil,
        finalize: nil,
        prepare: nil,
        unprepare: nil,
        process: tapProcess)

    var tap: Unmanaged<MTAudioProcessingTap>?
    let err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks, kMTAudioProcessingTapCreationFlag_PostEffects, &tap)

    if err != noErr {
        // TODO: something
    }

    let audioTrack = playerItem.asset.tracksWithMediaType(AVMediaTypeAudio).first!
    let inputParams = AVMutableAudioMixInputParameters(track: audioTrack)
    inputParams.audioTapProcessor = tap?.takeUnretainedValue()

    let audioMix = AVMutableAudioMix()
    audioMix.inputParameters = [inputParams]

    playerItem.audioMix = audioMix

    player = AVPlayer(playerItem: playerItem)
    player?.play()
}

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM