简体   繁体   English

如何播放从NSData转换而来的AVAudioPCMBuffer的音频

[英]How to play audio from AVAudioPCMBuffer converted from NSData

I am getting audio PCM 16bit Mono data from udp packets like this: 我从udp数据包获取音频PCM 16位Mono数据,如下所示:

(void)udpSocket:(GCDAsyncUdpSocket *)sock didReceiveData:(NSData *)data
                                               fromAddress:(NSData *)address
                                         withFilterContext:(id)filterContext
{
...
}

I am converting this data into PCM buffer by calling a swift function as below: 我通过调用swift函数将此数据转换为PCM缓冲区,如下所示:

func toPCMBuffer(data: NSData) -> AVAudioPCMBuffer {
    let audioFormat = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false)  // given NSData audio format
    var PCMBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity:1024*10)
    PCMBuffer.frameLength = PCMBuffer.frameCapacity

    let channels = UnsafeBufferPointer(start: PCMBuffer.floatChannelData, count: Int(PCMBuffer.format.channelCount))

    data.getBytes(UnsafeMutablePointer<Void>(channels[0]) , length: data.length)

    return PCMBuffer
}

Data is converted to PCM buffer and i can see its length in logs. 数据被转换为PCM缓冲区,我可以在日志中看到它的长度。 But when i try to play the buffer i hear no voice. 但是当我尝试播放缓冲区时,我听不到任何声音。 Here is the code for receiving: 这是接收的代码:

func toPCMBuffer(data: NSData) -> AVAudioPCMBuffer {
        let audioFormat = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false)  // given NSData audio format
        var PCMBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity:1024*10)
        PCMBuffer.frameLength = PCMBuffer.frameCapacity

        let channels = UnsafeBufferPointer(start: PCMBuffer.floatChannelData, count: Int(PCMBuffer.format.channelCount))

        data.getBytes(UnsafeMutablePointer<Void>(channels[0]) , length: data.length)
        var mainMixer = audioEngine.mainMixerNode
        audioEngine.attachNode(audioFilePlayer)
        audioEngine.connect(audioFilePlayer, to:mainMixer, format: PCMBuffer.format)
        audioEngine.startAndReturnError(nil)

        audioFilePlayer.play()
        audioFilePlayer.scheduleBuffer(PCMBuffer, atTime: nil, options: nil, completionHandler: nil)
        return PCMBuffer
    }

ended up using an objective-c function:data is getting converted fine 最终使用了一个objective-c函数:数据转换得很好

-(AudioBufferList *) getBufferListFromData: (NSData *) data
{
    if (data.length > 0)
    {
        NSUInteger len = [data length];
        //NSData *d2 = [data subdataWithRange:NSMakeRange(4, 1028)];
        //I guess you can use Byte*, void* or Float32*. I am not sure if that makes any difference.
        Byte* byteData = (Byte*) malloc (len);
        memcpy (byteData, [data bytes], len);
        if (byteData)
        {
            AudioBufferList * theDataBuffer =(AudioBufferList*)malloc(sizeof(AudioBufferList) * 1);
            theDataBuffer->mNumberBuffers = 1;
            theDataBuffer->mBuffers[0].mDataByteSize =(UInt32) len;
            theDataBuffer->mBuffers[0].mNumberChannels = 1;
            theDataBuffer->mBuffers[0].mData = byteData;
            // Read the data into an AudioBufferList
            return theDataBuffer;
        }
    }
    return nil;
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM