I am using ExtAudioFileCreateWithURL
and consistently get a runtime kAudioFileUnsupportedDataFormatError
error when creating a Stereo LPCM Float32 Wave file. I insist that the same procedure works fine with a Mono (single channel) file. Any hints?
Here's the code snippet:
let audioType: AudioFileTypeID = kAudioFileWAVEType
var recordingFormatStream = CAStreamBasicDescription(sampleRate: sampleRate, numChannels: 2, pcmf: .Float32, isInterleaved: false)!
err = ExtAudioFileCreateWithURL(audioFileRecordingURL,
audioType,
&recordingFormatStream,
nil,
AudioFileFlags.EraseFile.rawValue,
&audioRecordingAudioFile)
noting that audioFileRecordingURL
and audioRecordingAudioFile
are correctly typed and set.
For the records, the recordingFormatStream
contains:
mFormatFlags = kAudioFormatFlagsNativeEndian | kAudioFormatFlagIsPacked | kAudioFormatFlagIsFloat | kAudioFormatFlagIsNonInterleaved
mFormatID = kAudioFormatLinearPCM
mSampleRate: 44100.0
mBytesPerPacket: 4, mFramesPerPacket: 1, mBytesPerFrame: 4, mChannelsPerFrame: 2, mBitsPerChannel: 32, mReserved: 0
I insist that if I change the numChannels
to 1, everything is fine! Using iOS 9.3 SDK.
After much struggle: The ExtAudioFile
methods in SDK do not accept non-interleaved audio . I believe that this is somehow new!
Thanks to this post: Using ExtAudioFileWriteAsync() in callback function. Can't get to run
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.