简体   繁体   English

iOS:尝试获取音频帧值时为EXC_BAD_ACCESS

[英]iOS: EXC_BAD_ACCESS when trying to get audioframe value

I am having a little trouble getting Michael's code to work. 使Michael的代码正常工作有点麻烦。 Below you will se my implementation, i am trying to get it running on iOS 6.1 but I am getting an EXC_BAD_ACCESS at: 在下面,您将实现我的实现,我正在尝试使其在iOS 6.1上运行,但在以下位置获得了EXC_BAD_ACCESS

AudioBuffer buffer = bufferList.mBuffers[i]; AudioBuffer buffer = bufferList.mBuffers [i];

I am running the code below from the appdelegate file when the applicationfinishloading : applicationfinishloading时,我正在从appdelegate文件运行以下代码:

_rec = [[AU alloc]init]; _rec = [[AU alloc] init];

[_rec initializeAudio]; [_rec initializeAudio];

[_rec start]; [_rec开始];

Thanks in advance.. :) 提前致谢.. :)

#define kOutputBus 0
#define kInputBus 1


@implementation AU

AudioComponentInstance audioUnit;
AudioStreamBasicDescription audioFormat;
float *convertedSampleBuffer;
-(id)init{
    self = [super init];
    if (self){

    }
    return self;
}
-(OSStatus)start{
     NSLog(@"start");
    OSStatus status = AudioOutputUnitStart(audioUnit);
    return status;
}

-(OSStatus)stop{
    OSStatus status = AudioOutputUnitStop(audioUnit);
    return status;
}

-(void)cleanUp{
    AudioUnitUninitialize(audioUnit);
}
static OSStatus recordingCallback(void *inRefCon,
                                  AudioUnitRenderActionFlags *actionFlags,
                                  const AudioTimeStamp *audioTimeStamp,
                                  UInt32 inBusNumber,
                                  UInt32 numFrames,
                                  AudioBufferList *buffers) {
    AudioBufferList bufferList;     


    for (int i = 0; i < bufferList.mNumberBuffers; i++)
    {
        AudioBuffer buffer = bufferList.mBuffers[i];
        SInt16 *audioFrame = (SInt16*)buffer.mData;
    }
    return noErr;
}
-(void)initializeAudio{
    NSLog(@"init");
    OSStatus status;


    AudioComponentDescription desc;
    desc.componentType = kAudioUnitType_Output;
    desc.componentSubType = kAudioUnitSubType_RemoteIO;
    desc.componentFlags = 0;
    desc.componentFlagsMask = 0;
    desc.componentManufacturer = kAudioUnitManufacturer_Apple;

    // Get component
    AudioComponent inputComponent = AudioComponentFindNext(NULL, &desc);

    // Get audio units
    status = AudioComponentInstanceNew(inputComponent, &audioUnit);

    // Enable IO for recording
    UInt32 flag = 1;
    status = AudioUnitSetProperty(audioUnit,
                                  kAudioOutputUnitProperty_EnableIO,
                                  kAudioUnitScope_Input,
                                  kInputBus,
                                  &flag,
                                  sizeof(flag));

    // Describe format
    audioFormat.mSampleRate         = 44100.00;
    audioFormat.mFormatID           = kAudioFormatLinearPCM;
    audioFormat.mFormatFlags        = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
    audioFormat.mFramesPerPacket    = 1;
    audioFormat.mChannelsPerFrame   = 1;
    audioFormat.mBitsPerChannel     = 16;
    audioFormat.mBytesPerPacket     = 2;
    audioFormat.mBytesPerFrame      = 2;

    // Apply format
    status = AudioUnitSetProperty(audioUnit,
                                  kAudioUnitProperty_StreamFormat,
                                  kAudioUnitScope_Output,
                                  kInputBus,
                                  &audioFormat,
                                  sizeof(audioFormat));

    // Set input callback
    AURenderCallbackStruct callbackStruct;
    callbackStruct.inputProc = recordingCallback;
    callbackStruct.inputProcRefCon = (__bridge void *)(self);
    status = AudioUnitSetProperty(audioUnit,
                                  kAudioOutputUnitProperty_SetInputCallback,
                                  kAudioUnitScope_Global,
                                  kInputBus,
                                  &callbackStruct,
                                  sizeof(callbackStruct));

    status = AudioUnitInitialize(audioUnit);

}
@end

You need to call AudioUnitRender to get the audio data first (code taken from the page you linked to) 您需要调用AudioUnitRender首先获取音频数据(从链接到的页面获取的代码)

AudioUnitRender([audioInterface audioUnit], 
                             ioActionFlags, 
                             inTimeStamp, 
                             inBusNumber, 
                             inNumberFrames, 
                             bufferList);

Beware that in some scenarios you may also need to malloc your own bufferList to use. 请注意,在某些情况下,您可能还需要分配自己的bufferList才能使用。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM