簡體   English   中英

使用AVAssetWriter錄制和播放音頻

[英]Record and play audio with AVAssetWriter

我已經大大減少了這個問題,希望能有所幫助。

基本上,此類有兩種方法,一種方法是開始記錄音頻( -recordMode ),另一種方法是播放音頻( playMode )。 我目前在一個具有單個視圖控制器的項目中擁有此類,該控制器具有兩個調用相應方法(錄制,播放)的按鈕。 沒有其他變量,該類是獨立的。

但是,它不會播放/錄制任何內容,我無法弄清楚原因。 當我嘗試播放文件時,我得到的文件大小為0和錯誤,因為您當然不能使用nil引用來初始化AVAudioPlayer 但我不明白為什么文件為空或為什么self.outputPathnil

.h文件

#import <AVFoundation/AVFoundation.h>

@interface MicCommunicator : NSObject<AVCaptureAudioDataOutputSampleBufferDelegate>

@property(nonatomic,retain) NSURL *outputPath;
@property(nonatomic,retain) AVCaptureSession * captureSession;
@property(nonatomic,retain) AVCaptureAudioDataOutput * output;

-(void)beginStreaming;
-(void)playMode;
-(void)recordMode;

@end

.m文件:

@implementation MicCommunicator {
    AVAssetWriter *assetWriter;
    AVAssetWriterInput *assetWriterInput;
}

@synthesize captureSession = _captureSession;
@synthesize output = _output;
@synthesize outputPath = _outputPath;

-(id)init {
    if ((self = [super init])) {
        NSArray *searchPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        self.outputPath = [NSURL fileURLWithPath:[[searchPaths objectAtIndex:0] stringByAppendingPathComponent:@"micOutput.output"]];

        AudioChannelLayout acl;
        bzero(&acl, sizeof(acl));
        acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono; //kAudioChannelLayoutTag_Stereo;
        NSDictionary *audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                             [NSNumber numberWithInt: kAudioFormatULaw],AVFormatIDKey,        
                                             [NSNumber numberWithFloat:8000.0],AVSampleRateKey,//was 44100.0
                                             [NSData dataWithBytes: &acl length: sizeof( AudioChannelLayout ) ], AVChannelLayoutKey,
                                             [NSNumber numberWithInt:1],AVNumberOfChannelsKey,
                                             [NSNumber numberWithInt:8000.0],AVEncoderBitRateKey,
                                             nil];

        assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioOutputSettings];
        [assetWriterInput setExpectsMediaDataInRealTime:YES];

        assetWriter = [[AVAssetWriter assetWriterWithURL:_outputPath fileType:AVFileTypeWAVE error:nil] retain];
        [assetWriter addInput:assetWriterInput];
    }
    return self;
}

-(void)dealloc {
    [assetWriter release];
    [super dealloc];
}

//conveniance methods

-(void)playMode
{
    [self stopRecording];

    NSError *error;
    AVAudioPlayer * audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:self.outputPath error:&error];
    audioPlayer.numberOfLoops = -1;

    if (audioPlayer == nil){
        NSLog(@"error: %@",[error description]);        
    }else{ 
        NSLog(@"playing");  
        [audioPlayer play];
    }
}

-(void)recordMode
{
        [self beginStreaming];    
}

-(void)stopRecording
{
    [self.captureSession stopRunning];
    [assetWriterInput markAsFinished];
    [assetWriter  finishWriting];

    NSDictionary *outputFileAttributes = [[NSFileManager defaultManager] attributesOfItemAtPath:[NSString stringWithFormat:@"%@",self.outputPath] error:nil];
    NSLog (@"done. file size is %llu", [outputFileAttributes fileSize]);
}

//starts audio recording
-(void)beginStreaming {
    self.captureSession = [[AVCaptureSession alloc] init];
    AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    NSError *error = nil;
    AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
    if (audioInput)
        [self.captureSession addInput:audioInput];
    else {
        NSLog(@"No audio input found.");
        return;
    }

    AVCaptureAudioDataOutput *output = [[AVCaptureAudioDataOutput alloc] init];

    dispatch_queue_t outputQueue = dispatch_queue_create("micOutputDispatchQueue", NULL);
    [output setSampleBufferDelegate:self queue:outputQueue];
    dispatch_release(outputQueue);

    [self.captureSession addOutput:output];
    [assetWriter startWriting];
    [self.captureSession startRunning];
}

//callback
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    AudioBufferList audioBufferList;
    NSMutableData *data= [[NSMutableData alloc] init];
    CMBlockBufferRef blockBuffer;
    CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, NULL, &audioBufferList, sizeof(audioBufferList), NULL, NULL, 0, &blockBuffer);

    //for (int y = 0; y < audioBufferList.mNumberBuffers; y++) {
    //  AudioBuffer audioBuffer = audioBufferList.mBuffers[y];
    //  Float32 *frame = (Float32*)audioBuffer.mData;
    //          
    //  [data appendBytes:frame length:audioBuffer.mDataByteSize];
    //}

    // append [data bytes] to your NSOutputStream 


    // These two lines write to disk, you may not need this, just providing an example
    [assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
    [assetWriterInput appendSampleBuffer:sampleBuffer];

    CFRelease(blockBuffer);
    blockBuffer=NULL;
    [data release];
}

@end

每個Apple支持:

因此,這是一個錯誤-創建文件,成功寫入許多示例,然后由於某種未知原因而導致追加失敗。

似乎AVAssetWriter僅在這些設置下失敗。

AudioQueue是用於ulaw音頻的東西

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM