简体   繁体   English

此代码通过AVAssetWriter和AVAssetWriterInputs写入视频+音频不起作用。为什么?

[英]This code to write video+audio through AVAssetWriter and AVAssetWriterInputs is not working. Why?

I've been trying to write a video+audio using AVAssetWriter and AVAssetWriterInputs. 我一直在尝试使用AVAssetWriter和AVAssetWriterInputs编写视频+音频。

I read multiple posts in this forum of people saying they were able to accomplish that, but it is not working for me. 我在这个论坛上看了很多人说他们能够做到这一点,但这对我不起作用。 If I just write video then the code is doing its job very well. 如果我只是写视频,那么代码就能很好地完成它的工作。 When I add audio the output file is corrupted and cannot be reproduced. 当我添加音频时,输出文件已损坏且无法再现。

Here is part of my code: 这是我的代码的一部分:

Setting up AVCaptureVideoDataOutput and AVCaptureAudioDataOutput: 设置AVCaptureVideoDataOutput和AVCaptureAudioDataOutput:

NSError *error = nil;

// Setup the video input
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
// Setup the video output
_videoOutput = [[AVCaptureVideoDataOutput alloc] init];
_videoOutput.alwaysDiscardsLateVideoFrames = NO;
_videoOutput.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];     

// Setup the audio input
AVCaptureDevice *audioDevice     = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeAudio];
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error ];     
// Setup the audio output
_audioOutput = [[AVCaptureAudioDataOutput alloc] init];

// Create the session
_capSession = [[AVCaptureSession alloc] init];
[_capSession addInput:videoInput];
[_capSession addInput:audioInput];
[_capSession addOutput:_videoOutput];
[_capSession addOutput:_audioOutput];

_capSession.sessionPreset = AVCaptureSessionPresetLow;     

// Setup the queue
dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
[_videoOutput setSampleBufferDelegate:self queue:queue];
[_audioOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);

Setting up AVAssetWriter and associating both audio and video AVAssetWriterInputs to it: 设置AVAssetWriter并将音频和视频AVAssetWriterInputs关联到它:

- (BOOL)setupWriter {
    NSError *error = nil;
    _videoWriter = [[AVAssetWriter alloc] initWithURL:videoURL 
                                             fileType:AVFileTypeQuickTimeMovie
                                                error:&error];
    NSParameterAssert(_videoWriter);


    // Add video input
    NSDictionary *videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:
                                                 [NSNumber numberWithDouble:128.0*1024.0], AVVideoAverageBitRateKey,
                                                        nil ];

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              AVVideoCodecH264, AVVideoCodecKey,
                                              [NSNumber numberWithInt:192], AVVideoWidthKey,
                                              [NSNumber numberWithInt:144], AVVideoHeightKey,
                                              videoCompressionProps, AVVideoCompressionPropertiesKey,
                                              nil];

    _videoWriterInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                            outputSettings:videoSettings] retain];


    NSParameterAssert(_videoWriterInput);
    _videoWriterInput.expectsMediaDataInRealTime = YES;


    // Add the audio input
    AudioChannelLayout acl;
    bzero( &acl, sizeof(acl));
    acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;


    NSDictionary* audioOutputSettings = nil;          
    // Both type of audio inputs causes output video file to be corrupted.
    if (NO) {
        // should work from iphone 3GS on and from ipod 3rd generation
        audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                              [ NSNumber numberWithInt: kAudioFormatMPEG4AAC ], AVFormatIDKey,
                                     [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
                              [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
                              [ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,
                              [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
                              nil];
    } else {
        // should work on any device requires more space
        audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:                       
                              [ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey,
                                    [ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
                              [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
                              [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,                                      
                              [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
                                 nil ];
    } 

    _audioWriterInput = [[AVAssetWriterInput 
                            assetWriterInputWithMediaType: AVMediaTypeAudio 
                  outputSettings: audioOutputSettings ] retain];

    _audioWriterInput.expectsMediaDataInRealTime = YES;

    // add input
    [_videoWriter addInput:_videoWriterInput];
    [_videoWriter addInput:_audioWriterInput];

    return YES;
}

here are functions to start/stop video recording 这是启动/停止视频录制的功能

- (void)startVideoRecording
{
    if (!_isRecording) {
        NSLog(@"start video recording...");
        if (![self setupWriter]) {
             return;
        }
        _isRecording = YES;
    }
}

- (void)stopVideoRecording
{
    if (_isRecording) {
        _isRecording = NO;

        [_videoWriterInput markAsFinished];
        [_videoWriter endSessionAtSourceTime:lastSampleTime];

        [_videoWriter finishWriting];

        NSLog(@"video recording stopped");
    }
}

And finally the CaptureOutput code 最后是CaptureOutput代码

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    if (!CMSampleBufferDataIsReady(sampleBuffer)) {
        NSLog( @"sample buffer is not ready. Skipping sample" );
        return;
    }


    if (_isRecording == YES) {
        lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
        if (_videoWriter.status != AVAssetWriterStatusWriting ) {
            [_videoWriter startWriting];
            [_videoWriter startSessionAtSourceTime:lastSampleTime];
        }

        if (captureOutput == _videoOutput) {
            [self newVideoSample:sampleBuffer];
        }

        /*
        // If I add audio to the video, then the output file gets corrupted and it cannot be reproduced
        } else {
            [self newAudioSample:sampleBuffer];
        }
    */
    }
}

- (void)newVideoSample:(CMSampleBufferRef)sampleBuffer
{     
    if (_isRecording) {
        if (_videoWriter.status > AVAssetWriterStatusWriting) {
             NSLog(@"Warning: writer status is %d", _videoWriter.status);
             if (_videoWriter.status == AVAssetWriterStatusFailed)
                  NSLog(@"Error: %@", _videoWriter.error);
             return;
        }

        if (![_videoWriterInput appendSampleBuffer:sampleBuffer]) {
             NSLog(@"Unable to write to video input");
        }
    }
}



- (void)newAudioSample:(CMSampleBufferRef)sampleBuffer
{     
    if (_isRecording) {
        if (_videoWriter.status > AVAssetWriterStatusWriting) {
             NSLog(@"Warning: writer status is %d", _videoWriter.status);
             if (_videoWriter.status == AVAssetWriterStatusFailed)
                  NSLog(@"Error: %@", _videoWriter.error);
             return;
        }

        if (![_audioWriterInput appendSampleBuffer:sampleBuffer]) {
             NSLog(@"Unable to write to audio input");
        }
    }
}

I would be very glad if someone could find which is the problem in this code. 如果有人能找到这个代码中的问题,我会很高兴。

In startVideoRecording I call (I assume you are calling this at some point) 在startVideoRecording我调用(我假设你在某个时候调用它)

[_capSession startRunning] ;

In stopVideoRecording I do not call 在stopVideoRecording我不打电话

[_videoWriterInput markAsFinished];
[_videoWriter endSessionAtSourceTime:lastSampleTime];

The markAsFinished is more for use with the block style pull method. markAsFinished更适用于块样式拉方法。 See requestMediaDataWhenReadyOnQueue:usingBlock in AVAssetWriterInput for an explanation. 有关说明,请参阅AVAssetWriterInput中的requestMediaDataWhenReadyOnQueue:usingBlock。 The library should calculate the proper timing for interleaving the buffers. 库应该计算交错缓冲区的正确时间。

You do not need to call endSessionAtSrouceTime. 您不需要调用endSessionAtSrouceTime。 The last time stamp in the sample data will be used after the call to 调用后将使用示例数据中的最后一个时间戳

[_videoWriter finishWriting];

I also explicitly check for the type of capture output. 我还明确检查了捕获输出的类型。

else if( captureOutput == _audioOutput) {
    [self newAudioSample:sampleBuffer]; 
}

Here is what I have. 这就是我所拥有的。 The audio and video come through for me. 音频和视频来自我。 It is possible I changed something. 我有可能改变了一些东西。 If this does not work for you then I will post everything I have. 如果这不适合你,那么我会发布我拥有的一切。

-(void) startVideoRecording
    {
        if( !_isRecording )
         {
            NSLog(@"start video recording...");
            if( ![self setupWriter] ) {
                NSLog(@"Setup Writer Failed") ;

                return;
            }

            [_capSession startRunning] ;
            _isRecording = YES;
         }
    }

    -(void) stopVideoRecording
    {
        if( _isRecording )
         {
            _isRecording = NO;

            [_capSession stopRunning] ;

            if(![_videoWriter finishWriting]) { 
                NSLog(@"finishWriting returned NO") ;
            }
            //[_videoWriter endSessionAtSourceTime:lastSampleTime];
            //[_videoWriterInput markAsFinished];
            //[_audioWriterInput markAsFinished];

            NSLog(@"video recording stopped");
         }
    }

First, do not use [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] , as it is not the native format of the camera. 首先,不要使用[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] ,因为它不是相机的原生格式。 use [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] 使用[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]

Also, you should always check before calling startWriting that it isn't already running. 此外,在调用startWriting之前,应始终检查它是否尚未运行。 You do not need to set session end time, as stopWriting will do that. 您不需要设置会话结束时间,因为stopWriting会这样做。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM