简体   繁体   中英

How to save recorded video using AVAssetWriter?

I tried many other blogs and stack overflow. I didn't get solution for this, I can able to create custom camera with preview. I need video with custom frame, that's why I am using AVAssetWriter. But i unable to save recorded video into documents. I tried like this,

-(void) initilizeCameraConfigurations {

if(!captureSession) {

    captureSession = [[AVCaptureSession alloc] init];
    [captureSession beginConfiguration];
    captureSession.sessionPreset = AVCaptureSessionPresetHigh;
    self.view.backgroundColor = UIColor.blackColor;
    CGRect bounds = self.view.bounds;
    captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
    captureVideoPreviewLayer.backgroundColor = [UIColor clearColor].CGColor;
    captureVideoPreviewLayer.bounds = self.view.frame;
    captureVideoPreviewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
    captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    captureVideoPreviewLayer.position = CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
    [self.view.layer addSublayer:captureVideoPreviewLayer];
    [self.view bringSubviewToFront:self.controlsBgView];
}


// Add input to session
NSError *err;
videoCaptureDeviceInput  = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&err];

if([captureSession canAddInput:videoCaptureDeviceInput]) {
    [captureSession addInput:videoCaptureDeviceInput];
}

docPathUrl = [[NSURL alloc] initFileURLWithPath:[self getDocumentsUrl]];

assetWriter = [AVAssetWriter assetWriterWithURL:docPathUrl fileType:AVFileTypeQuickTimeMovie error:&err];
NSParameterAssert(assetWriter);
//assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:300], AVVideoWidthKey,
                               [NSNumber numberWithInt:300], AVVideoHeightKey,
                               nil];




 writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
 writerInput.expectsMediaDataInRealTime = YES;
 writerInput.transform = CGAffineTransformMakeRotation(M_PI);

 NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
 [NSNumber numberWithInt:300], kCVPixelBufferWidthKey,
 [NSNumber numberWithInt:300], kCVPixelBufferHeightKey,
 nil];

 assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];


 if([assetWriter canAddInput:writerInput]) {
 [assetWriter addInput:writerInput];
 }

     // Set video stabilization mode to preview layer
AVCaptureVideoStabilizationMode stablilizationMode = AVCaptureVideoStabilizationModeCinematic;
if([videoCaptureDevice.activeFormat isVideoStabilizationModeSupported:stablilizationMode]) {
    [captureVideoPreviewLayer.connection setPreferredVideoStabilizationMode:stablilizationMode];
}


// image output
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[captureSession addOutput:stillImageOutput];

[captureSession commitConfiguration];
if (![captureVideoPreviewLayer.connection isEnabled]) {
    [captureVideoPreviewLayer.connection setEnabled:YES];
}
[captureSession startRunning];

}
-(IBAction)startStopVideoRecording:(id)sender {

if(captureSession) {
    if(isVideoRecording) {
        [writerInput markAsFinished];

        [assetWriter finishWritingWithCompletionHandler:^{
            NSLog(@"Finished writing...checking completion status...");
            if (assetWriter.status != AVAssetWriterStatusFailed && assetWriter.status == AVAssetWriterStatusCompleted)
            {
                // Video saved
            } else
            {
                NSLog(@"#123 Video writing failed: %@", assetWriter.error);
            }

        }];

    } else {

        [assetWriter startWriting];
        [assetWriter startSessionAtSourceTime:kCMTimeZero];
        isVideoRecording = YES;

    }
}
}
-(NSString *) getDocumentsUrl {

NSString *docPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
if([[NSFileManager defaultManager] fileExistsAtPath:docPath]) {
    NSError *err;
    [[NSFileManager defaultManager] removeItemAtPath:docPath error:&err];
}
NSLog(@"Movie path : %@",docPath);
return docPath;


}


@end

Correct me if anything wrong. Thank you in advance.

You don't say what actually goes wrong, but two things look wrong with your code:

docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];

looks like it creates an undesired path like this @"/path/Movie/.mov" , when you want this:

docPath = [docPath stringByAppendingPathComponent:@"Movie.mov"];

And your timeline is wrong. Your asset writer starts at time 0, but the sampleBuffer s start at CMSampleBufferGetPresentationTimestamp(sampleBuffer) > 0 , so instead do this:

-(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    if(firstSampleBuffer) {
        [assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimestamp(sampleBuffer)];
    }

    [writerInput appendSampleBuffer:sampleBuffer];

}

Conceptually, you have to main functional areas: One that generates video frames – this the AVCaptureSession , and everything that is attached to it –, and another that writes these frames to a file – in your case the AVAssetWriter with attached inputs.
The problem with your code is: There is no connection between these two. No video frames / images coming out of the capture session are passed to the asset writer inputs.

Furthermore, the AVCaptureStillImageOutput method -captureStillImageAsynchronouslyFromConnection:completionHandler: is nowhere called, so the capture session actually produces no frames.

So, as a minimum, implement something like this:

-(IBAction)captureStillImageAndAppend:(id)sender
{
    [stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageOutput.connections.firstObject completionHandler:
        ^(CMSampleBufferRef imageDataSampleBuffer, NSError* error)
        {
            // check error, omitted here
            if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
                [assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( imageDataSampleBuffer))];
            [writerInput appendSampleBuffer:imageDataSampleBuffer];
        }];
}

Remove the AVAssetWriterInputPixelBufferAdaptor , it's not used.

But there are issues with AVCaptureStillImageOutput :

  • it's only intended to produce still images, not videos

  • it must be configured to produce uncompressed sample buffers if the asset writer input is configured to compress the appended sample buffers ( stillImageOutput.outputSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)}; )

  • it's deprecated under iOS

If you actually want to produce a video, as opposed to a sequence of still images, instead of the AVCaptureStillImageOutput add a AVCaptureVideoDataOutput to the capture session. It needs a delegate and a serial dispatch queue to output the sample buffers. The delegate has to implement something like this:

-(void)captureOutput:(AVCaptureOutput*)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
    if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
        [assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( sampleBuffer))];
    [writerInput appendSampleBuffer:sampleBuffer];
}

Note that

  • you will want to make sure that the AVCaptureVideoDataOutput only outputs frames when you're actually recording; add/remove it from the capture session or enable/disable its connection in the startStopVideoRecording action

  • reset the startTime to kCMTimeInvalid before starting another recording

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM