[英]How to save recorded video using AVAssetWriter?
我尝试了许多其他博客,并且堆栈溢出。 我没有解决方案,我可以创建带有预览的自定义相机。 我需要带有自定义框架的视频,这就是为什么我使用AVAssetWriter。 但是我无法将录制的视频保存到文档中。 我这样尝试过
-(void) initilizeCameraConfigurations {
if(!captureSession) {
captureSession = [[AVCaptureSession alloc] init];
[captureSession beginConfiguration];
captureSession.sessionPreset = AVCaptureSessionPresetHigh;
self.view.backgroundColor = UIColor.blackColor;
CGRect bounds = self.view.bounds;
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
captureVideoPreviewLayer.backgroundColor = [UIColor clearColor].CGColor;
captureVideoPreviewLayer.bounds = self.view.frame;
captureVideoPreviewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.position = CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
[self.view.layer addSublayer:captureVideoPreviewLayer];
[self.view bringSubviewToFront:self.controlsBgView];
}
// Add input to session
NSError *err;
videoCaptureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&err];
if([captureSession canAddInput:videoCaptureDeviceInput]) {
[captureSession addInput:videoCaptureDeviceInput];
}
docPathUrl = [[NSURL alloc] initFileURLWithPath:[self getDocumentsUrl]];
assetWriter = [AVAssetWriter assetWriterWithURL:docPathUrl fileType:AVFileTypeQuickTimeMovie error:&err];
NSParameterAssert(assetWriter);
//assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:300], AVVideoWidthKey,
[NSNumber numberWithInt:300], AVVideoHeightKey,
nil];
writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
writerInput.expectsMediaDataInRealTime = YES;
writerInput.transform = CGAffineTransformMakeRotation(M_PI);
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
[NSNumber numberWithInt:300], kCVPixelBufferWidthKey,
[NSNumber numberWithInt:300], kCVPixelBufferHeightKey,
nil];
assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
if([assetWriter canAddInput:writerInput]) {
[assetWriter addInput:writerInput];
}
// Set video stabilization mode to preview layer
AVCaptureVideoStabilizationMode stablilizationMode = AVCaptureVideoStabilizationModeCinematic;
if([videoCaptureDevice.activeFormat isVideoStabilizationModeSupported:stablilizationMode]) {
[captureVideoPreviewLayer.connection setPreferredVideoStabilizationMode:stablilizationMode];
}
// image output
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[captureSession addOutput:stillImageOutput];
[captureSession commitConfiguration];
if (![captureVideoPreviewLayer.connection isEnabled]) {
[captureVideoPreviewLayer.connection setEnabled:YES];
}
[captureSession startRunning];
}
-(IBAction)startStopVideoRecording:(id)sender {
if(captureSession) {
if(isVideoRecording) {
[writerInput markAsFinished];
[assetWriter finishWritingWithCompletionHandler:^{
NSLog(@"Finished writing...checking completion status...");
if (assetWriter.status != AVAssetWriterStatusFailed && assetWriter.status == AVAssetWriterStatusCompleted)
{
// Video saved
} else
{
NSLog(@"#123 Video writing failed: %@", assetWriter.error);
}
}];
} else {
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
isVideoRecording = YES;
}
}
}
-(NSString *) getDocumentsUrl {
NSString *docPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
if([[NSFileManager defaultManager] fileExistsAtPath:docPath]) {
NSError *err;
[[NSFileManager defaultManager] removeItemAtPath:docPath error:&err];
}
NSLog(@"Movie path : %@",docPath);
return docPath;
}
@end
如果有什么问题请指正。 先感谢您。
您没有说出实际出了什么问题,但是您的代码有两点看起来不对:
docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
看起来像它会在以下位置创建类似@"/path/Movie/.mov"
的不需要的路径:
docPath = [docPath stringByAppendingPathComponent:@"Movie.mov"];
而且您的时间安排有误。 您的资产写作者从时间0开始,但是sampleBuffer
的开始是CMSampleBufferGetPresentationTimestamp(sampleBuffer) > 0
,所以可以这样做:
-(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if(firstSampleBuffer) {
[assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimestamp(sampleBuffer)];
}
[writerInput appendSampleBuffer:sampleBuffer];
}
从概念上讲,您必须涉及主要的功能区域:一个生成视频帧(此为AVCaptureSession
,并附加所有内容),另一个生成视频帧至文件(在您的情况下为AVAssetWriter
带有附加输入)。
您的代码存在的问题是:两者之间没有任何联系。 从捕获会话中出来的视频帧/图像不会传递到资产编写器输入。
此外,没有调用AVCaptureStillImageOutput
方法-captureStillImageAsynchronouslyFromConnection:completionHandler:
AVCaptureStillImageOutput
因此捕获会话实际上不生成任何帧。
因此,至少要实现以下内容:
-(IBAction)captureStillImageAndAppend:(id)sender
{
[stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageOutput.connections.firstObject completionHandler:
^(CMSampleBufferRef imageDataSampleBuffer, NSError* error)
{
// check error, omitted here
if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
[assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( imageDataSampleBuffer))];
[writerInput appendSampleBuffer:imageDataSampleBuffer];
}];
}
删除未使用的AVAssetWriterInputPixelBufferAdaptor
。
但是AVCaptureStillImageOutput
存在问题:
它仅用于产生静止图像,而不是视频
如果资产编写器输入配置为压缩附加的样本缓冲区,则必须将其配置为产生未压缩的样本缓冲区( stillImageOutput.outputSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)};
)
在iOS下已弃用
如果你真的想制作一个视频,而不是静止图像的序列,而不是AVCaptureStillImageOutput
添加AVCaptureVideoDataOutput
到捕获会话。 它需要一个委托和一个串行调度队列来输出样本缓冲区。 委托必须实现这样的事情:
-(void)captureOutput:(AVCaptureOutput*)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
[assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( sampleBuffer))];
[writerInput appendSampleBuffer:sampleBuffer];
}
注意
您将要确保AVCaptureVideoDataOutput
仅在实际录制时才输出帧; 从捕获会话中添加/删除它,或在startStopVideoRecording操作中启用/禁用其连接
在开始另一个记录之前,将startTime
重置为kCMTimeInvalid
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.