[英]AVAssetExportSession rotation error's when video is from camera roll
我正在尝试将.mov视频转换为.mp4,同时更正方向。 我在下面使用的代码在使用UIImagePickerController
录制视频时效果很好但是如果从相机胶卷中选择视频我会收到此错误,我不明白为什么:
导出失败:操作已停止:错误域= AVFoundationErrorDomain代码= -11841“操作已停止”UserInfo = 0x1815ca50 {NSLocalizedDescription =操作已停止,NSLocalizedFailureReason =视频无法合成。}
我曾尝试将视频保存到另一个文件,但没有任何区别。
这是我用来转换视频的代码:
- (void)convertVideoToLowQuailtyAndFixRotationWithInputURL:(NSURL*)inputURL handler:(void (^)(NSURL *outURL))handler
{
if ([[inputURL pathExtension] isEqualToString:@"MOV"])
{
NSURL *outputURL = [inputURL URLByDeletingPathExtension];
outputURL = [outputURL URLByAppendingPathExtension:@"mp4"];
AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetTrack *sourceVideoTrack = [[avAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *sourceAudioTrack = [[avAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceVideoTrack
atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:sourceVideoTrack.preferredTransform];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceAudioTrack
atTime:kCMTimeZero error:nil];
AVMutableVideoComposition *videoComposition = [self getVideoComposition:avAsset];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:avAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality])
{
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.videoComposition = videoComposition;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status])
{
case AVAssetExportSessionStatusFailed:
NSLog(@"Export failed: %@ : %@", [[exportSession error] localizedDescription], [exportSession error]);
handler(nil);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Export canceled");
handler(nil);
break;
default:
handler(outputURL);
break;
}
}];
}
} else {
handler(inputURL);
}
}
- (AVMutableVideoComposition *)getVideoComposition:(AVAsset *)asset
{
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
CGSize videoSize = videoTrack.naturalSize;
BOOL isPortrait_ = [self isVideoPortrait:asset];
if(isPortrait_) {
// NSLog(@"video is portrait ");
videoSize = CGSizeMake(videoSize.height, videoSize.width);
}
composition.naturalSize = videoSize;
videoComposition.renderSize = videoSize;
videoComposition.frameDuration = CMTimeMakeWithSeconds( 1 / videoTrack.nominalFrameRate, 600);
AVMutableCompositionTrack *compositionVideoTrack;
compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionLayerInstruction *layerInst;
layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
[layerInst setTransform:videoTrack.preferredTransform atTime:kCMTimeZero];
AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
inst.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
inst.layerInstructions = [NSArray arrayWithObject:layerInst];
videoComposition.instructions = [NSArray arrayWithObject:inst];
return videoComposition;
}
AVFoundation错误常量-11841表示您的视频合成无效。 如果您想了解有关错误常数的更多信息,请参阅此链接: https : //developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVFoundation_ErrorConstants/Reference/reference.html
虽然我没有立即出现重大错误,但我可以建议以下方法来缩小问题的根源。
首先,不要在这些调用中传递nil
作为error
参数:
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceVideoTrack
atTime:kCMTimeZero error:nil];
创建一个NSError
对象并将引用传递给它,如下所示:
NSError *error = nil;
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceVideoTrack
atTime:kCMTimeZero error:&error];
检查错误以确保您的视频和音轨正确插入合成轨道。 如果一切顺利,错误应nil
。
if(error)
NSLog(@"Insertion error: %@", error);
您可能还想检查AVAsset的composable
和exportable
以及hasProtectedContent
属性。 如果这些分别不是YES,YES和NO,则可能在创建新视频文件时出现问题。
偶尔我会看到一个问题,即创建音频轨道的时间范围时,在带有视频轨道的合成中使用600时刻度时不会喜欢。 您可能希望在持续时间(avAsset.duration)中创建新的CMTime
CMTimeRangeMake(kCMTimeZero, avAsset.duration)
仅用于插入音轨。 在新的CMTime中,使用44100的时间刻度(或音频轨道的采样率。)您的videoComposition.frameDuration
。 根据视频轨道的nominalFrameRate
,600时间刻度可能无法正确表示您的时间。
最后,Apple提供了一个有用的工具来调试视频合成:
https://developer.apple.com/library/mac/samplecode/AVCompositionDebugViewer/Introduction/Intro.html
它可以直观地显示您的构图,并且可以看到事物看起来不应该是什么样子。
尝试评论以下行并运行您的项目
exportSession.videoComposition = videoComposition;
你绝对应该使用方法isValidForAsset:timeRange:validationDelegate:AVVideoCompostion,它将诊断你的视频合成的任何问题。 我有同样的问题,我的解决方案是使用AVMutableCompositionTrack而不是原始轨道创建layerInstruction:
layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.