[英]Making an AVFileTypeMPEG4 video file with AVAssetExportSession and AVMutableComposition
I am using the library "OWVideoProcessor" to cut parts of the live recording video. 我正在使用“ OWVideoProcessor”库来剪切部分实时录制的视频。 The video works fine on any apple devices, but when i play it on browser(Dropbox) it has some seconds added in front and the audio is also missing from those seconds added in front. 该视频在任何Apple设备上均可正常运行,但是当我在浏览器(Dropbox)上播放时,前面添加了几秒钟,而前面添加的这些秒也缺少了音频。 You can see examples of this videos here: https://www.dropbox.com/s/2vyhqlfgfh6gzlk/file32167%281%29.mp4?dl=0 If you download the video on an apple device the video has 20 sec. 您可以在此处查看此视频的示例: https : //www.dropbox.com/s/2vyhqlfgfh6gzlk/file32167%281%29.mp4?dl=0如果在Apple设备上下载视频,则视频会持续20秒。 if you play it in browser it has 29 sec. 如果您在浏览器中播放,则需要29秒。
This is the code for stitching the video: 这是拼接视频的代码:
- (void)stitchVideoWithDestinationPath:(NSString *)destinationPath completion:(void(^)(NSError *error))completion {
[self.exportSession cancelExport];
NSLog(@"export started to path: %@", destinationPath);
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime startTime = kCMTimeZero;
int lastIndex = self.segmentStart + self.segmentCount - 1;
NSLog(@"Stitching segments in interval: [%d - %d]", self.segmentStart, lastIndex);
for (int i = self.segmentCount - 5; i < lastIndex; i++) {
CMTimeShow(startTime);
NSURL *url = [OWUtilities urlForRecordingSegmentCount:i basePath:self.basePath];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:@{AVURLAssetPreferPreciseDurationAndTimingKey: @(YES)}];
NSAssert(asset, @"Invalid asset at: %@", url);
BOOL hasAllTracks = [[asset tracks] count] >= 2;
if (hasAllTracks) {
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
AVAssetTrack *track = nil;
track = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[videoTrack insertTimeRange:timeRange ofTrack:track atTime:startTime error:nil];
track = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[audioTrack insertTimeRange:timeRange ofTrack:track atTime:startTime error:nil];
startTime = CMTimeAdd(startTime, asset.duration);
}
}
NSTimeInterval segmentsDuration = CMTimeGetSeconds(startTime);
NSLog(@"Total segments duration: %.2f", segmentsDuration);
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetPassthrough];
if (![[NSFileManager defaultManager] fileExistsAtPath:destinationPath]) {
NSArray *filePathsArray = [NSArray new];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
filePathsArray = [[NSFileManager defaultManager] subpathsOfDirectoryAtPath:documentsDirectory error:nil];
documentsDirectory = [documentsDirectory stringByAppendingString:@"/uploads/"];
documentsDirectory = [documentsDirectory stringByAppendingString:[destinationPath lastPathComponent]];
if([[NSFileManager defaultManager] fileExistsAtPath:documentsDirectory]) {
destinationPath = documentsDirectory;
}
}
exporter.outputURL = [NSURL fileURLWithPath:destinationPath];
exporter.outputFileType = AVFileTypeMPEG4;
BOOL trimRange = (segmentsDuration > self.outputSegmentDuration);
if (trimRange) {
CMTime duration = CMTimeMakeWithSeconds(self.outputSegmentDuration, startTime.timescale);
NSTimeInterval startInterval = segmentsDuration - self.outputSegmentDuration;
CMTime start = CMTimeMakeWithSeconds(startInterval, startTime.timescale);
exporter.timeRange = CMTimeRangeMake(start, duration);
NSLog(@"Exporting segment:");
CMTimeRangeShow(exporter.timeRange);
NSTimeInterval segmentsDuration2 = CMTimeGetSeconds(duration);
NSLog(@"Total segments duration: %.2f", segmentsDuration2);
}
@weakify(self, exporter);
[exporter exportAsynchronouslyWithCompletionHandler:^{
@strongify(self, exporter);
NSLog(@"error: %@", exporter.error);
if (completion && (exporter.status != AVAssetExportSessionStatusCancelled)) {
completion(exporter.error);
} else {
completion(nil);
}
if (self.exportSession == exporter) {
self.exportSession = nil;
}
}];
self.exportSession = exporter;
}
The problem was not in the code above. 问题不在上面的代码中。 The problem was here: 问题出在这里:
NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInteger:width], AVVideoWidthKey,
[NSNumber numberWithInteger:height], AVVideoHeightKey,
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInteger: bps ], AVVideoAverageBitRateKey,
[NSNumber numberWithInteger:300], AVVideoMaxKeyFrameIntervalKey,
nil], AVVideoCompressionPropertiesKey,
nil];
This code is used to set up the video compression settings. 此代码用于设置视频压缩设置。 The AVVideoAverageBitRateKey is set to low (like 600 kbit/s) and the AVVideoMaxKeyFrameIntervalKey was set too big. AVVideoAverageBitRateKey设置为低(例如600 kbit / s),而AVVideoMaxKeyFrameIntervalKey设置太大。 So i changed AVVideoMaxKeyFrameIntervalKey to 1 and i increased the AVVideoAverageBitRateKey to 5000 kbit/s. 所以我将AVVideoMaxKeyFrameIntervalKey更改为1,然后将AVVideoAverageBitRateKey增加到5000 kbit / s。 This solved my issue. 这解决了我的问题。
This code was written to decrease the video size. 编写此代码是为了减小视频大小。 You can change it in the OWVideoProcessor library. 您可以在OWVideoProcessor库中对其进行更改。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.