简体   繁体   中英

AVAssetExportSession exporting too slow

I am trying to export an AVMutableComposition using AVAssetExportSession .

AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition presetName:AVAssetExportPresetHighestQuality];
    exporter.outputURL=url;
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    exporter.videoComposition = mainCompositionInst;
    exporter.shouldOptimizeForNetworkUse = YES;

    [exporter exportAsynchronouslyWithCompletionHandler:^
     {
         switch (exporter.status)
         {
             case AVAssetExportSessionStatusCompleted:
             {
                 NSLog(@"Video Merge SuccessFullt");
             }
                 break;
             case AVAssetExportSessionStatusFailed:
                 NSLog(@"Failed:%@", exporter.error.description);
                 break;
             case AVAssetExportSessionStatusCancelled:
                 NSLog(@"Canceled:%@", exporter.error);
                 break;
             case AVAssetExportSessionStatusExporting:
                 NSLog(@"Exporting!");
                 break;
             case AVAssetExportSessionStatusWaiting:
                 NSLog(@"Waiting");
                 break;
             default:
                 break;
         }
     }];

But exporting even 1 minute video takes around 30 seconds, which is too much considering iPad inbuilt camera app takes less than 2 seconds.

Also if I remove videoComposition from exporter, time reduces to 7 seconds, which is still bad considering video length to be only 1 minute. So, I want to know how to decrease the export time to minimum?

Also, I want to know, does AVAssetExportSession takes generally this much time or is it just my case?

Update : Merge Code:

AVMutableComposition *mutableComposition = [AVMutableComposition composition];

AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                                   preferredTrackID:kCMPersistentTrackID_Invalid];

AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                                   preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableVideoCompositionLayerInstruction *videoTrackLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];


NSMutableArray *instructions = [NSMutableArray new];
CGSize size = CGSizeZero;

CMTime time = kCMTimeZero;
for (AVURLAsset *asset in assets)
{
    AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    AVAssetTrack *audioAssetTrack = [asset tracksWithMediaType:AVMediaTypeAudio].firstObject;



    NSError *error;
    [videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration )
                                       ofTrack:assetTrack
                                        atTime:time
                                         error:&error];

    [videoTrackLayerInstruction setTransform:assetTrack.preferredTransform atTime:time];

    if (error) {
        NSLog(@"asset url :: %@",assetTrack.asset);
        NSLog(@"Error1 - %@", error.debugDescription);
    }

    [audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAssetTrack.timeRange.duration)
                                       ofTrack:audioAssetTrack
                                        atTime:time
                                         error:&error];
    if (error) {
        NSLog(@"Error2 - %@", error.debugDescription);
    }

    time = CMTimeAdd(time, assetTrack.timeRange.duration);

    if (CGSizeEqualToSize(size, CGSizeZero)) {
        size = assetTrack.naturalSize;
    }
}

AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, time);
mainInstruction.layerInstructions = [NSArray arrayWithObject:videoTrackLayerInstruction];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
mainCompositionInst.renderSize = size;

I built an app that merged together different video fragments and I can safely say it is your case. My video files have ~10 mb so maybe they are little smaller but it takes less than second to merge them all together, even if there is 10, 20 segments.

Now as for it is actually happening, I have checked my configuration against yours and the difference is following:

  • I use export.outputFileType = AVFileTypeMPEG4
  • I have network optimization disabled, and if you are not planning to stream the video from your device, you should disable it too

Other than that, it should be the same, I can't really compare it as you would have to provide code about how you actually create the composition. There are some things to check though:

  • if you are using AVURLAssetPreferPreciseDurationAndTimingKey when creating AVURLAsset and you don't have enough keyframes, it can actually take quite some time to seek through to find the keys and so it slows it down
  • Consider if you really need highest quality for the video
  • Consider resolution of the video and possibly lower it down

I should be able to help you more if you provide more information, but maybe some of this stuff will work. Give it a try and then report back.

Hope it helps a little!

Edit 1: I forgot to mention that if you are out of options, you should try to use FFmpeg library as it is very high performance, though due to licencing might not be suitable for you.

I think the issue here is AVAssetExportPresetHighestQuality this will cause conversion or up sampling, and will slow things WAY down. Try using AVAssetExportPresetPassthrough instead. Took my export times from ~35+ secs down to less than a second

I also disabled optimizing for network use as all of our videos are only used within the app, and never streamed or passed over a network

Keep in my that maybe the asset you're trying to export is not stored locally and first is downloading the content and then exporting your assets.

if you don't want to download any content

let videoRequestOptions: PHVideoRequestOptions = PHVideoRequestOptions()
videoRequestOptions.isNetworkAccessAllowed = true

you also going to receive a message within requestExportSession completion handler with a couple of useful info values. https://developer.apple.com/documentation/photokit/phimagemanager/image_result_info_keys

otherwise, if you want to download your asset from iCloud and make it as fast as possible you could play with the following parameters

let videoRequestOptions: PHVideoRequestOptions = PHVideoRequestOptions()
// highQualityFormat, highQualityFormat, fastFormat, automatic
videoRequestOptions.deliveryMode = .fastFormat 
videoRequestOptions.isNetworkAccessAllowed = true

another important property is the export preset, there's a bunch of available preset

let lowQualityPreset1 = AVAssetExportPresetLowQuality
let lowQualityPreset2 = AVAssetExportPreset640x480
let lowQualityPreset3 = AVAssetExportPreset960x540
let lowQualityPreset4 = AVAssetExportPreset1280x720

let manager = PHImageManager()
manager.requestExportSession(forVideo: asset, 
                              options: videoRequestOptions, 
                         exportPreset: lowQualityPreset1) { (session, info) in
  session?.outputURL = outputUrl
  session?.outputFileType = .mp4
  session?.shouldOptimizeForNetworkUse = true

  session?.exportAsynchronously {

  }
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM