简体   繁体   English

如何使用AVAssetWriter编写带视频和音频的电影?

[英]How to write a movie with video AND audio using AVAssetWriter?

I want to export a movie with AVAssetWriter and can't figure out how to include video and audio tracks in sync. 我想用AVAssetWriter导出一部电影, AVAssetWriter无法弄清楚如何同步包含视频和音频轨道。 Exporting only video works fine, but when I add audio the resulting movie looks like this: 仅导出视频工作正常,但是当我添加音频时,生成的电影如下所示:

First I see the video (without audio), then the video freezes (showing the last image frame until the end) and after some seconds I hear the audio. 首先,我看到视频(没有音频),然后视频冻结(显示最后一个图像帧直到结束),几秒钟后我听到音频。

I tried some things with CMSampleBufferSetOutputPresentationTimeStamp (subtracting the first CMSampleBufferGetPresentationTimeStamp from the current) for the audio, but it all didn't work and I don't think it is the right direction, since video & audio in the source movie should be in sync anyway... 我尝试使用CMSampleBufferSetOutputPresentationTimeStamp (从当前减去当前的第一个CMSampleBufferGetPresentationTimeStamp )的一些东西,但这一切都不起作用,我不认为这是正确的方向,因为源电影中的视频和音频应该是同步的无论如何...

My setup in short: I create an AVAssetReader and 2 AVAssetReaderTrackOutput (one for video, one for audio) and add them to the AVAssetReader , then I create an AVAssetWriter and 2 AVAssetWriterInput (video & audio) and add them to the AVAssetWriter ... I start it all up with: 我的设置简单:我创建一个AVAssetReader和2个AVAssetReaderTrackOutput (一个用于视频,一个用于音频)并将它们添加到AVAssetReader ,然后我创建一个AVAssetWriter和2个AVAssetWriterInput (视频和音频)并将它们添加到AVAssetWriter ......我开始了:

[assetReader startReading];
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];

Then I run 2 queues for doing the sample buffer stuff: 然后我运行2个队列来做样本缓冲区的东西:

dispatch_queue_t queueVideo=dispatch_queue_create("assetVideoWriterQueue", NULL);
[assetWriterVideoInput requestMediaDataWhenReadyOnQueue:queueVideo usingBlock:^
{
     while([assetWriterVideoInput isReadyForMoreMediaData])
     {
         CMSampleBufferRef sampleBuffer=[assetReaderVideoOutput copyNextSampleBuffer];
         if(sampleBuffer)
         {
             [assetWriterVideoInput appendSampleBuffer:sampleBuffer];
             CFRelease(sampleBuffer);
         } else
         {
             [assetWriterVideoInput markAsFinished];
             dispatch_release(queueVideo);
             videoFinished=YES;
             break;
         }
     }
}];

dispatch_queue_t queueAudio=dispatch_queue_create("assetAudioWriterQueue", NULL);
[assetWriterAudioInput requestMediaDataWhenReadyOnQueue:queueAudio usingBlock:^
{
    while([assetWriterAudioInput isReadyForMoreMediaData])
    {
        CMSampleBufferRef sampleBuffer=[assetReaderAudioOutput copyNextSampleBuffer];
        if(sampleBuffer)
        {
            [assetWriterAudioInput appendSampleBuffer:sampleBuffer];
            CFRelease(sampleBuffer);
        } else
        {
            [assetWriterAudioInput markAsFinished];
            dispatch_release(queueAudio);
            audioFinished=YES;
            break;
        }
    }
}];

In the main loop I wait for both queues until they finish: 在主循环中,我等待两个队列,直到它们完成:

while(!videoFinished && !audioFinished)
{
    sleep(1);
}
[assetWriter finishWriting];

Furthermore I try to save the resulting file in the library with the following code... 此外,我尝试使用以下代码将结果文件保存在库中...

NSURL *url=[[NSURL alloc] initFileURLWithPath:path];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if([library videoAtPathIsCompatibleWithSavedPhotosAlbum:url])
{
    [library writeVideoAtPathToSavedPhotosAlbum:url completionBlock:^(NSURL *assetURL, NSError *error)
     {
         if(error)
             NSLog(@"error=%@",error.localizedDescription);
         else
             NSLog(@"completed...");
     }];
} else
    NSLog(@"error, video not saved...");

[library release];
[url release];

...but I get the error: ...但我收到错误:

Video /Users/cb/Library/Application Support/iPhone Simulator/4.2/Applications/E9865BF9-D190-4912-9248-66768B1AB635/Documents/export.mp4 cannot be saved to the saved photos album: Error Domain=NSOSStatusErrorDomain Code=-12950 "Movie could not be played." UserInfo=0x5e4fb90 {NSLocalizedDescription=Movie could not be played.}

The code works without problems in another program. 代码在另一个程序中没有问题。 So something is wrong with the movie...? 那电影有问题......?

-(void)mergeAudioVideo
{

    NSString *videoOutputPath=[_documentsDirectory stringByAppendingPathComponent:@"dummy_video.mp4"];
    NSString *outputFilePath = [_documentsDirectory stringByAppendingPathComponent:@"final_video.mp4"];
    if ([[NSFileManager defaultManager]fileExistsAtPath:outputFilePath])
        [[NSFileManager defaultManager]removeItemAtPath:outputFilePath error:nil];


    NSURL    *outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
    NSString *filePath = [_documentsDirectory stringByAppendingPathComponent:@"newFile.m4a"];
    AVMutableComposition* mixComposition = [AVMutableComposition composition];

    NSURL    *audio_inputFileUrl = [NSURL fileURLWithPath:filePath];
    NSURL    *video_inputFileUrl = [NSURL fileURLWithPath:videoOutputPath];

    CMTime nextClipStartTime = kCMTimeZero;

    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
    CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);

    AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];

    AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
    CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
    AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil];

    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
    _assetExport.outputFileType = @"com.apple.quicktime-movie";
    _assetExport.outputURL = outputFileUrl;

    [_assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {
         if (_assetExport.status == AVAssetExportSessionStatusCompleted) {

          //Write Code Here to Continue
         }
         else {
            //Write Fail Code here     
         }
     }
     ];



}

You can use this code to merge audio and video. 您可以使用此代码合并音频和视频。

It seams that assetWriterAudioInput ignores sample buffer time for audio writing. 它接缝assetWriterAudioInput忽略音频写入的样本缓冲时间。 Do this way. 这样做。

1) Write video track. 1)写视频轨道。

2) When done, mark it finished ie [videoWriterInput markAsFinished]; 2)完成后,标记完成即[videoWriterInput markAsFinished];

3) do [assetWriter startSessionAtSourceTime:timeRangeStart]; 3)做[assetWriter startSessionAtSourceTime:timeRangeStart];

3) instantiate audio reader and start writing audio. 3)实例化音频阅读器并开始写音频。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM