简体   繁体   中英

How to control video frame rate with AVAssetReader and AVAssetWriter?

We are trying to understand how to control/specify the frame rate for videos that we are encoding with AVAssetReader and AVAssetWriter . Specifically, we are are using AVAssetReader and AVAssetWriter to transcode/encode/compress a video that we have accessed from the photo/video gallery. We are able to control things like bit rate, aspect ratio changes, etc., but cannot figure out how to control the frame rate. To be specific, we'd like to be able to take as input a 30 FPS video that's 5 minutes long and emit a 5 minute video at 15 FPS.

Our current loop that processes sample buffers is:

[writer startWriting];
[writer startSessionAtSourceTime:kCMTimeZero];
[videoReader startReading];

[videoWriterInput requestMediaDataWhenReadyOnQueue:videoEncoderQueue usingBlock:
 ^{         
    while ([videoWriterInput isReadyForMoreMediaData]) {
        CMSampleBufferRef sampleBuffer;

        if ([videoReader status] == AVAssetReaderStatusReading 
            && (sampleBuffer = [videoReaderTrackOutput copyNextSampleBuffer])) {
            if (sampleBuffer) {
                BOOL result = [videoWriterInput appendSampleBuffer:sampleBuffer];
                CFRelease(sampleBuffer);

                if (!result) {
                    [videoReader cancelReading];
                    break;
                }
            }
        } else {
            // deal with status other than AVAssetReaderStatusReading
            [videoWriterInput markAsFinished];
            // [...]
            break;
        }
    }
 }];

How do we augment or change this so that we could control the frame rate of the created video? We cannot seem to find a sample in SO or anywhere else that clearly explains how to do this. I think we're supposed to use CMTime and probably some other methods other than the ones in the code sample above, but the details aren't clear.

Depending on how you're compositing the frames, you may just need to set the movieTimeScale .

Alternately, you need to use CMTime to set the time of each frame as you add it to the writer.

CMTime time = CMTimeMake(0, 30); // (time, time_scale)

This would create the time for the first frame at a frame rate of 30 frames per second. Set the second parameter to your desired frame rate and don't change it. Increment the first for each frame you add to the writer.

Edit:

There are many different ways in which you can process the incoming and outgoing data. Hence there are many options for how the timing can / needs to be specified. Generally, the above is suitable when using a AVAssetWriterInputPixelBufferAdaptor (if you were editing the video frames).

Based on your updated code, you're doing a more 'simple' pass through, you probably need to use CMSampleBufferCreateCopyWithNewTiming to generate a copy of the sampleBuffer you receive from the reader. Strangely, I think, this makes the timing more complex. Depending on what you're trying to achieve with the edits you may want to create a new single CMSampleTimingInfo which can be used for all frames, or get the existing timing info from the sample buffer with CMSampleBufferGetSampleTimingInfoArray and then create an edited version of that. Something along the lines of:

CMItemCount count;
CMTime newTimeStamp = CMTimeMake(...);
CMSampleBufferGetSampleTimingInfoArray(sampleBuffer, 0, nil, &count);
CMSampleTimingInfo *timingInfo = malloc(sizeof(CMSampleTimingInfo) * count);
CMSampleBufferGetSampleTimingInfoArray(sampleBuffer, count, timingInfo, &count);

for (CMItemCount i = 0; i < count; i++)
{
    timingInfo[i].decodeTimeStamp = kCMTimeInvalid;
    timingInfo[i].presentationTimeStamp = newTimeStamp;
}

CMSampleBufferRef completedSampleBuffer;
CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault, sampleBuffer, count, timingInfo, &completedSampleBuffer);
free(timingInfo);

How you choose your newTimeStamp dictates what results you'll get.

Before, I use dispatch_block_wait to perform block at delta time to call the whole function again. But once I realise it will someday become a buggy stuff, I use a dispatch_source_t as a timer to perform block as the control of the FPS instead.

create a block of what you want to do:

var block = dispatch_block_create(...)
var queue = dispatch_queue_create(...)
var source = dispatch_source_create(DISPATCH_SOURCE_TYPE_TIMER, 0, 0, queue) 
dispatch_set_timer(source,STARTTIME,INTERVAL,0)
dispatch_source_set_event_handler(source,block)
dispatch_resume(source)

If you are looking for real case reference of grab the buffers, I've made it on https://github.com/matthewlui/FSVideoView . *Added The timeinterval to pass in is count in nano second = 1/1,000,000,000 second. Times it with your desire delta to next frame.

A better way is to the set the timebase property of the AVSampleBufferDisplayLayer accordingly:

CMTimebaseRef timebase;
OSStatus timebaseResult;
timebaseResult = CMTimebaseCreateWithMasterClock(NULL, CMClockGetHostTimeClock(), &timebase);
if (timebaseResult != 0)
{
    NSLog(@"ERROR: could not create timebase");
} else {
    CMTimebaseSetTime(timebase, CMTimeMake(1, 3000));
    CMTimebaseSetRate(timebase, 1.0f);
}

[(AVSampleBufferDisplayLayer *)self.layer setControlTimebase:timebase];
CFRelease(timebase);

It should be obvious why this is the preferred means over all others.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM