简体   繁体   English

AVAssetWriter / AVAssetWriterInputPixelBufferAdaptor - 黑帧和帧速率

[英]AVAssetWriter / AVAssetWriterInputPixelBufferAdaptor - black frames and frame rate

I'm capturing the camera feed and writing it to a movie. 我正在拍摄相机信息并将其写入电影。 The problem I'm having is that after the export the movie has a couple of black seconds in front of it (relative to the actual recording start time). 我遇到的问题是导出后电影前面有几个黑秒(相对于实际录制开始时间)。

I think this is related to [self.assetWriter startSessionAtSourceTime:kCMTimeZero]; 我认为这与[self.assetWriter startSessionAtSourceTime:kCMTimeZero]; I had a half working solution by having a frameStart variable that just counted upwards in the samplebuffer delegate method. 我有一个半工作的解决方案,它有一个frameStart变量,它在samplebuffer委托方法中向上计数。

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    frameStart++;
    if (self.startRecording == YES) {

        static int64_t frameNumber = 0;
        if(self.assetWriterInput.readyForMoreMediaData) {
            [self.pixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:CMTimeMake(frameNumber, 25)];
        }
        frameNumber++;
    }
}

and then call this method when the user pressed a button: 然后在用户按下按钮时调用此方法:

[self.assetWriter startSessionAtSourceTime:CMTimeMake(frameStart,25)];

this works. 这很有效。 but only once... if I want to record a second movie the black frames are back again. 但只有一次......如果我想要录制第二部电影,那么黑框会再次回来。

Also, when I look at the outputted movie the frame rate is 25fps like I want it to. 另外,当我看输出的电影时,帧速率是25fps,就像我想要的那样。 but the video looks as if it's sped up. 但视频看起来好像加快了。 as if there is not enough space between the frames. 好像框架之间没有足够的空间。 So the movie plays about twice as fast. 因此电影的播放速度快了两倍。

NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:640], AVVideoWidthKey, [NSNumber numberWithInt:480], AVVideoHeightKey, AVVideoCodecH264, AVVideoCodecKey, nil];

self.assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
self.assetWriterInput.expectsMediaDataInRealTime = YES;

You don't need to count frame timestamps on your own. 您不需要自己计算帧时间戳。 You can get the timestamp of the current sample with 您可以使用获取当前样本的时间戳

CMTime timestamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

However, it seems to me you are just passing the pixel buffer of the frame to the adaptor without modifications. 但是,在我看来,你只是将帧的像素缓冲区传递给适配器而不进行任何修改。 Wouldn't it be easier to pass the sample buffer itself directly to the assetWriterInput like the following? 是不是更容易将样本缓冲区本身直接传递给assetWriterInput ,如下所示?

[self.assetWriterInput appendSampleBuffer:sampleBuffer];

First of all, why are you incrementing frameNumber twice for every frame? 首先,为什么每帧增加两次frameNumber? Increment once, remove the first one. 增加一次,删除第一个。 This should fix the playback speed. 这应该可以确定播放速度。

Second, are you resetting frameNumber to 0 when you finish recording? 其次,你完成录制时是否将frameNumber重置为0? If not than this is your problem. 如果不是这个是你的问题。 If not I need more explanation about what is going on here.. 如果不是,我需要更多解释这里发生的事情..

Regards 问候

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM