简体   繁体   English

AVAssetWriter startSessionAtSourceTime不接受CMTIme值

[英]AVAssetWriter startSessionAtSourceTime not accepting CMTIme value

My app is designed to record video & analyze the frames generated under iOS 11.4, using Xcode 10.0 as IDE. 我的应用旨在使用Xcode 10.0作为IDE记录视频并分析在iOS 11.4下生成的帧。 Succeeded in recording video using AVCaptureMovieFileOutput, but need to analyze frames so transitioned to AVAssetWriter and modeled code after RosyWriter [ https://github.com/WildDylan/appleSample/tree/master/RosyWriter ]. 成功使用AVCaptureMovieFileOutput录制视频,但需要分析因此转换为AVAssetWriter的帧,并在RosyWriter [ https://github.com/WildDylan/appleSample/tree/master/RosyWriter ]之后对代码进行建模。 Code is written in ObjC. 代码是用ObjC编写的。

I am stuck with problem inside captureOutput: didOutputSampleBuffer: fromConnection: delegate. 我陷入captureOutput:didOutputSampleBuffer:fromConnection:委托内部的问题。 After capturing first frame, the AVAssetWriter is configured along with its inputs (video and audio),using settings extracted from first frame. 捕获第一帧后,将使用从第一帧提取的设置来配置AVAssetWriter及其输入(视频和音频)。 Once user selects record, the captured sampleBuffer is analyzed and written. 用户选择记录后,将分析并写入捕获的sampleBuffer。 I tried to use AVAssetWriter startSessionAtSourceTime: but there is clearly something wrong with the way CMSampleBufferGetPresentationTimeStamp is returning CMTime from the sample buffer. 我尝试使用AVAssetWriter startSessionAtSourceTime:但CMSampleBufferGetPresentationTimeStamp从示例缓冲区返回CMTime的方式显然存在问题。 The sampleBuufer log seems to show CMTime with valid values. sampleBuufer日志似乎显示带有有效值的CMTime。

If I implement: CMTime sampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); 如果我实现:CMTime sampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); [self->assetWriter startSessionAtSourceTime: sampleTime] the error generated is '*** -[AVAssetWriter startSessionAtSourceTime:] invalid parameter not satisfying: CMTIME_IS_NUMERIC(startTime)' . [self-> assetWriter startSessionAtSourceTime:sampleTime]生成的错误是'***-[AVAssetWriter startSessionAtSourceTime:]无效的参数不满足:CMTIME_IS_NUMERIC(startTime)'。

If I use [self->assetWriter startSessionAtSourceTime:kCMTimeZero] the error "warning: could not execute support code to read Objective-C class data in the process. This may reduce the quality of type information available." 如果我使用[self-> assetWriter startSessionAtSourceTime:kCMTimeZero],则错误“警告:在执行过程中无法执行支持代码来读取Objective-C类数据。这可能会降低可用类型信息的质量。” is generated. 生成。

When I log sampleTime I read - value=0, timescale=0, epoch=0 & flags=0. 当我记录sampleTime时,我读取-value = 0,timescale = 0,epoch = 0和flags = 0。 I also log the sampleBuffer and show it below, followed by the relevant code: 我还记录了sampleBuffer并在下面显示,然后显示相关代码:

SampleBuffer Content = 

2018-10-17 12:07:04.540816+0300 MyApp[10664:2111852] -[CameraCaptureManager captureOutput:didOutputSampleBuffer:fromConnection:] : sampleBuffer - CMSampleBuffer 0x100e388c0 retainCount: 1 allocator: 0x1c03a95e0
invalid = NO
dataReady = YES
makeDataReadyCallback = 0x0
makeDataReadyRefcon = 0x0
buffer-level attachments:
    Orientation(P) = 1
    {Exif}    (P) = <CFBasicHash 0x28161ce80 [0x1c03a95e0]>{type = mutable dict, count = 24,
entries => .....A LOT OF CAMERA DATA HERE.....
}

    DPIWidth  (P) = 72
    {TIFF}    (P) = <CFBasicHash 0x28161c540 [0x1c03a95e0]>{type =    mutable dict, count = 7,
entries => .....MORE CAMERA DATA HERE.....
}

    DPIHeight (P) = 72
    {MakerApple}(P) = {
1 = 3;
10 = 0;
14 = 0;
3 =     {
    epoch = 0;
    flags = 1;
    timescale = 1000000000;
    value = 390750488472916;
};
4 = 0;
5 = 221;
6 = 211;
7 = 1;
8 =     (
    "-0.04894018",
    "-0.6889497",
    "-0.7034443"
);
9 = 0;
}
formatDescription = <CMVideoFormatDescription 0x280ddc780 [0x1c03a95e0]> {
mediaType:'vide' 
mediaSubType:'BGRA' 
mediaSpecific: {
    codecType: 'BGRA'       dimensions: 720 x 1280 
} 
extensions: {<CFBasicHash 0x28161f880 [0x1c03a95e0]>{type = immutable dict, count = 5,
entries =>
0 : <CFString 0x1c0917068 [0x1c03a95e0]>{contents = "CVImageBufferYCbCrMatrix"} = <CFString 0x1c09170a8 [0x1c03a95e0]>{contents = "ITU_R_601_4"}
1 : <CFString 0x1c09171c8 [0x1c03a95e0]>{contents = "CVImageBufferTransferFunction"} = <CFString 0x1c0917088 [0x1c03a95e0]>{contents = "ITU_R_709_2"}
2 : <CFString 0x1c093f348 [0x1c03a95e0]>{contents = "CVBytesPerRow"} = <CFNumber 0x81092876519e5903 [0x1c03a95e0]>{value = +2880, type = kCFNumberSInt32Type}
3 : <CFString 0x1c093f3c8 [0x1c03a95e0]>{contents = "Version"} = <CFNumber 0x81092876519eed23 [0x1c03a95e0]>{value = +2, type = kCFNumberSInt32Type}
5 : <CFString 0x1c0917148 [0x1c03a95e0]>{contents = "CVImageBufferColorPrimaries"} = <CFString 0x1c0917088 [0x1c03a95e0]>{contents = "ITU_R_709_2"}
}
}
}
sbufToTrackReadiness = 0x0
numSamples = 1
sampleTimingArray[1] = {
    {PTS = {390750488483992/1000000000 = 390750.488}, DTS = {INVALID}, duration = {INVALID}},
}
imageBuffer = 0x2832ad2c0

==================================================== ================================================== ==

//AVCaptureVideoDataOutput Delegates
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{

if (connection == videoConnection)
{

    if (self.outputVideoFormatDescription == NULL )
    {
        self.outputVideoFormatDescription   =   CMSampleBufferGetFormatDescription(sampleBuffer);
        [self   setupVideoRecorder];
    }
    else    if (self.status==RecorderRecording)
    {
        NSLog(@"%s : self.outputVideoFormatDescription - %@",__FUNCTION__,self.outputVideoFormatDescription);

        [self.cmDelegate    manager:self capturedFrameBuffer:sampleBuffer];
        NSLog(@"%s : sampleBuffer - %@",__FUNCTION__,sampleBuffer);

        dispatch_async(vidWriteQueue, ^
            {
                if  (!self->wroteFirstFrame)
                {
                    CMTime sampleTime   =   CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
                    NSLog(@"%s : sampleTime value - %lld, timescale - %i, epoch - %lli, flags - %u",__FUNCTION__,sampleTime.value, sampleTime.timescale, sampleTime.epoch, sampleTime.flags);

                    [self->assetWriter  startSessionAtSourceTime:sampleTime];
                    self->wroteFirstFrame   =   YES;
                }
                if  (self->videoAWInput.readyForMoreMediaData)
                    //else if   (self->videoAWInput.readyForMoreMediaData)
                {
                    BOOL appendSuccess  =   [self->videoAWInput appendSampleBuffer:sampleBuffer];
                    NSLog(@"%s : appendSuccess - %i",__FUNCTION__,appendSuccess);

                    if (!appendSuccess) NSLog(@"%s : failed to append video buffer - %@@",__FUNCTION__,self->assetWriter.error.localizedDescription);
                }
            });
    }
    else if (connection == audioConnection)
    {
    }
}

} }

My bad... my problem was that I was spawning off the frame capture using a thread that was already declared in AVCaptureDataOutput setSampleBufferDelegate:queue: . 我的坏...我的问题是我使用AVCaptureDataOutput setSampleBufferDelegate:queue:中已经声明的线程生成了帧捕获。 Was recursively putting a process on a thread within that same thread. 递归地将进程放在同一线程中的线程上。 Posting answer in case another idiot, like me, makes the same stupid mistake... 如果像我这样的另一个白痴犯同样的愚蠢错误,则发布答案。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM