簡體   English   中英

AVAssetWriter startSessionAtSourceTime不接受CMTIme值

[英]AVAssetWriter startSessionAtSourceTime not accepting CMTIme value

我的應用旨在使用Xcode 10.0作為IDE記錄視頻並分析在iOS 11.4下生成的幀。 成功使用AVCaptureMovieFileOutput錄制視頻,但需要分析因此轉換為AVAssetWriter的幀,並在RosyWriter [ https://github.com/WildDylan/appleSample/tree/master/RosyWriter ]之后對代碼進行建模。 代碼是用ObjC編寫的。

我陷入captureOutput:didOutputSampleBuffer:fromConnection:委托內部的問題。 捕獲第一幀后,將使用從第一幀提取的設置來配置AVAssetWriter及其輸入(視頻和音頻)。 用戶選擇記錄后,將分析並寫入捕獲的sampleBuffer。 我嘗試使用AVAssetWriter startSessionAtSourceTime:但CMSampleBufferGetPresentationTimeStamp從示例緩沖區返回CMTime的方式顯然存在問題。 sampleBuufer日志似乎顯示帶有有效值的CMTime。

如果我實現:CMTime sampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); [self-> assetWriter startSessionAtSourceTime:sampleTime]生成的錯誤是'***-[AVAssetWriter startSessionAtSourceTime:]無效的參數不滿足:CMTIME_IS_NUMERIC(startTime)'。

如果我使用[self-> assetWriter startSessionAtSourceTime:kCMTimeZero],則錯誤“警告:在執行過程中無法執行支持代碼來讀取Objective-C類數據。這可能會降低可用類型信息的質量。” 生成。

當我記錄sampleTime時,我讀取-value = 0,timescale = 0,epoch = 0和flags = 0。 我還記錄了sampleBuffer並在下面顯示,然后顯示相關代碼:

SampleBuffer Content = 

2018-10-17 12:07:04.540816+0300 MyApp[10664:2111852] -[CameraCaptureManager captureOutput:didOutputSampleBuffer:fromConnection:] : sampleBuffer - CMSampleBuffer 0x100e388c0 retainCount: 1 allocator: 0x1c03a95e0
invalid = NO
dataReady = YES
makeDataReadyCallback = 0x0
makeDataReadyRefcon = 0x0
buffer-level attachments:
    Orientation(P) = 1
    {Exif}    (P) = <CFBasicHash 0x28161ce80 [0x1c03a95e0]>{type = mutable dict, count = 24,
entries => .....A LOT OF CAMERA DATA HERE.....
}

    DPIWidth  (P) = 72
    {TIFF}    (P) = <CFBasicHash 0x28161c540 [0x1c03a95e0]>{type =    mutable dict, count = 7,
entries => .....MORE CAMERA DATA HERE.....
}

    DPIHeight (P) = 72
    {MakerApple}(P) = {
1 = 3;
10 = 0;
14 = 0;
3 =     {
    epoch = 0;
    flags = 1;
    timescale = 1000000000;
    value = 390750488472916;
};
4 = 0;
5 = 221;
6 = 211;
7 = 1;
8 =     (
    "-0.04894018",
    "-0.6889497",
    "-0.7034443"
);
9 = 0;
}
formatDescription = <CMVideoFormatDescription 0x280ddc780 [0x1c03a95e0]> {
mediaType:'vide' 
mediaSubType:'BGRA' 
mediaSpecific: {
    codecType: 'BGRA'       dimensions: 720 x 1280 
} 
extensions: {<CFBasicHash 0x28161f880 [0x1c03a95e0]>{type = immutable dict, count = 5,
entries =>
0 : <CFString 0x1c0917068 [0x1c03a95e0]>{contents = "CVImageBufferYCbCrMatrix"} = <CFString 0x1c09170a8 [0x1c03a95e0]>{contents = "ITU_R_601_4"}
1 : <CFString 0x1c09171c8 [0x1c03a95e0]>{contents = "CVImageBufferTransferFunction"} = <CFString 0x1c0917088 [0x1c03a95e0]>{contents = "ITU_R_709_2"}
2 : <CFString 0x1c093f348 [0x1c03a95e0]>{contents = "CVBytesPerRow"} = <CFNumber 0x81092876519e5903 [0x1c03a95e0]>{value = +2880, type = kCFNumberSInt32Type}
3 : <CFString 0x1c093f3c8 [0x1c03a95e0]>{contents = "Version"} = <CFNumber 0x81092876519eed23 [0x1c03a95e0]>{value = +2, type = kCFNumberSInt32Type}
5 : <CFString 0x1c0917148 [0x1c03a95e0]>{contents = "CVImageBufferColorPrimaries"} = <CFString 0x1c0917088 [0x1c03a95e0]>{contents = "ITU_R_709_2"}
}
}
}
sbufToTrackReadiness = 0x0
numSamples = 1
sampleTimingArray[1] = {
    {PTS = {390750488483992/1000000000 = 390750.488}, DTS = {INVALID}, duration = {INVALID}},
}
imageBuffer = 0x2832ad2c0

================================================== ==

//AVCaptureVideoDataOutput Delegates
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{

if (connection == videoConnection)
{

    if (self.outputVideoFormatDescription == NULL )
    {
        self.outputVideoFormatDescription   =   CMSampleBufferGetFormatDescription(sampleBuffer);
        [self   setupVideoRecorder];
    }
    else    if (self.status==RecorderRecording)
    {
        NSLog(@"%s : self.outputVideoFormatDescription - %@",__FUNCTION__,self.outputVideoFormatDescription);

        [self.cmDelegate    manager:self capturedFrameBuffer:sampleBuffer];
        NSLog(@"%s : sampleBuffer - %@",__FUNCTION__,sampleBuffer);

        dispatch_async(vidWriteQueue, ^
            {
                if  (!self->wroteFirstFrame)
                {
                    CMTime sampleTime   =   CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
                    NSLog(@"%s : sampleTime value - %lld, timescale - %i, epoch - %lli, flags - %u",__FUNCTION__,sampleTime.value, sampleTime.timescale, sampleTime.epoch, sampleTime.flags);

                    [self->assetWriter  startSessionAtSourceTime:sampleTime];
                    self->wroteFirstFrame   =   YES;
                }
                if  (self->videoAWInput.readyForMoreMediaData)
                    //else if   (self->videoAWInput.readyForMoreMediaData)
                {
                    BOOL appendSuccess  =   [self->videoAWInput appendSampleBuffer:sampleBuffer];
                    NSLog(@"%s : appendSuccess - %i",__FUNCTION__,appendSuccess);

                    if (!appendSuccess) NSLog(@"%s : failed to append video buffer - %@@",__FUNCTION__,self->assetWriter.error.localizedDescription);
                }
            });
    }
    else if (connection == audioConnection)
    {
    }
}

}

我的壞...我的問題是我使用AVCaptureDataOutput setSampleBufferDelegate:queue:中已經聲明的線程生成了幀捕獲。 遞歸地將進程放在同一線程中的線程上。 如果像我這樣的另一個白痴犯同樣的愚蠢錯誤,則發布答案。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM