简体   繁体   English

将H.264 I帧放入AVSampleBufferDisplayLayer,但不显示视频图像

[英]Putting an H.264 I frame to AVSampleBufferDisplayLayer but no video image is displayed

After having a detail review of WWDC2014,Session513, I try to write my app on IOS8.0 to decode and display one live H.264 stream. 在详细了解WWDC2014,Session513之后,我尝试在IOS8.0上编写我的应用程序以解码和显示一个实时H.264流。 First of all, I construct a H264 parameter set successfully. 首先,我成功构造了一个H264参数集。 When I get one I frame with a 4 bit start code,just like"0x00 0x00 0x00 0x01 0x65 ...", I put it into a CMblockBuffer. 当我得到一个带有4位起始码的I帧时,就像“ 0x00 0x00 0x00 0x01 0x65 ...”一样,我将其放入CMblockBuffer。 Then I construct a CMSampleBuffer using previews CMBlockBuffer. 然后,我使用预览CMBlockBuffer构造一个CMSampleBuffer。 After that,I put the CMSampleBuffer into a AVSampleBufferDisplayLayer. 之后,我将CMSampleBuffer放入AVSampleBufferDisplayLayer。 Everything is OK(I checked the value returned ) except the AVSampleBufferDisplayLayer does not show any video image. 一切正常(我检查了返回的值),但AVSampleBufferDisplayLayer不显示任何视频图像。 Since these APIs are fairly new to everyone, I couldn't find any body who can resolve this problem. 由于这些API对每个人都是相当新的,因此我找不到能够解决此问题的机构。

I'll give the key codes as follows,and I do really appreciate it if you can help to figure out why the vide image can't be displayed. 我将提供以下关键代码,如果您能帮助弄清为什么无法显示视频图像,我将不胜感激。 Thanks a lot. 非常感谢。

(1) AVSampleBufferDisplayLayer initialised. (1)初始化AVSampleBufferDisplayLayer。 dsplayer is a objc instance of my main view controller. dsplayer是我的主视图控制器的objc实例。

    @property(nonatomic,strong)AVSampleBufferDisplayLayer *dspLayer;

if(!_dspLayer)
{
    _dspLayer = [[AVSampleBufferDisplayLayer alloc]init];
    [_dspLayer setFrame:CGRectMake(90,551,557,389)];
    _dspLayer.videoGravity = AVLayerVideoGravityResizeAspect;
   _dspLayer.backgroundColor = [UIColor grayColor].CGColor;
    CMTimebaseRef tmBase = nil;
    CMTimebaseCreateWithMasterClock(NULL,CMClockGetHostTimeClock(),&tmBase);
    _dspLayer.controlTimebase = tmBase;
    CMTimebaseSetTime(_dspLayer.controlTimebase, kCMTimeZero);
    CMTimebaseSetRate(_dspLayer.controlTimebase, 1.0);

     [self.view.layer addSublayer:_dspLayer];

}

(2)In another thread, I get one H.264 I frame. (2)在另一个线程中,我得到一个H.264 I帧。 //construct h.264 parameter set ok //构造好h.264参数集

    CMVideoFormatDescriptionRef formatDesc;
    OSStatus formatCreateResult =
    CMVideoFormatDescriptionCreateFromH264ParameterSets(NULL, ppsNum+1, props, sizes, 4, &formatDesc);
    NSLog([NSString stringWithFormat:@"construct h264 param set:%ld",formatCreateResult]);

//construct cmBlockbuffer . //构造cmBlockbuffer。 //databuf points to H.264 data. // databuf指向H.264数据。 starts with "0x00 0x00 0x00 0x01 0x65 ........" 以“ 0x00 0x00 0x00 0x01 0x65 ........”开头

    CMBlockBufferRef blockBufferOut = nil;
    CMBlockBufferCreateEmpty (0,0,kCMBlockBufferAlwaysCopyDataFlag, &blockBufferOut);
    CMBlockBufferAppendMemoryBlock(blockBufferOut,
                                    dataBuf,
                                    dataLen,
                                    NULL,
                                    NULL,
                                    0,
                                    dataLen,
                                    kCMBlockBufferAlwaysCopyDataFlag);

//construct cmsamplebuffer ok //构造cmsamplebuffer确定

    size_t sampleSizeArray[1] = {0};
    sampleSizeArray[0] = CMBlockBufferGetDataLength(blockBufferOut);
    CMSampleTiminginfo tmInfos[1] = {
        {CMTimeMake(5,1), CMTimeMake(5,1), CMTimeMake(5,1)}
    };
    CMSampleBufferRef sampBuf = nil;
    formatCreateResult = CMSampleBufferCreate(kCFAllocatorDefault,
                         blockBufferOut,
                         YES,
                         NULL,
                         NULL,
                         formatDesc,
                         1,
                         1,
                         tmInfos,
                         1,
                         sampleSizeArray,
                         &sampBuf);

//put to AVSampleBufferdisplayLayer,just one frame. //放入AVSampleBufferdisplayLayer,仅一帧。 But I can't see any video frame in my view 但是我看不到任何视频帧

    if([self.dspLayer isReadyForMoreMediaData])
    {
    [self.dspLayer enqueueSampleBuffer:sampBuf];
    }
    [self.dspLayer setNeedsDisplay];

Your NAL unit start codes 0x00 0x00 0x01 or 0x00 0x00 0x00 0x01 need to be replaced by a length header. 您的NAL单位起始代码0x00 0x00 0x01或0x00 0x00 0x00 0x01需要替换为长度标头。

This was clearly stated in the WWDC session you are referring to that the Annex B start code needs to be replaced by a AVCC conform lengh header. 这是在WWDC会话中明确指出的,您所指的是,附件B起始代码需要替换为符合AVCC的长度标头。 You are basically remuxing to MP4 file format from Annex B stream format on the fly here (simplified description of course). 您基本上是在此处从附件B流格式重新分配为MP4文件格式(当然是简化说明)。

Your call when creating the Parameter Set is "4" for this, so you need to prefix your VCL NAL units with a 4 byte length prefix. 为此,创建参数集时的调用为“ 4”,因此您需要为VCL NAL单元添加4字节长的前缀。 That's why you specifiy it as in AVCC format the length header can be shorter. 这就是为什么要指定它为AVCC格式,所以长度标题可以更短。

Whatever you put inside CMSampleBuffer will be OK, there is no sanity check if the contents can be decoded, just that you met the required parameters for being arbitrary data combined with timing information and a parameter set. 不管您放入CMSampleBuffer内的内容如何,​​都不会进行健全性检查,是否可以对内容进行解码,只是您满足了将任意数据与时序信息和参数集组合在一起所必需的参数。

Basically with the data you put in you said the the VCL NAL units are 1 byte long. 基本上,您输入的数据表示VCL NAL单元的长度为1个字节。 The decoder doesn't get the full NAL unit and bails out on an error. 解码器无法获取完整的NAL单元,并会因出现错误而纾困。

Also make sure that when you use create the parameter set that the PPS/SPS do not have a length byted added and that the Annex B start code is also stripped. 另外,请确保在使用创建参数集时,不要在PPS / SPS上添加字节长度,并且附件B的起始代码也将被剥离。

Also I recommend not to use AVSampleBufferDisplayLayer but go through a VTDecompressionSession, so you can do stuff like color correction or other things that are needed inside a pixel shader. 另外,我建议不要使用AVSampleBufferDisplayLayer,而要通过VTDecompressionSession,这样您就可以进行颜色校正或像素着色器中所需的其他操作。

It might be an idea to use DecompressionSessionDecode Frame initially as this will give you some feedback on the success of the decoding. 最初使用DecompressionSessionDecode Frame可能是一个主意,因为这将为您提供有关解码成功的一些反馈。 If there is an issue with the decoding the AVSampleBufferDisplay layer doesn't tell you it just doesn't display anything. 如果解码存在问题,AVSampleBufferDisplay层不会告诉您它只是不显示任何内容。 I can give you some code to help with this if required, let me know how you get on as I am attempting the same thing :) 如果需要的话,我可以给您一些代码来帮助您,让我知道您在尝试同一件事时的情况:)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM