简体   繁体   English

如何在iOS 8中将AVSampleBufferDisplayLayer与GStreamer一起用于RTP H264流?

[英]How to use AVSampleBufferDisplayLayer in iOS 8 for RTP H264 Streams with GStreamer?

After getting notice of the HW-H264-Decoder being available to programmers in iOS 8, I want to use it now. 在注意到HW-H264-Decoder在iOS 8中可供程序员使用后,我现在想使用它。 There is a nice introduction to 'Direct Access to Video Encoding and Decoding' from WWDC 2014 out there. 那里有关于WWDC 2014的“直接访问视频编码和解码”的精彩介绍。 You can take a look here . 你可以在这里看看。

Based on Case 1 there, I started to develop an Application, that should be able to get an H264-RTP-UDP-Stream from GStreamer, sink it into an 'appsink'-element to get direct access to the NAL Units and do the conversion to create CMSampleBuffers, which my AVSampleBufferDisplayLayer can display then. 基于那里的案例1,我开始开发一个应用程序,该应用程序应该能够从GStreamer获取H264-RTP-UDP-Stream,将其沉入一个“ appsink”元素中以直接访问NAL单元并执行转换以创建CMSampleBuffers,然后我的AVSampleBufferDisplayLayer可以显示该内容。

The interesting piece of code doing all that is the following: 执行以下所有操作的有趣代码段:

//
//  GStreamerBackend.m
// 

#import "GStreamerBackend.h"

NSString * const naluTypesStrings[] = {
    @"Unspecified (non-VCL)",
    @"Coded slice of a non-IDR picture (VCL)",
    @"Coded slice data partition A (VCL)",
    @"Coded slice data partition B (VCL)",
    @"Coded slice data partition C (VCL)",
    @"Coded slice of an IDR picture (VCL)",
    @"Supplemental enhancement information (SEI) (non-VCL)",
    @"Sequence parameter set (non-VCL)",
    @"Picture parameter set (non-VCL)",
    @"Access unit delimiter (non-VCL)",
    @"End of sequence (non-VCL)",
    @"End of stream (non-VCL)",
    @"Filler data (non-VCL)",
    @"Sequence parameter set extension (non-VCL)",
    @"Prefix NAL unit (non-VCL)",
    @"Subset sequence parameter set (non-VCL)",
    @"Reserved (non-VCL)",
    @"Reserved (non-VCL)",
    @"Reserved (non-VCL)",
    @"Coded slice of an auxiliary coded picture without partitioning (non-VCL)",
    @"Coded slice extension (non-VCL)",
    @"Coded slice extension for depth view components (non-VCL)",
    @"Reserved (non-VCL)",
    @"Reserved (non-VCL)",
    @"Unspecified (non-VCL)",
    @"Unspecified (non-VCL)",
    @"Unspecified (non-VCL)",
    @"Unspecified (non-VCL)",
    @"Unspecified (non-VCL)",
    @"Unspecified (non-VCL)",
    @"Unspecified (non-VCL)",
    @"Unspecified (non-VCL)",
};


static GstFlowReturn new_sample(GstAppSink *sink, gpointer user_data)
{
    GStreamerBackend *backend = (__bridge GStreamerBackend *)(user_data);
    GstSample *sample = gst_app_sink_pull_sample(sink);
    GstBuffer *buffer = gst_sample_get_buffer(sample);
    GstMemory *memory = gst_buffer_get_all_memory(buffer);

    GstMapInfo info;
    gst_memory_map (memory, &info, GST_MAP_READ);

    int startCodeIndex = 0;
    for (int i = 0; i < 5; i++) {
        if (info.data[i] == 0x01) {
            startCodeIndex = i;
            break;
        }
    }
    int nalu_type = ((uint8_t)info.data[startCodeIndex + 1] & 0x1F);
    NSLog(@"NALU with Type \"%@\" received.", naluTypesStrings[nalu_type]);
    if(backend.searchForSPSAndPPS) {
        if (nalu_type == 7)
            backend.spsData = [NSData dataWithBytes:&(info.data[startCodeIndex + 1]) length: info.size - 4];

        if (nalu_type == 8)
            backend.ppsData = [NSData dataWithBytes:&(info.data[startCodeIndex + 1]) length: info.size - 4];

        if (backend.spsData != nil && backend.ppsData != nil) {
            const uint8_t* const parameterSetPointers[2] = { (const uint8_t*)[backend.spsData bytes], (const uint8_t*)[backend.ppsData bytes] };
            const size_t parameterSetSizes[2] = { [backend.spsData length], [backend.ppsData length] };

            CMVideoFormatDescriptionRef videoFormatDescr;
            OSStatus status = CMVideoFormatDescriptionCreateFromH264ParameterSets(kCFAllocatorDefault, 2, parameterSetPointers, parameterSetSizes, 4, &videoFormatDescr);
            [backend setVideoFormatDescr:videoFormatDescr];
            [backend setSearchForSPSAndPPS:false];
            NSLog(@"Found all data for CMVideoFormatDescription. Creation: %@.", (status == noErr) ? @"successfully." : @"failed.");
        }
    }
    if (nalu_type == 1 || nalu_type == 5) {
        CMBlockBufferRef videoBlock = NULL;
        OSStatus status = CMBlockBufferCreateWithMemoryBlock(NULL, info.data, info.size, kCFAllocatorNull, NULL, 0, info.size, 0, &videoBlock);
        NSLog(@"BlockBufferCreation: %@", (status == kCMBlockBufferNoErr) ? @"successfully." : @"failed.");
        const uint8_t sourceBytes[] = {(uint8_t)(info.size >> 24), (uint8_t)(info.size >> 16), (uint8_t)(info.size >> 8), (uint8_t)info.size};
        status = CMBlockBufferReplaceDataBytes(sourceBytes, videoBlock, 0, 4);
        NSLog(@"BlockBufferReplace: %@", (status == kCMBlockBufferNoErr) ? @"successfully." : @"failed.");

        CMSampleBufferRef sbRef = NULL;
        const size_t sampleSizeArray[] = {info.size};

        status = CMSampleBufferCreate(kCFAllocatorDefault, videoBlock, true, NULL, NULL, backend.videoFormatDescr, 1, 0, NULL, 1, sampleSizeArray, &sbRef);
        NSLog(@"SampleBufferCreate: %@", (status == noErr) ? @"successfully." : @"failed.");

        CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sbRef, YES);
        CFMutableDictionaryRef dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0);
        CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue);

        NSLog(@"Error: %@, Status:%@", backend.displayLayer.error, (backend.displayLayer.status == AVQueuedSampleBufferRenderingStatusUnknown)?@"unknown":((backend.displayLayer.status == AVQueuedSampleBufferRenderingStatusRendering)?@"rendering":@"failed"));
        dispatch_async(dispatch_get_main_queue(),^{
            [backend.displayLayer enqueueSampleBuffer:sbRef];
            [backend.displayLayer setNeedsDisplay];
        });

    }

    gst_memory_unmap(memory, &info);
    gst_memory_unref(memory);
    gst_buffer_unref(buffer);

    return GST_FLOW_OK;
}

@implementation GStreamerBackend

- (instancetype)init
{
    if (self = [super init]) {
        self.searchForSPSAndPPS = true;
        self.ppsData = nil;
        self.spsData = nil;
        self.displayLayer = [[AVSampleBufferDisplayLayer alloc] init];
        self.displayLayer.bounds = CGRectMake(0, 0, 300, 300);
        self.displayLayer.backgroundColor = [UIColor blackColor].CGColor;
        self.displayLayer.position = CGPointMake(500, 500);
        self.queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
        dispatch_async(self.queue, ^{
            [self app_function];
        });
    }
    return self;
}

- (void)start
{
    if(gst_element_set_state(self.pipeline, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE) {
        NSLog(@"Failed to set pipeline to playing");
    }
}

- (void)app_function
{
    GstElement *udpsrc, *rtphdepay, *capsfilter;
    GMainContext *context; /* GLib context used to run the main loop */
    GMainLoop *main_loop;  /* GLib main loop */


    context = g_main_context_new ();
    g_main_context_push_thread_default(context);

    g_set_application_name ("appsink");

    self.pipeline = gst_pipeline_new ("testpipe");

    udpsrc = gst_element_factory_make ("udpsrc", "udpsrc");
    GstCaps *caps = gst_caps_new_simple("application/x-rtp", "media", G_TYPE_STRING, "video", "clock-rate", G_TYPE_INT, 90000, "encoding-name", G_TYPE_STRING, "H264", NULL);
    g_object_set(udpsrc, "caps", caps, "port", 5000, NULL);
    gst_caps_unref(caps);
    rtphdepay = gst_element_factory_make("rtph264depay", "rtph264depay");
    capsfilter = gst_element_factory_make("capsfilter", "capsfilter");
    caps = gst_caps_new_simple("video/x-h264", "streamformat", G_TYPE_STRING, "byte-stream", "alignment", G_TYPE_STRING, "nal", NULL);
    g_object_set(capsfilter, "caps", caps, NULL);
    self.appsink = gst_element_factory_make ("appsink", "appsink");

    gst_bin_add_many (GST_BIN (self.pipeline), udpsrc, rtphdepay, capsfilter, self.appsink, NULL);

    if(!gst_element_link_many (udpsrc, rtphdepay, capsfilter, self.appsink, NULL)) {
        NSLog(@"Cannot link gstreamer elements");
        exit (1);
    }

    if(gst_element_set_state(self.pipeline, GST_STATE_READY) != GST_STATE_CHANGE_SUCCESS)
        NSLog(@"could not change to ready");

    GstAppSinkCallbacks callbacks = { NULL, NULL, new_sample,
        NULL, NULL};
    gst_app_sink_set_callbacks (GST_APP_SINK(self.appsink), &callbacks, (__bridge gpointer)(self), NULL);

    main_loop = g_main_loop_new (context, FALSE);
    g_main_loop_run (main_loop);


    /* Free resources */
    g_main_loop_unref (main_loop);
    main_loop = NULL;
    g_main_context_pop_thread_default(context);
    g_main_context_unref (context);
    gst_element_set_state (GST_ELEMENT (self.pipeline), GST_STATE_NULL);
    gst_object_unref (GST_OBJECT (self.pipeline));
}

@end

What I get when running the App and starting to stream to the iOS device: 运行该应用程序并开始流式传输到iOS设备时会得到什么:

NALU with Type "Sequence parameter set (non-VCL)" received.
NALU with Type "Picture parameter set   (non-VCL)" received.

Found all data for CMVideoFormatDescription. Creation: successfully..

NALU with Type "Coded slice of an IDR picture (VCL)" received.
BlockBufferCreation: successfully.
BlockBufferReplace: successfully.
SampleBufferCreate: successfully.
Error: (null), Status:unknown

NALU with Type "Coded slice of a non-IDR picture (VCL)" received.
BlockBufferCreation: successfully.
BlockBufferReplace: successfully.
SampleBufferCreate: successfully.
Error: (null), Status:rendering
[...] (repetition of the last 5 lines)

So it seems to decode as it should do, but my problem is, that I could not see anything in my AVSampleBufferDisplayLayer. 因此,它似乎应该解码,但是我的问题是,我的AVSampleBufferDisplayLayer中看不到任何内容。 It might be a problem with the kCMSampleAttachmentKey_DisplayImmediately , but I have set it like I was told to here (see the 'important' note) . 这可能是kCMSampleAttachmentKey_DisplayImmediately的问题,但是我已经像在这里告诉我的那样进行了设置(请参阅“重要”说明)

Every idea is welcome ;) 每个想法都欢迎;)

Got it working now. 现在工作了。 The length of each NALU does not contain the length header itself. 每个NALU的长度不包含长度头本身。 So i have do subtract 4 from my info.size before using it for my sourceBytes. 因此,在将其用于sourceBytes之前,我需要从info.size减去4。

Instructed by your code,I write an program to decode and display a live H.264 stream using AVSampleBufferDisplayLayer.I use live555 rather GSStream to receive H.264 NAL units. 根据您的代码指示,我编写了一个程序来使用AVSampleBufferDisplayLayer解码并显示实时H.264流。我使用live555而不是GSStream来接收H.264 NAL单元。

Unfortunately, my app only displays a few frames and then no image can be show any more.Has you app ever met with the same problem? 不幸的是,我的应用仅显示了几帧,然后再也无法显示图像。您的应用遇到过同样的问题吗?

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM