簡體   English   中英

IP攝像頭捕獲

[英]IP camera capture

我試圖在nVidia Jetson TK1中捕獲直接連接到mini PCIe雙千兆擴展卡的兩個IP攝像機流。

我用下一個命令使用gstreamer捕獲了兩個攝像頭的流:

gst-launch-0.10 rtspsrc location=rtsp://admin:123456@192.168.0.123:554/mpeg4cif latency=0 ! decodebin ! ffmpegcolorspace ! autovideosink rtspsrc location=rtsp://admin:123456@192.168.2.254:554/mpeg4cif latency=0 ! decodebin ! ffmpegcolorspace ! autovideosink

它為每個攝像頭顯示一個窗口,但在捕獲開始時提供此輸出:

    WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink1/GstXvImageSink:autovideosink1-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(2875): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink1/GstXvImageSink:autovideosink1-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
---> TVMR: Video-conferencing detected !!!!!!!!!

流播放良好,相機之間也有“良好”的同步,但過了一會兒,突然其中一個攝像機停止,通常幾秒鍾后另一個停止。 使用像Wireshark這樣的接口snifer我可以檢查rtsp數據包是否仍在從攝像頭發送。

我的目的是使用這些相機將它們用作使用openCV的立體相機。 我可以使用以下函數使用OpenCV捕獲流:

camera[0].open("rtsp://admin:123456@192.168.2.254:554/mpeg4cif");//right
camera[1].open("rtsp://admin:123456@192.168.0.123:554/mpeg4cif");//left

它隨機地開始捕獲好壞,同步與否,延遲與否,但過了一段時間不可能使用捕獲的圖像,你可以在圖像中觀察到:

在此輸入圖像描述

運行openCV程序時的輸出通常是這樣的:(我復制了最完整的一個)

[h264 @ 0x1b9580] slice type too large (2) at 0 23
[h264 @ 0x1b9580] decode_slice_header error

[h264 @ 0x1b1160] left block unavailable for requested intra mode at 0 6
[h264 @ 0x1b1160] error while decoding MB 0 6, bytestream (-1)

[h264 @ 0x1b1160] mmco: unref short failure

[h264 @ 0x1b9580] too many reference frames

[h264 @ 0x1b1160] pps_id (-1) out of range

使用過的相機是兩個SIP-1080J模塊。

誰知道如何使用openCV實現良好的捕獲? 首先擺脫那些h264消息,並在程序執行時獲得穩定的圖像。

如果沒有,我如何使用gstreamer改善管道和緩沖區,以便在沒有突然停止流的情況下獲得良好的捕獲? 雖然我從未使用gstreamer通過openCV捕獲,但也許有一天我會知道如何做並解決這個問題。

非常感謝。

經過幾天深度搜索和一些嘗試后,我直接使用gstreamer-0.10 API。 首先,我學習了如何在http://docs.gstreamer.com/pages/viewpage.action?pageId=327735的教程中使用它。

對於大多數教程,您只需要安裝libgstreamer0.10-dev和其他一些軟件包。 我安裝了全部:

sudo apt-get install libgstreamer0*

然后將您要嘗試的示例的代碼復制到.c文件中,並從.c文件所在的文件夾中的終端鍵入(在某些示例中,您必須向pkg-config添加更多的lib):

gcc basic-tutorial-1.c $(pkg-config --cflags --libs gstreamer-0.10) -o basic-tutorial-1.c

之后我沒有感到迷茫,我開始嘗試混合一些c和c ++代碼。 您可以使用正確的g ++命令或使用CMakeLists.txt或您想要的方式編譯它...當我使用nVidia Jetson TK1進行開發時,我使用Nsight Eclipse Edition並且我需要正確配置項目屬性能夠使用gstreamer-0.10 libs和openCV libs。

混合一些代碼,最后我能夠實時捕獲我的兩個IP攝像機的流,沒有明顯的延遲,沒有任何幀中的錯誤解碼和兩個流同步。 我還沒有解決的唯一問題是獲得彩色幀而不是灰度(當我嘗試使用其他具有“分段錯誤”結果的CV_值時):

v = Mat(Size(640, 360),CV_8U, (char*)GST_BUFFER_DATA(gstImageBuffer));

完整的代碼是我使用gstreamer捕獲的地方,將捕獲轉換為openCV Mat對象,然后顯示它。 該代碼僅用於捕獲一個IP攝像頭。 您可以復制用於同時捕獲多個攝像頭的對象和方法。

#include <opencv2/core/core.hpp>
#include <opencv2/contrib/contrib.hpp>
#include <opencv2/highgui/highgui.hpp>
#include <opencv2/imgproc/imgproc.hpp>
#include <opencv2/video/video.hpp>

#include <gst/gst.h>
#include <gst/app/gstappsink.h>
#include <gst/app/gstappbuffer.h>
#include <glib.h>

#define DEFAULT_LATENCY_MS  1

using namespace cv;

typedef struct _vc_cfg_data {
    char server_ip_addr[100];
} vc_cfg_data;

typedef struct _vc_gst_data {
    GMainLoop *loop;
    GMainContext *context;
    GstElement *pipeline;
    GstElement *rtspsrc,*depayloader, *decoder, *converter, *sink;
    GstPad *recv_rtp_src_pad;
} vc_gst_data;

typedef struct _vc_data {
    vc_gst_data gst_data;
    vc_cfg_data cfg;
} vc_data;

/* Global data */
vc_data app_data;

static void vc_pad_added_handler (GstElement *src, GstPad *new_pad, vc_data *data);


#define VC_CHECK_ELEMENT_ERROR(e, name) \
if (!e) { \
g_printerr ("Element %s could not be created. Exiting.\n", name); \
return -1; \
}

/*******************************************************************************
Gstreamer pipeline creation and init
*******************************************************************************/
int vc_gst_pipeline_init(vc_data *data)
{
    GstStateChangeReturn ret;

    // Template
    GstPadTemplate* rtspsrc_pad_template;

    // Create a new GMainLoop
    data->gst_data.loop = g_main_loop_new (NULL, FALSE);
    data->gst_data.context = g_main_loop_get_context(data->gst_data.loop);

    // Create gstreamer elements
    data->gst_data.pipeline = gst_pipeline_new ("videoclient");
    VC_CHECK_ELEMENT_ERROR(data->gst_data.pipeline, "pipeline");

    //RTP UDP Source - for received RTP messages
    data->gst_data.rtspsrc = gst_element_factory_make ("rtspsrc", "rtspsrc");
    VC_CHECK_ELEMENT_ERROR(data->gst_data.rtspsrc,"rtspsrc");

    printf("URL: %s\n",data->cfg.server_ip_addr);
    g_print ("Setting RTSP source properties: \n");
    g_object_set (G_OBJECT (data->gst_data.rtspsrc), "location", data->cfg.server_ip_addr, "latency", DEFAULT_LATENCY_MS, NULL);

    //RTP H.264 Depayloader
    data->gst_data.depayloader = gst_element_factory_make ("rtph264depay","depayloader");
    VC_CHECK_ELEMENT_ERROR(data->gst_data.depayloader,"rtph264depay");

    //ffmpeg decoder
    data->gst_data.decoder = gst_element_factory_make ("ffdec_h264", "decoder");
    VC_CHECK_ELEMENT_ERROR(data->gst_data.decoder,"ffdec_h264");

    data->gst_data.converter = gst_element_factory_make ("ffmpegcolorspace", "converter");
    VC_CHECK_ELEMENT_ERROR(data->gst_data.converter,"ffmpegcolorspace");

    // i.MX Video sink
    data->gst_data.sink = gst_element_factory_make ("appsink", "sink");
    VC_CHECK_ELEMENT_ERROR(data->gst_data.sink,"appsink");
    gst_app_sink_set_max_buffers((GstAppSink*)data->gst_data.sink, 1);
    gst_app_sink_set_drop ((GstAppSink*)data->gst_data.sink, TRUE);
    g_object_set (G_OBJECT (data->gst_data.sink),"sync", FALSE, NULL);

    //Request pads from rtpbin, starting with the RTP receive sink pad,
    //This pad receives RTP data from the network (rtp-udpsrc).
    rtspsrc_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (data->gst_data.rtspsrc),"recv_rtp_src_0");

    // Use the template to request the pad
    data->gst_data.recv_rtp_src_pad = gst_element_request_pad (data->gst_data.rtspsrc, rtspsrc_pad_template,
    "recv_rtp_src_0", NULL);

    // Print the name for confirmation
    g_print ("A new pad %s was created\n",
    gst_pad_get_name (data->gst_data.recv_rtp_src_pad));

    // Add elements into the pipeline
    g_print(" Adding elements to pipeline...\n");
    gst_bin_add_many (GST_BIN (data->gst_data.pipeline),
            data->gst_data.rtspsrc,
            data->gst_data.depayloader,
            data->gst_data.decoder,
            data->gst_data.converter,
            data->gst_data.sink,
        NULL);

    // Link some of the elements together
    g_print(" Linking some elements ...\n");
    if(!gst_element_link_many (data->gst_data.depayloader, data->gst_data.decoder, data->gst_data.converter, data->gst_data.sink, NULL))
        g_print("Error: could not link all elements\n");

    // Connect to the pad-added signal for the rtpbin. This allows us to link
    //the dynamic RTP source pad to the depayloader when it is created.
    if(!g_signal_connect (data->gst_data.rtspsrc, "pad-added",
    G_CALLBACK (vc_pad_added_handler), data))
        g_print("Error: could not add signal handler\n");

    // Set the pipeline to "playing" state
    g_print ("Now playing A\n");
    ret = gst_element_set_state (data->gst_data.pipeline, GST_STATE_PLAYING);
    if (ret == GST_STATE_CHANGE_FAILURE) {
        g_printerr ("Unable to set the pipeline A to the playing state.\n");
        gst_object_unref (data->gst_data.pipeline);
        return -1;
    }

    return 0;
}

static void vc_pad_added_handler (GstElement *src, GstPad *new_pad, vc_data *data) {
    GstPad *sink_pad = gst_element_get_static_pad (data->gst_data.depayloader, "sink");
    GstPadLinkReturn ret;
    GstCaps *new_pad_caps = NULL;
    GstStructure *new_pad_struct = NULL;
    const gchar *new_pad_type = NULL;
    g_print ("Received new pad '%s' from '%s':\n", GST_PAD_NAME (new_pad), GST_ELEMENT_NAME (src));

    /* Check the new pad's name */
    if (!g_str_has_prefix (GST_PAD_NAME (new_pad), "recv_rtp_src_")) {
        g_print (" It is not the right pad. Need recv_rtp_src_. Ignoring.\n");
        goto exit;
    }

    /* If our converter is already linked, we have nothing to do here */
    if (gst_pad_is_linked (sink_pad)) {
        g_print (" Sink pad from %s already linked. Ignoring.\n", GST_ELEMENT_NAME (src));
        goto exit;
    }

    /* Check the new pad's type */
    new_pad_caps = gst_pad_get_caps (new_pad);
    new_pad_struct = gst_caps_get_structure (new_pad_caps, 0);
    new_pad_type = gst_structure_get_name (new_pad_struct);

    /* Attempt the link */
    ret = gst_pad_link (new_pad, sink_pad);
    if (GST_PAD_LINK_FAILED (ret)) {
        g_print (" Type is '%s' but link failed.\n", new_pad_type);
    } else {
        g_print (" Link succeeded (type '%s').\n", new_pad_type);
    }

    exit:
    /* Unreference the new pad's caps, if we got them */
    if (new_pad_caps != NULL)
        gst_caps_unref (new_pad_caps);
    /* Unreference the sink pad */
    gst_object_unref (sink_pad);
}



int vc_gst_pipeline_clean(vc_data *data) {
    GstStateChangeReturn ret;
    GstStateChangeReturn ret2;

    /* Cleanup Gstreamer */
    if(!data->gst_data.pipeline)
        return 0;

    /* Send the main loop a quit signal */
    g_main_loop_quit(data->gst_data.loop);
    g_main_loop_unref(data->gst_data.loop);
    ret = gst_element_set_state (data->gst_data.pipeline, GST_STATE_NULL);
    if (ret == GST_STATE_CHANGE_FAILURE) {
        g_printerr ("Unable to set the pipeline A to the NULL state.\n");
        gst_object_unref (data->gst_data.pipeline);
        return -1;
    }

    g_print ("Deleting pipeline\n");
    gst_object_unref (GST_OBJECT (data->gst_data.pipeline));
    /* Zero out the structure */
    memset(&data->gst_data, 0, sizeof(vc_gst_data));
    return 0;
}


void handleKey(char key)
{
    switch (key)
    {
    case 27:

        break;
    }
}


int vc_mainloop(vc_data* data)
{

    GstBuffer *gstImageBuffer;

    Mat v;

    namedWindow("view",WINDOW_NORMAL);

    while (1) {

        gstImageBuffer = gst_app_sink_pull_buffer((GstAppSink*)data->gst_data.sink);

        if (gstImageBuffer != NULL )
        {
                v = Mat(Size(640, 360),CV_8U, (char*)GST_BUFFER_DATA(gstImageBuffer));

                imshow("view", v);

                handleKey((char)waitKey(3));

                gst_buffer_unref(gstImageBuffer);
        }else{
            g_print("gsink buffer didn't return buffer.");
        }
    }
    return 0;
}


int main (int argc, char *argv[])
{
    setenv("DISPLAY", ":0", 0);

    strcpy(app_data.cfg.server_ip_addr, "rtsp://admin:123456@192.168.0.123:554/mpeg4cif");

    gst_init (&argc, &argv);

    if(vc_gst_pipeline_init(&app_data) == -1) {
        printf("Gstreamer pipeline creation and init failed\n");
        goto cleanup;
    }

    vc_mainloop(&app_data);

    printf ("Returned, stopping playback\n");
    cleanup:
    return vc_gst_pipeline_clean(&app_data);
    return  0;
}

我希望這有幫助!! ;)

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM