简体   繁体   English

Gstreamer 应用程序接收到 rtsp 服务器,应用程序rc 作为源大延迟

[英]Gstreamer appsink to rtsp server with appsrc as source large latency

I'm having pipeline with appsink which pushes samples to appsrc which acts as a source to pipeline created by rtsp server.我有一个带有appsink 的管道,它将样本推送到appsrc,appsrc 作为rtsp 服务器创建的管道的源。 It works, I can connect to rtsp server and see the streamed video.它有效,我可以连接到 rtsp 服务器并查看流式视频。 The problem is latency.问题是延迟。 For some reason a lot of buffers is queued in the appsrc and the viewed stream has latency of more than two seconds.由于某种原因,appsrc 中有很多缓冲区排队,并且查看的流的延迟超过两秒。

I tried to find the source of latency and it looks like the data are started to being read from appsrc source pad after some time from the point the pipeline is started.我试图找到延迟的来源,看起来数据在管道启动后的一段时间后开始从appsrc源板读取。 The delay between the point the pipeline is started and the point data start to be read out from appsrc source pad is then transformed to it's latency.管道启动点和点数据开始从appsrc源板读出之间的延迟然后转换为它的延迟。

I found this by reading out how many bytes is queued in appsrc each time I push the buffer to it.我通过每次将缓冲区推送到它时读取appsrc 中排队的字节数来发现这一点。 This value which I read out is continuously rising for some time.我读出的这个值持续上升了一段时间。 When the read out of data starts the current amout of the bytes stored in appsrc queue stay approximately the same for the rest of the time I stream the video.当开始读取数据时,appsrc 队列中存储的当前字节数在我流式传输视频的其余时间保持大致相同。

Here is my test application which I'm using to test the correct functionality of this design.这是我用来测试此设计的正确功能的测试应用程序。

#include <stdio.h>
#include <gst/gst.h>
#include <gst/app/gstappsrc.h>
#include <gst/app/gstappsink.h>
#include <time.h>

#include <gst/rtsp-server/rtsp-server.h>

  GMainLoop *loop;
  GstElement *appsink;
  GstElement *appsrc;
  GstElement *appsink_pipeline;

/* Functions below print the Capabilities in a human-friendly format */
static gboolean print_field (GQuark field, const GValue * value, gpointer pfx) {
  gchar *str = gst_value_serialize (value);

  g_print ("%s  %15s: %s\n", (gchar *) pfx, g_quark_to_string (field), str);
  g_free (str);
  return TRUE;
}

static void print_caps (const GstCaps * caps, const gchar * pfx) {
  guint i;

  g_return_if_fail (caps != NULL);

  if (gst_caps_is_any (caps)) {
    g_print ("%sANY\n", pfx);
    return;
  }
  if (gst_caps_is_empty (caps)) {
    g_print ("%sEMPTY\n", pfx);
    return;
  }

  for (i = 0; i < gst_caps_get_size (caps); i++) {
    GstStructure *structure = gst_caps_get_structure (caps, i);

    g_print ("%s%s\n", pfx, gst_structure_get_name (structure));
    gst_structure_foreach (structure, print_field, (gpointer) pfx);
  }
}

/* called when the appsink notifies us that there is a new buffer ready for
 * processing */
static GstFlowReturn
on_new_sample_from_sink (GstElement * elt, void * data)
{
  GstSample *sample;
  GstFlowReturn ret = GST_FLOW_OK;
  guint64 bytes;
 

  /* get the sample from appsink */
  sample = gst_app_sink_pull_sample (GST_APP_SINK (elt));
  if(appsrc)
  {

    bytes = gst_app_src_get_current_level_bytes(GST_APP_SRC(appsrc));
    g_print("buffered bytes before push %lu\n", bytes);
    
    ret = gst_app_src_push_sample(GST_APP_SRC (appsrc), sample);
    // bytes = gst_app_src_get_current_level_bytes(GST_APP_SRC(appsrc));
    // if(ret == GST_FLOW_OK)
      // g_print("pushed ok - buffered bytes after push %lu\n", bytes);

  }
  
  gst_sample_unref (sample);
 
  return ret;
}
 
/* called when we get a GstMessage from the source pipeline when we get EOS, we
 * notify the appsrc of it. */
static gboolean
on_source_message (GstBus * bus, GstMessage * message, void * data)
{
  gint percent;
  g_print ("%s\n", __func__);
 
  switch (GST_MESSAGE_TYPE (message)) {
    case GST_MESSAGE_EOS:
      g_print ("The source got dry\n");
      gst_app_src_end_of_stream (GST_APP_SRC (appsrc));
      break;
    case GST_MESSAGE_ERROR:
      g_print ("Received error\n");
      g_main_loop_quit (loop);
      break;
    case GST_MESSAGE_BUFFERING:
      gst_message_parse_buffering (message, &percent);
      g_print ("Buffering = %d\n", percent);
      break;
    default:
      break;
  }
  return TRUE;
}
 
static GstFlowReturn need_data (GstElement * appsrc_loc,
                    guint length,
                    gpointer udata)
{

  g_print("Need data\n");

  return GST_FLOW_OK;

}

/* this timeout is periodically run to clean up the expired sessions from the
 * pool. This needs to be run explicitly currently but might be done
 * automatically as part of the mainloop. */
static gboolean
timeout (GstRTSPServer * server)
{
  GstRTSPSessionPool *pool;

  pool = gst_rtsp_server_get_session_pool (server);
  gst_rtsp_session_pool_cleanup (pool);
  g_object_unref (pool);

  return TRUE;
}

void clientConnected(GstRTSPServer* server, GstRTSPClient* client, gpointer user)
{
  g_print("%s\n", __func__);
  
}

static void media_state_cb(GstRTSPMedia *media, GstState state)
{

  g_print("media state = %d\n", state);

}


static void
media_construct (GstRTSPMediaFactory * factory, GstRTSPMedia * media,
    gpointer user_data)
{
  GstElement *element;

  g_print("%s\n", __func__);
  /* get the element used for providing the streams of the media */
  element = gst_rtsp_media_get_element (media);

  /* get our appsrc, we named it 'appsrc' with the name property */
  appsrc = gst_bin_get_by_name_recurse_up (GST_BIN (element), "appsrc");
  g_signal_connect (appsrc, "need-data",
      G_CALLBACK (need_data), NULL);

  g_signal_connect (media, "new-state",
      G_CALLBACK (media_state_cb), NULL);

  gst_object_unref (element);
}

static void
media_configure (GstRTSPMediaFactory * factory, GstRTSPMedia * media,
    gpointer user_data)
{
  GstPad *pad;
  GstCaps *caps;
  gchar *caps_str;
  GstElement *element;

  g_print("%s\n", __func__);

  /* get the element used for providing the streams of the media */
  element = gst_rtsp_media_get_element (media);

  /* get our appsrc, we named it 'mysrc' with the name property */
  appsrc = gst_bin_get_by_name_recurse_up (GST_BIN (element), "appsrc");

  pad = gst_element_get_static_pad (appsink, "sink");
  if(pad)
  {
    g_print("Got pad\n");
    caps = gst_pad_get_current_caps (pad);
    if(caps)
    {
      caps_str = gst_caps_to_string  (caps);
      g_print("Got caps %s\n", caps_str);
      g_object_set (G_OBJECT (appsrc), "caps",  caps, NULL);

      gst_caps_unref(caps);
    }
  }

  /* this instructs appsrc that we will be dealing with timed buffer */
  gst_util_set_object_arg (G_OBJECT (appsrc), "format", "time");

  gst_object_unref (element);
}

int main (int argc, char *argv[]){
  GstBus *bus;
  GstRTSPServer *server;
  GstRTSPMountPoints *mounts;
  GstRTSPMediaFactory *factory;

  gchar src[] = "nvv4l2camerasrc device=/dev/video0  ! video/x-raw(memory:NVMM), width=1920, height=1080, format=UYVY, framerate=60/1 ! " 
        " queue max-size-buffers=3 leaky=downstream ! "
        " nvvidconv name=conv ! video/x-raw(memory:NVMM), width=1280, height=720, format=NV12, framerate=60/1 ! "
        " nvv4l2h264enc control-rate=1  bitrate=8000000 preset-level=1 profile=0 disable-cabac=1 maxperf-enable=1 name=encoder insert-sps-pps=1 insert-vui=1  idrinterval=30 ! "
        " appsink name=appsink sync=false max-buffers=3";


  gchar sink[] = "( appsrc name=appsrc format=3 stream-type=0 is-live=true blocksize=2097152  max-bytes=200000 ! "
                " queue max-size-buffers=3 leaky=no ! "
                " rtph264pay config-interval=1 name=pay0 )";
                  
        
  gst_init (&argc, &argv);

  loop = g_main_loop_new (NULL, FALSE);

  /* Create pipeline with appsink */
  g_print("Creating pipeline with appsink\n");
  appsink_pipeline = gst_parse_launch (src, NULL);
 
  if (appsink_pipeline == NULL) {
    g_print ("Bad source\n");
    g_main_loop_unref (loop);
    return -1;
  }

  /* to be notified of messages from this pipeline, mostly EOS */
  bus = gst_element_get_bus (appsink_pipeline);
  gst_bus_add_watch (bus, (GstBusFunc) on_source_message, appsink_pipeline);
  gst_object_unref (bus);

  /* Create push_buffer callback for appsink */
  g_print("Creating push buffer callback\n");
  appsink = gst_bin_get_by_name (GST_BIN (appsink_pipeline), "appsink");
  g_object_set (G_OBJECT (appsink), "emit-signals", TRUE, "sync", FALSE, NULL);
  g_signal_connect (appsink, "new-sample",
      G_CALLBACK (on_new_sample_from_sink), NULL);

  /* Create rtsp server with pipeline starting with appsrc */
  g_print("Creating rtsp server\n");
  /* create a server instance */
  server = gst_rtsp_server_new ();

  /* get the mount points for this server, every server has a default object
   * that be used to map uri mount points to media factories */
  mounts = gst_rtsp_server_get_mount_points (server);

  /* make a media factory for a test stream. The default media factory can use
   * gst-launch syntax to create pipelines.
   * any launch line works as long as it contains elements named pay%d. Each
   * element with pay%d names will be a stream */
  factory = gst_rtsp_media_factory_new ();

  gst_rtsp_media_factory_set_launch (factory, sink);
  gst_rtsp_media_factory_set_shared(factory, TRUE);

  /* attach the test factory to the /test url */
  gst_rtsp_mount_points_add_factory (mounts, "/test", factory);

  /* don't need the ref to the mapper anymore */
  g_object_unref (mounts);

  /* attach the server to the default maincontext */
  if (gst_rtsp_server_attach (server, NULL) == 0)
    goto failed;

  /* add a timeout for the session cleanup */
  g_timeout_add_seconds (2, (GSourceFunc) timeout, server);

  g_signal_connect (server, "client-connected",
      G_CALLBACK (clientConnected), NULL);

  /* Create media-constructed callback to get appsrc reference */
  g_print("Creating media-constructed callback\n");

  g_signal_connect (factory, "media-constructed", (GCallback) media_construct,
      NULL);

  g_signal_connect (factory, "media-configure", (GCallback) media_configure,
      NULL);
  /* Push buffers from appsink to appsrc */


  /* start serving, this never stops */ 

  g_print("Running main loop\n");
  
  gst_element_set_state (appsink_pipeline, GST_STATE_PLAYING);
  g_main_loop_run (loop);
  gst_element_set_state (appsink_pipeline, GST_STATE_NULL);

  return 0;

  /* ERRORS */
failed:
  {
    g_print ("failed to attach the server\n");
    return -1;
  }
}

I will appreciate every idea about what can cause this behavior and how to solve this.我会很感激关于什么会导致这种行为以及如何解决这个问题的每一个想法。

Thanks a lot!非常感谢!

This latency problem may be due to many reasons but most of the time this problem is due to frames are not in SYNC.此延迟问题可能是由多种原因造成的,但大多数情况下,此问题是由于帧未处于同步状态。 There is a lot of data in the queue.队列中有很多数据。

To counter this problem need to test these test cases to find out the real problem.为了解决这个问题需要测试这些测试用例来找出真正的问题。

  1. Check the behavior with videotestsrc instead of the camera source.使用 videotestsrc 而不是相机源检查行为。
  2. Are you sure that after nvv4l2camerasrc queue is needed what will be the output if you skip the queue element.您确定在需要 nvv4l2camerasrc 队列之后,如果您跳过队列元素,输出将是什么。
  3. You can also check with lower resolution input to get something from it.您还可以检查较低分辨率的输入以从中获取一些信息。
  4. what happened if you use v4l2src instead of nvv4l2camerasrc if your camera soruce is v4l2 complaince.如果您的相机源是 v4l2 投诉,如果您使用 v4l2src 而不是 nvv4l2camerasrc 会发生什么。 Thanks谢谢

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM