[英]GStreamer: Sending string to another pipeline via UDP
I have a Gstreamer pipeline in C, meant to send a file to a receiving pipeline via udp.我在 C 中有一个 Gstreamer 管道,旨在通过 udp 将文件发送到接收管道。
My sending pipeline is similar to this one:我的发送管道类似于这个:
filesrc location=X.mp4 ! decodebin ! videoconvert ! x264enc ! rtph264pay ! udpsink host=X port=5000
My receiving pipeline is similar to this:我的接收管道类似于:
udpsrc port=5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! autovideosink
My problem is that I need to send, with each frame, a simple string to the receiving pipeline.我的问题是我需要在每一帧中向接收管道发送一个简单的字符串。 I need to be able to to change the string on the fly (by a callback), and my receiving pipeline needs to be able to parse this string (also by a callback).
我需要能够动态更改字符串(通过回调),并且我的接收管道需要能够解析这个字符串(也通过回调)。
I understand I can't use textoverlay because the text becomes a part of the video pixels, and the obvious solution seems to be to use subtitles, but I can't figure out how to create a subtitle stream dynamically .我知道我不能使用 textoverlay 因为文本成为视频像素的一部分,显而易见的解决方案似乎是使用字幕,但我无法弄清楚如何动态创建字幕流。
Just to emphasize: I can't use a subtitle file, because I need to be able to send the subtitles "on the fly".只是强调:我不能使用字幕文件,因为我需要能够“即时”发送字幕。
Any help would be GREATLY appreciated.任何帮助将不胜感激。
Eventually I managed to do it.最终我设法做到了。
In case it helps anyone in the future:如果它可以帮助将来的任何人:
In my sender pipeline , I attach a probe to the rtph264pay src element:在我的发件人管道中,我将探测器附加到 rtph264pay src 元素:
rtph264pay_src_pad = gst_element_get_static_pad(rtph264pay_HD, "src");
gst_pad_add_probe (rtph264pay_src_pad, GST_PAD_PROBE_TYPE_BUFFER,
(GstPadProbeCallback) set_x_to_rtp_header, NULL, NULL);
gst_object_unref (rtph264pay_src_pad);
This is the callback "set_x_to_rtp_header":这是回调“set_x_to_rtp_header”:
static GstPadProbeReturn set_x_to_rtp_header (GstPad *pad,
GstPadProbeInfo *info,
gpointer user_data)
{
GstBuffer *buffer;
GstRTPBuffer rtpBuffer = GST_RTP_BUFFER_INIT;
g_print("%s\n", __func__);
buffer = GST_PAD_PROBE_INFO_BUFFER (info);
buffer = gst_buffer_make_writable (buffer);
/* Making a buffer writable can fail (for example if it
* cannot be copied and is used more than once)
*/
if (buffer == NULL)
return GST_PAD_PROBE_OK;
if (gst_rtp_buffer_map (buffer,GST_MAP_WRITE, &rtpBuffer)) {
pthread_mutex_lock(&mutex);
if (gst_rtp_buffer_set_extension_data(&rtpBuffer, setup_data.left_videocrop, sizeof(setup_data.left_videocrop)) != TRUE) {
g_printerr("cannot add extension to rtp header");
}
pthread_mutex_unlock(&mutex);
gst_rtp_buffer_unmap (&rtpBuffer);
}
return GST_PAD_PROBE_OK;
}
In my receiver pipeline , I attach a probe to the rtph264depay sink element:在我的接收器管道中,我将探针连接到 rtph264depay 接收器元素:
rtph264depay_sink_pad = gst_element_get_static_pad(rtph264depay_HD, "sink");
gst_pad_add_probe (rtph264depay_sink_pad, GST_PAD_PROBE_TYPE_BUFFER,
(GstPadProbeCallback) get_x_from_rtp_header, videomixer, NULL);
gst_object_unref (rtph264depay_sink_pad);
And this is the callback "get_x_from_rtp_header":这是回调“get_x_from_rtp_header”:
static GstPadProbeReturn get_x_from_rtp_header (GstPad *pad,
GstPadProbeInfo *info,
gpointer user_data)
{
GstBuffer *buffer;
GstRTPBuffer rtpBuffer = GST_RTP_BUFFER_INIT;
guint16 ret_x;
gpointer data;
guint wordlen;
GstElement* videomixer = (GstElement*)user_data;
GstPad* videomixer_HD_sink_pad;
g_print("%s\n", __func__);
buffer = GST_PAD_PROBE_INFO_BUFFER (info);
buffer = gst_buffer_make_writable (buffer);
/* Making a buffer writable can fail (for example if it
* cannot be copied and is used more than once)
*/
if (buffer == NULL)
return GST_PAD_PROBE_OK;
if (gst_rtp_buffer_map (buffer,GST_MAP_READ, &rtpBuffer)) {
//get x from the rtp header and into ret_x variable
if (gst_rtp_buffer_get_extension_data(&rtpBuffer, &ret_x, &data, &wordlen) != TRUE) {
return GST_PAD_PROBE_OK;
}
gst_rtp_buffer_unmap (&rtpBuffer);
}
return GST_PAD_PROBE_OK;
}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.