简体   繁体   English

udp使用python在gstreamer中播放视频

[英]udp videostreaming in gstreamer with python

All this is done in the same computer. 所有这些都是在同一台计算机上完成的。 I would like to receive a video stream from Gstreamer using the GST using python . 我想使用python使用GSTGstreamer接收视频流。 First I will introduce gstreamer pipeline 首先我将介绍gstreamer管道

gst-launch-1.0 v4l2src !  video/x-raw,width=640,height=480 !  timeoverlay !  tee name="local" !  queue !  autovideosink local. !  queue ! jpegenc! rtpjpegpay !  udpsink host=127.0.0.1 port= 5000

then I want to get a video stream from gstreamer with a script in python to output in opencv 然后我想从gstreamer获取一个视频流,用python中的脚本输出到opencv

import sys
from gi.repository import Gst
import cv2
import numpy

Gst.init(None)


image_arr = None

def gst_to_opencv(sample):
    buf = sample.get_buffer()
    caps = sample.get_caps()
    print caps.get_structure(0).get_value('format')
    print caps.get_structure(0).get_value('height')
    print caps.get_structure(0).get_value('width')

    print buf.get_size()

    arr = numpy.ndarray(
        (caps.get_structure(0).get_value('height'),
         caps.get_structure(0).get_value('width'),
         3),
        buffer=buf.extract_dup(0, buf.get_size()),
        dtype=numpy.uint8)
    return arr

def new_buffer(sink, data):
    global image_arr
    sample = sink.emit("pull-sample")
    buf = sample.get_buffer()
    print "Timestamp: ", buf.pts
    arr = gst_to_opencv(sample)
    image_arr = arr
    return Gst.FlowReturn.OK

# Create the elements
source = Gst.ElementFactory.make("udpsrc", None)
depay = Gst.ElementFactory.make("rtpjpegdepay", None)
decoder = Gst.ElementFactory.make("jpegdec", None)
sink = Gst.ElementFactory.make("appsink", None)



# Create the empty pipeline111
pipeline = Gst.Pipeline.new("test-pipeline")

if not pipeline:
    print("Not all elements could be created.")
    exit(-1)


sink.set_property("emit-signals", True)
#sink.set_property("max-buffers", 2)
# # sink.set_property("drop", True)
# # sink.set_property("sync", False)

#caps = Gst.caps_from_string("application/x-rtp, encoding name=JPEG, payload=26, width=640, height=480; video/x-bayer,format=(string){rggb,bggr,grbg,gbrg}")
caps = Gst.caps_from_string(" application/x-rtp, encoding-name=JPEG, payload=26, format=(string)[]{rggb,bggr,grbg,gbrg}")
sink.set_property("caps", caps)


sink.connect("new-sample", new_buffer, sink)

# Build the pipeline
pipeline.add(source)
pipeline.add(depay)
pipeline.add(decoder)
pipeline.add(sink)
if not Gst.Element.link(source, depay):
    print("Elements could not be linked.")
    exit(-1)
if not Gst.Element.link(depay, decoder):
    print("Elements could not be linked.")
    exit(-1)
if not Gst.Element.link(decoder, sink):
    print("Elements could not be linked.")
    exit(-1)

# Modify the source's properties
source.set_property("port", 5000)

# Start playing
ret = pipeline.set_state(Gst.State.PLAYING)
if ret == Gst.StateChangeReturn.FAILURE:
    print("Unable to set the pipeline to the playing state.")
    exit(-1)

# Wait until error or EOS
bus = pipeline.get_bus()


# Parse message
while True:
    message = bus.timed_pop_filtered(10000, Gst.MessageType.ANY)
    # print "image_arr: ", image_arr
    if image_arr is not None:   
        cv2.imshow("appsink image arr", image_arr)
        cv2.waitKey(1)
    if message:
        if message.type == Gst.MessageType.ERROR:
            err, debug = message.parse_error()
            print("Error received from element %s: %s" % (
                message.src.get_name(), err))
            print("Debugging information: %s" % debug)
            break
        elif message.type == Gst.MessageType.EOS:
            print("End-Of-Stream reached.")
            break
        elif message.type == Gst.MessageType.STATE_CHANGED:
            if isinstance(message.src, Gst.Pipeline):
                old_state, new_state, pending_state = message.parse_state_changed()
                print("Pipeline state changed from %s to %s." %
                       (old_state.value_nick, new_state.value_nick))
        else:
            print("Unexpected message received.")

# Free resources
pipeline.set_state(Gst.State.NULL)

I see this error 我看到了这个错误

Pipeline state changed from null to ready.
Unexpected message received.
Pipeline state changed from ready to paused.
Unexpected message received.
Unexpected message received.
Unexpected message received.
Error received from element rtpjpegdepay0: No RTP format was negotiated.
Debugging information: gstrtpbasedepayload.c(373): gst_rtp_base_depayload_chain (): /GstPipeline:test-pipeline/GstRtpJPEGDepay:rtpjpegdepay0:
Input buffers need to have RTP caps set on them. This is usually achieved by setting the 'caps' property of the upstream source element (often udpsrc or appsrc), or by putting a capsfilter element before the depayloader and setting the 'caps' property on that. Also see http://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/gst/rtp/README

How to fix it? 怎么解决?

receiver gstreamer pipeline 接收器gstreamer管道

gst-launch-1.0 udpsrc port=5000 !  application/x-rtp, encoding-name=JPEG,payload=26 !  rtpjpegdepay ! jpegdec ! autovideosink 

You have to negotiate caps on udpsrc like 你必须在udpsrc上协商上限

gst-launch-1.0 udpsrc port=9090 caps=application/x-rtp,media=video,encoding-name=MP2T 

How you are sending the data, based on that these caps will vary. 如何发送数据,基于这些上限会有所不同。 You have to negotiate RTP Format, as you have used an rtp pay element before udpsink. 您必须协商RTP格式,因为您在udpsink之前使用了rtp pay元素。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM