简体   繁体   中英

python and gstreamer, trying to play video (and later add textoverlay)

I'm trying to write a python application, and to get gstreamer to play a videofile I have recorded (and to have some subtitles on the video later on with textoverlay).

But looks like I still have some basic issues understanding how pads work.. I can't seem to get links up properly.

The basic example I am building on top is a simple application showing video from webcam. So I know the code works, and it's only my pipeline that is messing things up.

Also if I run execute following pipeline in terminal, it works:

gst-launch-0.10 filesrc location=GOPR0042.MP4 ! decodebin2 ! ffmpegcolorspace ! videoflip method=2 ! xvimagesink

Now, I am trying to recreate this pipeline to python app, as such:

#!/usr/bin/env python

import sys, os
import pygtk, gtk, gobject
import pygst
pygst.require("0.10")
import gst

class GTK_Main:

def __init__(self):
    window = gtk.Window(gtk.WINDOW_TOPLEVEL)
    window.set_title("Webcam-Viewer")
    window.set_default_size(500, 400)
    window.connect("destroy", gtk.main_quit, "WM destroy")
    vbox = gtk.VBox()
    window.add(vbox)
    self.movie_window = gtk.DrawingArea()
    vbox.add(self.movie_window)
    hbox = gtk.HBox()
    vbox.pack_start(hbox, False)
    hbox.set_border_width(10)
    hbox.pack_start(gtk.Label())
    self.button = gtk.Button("Start")
    self.button.connect("clicked", self.start_stop)
    hbox.pack_start(self.button, False)
    self.button2 = gtk.Button("Quit")
    self.button2.connect("clicked", self.exit)
    hbox.pack_start(self.button2, False)
    hbox.add(gtk.Label())
    window.show_all()

    # Set up the gstreamer pipeline
    self.pipeline = gst.Pipeline("player")
    self.filesource = gst.element_factory_make("filesrc","filesource")
          self.filesource.set_property("location","""/home/jlumme/video/GOPR0042.MP4""")
    self.pipeline.add(self.filesource)


    self.decoder = gst.element_factory_make("decodebin2","decoder")
        self.pipeline.add(self.decoder)

    self.colorspace = gst.element_factory_make("ffmpegcolorspace","colorspace")
    self.pipeline.add(self.colorspace)

    self.videosink = gst.element_factory_make("xvimagesink","videosink")
    self.pipeline.add(self.videosink)


    self.filesource.link(self.decoder)
    self.decoder.link(self.colorspace) #This fails
    self.colorspace.link(self.videosink)

    bus = self.pipeline.get_bus()
    bus.add_signal_watch()
    bus.enable_sync_message_emission()
    bus.connect("message", self.on_message)
    bus.connect("sync-message::element", self.on_sync_message)

def start_stop(self, w):
    if self.button.get_label() == "Start":
        self.button.set_label("Stop")
        self.pipeline.set_state(gst.STATE_PLAYING)
    else:
        self.pipeline.set_state(gst.STATE_NULL)
        self.pipeline.set_label("Start")

def exit(self, widget, data=None):
    gtk.main_quit()

def on_message(self, bus, message):
    t = message.type
    if t == gst.MESSAGE_EOS:
        self.pipeline.set_state(gst.STATE_NULL)
        self.button.set_label("Start")
    elif t == gst.MESSAGE_ERROR:
        err, debug = message.parse_error()
        print "Error: %s" % err, debug
        self.pipeline.set_state(gst.STATE_NULL)
        self.button.set_label("Start")

def on_sync_message(self, bus, message):
    if message.structure is None:
        return
    message_name = message.structure.get_name()
    if message_name == "prepare-xwindow-id":
        # Assign the viewport
        imagesink = message.src
        imagesink.set_property("force-aspect-ratio", True)
        imagesink.set_xwindow_id(self.movie_window.window.xid)

GTK_Main()
gtk.gdk.threads_init()
gtk.main()

Now I have seen people using a dynamic pad to link decoderbin to some audio stuff, but I don't really understand how it works... So, I guess I can't connect decoderbin2 and ffmpegcolorspace directly ? Could someone explain me why ?

Also, do you forsee problems in my next step, where I would like to add textoverlay element to the pipeline, to show subtitles ?

In the recent habit of answering my own questions, I will do that here as well :)

So, after a bit more reading and hacking, indeed I realize that I wasn't really getting the dynamic pads and, and how they need to be connected only when there is stuff coming in.

So basically I solved the above problem with 2 queues, for both audio and video. Those queues are connected then to decoders, and they need to be placed after the demuxer & connected dynamically. Also decoder and sink seems to need a dynamic connecting of pads.

A question on this forum that explains this process very clearly is this one: gstreamer code for playing avi file is hanging

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM