简体   繁体   English

如何在Linux中使用gstreamer播放OGG文件中的视频流

[英]How to play the video stream from an OGG file using gstreamer in Linux

I am trying to setup a pipeline to play just the video stream from an OGG file in Linux using gstreamer-0.10. 我正在尝试建立一个管道,以使用gstreamer-0.10在Linux中仅播放来自OGG文件的视频流。 I need to do this from the command line using the gst-launch utility. 我需要使用gst-launch实用程序从命令行执行此操作。 I am successfully able to play both the audio and video streams using the following command: 我可以使用以下命令成功播放音频和视频流:

$ gst-launch-0.10 playbin uri=file:///projects/demo.ogv

I am also able to setup a pipeline to play a video test file using the following command: 我还可以使用以下命令设置管道来播放视频测试文件:

$ gst-launch-0.10 videotestsrc ! autovideosink

But I cannot seem to piece together the proper pipeline to play the video stream from the OGG demuxer. 但是我似乎无法拼凑出适当的管道来播放OGG多路分配器的视频流。

According to the gstreamer documentation (Fig 3 - http://docs.gstreamer.com/display/GstSDK/Basic+tutorial+3%3A+Dynamic+pipelines ), the OGG demuxer video sink should be src_02. 根据gstreamer文档(图3- http: //docs.gstreamer.com/display/GstSDK/Basic+tutorial+3%3A+Dynamic+pipelines),OGG多路分配器视频接收器应为src_02。 This appears to be supported by the gst-inspect command: 这似乎受gst-inspect命令支持:

$ gst-inspect oggdemux
...
Pad Templates:
SRC template: 'src_%d'
    Availability: Sometimes
    Capabilities:
    ANY

SINK template: 'sink'
    Availability: Always
    Capabilities:
      application/ogg
      application/x-annodex
...

And according to this tutorial on specifying pads ( http://docs.gstreamer.com/display/GstSDK/Basic+tutorial+10%3A+GStreamer+tools ), I believe that my command to play the video stream from my file would look like this: 并且根据有关指定填充板的本教程( http://docs.gstreamer.com/display/GstSDK/Basic+tutorial+10%3A+GStreamer+tools ),我相信我播放文件中视频流的命令会看起来像这样:

$ gst-launch-0.10 filesrc location=demo.ogv ! oggdemux name=d d.src_02 ! theoradec ! autovideosink

But these are my run results. 但是这些是我的跑步成绩。 Everything appears to hang "prerolling" and I need to interrupt with a Ctrl+C to get back to the command line: 一切似乎都挂起了“ prerolling”,我需要用Ctrl + C中断才能返回命令行:

$ gst-launch-0.10 filesrc location=demo.ogv ! oggdemux name=d d.src_02 ! theoradec ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
^C
Caught interrupt -- handling interrupt.
Interrupt: Stopping pipeline ...
(gst-launch-0.10:7625): GLib-CRITICAL **: Source ID 1 was not found when attempting to remove it
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...

Any ideas? 有任何想法吗?

Also potentially insightful: 也可能很有见地:

$ gst-typefind-0.10 demo.ogv 
demo.ogv - application/x-annodex

$ gst-discoverer-0.10 demo.ogv 
Analyzing file:///projects/keypr/demo.ogv
Done discovering file:///projects/keypr/demo.ogv

Topology:
  container: Ogg
    audio: Vorbis
    video: Theora

Properties:
  Duration: 0:00:05.546666666
  Seekable: yes
  Tags: 
      container format: Ogg
      application name: ffmpeg2theora-0.26
      extended comment: SOURCE_OSHASH=d1af78a82e61d18f
      encoder: Xiph.Org libtheora 1.1 20090822 (Thusnelda)
      encoder version: 0
      nominal bitrate: 110000
      bitrate: 110000
      video codec: Theora
      audio codec: Vorbis

UPDATE: I was able to play just the audio stream using the following command: 更新:我能够使用以下命令仅播放音频流:

$ gst-launch-0.10 uridecodebin uri=file:///path/to/demo.ogv ! audioconvert ! autoaudiosink

Note that it does not work when using the filesrc location=demo.ogv . 请注意,在使用filesrc location=demo.ogv时,它不起作用。 Only when I use the uridecodebin. 仅当我使用uridecodebin时。 And I am still unable to isolate the video stream. 而且我仍然无法隔离视频流。

UPDATE 2: I stumbled a pipeline that isolates and plays the video stream, but I do not understand it: 更新2:我偶然发现了隔离并播放视频流的管道,但我不明白:

$ gst-launch-0.10 uridecodebin uri=file:///path/to/demo.ogv ! theoraenc ! oggmux ! oggdemux ! theoradec ! ffmpegcolorspace ! videoscale ! ximagesink

I found it while surfing ( http://wiki.laptop.org/go/GStreamer/Developers ) and saw a demo execution of videotestsrc. 我在冲浪时( http://wiki.laptop.org/go/GStreamer/Developers )找到了它,并观看了videotestsrc的演示执行。

$ gst-launch-0.10 videotestsrc ! theoraenc ! oggmux ! oggdemux ! theoradec ! ffmpegcolorspace ! videoscale ! ximagesink

Can anyone explain why this works? 谁能解释为什么这可行? This would appear to encode the file, mux it, demux it, decode it, and then filter/scale it into the sink. 这似乎是对文件进行编码,对其进行多路复用,对其进行解复用,对其进行解码,然后将其过滤/缩放到接收器中。 How does this make sense? 这有什么意义?

If uridecodebin is known to be giving you a good video pipeline, and you just want to copy it, you can try the following. 如果已知uridecodebin为您提供了良好的视频管道,而您只想复制它,则可以尝试以下操作。

1) set environment variable GST_DEBUG_DUMP_DOT_DIR. 1)设置环境变量GST_DEBUG_DUMP_DOT_DIR。

export GST_DEBUG_DUMP_DOT_DIR=/tmp

2) Run your gst-launch command. 2)运行您的gst-launch命令。

3) In /tmp you should see files like the following 3)在/ tmp中,您应该看到如下文件

  • 0.00.00.010839464-gst-launch.NULL_READY.dot 0.00.00.010839464-gst-launch.NULL_READY.dot
  • 0.00.00.100795940-gst-launch.READY_PAUSED.dot 0.00.00.100795940-gst-launch.READY_PAUSED.dot
  • 0.00.00.104255451-gst-launch.PAUSED_PLAYING.dot 0.00.00.104255451-gst-launch.PAUSED_PLAYING.dot
  • 0.00.00.988712046-gst-launch.PLAYING_READY.dot 0.00.00.988712046-gst-launch.PLAYING_READY.dot

4) Install graphviz if you don't already have it. 4)如果尚未安装graphviz,请安装它。

5) Run the "dot" program to create a PNG file of the exact pipeline GStreamer used. 5)运行“点”程序,以创建所使用的确切管道GStreamer的PNG文件。 Base it off the "PAUSED_PLAYING" file. 以“ PAUSED_PLAYING”文件为基础。

dot -Tpng 0.00.00.104255451-gst-launch.PAUSED_PLAYING.dot  -o /tmp/out.png

This actually doesn't make sense, and is totally wrong :) 这实际上是没有意义的,并且是完全错误的:)

You will want to use : 您将要使用:

gst-launch-0.10 uridecodebin uri=file:///path/to/demo.ogv ! ffmpegcolorspace ! autovideosink

to play only the video part. 仅播放视频部分。 Using filesrc of course won't work because you will try to send the content of the files, so something muxed and encoded, to audioconvert which can only deal with raw audio. 使用filesrc当然是行不通的,因为您将尝试将文件的内容(经过混合和编码的内容)发送到audioconvert,而后者只能处理原始音频。 If you want to construct the entire pipeline by hand, you can do : 如果要手动构建整个管道,可以执行以下操作:

gst-launch-0.10 filesrc location=demo.ogv ! oggdemux ! theoradec ! ffmpegcolorspace ! autovideosink

As a side note, you should use gstreamer 1.0 except if you have a very good reason not to. 附带说明,除非有充分理由不这样做,否则应使用gstreamer 1.0。

Cheers :) 干杯:)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM