简体   繁体   English

使用live555渲染RTSP H.264视频流

[英]Render RTSP H.264 video stream using live555

I would appreciate an example of using the Live555 library to render live streams to screen. 我将很感激使用Live555库将实时流渲染到屏幕的示例。 Apparently it's simple, but it would really help to see it done rather than simply read that "it's simple"!. 显然它很简单,但它确实有助于看到它完成而不是简单地读到“它很简单”! The gmane site states: gmane网站声明:

"To update (a copy of) the "testRTSPClient" code so that it renders video data is fairly straightforward: You simply have to change the "DummySink" class, so that it does the rendering (or calls a decoder library to do the rendering). In particular, you would change the implementation of the "afterGettingFrame()" function - at line 479. That's it! “更新(testRTSPClient)代码的副本,使其呈现视频数据非常简单:您只需更改”DummySink“类,以便进行渲染(或调用解码器库进行渲染)特别是,您将在第479行更改“afterGettingFrame()”函数的实现。就是这样!

(Actually, for H.264 video, there is one more thing that you'll probably need to do. H.264 streams have out-of-band configuration information (SPS and PPS NAL units) that you may need to feed to the decoder to initialize it. To get this information, call "MediaSubsession::fmtp_spropparametersets()" (on the video 'subsession' object). This will give you a (ASCII) character string. You can then pass this to "parseSPropParameterSets()", to generate binary NAL units for your decoder.) " (实际上,对于H.264视频,还有一件事你可能需要做.H.264流有带外配置信息(SPS和PPS NAL单元),你可能需要输入到要获取此信息,请调用“MediaSubsession :: fmtp_spropparametersets()”(在视频'subsession'对象上)。这将为您提供一个(ASCII)字符串。然后您可以将其传递给“parseSPropParameterSets()” “,为解码器生成二进制NAL单元。”

PS: I'm using visual studio and windows PS:我正在使用visual studio和windows

I did something similar in a previous job. 我在以前的工作中做了类似的事情。 The trick was to use DirectShow for rendering. 诀窍是使用DirectShow进行渲染。 Basically, live555 does not decode or render anything by itself, you need some kind of h264 decoder as well as a video surface. 基本上,live555本身不会解码或渲染任何东西,你需要某种h264解码器以及视频表面。 Fortunately for you, DirectShow provides both, but need quite a lot of programming to get to work. 幸运的是,DirectShow提供了两者,但需要相当多的编程才能开始工作。

live555 only provides mechanism for streaming, meaning that it will take NAL packets (h264 packets if you will) from the network or file source and push it to the "DummySink", you still need to decode (transform the NAL units to bitmaps) and render (draw the bitmaps to the screen). live555仅提供流式传输机制,这意味着它将从网络或文件源获取NAL数据包(如果你愿意的话,h264数据包)并将其推送到“DummySink”,你仍然需要解码(将NAL单元转换为位图)和渲染(将位图绘制到屏幕上)。 This is not something that live555 will do for you, but other libraries can, such as ffmpeg, but I didn't managed to get it to work so we moved to the DirectShow solution. 这不是live555会为你做的事情,但是其他库可以,比如ffmpeg,但是我没有设法让它工作,所以我们转向了DirectShow解决方案。 Namely, the "MS DTV-DVD Decoder" was very useful and could use some hardware acceleration provided by the chipset we were using (automagically). 也就是说,“MS DTV-DVD解码器”非常有用,可以使用我们使用的芯片组(自动)提供的一些硬件加速。 Another useful feature of live555 is that it will handle control protocols such as RTSP to handle PLAY/STOP/PAUSE requests for the stream. live555的另一个有用功能是它将处理控制协议(如RTSP)以处理流的PLAY / STOP / PAUSE请求。

EDIT: since you are searching for code, I've found an open sourced DirectShow filter that uses live555 for you. 编辑:因为您正在搜索代码,我找到了一个开源的DirectShow过滤器,为您使用live555 You should be able to run that with something like GraphStudio or GraphEdit. 您应该可以使用GraphStudio或GraphEdit之类的东西来运行它。 Eventually, you could learn how to create DirectShow graphs in c++ . 最后,您可以学习如何在c ++中创建DirectShow图形 This is really not very complicated as tons of examples already exist. 由于已经存在大量的例子,这实际上并不复杂。 The most difficult parts are to create the filters but the decoder and the renderer are already there for you, and I've provided you with the Source filter . 最困难的部分是创建过滤器,但解码器和渲染器已经在你身边,我已经为你提供了Source过滤器。

Another example of code using live555 on Windows is available at the video processing project . 另一个在Windows上使用live555的代码示例可在视频处理项目中找到 Like Eric's suggestion, it uses DirectShow (which is the standard way on Windows pre MediaFoundation. The code using live can be found here . 与Eric的建议一样,它使用DirectShow(这是Windows pre MediaFoundation上的标准方式。使用live的代码可以在这里找到。

If you're using Windows 7, there is a built in H.264 decoder which will be inserted into the media pipeline when you render the graph. 如果您使用的是Windows 7,则会有一个内置的H.264解码器,当您渲染图形时,它将插入到媒体管道中。 On earlier versions of Windows, you'll have to install your own H.264 decoder filter. 在早期版本的Windows上,您必须安装自己的H.264解码器过滤器。

Disclaimer: I am one of the authors of the video processing project. 免责声明:我是视频处理项目的作者之一。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM