简体   繁体   English

开发H264硬件解码器Android - Stagefright或OpenMax IL?

[英]Developing H264 hardware decoder Android - Stagefright or OpenMax IL?

I am developing H264 H/W accelerated video decoder for android. 我正在为Android开发H264 H / W加速视频解码器。 So far, I've come around with some libraries MediaCodec , Stagefright , OpenMax IL , OpenMax AL and FFmpeg . 到目前为止,我已经找到了一些图书馆MediaCodecStagefrightOpenMax ILOpenMax ALFFmpeg After a bit research, I've found that - 经过一番研究,我发现 -

  1. I found a great resource of using stagefright with FFmpeg, but I can not use FFmpeg as for its license, it is quite restrictive for distributed software. 我找到了一个很好的使用stagefright与FFmpeg的资源 ,但我不能使用FFmpeg作为其许可证,它对分布式软件是相当严格的。 (Or possible to discard FFmpeg from this approach?) (或者可以从这种方法中丢弃FFmpeg?)

  2. I can not use MediaCodec as its a Java API and I have to call it via the JNI from C++ layer which is relatively slow and I am not allowed. 我不能使用MediaCodec作为它的Java API,我必须通过C ++层的JNI调用它,这是相对慢的,我不被允许。

  3. I can not use OpenMax AL as it only supports the decoding of MPEG-2 transport stream via a buffer queue. 我不能使用OpenMax AL,因为它只支持通过缓冲队列解码MPEG-2传输流。 This rules out passing raw h264 NALUs or other media formats for that matter. 这排除了传递原始h264 NALU或其他媒体格式的问题。

  4. Now only left are - stagefright and OpenMax IL. 现在只剩下 - stagefright和OpenMax IL。 I came to know that stagefright uses OpenMax(OMX) interface. 我开始知道stagefright使用OpenMax(OMX)接口。 So should I go with stagefright or OpenMax IL? 那么我应该使用stagefright还是OpenMax IL? Which will be more promising? 哪个会更有希望?

Also, I came to know that Android H/W accelerated decoder is vendor specific and every vendors has their own OMX interfacing APIs. 此外,我发现Android H / W加速解码器是供应商特定的,每个供应商都有自己的OMX接口API。 Is it true? 这是真的吗? If so, do I need to write H/W vendor specific implementation incase of OpenMax IL? 如果是这样,我是否需要编写OpenMax IL的H / W供应商特定实现? What about stagefright? 那么stagefright呢? - Is it hardware agnostic or hardware dependent? - 它是硬件无关的还是硬件依赖的? If there is no way of H/W indenpent implementation using stagefright or OpenMax IL, I need to support at least Qualcomm's Snapdragon, Samsung's Exynos and Tegra-4. 如果使用stagefright或OpenMax IL无法实现H / W,我需要至少支持Qualcomm的Snapdragon,三星的Exynos和Tegra-4。

Note that, I need to decode H264 Annex B stream and expect decoded data after decode which I will send to my video rendering pipeline. 注意,我需要解码H264附件B流并期望解码后的解码数据,我将发送到我的视频渲染管道。 So basically, I only need the decoder module. 所以基本上,我只需要解码器模块。

I am really confused a lot. 我真的很困惑。 Please help me putting in right direction. 请帮助我正确指导。 Thanks in advance! 提前致谢!

EDIT 编辑

My software is for commercial purpose and the source code is private as well. 我的软件用于商业目的,源代码也是私有的。 And I am also restricted to use ffmpeg by client. 我也被限制使用客户端的ffmpeg。 :) :)

You really should go for MediaCodec. 你真的应该选择MediaCodec。 Calling java methods via JNI does have some overhead, but you should keep in mind what order of magnitude the overhead is. 通过JNI调用java方法确实有一些开销,但是你应该记住开销的数量级。 If you'd call a function per pixel, the overhead of JNI calls might be problematic. 如果你为每个像素调用一个函数,JNI调用的开销可能会有问题。 But for using MediaCodec, you only do a few function calls per frame, and the overhead there is negligible. 但是对于使用MediaCodec,每帧只进行一些函数调用,其开销可以忽略不计。

See eg http://git.videolan.org/?p=vlc.git;a=blob;f=modules/codec/omxil/mediacodec_jni.c;h=57df9889c97706436823a4960206e323565e221c;hb=b31df501269b56c65327be181cdca3df48946fb1 as an example on using MediaCodec from C code using JNI. 例如,请参阅http://git.videolan.org/?p=vlc.git;a=blob;f=modules/codec/omxil/mediacodec_jni.c;h=57df9889c97706436823a4960206e323565e221c;hb=b31df501269b56c65327be181cdca3df48946fb1作为使用C的MediaCodec的示例使用JNI的代码。 As others also have gone this way, I can assure you that the JNI overhead is not a reason to consider other APIs than MediaCodec. 正如其他人也这样,我可以向你保证,JNI开销不是考虑其他API而不是MediaCodec的理由。

Using stagefright or OMX directly is problematic; 直接使用stagefright或OMX是有问题的; the ABI differs between each platform version (so you can either only target one version, or compile multiple times targeting different versions, packaging it all up in one package), and you'd have to deal with a lot of device specific quirks, while MediaCodec should (and on modern versions does) work the same across all devices. ABI在每个平台版本之间有所不同(因此您可以只针对一个版本,或者针对不同版本进行多次编译,将其全部打包在一个软件包中),并且您必须处理许多特定于设备的怪癖,而MediaCodec应该(并且在现代版本上)应该在所有设备上工作相同。

I found a great resource of using stagefright with FFmpeg, but I can not use FFmpeg as for its license, it is quite restrictive for distributed software. 我找到了一个很好的使用stagefright与FFmpeg的资源,但我不能使用FFmpeg作为其许可证,它对分布式软件是相当严格的。 (Or possible to discard FFmpeg from this approach?) (或者可以从这种方法中丢弃FFmpeg?)

That's not true. 这不是真的。 FFmpeg is LGPL, so you can just use it in your commercially redistributable application. FFmpeg是LGPL,因此您可以在商业可再发行应用程序中使用它。

However, you might be using modules of FFmpeg which are GPL licensed, eg libx264. 但是,您可能正在使用经过GPL许可的FFmpeg 模块 ,例如libx264。 In that case, your program must be GPL-compliant. 在这种情况下,您的程序必须符合GPL标准。

But not even that is bad for distributing software -- it just means that you need to give your customers (who should be kings, anyway), access to the source code of the application they are paying for, and are not allowed to restrict their freedoms. 但即使这对分发软件也不好 - 这只是意味着你需要给你的客户(无论如何应该是国王),访问他们所支付的应用程序的源代码,并且不允许限制他们的自由。 Not a bad deal, IMHO. 不是坏事,恕我直言。

Also, I came to know that Android H/W accelerated decoder is vendor specific and every vendors has their own OMX interfacing APIs. 此外,我发现Android H / W加速解码器是供应商特定的,每个供应商都有自己的OMX接口API。 Is it true? 这是真的吗?

Obviously, yes. 显然,是的。 If you need hardware acceleration, someone has to write a program that makes your specific hardware accelerate something. 如果您需要硬件加速,有人必须编写一个程序,使您的特定硬件加速。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM