简体   繁体   English

我如何在 Android 上实际使用 ffmpeg?

[英]How do I actually use ffmpeg on Android?

I have a very basic question regarding Android and ffmpeg.我有一个关于 Android 和 ffmpeg 的非常基本的问题。 I obtained ffmpeg from http://bambuser.com/opensource and was able to compile it for ARM.我从http://bambuser.com/opensource获得了 ffmpeg 并且能够为 ARM 编译它。

The results are the binaries ( ffmpeg ) as well as several libsomething.so files.结果是二进制文件 ( ffmpeg ) 以及几个libsomething.so文件。

My question is: Is this enough to decode videos?我的问题是:这足以解码视频吗? How do I actually use ffmpeg then?那么我如何实际使用 ffmpeg 呢?

To load the library I have:要加载我有的库:

static {
    System.load("/data/data/com.package/lib/libavcodec.so");
 }

It loads fine.它加载良好。 But what then?但是呢?

More explanation: I saw other projects where people had their ffmpeg source in a JNI directory in the project.更多解释:我看到其他项目,人们在项目的 JNI 目录中有他们的 ffmpeg 源代码。 They also created some Android.mk files and some C code along with it.他们还创建了一些 Android.mk 文件和一些 C 代码。 Would I need this as well?我也需要这个吗? Why would I create the .so files first and then copy the ffmpeg source code again?为什么我要先创建 .so 文件,然后再次复制 ffmpeg 源代码?

I know the NDK and how it should work but I've never seen an example of how one would actually call ffmpeg functions using it, because people seem to be hiding their implementations (which is sort of understandable) but not even giving useful pointers or examples.我知道 NDK 以及它应该如何工作,但我从未见过一个示例说明如何使用它来实际调用 ffmpeg 函数,因为人们似乎隐藏了他们的实现(这是可以理解的),但甚至没有给出有用的指针或例子。

Let's just say I wanted to decode a video file.假设我想解码视频文件。 Which kind of native methods would I need to implement?我需要实现哪种本地方法? How do I run the project?我如何运行该项目? Which data types need to be passed?需要传递哪些数据类型? etc. There are certainly a few people here who have at least done that, I know this from searching for hours and hours.等等。这里肯定有一些人至少做到了这一点,我是通过搜索数小时才知道这一点的。

For your first question;对于你的第一个问题;

Just building is not enough for the proper use of the ffmpeg libraries.仅构建不足以正确使用 ffmpeg 库。 You should also wrap those so files in the right order because these so files NEED other libraries in the link time.您还应该以正确的顺序包装这些 so 文件,因为这些 so 文件在链接时需要其他库。 You can display header information of the so file, by using.可以使用显示so文件的头信息。

objdump -x libavcodec.so | grep NEEDED

So you need to wrap these so files through Android.mk.所以需要通过Android.mk来包装这些so文件。 You may check this link .您可以查看此 链接

The second one;第二个;

You only need the header files from the ffmpeg project.您只需要来自 ffmpeg 项目的头文件。 The implementation will linked from the so libraries.该实现将从 so 库链接。 Thats perhaps because, developers didn't bother to filter header files.那可能是因为,开发人员没有费心过滤头文件。

And the last one;还有最后一个;

your thoughts seems right for the time being, most of the current developers are struggling to use ffmpeg but they lack of documentation and sample codes.您的想法暂时似乎是正确的,当前的大多数开发人员都在努力使用 ffmpeg,但他们缺乏文档和示例代码。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM