简体   繁体   English

FFmpeg 上 Android

[英]FFmpeg on Android

I have got FFmpeg compiled (libffmpeg.so) on Android.我在 Android 上编译了 FFmpeg (libffmpeg.so)。 Now I have to build either an application like RockPlayer or use existing Android multimedia framework to invoke FFmpeg.现在我必须构建一个像 RockPlayer 这样的应用程序,或者使用现有的 Android 多媒体框架来调用 FFmpeg。

  1. Do you have steps / procedures / code / example on integrating FFmpeg on Android / StageFright?您是否有在 Android / StageFright 上集成 FFmpeg 的步骤/程序/代码/示例?

  2. Can you please guide me on how can I use this library for multimedia playback?你能指导我如何使用这个库进行多媒体播放吗?

  3. I have a requirement where I have already audio and video transport streams, which I need to feed to FFmpeg and get it decoded / rendered.我有一个要求,我已经有音频和视频传输流,我需要将其提供给 FFmpeg 并对其进行解码/渲染。 How can I do this on Android, since IOMX APIs are OMX based and cannot plug-in FFmpeg here?我如何在 Android 上执行此操作,因为 IOMX API 是基于 OMX 的并且不能在此处插入 FFmpeg?

  4. Also I could not find documentation on the FFmpeg APIs which need to be used for playback.我也找不到有关需要用于播放的 FFmpeg API 的文档。

Here are the steps I went through in getting ffmpeg to work on Android:以下是我让 ffmpeg 在 Android 上运行所经历的步骤:

  1. Build static libraries of ffmpeg for Android.为 Android 构建 ffmpeg 的静态库。 This was achieved by building olvaffe's ffmpeg android port ( libffmpeg ) using the Android Build System .这是通过使用Android Build System构建 olvaffe 的 ffmpeg android 端口 ( libffmpeg ) 来实现的。 Simply place the sources under /external and make away.只需把源下/外部和make之遥。 You'll need to extract bionic(libc) and zlib(libz) from the Android build as well, as ffmpeg libraries depend on them.您还需要从 Android 构建中提取 bionic(libc) 和 zlib(libz),因为 ffmpeg 库依赖于它们。
  2. Create a dynamic library wrapping ffmpeg functionality using the Android NDK .使用 Android NDK创建一个包装 ffmpeg 功能的动态库。 There's a lot of documentation out there on how to work with the NDK.有很多关于如何使用 NDK 的文档。 Basically you'll need to write some C/C++ code to export the functionality you need out of ffmpeg into a library java can interact with through JNI.基本上,您需要编写一些 C/C++ 代码来将您需要的功能从 ffmpeg 导出到 java 可以通过 JNI 与之交互的库中。 The NDK allows you to easily link against the static libraries you've generated in step 1, just add a line similar to this to Android.mk: LOCAL_STATIC_LIBRARIES := libavcodec libavformat libavutil libc libz NDK 允许您轻松链接到您在第 1 步中生成的静态库,只需在 Android.mk 中添加与此类似的行: LOCAL_STATIC_LIBRARIES := libavcodec libavformat libavutil libc libz

  3. Use the ffmpeg-wrapping dynamic library from your java sources.使用 java 源中的 ffmpeg-wrapping 动态库。 There's enough documentation on JNI out there, you should be fine.那里有足够的关于 JNI 的文档,你应该没问题。

Regarding using ffmpeg for playback, there are many examples (the ffmpeg binary itself is a good example), here 'sa basic tutorial.关于使用ffmpeg进行播放,例子很多(ffmpeg二进制本身就是一个很好的例子),这里有一个基础教程。 The best documentation can be found in the headers.最好的文档可以在标题中找到。

Good luck :)祝你好运 :)

For various reasons, Multimedia was and is never easy in terms of achieving the task without compromising on efficiency.由于各种原因,多媒体在不影响效率的情况下完成任务从来都不是一件容易的事。 ffmpeg is an effort in improving it day by day. ffmpeg 每天都在努力改进它。 It supports different formats of codecs and containers.它支持不同格式的编解码器和容器。

Now to answer the question of how to use this library , i would say that it is not so simple to write it here.现在要回答如何使用这个库的问题,我会说在这里写它并不是那么简单。 But i can guide you in following ways .但我可以通过以下方式指导你。

1) Inside the ffmpeg directory of source code, you have output_example.c or api_example.c . 1) 在源代码的 ffmpeg 目录中,您有output_example.capi_example.c Here, you can see the code where encoding/decoding is done.在这里,您可以看到完成编码/解码的代码。 You will get an idea as to which API's inside ffmpeg you should call.您将了解应该调用 ffmpeg 中的哪个 API。 This would be your first step.这将是你的第一步。

2) Dolphin player is a open source project for Android. 2) Dolphin player 是Android 的开源项目。 Currently it is having bugs but developers are working continuously.目前它有错误,但开发人员正在持续工作。 In that project you have the whole setup ready which you can use to continue your investigation.在该项目中,您已准备好可用于继续调查的整个设置。 Here is a link to the project from code.google.com or run the command " git clone https://code.google.com/p/dolphin-player/ " in a terminal.这是来自 code.google.com 的项目链接或在终端中运行命令“ git clone https://code.google.com/p/dolphin-player/ ”。 You can see two projects named P and P86 .您可以看到两个名为 P 和 P86 的项目。 You can use either of them.您可以使用其中任何一个。

Extra tip i would like to offer is that when you are building the ffmpeg code, inside build.sh you need to enable the muxers/demuxers/encoders/decoders of the formats you want to use.我想提供的额外提示是,当您构建 ffmpeg 代码时,在 build.sh 中,您需要启用要使用的格式的复用器/解复用器/编码器/解码器。 Else the corresponding code will not be included in the libraries.否则相应的代码将不会包含在库中。 It took a lot of time for me to realize this.我花了很多时间才意识到这一点。 So thought of sharing it with you.于是想到与大家分享。

Few Basics : When we say a video file, ex : avi, it is combination of both audio and video一些基础知识:当我们说一个视频文件时,例如:avi,它是音频和视频的组合

Video file = Video + Audio视频文件 = 视频 + 音频


Video = Codec + Muxer + Demuxer视频 = 编解码器 + 多路复用器 + 多路分离器

codec = encoder + Decoder编解码器 = 编码器 + 解码器

=> Video = encoder + decoder + Muxer + Demuxer(Mpeg4 + Mpeg4 + avi +avi - Example for avi container) =>视频 = 编码器 + 解码器 + 复用器 + 解复用器(Mpeg4 + Mpeg4 + avi +avi - avi 容器示例)


Audio = Codec + Muxer + Demuxer音频 = 编解码器 + 多路复用器 + 多路分离器

codec = encoder + Decoder编解码器 = 编码器 + 解码器

=> Audio = encoder + decoder + Muxer + Demuxer(mp2 + mp2 + avi + avi - Example for avi container) =>音频 = 编码器 + 解码器 + Muxer + Demuxer(mp2 + mp2 + avi + avi - avi 容器示例)


Codec(name is deriverd from a combination of en*co*der/*dec*oder) is just a part of format which defines the algorithms used to encode/decode a frame.编解码器(名称源自 en*co*der/*dec*oder 的组合)只是定义用于编码/解码帧的算法的格式的一部分。 AVI is not a codec, it is a container which uses Video codec of Mpeg4 and Audio codec of mp2. AVI 不是编解码器,它是一个容器,它使用 Mpeg4 的视频编解码器和 mp2 的音频编解码器。

Muxer/demuxer is used to combine/separate the frames from a file used while encoding/decoding. Muxer/demuxer 用于从编码/解码时使用的文件中组合/分离帧。

So if you want to use avi format, you need to enable Video components + Audio components.所以如果要使用avi格式,需要开启Video组件+Audio组件。

Ex, for avi, you need to enable the following.例如,对于 avi,您需要启用以下功能。 mpeg4 Encoder, mpeg4 decoder, mp2 encoder, mp2 decoder, avi muxer, avi demuxer. mpeg4 编码器、mpeg4 解码器、mp2 编码器、mp2 解码器、avi 复用器、avi 解复用器。

phewwwwwww...呜呜呜呜...

Programmatically build.sh should contain the following code:以编程方式 build.sh 应包含以下代码:

--enable-muxer=avi --enable-demuxer=avi (Generic for both audio/video. generally Specific to a container)
--enable-encoder=mpeg4 --enable-decoder=mpeg4(For video support)
--enable-encoder=mp2 --enable-decoder=mp2 (For Audio support)

Hope i idid not confuse you more after all this...希望在这一切之后我没有让你更加困惑......

Thanks, Any assistance needed, please let me know.谢谢,需要任何帮助,请告诉我。

After a lot of research, right now this is the most updated compiled library for Android that I found:经过大量研究,现在这是我发现的最新的 Android 编译库:

https://github.com/bravobit/FFmpeg-Android https://github.com/bravobit/FFmpeg-Android

我发现的最容易构建、最容易使用的实现是由监护人项目团队制作的: https : //github.com/guardianproject/android-ffmpeg

I've done a little project to configure and build X264 and FFMPEG using the Android NDK.我已经完成了一个使用 Android NDK 配置和构建 X264 和 FFMPEG 的小项目。 The main thing that's missing is a decent JNI interface to make it accessible via Java, but that is the easy part (relatively).缺少的主要内容是一个体面的 JNI 接口,使其可以通过 Java 访问,但这是简单的部分(相对而言)。 When I get round to making the JNI interface good for my own uses, I'll push that in.当我开始着手使 JNI 接口适合我自己的用途时,我会将其推入。

The benefit over olvaffe's build system is that it doesn't require Android.mk files to build the libraries, it just uses the regular makefiles and the toolchain. olvaffe 构建系统的优势在于它不需要 Android.mk 文件来构建库,它只使用常规的 makefile 和工具链。 This makes it much less likely to stop working when you pull new change from FFMPEG or X264.当您从 FFMPEG 或 X264 中提取新更改时,这使得停止工作的可能性大大降低。

https://github.com/halfninja/android-ffmpeg-x264 https://github.com/halfninja/android-ffmpeg-x264

To make my FFMPEG application I used this project ( https://github.com/hiteshsondhi88/ffmpeg-android-java ) so, I don't have to compile anything.为了制作我的 FFMPEG 应用程序,我使用了这个项目( https://github.com/hiteshsondhi88/ffmpeg-android-java ),所以我不需要编译任何东西。 I think it's the easy way to use FFMPEG in our Android applications.我认为这是在我们的 Android 应用程序中使用 FFMPEG 的简单方法。

More info on http://hiteshsondhi88.github.io/ffmpeg-android-java/有关http://hiteshsondhi88.github.io/ffmpeg-android-java/ 的更多信息

Inspired by many other FFmpeg on Android implementations out there (mainly the guadianproject ), I found a solution (with Lame support also).受到 Android 上许多其他 FFmpeg 实现(主要是guadianproject )的启发,我找到了一个解决方案(也有 Lame 支持)。

(lame and FFmpeg: https://github.com/intervigilium/liblame and http://bambuser.com/opensource ) (跛脚和 FFmpeg: https : //github.com/intervigilium/liblamehttp://bambuser.com/opensource

to call FFmpeg:调用 FFmpeg:

new Thread(new Runnable() {

    @Override
    public void run() {

        Looper.prepare();

        FfmpegController ffmpeg = null;

        try {
            ffmpeg = new FfmpegController(context);
        } catch (IOException ioe) {
            Log.e(DEBUG_TAG, "Error loading ffmpeg. " + ioe.getMessage());
        }

        ShellDummy shell = new ShellDummy();
        String mp3BitRate = "192";

        try {
            ffmpeg.extractAudio(in, out, audio, mp3BitRate, shell);
        } catch (IOException e) {
            Log.e(DEBUG_TAG, "IOException running ffmpeg" + e.getMessage());
        } catch (InterruptedException e) {
            Log.e(DEBUG_TAG, "InterruptedException running ffmpeg" + e.getMessage());
        }

        Looper.loop();

    }

}).start();

and to handle the console output:并处理控制台输出:

private class ShellDummy implements ShellCallback {

    @Override
    public void shellOut(String shellLine) {
        if (someCondition) {
            doSomething(shellLine);
        }
        Utils.logger("d", shellLine, DEBUG_TAG);
    }

    @Override
    public void processComplete(int exitValue) {
        if (exitValue == 0) {
            // Audio job OK, do your stuff: 

                            // i.e.             
                            // write id3 tags,
                            // calls the media scanner,
                            // etc.
        }
    }

    @Override
    public void processNotStartedCheck(boolean started) {
        if (!started) {
                            // Audio job error, as above.
        }
    }
}

Strange that this project hasn't been mentioned: AndroidFFmpeg from Appunite奇怪的是这个项目没有被提及:来自 Appunite 的 AndroidFFmpeg

It has quite detailed step-by-step instructions to copy/paste to command line, for lazy people like me ))它有非常详细的分步说明来复制/粘贴到命令行,适合像我这样的懒人))

I had the same issue, I found most of the answers here out dated.我有同样的问题,我发现这里的大部分答案都过时了。 I ended up writing a wrapper on FFMPEG to access from Android with a single line of code.我最终在 FFMPEG 上编写了一个包装器,以便使用一行代码从 Android 访问。

https://github.com/madhavanmalolan/ffmpegandroidlibrary https://github.com/madhavanmalolan/ffmpegandroidlibrary

First, add the dependency of FFmpeg library一、添加FFmpeg库的依赖

implementation 'com.writingminds:FFmpegAndroid:0.3.2'

Then load in activity然后加载活动

FFmpeg ffmpeg;
    private void trimVideo(ProgressDialog progressDialog) {

    outputAudioMux = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_MOVIES).getAbsolutePath()
            + "/VidEffectsFilter" + "/" + new SimpleDateFormat("ddMMyyyy_HHmmss").format(new Date())
            + "filter_apply.mp4";

    if (startTrim.equals("")) {
        startTrim = "00:00:00";
    }

    if (endTrim.equals("")) {
        endTrim = timeTrim(player.getDuration());
    }

    String[] cmd = new String[]{"-ss", startTrim + ".00", "-t", endTrim + ".00", "-noaccurate_seek", "-i", videoPath, "-codec", "copy", "-avoid_negative_ts", "1", outputAudioMux};


    execFFmpegBinary1(cmd, progressDialog);
    }



    private void execFFmpegBinary1(final String[] command, ProgressDialog prpg) {

    ProgressDialog progressDialog = prpg;

    try {
        ffmpeg.execute(command, new ExecuteBinaryResponseHandler() {
            @Override
            public void onFailure(String s) {
                progressDialog.dismiss();
                Toast.makeText(PlayerTestActivity.this, "Fail to generate video", Toast.LENGTH_SHORT).show();
                Log.d(TAG, "FAILED with output : " + s);
            }

            @Override
            public void onSuccess(String s) {
                Log.d(TAG, "SUCCESS wgith output : " + s);

//                    pathVideo = outputAudioMux;
                String finalPath = outputAudioMux;
                videoPath = outputAudioMux;
                Toast.makeText(PlayerTestActivity.this, "Storage Path =" + finalPath, Toast.LENGTH_SHORT).show();

                Intent intent = new Intent(PlayerTestActivity.this, ShareVideoActivity.class);
                intent.putExtra("pathGPU", finalPath);
                startActivity(intent);
                finish();
                MediaScannerConnection.scanFile(PlayerTestActivity.this, new String[]{finalPath}, new String[]{"mp4"}, null);

            }

            @Override
            public void onProgress(String s) {
                Log.d(TAG, "Started gcommand : ffmpeg " + command);
                progressDialog.setMessage("Please Wait video triming...");
            }

            @Override
            public void onStart() {
                Log.d(TAG, "Startedf command : ffmpeg " + command);

            }

            @Override
            public void onFinish() {
                Log.d(TAG, "Finished f command : ffmpeg " + command);
                progressDialog.dismiss();
            }
        });
    } catch (FFmpegCommandAlreadyRunningException e) {
        // do nothing for now
    }
}

  private void loadFFMpegBinary() {
    try {
        if (ffmpeg == null) {
            ffmpeg = FFmpeg.getInstance(this);
        }
        ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
            @Override
            public void onFailure() {
                showUnsupportedExceptionDialog();
            }

            @Override
            public void onSuccess() {
                Log.d("dd", "ffmpeg : correct Loaded");
            }
        });
    } catch (FFmpegNotSupportedException e) {
        showUnsupportedExceptionDialog();
    } catch (Exception e) {
        Log.d("dd", "EXception no controlada : " + e);
    }
}

private void showUnsupportedExceptionDialog() {
    new AlertDialog.Builder(this)
            .setIcon(android.R.drawable.ic_dialog_alert)
            .setTitle("Not Supported")
            .setMessage("Device Not Supported")
            .setCancelable(false)
            .setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {
                @Override
                public void onClick(DialogInterface dialog, int which) {
                    finish();
                }
            })
            .create()
            .show();

}
    public String timeTrim(long milliseconds) {
        String finalTimerString = "";
        String minutString = "";
        String secondsString = "";

        // Convert total duration into time
        int hours = (int) (milliseconds / (1000 * 60 * 60));
        int minutes = (int) (milliseconds % (1000 * 60 * 60)) / (1000 * 60);
        int seconds = (int) ((milliseconds % (1000 * 60 * 60)) % (1000 * 60) / 1000);
        // Add hours if there

        if (hours < 10) {
            finalTimerString = "0" + hours + ":";
        } else {
            finalTimerString = hours + ":";
        }


        if (minutes < 10) {
            minutString = "0" + minutes;
        } else {
            minutString = "" + minutes;
        }

        // Prepending 0 to seconds if it is one digit
        if (seconds < 10) {
            secondsString = "0" + seconds;
        } else {
            secondsString = "" + seconds;
        }

        finalTimerString = finalTimerString + minutString + ":" + secondsString;

        // return timer string
        return finalTimerString;
    }

Also use another feature by FFmpeg还使用 FFmpeg 的另一个功能

===> merge audio to video
String[] cmd = new String[]{"-i", yourRealPath, "-i", arrayList.get(posmusic).getPath(), "-map", "1:a", "-map", "0:v", "-codec", "copy", "-shortest", outputcrop};


===> Flip vertical :
String[] cm = new String[]{"-i", yourRealPath, "-vf", "vflip", "-codec:v", "libx264", "-preset", "ultrafast", "-codec:a", "copy", outputcrop1};


===> Flip horizontally :  
String[] cm = new String[]{"-i", yourRealPath, "-vf", "hflip", "-codec:v", "libx264", "-preset", "ultrafast", "-codec:a", "copy", outputcrop1};


===> Rotate 90 degrees clockwise:
String[] cm=new String[]{"-i", yourRealPath, "-c", "copy", "-metadata:s:v:0", "rotate=90", outputcrop1};


===> Compress Video
String[] complexCommand = {"-y", "-i", yourRealPath, "-strict", "experimental", "-vcodec", "libx264", "-preset", "ultrafast", "-crf", "24", "-acodec", "aac", "-ar", "22050", "-ac", "2", "-b", "360k", "-s", "1280x720", outputcrop1};


===> Speed up down video
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=2.0*PTS[v];[0:a]atempo=0.5[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=1.0*PTS[v];[0:a]atempo=1.0[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=0.75*PTS[v];[0:a]atempo=1.5[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=0.5*PTS[v];[0:a]atempo=2.0[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};



===> Add two mp3 files 

StringBuilder sb = new StringBuilder();
sb.append("-i ");
sb.append(textSngname);
sb.append(" -i ");
sb.append(mAudioFilename);
sb.append(" -filter_complex [0:0][1:0]concat=n=2:v=0:a=1[out] -map [out] ");
sb.append(finalfile);
---> ffmpeg.execute(sb.toString().split(" "), new ExecuteBinaryResponseHandler()




===> Add three mp3 files

StringBuilder sb = new StringBuilder();
sb.append("-i ");
sb.append(firstSngname);
sb.append(" -i ");
sb.append(textSngname);
sb.append(" -i ");
sb.append(mAudioFilename);
sb.append(" -filter_complex [0:0][1:0][2:0]concat=n=3:v=0:a=1[out] -map [out] ");
sb.append(finalfile);
---> ffmpeg.execute(sb.toString().split(" "), new ExecuteBinaryResponseHandler()

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM