简体   繁体   English

Java - 同时向FFmpeg写入视频和音频

[英]Java - Write Video and Audio at the same time to FFmpeg

Hello fellow developers, 你好开发者,

I'm currently developing a tool which can render videos by executing FFmpeg using a Java Process and feeding video frames to it. 我目前正在开发一种工具,可以通过使用Java Process执行FFmpeg并向其提供视频帧来呈现视频。

I currently use the following FFmpeg command: ffmpeg -y -f rawvideo -pix_fmt rgb24 -s %WIDTH%x%HEIGHT% -r %FPS% -i - -an -c:v libx264 -preset ultrafast -pix_fmt yuv420p "%FILENAME%.mp4" , where the placeholders are obviously replaced with real values. 我目前使用以下FFmpeg命令: ffmpeg -y -f rawvideo -pix_fmt rgb24 -s %WIDTH%x%HEIGHT% -r %FPS% -i - -an -c:v libx264 -preset ultrafast -pix_fmt yuv420p "%FILENAME%.mp4" ,其中占位符显然被实际值替换。

The code I'm using to initialize FFmpeg: 我用来初始化FFmpeg的代码:

    //commandArgs is a list of command line arguments for FFmpeg

    List<String> command = new ArrayList<String>();
    command.add("ffmpeg");
    command.addAll(commandArgs);

    process = new ProcessBuilder(command).directory(outputFolder).start();
    OutputStream exportLogOut = new FileOutputStream("export.log");
    new StreamPipe(process.getInputStream(), exportLogOut).start();
    new StreamPipe(process.getErrorStream(), exportLogOut).start();
    outputStream = process.getOutputStream();
    channel = Channels.newChannel(outputStream);

Then, I have the following method to write a ByteBuffer containing a Video Frame to FFmpeg: 然后,我有以下方法将包含视频帧的ByteBuffer写入FFmpeg:

public void consume(ByteBuffer buf) {
    try {
        channel.write(buf);
        ByteBufferPool.release(buf);
    } catch(Exception e) {
        e.printStackTrace();
    }
}

Now, my question is, how would I go ahead and write synchronous Audio Data to the output file? 现在,我的问题是,我将如何继续将同步音频数据写入输出文件? I assume that I need to use multiple pipes, and of course I will have to modify my command line arguments, but I need help: 我假设我需要使用多个管道,当然我将不得不修改我的命令行参数,但我需要帮助:

1) what kind of Audio Data do I need to feed FFmpeg with?
2) how do I feed Audio and Video in one go?
3) how do I keep Audio and Video synchronized?

Thanks in advance for any help! 在此先感谢您的帮助!

Greetings, CrushedPixel 问候,CrushedPixel

This is what muxing formats are for, ideally you want to use a muxing format to feed data to FFmpeg. 这就是多路复用格式的用途,理想情况下,您希望使用多路复用格式将数据提供给FFmpeg。 An example of how FFmpeg does this internally is the interaction between ffmpeg.exe and ffserver.exe, and it does so through a custom/internal streaming file format called FFM . FFmpeg在内部执行此操作的示例是ffmpeg.exe和ffserver.exe之间的交互,它通过名为FFM的自定义/内部流文件格式实现。 Complete details about the implementation can be found here . 有关实施的完整详细信息,请访问此处 You can obviously also use other muxing formats, as simple as AVI. 您显然也可以使用其他多路复用格式,就像AVI一样简单。 Synchronization is automatic since the file provides timestamps. 由于文件提供时间戳,因此同步是自动的。

As for the type of audio data, this can really be anything, most people will use raw, interleaved PCM audio (either float or int16). 至于音频数据的类型,这实际上可以是任何东西,大多数人将使用原始的,交错的PCM音频(浮点数或int16)。

Take a look at https://github.com/artclarke/humble-video which is a wrapper around ffmpeg in java. 看一下https://github.com/artclarke/humble-video ,它是java中ffmpeg的包装器。 You can add video/audio streams dinamically to an encoder. 您可以将视频/音频流以编码方式添加到编码器中。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM