简体   繁体   中英

Java - Write Video and Audio at the same time to FFmpeg

Hello fellow developers,

I'm currently developing a tool which can render videos by executing FFmpeg using a Java Process and feeding video frames to it.

I currently use the following FFmpeg command: ffmpeg -y -f rawvideo -pix_fmt rgb24 -s %WIDTH%x%HEIGHT% -r %FPS% -i - -an -c:v libx264 -preset ultrafast -pix_fmt yuv420p "%FILENAME%.mp4" , where the placeholders are obviously replaced with real values.

The code I'm using to initialize FFmpeg:

    //commandArgs is a list of command line arguments for FFmpeg

    List<String> command = new ArrayList<String>();
    command.add("ffmpeg");
    command.addAll(commandArgs);

    process = new ProcessBuilder(command).directory(outputFolder).start();
    OutputStream exportLogOut = new FileOutputStream("export.log");
    new StreamPipe(process.getInputStream(), exportLogOut).start();
    new StreamPipe(process.getErrorStream(), exportLogOut).start();
    outputStream = process.getOutputStream();
    channel = Channels.newChannel(outputStream);

Then, I have the following method to write a ByteBuffer containing a Video Frame to FFmpeg:

public void consume(ByteBuffer buf) {
    try {
        channel.write(buf);
        ByteBufferPool.release(buf);
    } catch(Exception e) {
        e.printStackTrace();
    }
}

Now, my question is, how would I go ahead and write synchronous Audio Data to the output file? I assume that I need to use multiple pipes, and of course I will have to modify my command line arguments, but I need help:

1) what kind of Audio Data do I need to feed FFmpeg with?
2) how do I feed Audio and Video in one go?
3) how do I keep Audio and Video synchronized?

Thanks in advance for any help!

Greetings, CrushedPixel

This is what muxing formats are for, ideally you want to use a muxing format to feed data to FFmpeg. An example of how FFmpeg does this internally is the interaction between ffmpeg.exe and ffserver.exe, and it does so through a custom/internal streaming file format called FFM . Complete details about the implementation can be found here . You can obviously also use other muxing formats, as simple as AVI. Synchronization is automatic since the file provides timestamps.

As for the type of audio data, this can really be anything, most people will use raw, interleaved PCM audio (either float or int16).

Take a look at https://github.com/artclarke/humble-video which is a wrapper around ffmpeg in java. You can add video/audio streams dinamically to an encoder.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM