简体   繁体   中英

FFMPEG Create internal pipeline for adding raw frames to AVI file (no input file)

I have an application that reads in a raw video file, does some image processing to each frame, then feeds the resulting BGRA-format byte[] frames to the FFMPEG container to eventually create an AVI file. Since this process works slightly differently than any other FFMPEG example I've seen in that it does not have an existing input file, I'm wondering if anyone knows how to do this.

I initialize the FFMPEG container:

ProcessBuilder pBuilder = new ProcessBuilder(raid.getLocation()
                + "\\ffmpeg\\bin\\ffmpeg.exe", "-r", "30", "-vcodec",
                "rawvideo", "-f", "rawvideo", "-pix_fmt", "bgra", "-s",
                size, "-i", "pipe:0", "-r", "30", "-y", "-c:v", "libx264",
                "C:\export\2015-02-03\1500\EXPORT6.avi");

 try
 {
     process = pBuilder.start();
 }
 catch (IOException e)
 {
     e.printStackTrace();
 }

  ffmpegInput = process.getOutputStream();

For each incoming byte[] array frame, I add the frame to the container ("src" is a BufferedImage that I'm converting to a byte array):

try
{
     ByteArrayOutputStream baos = new ByteArrayOutputStream();
     ImageIO.write(src, ".png", baos);
     ffmpegInput.write(baos.toByteArray());
}
catch (IOException e)
{
     e.printStackTrace();
}

And once the video is finished loading frames, I close the container:

try
{
     ffmpegInput.flush();
     ffmpegInput.close();
}
catch (IOException e)
{
     e.printStackTrace();
}

The AVI file is created but it displays an error when opening. The FFMPEG logger displays this as the error:

ffmpeg version N-71102-g1f5d1ee Copyright (c) 2000-2015 the FFmpeg developers built with gcc 4.9.2 (GCC)
  configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-lzma --enable-decklink --enable-zlib
  libavutil      54. 20.101 / 54. 20.101
  libavcodec     56. 30.100 / 56. 30.100
  libavformat    56. 26.101 / 56. 26.101
  libavdevice    56.  4.100 / 56.  4.100
  libavfilter     5. 13.101 /  5. 13.101
  libswscale      3.  1.101 /  3.  1.101
  libswresample   1.  1.100 /  1.  1.100
  libpostproc    53.  3.100 / 53.  3.100
Input #0, rawvideo, from 'pipe:0':
  Duration: N/A, bitrate: 294912 kb/s
    Stream #0:0: Video: rawvideo (BGRA / 0x41524742), bgra, 640x480, 294912 kb/s, 30 tbr, 30 tbn, 30 tbc
No pixel format specified, yuv444p for H.264 encoding chosen.
Use -pix_fmt yuv420p for compatibility with outdated media players.
[libx264 @ 00000000003bcbe0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2
[libx264 @ 00000000003bcbe0] profile High 4:4:4 Predictive, level 3.0, 4:4:4 8-bit
Output #0, avi, to 'C:\export\2015-02-03\1500\EXPORT6.avi':
  Metadata:
    ISFT            : Lavf56.26.101
    Stream #0:0: Video: h264 (libx264) (H264 / 0x34363248), yuv444p, 640x480, q=-1--1, 30 fps, 30 tbn, 30 tbc
    Metadata:
      encoder         : Lavc56.30.100 libx264
Stream mapping:
  Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (libx264))
frame=    0 fps=0.0 q=0.0 Lsize=       6kB time=00:00:00.00 bitrate=N/A    
video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
Output file is empty, nothing was encoded (check -ss / -t / -frames parameters if used)

Any insight or ideas would be greatly appreciated!

Replace ProcessBuilder content with this:

new ProcessBuilder(PATHOFYOURFFMPEG, "-r", "30", "-s", SIZEOFYOURFRAME,
"-vcodec", "rawvideo", "-f", "rawvideo", "-pix_fmt", "bgra", "-i",
"pipe:0", "-r", "30", PATHANDFILEEXTENTION);

Basically, this is saying the input frames are coming in at 30 fps, its codec and format are rawvideo and the pixel format is " bgra " (Spelling is important here. I misspelled this and took me a work day figure out). The input is pipe:0 . (eg 0 for stdin, 1 for stdout, 2 for stderr). The confusing part is on the Java side, you have to get its " OutputStream " from the process and pass your frame data ( byte [] ) as stdin for ffmpeg. very confused uh?

Finally, I redefined -r and "30" again. This is not a mistake. So basically things before -i is for input and things defined after -i is for output! So in here I'm just saying that the output video will be 30 fps as well. Hope this helps!!!

PS: I tried to pass the byte[] to an queue and array list to cache. For some reasons, Java caches only 1 copy over and over. By the time I figure this out, half day has passed. So check you frame data before passing it to the ffmpeg.

我相信将[ ffmpeg_folder ] /bin添加到%PATH%然后调用

ffmpeg .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM