繁体   English   中英

Java如何通过编程将图像编码成视频文件?

[英]How to encode images into a video file in Java through programming?

我正在尝试使用将一些相同分辨率的图像编码到视频文件中,为此我尝试过:

编解码器

  • jcodec .. 示例说明

    但是它非常耗时,并且不是编码大量图像的合适工具,并且会产生快速的时间扩展。

FFMPEG

  • FFMPEG ..示例说明

    但是 ffmpeg 只能从图像文件创建视频。 图像需要在物理系统上创建。

我听说Xuggler可以在 Java 程序中使用它的 API 来创建视频文件,但它的网站似乎已损坏。 我无法尝试。

有谁知道如何将java格式的图像编码成视频文件请帮忙!

提前致谢!

Xuggler弃用,请改用Humble-Video 它已经附带了一些演示项目,包括如何截取屏幕截图并将其转换为视频文件: RecordAndEncodeVideo.java

/*******************************************************************************
 * Copyright (c) 2014, Art Clarke.  All rights reserved.
 * <p>
 * This file is part of Humble-Video.
 * <p>
 * Humble-Video is free software: you can redistribute it and/or modify
 * it under the terms of the GNU Affero General Public License as published by
 * the Free Software Foundation, either version 3 of the License, or
 * (at your option) any later version.
 * <p>
 * Humble-Video is distributed in the hope that it will be useful,
 * but WITHOUT ANY WARRANTY; without even the implied warranty of
 * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 * GNU Affero General Public License for more details.
 * <p>
 * You should have received a copy of the GNU Affero General Public License
 * along with Humble-Video.  If not, see <http://www.gnu.org/licenses/>.
 *******************************************************************************/
package io.humble.video.demos;

import io.humble.video.*;
import io.humble.video.awt.MediaPictureConverter;
import io.humble.video.awt.MediaPictureConverterFactory;
import org.apache.commons.cli.*;

import java.awt.*;
import java.awt.image.BufferedImage;
import java.io.IOException;

/**
 * Records the contents of your computer screen to a media file for the passed in duration.
 * This is meant as a demonstration program to teach the use of the Humble API.
 * <p>
 * Concepts introduced:
 * </p>
 * <ul>
 * <li>Muxer: A {@link Muxer} object is a container you can write media data to.</li>
 * <li>Encoders: An {@link Encoder} object lets you convert {@link MediaAudio} or {@link MediaPicture} objects into {@link MediaPacket} objects
 * so they can be written to {@link Muxer} objects.</li>
 * </ul>
 *
 * <p>
 * To run from maven, do:
 * </p>
 * <pre>
 * mvn install exec:java -Dexec.mainClass="io.humble.video.demos.RecordAndEncodeVideo" -Dexec.args="filename.mp4"
 * </pre>
 *
 * @author aclarke
 *
 */
public class RecordAndEncodeVideo
{
    /**
     * Records the screen
     */
    private static void recordScreen (String filename, String formatname, String codecname, int duration, int snapsPerSecond) throws AWTException, InterruptedException, IOException
    {
        /**
         * Set up the AWT infrastructure to take screenshots of the desktop.
         */
        final Robot robot = new Robot();
        final Toolkit toolkit = Toolkit.getDefaultToolkit();
        final Rectangle screenbounds = new Rectangle(toolkit.getScreenSize());

        final Rational framerate = Rational.make(1, snapsPerSecond);

        /** First we create a muxer using the passed in filename and formatname if given. */
        final Muxer muxer = Muxer.make(filename, null, formatname);

        /** Now, we need to decide what type of codec to use to encode video. Muxers
         * have limited sets of codecs they can use. We're going to pick the first one that
         * works, or if the user supplied a codec name, we're going to force-fit that
         * in instead.
         */
        final MuxerFormat format = muxer.getFormat();
        final Codec codec;
        if (codecname != null)
        {
            codec = Codec.findEncodingCodecByName(codecname);
        }
        else
        {
            codec = Codec.findEncodingCodec(format.getDefaultVideoCodecId());
        }

        /**
         * Now that we know what codec, we need to create an encoder
         */
        Encoder encoder = Encoder.make(codec);

        /**
         * Video encoders need to know at a minimum:
         *   width
         *   height
         *   pixel format
         * Some also need to know frame-rate (older codecs that had a fixed rate at which video files could
         * be written needed this). There are many other options you can set on an encoder, but we're
         * going to keep it simpler here.
         */
        encoder.setWidth(screenbounds.width);
        encoder.setHeight(screenbounds.height);
        // We are going to use 420P as the format because that's what most video formats these days use
        final PixelFormat.Type pixelformat = PixelFormat.Type.PIX_FMT_YUV420P;
        encoder.setPixelFormat(pixelformat);
        encoder.setTimeBase(framerate);

        /** An annoynace of some formats is that they need global (rather than per-stream) headers,
         * and in that case you have to tell the encoder. And since Encoders are decoupled from
         * Muxers, there is no easy way to know this beyond
         */
        if (format.getFlag(MuxerFormat.Flag.GLOBAL_HEADER))
        {
            encoder.setFlag(Encoder.Flag.FLAG_GLOBAL_HEADER, true);
        }

        /** Open the encoder. */
        encoder.open(null, null);


        /** Add this stream to the muxer. */
        muxer.addNewStream(encoder);

        /** And open the muxer for business. */
        muxer.open(null, null);

        /** Next, we need to make sure we have the right MediaPicture format objects
         * to encode data with. Java (and most on-screen graphics programs) use some
         * variant of Red-Green-Blue image encoding (a.k.a. RGB or BGR). Most video
         * codecs use some variant of YCrCb formatting. So we're going to have to
         * convert. To do that, we'll introduce a MediaPictureConverter object later. object.
         */
        MediaPictureConverter converter = null;
        final MediaPicture picture = MediaPicture.make(encoder.getWidth(), encoder.getHeight(), pixelformat);
        picture.setTimeBase(framerate);

        /** Now begin our main loop of taking screen snaps.
         * We're going to encode and then write out any resulting packets. */
        final MediaPacket packet = MediaPacket.make();
        for (int i = 0; i < duration / framerate.getDouble(); i++)
        {
            /** Make the screen capture && convert image to TYPE_3BYTE_BGR */
            final BufferedImage screen = convertToType(robot.createScreenCapture(screenbounds), BufferedImage.TYPE_3BYTE_BGR);

            /** This is LIKELY not in YUV420P format, so we're going to convert it using some handy utilities. */
            if (converter == null)
            {
                converter = MediaPictureConverterFactory.createConverter(screen, picture);
            }
            converter.toPicture(picture, screen, i);

            do
            {
                encoder.encode(packet, picture);
                if (packet.isComplete())
                {
                    muxer.write(packet, false);
                }
            } while (packet.isComplete());

            /** now we'll sleep until it's time to take the next snapshot. */
            Thread.sleep((long) (1000 * framerate.getDouble()));
        }

        /** Encoders, like decoders, sometimes cache pictures so it can do the right key-frame optimizations.
         * So, they need to be flushed as well. As with the decoders, the convention is to pass in a null
         * input until the output is not complete.
         */
        do
        {
            encoder.encode(packet, null);
            if (packet.isComplete())
            {
                muxer.write(packet, false);
            }
        } while (packet.isComplete());

        /** Finally, let's clean up after ourselves. */
        muxer.close();
    }

    @SuppressWarnings("static-access")
    public static void main (String[] args) throws InterruptedException, IOException, AWTException
    {
        final Options options = new Options();
        options.addOption("h", "help", false, "displays help");
        options.addOption("v", "version", false, "version of this library");
        options.addOption(OptionBuilder.withArgName("format").withLongOpt("format").hasArg().
                withDescription("muxer format to use. If unspecified, we will guess from filename").create("f"));
        options.addOption(OptionBuilder.withArgName("codec")
                .withLongOpt("codec")
                .hasArg()
                .withDescription("codec to use when encoding video; If unspecified, we will guess from format")
                .create("c"));
        options.addOption(OptionBuilder.withArgName("duration")
                .withLongOpt("duration")
                .hasArg()
                .withDescription("number of seconds of screenshot to record; defaults to 10.")
                .create("d"));
        options.addOption(OptionBuilder.withArgName("snaps per second")
                .withLongOpt("snaps")
                .hasArg()
                .withDescription("number of pictures to take per second (i.e. the frame rate); defaults to 5")
                .create("s"));

        final CommandLineParser parser = new org.apache.commons.cli.BasicParser();
        try
        {
            final CommandLine cmd = parser.parse(options, args);
            final String[] parsedArgs = cmd.getArgs();
            if (cmd.hasOption("version"))
            {
                // let's find what version of the library we're running
                final String version = io.humble.video_native.Version.getVersionInfo();
                System.out.println("Humble Version: " + version);
            }
            else if (cmd.hasOption("help") || parsedArgs.length != 1)
            {
                final HelpFormatter formatter = new HelpFormatter();
                formatter.printHelp(RecordAndEncodeVideo.class.getCanonicalName() + " <filename>", options);
            }
            else
            {
                /**
                 * Read in some option values and their defaults.
                 */
                final int duration = Integer.parseInt(cmd.getOptionValue("duration", "10"));
                if (duration <= 0)
                {
                    throw new IllegalArgumentException("duration must be > 0");
                }
                final int snaps = Integer.parseInt(cmd.getOptionValue("snaps", "5"));
                if (snaps <= 0)
                {
                    throw new IllegalArgumentException("snaps must be > 0");
                }
                final String codecname = cmd.getOptionValue("codec");
                final String formatname = cmd.getOptionValue("format");
                final String filename = cmd.getArgs()[0];

                recordScreen(filename, formatname, codecname, duration, snaps);
            }
        } catch (ParseException e)
        {
            System.err.println("Exception parsing command line: " + e.getLocalizedMessage());
        }
    }

    /**
     * Convert a {@link BufferedImage} of any type, to {@link BufferedImage} of a
     * specified type. If the source image is the same type as the target type,
     * then original image is returned, otherwise new image of the correct type is
     * created and the content of the source image is copied into the new image.
     *
     * @param sourceImage
     *          the image to be converted
     * @param targetType
     *          the desired BufferedImage type
     *
     * @return a BufferedImage of the specifed target type.
     *
     * @see BufferedImage
     */
    public static BufferedImage convertToType (BufferedImage sourceImage, int targetType)
    {
        BufferedImage image;

        // if the source image is already the target type, return the source image

        if (sourceImage.getType() == targetType)
        {
            image = sourceImage;
        }

        // otherwise create a new image of the target type and draw the new
        // image

        else
        {
            image = new BufferedImage(sourceImage.getWidth(), sourceImage.getHeight(), targetType);
            image.getGraphics().drawImage(sourceImage, 0, 0, null);
        }

        return image;
    }
}

检查其他演示: 谦虚视频演示

我在 webapp 上实时使用它。

如果您要实时流式传输此内容,您将需要一个RTSP服务器。 您可以使用Red 5 ServerWowza Streaming Engine等大型框架,也可以使用Netty构建自己的服务器, Netty自 3.2 版起就内置了 RTSP 编解码器。

使用命令行,有多种方法可以将图像转换为视频。 您可以在 java 中使用这些命令进行保存。 您可以从以下链接获取这些命令:

  1. 使用 ffmpeg 将一组图像转换为视频
  2. 从图像创建视频幻灯片

我正在分享一个代码片段来解决这个问题:

从 HTML5 画布保存 png 图像的代码

Base64 decoder = new Base64();
byte[] pic = decoder.decodeBase64(request.getParameter("pic"));
String frameCount = request.getParameter("frame");
InputStream in = new ByteArrayInputStream(pic);
BufferedImage bImageFromConvert = ImageIO.read(in);
String outdir = "output\\"+frameCount;
//Random rand = new Random();
File file = new File(outdir);
if(file.isFile()){
    if(file.delete()){
        File writefile = new File(outdir);
        ImageIO.write(bImageFromConvert, "png", file);
    }
}

从视频创建图像的代码

String filePath = "D:\\temp\\some.mpg";
String outdir = "output";
File file = new File(outdir);
file.mkdirs();
Map<String, String> m = System.getenv();

/*
 * String command[] =
 * {"D:\\ffmpeg-win32-static\\bin\\ffmpeg","-i",filePath
 * ,"-r 30","-f","image2",outdir,"\\user%03d.jpg"};
 * 
 * ProcessBuilder pb = new ProcessBuilder(command); pb.start();
 */
String commands = "D:\\ffmpeg-win32-static\\bin\\ffmpeg -i " + filePath
        + " -r 30  -f image2 " + outdir + "\\image%5d.png";
Process p = Runtime.getRuntime().exec(commands);

从图像创建视频的代码

String filePath = "output";
File fileP = new File(filePath);
String commands = "D:\\ffmpeg-win32-static\\bin\\ffmpeg -f image2 -i "
        + fileP + "\\image%5d.png " + fileP + "\\video.mp4";
System.out.println(commands);
Runtime.getRuntime().exec(commands);
System.out.println(fileP.getAbsolutePath());

归功于@ yashprit


Android 开发人员的另一种方法:

  1. 在 Android 内部创建一个临时文件夹。
  2. 将图像复制到新文件夹中
  3. 首先,重命名图片以遵循数字顺序。 比如img1.jpg, img2.jpg, img3.jpg,... 那么你可以运行:
  4. 以编程方式运行此程序 ffm​​peg -f image2 -i img%d.jpg /tmp/a.mpg 以编程方式运行此程序,

使用以下代码:

void convertImg_to_vid()
{
    Process chperm;
    try {
        chperm=Runtime.getRuntime().exec("su");
          DataOutputStream os = 
              new DataOutputStream(chperm.getOutputStream());

              os.writeBytes("ffmpeg -f image2 -i img%d.jpg /tmp/a.mpg\n");
              os.flush();

              chperm.waitFor();

    } catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    } catch (InterruptedException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }
}

资源链接:

  1. 使用 ffmpeg 从图像创建视频文件

Java Media Framework 中有一个实用程序,它可以从 Jpeg 图像链接列表中创建视频

这是源代码:

JpegImagesToMovie.java

/*
 * @(#)JpegImagesToMovie.java   1.3 01/03/13
 * Copyright (c) 1999-2001 Sun Microsystems, Inc. All Rights Reserved.
 * Sun grants you ("Licensee") a non-exclusive, royalty free, license to use,
 * modify and redistribute this software in source and binary code form,
 * provided that i) this copyright notice and license appear on all copies of
 * the software; and ii) Licensee does not utilize the software in a manner
 * which is disparaging to Sun.
 * This software is provided "AS IS," without a warranty of any kind. ALL
 * EXPRESS OR IMPLIED CONDITIONS, REPRESENTATIONS AND WARRANTIES, INCLUDING ANY
 * IMPLIED WARRANTY OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE OR
 * NON-INFRINGEMENT, ARE HEREBY EXCLUDED. SUN AND ITS LICENSORS SHALL NOT BE
 * LIABLE FOR ANY DAMAGES SUFFERED BY LICENSEE AS A RESULT OF USING, MODIFYING
 * OR DISTRIBUTING THE SOFTWARE OR ITS DERIVATIVES. IN NO EVENT WILL SUN OR ITS
 * LICENSORS BE LIABLE FOR ANY LOST REVENUE, PROFIT OR DATA, OR FOR DIRECT,
 * INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL OR PUNITIVE DAMAGES, HOWEVER
 * CAUSED AND REGARDLESS OF THE THEORY OF LIABILITY, ARISING OUT OF THE USE OF
 * OR INABILITY TO USE SOFTWARE, EVEN IF SUN HAS BEEN ADVISED OF THE
 * POSSIBILITY OF SUCH DAMAGES.
 *
 * This software is not designed or intended for use in on-line control of
 * aircraft, air traffic, aircraft navigation or aircraft communications; or in
 * the design, construction, operation or maintenance of any nuclear
 * facility. Licensee represents and warrants that it will not use or
 * redistribute the Software for such purposes.
 */

 package imagetovideo;

 import java.awt.Dimension;
 import java.io.File;
 import java.io.IOException;
 import java.io.RandomAccessFile;
 import java.net.MalformedURLException;
 import java.util.Vector;
 import javax.media.Buffer;
 import javax.media.ConfigureCompleteEvent;
 import javax.media.ControllerEvent;
 import javax.media.ControllerListener;
 import javax.media.DataSink;
 import javax.media.EndOfMediaEvent; 
 import javax.media.Format;
 import javax.media.Manager; 
 import javax.media.MediaLocator;
 import javax.media.PrefetchCompleteEvent;
 import javax.media.Processor;
 import javax.media.RealizeCompleteEvent;
 import javax.media.ResourceUnavailableEvent;
 import javax.media.Time;
 import javax.media.control.TrackControl;
 import javax.media.datasink.DataSinkErrorEvent;
 import javax.media.datasink.DataSinkEvent;
 import javax.media.datasink.DataSinkListener;
 import javax.media.datasink.EndOfStreamEvent;
 import javax.media.format.VideoFormat;
 import javax.media.protocol.ContentDescriptor;
 import javax.media.protocol.DataSource;
 import javax.media.protocol.FileTypeDescriptor;
 import javax.media.protocol.PullBufferDataSource;
 import javax.media.protocol.PullBufferStream;

 /**
  * This program takes a list of JPEG image files and convert them into a
  * QuickTime movie.
 */
 public class JpegImagesToMovie implements ControllerListener, DataSinkListener {

public boolean doIt(int width, int height, int frameRate, Vector inFiles,
        MediaLocator outML) throws MalformedURLException {
    ImageDataSource ids = new ImageDataSource(width, height, frameRate,
            inFiles);

    Processor p;

    try {
        //System.err
        //      .println("- create processor for the image datasource ...");
        p = Manager.createProcessor(ids);
    } catch (Exception e) {
        System.err
                .println("Yikes!  Cannot create a processor from the data source.");
        return false;
    }

    p.addControllerListener(this);

    // Put the Processor into configured state so we can set
    // some processing options on the processor.
    p.configure();
    if (!waitForState(p, p.Configured)) {
        System.err.println("Failed to configure the processor.");
        return false;
    }

    // Set the output content descriptor to QuickTime.
    p.setContentDescriptor(new ContentDescriptor(
            FileTypeDescriptor.QUICKTIME));

    // Query for the processor for supported formats.
    // Then set it on the processor.
    TrackControl tcs[] = p.getTrackControls();
    Format f[] = tcs[0].getSupportedFormats();
    if (f == null || f.length <= 0) {
        System.err.println("The mux does not support the input format: "
                + tcs[0].getFormat());
        return false;
    }

    tcs[0].setFormat(f[0]);

    //System.err.println("Setting the track format to: " + f[0]);

    // We are done with programming the processor. Let's just
    // realize it.
    p.realize();
    if (!waitForState(p, p.Realized)) {
        System.err.println("Failed to realize the processor.");
        return false;
    }

    // Now, we'll need to create a DataSink.
    DataSink dsink;
    if ((dsink = createDataSink(p, outML)) == null) {
        System.err
                .println("Failed to create a DataSink for the given output MediaLocator: "
                        + outML);
        return false;
    }

    dsink.addDataSinkListener(this);
    fileDone = false;

    System.out.println("Generating the video : "+outML.getURL().toString());

    // OK, we can now start the actual transcoding.
    try {
        p.start();
        dsink.start();
    } catch (IOException e) {
        System.err.println("IO error during processing");
        return false;
    }

    // Wait for EndOfStream event.
    waitForFileDone();

    // Cleanup.
    try {
        dsink.close();
    } catch (Exception e) {
    }
    p.removeControllerListener(this);

    System.out.println("Video creation completed!!!!!");
    return true;
}

/**
 * Create the DataSink.
 */
DataSink createDataSink(Processor p, MediaLocator outML) {

    DataSource ds;

    if ((ds = p.getDataOutput()) == null) {
        System.err
                .println("Something is really wrong: the processor does not have an output DataSource");
        return null;
    }

    DataSink dsink;

    try {
        //System.err.println("- create DataSink for: " + outML);
        dsink = Manager.createDataSink(ds, outML);
        dsink.open();
    } catch (Exception e) {
        System.err.println("Cannot create the DataSink: " + e);
        return null;
    }

    return dsink;
}

Object waitSync = new Object();
boolean stateTransitionOK = true;

/**
 * Block until the processor has transitioned to the given state. Return
 * false if the transition failed.
 */
boolean waitForState(Processor p, int state) {
    synchronized (waitSync) {
        try {
            while (p.getState() < state && stateTransitionOK)
                waitSync.wait();
        } catch (Exception e) {
        }
    }
    return stateTransitionOK;
}

/**
 * Controller Listener.
 */
public void controllerUpdate(ControllerEvent evt) {

    if (evt instanceof ConfigureCompleteEvent
            || evt instanceof RealizeCompleteEvent
            || evt instanceof PrefetchCompleteEvent) {
        synchronized (waitSync) {
            stateTransitionOK = true;
            waitSync.notifyAll();
        }
    } else if (evt instanceof ResourceUnavailableEvent) {
        synchronized (waitSync) {
            stateTransitionOK = false;
            waitSync.notifyAll();
        }
    } else if (evt instanceof EndOfMediaEvent) {
        evt.getSourceController().stop();
        evt.getSourceController().close();
    }
}

Object waitFileSync = new Object();
boolean fileDone = false;
boolean fileSuccess = true;

/**
 * Block until file writing is done.
 */
boolean waitForFileDone() {
    synchronized (waitFileSync) {
        try {
            while (!fileDone)
                waitFileSync.wait();
        } catch (Exception e) {
        }
    }
    return fileSuccess;
}

/**
 * Event handler for the file writer.
 */
public void dataSinkUpdate(DataSinkEvent evt) {

    if (evt instanceof EndOfStreamEvent) {
        synchronized (waitFileSync) {
            fileDone = true;
            waitFileSync.notifyAll();
        }
    } else if (evt instanceof DataSinkErrorEvent) {
        synchronized (waitFileSync) {
            fileDone = true;
            fileSuccess = false;
            waitFileSync.notifyAll();
        }
    }
}

/*public static void main(String args[]) {

    if (args.length == 0)
        prUsage();

    // Parse the arguments.
    int i = 0;
    int width = -1, height = -1, frameRate = 1;
    Vector inputFiles = new Vector();
    String outputURL = null;

    while (i < args.length) {

        if (args[i].equals("-w")) {
            i++;
            if (i >= args.length)
                prUsage();
            width = new Integer(args[i]).intValue();
        } else if (args[i].equals("-h")) {
            i++;
            if (i >= args.length)
                prUsage();
            height = new Integer(args[i]).intValue();
        } else if (args[i].equals("-f")) {
            i++;
            if (i >= args.length)
                prUsage();
            frameRate = new Integer(args[i]).intValue();
        } else if (args[i].equals("-o")) {
            i++;
            if (i >= args.length)
                prUsage();
            outputURL = args[i];
        } else {
            inputFiles.addElement(args[i]);
        }
        i++;
    }

    if (outputURL == null || inputFiles.size() == 0)
        prUsage();

    // Check for output file extension.
    if (!outputURL.endsWith(".mov") && !outputURL.endsWith(".MOV")) {
        System.err
                .println("The output file extension should end with a .mov extension");
        prUsage();
    }

    if (width < 0 || height < 0) {
        System.err.println("Please specify the correct image size.");
        prUsage();
    }

    // Check the frame rate.
    if (frameRate < 1)
        frameRate = 1;

    // Generate the output media locators.
    MediaLocator oml;

    if ((oml = createMediaLocator(outputURL)) == null) {
        System.err.println("Cannot build media locator from: " + outputURL);
        System.exit(0);
    }

    JpegImagesToMovie imageToMovie = new JpegImagesToMovie();
    imageToMovie.doIt(width, height, frameRate, inputFiles, oml);

    System.exit(0);
}*/

static void prUsage() {
    System.err
            .println("Usage: java JpegImagesToMovie -w <width> -h <height> -f <frame rate> -o <output URL> <input JPEG file 1> <input JPEG file 2> ...");
    System.exit(-1);
}

/**
 * Create a media locator from the given string.
 */
static MediaLocator createMediaLocator(String url) {

    MediaLocator ml;

    if (url.indexOf(":") > 0 && (ml = new MediaLocator(url)) != null)
        return ml;

    if (url.startsWith(File.separator)) {
        if ((ml = new MediaLocator("file:" + url)) != null)
            return ml;
    } else {
        String file = "file:" + System.getProperty("user.dir")
                + File.separator + url;
        if ((ml = new MediaLocator(file)) != null)
            return ml;
    }

    return null;
}

// /////////////////////////////////////////////
//
// Inner classes.
// /////////////////////////////////////////////

/**
 * A DataSource to read from a list of JPEG image files and turn that into a
 * stream of JMF buffers. The DataSource is not seekable or positionable.
 */
class ImageDataSource extends PullBufferDataSource {

    ImageSourceStream streams[];

    ImageDataSource(int width, int height, int frameRate, Vector images) {
        streams = new ImageSourceStream[1];
        streams[0] = new ImageSourceStream(width, height, frameRate, images);
    }

    public void setLocator(MediaLocator source) {
    }

    public MediaLocator getLocator() {
        return null;
    }

    /**
     * Content type is of RAW since we are sending buffers of video frames
     * without a container format.
     */
    public String getContentType() {
        return ContentDescriptor.RAW;
    }

    public void connect() {
    }

    public void disconnect() {
    }

    public void start() {
    }

    public void stop() {
    }

    /**
     * Return the ImageSourceStreams.
     */
    public PullBufferStream[] getStreams() {
        return streams;
    }

    /**
     * We could have derived the duration from the number of frames and
     * frame rate. But for the purpose of this program, it's not necessary.
     */
    public Time getDuration() {
        return DURATION_UNKNOWN;
    }

    public Object[] getControls() {
        return new Object[0];
    }

    public Object getControl(String type) {
        return null;
    }
}

/**
 * The source stream to go along with ImageDataSource.
 */
class ImageSourceStream implements PullBufferStream {

    Vector images;
    int width, height;
    VideoFormat format;

    int nextImage = 0; // index of the next image to be read.
    boolean ended = false;

    public ImageSourceStream(int width, int height, int frameRate,
            Vector images) {
        this.width = width;
        this.height = height;
        this.images = images;

        format = new VideoFormat(VideoFormat.JPEG, new Dimension(width,
                height), Format.NOT_SPECIFIED, Format.byteArray,
                (float) frameRate);
    }

    /**
     * We should never need to block assuming data are read from files.
     */
    public boolean willReadBlock() {
        return false;
    }

    /**
     * This is called from the Processor to read a frame worth of video
     * data.
     */
    public void read(Buffer buf) throws IOException {

        // Check if we've finished all the frames.
        if (nextImage >= images.size()) {
            // We are done. Set EndOfMedia.
            //System.err.println("Done reading all images.");
            buf.setEOM(true);
            buf.setOffset(0);
            buf.setLength(0);
            ended = true;
            return;
        }

        String imageFile = (String) images.elementAt(nextImage);
        nextImage++;

        //System.err.println("  - reading image file: " + imageFile);

        // Open a random access file for the next image.
        RandomAccessFile raFile;
        raFile = new RandomAccessFile(imageFile, "r");

        byte data[] = null;

        // Check the input buffer type & size.

        if (buf.getData() instanceof byte[])
            data = (byte[]) buf.getData();

        // Check to see the given buffer is big enough for the frame.
        if (data == null || data.length < raFile.length()) {
            data = new byte[(int) raFile.length()];
            buf.setData(data);
        }

        // Read the entire JPEG image from the file.
        raFile.readFully(data, 0, (int) raFile.length());

        //System.err.println("    read " + raFile.length() + " bytes.");

        buf.setOffset(0);
        buf.setLength((int) raFile.length());
        buf.setFormat(format);
        buf.setFlags(buf.getFlags() | buf.FLAG_KEY_FRAME);

        // Close the random access file.
        raFile.close();
    }

    /**
     * Return the format of each video frame. That will be JPEG.
     */
    public Format getFormat() {
        return format;
    }

    public ContentDescriptor getContentDescriptor() {
        return new ContentDescriptor(ContentDescriptor.RAW);
    }

    public long getContentLength() {
        return 0;
    }

    public boolean endOfStream() {
        return ended;
    }

    public Object[] getControls() {
        return new Object[0];
    }

    public Object getControl(String type) {
        return null;
    }
  }
}

它的 doIt 函数可以从另一个具有 main 函数的类中调用:

创建视频.java

/*
 * To change this template, choose Tools | Templates
 * and open the template in the editor.
*/
package imagetovideo;

import java.awt.Graphics2D;
import java.awt.RenderingHints;
import java.awt.image.BufferedImage;
import java.io.File;
import java.io.FilenameFilter;
import java.io.IOException;
import java.net.MalformedURLException;
import java.util.Vector;
import javax.media.MediaLocator;


public class CreateVideo{

  public static final File dir = new File("D:\\imagesFolder\\");
  public static final String[] extensions = new String[]{"jpg", "png"};
  public static final FilenameFilter imageFilter = new FilenameFilter() {
    @Override
    public boolean accept(final File dir, String name) {
        for (final String ext : extensions) {
            if (name.endsWith("." + ext)) {
                return (true);
            }
        }
        return (false);
    }
};

// Main function 
public static void main(String[] args) throws IOException {
    File file = new File("D:\\a.mp4");
    if (!file.exists()) {
        file.createNewFile();
    }
    Vector<String> imgLst = new Vector<>();
    if (dir.isDirectory()) {
        int counter = 1;
        for (final File f : dir.listFiles(imageFilter)) {               
            imgLst.add(f.getAbsolutePath());             

        }
    }
    makeVideo("file:\\" + file.getAbsolutePath(), imgLst);        
}

 public static void makeVideo(String fileName, Vector imgLst) throws MalformedURLException {
    JpegImagesToMovie imageToMovie = new JpegImagesToMovie();
    MediaLocator oml;
    if ((oml = imageToMovie.createMediaLocator(fileName)) == null) {
        System.err.println("Cannot build media locator from: " + fileName);
        System.exit(0);
    }
    int interval = 40;
    imageToMovie.doIt(720, 360, (1000 / interval), imgLst, oml);
 }  
}

要求

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM