简体   繁体   中英

Create video from bitmap files using mediacodec

I have list of Bitmap files on my sd card. Now, I want to create video using mediacodec. I have checked MediaCodec documents.I could not find a way to create video. I don't want to use FFmpeg. I have tried below code. Any help would be appreciated!!

protected void MergeVideo() throws IOException {
        // TODO Auto-generated method stub
        MediaCodec mMediaCodec;
        MediaFormat mMediaFormat;
        ByteBuffer[] mInputBuffers;
        mMediaCodec = MediaCodec.createEncoderByType("video/avc");
        mMediaFormat = MediaFormat.createVideoFormat("video/avc", 320, 240);
        mMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 125000);
        mMediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
        mMediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
        mMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
        mMediaCodec.configure(mMediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        mMediaCodec.start();
        mInputBuffers = mMediaCodec.getInputBuffers();
        //for (int i = 0; i<50; i++) {
        int i=0;
            int j=String.valueOf(i).length()<1?Integer.parseInt("0"+i) : i;
           File imagesFile = new File(Environment.getExternalStorageDirectory() + "/VIDEOFRAME/","frame-"+j+".png");

         Bitmap bitmap = BitmapFactory.decodeFile(imagesFile.getAbsolutePath());
        ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
        bitmap.compress(Bitmap.CompressFormat.PNG, 100, byteArrayOutputStream); // image is the bitmap
        byte[] input = byteArrayOutputStream.toByteArray();

        int inputBufferIndex = mMediaCodec.dequeueInputBuffer(-1);
        if (inputBufferIndex >= 0) {
            ByteBuffer inputBuffer = mInputBuffers[inputBufferIndex];
            inputBuffer.clear();
            inputBuffer.put(input);
            mMediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0);
        }

You're missing a few pieces. The answer to this question has some of the information you need, but it was written for someone specifically wanting support in API 16. If you're willing to target API 18 and later, your life will be easier.

The biggest problem with what you have is that MediaCodec input from a ByteBuffer is always in uncompressed YUV format, but you seem to be passing compressed PNG images in. You will need to convert the bitmap to YUV. The exact layout and best method for doing this varies between devices (some use planar, some use semi-planar), but you can find code for doing so. Or just look at the way frames are generated in the buffer-to-buffer parts of EncodeDecodeTest .

Alternatively, use Surface input to the MediaCodec. Attach a Canvas to the input surface and draw the bitmap on it. The EncodeAndMuxTest does essentially this, but with OpenGL ES.

One potential issue is that you're passing in 0 for the frame timestamps. You should pass a real (generated) timestamp in, so that the value gets forwarded to MediaMuxer along with the encoded frame.

On very recent devices (API 21+), MediaRecorder can accept Surface input. This may be easier to work with than MediaCodec.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM