简体   繁体   English

如何在android中的MediaCodec上下文中使用ByteBuffer

[英]How to use ByteBuffer in the MediaCodec context in android

So far I am able to setup a MediaCodec to encode a video stream. 到目前为止,我能够设置MediaCodec来编码视频流。 The aim is to save my user generated artwork into a video file. 目的是将用户生成的图稿保存到视频文件中。

I use android Bitmap objects of the user artwork to push frames into the stream. 我使用用户图稿的android Bitmap对象将帧推送到流中。

See the code snippet I use at the bottom of this post (it is the full code nothing is trimmed): 请参阅我在本文底部使用的代码片段(它是完整的代码,没有任何修剪):

MediaCodec uses ByteBuffer to deal with video/audio streams. MediaCodec使用ByteBuffer处理视频/音频流。

Bitmaps are based on int[] which if converted to byte[] will require x4 the size of the int[] 位图基于int [],如果转换为byte []将需要x4 int []的大小

I did some research to figure out what contracts are there in place for the ByteBuffer when dealing with video/audio streams in MediaCodec, but the information is almost close to zilch. 我做了一些研究,以确定在处理MediaCodec中的视频/音频流时ByteBuffer的合同是什么,但信息几乎接近于zilch。

So, what are the ByteBuffer usage contracts in MediaCodec? 那么,MediaCodec中的ByteBuffer使用合约是什么?

Does specifying the frame dimensions in the MediaFormat automatically mean that the ByteBuffers have width * height * 4 bytes capacity? 在MediaFormat中指定帧尺寸是否自动意味着ByteBuffers具有宽度*高度* 4字节容量?

(I use a bitmap object at a time for each frame) (我每帧都使用一个位图对象)

Thanks for any help. 谢谢你的帮助。

(edited, code added) (已编辑,已添加代码)

    import java.io.ByteArrayOutputStream;
    import java.io.DataOutputStream;
    import java.io.File;
    import java.io.FileOutputStream;
    import java.nio.ByteBuffer;

    import android.graphics.Rect;
    import android.graphics.Bitmap.CompressFormat;
    import android.media.MediaCodec;
    import android.media.MediaCodec.BufferInfo;
    import android.media.CamcorderProfile;
    import android.media.MediaCodecInfo;
    import android.media.MediaFormat;
    import android.util.Log;
    import android.view.View;

    public class VideoCaptureManager {

        private boolean running;

        private long presentationTime;

        public void start(View rootView, String saveFilePath){
            Log.e("OUT", saveFilePath);
            this.running = true;
            this.presentationTime = 0;
            this.capture(rootView, saveFilePath);
        }

        private void capture(final View rootView, String saveFilePath){
            if(rootView != null){
                rootView.setDrawingCacheEnabled(true);

                final Rect drawingRect = new Rect();
                rootView.getDrawingRect(drawingRect);

                try{
                    final File file = new File(saveFilePath);
                    if(file.exists()){
                        // File exists return
                        return;
                    } else {
                        File parent = file.getParentFile();
                        if(!parent.exists()){
                            parent.mkdirs();
                        }
                    }

            new Thread(){
                public void run(){
                    try{
                        DataOutputStream dos = new DataOutputStream(new FileOutputStream(file));

                        MediaCodec codec = MediaCodec.createEncoderByType("video/mp4v-es");
                        MediaFormat mediaFormat = null;
                        if(CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)){
                            mediaFormat = MediaFormat.createVideoFormat("video/mp4v-es", 720, 1280);
                        } else {
                            mediaFormat = MediaFormat.createVideoFormat("video/mp4v-es", 480, 720);
                        }


                        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 700000);
                        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 10);
                        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
                        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
                        codec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

                        codec.start();

                        ByteBuffer[] inputBuffers = codec.getInputBuffers();
                        ByteBuffer[] outputBuffers = codec.getOutputBuffers();

                        while(VideoCaptureManager.this.running){
                            try{
                                int inputBufferIndex = codec.dequeueInputBuffer(-2);
                                if(inputBufferIndex >= 0){
                                    // Fill in the bitmap bytes
                                    // inputBuffers[inputBufferIndex].
                                    ByteArrayOutputStream baos = new ByteArrayOutputStream();
                                    rootView.getDrawingCache().compress(CompressFormat.JPEG, 80, baos);
                                    inputBuffers[inputBufferIndex].put(baos.toByteArray());

                                    codec.queueInputBuffer(inputBufferIndex, 0, inputBuffers[inputBufferIndex].capacity(), presentationTime, MediaCodec.BUFFER_FLAG_CODEC_CONFIG);
                                    presentationTime += 100;
                                }

                                BufferInfo info = new BufferInfo();
                                int outputBufferIndex = codec.dequeueOutputBuffer(info, -2);
                                if(outputBufferIndex >= 0){
                                    // Write the bytes to file
                                    byte[] array = outputBuffers[outputBufferIndex].array(); // THIS THORWS AN EXCEPTION. WHAT IS THE CONTRACT TO DEAL WITH ByteBuffer in this code?
                                    if(array != null){
                                        dos.write(array);
                                    }

                                    codec.releaseOutputBuffer(outputBufferIndex, false);
                                } else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED){
                                    outputBuffers = codec.getOutputBuffers();
                                } else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED){
                                    // codec format is changed
                                    MediaFormat format = codec.getOutputFormat();
                                }

                                Thread.sleep(100);
                            }catch(Throwable th){
                                Log.e("OUT", th.getMessage(), th);
                            }
                        }

                        codec.stop();
                        codec.release();
                        codec = null;

                        dos.flush();
                        dos.close();
                    }catch(Throwable th){
                        Log.e("OUT", th.getMessage(), th);
                    }
                }
                    }.start();

                }catch(Throwable th){
                    Log.e("OUT", th.getMessage(), th);
                }
            }
        }

        public void stop(){
            this.running = false;
        }
    }

The exact layout of the ByteBuffer is determined by the codec for the input format you've chosen. ByteBuffer的确切布局由您选择的输入格式的编解码器决定。 Not all devices support all possible input formats (eg some AVC encoders require planar 420 YUV, others require semi-planar). 并非所有设备都支持所有可能的输入格式(例如,某些AVC编码器需要平面420 YUV,其他需要半平面)。 Older versions of Android (<= API 17) didn't really provide a portable way to software-generate video frames for MediaCodec . 较早版本的Android(<= API 17)并没有真正为MediaCodec提供软件生成视频帧的便携方式。

In Android 4.3 (API 18), you have two options. 在Android 4.3(API 18)中,您有两种选择。 First, MediaCodec now accepts input from a Surface, which means anything you can draw with OpenGL ES can be recorded as a movie. 首先, MediaCodec现在接受来自Surface的输入,这意味着您可以使用OpenGL ES绘制的任何内容都可以录制为电影。 See, for example, the EncodeAndMuxTest sample . 例如,请参阅EncodeAndMuxTest示例

Second, you still have the option of using software-generated YUV 420 buffers, but now they're more likely to work because there are CTS tests that exercise them. 其次,您仍然可以选择使用软件生成的YUV 420缓冲区,但现在它们更有可能工作,因为有CTS测试可以使用它们。 You still have to do runtime detection of planar or semi-planar, but there's really only two layouts. 您仍然需要对平面或半平面进行运行时检测,但实际上只有两种布局。 See the buffer-to-buffer variants of the EncodeDecodeTest for an example. 有关示例,请参阅EncodeDecodeTest的缓冲区到缓冲区变体。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM