简体   繁体   English

MediaCodec.dequeueOutputBuffer在Android上编码h264时需要很长时间

[英]MediaCodec.dequeueOutputBuffer taking very long when encoding h264 on Android

I'm trying to encode h264 video on Android for real-time video streaming using MediaCodec but dequeueOutputBuffer keeps taking very long (actually it's very fast sometimes but very slow at other times, see log output below). 我正在尝试使用MediaCodec对Android上的h264视频进行实时视频流编码,但dequeueOutputBuffer持续时间非常长(实际上它有时非常快,但在其他时间非常慢,请参阅下面的日志输出)。 I've seen it go even up to 200ms for the output buffer to be ready. 我已经看到它甚至可以达到200毫秒,以便输出缓冲区准备就绪。 Is there something I'm doing wrong with my code or do you think this is an issue with the OMX.Nvidia.h264.encoder? 我的代码是否有问题,或者您认为这是OMX.Nvidia.h264.encoder的问题?

Maybe I need to downsample the image from 1280x720 to something smaller? 也许我需要将图像从1280x720下采样到更小的尺寸? Or maybe I need to dequeue and queue more input buffers while I'm waiting for the output buffer? 或者我可能需要在等待输出缓冲区时出队并排队更多输入缓冲区? (There are 6 input and 6 output buffers available). (有6个输入和6个输出缓冲区可用)。 I'm using Android API 19, so I can't use the asynchronous MediaCodec processing method. 我正在使用Android API 19,所以我无法使用异步MediaCodec处理方法。 I'm actually streaming an image from a Google Project Tango tablet, so my other suspicion is that perhaps the Tango's background operations are taking too long and causing the encoder to be slow. 我实际上正在从Google Project Tango平板电脑上传输图像,所以我怀疑的是,Tango的后台操作可能花费的时间太长而导致编码器变慢。 Any thoughts on what might be slowing this down so much? 关于什么可能会减缓这么多的想法?

01-20 23:36:30.728 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 0.400666ms.
01-20 23:36:30.855 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 94.290667ms.
01-20 23:36:30.880 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 0.57ms.
01-20 23:36:30.929 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 4.878417ms.
01-20 23:36:31.042 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 77.495417ms.
01-20 23:36:31.064 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 0.3225ms.
01-20 23:36:31.182 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 74.777583ms.
01-20 23:36:31.195 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 0.23ms.
01-20 23:36:31.246 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 17.243583ms.
01-20 23:36:31.350 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 80.14725ms.
01-20 23:36:31.373 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 2.493834ms.
01-20 23:36:31.421 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 13.273ms.
01-20 23:36:31.546 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 93.543667ms.
01-20 23:36:31.576 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 5.309334ms.
01-20 23:36:31.619 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 13.402583ms.
01-20 23:36:31.686 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 22.5485ms.
01-20 23:36:31.809 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 91.392083ms.

My relevant code is as follows: 我的相关代码如下:

public class StreamingThread extends Thread {
    ...

    // encoding
    private MediaCodec mVideoEncoder = null;
    private ByteBuffer[] mEncoderInputBuffers = null;
    private ByteBuffer[] mEncoderOutputBuffers = null;
    private NV21Convertor mNV21Converter = null;

    public static native VideoFrame getNewFrame();

    public StreamingThread()
    {
        this.setPriority(MAX_PRIORITY);
    }

    @Override
    public void run()
    {
        Looper.prepare();
        init();
        Looper.loop();
    }

    private void init()
    {
        mHandler = new Handler() {
            public void handleMessage(Message msg) {
                // process incoming messages here
                switch(msg.what)
                {
                    case HAVE_NEW_FRAME: // new frame has arrived (signaled from main thread)
                        processBufferedFrames();
                        break;

                    case CLOSE_THREAD:
                        close();
                        break;

                    default:
                        Log.e(LOGTAG, "received unknown message!");
                }
            }
        };

        try {
            ...
            // set up video encoding
            final String mime = "video/avc"; // H.264/AVC
            listAvailableEncoders(mime); // (this creates some debug output only)
            String codec = "OMX.Nvidia.h264.encoder"; // instead, hard-code the codec we want to use for now

            mVideoEncoder = MediaCodec.createByCodecName(codec);
            if(mVideoEncoder == null)
                Log.e(LOGTAG, "Media codec " + codec + " is not available!");

            // TODO: change, based on what we're streaming...
            int FRAME_WIDTH = 1280;
            int FRAME_HEIGHT = 720;

            // https://github.com/fyhertz/libstreaming/blob/ac44416d88ed3112869ef0f7eab151a184bbb78d/src/net/majorkernelpanic/streaming/hw/EncoderDebugger.java
            mNV21Converter = new NV21Convertor();
            mNV21Converter.setSize(FRAME_WIDTH, FRAME_HEIGHT);
            mNV21Converter.setEncoderColorFormat(MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
            mNV21Converter.setColorPanesReversed(true);
            mNV21Converter.setYPadding(0);

            MediaFormat format = MediaFormat.createVideoFormat(mime, FRAME_WIDTH, FRAME_HEIGHT);
            format.setInteger(MediaFormat.KEY_FRAME_RATE, 25);
            format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 10);
            format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
            // TODO: optimize bit rate
            format.setInteger(MediaFormat.KEY_BIT_RATE, 250000); // 4 Million bits/second = 0.48 Megabytes/s

            mVideoEncoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
            mVideoEncoder.start();
            mEncoderInputBuffers  = mVideoEncoder.getInputBuffers();
            mEncoderOutputBuffers = mVideoEncoder.getOutputBuffers();

            Log.d(LOGTAG, "Number of input buffers " + mEncoderInputBuffers.length);
            Log.d(LOGTAG, "Number of output buffers " + mEncoderOutputBuffers.length);

            initialized = true;

        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    private void close()
    {
        Looper.myLooper().quit();
        mVideoEncoder.stop();
        mVideoEncoder.release();
        mVideoEncoder = null;
    }

    private void processBufferedFrames()
    {
        if (!initialized)
            return;
        VideoFrame frame = getNewFrame();

        try {
            sendTCPFrame(frame);

        } catch (Exception e) {
            e.printStackTrace();
        }
    }
    private void sendTCPFrame(VideoFrame frame)
    {
        long start = System.nanoTime();

        long start2 = System.nanoTime();
        int inputBufferIndex = -1;
        while((inputBufferIndex = mVideoEncoder.dequeueInputBuffer(-1)) < 0 ) { // -1: wait indefinitely for the buffer
            switch(inputBufferIndex) {
                default:
                    Log.e(LOGTAG, "dequeueInputBuffer returned unknown value: " + inputBufferIndex);
            }
        }
        // fill in input (raw) data:
        mEncoderInputBuffers[inputBufferIndex].clear();

        long stop2 = System.nanoTime();
        Log.d(LOGTAG, "dequeueInputBuffer took " + (stop2 - start2) / 1e6 + "ms.");

        start2 = System.nanoTime();
        byte[] pixels = mNV21Converter.convert(frame.pixels);
        stop2 = System.nanoTime();
        Log.d(LOGTAG, "mNV21Converter.convert took " + (stop2-start2)/1e6 + "ms.");

        start2 = System.nanoTime();
        mEncoderInputBuffers[inputBufferIndex].put(pixels);
        stop2 = System.nanoTime();
        Log.d(LOGTAG, "mEncoderInputBuffers[inputBufferIndex].put(pixels) took " + (stop2 - start2) / 1e6 + "ms.");

        start2 = System.nanoTime();
        //mVideoEncoder.queueInputBuffer(inputBufferIndex, 0, pixels.length, 0, 0);
        //mVideoEncoder.queueInputBuffer(inputBufferIndex, 0, pixels.length, System.nanoTime() / 1000, 0);
        mVideoEncoder.queueInputBuffer(inputBufferIndex, 0, pixels.length, System.nanoTime(), 0);
        stop2 = System.nanoTime();
        Log.d(LOGTAG, "queueInputBuffer took " + (stop2 - start2) / 1e6 + "ms.");

        start2 = System.nanoTime();
        // wait for encoded data to become available:
        int outputBufferIndex = -1;
        MediaCodec.BufferInfo bufInfo = new MediaCodec.BufferInfo();
        long timeoutUs = -1;//10000; // microseconds
        while((outputBufferIndex = mVideoEncoder.dequeueOutputBuffer(bufInfo, timeoutUs)) < 0 ) { // -1: wait indefinitely for the buffer
            Log.i(LOGTAG, "dequeueOutputBuffer returned value: " + outputBufferIndex);
            switch(outputBufferIndex) {
                case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
                    // output buffers have changed, move reference
                    mEncoderOutputBuffers = mVideoEncoder.getOutputBuffers();
                    break;
                case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
                    // Subsequent data will conform to new format.
                    //MediaFormat format = codec.getOutputFormat();
                    Log.e(LOGTAG, "dequeueOutputBuffer returned INFO_OUTPUT_FORMAT_CHANGED ?!");
                    break;
                case MediaCodec.INFO_TRY_AGAIN_LATER:
                    Log.w(LOGTAG, "dequeueOutputBuffer return INFO_TRY_AGAIN_LATER");
                    break;
                default:
                    Log.e(LOGTAG, "dequeueOutputBuffer returned unknown value: " + outputBufferIndex);
            }
        }
        stop2 = System.nanoTime();
        Log.d(LOGTAG, "dequeueOutputBuffer took " + (stop2 - start2) / 1e6 + "ms.");

        // output (encoded) data available!
        Log.d(LOGTAG, "encoded buffer info: size = " + bufInfo.size + ", offset = " + bufInfo.offset + ", presentationTimeUs = " + bufInfo.presentationTimeUs + ", flags = " + bufInfo.flags);
        ByteBuffer encodedData = mEncoderOutputBuffers[outputBufferIndex];
        final int sizeOfImageData = bufInfo.size;

        long stop = System.nanoTime();
        Log.d(LOGTAG, "Encoding image took " + (stop-start)/1e6 + "ms.");

        start = System.nanoTime();
        // assemble header:
    ... 

        encodedData.rewind();
        // copy (!) raw image data to "direct" (array-backed) buffer:
        ByteBuffer imageBuffer = ByteBuffer.allocateDirect(encodedData.remaining());
        imageBuffer.put(encodedData); // TODO: can this copy be avoided?

        stop = System.nanoTime();
        Log.d(LOGTAG, "Preparing content for streaming took " + (stop - start) / 1e6 + "ms.");
        // do streaming via TCP
        ...
        mVideoEncoder.releaseOutputBuffer(outputBufferIndex, false);
    }

    // see http://developer.android.com/reference/android/media/MediaCodecInfo.html
    private void listAvailableEncoders(String mimeType)
    {
        Log.d(LOGTAG, "Available encoders for mime type " + mimeType + ":");
        for (int i = 0; i < MediaCodecList.getCodecCount(); i++) {
            MediaCodecInfo codec = MediaCodecList.getCodecInfoAt(i);

            if (!codec.isEncoder())
                continue;

            String[] types = codec.getSupportedTypes();
            for (int j = 0; j < types.length; j++) {
                //if (types[j].equalsIgnoreCase(mimeType)) {
                String msg = "- name: " + codec.getName() + ", supported color formats for " + mimeType + ":";
                MediaCodecInfo.CodecCapabilities cap = codec.getCapabilitiesForType(mimeType);
                for(int k = 0; k < cap.colorFormats.length; ++k) msg = msg + " " + cap.colorFormats[k];
                Log.d(LOGTAG, msg);
                //  break;
                //}
            }
        }
    }

Yes, there is something wrong with your code - you are waiting synchronously for the current frame to be output from the encoder before proceeding with the next frame. 是的,您的代码有问题 - 您正在等待当前帧从编码器输出,然后再继续下一帧。 Most hardware codecs have a bit more latency than you would expect, and in order to get the proper throughput as the encoder is capable of, you need to use it asynchronously. 大多数硬件编解码器的延迟比您预期的要多一些,并且为了获得编码器能够获得的正确吞吐量,您需要异步使用它。

That is, after sending one input buffer for encoding, you should not wait for the encoded output buffer, but only check if there is output. 也就是说,在发送一个用于编码的输入缓冲区之后,您不应该等待编码的输出缓冲区,而只是检查是否有输出。 You should then go on and input the next buffer, and again check for any available output. 然后,您应该继续输入下一个缓冲区,然后再次检查是否有可用的输出。 Only once you don't get an input buffer immediately, you can start waiting for output. 只有在没有立即获得输入缓冲区时,才能开始等待输出。 This way, there's always more than one input buffer available for the encoder to start working on, to keep it busy to actually achieve the frame rate that it is capable of. 这样,编码器总有多个输入缓冲器可供开始工作,以使其忙于实际实现其能够的帧速率。

(If you are ok with requiring Android 5.0, you could take a look at MediaCodec.setCallback , which makes it easier to work with asynchronously.) (如果您可以使用Android 5.0,可以查看MediaCodec.setCallback ,这样可以更容易地异步处理。)

There are even some codecs (mainly decoders though, if my memory serves me correctly) that won't even output the first buffer until you have passed more than a few input buffers. 甚至有一些编解码器(主要是解码器,如果我的内存正确地为我服务),在你传递了多个输入缓冲区之前,它甚至都不会输出第一个缓冲区。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Android MediaCodec 实时 h264 编码/解码延迟 - Android MediaCodec realtime h264 encoding/decoding latency 使用MediaCodec和MTK编解码器编码时选择H264配置文件 - Select H264 Profile when encoding with MediaCodec and MTK Codec 使用Ogg音频文件进行搜索时,MediaCodec.dequeueOutputBuffer超时 - MediaCodec.dequeueOutputBuffer times out when seeking with Ogg audio files MediaCodec 原始 H264 解码问题 - 超时并且decoder.dequeueOutputBuffer 总是返回 -1 - MediaCodec raw H264 decoding issue - times out and decoder.dequeueOutputBuffer always return -1 在某些使用MediaCodec高分辨率解码H264流的三星设备上,dequeueOutputBuffer始终返回-1 - on some Samsung devices decoding H264 stream at hight resolution with MediaCodec, dequeueOutputBuffer always return -1 解码H264视频时Android MediaCodec releaseOutputBuffer抛出MediaCodec.CodecException - Android MediaCodec releaseOutputBuffer throws MediaCodec.CodecException when decoding H264 video 我在使用android mediacodec编码和解码h264时出错了颜色 - I got wrong color while encoding and decoding h264 by android mediacodec 用MediaCodec解码h264 - Decode h264 with MediaCodec Android MediaCodec解码h264原始帧 - Android MediaCodec decode h264 raw frame Muxing Android MediaCodec将H264数据包编码为RTMP - Muxing Android MediaCodec encoded H264 packets into RTMP
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM