简体   繁体   English

MediaCodec H264编码器在Snapdragon 800设备上不起作用

[英]MediaCodec H264 Encoder not working on Snapdragon 800 devices

I have written a H264 Stream Encoder using the MediaCodec API of Android. 我已经使用Android的MediaCodec API编写了H264流编码器。 I tested it on about ten different devices with different processors and it worked on all of them, except on Snapdragon 800 powered ones (Google Nexus 5 and Sony Xperia Z1). 我在大约十种使用不同处理器的不同设备上对其进行了测试,并且可以在所有这些设备上正常工作,除了在使用Snapdragon 800的设备(Google Nexus 5和Sony Xperia Z1)上。 On those devices I get the SPS and PPS and the first Keyframe, but after that mEncoder.dequeueOutputBuffer(mBufferInfo, 0) only returns MediaCodec.INFO_TRY_AGAIN_LATER. 在那些设备上,我得到了SPS和PPS以及第一个关键帧,但是在那之后,mEncoder.dequeueOutputBuffer(mBufferInfo,0)仅返回MediaCodec.INFO_TRY_AGAIN_LATER。 I already experimented with different timeouts, bitrates, resolutions and other configuration options, to no avail. 我已经尝试了不同的超时,比特率,分辨率和其他配置选项,但无济于事。 The result is always the same. 结果总是一样的。

I use the following code to initialise the Encoder: 我使用以下代码初始化编码器:

        mBufferInfo = new MediaCodec.BufferInfo();
        encoder = MediaCodec.createEncoderByType("video/avc");
        MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 640, 480);
        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 768000);
        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 30);
        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, mEncoderColorFormat);
        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 10);
        encoder.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

where the selected color format is: 所选颜色格式为:

MediaCodecInfo.CodecCapabilities capabilities = mCodecInfo.getCapabilitiesForType(MIME_TYPE);
            for (int i = 0; i < capabilities.colorFormats.length && selectedColorFormat == 0; i++)
            {
                int format = capabilities.colorFormats[i];
                switch (format) {
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_QCOM_FormatYUV420SemiPlanar:
                        selectedColorFormat = format;
                        break;
                    default:
                        LogHandler.e(LOG_TAG, "Unsupported color format " + format);
                        break;
                }
            }

And I get the data by doing 我通过做得到数据

            ByteBuffer[] inputBuffers = mEncoder.getInputBuffers();
        ByteBuffer[] outputBuffers = mEncoder.getOutputBuffers();

        int inputBufferIndex = mEncoder.dequeueInputBuffer(-1);
        if (inputBufferIndex >= 0)
        {
            // fill inputBuffers[inputBufferIndex] with valid data
            ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
            inputBuffer.clear();
            inputBuffer.put(rawFrame);
            mEncoder.queueInputBuffer(inputBufferIndex, 0, rawFrame.length, 0, 0);
            LogHandler.e(LOG_TAG, "Queue Buffer in " + inputBufferIndex);
        }

        while(true)
        {
            int outputBufferIndex = mEncoder.dequeueOutputBuffer(mBufferInfo, 0);
            if (outputBufferIndex >= 0)
            {
                Log.d(LOG_TAG, "Queue Buffer out " + outputBufferIndex);
                ByteBuffer buffer = outputBuffers[outputBufferIndex];
                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0)
                {
                    // Config Bytes means SPS and PPS
                    Log.d(LOG_TAG, "Got config bytes");
                }

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_SYNC_FRAME) != 0)
                {
                    // Marks a Keyframe
                    Log.d(LOG_TAG, "Got Sync Frame");
                }

                if (mBufferInfo.size != 0)
                {
                    // adjust the ByteBuffer values to match BufferInfo (not needed?)
                    buffer.position(mBufferInfo.offset);
                    buffer.limit(mBufferInfo.offset + mBufferInfo.size);

                    int nalUnitLength = 0;
                    while((nalUnitLength = parseNextNalUnit(buffer)) != 0)
                    {
                        switch(mVideoData[0] & 0x0f)
                        {
                            // SPS
                            case 0x07:
                            {
                                Log.d(LOG_TAG, "Got SPS");
                                break;
                            }

                            // PPS
                            case 0x08:
                            {
                                Log.d(LOG_TAG, "Got PPS");
                                break;
                            }

                            // Key Frame
                            case 0x05:
                            {
                                Log.d(LOG_TAG, "Got Keyframe");
                            }

                            //$FALL-THROUGH$
                            default:
                            {
                                // Process Data
                                break;
                            }
                        }
                    }
                }

                mEncoder.releaseOutputBuffer(outputBufferIndex, false);

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0)
                {
                    // Stream is marked as done,
                    // break out of while
                    Log.d(LOG_TAG, "Marked EOS");
                    break;
                }
            }
            else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED)
            {
                outputBuffers = mEncoder.getOutputBuffers();
                Log.d(LOG_TAG, "Output Buffer changed " + outputBuffers);
            }
            else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED)
            {
                MediaFormat newFormat = mEncoder.getOutputFormat();
                Log.d(LOG_TAG, "Media Format Changed " + newFormat);
            }
            else if(outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER)
            {
                // No Data, break out
                break;
            }
            else
            {
                // Unexpected State, ignore it
                Log.d(LOG_TAG, "Unexpected State " + outputBufferIndex);
            }
        }

Thanks for your help! 谢谢你的帮助!

You need to set the presentationTimeUs parameter in your call to queueInputBuffer. 您需要在对queueInputBuffer的调用中设置presentationTimeUs参数。 Most encoders ignore this and you can encode for streaming without issues. 大多数编码器都忽略了这一点,您可以对流进行编码而不会出现问题。 The encoder used for Snapdragon 800 devices doesn't. Snapdragon 800设备使用的编码器没有。

This parameter represents the recording time of your frame and needs therefore to increase by the number of us between the frame that you want to encode and the previous frame. 此参数表示帧的录制时间,因此需要增加要编码的帧与上一个帧之间的帧数。

If the parameter set is the same value as in the previous frame the encoder drops it. 如果参数集与上一帧的值相同,则编码器将其丢弃。 If the parameter is set to a too small value (eg 100000 on a 30 FPS recording) the quality of the encoded frames drops. 如果参数设置的值太小(例如,在30 FPS记录下为100000),则编码帧的质量会下降。

encodeCodec.queueInputBuffer(inputBufferIndex,0,input.length,(System.currentTimeMillis()-startMs)* 1000,0);

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM