简体   繁体   English

Android,将Mediacodec与libstreaming一起使用

[英]Android, use Mediacodec with libstreaming

I've a problem with this library 我这个图书馆有问题

https://github.com/fyhertz/libstreaming https://github.com/fyhertz/libstreaming

it allows to send via wireless the streaming of photocamera, it use 3 methods: two with mediacodec and one with mediarecorder. 它允许通过无线发送照片流,它使用3种方法:两种使用mediacodec,一种使用mediarecorder。 I would like to modify it, and I have to use only the mediacodec;however first of all I tried the code of the example 2 of the library, but I've always found the same error: the log tell me that the device can use the mediacodec, it set the encoder and when it test the decoder it fall and the buffer is filled with -1. 我想修改它,而我只需要使用mediacodec;但是首先我尝试了该库的示例2的代码,但是我总是发现相同的错误:日志告诉我该设备可以使用媒体编解码器时,它会设置编码器,并且在测试解码器时会下降,并且缓冲区中会填充-1。

This is the method in the EncoderDebugger class where the exception occurs, some kind soul can help me please? 这是发生异常的EncoderDebugger类中的方法,请帮忙一下?

private long decode(boolean withPrefix) {
    int n =3, i = 0, j = 0;
    long elapsed = 0, now = timestamp();
    int decInputIndex = 0, decOutputIndex = 0;
    ByteBuffer[] decInputBuffers = mDecoder.getInputBuffers();
    ByteBuffer[] decOutputBuffers = mDecoder.getOutputBuffers();
    BufferInfo info = new BufferInfo();

    while (elapsed<3000000) {

        // Feeds the decoder with a NAL unit
        if (i<NB_ENCODED) {

            decInputIndex = mDecoder.dequeueInputBuffer(1000000/FRAMERATE);
            if (decInputIndex>=0) {
                int l1 = decInputBuffers[decInputIndex].capacity();
                int l2 = mVideo[i].length;
                decInputBuffers[decInputIndex].clear();

                if ((withPrefix && hasPrefix(mVideo[i])) || (!withPrefix && !hasPrefix(mVideo[i]))) {

                    check(l1>=l2, "The decoder input buffer is not big enough (nal="+l2+", capacity="+l1+").");
                    decInputBuffers[decInputIndex].put(mVideo[i],0,mVideo[i].length);
                } else if (withPrefix && !hasPrefix(mVideo[i])) {

                    check(l1>=l2+4, "The decoder input buffer is not big enough (nal="+(l2+4)+", capacity="+l1+").");
                    decInputBuffers[decInputIndex].put(new byte[] {0,0,0,1});
                    decInputBuffers[decInputIndex].put(mVideo[i],0,mVideo[i].length);
                } else if (!withPrefix && hasPrefix(mVideo[i])) {

                    check(l1>=l2-4, "The decoder input buffer is not big enough (nal="+(l2-4)+", capacity="+l1+").");
                    decInputBuffers[decInputIndex].put(mVideo[i],4,mVideo[i].length-4);
                }

                mDecoder.queueInputBuffer(decInputIndex, 0, l2, timestamp(), 0);
                i++;
            } else {
                if (VERBOSE) Log.d(TAG,"No buffer available !7");
            }
        }

        // Tries to get a decoded image

        decOutputIndex = mDecoder.dequeueOutputBuffer(info, 1000000/FRAMERATE);
        if (decOutputIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
            decOutputBuffers = mDecoder.getOutputBuffers();
        } else if (decOutputIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
            mDecOutputFormat = mDecoder.getOutputFormat();
        } else if (decOutputIndex>=0) {
            if (n>2) {
                // We have successfully encoded and decoded an image !
                int length = info.size;
                mDecodedVideo[j] = new byte[length];
                decOutputBuffers[decOutputIndex].clear();
                decOutputBuffers[decOutputIndex].get(mDecodedVideo[j], 0, length);
                // Converts the decoded frame to NV21
                convertToNV21(j);
                if (j>=NB_DECODED-1) {

                    flushMediaCodec(mDecoder);
                    if (VERBOSE) Log.v(TAG, "Decoding "+n+" frames took "+elapsed/1000+" ms");
                    return elapsed;
                }
                j++;
            }
            mDecoder.releaseOutputBuffer(decOutputIndex, false);
            n++;
        }   
        elapsed = timestamp() - now;
    }

    throw new RuntimeException("The decoder did not decode anything.");

}

Here's my suggestions: 这是我的建议:

(1) check the settings of encoder and decoder, and make sure that they match. (1)检查编码器和解码器的设置,并确保它们匹配。 For example, revolution and color format are the same. 例如,转数和颜色格式相同。

(2) make sure the very first packet generated by the encoder has been sent and pushed into the decoder. (2)确保编码器生成的第一个数据包已发送并推送到解码器中。 This packet defines the basic settings of the video stream. 该数据包定义了视频流的基本设置。

(3) the decoder usually buffers 5-10 frames. (3)解码器通常缓冲5-10帧。 So data in the buffer is invalid for a few hundred ms. 因此缓冲区中的数据在几百毫秒内无效。

(4) while initiating the decoder, set the surface as null. (4)启动解码器时,将曲面设置为null。 Otherwise the output buffer will be read by the surface and probably released automatically. 否则,输出缓冲区将被表面读取并可能自动释放。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM