简体   繁体   English

用MediaCodec流解码原始h264会导致黑色表面

[英]Decoding raw h264 with MediaCodec stream results in black surface

Hello Stack Overflow, 你好堆栈溢出,

I'm currently writing a framework to achive vr experience with a smartphone. 我目前正在编写一个框架,以通过智能手机获得虚拟现实体验。 Therefore graphical content gets rendered on a server (stereoscope), encoded and sent to a smartphone. 因此,图形内容将在服务器(立体镜)上呈现,编码并发送到智能手机。 The one I use is the Nexus 5x from LG. 我使用的是LG的Nexus 5x。 The app I'm writing originally consisted of two texture views and the logic to decode and display the frames. 我正在编写的应用程序最初由两个纹理视图以及用于解码和显示帧的逻辑组成。 However, Androids MediaCodec class crashed in every atempt, so I tried to create a minimal working example with only one surface, based on working code I've written before. 但是,Android的MediaCodec类在每次尝试时都会崩溃,因此我尝试根据我之前编写的工作代码创建一个只有一个表面的最小工作示例。 But despite the MediaCodec doesn't throw an CodecException anymore, the surface still remains black. 但是,尽管MediaCodec不再抛出CodecException了,但表面仍然保持黑色。

public class MainActivity extends Activity implements SurfaceHolder.Callback
{
private DisplayThread displayThread = null;

@Override
protected void onCreate(Bundle savedInstanceState)
{
    super.onCreate(savedInstanceState);
    SurfaceView sv = new SurfaceView(this);
    sv.getHolder().addCallback(this);
    setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
    setContentView(sv);
}

@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height)
{
    if (displayThread == null)
    {
        displayThread = new DisplayThread(holder.getSurface());
        displayThread.start();
    }
}

private class DisplayThread extends Thread
{
    private MediaCodec codec;
    private Surface surface;
    private UdpReceiver m_renderSock;


    public DisplayThread(Surface surface)
    {
        this.surface = surface;
    }

    @Override
    public void run()
    {
        m_renderSock = new UdpReceiver(9091);

        //Configuring Media Decoder
        try {
            codec = MediaCodec.createDecoderByType("video/avc");
        } catch (IOException e) {
            throw new RuntimeException(e.getMessage());
        }

        MediaFormat format = MediaFormat.createVideoFormat("video/avc", 1280,720);

        codec.configure(format, surface, null, 0);
        codec.start();


        while(!Thread.interrupted())
        {
            int frameSize = 0;
            byte[] frameData = m_renderSock.receive();

            if(frameData.length == 1) // Just for the moment, to cope with the first pakets get lost because of missing ARP, see http://stackoverflow.com/questions/11812731/first-udp-message-to-a-specific-remote-ip-gets-lost
                continue;

            /*Edit: This part may be left out*/
            int NAL_START = 1;
            //103, 104 -> SPS, PPS  | 101 -> Data
            int id = 0;
            int dataOffset = 0;

            //Later on this will be serversided, but for now... 
            //Separate the SPSPPS from the Data
            for(int i = 0; i < frameData.length - 4; i++)
            {
                id = frameData[i] << 24 |frameData[i+1] << 16 | frameData[i+2] << 8
                        | frameData[i+3];

                if(id == NAL_START) {
                    if(frameData[i+4] == 101)
                    {
                        dataOffset = i;
                    }
                }
            }


            byte[] SPSPPS = Arrays.copyOfRange(frameData, 0, dataOffset);
            byte[] data = Arrays.copyOfRange(frameData, dataOffset, frameData.length);

            if(SPSPPS.length != 0) {
                int inIndex = codec.dequeueInputBuffer(100000);

                if(inIndex >= 0)
                {
                    ByteBuffer input = codec.getInputBuffer(inIndex);
                    input.clear();
                    input.put(SPSPPS);
                    codec.queueInputBuffer(inIndex, 0, SPSPPS.length, 16, MediaCodec.BUFFER_FLAG_CODEC_CONFIG);
                }
            }
            /*Edit end*/

            int inIndex = codec.dequeueInputBuffer(10000);
            if(inIndex >= 0)
            {
                ByteBuffer inputBuffer = codec.getInputBuffer(inIndex);
                inputBuffer.clear();
                //inputBuffer.put(data);
                inputBuffer.put(frameData);
                //codec.queueInputBuffer(inIndex, 0, data.length, 16, 0);
                codec.queueInputBuffer(inIndex, 0, frameData.length, 16, 0);
            }

            BufferInfo buffInfo = new MediaCodec.BufferInfo();
            int outIndex = codec.dequeueOutputBuffer(buffInfo, 10000);

            switch(outIndex)
            {
                case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
                    break;
                case MediaCodec.INFO_TRY_AGAIN_LATER:
                    break;
                case -3: //This solves it
                    break;
                default:
                    ByteBuffer buffer = codec.getOutputBuffer(outIndex);
                    codec.releaseOutputBuffer(outIndex, true);
            }


        }
    }
}

So basically this code had worked in the past. 因此,基本上这段代码在过去一直有效。 But at that time the media codec API hat ByteBuffer[] for Input- and Output-Buffers. 但是那时,媒体编解码器API的ByteBuffer[]用于输入和输出缓冲区。 Also there was no need to separate the SPSPPS-data from the frame-data (At least I didn't do it and it worked, could also because of Nvcuvenc seperated every NALU). 另外,也不需要将SPSPPS数据与帧数据分开(至少我没有这样做,并且它起作用了,也可能是因为Nvcuvenc将每个NALU分开了)。

I inspected the contents of the two buffers and this is the result: 我检查了两个缓冲区的内容,这是结果:

SPSPPS: 
0 0 0 1 103 100 0 32 -84 43 64 40 2 -35 -128 -120 0 0 31 64 0 14 -90 4 120 -31 -107 
0 0 1 104 -18 60 -80

Data:
0 0 0 1 101 -72 4 95 ...

For me, this looks correct. 对我来说,这看起来是正确的。 The h264 stream is created with Nvidias NVenc API and, if saved to disc, is playable with VLC without any problem. h264流是使用Nvidias NVenc API创建的,如果保存到磁盘,则可以在VLC上播放而没有任何问题。

I'm sorry for the large code-lump. 我为大型代码块感到抱歉。 Thanks for your help! 谢谢你的帮助!

So the only problem was, that dequeueOutputBuffers still may return -3, aka MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED , which is marked as deprecated. 因此,唯一的问题是, dequeueOutputBuffers仍可能返回-3,即MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED ,它被标记为已弃用。 Very nice. 非常好。 By not handling this return value, or to be more specific, use the constants value as input for getOutputBuffer() , the codec throws an error -> black screen. 通过不处理此返回值,或更具体地说,将常量值用作getOutputBuffer()输入,编解码器将引发错误->黑屏。

Edit: Oh, and apparently the whole NAL stuff isn't needed as well. 编辑:哦,显然也不需要整个NAL。 Even if the API states, that the SPS and PPS NALUs have to be provided before start. 即使API指出,也必须在启动之前提供SPS和PPS NALU。 I marked the part that can be left out in my Question. 我标记了可以在我的问题中忽略的部分。

I am seeing similar behaviour on new Samsung devices, and suspect the codecs may have the same issue. 我在新的三星设备上看到类似的行为,并怀疑编解码器可能有相同的问题。 Will try your fix, thanks. 会尝试您的修复,谢谢。

Also the SPS/PPS stuff is only necessary for boxing containers like mp4. 同样,仅对mp4之类的包装箱来说,SPS / PPS物品是必需的。 Raw players are in-band. 原始播放器是带内的。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM