简体   繁体   English

如何在Android MediaCodec中播放原始NAL单元

[英]How to play raw NAL units in Android MediaCodec

I initially tried How to play raw NAL units in Andoid exoplayer? 我最初尝试过如何在Andoid exoplayer中播放原始NAL单元? but I noticed I'm gonna have to do things in low level. 但我注意到我将不得不做一些低级的事情。

I've found this simple MediaCodec example . 我已经找到了这个简单的MediaCodec示例 As you can see, it's a thread that plays a file on a surface passed to it. 如您所见,它是一个线程,在传递给它的表面上播放文件。

Notice the lines 注意行

mExtractor = new MediaExtractor();
mExtractor.setDataSource(filePath);

It looks like that I should create my own MediaExtractor which, instead of extracting the video units from a file, it'll use the h264 NAL units from a buffer I'll provide. 看来我应该创建自己的MediaExtractor ,而不是从文件中提取视频单元,而是使用我将提供的缓冲区中的h264 NAL单元。

I can then call mExtractor.setDataSource(MediaDataSource dataSource) , see MediaDataSource 然后,我可以调用mExtractor.setDataSource(MediaDataSource dataSource) ,请参见MediaDataSource

It has readAt(long position, byte[] buffer, int offset, int size) 它具有readAt(long position, byte[] buffer, int offset, int size)

This is where it reads the NAL units. 它在此处读取NAL单元。 However, how should I pass them? 但是,我应该如何通过它们? I have no information on the structure of the buffer that needs to be read. 我没有需要读取的缓冲区结构信息。

Should I pass a byte[] buffer with the NAL units in it, and if so, in which format? 我应该在其中传递带有NAL单元的byte[] buffer吗?如果可以,采用哪种格式? What is the offset for? 偏移量是多少? If it's a buffer, shouldn't I just erase the lines that were read and thus have no offset or size? 如果它是一个缓冲区,我不应该只擦除已读取的行,因此没有偏移量或大小吗?

By the way, the h264 NAL units are streaming ones, they come from RTP packets, not files. 顺便说一下,h264 NAL单元是流式的,它们来自RTP数据包,而不是文件。 I'm gonna get them through C++ and store them on a buffer an try to pass to the mediaExtractor. 我要通过C ++来获取它们,并将它们存储在缓冲区中,然后尝试传递给mediaExtractor。

UPDATE: 更新:

I've been reading a lot about MediaCodec and I think I understand it better. 我已经阅读了很多有关MediaCodec的文章,我想我对它的理解更好。 According to https://developer.android.com/reference/android/media/MediaCodec , everything relies on something of this type: 根据https://developer.android.com/reference/android/media/MediaCodec ,一切都取决于这种类型的东西:

 MediaCodec codec = MediaCodec.createByCodecName(name);
 MediaFormat mOutputFormat; // member variable
 codec.setCallback(new MediaCodec.Callback() {
   @Override
   void onInputBufferAvailable(MediaCodec mc, int inputBufferId) {
     ByteBuffer inputBuffer = codec.getInputBuffer(inputBufferId);
     // fill inputBuffer with valid data
     …
     codec.queueInputBuffer(inputBufferId, …);
   }

   @Override
   void onOutputBufferAvailable(MediaCodec mc, int outputBufferId, …) {
     ByteBuffer outputBuffer = codec.getOutputBuffer(outputBufferId);
     MediaFormat bufferFormat = codec.getOutputFormat(outputBufferId); // option A
     // bufferFormat is equivalent to mOutputFormat
     // outputBuffer is ready to be processed or rendered.
     …
     codec.releaseOutputBuffer(outputBufferId, …);
   }

   @Override
   void onOutputFormatChanged(MediaCodec mc, MediaFormat format) {
     // Subsequent data will conform to new format.
     // Can ignore if using getOutputFormat(outputBufferId)
     mOutputFormat = format; // option B
   }

   @Override
   void onError(…) {
     …
   }
 });
 codec.configure(format, …);
 mOutputFormat = codec.getOutputFormat(); // option B
 codec.start();
 // wait for processing to complete
 codec.stop();
 codec.release();

As you can see, I can pass input buffers and get decoded output buffers. 如您所见,我可以传递输入缓冲区并获取解码的输出缓冲区。 The exact byte formats are still a mystery, but I think that's how it works. 确切的字节格式仍然是个谜,但是我认为这是有效的。 Also according to the same article, the usage of ByteBuffer s is slow, and Surface s are preferred. 同样根据同一篇文章, ByteBuffer的用法很慢,并且Surface是首选。 They consume the output buffers automatically. 它们自动消耗输出缓冲区。 Although there's no tutorial on how to do it, there's a section in the article that says it's almost identical, so I guess I just need to add the additional lines 尽管没有有关如何执行此操作的教程,但文章中有一节说它几乎是相同的,所以我想我只需要添加其他行即可

 codec.setInputSurface(Surface inputSurface) 
 codec.setOutputSurface(Surface outputSurface) 

Where inputSurface and outputSurface are Surface s which I pass to a MediaPlayer which I use (how) to display the video in an activity. 其中inputSurfaceoutputSurfaceSurface ,我将其传递给MediaPlayer ,我使用它(如何)在活动中显示视频。 And the output buffers will simply not come on onOutputBufferAvailable (because the surface consumes them first), neither onInputBufferAvailable . 并且输出缓冲区将根本不会出现在onOutputBufferAvailable (因为surface首先消耗它们),也不会出现在onInputBufferAvailable

So the questions now are: how exactly do I construct a Surface which contains the video buffer, and how do I display a MediaPlayer into an activity 所以现在的问题是: 如何精确地构造一个包含视频缓冲区的Surface ,以及如何在活动中显示MediaPlayer

For output I can simply create a Surface and pass to a MediaPlayer and MediaCodec , but what about input? 对于输出,我可以简单地创建一个Surface并传递给MediaPlayerMediaCodec ,但是输入呢? Do I need ByteBuffer for the input anyways, and Surface would just be for using other outputs as inputs? 我是否仍需要ByteBuffer作为输入,而Surface只是用于将其他输出用作输入?

you first need to remove the NAL units , and feed the raw H264 bytes into this method, how ever in your case your reading from the file , so no need to remove any thing since your not using packets , just feed the data bytes to this method: 首先,您需要删除NAL单元,并将原始H264字节输入此方法,以防您从文件中读取数据,因此无需删除任何内容,因为您没有使用数据包,只需将数据字节输入此方法即可方法:

 rivate void initDecoder(){
    try {
        writeHeader = true;
        if(mDecodeMediaCodec != null){
            try{
                mDecodeMediaCodec.stop();
            }catch (Exception e){}
            try{
                mDecodeMediaCodec.release();
            }catch (Exception e){}
        }
        mDecodeMediaCodec = MediaCodec.createDecoderByType(MIME_TYPE);
       //MIME_TYPE = video/avc    in your case
        mDecodeMediaCodec.configure(format,mSurfaceView.getHolder().getSurface(),
                null,
                0);
        mDecodeMediaCodec.start();
        mDecodeInputBuffers = mDecodeMediaCodec.getInputBuffers();
    } catch (IOException e) {
        e.printStackTrace();
        mLatch.trigger();
    }
}


   private void decode(byte[] data){


            try {

                MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
                int inputBufferIndex = mDecodeMediaCodec.dequeueInputBuffer(1000);//
                if (inputBufferIndex >= 0) {
                    ByteBuffer buffer = mDecodeInputBuffers[inputBufferIndex];
                    buffer.clear();
                    buffer.put(data);
                    mDecodeMediaCodec.queueInputBuffer(inputBufferIndex,
                            0,
                            data.length,
                            packet.sequence / 1000,
                            0);

                    data = null;
                    //decodeDataBuffer.clear();
                    //decodeDataBuffer = null;
                }

                int outputBufferIndex = mDecodeMediaCodec.dequeueOutputBuffer(info,
                        1000);
                do {
                    if (outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
                        //no output available yet
                    } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                        //encodeOutputBuffers = mDecodeMediaCodec.getOutputBuffers();
                    } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                        format = mDecodeMediaCodec.getOutputFormat();
                        //mediaformat changed
                    } else if (outputBufferIndex < 0) {
                        //unexpected result from encoder.dequeueOutputBuffer
                    } else {

                        mDecodeMediaCodec.releaseOutputBuffer(outputBufferIndex,
                                true);

                        outputBufferIndex = mDecodeMediaCodec.dequeueOutputBuffer(info,
                                0);

                    }
                } while (outputBufferIndex > 0);

    }

please dont forget that iFrame (the first frame bytes) contains sensitive data and MUST be fed to the decoder first 请不要忘记iFrame(第一个帧字节)包含敏感数据,务必先送入解码器

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM