简体   繁体   中英

use ffmpeg to parse info about presentation time in h264 stream encoded by MediaCodec

I have seen the below example for encode/decode using MediaCodec API. https://android.googlesource.com/platform/cts/+/jb-mr2-release/tests/tests/media/src/android/media/cts/EncodeDecodeTest.java

In which there is a comparsion of the guessed presentation time and the presentation time received from decoded info.

assertEquals("Wrong time stamp", computePresentationTime(checkIndex),
    info.presentationTimeUs);

Because the decoder just decode the data in encoded buffer, I think there is any timestamp info could be parsed in this encoder's output H.264 stream.

I am writing an Android application which mux a H264 stream (.h264) encoded by MediaCodec to mp4 container by using ffmpeg ( libavformat ). I don't want to use MediaMuxer because it require version 4.3 which is too high.

However, ffmpeg seems not recognize the presentation timestamp in a packet encoded by MediaCodec , so I always get NO_PTS value when try to read a frame from the stream.

Anyone know how to get the correct presentation timestamp in this situation?

to send timestamps from MediaCodec encoder to ffmpeg you need to convert like that:

jint Java_com_classclass_WriteVideoFrame(JNIEnv * env, jobject this, jbyteArray data, jint datasize, jlong timestamp) {

    ....

AVPacket pkt;
av_init_packet(&pkt);

AVCodecContext *c = m_pVideoStream->codec;

pkt.pts = (long)((double)timestamp * (double)c->time_base.den / 1000.0);
pkt.stream_index    = m_pVideoStream->index;
pkt.data            = rawjBytes;
pkt.size            = datasize;

where time_base depends on framerate

upd re timestamps flow in pipline: neither decoder nor encoder knows time-stamps by their own. timestamps are set to these components via

decoder.queueInputBuffer(inputBufIndex, 0, info.size, info.presentationTimeUs, info.flags);

or

encoder.queueInputBuffer(inputBufIndex, 0, 0, ptsUsec, info.flags);

these timestamps could be taken from extractor, from camera or generated by app, but decoder\\encoder just passes through these time-stamps without changing them. as a result time-stamps go unchanged from source to sink (muxer).

for sure there are some exclusions: if frames frequency in changed - frame rate conversion for example. or if encoder makes encoding with B-frames and reordering happens. or encoder can add time-stamps to the encoder frame header - optional, not mandatory by standard. i think all of this is not applied to current android version, codecs or your usage scenario.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM