简体   繁体   中英

The H.264 avc video encoded by MediaCodec in Android cannot be played

BACKGROUND:

I have been working on implementing a Vine like video recorder for two days. First, I tried the MediaRecorder. But the video I need may be composed by small video clips. This class cannot be used to record a short-time video clip. Then I found the MediaCodec, FFmpeg and JavaCV. FFmpeg and JavaCV could solve this problem. But I have to compile my project with many library files. It will generate a very large APK file. So I prefer implementing it by MediaCodec, although this class only can be used after Android 4.1. 90% percent users will be satisfied.

RESULT:

I finally got the encoded file, but it cannot be played. I checked the information by FFprobe, the result is like:

Input #0, h264, from 'test.mp4': Duration: N/A, bitrate: N/A Stream #0:0: Video: h264 (Baseline), yuv420p, 640x480, 25 fps, 25 tbr, 1200k tbn, 50 tbc

I do not know much about the mechanism of H.264 coding.

CODE:

Modified from this link

public class AvcEncoder {

private static String TAG = AvcEncoder.class.getSimpleName();

private MediaCodec mediaCodec;
private BufferedOutputStream outputStream;
private int mWidth, mHeight;
private byte[] mDestData;

public AvcEncoder(int w, int h) {

    mWidth = w;
    mHeight = h;
    Log.d(TAG, "Thread Id: " + Thread.currentThread().getId());

    File f = new File("/sdcard/videos/test.mp4");

    try {
        outputStream = new BufferedOutputStream(new FileOutputStream(f));
        Log.i("AvcEncoder", "outputStream initialized");
    } catch (Exception e) {
        e.printStackTrace();
    }

    try {
        mediaCodec = MediaCodec.createEncoderByType("video/avc");
    } catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }
    MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", w,
            h);
    mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 2000000);
    mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
    // mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
    // MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
    mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
            MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);

    mDestData = new byte[w * h
            * ImageFormat.getBitsPerPixel(ImageFormat.YV12) / 8];
    mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
    mediaCodec.configure(mediaFormat, null, null,
            MediaCodec.CONFIGURE_FLAG_ENCODE);
    mediaCodec.start();
}

public void close() {
    try {
        mediaCodec.stop();
        mediaCodec.release();
        mediaCodec = null;

        // outputStream.flush();
        outputStream.close();
    } catch (IOException e) {

    }
}

public void offerEncoder(byte[] input) {
    try {
        CameraUtils.transYV12toYUV420Planar(input, mDestData, mWidth,
                mHeight);
        ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
        ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
        int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);

        if (inputBufferIndex >= 0) {
            ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
            inputBuffer.clear();
            inputBuffer.put(mDestData);
            mediaCodec.queueInputBuffer(inputBufferIndex, 0,
                    mDestData.length, 0, 0);
        }

        MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
        int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,
                0);

        while (outputBufferIndex >= 0) {
            ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
            byte[] outData = new byte[bufferInfo.size];
            outputBuffer.get(outData);
            try {
                outputStream.write(outData, 0, outData.length);

            } catch (Exception e) {
                Log.d("AvcEncoder", "Outputstream write failed");
                e.printStackTrace();
            }
            // Log.i("AvcEncoder", outData.length + " bytes written");

            mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
            outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,
                    0);

        }
    } catch (Throwable t) {
        t.printStackTrace();
    }
}
}

Invoke this class by Camera's startPreview:

private void startPreview() {
    if (mCamera == null) {
        return;
    }
    try {
        mCamera.setPreviewDisplay(mSurfaceView.getHolder());
        Parameters p = mCamera.getParameters();
        Size s = p.getPreviewSize();
        int len = s.width * s.height
                * ImageFormat.getBitsPerPixel(p.getPreviewFormat()) / 8;
        mAvcEncoder = new AvcEncoder(s.width, s.height);
        mCamera.addCallbackBuffer(new byte[len]);
        mCamera.setPreviewCallbackWithBuffer(new PreviewCallback() {

            @Override
            public void onPreviewFrame(byte[] data, Camera camera) {
                mAvcEncoder.offerEncoder(data);
                mCamera.addCallbackBuffer(data);
            }
        });
        mCamera.startPreview();
    } catch (IOException e) {
        e.printStackTrace();
    }
}

Close it when release Camera:

private void releaseCamera() {
    if (mCamera != null) {
        mCamera.stopPreview();
        mCamera.release();
        mCamera = null;
    }
    if (mAvcEncoder != null) {
        mAvcEncoder.close();
    }
}

You're saving a raw H.264 stream. You should convert it to .mp4 format. The easiest way to do this is with the MediaMuxer class (API 18+).

You can find a simple example on bigflake and more complete examples in Grafika .

You will need to provide presentation time stamps for each frame. You can either generate them according to your desired frame rate (like the bigflake example) or acquire them from the source (like the camera-input examples in Grafika).

Edit: For pre-API-18 devices (Android 4.1/4.2), MediaCodec is much more difficult to work with. You can't use Surface input or MediaMuxer, and the lack of platform tests led to some unfortunate incompatibilities. This answer has an overview.

In your specific case, I will note that your sample code is attempting to specify the input format, but that has no effect -- the AVC codec defines what input formats it accepts, and your app must query for it. You will likely find that the colors in your encoded video are currently wrong , as the Camera and MediaCodec don't have any color formats in common (see that answer for color-swap code).

I believe the data you are saving is raw h.264 data. Even through you are naming the file with a .mp4 extension, the data is not in a video container, such as that of a .mp4 file. This is why most media players will not be able to play the file.

You may have some success with vlc if you give the raw data file a .h264 or .264 extension. You should try this to verify that the data you are getting is raw h.264 data and that it is valid.

Also, It may help reading this old thread:

Decoding Raw H264 stream in android?

Within the discussion it talks about the SPS (Sequence Parameter Set). When processing streaming live video from a camera I have previously had to dynamically insert the SPS at the start of raw h.264 data in order to process it. Take a look at the raw h.264 binary data that you are saving to see if it has the SPS record at the start.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM