简体   繁体   中英

Convert YUV_420_888 to byte array

I am testing out the new Camera2 API, and I'm able to capture the camera preview in YUV_420_888 format. What I need to do next is to feed this data to a image processing library, which accepts a byte[] parameter.

I've found examples of converting YUV_420_888 to RGB and such, but I still need to convert the resulting Bitmap to byte[] through ByteArrayOutputStream , which after experimenting, is slowing down the app tremendously.

My question is, how do I convert YUV_420_888 to byte[] efficiently?

What is the actual format of the byte[] array the image processing library wants? Is it RGB? YUV planar? YUV semiplanar?

Assuming it's RGB, given that you reference converting YUV_420_888 to RGB, you can just modify that example to not create a Bitmap from the allocation - just use Allocation.copyTo with byte[] instead of Bitmap.

I've take a lot of time for looking a solution, so i found it, from answer of other guy on stackoverflow, i want share my customize code which has been optimized for loop numbers, it work with me, for YUV420 Image of camera2 API :D

public static byte[] imageToMat(Image image) {

    Image.Plane[] planes = image.getPlanes();

    ByteBuffer buffer0 = planes[0].getBuffer();
    ByteBuffer buffer1 = planes[1].getBuffer();
    ByteBuffer buffer2 = planes[2].getBuffer();

    int offset = 0;

    int width = image.getWidth();
    int height = image.getHeight();

    byte[] data = new byte[image.getWidth() * image.getHeight() * ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8];
    byte[] rowData1 = new byte[planes[1].getRowStride()];
    byte[] rowData2 = new byte[planes[2].getRowStride()];

    int bytesPerPixel = ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8;

    // loop via rows of u/v channels

    int offsetY = 0;

    int sizeY =  width * height * bytesPerPixel;
    int sizeUV = (width * height * bytesPerPixel) / 4;

    for (int row = 0; row < height ; row++) {

        // fill data for Y channel, two row
        {
            int length = bytesPerPixel * width;
            buffer0.get(data, offsetY, length);

            if ( height - row != 1)
                buffer0.position(buffer0.position()  +  planes[0].getRowStride() - length);

            offsetY += length;
        }

        if (row >= height/2)
            continue;

        {
            int uvlength = planes[1].getRowStride();

            if ( (height / 2 - row) == 1 ) {
                uvlength = width / 2 - planes[1].getPixelStride() + 1;
            }

            buffer1.get(rowData1, 0, uvlength);
            buffer2.get(rowData2, 0, uvlength);

            // fill data for u/v channels
            for (int col = 0; col < width / 2; ++col) {
                // u channel
                data[sizeY + (row * width)/2 + col] = rowData1[col * planes[1].getPixelStride()];

                // v channel
                data[sizeY + sizeUV + (row * width)/2 + col] = rowData2[col * planes[2].getPixelStride()];
            }
        }

    }

    return data;
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM