简体   繁体   中英

Convert Pixels Buffer type from 1555 to 5551 (C++, OpenGL ES)

I'm having a problem while converting OpenGL video plugin to support GLES 3.0 So far everything went well, except glTexSubImage2D the original code uses GL_UNSIGNED_SHORT_1_5_5_5_REV as pixels type which is not supported in GLES 3.0

the type that worked is GL_UNSIGNED_SHORT_5_5_5_1 but colors and pixels are broken,

在此处输入图像描述

so I thought converting the pixels buffer would be fine..

but due to my limited understanding in GL and C++ I didn't succeed to do that.

Pixels process:

  • the pixels will be converted internally to 16 bit ABGR as in the Shader comments:
// Take a normalized color and convert it into a 16bit 1555 ABGR
// integer in the format used internally by the Playstation GPU.
uint rebuild_psx_color(vec4 color) {
  uint a = uint(floor(color.a + 0.5));
  uint r = uint(floor(color.r * 31. + 0.5));
  uint g = uint(floor(color.g * 31. + 0.5));
  uint b = uint(floor(color.b * 31. + 0.5));
  
  return (a << 15) | (b << 10) | (g << 5) | r;
}
  • it will be received by this method after processing by vGPU:
static void Texture_set_sub_image_window(struct Texture *tex, uint16_t top_left[2], uint16_t resolution[2], size_t row_len, uint16_t* data)
{ 
uint16_t x         = top_left[0];
uint16_t y         = top_left[1];

/* TODO - Am I indexing data out of bounds? */
size_t index       = ((size_t) y) * row_len + ((size_t) x);
uint16_t* sub_data = &( data[index] );
glPixelStorei(GL_UNPACK_ROW_LENGTH, (GLint) row_len);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glBindTexture(GL_TEXTURE_2D, tex->id);

glTexSubImage2D(GL_TEXTURE_2D, 0, 
(GLint) top_left[0], (GLint) top_left[1], 
(GLsizei) resolution[0], (GLsizei) resolution[1], 
GL_RGBA, GL_UNSIGNED_SHORT_1_5_5_5_REV /* Not supported in GLES */, 
(void*)sub_data); 

glPixelStorei(GL_UNPACK_ROW_LENGTH, 0);
}

as for row_len it's get the value from #define VRAM_WIDTH_PIXELS 1024

What I tried to do:

  • 1st I replaced the type with another one:
glTexSubImage2D(GL_TEXTURE_2D, 0, 
(GLint) top_left[0], (GLint) top_left[1], 
(GLsizei) resolution[0], (GLsizei) resolution[1], 
GL_RGBA, GL_UNSIGNED_SHORT_5_5_5_1 /* <- Here new type */, 
(void*)sub_data);
  • 2nd converted sub_data using this method:
uint16_t* ABGRConvertion(const uint16_t* pixels, int row_len, int x, int y, int width, int height) {
    static uint16_t frameBuffer = (uint16_t*)malloc(width * row_len * height);
    signed i, j; 
      for (j=0; j < height; j++)
      {
         for (i=0; i < width; i++)
         {
              int offset = j * row_len + i;
              uint16_t pixel  = pixels[offset];
              frameBuffer[offset] = Convert1555To5551(pixel); //<- stuck here
         }
      }
  return frameBuffer;
}
  • I have no idea what Convert1555To5551 should look like?

Note: Sorry if some descriptions is wrong, I don't really have full understanding for the whole process.

Performance is not major problem.. just need to know how to deal with the current pixel buffer.

Side note: I had to replace glFramebufferTexture with glFramebufferTexture2D so I hope it's not involved in the issue.

Thanks.

This should be what you're looking for.

uint16_t Convert1555To5551(uint16_t pixel)
{
    // extract rgba from 1555 (1 bit alpha, 5 bits blue, 5 bits green, 5 bits red) 
    uint16_t a = pixel >> 15;
    uint16_t b = (pixel >> 10) & 0x1f; // mask lowest five bits
    uint16_t g = (pixel >> 5) & 0x1f;
    uint16_t r = pixel & 0x1f;
    
    // compress rgba into 5551 (5 bits red, 5 bits green, 5 bits blue, 1 bit alpha)
    return (r << 11) | (g << 6) | (b << 1) | a;
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM