简体   繁体   中英

Convert 32 Bit Color Space to 16 bit

I have a color in a 32 bit (sRGB) color space and I want to convert it into a 16-bit color space that will have bit order ABBBBBGGGGGRRRRR.

I've read that you can use a matrix to convert between spaces, but I am unsure how it works.

Can somebody please point me toward a java example of doing this? Or else help me out with the Math? It would be greatly appreciated.

My idea was to convert the 32 bit rgb values (rgb 0-255) to CMY and then convert the CMY back to rgb 16-bit color space (rgb 0-31).

Is there a way to do this and make the necessary corrections so I end up with a decent image?

There's no need to go to CMY and back again.
Just a little bit shifting will get you there.

First split up the 32-bit value.
It's actually 24-bit for the RGB plus 8-bit for the Alpha channel.

Then we shift out the least significant bits.

Lets go.

char convert32Colorto16(Color color) {
  byte a,r,g,b;
  r = color.getRed() >> 3;
  g = color.getGreen() >> 3;
  b = color.getBlue() >> 3;
  a = color.getAlpha() >> 7;
  return (a << 15) | (b << 10) | (g << 5) | (r);
}

Note that the char doubles as an unsigned short here.

This is the fast and easy route.
If you take a palette of colors and use the 32768 most common ones (assuming 1 bit for alpha channel), then you'll have to do a whole lot of lookup and proximity selection for those pixels that are not in your lookup table, but you'll get a nicer looking picture.

Another option for better quality is to forget about the alpha channel and devote 6 pixels to the green channel. (The human eye is most sensitive to green).

Are you converting for a custom application or from one standard format to another standard format?

You can "brute force" do the conversion by truncating the values in the 32-bit channels to their 16-bit equivalent. For example, if you have 8 bits in the 32-bit Red channel, you can chop it down to 5 bits in the 16-bit Red channel by taking the upper 5 bits from the 8 bit colors. This will be relatively easy to code, but probably not give good results.

If the 16-bits can be used to represent a palette of colors, then you can scan the colors in the original image, create a histogram of colors, pick 65k worth of the most common ones, and use them.

There are lots of algorithms for converting from one color depth to another. My first suggestion is to look at the open source code for Gimp . The Posterize function allows you to take the colors per channel down, which is effectively what you are doing. It is not the be-all-and-end-all answer, but it may be a good place to start and give you some pointers.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM