简体   繁体   中英

Convert 24-bit bmp to 16-bit?

I know that the .NET Framework comes with an image conversion class (the System.Drawing.Image.Save method).

But I need to convert a 24-bit (R8G8B8) bitmap image to a 16-bit (X1R5G5B5) and I really got no idea on this kind of conversion, and a 24-to-16-bit change in the bmp header wouldn't work (since we need to convert the entire image data).

Also I would like to know if I can control over the image Dither, etc.

Ideas? Any kind of help would be appreciated.

The Format16bppRgb1555 pixel format is declared but GDI+ doesn't actually support it. There is no main-stream video driver or image codec that ever used that pixel format. Something that the GDI+ designers guessed could have happened, their time machine wasn't accurate enough. Otherwise a pretty sloppy copy/paste from the programmer that worked on System.Drawing.

Rgb555 is the closest match for available hardware and codecs:

public static Bitmap ConvertTo16bpp(Image img) {
    var bmp = new Bitmap(img.Width, img.Height,
                  System.Drawing.Imaging.PixelFormat.Format16bppRgb555);
    using (var gr = Graphics.FromImage(bmp))
        gr.DrawImage(img, new Rectangle(0, 0, img.Width, img.Height));
    return bmp;
}

You need to save the bitmap with an Encoder parameter specifying color depth.

    myEncoder = Encoder.ColorDepth;
    myEncoderParameters = new EncoderParameters(1);

    // Save the image with a color depth of 24 bits per pixel.
    myEncoderParameter = new EncoderParameter(myEncoder, 24L);
    myEncoderParameters.Param[0] = myEncoderParameter;

    myBitmap.Save("MyBitmap.bmp", myImageCodecInfo, myEncoderParameters);

A really straightforward way to do this is to loop over the old bitmap data and covert every pair of r8-g8-b8 values to x1-r5-g5-b5, something akin to this function:

char condense(char i)
{ 
  return (char)(i*255.0f/31.0f);
}

short transform(long input)// note that the last 8 bytes are ignored
{
  return condense(input&0xff) || (condense((input&0xff00)>>8)<<5)
    || (condense((intput&0xff0000)>>16)<<10);
}

// and somewhere in your program
{
  int len; // the length of your data in pixels
  char *data; // your data
  char *newdata; // this is where you store the new data; make sure it's allocated

  for(char *p=data; p<data+len*3; p+=3)
    *(short *)newdata=transform(*(long *)data);
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM