[英]Java RenderedImage implementation with custom raw DataBuffer
I receive 16-bit grayscale images from a device, the images are delivered in an uncompressed raw format , here is a 8 bytes example of how 2X2 image will look like using this format (MSB first) : 我从设备接收到16位灰度图像,这些图像以未压缩的原始格式交付,这是一个8字节的示例,说明使用该格式的2X2图像的外观(首先是MSB):
21 27 33 F6 28 F3 27 F2
----- ----- ----- -----
pixel 0,0(x,y) pixel 1,0 pixel 1,0 pixel 1,1
I need to compress the images using Kakadu JPEG2000 library that expose a Java ImageWriter implementation, the ImageWriter.write method expect a RenderedImage as input, I'm using the following code to create a BufferedImage from the raw image data : 我需要使用公开了Java ImageWriter实现的Kakadu JPEG2000库压缩图像,ImageWriter.write方法期望将RenderedImage作为输入,我正在使用以下代码从原始图像数据创建BufferedImage:
int[] rasterData = new int[width * height];
int rawBufferOffset = 0;
for(int i=0;i<rasterData.length;i++) {
rasterData[i] = ((int) rawBuffer[rawBufferOffset + 1] << 8) | ((int) rawBuffer[rawBufferOffset] & 0xFF);
rawBufferOffset += 2;
}
BufferedImage image = new BufferedImage(width, height,BufferedImage.TYPE_USHORT_GRAY);
image.getRaster().setPixels(0, 0, width, height, rasterData);
The code works but it's obviously not the best method to this conversion, I was thinking about creating a RenderedImage implementation that uses the rawBuffer as the image raster data source, can anyone suggest how to do so or suggest any other method for this conversion? 该代码有效,但显然不是实现此转换的最佳方法,我正在考虑创建一个使用rawBuffer作为图像栅格数据源的RenderedImage实现,有人可以建议这样做还是建议其他转换方法?
The most straight forward way, is probably to use a ByteBuffer
to swap the byte order, and create a new short
array to hold the pixel data. 最直接的方法可能是使用
ByteBuffer
交换字节顺序,并创建一个新的short
数组来保存像素数据。
Then wrap the ( short
) pixel data in a DataBufferUShort
. 然后将(
short
)像素数据包装在DataBufferUShort
。 Create a matching WritableRaster
and ColorModel
, and finally create a BufferedImage
from this. 创建匹配的
WritableRaster
和ColorModel
,最后从中创建一个BufferedImage
。 This image should be identical to the image in your code above ( BufferedImage.TYPE_USHORT_GRAY
), but be slightly faster to create, as you only copy the pixels once (as opposed to twice in your code). 该图像应该与上面代码中的图像(
BufferedImage.TYPE_USHORT_GRAY
)相同,但是创建起来会稍微快一些,因为您仅复制像素一次(而不是代码中的两次)。
int w = 2;
int h = 2;
int stride = 1;
byte[] rawBytes = {0x21, 0x27, 0x33, (byte) 0xF6, 0x28, (byte) 0xF3, (byte) 0x27, (byte) 0xF2};
short[] rawShorts = new short[rawBytes.length / 2];
ByteBuffer.wrap(rawBytes)
.order(ByteOrder.LITTLE_ENDIAN)
.asShortBuffer()
.get(rawShorts);
DataBuffer dataBuffer = new DataBufferUShort(rawShorts, rawShorts.length);
WritableRaster raster = Raster.createInterleavedRaster(dataBuffer, w, h, w * stride, stride, new int[]{0}, null);
ColorModel colorModel = new ComponentColorModel(ColorSpace.getInstance(ColorSpace.CS_GRAY), false, false, Transparency.OPAQUE, DataBuffer.TYPE_USHORT);
BufferedImage image = new BufferedImage(colorModel, raster, colorModel.isAlphaPremultiplied(), null);
Another, slightly more convoluted, but probably faster way (as you don't copy the backing pixel array at all), is to create a custom SampleModel
that works with MSB (little endian) byte data, but exposes them as TYPE_USHORT
. 另一个复杂得多但可能更快的方法(因为您根本不复制后备像素数组)是创建一个自定义
SampleModel
,该模型可使用MSB(小尾数)字节数据,但将其公开为TYPE_USHORT
。 This will create a TYPE_CUSTOM
image. 这将创建一个
TYPE_CUSTOM
图像。
int w = 2, h = 2, stride = 2;
byte[] rawBytes = {0x21, 0x27, 0x33, (byte) 0xF6, 0x28, (byte) 0xF3, (byte) 0x27, (byte) 0xF2};
DataBuffer dataBuffer = new DataBufferByte(rawBytes, rawBytes.length);
SampleModel sampleModel = new ComponentSampleModel(DataBuffer.TYPE_USHORT, w, h, stride, w * stride, new int[] {0}) {
@Override
public Object getDataElements(int x, int y, Object obj, DataBuffer data) {
if ((x < 0) || (y < 0) || (x >= width) || (y >= height)) {
throw new ArrayIndexOutOfBoundsException("Coordinate out of bounds!");
}
// Simplified, as we only support TYPE_USHORT
int numDataElems = getNumDataElements();
int pixelOffset = y * scanlineStride + x * pixelStride;
short[] sdata;
if (obj == null) {
sdata = new short[numDataElems];
}
else {
sdata = (short[]) obj;
}
for (int i = 0; i < numDataElems; i++) {
sdata[i] = (short) (data.getElem(bankIndices[i], pixelOffset + bandOffsets[i] + 1) << 8|
data.getElem(bankIndices[i], pixelOffset + bandOffsets[i]));
}
return sdata;
}
};
ColorModel colorModel = new ComponentColorModel(ColorSpace.getInstance(ColorSpace.CS_GRAY), false, false, Transparency.OPAQUE, DataBuffer.TYPE_USHORT);
WritableRaster raster = Raster.createWritableRaster(sampleModel, dataBuffer, null);
BufferedImage image = new BufferedImage(colorModel, raster, colorModel.isAlphaPremultiplied(), null);
I don't really see a reason for creating a RenderedImage
subclass for this. 我真的没有看到为此创建
RenderedImage
子类的原因。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.