[英]Android NDK gl using Java nio bytebuffer for texture image
I am using Java side direct nio buffer which is holding pixel data from a bitmap and use it in the NDK side as a gl texture. 我正在使用Java端直接nio缓冲区,该缓冲区保存位图中的像素数据,并在NDK端将其用作gl纹理。 Basically I am not able to read the Java nio buffer pixels correctly in the c/c++ gl draw call.
基本上,我无法在c / c ++ gl draw调用中正确读取Java nio缓冲区像素。
It appears that the Java.ByteBuffer that is filled with pixels on Java side is not directly compatible with NDK side gl which requires unsigned bytes ( java byte is apparently 32 bit) 似乎在Java侧填充有像素的Java.ByteBuffer与需要无符号字节的NDK侧gl不直接兼容(Java字节显然是32位)
so a single white pixel on Java side : 因此Java端只有一个白色像素:
int size = 1;
ByteBuffer vv = ByteBuffer.allocateDirect(size_t*4);
vv.order(ByteOrder.nativeOrder());
vv.put((byte)255); // R
vv.put((byte)255); // G
vv.put((byte)255); // B
vv.put((byte)255); // A
vv.position(0);
... //code to send the buffer address to JNI/NDK gl side
... //
will be drawn on NDK side as a black pixel; 将在NDK一侧绘制为黑色像素;
I realize that the actual unsigned byte value of this buffer maybe be negative - how do I correct for this. 我意识到此缓冲区的实际无符号字节值可能为负-我该如何纠正。 Also - I will need to convert byte[] that is an image data derived from a Bitmap on Java side to be used in the NDK side texture once I solve this issue.
另外,一旦解决了这个问题,我将需要转换byte [],它是从Java端的位图派生的图像数据,以便在NDK端纹理中使用。
Thanks in advance! 提前致谢!
First "byte" in Java is 8 bit (signed) not 32 bit. Java中的第一个“字节”是8位(带符号),而不是32位。 I don't use nio.ByteBufffer (though one can).
我不使用nio.ByteBufffer(虽然可以)。 byte[] array works much better and simpler.
byte []数组工作得更好,更简单。
in AndroidBitmap.java 在AndroidBitmap.java中
public class AndroidBitmap {
public static native void updateBitmap(android.graphics.Bitmap bitmap, byte[] data, int w, int h, int bpp);
}
in AndroidBitmap.c 在AndroidBitmap.c中
jboolean Java_jni_AndroidBitmap_updateBitmap(JNIEnv* env, jobject that, jobject bitmap, jbyteArray data, jint w, jint h, jint bpp) {
jbyte* a = (*env)->GetByteArrayElements(env, data, NULL);
jsize bytes = (*env)->GetArrayLength(env, data);
AndroidBitmapInfo info = {0};
int r = AndroidBitmap_getInfo(env, bitmap, &info);
if (r != 0) {
// … "AndroidBitmap_getInfo() failed ! error=%d", r
return false;
}
int width = info.width;
int height = info.height;
if (info.format != ANDROID_BITMAP_FORMAT_RGBA_8888 && info.format != ANDROID_BITMAP_FORMAT_A_8) {
// "Bitmap format is not RGBA_8888 or A_8"
return false;
}
int bytesPerPixel = info.format == ANDROID_BITMAP_FORMAT_RGBA_8888 ? 4 : 1;
void* pixels = null;
r = AndroidBitmap_lockPixels(env, bitmap, &pixels);
if (r != 0) {
// ..."AndroidBitmap_lockPixels() failed ! error=%d", r
return false;
}
if (w == width && h == height && bytesPerPixel == bpp) {
memcpy(pixels, a, width * height * bytesPerPixel);
} else if (bytesPerPixel == 4 && bpp == 1) {
grayscaleToRGBA(pixels, &info, data, w, h);
} else {
assertion(bytesPerPixel == 4 && bpp == 1, "only grayscale -> RGBA is supported bytesPerPixel=%d bpp=%d", bytesPerPixel, bpp);
}
AndroidBitmap_unlockPixels(env, bitmap);
(*env)->ReleaseByteArrayElements(env, data, a, 0);
return true;
}
Hope this helps. 希望这可以帮助。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.