简体   繁体   中英

rgba arrays to OpenGL texture

For the gui for my game, I have a custom texture object that stores the rgba data for a texture. Each GUI element registered by my game adds to the final GUI texture, and then that texture is overlayed onto the framebuffer after post-processing.

I'm having trouble converting my Texture object to an openGL texture.

First I create a 1D int array that goes rgbargbargba... etc.

public int[] toIntArray(){
    int[] colors = new int[(width*height)*4];

    int i = 0;
    for(int y = 0; y < height; ++y){
        for(int x = 0; x < width; ++x){

            colors[i] = r[x][y];
            colors[i+1] = g[x][y];
            colors[i+2] = b[x][y];
            colors[i+3] = a[x][y];

            i += 4;
        }
    }
    return colors;
}

Where r , g , b , and a are jagged int arrays from 0 to 255. Next I create the int buffer and the texture.

id = glGenTextures();

glBindTexture(GL_TEXTURE_2D, id);

glPixelStorei(GL_UNPACK_ALIGNMENT, 1);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);

IntBuffer iBuffer = BufferUtils.createIntBuffer(((width * height)*4));

int[] data = toIntArray();
iBuffer.put(data);
iBuffer.rewind();

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_INT, iBuffer);

glBindTexture(GL_TEXTURE_2D, 0);

After that I add a 50x50 red square into the upper left of the texture, and bind the texture to the framebuffer shader and render the fullscreen rect that displays my framebuffer.

frameBuffer.unbind(window.getWidth(), window.getHeight());

postShaderProgram.bind();

glEnable(GL_TEXTURE_2D);
glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_2D, guiManager.texture()); // this gets the texture id that was created

postShaderProgram.setUniform("gui_texture", 1);

mesh.render();

postShaderProgram.unbind();

And then in my fragment shader, I try displaying the GUI:

#version 330

in  vec2 Texcoord;
out vec4 outColor;

uniform sampler2D texFramebuffer;
uniform sampler2D gui_texture;

void main()
{
    outColor = texture(gui_texture, Texcoord);
}

But all it outputs is a black window!

I added a red 50x50 rectangle into the upper left corner and verified that it exists, but for some reason it isn't showing in the final output.

在此处输入图片说明

That gives me reason to believe that I'm not converting my texture into an opengl texture with glTexImage2D correctly.

Can you see anything I'm doing wrong?

Update 1:

Here I saw them doing a similar thing using a float array, so I tried converting my 0-255 to a 0-1 float array and passing it as the image data like so:

public float[] toFloatArray(){
    float[] colors = new float[(width*height)*4];

    int i = 0;
    for(int y = 0; y < height; ++y){
        for(int x = 0; x < width; ++x){

            colors[i] = (( r[x][y] * 1.0f) / 255);
            colors[i+1] = (( g[x][y] * 1.0f) / 255);
            colors[i+2] = (( b[x][y] * 1.0f) / 255);
            colors[i+3] = (( a[x][y] * 1.0f) / 255);

            i += 4;
        }
    }
    return colors;
}

...

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_FLOAT, toFloatArray());

And it works!

在此处输入图片说明

I'm going to leave the question open however as I want to learn why the int buffer wasn't working :)

When you specified GL_UNSIGNED_INT as the type of the "Host" data, OpenGL expected 32 bits allocated for each color. Since OpenGL only maps the output colors in the default framebuffer to the range [0.0f, 1.0f] , it'll take your input color values (mapped in the range [0, 255] ) and divide all of them by the maximum size of an int (about 4.2 Billion) to get the final color displayed on screen. As an exercise, using your original code, set the "clear" color of the screen to white, and see that a black rectangle is getting drawn on screen.

You have two options. The first is to convert the color values to the range specified by GL_UNSIGNED_INT , which means for each color value, multiply them by Math.pow((long)2, 24) , and trust that the integer overflow of multiplying by that value will behave correctly (since Java doesn't have unsigned integer types).

The other, far safer option, is to store each 0-255 value in a byte[] object (do not use char . char is 1 byte in C/C++/OpenGL, but is 2 bytes in Java) and specify the type of the elements as GL_UNSIGNED_BYTE .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM