简体   繁体   中英

(LWJGL3) OpenGL 2D Texture Array stays empty after uploading image data with glTexSubImage3D

So I'm currently trying to replace my old texture atlas stitcher with a 2D texture array to make life simpler with anisotropic filtering and greedy meshing later on.

I'm loading the png files with stb and I know that the buffers are filled properly because if I export every single layer of the soon to be atlas right before uploading it it's the correct png file.

My setup works like this:

I'm loading every single texture in my jar file with stb and create an object with it that stores the width, height, layer and pixelData in it.

When every texture is loaded i look for the biggest texture and scale every smaller texture to the same size as the biggest because i know that 2D texture arrays only work if every single one of the layers has the same size.

Then I initialize the 2d texture array like this:

public void init(int layerCount, boolean supportsAlpha, int textureSize) {
    this.textureId = glGenTextures();
    this.maxLayer = layerCount;

    int internalFormat = supportsAlpha ? GL_RGBA8 : GL_RGB8;
    this.format = supportsAlpha ? GL_RGBA : GL_RGB;

    glBindTexture(GL_TEXTURE_2D_ARRAY, this.textureId);
    glTexImage3D(GL_TEXTURE_2D_ARRAY, 0, internalFormat, textureSize, textureSize, layerCount, 0, this.format, GL_UNSIGNED_BYTE, 0);
}

After that i go through my map of textureLayer objects and upload every single one of them like this:

public void upload(ITextureLayer textureLayer) {
    if (textureLayer.getLayer() >= this.maxLayer) {
        LOGGER.error("Tried uploading a texture with a too big layer.");
        return;
    } else if (this.textureId == 0) {
        LOGGER.error("Tried uploading texture layer to uninitialized texture array.");
        return;
    }

    glBindTexture(GL_TEXTURE_2D_ARRAY, this.textureId);

    // Tell openGL how to unpack the RGBA bytes
    glPixelStorei(GL_UNPACK_ALIGNMENT, 1);


    // Tell openGL to not blur the texture when it is stretched
    glTexParameteri(GL_TEXTURE_2D_ARRAY, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
    glTexParameteri(GL_TEXTURE_2D_ARRAY, GL_TEXTURE_MAG_FILTER, GL_NEAREST);

    // Upload the texture data
    glTexSubImage3D(GL_TEXTURE_2D_ARRAY, 0, 0, 0, textureLayer.getLayer(), textureLayer.getWidth(), textureLayer.getHeight(), 0, this.format, GL_UNSIGNED_BYTE, textureLayer.getPixels());

    int errorCode = glGetError();
    if (errorCode != 0) LOGGER.error("Error while uploading texture layer {} to graphics card. {}", textureLayer.getLayer(), GLHelper.errorToString(errorCode));
}

The error code for every single one of my layers is 0, so I assume that everything went well. But when I debug the game with RenderDoc I can see that on every single layer every bit is 0 and therefore it's just a transparent texture with the correct width and height.

I can't figure out what I'm doing wrong since openGL tells me everything went well. It is important to me that I only use openGL 3.3 and lower since I want the game to be playable on older PCs aswell so pre allocating memory with glTexStorage3D is not an option.

The 8th paramter of glTexSubImage3D should be 1 ( depth ).
Note, the size of the layer is textureLayer.getWidth() , textureLayer.getHeight() , 1 :

glTexSubImage3D(
    GL_TEXTURE_2D_ARRAY, 0, 0, 0, textureLayer.getLayer(),
    textureLayer.getWidth(), textureLayer.getHeight(), 1,    // depth is 1
    this.format, GL_UNSIGNED_BYTE, textureLayer.getPixels());

It is not an error to pass a width , height or depth of 0 to glTexSubImage3D , but it won't have any effect to the texture objects data store.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM