简体   繁体   中英

GL_INVALID_VALUE on glTextureStorage3D with cubemap arrays

I'm getting a GL_INVALID_VALUE when calling glTextureStorage3D on cubemap array textures. My code is quite abstracted but dumping the raw gl code from the texture creation to gl error boils down to this :

[SafeGL DUMP] glCreateTextures(glTargets[uint32_t(type)], 1, &id_) // glTargets[uint32_t(type)] = GL_TEXTURE_CUBE_MAP_ARRAY
[SafeGL DUMP] glTextureParameteri(id_, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR)
[SafeGL DUMP] glTextureParameteri(id_, GL_TEXTURE_MAG_FILTER, GL_LINEAR)
[SafeGL DUMP] glTextureStorage3D(id_, mipLevels, glFormats[uint32_t(format_)], width, height, depth) // mipLevels = 1, glFormats[uint32_t(format_)] = GL_RG16F, width = 1024, height = 1024, depth = 4
/!\ GL ERROR 501: GL_INVALID_VALUE -- Breaking to debugger.

The GL spec specifies that "An INVALID_VALUE error is generated if width, height, or depth is negative." Except in my case it's not...

My only guess is that I'm either encountering an undocumented error that's vendor-specific (GL 4.4 context on nvidia 375.70 driver), or I'm doing something completely wrong but then I'm at a complete loss as to what.

Any idea ?

EDIT: Just updated driver to 378.66, to no avail.

The problem is that depth is not divisible by 6. When you use cubemap arrays, you don't deal with layers, you deal with "layerfaces".

See also here .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM