简体   繁体   中英

OpenGLES 2.0 wrong depth buffer bits

I downloaded this Apple sample code GLEssentials sample code . I want to perform some experiments with depth buffer, so at first I decided to check BUFFER_BITS.

I added next code to OpenGLRenderer.m in -initWithDefaultFBO method:

// code from sample
NSLog(@"%s %s\n", glGetString(GL_RENDERER), glGetString(GL_VERSION));

// buffer bits check
GLint depthBits;
glGetIntegerv(GL_DEPTH_BITS, &depthBits);
printf("depthBits: %d\n", depthBits);

I had next output:

 GLEssentials[3630:112826] Apple Software Renderer OpenGL ES 2.0 APPLE-12.4.2 
 depthBits: 24

but in ES2Renderer.m I see next line:

// using 16-bit depth buffer
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, backingWidth, backingHeight);

Why its happened? Is it bug?

PS: I tested only in iOS simulator, because I haven't ios device.

The spec says:

An OpenGL ES implementation may vary its allocation of internal component resolution based on any RenderbufferStorage parameter (except target), but the allocation and chosen internal format must not be a function of any other state and cannot be changed once they are established. The actual resolution in bits of each component of the allocated image can be queried with GetRenderbufferParameteriv.

So basically, OpenGLES is allowed to choose a different bit depth from what was requested.

I suspect that on device, an actual 16 bit depth buffer would be used.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM