简体   繁体   中英

iOS OpenGL ES occasionally rendering wrong vertices

I'm trying to render some contents of an app I have using OpenGL to increase the performance. Currently, I'm trying to draw simple horizontal and vertical lines on a rectangle.

This works most of the time, and this is how the proper render looks .

But occasionally, seemingly randomly, it will draw vertices starting from the origin of the screen in a weird fashion: This is the same view controller, I just reopened it, and now it looks weirdly weird.

I'm using GLKView and setting up the context myself, context setup looks like this:

// [CustomView initWithFrame:] calls [OpenGLContext init]>

<OpenGLContext.m>

self.context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
[EAGLContext setCurrentContext:self.context];
<compile shaders, etc.>

<[CustomView initWithFrame:]>
self.context = self.openGlContext.context;
self.drawableDepthFormat = GLKViewDrawableDepthFormat24;
self.drawableColorFormat = GLKViewDrawableColorFormatRGBA8888;
self.drawableStencilFormat = GLKViewDrawableStencilFormatNone;
self.drawableMultisample = GLKViewDrawableMultisample4X;
self.layer.opaque = YES;
self.opaque = NO;

All works good, then I proceed to generate VBOs. Here's what my vertex object looks like:

typedef struct {
    vector_float3 position;
    packed_float4 color;
} VertexObject;

To manage vertex/index data, I use malloc/realloc and pointers using a custom VertexArrayBuffer Objective-C class to encapsulate it. Before I start rendering, I create a VAO like this:

- (void)createVAO:(VertexArrayBuffer*)vao // vao has the pointers and stuff
{
    // Re-generate
    glGenVertexArraysOES(1, &vao->vao);
    glBindVertexArrayOES(vao->vao);

    glGenBuffers(1, &vao->vertexBuffer); // VertexObject *vertexBuffer
    glBindBuffer(GL_ARRAY_BUFFER, vao->vertexBuffer);
    glBufferData(GL_ARRAY_BUFFER, vao.vertexCount * sizeof(VertexObject), vao.vertices, GL_STATIC_DRAW);

    glEnableVertexAttribArray(positionSlot);
    glVertexAttribPointer(positionSlot, 3, GL_FLOAT, GL_FALSE, sizeof(VertexObject), NULL);

    glEnableVertexAttribArray(colorSlot);
    glVertexAttribPointer(colorSlot, 4, GL_FLOAT, GL_FALSE, sizeof(VertexObject), (GLvoid*)sizeof(vector_float3));

    glGenBuffers(1, &vao->indexBuffer); // GLuint *indexBuffer
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vao->indexBuffer);
    glBufferData(GL_ELEMENT_ARRAY_BUFFER, vao.indexCount * sizeof(GLuint), vao.indices, GL_STATIC_DRAW);

    glBindBuffer(GL_ARRAY_BUFFER, 0);
    glBindVertexArrayOES(0);
}

Before calling glDrawElements I update the VBOs:

- (void)updateVAO:(VertexArrayBuffer*)vao
{
    glBindVertexArrayOES(vao->vao);

    glBindBuffer(GL_ARRAY_BUFFER, vao->vertexBuffer);
    glBufferData(GL_ARRAY_BUFFER, vao.vertexCount * sizeof(VertexObject), vao.vertices, GL_STATIC_DRAW);

    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vao->indexBuffer);
    glBufferData(GL_ELEMENT_ARRAY_BUFFER, vao.indexCount * sizeof(GLuint), vao.indices, GL_STATIC_DRAW);

    glBindBuffer(GL_ARRAY_BUFFER, 0);
    glBindVertexArrayOES(0);
}

And finally, I draw them:

glDrawElements(GL_LINES, (GLsizei)vao.indexCount, GL_UNSIGNED_INT, NULL);

I have two of these running at once in the view controller, one for each gantt view that requires the grid effect. It may glitch with only one, or both, at the same time. Sometimes the grid simply doesn't appear at all!

I've tried checking my alignments and searching for anyone with a similar issue but I haven't been able to find anything similar happening to anyone. When I print each vertex and index with po vao.vertices[0].position they seem to be correctly setup.

Maybe anyone has had that issue before, or can point me into the right direction?

Can share more sources, if needed, too.

Thanks!

Welp! Found out what was wrong. It had nothing to do with any of the snippets I posted, as well. For the sake of my sanity in a therapeutic fashion, I'll explain the answer in details.

In my shaders, I bind a 4x4 matrix to apply the transformations to map the screen coordinates. To make things easier I allocate it into a pointer array, and map this from another 3x3 matrix that is a set of 2D transformations, and then I just fill the diagonal with 1's and the rest with 0's.

Welp, at least I intend to fill it with 0s. Here's the culprit:

// GLfloat *renderMatrix
renderMatrix = malloc(sizeof(GLfloat) * 4 * 4); // 4x4 matrix

down the pipeline during rendering I plop it into the shader:

GLint matrixSlot = self.openGlContext->transformMatrixSlot;
glUniformMatrix4fv(matrixSlot, 1, 0, renderMatrix);

You might notice it's lacking something. Yes, I'm not preparing the memory before using it.

size_t matrixL = sizeof(GLfloat) * 4 * 4;
renderMatrix = malloc(matrixL);
memset(renderMatrix, 0, matrixL); // this is needed.

This fixes all issues, and now the grids work consistently!

Lesson: zero your memory before using it, folks.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM