简体   繁体   中英

Problems with drawing OpenGL objects from VBO using textures,normals & indexlist

I hope this dont seem too much of a code dump, but I really have no clue as to why this dont work. Ive tried getting errors from glGetError() and it seems to always return 0. Ive tried to only include the code I think is affecting the problem, as the other code has been working fine in most other situations Ive used it in.

Anyways, code first:

This is my main render loop:

float rotate = 0.0f;
void Render(SDL_Window *window)
{
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

    glLoadIdentity();

    gluLookAt(-2,-2,-10,   0,0,0,   0,1,0);

    glRotatef(rotate, 0, 1, 0);
    // Start drawing
    glUseProgramObjectARB( *shader->GetShaderProgram() );
    texture->EnableTexture( *shader->GetShaderProgram(),"tex" );
    cube->Render();
    SDL_GL_SwapWindow(window);
    rotate = rotate+1;
    glUseProgramObjectARB(0);
}

This is my objects render(cube->Render()):

void Object3DVBO::Render()
{

    //Bind the buffers and tell OpenGL to use the Vertex Buffer Objects (VBO's), which we already prepared earlier
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, vboID); 
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, texcoordsID);
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, normalID);
    glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, indiceID);

    //Enable states, and render (as if using vertex arrays directly)
    glEnableClientState(GL_VERTEX_ARRAY);
    glEnableClientState(GL_TEXTURE_COORD_ARRAY);
    glEnableClientState(GL_NORMAL_ARRAY);

    glTexCoordPointer(2, GL_FLOAT, 0, 0);
    glNormalPointer(GL_FLOAT, 0, 0);
    glVertexPointer(3, GL_FLOAT, 0,  0);

    if(textureid > 0) {
        glEnable(GL_TEXTURE_2D);      // Turn on Texturing
        //glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
        glBindTexture(GL_TEXTURE_2D, textureid);
    }

    //Draw the thing!
    glDrawElements(GL_TRIANGLES, indexCount, GL_UNSIGNED_BYTE, 0);
    //restore the GL state back
    glDisableClientState(GL_NORMAL_ARRAY);
    glDisableClientState(GL_TEXTURE_COORD_ARRAY);
    glDisableClientState(GL_VERTEX_ARRAY);
    if(textureid > 0) {
        glDisable(GL_TEXTURE_2D);      // Turn off Texturing
        glBindTexture(GL_TEXTURE_2D, textureid);
    }

    glBindBufferARB(GL_ARRAY_BUFFER_ARB, 0); //Restore non VBO mode
    glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, 0);
};

This is the code for initializing the VBOs:

void Object3DVBO::SetVBO()
{
    // Vertices:
    glGenBuffersARB(1, &vboID);
    glBindBufferARB( GL_ARRAY_BUFFER_ARB, vboID);
    glBufferDataARB( GL_ARRAY_BUFFER_ARB, sizeof(GLfloat)*vertices.size(), &vertices, GL_STATIC_DRAW_ARB );

    // Vertices:
    glGenBuffersARB(1, &indiceID);
    glBindBufferARB( GL_ELEMENT_ARRAY_BUFFER_ARB, indiceID);
    glBufferDataARB( GL_ELEMENT_ARRAY_BUFFER_ARB, sizeof(GLubyte)*indices.size(), &indices, GL_STATIC_DRAW_ARB );

    // Normals:
    glGenBuffersARB(1, &normalID);
    glBindBufferARB( GL_ARRAY_BUFFER_ARB, normalID);
    glBufferDataARB( GL_ARRAY_BUFFER_ARB, sizeof(GLfloat)*normals.size(), &normals, GL_STATIC_DRAW_ARB );

    // Texture coordinates:
    glGenBuffersARB(1, &texcoordsID);
    glBindBufferARB( GL_ARRAY_BUFFER_ARB, texcoordsID);
    glBufferDataARB( GL_ARRAY_BUFFER_ARB, sizeof(GLfloat)*texCoords.size(), &texCoords, GL_STATIC_DRAW_ARB );

    vertices.clear();
    normals.clear();
    texCoords.clear();
    indices.clear();
}

and then finally the simplest model Ive made in a textfile:

mesh:
-0.5 -0.5 0.0
0.5 -0.5 0.0
0.5 0.5 0.0
-0.5 0.5 0.0
normals:
0 0 -1
0 0 -1
0 0 -1
0 0 -1
texcoords:
0.0 0
1.0 0
1.0 1.0
0.0 1.0
indices:
0 0 0 1 1 1 2 2 2 0 0 0 2 2 2 3 3 3
end:

Whats outside this is reading the textfile and storing it into vectors from functions. Here is the header file for the object:

class Object3DVBO
{
public:
    Object3DVBO(const char* objectfilename, GLuint textureid);
    ~Object3DVBO();
    void Render();
private:
    GLuint  vboID, 
            texcoordsID, 
            normalID, 
            textureid,
            indiceID;
    int     vertCount,
            indexCount;
    std::vector<GLfloat> vertices;
    std::vector<GLfloat> normals;
    std::vector<GLfloat> texCoords;
    std::vector<GLubyte> indices;

    void ConvertToReadable( std::vector<GLfloat> v, std::vector<GLfloat> n, std::vector<GLfloat> tc, std::vector<GLint> i);
    void ReadObjectData( const char* objectfilename);
    void SetVBO();
};

The other parts are for basically doing the same for a shaderpair & a texture, these seem to work fine(Shader is for now only version 120, so its using ftransform() etc.etc.)

The problem that I have is that when I use this code and glDrawElements using indices, I get a black screen(no errors anywhere) and if I then switch to glDrawArrays it works(or atleast Im seeing SOMETHING. Ive read alot of tutorials, examples and other SO posts trying to find what Im doing wrong, but none of the solutions/tutorials have so far made any difference.

Im doing this for educational purposes so I really need to use the glDrawElements & indices. Any help would be greatly appreciated!

Ps. If anyone is wondering about the SDL version, its SDL2.

Here is the crux of your problem:

//Bind the buffers and tell OpenGL to use the Vertex Buffer Objects (VBO's), which we already prepared earlier
glBindBufferARB(GL_ARRAY_BUFFER_ARB, vboID);          // 1
glBindBufferARB(GL_ARRAY_BUFFER_ARB, texcoordsID);    // 2
glBindBufferARB(GL_ARRAY_BUFFER_ARB, normalID);       // 3
glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, indiceID);

//Enable states, and render (as if using vertex arrays directly)
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glEnableClientState(GL_NORMAL_ARRAY);

glTexCoordPointer(2, GL_FLOAT, 0, 0);                 // 2
glNormalPointer(GL_FLOAT, 0, 0);                      // 3
glVertexPointer(3, GL_FLOAT, 0,  0);                  // 1

The lines labeled 1, 2 and 3 need to come in pairs. That is to say, since you can only have one VBO bound at a time and it provides the context for a call to glTexCoordPointer (...) for instance, you need to set the pointers while the appropriate VBO is bound.

You can fix it like this:

//Bind the buffers and tell OpenGL to use the Vertex Buffer Objects (VBO's), which we already prepared earlier
glBindBufferARB(GL_ARRAY_BUFFER_ARB, vboID);          // 1
glVertexPointer(3, GL_FLOAT, 0,  0);                  // 1

glBindBufferARB(GL_ARRAY_BUFFER_ARB, texcoordsID);    // 2
glTexCoordPointer(2, GL_FLOAT, 0, 0);                 // 2

glBindBufferARB(GL_ARRAY_BUFFER_ARB, normalID);       // 3
glNormalPointer(GL_FLOAT, 0, 0);                      // 3


glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, indiceID);

//Enable states, and render (as if using vertex arrays directly)
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glEnableClientState(GL_NORMAL_ARRAY);

You have one additional issue, which is not really an error so much as a performance concern. GL_UNSIGNED_BYTE is not a hardware supported vertex index type. The driver must convert your index array to a 16-bit type for the hardware to use it, so there is no actual benefit to using 8-bit indices when you are going to store them in a VBO. Most OpenGL profilers and drivers with debug output enabled will generate a performance warning if you try to do this.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM