简体   繁体   中英

Texture rendering and VBO's [OpenGL/SDL/C++]

So, I've been working on a little game project for a bit and I've hit a snag that's annoying me to no end. I load an obj file which then gets rendered after being put into a VBO. This part works fine, no problemo. However, I've been trying to get it to render the accompanying texture with the supplied UVs with no success. Currently, I just get a matte green colouration on my model. Upon investigating it in GDE, I've seen that texture gets loaded fine and occupies the GL_TEXTURE0 unit, so that's not the issue. I believe it may be my binding but I have no idea why this would fail...

void Model_Man::render_models()
{
    for(int x=0; x<models.size(); x++)
    {
        if(models.at(x).visible==true)
        {
            glBindBuffer(GL_ARRAY_BUFFER,models.at(x).t_buff);
            glBindBuffer(GL_ELEMENT_ARRAY_BUFFER,models.at(x).i_buff);

            glEnableClientState(GL_VERTEX_ARRAY);
            glVertexPointer(3, GL_FLOAT,0,0);

            glClientActiveTexture(GL_TEXTURE0);

            glTexCoordPointer(2,GL_FLOAT,0,&models.at(x).uvs[0]);
            glEnableClientState(GL_TEXTURE_COORD_ARRAY);

            glActiveTexture(GL_TEXTURE0);
            int tex_loc = glGetUniformLocation(models.at(x).shaderid,"color_texture");
            glUniform1i(tex_loc,GL_TEXTURE0);
            glEnable(GL_TEXTURE_2D);
            glBindTexture(GL_TEXTURE_2D, models.at(x).mats.at(0).texid);

            c_render.use_program(models.at(x).shaderid);
            glDrawElements(GL_TRIANGLES,models.at(x).f_index.size()*3,GL_UNSIGNED_INT,0);
            c_render.use_program();
            glBindBuffer(GL_ARRAY_BUFFER, 0);
            glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);

            glDisableClientState(GL_VERTEX_ARRAY);
            glDisableClientState(GL_TEXTURE_COORD_ARRAY);
            glDisable(GL_TEXTURE_2D);
        }
    }
}

And my shader files...

Shader.frag

uniform sampler2D color_texture;
void main() {
    // Set the output color of our current pixel
    gl_FragColor = texture2D(color_texture, gl_TexCoord[0].st);
}

Shader.vert

void main() {           
    gl_TexCoord[0] = gl_MultiTexCoord0;

    // Set the position of the current vertex 
    gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}

And yes, I know I'm currently being horribly inefficient with my render loop :P but I'm already planning on refactoring it, I am just attempting to get this single model to draw correctly with everything I'm aiming to do. I have no clue why it wouldn't be rendering with the texture correctly applied - unless it's because I need to interleave my arrays but I'm still supplying it with uv data so I don't see why it fails.

The call that set the sampler uniform shall not set GL_TEXTUE0, but actually 0.

Indeed:

glUniform1i(location, 0)

For setting up a sampler uniform do:

glUseProgram(progId);
// ...
glActiveTexture(GL_TEXTURE0 + texUnit);
glBindTexture(texId);
glUniform1i(texUnit);

The main concept is that the uniform variable are a shader program state (it is mantained until you re-link the program or reset the uniform value). Without binding a program, glUniform1i shall fail since there's not shader program at which it can set the uniform value!


As a general advice, call glGetError after each OpenGL call to detect these conditions. Most of those calls can be removed by preprocessor in release version.

Well, found out that the big issue was that while I was binding a texture, I wasn't actually setting it in a way that it was understood as being used. Setting glClientActiveTexture(GL_TEXTURE0 + texUnit); in combination with glActiveTexture(); ended up being the final solution.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM