简体   繁体   中英

OpenGL Textures are all black in 3.3 - but work in 3.1

I'm currently working on a simple 3D scene in OpenGL3.3, but when trying to texture the objects - all of them are textured entirely black. However, if I change the context version to 3.1; it has no problem rendering the textures correctly over the models.

I'm not sure if this suggests I'm using deprecated functionality/methods, but I'm struggling to see where the problem could be.

Setting up the texture

(load texture from file)
...
glGenTextures(1, &TexID);               // Create The Texture ( CHANGE )
glBindTexture(GL_TEXTURE_2D, TexID);

glTexImage2D(GL_TEXTURE_2D, 0, texture_bpp / 8, texture_width, texture_height, 0, texture_type, GL_UNSIGNED_BYTE, texture_imageData);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
...

Binding the Texture to Render

// mLocation is the layout location in the shader, from glGetUniformLocation
// mTextureUnit is the specified texture unit to load into. Currently using 0.
// mTextureID is the ID of the loaded texture, as generated above.
glActiveTexture( GL_TEXTURE0 + mData.mTextureUnit );
glBindTexture( GL_TEXTURE_2D, mData.mTextureID );
glUniform1i( mLocation, mData.mTextureUnit );

Fragment Shader

uniform sampler2D diffusemap;
in vec2 passUV;
out vec3 outColour;
...
outColour = texture( diffusemap, passUV ).rgb;

All textures being used are power of 2, square sizes.


Images showing the problem.

GL3.1: http://i.imgur.com/NUgj6vA.png

GL3.3: http://i.imgur.com/oOc0jcd.png


Vertex Shader

#version 330 core

uniform mat4 p;
uniform mat4 v;
uniform mat4 m;

in vec3 vertex;
in vec3 normal;
in vec2 uv;

out vec3 passVertex;
out vec3 passNormal;
out vec2 passUV;

void main( void )
{
    gl_Position = p * v * m * vec4( vertex, 1.0 );

    passVertex = vec3( m * vec4( vertex, 1.0 ) );
    passNormal = vec3( m * vec4( normal, 1.0 ) );
    passUV = uv;
}

In the line:

glTexImage2D(GL_TEXTURE_2D, 0, texture_bpp / 8, texture_width, texture_height, 0, texture_type, GL_UNSIGNED_BYTE, texture_imageData);

The assumption that (texture_bpp / 8) will return the correct format type is incorrect. It should be one of the GLenum values that specifies the internal-format such as GL_RGBA.

Correcting it to (or whichever format matches the internal-format of the texture file) fixes the issue entirely and works on both GL3.3 and GL3.1:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, texture_width, texture_height, 0, texture_type, GL_UNSIGNED_BYTE, texture_imageData);

For the sake of completeness, the internal format of a texture should be an enumerator. And one of the sized enumerators, not one of the unsized ones. Please stop using GL_RGB when you can use GL_RGB8 .

Answer correctly identifies the issue, but it would be helpful to have it explained why the previous assumption would work on 3.1 and not on 3.3.

The ability to use a number on the range [1, 4] was deprecated in OpenGL 3.0 and removed in OpenGL 3.1. However , at that time, there wasn't a way to say, "give me the actual core profile of OpenGL version 3.1"; the WGL/GLX_CONTEXT_CORE_PROFILE_BIT_ARB s didn't exist. Therefore, when you got a 3.1 context, it was perfectly legal for an implementation to export the ARB_compatibility extension , which still allowed all of the removed functionality.

In 3.2, the ability to explicitly select a profile was added to OpenGL. At which point, you would not get someone exposing ARB_compatibility in a core profile. That's why your code works when you ask for 3.1 (since it's free to give you 3.1 compatibility), but not when you ask for 3.3 core profile.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM