简体   繁体   中英

gl_FragDepth breaks color

It seems that I can't use gl_FragDepth on my computer. My program works well otherwise, no glsl error, glGetError returns 0, but I can't write in the depth buffer from my fragment shader.

Besides, writing in gl_FragDepth changes the red component of the pixel color.

There is simplified version of my program. I pruned all the useless stuff (I gess?), and it does not work much better:

int        main(void)
{
//  These are custom structures for handling shader programs.
    t_glprog                prog;
    t_glprog                prog2;

    GLuint                  vbo;
    GLFWwindow              *window;
    static const GLfloat    vertab[] =
    {
        -1.0, -1.0, 1.0,
        1.0, -1.0, 1.0,
        1.0, 1.0, 1.0,
        -1.0, 1.0, 1.0
    };

    char const *vert =
        "attribute vec3 Coord;"
        "void main() {\
        gl_Position = vec4(Coord, 1.0);\
        }";

    char const *frag1 =
        "void main () {\
        gl_FragColor = vec4(0.0, 1.0, 0.0, 1.0);\
        gl_FragDepth = sin(gl_FragCoord.x * 0.1);\
        }";

    char const *frag2 =
        "void main () {\
        gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);\
        gl_FragDepth = cos(gl_FragCoord.x * 0.1);\
        }";

    if (!glfwInit())
    {
        fprintf(stderr, "GLFW failed to init.\n");
        return (-1);
    }

    glfwWindowHint(GLFW_DEPTH_BITS, 64);
    window = glfwCreateWindow(640, 480, "TEST", NULL, NULL);
    if (window == NULL)
    {
        fprintf( stderr, "Failed to open GLFW window.\n" );
        glfwTerminate();
        return (-1);
    }
    glfwMakeContextCurrent(window);

//  For Windows.
    if (glewInit() != GLEW_OK)
    {
        fprintf(stderr, "Failed to initialize GLEW\n");
        return (-1);
    }

    glfwSetInputMode(window, GLFW_STICKY_KEYS, GL_TRUE);

    glEnable(GL_DEPTH_TEST);
    glDepthFunc(GL_LESS);
    glClearDepth(1.0);
    glViewport(0, 0, 640, 480);

    glGenBuffers(1, &vbo);
    glBindBuffer(GL_ARRAY_BUFFER, vbo);
    glBufferData(GL_ARRAY_BUFFER, sizeof(vertab), vertab, GL_STATIC_DRAW);

    create_shaders_prog(&prog, vert, frag1);
    create_shaders_prog(&prog2, vert, frag2);

    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

    glEnableVertexAttribArray(0);
    glBindBuffer(GL_ARRAY_BUFFER, vbo);
    glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, 0);

    glUseProgram(prog.prog);
    glDrawArrays(GL_QUADS, 0, 4);

    glUseProgram(prog2.prog);
    glDrawArrays(GL_QUADS, 0, 4);

    glFlush();
    glfwSwapBuffers(window);

    while (glfwGetKey(window, GLFW_KEY_ESCAPE) != GLFW_PRESS &&
            glfwWindowShouldClose(window) == 0)
    {
        glfwPollEvents();
    }
    return (0);
}

It's supposed to draw red and green stripes, but instead I get blurred red lines. And if I remove the second drawcall, it's the same. On Windows, that is. I tested it on OSX and works as expected.

Here is some specs from glGetString :

    GL_VENDOR : Intel
    GL_RENDERER : Mobile Intel(R) 4 Series Express Chipset Family
    GL_VERSION : 2.1.0 - Build 8.15.10.1892
    GL_SHADING_LANGUAGE_VERSION : 1.20 - Intel Build 8.15.10.1892

Is it possible that your integrated graphics card driver is choking on this line?

glfwWindowHint(GLFW_DEPTH_BITS, 64);

64 bits is an awful lot for a depth buffer. 24 bits is a more typical value. Context creation should fail if a 64-bit depth buffer isn't supported, but I've seen strange behavior from some OpenGL drivers if the depth buffer isn't set up properly.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM