简体   繁体   中英

Does OpenGL eliminate a vertex shader with no bound vertex buffer?

The OpenGL SuperBible 6th Ed (for OpenGL 4.3) has samples that use vertex shaders that have no vertex attribute inputs, but hard coded vertices, eg

#version 420 core
void main(void) 
{                                                                 
    const vec4 vertices[] = vec4[](vec4( 0.4, -0.4, 0.5, 1.0), 
                                   vec4(-0.4, -0.4, 0.5, 1.0),    
                                   vec4( 0.4,  0.4, 0.5, 1.0));   
    gl_Position = vertices[gl_VertexID];  
 }

When I run the samples, the window is cleared but nothing else happens.

By experimenting I discovered that when binding an empty buffer to the context, the program runs as expected. Below is the sample program:

#include <stdio.h>
#include <stdlib.h>
#include <GL/glew.h> /
#include <GLFW/glfw3.h> 
#define GLM_MESSAGES
#include <glm/glm.hpp>
#include <glm/gtc/matrix_transform.hpp>
#include <glm/gtc/type_ptr.hpp>
#include <glm/gtx/vector_angle.hpp>

int main(int argc, char* argv[]) {
    GLuint program;
    GLuint vao;
    GLuint vbo;

    glfwInit();
    GLFWwindow* window = glfwCreateWindow(640, 480, "gl_InstanceID", NULL, NULL);
    glfwMakeContextCurrent(window);

    glewInit();
    program = glCreateProgram();
    GLuint vs = load("vertex.glsl", GL_VERTEX_SHADER, true);
    GLuint fs = load("frag.glsl", GL_FRAGMENT_SHADER, true);
    glAttachShader(program, vs);
    glAttachShader(program, fs);
    glLinkProgram(program); 
    glDeleteShader(vs);
    glDeleteShader(fs);

    glGenVertexArrays(1, &vao);
    glBindVertexArray(vao);

//**************************************
//no triangle drawn when below these lines commented out
    glGenBuffers(1, &vbo);
    glBindBuffer(GL_ARRAY_BUFFER, vbo);
    glBufferData(GL_ARRAY_BUFFER, 4, NULL, GL_DYNAMIC_COPY);
    glVertexAttribPointer(0, 4, GL_FLOAT, GL_FALSE, 0, NULL);
    glEnableVertexAttribArray(0);
//***************************************

    do {
        static const GLfloat green[] = { 0.0f, 0.2f, 0.0f, 1.0f };
        glClearBufferfv(GL_COLOR, 0, green);
        glUseProgram(program);
        glDrawArrays(GL_TRIANGLES, 0, 3); //OR glDrawArraysInstanced(GL_TRIANGLES, 0, 3,1);
        glfwSwapBuffers(window);

        glfwPollEvents();
        if (GLFW_PRESS == glfwGetKey(window, GLFW_KEY_ESCAPE)) {
            glfwSetWindowShouldClose(window, 1);
        }
    } while (!glfwWindowShouldClose(window));
    glfwTerminate();
    return 0;
}

When adding the code between the stars the triangle is drawn. Without the buffer no triangle is drawn.

Also, when the vertex array object (VAO) is removed (or used as above), the triangle also does not show. Only with the VAO and with the (empty) VBO is the triangle drawn. Removing the VAO but having the buffer shows the triangle. No buffer no triangle. The VAO doesn't seem to make a difference.

Also note the vertex shader has no inputs.

What is the happening ?

  1. Compiler / Graphics Card thinks the vertex shader does nothing and removes it ? I have feeling it's the graphics card (driver).
  2. Is this to be expected from OpenGL or is it because of the graphics card ?
  3. My understanding of buffers, VAO's and shaders is lacking something ?

Other information that might be useful

  • GLEW 1.10.0
  • GLFW 3.0.3

Fragment shader:

#version 420 core
out vec4 color;
void main(void)
{
color = vec4(1.0);
}

OS details:

  • Linux: 3.2.0-4-amd64
  • Distro: Debian 7.3 Wheezy
  • uname -m: x86_64

Compiler details

  • g++ --version: g++ (Debian 4.7.2-5) 4.7.2

OpenGL details:

  • OpenGL Provider: Advanced Micro Devices (from AMD Catalyst Control Center)
  • OpenGL Renderer: AMD Radeon HD 7600M Series (from AMD Catalyst Control Center)
  • OpenGL Version: 4.2.11762 Compatibility Profile Context (from AMD Catalyst Control Center)
  • glxinfo:
  • server glx version string: 1.4
  • client glx version string: 1.4
  • GLX version: 1.4
  • OpenGL version string: 4.2.11762 Compatibility Profile Context
  • OpenGL shading language version string: 4.20
  • server glx vendor string: ATI
  • client glx vendor string: ATI

The problem also occurs in Windows using the same graphics card as in Linux, ie

  • Windows 8
  • 64 bit
  • Running on a AMD Radeon HD 7670M from ATI Technologies Inc.
  • OpenGL version 4.2.11762 Compatibility Profile Context is supported

AMD has a new version 13.12 of their drivers for Linux (I am using 13.4). I'll give that a try when I have some time because the install last time wasn't easy.

I also logged issue 984 on Unofficial AMD Bugzilla . But perhaps it's fixed in 13.12.


However, the program works (both without using an empty buffer and with the empty buffer), running on

  • Windows 7 Pro
  • 64 bit
  • NVIDIA GeForce GT 520M (Driver Date: 2013/10/15, Driver Version: 9.18.13.3158)

and on

  • Windows 8
  • 64 bit
  • Running on a Intel(R) HD Graphics 4000 from Intel,
  • OpenGL version 4.0.0 - Build 9.17.10.2849

No, believe it or not glDrawArrays (...) has nothing to do with which vertex buffer is bound. Vertex Buffer binding only matters when you setup vertex pointers, as it defines the address space the pointer you pass is relative to. From that point on the bound VBO is irrelevant. By the way, if you run this fragment shader on a strict GLSL implementation it will warn/refuse to work because without a #version directive it is supposed to assume the fragment shader is written against the GLSL 1.10 spec., which does not support out stage variables (you need gl_FragData [0] instead).

That or, simply add #version 420 core like you have in the vertex shader... the version directive is actually more important than you would think, especially since the behavior that occurs in its absence varies wildly between vendors. Doing this probably will not fix your problem, but it is nevertheless a problem that you should address.

As for the issue with removing the VAO, that is to be expected. In a core OpenGL 3.2+ context, you must have a non-zero VAO bound for glDrawArrays (...) to work. Effectively, VAOs become the context for vertex draw commands. Without one bound you have no context for them to operate on.

Keep in mind that the vertex shader only gets to manipulate existing vertices. It can't create them. Even though the position is hardcoded in the shader you still have to feed it something to transform.

No, looks like this was a bug in the driver for AMD Radeon HD 7600M Series (13.4)

I installed latest AMD driver and it solved the problem in both Windows and Linux.

Linux: installed amd-catalyst-13.12-linux-x86.x86_64.zip

Windows: installed amd_catalyst_13.11_mobility_betav9.5.exe


To re-install AMD drivers on Linux I first uninstall the AMD installation (amd-catalyst-previous-version.run --uninstall), removed all packages which contain the name fglrx (using aptitude), adjusted the symbolic link to /usr/lib64 (see below) and then ran the new amd-catalyst-13.12-linux-x86.x86_64.run.

I came across this page AMD 13.1 64 bit drivers and the libGL.so.1 error which explains where the AMD installer puts the libGL.so file

the installer puts lib files in /usr/lib64. However, if you have Ubuntu, the 64 bit libraries go in /usr/lib. I did the following to fix my problem.

Uninstall the driver sudo ./amd-driver-installer-catalyst-13.1-legacy-linux-x86.x86_64.run --uninstall

Remove the /usr/lib64 folder sudo rm -Rf /usr/lib64

Create a symbolic link /usr/lib64 which points to /lib/usr sudo ln -s /usr/lib /usr/lib64

Install the driver again sudo ./amd-driver-installer-catalyst-13.1-legacy-linux-x86.x86_64.run --force

Reboot sudo reboot

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM