简体   繁体   中英

OpenGL VBO Drawing

I seem to be having some trouble drawing objects in OpenGL using VBOs. I've attempted to copy the example from: http://www.opengl.org/wiki/VBO_-_just_examples (number 2) but I can't get a plane to appear on screen.

Vertex.h:

#include <freeglut>

struct Vertex {
    GLfloat position[3];
    GLfloat normal[3];
    GLfloat *uvs[2];
    unsigned short uvCount;
};

Triangles.h:

#include <GL/glew.h>
#include "Vertex.h"

class Triangles {
public: 
    Triangles(GLuint program, Vertex *vertices, unsigned int vertexCount, unsigned int *indices[3], unsigned int indiceCount);
    ~Triangles();
    void Draw();

private:
    GLuint program;
    GLuint VertexVBOID;
    GLuint IndexVBOID;
    GLuint VaoID;

    unsigned int *indices[3];
    unsigned int indiceCount;
};

Triangles.cpp:

#include "Triangles.h"
#include <stdio.h>
#include <stddef.h>

Triangles::Triangles(GLuint program, unsigned int *indices[3], unsigned int indiceCount) {
    memcpy(this->indices, indices, sizeof(int) * indiceCount * 3);
    this->indiceCount = indiceCount;
    this->program = program;

    glGenVertexArrays(1, &VaoID);
    glBindVertexArray(VaoID);

    glGenBuffers(1, &VertexVBOID);
    glBindBuffer(GL_ARRAY_BUFFER, VertexVBOID);
    glBufferData(GL_ARRAY_BUFFER, sizeof(Vertex) * vertexCount, vertices, GL_STATIC_DRAW);

    GLuint attributeLocation = glGetAttribLocation(program, "position");
    glEnableVertexAttribArray(attributeLocation);
    glVertexAttribPointer(attributeLocation, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid *)(offsetof(Vertex, position)));

    attributeLocation = glGetAttribLocation(program, "normal");
    glEnableVertexAttribArray(attributeLocation);
    glVertexAttribPointer(attributeLocation, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid *)(offsetof(Vertex, normal)));

    glGenBuffers(1, &IndexVBOID);
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, IndexVBOID);
    glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(unsigned int) * 3 * indiceCount, indices, GL_STATIC_DRAW);
};

Triangles::~Triangles() {
    glDisableVertexAttribArray(glGetAttribLocation(program, "position"));
    glDisableVertexAttribArray(glGetAttribLocation(program, "normal"));

    glDeleteBuffers(1, &VertexVBOID);
    glDeleteBuffers(1, &IndexVBOID);
    glDeleteVertexArrays(1, &VaoID);
}

void Triangles::Draw() {
    glBindVertexArray(VaoID);
    glDrawElements(GL_TRIANGLES, indiceCount, GL_UNSIGNED_INT, 0);
};

Excerpt from main.cpp (creating triagle object):

Vertex vertices[4];
vertices[0].position[0] = -1;
vertices[0].position[1] = 1;
vertices[0].position[2] = 0;
vertices[0].normal[0] = 0;
vertices[0].normal[0] = 0;
vertices[0].normal[0] = 1;
vertices[0].uvCount = 0;

vertices[1].position[0] = 1;
vertices[1].position[1] = 1;
vertices[1].position[2] = 0;
vertices[1].normal[0] = 0;
vertices[1].normal[0] = 0;
vertices[1].normal[0] = 1;
vertices[1].uvCount = 0;

vertices[2].position[0] = 1;
vertices[2].position[1] = -1;
vertices[2].position[2] = 0;
vertices[2].normal[0] = 0;
vertices[2].normal[0] = 0;
vertices[2].normal[0] = 1;
vertices[2].uvCount = 0;

vertices[3].position[0] = -1;
vertices[3].position[1] = -1;
vertices[3].position[2] = 0;
vertices[3].normal[0] = 0;
vertices[3].normal[0] = 0;
vertices[3].normal[0] = 1;
vertices[3].uvCount = 0;

unsigned int **indices;
indices = new unsigned int*[2];
indices[0] = new unsigned int[3];
indices[0][0] = 0;
indices[0][1] = 1;
indices[0][2] = 2;
indices[1] = new unsigned int[3];
indices[1][0] = 2;
indices[1][1] = 3;
indices[1][2] = 0;

Triangles *t = new Triangles(program, vertices, 4 indices, 2);

createShader(GLenum , char *):

GLuint createShader(GLenum type, char *file) {
    GLuint shader = glCreateShader(type);
    const char *fileData = textFileRead(file);
    glShaderSource(shader, 1, &fileData, NULL);

    glCompileShader(shader);
    return shader;
}

Shader loading:

    GLuint v = createShader(GL_VERTEX_SHADER);
    GLuint f = createShader(GL_FRAGMENT_SHADER, "fragmentShader.frag");

    program = glCreateProgram();

    glAttachShader(program, v);
    glAttachShader(program, f);
    glLinkProgram(program);

    glUseProgram(program);

vertexShader.vert:

in vec3 position;
in vec3 normal;

out vec3 a_normal;

void main() {
    gl_Position = vec4(position, 1.0);
}

fragmentShader.frag:

in vec3 a_normal;

out vec4 out_color;

void main() {
    out_color = vec4(1.0, 1.0, 1.0, 1.0);
}

Please let me know if more code is needed. As a side note everything compiles just fine, I just don't see the plane that I have constructed on screen (maybe because I didn't use colors?)

My OpenGL information is as follows:

  • Vendor: ATI Technologies Inc.
  • Renderer: ATI Radeon HD 5700 Series
  • Version: 3.2.9756 Compatibility Profile Context
  • Extensions: extensions = GL_AMDX_name_gen_delete GL_AMDX_random_access_target GL_AMDX_vertex_shader_tessellator GL_AMD_conservative_depth GL_AMD_draw_buffers_blend GL_AMD_performance_monitor GL_AMD_seamless_cubemap_per_texture GL_AMD_shader_stencil_export GL_AMD_texture

In response to your comments:

Unfortunately I do not do error checking

You should always add some OpenGL error checking, it will save you from so many problems. It should look something like the following:

int err = glGetError();
if(err != 0) {
   //throw exception or log message or die or something
}

I used matrix functions because I didn't realize the vertex shader would effect that. I assumed the matrix set to the matrix at the top of the stack (the one I pushed before drawing.)

This is an incorrect assumption. The only variable which references deprecated the matrix stack is special (though deprecated) variable gl_ModelViewProjectionMatrix . What you currently have there is just an unused, uninitialized matrix, which is totally ignoring your matrix stack.

As for indices, I'm not exactly sure what you mean. I just drew the vertices on paper and decided the indices based on that.

I'm not referring to the indices of the triangle in your index buffer, but rather the first parameter to your glAttrib* functions. I suppose 'attribute location' is a more correct term than index.

glEnableVertexAttribArray(0);
glVertexAttribPointer(0, ...   //attrib location 0

glEnableVertexAttribArray(1);  
glVertexAttribPointer(1, ...   //attrib location 1

You seem to just be randomly assuming that "0" and "1" map to "position" and "normal". This is not a safe assumption to make. You should be querying the attribute location values for "position" and "normal" with glGetAttribLocation, and then using that value to glEnableVertexAttribArray and glVertexAttribPointer.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM