简体   繁体   English

OpenGL优化还是......?

[英]OpenGL Optimization or…?

I'm making a OpenGL game, and I have problem with optimization. 我正在制作一个OpenGL游戏,我在优化方面遇到了问题。 When I start it, it does not respond. 当我启动它时,它没有响应。 If in Update() I just put a for loop and _time += 0.1f , I get a blank screen. 如果在Update()我只是放了一个for循环而_time += 0.1f ,我得到一个空白屏幕。

void Update(){
    for(; ;){
        _time += 0.1f;
        Render();
    }
}

void Render() {
    glClearDepth(1.0);
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

    _colorProgram.use();

    GLuint timeLocation = _colorProgram.getUniformLocation("time");
    glUniform1f(timeLocation, _time);

    _sprite.Render();

    _colorProgram.unuse();

    glutSwapBuffers();
}

int main(int argc, char** argv) {
    std::printf("OpenGL version is %s",glGetString(GL_VERSION));
    // Window
    glutInit(&argc, argv);
    glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
    glutInitWindowPosition(520, 200);
    glutInitWindowSize(800, 600);
    glutInitDisplayMode(GLUT_DOUBLE);
    glutCreateWindow("OpenGL [ Shader #1 error pt 3 ]");
    // Setup GLEW
    if( GLEW_OK != glewInit()){
        return 1;
    } while( GL_NO_ERROR != glGetError() );

    // After creating window
    Init();
    glutDisplayFunc(Render);
    Update();

   glutMainLoop();
}

The infinite loop in Update() never lets GLUT pump the event queue. Update()的无限循环永远不会让GLUT泵入事件队列。

Use glutTimerFunc() or glutIdleFunc() to call Update() instead. 使用glutTimerFunc()glutIdleFunc()来调用Update() That way execution flow periodically returns to GLUT and GLUT can do what it needs to keep the OS happy. 这样,执行流程定期返回到GLUT,GLUT可以执行保持操作系统满意所需的操作。

The proper way to run an animation with GLUT is to use a timer function. 使用GLUT运行动画的正确方法是使用计时器功能。 This way, GLUT can get back to its main loop, and call your display function. 这样,GLUT可以返回其主循环,并调用您的显示功能。 You're not supposed to call the display function directly from your own code. 您不应该直接从您自己的代码调用显示功能。

For example, register a timer function during initialization (the first argument is a time in milliseconds): 例如,在初始化期间注册一个定时器函数(第一个参数是以毫秒为单位的时间):

glutTimerFunc(10, timerFunc, 0);

Then in the timer function: 然后在计时器功能:

void timerFunc(int value) {
    _time += 0.1f;
    glutPostRedisplay();
    glutTimerFunc(10, timerFunc, 0);
}

There are two critical pieces in the code fragment above: 上面的代码片段中有两个关键部分:

  • You do not call your Render() function directly. 您不直接调用Render()函数。 Instead, you call glutPostRedisplay() to tell GLUT that a redisplay is needed. 相反,你调用glutPostRedisplay()来告诉GLUT需要重新显示。 It will then call your Render() function because you registered it as the display function with glutDisplayFunc() . 然后它会调用你的Render()函数,因为你用glutDisplayFunc()将它注册为显示函数。

  • You have to register the timer again. 你必须再次注册计时器。 glutTimerFunc() fires the timer only once, not periodically. glutTimerFunc()仅触发一次计时器,而不是定期触发。 So you have to re-register it every time it fired. 因此,每次解雇时都必须重新注册。

There is one other problem in your code. 您的代码中还有另一个问题。 You have these calls in your main() : 你在main()有这些调用:

glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
...
glutInitDisplayMode(GLUT_DOUBLE);

The flags passed in the second call will override the ones from the first call. 第二次调用中传递的标志将覆盖第一次调用中的标志。 While GL_RGBA is the default anyway, you will not get a depth buffer because GL_DEPTH is missing in the second call. 虽然GL_RGBA是默认值,但您不会获得深度缓冲区,因为第二次调用中缺少GL_DEPTH You can simply remove the second call, since the first one is most likely what you want. 您可以简单地删除第二个呼叫,因为第一个呼叫很可能是您想要的。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM