简体   繁体   中英

Is there something wrong with my cleanup code? (OpenGL + SDL)

I think I have a bug in my program. I use SDL and OpenGL to render an animation. The program also measures the average FPS. Tipically, when I run the program, it will run at around 550 FPS.

However, if I start a second instance of the program, the FPS drops for both at around half (220 FPS). The strange thing is that if I close the first instance, the second one will still run at only 220 FPS. This leads me to believe that my cleanup code is somehow flawed.

Sometimes, even if I run a single instance, it will run at only 220 FPS, probably due to a previous instance that failed to clean up properly. Is there something wrong with my approach below?

I use a screen class which has the following *tors:

namespace gfx
{
    screen::screen(const settings& vs) : dbl_buf_(false), sdl_surface_(0)
    {
        if (SDL_Init(SDL_INIT_VIDEO) < 0)
            throw util::exception(::std::string("Unable to initialize SDL video: ") + SDL_GetError());
        if (!set(vs))
        {
            SDL_Quit();
            throw util::exception("Unable to setup initial video mode.");
        }
        glewInit();
    }

    screen::~screen()
    {
        SDL_Quit();
    }

    bool screen::set(const settings& vs)
    {
        SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);

        Uint32 flags = SDL_HWSURFACE | SDL_OPENGL;
        if (vs.full_screen) flags |= SDL_FULLSCREEN;
        sdl_surface_ = SDL_SetVideoMode(vs.size_x, vs.size_y, vs.bpp, flags);
        if (!sdl_surface_) return false;

        settings_ = vs;

        int db_flag = 0;
        SDL_GL_GetAttribute(SDL_GL_DOUBLEBUFFER, &db_flag);
        dbl_buf_ = (db_flag == 1);
        return true;
    }

    // ...
}

Also:

int main()
{
    try
    {
        gfx::settings vs = {800, 600, 32, false};
        gfx::screen scr(vs);
            // main app loop, render animation using OpenGL calls
            // loop runs while running_ variable is true (see below)
    }
    // catch, etc.
    return 0;
}

If it makes any difference, I use Linux and an ATI card.

Update: Event handling code:

SDL_Event event;
while (SDL_PollEvent(&event))
{
    switch (event.type)
    {
        case SDL_KEYDOWN:
            if (event.key.keysym.sym == SDLK_ESCAPE)
                running_ = false;
            break;
        case SDL_QUIT:
            running_ = false;
            break;
        default:
            world_.process_event(event);
            break;
    }
}

When a process terminates all the resources it used are freed automatically. That includes OpenGL. What may happen is, that you don't terminate your process but only hide the window by clicking the close button.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM