简体   繁体   中英

OpenGL: Is it supported to render into an FBO in one context, then use the FBO's texture in another?

I have the following OpenGL setup for troubleshooting frame buffer issues:

  1. I render a cube into a frame buffer.
  2. I use the target texture from this frame buffer to draw a textured quad, which displays the cube in my viewport.

This works OK when both stages of the process are done in the same context, but breaks if stage 1 is done in a different context to stage 2 (note that these contexts are both shared and both on the same thread). In this case, I only ever see the cube displayed when I resize my viewport (which recreates my frame buffer). The cube is sometimes corrupted or fragmented, which leads me to believe that all I'm seeing is parts of memory that were used by the texture before it was resized, and that nothing is ever displayed properly.

The reason I have to have this setup is that in my actual application I'm using Qt OpenGL widgets, which are forced to use their own individual contexts, so I have to render my scene in its own dedicated context and then copy it to the relevant viewports using shareable OpenGL resources. If I don't do this, I get errors caused by VAOs being bound/used in other contexts.

I've tried the following unsuccessful combinations (where the primary context is where I use the texture to draw the quad, and the secondary context where the "offscreen" rendering of the cube into the frame buffer takes place):

  • Creating the frame buffer, its render buffer and its texture all in the secondary context.
  • Creating the frame buffer and the render buffer in the secondary context, creating the texture in the primary context, and then attaching the texture to the frame buffer in the secondary context.
  • Creating the frame buffer, its render buffer and two separate textures in the secondary context. One of these textures is initially attached to the frame buffer for rendering. Once the rendering to the frame buffer is complete, the first texture is detached and the second one attached. The previously attached texture containing the content of the rendering is used with the quad in the primary context.

In addition, I can't use glBlitFramebuffer() as I don't have access to the frame buffer the QOpenGLWidget uses in the application (as far as I've tried, QOpenGLWidget::defaultFramebufferObject() returns 0 which causes glBlitFramebuffer to give me errors).

The only way I have managed to get the rendering to work is to use a QOpenGLFrameBuffer and call takeTexture() when I want to use the texture with the quad. However, doing it this way means that the QOpenGLFrameBuffer creates itself a new texture and I have to destroy the old one once I've used it, which seems very inefficient.

Is there anything I can do to solve this problem?

I've got a project that uses a texture like that. You need to call glFinish() after drawing and before using the texture from QOpenGLFramebufferObject::texture() . That was our problem on some of the OSes.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM