[英]SDL2 OpenGL 4.2 context with 0 color bits and 0 depth bits
I've been trying to start a new SDL + GLEW + OpenGL project, and the setup has been difficult (MinGW-w64-32 on Windows 8 64-bit with an Intel i7-4700MQ CPU and NVidia GTX 745M GPU). 我一直在尝试启动一个新的SDL + GLEW + OpenGL项目,并且设置非常困难(Windows 8 64位(具有Intel i7-4700MQ CPU和NVidia GTX 745M GPU)上的MinGW-w64-32)。
If I set the GL attributes to be used for context creation to use OpenGL version 4.2, the color and depth bit sizes get set to 0. However, if I request a 2.1 context (which is also the default), I can get the requested bit depths (8 bits for each color, 24 bits for depth). 如果将用于上下文创建的GL属性设置为使用OpenGL版本4.2,则颜色和深度位大小将设置为0。但是,如果我请求2.1上下文(这也是默认值),则可以获取所请求的位深度(每种颜色8位,深度24位)。 In either case, however, glClearColor has no effect (just a black background).
但是,无论哪种情况,glClearColor都无效(仅黑色背景)。
In both cases, the result of a few glGetString
calls is the same- a 4.2 context, suggesting SDL's output is far from correct. 在这两种情况下,几次
glGetString
调用的结果都是相同的-4.2上下文,这表明SDL的输出远非正确。
The entirety of the code can be found here , it's mostly boilerplate for a larger project for now. 完整的代码可以在这里找到,目前主要用于大型项目。 The relevant sections would most likely be
相关章节很可能是
if(SDL_Init(SDL_INIT_EVERYTHING) != 0) {
std::cerr << "Error initializing SDL.\n";
exit(1);
}
SDL_GL_SetAttribute(SDL_GL_ACCELERATED_VISUAL, 1);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 2);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 1);
//SDL_GL_SetAttribute(SDL_GL_CONTEXT_FLAGS, SDL_GL_CONTEXT_FORWARD_COMPATIBLE_FLAG);
SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
SDL_Window* window = SDL_CreateWindow("RenderSystem", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 640, 480, SDL_WINDOW_OPENGL|SDL_WINDOW_RESIZABLE);
if(!window) {
std::cerr << "Window creation failed.\n";
SDL_Quit();
exit(2);
}
SDL_GLContext context = SDL_GL_CreateContext(window);
if(!context) {
std::cerr << "OpenGL Context creation failed.\n";
SDL_DestroyWindow(window);
SDL_Quit();
exit(3);
}
SDL_GL_MakeCurrent(window, context);
and 和
glClearColor(1.0f, 1.0f, 1.0f, 1.0f);
SDL_Event evt;
bool run = true;
while(run) {
SDL_PollEvent(&evt);
switch(evt.type) {
case SDL_KEYDOWN:
if(evt.key.keysym.sym == SDLK_ESCAPE) {
run = false;
}
break;
case SDL_QUIT:
run = false;
break;
}
SDL_GL_SwapWindow(window);
}
I'm using SDL's built-in function
SDL_GL_GetAttribute
which returns the values that SDL uses to create the context, not the actual context attributes (as I understand it).我正在使用SDL的内置函数
SDL_GL_GetAttribute
,该函数返回SDL用于创建上下文的值,而不是实际的上下文属性(据我所知)。
This is incorrect, I took a look at the SDL implementation of SDL_GL_GetAttribute (...)
(see src/video/SDL_video.c
) and it does what I described. 这是不正确的,我看了一下
SDL_GL_GetAttribute (...)
的SDL实现(请参阅src/video/SDL_video.c
),它做了我所描述的。 You cannot query the values on a core profile context because they are not defined for the default framebuffer. 您无法在核心配置文件上下文中查询值,因为它们没有为默认帧缓冲区定义。
int
SDL_GL_GetAttribute(SDL_GLattr attr, int *value)
{
// ...
switch (attr) {
case SDL_GL_RED_SIZE:
attrib = GL_RED_BITS;
break;
case SDL_GL_BLUE_SIZE:
attrib = GL_BLUE_BITS;
break;
case SDL_GL_GREEN_SIZE:
attrib = GL_GREEN_BITS;
break;
case SDL_GL_ALPHA_SIZE:
attrib = GL_ALPHA_BITS;
break;
}
// ...
glGetIntegervFunc(attrib, (GLint *) value);
error = glGetErrorFunc();
}
That code actually generates a GL_INVALID_ENUM
error on a core profile, and the return value of SDL_GL_GetAttribute (...)
should be non-zero as a result. 该代码实际上在核心概要文件上生成
GL_INVALID_ENUM
错误,因此SDL_GL_GetAttribute (...)
的返回值应为非零。
If you must get meaningful values from SDL_GL_GetAttribute (...)
for bit depths, then that means you must use a compatibility profile. 如果必须从
SDL_GL_GetAttribute (...)
获取有意义的位深度值,则意味着必须使用兼容性配置文件。 SDL2 does not extract this information from the pixel format it selected (smarter frameworks like GLFW do this), but it naively tries to query it from GL. SDL2不会从它选择的像素格式中提取此信息(像GLFW这样的更聪明的框架会这样做),但是它会天真地尝试从GL查询它。
If not fail me memory, SDL_GL_DEPTH_SIZE has to be the sum of all color channels: 如果不能使我无法记忆,则SDL_GL_DEPTH_SIZE必须是所有颜色通道的总和:
using four color channels: 使用四个颜色通道:
SDL_GL_SetAttribute (SDL_GL_DEPTH_SIZE, 32);
If you were using 3 color channels would then: 如果您使用3个颜色通道,则:
SDL_GL_SetAttribute (SDL_GL_DEPTH_SIZE, 24);
Already had some problems with it, this might be the problem. 已经有一些问题了,这可能是问题所在。 Sorry for my english.
对不起我的英语不好。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.