[英]Black square renders instead of an OpenCV image
I'm trying to render two images of size 256x256 with ushort data type.我正在尝试使用 ushort 数据类型渲染两个大小为 256x256 的图像。 One must be in greyscale and another in RGB.
一个必须是灰度,另一个是 RGB。 However, both render as black squares.
但是,两者都呈现为黑色方块。 I believe that the fault lies somewhere in my openGL texture definition, but I'm not sure.
我相信故障出在我的 openGL 纹理定义中的某处,但我不确定。
Here's my minimal version of the code.这是我的代码的最小版本。
#include "imgui.h"
#include "imgui_impl_glfw.h"
#include "imgui_impl_opengl3.h"
#include <glad/glad.h>
#include <GLFW/glfw3.h>
#include <opencv2/opencv.hpp>
using namespace cv;
int main()
{
//init glfw, window, glad, imgui
glfwInit();
const char* glsl_version = "#version 330 core";
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
GLFWwindow* window = glfwCreateWindow(600, 400, "test", NULL, NULL);
glfwMakeContextCurrent(window);
gladLoadGL();
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
ImGui::CreateContext();
ImGui::StyleColorsDark();
ImGui_ImplGlfw_InitForOpenGL(window, true);
ImGui_ImplOpenGL3_Init(glsl_version);
//define image data
ushort value;
Mat_<ushort> grey = Mat_<ushort>(256, 256);
Mat_<Vec3w> rgb = Mat_<Vec3w>(256, 256);
for (int i = 0; i < grey.rows; i++)
for (int j = 0; j < grey.cols; j++)
{
value = (i + j) / 256.0 * USHRT_MAX;
grey.at<ushort>(i, j) = value;
rgb.at<Vec3w>(i, j) = Vec3w(value, value, value);
}
//create textures
GLuint greyID;
GLuint rgbID;
glGenTextures(1, &greyID);
glBindTexture(GL_TEXTURE_2D, greyID);
glTexImage2D(GL_TEXTURE_2D, 0, GL_R16, 256, 256, 0, GL_RED, GL_UNSIGNED_SHORT, grey.data);
glGenTextures(1, &rgbID);
glBindTexture(GL_TEXTURE_2D, rgbID);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB16UI, 256, 256, 0, GL_RGB, GL_UNSIGNED_SHORT, rgb.data);
while (!(glfwGetKey(window, GLFW_KEY_ESCAPE) == GLFW_PRESS))
{
glfwPollEvents();
ImGui_ImplOpenGL3_NewFrame();
ImGui_ImplGlfw_NewFrame();
ImGui::NewFrame();
ImGui::Begin("Images");
ImGui::Image((void*)(intptr_t)greyID, ImVec2(256, 256));
ImGui::SameLine();
ImGui::Image((void*)(intptr_t)rgbID, ImVec2(256, 256));
ImGui::End();
ImGui::Render();
glClearColor(0.2f, 0.2f, 0.2f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT);
ImGui_ImplOpenGL3_RenderDrawData(ImGui::GetDrawData());
glfwSwapBuffers(window);
}
ImGui::DestroyContext();
glfwDestroyWindow(window);
glfwTerminate();
return 1;
}
Here's the result:结果如下:
Your code has two problems.您的代码有两个问题。
First, as was discussed in the comments, in your case you probably want to use GL_RGB16
instead of GL_RGB16UI
.首先,正如评论中所讨论的,在您的情况下,您可能希望使用
GL_RGB16
而不是GL_RGB16UI
。 That takes care of the texture error.这会处理纹理错误。
The second problem is that you need to add第二个问题是你需要添加
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
after glBindTexture
.在
glBindTexture
之后。
The reason is that the default minifying filter is GL_NEAREST_MIPMAP_LINEAR
, but you have only provided the first mip-map level (so the texture is incomplete).原因是默认的缩小过滤器是
GL_NEAREST_MIPMAP_LINEAR
,但您只提供了第一个 mip-map 级别(因此纹理不完整)。 Alternatively, you could also reduce the max level.或者,您也可以降低最大级别。 Take a look at the wiki for more info.
查看wiki了解更多信息。
After fixing both of these issues, your program works:解决这两个问题后,您的程序可以运行:
You may also want to calculate your color as您可能还想将颜色计算为
value = min((i + j) / 256.0), 1.0) * USHRT_MAX;
To add to the answer from LHLaurini, I also made a mistake in my second glTexImage2d
call.为了补充 LHLaurini 的答案,我在第二次
glTexImage2d
调用中也犯了一个错误。 Here, the internal format of GL_RGB16UI
is incorrect, since my data type is short
.在这里,
GL_RGB16UI
的内部格式不正确,因为我的数据类型是short
。 This format should be GL_RGB16
:这种格式应该是
GL_RGB16
:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB16, 256, 256, 0, GL_RGB, GL_UNSIGNED_SHORT, rgb.data);
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.