简体   繁体   English

如何使用 opengl 从字节数组加载 ImGui 图像?

[英]How to load ImGui image from byte array with opengl?

I am trying to render an image in my c++ ImGui menu;我正在尝试在我的 c++ ImGui 菜单中渲染图像; I believe the end code would be something like ImGui::Image(ImTextureID, ImVec2(X, Y));我相信最终代码会类似于ImGui::Image(ImTextureID, ImVec2(X, Y)); . . I already have a byte array that includes the image I want to render, but don't know how to go about loading it into that ImTextureID that's being passed in. I have found how to do it with Direct X using D3DXCreateTextureFromFileInMemoryEx but need to know the opengl equivalent for doing this.我已经有一个字节数组,其中包含我要渲染的图像,但不知道如何 go 将其加载到正在传入的 ImTextureID 中。我已经找到了如何使用 Direct X 使用D3DXCreateTextureFromFileInMemoryEx但需要知道opengl 等效于执行此操作。

The 'ImTextureID' in ImGui::Image is simply an int, with a value corresponding to a texture that has been generated in your graphics environment (DirectX or OpenGL). ImGui::Image 中的“ImTextureID”只是一个 int,其值对应于在图形环境(DirectX 或 OpenGL)中生成的纹理。

A way to do so in OpenGL is as follows (I'm not familiar with DirectX but I bet that 'D3DXCreateTextureFromFileInMemoryEx' does pretty much the same):在 OpenGL 中这样做的一种方法如下(我不熟悉 DirectX,但我敢打赌 'D3DXCreateTextureFromFileInMemoryEx' 几乎相同):

  1. Generate the texture name (this 'name' is just an integer, and it is the integer that ImGui uses as ImTextureID) using glGenTextures()使用 glGenTextures() 生成纹理名称(这个“名称”只是一个 integer,它是 ImGui 用作 ImTextureID 的 integer)
  2. Set UV sampling parameters for the newly generated texture using glTexParameteri()使用 glTexParameteri() 为新生成的纹理设置 UV 采样参数
  3. Bind the texture to the currently active texture unit using glBindTexture()使用 glBindTexture() 将纹理绑定到当前活动的纹理单元
  4. Upload pixel data to the GPU using glTexImage2D()使用 glTexImage2D() 将像素数据上传到 GPU

Typically that would look like something like this:通常看起来像这样:

int textureID;
glGenTextures(GL_TEXTURE_2D, 1, &textureID);
glTextureParameteri(textureID, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTextureParameteri(textureID, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTextureParameteri(textureID, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_BORDER);
glTextureParameteri(textureID, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_BORDER);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, [width of your texture], [height of your texture], false, GL_RGBA, GL_FLOAT, [pointer to first element in array of texture pixel values]);

It's been a while since I did this in c++ so I might be wrong on some details.自从我在 c++ 中这样做以来已经有一段时间了,所以我可能在某些细节上是错误的。 But the documentation is pretty good, and best to read it anyway to figure out how to make a texture compatible with the type of texture data you intend to input: https://www.khronos.org/registry/OpenGL-Refpages/gl4/html/glTexImage2D.xhtml但是文档非常好,最好还是阅读它以了解如何使纹理与您打算输入的纹理数据类型兼容: https://www.khronos.org/registry/OpenGL-Refpages/gl4 /html/glTexImage2D.xhtml

Once setting up the texture like that, you use the value of textureID in the ImGui Image call.像这样设置纹理后,您可以在 ImGui Image 调用中使用 textureID 的值。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM