简体   繁体   中英

DevIL image not rendering correctly

I am using OpenGL, I can load tga files properly, but for some reason when i render jpg files, i do not see them correctly.

This is what the image is supposed to look like--

在此输入图像描述

And this is what it looks like.. why is it stretched? is it because of the coordinates?

在此输入图像描述

Here is the code i am using for drawing.

void Renderer::DrawJpg(GLuint tex, int xi, int yq, int width, int height) const
{
glBindTexture(GL_TEXTURE_2D, tex);
glBegin(GL_QUADS);
glTexCoord2i(0, 0); glVertex2i(0+xi,   0+xi);
glTexCoord2i(0, 1); glVertex2i(0+xi,   height+xi);
glTexCoord2i(1, 1); glVertex2i(width+xi, height+xi);
glTexCoord2i(1, 0); glVertex2i(width+xi, 0+xi);
glEnd();
}

This is how i am loading the image...

imagename=s;
ILboolean success;
ilInit();
ilGenImages(1, &id);
ilBindImage(id);
success = ilLoadImage((const ILstring)imagename.c_str());
if (success)
{
    success = ilConvertImage(IL_RGB, IL_UNSIGNED_BYTE); /* Convert every colour component into
                                                         unsigned byte. If your image contains alpha channel you can replace IL_RGB with IL_RGBA */
    if (!success)
    {
        printf("image conversion failed.");
    }
    glGenTextures(1, &id);
    glBindTexture(GL_TEXTURE_2D, id);

    width = ilGetInteger(IL_IMAGE_WIDTH);
    height = ilGetInteger(IL_IMAGE_HEIGHT);

    glTexImage2D(GL_TEXTURE_2D, 0, ilGetInteger(IL_IMAGE_BPP), ilGetInteger(IL_IMAGE_WIDTH),
                 ilGetInteger(IL_IMAGE_HEIGHT), 0, ilGetInteger(IL_IMAGE_FORMAT), GL_UNSIGNED_BYTE,
                 ilGetData());
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);       // Linear Filtered
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);       // Linear Filtered
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

I probably should mention this, but some images did get rendered properly, I thought it was because width != height. But that is not the case, images with width != height also get loaded fine. But for other images i still get this problem.

Always try to have the images width and height of the power of two because some GPU support textures only in NPOT resolution. (for example 128x128, 512x512 but not 123x533, 128x532)

And i think that here instead of GL_REPEAT you should use GL_CLAMP_TO_EDGE :) GL_REPEAT is used when your texture coordinates are > 1.0f, CLAMP_TO_EDGE too but guarantees the image will fill the polygon without unwanted lines on edges. (it's blocking your linear filtering on edges)

Remember to try out code where floats are used (sample from comment) :)

Here is good explanation http://open.gl/textures :)

You probably need to call

glPixelStorei(GL_UNPACK_ALIGNMENT, 1);

before uploading the texture data with glTexImage2D .

From the reference pages :

GL_UNPACK_ALIGNMENT

Specifies the alignment requirements for the start of each pixel row in memory. The allowable values are 1 (byte-alignment), 2 (rows aligned to even-numbered bytes), 4 (word-alignment), and 8 (rows start on double-word boundaries).

The default value for the alignment is 4 and your image loading library probably returns pixel data with byte-aligned rows, which explains why some of your images look OK (when the width is a multiple of four).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM