简体   繁体   中英

iPhone OpenGL ES Texture Pipeline for JPEG

I am having trouble getting my texture pipeline working for JPEGs. Everything works right for PNGs but converting over has been an issue.

I am loading by image data via UIImage and CGBitmapContextCreate

UIImage* tI = [UIImage imageWithContentsOfFile:fileName];
Image = tI.CGImage;

mWidth = CGImageGetWidth(Image);
mHeight = CGImageGetHeight(Image);

mData = new uint8[mWidth * mHeight * 4];

Context = CGBitmapContextCreate(mData, mWidth, mHeight, CGImageGetBitsPerComponent(Image), CGImageGetBytesPerRow(Image), 
                                CGImageGetColorSpace(Image), 
                                CGImageGetBitmapInfo(Image));

CGContextDrawImage(Context, CGRectMake(0.0, 0.0, float(mWidth), float(mHeight)), Image);
CGContextRelease(Context);

Then I setup my GLTexture with the call...

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, texture->Width(), texture->Height(), 0, GL_RGB, GL_UNSIGNED_BYTE, texture->Data());

I suspect this glTexImage2D call is the issue. I have been using different combinations to try and get things to work. The BitmapInfo state for alpha is 'kCGImageAlphaNoneSkipLast' so I am not sure if I should be using RGBA and GL_UNSIGNED_SHORT_5_5_5_1 but no combination I have tried worked so far. The closest I got was...

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, texture->Width(), texture->Height(), 0, GL_RGBA, GL_UNSIGNED_BYTE, texture->Data());

which gave me very blown out texture (color edges were discernable but everything was too bright)

Any help would be great. I am using JPEGs to try and save space over PNGs.

Your mData buffer is too big. JPEG images don't support alpha, so there are only three components per pixel (RGB), whereas you'd usually use four for PNGs (RGBA).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM