[英]OpenGL ES: Screenshot shows different between device and simulator
I've tried to create a screenshot by using UIActivityViewController and saved into Photos in iPhone/iPad device. 我尝试使用UIActivityViewController创建屏幕截图,并将其保存到iPhone / iPad设备中的“照片”中。 However, in simulator everything shows correctly, but when I switched to device, it only shows part.
但是,在模拟器中,所有内容都能正确显示,但是当我切换到设备时,它只显示了一部分。 Here is the screenshot:
这是屏幕截图:
I merged all of those three different UIImages into one image so that I can take a screenshot. 我将所有三个不同的UIImage合并为一个图像,以便可以截屏。
I first merge the background image (bridge UIImage) with star image. 我首先将背景图像(桥UIImage)与星型图像合并。
-(UIImage*)mergeUIImageView:(UIImage*)bkgound FrontPic:(UIImage*)fnt FrontPicX:(CGFloat)xPos FrontPicY:(CGFloat)yPos FrontPicWidth:(CGFloat)picWidth FrontPicHeight:(CGFloat)picHeight FinalSize:(CGSize)finalSize { UIGraphicsBeginImageContext(CGSizeMake([UIScreen mainScreen].bounds.size.height, [UIScreen mainScreen].bounds.size.width)); // bkgound - is the bridge image [bkgound drawInRect:CGRectMake(0, 0, [UIScreen mainScreen].bounds.size.height, [UIScreen mainScreen].bounds.size.width)]; // fnt - is the star image [fnt drawInRect:CGRectMake(xPos, yPos, picWidth, picHeight)]; // merged image UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); return newImage; }
Then I merged this picture with opengl rendered picture which is the green line. 然后,我将此图片与opengl渲染的图片(绿线)合并。
a) I first change the opengGL image to UIImage by using this function a)我首先通过使用此功能将opengGL图像更改为UIImage
-(UIImage *) glToUIImage {
float scaleFactor = [[UIScreen mainScreen] scale];
CGRect screen = [[UIScreen mainScreen] bounds];
CGFloat image_height = screen.size.width * scaleFactor;
CGFloat image_width = screen.size.height * scaleFactor;
NSInteger myDataLength = image_width * image_height * 4;
// allocate array and read pixels into it.
GLubyte *buffer = (GLubyte *) malloc(myDataLength);
glReadPixels(0, 0, image_width, image_height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
// gl renders "upside down" so swap top to bottom into new array.
// there's gotta be a better way, but this works.
GLubyte *buffer2 = (GLubyte *) malloc(myDataLength);
for(int y = 0; y < image_height; y++)
{
for(int x = 0; x < image_width * 4; x++)
{
buffer2[(int)((image_height - 1 - y) * image_width * 4 + x)] = buffer[(int)(y * 4 * image_width + x)];
}
}
// make data provider with data.
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, NULL);
// prep the ingredients
int bitsPerComponent = 8;
int bitsPerPixel = 32;
int bytesPerRow = 4 * image_width;
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
// make the cgimage
CGImageRef imageRef = CGImageCreate(image_width, image_height, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
// then make the uiimage from that
UIImage *myImage = [UIImage imageWithCGImage:imageRef];
return myImage;
}
b)Then I merge this opengl image with my above image(bridge + star) by using the same function above b)然后我通过使用上面的相同功能将此opengl图像与我上面的图像(桥+星)合并
-(UIImage*)screenshot
{
// get opengl image from above function
UIImage *image = [self glToUIImage];
CGRect pos = CGRectMake(0, 0, image.size.width, image.size.height);
UIGraphicsBeginImageContext(image.size);
[image drawInRect:pos];
[self.background.image drawInRect:pos];
UIImage* final = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return final;
}
And it works great in (iPhone, iPad, iPhone with retina and iPad with retina)simulator (version 6.0). 它非常适合(iPhone,iPad,带视网膜的iPhone和带视网膜的iPad)模拟器(6.0版)。 However, when I switched to real device (iPhone 4/4s/5, iPad (2/mini/retina)) it only shows star image.
但是,当我切换到真实设备(iPhone 4 / 4s / 5,iPad(2 / mini /视网膜))时,它仅显示星状图像。 The xcode version is 4.6.3 and base SDK is latest IOS(IOS 6.1) and IOS deployment target is 5.0.
xcode版本是4.6.3,基本SDK是最新的IOS(IOS 6.1),并且IOS部署目标是5.0。 Could you guys tell me how to fix it?
你们能告诉我如何解决吗? Thanks.
谢谢。
The problem is IOS 6.0 will not keep the buffer all the time, it will erase it. 问题是IOS 6.0不会一直保留缓冲区,而是会擦除缓冲区。 However, when you do screenshot, you are getting data from buffer so that's why I keep getting black background.
但是,当您截屏时,您是从缓冲区中获取数据的,因此这就是为什么我一直保持黑色背景。 So add category to let device keep buffer image instead will solve this problem.
因此添加类别以使设备保留缓冲区图像将解决此问题。
@interface CAEAGLLayer (Retained)
@end
@implementation CAEAGLLayer (Retained)
- (NSDictionary*) drawableProperties
{
return @{kEAGLDrawablePropertyRetainedBacking : @(YES)};
}
@end
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.