简体   繁体   中英

From Open GL ES to UIImage: memory issue

I'm trying to take a screenshot from Open GL (1.1) view and to turn it into UIImage using the following code.

This works well, but every time I call this method used memory increases (leak?), so my app starts using about 19 MB, it turns 31 MB when calling this method for the first time, then 43 MB and so on... Actually, when it reaches about 80 MB it stays there!!

Do you know what causes this behavior?

void releaseScreenshotData(void *info, const void *data, size_t size)
{
    free((void *)data);
}

- (UIImage *)fromOpenGLToUIImage
{
    [self draw];

    NSInteger myDataLength = backingWidth * backingHeight * 4;

    // allocate array and read pixels into it.
    GLuint *buffer = (GLuint *) malloc(myDataLength);
    glReadPixels(0.0f, 0.0f, backingWidth, backingHeight, GL_RGBA, GL_UNSIGNED_BYTE, buffer);

    // gl renders “upside down” so swap top to bottom into new array.
    for(int y = 0; y < backingHeight / 2; y++) {
        for(int x = 0; x < backingWidth; x++) {
            //Swap top and bottom bytes
            GLuint top = buffer[y * backingWidth + x];
            GLuint bottom = buffer[(backingHeight - 1 - y) * backingWidth + x];
            buffer[(backingHeight - 1 - y) * backingWidth + x] = top;
            buffer[y * backingWidth + x] = bottom;
        }
    }

    // make data provider with data.
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer, myDataLength, releaseScreenshotData);

    // prep the ingredients
    const int bitsPerComponent = 8;
    const int bitsPerPixel = 4 * bitsPerComponent;
    const int bytesPerRow = 4 * backingWidth;
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
    // CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault; // Per l'opacita'
    CGBitmapInfo bitmapInfo = (CGBitmapInfo)kCGImageAlphaPremultipliedLast; // Per la trasparenza
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;

    // make the cgimage
    CGFloat larghezza = 0.0f;
    CGFloat altezza = 0.0f;

    if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone) {
        if ([[UIScreen mainScreen] respondsToSelector:@selector(scale)] && ([[UIScreen mainScreen] scale] == 2.00) && ([UIScreen mainScreen].bounds.size.height == 480.0f)) {
            // iPhone Retina 3.5
            larghezza = 640.0f;
            altezza = 960.0f;
        } else if ([[UIScreen mainScreen] respondsToSelector:@selector(scale)] && ([[UIScreen mainScreen] scale] == 2.00) && ([UIScreen mainScreen].bounds.size.height == 568.0f)) {
            // iPhone Retina 4
            larghezza = 640.0f;
            altezza = 1136.0f;
        } else {
            // iPhone
            larghezza = 320.0f;
            altezza = 480.0f;
        }
    } else if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad) {
        if ([[UIScreen mainScreen] respondsToSelector:@selector(scale)] && ([[UIScreen mainScreen] scale] == 2.00) && ([UIScreen mainScreen].bounds.size.height == 1024.0f)) {
            // iPad Retina
            larghezza = 1536.0f;
            altezza = 2048.0f;
        } else {
            // iPad
            larghezza = 768.0f;
            altezza = 1024.0f;
        }
    }

    CGImageRef imageRef = CGImageCreate(larghezza, altezza, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
    CGColorSpaceRelease(colorSpaceRef);
    CGDataProviderRelease(provider);

    // then make the UIImage from that
    UIImage *image = [UIImage imageWithCGImage:imageRef];
    CGImageRelease(imageRef);

    return image;
}

The way you are explaining your situation I'm going to assume that you are reading the used memory from the debug window in xcode. This "used memory" value is the same as the "Activity Monitor" instrument. It is very different fromt he "Allocation" instrument.

The allocations instrument will give you the actual current memory in use by your application (in objects and primitives created and retained).

The Activity Monitor instrument will give you the current amount of memory allocated to your application for use by the OS. Whenever you create a new object (temporary or not) your application will request an amount of memory from the OS. The OS will determine if your app could benefit from having extra memory or if what is currently allotted to your app is sufficient. If your app is requesting memory a lot over and over the OS may give you more than you need so that your app can work faster instead of having to worry about releasing memory and reallocating from that freed memory.

Just think of the debugger "Used Memory" as your sandbox. It will increase and decrease based on the OS's idea of your app's need (and other apps that are currently in the background). If you are looking for leaked memory, that view will not give you any information at all. Use the "Allocations" instrument for that purpose.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM