简体   繁体   中英

CALayer causing memory leak with ARC

I am struggling for some time to understand some big memory leak in my code, so after simplifying the code what it's left is this:

@interface TestLayer: CALayer
@end
@implementation TestLayer
-(void)dealloc
{
    NSLog(@"dealloc called");
}
@end

@implementation AppDelegate
#define ENABLE_LEAK 1
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
    for (int i=0; i<10; i++) {
        @autoreleasepool {
            TestLayer* test = [TestLayer layer];
#if ENABLE_LEAK
            CALayer* l = [CALayer layer];
            [test addSublayer:l];
            [l removeFromSuperlayer];
            l = nil;
#endif
            test = nil;
        }
    }
return YES;
}
.....

If ENABLE_LEAK is set to 0 the dealloc in TestLayer is called correctly 10 times. However if it's set to 1 dealloc in TestLayer is not called before this the application: didFinishLaunchingWithOptions: returns. In fact just calling [test setNeedsLayout]; without adding any sublayers is causing TestLayer to leak.

I am using similar code for generating some offline content and will not be used for final application which will just use the pregenerated offline content.

Does anyone have any idea what is referencing my TestLayer and how can I convince it to release it?

As Cristi suggests in one of the comments, all my issues with memory leaks and CALayer were resolved just by using CATransaction.flush()

It works like a charm 🤩

From the doc for CATransaction , you'll see that an implicit transaction is created if you modify layers without explicitly opening one. This transaction gets flushed automatically at the next runloop iteration. Since you are iterating inside a single main thread method call, the run loop is not closing. As you say, the deallocs DO get called after you return. Presumably the transaction is storing all the ops (both add and remove) into a stack of some kind which retains them, and it only gets processed through at transaction conclusion. So the behavior you see is expected and if you really need to clean the references within a single run loop iteration you might want to handle the transactions explicitly yourself rather than flushing the implicit one, which might lead to poor performance or unexpected side effects.

I am struggling for some time to understand some big memory leak in my code, so after simplifying the code what it's left is this:

@interface TestLayer: CALayer
@end
@implementation TestLayer
-(void)dealloc
{
    NSLog(@"dealloc called");
}
@end

@implementation AppDelegate
#define ENABLE_LEAK 1
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
    for (int i=0; i<10; i++) {
        @autoreleasepool {
            TestLayer* test = [TestLayer layer];
#if ENABLE_LEAK
            CALayer* l = [CALayer layer];
            [test addSublayer:l];
            [l removeFromSuperlayer];
            l = nil;
#endif
            test = nil;
        }
    }
return YES;
}
.....

If ENABLE_LEAK is set to 0 the dealloc in TestLayer is called correctly 10 times. However if it's set to 1 dealloc in TestLayer is not called before this the application: didFinishLaunchingWithOptions: returns. In fact just calling [test setNeedsLayout]; without adding any sublayers is causing TestLayer to leak.

I am using similar code for generating some offline content and will not be used for final application which will just use the pregenerated offline content.

Does anyone have any idea what is referencing my TestLayer and how can I convince it to release it?

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM