简体   繁体   中英

CALayer Draggable Mask

I have two images. Both of the same scene, of the same size, that take up the entire screen. One is image blurry, one is in focus. The desired effect is that the user will initially see the blurry image and as they drag their finger across the screen from left to right they will see the part of the image to the left of where they drag be in focus (for example if they drag only halfway across, then the left-half of the scene is the focused image while the right half is still blurry.

I'm doing this by subclassing UIImageView with the focused image as the image - then adding a CALayer of the blurry image with a mask applied, then changing the location of the mask according to touchesBegan/touchesMoved. The problem is that performance is very slow using the approach below. So I'm wondering what I'm doing wrong.

@interface DragMaskImageView : UIImageView
@end    

@implementation DragMaskImageView{
    BOOL userIsTouchingMask;
    CALayer *maskingLayer;
    CALayer *topLayer;
    float horzDistanceOfTouchFromCenter;
}

- (void)awakeFromNib {
    topLayer = [CALayer layer];    
    topLayer.contents = (id) [UIImage imageNamed:@"blurryImage.jpg"].CGImage;
    topLayer.frame = CGRectMake(0, 0, 480, 300);
    [self.layer addSublayer:topLayer];

    maskingLayer = [CALayer layer];
    maskingLayer.contents = (id) [UIImage imageNamed:@"maskImage.png"].CGImage;
    maskingLayer.anchorPoint = CGPointMake(0.0, 0.0);
    maskingLayer.bounds = CGRectMake(0, 0, 480, 300);
    [topLayer setMask:maskingLayer];
}

-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
    CGPoint touchPoint = [[[event allTouches] anyObject] locationInView:self];

    if (touchPoint.x < maskingLayer.frame.origin.x) {
        NSLog(@"user is touching to the left of mask - disregard");
        userIsTouchingMask = NO;
    } else {
        NSLog(@"user is touching ");
        horzDistanceOfTouchFromCenter = touchPoint.x - maskingLayer.frame.origin.x;
        userIsTouchingMask = YES;
    }    
}

-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
    if (userIsTouchingMask) {

        CGPoint touchPoint = [[[event allTouches] anyObject] locationInView:self];
        float newMaskX = touchPoint.x - horzDistanceOfTouchFromCenter;

        if (newMaskX < 0) {
            newMaskX = 0;
        }
        if (newMaskX > 480) {
            newMaskX = 480;
        }
        maskingLayer.frame = CGRectMake(newMaskX, 0 ,480, 300);
    }   
}

I checked the related thread core animation calayer mask animation performance but setting shouldRasterize to YES on any layer doesn't seem to help the performance problem.

Perhaps the problem is that the layer's implicit animation is causing it to appear to be sluggish. Turning off the implicit animation should fix the problem:

[CATransaction begin];
[CATransaction setDisableActions:YES];
maskingLayer.frame = CGRectMake(newMaskX, 0 ,480, 300);
[CATransaction commit];

This is why @Rizwan's work-around works. It's a way of bypassing the implicit animation.

I ran into the same problem, and updating the frame of the mask always lags behind the touch. I found that actually re-creating the mask with the new frame ensures that the mask is always updated instantly.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM