简体   繁体   中英

CIFilter + UIImage + alpha mask

I'm using such solution to mask my UIImage with some alpha drawing: masking an UIImage

The problem is that later i want to apply some CIFilters . However when i change value of the filter, my alpha gets lost from UIImage . Do i have to re-apply the alpha channel to output image each time after modifying CIFilter? This will surely make the process much slower.

Samples of code: (each new paragraph is in another method)

// set the image
_image = [incomeImage imageWithMask:_mask]; // imageWithMask from method from link
[_myView.imageView setImage:_image];

// calculate ciimages
_inputCIImage = [[CIImage alloc] initWithCGImage:_image.CGImage options:nil];
_myView.imageView.image = _image;
_currentCIImage = _inputCIImage;

// change value in filter
[filter setValue:@(0.2f) forKey:@"someKey"];
[filter setValue:_inputCIImage forKey:kCIInputImageKey];
_currentCIImage = [filter outputImage];
CGImageRef img = [_context createCGImage:_currentCIImage fromRect:[_currentCIImage extent]];
UIImage *newImage = [UIImage imageWithCGImage:img];

you could do this only using CIFilters. Instead of using imageWithMask you can use the CIBlendWithMask CIFilter. Apple CIFilter Reference

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM