简体   繁体   中英

createCGImage(_:from:) not using correct rect

I am trying to blur a part of an image:

extension UIImage {
    func blur(rect: CGRect, radius: CGFloat) -> UIImage? {
        let context = CIContext(options: nil)
        let inputImage = CIImage(cgImage: self.cgImage!)
        let filter = CIFilter(name: "CIGaussianBlur")
        filter?.setValue(inputImage, forKey: kCIInputImageKey)
        filter?.setValue(radius, forKey: kCIInputRadiusKey)
        let outputImage = filter?.outputImage
        if let outputImage = outputImage {
            let cgImage = context.createCGImage(outputImage, from: rect)
            if let cgImage = cgImage {
                return UIImage(cgImage: cgImage)
            }
        }
        return nil
    }
}

Usage:

guard let blurredImage = self.imageView.image?.blur(rect: self.blur.frame, radius: 2.0) else { fatalError("could not blur image") }
let img = UIImageView(image: blurredImage)
img.frame = self.blur.frame
self.view.addSubview(img)

However, when the app launches, a wrong part of the picture is blurred. Below is the image; the rectangle at the bottom should blur the image behind it, but instead it blurs other part.

图片

Did you notice that your inset image looks not just blurred, but magnified? That should be a hint...

UIKit views, frames, and image sizes are in "points", a display-resolution-agnostic unit for doing layout. Your iPhone X display has the same width in points as an iPhone 6/7/8 display, even though the iPhone X has more pixels. (That is, they differ in scale factor: iPhone 6/7/8 are 2x scale, iPhone X is 3x.)

CGImage and CIImage measure pixels, not points, so you need to account for the screen's scale factor when using view dimensions from UIKit. Since you're creating the image-to-be-blurred from the whole view, and then rendering only a section of the blurred image, the place to deal with scale factor would be in your createCGImage(_:from:) call — just multiply the origin and size of your rect by the whole view's contentScaleFactor .

Also, note you don't necessarily need to go through CGImage to get from UIImage to CIImage and back — you can pass the outer image view's image to CIImage(image:) , and the result of the filter to UIImage(ciImage:) . In that case, UIKit and CoreImage manage the rendering context for you and can probably ensure better performance (say, by keeping the whole pipeline on GPU). But since you were using the render call to specify a crop rect, you'd need to do that elsewhere — for example, the CIImage cropped(to:) call.

It looks like you are on the right track. A couple things needed to be worked through...

The scale could be adjusted by passing the screen scale into to your method and updating your rect's origin and size

The positioning of the blur section was off because of the translation between the CoreGraphics and UIKit Coordinate Systems. In order to fix this you just need to subtract the height of the window by the position and height of the rect you want to blur (taking scale into account)

Please check out the code below:

extension UIImage {
    func blur(rect: CGRect, radius: CGFloat, scale: CGFloat) -> UIImage? {

        if let cgImage = self.cgImage {
            let inputImage = CIImage(cgImage: cgImage)
            let context = CIContext(options: nil)
            let filter = CIFilter(name: "CIGaussianBlur")
            filter?.setValue(inputImage, forKey: kCIInputImageKey)
            filter?.setValue(radius, forKey: kCIInputRadiusKey)

            // Rect bounds need to be adjusted for based on Image's Scale and Container's Scale
            // Y Position Needs to be Inverted because of UIKit <-> CoreGraphics Coordinate Systems

            let newRect = CGRect(
                x: rect.minX * scale,
                y: ((((CGFloat(cgImage.height) / scale)) - rect.minY - rect.height) * scale),
                width: rect.width * scale,
                height: rect.height * scale
            )


            if let outputImage = filter?.outputImage,
                let cgImage = context.createCGImage(outputImage, from: newRect) {
                return UIImage(
                    cgImage: cgImage,
                    scale: scale,
                    orientation: imageOrientation
                )
            }
        }
        return nil
    }
}

let blurredImage = self.imageView.image?.blur(rect: self.blur.frame, radius: 2.0, scale: UIScreen.main.scale)

问题是,在viewDidLoad中添加任何覆盖都无法提供正确的帧大小,请在side viewDidAppear添加它

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM