简体   繁体   中英

Applying CIFilter to UIImage results in resized and repositioned image

After applying a CIFilter to a photo captured with the camera the image taken shrinks and repositions itself.

I was thinking that if I was able to get the original images size and orientation that it would scale accordingly and pin the imageview to the corners of the screen. However nothing is changed with this approach and not aware of a way I can properly get the image to scale to the full size of the screen.

func applyBloom() -> UIImage {
  let ciImage = CIImage(image: image) // image is from UIImageView

  let filteredImage = ciImage?.applyingFilter("CIBloom",
                                              withInputParameters: [ kCIInputRadiusKey: 8,
                                                                     kCIInputIntensityKey: 1.00 ])
  let originalScale = image.scale
  let originalOrientation = image.imageOrientation


  if let image = filteredImage {
    let image = UIImage(ciImage: image, scale: originalScale, orientation: originalOrientation)
    return image
  }

  return self.image
}

Picture Description: Photo Captured and screenshot of the image with empty spacing being a result of an image shrink.

在此处输入图片说明

Try something like this. Replace:

func applyBloom() -> UIImage {
    let ciInputImage = CIImage(image: image) // image is from UIImageView
    let ciOutputImage = ciInputImage?.applyingFilter("CIBloom",
                                          withInputParameters: [kCIInputRadiusKey: 8, kCIInputIntensityKey: 1.00 ])
    let context = CIContext()
    let cgOutputImage = context.createCGImage(ciOutputImage, from: ciInputImage.extent)
    return UIImage(cgImage: cgOutputImage!)
}
  • I remained various variables to help explain what's happening.
  • Obviously, depending on your code, some tweaking to optionals and unwrapping may be needed.

What's happening is this - take the filtered/output CIImage, and using a CIContext, write a CGImage the size of the input CIImage.

  • Be aware that a CIContext is expensive. If you already have one created, you should probably use it.
  • Pretty much, a UIImage size is the same as a CIImage extent . (I say pretty much because some generated CIImages can have infinite extents.)
  • Depending on your specific needs (and your UIImageView), you may want to use the output CIImage extent instead. Usually though, they are the same.

Last, a suggestion. If you are trying to use a CIFilter to show "near real-time" changes to an image (like a photo editor), consider the major performance improvements you'll get using CIImages and a GLKView over UIImages and a UIImageView . The former uses a devices GPU instead of the CPU.

This could also happen if a CIFilter outputs an image with dimensions different than the input image (eg with CIPixellate )

In which case, simply tell the CIContext to render the image in a smaller rectangle:

let cgOutputImage = context.createCGImage(ciOutputImage, from: ciInputImage.extent.insetBy(dx: 20, dy: 20))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM