简体   繁体   中英

App crash when blurring an image with `CIFilter`

Since using this function to blur an image, I get frequent crash reports with CoreImage :

// Code exactly as in app
extension UserImage {

    func blurImage(_ radius: CGFloat) -> UIImage? {

        guard let ciImage = CIImage(image: self) else {
            return nil
        }

        let clampedImage = ciImage.clampedToExtent()

        let blurFilter = CIFilter(name: "CIGaussianBlur", parameters: [
            kCIInputImageKey: clampedImage,
            kCIInputRadiusKey: radius])

        var filterImage = blurFilter?.outputImage

        filterImage = filterImage?.cropped(to: ciImage.extent)

        guard let finalImage = filterImage else {
            return nil
        }

        return UIImage(ciImage: finalImage)
    }
}

// Code stripped down, contains more in app
class MyImage {

    var blurredImage: UIImage?

    func setBlurredImage() {
        DispatchQueue.global(qos: DispatchQoS.QoSClass.userInitiated).async {

            let blurredImage = self.getImage().blurImage(100)

            DispatchQueue.main.async {

                guard let blurredImage = blurredImage else { return }

                self.blurredImage = blurredImage
            }
        }
    }
}

According to Crashlytics:

  • the crash happens only for a small percentage of sessions
  • the crash happens on various iOS versions from 11.x to 12.x
  • 0% of the devices were in background state when the crash happened

I was not able to reproduce the crash, the process is:

  1. The MyImageView object (a child of UIImageView ) receives a Notification
  2. Sometimes (depending on other logic) a blurred version of a UIImage is created on thread DispatchQueue.global(qos: DispatchQoS.QoSClass.userInitiated).async
  3. On the main thread the objects sets the UIImage with self.image = ...

The app seems to crash after step 3 according to the crash log ( UIImageView setImage ). On the other hand the crash CIImage in the crash log indicates that the problem is somewhere in step 2 where CIFilter is used to create a blurred version of the image. Note: MyImageView is sometimes used in a UICollectionViewCell .

Crash log:

EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000000000000

Crashed: com.apple.main-thread
0  CoreImage                      0x1c18128c0 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 2388
1  CoreImage                      0x1c18128c0 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 2388
2  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
3  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
4  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
5  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
6  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
7  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
8  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
9  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
10 CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
11 CoreImage                      0x1c1812f04 CI::Context::render(CI::ProgramNode*, CGRect const&) + 116
12 CoreImage                      0x1c182ca3c invocation function for block in CI::image_render_to_surface(CI::Context*, CI::Image*, CGRect, CGColorSpace*, __IOSurface*, CGPoint, CI::PixelFormat, CI::RenderDestination const*) + 40
13 CoreImage                      0x1c18300bc CI::recursive_tile(CI::RenderTask*, CI::Context*, CI::RenderDestination const*, char const*, CI::Node*, CGRect const&, CI::PixelFormat, CI::swizzle_info const&, CI::TileTask* (CI::ProgramNode*, CGRect) block_pointer) + 608
14 CoreImage                      0x1c182b740 CI::tile_node_graph(CI::Context*, CI::RenderDestination const*, char const*, CI::Node*, CGRect const&, CI::PixelFormat, CI::swizzle_info const&, CI::TileTask* (CI::ProgramNode*, CGRect) block_pointer) + 396
15 CoreImage                      0x1c182c308 CI::image_render_to_surface(CI::Context*, CI::Image*, CGRect, CGColorSpace*, __IOSurface*, CGPoint, CI::PixelFormat, CI::RenderDestination const*) + 1340
16 CoreImage                      0x1c18781c0 -[CIContext(CIRenderDestination) _startTaskToRender:toDestination:forPrepareRender:error:] + 2488
17 CoreImage                      0x1c18777ec -[CIContext(CIRenderDestination) startTaskToRender:fromRect:toDestination:atPoint:error:] + 140
18 CoreImage                      0x1c17c9e4c -[CIContext render:toIOSurface:bounds:colorSpace:] + 268
19 UIKitCore                      0x1e8f41244 -[UIImageView _updateLayerContentsForCIImageBackedImage:] + 880
20 UIKitCore                      0x1e8f38968 -[UIImageView _setImageViewContents:] + 872
21 UIKitCore                      0x1e8f39fd8 -[UIImageView _updateState] + 664
22 UIKitCore                      0x1e8f79650 +[UIView(Animation) performWithoutAnimation:] + 104
23 UIKitCore                      0x1e8f3ff28 -[UIImageView _updateImageViewForOldImage:newImage:] + 504
24 UIKitCore                      0x1e8f3b0ac -[UIImageView setImage:] + 340
25 App                         0x100482434 MyImageView.updateImageView() (<compiler-generated>)
26 App                         0x10048343c closure #1 in MyImageView.handleNotification(_:) + 281 (MyImageView.swift:281)
27 App                         0x1004f1870 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>)
28 libdispatch.dylib              0x1bbbf4a38 _dispatch_call_block_and_release + 24
29 libdispatch.dylib              0x1bbbf57d4 _dispatch_client_callout + 16
30 libdispatch.dylib              0x1bbbd59e4 _dispatch_main_queue_callback_4CF$VARIANT$armv81 + 1008
31 CoreFoundation                 0x1bc146c1c __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 12
32 CoreFoundation                 0x1bc141b54 __CFRunLoopRun + 1924
33 CoreFoundation                 0x1bc1410b0 CFRunLoopRunSpecific + 436
34 GraphicsServices               0x1be34179c GSEventRunModal + 104
35 UIKitCore                      0x1e8aef978 UIApplicationMain + 212
36 App                         0x1002a3544 main + 18 (AppDelegate.swift:18)
37 libdyld.dylib                  0x1bbc068e0 start + 4

What could be the reason for the crash?


Update

Maybe related to CIImage memory leak . When profiling I see a lot of CIImage memory leaks with the same stack trace as in the crash log:

图像

Maybe related to Core Image and memory leak, swift 3.0 . I just found that the images were stored in an array in-memory and onReceiveMemoryWarning was not properly handled and did not clear that array. So the app would crash on memory issues in certain cases. Maybe that fixes the issue, I'll give an update here.


Update 2

It seems I was able to reproduce the crash. Testing on a physical device iPhone Xs Max with a 5MB JPEG image.

  • When displaying the image unblurred full screen the memory usage of the app is 160MB total.
  • When displaying the image blurred in 1/4 of the screen size, the memory usage is 380MB.
  • When displaying the image blurred full screen the memory usage jumps to >1.6GB and the app then crashes most of the time with:

Message from debugger: Terminated due to memory issue

I am surprised the image of 5MB can cause a memory usage of >1.6GB for a "simple" blur. Do I have to manually deallocate anything here, CIContext , CIImage , etc or is that normal and I have to manually resize the image to ~kB before blurring?

Update 3

Adding multiple image views displaying the blurred image causes the memory usage to go up some hundred MB each time an image view is added, until the view is removed, even though only 1 image is visible at a time. Maybe CIFilter is not intended to be used for displaying an image because it occupies more memory than the rendered image itself would.

So I changed the blur function to render the image in context and sure enough, the memory only increases shortly for rendering the image and falls back to pre-blurring levels afterwards.

Here is the updated method:

func blurImage(_ radius: CGFloat) -> UIImage? {

    guard let ciImage = CIImage(image: self) else {
        return nil
    }

    let clampedImage = ciImage.clampedToExtent()

    let blurFilter = CIFilter(name: "CIGaussianBlur", withInputParameters: [
        kCIInputImageKey: clampedImage,
        kCIInputRadiusKey: radius])

    var filteredImage = blurFilter?.outputImage

    filteredImage = filteredImage?.cropped(to: ciImage.extent)

    guard let blurredCiImage = filteredImage else {
        return nil
    }

    let rect = CGRect(origin: CGPoint.zero, size: size)

    UIGraphicsBeginImageContext(rect.size)
    UIImage(ciImage: blurredCiImage).draw(in: rect)
    let blurredImage = UIGraphicsGetImageFromCurrentImageContext()
    UIGraphicsEndImageContext()

    return blurredImage
}

In addition, thanks to @matt and @FrankSchlegel who suggested in the comments that the high memory consumption can be mitigated by downsampling the image before blurring, which I will also do. It is surprising that even an image of 300x300px causes a spike in memory usage of ~500MB. Considering that 2GB is the limit where the app will be terminated. I will post an update once the app is live with these updates.

Update 4

I added this code to downsample the image to a max of 300x300px before blurring it:

func resizeImageWithAspectFit(_ boundSize: CGSize) -> UIImage {

    let ratio = self.size.width / self.size.height
    let maxRatio = boundSize.width / boundSize.height

    var scaleFactor: CGFloat

    if ratio > maxRatio {
        scaleFactor = boundSize.width / self.size.width

    } else {
        scaleFactor = boundSize.height / self.size.height
    }

    let newWidth = self.size.width * scaleFactor
    let newHeight = self.size.height * scaleFactor

    let rect = CGRect(x: 0.0, y: 0.0, width: newWidth, height: newHeight)

    UIGraphicsBeginImageContext(rect.size)
    self.draw(in: rect)
    let newImage = UIGraphicsGetImageFromCurrentImageContext()
    UIGraphicsEndImageContext()

    return newImage!
}

The crashes look different now, but I am unsure whether the crash happens during downsampling or drawing the blurred image as described in Update #3 as both use UIGraphicsImageContext :

EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000000000010
Crashed: com.apple.root.user-initiated-qos
0  libobjc.A.dylib                0x1ce457530 objc_msgSend + 16
1  CoreImage                      0x1d48773dc -[CIContext initWithOptions:] + 96
2  CoreImage                      0x1d4877358 +[CIContext contextWithOptions:] + 52
3  UIKitCore                      0x1fb7ea794 -[UIImage drawInRect:blendMode:alpha:] + 984
4  MyApp                          0x1005bb478 UIImage.blurImage(_:) (<compiler-generated>)
5  MyApp                          0x100449f58 closure #1 in MyImage.getBlurredImage() + 153 (UIImage+Extension.swift:153)
6  MyApp                          0x1005cda48 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>)
7  libdispatch.dylib              0x1ceca4a38 _dispatch_call_block_and_release + 24
8  libdispatch.dylib              0x1ceca57d4 _dispatch_client_callout + 16
9  libdispatch.dylib              0x1cec88afc _dispatch_root_queue_drain + 636
10 libdispatch.dylib              0x1cec89248 _dispatch_worker_thread2 + 116
11 libsystem_pthread.dylib        0x1cee851b4 _pthread_wqthread + 464
12 libsystem_pthread.dylib        0x1cee87cd4 start_wqthread + 4

Here are the threads used to resize and blur the image ( blurImage() is the method as described in Update #3):

class MyImage {

    var originalImage: UIImage?
    var blurredImage: UIImage?

    // Called on the main thread
    func getBlurredImage() -> UIImage {

        DispatchQueue.global(qos: DispatchQoS.QoSClass.userInitiated).async {

            // Create resized image
            let smallImage = self.originalImage.resizeImageWithAspectFitToSizeLimit(CGSize(width: 1000, height: 1000))

            // Create blurred image
            let blurredImage = smallImage.blurImage()

                DispatchQueue.main.async {

                    self.blurredImage = blurredImage

                    // Notify observers to display `blurredImage` in UIImageView on the main thread
                    NotificationCenter.default.post(name: BlurredImageIsReady, object: nil, userInfo: ni)
                }
            }
        }
    }
}

I did some benchmarking and found it feasible to blur and display very large image when rendering directly into a MTKView , even when the processing happens on the original input size. Here is the whole testing code:

import CoreImage
import MetalKit
import UIKit

class ViewController: UIViewController {

    var device: MTLDevice!
    var commandQueue: MTLCommandQueue!
    var context: CIContext!
    let filter = CIFilter(name: "CIGaussianBlur")!
    let testImage = UIImage(named: "test10")! // 10 MB, 40 MP image
    @IBOutlet weak var metalView: MTKView!

    override func viewDidLoad() {
        super.viewDidLoad()

        self.device = MTLCreateSystemDefaultDevice()
        self.commandQueue = self.device.makeCommandQueue()

        self.context = CIContext(mtlDevice: self.device)

        self.metalView.delegate = self
        self.metalView.device = self.device
        self.metalView.isPaused = true
        self.metalView.enableSetNeedsDisplay = true
        self.metalView.framebufferOnly = false
    }

}

extension ViewController: MTKViewDelegate {

    func draw(in view: MTKView) {
        guard let currentDrawable = view.currentDrawable,
              let commandBuffer = self.commandQueue.makeCommandBuffer() else { return }

        let input = CIImage(image: self.testImage)!

        self.filter.setValue(input.clampedToExtent(), forKey: kCIInputImageKey)
        self.filter.setValue(100.0, forKey: kCIInputRadiusKey)
        let output = self.filter.outputImage!.cropped(to: input.extent)

        let drawableSize = view.drawableSize

        // Scale image to aspect-fit view.
        // NOTE: This is a benchmark scenario. Usually you would scale the image to a reasonable processing size
        //       (i.e. close to your output size) _before_ applying expensive filters.
        let scaleX = drawableSize.width / output.extent.width
        let scaleY = drawableSize.height / output.extent.height
        let scale = min(scaleX, scaleY)
        let scaledOutput = output.transformed(by: CGAffineTransform(scaleX: scale, y: scale))

        let destination = CIRenderDestination(mtlTexture: currentDrawable.texture, commandBuffer: commandBuffer)
        // BONUS: You can Quick Look the `task` in Xcode to see what Core Image is actually going to do on the GPU.
        let task = try! self.context.startTask(toRender: scaledOutput, to: destination)

        commandBuffer.present(currentDrawable)
        commandBuffer.commit()

        // BONUS: No need to wait, but you can Quick Look the `info` to see what was actually done during rendering
        //        and to get performance metrics, like the actual number of pixels processed.
        DispatchQueue.global(qos: .background).async {
            let info = try! task.waitUntilCompleted()
        }
    }

    func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) {}

}

For the 10 MB testing image (40 mega-pixel!) the memory spiked up to 800 MB very briefly during rendering, which is to be expected. I even tried the 30 MB (~74 mega-pixel!!) image and it went through without problem, using 1.3 GB of memory tops.

When I scaled the image to the destination before applying the filter, the memory stayed at ~60 MB all the time. So this is really what you should be doing in any case. But note that you need to change the radius of the gaussian blur in this case to achieve the same result.

If you need the rendering result not only for displaying, I guess you could use the createCGImage API of CIContext instead of rendering into the MTKView 's drawable and get the same memory usage.

I hope this is applicable to your scenario.

This appears to be a simple threading issue. CIFilter is not thread safe. You cannot form a chain of filters on one thread and then render the resulting CIImage on another thread. You should confine yourself to small images and do everything on the main thread and render explicitly using the GPU. That's what Core Image is all about.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM