简体   繁体   English

使用Metal的旧设备的内存使用率不断上升

[英]Memory usage keeps rising on older devices using Metal

I use Metal and CADisplayLink to live filter a CIImage and render it into a MTKView . 我使用MetalCADisplayLink实时过滤CIImage并将其渲染到MTKView

// Starting display link 
displayLink = CADisplayLink(target: self, selector: #selector(applyAnimatedFilter))
displayLink.preferredFramesPerSecond = 30
displayLink.add(to: .current, forMode: .default)

@objc func applyAnimatedFilter() {
    ...
    metalView.image = filter.applyFilter(image: ciImage)
}

According to the memory monitor in Xcode, memory usage is stable on iPhone X and never goes above 100mb, on devices like iPhone 6 or iPhone 6s the memory usage keeps growing until eventually the system kills the app. 根据Xcode中的内存监视器,iPhone X上的内存使用率稳定且永远不会超过100mb,在iPhone 6或iPhone 6s等设备上,内存使用量不断增长,直到最终系统杀死应用程序。

I've checked for memory leaks using Instruments , but no leaks were reported. 我使用Instruments检查了内存泄漏,但没有报告泄漏。 Running the app through Allocations also don't show any problems and the app won't get shut down by the system. 通过分配运行应用程序也不会显示任何问题,系统不会关闭应用程序。 I also find it interesting that on newer devices the memory usage is stable but on older it just keeps growing and growing. 我还发现有趣的是,在较新的设备上,内存使用率是稳定的,但在较旧的设备上,它只是不断增长和增长。

The filter's complexity don't matter as I tried even most simple filters and the issue persists. 过滤器的复杂性并不重要,因为我尝试了最简单的过滤器,问题仍然存在。 Here is an example from my metal file: 这是我的金属文件中的一个示例:

extern "C" { namespace coreimage {

    float4 applyColorFilter(sample_t s, float red, float green, float blue) {

        float4 newPixel = s.rgba;
        newPixel[0] = newPixel[0] + red;
        newPixel[1] = newPixel[1] + green;
        newPixel[2] = newPixel[2] + blue;

        return newPixel;
    }
}

I wonder what can cause the issue on older devices and in which direction I should look up to. 我想知道什么可能导致旧设备上的问题以及我应该注意的方向。

Update 1: here are two 1 minute graphs, one from Xcode and one from Allocations both using the same filter. 更新1:这里有两个1分钟的图表,一个来自Xcode ,一个来自Allocations都使用相同的过滤器。 Allocations graph is stable while Xcode graph is always growing: Allocations图是稳定的,而Xcode图总是在增长:

Xcode中

分配

Update 2: Attaching a screenshot of Allocations List sorted by size, the app was running for 16 minutes, applying the filter non stop: 更新2:附加按大小排序的分配列表的屏幕截图,应用程序运行16分钟,应用过滤器不停止:

在此输入图像描述

Update 3: A bit more info on what is happening in applyAnimatedFilter() : 更新3:有关applyAnimatedFilter()发生的事情的更多信息:

I render a filtered image into a metalView which is a MTKView . 我将过滤后的图像渲染到一个metalViewMTKView I receive the filtered image from filter.applyFilter(image: ciImage) , where in Filter class happens next: 我从filter.applyFilter(image: ciImage)接收过滤后的图像,其中Filter类接下来发生:

 func applyFilter(image: ciImage) -> CIImage {
    ...
    var colorMix = ColorMix()
    return colorMix.use(image: ciImage, time: filterTime)
 }

where filterTime is just a Double variable. 其中filterTime只是一个Double变量。 And finally, here is the whole ColorMix class: 最后,这是整个ColorMix类:

import UIKit

class ColorMix: CIFilter {

    private let kernel: CIKernel

    @objc dynamic var inputImage: CIImage?
    @objc dynamic var inputTime: CGFloat = 0

    override init() {

        let url = Bundle.main.url(forResource: "default", withExtension: "metallib")!
        let data = try! Data(contentsOf: url)
        kernel = try! CIKernel(functionName: "colorMix", fromMetalLibraryData: data)
        super.init()
    }

    required init?(coder aDecoder: NSCoder) {
        fatalError("init(coder:) has not been implemented")
    }

    func outputImage() -> CIImage? {

        guard let inputImage = inputImage else {return nil}

        return kernel.apply(extent: inputImage.extent, roiCallback: {
            (index, rect) in
            return rect.insetBy(dx: -1, dy: -1)
        }, arguments: [inputImage, CIVector(x: inputImage.extent.width, y: inputImage.extent.height), inputTime])
    }

    func use(image: CIImage, time: Double) -> CIImage {

        var resultImage = image

        // 1. Apply filter
        let filter = ColorMix()
        filter.setValue(resultImage, forKey: "inputImage")
        filter.setValue(NSNumber(floatLiteral: time), forKey: "inputTime")

        resultImage = filter.outputImage()!

        return resultImage
    }

}

Here are a few observations, but I'm not sure if one of them actually causes the memory usage you're seeing: 以下是一些观察,但我不确定其中一个是否真正导致您看到的内存使用情况:

  • In applyFilter you are creating a new ColorMix filter every frame . applyFilter每帧都要创建一个新的ColorMix过滤器。 Additionally, inside the instance method use(image:, time:) you are creating another one on every call. 另外,在实例方法中use(image:, time:)你在每次调用时都会创建另一个 That's a lot of overhead, especially since the filter loads it's kernel every time on init . 这是一个很大的开销,特别是因为过滤器每次都在init上加载它的内核。 It would be advisable to just create a single ColorMix filter during setup and just update its inputImage and inputTime on every frame. 建议在安装过程中创建一个ColorMix过滤器,并在每一帧上更新其inputImageinputTime
  • outputImage is not a func , but a var that you override from the CIFilter super class: outputImage不是一个func ,而是一个从CIFilter超类重写的var

    override var outputImage: CIImage? { /* your code here */ }

  • Is your colorMix kernel performing any kind of convolution? 你的colorMix内核是否colorMix执行任何类型的卷积? If not, it could be a CIColorKernel instead. 如果没有,它可能是CIColorKernel而不是。

  • If you need the size of the input inside your kernel, you don't need to pass it as extra argument. 如果您需要内核中输入的大小,则不需要将其作为额外参数传递。 You can just call .size() on the input sampler . 您只需在输入sampler上调用.size()

This is a bug in Xcode's diagnostic features (Metal validation and/or GPU frame capture). 这是Xcode诊断功能(金属验证和/或GPU帧捕获)中的一个错误。 If you turn those off, the memory usage should be similar to when running outside of Xcode. 如果关闭它们,内存使用量应与在Xcode外部运行时类似。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM