使用“CIFilter”模糊图像时应用程序崩溃

Man*_*uel 5 core-image uiimageview ios cifilter ciimage

由于使用此函数来模糊图像,我经常收到以下崩溃报告CoreImage

// Code exactly as in app
extension UserImage {

    func blurImage(_ radius: CGFloat) -> UIImage? {

        guard let ciImage = CIImage(image: self) else {
            return nil
        }

        let clampedImage = ciImage.clampedToExtent()

        let blurFilter = CIFilter(name: "CIGaussianBlur", parameters: [
            kCIInputImageKey: clampedImage,
            kCIInputRadiusKey: radius])

        var filterImage = blurFilter?.outputImage

        filterImage = filterImage?.cropped(to: ciImage.extent)

        guard let finalImage = filterImage else {
            return nil
        }

        return UIImage(ciImage: finalImage)
    }
}

// Code stripped down, contains more in app
class MyImage {

    var blurredImage: UIImage?

    func setBlurredImage() {
        DispatchQueue.global(qos: DispatchQoS.QoSClass.userInitiated).async {

            let blurredImage = self.getImage().blurImage(100)

            DispatchQueue.main.async {

                guard let blurredImage = blurredImage else { return }

                self.blurredImage = blurredImage
            }
        }
    }
}
Run Code Online (Sandbox Code Playgroud)

根据 Crashlytics 的说法:

  • 崩溃仅发生在一小部分会话中
  • 崩溃发生在从 11.x 到 12.x 的各种 iOS 版本上
  • 崩溃发生时 0% 的设备处于后台状态

我无法重现崩溃,过程是:

  1. MyImageView对象( 的子对象UIImageView)接收一个Notification
  2. 有时(取决于其他逻辑)UIImage在线程上创建a 的模糊版本DispatchQueue.global(qos: DispatchQoS.QoSClass.userInitiated).async
  3. 在主线程上,对象设置UIImageself.image = ...

根据崩溃日志 ( UIImageView setImage),该应用程序似乎在步骤 3 后崩溃了。另一方面,CIImage崩溃日志中的崩溃表明问题出在步骤 2 中CIFilter用于创建图像的模糊版本的某个位置。注意:MyImageView有时用在UICollectionViewCell.

崩溃日志:

EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000000000000

Crashed: com.apple.main-thread
0  CoreImage                      0x1c18128c0 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 2388
1  CoreImage                      0x1c18128c0 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 2388
2  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
3  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
4  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
5  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
6  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
7  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
8  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
9  CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
10 CoreImage                      0x1c18122e8 CI::Context::recursive_render(CI::TileTask*, CI::Node*, CGRect const&, CI::Node*, bool) + 892
11 CoreImage                      0x1c1812f04 CI::Context::render(CI::ProgramNode*, CGRect const&) + 116
12 CoreImage                      0x1c182ca3c invocation function for block in CI::image_render_to_surface(CI::Context*, CI::Image*, CGRect, CGColorSpace*, __IOSurface*, CGPoint, CI::PixelFormat, CI::RenderDestination const*) + 40
13 CoreImage                      0x1c18300bc CI::recursive_tile(CI::RenderTask*, CI::Context*, CI::RenderDestination const*, char const*, CI::Node*, CGRect const&, CI::PixelFormat, CI::swizzle_info const&, CI::TileTask* (CI::ProgramNode*, CGRect) block_pointer) + 608
14 CoreImage                      0x1c182b740 CI::tile_node_graph(CI::Context*, CI::RenderDestination const*, char const*, CI::Node*, CGRect const&, CI::PixelFormat, CI::swizzle_info const&, CI::TileTask* (CI::ProgramNode*, CGRect) block_pointer) + 396
15 CoreImage                      0x1c182c308 CI::image_render_to_surface(CI::Context*, CI::Image*, CGRect, CGColorSpace*, __IOSurface*, CGPoint, CI::PixelFormat, CI::RenderDestination const*) + 1340
16 CoreImage                      0x1c18781c0 -[CIContext(CIRenderDestination) _startTaskToRender:toDestination:forPrepareRender:error:] + 2488
17 CoreImage                      0x1c18777ec -[CIContext(CIRenderDestination) startTaskToRender:fromRect:toDestination:atPoint:error:] + 140
18 CoreImage                      0x1c17c9e4c -[CIContext render:toIOSurface:bounds:colorSpace:] + 268
19 UIKitCore                      0x1e8f41244 -[UIImageView _updateLayerContentsForCIImageBackedImage:] + 880
20 UIKitCore                      0x1e8f38968 -[UIImageView _setImageViewContents:] + 872
21 UIKitCore                      0x1e8f39fd8 -[UIImageView _updateState] + 664
22 UIKitCore                      0x1e8f79650 +[UIView(Animation) performWithoutAnimation:] + 104
23 UIKitCore                      0x1e8f3ff28 -[UIImageView _updateImageViewForOldImage:newImage:] + 504
24 UIKitCore                      0x1e8f3b0ac -[UIImageView setImage:] + 340
25 App                         0x100482434 MyImageView.updateImageView() (<compiler-generated>)
26 App                         0x10048343c closure #1 in MyImageView.handleNotification(_:) + 281 (MyImageView.swift:281)
27 App                         0x1004f1870 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>)
28 libdispatch.dylib              0x1bbbf4a38 _dispatch_call_block_and_release + 24
29 libdispatch.dylib              0x1bbbf57d4 _dispatch_client_callout + 16
30 libdispatch.dylib              0x1bbbd59e4 _dispatch_main_queue_callback_4CF$VARIANT$armv81 + 1008
31 CoreFoundation                 0x1bc146c1c __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 12
32 CoreFoundation                 0x1bc141b54 __CFRunLoopRun + 1924
33 CoreFoundation                 0x1bc1410b0 CFRunLoopRunSpecific + 436
34 GraphicsServices               0x1be34179c GSEventRunModal + 104
35 UIKitCore                      0x1e8aef978 UIApplicationMain + 212
36 App                         0x1002a3544 main + 18 (AppDelegate.swift:18)
37 libdyld.dylib                  0x1bbc068e0 start + 4
Run Code Online (Sandbox Code Playgroud)

坠机的原因可能是什么?


更新

可能与CIImage 内存泄漏有关。在分析时,我看到大量CIImage内存泄漏,其堆栈跟踪与崩溃日志中相同:

图像

可能与Core Image 和内存泄漏有关, swift 3.0。我刚刚发现图像存储在内存中的数组中,并且onReceiveMemoryWarning没有正确处理并且没有清除该数组。因此,在某些情况下,应用程序会因内存问题而崩溃。也许这可以解决问题,我会在这里提供更新。


更新2

看来我能够重现崩溃。在物理设备 iPhone Xs Max 上使用 5MB JPEG 图像进行测试。

  • 当全屏显示不模糊的图像时,应用程序的内存使用总量为 160MB。
  • 当显示1/4屏幕尺寸的模糊图像时,内存使用量为380MB。
  • 当全屏显示模糊的图像时,内存使用量会跳至 >1.6GB,然后应用程序大部分时间都会崩溃:

来自调试器的消息:由于内存问题而终止

我很惊讶 5MB 的图像可能会导致内存使用量 >1.6GB 以实现“简单”模糊。我是否必须手动取消分配此处的任何内容,CIContext等等CIImage,或者这是正常的,我必须在模糊之前手动将图像大小调整为〜kB?

更新3

添加多个显示模糊图像的图像视图会导致每次添加图像视图时内存使用量增加数百 MB,直到删除该视图,即使一次只有 1 个图像可见。MaybeCIFilter不适合用于显示图像,因为它比渲染图像本身占用更多的内存。

因此,我更改了模糊函数以在上下文中渲染图像,果然,渲染图像时内存只会短暂增加,然后又回落到模糊前的水平。

这是更新后的方法:

func blurImage(_ radius: CGFloat) -> UIImage? {

    guard let ciImage = CIImage(image: self) else {
        return nil
    }

    let clampedImage = ciImage.clampedToExtent()

    let blurFilter = CIFilter(name: "CIGaussianBlur", withInputParameters: [
        kCIInputImageKey: clampedImage,
        kCIInputRadiusKey: radius])

    var filteredImage = blurFilter?.outputImage

    filteredImage = filteredImage?.cropped(to: ciImage.extent)

    guard let blurredCiImage = filteredImage else {
        return nil
    }

    let rect = CGRect(origin: CGPoint.zero, size: size)

    UIGraphicsBeginImageContext(rect.size)
    UIImage(ciImage: blurredCiImage).draw(in: rect)
    let blurredImage = UIGraphicsGetImageFromCurrentImageContext()
    UIGraphicsEndImageContext()

    return blurredImage
}
Run Code Online (Sandbox Code Playgroud)

此外,感谢 @matt 和 @FrankSchlegel 在评论中建议,可以通过在模糊之前对图像进行下采样来减轻高内存消耗,我也会这样做。令人惊讶的是,即使是 300x300 像素的图像也会导致内存使用量激增约 500MB。考虑到 2GB 是应用程序将被终止的限制。一旦应用程序发布这些更新,我将发布更新。

更新4

我添加了以下代码,在模糊之前将图像缩小到最大 300x300px:

func resizeImageWithAspectFit(_ boundSize: CGSize) -> UIImage {

    let ratio = self.size.width / self.size.height
    let maxRatio = boundSize.width / boundSize.height

    var scaleFactor: CGFloat

    if ratio > maxRatio {
        scaleFactor = boundSize.width / self.size.width

    } else {
        scaleFactor = boundSize.height / self.size.height
    }

    let newWidth = self.size.width * scaleFactor
    let newHeight = self.size.height * scaleFactor

    let rect = CGRect(x: 0.0, y: 0.0, width: newWidth, height: newHeight)

    UIGraphicsBeginImageContext(rect.size)
    self.draw(in: rect)
    let newImage = UIGraphicsGetImageFromCurrentImageContext()
    UIGraphicsEndImageContext()

    return newImage!
}
Run Code Online (Sandbox Code Playgroud)

现在崩溃看起来有所不同,但我不确定崩溃是否发生在下采样或绘制模糊图像期间,如更新 #3 中所述,因为两者都使用UIGraphicsImageContext

EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000000000010
Crashed: com.apple.root.user-initiated-qos
0  libobjc.A.dylib                0x1ce457530 objc_msgSend + 16
1  CoreImage                      0x1d48773dc -[CIContext initWithOptions:] + 96
2  CoreImage                      0x1d4877358 +[CIContext contextWithOptions:] + 52
3  UIKitCore                      0x1fb7ea794 -[UIImage drawInRect:blendMode:alpha:] + 984
4  MyApp                          0x1005bb478 UIImage.blurImage(_:) (<compiler-generated>)
5  MyApp                          0x100449f58 closure #1 in MyImage.getBlurredImage() + 153 (UIImage+Extension.swift:153)
6  MyApp                          0x1005cda48 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>)
7  libdispatch.dylib              0x1ceca4a38 _dispatch_call_block_and_release + 24
8  libdispatch.dylib              0x1ceca57d4 _dispatch_client_callout + 16
9  libdispatch.dylib              0x1cec88afc _dispatch_root_queue_drain + 636
10 libdispatch.dylib              0x1cec89248 _dispatch_worker_thread2 + 116
11 libsystem_pthread.dylib        0x1cee851b4 _pthread_wqthread + 464
12 libsystem_pthread.dylib        0x1cee87cd4 start_wqthread + 4
Run Code Online (Sandbox Code Playgroud)

以下是用于调整图像大小和模糊图像的线程(blurImage()是更新 #3 中描述的方法):

class MyImage {

    var originalImage: UIImage?
    var blurredImage: UIImage?

    // Called on the main thread
    func getBlurredImage() -> UIImage {

        DispatchQueue.global(qos: DispatchQoS.QoSClass.userInitiated).async {

            // Create resized image
            let smallImage = self.originalImage.resizeImageWithAspectFitToSizeLimit(CGSize(width: 1000, height: 1000))

            // Create blurred image
            let blurredImage = smallImage.blurImage()

                DispatchQueue.main.async {

                    self.blurredImage = blurredImage

                    // Notify observers to display `blurredImage` in UIImageView on the main thread
                    NotificationCenter.default.post(name: BlurredImageIsReady, object: nil, userInfo: ni)
                }
            }
        }
    }
}
Run Code Online (Sandbox Code Playgroud)

Fra*_*gel 2

我做了一些基准测试,发现直接渲染到 时可以模糊并显示非常大的图像MTKView,即使处理发生在原始输入大小上也是如此。这是整个测试代码:

import CoreImage
import MetalKit
import UIKit

class ViewController: UIViewController {

    var device: MTLDevice!
    var commandQueue: MTLCommandQueue!
    var context: CIContext!
    let filter = CIFilter(name: "CIGaussianBlur")!
    let testImage = UIImage(named: "test10")! // 10 MB, 40 MP image
    @IBOutlet weak var metalView: MTKView!

    override func viewDidLoad() {
        super.viewDidLoad()

        self.device = MTLCreateSystemDefaultDevice()
        self.commandQueue = self.device.makeCommandQueue()

        self.context = CIContext(mtlDevice: self.device)

        self.metalView.delegate = self
        self.metalView.device = self.device
        self.metalView.isPaused = true
        self.metalView.enableSetNeedsDisplay = true
        self.metalView.framebufferOnly = false
    }

}

extension ViewController: MTKViewDelegate {

    func draw(in view: MTKView) {
        guard let currentDrawable = view.currentDrawable,
              let commandBuffer = self.commandQueue.makeCommandBuffer() else { return }

        let input = CIImage(image: self.testImage)!

        self.filter.setValue(input.clampedToExtent(), forKey: kCIInputImageKey)
        self.filter.setValue(100.0, forKey: kCIInputRadiusKey)
        let output = self.filter.outputImage!.cropped(to: input.extent)

        let drawableSize = view.drawableSize

        // Scale image to aspect-fit view.
        // NOTE: This is a benchmark scenario. Usually you would scale the image to a reasonable processing size
        //       (i.e. close to your output size) _before_ applying expensive filters.
        let scaleX = drawableSize.width / output.extent.width
        let scaleY = drawableSize.height / output.extent.height
        let scale = min(scaleX, scaleY)
        let scaledOutput = output.transformed(by: CGAffineTransform(scaleX: scale, y: scale))

        let destination = CIRenderDestination(mtlTexture: currentDrawable.texture, commandBuffer: commandBuffer)
        // BONUS: You can Quick Look the `task` in Xcode to see what Core Image is actually going to do on the GPU.
        let task = try! self.context.startTask(toRender: scaledOutput, to: destination)

        commandBuffer.present(currentDrawable)
        commandBuffer.commit()

        // BONUS: No need to wait, but you can Quick Look the `info` to see what was actually done during rendering
        //        and to get performance metrics, like the actual number of pixels processed.
        DispatchQueue.global(qos: .background).async {
            let info = try! task.waitUntilCompleted()
        }
    }

    func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) {}

}
Run Code Online (Sandbox Code Playgroud)

对于 10 MB 测试图像(40 兆像素!),渲染过程中内存会短暂飙升至 800 MB,这是可以预料的。我什至尝试了 30 MB(约 74 兆像素!!)的图像,并且使用 1.3 GB 内存顶部,运行顺利。

当我在应用滤镜之前将图像缩放到目标位置时,内存始终保持在约 60 MB。因此,无论如何,这确实是您应该做的事情。但请注意,在这种情况下,您需要更改高斯模糊的半径才能获得相同的结果。

如果您不仅需要显示结果,我想您可以使用createCGImageAPI,CIContext而不是渲染到的可绘制MTKView对象中,并获得相同的内存使用量。

我希望这适用于您的场景。