iOS 10打破自定义CIFilter

Geo*_*all 4 glsl avfoundation cifilter swift ios10

我编写了一个抠像滤镜,使MPEG电影的背景透明,这样您就可以将电影文件用于更长的动画,而无需冗长的PNG序列(某些类型的iOS动画通常这样做)。

我正在使用AVPlayerAVVideoComposition和自定义CIFilter来在背景图片上渲染视频。用户可以通过与应用程序进行交互来动态更改背景图片。

直到iOS 10出现,现在它已经坏了,这种方法才能正常工作。

现在发生的事情是视频正在播放,但是没有色度键发生,并且Xcode反复吐出以下错误:

need a swizzler so that YCC420v can be written.
Run Code Online (Sandbox Code Playgroud)

这是CIFilter应产生的图像:

自定义CIFilter工作的结果(iOS 10之前的版本)

相反,这是它产生的结果(自iOS 10起):

CIFilter损坏的结果(iOS 10之后)

这是我的代码中创建EAGLContext和应用自定义的部分CIFilter

    let myEAGLContext = EAGLContext.init(API: EAGLRenderingAPI.OpenGLES2)
    //let cicontext = CIContext.init(EAGLContext: myEAGLContext, options: [kCIContextWorkingColorSpace: NSNull()])
    let cicontext = CIContext.init(EAGLContext: myEAGLContext)

    let filter = ChromaKeyFilter()
    filter.activeColor = CIColor.init(red: 0, green:1.0, blue: 0.0)
    filter.threshold = self.threshold

    //most of below comes from the "WWDC15 What's New In Core Image" slides
    let vidComp = AVVideoComposition(asset: videoAsset!,
                                     applyingCIFiltersWithHandler:
        {
            request in
            let input = request.sourceImage.imageByClampingToExtent()

            filter.inputImage = input

            let output = filter.outputImage!.imageByClampingToExtent()
            request.finishWithImage(output, context: cicontext)
            self.reloadInputViews()

    })

    let playerItem = AVPlayerItem(asset: videoAsset!)
    playerItem.videoComposition = vidComp
    self.player = AVPlayer(playerItem: playerItem)
    self.playerInitialized = true
    let layer = AVPlayerLayer(player: player)

    self.subviews.forEach { subview in
        subview.removeFromSuperview()
    }

    layer.frame = CGRect(x: 0.0, y: 0.0, width: self.frame.size.width, height: self.frame.size.height)
    self.layer.addSublayer(layer)
Run Code Online (Sandbox Code Playgroud)

这是自定义代码CIFilter

private class ChromaKeyFilter : CIFilter {
private var kernel: CIColorKernel!
var inputImage: CIImage?
var activeColor = CIColor(red: 0.0, green: 1.0, blue: 0.0)
var threshold: Float = 0.05

override init() {
    super.init()
    kernel = createKernel()
}

required init(coder aDecoder: NSCoder) {
    super.init(coder: aDecoder)!
    kernel = createKernel()
}

override var outputImage: CIImage? {
    if let inputImage = inputImage {
        let dod = inputImage.extent
        let args = [inputImage as AnyObject, activeColor as AnyObject, threshold as AnyObject]
        return kernel.applyWithExtent(dod, arguments: args)
    }
    return nil
}

private func createKernel() -> CIColorKernel {
    let kernelString =
        "kernel vec4 chromaKey( __sample s, __color c, float threshold ) { \n" +
            //below kernel was adapted from the GPUImage custom chromakeyfilter:
            //https://github.com/BradLarson/GPUImage/blob/master/framework/Source/GPUImageChromaKeyFilter.m#L30
            "  float maskY = 0.2989 * c.r + 0.5866 * c.g + 0.1145 * c.b;\n" +
            "  float maskCr = 0.7132 * (c.r - maskY);\n" +
            "  float maskCb = 0.5647 * (c.b - maskY);\n" +
            "  float Y = 0.2989 * s.rgb.r + 0.5866 * s.rgb.g + 0.1145 * s.rgb.b;\n" +
            "  float Cr = 0.7132 * (s.rgb.r - Y);\n" +
            "  float Cb = 0.5647 * (s.rgba.b - Y);\n" +
            "  float blendValue = smoothstep(threshold, threshold + 0.5, distance(vec2(Cr, Cb), vec2(maskCr, maskCb)));\n" +
            "  return blendValue * vec4( s.rgb, 1.0 ); \n" +
    "}"
    let kernel = CIColorKernel(string: kernelString)
    return kernel!
}
Run Code Online (Sandbox Code Playgroud)

}

有人对为什么现在才打破有一些想法吗?有趣的是,它只是在电话上坏了。它仍然可以在模拟器上运行,尽管比iOS 10发行前的运行速度慢得多。

Rhy*_*man 5

看起来iOS10(设备)管道的某些部分(播放器层?)已切换到YUV。

将您AVPlayerLayer的设置pixelBufferAttributes为BGRA可修复缺少alpha的问题,并使记录的错误静音:

layer.pixelBufferAttributes = [kCVPixelBufferPixelFormatTypeKey as String: NSNumber(value: kCVPixelFormatType_32BGRA)]
Run Code Online (Sandbox Code Playgroud)