ARKit/SpriteKit - 将pixelBufferAttributes设置为SKVideoNode或在视频中制作透明像素(色度键效果)另一种方式

2sh*_*shy 8 ios sprite-kit skvideonode swift arkit

我的目标是使用在真实环境中呈现2D动画角色ARKit.动画角色是视频的一部分,在视频的以下快照中显示:

来自视频的快照

使用代码完全显示视频本身没有任何问题:

func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
    guard let urlString = Bundle.main.path(forResource: "resourceName", ofType: "mp4") else { return nil }

    let url = URL(fileURLWithPath: urlString)
    let asset = AVAsset(url: url)
    let item = AVPlayerItem(asset: asset)
    let player = AVPlayer(playerItem: item)

    let videoNode = SKVideoNode(avPlayer: player)
    videoNode.size = CGSize(width: 200.0, height: 150.0)
    videoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)

    return videoNode
}
Run Code Online (Sandbox Code Playgroud)

此代码的结果显示在以下应用程序的屏幕截图中,如预期:

应用截图#1

但正如你所看到的,角色的背景并不是很好,所以我需要让它消失,以便创造出实际上站在水平面上的角色的幻觉.我试图通过对视频产生色度键效果来实现这一点.

  • 对于那些不熟悉色度键的人来说,这是有时在电视上看到的"绿屏效果"的名称,以使颜色透明.

我对色度键效果的处理方法是创建基于的自定义过滤器"CIColorCube" CIFilter,然后使用过滤器对视频应用AVVideoComposition.

首先,是创建过滤器的代码:

func RGBtoHSV(r : Float, g : Float, b : Float) -> (h : Float, s : Float, v : Float) {
    var h : CGFloat = 0
    var s : CGFloat = 0
    var v : CGFloat = 0
    let col = UIColor(red: CGFloat(r), green: CGFloat(g), blue: CGFloat(b), alpha: 1.0)
    col.getHue(&h, saturation: &s, brightness: &v, alpha: nil)
    return (Float(h), Float(s), Float(v))
}

func colorCubeFilterForChromaKey(hueAngle: Float) -> CIFilter {

    let hueRange: Float = 20 // degrees size pie shape that we want to replace
    let minHueAngle: Float = (hueAngle - hueRange/2.0) / 360
    let maxHueAngle: Float = (hueAngle + hueRange/2.0) / 360

    let size = 64
    var cubeData = [Float](repeating: 0, count: size * size * size * 4)
    var rgb: [Float] = [0, 0, 0]
    var hsv: (h : Float, s : Float, v : Float)
    var offset = 0

    for z in 0 ..< size {
        rgb[2] = Float(z) / Float(size) // blue value
        for y in 0 ..< size {
            rgb[1] = Float(y) / Float(size) // green value
            for x in 0 ..< size {

                rgb[0] = Float(x) / Float(size) // red value
                hsv = RGBtoHSV(r: rgb[0], g: rgb[1], b: rgb[2])
                // TODO: Check if hsv.s > 0.5 is really nesseccary
                let alpha: Float = (hsv.h > minHueAngle && hsv.h < maxHueAngle && hsv.s > 0.5) ? 0 : 1.0

                cubeData[offset] = rgb[0] * alpha
                cubeData[offset + 1] = rgb[1] * alpha
                cubeData[offset + 2] = rgb[2] * alpha
                cubeData[offset + 3] = alpha
                offset += 4
            }
        }
    }
    let b = cubeData.withUnsafeBufferPointer { Data(buffer: $0) }
    let data = b as NSData

    let colorCube = CIFilter(name: "CIColorCube", withInputParameters: [
        "inputCubeDimension": size,
        "inputCubeData": data
        ])
    return colorCube!
}
Run Code Online (Sandbox Code Playgroud)

然后通过修改func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode?我之前写的函数将过滤器应用于视频的代码:

func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
    guard let urlString = Bundle.main.path(forResource: "resourceName", ofType: "mp4") else { return nil }

    let url = URL(fileURLWithPath: urlString)
    let asset = AVAsset(url: url)

    let filter = colorCubeFilterForChromaKey(hueAngle: 38)
    let composition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in
        let source = request.sourceImage
        filter.setValue(source, forKey: kCIInputImageKey)
        let output = filter.outputImage

        request.finish(with: output!, context: nil)
    })

    let item = AVPlayerItem(asset: asset)
    item.videoComposition = composition
    let player = AVPlayer(playerItem: item)

    let videoNode = SKVideoNode(avPlayer: player)
    videoNode.size = CGSize(width: 200.0, height: 150.0)
    videoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)

    return videoNode
}
Run Code Online (Sandbox Code Playgroud)

alpha = 0.0如果像素颜色与背景的色调范围匹配,则代码应该替换视频的每个帧的所有像素.但是没有获得透明像素,我将这些像素变为黑色,如下图所示:

应用截图#2

现在,虽然这不是想要的效果,但我并不感到惊讶,因为我知道这是iOS用alpha通道显示视频的方式.但这是真正的问题 - 当在一个正常视频中显示时AVPlayer,有一个选项可以添加一个AVPlayerLayer视图,并设置pixelBufferAttributes为它,让玩家层知道我们使用透明像素缓冲区,如下所示:

let playerLayer = AVPlayerLayer(player: player)
playerLayer.bounds = view.bounds
playerLayer.position = view.center
playerLayer.pixelBufferAttributes = [(kCVPixelBufferPixelFormatTypeKey as String): kCVPixelFormatType_32BGRA]
view.layer.addSublayer(playerLayer)
Run Code Online (Sandbox Code Playgroud)

这段代码为我们提供了一个透明背景的视频(GOOD!),但是固定的大小和位置(不是很好......),你可以在这个截图中看到:

应用截图#3

我想达到同样的效果,但是SKVideoNode,而不是AVPlayerLayer.但是,我找不到任何设置pixelBufferAttributes方法SKVideoNode,并且设置一个播放器层并没有达到预期效果,ARKit因为它固定在位置上.

我的问题是否有任何解决方案,或者是否有其他技术可以达到相同的预期效果?

2sh*_*shy 5

解决方法很简单!所有需要做的就是将视频添加为的子级,SKEffectNode然后将过滤器应用于SKEffectNode而不是视频本身(AVVideoComposition无需这样做)。这是我使用的代码:

func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
    // Create and configure a node for the anchor added to the view's session.
    let bialikVideoNode = videoNodeWith(resourceName: "Tsina_05", ofType: "mp4")
    bialikVideoNode.size = CGSize(width: kDizengofVideoWidth, height: kDizengofVideoHeight)
    bialikVideoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)

    // Make the video background transparent using an SKEffectNode, since chroma-key doesn't work on video
    let effectNode = SKEffectNode()
    effectNode.addChild(bialikVideoNode)
    effectNode.filter = colorCubeFilterForChromaKey(hueAngle: 120)

    return effectNode
}
Run Code Online (Sandbox Code Playgroud)

这是所需的结果: 在此处输入图片说明