Zba*_*itZ 3 cifilter metal arkit
我正在处理与使用相关的 Apple示例项目ARMatteGenerator生成 MTLTextureMTLTexture 可用作人员遮挡技术中的遮挡遮罩。
我想确定如何通过 CIFilter 运行生成的遮罩。在我的代码中,我像这样“过滤”遮罩;
func updateMatteTextures(commandBuffer: MTLCommandBuffer) {
guard let currentFrame = session.currentFrame else {
return
}
var targetImage: CIImage?
alphaTexture = matteGenerator.generateMatte(from: currentFrame, commandBuffer: commandBuffer)
dilatedDepthTexture = matteGenerator.generateDilatedDepth(from: currentFrame, commandBuffer: commandBuffer)
targetImage = CIImage(mtlTexture: alphaTexture!, options: nil)
monoAlphaCIFilter?.setValue(targetImage!, forKey: kCIInputImageKey)
monoAlphaCIFilter?.setValue(CIColor.red, forKey: kCIInputColorKey)
targetImage = (monoAlphaCIFilter?.outputImage)!
let drawingBounds = CGRect(origin: .zero, size: CGSize(width: alphaTexture!.width, height: alphaTexture!.height))
context.render(targetImage!, to: alphaTexture!, commandBuffer: commandBuffer, bounds: drawingBounds, colorSpace: CGColorSpaceCreateDeviceRGB())
}
Run Code Online (Sandbox Code Playgroud)
当我合成磨砂纹理和背景时,没有应用到磨砂的过滤效果。这就是纹理的合成方式;
func compositeImagesWithEncoder(renderEncoder: MTLRenderCommandEncoder) {
guard let textureY = capturedImageTextureY, let textureCbCr = capturedImageTextureCbCr else {
return
}
// Push a debug group allowing us to identify render commands in the GPU Frame Capture tool
renderEncoder.pushDebugGroup("CompositePass")
// Set render command encoder state
renderEncoder.setCullMode(.none)
renderEncoder.setRenderPipelineState(compositePipelineState)
renderEncoder.setDepthStencilState(compositeDepthState)
// Setup plane vertex buffers
renderEncoder.setVertexBuffer(imagePlaneVertexBuffer, offset: 0, index: 0)
renderEncoder.setVertexBuffer(scenePlaneVertexBuffer, offset: 0, index: 1)
// Setup textures for the composite fragment shader
renderEncoder.setFragmentBuffer(sharedUniformBuffer, offset: sharedUniformBufferOffset, index: Int(kBufferIndexSharedUniforms.rawValue))
renderEncoder.setFragmentTexture(CVMetalTextureGetTexture(textureY), index: 0)
renderEncoder.setFragmentTexture(CVMetalTextureGetTexture(textureCbCr), index: 1)
renderEncoder.setFragmentTexture(sceneColorTexture, index: 2)
renderEncoder.setFragmentTexture(sceneDepthTexture, index: 3)
renderEncoder.setFragmentTexture(alphaTexture, index: 4)
renderEncoder.setFragmentTexture(dilatedDepthTexture, index: 5)
// Draw final quad to display
renderEncoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4)
renderEncoder.popDebugGroup()
}
Run Code Online (Sandbox Code Playgroud)
如何将 CIFilter 仅应用于 ARMatteGenerator 生成的 alphaTexture?
我认为您不想将 aCIFilter应用于alphaTexture. 我假设您在自定义渲染器示例代码中使用 Apple 的Effecting People Occlusion。如果您观看今年的将人们带入 AR WWDC 会议,他们会谈到使用 生成分割遮罩ARMatteGenerator,这就是使用alphaTexture = matteGenerator.generateMatte(from: currentFrame, commandBuffer: commandBuffer). alphaTexture是一个MTLTexture本质上是一个 alpha 掩码,用于在相机帧中检测到人类的位置(即完全不透明的地方是人,而完全透明的地方不是人)。
向 alpha 纹理添加过滤器不会过滤最终渲染的图像,而只会影响合成中使用的遮罩。如果您试图获得上一个问题中链接的视频,我建议您调整合成发生的金属着色器。在会议中,他们指出他们比较了dilatedDepth和renderedDepth,看看他们是否应该从相机中绘制虚拟内容或像素:
fragment half4 customComposition(...) {
half4 camera = cameraTexture.sample(s, in.uv);
half4 rendered = renderedTexture.sample(s, in.uv);
float renderedDepth = renderedDepthTexture.sample(s, in.uv);
half4 scene = mix(rendered, camera, rendered.a);
half matte = matteTexture.sample(s, in.uv);
float dilatedDepth = dilatedDepthTexture.sample(s, in.uv);
if (dilatedDepth < renderedDepth) { // People in front of rendered
// mix together the virtual content and camera feed based on the alpha provided by the matte
return mix(scene, camera, matte);
} else {
// People are not in front so just return the scene
return scene
}
}
Run Code Online (Sandbox Code Playgroud)
不幸的是,这在示例代码中略有不同,但它仍然很容易修改。打开Shaders.metal。找到compositeImageFragmentShader函数。在函数的末尾,您将看到half4 occluderResult = mix(sceneColor, cameraColor, alpha);这与mix(scene, camera, matte);我们在上面看到的基本相同的操作。我们正在根据分割遮罩决定是使用场景中的像素还是相机输入中的像素。我们可以通过用代表颜色的cameraColora替换任意 rgba 值来轻松替换相机图像像素half4。例如,我们可以使用half4(float4(0.0, 0.0, 1.0, 1.0))将分割哑光蓝色中的所有像素绘制出来:
…
// Replacing camera color with blue
half4 occluderResult = mix(sceneColor, half4(float4(0.0, 0.0, 1.0, 1.0)), alpha);
half4 mattingResult = mix(sceneColor, occluderResult, showOccluder);
return mattingResult;
Run Code Online (Sandbox Code Playgroud)
当然,您也可以应用其他效果。动态灰度静态非常容易实现。
上面compositeImageFragmentShader补充:
float random(float offset, float2 tex_coord, float time) {
// pick two numbers that are unlikely to repeat
float2 non_repeating = float2(12.9898 * time, 78.233 * time);
// multiply our texture coordinates by the non-repeating numbers, then add them together
float sum = dot(tex_coord, non_repeating);
// calculate the sine of our sum to get a range between -1 and 1
float sine = sin(sum);
// multiply the sine by a big, non-repeating number so that even a small change will result in a big color jump
float huge_number = sine * 43758.5453 * offset;
// get just the numbers after the decimal point
float fraction = fract(huge_number);
// send the result back to the caller
return fraction;
}
Run Code Online (Sandbox Code Playgroud)
(取自 @twostraws ShaderKit)
然后修改compositeImageFragmentShader为:
…
float randFloat = random(1.0, cameraTexCoord, rgb[0]);
half4 occluderResult = mix(sceneColor, half4(float4(randFloat, randFloat, randFloat, 1.0)), alpha);
half4 mattingResult = mix(sceneColor, occluderResult, showOccluder);
return mattingResult;
Run Code Online (Sandbox Code Playgroud)
你应该得到:
最后,调试器似乎很难跟上应用程序。对我来说,当运行附加的 Xcode 时,应用程序会在启动后不久冻结,但在单独运行时通常很流畅。
| 归档时间: |
|
| 查看次数: |
714 次 |
| 最近记录: |