向 MTKView 提供像素缓冲区

1 macos buffer pixel ios metal

这是我的问题:我想显示我计算到 MTKView 的像素缓冲区。我搜索了 MTLTexture、MTLBuffer 和其他 Metal 对象,但我找不到任何方法来呈现像素缓冲区。我看到的每个教程都是关于使用顶点和片段着色器呈现 3D 对象。

我认为缓冲区必须在drawInMTKView函数内呈现(也许使用 MTLRenderCommandEncoder),但同样,我找不到任何有关此的信息。

我希望我没有问一个显而易见的问题。

谢谢

Fra*_*gel 8

欢迎!

我建议您使用 Core Image 将像素缓冲区的内容渲染到视图中。这需要最少的手动金属设置。

按如下所示设置MTKView一些必需的对象(假设您有一个视图控制器和一个故事板设置):

import UIKit
import CoreImage

class PreviewViewController: UIViewController {

    @IBOutlet weak var metalView: MTKView!

    var device: MTLDevice!
    var commandQueue: MTLCommandQueue!
    var ciContext: CIContext!

    var pixelBuffer: CVPixelBuffer?


    override func viewDidLoad() {
        super.viewDidLoad()

        self.device = MTLCreateSystemDefaultDevice()
        self.commandQueue = self.device.makeCommandQueue()

        self.metalView.delegate = self
        self.metalView.device = self.device
        // this allows us to render into the view's drawable
        self.metalView.framebufferOnly = false

        self.ciContext = CIContext(mtlDevice: self.device)
    }

}
Run Code Online (Sandbox Code Playgroud)

在委托方法中,您使用 Core Image 来转换像素缓冲区以适合视图的内容(这是一个好处,请根据您的用例进行调整)并使用以下方法进行渲染CIContext

extension PreviewViewController:  MTKViewDelegate {

    func draw(in view: MTKView) {
        guard let pixelBuffer = self.pixelBuffer,
              let commandBuffer = self.commandQueue.makeCommandBuffer() else { return }
        // turn the pixel buffer into a CIImage so we can use Core Image for rendering into the view
        let image = CIImage(cvPixelBuffer: pixelBuffer)

        // bonus: transform the image to aspect-fit the view's bounds
        let drawableSize = view.drawableSize
        let scaleX = drawableSize.width / image.extent.width
        let scaleY = drawableSize.height / image.extent.height
        let scale = min(scaleX, scaleY)
        let scaledImage = image.transformed(by: CGAffineTransform(scaleX: scale, y: scale))
        // center in the view
        let originX = max(drawableSize.width - scaledImage.extent.size.width, 0) / 2
        let originY = max(drawableSize.height - scaledImage.extent.size.height, 0) / 2
        let centeredImage = scaledImage.transformed(by: CGAffineTransform(translationX: originX, y: originY))

        // Create a render destination that allows to lazily fetch the target texture
        // which allows the encoder to process all CI commands _before_ the texture is actually available.
        // This gives a nice speed boost because the CPU doesn't need to wait for the GPU to finish
        // before starting to encode the next frame.
        // Also note that we don't pass a command buffer here, because according to Apple:
        // "Rendering to a CIRenderDestination initialized with a commandBuffer requires encoding all
        // the commands to render an image into the specified buffer. This may impact system responsiveness
        // and may result in higher memory usage if the image requires many passes to render."
        let destination = CIRenderDestination(width: Int(drawableSize.width),
                                              height: Int(drawableSize.height),
                                              pixelFormat: view.colorPixelFormat,
                                              commandBuffer: nil,
                                              mtlTextureProvider: { () -> MTLTexture in
                                                  return currentDrawable.texture
                                              })
        // render into the view's drawable
        let _ = try! self.ciContext.startTask(toRender: centeredImage, to: destination)

        // present the drawable
        commandBuffer.present(currentDrawable)
        commandBuffer.commit()
    }

}
Run Code Online (Sandbox Code Playgroud)

有一种稍微简单的方法可以渲染到可绘制纹理中,而不是使用CIRenderDestination,但如果您想实现高帧速率,建议使用此方法(请参阅评论)。