如何将SKNode渲染为UIImage

So *_* It 10 objective-c uiimage ios sprite-kit sknode

只是玩SpriteKit,我试图弄清楚如何捕获SKNode的'抓取'到UIImage.

使用UIView(或UIView子类),我使用layer视图的属性渲染到图形上下文中.

例如.

#import <QuartzCore/QuartzCore.h>
+ (UIImage *)imageOfView:(UIView *)view {
    UIGraphicsBeginImageContextWithOptions(view.frame.size, YES, 0.0f);
    CGContextRef context = UIGraphicsGetCurrentContext();
    [view.layer renderInContext:context];
    UIImage *viewShot = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();
    return viewShot;
}
Run Code Online (Sandbox Code Playgroud)

SKNode不是UIView的子类,因此似乎没有图层支持.

关于如何将给定的SKNode呈现给UIImage的任何想法?

Lea*_*s2D 14

这将捕获整个场景:

CGRect bounds = self.scene.view.bounds;
UIGraphicsBeginImageContextWithOptions(bounds.size, NO, [UIScreen mainScreen].scale);
[self drawViewHierarchyInRect:bounds afterScreenUpdates:YES];
UIImage* screenshotImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Run Code Online (Sandbox Code Playgroud)

如果您只想要一个特定的节点分支,则可以在截取屏幕截图之前隐藏您不想捕获的所有节点.您还可以使用转换为UIKit坐标的累积帧来仅捕获节点及其子节点的区域.

或者,您可以SKTexture从节点层次结构的特定部分获取:

SKTexture* tex = [self.scene.view textureFromNode:yourNode];
Run Code Online (Sandbox Code Playgroud)

在iOS 9之前,没有办法将SKTexture后端转换为UIImage.然而,现在,它是微不足道的:

UIImage *image = [UIImage imageWithCGImage:tex.CGImage];
Run Code Online (Sandbox Code Playgroud)


Cen*_*Guy 7

对于 7 年后查看该帖子的任何人,我尝试使用上述方法,但它们产生的结果相对较慢。我想在屏幕外渲染整个 SKScene,因为我将图像渲染为视频的帧。这意味着我无法使用 LearnCocos2D 建议的第一种方法,因为它需要在屏幕上绘制视图,而第二种方法需要很长时间才能SKTextureUIImage. 您可以使用 iOS 11.0 中引入的新SKRenderer类将整个场景渲染到 UIImage,它利用Metal,因此渲染速度相对较快。我能够SKScene在大约 0.013 秒内渲染 1920x1080!

您可以使用此扩展:

  • 确保你import MetalKit
  • ignoreScreenScale参数指定输出图像是否应像素精确。通常,如果您要将图像显示回屏幕上,您会希望这是错误的。当此值为 false 时,输出图像的大小将按设备的比例缩放,以便场景上的每个“点”在图像中占据与屏幕上相同数量的像素。当这是真的时,输出图像的大小(以像素为单位)等于以SKScene点为单位的大小。

干杯!

extension SKScene {
    func toImage(ignoreScreenScale: Bool = false) -> UIImage? {
        guard let device = MTLCreateSystemDefaultDevice(),
              let commandQueue = device.makeCommandQueue(),
              let commandBuffer = commandQueue.makeCommandBuffer() else { return nil }

        let scale = ignoreScreenScale ? 1 : UIScreen.main.scale
        let size = self.size.applying(CGAffineTransform(scaleX: scale, y: scale))
        let renderer = SKRenderer(device: device)
        let renderPassDescriptor = MTLRenderPassDescriptor()

        var r = CGFloat.zero, g = CGFloat.zero, b = CGFloat.zero, a = CGFloat.zero
        backgroundColor.getRed(&r, green: &g, blue: &b, alpha: &a)

        let textureDescriptor = MTLTextureDescriptor()
        textureDescriptor.usage = [.renderTarget, .shaderRead]
        textureDescriptor.width = Int(size.width)
        textureDescriptor.height = Int(size.height)
        let texture = device.makeTexture(descriptor: textureDescriptor)

        renderPassDescriptor.colorAttachments[0].loadAction = .clear
        renderPassDescriptor.colorAttachments[0].texture = texture
        renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor(
            red: Double(r),
            green: Double(g),
            blue: Double(b),
            alpha:Double(a)
        )

        renderer.scene = self
        renderer.render(withViewport: CGRect(origin: .zero, size: size), commandBuffer: commandBuffer, renderPassDescriptor: renderPassDescriptor)
        commandBuffer.commit()

        let image = CIImage(mtlTexture: texture!, options: nil)!
        let transformed = image.transformed(by: CGAffineTransform(scaleX: 1, y: -1).translatedBy(x: 0, y: -image.extent.size.height))
        return UIImage(ciImage: transformed)
    }
}
Run Code Online (Sandbox Code Playgroud)