iPad Pro 激光雷达 - 导出几何和纹理

Chr*_*ris 8 lidar arkit

我希望能够从 iPad Pro 激光雷达导出网格和纹理。

这里有如何导出网格的示例,但我也希望能够导出环境纹理

ARKit 3.5 – 如何使用 LiDAR 从新 iPad Pro 导出 OBJ?

ARMeshGeometry 存储网格的顶点,是否必须在扫描环境时“记录”纹理并手动应用它们?

这篇文章似乎展示了一种获取纹理坐标的方法,但我看不到使用 ARMeshGeometry 做到这一点的方法:将 ARFaceGeometry 保存到 OBJ 文件

任何指向正确方向的点,或要看的东西,非常感谢!

克里斯

Pav*_*n K 10

您需要计算每个顶点的纹理坐标,将它们应用到网格并提供纹理作为网格的材质。

let geom = meshAnchor.geometry
let vertices = geom.vertices 
let size = arFrame.camera.imageResolution
let camera = arFrame.camera
    
let modelMatrix = meshAnchor.transform
 
let textureCoordinates = vertices.map { vertex -> vector_float2 in
    let vertex4 = vector_float4(vertex.x, vertex.y, vertex.z, 1)
    let world_vertex4 = simd_mul(modelMatrix!, vertex4)
    let world_vector3 = simd_float3(x: world_vertex4.x, y: world_vertex4.y, z: world_vertex4.z)
    let pt = camera.projectPoint(world_vector3,
        orientation: .portrait,
        viewportSize: CGSize(
            width: CGFloat(size.height),
            height: CGFloat(size.width)))
    let v = 1.0 - Float(pt.x) / Float(size.height)
    let u = Float(pt.y) / Float(size.width)
    return vector_float2(u, v)
}

// construct your vertices, normals and faces from the source geometry 
// directly and supply the computed texture coords to create new geometry
// and then apply the texture.
let scnGeometry = SCNGeometry(sources: [verticesSource, textureCoordinates, normalsSource], elements: [facesSource])

let texture = UIImage(pixelBuffer: frame.capturedImage)
let imageMaterial = SCNMaterial()
imageMaterial.isDoubleSided = false
imageMaterial.diffuse.contents = texture
scnGeometry.materials = [imageMaterial]
let pcNode = SCNNode(geometry: scnGeometry)
Run Code Online (Sandbox Code Playgroud)

pcNode 如果添加到场景中,将包含应用了纹理的网格。

纹理坐标计算从这里开始

  • 您是否有机会分享更完整的实现,@Pavan K?这看起来很有希望,但我有点不确定像“verticesSource”、“normalsSource”和“facesSource”之类的东西来自哪里。 (2认同)