简体   繁体   中英

How to debug custom geometry in SceneKit with Swift

I'm trying to learn how to create custom geometry in SceneKit. However, I've tried to make a triangle and it's not showing anything. I'm at a loss as to how to debug this. Is there a way to figure out if the triangle is valid? I just don't know where to start.

For reference, the playground code in question is below. Note that it is written against Swift 4, but the changes between Swift 3 and Swift 4 are so minor that getting it to compile in Swift 3 is trivial.

import UIKit
import SceneKit

let points = [
    SCNVector3Make(0, 0, 0),
    SCNVector3Make(0, 10, 0),
    SCNVector3Make(10, 0, 0),
]
let indices = [
    0,2,1,
]

let vertexSource = SCNGeometrySource(vertices: points)
let element = SCNGeometryElement(indices: indices, primitiveType: .triangles)
let geo = SCNGeometry(sources: [vertexSource], elements: [element])

When creating custom SCNGeometryElement s the type of the indices needs to be Int16 1 . I don't think this documented anywhere. But when you change the declaration of the indices too

let indices: [Int16] = [
    0, 2, 1
]

the triangle should appear.


Edit

1: As @mnuages has pointed out, SceneKit supports only 32bit Integers as indices. So you can use Int8 , Int16 and Int32 .

From the documentation for the Int type we get that

On 32-bit platforms, Int is the same size as Int32, and on 64-bit platforms, Int is the same size as Int64.

So in a Swift Playground you'll end up with Int64 .

That's unfortunate because SceneKit (and Metal) only support UInt8 , UInt16 and UInt32 . In the debugger console you should be able to see SceneKit warn about the unsupported 64 bits wide index.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM