简体   繁体   中英

ARKit & Reality composer - how to Anchor scene using image coordinates

I have written code to initialise one of 3 Reality Composer scenes when a button is pressed depending on the day of the month.

That all works fine.

The Reality Composer scenes use an image detection to place the objects within the environment but currently as soon as the image is out of the camera view the objects disappear.

I would like to anchor the scene with the root node being where the image is first detected so that users can look around the scene and the objects are maintained even when the image trigger is not in the camera view.

I tried passing in a func renderer code below but i get errors saying the view controller class doesn't have the.planeNode

 func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
            guard let imageAnchor = anchor as? ARImageAnchor else { return }
            let referenceImage = imageAnchor.referenceImage

                // Create a plane to visualize the initial position of the detected image.
                let plane = SCNPlane(width: referenceImage.physicalSize.width,
                                 height: referenceImage.physicalSize.height)
                plane.materials.first?.diffuse.contents = UIColor.blue.withAlphaComponent(0.20)
                self.planeNode = SCNNode(geometry: plane)

                self.planeNode?.opacity = 1

                /*
                 `SCNPlane` is vertically oriented in its local coordinate space, but
                 `ARImageAnchor` assumes the image is horizontal in its local space, so
                 rotate the plane to match.
                 */
                self.planeNode?.eulerAngles.x = -.pi / 2

                /*
                 Image anchors are not tracked after initial detection, so create an
                 animation that limits the duration for which the plane visualization appears.
                 */

                // Add the plane visualization to the scene.
                if let planeNode = self.planeNode {
                    node.addChildNode(planeNode)
                }

                if let imageName = referenceImage.name {
                    plane.materials = [SCNMaterial()]
                    plane.materials[0].diffuse.contents = UIImage(named: imageName)
                }

here's my code

import UIKit
import RealityKit
import ARKit
import SceneKit



class ViewController: UIViewController {



@IBOutlet var move: ARView!
    @IBOutlet var arView: ARView!

    var ARBorealAnchor3: ARboreal.ArBoreal3!

    var ARBorealAnchor2: ARboreal.ArBoreal2!

    var ARBorealAnchor: ARboreal.ArBoreal!

    var Date1 = 1




    override func viewDidLoad() {
        super.viewDidLoad()



        func getSingle() {
            let date = Date()
            let calendar = Calendar.current
            let day = calendar.component(.day, from: date)
            Date1 = day
        }

     getSingle()

      ARBorealAnchor = try! ARboreal.loadArBoreal()

        ARBorealAnchor2 = try!
        ARboreal.loadArBoreal2()

        ARBorealAnchor3 = try!
              ARboreal.loadArBoreal3()



        if Date1 == 24 {
            arView.scene.anchors.append(ARBorealAnchor)
        }
        if Date1 == 25 {
            arView.scene.anchors.append(ARBorealAnchor2)
        }
        if Date1 == 26 {
            arView.scene.anchors.append(ARBorealAnchor3)
        }
    }
}

Any help would be greatly appreciated.

Cheers, Daniel Savage

What is happening is that when the image anchor goes out of view the AnchorEntity becomes unanchored and RealityKit will then stop rendering it and all its descendants.

One way to work around this could be to just separate your image anchor and content you want to render, add the image anchor manually in code, then when the image anchor is first detected, add your content to the scene under a different world anchor. When the image anchor transform is updated, update your world anchor to match.

That way you can use the image anchor when it is visible to get the latest transform, but when it disappears the rendering of the content is not tied to it. Something like below (you will have to create an AR Resource Group called ARTest and add an image to it named "test" to get the anchor to work):

import ARKit
import SwiftUI
import RealityKit
import Combine

struct ContentView : View {
    var body: some View {
        return ARViewContainer().edgesIgnoringSafeArea(.all)
    }
}

let arDelegate = SessionDelegate()

struct ARViewContainer: UIViewRepresentable {

  func makeUIView(context: Context) -> ARView {

    let arView = ARView(frame: .zero)

    arDelegate.set(arView: arView)
    arView.session.delegate = arDelegate

    // Create an image anchor, add it to the scene. We won't add any
    // rendering content to the anchor, it will be used only for detection
    let imageAnchor = AnchorEntity(.image(group: "ARTest", name: "test"))
    arView.scene.anchors.append(imageAnchor)

    return arView
  }

  func updateUIView(_ uiView: ARView, context: Context) {}
}

final class SessionDelegate: NSObject, ARSessionDelegate {
  var arView: ARView!
  var rootAnchor: AnchorEntity?

  func set(arView: ARView) {
    self.arView = arView
  }

  func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {

    // If we already added the content to render, ignore
    if rootAnchor != nil {
       return
    }

    // Make sure we are adding to an image anchor. Assuming only
    // one image anchor in the scene for brevity.
    guard anchors[0] is ARImageAnchor else {
      return
    }

    // Create the entity to render, could load from your experience file here
    // this will render at the center of the matched image
    rootAnchor = AnchorEntity(world: [0,0,0])
    let ball = ModelEntity(
      mesh: MeshResource.generateBox(size: 0.01),
      materials: [SimpleMaterial(color: .red, isMetallic: false)]
    )
    rootAnchor!.addChild(ball)

    // Just add another model to show how it remains in the scene even
    // when the tracking image is out of view.
    let ball2 = ModelEntity(
      mesh: MeshResource.generateBox(size: 0.10),
      materials: [SimpleMaterial(color: .orange, isMetallic: false)]
    )
    ball.addChild(ball2)
    ball2.position = [0, 0, 1]

    arView.scene.addAnchor(rootAnchor!)
  }

  func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
    guard let rootAnchor = rootAnchor else {
      return
    }

    // Code is assuming you only have one image anchor for brevity
    guard let imageAnchor = anchors[0] as? ARImageAnchor else {
      return
    }

    if !imageAnchor.isTracked {
      return
    }

    // Update our fixed anchor to image transform
    rootAnchor.transform = Transform(matrix: imageAnchor.transform)
  }

}

#if DEBUG
struct ContentView_Previews : PreviewProvider {
  static var previews: some View {
    ContentView()
  }
}
#endif

NOTE: It seems like the transform for an ARImageAnchor updates frequently as you move around as ARKit is trying to calculate the accurate image plane (content might seem like it is in the right place but the z value is not accurate for example), make sure your image dimensions are accurate in the AR resource group for the image to get better tracking.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM