简体   繁体   中英

How to get some useful sensor data in ARKit?

I learn from the document that ARKit is CoreMotion plus AVFoundation , but how I get the sensor data:

  • gravity
  • acceleration
  • rotation matrix

that CoreMotion provided from ARKit instead of set a listener of Core Motion?

ARKit (and its new satellite RealityKit ) not only contains some classes, methods and properties from CoreMotion and AVFoundation frameworks, but also some classes, methods and properties from:

  • UIKit
  • SceneKit
  • SpriteKit
  • Metal
  • CoreML
  • CoreLocation
  • MultipeerConnectivity
  • etc.

However, you can't get any raw data from iPhone sensors (Apple doesn't allow it) but you can definitely use data what access is allowed to.

For instance:

1. A pixel buffer containing the image captured by the camera:

let pixelBuffer: CVPixelBuffer? = sceneView.session.currentFrame?.capturedImage

2. The position and orientation of the camera in world coordinate space ( simd_float4x4 ):

let cameraMatrix = (sceneView.session.currentFrame?.camera.transform)!

3. Options for how ARKit constructs a scene coordinate system based on real-world device motion:

ARConfiguration.WorldAlignment.gravityAndHeading
ARConfiguration.WorldAlignment.gravity
ARConfiguration.WorldAlignment.camera

Alas, but you can't get a pure acceleration data from IMU.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM