简体   繁体   English

如何在 ARKit 中获取一些有用的传感器数据?

[英]How to get some useful sensor data in ARKit?

I learn from the document that ARKit is CoreMotion plus AVFoundation , but how I get the sensor data:我从文档中了解到 ARKit 是CoreMotion plus AVFoundation ,但是我如何获取传感器数据:

  • gravity重力
  • acceleration加速度
  • rotation matrix旋转矩阵

that CoreMotion provided from ARKit instead of set a listener of Core Motion?从 ARKit 提供的 CoreMotion 而不是设置 Core Motion 的侦听器?

ARKit (and its new satellite RealityKit ) not only contains some classes, methods and properties from CoreMotion and AVFoundation frameworks, but also some classes, methods and properties from: ARKit (及其新的卫星RealityKit )不仅包含来自CoreMotionAVFoundation框架的一些类、方法和属性,还包含来自以下的一些类、方法和属性:

  • UIKit UIKit
  • SceneKit场景套件
  • SpriteKit精灵套件
  • Metal金属
  • CoreML CoreML
  • CoreLocation核心位置
  • MultipeerConnectivity多点连接
  • etc.等等

However, you can't get any raw data from iPhone sensors (Apple doesn't allow it) but you can definitely use data what access is allowed to.但是,您无法从 iPhone 传感器获取任何原始数据(Apple 不允许),但您绝对可以使用允许访问的数据。

For instance:例如:

1. A pixel buffer containing the image captured by the camera: 1.一个像素缓冲区,包含相机捕获的图像:

let pixelBuffer: CVPixelBuffer? = sceneView.session.currentFrame?.capturedImage

2. The position and orientation of the camera in world coordinate space ( simd_float4x4 ): 2. position 和相机在世界坐标空间中的方向( simd_float4x4 ):

let cameraMatrix = (sceneView.session.currentFrame?.camera.transform)!

3. Options for how ARKit constructs a scene coordinate system based on real-world device motion: 3. ARKit 如何根据真实世界的设备运动构建场景坐标系的选项:

ARConfiguration.WorldAlignment.gravityAndHeading
ARConfiguration.WorldAlignment.gravity
ARConfiguration.WorldAlignment.camera

Alas, but you can't get a pure acceleration data from IMU.唉,但你无法从 IMU 获得纯加速度数据。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM