简体   繁体   中英

How does Android calculate Rotation Vector?

Android documentation says here (just read top 4-5 lines) that Rotation Vector Sensor is software based. But most of Android devices actually have 3 sensor, viz. accelerometer, gyroscope, and magnetometer. So there must be a conversion algorithm which converts data from these actual sensors to the virtual rotation vector sensor. But I was unable to find any article or source code where I can see the algorithm that was used to calculate the rotation vector. If someone had any experience in this area, may be he can point me to the right direction.

I need it to know if it is at all necessary to get the rotation data from rotation vector sensor, or I can compute it myself using the hardware-based sensors.

The implementation accessible through the following url uses an Extended Kalman Filter, which is a pretty standard way of fusing accelerometer, gyroscope and (optionally) magnetometer data:

https://android.googlesource.com/platform/frameworks/native/+/refs/heads/master/services/sensorservice/Fusion.cpp

The key methods are predict and update which correspond to respective stages of the Kalman Filter. The output is a quaternion (gettable by getAttitude method`), which is a convenient way to represent 3D orientations. The term "Rotation Vector" is just an alias used by Android for a quaternion.

Other alternatives for sensor fusion include Madgwick and Mahony filters. There are plenty of resources out there (and even many open implementations).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM