简体   繁体   中英

Sensor Fusion on iOS Devices

I'm trying to find out how could I start to implement sensor fusion on the iPhone. I've started from this talk from David Sachs:

Sensor Fusion on Android Devices

Although David's talk is very illustrative, it doesn't show any code (it makes sense). I've seen both the GLGravity (to extract the gravity vector) and the AccelerometerGraph examples, but I need some help or at least guidance on how to combine the accelerometer, gyroscope and compass inputs so that the result is similar to what David shows.

Thanks

UPDATE: As of May 19, 2015, there is no point in implementing sensor fusion yourself on mobile devices: Both Android ( SensorManager under Sensor.TYPE_ROTATION_VECTOR ) and iPhone ( Core Motion under CMAttitude ) offers its own.



(The original answer from May 5, 2011)

I have implemented sensor fusion for Shimmer 2 devices based on this manuscript . I highly recommend it.

Sensor fusion is often achieved by Kalman Filter .

However, there is no such thing as "Kalman Filter for programmers" . Kalman filter is difficult to understand. You won't be able to implement and use it correctly if you do not understand it. Just use the above manuscript.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM