Is it feasible to create a 3d space absolute tracking using the Meta Wear CPRO ?

edited September 2016 in General
I am thinking about creating a 3d space tracking system for a Mixed Reality project.

As I googled, it seems that absolute position seems impossible by only relying on Accelerometer and Gyroscope.

But as Meta Wear CPRO has Magnetometer, I was thinking does this helps to tackle the challenge of 3d space tracking.

Any help would be appreciated.


  • You will need to integrate the MetaWear sensor data from the API with a sensor fusion or Euler angle library.  Take a look at these example iOS projects on our GitHub page for some insights.

  • Great to hear that is possible.

    Is there any sensor fusion or Euler angle library for android or Windows as well ?
  • From what I've seen, estimating object orientation with respect to the world frame (e.g. in the form of quaternions or Euler angles) is the first step in 3D tracking.

    After this, people typically use the estimated orientation to convert acceleration, which is measured with respect to the object body frame, to the world frame. The acceleration measurements can then be integrated to give velocity and position estimates (after compensating for the effect of gravity).

    3D tracking using inertial measurements alone doesn't tend to work well because errors in these estimates accumulate over time. As a result, you'll see the estimated position drift further and further from the true position. I think including magnetometer information can only decrease the rate of this drift.

    To mitigate this problem, people try to incorporate periodic updates on one (or more) of the estimated quantities. For example, you might update the position using estimates from GPS or a computer vision system, or you might update the velocity when you are confident the object isn't in motion.

    If a feasible 3D object tracker based on inertial measurements and dead reckoning does exist, I'd love to hear about it.

  • I am doing research implementing sensor fusion and space tracking confined to a very small area. The code on GitHub works great. I would love to see a Swift version soon !

  • this sounds very interesting, ans Swift version would be great!
This discussion has been closed.