MetaWear CPRO Sensor Fusion

There was a previous post (Sensor fusion / quaternion / absolute orientation) regarding sensor fusion on the RPRO board. Since the RPRO doesn't have a magnetometer, there's no way to get an absolute quaternion orientation. However, the CPRO does have a magnetometer. Is getting an absolute orientation (quaternion or Euler angles) still not possible? I can obviously fuse the data after collection, but it would be much easier if this was something that could be obtained through the API


  • The API does not perform sensor fusion; you will have to do the computations yourself.  Ideally, this would be done the board but our current SOC cannot perform these calculations.
  • Ideally what I'm looking for is something similar to Android's Rotation_Vector. Android uses the same sensors that are available on the CPRO to return a rotation axis and an angle of rotation about that axis. This can be converted to a quaternion if necessary. Having this available through the API would greatly reduce the amount of data that I need to stream back to the phone, and the amount of processing that has to be done once I get the data. This feature has been available on Android phones with same hardware for a long time. I accidentally assumed that the CPRO would be able to fuse the data together in a similar way.
  • sorry, missed your comment when posting. Is this something that would be available with future firmware updates, or is the SOC now powerful enough to perform the necessary computations. My main concern is that for good sensor fusion, I need high frequency data for all the sensors. I think I'll likely saturate the bandwith of the Bluetooth connection trying to stream all that data.
  • The SOC has not changed, we are still using a cortex M0.  Down the line we do want to switch to a cortex M4 which can do the sensor fusion on board.
  • We're using the BNO055 via I2C to quickly address the sensor fusion issue. However that's not going completely smoothly either: BNO055 via I2C. Hopefully we'll be able to figure out this internally or via the community and it'll give you an option for quick(ish) access to quaternions. 
  • I am new to sensor fusion . Do you have some example code that I can use for fusing the metawear pro data to get the yaw pitch roll? I don't need to do it realtime, can do postprocessing
  • fusing data is not a trivial matter for 9-axis sensor fusion (accelerometer, gyro, & magnetometer). There are entire third party libraries (mostly commercial) that perform the sensor fusion. It requires using a kalman filter with data sampled at a very high data rate. Unfortunately the Cpro board can't stream data at a fast enough rate across the BTLE link. There are simpler 6-axis algorithms (accelerometer & gyro) that can get decent results with lower data rates. Unfortunatly, without the magnetometer info, they can't give you an absolute orientation in space, just a relative orientation to the starting position. They are also susceptible to gyro drift error over time, although that can be minimized with some other techniques.
  • ok. Thanks. From what I head Mbientlab is going to release the sensor fusion algorithm in couple of months. Till then I would like to play with the relative orientation with just accelerometer/ gyro data. Do you have any generic examples/scripts?

  • Unfortunately, I don't have any examples that I am allowed to share publicly. You could try these 2. They use the complementary filter instead of the kalman filter, so they're significantly simpler.
    The first is 6-axis (accel & gyro) the second is 9-axis
  • Thank you.
This discussion has been closed.