Using SensorFusion to recognize distinct movements

Hello, 

I'm making an app which are meant to recognize different movements in air. I am using the MetaMotion R and the orientation of the module is random from time to time. I wanted to use quaternions from the SensorFusion module, and determine the relative rotation. E.g. I have two quaternions corresponding to the start and end point of the movement - Q1 and Q2 - and then I'm calculating the quaternion needed to rotate from Q1 to Q2. The problem is that this will be different from time to time dependent on how the sensor is orientated. Any good ideas to how I would go about this?

And also, are the quaternions from the sensor fusion module rotations from a fixed start point or are they directions describing the orientation of the sensor module? I'm assuming they are rotations, but what is then the origin, i.e. the quaternion from which they are rotated? (Hope the questions make sense)


Comments

  • The sensor fusion algorithm describes device orientation.  

    What configuration values are you using and can you post quaternion data that showcases the issue you are describing?
  • The module is configured as:
            if (sensorFusionModule1 != null) {
    sensorFusionModule1.configure()
    .setMode(SensorFusion.Mode.NDOF)
    .setAccRange(SensorFusion.AccRange.AR_2G)
    .setGyroRange(SensorFusion.GyroRange.GR_500DPS)
    .commit();

    // sensorFusionModule1.routeData().fromQuaternions().stream("sensor_fusion_stream").commit()
    sensorFusionModule1.routeData().fromLinearAcceleration().stream("sensor_fusion_stream_from_linear_acc").commit()
    .onComplete(onCompleteSensorFusionHandler1);
    sensorFusionModule1.routeData().fromQuaternions().stream("sensor_fusion_stream_from_quaternion").commit()
    .onComplete(onCompleteSensorFusionHandler1);
    I have two scenarios. In the learning scenario (e.g. scenario 1 in the image below) I am rotating the sensor a known angle, e.g. 90 degrees. 
    From the quaternions I get in the two positions I can easily calculate the quaternion needed to rotate from orientation 1 to orientation 2.
    Now I want to make a similar rotation in the same reference system but with the sensor board in a different random orientation (e.g. like in scenario 2 in the below image).
    Here I know the start orientation, which we can call the calibration value. And now I want to be able to predict (calculate) the quaternion I will get when I rotate the sensor board the same way (e.g. 90 degrees).
    So how can I do this?
    image
    

  • NDoF mode incorporates magnetometer data into the computation; try using IMUPlus mode instead.
This discussion has been closed.