Using SensorFusion to recognize distinct movements
Hello,
I'm making an app which are meant to recognize different movements in air. I am using the MetaMotion R and the orientation of the module is random from time to time. I wanted to use quaternions from the SensorFusion module, and determine the relative rotation. E.g. I have two quaternions corresponding to the start and end point of the movement - Q1 and Q2 - and then I'm calculating the quaternion needed to rotate from Q1 to Q2. The problem is that this will be different from time to time dependent on how the sensor is orientated. Any good ideas to how I would go about this?
And also, are the quaternions from the sensor fusion module rotations from a fixed start point or are they directions describing the orientation of the sensor module? I'm assuming they are rotations, but what is then the origin, i.e. the quaternion from which they are rotated? (Hope the questions make sense)
This discussion has been closed.
Comments