# Taring quaternion attitude

Hello everyone, I have a question concerning the base orientation (default) of the sensors. I'm interested in using the sensors in pairs, to find differences in orientation of two sensor mounting positions.

My question is whether or not the sensor reference orientation can be changed or is fixed, with reference output being (W:1,X:0,Y:0,Z:0). When aligning the sensors in the same physical orientation they do not output exactly the same attitude estimation and it benefits me greatly to be able to adjust this "tare" or reference position on the sensor so that they all have the same reference.

• How much of a difference is there between the two initial values?  What if you instead compute relative orientation changes from the initial value?
• edited March 2017
There is up to 30 degrees difference in reference orientation between select pairs of sensors. Once I find the difference in orientation between the two values, I then extract the rotation about two independent axis, rotation about y and rotation about x for example. My concern is that the axes are not aligned between sensors and I will get a false rotation about an axis, even though the quaternion angle between the two sensors is valid.

I should add as well that my application is made such that a new sensor can be swapped out at any time if a battery dies or damage occurs, so adding an offset in the application for each sensor would require an extra calibration step that is not ideal. If the offset could be coded onto the board somehow that would solve my problem I believe, and from the application's point of view the references would be identical.
• I'm doing exactly the same thing in my project and I asked this question before and the response I got was to soft reset the board, which isn't great because you have to reconnect afterward but it does seem to reset them to the same origin of orientation each time. So hopefully MBientlab can add an easy way to reset back to that "clean" origin.

For now as far as I can tell there's no way to do it without individual calibration, because each sensor will suffer some drift over time, and the drift is unique to the sensor. I sync/calibrate my boards by saving their current orientation reading and applying the inverse to further readings to get an adjusted reading.

I'm interested in hearing whether there is a better solution.
• Thanks for the response lionfish, I will have a go with the soft reset option. If I come across another solution I will post it here.
• Can you post the data that you are receiving from the two boards along with the code you are using to configure and use the sensor fusion algorithm?
• edited April 2017
To configure I use the following line for both the upper and lower node:

self.getLeftLowerSensor().sensorFusion!.mode = MBLSensorFusionMode(rawValue: 2)!;

Then I start listening for the quaternion events for each one like this:

if let data = data {

quaternionSync(1,data,error);

}

}

And when I run this, the two nodes are sitting in the same orientation on the desk. These are the readings of each sensor:

 w:0.98220957 x:-0.01696632 y:-0.0017892 z:0.187011687

 w:0.950367987 x:0.01873878 y:0.035578135 z:-0.308518827
This is a 15 degree difference when they are sitting in the same orientation.

• Performing a soft reset of the boards does seem to bring them to the same attitude reading when they are in the same physical orientation. So maybe my solution is some way of performing this reset without disconnecting the sensors from the app.
• Eric, I've got an updated solution to my issue I thought I would share. It seems as though the sensor fusion should correct for drift while the sensors are at rest, and Earth's field measurements can be trusted more heavily. I changed the sensor mode to 1, the NDof mode. Initially I avoided this because I'm working in an area of high magnetic interference. With the mode set to 1, I zeroed the z component of the quaternions I received as this axis seems most affected by the compass. After normalizing the quaternion again my readings seem to be very accurate, with the drift corrected every time the sensors return to rest. I lose a degree of freedom but for my application this seems to work.

Lionfish, this may help you, I'm not sure. I did need to change my sensor mounting orientation to remove the use of the Z axis.
• For the specific magnetometer on the boards (BMM150), field strength along the xy axes is measured with different sensing elements than the z-axis (FlipCore vs. Hall plate), and they have different ranges.  This might explain why the z-axis is behaving differently than the xy axes.
• @Riley @lionfish

The way the sensorfusion module is presently implemented, if the sensorfusion module is explicitly disabled, then the fusion engine will have its state reset.  This should be equivalent to a soft reset once you re-enable the module -- and will not disconnect your device or reset other device settings.
• Thanks Matt, that is a great fix. Could this be accomplished by changing the sensor fusion mode, to 3 for example, and then back again?

• Changing the mode should also reset the internal state.  Try it out and let us know if this is the case.
• No, resetting the state has not fixed the issue, the sensors I have do not have the same axis alignment, even from factory reset or soft reset.

My process:
Put four sensors in a line on my desk, all in the same orientation and on a flat surface.
Connect to all the sensors using the MetaWear app on the apple app store.
Reset all sensors to factory defaults.
Connect once again to all sensors and perform a soft reset.
Without moving the sensors, I connect all four sensors to my application.
Upon connection, my application sets the quaternion mode to 2, and then to 1 for each sensor (performing the solution proposed by Matt)
I begin streaming using the quaternion periodic sample filter with time 30. My application syncs data between two pairs of the sensors.
I get a quaternion from each sensor and perform the following to get the difference in orientation as q3.
q3 = quatmultiply(q1, conjugate(q2))
where quatmultiply {
let n0 = (r.w*q.w - r.x*q.x - r.y*q.y - r.z*q.z);
let n1 = (r.w*q.x + r.x*q.w - r.y*q.z + r.z*q.y);
let n2 = (r.w*q.y + r.x*q.z + r.y*q.w - r.z*q.x);
let n3 = (r.w*q.z - r.x*q.y + r.y*q.x + r.z*q.w);
return Quaternion(w:n0, x:n1, y:n2, z:n3);
}
and conjugate {
self.x = -self.x;
self.y = -self.y;
self.z = -self.z;
return self;
}

I get the magnitude change in rotation of q3 by: