Applying magnetometer data to graphics object
Hello,
I'm trying to mirror the orientation of the MetaMotion R sensor in a graphical object (SceneKit).
I've managed to tie the two together, but the translation is not good and it's almost certainly because I don't really understand the data coming out. So what I'm looking for in the first place is a explanation of the data, some background.
The specific issue I'm having is that the object seems to 'over react' as compared to the sensor. Turning the sensor 180 degrees will, for some angles, turn the object all the way over (360 degrees). But it doesn't seem consistent for all angles.
What I'm doing in the code is simply to assign the x,y,z values from the sensor data directly to the absolute rotation of the object.
I have several other options, however, in the form of Euler angles, quaternion and a choice of absolute or relative rotation. But I need to have a better understanding of how to correlate the data to one of these choices.
Any tips or suggestions on how to proceed from here would be greatly appreciated.
Comments
Calibrate sensor fusion algorithm and use quarternion.
Great tip, thanks.
I have to say, however, that your swift documentation is somewhat lacking.
To the benefit of anyone who agrees, I'll list the code I ended up with.
After connecting to the board, it goes like this:
@Tobias,
We are here to provide you with the sensor data. Everyone does something different with it so all we do is make sure you get as much raw data as possible.
After that it is up to you to calibrate the sensor and use the data properly as per your needs.
Learn a little bit about processing sensor data, filtering, and so on...it will go a long way.