Relative to the initial orientation from what? I mean, from sensor's calibration, from sensor's manufacture or from the sensor's configuration just before a data emissions session starts?
I'd like to know how this works because I want to achieve the latest one and I keep getting the same reference frame each try I re-configure the sensor.
This means that you need to know the position at time 0 in order to get the position at time t.
So for example, I place the sensor on my right arm and move it around (maybe I am dancing).
I know where my right arm started; in this case maybe, it was in a resting position along my right side, palm facing up, and dangling freely. I also know the sensor is placed right above my wrist.
Then I take some sensor fusion data as I dance and finish back in the same resting position.
Now I can use the quaternion data from the sensor to draw, in 3D, an exact replica of my arm movement over time. Basically I can replicate my arm dance and visualize it in 3D if I want to because i know the start position and then I have the quaternion data showing me how to arm moved relative to it. I personally like to do this in Python or using 3D models.
If I did not know the initial position of my arm, I could not do this because I only know the relative orientation.
You need the starting position + relative orientation to calculate absolute position.
Comments
Relative to the initial orientation.
Relative to the initial orientation from what? I mean, from sensor's calibration, from sensor's manufacture or from the sensor's configuration just before a data emissions session starts?
I'd like to know how this works because I want to achieve the latest one and I keep getting the same reference frame each try I re-configure the sensor.
This means that you need to know the position at time 0 in order to get the position at time t.
So for example, I place the sensor on my right arm and move it around (maybe I am dancing).
I know where my right arm started; in this case maybe, it was in a resting position along my right side, palm facing up, and dangling freely. I also know the sensor is placed right above my wrist.
Then I take some sensor fusion data as I dance and finish back in the same resting position.
Now I can use the quaternion data from the sensor to draw, in 3D, an exact replica of my arm movement over time. Basically I can replicate my arm dance and visualize it in 3D if I want to because i know the start position and then I have the quaternion data showing me how to arm moved relative to it. I personally like to do this in Python or using 3D models.
If I did not know the initial position of my arm, I could not do this because I only know the relative orientation.
You need the starting position + relative orientation to calculate absolute position.