How SensorFusion's mode IMUPlus really works?

edited November 2019 in Android

Hey all,

I'd like to understand how the IMUPlus mode works. More specifically, it would be great if someone could clarify to me when the relative frame is specified. On calibration or on the configuration of a session?

I use this configuration for the SensorFusion and I keep getting the same relative coordinate system.

sensorFusion.configure()
        .mode(Mode.IMUPlus)
        .accRange(AccRange.AR_16G)
        .gyroRange(GyroRange.GR_2000DPS)
        .commit();
«1

Comments

  • This should help: https://mbientlab.com/androiddocs/latest/sensor_fusion.html#configuration
    The orientation you can infer once you get data as it depends on your setup.

  • edited November 2019

    The above code I posted earlier has a mistake, as I use IMUPlus mode. I'm sorry!

    I've already studied this section of the docs. I'll try to make my concern more clear with an example:

    No1 Scenario:
    I have the sensor on the table (the z-axis of the sensor is vertical to the table) and configure the SensorFusion object using Mode.IMUPlus. I don't move the sensor at all and get some quaternion samples (relative to the initial position of the sensor). Then, I stop the sensor using sensorFusion.stop();

    No2 Scenario:
    I have exactly the same setup and process as No1 Scenario, but I just rotate the sensor around the z-axis by 90 degrees before I configure it. Then, I configure it, get some samples and stop it.

    I expect to receive the same quaternion values because the sensor's position was the same as the initial one, shouldn't I?!

    Thank you in advance for your help!

  • edited November 2019

    I expect to receive the same quaternion values because the sensor's position was the same as the initial one, shouldn't I?!

    Yes if you use IMU_PLUS. Can you try with NDOF mode? Do you expect to get the same quaternion despite the 90 degrees rotation?

    @Laura I think, what he is asking has nothing to do with calibration. He uses IMU_PLUS not NDOF so there is no absolute orientation.

  • @nekos, thanks you are right.

  • Hi @nekos, thank you for your help!

    The problem is that I don't get the same quaternion values for the No1 and No2 scenarios.

    It seems that they use the same relative coordinate system, which is awkward considering that their initial quaternion was different.

    Do I make a mistake on configuration or did I miss something?

    By using the NDOF mode I have exactly the same behavior. The only difference is that the common absolute coordinate system is rotated by 90 degrees around the z-axis (this is a minor note though). However, this behavior is expected.

    I don't think calibration is the problem here @Laura. The docs lack of a detailed explanation on how the sensor's relative/absolute system is defined. I'd really appreciate clarification on those points. :smile:

  • @dmraptis That's very strange. Today i will test your scenarios with all of my sensors.
    If you find an answer please let me know.

  • I know, it's confusing! I'll let you know if I make any further progress! Thanks @nekos

  • Any update on this? I have the same issue with NDOF on a calibrated metaMotionR streaming Euler or Quaternions. The problem is on Yaw axis. To reproduce:
    1. Place your sensor on a table where the z-axis is vertical on the table.
    2. Configure the board and start streaming data.
    3. Now rotate your sensor 90 degrees to get different yaw values from your initial position.
    4. Stop streaming data (e.g your initial position was 0 degrees on yaw and you stopped streaming on 90)
    5. Rotate your sensor back to the initial position (without starting streaming).
    6. Start streaming data and you will see that yaw angle is on 90 degrees which is wrong.

    Again, I am using NDOF so I was expecting to get the absolute orientation. I tested this on multiple sensors and all of them were well-calibrated.

  • You should look up gimbal lock and learn more about the shortcomings of euler versus quaternions. This might help clear things up a bit.

  • edited June 2020

    Are you sure that this has anything to do with gimbal lock? You can try the same experiment with < 90 degrees and you will get the same result. You can even get the same results using quaternions. It seems that the fusion algorithm is not using magnetometer at all or keeps internal the last state before you stop the algorithm. You can even notice this on step 6 of my experiment where the first values are the same as the previous know state of the sensor.

    Edit: Actually you can clearly see the problem if you only use quaternions. On steps 3 and 6 you will get the same Quaternions!

  • edited June 2020

    We believe that we found the problem. I was right about the magnetometer that it doesn't work. Here are data captured at 25Hz with your metaHub App after calibration. I did a full rotation around the z-axis and then a random motion.

    We have many metaMotions, so I will test all of them. Please let me know if I can fix that somehow.

    Update: Yep, all of our sensors (12 Metmotion R) have the same problem with magnetometer data. I actually found on this forum that a lot of developers have the same issue.

  • is it a MMR+ or MMR?

  • Yep the problem seems to occur on MMR+ sensors. We get valid magnetometer data for sensors without vibrator. I think you better update your community about this issue (12 out of 12 MMR+ sensors had this problem).

    Although this fixed magnetometer, it didn't answer my first question. I tried my experiment with the correct magnetometer data and I got the same results. I got the same quaternions on steps 3 and 4. If that is the case you cannot find the absolute orientation (using the same starting position + relative orientation) after termination of the fusion algorithm. I mean you have to use a new starting position every time you stop/start sensor fusion.

  • edited June 2020

    As I suspected.

    Please read the first line on this page in giant red letters in the Product Details section: https://mbientlab.com/store/mmrp-metamotionrp/

    A coin vibration motor = magnet
    A magnetometer = measures magnetic forces
    Therefore: motor + magnetometer = interference!

    You have two options:
    1. Remove the coin motor
    2. Don't use the magnetometer in the sensor fusion

  • edited June 2020

    Please read the second paragraph of my comment! Magnetometer has nothing to do with my experiment (neither gimbal lock).

    Although this fixed magnetometer, it didn't answer my first question. I tried my experiment with the correct magnetometer data and I got the same results. I got the same quaternions on steps 3 and 4. If that is the case you cannot find the absolute orientation (using the same starting position + relative orientation) after termination of the fusion algorithm. I mean you have to use a new starting position every time you stop/start sensor fusion.

    Update: That's certainly a bug on your Sensor Fusion module. We tested this with other sensors (not Mbientlab's) and got the expected results. You can even test this with your smartphone. I am not sure why you don't care about this issue. Should I post this elsewhere (maybe a Reddit or Medium post)?

  • Unfortunately it is not our sensor fusion. It is the Bosch sensor fusion module. Maybe you can post your findings on their forum.

  • @nekos said:
    Any update on this? I have the same issue with NDOF on a calibrated metaMotionR streaming Euler or Quaternions. The problem is on Yaw axis. To reproduce:
    1. Place your sensor on a table where the z-axis is vertical on the table.
    2. Configure the board and start streaming data.
    3. Now rotate your sensor 90 degrees to get different yaw values from your initial position.
    4. Stop streaming data (e.g your initial position was 0 degrees on yaw and you stopped streaming on 90)
    5. Rotate your sensor back to the initial position (without starting streaming).
    6. Start streaming data and you will see that yaw angle is on 90 degrees which is wrong.

    Again, I am using NDOF so I was expecting to get the absolute orientation. I tested this on multiple sensors and all of them were well-calibrated.

    I am having this exact same issue and able to replicate it. It happens in both NDOF mode on calibrated MMR boards. I also have the same results in IMU plus mode, which i would expect to have the initial yaw be set back to zero since it is a relative orientation, but it uses the last known sensor position. When I reset the board and run a test in both NDOF and IMU Plus i get an initial yaw of 0, but when in NDOF mode, no matter the sensors initial orientation wrt magnetic north, I will still get zero.

    I am also seeing no difference in behaviour between IMU plus mode and NDOF mode, which leads be the believe that for some reason the absolute orientation isn't actually being calculated. I also get considerable drift over time which should not be happening with sensor fusion Kalman filtering.

    Really looking for a solution to this since currently the data is almost unusable.

    Also wondering which sensor fusion mode you windows metabase application uses - ive been trying some troubleshooting tests with this.

  • @slee,
    So the metabase app works? Let me know, I can look up the config.

  • @Laura
    The metabase app appears like it works in IMU Plus mode and reports the relative orientation because the Yaw is reset to zero after each trial. That being said, the metabase windows app also preforms a reset on disconnect which I have found will reset the initial position to zero in both NDOF and IMU plus mode in my own code. If i run my own sensor fusion code and stop it without a reset and then run the metabase app, the initial position will end up being the last position of the pervious sensor fusion session.

    The biggest problem I am finding is that I am not actually able to get absolute orientation even in the NDOF mode. (NDOF and IMU plus appear to be working in the exact same way and give me the same orientation results - which seems to only be relative orientation)

  • Any update?

  • edited August 2020

    I'd also like some clarification with this and some advice on how to get a zeroed orientation when starting. The above is taken with the api test app on iOS. Starting with a clean factory reset sensor (MetaMotion R) with it flat on a table throughout. Streaming is started (0) and it shows the "zero" orientation, then the sensor is rotated 90* and it changes accordingly (1). Then streaming is stopped (2).

    Now streaming is started again (3), however given that the api test app writes the sensor fusion config on streaming start, I'd expect the orientation to be the same as at point 0 (zeroed), however it just keeps the last orientation offset which is not a zeroed orientation. Now stop streaming and rotate the sensor back to the original orientation (4) and start streaming again (5), still the orientation is still the same as the rotated and still not zeroed.

    The only way to get back to a zeroed orientation that I have found is to do a factory reset (6) and then when you start streaming again the orientation is zeroed (no matter which way you rotate the sensor for starting, which is the expected behaviour).

    So it only works as expected after a factory reset. We need to have a way to have the orientation be zeroed when writing the sensor fusion config without doing a full hardware reset. Because if you want to start and stop multiple times and also compare the orientations between 2 sensors it's impossible without doing a full hardware reset on the sensors.

  • edited August 2020

    I think you guys might want to direct your questions over to the BOSCH forum. We don't really have any say in this as we get a blackbox from BOSCH and we put it in our firmware. The BOSCH folks will be able to answer your questions about how their sensor fusion works.

    https://community.bosch-sensortec.com/t5/MEMS-sensors-forum/bd-p/bst_community-mems-forum

  • That's not really an answer. As stated above, resetting the sensor makes it work as expected and described, is there no way for the metawear firmware to do that to the bosch module without itself doing a full restart? Isn't that what setting the sensor fusion config supposed to do?

    What would I ask on the bosch forum about this issue? The metawear is controlling the sensor fusion module, they would just say ask your sensor manufacturer

  • I don't understand what you are asking. Can you please post a single clear question?

  • How can I get the sensor fusion to start with a zeroed orientation (with respect to heading) like it does after resetting the sensor? If this is not currently supported, can it be be done so?

  • edited August 2020

    if you calibrate it it should restart at 0. Can you just use the calibration function to accomplish this?

  • so call mbl_mw_sensor_fusion_read_calibration_data() and
    mbl_mw_sensor_fusion_write_calibration_data() the result should do it?

  • Can you try?

  • I've modified the api test app to read and write the calibration, however it doesn't reset the orientation

Sign In or Register to comment.