MetaWear Tutorial Bug

Your sensor fusion tutorial is using NDOF mode to get the absolute orientation of the sensor and display it on 3D cube. The issue is that even if you call NDOF you are not getting absolute orientation but relative instead (I tested this on MMR without the vibrator). You can easily test this on your own but I can provide you a video and dataset. Also, @Laura when I say absolute orientation I mean orientation relative to MAGNETIC NORTH and not a random point. Google query absolute and relative orientation https://lmgtfy.com/?q=sensor+absolute+orientation

Comments

  • It is relative orientation.

  • According to this https://mbientlab.com/androiddocs/latest/sensor_fusion.html#configuration NDoF calculates absolute orientation from accelerometer, gyro, and magnetometer

  • You can calculate absolute orientation from a know start + relative orientation.

  • That's relative orientation with extra steps. Here is the definition of absolute orientation https://blog.endaq.com/quaternions-for-orientation (The absolute measurement provides a rotation relative to magnetic north)

    @Laura You may consider adding giant red letters in the Product Details to state that NDoF is not working properly

  • The NDoF is working as intended; it is the one Bosch has created and is used by companies like Apple.

    Did you confirm there was no magnetic interference?

  • Did you confirm there was no magnetic interference?

    Yes 100%.

  • @nekos,
    What you check the BOSCH forums? I am curious.

  • @Laura ,
    Has there been any solution for this? I can confirm that NDOF-mode is giving a relative orientation (opposed to claimed absolute orientation). I'm using working with a metawearR.

  • How did you confirm this?

  • I'm happy if you can point out a mistake in my process, conclusion or understanding.

    I'm pulling data from sensorfusion configured to NDOF. I'm using an asyncroute with linearAcceleration-module, and taking Acceleration-type data from the stream. I'm taking Acceleration.x(), .y() and z() from the stream.

    Collecting data and moving the device same motion, but rotating the device ends up showing the data in different axels (x, y, or z) depending on the device orientation.

    If I understood correctly, device orientation shouldn't show in the readings and it should be already corrected to absolute orientation --- moving the device along the gravitational force should always show in a certain axis etc.

    Am I misunderstanding something?

  • This is hard to follow. Do you have data, videos, or pics to help out? I was thinking maybe i could do a tutorial on this subject soon. Let me know.

  • I will replicate this for video when I can - if still necessary after the next question.

    I can ask a simpler question:
    Should sensorfusion, using NDOF-mode and streaming linearacceleration-data, provide data that is already oriented in some manner? For example, should acceleration that is parallel to the gravational force (or perpendical to the ground) be always shown in some specific axis (for example y-axis) regardless of the orientation of the device?

    If the answer to that is "yes", then something is not working currently in my setup. If the answer is "no", then I have misunderstood NDOF-mode and there probably is no problem.

    I'm aware I can fix my acceleration data's orientation with gyroscope data, I was just wishfully hoping that the NDOF-mode would have provided acceleration data that had it's orientation already fixed in some manner.

Sign In or Register to comment.