Applying magnetometer data to graphics object

Hello,

I'm trying to mirror the orientation of the MetaMotion R sensor in a graphical object (SceneKit).
I've managed to tie the two together, but the translation is not good and it's almost certainly because I don't really understand the data coming out. So what I'm looking for in the first place is a explanation of the data, some background.

The specific issue I'm having is that the object seems to 'over react' as compared to the sensor. Turning the sensor 180 degrees will, for some angles, turn the object all the way over (360 degrees). But it doesn't seem consistent for all angles.

What I'm doing in the code is simply to assign the x,y,z values from the sensor data directly to the absolute rotation of the object.

I have several other options, however, in the form of Euler angles, quaternion and a choice of absolute or relative rotation. But I need to have a better understanding of how to correlate the data to one of these choices.

Any tips or suggestions on how to proceed from here would be greatly appreciated.

Comments

  • edited February 2019

    Calibrate sensor fusion algorithm and use quarternion.

  • Great tip, thanks.
    I have to say, however, that your swift documentation is somewhat lacking.

    To the benefit of anyone who agrees, I'll list the code I ended up with.

    After connecting to the board, it goes like this:

    func fusionTest() {
            configureFusion()
            let incomming = mbl_mw_sensor_fusion_get_data_signal(device.board, MBL_MW_SENSOR_FUSION_DATA_QUATERNION)
            mbl_mw_datasignal_subscribe(incomming!, bridge(obj: self)) { (context, data) in
                let dataSignal = data!.pointee.valueAs() as MblMwQuaternion
                DispatchQueue.main.async{
                    let mySelf = Unmanaged<GraphicsViewController>.fromOpaque(context!).takeUnretainedValue()
                    let myQuatanion = SCNQuaternion(x: dataSignal.x, y:dataSignal.y, z:dataSignal.z, w:dataSignal.w)
                    mySelf.pyramidNode.worldOrientation = myQuatanion
                }
            }
            mbl_mw_sensor_fusion_enable_data(device.board, MBL_MW_SENSOR_FUSION_DATA_QUATERNION);
            mbl_mw_sensor_fusion_start(device.board);
        }
    
    func configureFusion() {
            // set fusion mode to ndof (n degress of freedom)
            mbl_mw_sensor_fusion_set_mode(device.board, MBL_MW_SENSOR_FUSION_MODE_NDOF);
            // set acceleration rangen to +/-8G, note accelerometer is configured here
            mbl_mw_sensor_fusion_set_acc_range(device.board, MBL_MW_SENSOR_FUSION_ACC_RANGE_8G);
            // write changes to the board
            mbl_mw_sensor_fusion_write_config(device.board);
        }
    
  • edited February 2019

    @Tobias,

    We are here to provide you with the sensor data. Everyone does something different with it so all we do is make sure you get as much raw data as possible.

    After that it is up to you to calibrate the sensor and use the data properly as per your needs.

    Learn a little bit about processing sensor data, filtering, and so on...it will go a long way.

Sign In or Register to comment.