Which sensors to use to simulate the hand movements as 3D model?

edited November 2017 in General
Hello all,

I'm a beginner to the programming with these sensors and I was thinking that sharing any kind of information (even if's something basic for the ones that have much of an experience) can be of a big help for me.

I'm having a few "MetaWearC  Streaming Sensors" and I need to do the following: to place them around one of my hands (one of the sensors will be placed on my back and I'll use that one as a reference for all the others) and as I'm moving the hand to get all the necessary data in order later to do a 3D modeling of the hand movements. As far as I researched, I think I will need to get the data from the Accelerometer sensor (for detecting the speed of movement) and the Gyro sensor (for detecting the rotation).

Is my understanding correct? Will these data be enough? If any of you can give me any kind reference, help or idea (for how to proceed or if I'm missing something) it will be appreciated. I tried and search the forum for similar questions, but I couldn't find the question that suits to this.



  • Depends on the how complex of the movements you are trying to capture.  If all you want is a 3D hand model's orientation to match your hand's orientation, you can simply stream quaternions or Euler angles to your app and use those values to rotate your model accordingly.  If you need to also have your hand move, stream linear acceleration from the sensor fusion algorithm.
  • For start I won't have some complex moves. Body will stay in a fixed position (so the person is not moving around) and I want to be able to detect the basic movements: https://prnt.sc/h7jc6m (blue points are the placed sensors).

    I tried to take the Euler angles and acceleration parameters from "Sensor fusion", but my Android application was crushing during the configuration of the sensor. I then researched a bit and found out that the "MetaWear CPRO" unit (which I actually have) does not support the  "Sensor fusion" module (https://mbientlab.com/androiddocs/latest/sensor_fusion.html#).

    Can you please give me an advice which module/class can I use in order to get these data (Euler angles and acceleration parameters) for the purpose I mentioned above?

    For example, can you please give me your comments regarding the following statements:
    - for Euler angles, I believe I can use the Gyro sensor (https://mbientlab.com/androiddocs/latest/gyro_bmi160.html#) and then somehow to calculate the Euler angles from the returned data.
    - for acceleration parameters, which one do you suggest that I should use: Accelerometer, Bosch Accelerometer, BMA255 Accelerometer, BMI160 Accelerometer or MMA8452Q Accelerometer?
  • You'll need both gyro and accelerometer data to determine the orientation.  You'll need to integrate your own sensor fusion code such as Sebastian Madgwick open source IMU algorithm.

    Use the Accelerometer interface.
This discussion has been closed.