Which sensors to use to simulate the hand movements as 3D model?
Hello all,
I'm a beginner to the programming with these sensors and I was thinking that sharing any kind of information (even if's something basic for the ones that have much of an experience) can be of a big help for me.
I'm having a few "MetaWearC Streaming Sensors" and I need to do the following: to place them around one of my hands (one of the sensors will be placed on my back and I'll use that one as a reference for all the others) and as I'm moving the hand to get all the necessary data in order later to do a 3D modeling of the hand movements. As far as I researched, I think I will need to get the data from the Accelerometer sensor (for detecting the speed of movement) and the Gyro sensor (for detecting the rotation).
Is my understanding correct? Will these data be enough? If any of you can give me any kind reference, help or idea (for how to proceed or if I'm missing something) it will be appreciated. I tried and search the forum for similar questions, but I couldn't find the question that suits to this.
Regards
This discussion has been closed.
Comments
I tried to take the Euler angles and acceleration parameters from "Sensor fusion", but my Android application was crushing during the configuration of the sensor. I then researched a bit and found out that the "MetaWear CPRO" unit (which I actually have) does not support the "Sensor fusion" module (https://mbientlab.com/androiddocs/latest/sensor_fusion.html#).