Smoothing out sensor fusion data in real time?
I am working on an application that replicates the motion of the lower body on screen. Right now I have 3 MMC sensors on one leg and am animating the motion of that leg in real time (using quaternions streaming from the sensors). It is looking pretty good but the animation is kind of shaky/jittery for lack of a better word (see the linked video below). Is there a good way to smooth out this data so the leg will follow me smoothly in real time?
If nothing else I might just average the values over some delta time and update the leg with those values. But I would like to keep as much speed/precision as possible while making it look better. Right now the sampling rate and everything else is default so maybe some adjustment there is the trick. Interested in feedback/ideas while I work on this.