Simulating real time accelerometer data for app testing
Hello! This is my first post here and I'm pretty new to this whole game so apologies if this has already been answered elsewhere.
My question is: when developing and testing an app to analyse motion data from a Metawear board is there an accepted method to work with a known set of accelerometer (or other sensor) data and 'replay' this each time rather than having to actually reproduce the real life motion?
Some more context:
I have a MetaWear RG board and my goal is to use this to capture motion data from a horse and use it to analyse in real time the pattern of footfalls. It's not so much which hoof hits the ground that matters but the rhythm of footfalls, or the relative timing of each footfall. I hope to be able to use the board in conjunction with an Android app to distinguish between the various gaits of the horse (walk, trot, canter etc) and to provide other kinds of analysis like the beats per minute and the regularity (or otherwise) of the pattern.
I've already used the board to capture some data and have examined it in graphical format. To my eye it is pretty clear which gait each data set represents so the challenge is getting the app to make this judgement.
It seems to me this process will require a lot of trial and error, changes and testing. Development would be extremely slow if every time I wanted to try something out in my app I had to actually exercise the horse . So I am hoping there is some way to simply work with a known set of accelerometer data and 'replay' this every time to see how the app responds.
I guess this is relatively simple in the situation where you are streaming raw sensor data to the app but what about the case when you are using on board processing first? Is it possible to start with a set of raw data and simulate the effect of the on board processing and pass these outputs to the android app?
Any suggestions please? Thanks in advance.
This discussion has been closed.
Comments