Simulating real time accelerometer data for app testing

Hello! This is my first post here and I'm pretty new to this whole game so apologies if this has already been answered elsewhere.

My question is: when developing and testing an app to analyse motion data from a Metawear board is there an accepted method to work with a known set of accelerometer (or other sensor) data and 'replay' this each time rather than having to actually reproduce the real life motion?

Some more context:
I have a MetaWear RG board and my goal is to use this to capture motion data from a horse and use it to analyse in real time the pattern of footfalls. It's not so much which hoof hits the ground that matters but the rhythm of footfalls, or the relative timing of each footfall. I hope to be able to use the board in conjunction with an Android app to distinguish between the various gaits of the horse (walk, trot, canter etc) and to provide other kinds of analysis like the beats per minute and the regularity (or otherwise) of the pattern.

I've already used the board to capture some data and have examined it in graphical format. To my eye it is pretty clear which gait each data set represents so the challenge is getting the app to make this judgement. 

It seems to me this process will require a lot of trial and error, changes and testing. Development would be extremely slow if every time I wanted to try something out in my app I had to actually exercise the horse . So I am hoping there is some way to simply work with a known set of accelerometer data and 'replay' this every time to see how the app responds. 

I guess this is relatively simple in the situation where you are streaming raw sensor data to the app but what about the case when you are using on board processing first? Is it possible to start with a set of raw data and simulate the effect of the on board processing and pass these outputs to the android app?

Any suggestions please? Thanks in advance.

Comments

  • The best way to test data processing is to capture raw data for the different walk types and use a numerical processing tool (numpy, matlab, etc.) to figure out what data processing needs to be done to convert raw data into meaningful metrics.  Once you have the correct processing sequence, use the API to program the chain to the board.
  • Hi Eric, thanks for your reply. As I said I'm quite new to this but have now been able to get numpy up and running on my iMac and can now graph the data that I'm getting back and I'm starting to get my head around the various data processing options.
  • MetaWear only has a small subset of those functions available so there aren't that many options you can realistically use with the on-board processing.  You can checkout the available processors on the Android documentation:

This discussion has been closed.