Acceleration log time delay
What is the average time delay (and is there a min-max range) for the device to log and display the first acceleration from stationary? Is this the same for freefall and all other movement from stationary? We're about 30cm out in our freefall distance calculations and are hoping knowing this value might help. Cheers, Jon
This discussion has been closed.
Comments
Matt,
Thank you very much for such a detailed description - I'm sure a lot of people will find this useful.
We've been able to reduce the measurement error significantly already because of this. Our performance goal is to measure distance travelled in free-fall or down an incline etc as accurately as possible based on the log data retrievable from the device. I guess we want to know what our realistic error measurement will be.
Here's the accelerometer settings currently (we will be testing at +/- 2G as, from reading, that appears to be more sensitive to smaller degrees of motion?)
sampleFrequency: 800.0f;
highPassFilter = NO;
fullScaleRange = 8G
lowNoise = NO
fastReadMode = NO
activePowerScheme = HighResolution
sleepPowerScheme = HighResolution
autoSleep = NO
Many thanks,
Jon
Many thanks again Matt for your reply.
Ideally I'd like to be able to measure distance dropped from no more than 3 metres with an error of 1 or 2 cm at most.
So from what I've read we need to optimise the settings to enable:
1. The data should be read locally by the device, not the App.
2. Only when there is freefall START and END should the timestamps be sent to the device.
3. We can then correct for delays and calculate from there.
we've been getting csv data with very detailed acceleration measurements, sometimes more than one acceleration value per mS. Presumably we are therefore using the device to record acceleration. If you graph the acceleration (RMS method rather than eliminating X and Y axes from calculations) then eyeballing the data to obtain a sensible timestamp for freefall start and end gives reasonable accuracy but not what I'm after. Also, using the mean acceleration observed in test conditions, as opposed to 9.81 appears to give better results, but I'm not convinced as the observed acceleration is presumably more susceptible to small rotations/tilts than for there to be a significantly measurable effect from air resistance (on average my acceleration data is returning an average during fall of 0.9G). Ideally I want the device to send me the two most accurate timestamps without us having to write an algorithm based on the graphical data, which doesn't appear particularly accurate. Really I'd like the inherent delays to be part of the calculation already as well so we can simply concentrate on the kinematic equations. It seems getting these timestamp values out is quite complicated!
Cheers,
Jon