# The difference between two timestamps of the logged data is not constant

Hi there,

We tried using the Metawear App to log the accelerometer data at 400hz. We logged it for a few seconds and exported the data as CSV. In theory, the difference between two adjacent timestamps should be 2.5ms (1000ms / 400 = 2.5ms), right? But we looked at it closely, the difference is not constant, sometime 1ms, sometime 2ms and 3ms. We understand that the resolution of the timestamp is 1ms, so it makes sense for them(the timestamp difference) to be 2ms or 3m, due to rounding. But they shouldn't be 1ms.

We are just checking, when we log the accelerometer data at 400hz, is the sampling/collecting interval precisely and constantly at 2.5ms (we assume it has some internal precise 400hz trigger to trigger the SPI comm with the accelerometer)? If you can confirm that, we can ignore the time stamp.

• Thanks Eric for getting back to me.

We understand that there is an intrinsic 1% error in the timestamps.

We still have a question on the timestamps on the exported CSV.

We tried using the Metawear App to log and export. The data we see is:

1540775002.3315318,-0.01953125,-0.8525390625,0.52197265625
1540775002.3344615,-0.02001953125,-0.85986328125,0.5341796875
1540775002.3373911,-0.02294921875,-0.8564453125,0.53466796875
1540775002.338856,-0.0244140625,-0.85693359375,0.53564453125
1540775002.3417857,-0.01611328125,-0.861328125,0.5322265625
1540775002.3447154,-0.0166015625,-0.8544921875,0.53369140625
1540775002.3461802,-0.01904296875,-0.85888671875,0.5361328125

The unit for the timestamp (1540775002.3315318) is clearly in seconds, while the resolution is 0.1 micro-seconds.

However, when we are using the latest C++ SDK in our app and do the logging and exporting, we see:
1540959674544,-0.0703125,-0.29760742,0.98706055
1540959674547,-0.06665039,-0.29858398,0.98046875
1540959674549,-0.06640625,-0.29736328,0.98779297
1540959674552,-0.067871094,-0.29638672,0.98217773
1540959674555,-0.06347656,-0.2980957,0.98095703
1540959674556,-0.06323242,-0.29663086,0.9848633
1540959674559,-0.06542969,-0.29418945,0.9863281
1540959674562,-0.064208984,-0.29833984,0.98583984
1540959674563,-0.064208984,-0.29882812,0.98999023

The unit for the timestamp (1540959674544) is clearly in milli-seconds, while the resolution is 1 milli-seconds.

The question is, why do we get a lower timestamp resolution using the latest C++ API than using the Metawear App?

How do we get a resolution that's matching the Metawear App?

Cheers

• edited November 2018

They both have the same millisecond resolution. A simple difference of consecutive time-stamped for both data sets easily shows this.

Just because your Android time is represented with fractional seconds and displayed with 6+ digits does not mean it has microsecond granularity.

This discussion has been closed.