Data from sensors during the same task are different lengths?

Hi all,

I had a general question for which I am hoping there is a simple solution. I am currently using three sensors attached to a dummy to represent in areas that would correspond to the manubrium, 10th thoracic vertebrae (T10) and 1 sacral vetebrae (S1). The purpose is two-fold: 1) to simulate trunk flexion/extension, lateral bend and axial rotation, and 2) to compare the accuracy of the Metamotion R sensors to our current motion capture system. I have grouped these three sensors under Trunk, and collect data in both quaternion and Euler angles (100Hz - in Metabase). Despite all the sensors running at the same time, for the same length of time, and at the same frequency, they all collect different lengths of data. So my questions are as follows:

1) what is causing this to be the case. Since all the sensors are doing essentially doing the same task, for the same length of time and collecting at the same frequency I would assume that the data would all be the same length?

2) is there a way to remedy this that isnt' pre-analysis data interpolation?

Keep in mind I am using the native MetaBase app, and do not have any experience coding an app.


  • Kui


  • The sensors are not started at the exact same time.

  • Even when the sensors are grouped?

  • No, they are started around the same second, but not the same millisecond or nanosecond.

  • Do you have a solution during a post processing step to synchronize the time between two sensors ?

  • It's in our tutorials, the first rule of the forum is to always check the tutorials first.

  • Hi Kui,
    Yes it is helpful, thanks a lot.
    Do you have an idea how precise is the clock ? Because the information of the time since January 1, 1970 is sent through the ble by the smart phone right ? And the internal clock just increment this time stamp ? But I guess there is some drift and if I don't update my sensor it will notice a discrepancy after 2 or 3 weeks ? (when I say drift, I am talking about 10 ms)

  • @zozo

    If you're using the logger, the resolution on the clock is approximately 200us. When reading out and synchronizing with the smartphone, the uncertainty in the data transmission comes into play -- this is on the order of 40 ms for iOS and 20ms for many android models. At the time of readout, the internal clock will be aligned with real world time to within the 20-40ms. The most accurate log entry will be the one taken most recently. All other entries have their real world time calculated relative to present time, using the device's internal clock. The internal clock has an approximate 250ppm frequency tolerance. Entries logged in the distance past will accrue additional error from the internal clock tolerance.


  • Hello @Matt

    Thank you for your answer. Correct me if I am wrong. I understand that if I log the data, I will get a resolution of 0.2ms between the two sensor timestamps ?
    Elif I use the stream mode, I will have an error of 20 to 40 ms depending on my phone OS.

    Therefore, if I want to keep the clock synchronization error below, let's say, 1ms I have to use the log mode. Is it right ?

    I don't mind having a transmission error between the phone and the metawear but I want to be sure that it will be the same for the two sensors. Because at the end of the day, I only want the two sensors to share a common clock and it does not have to be the phone's clock.
    When I open the two data files, I want to be sure that when I am looking at epoch column I have an error below 1ms between the two files.

  • edited September 2019


    If you log, all data from one specific metawear device will have 0.2ms resolution and a base clock source accurate to approximately 250ppm. Multiple sensors on one metawear device will be logged against the same time base with that resolution.

    The phone OS cannot resolve any particular sensor's timebase more accurately than half the connection interval (20 - 40ms).

    Log mode will not help to synchronize between multiple metawear devices, the uncertainty of each radio link is involved. If you want higher accuracy you would need to use Linux, which can operate down to 7.5ms (I believe) connection intervals.

    Keep in mind for a sensor operating at 100hz, each sample is 10ms apart. 1ms accuracy on such a signal is meaningless because there are physical time constants (newton's law) on the objects in motion. Further, the motion sensor will have already performed averaging above said frequency to prevent/reduce aliasing in the signal.


Sign In or Register to comment.