Effect set_sample_delay in data rate
I am connecting two MetaWearR boards using python api and it has been working ok(ish) so far.
I have a calibration routine which uses sensorfusion (corrected) acc/gyro data, and the normal operation mode uses quaternion data. When entering calibration mode, I just change the sensorfusion notification callbacks, removing quaternion one and setting acc/gyro ones. When going back to normal mode, I just do the opposite.
But that caused the gyro data to stream for about 1-2 seconds and then it stops streaming (acc data keeps coming).
So I changed to use the normal (not sensorfusion) acc/gyro when in calibration, and re-enable sensorfusion when going back to normal mode.
https://pastebin.com/KuvHKM0A here is the code used for the mode change. You can see that I commented the set_sample_delay call, which was set to 20ms for a data rate of 50Hz. Using that, besides eventual segmentation faults when going from calibration to normal mode, the data rates were 50Hz for the first board I connected and 35Hz for the second. Now with that line commented, both sensors stream at 100Hz. How can it be?
This discussion has been closed.