Analyzing Sensor Data with AI/ML
I am curious if anyone in the community has been working with Artificial Intelligence / Machine Learning for sensor analysis.
I am contemplating the best way to capture, label and process sensor data to identify certain types of movements.
In my case, my sensor is mounted on a remote control car and I would like to synchronize the data stream with GoPro footage so I have a visual reference for data labeling.
(I noticed that mbientlab received a research grant to make AI/ML analysis for wearables more accessible to non-technical folks but I haven't seen any mention of that here).
pete
Comments
Hey Pete,
When we did our ML/AI worked we use a lot of Javascript and Python AI/ML libraries and everything we did was on an Intel Nuc running Linux (Ubuntu 16/18 is nice). We also used expensive BLE dongles.
Thanks Laura. I am going to assign a new ML Intern to this project and we will likely start with pre-recorded video and sensor data to develop develop and train models. I am curious to see how well we will be able to detect events at high rates of speed (20 - 80 mph).
pete
Keep us updated!
Hello there,
As I am trying to do the same type of work I was wondering if you could guide me to these expensive BLE dongles that you have mentioned. I keep getting disconnections though I am using a class 1 bluetooth 4 dongle (link) but I still get disconnected due to the body absorbing most of the signal. Any help would be much appreciated.
I am not much of a dongle expert. A little google-fu might go a long way. The issue is that dongles are highly dependent on the hardware and OS. Some OSs have the drivers built in, some OSs you can install the drivers and some don't support the dongles at all.
I personally have about 5 different dongles on my desk and I switch between them depending on my project.
Okay so for a linux platform and that can connect to 3 or 4 sensors which one have you found to be the strongest and most reliable from them all.
Yes. right now I am using a Logitech dongle for my RaspberryPi4 project and it is working extremely well (CSR dongles are not supported on my Pi4 setup)
I use mostly CSR dongles with my Pi3.
For my NUC, I used the Sena BT-UD100.
We recommend one dongle for two sensors and the use of a high powered USB hub if possible.
I also like to use different dongles in the same project.
I'm basically doing a similar thingy, requriing to sync MMR sensors + phone (sensors and gps) + camera. I apply AI powered real-time processing of the sensor-data on the phone and extended post-process analysis on the recorded streams. I use timestamps to sync the event streams which is good enough. For phone sensors itself i just sync at the beginning = calculate start offsets. But including MMR it's not enough, as the event-streams are timewise not as accurate as i receive them on the phone and slightly drift and i also have to handle disconnects. For streaming from MMR, the "tricky" part is to re-calc the timestamps, as the timestamps included are when the packet was received, not created. So you have to rebuild from event-id and ODR. (https://mbientlab.com/community/discussion/1934/metabase-streaming-timestamps#latest). Analyzing those data captured at different speeds is yet another fun topic..
To avoid such hazzle, many other solutions, just use forced data acquisition and pull all their sensor data at regular intervals, to have the sensor data in sync, but this heavily depends if this is good enough for your purpose.'
Patrick