Summary of SHL Challenge 2024: Motion Sensor-based Locomotion and Transportation Mode Recognition in Missing Data Scenarios
Abstract
The paper summarizes the contributions of participants to the sixth Sussex-Huawei Locomotion-Transportation (SHL) Recognition Challenge organized at the HASCA Workshop of UbiComp/ISWC 2024. The goal of this machine learning/data science challenge is to recognize eight locomotion and transportation activities (Still, Walk, Run, Bike, Bus, Car, Train, Subway) from the motion (accelerometer, gyroscope, magnetometer) sensor data of a smartphone in a way which is user-independent and smartphone position-independent, and as well robust to data missing during deployment. The training data of a “train” user is available from smartphones placed at four body positions (Hand, Torso, Bag and Hips). The testing data originates from “test” users with a smartphone placed at one of three body positions (Torso, Bag or Hips). In addition, the test data has one or multiple sensor modalities randomly missing from each time frame (5 seconds). Such a scenario may occur if a device turns on and off dynamically sensors to save power, or due to limited computational or memory capacity. We introduce the dataset used in the challenge and the protocol of the competition. We present a meta-analysis of the contributions from 7 submissions, their approaches, the software tools used, computational cost and the achieved results. Overall, one submission achieved an F1 score between 70% and 80%, two between 60% and 70%, three between 50% and 60%, and one below 50%. Finally, we present a baseline implementation addressing missing sensor modalities.