Show simple item record

dc.contributor.authorWang, L
dc.contributor.authorGjoreski, H
dc.contributor.authorCiliberto, M
dc.contributor.authorLago, P
dc.contributor.authorMurao, K
dc.contributor.authorOkita, T
dc.contributor.authorRoggen, D
dc.contributor.authorUbiComp/ISWC '23: The 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing
dc.date.accessioned2023-10-24T10:54:45Z
dc.date.available2023-10-24T10:54:45Z
dc.date.issued2023-10-08
dc.identifier.urihttps://qmro.qmul.ac.uk/xmlui/handle/123456789/91549
dc.description.abstractIn this paper we summarize the contributions of participants to the fifth Sussex-Huawei Locomotion-Transportation (SHL) Recognition Challenge organized at the HASCA Workshop of UbiComp/ISWC 2023. The goal of this machine learning/data science challenge is to recognize eight locomotion and transportation activities (Still, Walk, Run, Bike, Bus, Car, Train, Subway) from the motion (accelerometer, gyroscope, magnetometer) and GPS (GPS location, GPS reception) sensor data of a smartphone in a user-independent manner. The training data of a “train” user is available from smartphones placed at four body positions (Hand, Torso, Bag and Hips). The testing data originates from “test” users with a smartphone placed at one, but unknown, body position. We introduce the dataset used in the challenge and the protocol of the competition. We present a meta-analysis of the contributions from 15 submissions, their approaches, the software tools used, computational cost and the achieved results. The challenge evaluates the recognition performance by comparing predicted to ground-truth labels at every 10 milliseconds, but puts no constraints on the maximum decision window length. Overall, five submissions achieved F1 scores above 90%, three between 80% and 90%, two between 70% and 80%, three between 50% and 70%, and two below 50%. While the task this year is facing the technical challenges of sensor unavailability, irregular sampling, and sensor diversity, the overall performance based on GPS and motion sensors is better than previous years (e.g. the best performance reported in SHL 2020, 2021 and 2023 are 88.5%, 75.4% and 96.0%, respectively). This is possibly due to the complementary between the GPS and motion sensors and also the removal of constraints on the decision window length. Finally, we present a baseline implementation to help understand the contribution of each sensor modality to the recognition task.en_US
dc.publisherACMen_US
dc.titleSummary of SHL Challenge 2023: Recognizing Locomotion and Transportation Mode from GPS and Motion Sensorsen_US
dc.typeConference Proceedingen_US
dc.rights.holder© 2023 ACM
dc.identifier.doi10.1145/3594739.3610758
pubs.notesNot knownen_US
pubs.publication-statusPublisheden_US
pubs.publisher-urlhttp://dx.doi.org/10.1145/3594739.3610758en_US
rioxxterms.funderDefault funderen_US
rioxxterms.identifier.projectDefault projecten_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record