dc.contributor.author | Saitis, C | en_US |
dc.contributor.author | Kalimeri, K | en_US |
dc.date.accessioned | 2019-09-24T13:41:39Z | |
dc.date.available | 2018-08-16 | en_US |
dc.date.issued | 2018-08-22 | en_US |
dc.identifier.issn | 1949-3045 | en_US |
dc.identifier.uri | https://qmro.qmul.ac.uk/xmlui/handle/123456789/59824 | |
dc.description.abstract | In this study, we aim to better understand the cognitive-emotional experience of visually impaired people when navigating in unfamiliar urban environments, both outdoor and indoor. We propose a multimodal framework based on random forest classifiers, which predict the actual environment among predefined generic classes of urban settings, inferring on real-time, non-invasive, ambulatory monitoring of brain and peripheral biosignals. Model performance reached 93% for the outdoor and 87% for the indoor environments (expressed in weighted AUROC), demonstrating the potential of the approach. Estimating the density distributions of the most predictive biomarkers, we present a series of geographic and temporal visualizations depicting the environmental contexts in which the most intense affective and cognitive reactions take place. A linear mixed model analysis revealed significant differences between categories of vision impairment, but not between normal and impaired vision. Despite the limited size of our cohort, these findings pave the way to emotionally intelligent mobility-enhancing systems, capable of implicit adaptation not only to changing environments but also to shifts in the affective state of the user in relation to different environmental and situational factors. | en_US |
dc.relation.ispartof | IEEE Transactions on Affective Computing | en_US |
dc.subject | visual impairment | en_US |
dc.subject | affective state | en_US |
dc.subject | multimodal recognition | en_US |
dc.subject | data fusion | en_US |
dc.title | Multimodal Classification of Stressful Environments in Visually Impaired Mobility Using EEG and Peripheral Biosignals | en_US |
dc.type | Article | |
dc.rights.holder | © 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | |
dc.identifier.doi | 10.1109/TAFFC.2018.2866865 | en_US |
pubs.notes | Not known | en_US |
pubs.publication-status | Published | en_US |
dcterms.dateAccepted | 2018-08-16 | en_US |
rioxxterms.funder | Default funder | en_US |
rioxxterms.identifier.project | Default project | en_US |