dc.contributor.author | McArthur, A | |
dc.contributor.author | Sandler, M | |
dc.contributor.author | Stewart, R | |
dc.date.accessioned | 2019-04-25T09:31:44Z | |
dc.date.available | 2019-04-25T09:31:44Z | |
dc.date.issued | 2018-01-01 | |
dc.identifier.isbn | 9781510870390 | |
dc.identifier.uri | https://qmro.qmul.ac.uk/xmlui/handle/123456789/57031 | |
dc.description.abstract | All rights reserved. This study examines auditory distance discrimination in cinematic virtual reality. It uses controlled stimuli with audio-visual distance variations, to determine if mismatch stimuli are detected. It asks if visual conditions - either equally or unequally distanced from the user, and environmental conditions - either a reverberant space as opposed to a freer field, impact accuracy in discrimination between congruent and incongruent aural and visual cues. A Repertory Grid Technique-derived design is used, whereby participant-specific constructs are translated into numerical ratings. Discrimination of auditory event mismatch was improved for stimuli with varied visual-event distances, though not for equidistant visual events. This may demonstrate that visual cues alert users to matches and mismatches. | en_US |
dc.format.extent | 24 - 33 | |
dc.publisher | Audio Engineering Society | en_US |
dc.title | Perception of mismatched auditory distance - Cinematic VR | en_US |
dc.type | Conference Proceeding | en_US |
dc.rights.holder | © 2018 Audio Engineering Society | |
pubs.notes | Not known | en_US |
pubs.publication-status | Published | en_US |
pubs.volume | 2018-August | en_US |
rioxxterms.funder | Default funder | en_US |
rioxxterms.identifier.project | Default project | en_US |