Show simple item record

dc.contributor.authorMcArthur, A
dc.contributor.authorSandler, M
dc.contributor.authorStewart, R
dc.date.accessioned2019-04-25T09:31:44Z
dc.date.available2019-04-25T09:31:44Z
dc.date.issued2018-01-01
dc.identifier.isbn9781510870390
dc.identifier.urihttps://qmro.qmul.ac.uk/xmlui/handle/123456789/57031
dc.description.abstractAll rights reserved. This study examines auditory distance discrimination in cinematic virtual reality. It uses controlled stimuli with audio-visual distance variations, to determine if mismatch stimuli are detected. It asks if visual conditions - either equally or unequally distanced from the user, and environmental conditions - either a reverberant space as opposed to a freer field, impact accuracy in discrimination between congruent and incongruent aural and visual cues. A Repertory Grid Technique-derived design is used, whereby participant-specific constructs are translated into numerical ratings. Discrimination of auditory event mismatch was improved for stimuli with varied visual-event distances, though not for equidistant visual events. This may demonstrate that visual cues alert users to matches and mismatches.en_US
dc.format.extent24 - 33
dc.publisherAudio Engineering Societyen_US
dc.titlePerception of mismatched auditory distance - Cinematic VRen_US
dc.typeConference Proceedingen_US
dc.rights.holder© 2018 Audio Engineering Society
pubs.notesNot knownen_US
pubs.publication-statusPublisheden_US
pubs.volume2018-Augusten_US
rioxxterms.funderDefault funderen_US
rioxxterms.identifier.projectDefault projecten_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record