Show simple item record

dc.contributor.authorMORGAN, ELen_US
dc.contributor.editorGunes, Hen_US
dc.contributor.editorBryan-Kinns, Nen_US
dc.date.accessioned2017-01-13T09:48:54Z
dc.date.issued2016-08-03en_US
dc.date.submitted2017-01-13T09:42:18.432Z
dc.identifier.urihttp://qmro.qmul.ac.uk/xmlui/handle/123456789/18519
dc.description.abstractModern sensor technologies facilitate the measurement and interpretation of human affective and behavioural signals, and have consequently become widely used tools in the fields of affective computing, social signal processing and psychophysiology. This thesis investigates the use and development of these tools for measuring and enhancing aff ective and behavioural interaction during collaborative music making. Drawing upon work in the aforementioned fields, an exploratory study is designed, where self-report and continuous behavioural and physiological measures are collected from pairs of improvising percussionists. The findings lead to the selection of gaze, motion, and cardiac activity as input measures in the design of a device to enhance affective and behavioural interaction between co-present musicians. The device provides musicians with real-time visual feedback on the glances or body motions of their co-performers, whilst also recording cardiac activity as a potential measure of musical decision making processes. Quantitative evidence is found for the effects of this device on the communicative behaviours of collaborating musicians during an experiment designed to test the device in a controlled environment. This study also reports findings on discrete and time series relationships between cardiac activity and musical decision-making. A further, qualitative study is designed to evaluate the appropriation and impact of the device during long-term use in naturalistic settings. The results provide insights into earlier findings and contribute towards an empirical understanding of affective and behavioural interaction during collaborative music making, as well as implications for the design and deployment of sensor-based technologies to enhance such interactions. This thesis advances the dominant single-user paradigm within human-computer interaction and affective computing research, towards multi-user scenarios, where the concern is human-human interaction. It achieves this by focusing on the emotionally rich, and under-studied context of co-present musical collaboration; contributing new methods and findings that pave the way for further research and real-world applications.en_US
dc.description.sponsorshipThis work was funded by the Engineering and Physical Sciences Research Council (EPSRC) as part of the Centre for Doctoral Training in Media and Arts Technology at Queen Mary University of London (ref: EP/G03723X/1).en_US
dc.language.isoenen_US
dc.subjectcollaborative music makingen_US
dc.subjectaffective and behavioural interactionen_US
dc.subjectsensor-based technologiesen_US
dc.titleInstrumenting the Musician: Measuring and Enhancing A ective and Behavioural Interaction During Collaborative Music Makingen_US
dc.rights.holderThe copyright of this thesis rests with the author and no quotation from it or information derived from it may be published without the prior written consent of the author.
pubs.author-urlhttps://qmro.qmul.ac.uk/xmlui/handle/123456789/18519en_US
pubs.notesNot knownen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

  • Theses [4209]
    Theses Awarded by Queen Mary University of London

Show simple item record