Show simple item record

dc.contributor.authorMahmud Rafee, SRen_US
dc.contributor.authorFazekas, Gen_US
dc.contributor.authorWiggins, Gen_US
dc.date.accessioned2024-01-25T14:46:55Z
dc.date.issued2023-01-01en_US
dc.identifier.isbn9781728163277en_US
dc.identifier.issn1520-6149en_US
dc.identifier.urihttps://qmro.qmul.ac.uk/xmlui/handle/123456789/94221
dc.description.abstractAutomatic Performer Identification from the symbolic representation of music has been a challenging topic in Music Information Retrieval (MIR).In this study, we apply a Recurrent Neural Network (RNN) model to classify the most likely music performers from their interpretative styles. We study different expressive parameters and investigate how to quantify these parameters for the exceptionally challenging task of performer identification. We encode performerstyle information using a Hierarchical Attention Network (HAN) architecture, based on the notion that traditional western music has a hierarchical structure (note, beat, measure, phrase level etc.). In addition, we present a large-scale dataset consisting of six virtuoso pianists performing the same set of compositions. The experimental results show that our model outperforms the baseline models with an F1-score of 0.845 and demonstrates the significance of the attention mechanism for understanding different performance styles.en_US
dc.titleHIPI: A Hierarchical Performer Identification Model Based on Symbolic Representation of Musicen_US
dc.typeConference Proceeding
dc.identifier.doi10.1109/ICASSP49357.2023.10094844en_US
pubs.notesNot knownen_US
pubs.publication-statusPublisheden_US
pubs.volume2023-Juneen_US
rioxxterms.funderDefault funderen_US
rioxxterms.identifier.projectDefault projecten_US


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record