Show simple item record

dc.contributor.authorRafee, S
dc.contributor.authorFazekas, G
dc.contributor.authorWiggins, G
dc.contributor.authorInternation computer music conference
dc.date.accessioned2021-08-09T15:11:56Z
dc.date.available2021-05-14
dc.date.available2021-08-09T15:11:56Z
dc.identifier.urihttps://qmro.qmul.ac.uk/xmlui/handle/123456789/73526
dc.description.abstractMusic Performers have their own idiosyncratic way of interpreting a musical piece. A group of skilled performers playing the same piece of music would likely to inject their unique artistic styles in their performances. The variations of the tempo, timing, dynamics, articulation etc. from the actual notated music are what make the performers unique in their performances. This study presents a dataset consisting of four movements of Schubert's ``Sonata in B-flat major, D.960" performed by nine virtuoso pianists individually. We proposed and extracted a set of expressive features that are able to capture the characteristics of an individual performer's style. We then present a performer identification method based on the similarity of feature distribution, given a set of piano performances. The identification is done considering each feature individually as well as a fusion of the features. Results show that the proposed method achieved a precision of 0.903 using fusion features. Moreover, the onset time deviation feature shows promising result when considered individually.en_US
dc.rightsAttribution 3.0 United States*
dc.rights.urihttp://creativecommons.org/licenses/by/3.0/us/*
dc.titlePerformer Identification From Symbolic Representation of Music Using Statistical Modelsen_US
dc.typeConference Proceedingen_US
pubs.notesNot knownen_US
pubs.publication-statusAccepteden_US
dcterms.dateAccepted2021-05-14


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record

Attribution 3.0 United States
Except where otherwise noted, this item's license is described as Attribution 3.0 United States