dc.contributor.author | Riley, X | en_US |
dc.contributor.author | Dixon, S | en_US |
dc.contributor.author | Riley, J | en_US |
dc.contributor.author | Sound and Music Computing Conference | en_US |
dc.date.accessioned | 2023-06-22T14:05:58Z | |
dc.date.available | 2023-03-24 | en_US |
dc.date.issued | 2023-06-17 | en_US |
dc.identifier.uri | https://qmro.qmul.ac.uk/xmlui/handle/123456789/89144 | |
dc.description.abstract | Tracking the fundamental frequency (f0) of a monophonic instrumental performance is effectively a solved problem with several solutions achieving 99% accuracy. However, the related task of automatic music transcription requires a further processing step to segment an f0 contour into discrete notes. This sub-task of note segmentation is necessary to enable a range of applications including musicological analysis and symbolic music generation. Building on CREPE, a state-of-the-art monophonic pitch tracking solution based on a simple neural network, we propose a simple and effective method for post-processing CREPE’s output to achieve monophonic note segmentation. The proposed method demonstrates state-of-the-art results on two challenging datasets of monophonic instrumental music. Our approach also gives a 97% reduction in the total number of parameters used when compared with other deep learning based methods | en_US |
dc.rights | Attribution 3.0 United States | * |
dc.rights.uri | http://creativecommons.org/licenses/by/3.0/us/ | * |
dc.title | CREPE NOTES: A NEW METHOD FOR SEGMENTING PITCH CONTOURS INTO DISCRETE NOTES | en_US |
dc.type | Conference Proceeding | |
pubs.author-url | https://www.xavierriley.co.uk/ | en_US |
pubs.notes | Not known | en_US |
pubs.publication-status | Accepted | en_US |
dcterms.dateAccepted | 2023-03-24 | en_US |
qmul.funder | UKRI Centre for Doctoral Training in Artificial Intelligence and Music::Engineering and Physical Sciences Research Council | en_US |