Show simple item record

dc.contributor.authorSarmento, P
dc.date.accessioned2024-08-15T09:28:27Z
dc.date.available2024-08-15T09:28:27Z
dc.identifier.urihttps://qmro.qmul.ac.uk/xmlui/handle/123456789/98872
dc.description.abstractThe burgeoning of deep learning-based music generation has overlooked the potential of symbolic representations tailored for fretted instruments. Guitar tablatures offer an advantageous approach to represent prescriptive information about music performance, often missing from standard MIDI representations. This dissertation tackles a gap in symbolic music generation by developing models that predict both musical structures and expressive guitar performance techniques. We first present DadaGP, a dataset comprising over 25k songs converted from the Guitar Pro tablature format to a dedicated token format suiting sequence models such as the Transformer. To establish a benchmark, we first introduce a baseline unconditional model for guitar tablature generation, by training a Transformer-XL architecture on the DadaGP dataset. We explored various architecture configurations and experimented with two different tokenisation approaches. Delving into controllability of the generative process, we introduce methods for manipulating the output's instrumentation (inst-CTRL) and musical genre (genre-CTRL), by utilising control tokens, prepended to each song before training, acting as instructions for the model. Building upon the core model, we further explore fine-tuning procedures. This allowed us to develop specialised models: ShredGP, which mimics the stylistic features of specific electric guitar players, and LooperGP, focusing on generating loopable music. We further investigated human-AI creative collaborations using the models developed in this work. ProgGP, a groundbreaking model for generating progressive metal, played a pivotal role in establishing a novel workflow for music co-composition and production. To ensure the responsible development and application of this research, we conclude by grounding our work in an ethical framework, involving a critical examination of the limitations and strengths related to data transparency and diversity. We believe the dataset and models presented in this thesis hold significant value for the field of guitar-based research in music information retrieval, not only strengthening the foundation of this sub-field, but also sparking research in automatic music generation with a focus on guitar tablature.en_US
dc.language.isoenen_US
dc.publisherQueen Mary University of Londonen_US
dc.titleGuitar Tablature Generation with Deep Learningen_US
dc.typeThesisen_US
pubs.notesNot knownen_US
rioxxterms.funderDefault funderen_US
rioxxterms.identifier.projectDefault projecten_US
qmul.funderUKRI Centre for Doctoral Training in Artificial Intelligence and Music::Engineering and Physical Sciences Research Councilen_US
rioxxterms.funder.projectda54ab93-5b96-400d-b819-869905386bbfen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

  • Theses [4248]
    Theses Awarded by Queen Mary University of London

Show simple item record