Musicians and Machines: Bridging the Semantic Gap In Live Performance
Abstract
This thesis explores the automatic extraction of musical information from
live performances – with the intention of using that information to create
novel, responsive and adaptive performance tools for musicians.
We focus specifically on two forms of musical analysis – harmonic analysis
and beat tracking. We present two harmonic analysis algorithms –
specifically we present a novel chroma vector analysis technique which
we later use as the input for a chord recognition algorithm. We also
present a real-time beat tracker, based upon an extension of state of the
art non-causal models, that is computationally efficient and capable of
strong performance compared to other models. Furthermore, through a
modular study of several beat tracking algorithms we attempt to establish
methods to improve beat tracking and apply these lessons to our model.
Building upon this work, we show that these analyses can be combined
to create a beat-synchronous musical representation, with harmonic information
segmented at the level of the beat. We present a number of ways
of calculating these representations and discuss their relative merits.
We proceed by introducing a technique, which we call Performance
Following, for recognising repeated patterns in live musical performances.
Through examining the real-time beat-synchronous musical representation,
this technique makes predictions of future harmonic content in musical
performances with no prior knowledge in the form of a score.
Finally, we present a number of potential applications for live performances
that incorporate the real-time musical analysis techniques outlined
previously. The applications presented include audio effects informed by
beat tracking, a technique for synchronising video to a live performance,
the use of harmonic information to control visual displays and an automatic
accompaniment system based upon our performance following
technique.
Authors
Stark, AdamCollections
- Theses [3705]