When XR Meets AI: Integrating Interactive Machine Learning with an XR Musical Instrument
Abstract
This paper explores the integration of artificial intelligence (AI) with extended reality (XR) through the development of Netz, an XR musical instrument (XRMI) designed to enhance musical expression and instrument interaction via deep learning. Netz utilises algorithms to map physical gestures to digital controls and allows for customisable control schemes, improving the accuracy of gesture interpretation and the overall musical experience. The instrument was developed through a participatory design process involving a professional keyboard player and music producer over three phases: exploration, making, and performance & refinement. Initial challenges with traditional computational approaches to hand-pose classification were overcome by incorporating an interactive machine learning (IML) model, enabling personalised gesture control. Evaluation included user tasks and thematic analysis of interviews, highlighting improved interaction and the potential of AI to augment musical performance in XR. The study is limited by a single participant evaluation. Future work will involve a wider range of musicians to assess the generalisability of our findings.