Abstract
These demonstrations will allow visitors to prototype gestural, interactive musical instruments in the browser. Different browser based synthesisers can be controlled by either a Leap Motion sensor or a Myo armband. The visitor will be able to use an interactive machine learning toolkit to quickly and iteratively explore different interaction possibilities.
The demonstrations show how interactive, browser-based machine learning tools can be used to rapidly prototype gestural controllers for audio.
These demonstrations showcase RapidLib, a browser based machine learning library developed through the RAPID-MIX project.
Licence information
Attribution-NonCommercial-NoDerivs 3.0 United States