Real-time, Interactive Machine Learning for Music Composition and Performance
Speaker: Rebecca Fiebrink, Princeton
Date: Friday, November 4 2011
Time: 1:00PM to 2:00PM
Location: Patil/Kiva Seminar Room (G449)
Host: Rob Miller, MIT CSAIL
Contact: Katrina Panovich, email@example.comRelevant URL: http://groups.csail.mit.edu/uid/seminar.shtml
Supervised learning offers a useful set of computational tools for many problems in computer music composition and performance. Through the use of training examples, these algorithms offer composers and instrument builders a means to implicitly specify the relationship between low-level, human-generated control signals (such as the outputs of gesturally-manipulated sensor interfaces, or audio captured by a microphone) and the desired computer response (such as a change in the synthesis or structural parameters of dynamically-generated digital audio). However, previously existing software tools have not adequately enabled musicians to employ supervised learning in their work. In my recent research, I have focused on building better tools for these users by supporting more appropriate and comprehensive end-user interactions with the supervised learning process.
In this talk, I will provide a brief introduction to interactive computer music and the use of supervised learning in this field. I will show a live musical demo of the software that I have created for interactively applying standard supervised learning algorithms to music and other real-time problem domains. This software, called the Wekinator, supports a hands-on approach to generating training examples by real-time demonstration, as well as interactive, real-time evaluation of the trained models.
In the rest of the talk, I will present my research and collaborations with composers and students applying the Wekinator to their work. This has included a participatory design process with practicing composers, pedagogical use with undergraduate students building interactive music systems, the design of a gesture recognition system for a sensor-augmented cello bow, and case studies with composers who have used the Wekinator in publicly-performed musical works. I will discuss some highlights of my findings, such as how interactions with the Wekinator supported users in accomplishing their goals, how the Wekinator "trained" its users to become better machine learning practitioners and to become more aware of their own actions, and how interactive supervised learning functioned as a tool for supporting creativity and an embodied approach to design.
See other events that are part of HCI Seminar Series 2011/2012
See other events happening in November 2011