Computer scientists typically think about machine learning as a set of powerful algorithms for modeling data in order to make decisions or predictions, or to better understand some phenomenon. In this talk, I’ll invite you to consider a different perspective, one in which machine learning algorithms function as live and interactive human-machine interfaces, akin to a musical instrument. These “instruments" can support a rich variety of activities, including creative, embodied, and exploratory interactions with computers and media. They can also enable a broader range of people—from software developers to children to music therapists—to create interactive digital systems. Drawing on a decade of research on these topics, I’ll discuss some of our most exciting findings about how machine learning can support human creative practices, for instance by enabling faster prototyping and exploration of new technologies (including by non-programmers), by supporting greater embodied engagement in design, and by changing the ways that creators are able to think about the design process and about themselves. I’ll discuss how these findings inform new ways of thinking about what machine learning is good for, how to make more useful and usable creative machine learning tools, how to teach creative practitioners about machine learning, and what the future of human-computer collaboration might look like.
Rebecca Fiebrink is a Reader in the Creative Computing Institute at University of the Arts London