Machine Learning as Meta-Instrument: Human-Machine Partnerships Shaping Expressive Instrumental Creation

Fiebrink, Rebecca. 2017. Machine Learning as Meta-Instrument: Human-Machine Partnerships Shaping Expressive Instrumental Creation. In: Till Bovermann; Alberto de Campo; Hauke Egermann; Sarah-Indriyati Hardjowirogo and Stefan Weinzierl, eds. Musical Instruments in the 21st Century. Singapore: Springer, pp. 137-151. ISBN 978-981-10-2950-9 [Book Section]

Fiebrink_Instrumentalities_Preprint.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial.

Download (13MB) | Preview

Abstract or Description

In this chapter, I discuss how machine learning algorithms can shape the design of new instruments. Machine learning algorithms can facilitate new types of design outcomes: they enable people to create new types of digital musical instruments. But, I will argue, they are also valuable in facilitating new types of design processes, allowing the instrument creation process to become a more exploratory, playful, embodied, expressive partnership between human and machine. And these qualities of the design process in turn influence the final form of the instrument that is created— as well as the instrument creator herself. My aims in this chapter are: (1) to provide readers new to these ideas an introductory understanding of how supervised learning algorithms can be used to build new digital musical instruments; (2) to demonstrate that supervised learning algorithms are valuable as design tools, bolstering embodied, real-time, creative practices; and (3) to argue that, because the nature of any new musical instrument is intimately tied to the process through which it was designed, a closer attention to the relationships between instrument builders and instrument creation tools can deepen our understanding of new instruments as well as point to opportunities to design both new instruments and creative experiences.

Item Type:

Book Section

Identification Number (DOI):

Departments, Centres and Research Units:

Computing > Embodied AudioVisual Interaction Group (EAVI)


1 January 2017Published

Item ID:


Date Deposited:

25 Jan 2017 12:32

Last Modified:

29 Apr 2020 16:23


View statistics for this item...

Edit Record Edit Record (login required)