Latent Spaces: A Creative Approach

Yee-King, Matthew. 2022. Latent Spaces: A Creative Approach. In: Craig Vear and Fabrizio Poltronieri, eds. The Language of Creative AI: Practices, Aesthetics and Structures. Cham, Switzerland: Springer, pp. 137-154. ISBN 9783031109591 [Book Section]

No full text available
[img] Text
matthewyeeking_latent_spaces_v3.pdf - Accepted Version
Permissions: Administrator Access Only until 6 November 2024.

Download (6MB)

Abstract or Description

This chapter explores the creative possibilities offered by latent spaces. Latent spaces are machine-learnt maps representing large media datasets such as images and sound. With a latent space, an artist can rapidly search for interesting places in the dataset and then generate new artefacts around and between data points. These unique artefacts were not in the original dataset, but they relate to it. Readers will find a detailed explanation of what latent spaces are and how they fit into a series of developments that have taken place in digital media processing techniques such as content-based search and feature extraction. We will encounter four examples of machine learning systems that provide latent spaces suitable for creative work. The first example is Music-VAE which creates a latent space of millions of musical fragments represented in the symbolic MIDI format. The second example is Latent Timbre Synthesis (LTS). Unlike Music-VAE, which works in a symbolic musical domain, LTS works directly with audio fragments. The third example is StyleGAN which creates a latent space of images which has specific properties allowing for style transfers. The final example is VQGAN + CLIP which is a text phrase-to-image system which uses fine-tuning techniques to iteratively generate images. Finally, we consider examples of artists working with each of the four systems along with reflections on their creative processes.

Item Type:

Book Section

Identification Number (DOI):


Machine learning, Music-AI, StyleGAN, Generative art, Latent space

Departments, Centres and Research Units:



6 November 2022Published

Item ID:


Date Deposited:

22 Dec 2022 09:21

Last Modified:

07 Jan 2023 01:27


View statistics for this item...

Edit Record Edit Record (login required)