Autoencoding Blade Runner: Reconstructing Films With Artificial Neural Networks

Broad, Terence and Grierson, Mick. 2017. Autoencoding Blade Runner: Reconstructing Films With Artificial Neural Networks. SIGGRAPH 17 Art Papers, ISSN 1931-4027 [Article]

[img]
Preview
Text (ACM [accepted manuscript])
artpaper_author_draft.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.

Download (531kB) | Preview
[img]
Preview
Text (Leonardo version © 2017 Terence Broad and Mick Grierson)
leon_a_01455.pdf - Published Version

Download (1MB) | Preview

Abstract or Description

‘Blade Runner—Autoencoded’ is a film made by training an autoencoder—a type of generative neural network—to recreate frames from the film Blade Runner. The autoencoder is made to reinterpret every individual frame, reconstructing it based on its memory of the film. The result is a hazy, dreamlike version of the original film. The project explores the aesthetic qualities of the disembodied gaze of the neural network. The autoencoder is also capable of representing images from films it has not seen based on what it has learned from watching Blade Runner.

Item Type:

Article

Departments, Centres and Research Units:

Computing > Embodied AudioVisual Interaction Group (EAVI)

Dates:

DateEvent
14 March 2017Accepted
1 August 2017Published

Item ID:

20555

Date Deposited:

12 Jun 2017 10:47

Last Modified:

29 Apr 2020 16:27

Peer Reviewed:

Yes, this version has been peer-reviewed.

URI:

https://research.gold.ac.uk/id/eprint/20555

View statistics for this item...

Edit Record Edit Record (login required)