Algorithmic Witnesses
Rosamond, Emily. 2016. 'Algorithmic Witnesses'. In: 8th Beyond Humanism Conference. Universidad Complutense de Madrid, Spain 25-27 May 2016. [Conference or Workshop Item]
|
Text
Emily Rosamond Beyond Humanism Abstract.pdf Download (13kB) | Preview |
Abstract or Description
In the mid-twentieth century, George Orwell and Michel Foucault penned vastly influential visions of surveillance and its role in disciplining subjects. Their ideas still, quite often, frame discussions of surveillance after the Snowden revelations of 2013. Yet in many ways, these twentieth-century narratives are ill-equipped to describe recent developments: the increasing role of corporate and financial players in online surveillance; the move from discipline toward behavioural modulation; the increasingly predictive, speculative orientation of online surveillance apparatuses; and the prevalence of automated, algorithmic user identification. Not only governmental bodies, such as NSA and GCHQ, watch; hosts of corporate and financial players monetize data, identify users, and predict future behaviour. Automated, algorithmic witnesses attempt to determine who we are, and predetermine what we see online. What are the implications of these automated witnesses for artists? What are the implications of claims – made by Trevor Paglen and others – that, in order to respond to these new conditions of control, artists may need to learn to see the way their automated witnesses see?
In her ground-breaking Islamic genealogy of new media art (2010), Laura Marks writes of Islamic carpet designs as algorithmic interplays of inputs and outputs, which enable a kind of inorganic thought to play itself out in, and as, design. Algorithms constantly perform such forms of inorganic thought in the server farms that constitute the computational backdrops of daily life; but in a surveillance economy, they have largely been tasked with acts of identification and witnessing, which at least partly reinforce reified racial, class and gender biases. In light of the ubiquity of automated witnesses, Trevor Paglen calls for artists to drop visual culture – so irrelevant to machinic seeing – as a frame of reference. Yet his own works certainly do not achieve this; if anything, they uphold end extend both the visual tropes of romanticism, and the psychoanalytic discourses of the scopic drive. Further, it could be argued that Paglen’s approach reflects a privileged de-privileging of the humanist frameworks of visual culture, insofar as it fails to sufficiently differentiate between its humanist investments and its posthumanist aims. In constrast, feminist works by Erica Scourti and others self-consciously stage a dialogue between human subjects and their algorithmic witnesses, thereby opening a space for navigating the differences between human perception and machine identification, and exploring the myriad ways in which algorithmic witnesses subtly reshape the tropes of visual culture.
Item Type: |
Conference or Workshop Item (Paper) |
||||||
Keywords: |
Surveillance, Automated witnesses, Algorithmic witnesses, Artists |
||||||
Departments, Centres and Research Units: |
|||||||
Dates: |
|
||||||
Event Location: |
Universidad Complutense de Madrid, Spain |
||||||
Date range: |
25-27 May 2016 |
||||||
Item ID: |
25561 |
||||||
Date Deposited: |
16 Jan 2019 16:02 |
||||||
Last Modified: |
29 Apr 2020 17:05 |
||||||
URI: |
View statistics for this item...
Edit Record (login required) |