What does it mean to 'solve' the problem of discrimination in hiring? Social, technical and legal perspectives from the UK on automated hiring systems

Sanchez-Monedero, Javier; Dencik, Lina and Edwards, Lilian. 2020. 'What does it mean to 'solve' the problem of discrimination in hiring? Social, technical and legal perspectives from the UK on automated hiring systems'. In: Conference on Fairness, Accountability, and Transparency (FAT* ’20). Barcelona, Spain 27 - 30 January 2020. [Conference or Workshop Item]

[img]
Preview
Text
bias-auditing-hiring-CAMERA-READY (1).pdf - Accepted Version

Download (1MB) | Preview

Abstract or Description

Discriminatory practices in recruitment and hiring are an ongoing issue that is a concern not just for workplace relations, but also for wider understandings of economic justice and inequality. The ability to get and keep a job is a key aspect of participating in society and sustaining livelihoods. Yet the way decisions are made on who is eligible for jobs, and why, are rapidly changing with the advent and growth in uptake of automated hiring systems (AHSs) powered by data-driven tools. Evidence of the extent of this uptake around the globe is scarce, but a recent report estimated that 98% of Fortune 500 companies use Applicant Tracking Systems of some kind in their hiring process, a trend driven by perceived efficiency measures and cost-savings. Key concerns about such AHSs include the lack of transparency and potential limitation of access to jobs for specific profiles. In relation to the latter, however, several of these AHSs claim to detect and mitigate discriminatory practices against protected groups and promote diversity and inclusion at work. Yet whilst these tools have a growing user-base around the world, such claims of 'bias mitigation' are rarely scrutinised and evaluated, and when done so, have almost exclusively been from a US socio-legal perspective.

In this paper, we introduce a perspective outside the US by critically examining how three prominent automated hiring systems (AHSs) in regular use in the UK, HireVue, Pymetrics and Applied, understand and attempt to mitigate bias and discrimination. These systems have been chosen as they explicitly claim to address issues of discrimination in hiring and, unlike many of their competitors, provide some information about how their systems work that can inform an analysis. Using publicly available documents, we describe how their tools are designed, validated and audited for bias, highlighting assumptions and limitations, before situating these in the socio-legal context of the UK. The UK has a very different legal background to the US in terms not only of hiring and equality law, but also in terms of data protection (DP) law. We argue that this might be important for addressing concerns about transparency and could mean a challenge to building bias mitigation into AHSs definitively capable of meeting EU legal standards. This is significant as these AHSs, especially those developed in the US, may obscure rather than improve systemic discrimination in the workplace.

Item Type:

Conference or Workshop Item (Paper)

Identification Number (DOI):

https://doi.org/10.1145/3351095.3372849

Additional Information:

"© 2020 Copyright held by the owner/author(s). This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record is available at, https://doi.org/10.1145/3351095.3372849."

The research of Lina Dencik and Javier Sánchez-Monedero has been funded by the ERC Starting Grant DATAJUSTICE (grant no. 759903).

Keywords:

Socio-technical systems, automated hiring, algorithmic decision-making, fairness, discrimination, GDPR, social justice

Departments, Centres and Research Units:

Media, Communications and Cultural Studies

Dates:

DateEvent
7 January 2020Published

Event Location:

Barcelona, Spain

Date range:

27 - 30 January 2020

Item ID:

37281

Date Deposited:

24 Jul 2024 09:27

Last Modified:

25 Jul 2024 10:02

URI:

https://research.gold.ac.uk/id/eprint/37281

View statistics for this item...

Edit Record Edit Record (login required)