Algorithmic injustice: a relational ethics approach

Patterns (N Y). 2021 Feb 12;2(2):100205. doi: 10.1016/j.patter.2021.100205.

Abstract

It has become trivial to point out that algorithmic systems increasingly pervade the social sphere. Improved efficiency-the hallmark of these systems-drives their mass integration into day-to-day life. However, as a robust body of research in the area of algorithmic injustice shows, algorithmic systems, especially when used to sort and predict social outcomes, are not only inadequate but also perpetuate harm. In particular, a persistent and recurrent trend within the literature indicates that society's most vulnerable are disproportionally impacted. When algorithmic injustice and harm are brought to the fore, most of the solutions on offer (1) revolve around technical solutions and (2) do not center disproportionally impacted communities. This paper proposes a fundamental shift-from rational to relational-in thinking about personhood, data, justice, and everything in between, and places ethics as something that goes above and beyond technical solutions. Outlining the idea of ethics built on the foundations of relationality, this paper calls for a rethinking of justice and ethics as a set of broad, contingent, and fluid concepts and down-to-earth practices that are best viewed as a habit and not a mere methodology for data science. As such, this paper mainly offers critical examinations and reflection and not "solutions."

Keywords: Afro-feminism; artificial intelligence; complex systems; data science; embodiment; enaction; ethics; justice; machine learning; relational epistemology.

Publication types

  • Review