AI4Society Seminar: Ethics of Algorithmic Monoculture and Systemic Exclusion
Join us for the AI4Society seminar series featuring distinguished scholars exploring AI's impact on society, ethics, governance, and human-computer interaction!Our first speaker this semester is Kathleen Creel, an assistant professor of philosophy, religion, and computer science at Northeastern University.
Title: Ethics of Algorithmic Monoculture and Systemic Exclusion
Abstract:
Mistakes are inevitable, but fortunately human mistakes are typically heterogenous. Using the same machine learning model for high stakes decisions creates consistency while amplifying the weaknesses, biases, and idiosyncrasies of the original model. When the same person re-encounters the same model or models trained on the same dataset, she might be wrongly rejected again and again. Thus algorithmic monoculture could lead to consistent ill-treatment of individual people by homogenizing the decision outcomes they experience. Is it unfair to allow the quirks of an algorithmic system to consistently exclude a small number of people from consequential opportunities? And if it is unfair, does its unfairness depend on correlation with bias? I will present an ethical argument for why and under what circumstances algorithmic homogenization of outcomes is unfair.
Speaker Bio:
Kathleen Creel is an assistant professor at Northeastern University appointed in the Department of Philosophy and Religion and in Khoury College of Computer Sciences. Her research explores the moral, political, and epistemic implications of machine learning as it is used in automated decision making and in science.
Date: Tuesday, March 18
Time: 4-5 PM
Room: 32-141