The Societal Effects of Technical Systems: Case Studies of Privacy and Fairness




Internet Policy Research Initiative
Today’s algorithmic systems aim to engineer society, ranging from personalization of online ads to prediction of criminal risk for determining bail and parole. In such socio-technical systems, it is infeasible to formally specify a complete list of desirable properties. As computer scientists designing and studying such systems, we must adapt our methods.

In this talk, I’ll describe my work on two domains without clear formal specifications: the privacy implications of web tracking and the (un)fairness of machine learning. I’ll use these to illustrate an interdisciplinary research approach that is centered on measurement, embraces ambiguity in definitions, and seeks to build tools that can enable users, developers, and regulators to effectively negotiate conflicting goals and preferences.


Arvind Narayanan is an Assistant Professor of Computer Science at Princeton. He leads the Princeton Web Transparency and Accountability Project to uncover how companies collect and use our personal information. Narayanan also leads a research team investigating the security, anonymity, and stability of cryptocurrencies as well as novel applications of blockchains. He co-created a Massive Open Online Course as well as a textbook on Bitcoin and cryptocurrency technologies. His doctoral research showed the fundamental limits of de-identification, for which he received the Privacy Enhancing Technologies Award.

Narayanan is an affiliated faculty member at the Center for Information Technology Policy at Princeton and an affiliate scholar at Stanford Law School's Center for Internet and Society.