Abstract: As a social anthropologist, cryptography is interesting because what counts as secure, what constitutes trust, and how sensitive information is perceived are not only cryptographic questions but deeply social ones. Anthropologists study societal and cultural praxes, processes and perceptions, and information security and cryptography are becoming important themes for such inquiries. In this talk, I invite you to be curious about how anthropologists explore notions of security, privacy and trust and how this might be relevant to your work. I suggest that joining forces across disciplines could be mutually beneficial for commercial applications, offer new ways to think methodologically together, and offer a platform for much needed societal debate and critique.
From a commercial perspective, there seems to be a disconnect between developing new cryptographic primitives and getting these brilliant schemes adopted for actual use (Anderson 1994; Whitten and Tygar 2005). This is connected to what appears to be a narrative about how tasks are distributed: We start with math and crypto, which is then passed on to engineers who develop software. From there, policies and laws are developed and hardware issues are addressed. But the final step, adoption by users, is problematic. If users do not recognize the need for information security tools or are not able to understand them well enough to be convinced that they work, all the other steps are for naught. Perhaps we should re-think this distribution of tasks. When users’ existing practices, perceptions and life conditions are embedded in designs from the beginning, they are more successful.
The narrative above describes a movement from the general and abstract (math) to the specific and practical (users), with each step separate from the others. What if we designed cryptographic tools by thinking athwart (Cf. Helmreich 2009)? Placing math and social analysis next to each other, and tacking back and forth between them, offers a different methodological construct (Mannov, Andersen, and Bruun, n.d.). It places, for example, actual social trust and cryptographic trust schemes next to each other, showing more clearly how they agree or contradict (Bruun, Mannov, and Andersen n.d.). Similar questions can be asked about “security” (von Schnitzler 2008), “risk” (Cf. Appel 2012) and “privacy”.
Bringing our disciplines together in this way is new and it represents an opportunity: The Snowden revelations and Cambridge Analytica’s recent abuses have changed the way citizens think about data security and the time may be right for launching a nuanced public debate. Addressing the socio-mathematical issues of trust, privacy and security together positions us to offer well-informed criticism on issues such as data-citizens’ rights (Taylor 2017) and surveillance capitalism (Zuboff 2019).