Privacy, incentives, and truthfulness
Speaker: David Xiao , LRI Université Paris-Sud and LIAFA, Université ParisContact:
Date: January 24 2011
Time: 10:30AM to 12:00PM
Location: 32-D463 STAR conf. room
Host: Shafi Goldwasser, CSAIL, MIT
Be Blackburn , 3-6098, firstname.lastname@example.orgRelevant URL:
Privacy has become an ever more pressing concern as we conduct more and more of our lives in public forums such as the Internet. One privacy question that has received much study is how a database curator may output "sanitized" data that does not reveal too much information about any particular individual. This criteria has been formalized as differential privacy, proposed originally by Dwork et al. (TCC '06 and ICALP '06), which captures the idea that "the presence or absence of any individual's data does not change the distribution of the sanitized data by much". This guarantee has been interpreted to mean that individuals should be comfortable revealing their information, since their participation barely changes the output.
In this talk, we argue that the interpretation is incomplete because unless participation in the database somehow explicitly benefits the individuals, individuals will always refuse to participate truthfully, regardless of whether the sanitization mechanism is differentially private or not.
We therefore advocate studying differential privacy in conjunction with the notion of truthfulness from game theory, which says that a mechanism should be designed so that it is in the individuals' own interest to give their true information. We show that there exist games for which differentially private mechanisms, in particular the exponential mechanism of McSherry and Talwar (FOCS '07), do not motivate the individuals to participate truthfully. On the positive side, we show that a wide class of games do admit differentially private, truthful, and efficient mechanisms.
Finally, we explore the possibility of tradeoffs between utility and privacy. This is because individuals may be willing to give up some privacy if they receive enough utility from a game, and vice versa. We show that, under a natural measure of information cost, even the release of a differentially private histogram may reveal so much information that individuals would rather suffer the consequences of lying rather than have their information published.
See other events that are part of CIS Seminars 2010/2011
See other events happening in January 2011