eCommons will be completely unavailable from 8:00am April 4 until 5:00pm April 5, 2018, for software upgrades. Thank you for your patience during this planned service interruption. Please contact us at email@example.com if you have questions or concerns.
New Models For Data Privacy And Voting
Enormous amounts of data are collected by hospitals, social networking systems, government agencies, and other organizations. There are huge social benefits in analyzing this data, but we must protect the privacy of the individuals in the data. The current standard definition of data privacy is differential privacy [22, 19]. In this thesis, we introduce new definitions of data privacy that can be better than differential privacy in certain ways. We first argue that differential privacy might not be strong enough in social network settings. We then introduce a zero-knowledge based definition of privacy called zero-knowledge privacy, which is strictly stronger than differential privacy and is particularly attractive when modeling privacy in social networks. Both differential privacy and zero-knowledge privacy provide strong privacy guarantees. However, for certain tasks, mechanisms satisfying these privacy definitions have to add a lot of "noise", thus lowering the utility of the released data. Thus, we introduce a new definition of privacy called crowd-blending privacy that strictly relaxes the notion of differential privacy. We demonstrate crowd-blending private mechanisms for histograms and for releasing synthetic data points, achieving strictly better utility than what is possible using differentially private mechanisms. Differential privacy guarantees the same level of privacy protection for all individuals. However, we demonstrate that some individuals may need more privacy than others. Thus, we introduce a generalization of differential privacy called tai- lored differential privacy, where an individual's privacy parameter is "tailored" for the individual based on the individual's data and the data set. We focus on a natural instance of tailored differential privacy, which we call outlier privacy: an individual's privacy parameter is determined by how much of an "outlier " the individual is. In this thesis, we also study the problem of strategy-proof voting, which is plagued by impossibility results. We take a bounded-rationality approach to this problem and consider a setting where voters have "coarse" beliefs (a notion that has gained popularity in the behavioral economics literature). In particular, we construct good voting rules that satisfy a notion of strategy-proofness with respect to coarse i.i.d. beliefs, thus circumventing the existing impossibility results.
data privacy; voting
Gehrke,Johannes E.; Kozen,Dexter Campbell
Ph.D. of Computer Science
Doctor of Philosophy
dissertation or thesis