This dissertation focuses on strategic behavior and database privacy. First, we look at strategic behavior as a tool for distributed computation. We blend the perspectives of game theory and mechanism design in proposals for distributed solutions to the classical set cover optimization problem. We endow agents with natural individual incentives, and we show that centrally broadcasting non-binding advice effectively guides the system to a near-optimal state while keeping the original incentive structure intact.
We next turn to the database privacy setting, in which an analyst wishes to learn something from a database, but the individuals contributing the data want to protect their personal information. The notion of differential privacy allows us to do both by obscuring true answers to statistical queries with a small amount of noise. The ability to conduct a task differentially privately depends on whether the amount of noise required for privacy still permits statistical accuracy.
We show that it is possible to give a satisfying tradeoff between privacy and accuracy for a computational problem called independent component analysis (ICA), which seeks to decompose an observed signal into its underlying independent source variables. We do this by releasing a perturbation of a compact representation of the observed data. This approach allows us to preserve individual privacy while releasing information that can be used to reconstruct the underlying relationship between the observed variables.
In almost all of the differential privacy literature, the privacy requirement must be specified before looking at the data, and the noise added for privacy limits the statistical utility of the sanitized data. The third part of this dissertation ties together privacy and strategic behavior to answer the question of how to determine an appropriate level of privacy when data contributors prefer more privacy but an analyst prefers more accuracy. The proposed solution to this problem views privacy as a public good and uses market design techniques to
collect these preferences and then privately select and enforce a socially efficient level of privacy.
Identifer | oai:union.ndltd.org:GATECH/oai:smartech.gatech.edu:1853/53964 |
Date | 21 September 2015 |
Creators | Krehbiel, Sara |
Contributors | Peikert, Chris |
Publisher | Georgia Institute of Technology |
Source Sets | Georgia Tech Electronic Thesis and Dissertation Archive |
Language | en_US |
Detected Language | English |
Type | Dissertation |
Format | application/pdf |
Page generated in 0.0014 seconds