学位论文详细信息
The fundamental limits of statistical data privacy
Privacy;Information Theory;Data Privacy;Statistics;Multi-Party Computation;Security;Local Differential Privacy;Privacy-Preserving Machine Learning Algorithms;Information Theoretic Utilities;f-Divergence;Mutual Information;Statistical Inference;Hypothesis Testing;Estimation
Kairouz, Peter
关键词: Privacy;    Information Theory;    Data Privacy;    Statistics;    Multi-Party Computation;    Security;    Local Differential Privacy;    Privacy-Preserving Machine Learning Algorithms;    Information Theoretic Utilities;    f-Divergence;    Mutual Information;    Statistical Inference;    Hypothesis Testing;    Estimation;   
Others  :  https://www.ideals.illinois.edu/bitstream/handle/2142/92686/KAIROUZ-DISSERTATION-2016.pdf?sequence=1&isAllowed=y
美国|英语
来源: The Illinois Digital Environment for Access to Learning and Scholarship
PDF
【 摘 要 】

The Internet is shaping our daily lives. On the one hand, social networks like Facebook and Twitter allow people to share their precious moments and opinions with virtually anyone around the world. On the other, services like Google, Netflix, and Amazon allow people to look up information, watch movies, and shop online anytime, anywhere. However, with this unprecedented level of connectivity comes the danger of being monitored. There is an increasing tension between the need to share data and the need to preserve the privacy of Internet users. The need for privacy appears in three main contexts: (1) the global privacy context, as in when private companies and public institutions release personal information about individuals to the public; (2) the local privacy context, as in when individuals disclose their personal information with potentially malicious service providers; (3) the multi-party privacy context, as in when different parties cooperate to interactively compute a function that is defined over all the parties' data.Differential privacy has recently surfaced as a strong measure of privacy in all three contexts. Under differential privacy, privacy is achieved by randomizing the data before releasing it. This leads to a fundamental tradeoff between privacy and utility. In this thesis, we take a concrete step towards understanding the fundamental structure of privacy mechanisms that achieve the best privacy-utility tradeoff. This tradeoff is formulated as a constrained optimization problem: maximize utility subject to differential privacy constraints. We show, perhaps surprisingly, that in all three privacy contexts, the optimal privacy mechanisms have the same combinatorial staircase structure. This deep result is a direct consequence of the geometry of the constraints imposed by differential privacy on the privatization mechanisms.

【 预 览 】
附件列表
Files Size Format View
The fundamental limits of statistical data privacy 5373KB PDF download
  文献评价指标  
  下载次数:8次 浏览次数:19次