学位论文详细信息
Differential Privacy, Property Testing, and Perturbations
differential privacy;Mathematics;Science;Mathematics
McMillan, AudraSchotland, John Carl ;
University of Michigan
关键词: differential privacy;    Mathematics;    Science;    Mathematics;   
Others  :  https://deepblue.lib.umich.edu/bitstream/handle/2027.42/143940/amcm_1.pdf?sequence=1&isAllowed=y
瑞士|英语
来源: The Illinois Digital Environment for Access to Learning and Scholarship
PDF
【 摘 要 】

Controlling the dissemination of information about ourselves has become a minefield inthe modern age. We release data about ourselves every day and don’t always fully understandwhat information is contained in this data. It is often the case that the combinationof seemingly innocuous pieces of data can be combined to reveal more sensitive informationabout ourselves than we intended. Differential privacy has developed as a techniqueto prevent this type of privacy leakage. It borrows ideas from information theory to injectenough uncertainty into the data so that sensitive information is provably absent fromthe privatised data. Current research in differential privacy walks the fine line betweenremoving sensitive information while allowing non-sensitive information to be released.At its heart, this thesis is about the study of information. Many of the results can beformulated as asking a subset of the questions: does the data you have contain enoughinformation to learn what you would like to learn? and how can I affect the data to ensureyou can’t discern sensitive information? We will often approach the former question fromboth directions: information theoretic lower bounds on recovery and algorithmic upperbounds.We begin with an information theoretic lower bound for graphon estimation. This exploresthe fundamental limits of how much information about the underlying population iscontained in a finite sample of data. We then move on to exploring the connection betweeninformation theoretic results and privacy in the context of linear inverse problems. We findthat there is a discrepancy between how the inverse problems community and the privacycommunity view good recovery of information. Next, we explore black-box testing forprivacy. We argue that the amount of information required to verify the privacy guaranteeof an algorithm, without access to the internals of the algorithm, is lower bounded by theamount of information required to break the privacy guarantee. Finally, we explore a settingwhere imposing privacy is a help rather than a hindrance: online linear optimisation.We argue that private algorithms have the right kind of stability guarantee to ensure lowregret for online linear optimisation.

【 预 览 】
附件列表
Files Size Format View
Differential Privacy, Property Testing, and Perturbations 1065KB PDF download
  文献评价指标  
  下载次数:8次 浏览次数:8次