学位论文详细信息
A framework for privacy-preserving, distributed machine learning using gradient obfuscation
Distributed optimization;Privacy
Phadke, Nishad Ashok ; Vaidya ; Nitin H.
关键词: Distributed optimization;    Privacy;   
Others  :  https://www.ideals.illinois.edu/bitstream/handle/2142/99116/PHADKE-THESIS-2017.pdf?sequence=1&isAllowed=y
美国|英语
来源: The Illinois Digital Environment for Access to Learning and Scholarship
PDF
【 摘 要 】

Large-scale machine learning has recently risen to prominence in settings of both industry and academia, driven by today's newfound accessibility to data-collecting sensors and high-volume data storage devices. The advent of these capabilities in industry, however, has raised questions about the privacy implications of new massively data-driven, subscribable services offered by corporations to individuals. Recent lines of research have developed algorithms designed to scale in distributed machine learning environments that make certain privacy guarantees to subscribers without hindering the quality of service the corporations are able to provide. In this work, we fully implement one such distributed optimization framework and rigorously test its parameterized convergence properties. We also develop a system of both disruptive and nondisruptive attacks designed to aggressively intrude upon subscribers' privacy and to glean subscribers' private data from information readily available within the framework's network. These attack techniques can be seamlessly integrated into the aforementioned distributed optimization framework and are shown to be a risk to the privacy of the system.

【 预 览 】
附件列表
Files Size Format View
A framework for privacy-preserving, distributed machine learning using gradient obfuscation 14170KB PDF download
  文献评价指标  
  下载次数:20次 浏览次数:14次