期刊论文详细信息
卷:151
Dynamics based privacy preservation in decentralized optimization
Article
关键词: DISTRIBUTED OPTIMIZATION;    CONVERGENCE;   
DOI  :  10.1016/j.automatica.2023.110878
来源: SCIE
【 摘 要 】

With decentralized optimization having increased applications in various domains ranging from machine learning, control, to robotics, its privacy is also receiving increased attention. Existing privacy solutions for decentralized optimization achieve privacy by patching information-technology privacy mechanisms such as differential privacy or homomorphic encryption, which either sacrifices opti-mization accuracy or incurs heavy computation/communication overhead. We propose an inherently privacy-preserving decentralized optimization algorithm by exploiting the robustness of decentralized optimization dynamics. More specifically, we present a general decentralized optimization framework, based on which we show that the privacy of participating nodes' gradients can be protected by adding randomness in optimization parameters. We further show that the added randomness has no influence on the accuracy of optimization, and prove that our inherently privacy-preserving algorithm has R -linear convergence when the global objective function is smooth and strongly convex. We also prove that the proposed algorithm can avoid the gradient of a node from being inferable by other nodes. Simulation results confirm the theoretical predictions.(c) 2023 Elsevier Ltd. All rights reserved.

【 授权许可】

Free   

  文献评价指标  
  下载次数:0次 浏览次数:1次