期刊论文详细信息
卷:155
Gradient-tracking based differentially private distributed optimization with enhanced optimization accuracy ✩
Article
关键词: CONVERGENCE;    ADMM;   
DOI  :  10.1016/j.automatica.2023.111150
来源: SCIE
【 摘 要 】

Privacy protection has become an increasingly pressing requirement in distributed optimization. However, equipping distributed optimization with differential privacy, the state-of-the-art privacy protection mechanism, will unavoidably compromise optimization accuracy. In this paper, we propose an algorithm to achieve rigorous & epsilon;-differential privacy in gradient-tracking based distributed optimization with enhanced optimization accuracy. More specifically, to suppress the influence of differential privacy noise, we propose a new robust gradient-tracking based distributed optimization algorithm that allows both stepsize and the variance of injected noise to vary with time. Then, we establish a new analyzing approach that can characterize the convergence of the gradient-tracking based algorithm under both constant and time-varying stepsizes. To our knowledge, this is the first analyzing framework that can treat gradient-tracking based distributed optimization under both constant and time-varying stepsizes in a unified manner. More importantly, the new analyzing approach gives a much less conservative analytical bound on the stepsize compared with existing proof techniques for gradient-tracking based distributed optimization. We also theoretically characterize the influence of differential-privacy design on the accuracy of distributed optimization, which reveals that inter-agent interaction has a significant impact on the final optimization accuracy. Numerical simulation results confirm the theoretical predictions. & COPY; 2023 Elsevier Ltd. All rights reserved.

【 授权许可】

Free   

  文献评价指标  
  下载次数:0次 浏览次数:0次