Research breakthrough possible @S-Logix pro@slogix.in

Office Address

Social List

Concentrated Differentially Private and Utility Preserving Federated Learning - 2020

Concentrated Differentially Private And Utility Preserving Federated Learning

Research Area:  Machine Learning

Abstract:

Federated learning is a machine learning setting where a set of edge devices collaboratively train a model under the orchestration of a central server without sharing their local data. At each communication round of federated learning, edge devices perform multiple steps of stochastic gradient descent with their local data and then upload the computation results to the server for model update. During this process, the challenge of privacy leakage arises due to the information exchange between edge devices and the server when the server is not fully trusted. While some previous privacy-preserving mechanisms could readily be used for federated learning, they usually come at a high cost on convergence of the algorithm and utility of the learned model. In this paper, we develop a federated learning approach that addresses the privacy challenge without much degradation on model utility through a combination of local gradient perturbation, secure aggregation, and zero-concentrated differential privacy (zCDP). We provide a tight end-to-end privacy guarantee of our approach and analyze its theoretical convergence rates. Through extensive numerical experiments on real-world datasets, we demonstrate the effectiveness of our proposed method and show its superior trade-off between privacy and model utility.

Keywords:  

Author(s) Name:  Rui Hu, Yuanxiong Guo, Yanmin Gong

Journal name:  Computer Science

Conferrence name:  

Publisher name:  arXiv:2003.13761

DOI:  10.48550/arXiv.2003.13761

Volume Information: