Research Area:  Machine Learning
With the rapid growth in mobile computing, massive amounts of data and computing resources are now located at the edge. To this end, Federated learning (FL) is becoming a widely adopted distributed machine learning (ML) paradigm, which aims to harness this expanding skewed data locally in order to develop rich and informative models. In centralized FL, a collection of devices collaboratively solve a ML task under the coordination of a central server. However, existing FL frameworks make an over-simplistic assumption about network connectivity and ignore the communication bandwidth of the different links in the network. In this paper, we present and study a novel FL algorithm, in which devices mostly collaborate with other devices in a pairwise manner. Our nonparametric approach is able to exploit network topology to reduce communication bottlenecks. We evaluate our approach on various FL benchmarks and demonstrate that our method achieves 10× better communication efficiency and around 8% increase in accuracy compared to the centralized approach.
Keywords:  
Centralized Federated Learning
Deep Learning
Machine Learning
Author(s) Name:  Li Chou, Zichang Liu, Zhuang Wang & Anshumali Shrivastava
Journal name:  
Conferrence name:  Research Paper on Machine Learning and Knowledge Discovery in Databases
Publisher name:  Springer
DOI:  10.1007/978-3-030-86486-6_47
Volume Information:  
Paper Link:   https://link.springer.com/chapter/10.1007/978-3-030-86486-6_47