Research Area:  Machine Learning
In federated learning, a central server coordinates the training of a single model on a massively distributed network of devices. This setting can be naturally extended to a multi-task learning framework, to handle real-world federated datasets that typically show strong statistical heterogeneity among devices. Despite federated multi-task learning being shown to be an effective paradigm for real-world datasets, it has been applied only on convex models. In this work, we introduce VIRTUAL, an algorithm for federated multi-task learning for general non-convex models. In VIRTUAL the federated network of the server and the clients is treated as a star-shaped Bayesian network, and learning is performed on the network using approximated variational inference. We show that this method is effective on real-world federated datasets, outperforming the current state-of-the-art for federated learning, and concurrently allowing sparser gradient updates.
Keywords:  
Author(s) Name:  Luca Corinzia, Ami Beuret, Joachim M. Buhmann
Journal name:  Computer Science
Conferrence name:  
Publisher name:  arXiv:1906.06268
DOI:  10.48550/arXiv.1906.06268
Volume Information:  
Paper Link:   https://arxiv.org/abs/1906.06268