Research Area:  Machine Learning
Federated Learning is a new distributed learning mechanism which allows model training on a large corpus of decentralized data owned by different data providers, without sharing or leakage of raw data. According to the characteristics of data dis-tribution, it could be usually classified into three categories: horizontal federated learning, vertical federated learning, and federated transfer learning. In this paper we present a solution for parallel dis-tributed logistic regression for vertical federated learning. As compared with existing works, the role of third-party coordinator is removed in our proposed solution. The system is built on the pa-rameter server architecture and aims to speed up the model training via utilizing a cluster of servers in case of large volume of training data. We also evaluate the performance of the parallel distributed model training and the experimental results show the great scalability of the system.
Author(s) Name:  Shengwen Yang, Bing Ren, Xuhui Zhou, Liping Liu
Journal name:  Computer Science
Publisher name:  arXiv:1911.09824
Paper Link:   https://arxiv.org/abs/1911.09824