Research Area:  Machine Learning
Federated learning is an effective way to enable artificial intelligence over massive distributed nodes with security and communication efficiency. Some previous works primarily focus on learning a single global model for a unique task across the network, which is less competent to handle multi-task scenarios with stragglers and fault, after adopting the general gradient update methods in a federated environment. Others aim to learn a distinct model for each node, which is expensive in terms of the computation and communication cost. Using hierarchical network to reduce communication cost is becoming a new candidate. Thus, we propose a primal-and-dual method-based hierarchical federated multi-task learning system, supported with HFedMTL algorithm that allows massive nodes from distributed areas to join in the federated multi-task learning process. Empirical experiments verify the analysis and demonstrate the benefits of improving the learning performance and convergence rate.
Keywords:  
Learning systems
Costs
Federated learning
Computational modeling
Simulation
Multitasking
Security
Author(s) Name:  Xingfu Yi; Rongpeng Li; Chenghui Peng; Jianjun Wu
Journal name:  
Conferrence name:  2022 IEEE 33rd Annual International Symposium on Personal, Indoor and Mobile Radio Communications
Publisher name:  IEEE
DOI:  10.1109/PIMRC54779.2022.9977670
Volume Information:  
Paper Link:   https://ieeexplore.ieee.org/abstract/document/9977670