List of Topics:
Location Research Breakthrough Possible @S-Logix pro@slogix.in

Office Address

Social List

FedDGA: Federated Multi-Task Learning based on Dynamic Guided Attention - 2024

feddga.png

Research Paper On FedDGA: Federated Multi-Task Learning based on Dynamic Guided Attention

Research Area:  Machine Learning

Abstract:

The proliferation of privacy-sensitive data has spurred the development of federated learning (FL), which is an important technology for state-of-the-art machine learning and responsible AI. However, most existing FL methods are constrained in their applicability and generalizability due to their narrow focus on specific tasks. This paper presents a novel federated multi-task learning (FMTL) framework that is capable of acquiring knowledge across multiple tasks. To address the challenges posed by Non-IID data and task imbalance in FMTL, this study proposes a federated fusion strategy based on dynamic guided attention (FedDGA), which adaptively fine-tunes local models for multiple tasks with personalized attention. In addition, this paper designed dynamic batch weight (DBW) to balance the task losses and improve the convergence speed. Extensive experiments were conducted on various datasets, tasks and settings, and the proposed method was compared with state-of-the-art methods such as FedAvg, FedProx and SCAFFOLD. The results show that our method achieves significant performance gains, with up to 11.1% increase in accuracy over the baselines.

Keywords:  

Author(s) Name:  Haoyun Sun, Hongwei Zhao, Liang Xu, Weishan Zhang, Hongqing Guan, Su Yang

Journal name:  IEEE Transactions on Artificial Intelligence

Conferrence name:  

Publisher name:  IEEE

DOI:  10.1109/TAI.2024.3350538

Volume Information:  Volume 8,Pages: 1 - 13,(2024)