List of Topics:
Location Research Breakthrough Possible @S-Logix pro@slogix.in

Office Address

Social List

Communication-Efficient Personalized Federated Meta-Learning In Edge Networks - 2023

communication-efficient-personalized.png

Research Paper On Communication-Efficient Personalized Federated Meta-Learning In Edge Networks

Research Area:  Machine Learning

Abstract:

Due to the privacy breach risks and data aggregation of traditional centralized machine learning (ML) approaches, applications, data and computing power are being pushed from centralized data centers to network edge nodes. Federated Learning (FL) is an emerging privacy-preserving distributed ML paradigm suitable for edge network applications, which is able to address the above two issues of traditional ML. However, the current FL methods cannot flexibly deal with the challenges of model personalization and communication overhead in the network applications. Inspired by the mixture of global and local models, we proposed a Communication-Efficient Personalized Federated Meta-Learning algorithm to obtain a novel personalized model by introducing the personalization parameter. We can improve model accuracy and accelerate its convergence by adjusting the size of the personalized parameter. Further, the local model to be uploaded is transformed into the latent space through autoencoder, thereby reducing the amount of communication data, and further reducing communication overhead. And local and task-global differential privacy are applied to provide privacy protection for model generation. Simulation experiments demonstrate that our method can obtain better personalized models at a lower communication overhead for edge network applications, while compared with several other algorithms.

Keywords:  

Author(s) Name:  Feng Yu, Hui Lin, Xiaoding Wang, Sahil Garg, Georges Kaddoum, Satinder Singh

Journal name:  IEEE Transactions On Network And Service Management

Conferrence name:  

Publisher name:  IEEE

DOI:  10.1109/TNSM.2023.3263831

Volume Information:  Volume: 20,Pages: 1558-1571,(2023)