In recent years, Federated Learning has become a hot research topic in Machine learning. Federated learning (FL) is a distributed machine learning approach, which aims to collaboratively learn from data across multiple devices or servers with decentralized data samples without exchanging the actual raw data. It is also termed the data preserving approach, which serves as better data privacy because training data is not transmitted to a central server. FL mainly enables multiple parties to jointly train a machine learning model without exchanging the local data. It has been employed in various applications, ranging from medical to IoT, transportation, defense, and mobile apps.
As the data is not collected from a single server still it analyzes the data from multiple devices for learning leads to data heterogeneity problems (Non-identically distributed data). Thus, Personalizing a global model is necessary to account for FL-s challenge of heterogeneity. Personalization in federated learning enables local, personalized models on clients and reveals the underlying relationships among clients via attention, which improves the generality of the algorithms. Even though it acts as data privacy, attackers participate in FL and degrade the performance. Thus, it is essential in FL for adversary training.