Federated learning (FL) is a distributed collaborative machine learning strategy. FL has the ability to enable several smart IoT applications by permitting machine learning training at distributed IoT devices without sharing actual data. It learns the multiple decentralized edge clients and generates a global model. It enables on-device training and keeps the privacy of user data. The global model is updated according to the local model updates.
The FL is broadly classified into three types that are Horizontal FL (HFL), vertical FL (VFL), and Federated Transfer Learning (FTL). In HFL, the distributed clients learn a global model generated using local data sets of various clients with similar features and different samples. In VFL, the clients train the global model with their local data sets that comprise similar sample values and diverse feature spaces. The FTL extends the samples from VFL and transfers various features for similar representation of diverse clients to enable global training. The FL model enhances data privacy, learning capabilities and minimizes the latency in communication.
The FL methods offer several advantages, such as scalability and data privacy; it is not well suited for low-cost computing devices or devices with limited storage capacity. There are three layers in FL. The first layer is the device layer, and it consists of hardware devices to collect data. The middle layer is the second layer, and it is responsible for data transfer. The last one is the application layer, which offers services or analyzes the data received from the middle layer. The challenges that need to be faced in FL are data partitioning, clients settings, implementation of communication schemes, data security and privacy, and data aggregation.