Research Area:  Machine Learning
The rise of connected vehicular networks (CVNs) holds promise for future intelligent transport systems, offering improvements in safety and road efficiency. CVNs face challenges due to data-driven perception and driving models, requiring extensive knowledge to navigate complex scenarios. In vehicular networks, federated learning (FL) is vital for privacy-preserving machine learning (ML). It allows collaborative training of a single ML model across edge devices while keeping data locally, preserving privacy. However, scalability remains a challenge, especially for large ML models, and can yield suboptimal results when local data distributions diverge. We present a robust and efficient Fed-aided multi-task temporal clustering (FeMTC) knowledge-sharing framework tailored to the demands of highly distributed vehicular networks. Our approach quantifies the temporal similarity between a pair of client vectors to group clients with higher similarity at the edge-base server and trains independently on single and multi-task cluster learning. Experiments show that FeMTC achieves faster convergence and up to 15% better performance than existing methods in some scenarios. It easily combines with other methods for improved performance and exhibits robust gains in various non-independent and identically distributed (non-IID) scenarios.
Keywords:  
Author(s) Name:  Muhammad Waqas Nawaz, Muhammad Ali Imran, Olaoluwa Popoola
Journal name:  IEEE Wireless Communications and Networking
Conferrence name:  
Publisher name:  IEEE
DOI:  10.1109/WCNC57260.2024.10570723
Volume Information:  Volume 7,(2024)
Paper Link:   https://ieeexplore.ieee.org/document/10570723