In this article, we propose a novel approach called FedSSA to address the challenges of federated learning in heterogeneous settings. Our goal is to minimize the sum of losses across all clients’ local models, which is like trying to juggle multiple balls at once without any of them dropping. To achieve this, we use semantic similarity-based classification header parameter aggregation for knowledge transfer from local to global models, and adaptive parameter stabilization for global-to-local knowledge transfer. This allows us to create personalized heterogeneous local models that can be used for tasks such as image classification.
Our approach is demonstrated to achieve higher accuracy and lower communication cost compared to other federated learning algorithms in various scenarios, making it an effective solution for situations where data is non-IID (independent and identically distributed). We also show that FedSSA can adapt to different trade-offs among accuracy, communication cost, and computational cost by adjusting the hyperparameter η. This is like having a control knob for fine-tuning the algorithm’s performance to suit specific needs.
In summary, FedSSA is a powerful tool for federated learning in heterogeneous settings that can help us juggle multiple balls without any of them dropping. By using semantic similarity-based classification header parameter aggregation and adaptive parameter stabilization, we can create personalized local models that are tailored to each client’s data distribution. Our approach demonstrates superior performance compared to other algorithms in various scenarios, making it an ideal choice for situations where data is non-IID.
Computer Science, Machine Learning