|国家预印本平台
首页|Lazy But Effective: Collaborative Personalized Federated Learning with Heterogeneous Data

Lazy But Effective: Collaborative Personalized Federated Learning with Heterogeneous Data

Lazy But Effective: Collaborative Personalized Federated Learning with Heterogeneous Data

来源:Arxiv_logoArxiv
英文摘要

In Federated Learning, heterogeneity in client data distributions often means that a single global model does not have the best performance for individual clients. Consider for example training a next-word prediction model for keyboards: user-specific language patterns due to demographics (dialect, age, etc.), language proficiency, and writing style result in a highly non-IID dataset across clients. Other examples are medical images taken with different machines, or driving data from different vehicle types. To address this, we propose a simple yet effective personalized federated learning framework (pFedLIA) that utilizes a computationally efficient influence approximation, called `Lazy Influence', to cluster clients in a distributed manner before model aggregation. Within each cluster, data owners collaborate to jointly train a model that captures the specific data patterns of the clients. Our method has been shown to successfully recover the global model's performance drop due to the non-IID-ness in various synthetic and real-world settings, specifically a next-word prediction task on the Nordic languages as well as several benchmark tasks. It matches the performance of a hypothetical Oracle clustering, and significantly improves on existing baselines, e.g., an improvement of 17% on CIFAR100.

Boi Faltings、Panayiotis Danassis、Ljubomir Rokvic

计算技术、计算机技术

Boi Faltings,Panayiotis Danassis,Ljubomir Rokvic.Lazy But Effective: Collaborative Personalized Federated Learning with Heterogeneous Data[EB/OL].(2025-05-05)[2025-05-18].https://arxiv.org/abs/2505.02540.点此复制

评论