|国家预印本平台
首页|Personalized Federated Learning under Model Dissimilarity Constraints

Personalized Federated Learning under Model Dissimilarity Constraints

Personalized Federated Learning under Model Dissimilarity Constraints

来源:Arxiv_logoArxiv
英文摘要

One of the defining challenges in federated learning is that of statistical heterogeneity among clients. We address this problem with KARULA, a regularized strategy for personalized federated learning, which constrains the pairwise model dissimilarities between clients based on the difference in their distributions, as measured by a surrogate for the 1-Wasserstein distance adapted for the federated setting. This allows the strategy to adapt to highly complex interrelations between clients, that e.g., clustered approaches fail to capture. We propose an inexact projected stochastic gradient algorithm to solve the constrained problem that the strategy defines, and show theoretically that it converges with smooth, possibly non-convex losses to a neighborhood of a stationary point with rate O(1/K). We demonstrate the effectiveness of KARULA on synthetic and real federated data sets.

Samuel Erickson、Mikael Johansson

计算技术、计算机技术

Samuel Erickson,Mikael Johansson.Personalized Federated Learning under Model Dissimilarity Constraints[EB/OL].(2025-05-12)[2025-06-26].https://arxiv.org/abs/2505.07575.点此复制

评论