|国家预印本平台
首页|GC-Fed: Gradient Centralized Federated Learning with Partial Client Participation

GC-Fed: Gradient Centralized Federated Learning with Partial Client Participation

GC-Fed: Gradient Centralized Federated Learning with Partial Client Participation

来源:Arxiv_logoArxiv
英文摘要

Federated Learning (FL) enables privacy-preserving multi-source information fusion (MSIF) but is challenged by client drift in highly heterogeneous data settings. Many existing drift-mitigation strategies rely on reference-based techniques--such as gradient adjustments or proximal loss--that use historical snapshots (e.g., past gradients or previous global models) as reference points. When only a subset of clients participates in each training round, these historical references may not accurately capture the overall data distribution, leading to unstable training. In contrast, our proposed Gradient Centralized Federated Learning (GC-Fed) employs a hyperplane as a historically independent reference point to guide local training and enhance inter-client alignment. GC-Fed comprises two complementary components: Local GC, which centralizes gradients during local training, and Global GC, which centralizes updates during server aggregation. In our hybrid design, Local GC is applied to feature-extraction layers to harmonize client contributions, while Global GC refines classifier layers to stabilize round-wise performance. Theoretical analysis and extensive experiments on benchmark FL tasks demonstrate that GC-Fed effectively mitigates client drift and achieves up to a 20% improvement in accuracy under heterogeneous and partial participation conditions.

Ferhat Ozgur Catak、Chunming Rong、Kibeom Hong、Minhoe Kim、Jungwon Seo

计算技术、计算机技术

Ferhat Ozgur Catak,Chunming Rong,Kibeom Hong,Minhoe Kim,Jungwon Seo.GC-Fed: Gradient Centralized Federated Learning with Partial Client Participation[EB/OL].(2025-03-17)[2025-05-22].https://arxiv.org/abs/2503.13180.点此复制

评论