|国家预印本平台
首页|TinyProto: Communication-Efficient Federated Learning with Sparse Prototypes in Resource-Constrained Environments

TinyProto: Communication-Efficient Federated Learning with Sparse Prototypes in Resource-Constrained Environments

TinyProto: Communication-Efficient Federated Learning with Sparse Prototypes in Resource-Constrained Environments

来源:Arxiv_logoArxiv
英文摘要

Communication efficiency in federated learning (FL) remains a critical challenge for resource-constrained environments. While prototype-based FL reduces communication overhead by sharing class prototypes-mean activations in the penultimate layer-instead of model parameters, its efficiency decreases with larger feature dimensions and class counts. We propose TinyProto, which addresses these limitations through Class-wise Prototype Sparsification (CPS) and adaptive prototype scaling. CPS enables structured sparsity by allocating specific dimensions to class prototypes and transmitting only non-zero elements, while adaptive scaling adjusts prototypes based on class distributions. Our experiments show TinyProto reduces communication costs by up to 4x compared to existing methods while maintaining performance. Beyond its communication efficiency, TinyProto offers crucial advantages: achieving compression without client-side computational overhead and supporting heterogeneous architectures, making it ideal for resource-constrained heterogeneous FL.

Gyuejeong Lee、Daeyoung Choi

通信

Gyuejeong Lee,Daeyoung Choi.TinyProto: Communication-Efficient Federated Learning with Sparse Prototypes in Resource-Constrained Environments[EB/OL].(2025-07-06)[2025-07-19].https://arxiv.org/abs/2507.04327.点此复制

评论