|国家预印本平台
| 注册
首页|基于自蒸馏的掉队者自适应联邦学习方法

基于自蒸馏的掉队者自适应联邦学习方法

姚文斌 崔浩然

基于自蒸馏的掉队者自适应联邦学习方法

A self-distillation-based straggler-adaptive federated learning method

姚文斌 1崔浩然1

作者信息

  • 1. 北京邮电大学计算机学院(国家示范性软件学院),北京 100876
  • 折叠

摘要

联邦学习是一种在保护数据隐私的前提下,利用多方局部数据进行协作训练的分布式机器学习范式。然而,系统异质性导致的"掉队者问题"严重阻碍了训练效率,而数据异质性产生的非独立同分布数据则会降低模型性能。本文提出一种基于自蒸馏的掉队者自适应联邦学习方法。该方法首先根据计算与通信能力将客户端划分为"掉队者"和"领导者"。掉队者被分配与其能力相匹配的轻量化分类器,而领导者则利用深度分类器通过自蒸馏技术将知识传递给掉队者模型。这种策略确保了掉队者能有效参与训练,并最大限度减少性能损耗。此外,随着全局模型准确率的提升,算法将逐步增加领导者的比例,以进一步释放训练潜力。

Abstract

Federated Learning facilitates collaborative model training while preserving data privacy, yet it remains hampered by system heterogeneity (the "straggler problem") and data heterogeneity (non-IID distributions). This paper introduces a self-distillation-based straggler-adaptive federated learning method to mitigate these challenges. The method categorizes clients into "stragglers" and "leaders" based on their hardware and communication constraints. Stragglers are assigned lightweight classifiers, while leaders employ deep classifiers to transfer knowledge to stragglers via self-distillation. This mechanism ensures that stragglers contribute meaningfully to the global model without stalling the process. Furthermore, the proportion of leaders is incrementally increased as global accuracy improves to further optimize training potential.

关键词

联邦学习/自蒸馏/多分类器/异质性。

Key words

Federated learning/Self-distillation/Multi-classifiers/Heterogeneity.

引用本文复制引用

姚文斌,崔浩然.基于自蒸馏的掉队者自适应联邦学习方法[EB/OL].(2026-02-06)[2026-02-08].http://www.paper.edu.cn/releasepaper/content/202602-48.

学科分类

计算技术、计算机技术

评论

首发时间 2026-02-06
下载量:0
|
点击量:11
段落导航相关论文