|国家预印本平台
首页|基于双层知识蒸馏的联邦学习方法

基于双层知识蒸馏的联邦学习方法

federated learning method based on two-layer knowledge distillation

中文摘要英文摘要

联邦学习作为一种旨在保护用户隐私的机器学习训练方法,无需用户共享数据即可完成全局模型的训练。然而,联邦学习在实际应用中会面临数据异构的问题,即客户端中的数据通常情况下是非均匀分布的,这将导致联邦学习训练出的模型准确率大幅降低,对联邦学习的实际应用造成困难。本文通过知识蒸馏来对联邦学习中的服务器与客户端进行双层优化,使服务器中的全局聚合与客户端中的本地训练能够学习到更多全局的知识,减少因数据异构造成的知识遗忘现象,从知识迁移的角度提升联邦学习的整体效果。同时,本文通过实验模拟了数据异构的场景,将所提的基于双层知识蒸馏的联邦学习方法与目前联邦学习的主流方法进行对比,实验证明,所提方法训练出的模型在准确率和收敛速度上有着明显的优势,为联邦学习在数据异构场景下的使用提供了新的思路。

Federated learning, as a machine learning training method designed to protect user privacy, can complete the training of global models without users sharing data. However, federated learning will face the problem of data heterogeneity in practical applications, that is, the data in the client is usually non-uniformly distributed, which will lead to a significant reduction in the accuracy of the model trained by federated learning, which will cause difficulties in the practical application of federated learning. In this paper, the server and client in federated learning are optimized by knowledge distillation, so that the global aggregation in the server and the local training in the client can learn more global knowledge, reduce the knowledge forgetting phenomenon caused by data heterogeneity, and improve the overall effect of federated learning from the perspective of knowledge transfer. At the same time, this paper simulates the scenario of data heterogeneity through experiments, and compares the proposed federated learning method based on double-layer knowledge distillation with the current mainstream federated learning methods, and the experimental results show that the model trained by the proposed method has obvious advantages in accuracy and convergence speed, which provides a new idea for the use of federated learning in the data heterogeneous scenario.

李丽香、宋杰

计算技术、计算机技术

数据安全与计算机安全隐私保护联邦学习知识蒸馏

ata Security vs. Computer SecurityPrivacy ProtectionFederated LearningKnowledge Distillation

李丽香,宋杰.基于双层知识蒸馏的联邦学习方法[EB/OL].(2024-03-15)[2025-08-24].http://www.paper.edu.cn/releasepaper/content/202403-186.点此复制

评论