|国家预印本平台
首页|Federated Learning on Non-iid Data via Local and Global Distillation

Federated Learning on Non-iid Data via Local and Global Distillation

Federated Learning on Non-iid Data via Local and Global Distillation

来源:Arxiv_logoArxiv
英文摘要

Most existing federated learning algorithms are based on the vanilla FedAvg scheme. However, with the increase of data complexity and the number of model parameters, the amount of communication traffic and the number of iteration rounds for training such algorithms increases significantly, especially in non-independently and homogeneously distributed scenarios, where they do not achieve satisfactory performance. In this work, we propose FedND: federated learning with noise distillation. The main idea is to use knowledge distillation to optimize the model training process. In the client, we propose a self-distillation method to train the local model. In the server, we generate noisy samples for each client and use them to distill other clients. Finally, the global model is obtained by the aggregation of local models. Experimental results show that the algorithm achieves the best performance and is more communication-efficient than state-of-the-art methods.

Longfei Zheng、Fei Zheng、Senci Ying、Chaochao Chen、Jianwei Yin、Fengqin Dong、Xiaolin Zheng

10.1109/ICWS60048.2023.00083

计算技术、计算机技术

Longfei Zheng,Fei Zheng,Senci Ying,Chaochao Chen,Jianwei Yin,Fengqin Dong,Xiaolin Zheng.Federated Learning on Non-iid Data via Local and Global Distillation[EB/OL].(2023-06-26)[2025-07-16].https://arxiv.org/abs/2306.14443.点此复制

评论