|国家预印本平台
首页|Efficient Federated Learning with Timely Update Dissemination

Efficient Federated Learning with Timely Update Dissemination

Efficient Federated Learning with Timely Update Dissemination

来源:Arxiv_logoArxiv
英文摘要

Federated Learning (FL) has emerged as a compelling methodology for the management of distributed data, marked by significant advancements in recent years. In this paper, we propose an efficient FL approach that capitalizes on additional downlink bandwidth resources to ensure timely update dissemination. Initially, we implement this strategy within an asynchronous framework, introducing the Asynchronous Staleness-aware Model Update (FedASMU), which integrates both server-side and device-side methodologies. On the server side, we present an asynchronous FL system model that employs a dynamic model aggregation technique, which harmonizes local model updates with the global model to enhance both accuracy and efficiency. Concurrently, on the device side, we propose an adaptive model adjustment mechanism that integrates the latest global model with local models during training to further elevate accuracy. Subsequently, we extend this approach to a synchronous context, referred to as FedSSMU. Theoretical analyses substantiate the convergence of our proposed methodologies. Extensive experiments, encompassing six models and five public datasets, demonstrate that FedASMU and FedSSMU significantly surpass baseline methods in terms of both accuracy (up to 145.87%) and efficiency (up to 97.59%).

Juncheng Jia、Ji Liu、Chao Huo、Yihui Shen、Yang Zhou、Huaiyu Dai、Dejing Dou

计算技术、计算机技术

Juncheng Jia,Ji Liu,Chao Huo,Yihui Shen,Yang Zhou,Huaiyu Dai,Dejing Dou.Efficient Federated Learning with Timely Update Dissemination[EB/OL].(2025-07-08)[2025-07-23].https://arxiv.org/abs/2507.06031.点此复制

评论