AMSFL: Adaptive Multi-Step Federated Learning via Gradient Difference-Based Error Modeling
AMSFL: Adaptive Multi-Step Federated Learning via Gradient Difference-Based Error Modeling
Federated learning faces critical challenges in balancing communication efficiency and model accuracy. One key issue lies in the approximation of update errors without incurring high computational costs. In this paper, we propose a lightweight yet effective method called Gradient Difference Approximation (GDA), which leverages first-order information to estimate local error trends without computing the full Hessian matrix. The proposed method forms a key component of the Adaptive Multi-Step Federated Learning (AMSFL) framework and provides a unified error modeling strategy for large-scale multi-step adaptive training environments.
Ganglou Xu
计算技术、计算机技术
Ganglou Xu.AMSFL: Adaptive Multi-Step Federated Learning via Gradient Difference-Based Error Modeling[EB/OL].(2025-05-27)[2025-07-16].https://arxiv.org/abs/2505.21695.点此复制
评论