|国家预印本平台
首页|Distributed Online Randomized Gradient-Free optimization with Compressed Communication

Distributed Online Randomized Gradient-Free optimization with Compressed Communication

Distributed Online Randomized Gradient-Free optimization with Compressed Communication

来源:Arxiv_logoArxiv
英文摘要

This paper addresses two fundamental challenges in distributed online convex optimization: communication efficiency and optimization under limited feedback. We propose Online Compressed Gradient Tracking with one-point Bandit Feedback (OCGT-BF), a novel algorithm that harness data compression and gradient-free optimization techniques in distributed networks. Our algorithm incorporates a compression scheme with error compensation mechanisms to reduce communication overhead while maintaining convergence guarantees. Unlike traditional approaches that assume perfect communication and full gradient access, OCGT-BF operates effectively under practical constraints by combining gradient-like tracking with one-point feedback estimation. We provide theoretical analysis demonstrating the dynamic regret bounds under both bandit feedback and stochastic gradient scenarios. Finally, extensive experiments validate that OCGT-BF achieves low dynamic regret while significantly reducing communication requirements.

Longkang Zhu、Xinli Shi、Xiangping Xu、Jinde Cao

通信

Longkang Zhu,Xinli Shi,Xiangping Xu,Jinde Cao.Distributed Online Randomized Gradient-Free optimization with Compressed Communication[EB/OL].(2025-04-30)[2025-06-13].https://arxiv.org/abs/2504.21693.点此复制

评论