|国家预印本平台
| 注册
首页|Fed-DPRoC:Communication-Efficient Differentially Private and Robust Federated Learning

Fed-DPRoC:Communication-Efficient Differentially Private and Robust Federated Learning

Fed-DPRoC:Communication-Efficient Differentially Private and Robust Federated Learning

来源:Arxiv_logoArxiv
英文摘要

We propose Fed-DPRoC, a novel federated learning framework that simultaneously ensures differential privacy (DP), Byzantine robustness, and communication efficiency. We introduce the concept of robust-compatible compression, which enables users to compress DP-protected updates while maintaining the robustness of the aggregation rule. We instantiate our framework as RobAJoL, combining the Johnson-Lindenstrauss (JL) transform for compression with robust averaging for robust aggregation. We theoretically prove the compatibility of JL transform with robust averaging and show that RobAJoL preserves robustness guarantees, ensures DP, and reduces communication cost. Experiments on CIFAR-10 and Fashion MNIST validate our theoretical claims and demonstrate that RobAJoL outperforms existing methods in terms of robustness and utility under different Byzantine attacks.

Yue Xia、Tayyebeh Jahani-Nezhad、Rawad Bitar

计算技术、计算机技术

Yue Xia,Tayyebeh Jahani-Nezhad,Rawad Bitar.Fed-DPRoC:Communication-Efficient Differentially Private and Robust Federated Learning[EB/OL].(2025-08-18)[2025-09-07].https://arxiv.org/abs/2508.12978.点此复制

评论