FedPhD: Federated Pruning with Hierarchical Learning of Diffusion Models
FedPhD: Federated Pruning with Hierarchical Learning of Diffusion Models
Federated Learning (FL), as a distributed learning paradigm, trains models over distributed clients' data. FL is particularly beneficial for distributed training of Diffusion Models (DMs), which are high-quality image generators that require diverse data. However, challenges such as high communication costs and data heterogeneity persist in training DMs similar to training Transformers and Convolutional Neural Networks. Limited research has addressed these issues in FL environments. To address this gap and challenges, we introduce a novel approach, FedPhD, designed to efficiently train DMs in FL environments. FedPhD leverages Hierarchical FL with homogeneity-aware model aggregation and selection policy to tackle data heterogeneity while reducing communication costs. The distributed structured pruning of FedPhD enhances computational efficiency and reduces model storage requirements in clients. Our experiments across multiple datasets demonstrate that FedPhD achieves high model performance regarding Fréchet Inception Distance (FID) scores while reducing communication costs by up to $88\%$. FedPhD outperforms baseline methods achieving at least a $34\%$ improvement in FID, while utilizing only $56\%$ of the total computation and communication resources.
Qianyu Long、Qiyuan Wang、Christos Anagnostopoulos、Daning Bi
计算技术、计算机技术通信无线通信
Qianyu Long,Qiyuan Wang,Christos Anagnostopoulos,Daning Bi.FedPhD: Federated Pruning with Hierarchical Learning of Diffusion Models[EB/OL].(2025-07-08)[2025-07-16].https://arxiv.org/abs/2507.06449.点此复制
评论