SPIRE: Conditional Personalization for Federated Diffusion Generative Models
SPIRE: Conditional Personalization for Federated Diffusion Generative Models
Recent advances in diffusion models have revolutionized generative AI, but their sheer size makes on device personalization, and thus effective federated learning (FL), infeasible. We propose Shared Backbone Personal Identity Representation Embeddings (SPIRE), a framework that casts per client diffusion based generation as conditional generation in FL. SPIRE factorizes the network into (i) a high capacity global backbone that learns a population level score function and (ii) lightweight, learnable client embeddings that encode local data statistics. This separation enables parameter efficient finetuning that touches $\leq 0.01\%$ of weights. We provide the first theoretical bridge between conditional diffusion training and maximum likelihood estimation in Gaussian mixture models. For a two component mixture we prove that gradient descent on the DDPM with respect to mixing weights loss recovers the optimal mixing weights and enjoys dimension free error bounds. Our analysis also hints at how client embeddings act as biases that steer a shared score network toward personalized distributions. Empirically, SPIRE matches or surpasses strong baselines during collaborative pretraining, and vastly outperforms them when adapting to unseen clients, reducing Kernel Inception Distance while updating only hundreds of parameters. SPIRE further mitigates catastrophic forgetting and remains robust across finetuning learning rate and epoch choices.
Kaan Ozkara、Ruida Zhou、Suhas Diggavi
计算技术、计算机技术
Kaan Ozkara,Ruida Zhou,Suhas Diggavi.SPIRE: Conditional Personalization for Federated Diffusion Generative Models[EB/OL].(2025-06-13)[2025-06-30].https://arxiv.org/abs/2506.12303.点此复制
评论