|国家预印本平台
首页|Improved Immiscible Diffusion: Accelerate Diffusion Training by Reducing Its Miscibility

Improved Immiscible Diffusion: Accelerate Diffusion Training by Reducing Its Miscibility

Improved Immiscible Diffusion: Accelerate Diffusion Training by Reducing Its Miscibility

来源:Arxiv_logoArxiv
英文摘要

The substantial training cost of diffusion models hinders their deployment. Immiscible Diffusion recently showed that reducing diffusion trajectory mixing in the noise space via linear assignment accelerates training by simplifying denoising. To extend immiscible diffusion beyond the inefficient linear assignment under high batch sizes and high dimensions, we refine this concept to a broader miscibility reduction at any layer and by any implementation. Specifically, we empirically demonstrate the bijective nature of the denoising process with respect to immiscible diffusion, ensuring its preservation of generative diversity. Moreover, we provide thorough analysis and show step-by-step how immiscibility eases denoising and improves efficiency. Extending beyond linear assignment, we propose a family of implementations including K-nearest neighbor (KNN) noise selection and image scaling to reduce miscibility, achieving up to >4x faster training across diverse models and tasks including unconditional/conditional generation, image editing, and robotics planning. Furthermore, our analysis of immiscibility offers a novel perspective on how optimal transport (OT) enhances diffusion training. By identifying trajectory miscibility as a fundamental bottleneck, we believe this work establishes a potentially new direction for future research into high-efficiency diffusion training. The code is available at https://github.com/yhli123/Immiscible-Diffusion.

Yiheng Li、Feng Liang、Dan Kondratyuk、Masayoshi Tomizuka、Kurt Keutzer、Chenfeng Xu

计算技术、计算机技术

Yiheng Li,Feng Liang,Dan Kondratyuk,Masayoshi Tomizuka,Kurt Keutzer,Chenfeng Xu.Improved Immiscible Diffusion: Accelerate Diffusion Training by Reducing Its Miscibility[EB/OL].(2025-05-24)[2025-06-12].https://arxiv.org/abs/2505.18521.点此复制

评论