|国家预印本平台
首页|Warm Starts Accelerate Generative Modelling

Warm Starts Accelerate Generative Modelling

Warm Starts Accelerate Generative Modelling

来源:Arxiv_logoArxiv
英文摘要

Iterative generative models, like diffusion and flow-matching, create high-fidelity samples by progressively refining a noise vector into data. However, this process is notoriously slow, often requiring hundreds of function evaluations. We introduce the warm-start model, a simple, deterministic model that dramatically accelerates conditional generation by providing a better starting point. Instead of starting generation from an uninformed N(0, I) prior, our warm-start model predicts an informed prior N(mu, sigma), whose moments are conditioned on the input context. This "warm start" substantially reduces the distance the generative process must traverse, particularly when the conditioning information is strongly informative. On tasks like image inpainting, our method achieves results competitive with a 1000-step DDPM baseline using only 11 total function evaluations (1 for the warm start, 10 for generation). A simple conditional normalization trick makes our method compatible with any standard generative model and sampler without modification, allowing it to be combined with other efficient sampling techniques for further acceleration. Our implementation is available at https://github.com/jonas-scholz123/warm-start-model.

Jonas Scholz、Richard E. Turner

计算技术、计算机技术

Jonas Scholz,Richard E. Turner.Warm Starts Accelerate Generative Modelling[EB/OL].(2025-07-12)[2025-07-25].https://arxiv.org/abs/2507.09212.点此复制

评论