|国家预印本平台
| 注册
首页|A Sharp KL-Convergence Analysis for Diffusion Models under Minimal Assumptions

A Sharp KL-Convergence Analysis for Diffusion Models under Minimal Assumptions

A Sharp KL-Convergence Analysis for Diffusion Models under Minimal Assumptions

来源:Arxiv_logoArxiv
英文摘要

Diffusion-based generative models have emerged as highly effective methods for synthesizing high-quality samples. Recent works have focused on analyzing the convergence of their generation process with minimal assumptions, either through reverse SDEs or Probability Flow ODEs. The best known guarantees, without any smoothness assumptions, for the KL divergence so far achieve a linear dependence on the data dimension $d$ and an inverse quadratic dependence on $\varepsilon$. In this work, we present a refined analysis that improves the dependence on $\varepsilon$. We model the generation process as a composition of two steps: a reverse ODE step, followed by a smaller noising step along the forward process. This design leverages the fact that the ODE step enables control in Wasserstein-type error, which can then be converted into a KL divergence bound via noise addition, leading to a better dependence on the discretization step size. We further provide a novel analysis to achieve the linear $d$-dependence for the error due to discretizing this Probability Flow ODE in absence of any smoothness assumptions. We show that $\tilde{O}\left(\tfrac{d\log^{3/2}(\frac{1}δ)}{\varepsilon}\right)$ steps suffice to approximate the target distribution corrupted with Gaussian noise of variance $δ$ within $O(\varepsilon^2)$ in KL divergence, improving upon the previous best result, requiring $\tilde{O}\left(\tfrac{d\log^2(\frac{1}δ)}{\varepsilon^2}\right)$ steps.

Nishant Jain、Tong Zhang

计算技术、计算机技术数学

Nishant Jain,Tong Zhang.A Sharp KL-Convergence Analysis for Diffusion Models under Minimal Assumptions[EB/OL].(2025-08-22)[2025-09-06].https://arxiv.org/abs/2508.16306.点此复制

评论