|国家预印本平台
首页|Convergence guarantee for consistency models

Convergence guarantee for consistency models

Convergence guarantee for consistency models

来源:Arxiv_logoArxiv
英文摘要

We provide the first convergence guarantees for the Consistency Models (CMs), a newly emerging type of one-step generative models that can generate comparable samples to those generated by Diffusion Models. Our main result is that, under the basic assumptions on score-matching errors, consistency errors and smoothness of the data distribution, CMs can efficiently sample from any realistic data distribution in one step with small $W_2$ error. Our results (1) hold for $L^2$-accurate score and consistency assumption (rather than $L^\infty$-accurate); (2) do note require strong assumptions on the data distribution such as log-Sobelev inequality; (3) scale polynomially in all parameters; and (4) match the state-of-the-art convergence guarantee for score-based generative models (SGMs). We also provide the result that the Multistep Consistency Sampling procedure can further reduce the error comparing to one step sampling, which support the original statement of "Consistency Models, Yang Song 2023". Our result further imply a TV error guarantee when take some Langevin-based modifications to the output distributions.

Zhitang Chen、Junlong Lyu、Shoubo Feng

计算技术、计算机技术

Zhitang Chen,Junlong Lyu,Shoubo Feng.Convergence guarantee for consistency models[EB/OL].(2023-08-22)[2025-08-02].https://arxiv.org/abs/2308.11449.点此复制

评论