|国家预印本平台
首页|Deep generative models as the probability transformation functions

Deep generative models as the probability transformation functions

Deep generative models as the probability transformation functions

来源:Arxiv_logoArxiv
英文摘要

This paper introduces a unified theoretical perspective that views deep generative models as probability transformation functions. Despite the apparent differences in architecture and training methodologies among various types of generative models - autoencoders, autoregressive models, generative adversarial networks, normalizing flows, diffusion models, and flow matching - we demonstrate that they all fundamentally operate by transforming simple predefined distributions into complex target data distributions. This unifying perspective facilitates the transfer of methodological improvements between model architectures and provides a foundation for developing universal theoretical approaches, potentially leading to more efficient and effective generative modeling techniques.

Vitalii Bondar、Vira Babenko、Roman Trembovetskyi、Yurii Korobeinyk、Viktoriya Dzyuba

计算技术、计算机技术

Vitalii Bondar,Vira Babenko,Roman Trembovetskyi,Yurii Korobeinyk,Viktoriya Dzyuba.Deep generative models as the probability transformation functions[EB/OL].(2025-06-20)[2025-07-16].https://arxiv.org/abs/2506.17171.点此复制

评论