|国家预印本平台
首页|Revisiting Unbiased Implicit Variational Inference

Revisiting Unbiased Implicit Variational Inference

Revisiting Unbiased Implicit Variational Inference

来源:Arxiv_logoArxiv
英文摘要

Recent years have witnessed growing interest in semi-implicit variational inference (SIVI) methods due to their ability to rapidly generate samples from complex distributions. However, since the likelihood of these samples is non-trivial to estimate in high dimensions, current research focuses on finding effective SIVI training routines. Although unbiased implicit variational inference (UIVI) has largely been dismissed as imprecise and computationally prohibitive because of its inner MCMC loop, we revisit this method and show that UIVI's MCMC loop can be effectively replaced via importance sampling and the optimal proposal distribution can be learned stably by minimizing an expected forward Kullback-Leibler divergence without bias. Our refined approach demonstrates superior performance or parity with state-of-the-art methods on established SIVI benchmarks.

Tobias Pielok、Bernd Bischl、David Rügamer

计算技术、计算机技术

Tobias Pielok,Bernd Bischl,David Rügamer.Revisiting Unbiased Implicit Variational Inference[EB/OL].(2025-06-04)[2025-07-17].https://arxiv.org/abs/2506.03839.点此复制

评论