|国家预印本平台
首页|Leveraging Self-Consistency for Data-Efficient Amortized Bayesian Inference

Leveraging Self-Consistency for Data-Efficient Amortized Bayesian Inference

Leveraging Self-Consistency for Data-Efficient Amortized Bayesian Inference

来源:Arxiv_logoArxiv
英文摘要

We propose a method to improve the efficiency and accuracy of amortized Bayesian inference by leveraging universal symmetries in the joint probabilistic model of parameters and data. In a nutshell, we invert Bayes' theorem and estimate the marginal likelihood based on approximate representations of the joint model. Upon perfect approximation, the marginal likelihood is constant across all parameter values by definition. However, errors in approximate inference lead to undesirable variance in the marginal likelihood estimates across different parameter values. We penalize violations of this symmetry with a \textit{self-consistency loss} which significantly improves the quality of approximate inference in low data regimes and can be used to augment the training of popular neural density estimators. We apply our method to a number of synthetic problems and realistic scientific models, discovering notable advantages in the context of both neural posterior and likelihood approximation.

Desi R. Ivanova、Marvin Schmitt、Daniel Habermann、Ullrich K?the、Stefan T. Radev、Paul-Christian B¨1rkner

计算技术、计算机技术

Desi R. Ivanova,Marvin Schmitt,Daniel Habermann,Ullrich K?the,Stefan T. Radev,Paul-Christian B¨1rkner.Leveraging Self-Consistency for Data-Efficient Amortized Bayesian Inference[EB/OL].(2023-10-06)[2025-07-23].https://arxiv.org/abs/2310.04395.点此复制

评论