Compositional amortized inference for large-scale hierarchical Bayesian models
Compositional amortized inference for large-scale hierarchical Bayesian models
Amortized Bayesian inference (ABI) has emerged as a powerful simulation-based approach for estimating complex mechanistic models, offering fast posterior sampling via generative neural networks. However, extending ABI to hierarchical models, a cornerstone of modern Bayesian analysis, remains a major challenge due to the difficulty of scaling to large numbers of parameters. In this work, we build on compositional score matching (CSM), a divide-and-conquer strategy for Bayesian updating using diffusion models. To address existing stability issues of CSM, we propose adaptive solvers coupled with a novel, error-damping compositional estimator. Our proposed method remains stable even with hundreds of thousands of data points and parameters. We validate our approach on a controlled toy example, a high-dimensional spatial autoregressive model, and a real-world advanced microscopy biological application task involving over 750,000 parameters.
Jonas Arruda、Vikas Pandey、Catherine Sherry、Margarida Barroso、Xavier Intes、Jan Hasenauer、Stefan T. Radev
生物科学研究方法、生物科学研究技术生物科学理论、生物科学方法
Jonas Arruda,Vikas Pandey,Catherine Sherry,Margarida Barroso,Xavier Intes,Jan Hasenauer,Stefan T. Radev.Compositional amortized inference for large-scale hierarchical Bayesian models[EB/OL].(2025-05-20)[2025-06-08].https://arxiv.org/abs/2505.14429.点此复制
评论