Large Language Bayes
Large Language Bayes
Many domain experts do not have the time or training to write formal Bayesian models. This paper takes an informal problem description as input, and combines a large language model and a probabilistic programming language to create a joint distribution over formal models, latent variables, and data. A posterior over latent variables follows by conditioning on observed data and integrating over formal models. This presents a challenging inference problem. We suggest an inference recipe that amounts to generating many formal models from the large language model, performing approximate inference on each, and then doing a weighted average. This is justified an analyzed as a combination of self-normalized importance sampling, MCMC, and variational inference. We show that this produces sensible predictions without the need to specify a formal model.
Justin Domke
数学
Justin Domke.Large Language Bayes[EB/OL].(2025-04-18)[2025-05-15].https://arxiv.org/abs/2504.14025.点此复制
评论