|国家预印本平台
首页|How many measurements are enough? Bayesian recovery in inverse problems with general distributions

How many measurements are enough? Bayesian recovery in inverse problems with general distributions

How many measurements are enough? Bayesian recovery in inverse problems with general distributions

来源:Arxiv_logoArxiv
英文摘要

We study the sample complexity of Bayesian recovery for solving inverse problems with general prior, forward operator and noise distributions. We consider posterior sampling according to an approximate prior $\mathcal{P}$, and establish sufficient conditions for stable and accurate recovery with high probability. Our main result is a non-asymptotic bound that shows that the sample complexity depends on (i) the intrinsic complexity of $\mathcal{P}$, quantified by its so-called approximate covering number, and (ii) concentration bounds for the forward operator and noise distributions. As a key application, we specialize to generative priors, where $\mathcal{P}$ is the pushforward of a latent distribution via a Deep Neural Network (DNN). We show that the sample complexity scales log-linearly with the latent dimension $k$, thus establishing the efficacy of DNN-based priors. Generalizing existing results on deterministic (i.e., non-Bayesian) recovery for the important problem of random sampling with an orthogonal matrix $U$, we show how the sample complexity is determined by the coherence of $U$ with respect to the support of $\mathcal{P}$. Hence, we establish that coherence plays a fundamental role in Bayesian recovery as well. Overall, our framework unifies and extends prior work, providing rigorous guarantees for the sample complexity of solving Bayesian inverse problems with arbitrary distributions.

Ben Adcock、Nick Huang

数学

Ben Adcock,Nick Huang.How many measurements are enough? Bayesian recovery in inverse problems with general distributions[EB/OL].(2025-05-15)[2025-06-08].https://arxiv.org/abs/2505.10630.点此复制

评论