Stochastic Variational Inference with Tuneable Stochastic Annealing
Stochastic Variational Inference with Tuneable Stochastic Annealing
In this paper, we exploit the observation that stochastic variational inference (SVI) is a form of annealing and present a modified SVI approach -- applicable to both large and small datasets -- that allows the amount of annealing done by SVI to be tuned. We are motivated by the fact that, in SVI, the larger the batch size the more approximately Gaussian is the intrinsic noise, but the smaller its variance. This low variance reduces the amount of annealing which is needed to escape bad local optimal solutions. We propose a simple method for achieving both goals of having larger variance noise to escape bad local optimal solutions and more data information to obtain more accurate gradient directions. The idea is to set an actual batch size, which may be the size of the data set, and a smaller effective batch size that matches the larger level of variance at this smaller batch size. The result is an approximation to the maximum entropy stochastic gradient at this variance level. We theoretically motivate our approach for the framework of conjugate exponential family models and illustrate the method empirically on the probabilistic matrix factorization collaborative filter, the Latent Dirichlet Allocation topic model, and the Gaussian mixture model.
John Paisley、Ghazal Fazelnia、Brian Barr
计算技术、计算机技术
John Paisley,Ghazal Fazelnia,Brian Barr.Stochastic Variational Inference with Tuneable Stochastic Annealing[EB/OL].(2025-04-04)[2025-05-04].https://arxiv.org/abs/2504.03902.点此复制
评论