Error bounds for particle gradient descent, and extensions of the log-Sobolev and Talagrand inequalities
Error bounds for particle gradient descent, and extensions of the log-Sobolev and Talagrand inequalities
We prove non-asymptotic error bounds for particle gradient descent (PGD, Kuntz et al., 2023), a recently introduced algorithm for maximum likelihood estimation of large latent variable models obtained by discretizing a gradient flow of the free energy. We begin by showing that the flow converges exponentially fast to the free energy's minimizers for models satisfying a condition that generalizes both the log-Sobolev and the Polyak--Åojasiewicz inequalities (LSI and PÅI, respectively). We achieve this by extending a result well-known in the optimal transport literature (that the LSI implies the Talagrand inequality) and its counterpart in the optimization literature (that the PÅI implies the so-called quadratic growth condition), and applying the extension to our new setting. We also generalize the Bakry--Ãmery Theorem and show that the LSI/PÅI extension holds for models with strongly concave log-likelihoods. For such models, we further control PGD's discretization error and obtain the non-asymptotic error bounds. While we are motivated by the study of PGD, we believe that the inequalities and results we extend may be of independent interest.
Samuel Power、Juan Kuntz、Rocco Caprio、Adam M. Johansen
数学计算技术、计算机技术
Samuel Power,Juan Kuntz,Rocco Caprio,Adam M. Johansen.Error bounds for particle gradient descent, and extensions of the log-Sobolev and Talagrand inequalities[EB/OL].(2025-07-16)[2025-08-04].https://arxiv.org/abs/2403.02004.点此复制
评论