|国家预印本平台
首页|Error bounds for particle gradient descent, and extensions of the log-Sobolev and Talagrand inequalities

Error bounds for particle gradient descent, and extensions of the log-Sobolev and Talagrand inequalities

Error bounds for particle gradient descent, and extensions of the log-Sobolev and Talagrand inequalities

来源:Arxiv_logoArxiv
英文摘要

We prove non-asymptotic error bounds for particle gradient descent (PGD, Kuntz et al., 2023), a recently introduced algorithm for maximum likelihood estimation of large latent variable models obtained by discretizing a gradient flow of the free energy. We begin by showing that the flow converges exponentially fast to the free energy's minimizers for models satisfying a condition that generalizes both the log-Sobolev and the Polyak--Łojasiewicz inequalities (LSI and PŁI, respectively). We achieve this by extending a result well-known in the optimal transport literature (that the LSI implies the Talagrand inequality) and its counterpart in the optimization literature (that the PŁI implies the so-called quadratic growth condition), and applying the extension to our new setting. We also generalize the Bakry--Émery Theorem and show that the LSI/PŁI extension holds for models with strongly concave log-likelihoods. For such models, we further control PGD's discretization error and obtain the non-asymptotic error bounds. While we are motivated by the study of PGD, we believe that the inequalities and results we extend may be of independent interest.

Samuel Power、Juan Kuntz、Rocco Caprio、Adam M. Johansen

数学计算技术、计算机技术

Samuel Power,Juan Kuntz,Rocco Caprio,Adam M. Johansen.Error bounds for particle gradient descent, and extensions of the log-Sobolev and Talagrand inequalities[EB/OL].(2025-07-16)[2025-08-04].https://arxiv.org/abs/2403.02004.点此复制

评论