|国家预印本平台
首页|Rethinking Probabilistic Circuit Parameter Learning

Rethinking Probabilistic Circuit Parameter Learning

Rethinking Probabilistic Circuit Parameter Learning

来源:Arxiv_logoArxiv
英文摘要

Probabilistic Circuits (PCs) offer a computationally scalable framework for generative modeling, supporting exact and efficient inference of a wide range of probabilistic queries. While recent advances have significantly improved the expressiveness and scalability of PCs, effectively training their parameters remains a challenge. In particular, a widely used optimization method, full-batch Expectation-Maximization (EM), requires processing the entire dataset before performing a single update, making it ineffective for large datasets. While empirical extensions to the mini-batch setting have been proposed, it remains unclear what objective these algorithms are optimizing, making it difficult to assess their theoretical soundness. This paper bridges the gap by establishing a novel connection between the general EM objective and the standard full-batch EM algorithm. Building on this, we derive a theoretically grounded generalization to the mini-batch setting and demonstrate its effectiveness through preliminary empirical results.

Anji Liu、Guy Van den Broeck

计算技术、计算机技术

Anji Liu,Guy Van den Broeck.Rethinking Probabilistic Circuit Parameter Learning[EB/OL].(2025-05-26)[2025-06-29].https://arxiv.org/abs/2505.19982.点此复制

评论