The Gaussian-Multinoulli Restricted Boltzmann Machine: A Potts Model Extension of the GRBM
The Gaussian-Multinoulli Restricted Boltzmann Machine: A Potts Model Extension of the GRBM
Many real-world tasks, from associative memory to symbolic reasoning, demand discrete, structured representations that standard continuous latent models struggle to express naturally. We introduce the Gaussian-Multinoulli Restricted Boltzmann Machine (GM-RBM), a generative energy-based model that extends the Gaussian-Bernoulli RBM (GB-RBM) by replacing binary hidden units with $q$-state Potts variables. This modification enables a combinatorially richer latent space and supports learning over multivalued, interpretable latent concepts. We formally derive GM-RBM's energy function, learning dynamics, and conditional distributions, showing that it preserves tractable inference and training through contrastive divergence. Empirically, we demonstrate that GM-RBMs model complex multimodal distributions more effectively than binary RBMs, outperforming them on tasks involving analogical recall and structured memory. Our results highlight GM-RBMs as a scalable framework for discrete latent inference with enhanced expressiveness and interoperability.
Nikhil Kapasi、William Whitehead、Luke Theogarajan
计算技术、计算机技术
Nikhil Kapasi,William Whitehead,Luke Theogarajan.The Gaussian-Multinoulli Restricted Boltzmann Machine: A Potts Model Extension of the GRBM[EB/OL].(2025-05-16)[2025-06-07].https://arxiv.org/abs/2505.11635.点此复制
评论