|国家预印本平台
首页|The VampPrior Mixture Model

The VampPrior Mixture Model

The VampPrior Mixture Model

来源:Arxiv_logoArxiv
英文摘要

Widely used deep latent variable models (DLVMs), in particular Variational Autoencoders (VAEs), employ overly simplistic priors on the latent space. To achieve strong clustering performance, existing methods that replace the standard normal prior with a Gaussian mixture model (GMM) require defining the number of clusters to be close to the number of expected ground truth classes a-priori and are susceptible to poor initializations. We leverage VampPrior concepts (Tomczak and Welling, 2018) to fit a Bayesian GMM prior, resulting in the VampPrior Mixture Model (VMM), a novel prior for DLVMs. In a VAE, the VMM attains highly competitive clustering performance on benchmark datasets. Integrating the VMM into scVI (Lopez et al., 2018), a popular scRNA-seq integration method, significantly improves its performance and automatically arranges cells into clusters with similar biological characteristics.

Andrew A. Stirn、David A. Knowles

生物科学现状、生物科学发展生物科学研究方法、生物科学研究技术计算技术、计算机技术

Andrew A. Stirn,David A. Knowles.The VampPrior Mixture Model[EB/OL].(2024-02-06)[2025-05-19].https://arxiv.org/abs/2402.04412.点此复制

评论