|国家预印本平台
首页|Transformers Simulate MLE for Sequence Generation in Bayesian Networks

Transformers Simulate MLE for Sequence Generation in Bayesian Networks

Transformers Simulate MLE for Sequence Generation in Bayesian Networks

来源:Arxiv_logoArxiv
英文摘要

Transformers have achieved significant success in various fields, notably excelling in tasks involving sequential data like natural language processing. Despite these achievements, the theoretical understanding of transformers' capabilities remains limited. In this paper, we investigate the theoretical capabilities of transformers to autoregressively generate sequences in Bayesian networks based on in-context maximum likelihood estimation (MLE). Specifically, we consider a setting where a context is formed by a set of independent sequences generated according to a Bayesian network. We demonstrate that there exists a simple transformer model that can (i) estimate the conditional probabilities of the Bayesian network according to the context, and (ii) autoregressively generate a new sample according to the Bayesian network with estimated conditional probabilities. We further demonstrate in extensive experiments that such a transformer does not only exist in theory, but can also be effectively obtained through training. Our analysis highlights the potential of transformers to learn complex probabilistic models and contributes to a better understanding of large language models as a powerful class of sequence generators.

Yuan Cao、Yihan He、Dennis Wu、Hong-Yu Chen、Jianqing Fan、Han Liu

计算技术、计算机技术

Yuan Cao,Yihan He,Dennis Wu,Hong-Yu Chen,Jianqing Fan,Han Liu.Transformers Simulate MLE for Sequence Generation in Bayesian Networks[EB/OL].(2025-07-08)[2025-07-20].https://arxiv.org/abs/2501.02547.点此复制

评论