|国家预印本平台
首页|Physical models realizing the transformer architecture of large language models

Physical models realizing the transformer architecture of large language models

Physical models realizing the transformer architecture of large language models

来源:Arxiv_logoArxiv
英文摘要

The introduction of the transformer architecture in 2017 marked the most striking advancement in natural language processing. The transformer is a model architecture relying entirely on an attention mechanism to draw global dependencies between input and output. However, we believe there is a gap in our theoretical understanding of what the transformer is, and how it works physically. From a physical perspective on modern chips, such as those chips under 28nm, modern intelligent machines should be regarded as open quantum systems beyond conventional statistical systems. Thereby, in this paper, we construct physical models realizing large language models based on a transformer architecture as open quantum systems in the Fock space over the Hilbert space of tokens. Our physical models underlie the transformer architecture for large language models.

Zeqian Chen

语言学计算技术、计算机技术自动化基础理论

Zeqian Chen.Physical models realizing the transformer architecture of large language models[EB/OL].(2025-07-22)[2025-08-10].https://arxiv.org/abs/2507.13354.点此复制

评论