MBFormer: A General Transformer-based Learning Paradigm for Many-body Interactions in Real Materials
MBFormer: A General Transformer-based Learning Paradigm for Many-body Interactions in Real Materials
Recently, radical progress in machine learning (ML) has revolutionized computational materials science, enabling unprecedentedly rapid materials discovery and property prediction, but the quantum many-body problem -- which is the key to understanding excited-state properties, ranging from transport to optics -- remains challenging due to the complexity of the nonlocal and energy-dependent interactions. Here, we propose a symmetry-aware, grid-free, transformer-based model, MBFormer, that is designed to learn the entire many-body hierarchy directly from mean-field inputs, exploiting the attention mechanism to accurately capture many-body correlations between mean-field states. As proof of principle, we demonstrate the capability of MBFormer in predicting results based on the GW plus Bethe Salpeter equation (GW-BSE) formalism, including quasiparticle energies, exciton energies, exciton oscillator strengths, and exciton wavefunction distribution. Our model is trained on a dataset of 721 two-dimensional materials from the C2DB database, achieving state-of-the-art performance with a low prediction mean absolute error (MAE) on the order of 0.1-0.2 eV for state-level quasiparticle and exciton energies across different materials. Moreover, we show explicitly that the attention mechanism plays a crucial role in capturing many-body correlations. Our framework provides an end-to-end platform from ground states to general many-body prediction in real materials, which could serve as a foundation model for computational materials science.
Bowen Hou、Xian Xu、Jinyuan Wu、Diana Y. Qiu
物理学信息科学、信息技术自然科学研究方法计算技术、计算机技术
Bowen Hou,Xian Xu,Jinyuan Wu,Diana Y. Qiu.MBFormer: A General Transformer-based Learning Paradigm for Many-body Interactions in Real Materials[EB/OL].(2025-07-07)[2025-07-21].https://arxiv.org/abs/2507.05480.点此复制
评论