|国家预印本平台
首页|Mesh-Informed Neural Operator : A Transformer Generative Approach

Mesh-Informed Neural Operator : A Transformer Generative Approach

Mesh-Informed Neural Operator : A Transformer Generative Approach

来源:Arxiv_logoArxiv
英文摘要

Generative models in function spaces, situated at the intersection of generative modeling and operator learning, are attracting increasing attention due to their immense potential in diverse scientific and engineering applications. While functional generative models are theoretically domain- and discretization-agnostic, current implementations heavily rely on the Fourier Neural Operator (FNO), limiting their applicability to regular grids and rectangular domains. To overcome these critical limitations, we introduce the Mesh-Informed Neural Operator (MINO). By leveraging graph neural operators and cross-attention mechanisms, MINO offers a principled, domain- and discretization-agnostic backbone for generative modeling in function spaces. This advancement significantly expands the scope of such models to more diverse applications in generative, inverse, and regression tasks. Furthermore, MINO provides a unified perspective on integrating neural operators with general advanced deep learning architectures. Finally, we introduce a suite of standardized evaluation metrics that enable objective comparison of functional generative models, addressing another critical gap in the field.

Yaozhong Shi、Zachary E. Ross、Domniki Asimaki、Kamyar Azizzadenesheli

计算技术、计算机技术

Yaozhong Shi,Zachary E. Ross,Domniki Asimaki,Kamyar Azizzadenesheli.Mesh-Informed Neural Operator : A Transformer Generative Approach[EB/OL].(2025-06-26)[2025-07-01].https://arxiv.org/abs/2506.16656.点此复制

评论