|国家预印本平台
首页|Geometry Aware Operator Transformer as an Efficient and Accurate Neural Surrogate for PDEs on Arbitrary Domains

Geometry Aware Operator Transformer as an Efficient and Accurate Neural Surrogate for PDEs on Arbitrary Domains

Geometry Aware Operator Transformer as an Efficient and Accurate Neural Surrogate for PDEs on Arbitrary Domains

来源:Arxiv_logoArxiv
英文摘要

The very challenging task of learning solution operators of PDEs on arbitrary domains accurately and efficiently is of vital importance to engineering and industrial simulations. Despite the existence of many operator learning algorithms to approximate such PDEs, we find that accurate models are not necessarily computationally efficient and vice versa. We address this issue by proposing a geometry aware operator transformer (GAOT) for learning PDEs on arbitrary domains. GAOT combines novel multiscale attentional graph neural operator encoders and decoders, together with geometry embeddings and (vision) transformer processors to accurately map information about the domain and the inputs into a robust approximation of the PDE solution. Multiple innovations in the implementation of GAOT also ensure computational efficiency and scalability. We demonstrate this significant gain in both accuracy and efficiency of GAOT over several baselines on a large number of learning tasks from a diverse set of PDEs, including achieving state of the art performance on a large scale three-dimensional industrial CFD dataset.

Shizheng Wen、Arsh Kumbhat、Levi Lingsch、Sepehr Mousavi、Yizhou Zhao、Praveen Chandrashekar、Siddhartha Mishra

数学计算技术、计算机技术

Shizheng Wen,Arsh Kumbhat,Levi Lingsch,Sepehr Mousavi,Yizhou Zhao,Praveen Chandrashekar,Siddhartha Mishra.Geometry Aware Operator Transformer as an Efficient and Accurate Neural Surrogate for PDEs on Arbitrary Domains[EB/OL].(2025-05-24)[2025-06-14].https://arxiv.org/abs/2505.18781.点此复制

评论