GFocal: A Global-Focal Neural Operator for Solving PDEs on Arbitrary Geometries
GFocal: A Global-Focal Neural Operator for Solving PDEs on Arbitrary Geometries
Transformer-based neural operators have emerged as promising surrogate solvers for partial differential equations, by leveraging the effectiveness of Transformers for capturing long-range dependencies and global correlations, profoundly proven in language modeling. However, existing methodologies overlook the coordinated learning of interdependencies between local physical details and global features, which are essential for tackling multiscale problems, preserving physical consistency and numerical stability in long-term rollouts, and accurately capturing transitional dynamics. In this work, we propose GFocal, a Transformer-based neural operator method that enforces simultaneous global and local feature learning and fusion. Global correlations and local features are harnessed through Nyström attention-based \textbf{g}lobal blocks and slices-based \textbf{focal} blocks to generate physics-aware tokens, subsequently modulated and integrated via convolution-based gating blocks, enabling dynamic fusion of multiscale information. GFocal achieves accurate modeling and prediction of physical features given arbitrary geometries and initial conditions. Experiments show that GFocal achieves state-of-the-art performance with an average 15.2\% relative gain in five out of six benchmarks and also excels in industry-scale simulations such as aerodynamics simulation of automotives and airfoils.
Fangzhi Fei、Jiaxin Hu、Qiaofeng Li、Zhenyu Liu
物理学数学
Fangzhi Fei,Jiaxin Hu,Qiaofeng Li,Zhenyu Liu.GFocal: A Global-Focal Neural Operator for Solving PDEs on Arbitrary Geometries[EB/OL].(2025-08-06)[2025-08-17].https://arxiv.org/abs/2508.04463.点此复制
评论