BSA: Ball Sparse Attention for Large-scale Geometries
BSA: Ball Sparse Attention for Large-scale Geometries
Self-attention scales quadratically with input size, limiting its use for large-scale physical systems. Although sparse attention mechanisms provide a viable alternative, they are primarily designed for regular structures such as text or images, making them inapplicable for irregular geometries. In this work, we present Ball Sparse Attention (BSA), which adapts Native Sparse Attention (NSA) (Yuan et al., 2025) to unordered point sets by imposing regularity using the Ball Tree structure from the Erwin Transformer (Zhdanov et al., 2025). We modify NSA's components to work with ball-based neighborhoods, yielding a global receptive field at sub-quadratic cost. On an airflow pressure prediction task, we achieve accuracy comparable to Full Attention while significantly reducing the theoretical computational complexity. Our implementation is available at https://github.com/britacatalin/bsa.
Catalin E. Brita、Hieu Nguyen、Lohithsai Yadala Chanchu、Domonkos Nagy、Maksim Zhdanov
力学工程基础科学
Catalin E. Brita,Hieu Nguyen,Lohithsai Yadala Chanchu,Domonkos Nagy,Maksim Zhdanov.BSA: Ball Sparse Attention for Large-scale Geometries[EB/OL].(2025-06-14)[2025-07-17].https://arxiv.org/abs/2506.12541.点此复制
评论