|国家预印本平台
首页|Flash Invariant Point Attention

Flash Invariant Point Attention

Flash Invariant Point Attention

来源:Arxiv_logoArxiv
英文摘要

Invariant Point Attention (IPA) is a key algorithm for geometry-aware modeling in structural biology, central to many protein and RNA models. However, its quadratic complexity limits the input sequence length. We introduce FlashIPA, a factorized reformulation of IPA that leverages hardware-efficient FlashAttention to achieve linear scaling in GPU memory and wall-clock time with sequence length. FlashIPA matches or exceeds standard IPA performance while substantially reducing computational costs. FlashIPA extends training to previously unattainable lengths, and we demonstrate this by re-training generative models without length restrictions and generating structures of thousands of residues. FlashIPA is available at https://github.com/flagshippioneering/flash_ipa.

Andrew Liu、Axel Elaldi、Nicholas T Franklin、Nathan Russell、Gurinder S Atwal、Yih-En A Ban、Olivia Viessmann

生物科学研究方法、生物科学研究技术

Andrew Liu,Axel Elaldi,Nicholas T Franklin,Nathan Russell,Gurinder S Atwal,Yih-En A Ban,Olivia Viessmann.Flash Invariant Point Attention[EB/OL].(2025-05-16)[2025-06-12].https://arxiv.org/abs/2505.11580.点此复制

评论