Inter2Former: Dynamic Hybrid Attention for Efficient High-Precision Interactive
Inter2Former: Dynamic Hybrid Attention for Efficient High-Precision Interactive
Interactive segmentation (IS) improves annotation efficiency by segmenting target regions from user prompts, with widespread applications in real-world scenarios. Current approaches face a critical trade-off: dense-token methods achieve superior accuracy and detail preservation but suffer from prohibitively slow processing on CPU devices, while the Segment Anything Model (SAM) advances the field with sparse prompt tokens for fast inference but compromises segmentation quality. In this paper, we propose Inter2Former to address this challenge by optimizing computation allocation in dense-token processing, which introduces four key enhancements. First, we propose Dynamic Prompt Embedding (DPE) that adaptively processes only regions of interest while avoiding additional overhead from background tokens. Second, we introduce Dynamic Hybrid Attention (DHA), which leverages previous segmentation masks to route tokens through either full attention (O(N2)) for boundary regions or our proposed efficient BSQ attention (O(N)) for non-boundary regions. Third, we develop Hybrid Mixture of Experts (HMoE), which applies similar adaptive computation strategies in FFN modules with CPU-optimized parallel processing. Finally, we present Dynamic Local Upsampling (DLU), a reverse operation of DPE, which localizes objects with a lightweight MLP and performs fine-grained upsampling only in detected regions. Experimental results on high-precision IS benchmarks demonstrate that Inter2Former achieves SOTA performance with high efficiency on CPU devices.
You Huang、Lichao Chen、Jiayi Ji、Liujuan Cao、Shengchuan Zhang、Rongrong Ji
计算技术、计算机技术
You Huang,Lichao Chen,Jiayi Ji,Liujuan Cao,Shengchuan Zhang,Rongrong Ji.Inter2Former: Dynamic Hybrid Attention for Efficient High-Precision Interactive[EB/OL].(2025-07-13)[2025-07-23].https://arxiv.org/abs/2507.09612.点此复制
评论