ESPFormer: Doubly-Stochastic Attention with Expected Sliced Transport Plans
ESPFormer: Doubly-Stochastic Attention with Expected Sliced Transport Plans
While self-attention has been instrumental in the success of Transformers, it can lead to over-concentration on a few tokens during training, resulting in suboptimal information flow. Enforcing doubly-stochastic constraints in attention matrices has been shown to improve structure and balance in attention distributions. However, existing methods rely on iterative Sinkhorn normalization, which is computationally costly. In this paper, we introduce a novel, fully parallelizable doubly-stochastic attention mechanism based on sliced optimal transport, leveraging Expected Sliced Transport Plans (ESP). Unlike prior approaches, our method enforces doubly stochasticity without iterative Sinkhorn normalization, significantly enhancing efficiency. To ensure differentiability, we incorporate a temperature-based soft sorting technique, enabling seamless integration into deep learning models. Experiments across multiple benchmark datasets, including image classification, point cloud classification, sentiment analysis, and neural machine translation, demonstrate that our enhanced attention regularization consistently improves performance across diverse applications. Our implementation code can be found at https://github.com/dariansal/ESPFormer.
Elaheh Akbari、Ashkan Shahbazi、Darian Salehi、Xinran Liu、Navid Naderializadeh、Soheil Kolouri
计算技术、计算机技术
Elaheh Akbari,Ashkan Shahbazi,Darian Salehi,Xinran Liu,Navid Naderializadeh,Soheil Kolouri.ESPFormer: Doubly-Stochastic Attention with Expected Sliced Transport Plans[EB/OL].(2025-07-12)[2025-07-25].https://arxiv.org/abs/2502.07962.点此复制
评论