|国家预印本平台
首页|Hybrid Mamba-Transformer Decoder for Error-Correcting Codes

Hybrid Mamba-Transformer Decoder for Error-Correcting Codes

Hybrid Mamba-Transformer Decoder for Error-Correcting Codes

来源:Arxiv_logoArxiv
英文摘要

We introduce a novel deep learning method for decoding error correction codes based on the Mamba architecture, enhanced with Transformer layers. Our approach proposes a hybrid decoder that leverages Mamba's efficient sequential modeling while maintaining the global context capabilities of Transformers. To further improve performance, we design a novel layer-wise masking strategy applied to each Mamba layer, allowing selective attention to relevant code features at different depths. Additionally, we introduce a progressive layer-wise loss, supervising the network at intermediate stages and promoting robust feature extraction throughout the decoding process. Comprehensive experiments across a range of linear codes demonstrate that our method significantly outperforms Transformer-only decoders and standard Mamba models.

Shy-el Cohen、Yoni Choukroun、Eliya Nachmani

计算技术、计算机技术通信

Shy-el Cohen,Yoni Choukroun,Eliya Nachmani.Hybrid Mamba-Transformer Decoder for Error-Correcting Codes[EB/OL].(2025-05-23)[2025-07-17].https://arxiv.org/abs/2505.17834.点此复制

评论