|国家预印本平台
首页|Short Wins Long: Short Codes with Language Model Semantic Correction Outperform Long Codes

Short Wins Long: Short Codes with Language Model Semantic Correction Outperform Long Codes

Short Wins Long: Short Codes with Language Model Semantic Correction Outperform Long Codes

来源:Arxiv_logoArxiv
英文摘要

This paper presents a novel semantic-enhanced decoding scheme for transmitting natural language sentences with multiple short block codes over noisy wireless channels. After ASCII source coding, the natural language sentence message is divided into segments, where each is encoded with short block channel codes independently before transmission. At the receiver, each short block of codewords is decoded in parallel, followed by a semantic error correction (SEC) model to reconstruct corrupted segments semantically. We design and train the SEC model based on Bidirectional and Auto-Regressive Transformers (BART). Simulations demonstrate that the proposed scheme can significantly outperform encoding the sentence with one conventional long LDPC code, in terms of block error rate (BLER), semantic metrics, and decoding latency. Finally, we proposed a semantic hybrid automatic repeat request (HARQ) scheme to further enhance the error performance, which selectively requests retransmission depends on semantic uncertainty.

Jiafu Hao、Chentao Yue、Hao Chang、Branka Vucetic、Yonghui Li

无线通信

Jiafu Hao,Chentao Yue,Hao Chang,Branka Vucetic,Yonghui Li.Short Wins Long: Short Codes with Language Model Semantic Correction Outperform Long Codes[EB/OL].(2025-05-13)[2025-06-24].https://arxiv.org/abs/2505.08536.点此复制

评论