Language Model for Large-Text Transmission in Noisy Quantum Communications
Language Model for Large-Text Transmission in Noisy Quantum Communications
Quantum communication has the potential to revolutionize information processing, providing unparalleled security and increased capacity compared to its classical counterpart by using the principles of quantum mechanics. However, the presence of noise remains a major barrier to realizing these advantages. While strategies like quantum error correction and mitigation have been developed to address this challenge, they often come with substantial overhead in physical qubits or sample complexity, limiting their practicality for large-scale information transfer. Here, we present an alternative approach: applying machine learning frameworks from natural language processing to enhance the performance of noisy quantum communications, focusing on superdense coding. By employing bidirectional encoder representations from transformers (BERT), a model known for its capabilities in natural language processing, we demonstrate improvements in information transfer efficiency without resorting to conventional error correction or mitigation techniques. These results mark a step toward the practical realization of a scalable and resilient quantum internet.
Yuqi Li、Zhouhang Shi、Haitao Ma、Li Shen、Jinge Bao、Yunlong Xiao
通信无线通信
Yuqi Li,Zhouhang Shi,Haitao Ma,Li Shen,Jinge Bao,Yunlong Xiao.Language Model for Large-Text Transmission in Noisy Quantum Communications[EB/OL].(2025-04-29)[2025-06-05].https://arxiv.org/abs/2504.20842.点此复制
评论