Quantum Brain
← Back to papers

Language Model for Large-Text Transmission in Noisy Quantum Communications

Yuqi Li, Zhou Shi, Haitao Ma, Lijiong Shen, Jing Bao, Yunlong Xiao·April 29, 2025
Physics

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

Quantum communication has the potential to revolutionize information processing, providing unparalleled security and increased capacity compared to its classical counterpart by using the principles of quantum mechanics. However, the presence of noise remains a major barrier to realizing these advantages. While strategies like quantum error correction and mitigation have been developed to address this challenge, they often come with substantial overhead in physical qubits or sample complexity, limiting their practicality for large-scale information transfer. Here, we present an alternative approach: applying machine learning frameworks from natural language processing to enhance the performance of noisy quantum communications, focusing on superdense coding. By employing bidirectional encoder representations from transformers (BERT), a model known for its capabilities in natural language processing, we demonstrate improvements in information transfer efficiency without resorting to conventional error correction or mitigation techniques. These results mark a step toward the practical realization of a scalable and resilient quantum internet.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.