|国家预印本平台
首页|Quantum Complex-Valued Self-Attention Model

Quantum Complex-Valued Self-Attention Model

Quantum Complex-Valued Self-Attention Model

来源:Arxiv_logoArxiv
英文摘要

Self-attention has revolutionized classical machine learning, yet existing quantum self-attention models underutilize quantum states' potential due to oversimplified or incomplete mechanisms. To address this limitation, we introduce the Quantum Complex-Valued Self-Attention Model (QCSAM), the first framework to leverage complex-valued similarities, which captures amplitude and phase relationships between quantum states more comprehensively. To achieve this, QCSAM extends the Linear Combination of Unitaries (LCUs) into the Complex LCUs (CLCUs) framework, enabling precise complex-valued weighting of quantum states and supporting quantum multi-head attention. Experiments on MNIST and Fashion-MNIST show that QCSAM outperforms recent quantum self-attention models, including QKSAN, QSAN, and GQHAN. With only 4 qubits, QCSAM achieves 100% and 99.2% test accuracies on MNIST and Fashion-MNIST, respectively. Furthermore, we evaluate scalability across 3-8 qubits and 2-4 class tasks, while ablation studies validate the advantages of complex-valued attention weights over real-valued alternatives. This work advances quantum machine learning by enhancing the expressiveness and precision of quantum self-attention in a way that aligns with the inherent complexity of quantum mechanics.

Longfei Tang、Fu Chen、Qinglin Zhao、Li Feng、Yangbin Lin、Haitao Huang

计算技术、计算机技术

Longfei Tang,Fu Chen,Qinglin Zhao,Li Feng,Yangbin Lin,Haitao Huang.Quantum Complex-Valued Self-Attention Model[EB/OL].(2025-03-24)[2025-04-27].https://arxiv.org/abs/2503.19002.点此复制

评论