When Quantum and Classical Models Disagree: Learning Beyond Minimum Norm Least Square
When Quantum and Classical Models Disagree: Learning Beyond Minimum Norm Least Square
Quantum Machine Learning algorithms based on Variational Quantum Circuits (VQCs) are important candidates for useful application of quantum computing. It is known that a VQC is a linear model in a feature space determined by its architecture. Such models can be compared to classical ones using various sets of tools, and surrogate models designed to classically approximate their results were proposed. At the same time, quantum advantages for learning tasks have been proven in the case of discrete data distributions and cryptography primitives. In this work, we propose a general theory of quantum advantages for regression problems. Using previous results, we establish conditions on the weight vectors of the quantum models that are necessary to avoid dequantization. We show that this theory is compatible with previously proven quantum advantages on discrete inputs, and provides examples of advantages for continuous inputs. This separation is connected to large weight vector norm, and we suggest that this can only happen with a high dimensional feature map. Our results demonstrate that it is possible to design quantum models that cannot be classically approximated with good generalization. Finally, we discuss how concentration issues must be considered to design such instances. We expect that our work will be a starting point to design near-term quantum models that avoid dequantization methods by ensuring non-classical convergence properties, and to identify existing quantum models that can be classically approximated.
Eliott Z. Mamon、Jonas Landman、Slimane Thabet、Léo Monbroussou
计算技术、计算机技术
Eliott Z. Mamon,Jonas Landman,Slimane Thabet,Léo Monbroussou.When Quantum and Classical Models Disagree: Learning Beyond Minimum Norm Least Square[EB/OL].(2025-07-08)[2025-07-23].https://arxiv.org/abs/2411.04940.点此复制
评论