|国家预印本平台
| 注册
首页|How many qubits does a machine learning problem require?

How many qubits does a machine learning problem require?

How many qubits does a machine learning problem require?

来源:Arxiv_logoArxiv
英文摘要

For a machine learning paradigm to be generally applicable, it should have the property of universal approximation, that is, it should be able to approximate any target function to any desired degree of accuracy. In variational quantum machine learning, the class of functions that can be learned depend on both the data encoding scheme as well as the architecture of the optimizable part of the model. Here, we show that the property of universal approximation is constructively and efficiently realized by the recently proposed bit-bit encoding scheme. Further, we show that this construction allows us to calculate the number of qubits required to solve a learning problem on a dataset to a target accuracy, giving rise to the first resource estimation framework for variational quantum machine learning. We apply bit-bit encoding to a number of medium-sized datasets from OpenML and find that they require only $20$ qubits on average for encoding. Further, we extend the basic bit-bit encoding scheme to one that can handle batching very large datasets. As a demonstration, we apply this new scheme to the giga-scale transcriptomic Tahoe-100M dataset, concluding that the number of qubits required for encoding it lies beyond classical simulation capabilities. Remarkably, we find that the number of qubits does not necessarily increase with the number of features of a dataset, but may sometimes even decrease.

Sydney Leither、Michael Kubal、Sonika Johri

计算技术、计算机技术

Sydney Leither,Michael Kubal,Sonika Johri.How many qubits does a machine learning problem require?[EB/OL].(2025-08-28)[2025-09-06].https://arxiv.org/abs/2508.20992.点此复制

评论