Personalized Improvement of Standard Readout Error Mitigation using Low-Depth Circuits and Machine Learning
Personalized Improvement of Standard Readout Error Mitigation using Low-Depth Circuits and Machine Learning
Quantum computers have shown promise in improving algorithms in a variety of fields. The realization of these advancements is limited by the presence of noise and high error rates, which become prominent especially with increasing system size. Mitigation techniques using matrix inversions, unfolding, and deep learning, among others, have been leveraged to reduce this error. However, these methods are not reflective of the entire gate set of the quantum device and may need further tuning depending on the distance from the most recent calibration time. This paper proposes a method of improvement to numerical readout error techniques, where the readout error model is further refined using measured probability distributions from a collection of low-depth circuits. We use machine learning to improve the readout error model for the quantum system, testing the circuits on the simulated IBM Perth backend using Qiskit. We demonstrate a median 6.6% improvement in fidelity, 29.9% improvement for mean-squared error, and 10.3% improvement in Hellinger distance over the standard error mitigation approach for a seven-qubit system with a circuit depth of four. With further focus directed towards such improvement of these error mitigation techniques, we are one step closer to the fault-tolerant quantum computing era.
Melody Lee
物理学计算技术、计算机技术
Melody Lee.Personalized Improvement of Standard Readout Error Mitigation using Low-Depth Circuits and Machine Learning[EB/OL].(2025-06-04)[2025-06-28].https://arxiv.org/abs/2506.03920.点此复制
评论