|国家预印本平台
首页|Preconditioning Natural and Second Order Gradient Descent in Quantum Optimization: A Performance Benchmark

Preconditioning Natural and Second Order Gradient Descent in Quantum Optimization: A Performance Benchmark

Preconditioning Natural and Second Order Gradient Descent in Quantum Optimization: A Performance Benchmark

来源:Arxiv_logoArxiv
英文摘要

The optimization of parametric quantum circuits is technically hindered by three major obstacles: the non-convex nature of the objective function, noisy gradient evaluations, and the presence of barren plateaus. As a result, the selection of classical optimizer becomes a critical factor in assessing and exploiting quantum-classical applications. One promising approach to tackle these challenges involves incorporating curvature information into the parameter update. The most prominent methods in this field are quasi-Newton and quantum natural gradient methods, which can facilitate faster convergence compared to first-order approaches. Second order methods however exhibit a significant trade-off between computational cost and accuracy, as well as heightened sensitivity to noise. This study evaluates the performance of three families of optimizers on synthetically generated MaxCut problems on a shallow QAOA algorithm. To address noise sensitivity and iteration cost, we demonstrate that incorporating secant-penalization in the BFGS update rule (SP-BFGS) yields improved outcomes for QAOA optimization problems, introducing a novel approach to stabilizing BFGS updates against gradient noise.

Théo Lisart-Liebermann、Arcesio Casta?eda Medina

计算技术、计算机技术物理学

Théo Lisart-Liebermann,Arcesio Casta?eda Medina.Preconditioning Natural and Second Order Gradient Descent in Quantum Optimization: A Performance Benchmark[EB/OL].(2025-04-23)[2025-06-16].https://arxiv.org/abs/2504.16518.点此复制

评论