Leveraging Operator Learning to Accelerate Convergence of the Preconditioned Conjugate Gradient Method
Leveraging Operator Learning to Accelerate Convergence of the Preconditioned Conjugate Gradient Method
We propose a new deflation strategy to accelerate the convergence of the preconditioned conjugate gradient(PCG) method for solving parametric large-scale linear systems of equations. Unlike traditional deflation techniques that rely on eigenvector approximations or recycled Krylov subspaces, we generate the deflation subspaces using operator learning, specifically the Deep Operator Network~(DeepONet). To this aim, we introduce two complementary approaches for assembling the deflation operators. The first approach approximates near-null space vectors of the discrete PDE operator using the basis functions learned by the DeepONet. The second approach directly leverages solutions predicted by the DeepONet. To further enhance convergence, we also propose several strategies for prescribing the sparsity pattern of the deflation operator. A comprehensive set of numerical experiments encompassing steady-state, time-dependent, scalar, and vector-valued problems posed on both structured and unstructured geometries is presented and demonstrates the effectiveness of the proposed DeepONet-based deflated PCG method, as well as its generalization across a wide range of model parameters and problem resolutions.
Alena Kopaničáková、Youngkyu Lee、George Em Karniadakis
计算技术、计算机技术
Alena Kopaničáková,Youngkyu Lee,George Em Karniadakis.Leveraging Operator Learning to Accelerate Convergence of the Preconditioned Conjugate Gradient Method[EB/OL].(2025-07-31)[2025-08-11].https://arxiv.org/abs/2508.00101.点此复制
评论