Expansive Natural Neural Gradient Flows for Energy Minimization
Expansive Natural Neural Gradient Flows for Energy Minimization
This paper develops expansive gradient dynamics in deep neural network-induced mapping spaces. Specifically, we generate tools and concepts for minimizing a class of energy functionals in an abstract Hilbert space setting covering a wide scope of applications such as PDEs-based inverse problems and supervised learning. The approach hinges on a Hilbert space metric in the full diffeomorphism mapping space, which could be viewed as a generalized Wasserstein-2 metric. We then study a projection gradient descent method within deep neural network parameterized sets. More importantly, we develop an adaptation and expanding strategy to step-by-step enlarge the deep neural network structures. In particular, the expansion mechanism aims to enhance the alignment of the neural manifold induced natural gradient direction as well as possible with the ideal Hilbert space gradient descent direction leveraging the fact that we can evaluate projections of the Hilbert space gradient. We demonstrate the efficacy of the proposed strategy for several simple model problems for energies arising in the context of supervised learning, model reduction, or inverse problems. In particular, we highlight the importance of assembling the neural flow matrix based on the inner product for the ambient Hilbert space. The actual algorithms are the simplest specifications of a broader spectrum based on a correspondingly wider discussion, postponing a detailed analysis to forthcoming work.
Wolfgang Dahmen、Wuchen Li、Yuankai Teng、Zhu Wang
计算技术、计算机技术
Wolfgang Dahmen,Wuchen Li,Yuankai Teng,Zhu Wang.Expansive Natural Neural Gradient Flows for Energy Minimization[EB/OL].(2025-07-17)[2025-08-10].https://arxiv.org/abs/2507.13475.点此复制
评论