Discontinuous hybrid neural networks for the one-dimensional partial differential equations
Discontinuous hybrid neural networks for the one-dimensional partial differential equations
A feedforward neural network, including hidden layers, motivated by nonlinear functions (such as Tanh, ReLU, and Sigmoid functions), exhibits uniform approximation properties in Sobolev space, and discontinuous neural networks can reduce computational complexity. In this work, we present a discontinuous hybrid neural network method for solving the partial differential equations, construct a new hybrid loss functional that incorporates the variational of the approximation equation, interface jump stencil and boundary constraints. The RMSprop algorithm and discontinuous Galerkin method are employed to update the nonlinear parameters and linear parameters in neural networks, respectively. This approach guarantees the convergence of the loss functional and provides an approximate solution with high accuracy.
Xiaoyu Wang、Long Yuan、Yao Yu
数学计算技术、计算机技术
Xiaoyu Wang,Long Yuan,Yao Yu.Discontinuous hybrid neural networks for the one-dimensional partial differential equations[EB/OL].(2025-05-14)[2025-07-16].https://arxiv.org/abs/2505.09911.点此复制
评论