|国家预印本平台
首页|Optimization-Induced Dynamics of Lipschitz Continuity in Neural Networks

Optimization-Induced Dynamics of Lipschitz Continuity in Neural Networks

Optimization-Induced Dynamics of Lipschitz Continuity in Neural Networks

来源:Arxiv_logoArxiv
英文摘要

Lipschitz continuity characterizes the worst-case sensitivity of neural networks to small input perturbations; yet its dynamics (i.e. temporal evolution) during training remains under-explored. We present a rigorous mathematical framework to model the temporal evolution of Lipschitz continuity during training with stochastic gradient descent (SGD). This framework leverages a system of stochastic differential equations (SDEs) to capture both deterministic and stochastic forces. Our theoretical analysis identifies three principal factors driving the evolution: (i) the projection of gradient flows, induced by the optimization dynamics, onto the operator-norm Jacobian of parameter matrices; (ii) the projection of gradient noise, arising from the randomness in mini-batch sampling, onto the operator-norm Jacobian; and (iii) the projection of the gradient noise onto the operator-norm Hessian of parameter matrices. Furthermore, our theoretical framework sheds light on such as how noisy supervision, parameter initialization, batch size, and mini-batch sampling trajectories, among other factors, shape the evolution of the Lipschitz continuity of neural networks. Our experimental results demonstrate strong agreement between the theoretical implications and the observed behaviors.

R?3is?-n Luo、James McDermott、Christian Gagn??、Qiang Sun、Colm O'Riordan

数学计算技术、计算机技术

R?3is?-n Luo,James McDermott,Christian Gagn??,Qiang Sun,Colm O'Riordan.Optimization-Induced Dynamics of Lipschitz Continuity in Neural Networks[EB/OL].(2025-06-23)[2025-07-03].https://arxiv.org/abs/2506.18588.点此复制

评论