A Scalable Approach for Safe and Robust Learning via Lipschitz-Constrained Networks
A Scalable Approach for Safe and Robust Learning via Lipschitz-Constrained Networks
Certified robustness is a critical property for deploying neural networks (NN) in safety-critical applications. A principle approach to achieving such guarantees is to constrain the global Lipschitz constant of the network. However, accurate methods for Lipschitz-constrained training often suffer from non-convex formulations and poor scalability due to reliance on global semidefinite programs (SDPs). In this letter, we propose a convex training framework that enforces global Lipschitz constraints via semidefinite relaxation. By reparameterizing the NN using loop transformation, we derive a convex admissibility condition that enables tractable and certifiable training. While the resulting formulation guarantees robustness, its scalability is limited by the size of global SDP. To overcome this, we develop a randomized subspace linear matrix inequalities (RS-LMI) approach that decomposes the global constraints into sketched layerwise constraints projected onto low-dimensional subspaces, yielding a smooth and memory-efficient training objective. Empirical results on MNIST, CIFAR-10, and ImageNet demonstrate that the proposed framework achieves competitive accuracy with significantly improved Lipschitz bounds and runtime performance.
Zain ul Abdeen、Vassilis Kekatos、Ming Jin
控制理论、控制技术计算技术、计算机技术
Zain ul Abdeen,Vassilis Kekatos,Ming Jin.A Scalable Approach for Safe and Robust Learning via Lipschitz-Constrained Networks[EB/OL].(2025-06-30)[2025-07-21].https://arxiv.org/abs/2506.23977.点此复制
评论