|国家预印本平台
首页|R2DN: Scalable Parameterization of Contracting and Lipschitz Recurrent Deep Networks

R2DN: Scalable Parameterization of Contracting and Lipschitz Recurrent Deep Networks

R2DN: Scalable Parameterization of Contracting and Lipschitz Recurrent Deep Networks

来源:Arxiv_logoArxiv
英文摘要

This paper presents the Robust Recurrent Deep Network (R2DN), a scalable parameterization of robust recurrent neural networks for machine learning and data-driven control. We construct R2DNs as a feedback interconnection of a linear time-invariant system and a 1-Lipschitz deep feedforward network, and directly parameterize the weights so that our models are stable (contracting) and robust to small input perturbations (Lipschitz) by design. Our parameterization uses a structure similar to the previously-proposed recurrent equilibrium networks (RENs), but without the requirement to iteratively solve an equilibrium layer at each time-step. This speeds up model evaluation and backpropagation on GPUs, and makes it computationally feasible to scale up the network size, batch size, and input sequence length in comparison to RENs. We compare R2DNs to RENs on three representative problems in nonlinear system identification, observer design, and learning-based feedback control and find that training and inference are both up to an order of magnitude faster with similar test set performance, and that training/inference times scale more favorably with respect to model expressivity.

Nicholas H. Barbara、Ruigang Wang、Ian R. Manchester

计算技术、计算机技术

Nicholas H. Barbara,Ruigang Wang,Ian R. Manchester.R2DN: Scalable Parameterization of Contracting and Lipschitz Recurrent Deep Networks[EB/OL].(2025-04-01)[2025-04-28].https://arxiv.org/abs/2504.01250.点此复制

评论