|国家预印本平台
首页|Critical Points of Random Neural Networks

Critical Points of Random Neural Networks

Critical Points of Random Neural Networks

来源:Arxiv_logoArxiv
英文摘要

This work investigates the expected number of critical points of random neural networks with different activation functions as the depth increases in the infinite-width limit. Under suitable regularity conditions, we derive precise asymptotic formulas for the expected number of critical points of fixed index and those exceeding a given threshold. Our analysis reveals three distinct regimes depending on the value of the first derivative of the covariance evaluated at 1: the expected number of critical points may converge, grow polynomially, or grow exponentially with depth. The theoretical predictions are supported by numerical experiments. Moreover, we provide numerical evidence suggesting that, when the regularity condition is not satisfied (e.g. for neural networks with ReLU as activation function), the number of critical points increases as the map resolution increases, indicating a potential divergence in the number of critical points.

Simmaco Di Lillo

计算技术、计算机技术

Simmaco Di Lillo.Critical Points of Random Neural Networks[EB/OL].(2025-05-22)[2025-06-22].https://arxiv.org/abs/2505.17000.点此复制

评论