Revisiting Deep Information Propagation: Fractal Frontier and Finite-size Effects
Revisiting Deep Information Propagation: Fractal Frontier and Finite-size Effects
Information propagation characterizes how input correlations evolve across layers in deep neural networks. This framework has been well studied using mean-field theory, which assumes infinitely wide networks. However, these assumptions break down for practical, finite-size networks. In this work, we study information propagation in randomly initialized neural networks with finite width and reveal that the boundary between ordered and chaotic regimes exhibits a fractal structure. This shows the fundamental complexity of neural network dynamics, in a setting that is independent of input data and optimization. To extend this analysis beyond multilayer perceptrons, we leverage recently introduced Fourier-based structured transforms, and show that information propagation in convolutional neural networks also follow the same behavior. Our investigation highlights the importance of finite network depth with respect to the tradeoff between separation and robustness.
Giuseppe Alessio D'Inverno、Zhiyuan Hu、Leo Davy、Michael Unser、Gianluigi Rozza、Jonathan Dong
计算技术、计算机技术
Giuseppe Alessio D'Inverno,Zhiyuan Hu,Leo Davy,Michael Unser,Gianluigi Rozza,Jonathan Dong.Revisiting Deep Information Propagation: Fractal Frontier and Finite-size Effects[EB/OL].(2025-08-05)[2025-08-16].https://arxiv.org/abs/2508.03222.点此复制
评论