|国家预印本平台
首页|Sure Convergence and Constructive Universal Approximation for Multi-Layer Neural Networks

Sure Convergence and Constructive Universal Approximation for Multi-Layer Neural Networks

Sure Convergence and Constructive Universal Approximation for Multi-Layer Neural Networks

来源:Arxiv_logoArxiv
英文摘要

We propose a new neural network model, 01Neuro, built on indicator activation neurons. Its boosted variant possesses two key statistical properties: (1) Sure Convergence, where model optimization can be achieved with high probability given sufficient computational resources; and (2) Constructive Universal Approximation: In the infinite sample setting, the model can approximate any finite sum of measurable functions, each depending on only k out of p input features, provided the architecture is properly tuned. Unlike most universal approximation results that are agnostic to training procedures, our guarantees are directly tied to the model's explicit construction and optimization algorithm. To improve prediction stability, we integrate stochastic training and bagging into the boosted 01Neuro framework. Empirical evaluations on simulated and real-world tabular datasets with small to medium sample sizes highlight its strengths: effective approximation of interaction components (multiplicative terms), stable prediction performance (comparable to Random Forests), robustness to many noisy features, and insensitivity to feature scaling. A major limitation of the current implementation of boosted 01Neuro is its higher computational cost, which is approximately 5 to 30 times that of Random Forests and XGBoost.

Chien-Ming Chi

计算技术、计算机技术

Chien-Ming Chi.Sure Convergence and Constructive Universal Approximation for Multi-Layer Neural Networks[EB/OL].(2025-07-07)[2025-07-17].https://arxiv.org/abs/2507.04779.点此复制

评论