|国家预印本平台
首页|PAC-Bayesian risk bounds for fully connected deep neural network with Gaussian priors

PAC-Bayesian risk bounds for fully connected deep neural network with Gaussian priors

PAC-Bayesian risk bounds for fully connected deep neural network with Gaussian priors

来源:Arxiv_logoArxiv
英文摘要

Deep neural networks (DNNs) have emerged as a powerful methodology with significant practical successes in fields such as computer vision and natural language processing. Recent works have demonstrated that sparsely connected DNNs with carefully designed architectures can achieve minimax estimation rates under classical smoothness assumptions. However, subsequent studies revealed that simple fully connected DNNs can achieve comparable convergence rates, challenging the necessity of sparsity. Theoretical advances in Bayesian neural networks (BNNs) have been more fragmented. Much of those work has concentrated on sparse networks, leaving the theoretical properties of fully connected BNNs underexplored. In this paper, we address this gap by investigating fully connected Bayesian DNNs with Gaussian prior using PAC-Bayes bounds. We establish upper bounds on the prediction risk for a probabilistic deep neural network method, showing that these bounds match (up to logarithmic factors) the minimax-optimal rates in Besov space, for both nonparametric regression and binary classification with logistic loss. Importantly, our results hold for a broad class of practical activation functions that are Lipschitz continuous.

The Tien Mai

计算技术、计算机技术

The Tien Mai.PAC-Bayesian risk bounds for fully connected deep neural network with Gaussian priors[EB/OL].(2025-05-07)[2025-07-16].https://arxiv.org/abs/2505.04341.点此复制

评论