|国家预印本平台
首页|The Exploration of Error Bounds in Classification with Noisy Labels

The Exploration of Error Bounds in Classification with Noisy Labels

The Exploration of Error Bounds in Classification with Noisy Labels

来源:Arxiv_logoArxiv
英文摘要

Numerous studies have shown that label noise can lead to poor generalization performance, negatively affecting classification accuracy. Therefore, understanding the effectiveness of classifiers trained using deep neural networks in the presence of noisy labels is of considerable practical significance. In this paper, we focus on the error bounds of excess risks for classification problems with noisy labels within deep learning frameworks. We derive error bounds for the excess risk, decomposing it into statistical error and approximation error. To handle statistical dependencies (e.g., mixing sequences), we employ an independent block construction to bound the error, leveraging techniques for dependent processes. For the approximation error, we establish these theoretical results to the vector-valued setting, where the output space consists of $K$-dimensional unit vectors. Finally, under the low-dimensional manifold hypothesis, we further refine the approximation error to mitigate the impact of high-dimensional input spaces.

Haixia Liu、Boxiao Li、Can Yang、Yang Wang

计算技术、计算机技术

Haixia Liu,Boxiao Li,Can Yang,Yang Wang.The Exploration of Error Bounds in Classification with Noisy Labels[EB/OL].(2025-06-19)[2025-07-21].https://arxiv.org/abs/2501.15163.点此复制

评论