|国家预印本平台
首页|Deviation Inequalities for Rényi Divergence Estimators via Variational Expression

Deviation Inequalities for Rényi Divergence Estimators via Variational Expression

Deviation Inequalities for Rényi Divergence Estimators via Variational Expression

来源:Arxiv_logoArxiv
英文摘要

Rényi divergences play a pivotal role in information theory, statistics, and machine learning. While several estimators of these divergences have been proposed in the literature with their consistency properties established and minimax convergence rates quantified, existing accounts of probabilistic bounds governing the estimation error are premature. Here, we make progress in this regard by establishing exponential deviation inequalities for smoothed plug-in estimators and neural estimators by relating the error to an appropriate empirical process and leveraging tools from empirical process theory. In particular, our approach does not require the underlying distributions to be compactly supported or have densities bounded away from zero, an assumption prevalent in existing results. The deviation inequality also leads to a one-sided concentration bound from the expectation, which is useful in random-coding arguments over continuous alphabets in information theory with potential applications to physical-layer security. As another concrete application, we consider a hypothesis testing framework for auditing Rényi differential privacy using the neural estimator as a test statistic and obtain non-asymptotic performance guarantees for such a test.

Sreejith Sreekumar、Kengo Kato

通信

Sreejith Sreekumar,Kengo Kato.Deviation Inequalities for Rényi Divergence Estimators via Variational Expression[EB/OL].(2025-08-12)[2025-08-24].https://arxiv.org/abs/2508.09382.点此复制

评论