Communication Complexity of Exact Sampling under R\'enyi Information
Communication Complexity of Exact Sampling under R\'enyi Information
We study the problem of communicating a sample from a probability distribution $P$ given shared access to a sequence distributed according to another probability distribution $Q$. Li and El Gamal used the Poisson functional representation to show that the minimum expected message length to communicate a sample from $P$ can be upper bounded by $D(P||Q) + \log (D(P||Q) + 1) + 4$, where $D(\, \cdot \, || \, \cdot\, )$ is the Kullback-Leibler divergence. We generalize this and related results to a cost which is exponential in the message length, specifically $L(t)$, Campbell's average codeword length of order $t$, and to R\'enyi's entropy. We lower bound the Campbell cost and R\'enyi entropy of communicating a sample under any (possibly noncausal) sampling protocol, showing that it grows approximately as $D_{1/\alpha}(P||Q)$, where $D_\beta(\,\cdot \,|| \,\cdot\,)$ is the R\'enyi divergence of order $\beta$. Using the Poisson functional representation, we prove an upper bound on $L(t)$ and $H_\alpha(K)$ which has a leading R\'enyi divergence term with order within $\epsilon$ of the lower bound. Our results reduce to the bounds of Harsha et al. as $\alpha \to 1$. We also provide numerical examples comparing the bounds in the cases of normal and Laplacian distributions, demonstrating that the upper and lower bounds are typically within 5-10 bits of each other.
Spencer Hill、Fady Alajaji、Tamás Linder
通信
Spencer Hill,Fady Alajaji,Tamás Linder.Communication Complexity of Exact Sampling under R\'enyi Information[EB/OL].(2025-06-13)[2025-07-16].https://arxiv.org/abs/2506.12219.点此复制
评论