|国家预印本平台
首页|Differentially Private Distribution Release of Gaussian Mixture Models via KL-Divergence Minimization

Differentially Private Distribution Release of Gaussian Mixture Models via KL-Divergence Minimization

Differentially Private Distribution Release of Gaussian Mixture Models via KL-Divergence Minimization

来源:Arxiv_logoArxiv
英文摘要

Gaussian Mixture Models (GMMs) are widely used statistical models for representing multi-modal data distributions, with numerous applications in data mining, pattern recognition, data simulation, and machine learning. However, recent research has shown that releasing GMM parameters poses significant privacy risks, potentially exposing sensitive information about the underlying data. In this paper, we address the challenge of releasing GMM parameters while ensuring differential privacy (DP) guarantees. Specifically, we focus on the privacy protection of mixture weights, component means, and covariance matrices. We propose to use Kullback-Leibler (KL) divergence as a utility metric to assess the accuracy of the released GMM, as it captures the joint impact of noise perturbation on all the model parameters. To achieve privacy, we introduce a DP mechanism that adds carefully calibrated random perturbations to the GMM parameters. Through theoretical analysis, we quantify the effects of privacy budget allocation and perturbation statistics on the DP guarantee, and derive a tractable expression for evaluating KL divergence. We formulate and solve an optimization problem to minimize the KL divergence between the released and original models, subject to a given $(\epsilon, \delta)$-DP constraint. Extensive experiments on both synthetic and real-world datasets demonstrate that our approach achieves strong privacy guarantees while maintaining high utility.

Hang Liu、Anna Scaglione、Sean Peisert

计算技术、计算机技术

Hang Liu,Anna Scaglione,Sean Peisert.Differentially Private Distribution Release of Gaussian Mixture Models via KL-Divergence Minimization[EB/OL].(2025-06-03)[2025-06-17].https://arxiv.org/abs/2506.03467.点此复制

评论