Quantization-based Bounds on the Wasserstein Metric
Quantization-based Bounds on the Wasserstein Metric
The Wasserstein metric has become increasingly important in many machine learning applications such as generative modeling, image retrieval and domain adaptation. Despite its appeal, it is often too costly to compute. This has motivated approximation methods like entropy-regularized optimal transport, downsampling, and subsampling, which trade accuracy for computational efficiency. In this paper, we consider the challenge of computing efficient approximations to the Wasserstein metric that also serve as strict upper or lower bounds. Focusing on discrete measures on regular grids, our approach involves formulating and exactly solving a Kantorovich problem on a coarse grid using a quantized measure and specially designed cost matrix, followed by an upscaling and correction stage. This is done either in the primal or dual space to obtain valid upper and lower bounds on the Wasserstein metric of the full-resolution inputs. We evaluate our methods on the DOTmark optimal transport images benchmark, demonstrating a 10x-100x speedup compared to entropy-regularized OT while keeping the approximation error below 2%.
Jonathan Bobrutsky、Amit Moscovich
计算技术、计算机技术
Jonathan Bobrutsky,Amit Moscovich.Quantization-based Bounds on the Wasserstein Metric[EB/OL].(2025-06-01)[2025-07-18].https://arxiv.org/abs/2506.00976.点此复制
评论