|国家预印本平台
首页|Discretization-free Multicalibration through Loss Minimization over Tree Ensembles

Discretization-free Multicalibration through Loss Minimization over Tree Ensembles

Discretization-free Multicalibration through Loss Minimization over Tree Ensembles

来源:Arxiv_logoArxiv
英文摘要

In recent years, multicalibration has emerged as a desirable learning objective for ensuring that a predictor is calibrated across a rich collection of overlapping subpopulations. Existing approaches typically achieve multicalibration by discretizing the predictor's output space and iteratively adjusting its output values. However, this discretization approach departs from the standard empirical risk minimization (ERM) pipeline, introduces rounding error and additional sensitive hyperparameter, and may distort the predictor's outputs in ways that hinder downstream decision-making. In this work, we propose a discretization-free multicalibration method that directly optimizes an empirical risk objective over an ensemble of depth-two decision trees. Our ERM approach can be implemented using off-the-shelf tree ensemble learning methods such as LightGBM. Our algorithm provably achieves multicalibration, provided that the data distribution satisfies a technical condition we term as loss saturation. Across multiple datasets, our empirical evaluation shows that this condition is always met in practice. Our discretization-free algorithm consistently matches or outperforms existing multicalibration approaches--even when evaluated using a discretization-based multicalibration metric that shares its discretization granularity with the baselines.

Hongyi Henry Jin、Zijun Ding、Dung Daniel Ngo、Zhiwei Steven Wu

计算技术、计算机技术

Hongyi Henry Jin,Zijun Ding,Dung Daniel Ngo,Zhiwei Steven Wu.Discretization-free Multicalibration through Loss Minimization over Tree Ensembles[EB/OL].(2025-05-22)[2025-06-13].https://arxiv.org/abs/2505.17435.点此复制

评论