|国家预印本平台
首页|From Kinetic Theory to AI: a Rediscovery of High-Dimensional Divergences and Their Properties

From Kinetic Theory to AI: a Rediscovery of High-Dimensional Divergences and Their Properties

From Kinetic Theory to AI: a Rediscovery of High-Dimensional Divergences and Their Properties

来源:Arxiv_logoArxiv
英文摘要

Selecting an appropriate divergence measure is a critical aspect of machine learning, as it directly impacts model performance. Among the most widely used, we find the Kullback-Leibler (KL) divergence, originally introduced in kinetic theory as a measure of relative entropy between probability distributions. Just as in machine learning, the ability to quantify the proximity of probability distributions plays a central role in kinetic theory. In this paper, we present a comparative review of divergence measures rooted in kinetic theory, highlighting their theoretical foundations and exploring their potential applications in machine learning and artificial intelligence.

Gennaro Auricchio、Giovanni Brigati、Paolo Giudici、Giuseppe Toscani

物理学计算技术、计算机技术

Gennaro Auricchio,Giovanni Brigati,Paolo Giudici,Giuseppe Toscani.From Kinetic Theory to AI: a Rediscovery of High-Dimensional Divergences and Their Properties[EB/OL].(2025-07-15)[2025-07-25].https://arxiv.org/abs/2507.11387.点此复制

评论