|国家预印本平台
首页|Neural Thermodynamic Laws for Large Language Model Training

Neural Thermodynamic Laws for Large Language Model Training

Neural Thermodynamic Laws for Large Language Model Training

来源:Arxiv_logoArxiv
英文摘要

Beyond neural scaling laws, little is known about the laws underlying large language models (LLMs). We introduce Neural Thermodynamic Laws (NTL) -- a new framework that offers fresh insights into LLM training dynamics. On the theoretical side, we demonstrate that key thermodynamic quantities (e.g., temperature, entropy, heat capacity, thermal conduction) and classical thermodynamic principles (e.g., the three laws of thermodynamics and the equipartition theorem) naturally emerge under river-valley loss landscape assumptions. On the practical side, this scientific perspective yields intuitive guidelines for designing learning rate schedules.

Yizhou Liu、Jeff Gore、Max Tegmark、Ziming Liu

自然科学理论自然科学研究方法

Yizhou Liu,Jeff Gore,Max Tegmark,Ziming Liu.Neural Thermodynamic Laws for Large Language Model Training[EB/OL].(2025-05-15)[2025-06-19].https://arxiv.org/abs/2505.10559.点此复制

评论