|国家预印本平台
首页|Sundial: A Family of Highly Capable Time Series Foundation Models

Sundial: A Family of Highly Capable Time Series Foundation Models

Sundial: A Family of Highly Capable Time Series Foundation Models

来源:Arxiv_logoArxiv
英文摘要

We introduce Sundial, a family of native, flexible, and scalable time series foundation models. To predict the next-patch's distribution, we propose a TimeFlow Loss based on flow-matching, which facilitates native pre-training of Transformers on continuous-valued time series without discrete tokenization. Conditioned on arbitrary-length time series, our models are pre-trained without specifying any prior distribution and can generate multiple probable predictions, achieving more flexibility in representation learning than using parametric densities. Towards time series foundation models, we leverage minimal but crucial adaptations of Transformers and curate TimeBench with one trillion time points, comprising mostly real-world datasets and synthetic data. By mitigating mode collapse via TimeFlow Loss, we pre-train a family of Sundial models on TimeBench, which achieve unprecedented model capacity and generalization performance. In addition to excellent scalability, Sundial achieves state-of-the-art results on both point and probabilistic forecasting benchmarks with a just-in-time inference speed, i.e., making zero-shot predictions within a few milliseconds. We believe that Sundial's pioneering generative forecasting capability can improve model reliability in real-world decision-making. Code is available at: https://github.com/thuml/Sundial.

Yong Liu、Guo Qin、Zhiyuan Shi、Zhi Chen、Caiyin Yang、Xiangdong Huang、Jianmin Wang、Mingsheng Long

信息科学、信息技术计算技术、计算机技术

Yong Liu,Guo Qin,Zhiyuan Shi,Zhi Chen,Caiyin Yang,Xiangdong Huang,Jianmin Wang,Mingsheng Long.Sundial: A Family of Highly Capable Time Series Foundation Models[EB/OL].(2025-02-02)[2025-06-16].https://arxiv.org/abs/2502.00816.点此复制

评论