|国家预印本平台
首页|Evaluating Temporal Plasticity in Foundation Time Series Models for Incremental Fine-tuning

Evaluating Temporal Plasticity in Foundation Time Series Models for Incremental Fine-tuning

Evaluating Temporal Plasticity in Foundation Time Series Models for Incremental Fine-tuning

来源:Arxiv_logoArxiv
英文摘要

Time series foundation models excel at diverse time series forecasting tasks, but their capacity for continuous improvement through incremental learning remains unexplored. We present the first comprehensive study investigating these models' temporal plasticity - their ability to progressively enhance performance through continual learning while maintaining existing capabilities. Through experiments on real-world datasets exhibiting distribution shifts, we evaluate both conventional deep learning models and foundation models using a novel continual learning framework. Our findings reveal that while traditional models struggle with performance deterioration during incremental fine-tuning, foundation models like Time-MoE and Chronos demonstrate sustained improvement in predictive accuracy. This suggests that optimizing foundation model fine-tuning strategies may be more valuable than developing domain-specific small models. Our research introduces new evaluation methodologies and insights for developing foundation time series models with robust continuous learning capabilities.

Jia Liu、Cheng Jinguo、Xia Fang、Zhenyuan Ma、Yuankai Wu

计算技术、计算机技术

Jia Liu,Cheng Jinguo,Xia Fang,Zhenyuan Ma,Yuankai Wu.Evaluating Temporal Plasticity in Foundation Time Series Models for Incremental Fine-tuning[EB/OL].(2025-04-20)[2025-06-03].https://arxiv.org/abs/2504.14677.点此复制

评论