Univariate to Multivariate: LLMs as Zero-Shot Predictors for Time-Series Forecasting
Univariate to Multivariate: LLMs as Zero-Shot Predictors for Time-Series Forecasting
Time-series prediction or forecasting is critical across many real-world dynamic systems, and recent studies have proposed using Large Language Models (LLMs) for this task due to their strong generalization capabilities and ability to perform well without extensive pre-training. However, their effectiveness in handling complex, noisy, and multivariate time-series data remains underexplored. To address this, we propose LLMPred which enhances LLM-based time-series prediction by converting time-series sequences into text and feeding them to LLMs for zero shot prediction along with two main data pre-processing techniques. First, we apply time-series sequence decomposition to facilitate accurate prediction on complex and noisy univariate sequences. Second, we extend this univariate prediction capability to multivariate data using a lightweight prompt-processing strategy. Extensive experiments with smaller LLMs such as Llama 2 7B, Llama 3.2 3B, GPT-4o-mini, and DeepSeek 7B demonstrate that LLMPred achieves competitive or superior performance compared to state-of-the-art baselines. Additionally, a thorough ablation study highlights the importance of the key components proposed in LLMPred.
Chamara Madarasingha、Nasrin Sohrabi、Zahir Tari
计算技术、计算机技术
Chamara Madarasingha,Nasrin Sohrabi,Zahir Tari.Univariate to Multivariate: LLMs as Zero-Shot Predictors for Time-Series Forecasting[EB/OL].(2025-06-02)[2025-06-19].https://arxiv.org/abs/2506.02389.点此复制
评论