|国家预印本平台
首页|LLM-Prompt: Integrated Heterogeneous Prompts for Unlocking LLMs in Time Series Forecasting

LLM-Prompt: Integrated Heterogeneous Prompts for Unlocking LLMs in Time Series Forecasting

LLM-Prompt: Integrated Heterogeneous Prompts for Unlocking LLMs in Time Series Forecasting

来源:Arxiv_logoArxiv
英文摘要

Time series forecasting aims to model temporal dependencies among variables for future state inference, holding significant importance and widespread applications in real-world scenarios. Although deep learning-based methods have achieved remarkable progress, they still exhibit suboptimal performance in long-term forecasting and data-scarce scenarios. Recent research demonstrates that large language models (LLMs) achieve promising performance in time series forecasting. However, we find existing LLM-based methods still have shortcomings: (1) the absence of a unified paradigm for textual prompt formulation and (2) the neglect of modality discrepancies between textual prompts and time series. To address this, we propose LLM-Prompt, an LLM-based time series forecasting framework integrating multi-prompt information and cross-modal semantic alignment. Specifically, we first construct a unified textual prompt paradigm containing learnable soft prompts and textualized hard prompts. Second, to enhance LLMs' comprehensive understanding of the forecasting task, we design a semantic space embedding and cross-modal alignment module to achieve cross-modal fusion of temporal and textual information. Finally, the transformed time series from the LLMs are projected to obtain the forecasts. Comprehensive evaluations on 6 public datasets and 3 carbon emission datasets demonstrate that LLM-Prompt is a powerful framework for time series forecasting.

Zesen Wang、Yonggang Li、Lijuan Lan

信息科学、信息技术

Zesen Wang,Yonggang Li,Lijuan Lan.LLM-Prompt: Integrated Heterogeneous Prompts for Unlocking LLMs in Time Series Forecasting[EB/OL].(2025-06-21)[2025-07-16].https://arxiv.org/abs/2506.17631.点此复制

评论