Reducing Smoothness with Expressive Memory Enhanced Hierarchical Graph Neural Networks
Reducing Smoothness with Expressive Memory Enhanced Hierarchical Graph Neural Networks
Graphical forecasting models learn the structure of time series data via projecting onto a graph, with recent techniques capturing spatial-temporal associations between variables via edge weights. Hierarchical variants offer a distinct advantage by analysing the time series across multiple resolutions, making them particularly effective in tasks like global weather forecasting, where low-resolution variable interactions are significant. A critical challenge in hierarchical models is information loss during forward or backward passes through the hierarchy. We propose the Hierarchical Graph Flow (HiGFlow) network, which introduces a memory buffer variable of dynamic size to store previously seen information across variable resolutions. We theoretically show two key results: HiGFlow reduces smoothness when mapping onto new feature spaces in the hierarchy and non-strictly enhances the utility of message-passing by improving Weisfeiler-Lehman (WL) expressivity. Empirical results demonstrate that HiGFlow outperforms state-of-the-art baselines, including transformer models, by at least an average of 6.1% in MAE and 6.2% in RMSE. Code is available at https://github.com/TB862/ HiGFlow.git.
Thomas Bailie、Yun Sing Koh、S. Karthik Mukkavilli、Varvara Vetrova
大气科学(气象学)信息科学、信息技术
Thomas Bailie,Yun Sing Koh,S. Karthik Mukkavilli,Varvara Vetrova.Reducing Smoothness with Expressive Memory Enhanced Hierarchical Graph Neural Networks[EB/OL].(2025-03-31)[2025-05-01].https://arxiv.org/abs/2504.00349.点此复制
评论