|国家预印本平台
首页|Can LLMs Alleviate Catastrophic Forgetting in Graph Continual Learning? A Systematic Study

Can LLMs Alleviate Catastrophic Forgetting in Graph Continual Learning? A Systematic Study

Can LLMs Alleviate Catastrophic Forgetting in Graph Continual Learning? A Systematic Study

来源:Arxiv_logoArxiv
英文摘要

Nowadays, real-world data, including graph-structure data, often arrives in a streaming manner, which means that learning systems need to continuously acquire new knowledge without forgetting previously learned information. Although substantial existing works attempt to address catastrophic forgetting in graph machine learning, they are all based on training from scratch with streaming data. With the rise of pretrained models, an increasing number of studies have leveraged their strong generalization ability for continual learning. Therefore, in this work, we attempt to answer whether large language models (LLMs) can mitigate catastrophic forgetting in Graph Continual Learning (GCL). We first point out that current experimental setups for GCL have significant flaws, as the evaluation stage may lead to task ID leakage. Then, we evaluate the performance of LLMs in more realistic scenarios and find that even minor modifications can lead to outstanding results. Finally, based on extensive experiments, we propose a simple-yet-effective method, Simple Graph Continual Learning (SimGCL), that surpasses the previous state-of-the-art GNN-based baseline by around 20% under the rehearsal-free constraint. To facilitate reproducibility, we have developed an easy-to-use benchmark LLM4GCL for training and evaluating existing GCL methods. The code is available at: https://github.com/ZhixunLEE/LLM4GCL.

Ziyang Cheng、Zhixun Li、Yuhan Li、Yixin Song、Kangyi Zhao、Dawei Cheng、Jia Li、Jeffrey Xu Yu

计算技术、计算机技术

Ziyang Cheng,Zhixun Li,Yuhan Li,Yixin Song,Kangyi Zhao,Dawei Cheng,Jia Li,Jeffrey Xu Yu.Can LLMs Alleviate Catastrophic Forgetting in Graph Continual Learning? A Systematic Study[EB/OL].(2025-05-24)[2025-06-05].https://arxiv.org/abs/2505.18697.点此复制

评论