|国家预印本平台
首页|Are Large Language Models Good Temporal Graph Learners?

Are Large Language Models Good Temporal Graph Learners?

Are Large Language Models Good Temporal Graph Learners?

来源:Arxiv_logoArxiv
英文摘要

Large Language Models (LLMs) have recently driven significant advancements in Natural Language Processing and various other applications. While a broad range of literature has explored the graph-reasoning capabilities of LLMs, including their use of predictors on graphs, the application of LLMs to dynamic graphs -- real world evolving networks -- remains relatively unexplored. Recent work studies synthetic temporal graphs generated by random graph models, but applying LLMs to real-world temporal graphs remains an open question. To address this gap, we introduce Temporal Graph Talker (TGTalker), a novel temporal graph learning framework designed for LLMs. TGTalker utilizes the recency bias in temporal graphs to extract relevant structural information, converted to natural language for LLMs, while leveraging temporal neighbors as additional information for prediction. TGTalker demonstrates competitive link prediction capabilities compared to existing Temporal Graph Neural Network (TGNN) models. Across five real-world networks, TGTalker performs competitively with state-of-the-art temporal graph methods while consistently outperforming popular models such as TGN and HTGN. Furthermore, TGTalker generates textual explanations for each prediction, thus opening up exciting new directions in explainability and interpretability for temporal link prediction. The code is publicly available at https://github.com/shenyangHuang/TGTalker.

Shenyang Huang、Ali Parviz、Emma Kondrup、Zachary Yang、Zifeng Ding、Michael Bronstein、Reihaneh Rabbany、Guillaume Rabusseau

计算技术、计算机技术

Shenyang Huang,Ali Parviz,Emma Kondrup,Zachary Yang,Zifeng Ding,Michael Bronstein,Reihaneh Rabbany,Guillaume Rabusseau.Are Large Language Models Good Temporal Graph Learners?[EB/OL].(2025-06-03)[2025-07-16].https://arxiv.org/abs/2506.05393.点此复制

评论