|国家预印本平台
首页|Advancing Graph Representation Learning with Large Language Models: A Comprehensive Survey of Techniques

Advancing Graph Representation Learning with Large Language Models: A Comprehensive Survey of Techniques

Advancing Graph Representation Learning with Large Language Models: A Comprehensive Survey of Techniques

来源:Arxiv_logoArxiv
英文摘要

The integration of Large Language Models (LLMs) with Graph Representation Learning (GRL) marks a significant evolution in analyzing complex data structures. This collaboration harnesses the sophisticated linguistic capabilities of LLMs to improve the contextual understanding and adaptability of graph models, thereby broadening the scope and potential of GRL. Despite a growing body of research dedicated to integrating LLMs into the graph domain, a comprehensive review that deeply analyzes the core components and operations within these models is notably lacking. Our survey fills this gap by proposing a novel taxonomy that breaks down these models into primary components and operation techniques from a novel technical perspective. We further dissect recent literature into two primary components including knowledge extractors and organizers, and two operation techniques including integration and training stratigies, shedding light on effective model design and training strategies. Additionally, we identify and explore potential future research avenues in this nascent yet underexplored field, proposing paths for continued progress.

Zhuo Li、Chenghao Liu、Jianling Sun、Qiheng Mao、Zemin Liu

计算技术、计算机技术

Zhuo Li,Chenghao Liu,Jianling Sun,Qiheng Mao,Zemin Liu.Advancing Graph Representation Learning with Large Language Models: A Comprehensive Survey of Techniques[EB/OL].(2024-02-04)[2025-08-02].https://arxiv.org/abs/2402.05952.点此复制

评论