|国家预印本平台
首页|NOCL: Node-Oriented Conceptualization LLM for Graph Tasks without Message Passing

NOCL: Node-Oriented Conceptualization LLM for Graph Tasks without Message Passing

NOCL: Node-Oriented Conceptualization LLM for Graph Tasks without Message Passing

来源:Arxiv_logoArxiv
英文摘要

Graphs are essential for modeling complex interactions across domains such as social networks, biology, and recommendation systems. Traditional Graph Neural Networks, particularly Message Passing Neural Networks (MPNNs), rely heavily on supervised learning, limiting their generalization and applicability in label-scarce scenarios. Recent self-supervised approaches still require labeled fine-tuning, limiting their effectiveness in zero-shot scenarios. Meanwhile, Large Language Models (LLMs) excel in natural language tasks but face significant challenges when applied to graphs, including preserving reasoning abilities, managing extensive token lengths from rich node attributes, and being limited to textual-attributed graphs (TAGs) and a single level task. To overcome these limitations, we propose the Node-Oriented Conceptualization LLM (NOCL), a novel framework that leverages two core techniques: 1) node description, which converts heterogeneous node attributes into structured natural language, extending LLM from TAGs to non-TAGs; 2) node concept, which encodes node descriptions into compact semantic embeddings using pretrained language models, significantly reducing token lengths by up to 93.9% compared to directly using node descriptions. Additionally, our NOCL employs graph representation descriptors to unify graph tasks at various levels into a shared, language-based query format, paving a new direction for Graph Foundation Models. Experimental results validate NOCL's competitive supervised performance relative to traditional MPNNs and hybrid LLM-MPNN methods and demonstrate superior generalization in zero-shot settings.

Wei Li、Mengcheng Lan、Jiaxing Xu、Yiping Ke

计算技术、计算机技术

Wei Li,Mengcheng Lan,Jiaxing Xu,Yiping Ke.NOCL: Node-Oriented Conceptualization LLM for Graph Tasks without Message Passing[EB/OL].(2025-05-28)[2025-07-16].https://arxiv.org/abs/2506.10014.点此复制

评论