|国家预印本平台
首页|Heuristic Methods are Good Teachers to Distill MLPs for Graph Link Prediction

Heuristic Methods are Good Teachers to Distill MLPs for Graph Link Prediction

Heuristic Methods are Good Teachers to Distill MLPs for Graph Link Prediction

来源:Arxiv_logoArxiv
英文摘要

Link prediction is a crucial graph-learning task with applications including citation prediction and product recommendation. Distilling Graph Neural Networks (GNNs) teachers into Multi-Layer Perceptrons (MLPs) students has emerged as an effective approach to achieve strong performance and reducing computational cost by removing graph dependency. However, existing distillation methods only use standard GNNs and overlook alternative teachers such as specialized model for link prediction (GNN4LP) and heuristic methods (e.g., common neighbors). This paper first explores the impact of different teachers in GNN-to-MLP distillation. Surprisingly, we find that stronger teachers do not always produce stronger students: MLPs distilled from GNN4LP can underperform those distilled from simpler GNNs, while weaker heuristic methods can teach MLPs to near-GNN performance with drastically reduced training costs. Building on these insights, we propose Ensemble Heuristic-Distilled MLPs (EHDM), which eliminates graph dependencies while effectively integrating complementary signals via a gating mechanism. Experiments on ten datasets show an average 7.93% improvement over previous GNN-to-MLP approaches with 1.95-3.32 times less training time, indicating EHDM is an efficient and effective link prediction method.

Zongyue Qin、Shichang Zhang、Mingxuan Ju、Tong Zhao、Neil Shah、Yizhou Sun

计算技术、计算机技术

Zongyue Qin,Shichang Zhang,Mingxuan Ju,Tong Zhao,Neil Shah,Yizhou Sun.Heuristic Methods are Good Teachers to Distill MLPs for Graph Link Prediction[EB/OL].(2025-04-08)[2025-05-19].https://arxiv.org/abs/2504.06193.点此复制

评论