|国家预印本平台
首页|Aggregation Buffer: Revisiting DropEdge with a New Parameter Block

Aggregation Buffer: Revisiting DropEdge with a New Parameter Block

Aggregation Buffer: Revisiting DropEdge with a New Parameter Block

来源:Arxiv_logoArxiv
英文摘要

We revisit DropEdge, a data augmentation technique for GNNs which randomly removes edges to expose diverse graph structures during training. While being a promising approach to effectively reduce overfitting on specific connections in the graph, we observe that its potential performance gain in supervised learning tasks is significantly limited. To understand why, we provide a theoretical analysis showing that the limited performance of DropEdge comes from the fundamental limitation that exists in many GNN architectures. Based on this analysis, we propose Aggregation Buffer, a parameter block specifically designed to improve the robustness of GNNs by addressing the limitation of DropEdge. Our method is compatible with any GNN model, and shows consistent performance improvements on multiple datasets. Moreover, our method effectively addresses well-known problems such as degree bias or structural disparity as a unifying solution. Code and datasets are available at https://github.com/dooho00/agg-buffer.

Dooho Lee、Myeong Kong、Sagad Hamid、Cheonwoo Lee、Jaemin Yoo

计算技术、计算机技术

Dooho Lee,Myeong Kong,Sagad Hamid,Cheonwoo Lee,Jaemin Yoo.Aggregation Buffer: Revisiting DropEdge with a New Parameter Block[EB/OL].(2025-05-27)[2025-06-28].https://arxiv.org/abs/2505.20840.点此复制

评论