Caterpillar GNN: Replacing Message Passing with Efficient Aggregation
Caterpillar GNN: Replacing Message Passing with Efficient Aggregation
Message-passing graph neural networks (MPGNNs) dominate modern graph learning, typically prioritizing maximal expressive power. In contrast, we introduce an \emph{efficient aggregation} mechanism, deliberately trading off some expressivity for stronger and more structured aggregation capabilities. Our approach allows seamless scaling between classical message-passing and simpler methods based on colored or plain walks. We rigorously characterize the expressive power at each intermediate step using homomorphism counts from a hierarchy of generalized \emph{caterpillar graphs}. Based on this foundation, we propose the \emph{Caterpillar GNN}, whose robust graph-level aggregation enables it to successfully tackle synthetic graph-level task specifically designed to be challenging for classical MPGNNs. Moreover, we demonstrate that, on real-world datasets, the Caterpillar GNN achieves comparable predictive performance while significantly reducing the number of nodes in the hidden layers of the computational graph.
Marek ?erny
计算技术、计算机技术
Marek ?erny.Caterpillar GNN: Replacing Message Passing with Efficient Aggregation[EB/OL].(2025-06-07)[2025-06-21].https://arxiv.org/abs/2506.06784.点此复制
评论