|国家预印本平台
首页|On the Trade-off between Over-smoothing and Over-squashing in Deep Graph Neural Networks

On the Trade-off between Over-smoothing and Over-squashing in Deep Graph Neural Networks

On the Trade-off between Over-smoothing and Over-squashing in Deep Graph Neural Networks

来源:Arxiv_logoArxiv
英文摘要

Graph Neural Networks (GNNs) have succeeded in various computer science applications, yet deep GNNs underperform their shallow counterparts despite deep learning's success in other domains. Over-smoothing and over-squashing are key challenges when stacking graph convolutional layers, hindering deep representation learning and information propagation from distant nodes. Our work reveals that over-smoothing and over-squashing are intrinsically related to the spectral gap of the graph Laplacian, resulting in an inevitable trade-off between these two issues, as they cannot be alleviated simultaneously. To achieve a suitable compromise, we propose adding and removing edges as a viable approach. We introduce the Stochastic Jost and Liu Curvature Rewiring (SJLR) algorithm, which is computationally efficient and preserves fundamental properties compared to previous curvature-based methods. Unlike existing approaches, SJLR performs edge addition and removal during GNN training while maintaining the graph unchanged during testing. Comprehensive comparisons demonstrate SJLR's competitive performance in addressing over-smoothing and over-squashing.

Jhony H. Giraldo、Fragkiskos D. Malliaros、Thierry Bouwmans、Konstantinos Skianis

10.1145/3583780.3614997

计算技术、计算机技术

Jhony H. Giraldo,Fragkiskos D. Malliaros,Thierry Bouwmans,Konstantinos Skianis.On the Trade-off between Over-smoothing and Over-squashing in Deep Graph Neural Networks[EB/OL].(2022-12-05)[2025-08-02].https://arxiv.org/abs/2212.02374.点此复制

评论