|国家预印本平台
首页|Towards Anomaly-Aware Pre-Training and Fine-Tuning for Graph Anomaly Detection

Towards Anomaly-Aware Pre-Training and Fine-Tuning for Graph Anomaly Detection

Towards Anomaly-Aware Pre-Training and Fine-Tuning for Graph Anomaly Detection

来源:Arxiv_logoArxiv
英文摘要

Graph anomaly detection (GAD) has garnered increasing attention in recent years, yet remains challenging due to two key factors: (1) label scarcity stemming from the high cost of annotations and (2) homophily disparity at node and class levels. In this paper, we introduce Anomaly-Aware Pre-Training and Fine-Tuning (APF), a targeted and effective framework to mitigate the above challenges in GAD. In the pre-training stage, APF incorporates node-specific subgraphs selected via the Rayleigh Quotient, a label-free anomaly metric, into the learning objective to enhance anomaly awareness. It further introduces two learnable spectral polynomial filters to jointly learn dual representations that capture both general semantics and subtle anomaly cues. During fine-tuning, a gated fusion mechanism adaptively integrates pre-trained representations across nodes and dimensions, while an anomaly-aware regularization loss encourages abnormal nodes to preserve more anomaly-relevant information. Furthermore, we theoretically show that APF tends to achieve linear separability under mild conditions. Comprehensive experiments on 10 benchmark datasets validate the superior performance of APF in comparison to state-of-the-art baselines.

Hongzhi Yin、Tao Zheng、Jianhua Zhao、Tieke He、Yunhui Liu、Jiashun Cheng、Yiqing Lin、Qizhuo Xie、Jia Li、Fugee Tsung

计算技术、计算机技术

Hongzhi Yin,Tao Zheng,Jianhua Zhao,Tieke He,Yunhui Liu,Jiashun Cheng,Yiqing Lin,Qizhuo Xie,Jia Li,Fugee Tsung.Towards Anomaly-Aware Pre-Training and Fine-Tuning for Graph Anomaly Detection[EB/OL].(2025-04-19)[2025-05-22].https://arxiv.org/abs/2504.14250.点此复制

评论