KG-BiLM: Knowledge Graph Embedding via Bidirectional Language Models
KG-BiLM: Knowledge Graph Embedding via Bidirectional Language Models
Recent advances in knowledge representation learning (KRL) highlight the urgent necessity to unify symbolic knowledge graphs (KGs) with language models (LMs) for richer semantic understanding. However, existing approaches typically prioritize either graph structure or textual semantics, leaving a gap: a unified framework that simultaneously captures global KG connectivity, nuanced linguistic context, and discriminative reasoning semantics. To bridge this gap, we introduce KG-BiLM, a bidirectional LM framework that fuses structural cues from KGs with the semantic expressiveness of generative transformers. KG-BiLM incorporates three key components: (i) Bidirectional Knowledge Attention, which removes the causal mask to enable full interaction among all tokens and entities; (ii) Knowledge-Masked Prediction, which encourages the model to leverage both local semantic contexts and global graph connectivity; and (iii) Contrastive Graph Semantic Aggregation, which preserves KG structure via contrastive alignment of sampled sub-graph representations. Extensive experiments on standard benchmarks demonstrate that KG-BiLM outperforms strong baselines in link prediction, especially on large-scale graphs with complex multi-hop relations - validating its effectiveness in unifying structural information and textual semantics.
Zirui Chen、Xin Wang、Zhao Li、Wenbin Guo、Dongxiao He
计算技术、计算机技术
Zirui Chen,Xin Wang,Zhao Li,Wenbin Guo,Dongxiao He.KG-BiLM: Knowledge Graph Embedding via Bidirectional Language Models[EB/OL].(2025-06-04)[2025-07-17].https://arxiv.org/abs/2506.03576.点此复制
评论