|国家预印本平台
首页|Not All Features Deserve Attention: Graph-Guided Dependency Learning for Tabular Data Generation with Language Models

Not All Features Deserve Attention: Graph-Guided Dependency Learning for Tabular Data Generation with Language Models

Not All Features Deserve Attention: Graph-Guided Dependency Learning for Tabular Data Generation with Language Models

来源:Arxiv_logoArxiv
英文摘要

Large Language Models (LLMs) have shown strong potential for tabular data generation by modeling textualized feature-value pairs. However, tabular data inherently exhibits sparse feature-level dependencies, where many feature interactions are structurally insignificant. This creates a fundamental mismatch as LLMs' self-attention mechanism inevitably distributes focus across all pairs, diluting attention on critical relationships, particularly in datasets with complex dependencies or semantically ambiguous features. To address this limitation, we propose GraDe (Graph-Guided Dependency Learning), a novel method that explicitly integrates sparse dependency graphs into LLMs' attention mechanism. GraDe employs a lightweight dynamic graph learning module guided by externally extracted functional dependencies, prioritizing key feature interactions while suppressing irrelevant ones. Our experiments across diverse real-world datasets demonstrate that GraDe outperforms existing LLM-based approaches by up to 12% on complex datasets while achieving competitive results with state-of-the-art approaches in synthetic data quality. Our method is minimally intrusive yet effective, offering a practical solution for structure-aware tabular data modeling with LLMs.

Zheyu Zhang、Shuo Yang、Bardh Prenkaj、Gjergji Kasneci

计算技术、计算机技术

Zheyu Zhang,Shuo Yang,Bardh Prenkaj,Gjergji Kasneci.Not All Features Deserve Attention: Graph-Guided Dependency Learning for Tabular Data Generation with Language Models[EB/OL].(2025-07-24)[2025-08-10].https://arxiv.org/abs/2507.18504.点此复制

评论