|国家预印本平台
首页|Tractable Transformers for Flexible Conditional Generation

Tractable Transformers for Flexible Conditional Generation

Tractable Transformers for Flexible Conditional Generation

来源:Arxiv_logoArxiv
英文摘要

Non-autoregressive (NAR) generative models are valuable because they can handle diverse conditional generation tasks in a more principled way than their autoregressive (AR) counterparts, which are constrained by sequential dependency requirements. Recent advancements in NAR models, such as diffusion language models, have demonstrated superior performance in unconditional generation compared to AR models (e.g., GPTs) of similar sizes. However, such improvements do not always lead to improved conditional generation performance. We show that a key reason for this gap is the difficulty in generalizing to conditional probability queries (i.e., the set of unknown variables) unseen during training. As a result, strong unconditional generation performance does not guarantee high-quality conditional generation. This paper proposes Tractable Transformers (Tracformer), a Transformer-based generative model that is more robust to different conditional generation tasks. Unlike existing models that rely solely on global contextual features derived from full inputs, Tracformers incorporate a sparse Transformer encoder to capture both local and global contextual information. This information is routed through a decoder for conditional generation. Empirical results demonstrate that Tracformers achieve state-of-the-art conditional generation performance on text modeling compared to recent diffusion and AR model baselines.

Anji Liu、Mathias Niepert、Yitao Liang、Guy Van den Broeck、Xuejie Liu、Dayuan Zhao

计算技术、计算机技术

Anji Liu,Mathias Niepert,Yitao Liang,Guy Van den Broeck,Xuejie Liu,Dayuan Zhao.Tractable Transformers for Flexible Conditional Generation[EB/OL].(2025-07-07)[2025-07-18].https://arxiv.org/abs/2502.07616.点此复制

评论