|国家预印本平台
首页|Your LLM Knows the Future: Uncovering Its Multi-Token Prediction Potential

Your LLM Knows the Future: Uncovering Its Multi-Token Prediction Potential

Your LLM Knows the Future: Uncovering Its Multi-Token Prediction Potential

来源:Arxiv_logoArxiv
英文摘要

Autoregressive language models are constrained by their inherently sequential nature, generating one token at a time. This paradigm limits inference speed and parallelism, especially during later stages of generation when the direction and semantics of text are relatively certain. In this work, we propose a novel framework that leverages the inherent knowledge of vanilla autoregressive language models about future tokens, combining techniques to realize this potential and enable simultaneous prediction of multiple subsequent tokens. Our approach introduces several key innovations: (1) a masked-input formulation where multiple future tokens are jointly predicted from a common prefix; (2) a gated LoRA formulation that preserves the original LLM's functionality, while equipping it for multi-token prediction; (3) a lightweight, learnable sampler module that generates coherent sequences from the predicted future tokens; (4) a set of auxiliary training losses, including a consistency loss, to enhance the coherence and accuracy of jointly generated tokens; and (5) a speculative generation strategy that expands tokens quadratically in the future while maintaining high fidelity. Our method achieves significant speedups through supervised fine-tuning on pretrained models. For example, it generates code and math nearly 5x faster, and improves general chat and knowledge tasks by almost 2.5x. These gains come without any loss in quality.

Mohammad Samragh、Arnav Kundu、David Harrison、Kumari Nishu、Devang Naik、Minsik Cho、Mehrdad Farajtabar

计算技术、计算机技术

Mohammad Samragh,Arnav Kundu,David Harrison,Kumari Nishu,Devang Naik,Minsik Cho,Mehrdad Farajtabar.Your LLM Knows the Future: Uncovering Its Multi-Token Prediction Potential[EB/OL].(2025-07-16)[2025-08-10].https://arxiv.org/abs/2507.11851.点此复制

评论