|国家预印本平台
首页|Multiverse: Your Language Models Secretly Decide How to Parallelize and Merge Generation

Multiverse: Your Language Models Secretly Decide How to Parallelize and Merge Generation

Multiverse: Your Language Models Secretly Decide How to Parallelize and Merge Generation

来源:Arxiv_logoArxiv
英文摘要

Autoregressive Large Language Models (AR-LLMs) frequently exhibit implicit parallelism in sequential generation. Inspired by this, we introduce Multiverse, a new generative model that enables natively parallel generation. Multiverse internalizes a MapReduce paradigm, generating automatically through three stages: (i) a Map stage for adaptive task decomposition, (ii) a Process stage for parallel subtask execution, and (iii) a Reduce stage for lossless result synthesis. Next, we build a real-world Multiverse reasoning model with co-design of data, algorithm, and system, enabling rapid and seamless transfer from frontier AR-LLMs. For data creation, we develop Multiverse Curator, an automated LLM-assisted pipeline that transforms sequential reasoning chains into structured training data, avoiding costly human annotations. Algorithmically, we design Multiverse Attention to separate parallel reasoning steps while keeping compatibility with causal attention for efficient training. Systematically, we implement Multiverse Engine to support parallel inference. It features a dedicated interpreter that dynamically switches between sequential and parallel generation, triggered directly by the model. After a 3-hour fine-tuning with 1K examples, our Multiverse-32B stands as the only open-sourced non-AR model achieving performance on par with leading AR-LLMs of the same scale, evidenced by AIME24 & 25 scores of 54% and 46%, respectively. Moreover, our budget control experiments show that Multiverse-32B exhibits superior scaling, outperforming AR-LLMs by 1.87% on average using the same context length. Such scaling further leads to practical efficiency gains, achieving up to 2x speedup across varying batch sizes. We have open-sourced the entire Multiverse ecosystem, including data, model weights, engine, as well as complete data curation prompts and detailed training and evaluation recipes.

Tianqi Chen、Beidi Chen、Hongyi Liu、Yuwei An、Xinyu Yang

计算技术、计算机技术

Tianqi Chen,Beidi Chen,Hongyi Liu,Yuwei An,Xinyu Yang.Multiverse: Your Language Models Secretly Decide How to Parallelize and Merge Generation[EB/OL].(2025-06-11)[2025-06-22].https://arxiv.org/abs/2506.09991.点此复制

评论