When Less Language is More: Language-Reasoning Disentanglement Makes LLMs Better Multilingual Reasoners
When Less Language is More: Language-Reasoning Disentanglement Makes LLMs Better Multilingual Reasoners
Multilingual reasoning remains a significant challenge for large language models (LLMs), with performance disproportionately favoring high-resource languages. Drawing inspiration from cognitive neuroscience, which suggests that human reasoning functions largely independently of language processing, we hypothesize that LLMs similarly encode reasoning and language as separable components that can be disentangled to enhance multilingual reasoning. To evaluate this, we perform a causal intervention by ablating language-specific representations at inference time. Experiments on 10 open-source LLMs spanning 11 typologically diverse languages show that this language-specific ablation consistently boosts multilingual reasoning performance. Layer-wise analyses further confirm that language and reasoning representations can be effectively decoupled throughout the model, yielding improved multilingual reasoning capabilities, while preserving top-layer language features remains essential for maintaining linguistic fidelity. Compared to post-training such as supervised fine-tuning or reinforcement learning, our training-free ablation achieves comparable or superior results with minimal computational overhead. These findings shed light on the internal mechanisms underlying multilingual reasoning in LLMs and suggest a lightweight and interpretable strategy for improving cross-lingual generalization.
Weixiang Zhao、Jiahe Guo、Yang Deng、Tongtong Wu、Wenxuan Zhang、Yulin Hu、Xingyu Sui、Yanyan Zhao、Wanxiang Che、Bing Qin、Tat-Seng Chua、Ting Liu
语言学
Weixiang Zhao,Jiahe Guo,Yang Deng,Tongtong Wu,Wenxuan Zhang,Yulin Hu,Xingyu Sui,Yanyan Zhao,Wanxiang Che,Bing Qin,Tat-Seng Chua,Ting Liu.When Less Language is More: Language-Reasoning Disentanglement Makes LLMs Better Multilingual Reasoners[EB/OL].(2025-05-21)[2025-06-13].https://arxiv.org/abs/2505.15257.点此复制
评论