|国家预印本平台
首页|A Post-trainer's Guide to Multilingual Training Data: Uncovering Cross-lingual Transfer Dynamics

A Post-trainer's Guide to Multilingual Training Data: Uncovering Cross-lingual Transfer Dynamics

A Post-trainer's Guide to Multilingual Training Data: Uncovering Cross-lingual Transfer Dynamics

来源:Arxiv_logoArxiv
英文摘要

In order for large language models to be useful across the globe, they are fine-tuned to follow instructions on multilingual data. Despite the ubiquity of such post-training, a clear understanding of the dynamics that enable cross-lingual transfer remains elusive. This study examines cross-lingual transfer (CLT) dynamics in realistic post-training settings. We study two model families of up to 35B parameters in size trained on carefully controlled mixtures of multilingual data on three generative tasks with varying levels of complexity (summarization, instruction following, and mathematical reasoning) in both single-task and multi-task instruction tuning settings. Overall, we find that the dynamics of cross-lingual transfer and multilingual performance cannot be explained by isolated variables, varying depending on the combination of post-training settings. Finally, we identify the conditions that lead to effective cross-lingual transfer in practice.

Luisa Shimabucoro、Ahmet Ustun、Marzieh Fadaee、Sebastian Ruder

语言学

Luisa Shimabucoro,Ahmet Ustun,Marzieh Fadaee,Sebastian Ruder.A Post-trainer's Guide to Multilingual Training Data: Uncovering Cross-lingual Transfer Dynamics[EB/OL].(2025-04-23)[2025-05-24].https://arxiv.org/abs/2504.16677.点此复制

评论