|国家预印本平台
首页|Trillion 7B Technical Report

Trillion 7B Technical Report

Trillion 7B Technical Report

来源:Arxiv_logoArxiv
英文摘要

We introduce Trillion-7B, the most token-efficient Korean-centric multilingual LLM available. Our novel Cross-lingual Document Attention (XLDA) mechanism enables highly efficient and effective knowledge transfer from English to target languages like Korean and Japanese. Combined with optimized data mixtures, language-specific filtering, and tailored tokenizer construction, Trillion-7B achieves competitive performance while dedicating only 10\% of its 2T training tokens to multilingual data and requiring just 59.4K H100 GPU hours (\$148K) for full training. Comprehensive evaluations across 27 benchmarks in four languages demonstrate Trillion-7B's robust multilingual performance and exceptional cross-lingual consistency.

Sungjun Han、Juyoung Suk、Suyeong An、Hyungguk Kim、Kyuseok Kim、Wonsuk Yang、Seungtaek Choi、Jamin Shin

Trillion LabsTrillion LabsTrillion LabsTrillion LabsTrillion LabsTrillion LabsTrillion LabsTrillion Labs

语言学常用外国语

Sungjun Han,Juyoung Suk,Suyeong An,Hyungguk Kim,Kyuseok Kim,Wonsuk Yang,Seungtaek Choi,Jamin Shin.Trillion 7B Technical Report[EB/OL].(2025-04-21)[2025-05-12].https://arxiv.org/abs/2504.15431.点此复制

评论