|国家预印本平台
首页|Language Graph Distillation for Low-Resource Machine Translation

Language Graph Distillation for Low-Resource Machine Translation

Language Graph Distillation for Low-Resource Machine Translation

来源:Arxiv_logoArxiv
英文摘要

Neural machine translation on low-resource language is challenging due to the lack of bilingual sentence pairs. Previous works usually solve the low-resource translation problem with knowledge transfer in a multilingual setting. In this paper, we propose the concept of Language Graph and further design a novel graph distillation algorithm that boosts the accuracy of low-resource translations in the graph with forward and backward knowledge distillation. Preliminary experiments on the TED talks multilingual dataset demonstrate the effectiveness of our proposed method. Specifically, we improve the low-resource translation pair by more than 3.13 points in terms of BLEU score.

Tianyu He、Tao Qin、Xu Tan、Jiale Chen

常用外国语语言学

Tianyu He,Tao Qin,Xu Tan,Jiale Chen.Language Graph Distillation for Low-Resource Machine Translation[EB/OL].(2019-08-17)[2025-08-02].https://arxiv.org/abs/1908.06258.点此复制

评论