|国家预印本平台
首页|Teaching a Language Model to Speak the Language of Tools

Teaching a Language Model to Speak the Language of Tools

Teaching a Language Model to Speak the Language of Tools

来源:Arxiv_logoArxiv
英文摘要

External tool integration through function-calling is essential for practical language model applications, yet most multilingual models lack reliable tool-use capabilities in non-English languages. Even state-of-the-art multilingual models struggle with determining when to use tools and generating the structured outputs required for function calls, often exhibiting language confusion when prompted in lower-resource languages. This work presents a methodology for adapting existing language models to enable robust tool use in any target language, using Bulgarian as a case study. The approach involves continued training of the BgGPT model series (2.6B, 9B, 27B parameters) on a novel bilingual dataset of 10,035 function-calling examples designed to support standardized protocols like MCP (Model Context Protocol). The research introduces TUCAN (Tool-Using Capable Assistant Navigator), which achieves up to 28.75% improvement in function-calling accuracy over base models while preserving core language understanding, as verified on established Bulgarian benchmarks. Beyond accuracy gains, TUCAN models demonstrate production-ready response formatting with clean, parsable function calls, contrasting with the verbose and inconsistent outputs of base models. The models, evaluation framework, and dataset are released to enable replication for other languages. This work demonstrates a practical approach for extending tool-augmented capabilities beyond English-centric systems.

Simeon Emanuilov

乌拉尔语系(芬兰-乌戈尔语系)计算技术、计算机技术信息传播、知识传播

Simeon Emanuilov.Teaching a Language Model to Speak the Language of Tools[EB/OL].(2025-06-29)[2025-07-21].https://arxiv.org/abs/2506.23394.点此复制

评论