Leveraging Large Language Models for Bengali Math Word Problem Solving with Chain of Thought Reasoning
Leveraging Large Language Models for Bengali Math Word Problem Solving with Chain of Thought Reasoning
Solving Bengali Math Word Problems (MWPs) remains a major challenge in natural language processing (NLP) due to the language's low-resource status and the multi-step reasoning required. Existing models struggle with complex Bengali MWPs, largely because no human-annotated Bengali dataset has previously addressed this task. This gap has limited progress in Bengali mathematical reasoning. To address this, we created SOMADHAN, a dataset of 8792 complex Bengali MWPs with manually written, step-by-step solutions. We designed this dataset to support reasoning-focused evaluation and model development in a linguistically underrepresented context. Using SOMADHAN, we evaluated a range of large language models (LLMs) - including GPT-4o, GPT-3.5 Turbo, LLaMA series models, Deepseek, and Qwen - through both zero-shot and few-shot prompting with and without Chain of Thought (CoT) reasoning. CoT prompting consistently improved performance over standard prompting, especially in tasks requiring multi-step logic. LLaMA-3.3 70B achieved the highest accuracy of 88% with few-shot CoT prompting. We also applied Low-Rank Adaptation (LoRA) to fine-tune models efficiently, enabling them to adapt to Bengali MWPs with minimal computational cost. Our work fills a critical gap in Bengali NLP by providing a high-quality reasoning dataset and a scalable framework for solving complex MWPs. We aim to advance equitable research in low-resource languages and enhance reasoning capabilities in educational and language technologies.
Bidyarthi Paul、Jalisha Jashim Era、Mirazur Rahman Zim、Tahmid Sattar Aothoi、Faisal Muhammad Shah
南亚语系(澳斯特罗-亚细亚语系)数学
Bidyarthi Paul,Jalisha Jashim Era,Mirazur Rahman Zim,Tahmid Sattar Aothoi,Faisal Muhammad Shah.Leveraging Large Language Models for Bengali Math Word Problem Solving with Chain of Thought Reasoning[EB/OL].(2025-05-27)[2025-06-14].https://arxiv.org/abs/2505.21354.点此复制
评论