TuRTLe: A Unified Evaluation of LLMs for RTL Generation
TuRTLe: A Unified Evaluation of LLMs for RTL Generation
The rapid advancements in LLMs have driven the adoption of generative AI in various domains, including Electronic Design Automation (EDA). Unlike traditional software development, EDA presents unique challenges, as generated RTL code must not only be syntactically correct and functionally accurate but also synthesizable by hardware generators while meeting performance, power, and area constraints. These additional requirements introduce complexities that existing code-generation benchmarks often fail to capture, limiting their effectiveness in evaluating LLMs for RTL generation. To address this gap, we propose TuRTLe, a unified evaluation framework designed to systematically assess LLMs across key RTL generation tasks. TuRTLe integrates multiple existing benchmarks and automates the evaluation process, enabling a comprehensive assessment of LLM performance in syntax correctness, functional correctness, synthesis, PPA optimization, and exact line completion. Using this framework, we benchmark a diverse set of open LLMs and analyze their strengths and weaknesses in EDA-specific tasks. Our results show that reasoning-based models, such as DeepSeek R1, consistently outperform others across multiple evaluation criteria, but at the cost of increased computational overhead and inference latency. Additionally, base models are better suited in module completion tasks, while instruct-tuned models perform better in specification-to-RTL tasks.
Dario Garcia-Gasulla、Gokcen Kestor、Emanuele Parisi、Miquel Albert'i-Binimelis、Cristian Gutierrez、Razine Moundir Ghorab、Orlando Montenegro、Bernat Homs、Miquel Moreto
微电子学、集成电路自动化技术、自动化技术设备计算技术、计算机技术
Dario Garcia-Gasulla,Gokcen Kestor,Emanuele Parisi,Miquel Albert'i-Binimelis,Cristian Gutierrez,Razine Moundir Ghorab,Orlando Montenegro,Bernat Homs,Miquel Moreto.TuRTLe: A Unified Evaluation of LLMs for RTL Generation[EB/OL].(2025-03-31)[2025-04-26].https://arxiv.org/abs/2504.01986.点此复制
评论