|国家预印本平台
首页|Subitizing-Inspired_Large_Language_Models_for_Floorplanning

Subitizing-Inspired_Large_Language_Models_for_Floorplanning

Subitizing-Inspired_Large_Language_Models_for_Floorplanning

来源:Arxiv_logoArxiv
英文摘要

We present a novel approach to solving the floorplanning problem by leveraging fine-tuned Large Language Models (LLMs). Inspired by subitizing--the human ability to instantly and accurately count small numbers of items at a glance--we hypothesize that LLMs can similarly address floorplanning challenges swiftly and accurately. We propose an efficient representation of the floorplanning problem and introduce a method for generating high-quality datasets tailored for model fine-tuning. We fine-tune LLMs on datasets with a specified number of modules to test whether LLMs can emulate the human ability to quickly count and arrange items. Our experimental results demonstrate that fine-tuned LLMs, particularly GPT4o-mini, achieve high success and optimal rates while attaining relatively low average dead space. These findings underscore the potential of LLMs as promising solutions for complex optimization tasks in VLSI design.

Shao-Chien Lu、Chen-Chen Yeh、Hui-Lin Cho、Yu-Cheng Lin、Rung-Bin Lin

微电子学、集成电路

Shao-Chien Lu,Chen-Chen Yeh,Hui-Lin Cho,Yu-Cheng Lin,Rung-Bin Lin.Subitizing-Inspired_Large_Language_Models_for_Floorplanning[EB/OL].(2025-04-16)[2025-06-05].https://arxiv.org/abs/2504.12076.点此复制

评论