|国家预印本平台
首页|Generalizable Heuristic Generation Through Large Language Models with Meta-Optimization

Generalizable Heuristic Generation Through Large Language Models with Meta-Optimization

Generalizable Heuristic Generation Through Large Language Models with Meta-Optimization

来源:Arxiv_logoArxiv
英文摘要

Heuristic design with large language models (LLMs) has emerged as a promising approach for tackling combinatorial optimization problems (COPs). However, existing approaches often rely on manually predefined evolutionary computation (EC) optimizers and single-task training schemes, which may constrain the exploration of diverse heuristic algorithms and hinder the generalization of the resulting heuristics. To address these issues, we propose Meta-Optimization of Heuristics (MoH), a novel framework that operates at the optimizer level, discovering effective optimizers through the principle of meta-learning. Specifically, MoH leverages LLMs to iteratively refine a meta-optimizer that autonomously constructs diverse optimizers through (self-)invocation, thereby eliminating the reliance on a predefined EC optimizer. These constructed optimizers subsequently evolve heuristics for downstream tasks, enabling broader heuristic exploration. Moreover, MoH employs a multi-task training scheme to promote its generalization capability. Experiments on classic COPs demonstrate that MoH constructs an effective and interpretable meta-optimizer, achieving state-of-the-art performance across various downstream tasks, particularly in cross-size settings.

Yiding Shi、Jianan Zhou、Wen Song、Jieyi Bi、Yaoxin Wu、Jie Zhang

计算技术、计算机技术

Yiding Shi,Jianan Zhou,Wen Song,Jieyi Bi,Yaoxin Wu,Jie Zhang.Generalizable Heuristic Generation Through Large Language Models with Meta-Optimization[EB/OL].(2025-05-27)[2025-06-18].https://arxiv.org/abs/2505.20881.点此复制

评论