FORGE: Foundational Optimization Representations from Graph Embeddings
FORGE: Foundational Optimization Representations from Graph Embeddings
Combinatorial optimization problems are ubiquitous in science and engineering, yet learning-based approaches to accelerate their solution often require solving a large number of hard-to-solve optimization instances to collect training data, incurring significant computational overhead. Existing methods require training dedicated models for each problem distribution for each downstream task, severely limiting their scalability and generalization. In this work, we introduce Forge, a method of pre-training a vector-quantized graph autoencoder on a large and diverse collection of mixed-integer programming (MIP) instances in an unsupervised fashion without dependency on their solution. The vector quantization process creates discrete code assignments that act as a vocabulary to represent optimization instances. We evaluate our approach under both supervised and unsupervised settings. For the unsupervised setting, we demonstrate that Forge embeddings effectively differentiate and cluster unseen instances. For the supervised setting, we fine-tuneForge embeddings and show that a single model predicts both the variables for warm-starts and integrality gaps for cut-generation across multiple problem type distributions. Both predictions help improve performance of a state-of-the-art, commercial optimization solver. Finally, we release our code and pre-trained Forge weights to encourage further research and practical use of instance-level MIP embeddings at https://github.com/skadio/forge/.
Zohair Shafi、Serdar Kadioglu
计算技术、计算机技术自动化技术、自动化技术设备
Zohair Shafi,Serdar Kadioglu.FORGE: Foundational Optimization Representations from Graph Embeddings[EB/OL].(2025-09-01)[2025-09-06].https://arxiv.org/abs/2508.20330.点此复制
评论