|国家预印本平台
首页|MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts

MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts

MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts

来源:Arxiv_logoArxiv
英文摘要

Learning to solve vehicle routing problems (VRPs) has garnered much attention. However, most neural solvers are only structured and trained independently on a specific problem, making them less generic and practical. In this paper, we aim to develop a unified neural solver that can cope with a range of VRP variants simultaneously. Specifically, we propose a multi-task vehicle routing solver with mixture-of-experts (MVMoE), which greatly enhances the model capacity without a proportional increase in computation. We further develop a hierarchical gating mechanism for the MVMoE, delivering a good trade-off between empirical performance and computational complexity. Experimentally, our method significantly promotes zero-shot generalization performance on 10 unseen VRP variants, and showcases decent results on the few-shot setting and real-world benchmark instances. We further conduct extensive studies on the effect of MoE configurations in solving VRPs, and observe the superiority of hierarchical gating when facing out-of-distribution data. The source code is available at: https://github.com/RoyalSkye/Routing-MVMoE.

Chi Xu、Yaoxin Wu、Zhiguang Cao、Jianan Zhou、Wen Song、Jie Zhang、Yining Ma

计算技术、计算机技术自动化技术经济自动化技术、自动化技术设备

Chi Xu,Yaoxin Wu,Zhiguang Cao,Jianan Zhou,Wen Song,Jie Zhang,Yining Ma.MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts[EB/OL].(2024-05-02)[2025-06-16].https://arxiv.org/abs/2405.01029.点此复制

评论