|国家预印本平台
首页|An efficient forgetting-aware fine-tuning framework for pretrained universal machine-learning interatomic potentials

An efficient forgetting-aware fine-tuning framework for pretrained universal machine-learning interatomic potentials

An efficient forgetting-aware fine-tuning framework for pretrained universal machine-learning interatomic potentials

来源:Arxiv_logoArxiv
英文摘要

Pretrained universal machine-learning interatomic potentials (MLIPs) have revolutionized computational materials science by enabling rapid atomistic simulations as efficient alternatives to ab initio methods. Fine-tuning pretrained MLIPs offers a practical approach to improving accuracy for materials and properties where predictive performance is insufficient. However, this approach often induces catastrophic forgetting, undermining the generalizability that is a key advantage of pretrained MLIPs. Herein, we propose reEWC, an advanced fine-tuning strategy that integrates Experience Replay and Elastic Weight Consolidation (EWC) to effectively balance forgetting prevention with fine-tuning efficiency. Using Li$_6$PS$_5$Cl (LPSC), a sulfide-based Li solid-state electrolyte, as a fine-tuning target, we show that reEWC significantly improves the accuracy of a pretrained MLIP, resolving well-known issues of potential energy surface softening and overestimated Li diffusivities. Moreover, reEWC preserves the generalizability of the pretrained MLIP and enables knowledge transfer to chemically distinct systems, including other sulfide, oxide, nitride, and halide electrolytes. Compared to Experience Replay and EWC used individually, reEWC delivers clear synergistic benefits, mitigating their respective limitations while maintaining computational efficiency. These results establish reEWC as a robust and effective solution for continual learning in MLIPs, enabling universal models that can advance materials research through large-scale, high-throughput simulations across diverse chemistries.

Jisu Kim、Jiho Lee、Sangmin Oh、Yutack Park、Seungwoo Hwang、Seungwu Han、Sungwoo Kang、Youngho Kang

物理学信息科学、信息技术

Jisu Kim,Jiho Lee,Sangmin Oh,Yutack Park,Seungwoo Hwang,Seungwu Han,Sungwoo Kang,Youngho Kang.An efficient forgetting-aware fine-tuning framework for pretrained universal machine-learning interatomic potentials[EB/OL].(2025-06-18)[2025-07-16].https://arxiv.org/abs/2506.15223.点此复制

评论