Apax: A Flexible and Performant Framework For The Development of Machine-Learned Interatomic Potentials
Apax: A Flexible and Performant Framework For The Development of Machine-Learned Interatomic Potentials
We introduce Atomistic learned potentials in JAX (apax), a flexible and efficient open source software package for training and inference of machine-learned interatomic potentials. Built on the JAX framework, apax supports GPU acceleration and implements flexible model abstractions for fast development. With features such as kernel-based data selection, well-calibrated uncertainty estimation, and enhanced sampling, it is tailored to active learning applications and ease of use. The features and design decisions made in apax are discussed before demonstrating some of its capabilities. First, a data set for the room-temperature ionic liquid EMIM+BF4- is created using active learning. It is highlighted how continuously learning models between iterations can reduce training times up to 85 % with only a minor reduction of the models' accuracy. Second, we show good scalability in a data-parallel training setting. We report that a Gaussian Moment Neural Network model, as implemented in apax, achieves higher accuracy and up to 10 times faster inference times than a performance-optimized Allegro model. A recently published Li3PO4 dataset, reported with comparable accuracy and inference performance metrics, is used as a point of comparison. Moreover, the inference speeds of the available simulation engines are compared. Finally, to highlight the modularity of apax, an equivariant message-passing model is trained as a shallow ensemble and used to perform uncertainty-driven dynamics.
Fabian Zills、Christian Holm、Johannes K?stner、Moritz René Sch?fer、Nico Segreto
计算技术、计算机技术
Fabian Zills,Christian Holm,Johannes K?stner,Moritz René Sch?fer,Nico Segreto.Apax: A Flexible and Performant Framework For The Development of Machine-Learned Interatomic Potentials[EB/OL].(2025-05-28)[2025-06-15].https://arxiv.org/abs/2505.22168.点此复制
评论