Implicit Delta Learning of High Fidelity Neural Network Potentials

Bibliographic Details
Title: Implicit Delta Learning of High Fidelity Neural Network Potentials
Authors: Thaler, Stephan, Gabellini, Cristian, Shenoy, Nikhil, Tossou, Prudencio
Publication Year: 2024
Collection: Computer Science
Physics (Other)
Subject Terms: Physics - Chemical Physics, Computer Science - Machine Learning
More Details: Neural network potentials (NNPs) offer a fast and accurate alternative to ab-initio methods for molecular dynamics (MD) simulations but are hindered by the high cost of training data from high-fidelity Quantum Mechanics (QM) methods. Our work introduces the Implicit Delta Learning (IDLe) method, which reduces the need for high-fidelity QM data by leveraging cheaper semi-empirical QM computations without compromising NNP accuracy or inference cost. IDLe employs an end-to-end multi-task architecture with fidelity-specific heads that decode energies based on a shared latent representation of the input atomistic system. In various settings, IDLe achieves the same accuracy as single high-fidelity baselines while using up to 50x less high-fidelity data. This result could significantly reduce data generation cost and consequently enhance accuracy and generalization, and expand chemical coverage for NNPs, advancing MD simulations for material science and drug discovery. Additionally, we provide a novel set of 11 million semi-empirical QM calculations to support future multi-fidelity NNP modeling.
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2412.06064
Accession Number: edsarx.2412.06064
Database: arXiv
More Details
Description not available.