Metadynamics for Training Neural Network Model Chemistries: a Competitive Assessment

12/19/2017
by   John E. Herr, et al.
0

Neural network (NN) model chemistries (MCs) promise to facilitate the accurate exploration of chemical space and simulation of large reactive systems. One important path to improving these models is to add layers of physical detail, especially long-range forces. At short range, however, these models are data driven and data limited. Little is systematically known about how data should be sampled, and `test data' chosen randomly from some sampling techniques can provide poor information about generality. If the sampling method is narrow `test error' can appear encouragingly tiny while the model fails catastrophically elsewhere. In this manuscript we competitively evaluate two common sampling methods: molecular dynamics (MD), normal-mode sampling (NMS) and one uncommon alternative, Metadynamics (MetaMD), for preparing training geometries. We show that MD is an inefficient sampling method in the sense that additional samples do not improve generality. We also show MetaMD is easily implemented in any NNMC software package with cost that scales linearly with the number of atoms in a sample molecule. MetaMD is a black-box way to ensure samples always reach out to new regions of chemical space, while remaining relevant to chemistry near k_bT. It is one cheap tool to address the issue of generalization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/18/2018

Molecular Dynamics with Neural-Network Potentials

Molecular dynamics simulations are an important tool for describing the ...
research
01/09/2022

A multi-scale sampling method for accurate and robust deep neural network to predict combustion chemical kinetics

Machine learning has long been considered as a black box for predicting ...
research
11/27/2019

Neural Network Based in Silico Simulation of Combustion Reactions

Understanding and prediction of the chemical reactions are fundamental d...
research
11/11/2018

Swift Two-sample Test on High-dimensional Neural Spiking Data

To understand how neural networks process information, it is important t...
research
09/22/2016

The Many-Body Expansion Combined with Neural Networks

Fragmentation methods such as the many-body expansion (MBE) are a common...
research
07/08/2021

Predicting Disease Progress with Imprecise Lab Test Results

In existing deep learning methods, almost all loss functions assume that...
research
04/30/2021

ModelGuard: Runtime Validation of Lipschitz-continuous Models

This paper presents ModelGuard, a sampling-based approach to runtime mod...

Please sign up or login with your details

Forgot password? Click here to reset