DPA-1: Pretraining of Attention-based Deep Potential Model for Molecular Simulation

08/17/2022
by   Duo Zhang, et al.
0

Machine learning assisted modeling of the inter-atomic potential energy surface (PES) is revolutionizing the field of molecular simulation. With the accumulation of high-quality electronic structure data, a model that can be pretrained on all available data and finetuned on downstream tasks with a small additional effort would bring the field to a new stage. Here we propose DPA-1, a Deep Potential model with a novel attention mechanism, which is highly effective for representing the conformation and chemical spaces of atomic systems and learning the PES. We tested DPA-1 on a number of systems and observed superior performance compared with existing benchmarks. When pretrained on large-scale datasets containing 56 elements, DPA-1 can be successfully applied to various downstream tasks with a great improvement of sample efficiency. Surprisingly, for different elements, the learned type embedding parameters form a spiral in the latent space and have a natural correspondence with their positions on the periodic table, showing interesting interpretability of the pretrained DPA-1 model.

READ FULL TEXT
research
09/05/2022

ChemBERTa-2: Towards Chemical Foundation Models

Large pretrained models such as GPT-3 have had tremendous impact on mode...
research
02/19/2020

Molecule Attention Transformer

Designing a single neural network architecture that performs competitive...
research
11/08/2022

Reducing Down(stream)time: Pretraining Molecular GNNs using Heterogeneous AI Accelerators

The demonstrated success of transfer learning has popularized approaches...
research
09/03/2022

TransPolymer: a Transformer-based Language Model for Polymer Property Predictions

Accurate and efficient prediction of polymer properties is of great sign...
research
11/19/2022

Molecular Structure-Property Co-Trained Foundation Model for In Silico Chemistry

Recently, deep learning approaches have been extensively studied for var...
research
04/19/2023

NetGPT: Generative Pretrained Transformer for Network Traffic

All data on the Internet are transferred by network traffic, thus accura...
research
10/27/2021

A2I Transformer: Permutation-equivariant attention network for pairwise and many-body interactions with minimal featurization

The combination of neural network potential (NNP) with molecular simulat...

Please sign up or login with your details

Forgot password? Click here to reset