A Self-Attention Ansatz for Ab-initio Quantum Chemistry

11/24/2022
by   Ingrid von Glehn, et al.
0

We present a novel neural network architecture using self-attention, the Wavefunction Transformer (Psiformer), which can be used as an approximation (or Ansatz) for solving the many-electron Schrödinger equation, the fundamental equation for quantum chemistry and material science. This equation can be solved from first principles, requiring no external training data. In recent years, deep neural networks like the FermiNet and PauliNet have been used to significantly improve the accuracy of these first-principle calculations, but they lack an attention-like mechanism for gating interactions between electrons. Here we show that the Psiformer can be used as a drop-in replacement for these other neural networks, often dramatically improving the accuracy of the calculations. On larger molecules especially, the ground state energy can be improved by dozens of kcal/mol, a qualitative leap over previous methods. This demonstrates that self-attention networks can learn complex quantum mechanical correlations between electrons, and are a promising route to reaching unprecedented accuracy in chemical calculations on larger systems.

READ FULL TEXT

page 16

page 17

research
05/11/2022

Quantum Self-Attention Neural Networks for Text Classification

An emerging direction of quantum computing is to establish meaningful qu...
research
09/05/2019

Ab-Initio Solution of the Many-Electron Schrödinger Equation with Deep Neural Networks

Given access to accurate solutions of the many-electron Schrödinger equa...
research
08/25/2023

QKSAN: A Quantum Kernel Self-Attention Network

Self-Attention Mechanism (SAM) is skilled at extracting important inform...
research
05/28/2022

So3krates – Self-attention for higher-order geometric interactions on arbitrary length-scales

The application of machine learning methods in quantum chemistry has ena...
research
12/06/2021

Human Parity on CommonsenseQA: Augmenting Self-Attention with External Attention

Most of today's AI systems focus on using self-attention mechanisms and ...
research
09/07/2020

Scalar Coupling Constant Prediction Using Graph Embedding Local Attention Encoder

Scalar coupling constant (SCC) plays a key role in the analysis of three...
research
01/22/2022

Data-Centric Machine Learning in Quantum Information Science

We propose a series of data-centric heuristics for improving the perform...

Please sign up or login with your details

Forgot password? Click here to reset