Scalar Coupling Constant Prediction Using Graph Embedding Local Attention Encoder

09/07/2020
by   Caiqing Jian, et al.
0

Scalar coupling constant (SCC) plays a key role in the analysis of three-dimensional structure of organic matter, however, the traditional SCC prediction using quantum mechanical calculations is very time-consuming. To calculate SCC efficiently and accurately, we proposed a graph embedding local self-attention encoder (GELAE) model, in which, a novel invariant structure representation of the coupling system in terms of bond length, bond angle and dihedral angle was presented firstly, and then a local self-attention module embedded with the adjacent matrix of a graph was designed to extract effectively the features of coupling systems, finally, with a modified classification loss function, the SCC was predicted. To validate the superiority of the proposed method, we conducted a series of comparison experiments using different structure representations, different attention modules, and different losses. The experimental results demonstrate that, compared to the traditional chemical bond structure representations, the rotation and translation invariant structure representations proposed in this work can improve the SCC prediction accuracy; with the graph embedded local self-attention, the mean absolute error (MAE) of the prediction model in the validation set decreases from 0.1603 Hz to 0.1067 Hz; using the classification based loss function instead of the scaled regression loss, the MAE of the predicted SCC can be decreased to 0.0963 HZ, which is close to the quantum chemistry standard on CHAMPS dataset.

READ FULL TEXT
research
01/30/2022

Graph Self-Attention for learning graph representation with Transformer

We propose a novel Graph Self-Attention module to enable Transformer mod...
research
04/06/2021

Hyperspectral and LiDAR data classification based on linear self-attention

An efficient linear self-attention fusion model is proposed in this pape...
research
05/13/2022

Local Attention Graph-based Transformer for Multi-target Genetic Alteration Prediction

Classical multiple instance learning (MIL) methods are often based on th...
research
08/17/2022

KAM – a Kernel Attention Module for Emotion Classification with EEG Data

In this work, a kernel attention module is presented for the task of EEG...
research
05/09/2021

Conformer: Local Features Coupling Global Representations for Visual Recognition

Within Convolutional Neural Network (CNN), the convolution operations ar...
research
07/01/2022

ChrSNet: Chromosome Straightening using Self-attention Guided Networks

Karyotyping is an important procedure to assess the possible existence o...
research
11/24/2022

A Self-Attention Ansatz for Ab-initio Quantum Chemistry

We present a novel neural network architecture using self-attention, the...

Please sign up or login with your details

Forgot password? Click here to reset