3D-Transformer: Molecular Representation with Transformer in 3D Space

10/04/2021
by   Fang Wu, et al.
5

Spatial structures in the 3D space are important to determine molecular properties. Recent papers use geometric deep learning to represent molecules and predict properties. These papers, however, are computationally expensive in capturing long-range dependencies of input atoms; and have not considered the non-uniformity of interatomic distances, thus failing to learn context-dependent representations at different scales. To deal with such issues, we introduce 3D-Transformer, a variant of the Transformer for molecular representations that incorporates 3D spatial information. 3D-Transformer operates on a fully-connected graph with direct connections between atoms. To cope with the non-uniformity of interatomic distances, we develop a multi-scale self-attention module that exploits local fine-grained patterns with increasing contextual scales. As molecules of different sizes rely on different kinds of spatial features, we design an adaptive position encoding module that adopts different position encoding methods for small and large molecules. Finally, to attain the molecular representation from atom embeddings, we propose an attentive farthest point sampling algorithm that selects a portion of atoms with the assistance of attention scores, overcoming handicaps of the virtual node and previous distance-dominant downsampling methods. We validate 3D-Transformer across three important scientific domains: quantum chemistry, material science, and proteomics. Our experiments show significant improvements over state-of-the-art models on the crystal property prediction task and the protein-ligand binding affinity prediction task, and show better or competitive performance in quantum chemistry molecular datasets. This work provides clear evidence that biochemical tasks can gain consistent benefits from 3D molecular representations and different tasks require different position encoding methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2022

Substructure-Atom Cross Attention for Molecular Representation Learning

Designing a neural network architecture for molecular representation is ...
research
02/17/2023

Multiresolution Graph Transformers and Wavelet Positional Encoding for Learning Hierarchical Structures

Contemporary graph learning algorithms are not well-defined for large mo...
research
10/26/2021

Geometric Transformer for End-to-End Molecule Properties Prediction

Transformers have become methods of choice in many applications thanks t...
research
04/09/2021

A Graph VAE and Graph Transformer Approach to Generating Molecular Graphs

We propose a combination of a variational autoencoder and a transformer ...
research
04/28/2023

MUDiff: Unified Diffusion for Complete Molecule Generation

We present a new model for generating molecular data by combining discre...
research
02/02/2023

Molecular Geometry-aware Transformer for accurate 3D Atomic System modeling

Molecular dynamic simulations are important in computational physics, ch...
research
11/23/2022

An ensemble of VisNet, Transformer-M, and pretraining models for molecular property prediction in OGB Large-Scale Challenge @ NeurIPS 2022

In the technical report, we provide our solution for OGB-LSC 2022 Graph ...

Please sign up or login with your details

Forgot password? Click here to reset