Permutation invariant graph-to-sequence model for template-free retrosynthesis and reaction prediction

10/19/2021
by   Zhengkai Tu, et al.
16

Synthesis planning and reaction outcome prediction are two fundamental problems in computer-aided organic chemistry for which a variety of data-driven approaches have emerged. Natural language approaches that model each problem as a SMILES-to-SMILES translation lead to a simple end-to-end formulation, reduce the need for data preprocessing, and enable the use of well-optimized machine translation model architectures. However, SMILES representations are not an efficient representation for capturing information about molecular structures, as evidenced by the success of SMILES augmentation to boost empirical performance. Here, we describe a novel Graph2SMILES model that combines the power of Transformer models for text generation with the permutation invariance of molecular graph encoders that mitigates the need for input data augmentation. As an end-to-end architecture, Graph2SMILES can be used as a drop-in replacement for the Transformer in any task involving molecule(s)-to-molecule(s) transformations. In our encoder, an attention-augmented directed message passing neural network (D-MPNN) captures local chemical environments, and the global attention encoder allows for long-range and intermolecular interactions, enhanced by graph-aware positional embedding. Graph2SMILES improves the top-1 accuracy of the Transformer baselines by 1.7% and 1.9% for reaction outcome prediction on USPTO_480k and USPTO_STEREO datasets respectively, and by 9.8% for one-step retrosynthesis on the USPTO_50k dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/29/2022

Retroformer: Pushing the Limits of Interpretable End-to-end Retrosynthesis Transformer

Retrosynthesis prediction is one of the fundamental challenges in organi...
research
11/06/2018

Molecular Transformer for Chemical Reaction Prediction and Uncertainty Estimation

Organic synthesis is one of the key stumbling blocks in medicinal chemis...
research
05/04/2023

G-MATT: Single-step Retrosynthesis Prediction using Molecular Grammar Tree Transformer

Various template-based and template-free approaches have been proposed f...
research
07/02/2019

Predicting Retrosynthetic Reaction using Self-Corrected Transformer Neural Networks

Synthesis planning is the process of recursively decomposing target mole...
research
03/05/2020

Augmented Transformer Achieves 97 and Classical Retro-Synthesis

We investigated the effect of different augmentation scenarios on predic...
research
10/17/2019

Predicting retrosynthetic pathways using a combined linguistic model and hyper-graph exploration strategy

We present an extension of our Molecular Transformer architecture combin...
research
10/27/2021

A2I Transformer: Permutation-equivariant attention network for pairwise and many-body interactions with minimal featurization

The combination of neural network potential (NNP) with molecular simulat...

Please sign up or login with your details

Forgot password? Click here to reset