EquiformerV2: Improved Equivariant Transformer for Scaling to Higher-Degree Representations

06/21/2023
by   Yi-Lun Liao, et al.
0

Equivariant Transformers such as Equiformer have demonstrated the efficacy of applying Transformers to the domain of 3D atomistic systems. However, they are still limited to small degrees of equivariant representations due to their computational complexity. In this paper, we investigate whether these architectures can scale well to higher degrees. Starting from Equiformer, we first replace SO(3) convolutions with eSCN convolutions to efficiently incorporate higher-degree tensors. Then, to better leverage the power of higher degrees, we propose three architectural improvements – attention re-normalization, separable S^2 activation and separable layer normalization. Putting this all together, we propose EquiformerV2, which outperforms previous state-of-the-art methods on the large-scale OC20 dataset by up to 12% on forces, 4% on energies, offers better speed-accuracy trade-offs, and 2× reduction in DFT calculations needed for computing adsorption energies.

READ FULL TEXT
research
01/16/2017

Towards a New Interpretation of Separable Convolutions

In recent times, the use of separable convolutions in deep convolutional...
research
05/19/2020

Exploring Transformers for Large-Scale Speech Recognition

While recurrent neural networks still largely define state-of-the-art sp...
research
09/19/2023

Interpret Vision Transformers as ConvNets with Dynamic Convolutions

There has been a debate about the superiority between vision Transformer...
research
03/17/2022

SepTr: Separable Transformer for Audio Spectrogram Processing

Following the successful application of vision transformers in multiple ...
research
02/07/2023

Reducing SO(3) Convolutions to SO(2) for Efficient Equivariant GNNs

Graph neural networks that model 3D data, such as point clouds or atoms,...
research
04/08/2022

Vision Transformers for Single Image Dehazing

Image dehazing is a representative low-level vision task that estimates ...
research
10/14/2019

Transformers without Tears: Improving the Normalization of Self-Attention

We evaluate three simple, normalization-centric changes to improve Trans...

Please sign up or login with your details

Forgot password? Click here to reset