DeepAI AI Chat
Log In Sign Up

Of Non-Linearity and Commutativity in BERT

01/12/2021
by   Sumu Zhao, et al.
0

In this work we provide new insights into the transformer architecture, and in particular, its best-known variant, BERT. First, we propose a method to measure the degree of non-linearity of different elements of transformers. Next, we focus our investigation on the feed-forward networks (FFN) inside transformers, which contain 2/3 of the model parameters and have so far not received much attention. We find that FFNs are an inefficient yet important architectural element and that they cannot simply be replaced by attention blocks without a degradation in performance. Moreover, we study the interactions between layers in BERT and show that, while the layers exhibit some hierarchical structure, they extract features in a fuzzy manner. Our results suggest that BERT has an inductive bias towards layer commutativity, which we find is mainly due to the skip connections. This provides a justification for the strong performance of recurrent and weight-shared transformer models.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/14/2019

Pruning a BERT-based Question Answering Model

We investigate compressing a BERT-based question answering system by pru...
05/06/2021

Do You Even Need Attention? A Stack of Feed-Forward Layers Does Surprisingly Well on ImageNet

The strong performance of vision transformers on image classification an...
07/16/2020

Hopfield Networks is All You Need

We show that the transformer attention mechanism is the update rule of a...
03/01/2022

DeepNet: Scaling Transformers to 1,000 Layers

In this paper, we propose a simple yet effective method to stabilize ext...
03/02/2023

Self-attention in Vision Transformers Performs Perceptual Grouping, Not Attention

Recently, a considerable number of studies in computer vision involves d...
06/13/2021

Thinking Like Transformers

What is the computational model behind a Transformer? Where recurrent ne...
12/14/2019

BERTQA – Attention on Steroids

In this work, we extend the Bidirectional Encoder Representations from T...

Code Repositories

Investigate-BERT-Non-linearity-Commutativity

Investigate BERT on Non-linearity and Layer Commutativity


view repo