ResMLP: Feedforward networks for image classification with data-efficient training

05/07/2021
by   Hugo Touvron, et al.
43

We present ResMLP, an architecture built entirely upon multi-layer perceptrons for image classification. It is a simple residual network that alternates (i) a linear layer in which image patches interact, independently and identically across channels, and (ii) a two-layer feed-forward network in which channels interact independently per patch. When trained with a modern training strategy using heavy data-augmentation and optionally distillation, it attains surprisingly good accuracy/complexity trade-offs on ImageNet. We will share our code based on the Timm library and pre-trained models.

READ FULL TEXT

page 6

page 15

research
05/06/2021

Do You Even Need Attention? A Stack of Feed-Forward Layers Does Surprisingly Well on ImageNet

The strong performance of vision transformers on image classification an...
research
01/03/2021

Few-shot Image Classification: Just Use a Library of Pre-trained Feature Extractors and a Simple Classifier

Recent papers have suggested that transfer learning can outperform sophi...
research
06/22/2021

LV-BERT: Exploiting Layer Variety for BERT

Modern pre-trained language models are mostly built upon backbones stack...
research
01/02/2017

Dynamic Deep Neural Networks: Optimizing Accuracy-Efficiency Trade-offs by Selective Execution

We introduce Dynamic Deep Neural Networks (D2NN), a new type of feed-for...
research
10/21/2017

Incomplete Dot Products for Dynamic Computation Scaling in Neural Network Inference

We propose the use of incomplete dot products (IDP) to dynamically adjus...
research
12/23/2020

Training data-efficient image transformers distillation through attention

Recently, neural networks purely based on attention were shown to addres...
research
10/01/2021

ResNet strikes back: An improved training procedure in timm

The influential Residual Networks designed by He et al. remain the gold-...

Please sign up or login with your details

Forgot password? Click here to reset