ConcatPlexer: Additional Dim1 Batching for Faster ViTs

08/22/2023
by   Donghoon Han, et al.
0

Transformers have demonstrated tremendous success not only in the natural language processing (NLP) domain but also the field of computer vision, igniting various creative approaches and applications. Yet, the superior performance and modeling flexibility of transformers came with a severe increase in computation costs, and hence several works have proposed methods to reduce this burden. Inspired by a cost-cutting method originally proposed for language models, Data Multiplexing (DataMUX), we propose a novel approach for efficient visual recognition that employs additional dim1 batching (i.e., concatenation) that greatly improves the throughput with little compromise in the accuracy. We first introduce a naive adaptation of DataMux for vision models, Image Multiplexer, and devise novel components to overcome its weaknesses, rendering our final model, ConcatPlexer, at the sweet spot between inference speed and accuracy. The ConcatPlexer was trained on ImageNet1K and CIFAR100 dataset and it achieved 23.5 83.4

READ FULL TEXT
research
07/07/2022

Vision Transformers: State of the Art and Research Challenges

Transformers have achieved great success in natural language processing....
research
11/22/2021

DBIA: Data-free Backdoor Injection Attack against Transformer Networks

Recently, transformer architecture has demonstrated its significance in ...
research
05/20/2023

Dynamic Transformers Provide a False Sense of Efficiency

Despite much success in natural language processing (NLP), pre-trained l...
research
07/06/2023

Vision Language Transformers: A Survey

Vision language tasks, such as answering questions about or generating c...
research
01/28/2021

A Neural Few-Shot Text Classification Reality Check

Modern classification models tend to struggle when the amount of annotat...
research
08/25/2023

Enhancing Landmark Detection in Cluttered Real-World Scenarios with Vision Transformers

Visual place recognition tasks often encounter significant challenges in...

Please sign up or login with your details

Forgot password? Click here to reset