Attentive Single-Tasking of Multiple Tasks

04/18/2019
by   Kevis-Kokitsi Maninis, et al.
24

In this work we address task interference in universal networks by considering that a network is trained on multiple tasks, but performs one task at a time, an approach we refer to as "single-tasking multiple tasks". The network thus modifies its behaviour through task-dependent feature adaptation, or task attention. This gives the network the ability to accentuate the features that are adapted to a task, while shunning irrelevant ones. We further reduce task interference by forcing the task gradients to be statistically indistinguishable through adversarial training, ensuring that the common backbone architecture serving all tasks is not dominated by any of the task-specific gradients. Results in three multi-task dense labelling problems consistently show: (i) a large reduction in the number of parameters while preserving, or even improving performance and (ii) a smooth trade-off between computation and multi-task accuracy. We provide our system's code and pre-trained models at http://vision.ee.ethz.ch/ kmaninis/astmt/.

READ FULL TEXT

page 4

page 5

page 6

page 7

page 10

page 11

page 12

page 13

research
05/18/2019

Which Tasks Should Be Learned Together in Multi-task Learning?

Many computer vision applications require solving multiple tasks in real...
research
07/24/2020

Reparameterizing Convolutions for Incremental Multi-Task Learning without Task Interference

Multi-task networks are commonly utilized to alleviate the need for a la...
research
12/05/2019

12-in-1: Multi-Task Vision and Language Representation Learning

Much of vision-and-language research focuses on a small but diverse set ...
research
12/14/2019

Regularizing Deep Multi-Task Networks using Orthogonal Gradients

Deep neural networks are a promising approach towards multi-task learnin...
research
03/31/2022

Visual Prompting: Modifying Pixel Space to Adapt Pre-trained Models

Prompting has recently become a popular paradigm for adapting language m...
research
08/03/2023

Mitigating Task Interference in Multi-Task Learning via Explicit Task Routing with Non-Learnable Primitives

Multi-task learning (MTL) seeks to learn a single model to accomplish mu...
research
06/29/2019

NetTailor: Tuning the Architecture, Not Just the Weights

Real-world applications of object recognition often require the solution...

Please sign up or login with your details

Forgot password? Click here to reset