Which Tasks Should Be Learned Together in Multi-task Learning?

05/18/2019
by   Trevor Standley, et al.
0

Many computer vision applications require solving multiple tasks in real-time. A neural network can be trained to solve multiple tasks simultaneously using `multi-task learning'. This saves computation at inference time as only a single network needs to be evaluated. Unfortunately, this often leads to inferior overall performance as task objectives compete, which consequently poses the question: which tasks should and should not be learned together in one network when employing multi-task learning? We systematically study task cooperation and competition and propose a framework for assigning tasks to a few neural networks such that cooperating tasks are computed by the same neural network, while competing tasks are computed by different networks. Our framework offers a time-accuracy trade-off and can produce better accuracy using less inference time than not only a single large multi-task neural network but also many single-task networks.

READ FULL TEXT
research
05/17/2016

Recurrent Neural Network for Text Classification with Multi-Task Learning

Neural network based methods have obtained great progress on a variety o...
research
07/20/2020

Navigating the Trade-Off between Multi-Task Learning and Learning to Multitask in Deep Neural Networks

The terms multi-task learning and multitasking are easily confused. Mult...
research
10/25/2021

AutoMTL: A Programming Framework for Automated Multi-Task Learning

Multi-task learning (MTL) jointly learns a set of tasks. It is a promisi...
research
11/30/2022

Optical multi-task learning using multi-wavelength diffractive deep neural networks

Photonic neural networks are brain-inspired information processing techn...
research
11/18/2018

Neural Multi-Task Learning for Citation Function and Provenance

Citation function and provenance are two cornerstone tasks in citation a...
research
04/11/2022

MIME: Adapting a Single Neural Network for Multi-task Inference with Memory-efficient Dynamic Pruning

Recent years have seen a paradigm shift towards multi-task learning. Thi...
research
04/18/2019

Attentive Single-Tasking of Multiple Tasks

In this work we address task interference in universal networks by consi...

Please sign up or login with your details

Forgot password? Click here to reset