Beyond Shared Hierarchies: Deep Multitask Learning through Soft Layer Ordering

10/31/2017
by   Elliot Meyerson, et al.
0

Existing deep multitask learning (MTL) approaches align layers shared between tasks in a parallel ordering. Such an organization significantly constricts the types of shared structure that can be learned. The necessity of parallel ordering for deep MTL is first tested by comparing it with permuted ordering of shared layers. The results indicate that a flexible ordering can enable more effective sharing, thus motivating the development of a soft ordering approach, which learns how shared layers are applied in different ways for different tasks. Deep MTL with soft ordering outperforms parallel ordering methods across a series of domains. These results suggest that the power of deep MTL comes from learning highly general building blocks that can be assembled to meet the demands of each task.

READ FULL TEXT
research
02/15/2022

The directed plump ordering

Based on Taylor's hereditarily directed plump ordinals, we define the di...
research
03/10/2018

Evolutionary Architecture Search For Deep Multitask Networks

Multitask learning, i.e. learning several tasks at once with the same ne...
research
02/27/2020

Usual stochastic ordering results for series and parallel systems with components having Exponentiated Chen distribution

In this paper, we have discussed the usual stochastic ordering relations...
research
05/20/2020

Multitask Learning with Single Gradient Step Update for Task Balancing

Multitask learning is a methodology to boost generalization performance ...
research
03/02/2017

Self-Paced Multitask Learning with Shared Knowledge

This paper introduces self-paced task selection to multitask learning, w...
research
04/24/2023

Protecting Locks Against Unbalanced Unlock()

The lock is a building-block synchronization primitive that enables mutu...

Please sign up or login with your details

Forgot password? Click here to reset