Fully-adaptive Feature Sharing in Multi-Task Networks with Applications in Person Attribute Classification

11/16/2016
by   Yongxi Lu, et al.
0

Multi-task learning aims to improve generalization performance of multiple prediction tasks by appropriately sharing relevant information across them. In the context of deep neural networks, this idea is often realized by hand-designed network architectures with layers that are shared across tasks and branches that encode task-specific features. However, the space of possible multi-task deep architectures is combinatorially large and often the final architecture is arrived at by manual exploration of this space subject to designer's bias, which can be both error-prone and tedious. In this work, we propose a principled approach for designing compact multi-task deep learning architectures. Our approach starts with a thin network and dynamically widens it in a greedy manner during training using a novel criterion that promotes grouping of similar tasks together. Our Extensive evaluation on person attributes classification tasks involving facial and clothing attributes suggests that the models produced by the proposed method are fast, compact and can closely match or exceed the state-of-the-art accuracy from strong baselines by much more expensive models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/12/2016

Cross-stitch Networks for Multi-task Learning

Multi-task learning in Convolutional Networks has displayed remarkable s...
research
06/06/2016

Integrated perception with recurrent multi-task neural networks

Modern discriminative predictors have been shown to match natural intell...
research
04/19/2018

GNAS: A Greedy Neural Architecture Search Method for Multi-Attribute Learning

A key problem in deep multi-attribute learning is to effectively discove...
research
08/12/2019

Feature Partitioning for Efficient Multi-Task Architectures

Multi-task learning holds the promise of less data, parameters, and time...
research
08/04/2021

Deep multi-task mining Calabi-Yau four-folds

We continue earlier efforts in computing the dimensions of tangent space...
research
04/05/2019

Branched Multi-Task Networks: Deciding What Layers To Share

In the context of deep learning, neural networks with multiple branches ...
research
04/25/2016

Attributes for Improved Attributes: A Multi-Task Network for Attribute Classification

Attributes, or semantic features, have gained popularity in the past few...

Please sign up or login with your details

Forgot password? Click here to reset