Stochastic Filter Groups for Multi-Task CNNs: Learning Specialist and Generalist Convolution Kernels

08/26/2019
by   Felix J. S. Bragman, et al.
12

The performance of multi-task learning in Convolutional Neural Networks (CNNs) hinges on the design of feature sharing between tasks within the architecture. The number of possible sharing patterns are combinatorial in the depth of the network and the number of tasks, and thus hand-crafting an architecture, purely based on the human intuitions of task relationships can be time-consuming and suboptimal. In this paper, we present a probabilistic approach to learning task-specific and shared representations in CNNs for multi-task learning. Specifically, we propose "stochastic filter groups" (SFG), a mechanism to assign convolution kernels in each layer to "specialist" or "generalist" groups, which are specific to or shared across different tasks, respectively. The SFG modules determine the connectivity between layers and the structures of task-specific and shared representations in the network. We employ variational inference to learn the posterior distribution over the possible grouping of kernels and network parameters. Experiments demonstrate that the proposed method generalises across multiple tasks and shows improved performance over baseline methods.

READ FULL TEXT

page 3

page 4

page 7

page 8

page 14

page 15

page 16

page 17

research
04/12/2016

Cross-stitch Networks for Multi-task Learning

Multi-task learning in Convolutional Networks has displayed remarkable s...
research
03/23/2020

Learned Weight Sharing for Deep Multi-Task Learning by Natural Evolution Strategy and Stochastic Gradient Descent

In deep multi-task learning, weights of task-specific networks are share...
research
10/28/2017

Multi-Task Learning by Deep Collaboration and Application in Facial Landmark Detection

Convolutional neural networks (CNN) have become the most successful and ...
research
01/25/2018

NDDR-CNN: Layer-wise Feature Fusing in Multi-Task CNN by Neural Discriminative Dimensionality Reduction

State-of-the-art Convolutional Neural Network (CNN) benefits a lot from ...
research
07/31/2017

Convolution with Logarithmic Filter Groups for Efficient Shallow CNN

In convolutional neural networks (CNNs), the filter grouping in convolut...
research
01/22/2021

Network Clustering for Multi-task Learning

The Multi-Task Learning (MTL) technique has been widely studied by word-...
research
01/29/2021

Learning Twofold Heterogeneous Multi-Task by Sharing Similar Convolution Kernel Pairs

Heterogeneous multi-task learning (HMTL) is an important topic in multi-...

Please sign up or login with your details

Forgot password? Click here to reset