Co-Clustering for Multitask Learning

03/03/2017
by   Keerthiram Murugesan, et al.
0

This paper presents a new multitask learning framework that learns a shared representation among the tasks, incorporating both task and feature clusters. The jointly-induced clusters yield a shared latent subspace where task relationships are learned more effectively and more generally than in state-of-the-art multitask learning methods. The proposed general framework enables the derivation of more specific or restricted state-of-the-art multitask methods. The paper also proposes a highly-scalable multitask learning algorithm, based on the new framework, using conjugate gradient descent and generalized Sylvester equations. Experimental results on synthetic and benchmark datasets show that the proposed method systematically outperforms several state-of-the-art multitask learning methods.

READ FULL TEXT
research
05/19/2018

Learning to Multitask

Multitask learning has shown promising performance in many applications ...
research
11/10/2016

Multi-Task Multiple Kernel Relationship Learning

This paper presents a novel multitask multiple kernel learning framework...
research
06/27/2012

Flexible Modeling of Latent Task Structures in Multitask Learning

Multitask learning algorithms are typically designed assuming some fixed...
research
09/07/2022

Multitask Learning via Shared Features: Algorithms and Hardness

We investigate the computational efficiency of multitask learning of Boo...
research
05/23/2017

Consistent Multitask Learning with Nonlinear Output Relations

Key to multitask learning is exploiting relationships between different ...
research
06/04/2021

Multitask Online Mirror Descent

We introduce and analyze MT-OMD, a multitask generalization of Online Mi...
research
09/07/2022

Bayesian learning of feature spaces for multitasks problems

This paper presents a Bayesian framework to construct non-linear, parsim...

Please sign up or login with your details

Forgot password? Click here to reset