Bayesian Multitask Learning with Latent Hierarchies

08/09/2014
by   Hal Daumé III, et al.
0

We learn multiple hypotheses for related tasks under a latent hierarchical relationship between tasks. We exploit the intuition that for domain adaptation, we wish to share classifier structure, but for multitask learning, we wish to share covariance structure. Our hierarchical model is seen to subsume several previously proposed multitask learning models and performs well on three distinct real-world data sets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2012

Flexible Modeling of Latent Task Structures in Multitask Learning

Multitask learning algorithms are typically designed assuming some fixed...
research
06/29/2020

A No-Free-Lunch Theorem for MultiTask Learning

Multitask learning and related areas such as multi-source domain adaptat...
research
01/07/2020

Multitask learning over graphs

The problem of learning simultaneously several related tasks has receive...
research
06/27/2012

Cross-Domain Multitask Learning with Latent Probit Models

Learning multiple tasks across heterogeneous domains is a challenging pr...
research
02/13/2017

Multitask diffusion adaptation over networks with common latent representations

Online learning with streaming data in a distributed and collaborative m...
research
02/21/2020

Modelling Latent Skills for Multitask Language Generation

We present a generative model for multitask conditional language generat...
research
09/07/2022

Bayesian learning of feature spaces for multitasks problems

This paper presents a Bayesian framework to construct non-linear, parsim...

Please sign up or login with your details

Forgot password? Click here to reset