Asymptotic Bayes risk of semi-supervised multitask learning on Gaussian mixture

03/03/2023
by   Minh-Toan Nguyen, et al.
0

The article considers semi-supervised multitask learning on a Gaussian mixture model (GMM). Using methods from statistical physics, we compute the asymptotic Bayes risk of each task in the regime of large datasets in high dimension, from which we analyze the role of task similarity in learning and evaluate the performance gain when tasks are learned together rather than separately. In the supervised case, we derive a simple algorithm that attains the Bayes optimal performance.

READ FULL TEXT

page 4

page 5

research
07/08/2019

Asymptotic Bayes risk for Gaussian mixture in a semi-supervised setting

Semi-supervised learning (SSL) uses unlabeled data for training and has ...
research
11/09/2017

A random matrix analysis and improvement of semi-supervised learning for large dimensional data

This article provides an original understanding of the behavior of a cla...
research
04/10/2019

Multitask Hopfield Networks

Multitask algorithms typically use task similarity information as a bias...
research
11/25/2019

Detecting Unknown Behaviors by Pre-defined Behaviours: An Bayesian Non-parametric Approach

An automatic mouse behavior recognition system can considerably reduce t...
research
12/28/2015

Outlier Detection In Large-scale Traffic Data By Naïve Bayes Method and Gaussian Mixture Model Method

It is meaningful to detect outliers in traffic data for traffic manageme...
research
12/30/2019

Semi-Supervised Learning with Normalizing Flows

Normalizing flows transform a latent distribution through an invertible ...
research
12/13/2017

A Quantum Extension of Variational Bayes Inference

Variational Bayes (VB) inference is one of the most important algorithms...

Please sign up or login with your details

Forgot password? Click here to reset