On the Sample Complexity of HGR Maximal Correlation Functions

06/30/2019
by   Shao-Lun Huang, et al.
0

The Hirschfeld-Gebelein-Rényi (HGR) maximal correlation and the corresponding functions have been shown useful in many machine learning scenarios. In this paper, we study the sample complexity of estimating the HGR maximal correlation functions by the alternative conditional expectation (ACE) algorithm from a sequence of training data in the asymptotic regime. Specifically, we develop a mathematical framework to characterize the learning errors between the maximal correlation functions computed from the true distribution, and the functions estimated from the ACE algorithm. For both supervised and semi-supervised learning scenarios, we establish the analytical expressions for the error exponents of the learning errors, which indicate the number of training samples required for estimating the HGR maximal correlation functions by the ACE algorithm. Moreover, with our theoretical results, we investigate the sampling strategy for different types of samples in semi-supervised learning with a total sampling budget constraint, and an optimal sampling strategy is developed to maximize the error exponent of the learning error. Finally, the numerical simulations are presented to support our theoretical results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/05/2022

Rethinking Backdoor Data Poisoning Attacks in the Context of Semi-Supervised Learning

Semi-supervised learning methods can train high-accuracy machine learnin...
research
09/10/2018

Sample Complexity of Nonparametric Semi-Supervised Learning

We study the sample complexity of semi-supervised learning (SSL) and int...
research
02/14/2012

Active Semi-Supervised Learning using Submodular Functions

We consider active, semi-supervised learning in an offline transductive ...
research
11/28/2017

The Maximal MAM, a Reasonable Implementation of the Maximal Strategy

This note is about a reasonable abstract machine, called Maximal MAM, im...
research
05/02/2017

One-Class Semi-Supervised Learning: Detecting Linearly Separable Class by its Mean

In this paper, we presented a novel semi-supervised one-class classifica...
research
12/14/2018

Semi-Supervised Monaural Singing Voice Separation With a Masking Network Trained on Synthetic Mixtures

We study the problem of semi-supervised singing voice separation, in whi...
research
12/05/2022

Breaking the Spurious Causality of Conditional Generation via Fairness Intervention with Corrective Sampling

Trying to capture the sample-label relationship, conditional generative ...

Please sign up or login with your details

Forgot password? Click here to reset