A Perturbation Resilient Framework for Unsupervised Learning

12/14/2020
by   Andreas Maurer, et al.
0

Designing learning algorithms that are resistant to perturbations of the underlying data distribution is a problem of wide practical and theoretical importance. We present a general approach to this problem focusing on unsupervised learning. The key assumption is that the perturbing distribution is characterized by larger losses relative to a given class of admissible models. This is exploited by a general descent algorithm which minimizes an L-statistic criterion over the model class, weighting more small losses. We characterize the robustness of the method in terms of bounds on the reconstruction error for the assumed unperturbed distribution. Numerical experiments with kmeans clustering and principal subspace analysis demonstrate the effectiveness of our method.

READ FULL TEXT
research
12/29/2016

Meta-Unsupervised-Learning: A supervised approach to unsupervised learning

We introduce a new paradigm to investigate unsupervised learning, reduci...
research
08/02/2019

Large-Scale Sparse Subspace Clustering Using Landmarks

Subspace clustering methods based on expressing each data point as a lin...
research
10/16/2018

Sharp Analysis of Learning with Discrete Losses

The problem of devising learning strategies for discrete losses (e.g., m...
research
06/19/2021

Learning and Generalization in Overparameterized Normalizing Flows

In supervised learning, it is known that overparameterized neural networ...
research
09/06/2017

On the exact relationship between the denoising function and the data distribution

We prove an exact relationship between the optimal denoising function an...
research
11/19/2015

Towards Principled Unsupervised Learning

General unsupervised learning is a long-standing conceptual problem in m...
research
02/20/2023

Unsupervised Learning on a DIET: Datum IndEx as Target Free of Self-Supervision, Reconstruction, Projector Head

Costly, noisy, and over-specialized, labels are to be set aside in favor...

Please sign up or login with your details

Forgot password? Click here to reset