A Non-generative Framework and Convex Relaxations for Unsupervised Learning

10/04/2016
by   Elad Hazan, et al.
0

We give a novel formal theoretical framework for unsupervised learning with two distinctive characteristics. First, it does not assume any generative model and based on a worst-case performance metric. Second, it is comparative, namely performance is measured with respect to a given hypothesis class. This allows to avoid known computational hardness results and improper algorithms based on convex relaxations. We show how several families of unsupervised learning models, which were previously only analyzed under probabilistic assumptions and are otherwise provably intractable, can be efficiently learned in our framework by convex optimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2012

Efficient Methods for Unsupervised Learning of Probabilistic Models

In this thesis I develop a variety of techniques to train, evaluate, and...
research
03/15/2020

Provably Efficient Exploration for RL with Unsupervised Learning

We study how to use unsupervised learning for efficient exploration in r...
research
07/28/2022

Hardness of Agnostically Learning Halfspaces from Worst-Case Lattice Problems

We show hardness of improperly learning halfspaces in the agnostic model...
research
11/29/2018

Smoothed Analysis in Unsupervised Learning via Decoupling

Smoothed analysis is a powerful paradigm in overcoming worst-case intrac...
research
10/29/2018

Semi-unsupervised Learning of Human Activity using Deep Generative Models

Here we demonstrate a new deep generative model for classification. We i...
research
06/19/2021

Learning and Generalization in Overparameterized Normalizing Flows

In supervised learning, it is known that overparameterized neural networ...
research
09/08/2017

A Brief Introduction to Machine Learning for Engineers

This monograph aims at providing an introduction to key concepts, algori...

Please sign up or login with your details

Forgot password? Click here to reset