Smoothed Analysis in Unsupervised Learning via Decoupling

11/29/2018
by   Aditya Bhaskara, et al.
12

Smoothed analysis is a powerful paradigm in overcoming worst-case intractability in unsupervised learning and high-dimensional data analysis. While polynomial time smoothed analysis guarantees have been obtained for worst-case intractable problems like tensor decompositions and learning mixtures of Gaussians, such guarantees have been hard to obtain for several other important problems in unsupervised learning. A core technical challenge is obtaining lower bounds on the least singular value for random matrix ensembles with dependent entries, that are given by low-degree polynomials of a few base underlying random variables. In this work, we address this challenge by obtaining high-confidence lower bounds on the least singular value of new classes of structured random matrix ensembles of the above kind. We then use these bounds to obtain polynomial time smoothed analysis guarantees for the following three important problems in unsupervised learning: 1. Robust subspace recovery, when the fraction α of inliers in the d-dimensional subspace T ⊂R^n is at least α > (d/n)^ℓ for any constant integer ℓ>0. This contrasts with the known worst-case intractability when α< d/n, and the previous smoothed analysis result which needed α > d/n (Hardt and Moitra, 2013). 2. Higher order tensor decompositions, where we generalize the so-called FOOBI algorithm of Cardoso to find order-ℓ rank-one tensors in a subspace. This allows us to obtain polynomially robust decomposition algorithms for 2ℓ'th order tensors with rank O(n^ℓ). 3. Learning overcomplete hidden markov models, where the size of the state space is any polynomial in the dimension of the observations. This gives the first polynomial time guarantees for learning overcomplete HMMs in a smoothed analysis model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/30/2020

Efficient Tensor Decomposition

This chapter studies the problem of decomposing a tensor into a sum of c...
research
07/17/2020

Perturbation Bounds for Orthogonally Decomposable Tensors and Their Applications in High Dimensional Data Analysis

We develop deterministic perturbation bounds for singular values and vec...
research
12/07/2022

Computing linear sections of varieties: quantum entanglement, tensor decompositions and beyond

We study the problem of finding elements in the intersection of an arbit...
research
11/14/2013

Smoothed Analysis of Tensor Decompositions

Low rank tensor decompositions are a powerful tool for learning generati...
research
07/30/2022

Polynomial-Time Power-Sum Decomposition of Polynomials

We give efficient algorithms for finding power-sum decomposition of an i...
research
10/04/2016

A Non-generative Framework and Convex Relaxations for Unsupervised Learning

We give a novel formal theoretical framework for unsupervised learning w...
research
02/08/2023

Approximately Optimal Core Shapes for Tensor Decompositions

This work studies the combinatorial optimization problem of finding an o...

Please sign up or login with your details

Forgot password? Click here to reset