Efficient Tensor Decomposition

07/30/2020
by   Aravindan Vijayaraghavan, et al.
0

This chapter studies the problem of decomposing a tensor into a sum of constituent rank one tensors. While tensor decompositions are very useful in designing learning algorithms and data analysis, they are NP-hard in the worst-case. We will see how to design efficient algorithms with provable guarantees under mild assumptions, and using beyond worst-case frameworks like smoothed analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/07/2022

Computing linear sections of varieties: quantum entanglement, tensor decompositions and beyond

We study the problem of finding elements in the intersection of an arbit...
research
11/29/2018

Smoothed Analysis in Unsupervised Learning via Decoupling

Smoothed analysis is a powerful paradigm in overcoming worst-case intrac...
research
07/08/2021

Decomposition algorithms for tensors and polynomials

We give algorithms to compute decompositions of a given polynomial, or m...
research
02/08/2023

Approximately Optimal Core Shapes for Tensor Decompositions

This work studies the combinatorial optimization problem of finding an o...
research
10/12/2021

Tensor decompositions and algorithms, with applications to tensor learning

A new algorithm of the canonical polyadic decomposition (CPD) presented ...
research
10/10/2020

Noise in Classification

This chapter considers the computational and statistical aspects of lear...
research
12/28/2016

Provable learning of Noisy-or Networks

Many machine learning applications use latent variable models to explain...

Please sign up or login with your details

Forgot password? Click here to reset