Analysis of the Stochastic Alternating Least Squares Method for the Decomposition of Random Tensors

by   Yanzhao Cao, et al.

Stochastic Alternating Least Squares (SALS) is a method that approximates the canonical decomposition of averages of sampled random tensors. Its simplicity and efficient memory usage make SALS an ideal tool for decomposing tensors in an online setting. We show, under mild regularization and readily verifiable assumptions on the boundedness of the data, that the SALS algorithm is globally convergent. Numerical experiments validate our theoretical findings and demonstrate the algorithm's performance and complexity.



page 1

page 2

page 3

page 4


A Fast Implementation for the Canonical Polyadic Decomposition

A new implementation of the canonical polyadic decomposition (CPD) is pr...

The Epsilon-Alternating Least Squares for Orthogonal Low-Rank Tensor Approximation and Its Global Convergence

The epsilon alternating least squares (ϵ-ALS) is developed and analyzed ...

A rank-adaptive higher-order orthogonal iteration algorithm for truncated Tucker decomposition

We propose a novel rank-adaptive higher-order orthogonal iteration (HOOI...

Convergence of a Jacobi-type method for the approximate orthogonal tensor diagonalization

For a general third-order tensor 𝒜∈ℝ^n× n× n the paper studies two close...

A Coupled Random Projection Approach to Large-Scale Canonical Polyadic Decomposition

We propose a novel algorithm for the computation of canonical polyadic d...

Efficient Alternating Least Squares Algorithms for Truncated HOSVD of Higher-Order Tensors

The truncated Tucker decomposition, also known as the truncated higher-o...

Alternating Mahalanobis Distance Minimization for Stable and Accurate CP Decomposition

CP decomposition (CPD) is prevalent in chemometrics, signal processing, ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.