DeepAI AI Chat
Log In Sign Up

Stochastic Gradients for Large-Scale Tensor Decomposition

by   Tamara G. Kolda, et al.

Tensor decomposition is a well-known tool for multiway data analysis. This work proposes using stochastic gradients for efficient generalized canonical polyadic (GCP) tensor decomposition of large-scale tensors. GCP tensor decomposition is a recently proposed version of tensor decomposition that allows for a variety of loss functions such as logistic loss for binary data or Huber loss for robust estimation. The stochastic gradient is formed from randomly sampled elements of the tensor. For dense tensors, we simply use uniform sampling. For sparse tensors, we propose two types of stratified sampling that give precedence to sampling nonzeros. Numerical results demonstrate the advantages of the proposed approach and its scalability to large-scale problems.


Stochastic Mirror Descent for Low-Rank Tensor Decomposition Under Non-Euclidean Losses

This work considers low-rank canonical polyadic decomposition (CPD) unde...

Generalized Canonical Polyadic Tensor Decomposition

Tensor decomposition is a fundamental unsupervised machine learning meth...

A Coupled Random Projection Approach to Large-Scale Canonical Polyadic Decomposition

We propose a novel algorithm for the computation of canonical polyadic d...

Dictionary-based Tensor Canonical Polyadic Decomposition

To ensure interpretability of extracted sources in tensor decomposition,...

Tensor Decompositions for Modeling Inverse Dynamics

Modeling inverse dynamics is crucial for accurate feedforward robot cont...

Sparse Logistic Tensor Decomposition for Binary Data

Tensor data are increasingly available in many application domains. We d...

Analysis of the Stochastic Alternating Least Squares Method for the Decomposition of Random Tensors

Stochastic Alternating Least Squares (SALS) is a method that approximate...