Weighted Tensor Decomposition for Learning Latent Variables with Partial Data

10/18/2017
by   Omer Gottesman, et al.
0

Tensor decomposition methods are popular tools for learning latent variables given only lower-order moments of the data. However, the standard assumption is that we have sufficient data to estimate these moments to high accuracy. In this work, we consider the case in which certain dimensions of the data are not always observed---common in applied settings, where not all measurements may be taken for all observations---resulting in moment estimates of varying quality. We derive a weighted tensor decomposition approach that is computationally as efficient as the non-weighted approach, and demonstrate that it outperforms methods that do not appropriately leverage these less-observed dimensions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/17/2018

Hierarchical Methods of Moments

Spectral methods of moments provide a powerful tool for learning the par...
research
04/16/2020

Spectral Learning on Matrices and Tensors

Spectral methods have been the mainstay in several domains such as machi...
research
10/29/2012

Tensor decompositions for learning latent variable models

This work considers a computationally and statistically efficient parame...
research
06/26/2017

Uncertainty Decomposition in Bayesian Neural Networks with Latent Variables

Bayesian neural networks (BNNs) with latent variables are probabilistic ...
research
02/27/2018

Learning Binary Latent Variable Models: A Tensor Eigenpair Approach

Latent variable models with hidden binary units appear in various applic...
research
05/24/2019

A general method for regularizing tensor decomposition methods via pseudo-data

Tensor decomposition methods allow us to learn the parameters of latent ...
research
10/02/2022

Exact first moments of the RV coefficient by invariant orthogonal integration

The RV coefficient measures the similarity between two multivariate conf...

Please sign up or login with your details

Forgot password? Click here to reset