Deep Canonically Correlated LSTMs

01/16/2018
by   Neil Mallinar, et al.
0

We examine Deep Canonically Correlated LSTMs as a way to learn nonlinear transformations of variable length sequences and embed them into a correlated, fixed dimensional space. We use LSTMs to transform multi-view time-series data non-linearly while learning temporal relationships within the data. We then perform correlation analysis on the outputs of these neural networks to find a correlated subspace through which we get our final representation via projection. This work follows from previous work done on Deep Canonical Correlation (DCCA), in which deep feed-forward neural networks were used to learn nonlinear transformations of data while maximizing correlation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/25/2020

Deep Tensor CCA for Multi-view Learning

We present Deep Tensor Canonical Correlation Analysis (DTCCA), a method ...
research
07/17/2019

Deep Multi-View Learning via Task-Optimal CCA

Canonical Correlation Analysis (CCA) is widely used for multimodal data ...
research
03/23/2022

Dynamically-Scaled Deep Canonical Correlation Analysis

Canonical Correlation Analysis (CCA) is a method for feature extraction ...
research
10/12/2020

Deep Gated Canonical Correlation Analysis

Canonical Correlation Analysis (CCA) models can extract informative corr...
research
02/10/2018

Learning Correlation Space for Time Series

We propose an approximation algorithm for efficient correlation search i...
research
11/16/2015

Nonparametric Canonical Correlation Analysis

Canonical correlation analysis (CCA) is a classical representation learn...
research
04/03/2018

Convolutional Neural Networks Regularized by Correlated Noise

Neurons in the visual cortex are correlated in their variability. The pr...

Please sign up or login with your details

Forgot password? Click here to reset