Supervised Learning with Quantum-Inspired Tensor Networks

05/18/2016
by   E. Miles Stoudenmire, et al.
0

Tensor networks are efficient representations of high-dimensional tensors which have been very successful for physics and mathematics applications. We demonstrate how algorithms for optimizing such networks can be adapted to supervised learning tasks by using matrix product states (tensor trains) to parameterize models for classifying images. For the MNIST data set we obtain less than 1 form imparts additional structure to the learned model and suggest a possible generative interpretation.

READ FULL TEXT

page 5

page 6

research
09/25/2020

Locally orderless tensor networks for classifying two- and three-dimensional medical images

Tensor networks are factorisations of high rank tensors into networks of...
research
06/26/2023

Distributive Pre-Training of Generative Modeling Using Matrix-Product States

Tensor networks have recently found applications in machine learning for...
research
05/15/2019

Number-State Preserving Tensor Networks as Classifiers for Supervised Learning

We propose a restricted class of tensor network state, built from number...
research
12/31/2017

Learning Relevant Features of Data with Multi-scale Tensor Networks

Inspired by coarse-graining approaches used in physics, we show how simi...
research
10/04/2019

Tensor-based algorithms for image classification

The interest in machine learning with tensor networks has been growing r...
research
01/27/2020

Supervised Learning for Non-Sequential Data with the Canonical Polyadic Decomposition

There has recently been increasing interest, both theoretical and practi...
research
05/21/2023

GeometricImageNet: Extending convolutional neural networks to vector and tensor images

Convolutional neural networks and their ilk have been very successful fo...

Please sign up or login with your details

Forgot password? Click here to reset