DeepAI
Log In Sign Up

Tensor networks and efficient descriptions of classical data

03/11/2021
by   Sirui Lu, et al.
0

We investigate the potential of tensor network based machine learning methods to scale to large image and text data sets. For that, we study how the mutual information between a subregion and its complement scales with the subsystem size L, similarly to how it is done in quantum many-body physics. We find that for text, the mutual information scales as a power law L^ν with a close to volume law exponent, indicating that text cannot be efficiently described by 1D tensor networks. For images, the scaling is close to an area law, hinting at 2D tensor networks such as PEPS could have an adequate expressibility. For the numerical analysis, we introduce a mutual information estimator based on autoregressive networks, and we also use convolutional neural networks in a neural estimator method.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/25/2020

Bounds for Algorithmic Mutual Information and a Unifilar Order Estimator

Inspired by Hilberg's hypothesis, which states that mutual information b...
06/21/2016

Criticality in Formal Languages and Statistical Physics

We show that the mutual information between two symbols, as a function o...
03/28/2021

Explaining Representation by Mutual Information

Science is used to discover the law of world. Machine learning can be us...
11/02/2022

There Are Fewer Facts Than Words: Communication With A Growing Complexity

We present an impossibility result, called a theorem about facts and wor...
01/15/2023

Quantum-inspired tensor network for Earth science

Deep Learning (DL) is one of many successful methodologies to extract in...
12/30/2006

Magnification Laws of Winner-Relaxing and Winner-Enhancing Kohonen Feature Maps

Self-Organizing Maps are models for unsupervised representation formatio...
05/10/2019

Mutual Information Scaling and Expressive Power of Sequence Models

Sequence models assign probabilities to variable-length sequences such a...