Log In Sign Up

Tensor networks and efficient descriptions of classical data

by   Sirui Lu, et al.

We investigate the potential of tensor network based machine learning methods to scale to large image and text data sets. For that, we study how the mutual information between a subregion and its complement scales with the subsystem size L, similarly to how it is done in quantum many-body physics. We find that for text, the mutual information scales as a power law L^ν with a close to volume law exponent, indicating that text cannot be efficiently described by 1D tensor networks. For images, the scaling is close to an area law, hinting at 2D tensor networks such as PEPS could have an adequate expressibility. For the numerical analysis, we introduce a mutual information estimator based on autoregressive networks, and we also use convolutional neural networks in a neural estimator method.


page 1

page 2

page 3

page 4


Bounds for Algorithmic Mutual Information and a Unifilar Order Estimator

Inspired by Hilberg's hypothesis, which states that mutual information b...

Criticality in Formal Languages and Statistical Physics

We show that the mutual information between two symbols, as a function o...

Explaining Representation by Mutual Information

Science is used to discover the law of world. Machine learning can be us...

There Are Fewer Facts Than Words: Communication With A Growing Complexity

We present an impossibility result, called a theorem about facts and wor...

Quantum-inspired tensor network for Earth science

Deep Learning (DL) is one of many successful methodologies to extract in...

Magnification Laws of Winner-Relaxing and Winner-Enhancing Kohonen Feature Maps

Self-Organizing Maps are models for unsupervised representation formatio...

Mutual Information Scaling and Expressive Power of Sequence Models

Sequence models assign probabilities to variable-length sequences such a...