On the descriptive power of Neural-Networks as constrained Tensor Networks with exponentially large bond dimension

05/27/2019
by   Mario Collura, et al.
0

In many cases, neural networks can be mapped into tensor networks with an exponentially large bond dimension. We show that, when used to study the ground state of short-range Hamiltonians, the tensor network resulting from this mapping is highly constrained and thus it does not deliver the naive expected drastic improvement of the description, with respect to what is obtained via state-of-the-art tensor network methods. We explicitly show this result in two paradigmatic examples, the 1D ferromagnetic Ising model and the 2D antiferromagnetic Heisenberg model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/23/2020

Ranks of Tensor Networks for Eigenspace Projections and the Curse of Dimensionality

The hierarchical (multi-linear) rank of an order-d tensor is key in dete...
research
09/16/2022

Deep tensor networks with matrix product operators

We introduce deep tensor networks, which are exponentially wide neural n...
research
12/30/2019

Bayesian Tensor Network with Polynomial Complexity for Probabilistic Machine Learning

It is known that describing or calculating the conditional probabilities...
research
01/24/2023

How Jellyfish Characterise Alternating Group Equivariant Neural Networks

We provide a full characterisation of all of the possible alternating gr...
research
07/15/2023

Graph Automorphism Group Equivariant Neural Networks

For any graph G having n vertices and its automorphism group Aut(G), we ...
research
08/30/2022

Tensor product approach to modelling epidemics on networks

To improve mathematical models of epidemics it is essential to move beyo...
research
05/10/2021

Exploiting Elasticity in Tensor Ranks for Compressing Neural Networks

Elasticities in depth, width, kernel size and resolution have been explo...

Please sign up or login with your details

Forgot password? Click here to reset