On the descriptive power of Neural-Networks as constrained Tensor Networks with exponentially large bond dimension
In many cases, neural networks can be mapped into tensor networks with an exponentially large bond dimension. We show that, when used to study the ground state of short-range Hamiltonians, the tensor network resulting from this mapping is highly constrained and thus it does not deliver the naive expected drastic improvement of the description, with respect to what is obtained via state-of-the-art tensor network methods. We explicitly show this result in two paradigmatic examples, the 1D ferromagnetic Ising model and the 2D antiferromagnetic Heisenberg model.
READ FULL TEXT