From Tensor Network Quantum States to Tensorial Recurrent Neural Networks

06/24/2022
by   Dian Wu, et al.
12

We show that any matrix product state (MPS) can be exactly represented by a recurrent neural network (RNN) with a linear memory update. We generalize this RNN architecture to 2D lattices using a multilinear memory update. It supports perfect sampling and wave function evaluation in polynomial time, and can represent an area law of entanglement entropy. Numerical evidence shows that it can encode the wave function using a bond dimension lower by orders of magnitude when compared to MPS, with an accuracy that can be systematically improved by increasing the bond dimension.

READ FULL TEXT
research
03/20/2023

Investigating Topological Order using Recurrent Neural Networks

Recurrent neural networks (RNNs), originally developed for natural langu...
research
10/11/2017

Neural Networks Quantum States, String-Bond States and chiral topological states

Neural Networks Quantum States have been recently introduced as an Ansat...
research
04/29/2019

Recurrent Neural Networks in the Eye of Differential Equations

To understand the fundamental trade-offs between training stability, tem...
research
03/20/2019

Counterexample-Guided Strategy Improvement for POMDPs Using Recurrent Neural Networks

We study strategy synthesis for partially observable Markov decision pro...
research
11/08/2017

A New Hybrid-parameter Recurrent Neural Networks for Online Handwritten Chinese Character Recognition

The recurrent neural network (RNN) is appropriate for dealing with tempo...
research
07/28/2022

Supplementing Recurrent Neural Network Wave Functions with Symmetry and Annealing to Improve Accuracy

Recurrent neural networks (RNNs) are a class of neural networks that hav...
research
05/27/2020

Fast and Effective Robustness Certification for Recurrent Neural Networks

We present a precise and scalable verifier for recurrent neural networks...

Please sign up or login with your details

Forgot password? Click here to reset