On the descriptive power of Neural-Networks as constrained Tensor Networks with exponentially large bond dimension

05/27/2019
by   Mario Collura, et al.
0

In many cases, neural networks can be mapped into tensor networks with an exponentially large bond dimension. We show that, when used to study the ground state of short-range Hamiltonians, the tensor network resulting from this mapping is highly constrained and thus it does not deliver the naive expected drastic improvement of the description, with respect to what is obtained via state-of-the-art tensor network methods. We explicitly show this result in two paradigmatic examples, the 1D ferromagnetic Ising model and the 2D antiferromagnetic Heisenberg model.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

12/23/2020

Ranks of Tensor Networks for Eigenspace Projections and the Curse of Dimensionality

The hierarchical (multi-linear) rank of an order-d tensor is key in dete...
10/04/2019

Tensor-based algorithms for image classification

The interest in machine learning with tensor networks has been growing r...
12/30/2019

Bayesian Tensor Network with Polynomial Complexity for Probabilistic Machine Learning

It is known that describing or calculating the conditional probabilities...
06/22/2021

Lower and Upper Bounds on the VC-Dimension of Tensor Network Models

Tensor network methods have been a key ingredient of advances in condens...
04/21/2020

Tensor Networks for Medical Image Classification

With the increasing adoption of machine learning tools like neural netwo...
11/09/2018

Deep Compression of Sum-Product Networks on Tensor Networks

Sum-product networks (SPNs) represent an emerging class of neural networ...
05/10/2021

Exploiting Elasticity in Tensor Ranks for Compressing Neural Networks

Elasticities in depth, width, kernel size and resolution have been explo...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

References

  • (1) C. M. Bishop,

    Neural Networks for Pattern Recognition

    , Clarendon Press, Oxford : NewYork, (1996).
  • (2) S. S. Haykin, Neural networks and learning machines, Vol. 3, Pearson Upper Saddle River, NJ, USA, (2009).
  • (3) M. A. Nielsen, Neural Networks and Deep Learning, Determination Press, (2015).
  • (4) Y. LeCun, Y. Bengio, and G. Hinton, Nature 521, 436 (2015).
  • (5) I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning, MIT Press, (2016).
  • (6) D. J. Amit, H. Gutfreund, and H. Sompolinsky, Phys. Rev. A 32, 1007 (1985).
  • (7) P. Mehta, and D. J. Schwab, arXiv:1410.3831.
  • (8) H. W. Lin, M. Tegmark, and D. Rolnick, J. Stat. Phys. 168, 1223 (2017).
  • (9) G. Carleo, I. Cirac, et al., Machine learning and the physical sciences, arXiv:1903.10563.
  • (10) G. Torlai, and R. G. Melko, Phys. Rev. B 94, 165134 (2016).
  • (11) L. Huang, and L. Wang, Phys. Rev. B 95, 035105 (2017).
  • (12) J. Liu, Y. Qi, Z. Y. Meng, and L. Fu, Phys. Rev. B 95, 041101(R) (2017).
  • (13) G. Carleo, and M. Troyer, Science 355, 602 (2017).
  • (14) D.-L. Deng, X. Li, and S. Das Sarma, Phys. Rev. B 96, 195145 (2017).
  • (15) Y. Nomura, A. Darmawan, et al., Phys. Rev. B 96, 205152 (2017).
  • (16) D. Perez-Garcia, F. Verstraete, et al., Quantum Inf. Comput. 7, 401 (2007).
  • (17) R. Orús, Annals of Physics 349, 117 (2014).
  • (18) R. Orús, European Physical Journal B 87, 280 (2014).
  • (19)

    G. Vidal, in Understanding Quantum Phase Transitions, edited by Lincoln D. Carr (Taylor & Francis, Boca Raton, 2010).

  • (20) F. Verstraete, V. Murg, and J. I. Cirac, Advances in Physics 57, 143 (2008).
  • (21) J. I. Cirac and F. Verstraete, J. Phys. A Math. Theor. 42, 504004 (2009).
  • (22) M. J. Hartmann, J. Prior, S. R. Clark, and M. B. Plenio, Phys. Rev. Lett. 102, 057202 (2009).
  • (23) U. Schollwöck, Ann. Phys. 326, 96 (2011).
  • (24) U. Schollwöck, Philos. Trans. Royal Soc. A 369, 2643 (2011).
  • (25) S. Sachdev, Physics 2, 90 (2009).
  • (26) J. Eisert, Modeling and Simulation 3 , 520 (2013).
  • (27) G. Evenbly and G. Vidal, Journal of Statistical Physics 145, 891 (2011).
  • (28) J. C. Bridgeman and C. T. Chubb, J. Phys. A: Math. Theor. 50, 223001 (2017).
  • (29) A. Cichocki, N. Lee, et al.

    , Foundations and Trends in Machine Learning

    9, 249 (2016).
  • (30) A. Cichocki, N. Lee, et al., Foundations and Trends in Machine Learning 9, 431 (2017).
  • (31) S. Montangero, Introduction to Tensor Network Methods, Springer International Publishing (2018).
  • (32) Pietro Silvi, et al., SciPost Phys. Lect. Notes 8 (2019).
  • (33) A. N. Kolmogorov, Dokl. Akad. Nauk SSSR 108, 179 (1961).
  • (34) K. Hornik, Neural Netw. 4, 251 (1991).
  • (35) L. Younes, Appl. Math. Lett. 9, 109 (1996).
  • (36) N. Le Roux, and Y. Bengio, Neural Comput. 20, 1631 (2008).
  • (37) G. Montufar, and N. Ay, Neural Comput. 23, 1306 (2011).
  • (38) I. Glasser, N. Pancotti, et al., Phys. Rev. X 8, 011006 (2018).
  • (39) J. Chen, S. Cheng, et al., Phys. Rev. B 97, 085104 (2018)
  • (40) S. R. Clark. J. Phys. A 51 135301 (2018).
  • (41) W. Huggins, et al., Quantum Sci. Technol. 4, 024001 (2019).
  • (42) L Pastori, R. Kaubruegger, J. C. Budich, Phys. Rev. B 99, 165123 (2019).
  • (43) D.-L. Deng, X. Li, and S. Das Sarma, Phys. Rev. X 7, 021021 (2017).
  • (44) M. B. Hastings, J. Stat. Mech. (2007) P08024.
  • (45) J. Eisert, M. Cramer, M.B. Plenio Rev. Mod. Phys. 82, 277 (2010).
  • (46) E. M. Inack, G. E. Santoro, L. Dell’Anna, and S. Pilati, Phys. Rev. B 98, 235145 (2018).
  • (47) K. Choo, T. Neupert, G. Carleo, arXiv:1903.06713
  • (48) G. Vidal, Phys. Rev. Lett. 93, 040502 (2004).
  • (49) A. W. Sandvik, Phys. Rev. B 56, 11678 (1997).
  • (50) Y. Shi, L. Duan, G. Vidal, Phys. Rev. A 74, 022320 (2006).
  • (51) L. Tagliacozzo, G. Evenbly, and G. Vidal, Phys. Rev. B 80, 235127 (2009).
  • (52) V. Murg, F. Verstraete, Ö. Legeza, and R. M. Noack, Phys. Rev. B 82, 205105 (2010).
  • (53) M. Gerster, et al., Phys. Rev. B 96, 195123 (2017)