Low-rank representation of tensor network operators with long-range pairwise interactions

09/05/2019
by   Lin Lin, et al.
0

Tensor network operators, such as the matrix product operator (MPO) and the projected entangled-pair operator (PEPO), can provide efficient representation of certain linear operators in high dimensional spaces. This paper focuses on the efficient representation of tensor network operators with long-range pairwise interactions such as the Coulomb interaction. For MPOs, we find that all existing efficient methods exploit a peculiar "upper-triangular low-rank" (UTLR) property, i.e. the upper-triangular part of the matrix can be well approximated by a low-rank matrix, while the matrix itself can be full-rank. This allows us to convert the problem of finding the efficient MPO representation into a matrix completion problem. We develop a modified incremental singular value decomposition method (ISVD) to solve this ill-conditioned matrix completion problem. This algorithm yields equivalent MPO representation to that developed in [Stoudenmire and White, Phys. Rev. Lett. 2017]. In order to efficiently treat more general tensor network operators, we develop another strategy for compressing tensor network operators based on hierarchical low-rank matrix formats, such as the hierarchical off-diagonal low-rank (HODLR) format, and the H-matrix format. Though the pre-constant in the complexity is larger, the advantage of using the hierarchical low-rank matrix format is that it is applicable to both MPOs and PEPOs. For the Coulomb interaction, the operator can be represented by a linear combination of O((N)(N/ϵ)) MPOs/PEPOs, each with a constant bond dimension, where N is the system size and ϵ is the accuracy of the low-rank truncation. Neither the modified ISVD nor the hierarchical low-rank algorithm assumes that the long-range interaction takes a translation-invariant form.

READ FULL TEXT

page 17

page 23

page 24

research
02/06/2020

Low Rank Triple Decomposition and Tensor Recovery

A simple approach for matrix completion and recovery is via low rank mat...
research
04/10/2018

Efficient approximation for global functions of matrix product operators

Building on a previously introduced block Lanczos method, we demonstrate...
research
10/28/2021

Approximately low-rank recovery from noisy and local measurements by convex program

Low-rank matrix models have been universally useful for numerous applica...
research
01/30/2020

Prospects of tensor-based numerical modeling of the collective electrostatic potential in many-particle systems

Recently the rank-structured tensor approach suggested a progress in the...
research
12/01/2022

Learning Transition Operators From Sparse Space-Time Samples

We consider the nonlinear inverse problem of learning a transition opera...
research
09/15/2022

Fast hierarchical low-rank view factor matrices for thermal irradiance on planetary surfaces

We present an algorithm for compressing the radiosity view factor model ...
research
12/16/2017

Low Rank Matrix Recovery for Joint Array Self-Calibration and Sparse Model DoA Estimation

In this work, combined calibration and DoA estimation is approached as a...

Please sign up or login with your details

Forgot password? Click here to reset