Generative modeling with projected entangled-pair states

02/16/2022
by   Tom Vieijra, et al.
0

We argue and demonstrate that projected entangled-pair states (PEPS) outperform matrix product states significantly for the task of generative modeling of datasets with an intrinsic two-dimensional structure such as images. Our approach builds on a recently introduced algorithm for sampling PEPS, which allows for the efficient optimization and sampling of the distributions.

READ FULL TEXT
research
09/06/2017

Unsupervised Generative Modeling Using Matrix Product States

Generative modeling, which learns joint probability distribution from tr...
research
12/13/2018

Shortcut Matrix Product States and its applications

Matrix Product States (MPS), also known as Tensor Train (TT) decompositi...
research
01/08/2019

Tree Tensor Networks for Generative Modeling

Matrix product states (MPS), a tensor network designed for one-dimension...
research
10/27/2022

On Tsirelson pairs of C*-algebras

We introduce the notion of a Tsirelson pair of C*-algebras, which is a p...
research
10/19/2012

Modeling with Copulas and Vines in Estimation of Distribution Algorithms

The aim of this work is studying the use of copulas and vines in the opt...
research
02/01/2022

StyleGAN-XL: Scaling StyleGAN to Large Diverse Datasets

Computer graphics has experienced a recent surge of data-centric approac...
research
08/14/2020

The Projected Belief Network Classfier : both Generative and Discriminative

The projected belief network (PBN) is a layered generative network with ...

Please sign up or login with your details

Forgot password? Click here to reset