Deep Compression of Sum-Product Networks on Tensor Networks

11/09/2018
by   Ching-Yun Ko, et al.
0

Sum-product networks (SPNs) represent an emerging class of neural networks with clear probabilistic semantics and superior inference speed over graphical models. This work reveals a strikingly intimate connection between SPNs and tensor networks, thus leading to a highly efficient representation that we call tensor SPNs (tSPNs). For the first time, through mapping an SPN onto a tSPN and employing novel optimization techniques, we demonstrate remarkable parameter compression with negligible loss in accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/30/2021

Semi-tensor Product-based TensorDecomposition for Neural Network Compression

The existing tensor networks adopt conventional matrix product for conne...
research
08/21/2017

Sum-Product Graphical Models

This paper introduces a new probabilistic architecture called Sum-Produc...
research
02/16/2019

Deep Convolutional Sum-Product Networks for Probabilistic Image Representations

Sum-Product Networks (SPNs) are hierarchical probabilistic graphical mod...
research
01/19/2017

Online Structure Learning for Sum-Product Networks with Gaussian Leaves

Sum-product networks have recently emerged as an attractive representati...
research
05/20/2019

Optimisation of Overparametrized Sum-Product Networks

It seems to be a pearl of conventional wisdom that parameter learning in...
research
07/13/2020

Lossless Compression of Structured Convolutional Models via Lifting

Lifting is an efficient technique to scale up graphical models generaliz...
research
08/18/2022

Memory and Capacity of Graph Embedding Methods

This paper analyzes the graph embedding method introduced in <cit.>, whi...

Please sign up or login with your details

Forgot password? Click here to reset