Sum-Product-Attention Networks: Leveraging Self-Attention in Probabilistic Circuits

09/14/2021
by   Zhongjie Yu, et al.
9

Probabilistic circuits (PCs) have become the de-facto standard for learning and inference in probabilistic modeling. We introduce Sum-Product-Attention Networks (SPAN), a new generative model that integrates probabilistic circuits with Transformers. SPAN uses self-attention to select the most relevant parts of a probabilistic circuit, here sum-product networks, to improve the modeling capability of the underlying sum-product network. We show that while modeling, SPAN focuses on a specific set of independent assumptions in every product layer of the sum-product network. Our empirical evaluations show that SPAN outperforms state-of-the-art probabilistic generative models on various benchmark data sets as well is an efficient generative image model.

READ FULL TEXT
research
05/19/2019

Adaptive Attention Span in Transformers

We propose a novel self-attention mechanism that can learn its optimal a...
research
12/30/2018

Variational Self-attention Model for Sentence Representation

This paper proposes a variational self-attention model (VSAM) that emplo...
research
04/28/2022

A Probabilistic Interpretation of Transformers

We propose a probabilistic interpretation of exponential dot product att...
research
10/19/2021

Explaining Deep Tractable Probabilistic Models: The sum-product network case

We consider the problem of explaining a tractable deep probabilistic mod...
research
02/16/2019

Deep Convolutional Sum-Product Networks for Probabilistic Image Representations

Sum-Product Networks (SPNs) are hierarchical probabilistic graphical mod...
research
11/27/2014

On the Expressive Efficiency of Sum Product Networks

Sum Product Networks (SPNs) are a recently developed class of deep gener...
research
10/12/2017

Sum-Product-Quotient Networks

We present a novel tractable generative model that extends Sum-Product N...

Please sign up or login with your details

Forgot password? Click here to reset