DeepAI AI Chat
Log In Sign Up

Towards Building Deep Networks with Bayesian Factor Graphs

by   Amedeo Buonanno, et al.
Università degli Studi della Campania "Luigi Vanvitelli"

We propose a Multi-Layer Network based on the Bayesian framework of the Factor Graphs in Reduced Normal Form (FGrn) applied to a two-dimensional lattice. The Latent Variable Model (LVM) is the basic building block of a quadtree hierarchy built on top of a bottom layer of random variables that represent pixels of an image, a feature map, or more generally a collection of spatially distributed discrete variables. The multi-layer architecture implements a hierarchical data representation that, via belief propagation, can be used for learning and inference. Typical uses are pattern completion, correction and classification. The FGrn paradigm provides great flexibility and modularity and appears as a promising candidate for building deep networks: the system can be easily extended by introducing new and different (in cardinality and in type) variables. Prior knowledge, or supervised information, can be introduced at different scales. The FGrn paradigm provides a handy way for building all kinds of architectures by interconnecting only three types of units: Single Input Single Output (SISO) blocks, Sources and Replicators. The network is designed like a circuit diagram and the belief messages flow bidirectionally in the whole system. The learning algorithms operate only locally within each block. The framework is demonstrated in this paper in a three-layer structure applied to images extracted from a standard data set.


page 9

page 11

page 12

page 13

page 14

page 15

page 16


Deep Variable-Block Chain with Adaptive Variable Selection

The architectures of deep neural networks (DNN) rely heavily on the unde...

A Comparison of Algorithms for Learning Hidden Variables in Normal Graphs

A Bayesian factor graph reduced to normal form consists in the interconn...

Neural Enhanced Belief Propagation on Factor Graphs

A graphical model is a structured representation of locally dependent ra...

Asymptotics of MAP Inference in Deep Networks

Deep generative priors are a powerful tool for reconstruction problems w...

CNN as Guided Multi-layer RECOS Transform

There is a resurging interest in developing a neural-network-based solut...

Stochastic Deep Networks with Linear Competing Units for Model-Agnostic Meta-Learning

This work addresses meta-learning (ML) by considering deep networks with...