Towards Building Deep Networks with Bayesian Factor Graphs

02/16/2015
by   Amedeo Buonanno, et al.
0

We propose a Multi-Layer Network based on the Bayesian framework of the Factor Graphs in Reduced Normal Form (FGrn) applied to a two-dimensional lattice. The Latent Variable Model (LVM) is the basic building block of a quadtree hierarchy built on top of a bottom layer of random variables that represent pixels of an image, a feature map, or more generally a collection of spatially distributed discrete variables. The multi-layer architecture implements a hierarchical data representation that, via belief propagation, can be used for learning and inference. Typical uses are pattern completion, correction and classification. The FGrn paradigm provides great flexibility and modularity and appears as a promising candidate for building deep networks: the system can be easily extended by introducing new and different (in cardinality and in type) variables. Prior knowledge, or supervised information, can be introduced at different scales. The FGrn paradigm provides a handy way for building all kinds of architectures by interconnecting only three types of units: Single Input Single Output (SISO) blocks, Sources and Replicators. The network is designed like a circuit diagram and the belief messages flow bidirectionally in the whole system. The learning algorithms operate only locally within each block. The framework is demonstrated in this paper in a three-layer structure applied to images extracted from a standard data set.

READ FULL TEXT

page 9

page 11

page 12

page 13

page 14

page 15

page 16

research
12/07/2019

Deep Variable-Block Chain with Adaptive Variable Selection

The architectures of deep neural networks (DNN) rely heavily on the unde...
research
08/26/2013

A Comparison of Algorithms for Learning Hidden Variables in Normal Graphs

A Bayesian factor graph reduced to normal form consists in the interconn...
research
03/04/2020

Neural Enhanced Belief Propagation on Factor Graphs

A graphical model is a structured representation of locally dependent ra...
research
03/01/2019

Asymptotics of MAP Inference in Deep Networks

Deep generative priors are a powerful tool for reconstruction problems w...
research
01/30/2017

CNN as Guided Multi-layer RECOS Transform

There is a resurging interest in developing a neural-network-based solut...
research
08/02/2022

Stochastic Deep Networks with Linear Competing Units for Model-Agnostic Meta-Learning

This work addresses meta-learning (ML) by considering deep networks with...

Please sign up or login with your details

Forgot password? Click here to reset