Modelling the Probability Density of Markov Sources

07/06/2006
by   Stephen Luttrell, et al.
0

This paper introduces an objective function that seeks to minimise the average total number of bits required to encode the joint state of all of the layers of a Markov source. This type of encoder may be applied to the problem of optimising the bottom-up (recognition model) and top-down (generative model) connections in a multilayer neural network, and it unifies several previous results on the optimisation of multilayer neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/27/2017

A Comprehensive Survey on Bengali Phoneme Recognition

Hidden Markov model based various phoneme recognition methods for Bengal...
research
05/03/2015

Some Theoretical Properties of a Network of Discretely Firing Neurons

The problem of optimising a network of discretely firing neurons is addr...
research
04/26/2019

Think Again Networks and the Delta Loss

This short paper introduces an abstraction called Think Again Networks (...
research
12/16/2010

Adaptive Cluster Expansion (ACE): A Multilayer Network for Estimating Probability Density Functions

We derive an adaptive hierarchical method of estimating high dimensional...
research
12/20/2018

Core Decomposition in Multilayer Networks: Theory, Algorithms, and Applications

Multilayer networks are a powerful paradigm to model complex systems, wh...
research
10/15/2004

Self-Organised Factorial Encoding of a Toroidal Manifold

It is shown analytically how a neural network can be used optimally to e...
research
11/16/2018

A multilayer exponential random graph modelling approach for weighted networks

This paper introduces a new modelling approach to analyse weighted netwo...

Please sign up or login with your details

Forgot password? Click here to reset