Bayesian Recurrent Units and the Forward-Backward Algorithm

by   Alexandre Bittar, et al.
Idiap Research Institute

Using Bayes's theorem, we derive a unit-wise recurrence as well as a backward recursion similar to the forward-backward algorithm. The resulting Bayesian recurrent units can be integrated as recurrent neural networks within deep learning frameworks, while retaining a probabilistic interpretation from the direct correspondence with hidden Markov models. Whilst the contribution is mainly theoretical, experiments on speech recognition indicate that adding the derived units at the end of state-of-the-art recurrent architectures can improve the performance at a very low cost in terms of trainable parameters.


page 1

page 2

page 3

page 4


A Bayesian Approach to Recurrence in Neural Networks

We begin by reiterating that common neural network activation functions ...

Hidden Markov Chains, Entropic Forward-Backward, and Part-Of-Speech Tagging

The ability to take into account the characteristics - also called featu...

Hidden Markov models are recurrent neural networks: A disease progression modeling application

Hidden Markov models (HMMs) are commonly used for sequential data modeli...

End-to-end Adaptation with Backpropagation through WFST for On-device Speech Recognition System

An on-device DNN-HMM speech recognition system efficiently works with a ...

Stabilising and accelerating light gated recurrent units for automatic speech recognition

The light gated recurrent units (Li-GRU) is well-known for achieving imp...

Lipschitz Recurrent Neural Networks

Differential equations are a natural choice for modeling recurrent neura...

Combining Forward and Backward Abstract Interpretation of Horn Clauses

Alternation of forward and backward analyses is a standard technique in ...

Please sign up or login with your details

Forgot password? Click here to reset