Bayesian Recurrent Units and the Forward-Backward Algorithm

07/21/2022
by   Alexandre Bittar, et al.
0

Using Bayes's theorem, we derive a unit-wise recurrence as well as a backward recursion similar to the forward-backward algorithm. The resulting Bayesian recurrent units can be integrated as recurrent neural networks within deep learning frameworks, while retaining a probabilistic interpretation from the direct correspondence with hidden Markov models. Whilst the contribution is mainly theoretical, experiments on speech recognition indicate that adding the derived units at the end of state-of-the-art recurrent architectures can improve the performance at a very low cost in terms of trainable parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2019

A Bayesian Approach to Recurrence in Neural Networks

We begin by reiterating that common neural network activation functions ...
research
05/21/2020

Hidden Markov Chains, Entropic Forward-Backward, and Part-Of-Speech Tagging

The ability to take into account the characteristics - also called featu...
research
06/04/2020

Hidden Markov models are recurrent neural networks: A disease progression modeling application

Hidden Markov models (HMMs) are commonly used for sequential data modeli...
research
05/17/2019

End-to-end Adaptation with Backpropagation through WFST for On-device Speech Recognition System

An on-device DNN-HMM speech recognition system efficiently works with a ...
research
02/16/2023

Stabilising and accelerating light gated recurrent units for automatic speech recognition

The light gated recurrent units (Li-GRU) is well-known for achieving imp...
research
06/23/2020

Lipschitz Recurrent Neural Networks

Differential equations are a natural choice for modeling recurrent neura...
research
07/05/2017

Combining Forward and Backward Abstract Interpretation of Horn Clauses

Alternation of forward and backward analyses is a standard technique in ...

Please sign up or login with your details

Forgot password? Click here to reset