Effective Mean-Field Inference Method for Nonnegative Boltzmann Machines

03/08/2016
by   Muneki Yasuda, et al.
0

Nonnegative Boltzmann machines (NNBMs) are recurrent probabilistic neural network models that can describe multi-modal nonnegative data. NNBMs form rectified Gaussian distributions that appear in biological neural network models, positive matrix factorization, nonnegative matrix factorization, and so on. In this paper, an effective inference method for NNBMs is proposed that uses the mean-field method, referred to as the Thouless--Anderson--Palmer equation, and the diagonal consistency method, which was recently proposed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/02/2020

Image Analysis Based on Nonnegative/Binary Matrix Factorization

Using nonnegative/binary matrix factorization (NBMF), a matrix can be de...
research
01/30/2013

Mixture Representations for Inference and Learning in Boltzmann Machines

Boltzmann machines are undirected graphical models with two-state stocha...
research
09/13/2017

Tight Semi-Nonnegative Matrix Factorization

The nonnegative matrix factorization is a widely used, flexible matrix d...
research
07/11/2023

Monotone deep Boltzmann machines

Deep Boltzmann machines (DBMs), one of the first “deep” learning methods...
research
02/01/2022

Graph-based Neural Acceleration for Nonnegative Matrix Factorization

We describe a graph-based neural acceleration technique for nonnegative ...
research
05/04/2020

A Dynamical Mean-Field Theory for Learning in Restricted Boltzmann Machines

We define a message-passing algorithm for computing magnetizations in Re...
research
03/15/2019

A Ranking Model Motivated by Nonnegative Matrix Factorization with Applications to Tennis Tournaments

We propose a novel ranking model that combines the Bradley-Terry-Luce pr...

Please sign up or login with your details

Forgot password? Click here to reset