Natural-Parameter Networks: A Class of Probabilistic Neural Networks

11/02/2016
by   Hao Wang, et al.
0

Neural networks (NN) have achieved state-of-the-art performance in various applications. Unfortunately in applications where training data is insufficient, they are often prone to overfitting. One effective way to alleviate this problem is to exploit the Bayesian approach by using Bayesian neural networks (BNN). Another shortcoming of NN is the lack of flexibility to customize different distributions for the weights and neurons according to the data, as is often done in probabilistic graphical models. To address these problems, we propose a class of probabilistic neural networks, dubbed natural-parameter networks (NPN), as a novel and lightweight Bayesian treatment of NN. NPN allows the usage of arbitrary exponential-family distributions to model the weights and neurons. Different from traditional NN and BNN, NPN takes distributions as input and goes through layers of transformation before producing distributions to match the target output distributions. As a Bayesian treatment, efficient backpropagation (BP) is performed to learn the natural parameters for the distributions over both the weights and neurons. The output distributions of each layer, as byproducts, may be used as second-order representations for the associated tasks such as link prediction. Experiments on real-world datasets show that NPN can achieve state-of-the-art performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/18/2015

Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks

Large multilayer neural networks trained with backpropagation have recen...
research
07/14/2020

Streaming Probabilistic Deep Tensor Factorization

Despite the success of existing tensor factorization methods, most of th...
research
06/21/2019

Limitations of Lazy Training of Two-layers Neural Networks

We study the supervised learning problem under either of the following t...
research
08/17/2017

General Backpropagation Algorithm for Training Second-order Neural Networks

The artificial neural network is a popular framework in machine learning...
research
02/06/2019

Bidirectional Inference Networks: A Class of Deep Bayesian Networks for Health Profiling

We consider the problem of inferring the values of an arbitrary set of v...
research
02/10/2018

Deep learning with t-exponential Bayesian kitchen sinks

Bayesian learning has been recently considered as an effective means of ...
research
06/05/2020

Hardness of Learning Neural Networks with Natural Weights

Neural networks are nowadays highly successful despite strong hardness r...

Please sign up or login with your details

Forgot password? Click here to reset