Testing the number of parameters with multidimensional MLP

02/21/2008
by   Joseph Rynkiewicz, et al.
0

This work concerns testing the number of parameters in one hidden layer multilayer perceptron (MLP). For this purpose we assume that we have identifiable models, up to a finite group of transformations on the weights, this is for example the case when the number of hidden units is know. In this framework, we show that we get a simple asymptotic distribution, if we use the logarithm of the determinant of the empirical error covariance matrix as cost function.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2008

Efficient Estimation of Multidimensional Regression Model with Multilayer Perceptron

This work concerns estimation of multidimensional nonlinear regression m...
research
06/12/2018

Using Inherent Structures to design Lean 2-layer RBMs

Understanding the representational power of Restricted Boltzmann Machine...
research
09/06/2020

Multi-Activation Hidden Units for Neural Networks with Random Weights

Single layer feedforward networks with random weights are successful in ...
research
10/27/2020

Are wider nets better given the same number of parameters?

Empirical studies demonstrate that the performance of neural networks im...
research
10/05/2016

A Novel Representation of Neural Networks

Deep Neural Networks (DNNs) have become very popular for prediction in m...
research
10/07/2020

Stochastic parameterization with VARX processes

In this study we investigate a data-driven stochastic methodology to par...
research
06/28/2014

Exponentially Increasing the Capacity-to-Computation Ratio for Conditional Computation in Deep Learning

Many state-of-the-art results obtained with deep networks are achieved w...

Please sign up or login with your details

Forgot password? Click here to reset