Alternating optimization method based on nonnegative matrix factorizations for deep neural networks

05/16/2016
by   Tetsuya Sakurai, et al.
0

The backpropagation algorithm for calculating gradients has been widely used in computation of weights for deep neural networks (DNNs). This method requires derivatives of objective functions and has some difficulties finding appropriate parameters such as learning rate. In this paper, we propose a novel approach for computing weight matrices of fully-connected DNNs by using two types of semi-nonnegative matrix factorizations (semi-NMFs). In this method, optimization processes are performed by calculating weight matrices alternately, and backpropagation (BP) is not used. We also present a method to calculate stacked autoencoder using a NMF. The output results of the autoencoder are used as pre-training data for DNNs. The experimental results show that our method using three types of NMFs attains similar error rates to the conventional DNNs with BP.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/08/2021

Credit Assignment Through Broadcasting a Global Error Vector

Backpropagation (BP) uses detailed, unit-specific feedback to train deep...
research
03/24/2018

A Proximal Block Coordinate Descent Algorithm for Deep Neural Network Training

Training deep neural networks (DNNs) efficiently is a challenge due to t...
research
07/20/2021

An induction proof of the backpropagation algorithm in matrix notation

Backpropagation (BP) is a core component of the contemporary deep learni...
research
05/27/2021

Hamiltonian Deep Neural Networks Guaranteeing Non-vanishing Gradients by Design

Deep Neural Networks (DNNs) training can be difficult due to vanishing a...
research
12/21/2021

A Theoretical View of Linear Backpropagation and Its Convergence

Backpropagation is widely used for calculating gradients in deep neural ...
research
10/05/2016

A Novel Representation of Neural Networks

Deep Neural Networks (DNNs) have become very popular for prediction in m...
research
04/26/2015

Comparison of Training Methods for Deep Neural Networks

This report describes the difficulties of training neural networks and i...

Please sign up or login with your details

Forgot password? Click here to reset