Use of Deterministic Transforms to Design Weight Matrices of a Neural Network

10/06/2021
by   Pol Grau Jurado, et al.
0

Self size-estimating feedforward network (SSFN) is a feedforward multilayer network. For the existing SSFN, a part of each weight matrix is trained using a layer-wise convex optimization approach (a supervised training), while the other part is chosen as a random matrix instance (an unsupervised training). In this article, the use of deterministic transforms instead of random matrix instances for the SSFN weight matrices is explored. The use of deterministic transforms provides a reduction in computational complexity. The use of several deterministic transforms is investigated, such as discrete cosine transform, Hadamard transform, Hartley transform, and wavelet transforms. The choice of a deterministic transform among a set of transforms is made in an unsupervised manner. To this end, two methods based on features' statistical parameters are developed. The proposed methods help to design a neural net where deterministic transforms can vary across its layers' weight matrices. The effectiveness of the proposed approach vis-a-vis the SSFN is illustrated for object classification tasks using several benchmark datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/09/2017

New Transforms for JPEG Format

The two-dimensional discrete cosine transform (DCT) can be found in the ...
research
05/17/2019

SSFN: Self Size-estimating Feed-forward Network and Low Complexity Design

We design a self size-estimating feed-forward network (SSFN) using a joi...
research
03/14/2019

Learning Fast Algorithms for Linear Transforms Using Butterfly Factorizations

Fast linear transforms are ubiquitous in machine learning, including the...
research
07/05/2020

An Integer Approximation Method for Discrete Sinusoidal Transforms

Approximate methods have been considered as a means to the evaluation of...
research
10/02/2020

Deep Convolutional Transform Learning – Extended version

This work introduces a new unsupervised representation learning techniqu...
research
10/23/2017

Progressive Learning for Systematic Design of Large Neural Networks

We develop an algorithm for systematic design of a large artificial neur...
research
01/19/2021

On the derivation of the Khmaladze transforms

Some 40 years ago Khmaladze introduced a transform which greatly facilit...

Please sign up or login with your details

Forgot password? Click here to reset