An Equivalence of Fully Connected Layer and Convolutional Layer

12/04/2017
by   Wei Ma, et al.
EPFL
0

This article demonstrates that convolutional operation can be converted to matrix multiplication, which has the same calculation way with fully connected layer. The article is helpful for the beginners of the neural network to understand how fully connected layer and the convolutional layer work in the backend. To be concise and to make the article more readable, we only consider the linear case. It can be extended to the non-linear case easily through plugging in a non-linear encapsulation to the values like this σ(x) denoted as x^'.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/09/2022

Redundancy in Deep Linear Neural Networks

Conventional wisdom states that deep linear neural networks benefit from...
10/29/2020

What can we learn from gradients?

Recent work (<cit.>) has shown that it is possible to reconstruct the in...
06/20/2019

Clustering and Classification Networks

In this paper, we will describe a network architecture that demonstrates...
10/22/2021

Deep Convolutional Autoencoders as Generic Feature Extractors in Seismological Applications

The idea of using a deep autoencoder to encode seismic waveform features...
03/23/2018

Lifting Layers: Analysis and Applications

The great advances of learning-based approaches in image processing and ...
03/21/2022

Image Classification on Accelerated Neural Networks

For image classification problems, various neural network models are com...
02/17/2021

Beyond Fully-Connected Layers with Quaternions: Parameterization of Hypercomplex Multiplications with 1/n Parameters

Recent works have demonstrated reasonable success of representation lear...

Please sign up or login with your details

Forgot password? Click here to reset