On the Equivalence of Convolutional and Hadamard Networks using DFT

10/27/2018
by   Marcel Crasmaru, et al.
0

In this paper we introduce activation functions that move the entire computation of Convolutional Networks into the frequency domain, where they are actually Hadamard Networks. To achieve this result we employ the properties of Discrete Fourier Transform. We present some implementation details and experimental results, as well as some insights into why convolutional networks perform well in learning use cases.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/16/2016

Spectral Convolution Networks

Previous research has shown that computation of convolution in the frequ...
research
02/26/2018

Improving Graph Convolutional Networks with Non-Parametric Activation Functions

Graph neural networks (GNNs) are a class of neural networks that allow t...
research
06/04/2020

Lifted Inference in 2-Variable Markov Logic Networks with Function and Cardinality Constraints Using Discrete Fourier Transform

In this paper we show that inference in 2-variable Markov logic networks...
research
12/01/2020

Problems of representation of electrocardiograms in convolutional neural networks

Using electrocardiograms as an example, we demonstrate the characteristi...
research
05/17/2016

Siamese convolutional networks based on phonetic features for cognate identification

In this paper, we explore the use of convolutional networks (ConvNets) f...
research
03/22/2018

Speech Dereverberation Using Fully Convolutional Networks

Speech derverberation using a single microphone is addressed in this pap...
research
06/13/2023

Safe Use of Neural Networks

Neural networks in modern communication systems can be susceptible to in...

Please sign up or login with your details

Forgot password? Click here to reset