Efficient Neural Network Implementation with Quadratic Neuron

11/21/2020
by   Zirui Xu, et al.
0

Previous works proved that the combination of the linear neuron network with nonlinear activation functions (e.g. ReLu) can achieve nonlinear function approximation. However, simply widening or deepening the network structure will introduce some training problems. In this work, we are aiming to build a comprehensive second-order CNN implementation framework that includes neuron/network design and system deployment optimization.

READ FULL TEXT

page 1

page 2

research
09/06/2019

Differential Equation Units: Learning Functional Forms of Activation Functions from Data

Most deep neural networks use simple, fixed activation functions, such a...
research
11/21/2020

Central and Non-central Limit Theorems arising from the Scattering Transform and its Neural Activation Generalization

Motivated by analyzing complicated and non-stationary time series, we st...
research
04/01/2022

QuadraLib: A Performant Quadratic Neural Network Library for Architecture Optimization and Design Exploration

The significant success of Deep Neural Networks (DNNs) is highly promote...
research
05/07/2020

NetPyNE implementation and rescaling of the Potjans-Diesmanncortical microcircuit model

The Potjans-Diesmann cortical microcircuit model is a widely used model ...
research
07/13/2023

Efficient SGD Neural Network Training via Sublinear Activated Neuron Identification

Deep learning has been widely used in many fields, but the model trainin...
research
08/07/2013

A Note on Topology Preservation in Classification, and the Construction of a Universal Neuron Grid

It will be shown that according to theorems of K. Menger, every neuron g...
research
01/15/2020

Learning a Single Neuron with Gradient Methods

We consider the fundamental problem of learning a single neuron x σ(w^ x...

Please sign up or login with your details

Forgot password? Click here to reset