Max and Coincidence Neurons in Neural Networks

10/04/2021
by   Albert Lee, et al.
0

Network design has been a central topic in machine learning. Large amounts of effort have been devoted towards creating efficient architectures through manual exploration as well as automated neural architecture search. However, todays architectures have yet to consider the diversity of neurons and the existence of neurons with specific processing functions. In this work, we optimize networks containing models of the max and coincidence neurons using neural architecture search, and analyze the structure, operations, and neurons of optimized networks to develop a signal-processing ResNet. The developed network achieves an average of 2 in network size across a variety of datasets, demonstrating the importance of neuronal functions in creating compact, efficient networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/18/2019

A Study of the Learning Progress in Neural Architecture Search Techniques

In neural architecture search, the structure of the neural network to be...
research
03/29/2021

Self-Constructing Neural Networks Through Random Mutation

The search for neural architecture is producing many of the most excitin...
research
10/04/2022

Low-Light Image Restoration Based on Retina Model using Neural Networks

We report the possibility of using a simple neural network for effortles...
research
01/14/2020

Neural Architecture Search for Deep Image Prior

We present a neural architecture search (NAS) technique to enhance the p...
research
04/25/2021

Vector Neurons: A General Framework for SO(3)-Equivariant Networks

Invariance and equivariance to the rotation group have been widely discu...
research
03/25/2021

Recovering Quantitative Models of Human Information Processing with Differentiable Architecture Search

The integration of behavioral phenomena into mechanistic models of cogni...

Please sign up or login with your details

Forgot password? Click here to reset