DeepAI AI Chat
Log In Sign Up

Max and Coincidence Neurons in Neural Networks

by   Albert Lee, et al.

Network design has been a central topic in machine learning. Large amounts of effort have been devoted towards creating efficient architectures through manual exploration as well as automated neural architecture search. However, todays architectures have yet to consider the diversity of neurons and the existence of neurons with specific processing functions. In this work, we optimize networks containing models of the max and coincidence neurons using neural architecture search, and analyze the structure, operations, and neurons of optimized networks to develop a signal-processing ResNet. The developed network achieves an average of 2 in network size across a variety of datasets, demonstrating the importance of neuronal functions in creating compact, efficient networks.


page 1

page 2

page 3

page 4


MONAS: Multi-Objective Neural Architecture Search using Reinforcement Learning

Recent studies on neural architecture search have shown that automatical...

Self-Constructing Neural Networks Through Random Mutation

The search for neural architecture is producing many of the most excitin...

Low-Light Image Restoration Based on Retina Model using Neural Networks

We report the possibility of using a simple neural network for effortles...

Neural Architecture Search for Deep Image Prior

We present a neural architecture search (NAS) technique to enhance the p...

Vector Neurons: A General Framework for SO(3)-Equivariant Networks

Invariance and equivariance to the rotation group have been widely discu...

StressNAS: Affect State and Stress Detection Using Neural Architecture Search

Smartwatches have rapidly evolved towards capabilities to accurately cap...