Bio-inspired Min-Nets Improve the Performance and Robustness of Deep Networks

01/06/2022
by   Philipp Grüning, et al.
0

Min-Nets are inspired by end-stopped cortical cells with units that output the minimum of two learned filters. We insert such Min-units into state-of-the-art deep networks, such as the popular ResNet and DenseNet, and show that the resulting Min-Nets perform better on the Cifar-10 benchmark. Moreover, we show that Min-Nets are more robust against JPEG compression artifacts. We argue that the minimum operation is the simplest way of implementing an AND operation on pairs of filters and that such AND operations introduce a bias that is appropriate given the statistics of natural images.

READ FULL TEXT
research
08/18/2020

Feature Products Yield Efficient Networks

We introduce Feature-Product networks (FP-nets) as a novel deep-network ...
research
10/17/2019

On Concept of Creative Petri Nets

A new formalism of Petri nets, based on the adoption of the "position-ar...
research
11/12/2017

D-PCN: Parallel Convolutional Neural Networks for Image Recognition in Reverse Adversarial Style

In this paper, a recognition framework named D-PCN using a discriminator...
research
06/08/2016

DISCO Nets: DISsimilarity COefficient Networks

We present a new type of probabilistic model which we call DISsimilarity...
research
09/18/2014

Deeply-Supervised Nets

Our proposed deeply-supervised nets (DSN) method simultaneously minimize...
research
08/11/2016

Faster Training of Very Deep Networks Via p-Norm Gates

A major contributing factor to the recent advances in deep neural networ...
research
12/31/2018

Convex Relaxations of Convolutional Neural Nets

We propose convex relaxations for convolutional neural nets with one hid...

Please sign up or login with your details

Forgot password? Click here to reset