Variations of Squeeze and Excitation networks

04/11/2023
by   Mahendran, et al.
0

Convolutional neural networks learns spatial features and are heavily interlinked within kernels. The SE module have broken the traditional route of neural networks passing the entire result to next layer. Instead SE only passes important features to be learned with its squeeze and excitation (SE) module. We propose variations of the SE module which improvises the process of squeeze and excitation and enhances the performance. The proposed squeezing or exciting the layer makes it possible for having a smooth transition of layer weights. These proposed variations also retain the characteristics of SE module. The experimented results are carried out on residual networks and the results are tabulated.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/18/2021

Abnormal Occupancy Grid Map Recognition using Attention Network

The occupancy grid map is a critical component of autonomous positioning...
research
05/26/2022

Reinforcement Learning Approach for Mapping Applications to Dataflow-Based Coarse-Grained Reconfigurable Array

The Streaming Engine (SE) is a Coarse-Grained Reconfigurable Array which...
research
11/29/2022

Equivalence Between SE(3) Equivariant Networks via Steerable Kernels and Group Convolution

A wide range of techniques have been proposed in recent years for design...
research
08/25/2023

Squeeze aggregated excitation network

Convolutional neural networks have spatial representations which read pa...
research
04/23/2020

Love, Joy, Anger, Sadness, Fear, and Surprise: SE Needs Special Kinds of AI: A Case Study on Text Mining and SE

Do you like your code? What kind of code makes developers happiest? What...
research
03/22/2005

Semi-automatic vectorization of linear networks on rasterized cartographic maps

A system for semi-automatic vectorization of linear networks (roads, riv...

Please sign up or login with your details

Forgot password? Click here to reset