LogAvgExp Provides a Principled and Performant Global Pooling Operator

11/02/2021
by   Scott C. Lowe, et al.
0

We seek to improve the pooling operation in neural networks, by applying a more theoretically justified operator. We demonstrate that LogSumExp provides a natural OR operator for logits. When one corrects for the number of elements inside the pooling operator, this becomes LogAvgExp := log(mean(exp(x))). By introducing a single temperature parameter, LogAvgExp smoothly transitions from the max of its operands to the mean (found at the limiting cases t → 0^+ and t → +∞). We experimentally tested LogAvgExp, both with and without a learnable temperature parameter, in a variety of deep neural network architectures for computer vision.

READ FULL TEXT
research
09/30/2015

Generalizing Pooling Functions in Convolutional Neural Networks: Mixed, Gated, and Tree

We seek to improve deep neural networks by generalizing the pooling oper...
research
04/08/2018

Ordinal Pooling Networks: For Preserving Information over Shrinking Feature Maps

In the framework of convolutional neural networks that lie at the heart ...
research
06/10/2019

A Closed-Form Learned Pooling for Deep Classification Networks

In modern computer vision tasks, convolutional neural networks (CNNs) ar...
research
09/03/2021

Ordinal Pooling

In the framework of convolutional neural networks, downsampling is often...
research
07/01/2019

iPool -- Information-based Pooling in Hierarchical Graph Neural Networks

With the advent of data science, the analysis of network or graph data h...
research
04/02/2020

ProxyNCA++: Revisiting and Revitalizing Proxy Neighborhood Component Analysis

We consider the problem of distance metric learning (DML), where the tas...
research
10/11/2022

Embeddings as Epistemic States: Limitations on the Use of Pooling Operators for Accumulating Knowledge

Various neural network architectures rely on pooling operators to aggreg...

Please sign up or login with your details

Forgot password? Click here to reset