-
Alpha-Pooling for Convolutional Neural Networks
Convolutional neural networks (CNNs) have achieved remarkable performanc...
read it
-
Encoding CNN Activations for Writer Recognition
The encoding of local features is an essential part for writer identific...
read it
-
Generalized Max Pooling
State-of-the-art patch-based image representations involve a pooling ope...
read it
-
How is Contrast Encoded in Deep Neural Networks?
Contrast is a crucial factor in visual information processing. It is des...
read it
-
Deep learning for lithological classification of carbonate rock micro-CT images
In addition to the ongoing development, pre-salt carbonate reservoir cha...
read it
-
Effective face landmark localization via single deep network
In this paper, we propose a novel face alignment method using single dee...
read it
-
ProxyNCA++: Revisiting and Revitalizing Proxy Neighborhood Component Analysis
We consider the problem of distance metric learning (DML), where the tas...
read it
Deep Generalized Max Pooling
Global pooling layers are an essential part of Convolutional Neural Networks (CNN). They are used to aggregate activations of spatial locations to produce a fixed-size vector in several state-of-the-art CNNs. Global average pooling or global max pooling are commonly used for converting convolutional features of variable size images to a fix-sized embedding. However, both pooling layer types are computed spatially independent: each individual activation map is pooled and thus activations of different locations are pooled together. In contrast, we propose Deep Generalized Max Pooling that balances the contribution of all activations of a spatially coherent region by re-weighting all descriptors so that the impact of frequent and rare ones is equalized. We show that this layer is superior to both average and max pooling on the classification of Latin medieval manuscripts (CLAMM'16, CLAMM'17), as well as writer identification (Historical-WI'17).
READ FULL TEXT
Comments
There are no comments yet.