High-Level Features Parallelization for Inference Cost Reduction Through Selective Attention

08/09/2023
by   Andre Peter Kelm, et al.
0

In this work, we parallelize high-level features in deep networks to selectively skip or select class-specific features to reduce inference costs. This challenges most deep learning methods due to their limited ability to efficiently and effectively focus on selected class-specific features without retraining. We propose a serial-parallel hybrid architecture with serial generic low-level features and parallel high-level features. This accounts for the fact that many high-level features are class-specific rather than generic, and has connections to recent neuroscientific findings that observe spatially and contextually separated neural activations in the human brain. Our approach provides the unique functionality of cutouts: selecting parts of the network to focus on only relevant subsets of classes without requiring retraining. High performance is maintained, but the cost of inference can be significantly reduced. In some of our examples, up to 75 % of parameters are skipped and 35 % fewer GMACs (Giga multiply-accumulate) operations are used as the approach adapts to a change in task complexity. This is important for mobile, industrial, and robotic applications where reducing the number of parameters, the computational complexity, and thus the power consumption can be paramount. Another unique functionality is that it allows processing to be directly influenced by enhancing or inhibiting high-level class-specific features, similar to the mechanism of selective attention in the human brain. This can be relevant for cross-modal applications, the use of semantic prior knowledge, and/or context-aware processing.

READ FULL TEXT
research
02/21/2017

Mimicking Ensemble Learning with Deep Branched Networks

This paper proposes a branched residual network for image classification...
research
12/17/2019

Feature Fusion Use Unsupervised Prior Knowledge to Let Small Object Represent

Fusing low level and high level features is a widely used strategy to pr...
research
11/15/2018

Selective Feature Connection Mechanism: Concatenating Multi-layer CNN Features with a Feature Selector

Different layers of deep convolutional neural networks(CNN) can encode d...
research
06/28/2022

SHELS: Exclusive Feature Sets for Novelty Detection and Continual Learning Without Class Boundaries

While deep neural networks (DNNs) have achieved impressive classificatio...
research
05/24/2019

Not All Features Are Equal: Feature Leveling Deep Neural Networks for Better Interpretation

Self-explaining models are models that reveal decision making parameters...
research
08/07/2020

Revisiting Mid-Level Patterns for Distant-Domain Few-Shot Recognition

Cross-domain few-shot learning (FSL) is proposed recently to transfer kn...
research
12/20/2016

Beyond Skip Connections: Top-Down Modulation for Object Detection

In recent years, we have seen tremendous progress in the field of object...

Please sign up or login with your details

Forgot password? Click here to reset