Encoding Hierarchical Information in Neural Networks helps in Subpopulation Shift

12/20/2021
by   Amitangshu Mukherjee, et al.
0

Over the past decade, deep neural networks have proven to be adept in image classification tasks, often surpassing humans in terms of accuracy. However, standard neural networks often fail to understand the concept of hierarchical structures and dependencies among different classes for vision related tasks. Humans on the other hand, seem to learn categories conceptually, progressively growing from understanding high-level concepts down to granular levels of categories. One of the issues arising from the inability of neural networks to encode such dependencies within its learned structure is that of subpopulation shift – where models are queried with novel unseen classes taken from a shifted population of the training set categories. Since the neural network treats each class as independent from all others, it struggles to categorize shifting populations that are dependent at higher levels of the hierarchy. In this work, we study the aforementioned problems through the lens of a novel conditional supervised training framework. We tackle subpopulation shift by a structured learning procedure that incorporates hierarchical information conditionally through labels. Furthermore, we introduce a notion of graphical distance to model the catastrophic effect of mispredictions. We show that learning in this structured hierarchical manner results in networks that are more robust against subpopulation shifts, with an improvement of around  2 terms of accuracy and around 8.5% in terms of graphical distance over standard models on subpopulation shift benchmarks.

READ FULL TEXT

page 4

page 6

research
03/13/2019

All You Need is a Few Shifts: Designing Efficient Convolutional Neural Networks for Image Classification

Shift operation is an efficient alternative over depthwise separable con...
research
10/17/2017

Do Convolutional Neural Networks Learn Class Hierarchy?

Convolutional Neural Networks (CNNs) currently achieve state-of-the-art ...
research
02/28/2018

Memory-based Parameter Adaptation

Deep neural networks have excelled on a wide range of problems, from vis...
research
01/14/2023

Functional Neural Networks: Shift invariant models for functional data with applications to EEG classification

It is desirable for statistical models to detect signals of interest ind...
research
08/14/2020

Abstracting Deep Neural Networks into Concept Graphs for Concept Level Interpretability

The black-box nature of deep learning models prevents them from being co...
research
06/07/2019

Resampling-based Assessment of Robustness to Distribution Shift for Deep Neural Networks

A novel resampling framework is proposed to evaluate the robustness and ...
research
02/03/2017

Structured Attention Networks

Attention networks have proven to be an effective approach for embedding...

Please sign up or login with your details

Forgot password? Click here to reset