Reduction of Class Activation Uncertainty with Background Information

05/05/2023
by   H M Dipu Kabir, et al.
0

Multitask learning is a popular approach to training high-performing neural networks with improved generalization. In this paper, we propose a background class to achieve improved generalization at a lower computation compared to multitask learning to help researchers and organizations with limited computation power. We also present a methodology for selecting background images and discuss potential future improvements. We apply our approach to several datasets and achieved improved generalization with much lower computation. We also investigate class activation mappings (CAMs) of the trained model and observed the tendency towards looking at a bigger picture in a few class classification problems with the proposed model training methodology. Example scripts are available in the `CAM' folder of the following GitHub Repository: github.com/dipuk0506/UQ

READ FULL TEXT

page 1

page 4

page 7

research
12/21/2018

An Integrated Transfer Learning and Multitask Learning Approach for Pharmacokinetic Parameter Prediction

Background: Pharmacokinetic evaluation is one of the key processes in dr...
research
06/09/2022

DiSparse: Disentangled Sparsification for Multitask Model Compression

Despite the popularity of Model Compression and Multitask Learning, how ...
research
12/02/2022

ColD Fusion: Collaborative Descent for Distributed Multitask Finetuning

Pretraining has been shown to scale well with compute, data size and dat...
research
06/26/2023

Multitask Learning for Multiple Recognition Tasks: A Framework for Lower-limb Exoskeleton Robot Applications

To control the lower-limb exoskeleton robot effectively, it is essential...
research
07/19/2023

BSDM: Background Suppression Diffusion Model for Hyperspectral Anomaly Detection

Hyperspectral anomaly detection (HAD) is widely used in Earth observatio...
research
05/24/2023

Networks are Slacking Off: Understanding Generalization Problem in Image Deraining

Deep deraining networks, while successful in laboratory benchmarks, cons...
research
09/03/2017

An Improved Algorithm for E-Generalization

E-generalization computes common generalizations of given ground terms w...

Please sign up or login with your details

Forgot password? Click here to reset