DeepMI: A Mutual Information Based Framework For Unsupervised Deep Learning of Tasks

01/16/2021
by   Ashish Kumar, et al.
0

In this work, we propose an information theory based framework DeepMI to train deep neural networks (DNN) using Mutual Information (MI). The DeepMI framework is especially targeted but not limited to the learning of real world tasks in an unsupervised manner. The primary motivation behind this work is the insufficiency of traditional loss functions for unsupervised task learning. Moreover, directly using MI for the training purpose is quite challenging to deal because of its unbounded above nature. Hence, we develop an alternative linearized representation of MI as a part of the framework. Contributions of this paper are three fold: i) investigation of MI to train deep neural networks, ii) novel loss function LLMI, and iii) a fuzzy logic based end-to-end differentiable pipeline to integrate DeepMI into deep learning framework. We choose a few unsupervised learning tasks for our experimental study. We demonstrate that L LM I alone provides better gradients to achieve a neural network better performance over the cases when multiple loss functions are used for a given task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/05/2023

Loss Functions and Metrics in Deep Learning. A Review

One of the essential components of deep learning is the choice of the lo...
research
05/24/2018

Entropy and mutual information in models of deep neural networks

We examine a class of deep learning models with a tractable method to co...
research
07/17/2018

Invariant Information Distillation for Unsupervised Image Segmentation and Clustering

We present a new method that learns to segment and cluster images withou...
research
07/26/2018

Superpixel Sampling Networks

Superpixels provide an efficient low/mid-level representation of image d...
research
05/26/2019

Deep Online Learning with Stochastic Constraints

Deep learning models are considered to be state-of-the-art in many offli...
research
05/23/2023

Reviewing Evolution of Learning Functions and Semantic Information Measures for Understanding Deep Learning

A new trend in deep learning, represented by Mutual Information Neural E...
research
09/17/2023

Conditional Mutual Information Constrained Deep Learning for Classification

The concepts of conditional mutual information (CMI) and normalized cond...

Please sign up or login with your details

Forgot password? Click here to reset