Information theoretic learning of robust deep representations

05/30/2019
by   Nicolas Pinchaud, et al.
0

We propose a novel objective function for learning robust deep representations of data based on information theory. Data is projected into a feature-vector space such that the mutual information of all subsets of features relative to the supervising signal is maximized. This objective function gives rise to robust representations by conserving available information relative to supervision in the face of noisy or unavailable features. Although the objective function is not directly tractable, we are able to derive a surrogate objective function. Minimizing this surrogate loss encourages features to be non-redundant and conditionally independent relative to the supervising signal. To evaluate the quality of obtained solutions, we have performed a set of preliminary experiments that show promising results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/09/2020

Deep Unsupervised Image Anomaly Detection: An Information Theoretic Framework

Surrogate task based methods have recently shown great promise for unsup...
research
06/27/2012

Communications Inspired Linear Discriminant Analysis

We study the problem of supervised linear dimensionality reduction, taki...
research
04/23/2012

Objective Function Designing Led by User Preferences Acquisition

Many real world problems can be defined as optimisation problems in whic...
research
04/06/2022

Monotone Improvement of Information-Geometric Optimization Algorithms with a Surrogate Function

A surrogate function is often employed to reduce the number of objective...
research
11/07/2016

An Information-Theoretic Framework for Fast and Robust Unsupervised Learning via Neural Population Infomax

A framework is presented for unsupervised learning of representations ba...
research
12/10/2021

Surrogate-based cross-correlation for particle image velocimetry

This paper presents a novel surrogate-based cross-correlation (SBCC) fra...
research
06/06/2022

Information-theoretic Inducing Point Placement for High-throughput Bayesian Optimisation

Sparse Gaussian Processes are a key component of high-throughput Bayesia...

Please sign up or login with your details

Forgot password? Click here to reset