Class-Conditional Compression and Disentanglement: Bridging the Gap between Neural Networks and Naive Bayes Classifiers

06/06/2019
by   Rana Ali Amjad, et al.
0

In this draft, which reports on work in progress, we 1) adapt the information bottleneck functional by replacing the compression term by class-conditional compression, 2) relax this functional using a variational bound related to class-conditional disentanglement, 3) consider this functional as a training objective for stochastic neural networks, and 4) show that the latent representations are learned such that they can be used in a naive Bayes classifier. We continue by suggesting a series of experiments along the lines of Nonlinear In-formation Bottleneck [Kolchinsky et al., 2018], Deep Variational Information Bottleneck [Alemi et al., 2017], and Information Dropout [Achille and Soatto, 2018]. We furthermore suggest a neural network where the decoder architecture is a parameterized naive Bayes decoder.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/31/2021

Deep Deterministic Information Bottleneck with Matrix-based Entropy Functional

We introduce the matrix-based Renyi's α-order entropy functional to para...
research
09/27/2020

Learning Optimal Representations with the Decodable Information Bottleneck

We address the question of characterizing and finding optimal representa...
research
11/14/2021

Improving usual Naive Bayes classifier performances with Neural Naive Bayes based models

Naive Bayes is a popular probabilistic model appreciated for its simplic...
research
03/23/2021

Drop-Bottleneck: Learning Discrete Compressed Representation for Noise-Robust Exploration

We propose a novel information bottleneck (IB) method named Drop-Bottlen...
research
04/28/2023

Recognizable Information Bottleneck

Information Bottlenecks (IBs) learn representations that generalize to u...
research
09/29/2021

PAC-Bayes Information Bottleneck

Information bottleneck (IB) depicts a trade-off between the accuracy and...
research
10/15/2019

REVE: Regularizing Deep Learning with Variational Entropy Bound

Studies on generalization performance of machine learning algorithms und...

Please sign up or login with your details

Forgot password? Click here to reset