Drop-Bottleneck: Learning Discrete Compressed Representation for Noise-Robust Exploration

03/23/2021
by   Jaekyeom Kim, et al.
0

We propose a novel information bottleneck (IB) method named Drop-Bottleneck, which discretely drops features that are irrelevant to the target variable. Drop-Bottleneck not only enjoys a simple and tractable compression objective but also additionally provides a deterministic compressed representation of the input variable, which is useful for inference tasks that require consistent representation. Moreover, it can jointly learn a feature extractor and select features considering each feature dimension's relevance to the target task, which is unattainable by most neural network-based IB methods. We propose an exploration method based on Drop-Bottleneck for reinforcement learning tasks. In a multitude of noisy and reward sparse maze navigation tasks in VizDoom (Kempka et al., 2016) and DMLab (Beattie et al., 2016), our exploration method achieves state-of-the-art performance. As a new IB framework, we demonstrate that Drop-Bottleneck outperforms Variational Information Bottleneck (VIB) (Alemi et al., 2017) in multiple aspects including adversarial robustness and dimensionality reduction.

READ FULL TEXT
research
09/12/2022

Self-supervised Sequential Information Bottleneck for Robust Exploration in Deep Reinforcement Learning

Effective exploration is critical for reinforcement learning agents in e...
research
04/24/2020

The Variational Bandwidth Bottleneck: Stochastic Evaluation on an Information Budget

In many applications, it is desirable to extract only the relevant infor...
research
10/20/2021

Dynamic Bottleneck for Robust Self-Supervised Exploration

Exploration methods based on pseudo-count of transitions or curiosity of...
research
06/06/2019

Class-Conditional Compression and Disentanglement: Bridging the Gap between Neural Networks and Naive Bayes Classifiers

In this draft, which reports on work in progress, we 1) adapt the inform...
research
01/03/2016

Supervised Dimensionality Reduction via Distance Correlation Maximization

In our work, we propose a novel formulation for supervised dimensionalit...
research
10/15/2019

Extracting robust and accurate features via a robust information bottleneck

We propose a novel strategy for extracting features in supervised learni...
research
10/05/2022

Neural Distillation as a State Representation Bottleneck in Reinforcement Learning

Learning a good state representation is a critical skill when dealing wi...

Please sign up or login with your details

Forgot password? Click here to reset