Principal Bit Analysis: Autoencoding with Schur-Concave Loss

06/05/2021
by   Sourbh Bhadane, et al.
28

We consider a linear autoencoder in which the latent variables are quantized, or corrupted by noise, and the constraint is Schur-concave in the set of latent variances. Although finding the optimal encoder/decoder pair for this setup is a nonconvex optimization problem, we show that decomposing the source into its principal components is optimal. If the constraint is strictly Schur-concave and the empirical covariance matrix has only simple eigenvalues, then any optimal encoder/decoder must decompose the source in this way. As one application, we consider a strictly Schur-concave constraint that estimates the number of bits needed to represent the latent variables under fixed-rate encoding, a setup that we call Principal Bit Analysis (PBA). This yields a practical, general-purpose, fixed-rate compressor that outperforms existing algorithms. As a second application, we show that a prototypical autoencoder-based variable-rate compressor is guaranteed to decompose the source into its principal components.

READ FULL TEXT
research
02/19/2021

Correlation Based Principal Loading Analysis

Principal loading analysis is a dimension reduction method that discards...
research
02/03/2021

Quadratic Signaling Games with Channel Combining Ratio

In this study, Nash and Stackelberg equilibria of single-stage and multi...
research
02/04/2020

Optimal Causal Rate-Constrained Sampling for a Class of Continuous Markov Processes

Consider the following communication scenario. An encoder observes a sto...
research
11/29/2022

Variable selection and covariance structure identification using loadings

We provide sparse principal loading analysis which is a new concept that...
research
10/07/2017

Nonsparse learning with latent variables

As a popular tool for producing meaningful and interpretable models, lar...
research
08/22/2023

Information Bottleneck Revisited: Posterior Probability Perspective with Optimal Transport

Information bottleneck (IB) is a paradigm to extract information in one ...
research
05/31/2021

Factorising Meaning and Form for Intent-Preserving Paraphrasing

We propose a method for generating paraphrases of English questions that...

Please sign up or login with your details

Forgot password? Click here to reset