Unsupervised learning of disentangled representations in deep restricted kernel machines with orthogonality constraints

11/25/2020
by   Francesco Tonin, et al.
0

We introduce Constr-DRKM, a deep kernel method for the unsupervised learning of disentangled data representations. We propose augmenting the original deep restricted kernel machine formulation for kernel PCA by orthogonality constraints on the latent variables to promote disentanglement and to make it possible to carry out optimization without first defining a stabilized objective. After illustrating an end-to-end training procedure based on a quadratic penalty optimization algorithm with warm start, we quantitatively evaluate the proposed method's effectiveness in disentangled feature learning. We demonstrate on four benchmark datasets that this approach performs similarly overall to β-VAE on a number of disentanglement metrics when few training points are available, while being less sensitive to randomness and hyperparameter selection than β-VAE. We also present a deterministic initialization of Constr-DRKM's training algorithm that significantly improves the reproducibility of the results. Finally, we empirically evaluate and discuss the role of the number of layers in the proposed methodology, examining the influence of each principal component in every layer and showing that components in lower layers act as local feature detectors capturing the broad trends of the data distribution, while components in deeper layers use the representation learned by previous layers and more accurately reproduce higher-level features.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/22/2023

Deep Kernel Principal Component Analysis for Multi-level Feature Learning

Principal Component Analysis (PCA) and its nonlinear extension Kernel PC...
research
03/19/2021

GLOWin: A Flow-based Invertible Generative Framework for Learning Disentangled Feature Representations in Medical Images

Disentangled representations can be useful in many downstream tasks, hel...
research
03/25/2023

Beta-VAE has 2 Behaviors: PCA or ICA?

Beta-VAE is a very classical model for disentangled representation learn...
research
06/12/2023

Combining Primal and Dual Representations in Deep Restricted Kernel Machines Classifiers

In contrast to deep networks, kernel methods cannot directly take advant...
research
08/26/2019

Theory and Evaluation Metrics for Learning Disentangled Representations

We make two theoretical contributions to disentanglement learning by (a)...
research
05/16/2023

ProtoVAE: Prototypical Networks for Unsupervised Disentanglement

Generative modeling and self-supervised learning have in recent years ma...
research
02/04/2020

Robust Generative Restricted Kernel Machines using Weighted Conjugate Feature Duality

In the past decade, interest in generative models has grown tremendously...

Please sign up or login with your details

Forgot password? Click here to reset