Bayesian representation learning with oracle constraints

06/16/2015
by   Theofanis Karaletsos, et al.
0

Representation learning systems typically rely on massive amounts of labeled data in order to be trained to high accuracy. Recently, high-dimensional parametric models like neural networks have succeeded in building rich representations using either compressive, reconstructive or supervised criteria. However, the semantic structure inherent in observations is oftentimes lost in the process. Human perception excels at understanding semantics but cannot always be expressed in terms of labels. Thus, oracles or human-in-the-loop systems, for example crowdsourcing, are often employed to generate similarity constraints using an implicit similarity function encoded in human perception. In this work we propose to combine generative unsupervised feature learning with a probabilistic treatment of oracle information like triplets in order to transfer implicit privileged oracle knowledge into explicit nonlinear Bayesian latent factor models of the observations. We use a fast variational algorithm to learn the joint model and demonstrate applicability to a well-known image dataset. We show how implicit triplet information can provide rich information to learn representations that outperform previous metric learning approaches as well as generative models without this side-information in a variety of predictive tasks. In addition, we illustrate that the proposed approach compartmentalizes the latent spaces semantically which allows interpretation of the latent variables.

READ FULL TEXT

page 8

page 9

page 13

page 14

research
03/21/2017

Nonparametric Variational Auto-encoders for Hierarchical Representation Learning

The recently developed variational autoencoders (VAEs) have proved to be...
research
02/13/2018

TVAE: Triplet-Based Variational Autoencoder using Metric Learning

Deep metric learning has been demonstrated to be highly effective in lea...
research
03/07/2017

On the Limits of Learning Representations with Label-Based Supervision

Advances in neural network based classifiers have transformed automatic ...
research
09/07/2023

A Probabilistic Semi-Supervised Approach with Triplet Markov Chains

Triplet Markov chains are general generative models for sequential data ...
research
07/24/2018

The Variational Homoencoder: Learning to learn high capacity generative models from few examples

Hierarchical Bayesian methods can unify many related tasks (e.g. k-shot ...
research
04/03/2023

Learning Sparsity of Representations with Discrete Latent Variables

Deep latent generative models have attracted increasing attention due to...
research
02/27/2022

Architectural Optimization and Feature Learning for High-Dimensional Time Series Datasets

As our ability to sense increases, we are experiencing a transition from...

Please sign up or login with your details

Forgot password? Click here to reset