S-JEA: Stacked Joint Embedding Architectures for Self-Supervised Visual Representation Learning

05/19/2023
by   Alžběta Manová, et al.
3

The recent emergence of Self-Supervised Learning (SSL) as a fundamental paradigm for learning image representations has, and continues to, demonstrate high empirical success in a variety of tasks. However, most SSL approaches fail to learn embeddings that capture hierarchical semantic concepts that are separable and interpretable. In this work, we aim to learn highly separable semantic hierarchical representations by stacking Joint Embedding Architectures (JEA) where higher-level JEAs are input with representations of lower-level JEA. This results in a representation space that exhibits distinct sub-categories of semantic concepts (e.g., model and colour of vehicles) in higher-level JEAs. We empirically show that representations from stacked JEA perform on a similar level as traditional JEA with comparative parameter counts and visualise the representation spaces to validate the semantic hierarchies.

READ FULL TEXT
research
04/14/2022

Masked Siamese Networks for Label-Efficient Learning

We propose Masked Siamese Networks (MSN), a self-supervised learning fra...
research
05/18/2023

HMSN: Hyperbolic Self-Supervised Learning by Clustering with Ideal Prototypes

Hyperbolic manifolds for visual representation learning allow for effect...
research
05/26/2022

HIRL: A General Framework for Hierarchical Image Representation Learning

Learning self-supervised image representations has been broadly studied ...
research
11/24/2021

ViCE: Self-Supervised Visual Concept Embeddings as Contextual and Pixel Appearance Invariant Semantic Representations

This work presents a self-supervised method to learn dense semantically ...
research
12/25/2019

Multiple Pretext-Task for Self-Supervised Learning via Mixing Multiple Image Transformations

Self-supervised learning is one of the most promising approaches to lear...
research
09/29/2022

Variance Covariance Regularization Enforces Pairwise Independence in Self-Supervised Representations

Self-Supervised Learning (SSL) methods such as VICReg, Barlow Twins or W...
research
05/24/2023

Reverse Engineering Self-Supervised Learning

Self-supervised learning (SSL) is a powerful tool in machine learning, b...

Please sign up or login with your details

Forgot password? Click here to reset