Generative-Contrastive Learning for Self-Supervised Latent Representations of 3D Shapes from Multi-Modal Euclidean Input

01/11/2023
by   Chengzhi Wu, et al.
0

We propose a combined generative and contrastive neural architecture for learning latent representations of 3D volumetric shapes. The architecture uses two encoder branches for voxel grids and multi-view images from the same underlying shape. The main idea is to combine a contrastive loss between the resulting latent representations with an additional reconstruction loss. That helps to avoid collapsing the latent representations as a trivial solution for minimizing the contrastive loss. A novel switching scheme is used to cross-train two encoders with a shared decoder. The switching scheme also enables the stop gradient operation on a random branch. Further classification experiments show that the latent representations learned with our self-supervised method integrate more useful information from the additional input data implicitly, thus leading to better reconstruction and classification performance.

READ FULL TEXT

page 5

page 7

page 8

research
06/26/2022

Latent Augmentation For Better Graph Self-Supervised Learning

Graph self-supervised learning has been vastly employed to learn represe...
research
04/10/2023

GraphMAE2: A Decoding-Enhanced Masked Self-Supervised Graph Learner

Graph self-supervised learning (SSL), including contrastive and generati...
research
04/27/2021

Contrastive Spatial Reasoning on Multi-View Line Drawings

Spatial reasoning on multi-view line drawings by state-of-the-art superv...
research
04/28/2022

Keep the Caption Information: Preventing Shortcut Learning in Contrastive Image-Caption Retrieval

To train image-caption retrieval (ICR) methods, contrastive loss functio...
research
04/09/2021

Towards Fine-grained Visual Representations by Combining Contrastive Learning with Image Reconstruction and Attention-weighted Pooling

This paper presents Contrastive Reconstruction, ConRec - a self-supervis...
research
06/29/2021

Self-Contrastive Learning

This paper proposes a novel contrastive learning framework, coined as Se...
research
12/21/2022

Contrastive Distillation Is a Sample-Efficient Self-Supervised Loss Policy for Transfer Learning

Traditional approaches to RL have focused on learning decision policies ...

Please sign up or login with your details

Forgot password? Click here to reset