Concept Representation Learning with Contrastive Self-Supervised Learning

12/10/2021
by   Daniel T Chang, et al.
0

Concept-oriented deep learning (CODL) is a general approach to meet the future challenges for deep learning: (1) learning with little or no external supervision, (2) coping with test examples that come from a different distribution than the training examples, and (3) integrating deep learning with symbolic AI. In CODL, as in human learning, concept representations are learned based on concept exemplars. Contrastive self-supervised learning (CSSL) provides a promising approach to do so, since it: (1) uses data-driven associations, to get away from semantic labels, (2) supports incremental and continual learning, to get away from (large) fixed datasets, and (3) accommodates emergent objectives, to get away from fixed objectives (tasks). We discuss major aspects of concept representation learning using CSSL. These include dual-level concept representations, CSSL for feature representations, exemplar similarity measures and self-supervised relational reasoning, incremental and continual CSSL, and contrastive self-supervised concept (class) incremental learning. The discussion leverages recent findings from cognitive neural science and CSSL.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/13/2022

Embodied-Symbolic Contrastive Graph Self-Supervised Learning for Molecular Graphs

Dual embodied-symbolic concept representations are the foundation for de...
research
06/05/2018

Concept-Oriented Deep Learning

Concepts are the foundation of human deep learning, understanding, and k...
research
12/01/2022

A General Purpose Supervisory Signal for Embodied Agents

Training effective embodied AI agents often involves manual reward engin...
research
11/22/2020

Run Away From your Teacher: Understanding BYOL by a Novel Self-Supervised Approach

Recently, a newly proposed self-supervised framework Bootstrap Your Own ...
research
01/28/2023

Unbiased and Efficient Self-Supervised Incremental Contrastive Learning

Contrastive Learning (CL) has been proved to be a powerful self-supervis...
research
06/08/2023

Sy-CON: Symmetric Contrastive Loss for Continual Self-Supervised Representation Learning

We introduce a novel and general loss function, called Symmetric Contras...
research
03/01/2022

Dual Embodied-Symbolic Concept Representations for Deep Learning

Motivated by recent findings from cognitive neural science, we advocate ...

Please sign up or login with your details

Forgot password? Click here to reset