Personalizing Pre-trained Models

06/02/2021
by   Mina Khan, et al.
0

Self-supervised or weakly supervised models trained on large-scale datasets have shown sample-efficient transfer to diverse datasets in few-shot settings. We consider how upstream pretrained models can be leveraged for downstream few-shot, multilabel, and continual learning tasks. Our model CLIPPER (CLIP PERsonalized) uses image representations from CLIP, a large-scale image representation learning model trained using weak natural language supervision. We developed a technique, called Multi-label Weight Imprinting (MWI), for multi-label, continual, and few-shot learning, and CLIPPER uses MWI with image representations from CLIP. We evaluated CLIPPER on 10 single-label and 5 multi-label datasets. Our model shows robust and competitive performance, and we set new benchmarks for few-shot, multi-label, and continual learning. Our lightweight technique is also compute-efficient and enables privacy-preserving applications as the data is not sent to the upstream model for fine-tuning.

READ FULL TEXT
research
08/08/2022

A Multi-label Continual Learning Framework to Scale Deep Learning Approaches for Packaging Equipment Monitoring

Continual Learning aims to learn from a stream of tasks, being able to r...
research
11/23/2021

Multi-label Iterated Learning for Image Classification with Label Ambiguity

Transfer learning from large-scale pre-trained models has become essenti...
research
06/09/2021

Pretrained Encoders are All You Need

Data-efficiency and generalization are key challenges in deep learning a...
research
10/28/2022

Elastic Weight Consolidation Improves the Robustness of Self-Supervised Learning Methods under Transfer

Self-supervised representation learning (SSL) methods provide an effecti...
research
11/15/2021

CoLLIE: Continual Learning of Language Grounding from Language-Image Embeddings

This paper presents CoLLIE: a simple, yet effective model for continual ...
research
07/21/2023

DEFTri: A Few-Shot Label Fused Contextual Representation Learning For Product Defect Triage in e-Commerce

Defect Triage is a time-sensitive and critical process in a large-scale ...
research
06/16/2021

SPeCiaL: Self-Supervised Pretraining for Continual Learning

This paper presents SPeCiaL: a method for unsupervised pretraining of re...

Please sign up or login with your details

Forgot password? Click here to reset