Smoothed Embeddings for Certified Few-Shot Learning

02/02/2022
by   Mikhail Pautov, et al.
0

Randomized smoothing is considered to be the state-of-the-art provable defense against adversarial perturbations. However, it heavily exploits the fact that classifiers map input objects to class probabilities and do not focus on the ones that learn a metric space in which classification is performed by computing distances to embeddings of classes prototypes. In this work, we extend randomized smoothing to few-shot learning models that map inputs to normalized embeddings. We provide analysis of Lipschitz continuity of such models and derive robustness certificate against ℓ_2-bounded perturbations that may be useful in few-shot learning scenarios. Our theoretical results are confirmed by experiments on different datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/19/2020

Randomized Smoothing of All Shapes and Sizes

Randomized smoothing is a recently proposed defense against adversarial ...
research
06/17/2021

Episode Adaptive Embedding Networks for Few-shot Learning

Few-shot learning aims to learn a classifier using a few labelled instan...
research
03/15/2017

Prototypical Networks for Few-shot Learning

We propose prototypical networks for the problem of few-shot classificat...
research
05/10/2019

Few-Shot Learning with Embedded Class Models and Shot-Free Meta Training

We propose a method for learning embeddings for few-shot learning that i...
research
07/05/2022

UniCR: Universally Approximated Certified Robustness via Randomized Smoothing

We study certified robustness of machine learning classifiers against ad...
research
04/13/2023

LSFSL: Leveraging Shape Information in Few-shot Learning

Few-shot learning (FSL) techniques seek to learn the underlying patterns...
research
02/27/2020

Certification of Semantic Perturbations via Randomized Smoothing

We introduce a novel certification method for parametrized perturbations...

Please sign up or login with your details

Forgot password? Click here to reset