Explore the Power of Dropout on Few-shot Learning

01/26/2023
by   Shaobo Lin, et al.
0

The generalization power of the pre-trained model is the key for few-shot deep learning. Dropout is a regularization technique used in traditional deep learning methods. In this paper, we explore the power of dropout on few-shot learning and provide some insights about how to use it. Extensive experiments on the few-shot object detection and few-shot image classification datasets, i.e., Pascal VOC, MS COCO, CUB, and mini-ImageNet, validate the effectiveness of our method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/12/2022

A Unified Framework with Meta-dropout for Few-shot Learning

Conventional training of deep neural networks usually requires a substan...
research
09/19/2020

Few-shot learning using pre-training and shots, enriched by pre-trained samples

We use the EMNIST dataset of handwritten digits to test a simple approac...
research
12/07/2021

Learning Instance and Task-Aware Dynamic Kernels for Few Shot Learning

Learning and generalizing to novel concepts with few samples (Few-Shot L...
research
08/05/2022

Convolutional Ensembling based Few-Shot Defect Detection Technique

Over the past few years, there has been a significant improvement in the...
research
11/16/2022

On Measuring the Intrinsic Few-Shot Hardness of Datasets

While advances in pre-training have led to dramatic improvements in few-...
research
01/20/2023

Open-Set Likelihood Maximization for Few-Shot Learning

We tackle the Few-Shot Open-Set Recognition (FSOSR) problem, i.e. classi...

Please sign up or login with your details

Forgot password? Click here to reset