Iterative Self-Learning: Semi-Supervised Improvement to Dataset Volumes and Model Accuracy

06/06/2019
by   Robert Dupre, et al.
0

A novel semi-supervised learning technique is introduced based on a simple iterative learning cycle together with learned thresholding techniques and an ensemble decision support system. State-of-the-art model performance and increased training data volume are demonstrated, through the use of unlabelled data when training deeply learned classification models. Evaluation of the proposed approach is performed on commonly used datasets when evaluating semi-supervised learning techniques as well as a number of more challenging image classification datasets (CIFAR-100 and a 200 class subset of ImageNet).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/01/2019

A Semi-Supervised Self-Organizing Map for Clustering and Classification

There has been an increasing interest in semi-supervised learning in the...
research
07/10/2021

Semi-Supervised Learning with Multi-Head Co-Training

Co-training, extended from self-training, is one of the frameworks for s...
research
08/29/2014

Comment on "Ensemble Projection for Semi-supervised Image Classification"

In a series of papers by Dai and colleagues [1,2], a feature map (or ker...
research
05/17/2023

Cold PAWS: Unsupervised class discovery and the cold-start problem

In many machine learning applications, labeling datasets can be an arduo...
research
10/28/2022

When does mixup promote local linearity in learned representations?

Mixup is a regularization technique that artificially produces new sampl...
research
04/04/2021

IITK@Detox at SemEval-2021 Task 5: Semi-Supervised Learning and Dice Loss for Toxic Spans Detection

In this work, we present our approach and findings for SemEval-2021 Task...
research
12/19/2017

Estimation of Individual Micro Data from Aggregated Open Data

In this paper, we propose a method of estimating individual micro data f...

Please sign up or login with your details

Forgot password? Click here to reset