ScreenerNet: Learning Curriculum for Neural Networks

01/03/2018
by   Tae-Hoon Kim, et al.
0

We propose to learn a curriculum or a syllabus for supervised learning with deep neural networks. Specifically, we learn weights for each sample in training by an attached neural network, called ScreenerNet, to the original network and jointly train them in an end-to-end fashion. We show the networks augmented with our ScreenerNet achieve early convergence with better accuracy than the state-of-the-art rule-based curricular learning methods in extensive experiments using three popular vision datasets including MNIST, CIFAR10 and Pascal VOC2012, and a Cartpole task using Deep Q-learning.

READ FULL TEXT

page 6

page 7

research
04/21/2021

Improving the Accuracy of Early Exits in Multi-Exit Architectures via Curriculum Learning

Deploying deep learning services for time-sensitive and resource-constra...
research
02/28/2017

Learning Discrete Representations via Information Maximizing Self-Augmented Training

Learning discrete representations of data is a central machine learning ...
research
09/06/2019

Mass Personalization of Deep Learning

We discuss training techniques, objectives and metrics toward mass perso...
research
10/24/2017

Improving Accuracy of Nonparametric Transfer Learning via Vector Segmentation

Transfer learning using deep neural networks as feature extractors has b...
research
11/12/2018

Generalized Ternary Connect: End-to-End Learning and Compression of Multiplication-Free Deep Neural Networks

The use of deep neural networks in edge computing devices hinges on the ...
research
03/15/2021

Automatically Lock Your Neural Networks When You're Away

The smartphone and laptop can be unlocked by face or fingerprint recogni...
research
01/11/2022

Neural Capacitance: A New Perspective of Neural Network Selection via Edge Dynamics

Efficient model selection for identifying a suitable pre-trained neural ...

Please sign up or login with your details

Forgot password? Click here to reset