Learning with Pseudo-Ensembles

12/16/2014
by   Philip Bachman, et al.
0

We formalize the notion of a pseudo-ensemble, a (possibly infinite) collection of child models spawned from a parent model by perturbing it according to some noise process. E.g., dropout (Hinton et. al, 2012) in a deep neural network trains a pseudo-ensemble of child subnetworks generated by randomly masking nodes in the parent network. We present a novel regularizer based on making the behavior of a pseudo-ensemble robust with respect to the noise process generating it. In the fully-supervised setting, our regularizer matches the performance of dropout. But, unlike dropout, our regularizer naturally extends to the semi-supervised setting, where it produces state-of-the-art results. We provide a case study in which we transform the Recursive Neural Tensor Network of (Socher et. al, 2013) into a pseudo-ensemble, which significantly improves its performance on a real-world sentiment analysis benchmark.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/04/2013

Dropout Training as Adaptive Regularization

Dropout and other feature noising schemes control overfitting by artific...
research
08/31/2022

Seq-UPS: Sequential Uncertainty-aware Pseudo-label Selection for Semi-Supervised Text Recognition

This paper looks at semi-supervised learning (SSL) for image-based text ...
research
02/19/2023

Pseudo Contrastive Learning for Graph-based Semi-supervised Learning

Pseudo Labeling is a technique used to improve the performance of semi-s...
research
02/15/2023

Semi-Supervised Deep Regression with Uncertainty Consistency and Variational Model Ensembling via Bayesian Neural Networks

Deep regression is an important problem with numerous applications. Thes...
research
07/12/2017

Adversarial Dropout for Supervised and Semi-supervised Learning

Recently, the training with adversarial examples, which are generated by...
research
12/21/2013

An empirical analysis of dropout in piecewise linear networks

The recently introduced dropout training criterion for neural networks h...
research
05/01/2018

Internal node bagging: an explicit ensemble learning method in neural network training

We introduce a novel view to understand how dropout works as an inexplic...

Please sign up or login with your details

Forgot password? Click here to reset