Teacher Improves Learning by Selecting a Training Subset

02/25/2018
by   Yuzhe Ma, et al.
0

We call a learner super-teachable if a teacher can trim down an iid training set while making the learner learn even better. We provide sharp super-teaching guarantees on two learners: the maximum likelihood estimator for the mean of a Gaussian, and the large margin classifier in 1D. For general learners, we provide a mixed-integer nonlinear programming-based algorithm to find a super teaching set. Empirical experiments show that our algorithm is able to find good super-teaching sets for both regression and classification problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2017

Iterative Machine Teaching

In this paper, we consider the problem of machine teaching, the inverse ...
research
05/07/2019

Collaborative and Privacy-Preserving Machine Teaching via Consensus Optimization

In this work, we define a collaborative and privacy-preserving machine t...
research
10/21/2017

Towards Black-box Iterative Machine Teaching

In this paper, we make an important step towards the black-box machine t...
research
10/31/2022

Iterative Teaching by Data Hallucination

We consider the problem of iterative machine teaching, where a teacher s...
research
06/30/2022

When an Active Learner Meets a Black-box Teacher

Active learning maximizes the hypothesis updates to find those desired u...
research
02/04/2022

Time-Constrained Learning

Consider a scenario in which we have a huge labeled dataset D and a limi...
research
05/21/2018

Teaching Multiple Concepts to Forgetful Learners

How can we help a forgetful learner learn multiple concepts within a lim...

Please sign up or login with your details

Forgot password? Click here to reset