Everything old is new again: A multi-view learning approach to learning using privileged information and distillation

03/08/2019
by   Weiran Wang, et al.
0

We adopt a multi-view approach for analyzing two knowledge transfer settings---learning using privileged information (LUPI) and distillation---in a common framework. Under reasonable assumptions about the complexities of hypothesis spaces, and being optimistic about the expected loss achievable by the student (in distillation) and a transformed teacher predictor (in LUPI), we show that encouraging agreement between the teacher and the student leads to reduced search space. As a result, improved convergence rate can be obtained with regularized empirical risk minimization.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset