Unifying distillation and privileged information

11/11/2015
by   David Lopez-Paz, et al.
0

Distillation (Hinton et al., 2015) and privileged information (Vapnik & Izmailov, 2015) are two techniques that enable machines to learn from other machines. This paper unifies these two techniques into generalized distillation, a framework to learn from multiple machines and data representations. We provide theoretical and causal insight about the inner workings of generalized distillation, extend it to unsupervised, semisupervised and multitask learning scenarios, and illustrate its efficacy on a variety of numerical simulations on both synthetic and real-world data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2020

New Properties of the Data Distillation Method When Working With Tabular Data

Data distillation is the problem of reducing the volume oftraining data ...
research
03/30/2020

On the Unreasonable Effectiveness of Knowledge Distillation: Analysis in the Kernel Regime

Knowledge distillation (KD), i.e. one classifier being trained on the ou...
research
05/17/2019

Dream Distillation: A Data-Independent Model Compression Framework

Model compression is eminently suited for deploying deep learning on IoT...
research
05/28/2023

Distill Gold from Massive Ores: Efficient Dataset Distillation via Critical Samples Selection

Data-efficient learning has drawn significant attention, especially give...
research
12/05/2018

Revisiting Deniability in Quantum Key Exchange via Covert Communication and Entanglement Distillation

We revisit the notion of deniability in quantum key exchange (QKE), a to...
research
07/16/2018

On the Information Theoretic Distance Measures and Bidirectional Helmholtz Machines

By establishing a connection between bi-directional Helmholtz machines a...

Please sign up or login with your details

Forgot password? Click here to reset