Everything old is new again: A multi-view learning approach to learning using privileged information and distillation

03/08/2019
by   Weiran Wang, et al.
0

We adopt a multi-view approach for analyzing two knowledge transfer settings---learning using privileged information (LUPI) and distillation---in a common framework. Under reasonable assumptions about the complexities of hypothesis spaces, and being optimistic about the expected loss achievable by the student (in distillation) and a transformed teacher predictor (in LUPI), we show that encouraging agreement between the teacher and the student leads to reduced search space. As a result, improved convergence rate can be obtained with regularized empirical risk minimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/22/2021

Multi-View Feature Representation for Dialogue Generation with Bidirectional Distillation

Neural dialogue models suffer from low-quality responses when interacted...
research
12/19/2021

Controlling the Quality of Distillation in Response-Based Network Compression

The performance of a distillation-based compressed network is governed b...
research
03/28/2023

Enhancing Depth Completion with Multi-View Monitored Distillation

This paper presents a novel method for depth completion, which leverages...
research
11/16/2018

A generalized meta-loss function for distillation and learning using privileged information for classification and regression

Learning using privileged information and distillation are powerful mach...
research
12/07/2021

ADD: Frequency Attention and Multi-View based Knowledge Distillation to Detect Low-Quality Compressed Deepfake Images

Despite significant advancements of deep learning-based forgery detector...
research
03/11/2022

A New Learning Paradigm for Stochastic Configuration Network: SCN+

Learning using privileged information (LUPI) paradigm, which pioneered t...

Please sign up or login with your details

Forgot password? Click here to reset