Modularizing Deep Learning via Pairwise Learning With Kernels

05/12/2020
by   Shiyu Duan, et al.
0

By redefining the conventional notions of layers, we present an alternative view on finitely wide, fully trainable deep neural networks as stacked linear models in feature spaces, leading to a kernel machine interpretation. Based on this construction, we then propose a provably optimal modular learning framework for classification, avoiding between-module backpropagation. This modular training approach brings new insights into the label requirement of deep learning: It leverages weak pairwise labels when learning the hidden modules. When training the output module, on the other hand, it requires full supervision but achieves high label efficiency, needing as few as 10 randomly selected labeled examples (one from each class) to achieve 94.88% accuracy on CIFAR-10 using a ResNet-18 backbone. Moreover, modular training enables fully modularized deep learning workflows, which then simplify the design and implementation of pipelines and improve the maintainability and reusability of models. To showcase the advantages of such a modularized workflow, we describe a simple yet reliable method for estimating reusability of pre-trained modules as well as task transferability in a transfer learning setting. At practically no computation overhead, it precisely described the task space structure of 15 binary classification tasks from CIFAR-10.

READ FULL TEXT

page 1

page 3

research
06/15/2020

Finding trainable sparse networks through Neural Tangent Transfer

Deep neural networks have dramatically transformed machine learning, but...
research
07/30/2020

Improving Sample Efficiency with Normalized RBF Kernels

In deep learning models, learning more with less data is becoming more i...
research
07/30/2018

Improving Transferability of Deep Neural Networks

Learning from small amounts of labeled data is a challenge in the area o...
research
01/09/2021

Training Deep Architectures Without End-to-End Backpropagation: A Brief Survey

This tutorial paper surveys training alternatives to end-to-end backprop...
research
09/01/2019

Transfer Learning Between Related Tasks Using Expected Label Proportions

Deep learning systems thrive on abundance of labeled training data but s...
research
05/20/2021

Intra-Model Collaborative Learning of Neural Networks

Recently, collaborative learning proposed by Song and Chai has achieved ...
research
03/16/2022

Playing with blocks: Toward re-usable deep learning models for side-channel profiled attacks

This paper introduces a deep learning modular network for side-channel a...

Please sign up or login with your details

Forgot password? Click here to reset