muNet: Evolving Pretrained Deep Neural Networks into Scalable Auto-tuning Multitask Systems

05/22/2022
by   Andrea Gesmundo, et al.
3

Most uses of machine learning today involve training a model from scratch for a particular task, or sometimes starting with a model pretrained on a related task and then fine-tuning on a downstream task. Both approaches offer limited knowledge transfer between different tasks, time-consuming human-driven customization to individual tasks and high computational costs especially when starting from randomly initialized models. We propose a method that uses the layers of a pretrained deep neural network as building blocks to construct an ML system that can jointly solve an arbitrary number of tasks. The resulting system can leverage cross tasks knowledge transfer, while being immune from common drawbacks of multitask approaches such as catastrophic forgetting, gradients interference and negative transfer. We define an evolutionary approach designed to jointly select the prior knowledge relevant for each task, choose the subset of the model parameters to train and dynamically auto-tune its hyperparameters. Furthermore, a novel scale control method is employed to achieve quality/size trade-offs that outperform common fine-tuning techniques. Compared with standard fine-tuning on a benchmark of 10 diverse image classification tasks, the proposed model improves the average accuracy by 2.39 while using 47

READ FULL TEXT

page 3

page 20

page 21

research
05/25/2022

An Evolutionary Approach to Dynamic Introduction of Tasks in Large-scale Multitask Learning Systems

Multitask learning assumes that models capable of learning from multiple...
research
01/29/2023

Debiased Fine-Tuning for Vision-language Models by Prompt Regularization

We present a new paradigm for fine-tuning large-scale visionlanguage pre...
research
06/29/2016

Learning without Forgetting

When building a unified vision system or gradually adding new capabiliti...
research
04/03/2022

Revisiting a kNN-based Image Classification System with High-capacity Storage

In existing image classification systems that use deep neural networks, ...
research
07/02/2020

Learn Faster and Forget Slower via Fast and Stable Task Adaptation

Training Deep Neural Networks (DNNs) is still highly time-consuming and ...
research
02/15/2023

The Expressive Power of Tuning Only the Norm Layers

Feature normalization transforms such as Batch and Layer-Normalization h...
research
01/16/2023

Multimodal Side-Tuning for Document Classification

In this paper, we propose to exploit the side-tuning framework for multi...

Please sign up or login with your details

Forgot password? Click here to reset