Multitask and Transfer Learning for Autotuning Exascale Applications

08/15/2019
by   Wissam M. Sid-Lakhdar, et al.
0

Multitask learning and transfer learning have proven to be useful in the field of machine learning when additional knowledge is available to help a prediction task. We aim at deriving methods following these paradigms for use in autotuning, where the goal is to find the optimal performance parameters of an application treated as a black-box function. We show comparative results with state-of-the-art autotuning techniques. For instance, we observe an average 1.5x improvement of the application runtime compared to the OpenTuner and HpBandSter autotuners. We explain how our approaches can be more suitable than some state-of-the-art autotuners for the tuning of any application in general and of expensive exascale applications in particular.

READ FULL TEXT
research
12/21/2018

An Integrated Transfer Learning and Multitask Learning Approach for Pharmacokinetic Parameter Prediction

Background: Pharmacokinetic evaluation is one of the key processes in dr...
research
07/17/2020

Transfer Learning without Knowing: Reprogramming Black-box Machine Learning Models with Scarce Data and Limited Resources

Current transfer learning methods are mainly based on finetuning a pretr...
research
03/04/2020

jiant: A Software Toolkit for Research on General-Purpose Text Understanding Models

We introduce jiant, an open source toolkit for conducting multitask and ...
research
09/04/2012

Sparse coding for multitask and transfer learning

We investigate the use of sparse coding and dictionary learning in the c...
research
03/06/2023

Multitask Prompt Tuning Enables Parameter-Efficient Transfer Learning

Prompt tuning, in which a base pretrained model is adapted to each task ...
research
07/24/2020

Dynamic Knowledge Distillation for Black-box Hypothesis Transfer Learning

In real world applications like healthcare, it is usually difficult to b...
research
10/14/2020

Just Pick a Sign: Optimizing Deep Multitask Models with Gradient Sign Dropout

The vast majority of deep models use multiple gradient signals, typicall...

Please sign up or login with your details

Forgot password? Click here to reset