Jack and Masters of All Trades: One-Pass Learning of a Set of Model Sets from Foundation AI Models

05/02/2022
by   Han Xiang Choong, et al.
9

For deep learning, size is power. Massive neural nets trained on broad data for a spectrum of tasks are at the forefront of artificial intelligence. These foundation models or 'Jacks of All Trades' (JATs), when fine-tuned for downstream tasks, are gaining importance in driving deep learning advancements. However, environments with tight resource constraints, changing objectives and intentions, or varied task requirements, could limit the real-world utility of a singular JAT. Hence, in tandem with current trends towards building increasingly large JATs, this paper conducts an initial exploration into concepts underlying the creation of a diverse set of compact machine learning model sets. Composed of many smaller and specialized models, we formulate the Set of Sets to simultaneously fulfil many task settings and environmental conditions. A means to arrive at such a set tractably in one pass of a neuroevolutionary multitasking algorithm is presented for the first time, bringing us closer to models that are collectively 'Masters of All Trades'.

READ FULL TEXT

page 1

page 2

page 4

page 6

page 8

page 9

research
12/20/2022

Recycling diverse models for out-of-distribution generalization

Foundation models are redefining how AI systems are built. Practitioners...
research
05/14/2023

A Comprehensive Survey on Segment Anything Model for Vision and Beyond

Artificial intelligence (AI) is evolving towards artificial general inte...
research
09/19/2023

AI Foundation Models for Weather and Climate: Applications, Design, and Implementation

Machine learning and deep learning methods have been widely explored in ...
research
09/05/2023

SeisCLIP: A seismology foundation model pre-trained by multi-modal data for multi-purpose seismic feature extraction

Training specific deep learning models for particular tasks is common ac...
research
11/19/2022

Molecular Structure-Property Co-Trained Foundation Model for In Silico Chemistry

Recently, deep learning approaches have been extensively studied for var...
research
05/20/2022

Pre-Train Your Loss: Easy Bayesian Transfer Learning with Informative Priors

Deep learning is increasingly moving towards a transfer learning paradig...
research
07/04/2022

How Much More Data Do I Need? Estimating Requirements for Downstream Tasks

Given a small training data set and a learning algorithm, how much more ...

Please sign up or login with your details

Forgot password? Click here to reset