An Efficient Source Model Selection Framework in Model Databases

by   Minjun Zhao, et al.

With the explosive increase of big data, training a Machine Learning (ML) model becomes a computation-intensive workload, which would take days or even weeks. Thus, reusing an already trained model has received attention, which is called transfer learning. Transfer learning avoids training a new model from scratch by transferring knowledge from a source task to a target task. Existing transfer learning methods mostly focus on how to improve the performance of the target task through a specific source model, and assume that the source model is given. Although many source models are available, it is difficult for data scientists to select the best source model for the target task manually. Hence, how to efficiently select a suitable source model in a model database for model reuse is an interesting but unsolved problem. In this paper, we propose SMS, an effective, efficient, and flexible source model selection framework. SMS is effective even when the source and target datasets have significantly different data labels, and is flexible to support source models with any type of structure, and is efficient to avoid any training process. For each source model, SMS first vectorizes the samples in the target dataset into soft labels by directly applying this model to the target dataset, then uses Gaussian distributions to fit for clusters of soft labels, and finally measures the distinguishing ability of the source model using Gaussian mixture-based metric. Moreover, we present an improved SMS (I-SMS), which decreases the output number of the source model. I-SMS can significantly reduce the selection time while retaining the selection performance of SMS. Extensive experiments on a range of practical model reuse workloads demonstrate the effectiveness and efficiency of SMS.


page 2

page 3

page 4

page 5

page 6

page 7

page 9

page 10


Transfer Learning with Pre-trained Conditional Generative Models

Transfer learning is crucial in training deep neural networks on new tar...

Transfer Learning for Sequence Labeling Using Source Model and Target Data

In this paper, we propose an approach for transferring the knowledge of ...

Distilling from Similar Tasks for Transfer Learning on a Budget

We address the challenge of getting efficient yet accurate recognition s...

Subspace Selection to Suppress Confounding Source Domain Information in AAM Transfer Learning

Active appearance models (AAMs) are a class of generative models that ha...

Source data selection for out-of-domain generalization

Models that perform out-of-domain generalization borrow knowledge from h...

Limits of Model Selection under Transfer Learning

Theoretical studies on transfer learning or domain adaptation have so fa...

Model Selection, Adaptation, and Combination for Deep Transfer Learning through Neural Networks in Renewable Energies

There is recent interest in using model hubs, a collection of pre-traine...

Please sign up or login with your details

Forgot password? Click here to reset