Offsite-Tuning: Transfer Learning without Full Model

02/09/2023
by   Guangxuan Xiao, et al.
0

Transfer learning is important for foundation models to adapt to downstream tasks. However, many foundation models are proprietary, so users must share their data with model owners to fine-tune the models, which is costly and raise privacy concerns. Moreover, fine-tuning large foundation models is computation-intensive and impractical for most downstream users. In this paper, we propose Offsite-Tuning, a privacy-preserving and efficient transfer learning framework that can adapt billion-parameter foundation models to downstream data without access to the full model. In offsite-tuning, the model owner sends a light-weight adapter and a lossy compressed emulator to the data owner, who then fine-tunes the adapter on the downstream data with the emulator's assistance. The fine-tuned adapter is then returned to the model owner, who plugs it into the full model to create an adapted foundation model. Offsite-tuning preserves both parties' privacy and is computationally more efficient than the existing fine-tuning methods that require access to the full model weights. We demonstrate the effectiveness of offsite-tuning on various large language and vision foundation models. Offsite-tuning can achieve comparable accuracy as full model fine-tuning while being privacy-preserving and efficient, achieving 6.5x speedup and 5.6x memory reduction. Code is available at https://github.com/mit-han-lab/offsite-tuning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/24/2021

One to Transfer All: A Universal Transfer Framework for Vision Foundation Model with Few Data

The foundation model is not the last chapter of the model production pip...
research
03/23/2023

MELTR: Meta Loss Transformer for Learning to Fine-tune Video Foundation Models

Foundation models have shown outstanding performance and generalization ...
research
09/14/2023

When is a Foundation Model a Foundation Model

Recently, several studies have reported on the fine-tuning of foundation...
research
11/29/2022

On the power of foundation models

With infinitely many high-quality data points, infinite computational po...
research
03/29/2023

RetClean: Retrieval-Based Data Cleaning Using Foundation Models and Data Lakes

Can foundation models (such as ChatGPT) clean your data? In this proposa...
research
08/25/2023

Fine-tuning can cripple your foundation model; preserving features may be the solution

Pre-trained foundation models, owing primarily to their enormous capacit...
research
09/21/2023

SAM-OCTA: A Fine-Tuning Strategy for Applying Foundation Model to OCTA Image Segmentation Tasks

In the analysis of optical coherence tomography angiography (OCTA) image...

Please sign up or login with your details

Forgot password? Click here to reset