Refocusing Is Key to Transfer Learning

05/24/2023
by   Baifeng Shi, et al.
0

Transfer learning involves adapting a pre-trained model to novel downstream tasks. However, we observe that current transfer learning methods often fail to focus on task-relevant features. In this work, we emphasize the importance of refocusing the attention in transfer learning. We introduce Top-Down Attention Steering (TOAST), a novel transfer learning algorithm that keeps the pre-trained backbone frozen, while selecting the task-relevant elements in the output and feeding them back to the model to steer its attention to the task-specific features. By refocusing the attention only, TOAST achieves state-of-the-art results on a number of transfer learning benchmarks, while having a small portion of tunable parameters. Compared to fully fine-tuning, LoRA, and prompt tuning, TOAST substantially improves performance across a range of fine-grained visual classification datasets (e.g., 81.1 FGVC). TOAST also outperforms the fully fine-tuned Alpaca model on instruction-following language generation. Code is available at https://github.com/bfshi/TOAST.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset