Sparse Structure Search for Parameter-Efficient Tuning

06/15/2022
by   Shengding Hu, et al.
11

Adapting large pre-trained models (PTMs) through fine-tuning imposes prohibitive computational and storage burdens. Recent studies of parameter-efficient tuning (PET) find that only optimizing a small portion of parameters conditioned on PTMs could yield on-par performance compared to conventional fine-tuning. Generally, PET methods exquisitely design parameter-efficient modules (PET modules) which could be applied to arbitrary fine-grained positions inside PTMs. However, the effectiveness of these fine-grained positions largely relies on sophisticated manual designation, thereby usually producing sub-optimal results. In contrast to the manual designation, we explore constructing PET modules in an automatic manner. We automatically Search for the Sparse Structure of Parameter-Efficient Tuning (S^3PET). Based on a unified framework of various PET methods, S^3PET conducts the differentiable PET structure search through bi-level optimization and proposes shifted global sigmoid method to explicitly control the number of trainable parameters. Extensive experiments show that S^3PET surpasses manual and random structures with less trainable parameters. The searched structures preserve more than 99% fine-tuning performance with 0.01% trainable parameters. Moreover, the advantage of S^3PET is amplified with extremely low trainable parameters budgets (0.0009%∼0.01%). The searched structures are transferable and explainable, providing suggestions and guidance for the future design of PET methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset