Simple Distillation Baselines for Improving Small Self-supervised Models

06/21/2021
by   Jindong Gu, et al.
0

While large self-supervised models have rivalled the performance of their supervised counterparts, small models still struggle. In this report, we explore simple baselines for improving small self-supervised models via distillation, called SimDis. Specifically, we present an offline-distillation baseline, which establishes a new state-of-the-art, and an online-distillation baseline, which achieves similar performance with minimal computational overhead. We hope these baselines will provide useful experience for relevant future research. Code is available at: https://github.com/JindongGu/SimDis/

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset