S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration

02/17/2021
by   Zhiqiang Shen, et al.
0

Previous studies dominantly target at self-supervised learning on real-valued networks and have achieved many promising results. However, on the more challenging binary neural networks (BNNs), this task has not yet been fully explored in the community. In this paper, we focus on this more difficult scenario: learning networks where both weights and activations are binary, meanwhile, without any human annotated labels. We observe that the commonly used contrastive objective is not satisfying on BNNs for competitive accuracy, since the backbone network contains relatively limited capacity and representation ability. Hence instead of directly applying existing self-supervised methods, which cause a severe decline in performance, we present a novel guided learning paradigm from real-valued to distill binary networks on the final prediction distribution, to minimize the loss and obtain desirable accuracy. Our proposed method can boost the simple contrastive learning baseline by an absolute gain of 5.5 15 that it is difficult for BNNs to recover the similar predictive distributions as real-valued models when training without labels. Thus, how to calibrate them is key to address the degradation in performance. Extensive experiments are conducted on the large-scale ImageNet and downstream datasets. Our method achieves substantial improvement over the simple contrastive learning baseline, and is even comparable to many mainstream supervised BNN methods. Code will be made available.

READ FULL TEXT

page 5

page 8

research
03/08/2023

Self-Supervised Learning for Group Equivariant Neural Networks

This paper proposes a method to construct pretext tasks for self-supervi...
research
03/25/2021

Rethinking Self-Supervised Learning: Small is Beautiful

Self-supervised learning (SSL), in particular contrastive learning, has ...
research
12/10/2021

Learning Representations with Contrastive Self-Supervised Learning for Histopathology Applications

Unsupervised learning has made substantial progress over the last few ye...
research
11/08/2022

Pushing the limits of self-supervised speaker verification using regularized distillation framework

Training robust speaker verification systems without speaker labels has ...
research
05/30/2023

A Graph is Worth 1-bit Spikes: When Graph Contrastive Learning Meets Spiking Neural Networks

While contrastive self-supervised learning has become the de-facto learn...
research
06/03/2022

Rethinking and Scaling Up Graph Contrastive Learning: An Extremely Efficient Approach with Group Discrimination

Graph contrastive learning (GCL) alleviates the heavy reliance on label ...
research
06/19/2019

Back to Simplicity: How to Train Accurate BNNs from Scratch?

Binary Neural Networks (BNNs) show promising progress in reducing comput...

Please sign up or login with your details

Forgot password? Click here to reset