Augment with Care: Contrastive Learning for the Boolean Satisfiability Problem

02/17/2022
by   Haonan Duan, et al.
3

Supervised learning can improve the design of state-of-the-art solvers for combinatorial problems, but labelling large numbers of combinatorial instances is often impractical due to exponential worst-case complexity. Inspired by the recent success of contrastive pre-training for images, we conduct a scientific study of the effect of augmentation design on contrastive pre-training for the Boolean satisfiability problem. While typical graph contrastive pre-training uses label-agnostic augmentations, our key insight is that many combinatorial problems have well-studied invariances, which allow for the design of label-preserving augmentations. We find that label-preserving augmentations are critical for the success of contrastive pre-training. We show that our representations are able to achieve comparable test accuracy to fully-supervised learning while using only 1 demonstrate that our representations are more transferable to larger problems from unseen domains.

READ FULL TEXT

page 6

page 18

research
06/18/2020

Recovering Petaflops in Contrastive Semi-Supervised Learning of Visual Representations

We investigate a strategy for improving the computational efficiency of ...
research
10/26/2020

Robust Pre-Training by Adversarial Contrastive Learning

Recent work has shown that, when integrated with adversarial training, s...
research
05/18/2022

Relation Extraction with Weighted Contrastive Pre-training on Distant Supervision

Contrastive pre-training on distant supervision has shown remarkable eff...
research
11/06/2021

Towards noise robust trigger-word detection with contrastive learning pre-task for fast on-boarding of new trigger-words

Trigger-word detection plays an important role as the entry point of use...
research
04/09/2021

Patient Contrastive Learning: a Performant, Expressive, and Practical Approach to ECG Modeling

Supervised machine learning applications in health care are often limite...
research
07/14/2022

ConCL: Concept Contrastive Learning for Dense Prediction Pre-training in Pathology Images

Detectingandsegmentingobjectswithinwholeslideimagesis essential in compu...
research
10/18/2022

Soft-Labeled Contrastive Pre-training for Function-level Code Representation

Code contrastive pre-training has recently achieved significant progress...

Please sign up or login with your details

Forgot password? Click here to reset