Semi-Supervised Learning with Declaratively Specified Entropy Constraints
We propose a technique for declaratively specifying strategies for semi-supervised learning (SSL). The proposed method can be used to specify ensembles of semi-supervised learning, as well as agreement constraints and entropic regularization constraints between these learners, and can be used to model both well-known heuristics such as co-training and novel domain-specific heuristics. In addition to representing individual SSL heuristics, we show that multiple heuristics can also be automatically combined using Bayesian optimization methods. We show consistent improvements on a suite of well-studied SSL benchmarks, including a new state-of-the-art result on a difficult relation extraction task.
READ FULL TEXT