Learning SMaLL Predictors

03/06/2018
by   Vikas K. Garg, et al.
0

We present a new machine learning technique for training small resource-constrained predictors. Our algorithm, the Sparse Multiprototype Linear Learner (SMaLL), is inspired by the classic machine learning problem of learning k-DNF Boolean formulae. We present a formal derivation of our algorithm and demonstrate the benefits of our approach with a detailed empirical study.

READ FULL TEXT

page 4

page 6

page 8

research
04/28/2018

Efficient Subpixel Refinement with Symbolic Linear Predictors

We present an efficient subpixel refinement method usinga learning-based...
research
12/20/2021

Calabi-Yau Metrics, Energy Functionals and Machine-Learning

We apply machine learning to the problem of finding numerical Calabi-Yau...
research
06/05/2017

ToPs: Ensemble Learning with Trees of Predictors

We present a new approach to ensemble learning. Our approach constructs ...
research
04/12/2020

A Machine Learning Approach for Flagging Incomplete Bid-rigging Cartels

We propose a new method for flagging bid rigging, which is particularly ...
research
03/29/2020

Prediction of properties of steel alloys

We present a study of possible predictors based on four supervised machi...
research
10/18/2019

Machine learning Calabi-Yau metrics

We apply machine learning to the problem of finding numerical Calabi-Yau...
research
09/15/2022

Adversarially Robust Learning: A Generic Minimax Optimal Learner and Characterization

We present a minimax optimal learner for the problem of learning predict...

Please sign up or login with your details

Forgot password? Click here to reset