Continuous Weight Balancing

03/30/2021
by   Daniel J Wu, et al.
0

We propose a simple method by which to choose sample weights for problems with highly imbalanced or skewed traits. Rather than naively discretizing regression labels to find binned weights, we take a more principled approach – we derive sample weights from the transfer function between an estimated source and specified target distributions. Our method outperforms both unweighted and discretely-weighted models on both regression and classification tasks. We also open-source our implementation of this method (https://github.com/Daniel-Wu/Continuous-Weight-Balancing) to the scientific community.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/17/2020

On Integer Balancing of Digraphs

A weighted digraph is balanced if the sums of the weights of the incomin...
research
06/24/2020

Approximating a Target Distribution using Weight Queries

A basic assumption in classical learning and estimation is the availabil...
research
02/18/2021

Delving into Deep Imbalanced Regression

Real-world data often exhibit imbalanced distributions, where certain ta...
research
04/14/2022

GPT-NeoX-20B: An Open-Source Autoregressive Language Model

We introduce GPT-NeoX-20B, a 20 billion parameter autoregressive languag...
research
11/14/2022

Generalized Stable Weights via Neural Gibbs Density

We present a generalized balancing weight method fully available for est...
research
12/02/2021

LeapfrogLayers: A Trainable Framework for Effective Topological Sampling

We introduce LeapfrogLayers, an invertible neural network architecture t...
research
06/22/2022

Dynamic Restrained Uncertainty Weighting Loss for Multitask Learning of Vocal Expression

We propose a novel Dynamic Restrained Uncertainty Weighting Loss to expe...

Please sign up or login with your details

Forgot password? Click here to reset