Wrapped Loss Function for Regularizing Nonconforming Residual Distributions

08/21/2018
by   Chun Ting Liu, et al.
0

Multi-output is essential in machine learning that it might suffer from nonconforming residual distributions, i.e., the multi-output residual distributions are not conforming to the expected distribution. In this paper we propose "Wrapped Loss Function" to wrap the original loss function to alleviate the problem. This wrapped loss function acts just like original loss function that its gradient can be used for backpropagation optimization. Empirical evaluations show wrapped loss function has advanced properties of faster convergence, better accuracy and improving imbalanced data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/29/2023

Lipschitzness Effect of a Loss Function on Generalization Performance of Deep Neural Networks Trained by Adam and AdamW Optimizers

The generalization performance of deep neural networks with regard to th...
research
10/25/2019

Components of Machine Learning: Binding Bits and FLOPS

Many machine learning problems and methods are combinations of three com...
research
01/03/2020

The Real-World-Weight Cross-Entropy Loss Function: Modeling the Costs of Mislabeling

In this paper, we propose a new metric to measure goodness-of-fit for cl...
research
10/27/2020

Nonlinear Monte Carlo Method for Imbalanced Data Learning

For basic machine learning problems, expected error is used to evaluate ...
research
09/08/2020

Empirical Strategy for Stretching Probability Distribution in Neural-network-based Regression

In regression analysis under artificial neural networks, the prediction ...
research
01/04/2022

AutoBalance: Optimized Loss Functions for Imbalanced Data

Imbalanced datasets are commonplace in modern machine learning problems....
research
05/14/2023

ReSDF: Redistancing Implicit Surfaces using Neural Networks

This paper proposes a deep-learning-based method for recovering a signed...

Please sign up or login with your details

Forgot password? Click here to reset