Wrapped Loss Function for Regularizing Nonconforming Residual Distributions
Multi-output is essential in machine learning that it might suffer from nonconforming residual distributions, i.e., the multi-output residual distributions are not conforming to the expected distribution. In this paper we propose "Wrapped Loss Function" to wrap the original loss function to alleviate the problem. This wrapped loss function acts just like original loss function that its gradient can be used for backpropagation optimization. Empirical evaluations show wrapped loss function has advanced properties of faster convergence, better accuracy and improving imbalanced data.
READ FULL TEXT