Improving evidential deep learning via multi-task learning

12/17/2021
by   Dongpin Oh, et al.
0

The Evidential regression network (ENet) estimates a continuous target and its predictive uncertainty without costly Bayesian model averaging. However, it is possible that the target is inaccurately predicted due to the gradient shrinkage problem of the original loss function of the ENet, the negative log marginal likelihood (NLL) loss. In this paper, the objective is to improve the prediction accuracy of the ENet while maintaining its efficient uncertainty estimation by resolving the gradient shrinkage problem. A multi-task learning (MTL) framework, referred to as MT-ENet, is proposed to accomplish this aim. In the MTL, we define the Lipschitz modified mean squared error (MSE) loss function as another loss and add it to the existing NLL loss. The Lipschitz modified MSE loss is designed to mitigate the gradient conflict with the NLL loss by dynamically adjusting its Lipschitz constant. By doing so, the Lipschitz MSE loss does not disturb the uncertainty estimation of the NLL loss. The MT-ENet enhances the predictive accuracy of the ENet without losing uncertainty estimation capability on the synthetic dataset and real-world benchmarks, including drug-target affinity (DTA) regression. Furthermore, the MT-ENet shows remarkable calibration and out-of-distribution detection capability on the DTA benchmarks.

READ FULL TEXT

page 2

page 5

page 13

research
07/06/2017

Statistical Parametric Speech Synthesis Using Generative Adversarial Networks Under A Multi-task Learning Framework

In this paper, we aim at improving the performance of synthesized speech...
research
02/14/2023

Cauchy Loss Function: Robustness Under Gaussian and Cauchy Noise

In supervised machine learning, the choice of loss function implicitly a...
research
08/12/2020

On Mean Absolute Error for Deep Neural Network Based Vector-to-Vector Regression

In this paper, we exploit the properties of mean absolute error (MAE) as...
research
09/16/2021

Improving Regression Uncertainty Estimation Under Statistical Change

While deep neural networks are highly performant and successful in a wid...
research
10/06/2021

A Hierarchical Variational Neural Uncertainty Model for Stochastic Video Prediction

Predicting the future frames of a video is a challenging task, in part d...
research
08/03/2023

Online Multi-Task Learning with Recursive Least Squares and Recursive Kernel Methods

This paper introduces two novel approaches for Online Multi-Task Learnin...
research
08/22/2019

Image Colorization By Capsule Networks

In this paper, a simple topology of Capsule Network (CapsNet) is investi...

Please sign up or login with your details

Forgot password? Click here to reset