Loss Functions for Neural Networks for Image Processing

11/28/2015
by   Hang Zhao, et al.
0

Neural networks are becoming central in several areas of computer vision and image processing and different architectures have been proposed to solve specific problems. The impact of the loss layer of neural networks, however, has not received much attention in the context of image processing: the default and virtually only choice is L2. In this paper we bring attention to alternative choices. We study the performance of several losses, including perceptually-motivated losses, and propose a novel, differentiable error function. We show that the quality of the results improves significantly with better loss functions, even when the network architecture is left unchanged.

READ FULL TEXT

page 2

page 9

page 10

page 11

page 13

page 14

research
10/28/2022

Evaluating the Impact of Loss Function Variation in Deep Learning for Classification

The loss function is arguably among the most important hyperparameters f...
research
03/17/2023

Alternate Loss Functions Can Improve the Performance of Artificial Neural Networks

All machine learning algorithms use a loss, cost, utility or reward func...
research
03/02/2021

Brain-inspired algorithms for processing of visual data

The study of the visual system of the brain has attracted the attention ...
research
07/14/2022

Differentiable Logics for Neural Network Training and Verification

The rising popularity of neural networks (NNs) in recent years and their...
research
07/20/2020

Solving the functional Eigen-Problem using Neural Networks

In this work, we explore the ability of NN (Neural Networks) to serve as...
research
12/01/2015

Loss Functions for Top-k Error: Analysis and Insights

In order to push the performance on realistic computer vision tasks, the...
research
08/08/2014

Gabor-like Image Filtering using a Neural Microcircuit

In this letter, we present an implementation of a neural microcircuit fo...

Please sign up or login with your details

Forgot password? Click here to reset