Unified Backpropagation for Multi-Objective Deep Learning

10/20/2017
by   Arash Shahriari, et al.
0

A common practice in most of deep convolutional neural architectures is to employ fully-connected layers followed by Softmax activation to minimize cross-entropy loss for the sake of classification. Recent studies show that substitution or addition of the Softmax objective to the cost functions of support vector machines or linear discriminant analysis is highly beneficial to improve the classification performance in hybrid neural networks. We propose a novel paradigm to link the optimization of several hybrid objectives through unified backpropagation. This highly alleviates the burden of extensive boosting for independent objective functions or complex formulation of multiobjective gradients. Hybrid loss functions are linked by basic probability assignment from evidence theory. We conduct our experiments for a variety of scenarios and standard datasets to evaluate the advantage of our proposed unification approach to deliver consistent improvements into the classification performance of deep convolutional neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/02/2013

Deep Learning using Linear Support Vector Machines

Recently, fully-connected and convolutional neural networks have been tr...
research
06/23/2021

Universal Consistency of Deep Convolutional Neural Networks

Compared with avid research activities of deep convolutional neural netw...
research
11/27/2020

Statistical theory for image classification using deep convolutional neural networks with cross-entropy loss

Convolutional neural networks learned by minimizing the cross-entropy lo...
research
02/06/2017

Search Intelligence: Deep Learning For Dominant Category Prediction

Deep Neural Networks, and specifically fully-connected convolutional neu...
research
04/23/2021

GuideBP: Guiding Backpropagation Through Weaker Pathways of Parallel Logits

Convolutional neural networks often generate multiple logits and use sim...
research
02/24/2022

Optimal Learning Rates of Deep Convolutional Neural Networks: Additive Ridge Functions

Convolutional neural networks have shown extraordinary abilities in many...
research
10/20/2017

Distributed Deep Transfer Learning by Basic Probability Assignment

Transfer learning is a popular practice in deep neural networks, but fin...

Please sign up or login with your details

Forgot password? Click here to reset