Tensor Normalization and Full Distribution Training

09/06/2021
by   Wolfgang Fuhl, et al.
0

In this work, we introduce pixel wise tensor normalization, which is inserted after rectifier linear units and, together with batch normalization, provides a significant improvement in the accuracy of modern deep neural networks. In addition, this work deals with the robustness of networks. We show that the factorized superposition of images from the training set and the reformulation of the multi class problem into a multi-label problem yields significantly more robust networks. The reformulation and the adjustment of the multi class log loss also improves the results compared to the overlay with only one class as label. https://atreus.informatik.uni-tuebingen.de/seafile/d/8e2ab8c3fdd444e1a135/?p=

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/13/2016

Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach

We present a theoretically grounded approach to train deep neural networ...
research
06/07/2021

Sum of Ranked Range Loss for Supervised Learning

In forming learning objectives, one oftentimes needs to aggregate a set ...
research
07/10/2019

Deep Multi Label Classification in Affine Subspaces

Multi-label classification (MLC) problems are becoming increasingly popu...
research
05/28/2023

T2FNorm: Extremely Simple Scaled Train-time Feature Normalization for OOD Detection

Neural networks are notorious for being overconfident predictors, posing...
research
09/12/2021

CropDefender: deep watermark which is more convenient to train and more robust against cropping

Digital image watermarking, which is a technique for invisibly embedding...
research
10/07/2013

Least Squares Revisited: Scalable Approaches for Multi-class Prediction

This work provides simple algorithms for multi-class (and multi-label) p...
research
10/16/2015

Normalization of Relative and Incomplete Temporal Expressions in Clinical Narratives

We analyze the RI-TIMEXes in temporally annotated corpora and propose tw...

Please sign up or login with your details

Forgot password? Click here to reset