Enhancing Classifier Conservativeness and Robustness by Polynomiality

03/23/2022
by   Ziqi Wang, et al.
0

We illustrate the detrimental effect, such as overconfident decisions, that exponential behavior can have in methods like classical LDA and logistic regression. We then show how polynomiality can remedy the situation. This, among others, leads purposefully to random-level performance in the tails, away from the bulk of the training data. A directly related, simple, yet important technical novelty we subsequently present is softRmax: a reasoned alternative to the standard softmax function employed in contemporary (deep) neural networks. It is derived through linking the standard softmax to Gaussian class-conditional models, as employed in LDA, and replacing those by a polynomial alternative. We show that two aspects of softRmax, conservativeness and inherent gradient regularization, lead to robustness against adversarial attacks without gradient obfuscation.

READ FULL TEXT
research
02/26/2018

Max-Mahalanobis Linear Discriminant Analysis Networks

A deep neural network (DNN) consists of a nonlinear transformation from ...
research
03/23/2018

Improving DNN Robustness to Adversarial Attacks using Jacobian Regularization

Deep neural networks have lately shown tremendous performance in various...
research
08/05/2017

Adversarial Robustness: Softmax versus Openmax

Deep neural networks (DNNs) provide state-of-the-art results on various ...
research
11/10/2022

Sketched Gaussian Model Linear Discriminant Analysis via the Randomized Kaczmarz Method

We present sketched linear discriminant analysis, an iterative randomize...
research
03/03/2023

Convex Bounds on the Softmax Function with Applications to Robustness Verification

The softmax function is a ubiquitous component at the output of neural n...
research
07/02/2023

Towards Unbiased Exploration in Partial Label Learning

We consider learning a probabilistic classifier from partially-labelled ...

Please sign up or login with your details

Forgot password? Click here to reset