Logitron: Perceptron-augmented classification model based on an extended logistic loss function

04/05/2019
by   Hyenkyun Woo, et al.
0

Classification is the most important process in data analysis. However, due to the inherent non-convex and non-smooth structure of the zero-one loss function of the classification model, various convex surrogate loss functions such as hinge loss, squared hinge loss, logistic loss, and exponential loss are introduced. These loss functions have been used for decades in diverse classification models, such as SVM (support vector machine) with hinge loss, logistic regression with logistic loss, and Adaboost with exponential loss and so on. In this work, we present a Perceptron-augmented convex classification framework, Logitron. The loss function of it is a smoothly stitched function of the extended logistic loss with the famous Perceptron loss function. The extended logistic loss function is a parameterized function established based on the extended logarithmic function and the extended exponential function. The main advantage of the proposed Logitron classification model is that it shows the connection between SVM and logistic regression via polynomial parameterization of the loss function. In more details, depending on the choice of parameters, we have the Hinge-Logitron which has the generalized k-th order hinge-loss with an additional k-th root stabilization function and the Logistic-Logitron which has a logistic-like loss function with relatively large |k|. Interestingly, even k=-1, Hinge-Logitron satisfies the classification-calibration condition and shows reasonable classification performance with low computational cost. The numerical experiment in the linear classifier framework demonstrates that Hinge-Logitron with k=4 (the fourth-order SVM with the fourth root stabilization function) outperforms logistic regression, SVM, and other Logitron models in terms of classification accuracy.

READ FULL TEXT
research
07/16/2019

The Bregman-Tweedie Classification Model

This work proposes the Bregman-Tweedie classification model and analyzes...
research
05/24/2022

Soft-SVM Regression For Binary Classification

The binomial deviance and the SVM hinge loss functions are two of the mo...
research
05/16/2020

Classification vs regression in overparameterized regimes: Does the loss function matter?

We compare classification and regression tasks in the overparameterized ...
research
05/19/2017

Two-temperature logistic regression based on the Tsallis divergence

We develop a variant of multiclass logistic regression that achieves thr...
research
02/23/2022

Fast Sparse Classification for Generalized Linear and Additive Models

We present fast classification techniques for sparse generalized linear ...
research
09/05/2023

RoBoSS: A Robust, Bounded, Sparse, and Smooth Loss Function for Supervised Learning

In the domain of machine learning algorithms, the significance of the lo...
research
04/29/2020

Improving Vertical Positioning Accuracy with the Weighted Multinomial Logistic Regression Classifier

In this paper, a method of improving vertical positioning accuracy with ...

Please sign up or login with your details

Forgot password? Click here to reset