LBL: Logarithmic Barrier Loss Function for One-class Classification

07/20/2023
by   Tianlei Wang, et al.
0

One-class classification (OCC) aims to train a classifier only with the target class data and attracts great attention for its strong applicability in real-world application. Despite a lot of advances have been made in OCC, it still lacks the effective OCC loss functions for deep learning. In this paper, a novel logarithmic barrier function based OCC loss (LBL) that assigns large gradients to the margin samples and thus derives more compact hypersphere, is first proposed by approximating the OCC objective smoothly. But the optimization of LBL may be instability especially when samples lie on the boundary leading to the infinity loss. To address this issue, then, a unilateral relaxation Sigmoid function is introduced into LBL and a novel OCC loss named LBLSig is proposed. The LBLSig can be seen as the fusion of the mean square error (MSE) and the cross entropy (CE) and the optimization of LBLSig is smoother owing to the unilateral relaxation Sigmoid function. The effectiveness of the proposed LBL and LBLSig is experimentally demonstrated in comparisons with several state-of-the-art OCC algorithms on different network structures. The source code can be found at https://github.com/ML-HDU/LBL_LBLSig.

READ FULL TEXT

page 1

page 7

page 9

research
11/11/2020

Optimized Loss Functions for Object detection and Application on Nighttime Vehicle Detection

Loss functions is a crucial factor than affecting the detection precisio...
research
04/21/2020

AMC-Loss: Angular Margin Contrastive Loss for Improved Explainability in Image Classification

Deep-learning architectures for classification problems involve the cros...
research
04/27/2022

Gleo-Det: Deep Convolution Feature-Guided Detector with Local Entropy Optimization for Salient Points

Feature detection is an important procedure for image matching, where un...
research
06/27/2020

ReMarNet: Conjoint Relation and Margin Learning for Small-Sample Image Classification

Despite achieving state-of-the-art performance, deep learning methods ge...
research
11/26/2018

Frequency Principle in Deep Learning with General Loss Functions and Its Potential Application

Previous studies have shown that deep neural networks (DNNs) with common...
research
05/23/2023

Decoupled Kullback-Leibler Divergence Loss

In this paper, we delve deeper into the Kullback-Leibler (KL) Divergence...
research
08/20/2021

A Conditional Cascade Model for Relational Triple Extraction

Tagging based methods are one of the mainstream methods in relational tr...

Please sign up or login with your details

Forgot password? Click here to reset