A Generalized Weighted Loss for SVC and MLP

02/22/2023
by   Filippo Portera, et al.
0

Usually standard algorithms employ a loss where each error is the mere absolute difference between the true value and the prediction, in case of a regression task. In the present, we introduce several error weighting schemes that are a generalization of the consolidated routine. We study both a binary classification model for Support Vector Classification and a regression net for Multi-layer Perceptron. Results proves that the error is never worse than the standard procedure and several times it is better.

READ FULL TEXT

page 1

page 2

page 3

research
03/27/2020

Random Machines Regression Approach: an ensemble support vector regression model with free kernel choice

Machine learning techniques always aim to reduce the generalized predict...
research
08/23/2022

pystacked: Stacking generalization and machine learning in Stata

pystacked implements stacked generalization (Wolpert, 1992) for regressi...
research
11/11/2020

Linear Dilation-Erosion Perceptron for Binary Classification

In this work, we briefly revise the reduced dilation-erosion perceptron ...
research
05/14/2015

Pinball Loss Minimization for One-bit Compressive Sensing

The one-bit quantization can be implemented by one single comparator, wh...
research
09/14/2016

Very Simple Classifier: a Concept Binary Classifier toInvestigate Features Based on Subsampling and Localility

We propose Very Simple Classifier (VSC) a novel method designed to incor...
research
01/23/2022

A Generalized Weighted Optimization Method for Computational Learning and Inversion

The generalization capacity of various machine learning models exhibits ...
research
12/04/2022

Label Encoding for Regression Networks

Deep neural networks are used for a wide range of regression problems. H...

Please sign up or login with your details

Forgot password? Click here to reset