Numerical analysis of least squares and perceptron learning for classification problems

04/02/2020
by   L. Beilina, et al.
0

This work presents study on regularized and non-regularized versions of perceptron learning and least squares algorithms for classification problems. Fr'echet derivatives for regularized least squares and perceptron learning algorithms are derived. Different techniques for choosing the regularization parameter are discussed. Decision boundaries obtained by non-regularized algorithms to classify simulated and experimental data sets are analyzed.

READ FULL TEXT
research
01/10/2023

First-projection-then-regularization hybrid algorithms for large-scale general-form regularization

The paper presents first-projection-then-regularization hybrid algorithm...
research
11/11/2020

Linear Dilation-Erosion Perceptron for Binary Classification

In this work, we briefly revise the reduced dilation-erosion perceptron ...
research
02/19/2017

Harmonic Grammar, Optimality Theory, and Syntax Learnability: An Empirical Exploration of Czech Word Order

This work presents a systematic theoretical and empirical comparison of ...
research
06/01/2020

Analysis of Least Squares Regularized Regression in Reproducing Kernel Krein Spaces

In this paper, we study the asymptotical properties of least squares reg...
research
06/19/2023

Multigrid preconditioning for regularized least-squares problems

In this paper, we are concerned with efficiently solving the sequences o...
research
12/24/2019

Broad Learning System Based on Maximum Correntropy Criterion

As an effective and efficient discriminative learning method, Broad Lear...
research
02/28/2023

Safe peeling for l0-regularized least-squares with supplementary material

We introduce a new methodology dubbed “safe peeling” to accelerate the r...

Please sign up or login with your details

Forgot password? Click here to reset