Informative regularization for a multi-layer perceptron RR Lyrae classifier under data shift

03/12/2023
by   Francisco Pérez-Galarce, et al.
0

In recent decades, machine learning has provided valuable models and algorithms for processing and extracting knowledge from time-series surveys. Different classifiers have been proposed and performed to an excellent standard. Nevertheless, few papers have tackled the data shift problem in labeled training sets, which occurs when there is a mismatch between the data distribution in the training set and the testing set. This drawback can damage the prediction performance in unseen data. Consequently, we propose a scalable and easily adaptable approach based on an informative regularization and an ad-hoc training procedure to mitigate the shift problem during the training of a multi-layer perceptron for RR Lyrae classification. We collect ranges for characteristic features to construct a symbolic representation of prior knowledge, which was used to model the informative regularizer component. Simultaneously, we design a two-step back-propagation algorithm to integrate this knowledge into the neural network, whereby one step is applied in each epoch to minimize classification error, while another is applied to ensure regularization. Our algorithm defines a subset of parameters (a mask) for each loss function. This approach handles the forgetting effect, which stems from a trade-off between these loss functions (learning from data versus learning expert knowledge) during training. Experiments were conducted using recently proposed shifted benchmark sets for RR Lyrae stars, outperforming baseline models by up to 3% through a more reliable classifier. Our method provides a new path to incorporate knowledge from characteristic features into artificial neural networks to manage the underlying data shift problem.

READ FULL TEXT

page 20

page 21

research
07/08/2014

Meteorological time series forecasting with pruned multi-layer perceptron and 2-stage Levenberg-Marquardt method

A Multi-Layer Perceptron (MLP) defines a family of artificial neural net...
research
08/08/2013

Time series modeling with pruned multi-layer perceptron and 2-stage damped least-squares method

A Multi-Layer Perceptron (MLP) defines a family of artificial neural net...
research
10/26/2019

On the Efficiency of the Neuro-Fuzzy Classifier for User Knowledge Modeling Systems

User knowledge modeling systems are used as the most effective technolog...
research
08/11/2023

Automated Sizing and Training of Efficient Deep Autoencoders using Second Order Algorithms

We propose a multi-step training method for designing generalized linear...
research
10/08/2019

Automatic Construction of Multi-layer Perceptron Network from Streaming Examples

Autonomous construction of deep neural network (DNNs) is desired for dat...
research
03/05/2019

Copying Machine Learning Classifiers

We study model-agnostic copies of machine learning classifiers. We devel...

Please sign up or login with your details

Forgot password? Click here to reset