On the Validation of Gibbs Algorithms: Training Datasets, Test Datasets and their Aggregation

06/21/2023
by   Samir M. Perlaza, et al.
0

The dependence on training data of the Gibbs algorithm (GA) is analytically characterized. By adopting the expected empirical risk as the performance metric, the sensitivity of the GA is obtained in closed form. In this case, sensitivity is the performance difference with respect to an arbitrary alternative algorithm. This description enables the development of explicit expressions involving the training errors and test errors of GAs trained with different datasets. Using these tools, dataset aggregation is studied and different figures of merit to evaluate the generalization capabilities of GAs are introduced. For particular sizes of such datasets and parameters of the GAs, a connection between Jeffrey's divergence, training and test errors is established.

READ FULL TEXT
research
11/30/2013

A Framework for Genetic Algorithms Based on Hadoop

Genetic Algorithms (GAs) are powerful metaheuristic techniques mostly us...
research
08/15/2012

Performance Analysis Of Neuro Genetic Algorithm Applied On Detecting Proportion Of Components In Manhole Gas Mixture

The article presents performance analysis of a real valued neuro genetic...
research
05/02/2011

A Novel Crossover Operator for Genetic Algorithms: Ring Crossover

The genetic algorithm (GA) is an optimization and search technique based...
research
12/07/2020

Estimation of Gas Turbine Shaft Torque and Fuel Flow of a CODLAG Propulsion System Using Genetic Programming Algorithm

In this paper, the publicly available dataset of condition based mainten...
research
04/27/2023

On the Generalization Error of Meta Learning for the Gibbs Algorithm

We analyze the generalization ability of joint-training meta learning al...
research
05/05/2021

Genetic Algorithms For Extractive Summarization

Most current work in NLP utilizes deep learning, which requires a lot of...

Please sign up or login with your details

Forgot password? Click here to reset