Understanding the effect of hyperparameter optimization on machine learning models for structure design problems

07/04/2020
by   Xianping Du, et al.
0

To relieve the computational cost of design evaluations using expensive finite element simulations, surrogate models have been widely applied in computer-aided engineering design. Machine learning algorithms (MLAs) have been implemented as surrogate models due to their capability of learning the complex interrelations between the design variables and the response from big datasets. Typically, an MLA regression model contains model parameters and hyperparameters. The model parameters are obtained by fitting the training data. Hyperparameters, which govern the model structures and the training processes, are assigned by users before training. There is a lack of systematic studies on the effect of hyperparameters on the accuracy and robustness of the surrogate model. In this work, we proposed to establish a hyperparameter optimization (HOpt) framework to deepen our understanding of the effect. Four frequently used MLAs, namely Gaussian Process Regression (GPR), Support Vector Machine (SVM), Random Forest Regression (RFR), and Artificial Neural Network (ANN), are tested on four benchmark examples. For each MLA model, the model accuracy and robustness before and after the HOpt are compared. The results show that HOpt can generally improve the performance of the MLA models in general. HOpt leads to few improvements in the MLAs accuracy and robustness for complex problems, which are featured by high-dimensional mixed-variable design space. The HOpt is recommended for the design problems with intermediate complexity. We also investigated the additional computational costs incurred by HOpt. The training cost is closely related to the MLA architecture. After HOpt, the training cost of ANN and RFR is increased more than that of the GPR and SVM. To sum up, this study benefits the selection of HOpt method for the different types of design problems based on their complexity.

READ FULL TEXT

page 20

page 24

page 37

page 38

page 39

page 41

research
02/14/2018

Stealing Hyperparameters in Machine Learning

Hyperparameters are critical in machine learning, as different hyperpara...
research
11/12/2015

Prediction of the Yield of Enzymatic Synthesis of Betulinic Acid Ester Using Artificial Neural Networks and Support Vector Machine

3e̱ṯa̱-O-phthalic ester of betulinic acid is of great importance in anti...
research
01/07/2021

An automated machine learning-genetic algorithm (AutoML-GA) approach for efficient simulation-driven engine design optimization

In recent years, the use of machine learning techniques as surrogate mod...
research
11/01/2018

Efficient Online Hyperparameter Optimization for Kernel Ridge Regression with Applications to Traffic Time Series Prediction

Computational efficiency is an important consideration for deploying mac...
research
01/24/2020

PairNets: Novel Fast Shallow Artificial Neural Networks on Partitioned Subspaces

Traditionally, an artificial neural network (ANN) is trained slowly by a...
research
11/30/2022

Learning non-stationary and discontinuous functions using clustering, classification and Gaussian process modelling

Surrogate models have shown to be an extremely efficient aid in solving ...
research
06/01/2020

Surrogate sea ice model enables efficient tuning

Predicting changes in sea ice cover is critical for shipping, ecosystem ...

Please sign up or login with your details

Forgot password? Click here to reset