Understanding the effect of hyperparameter optimization on machine learning models for structure design problems

by   Xianping Du, et al.

To relieve the computational cost of design evaluations using expensive finite element simulations, surrogate models have been widely applied in computer-aided engineering design. Machine learning algorithms (MLAs) have been implemented as surrogate models due to their capability of learning the complex interrelations between the design variables and the response from big datasets. Typically, an MLA regression model contains model parameters and hyperparameters. The model parameters are obtained by fitting the training data. Hyperparameters, which govern the model structures and the training processes, are assigned by users before training. There is a lack of systematic studies on the effect of hyperparameters on the accuracy and robustness of the surrogate model. In this work, we proposed to establish a hyperparameter optimization (HOpt) framework to deepen our understanding of the effect. Four frequently used MLAs, namely Gaussian Process Regression (GPR), Support Vector Machine (SVM), Random Forest Regression (RFR), and Artificial Neural Network (ANN), are tested on four benchmark examples. For each MLA model, the model accuracy and robustness before and after the HOpt are compared. The results show that HOpt can generally improve the performance of the MLA models in general. HOpt leads to few improvements in the MLAs accuracy and robustness for complex problems, which are featured by high-dimensional mixed-variable design space. The HOpt is recommended for the design problems with intermediate complexity. We also investigated the additional computational costs incurred by HOpt. The training cost is closely related to the MLA architecture. After HOpt, the training cost of ANN and RFR is increased more than that of the GPR and SVM. To sum up, this study benefits the selection of HOpt method for the different types of design problems based on their complexity.



There are no comments yet.


page 20

page 24

page 37

page 38

page 39

page 41


Stealing Hyperparameters in Machine Learning

Hyperparameters are critical in machine learning, as different hyperpara...

Prediction of the Yield of Enzymatic Synthesis of Betulinic Acid Ester Using Artificial Neural Networks and Support Vector Machine

3e̱ṯa̱-O-phthalic ester of betulinic acid is of great importance in anti...

An automated machine learning-genetic algorithm (AutoML-GA) approach for efficient simulation-driven engine design optimization

In recent years, the use of machine learning techniques as surrogate mod...

PairNets: Novel Fast Shallow Artificial Neural Networks on Partitioned Subspaces

Traditionally, an artificial neural network (ANN) is trained slowly by a...

Surrogate sea ice model enables efficient tuning

Predicting changes in sea ice cover is critical for shipping, ecosystem ...

Enhancing Explainability of Neural Networks through Architecture Constraints

Prediction accuracy and model explainability are the two most important ...

Identifying Entangled Physics Relationships through Sparse Matrix Decomposition to Inform Plasma Fusion Design

A sustainable burn platform through inertial confinement fusion (ICF) ha...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.