A comparative study on different neural network architectures to model inelasticity

03/06/2023
by   Max Rosenkranz, et al.
0

The mathematical formulation of constitutive models to describe the path-dependent, i.e., inelastic, behavior of materials is a challenging task and has been a focus in mechanics research for several decades. There have been increased efforts to facilitate or automate this task through data-driven techniques, impelled in particular by the recent revival of neural networks (NNs) in computational mechanics. However, it seems questionable to simply not consider fundamental findings of constitutive modeling originating from the last decades research within NN-based approaches. Herein, we propose a comparative study on different feedforward and recurrent neural network architectures to model inelasticity. Within this study, we divide the models into three basic classes: black box NNs, NNs enforcing physics in a weak form, and NNs enforcing physics in a strong form. Thereby, the first class of networks can learn constitutive relations from data while the underlying physics are completely ignored, whereas the latter two are constructed such that they can account for fundamental physics, where special attention is paid to the second law of thermodynamics in this work. Conventional linear and nonlinear viscoelastic as well as elastoplastic models are used for training data generation and, later on, as reference. After training with random walk time sequences containing information on stress, strain, and, for some models, internal variables, the NN-based models are compared to the reference solution, whereby interpolation and extrapolation are considered. Besides the quality of the stress prediction, the related free energy and dissipation rate are analyzed to evaluate the models. Overall, the presented study enables a clear recording of the advantages and disadvantages of different NN architectures to model inelasticity and gives guidance on how to train and apply these models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/01/2020

Learning Constitutive Relations using Symmetric Positive Definite Neural Networks

We present the Cholesky-factored symmetric positive definite neural netw...
research
04/27/2020

Neural Network and Particle Filtering: A Hybrid Framework for Crack Propagation Prediction

Crack detection, length estimation, and Remaining Useful Life (RUL) pred...
research
05/01/2022

Thermodynamically Consistent Machine-Learned Internal State Variable Approach for Data-Driven Modeling of Path-Dependent Materials

Characterization and modeling of path-dependent behaviors of complex mat...
research
01/18/2022

Observing how deep neural networks understand physics through the energy spectrum of one-dimensional quantum mechanics

We investigated how neural networks (NNs) understand physics using one-d...
research
04/20/2021

Explainable artificial intelligence for mechanics: physics-informing neural networks for constitutive models

(Artificial) neural networks have become increasingly popular in mechani...
research
08/30/2023

Neural network-based multiscale modeling of finite strain magneto-elasticity with relaxed convexity criteria

We present a framework for the multiscale modeling of finite strain magn...
research
06/06/2017

Learning to Represent Mechanics via Long-term Extrapolation and Interpolation

While the basic laws of Newtonian mechanics are well understood, explain...

Please sign up or login with your details

Forgot password? Click here to reset