HiDeNN-PGD: reduced-order hierarchical deep learning neural networks

05/13/2021 ∙ by Lei Zhang, et al. ∙ 7

This paper presents a proper generalized decomposition (PGD) based reduced-order model of hierarchical deep-learning neural networks (HiDeNN). The proposed HiDeNN-PGD method keeps both advantages of HiDeNN and PGD methods. The automatic mesh adaptivity makes the HiDeNN-PGD more accurate than the finite element method (FEM) and conventional PGD, using a fraction of the FEM degrees of freedom. The accuracy and convergence of the method have been studied theoretically and numerically, with a comparison to different methods, including FEM, PGD, HiDeNN and Deep Neural Networks. In addition, we theoretically showed that the PGD converges to FEM at increasing modes, and the PGD error is a direct sum of the FEM error and the mode reduction error. The proposed HiDeNN-PGD performs high accuracy with orders of magnitude fewer degrees of freedom, which shows a high potential to achieve fast computations with a high level of accuracy for large-size engineering problems.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 4

page 10

page 18

page 19

page 32

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.