Adaptive Reduced Basis Methods for Multiscale Problems and Large-scale PDE-constrained Optimization
This thesis presents recent advances in model order reduction methods with the primary aim to construct online-efficient reduced surrogate models for parameterized multiscale phenomena and accelerate large-scale PDE-constrained parameter optimization methods. In particular, we present several different adaptive RB approaches that can be used in an error-aware trust-region framework for progressive construction of a surrogate model used during a certified outer optimization loop. In addition, we elaborate on several different enhancements for the trust-region reduced basis (TR-RB) algorithm and generalize it for parameter constraints. Thanks to the a posteriori error estimation of the reduced model, the resulting algorithm can be considered certified with respect to the high-fidelity model. Moreover, we use the first-optimize-then-discretize approach in order to take maximum advantage of the underlying optimality system of the problem. In the first part of this thesis, the theory is based on global RB techniques that use an accurate FEM discretization as the high-fidelity model. In the second part, we focus on localized model order reduction methods and develop a novel online efficient reduced model for the localized orthogonal decomposition (LOD) multiscale method. The reduced model is internally based on a two-scale formulation of the LOD and, in particular, is independent of the coarse and fine discretization of the LOD. The last part of this thesis is devoted to combining both results on TR-RB methods and localized RB approaches for the LOD. To this end, we present an algorithm that uses adaptive localized reduced basis methods in the framework of a trust-region localized reduced basis (TR-LRB) algorithm. The basic ideas from the TR-RB are followed, but FEM evaluations of the involved systems are entirely avoided.
READ FULL TEXT