Statistical identification of penalizing configurations in high-dimensional thermalhydraulic numerical experiments: The ICSCREAM methodology

04/08/2020
by   A. Marrel, et al.
0

In the framework of risk assessment in nuclear accident analysis, best-estimate computer codes are used to estimate safety margins. Several inputs of the code can be uncertain, due to a lack of knowledge but also to the particular choice of accidental scenario being considered. The objective of this work is to identify the most penalizing (or critical) configurations (corresponding to extreme values of the code output) of several input parameters (called "scenario inputs"), independently of the uncertainty of the other input parameters. However, complex computer codes, as the ones used in thermal-hydraulic accident scenario simulations, are often too CPU-time expensive to be directly used to perform these studies. A solution consists in fitting the code output by a metamodel, built from a reduced number of code simulations. When the number of input parameters is very large (e.g., around a hundred here), the metamodel building remains a challenge. To overcome this, we propose a methodology, called ICSCREAM (Identification of penalizing Configurations using SCREening And Metamodel), based on screening techniques and Gaussian process (Gp) metamodeling. The efficiency of this methodology is illustrated on a thermal-hydraulic industrial case simulating an accident of primary coolant loss in a Pressurized Water Reactor. This use-case includes 97 uncertain inputs, two scenario inputs to be penalized and 500 code simulations for the learning database. The study focuses on the peak cladding temperature (PCT) and critical configurations are defined by exceeding the 90 PCT.For the screening step, statistical tests of independence based on the Hilbert-Schmidt independence criterion are used for global and target sensitivity analyses. They allow a significant reduction of inputs (from 97 to 20) and a ranking of these influential inputs by order of influence. Then, a Gp metamodel is sequentially built to reach a satisfactory predictivity of 82 explained PTC variance, and a high capacity of identifying PTC critical areas (94 estimate, within a Bayesian framework, the conditional probabilities of exceeding the threshold, according to the two scenario inputs. The analysis reveals the strong interaction of the two scenario inputs in the occurrence of critical configurations, worst cases corresponding to medium values of both inputs.

READ FULL TEXT

page 5

page 10

research
12/29/2018

Advanced methodology for uncertainty propagation in computer experiments with large number of inputs

In the framework of the estimation of safety margins in nuclear accident...
research
02/19/2019

New statistical methodology for second level global sensitivity analysis

Global sensitivity analysis (GSA) of numerical simulators aims at studyi...
research
12/21/2022

Second-level global sensitivity analysis of numerical simulators with application to an accident scenario in a sodium-cooled fast reactor

Numerical simulators are widely used to model physical phenomena and glo...
research
11/02/2022

Bayesian sequential design of computer experiments to estimate reliable sets

We consider an unknown multivariate function representing a system-such ...
research
06/16/2018

Sensitivity-driven adaptive construction of reduced-space surrogates

We develop a systematic approach for surrogate model construction in red...
research
10/03/2019

Scenario Discovery via Rule Extraction

Scenario discovery is the process of finding areas of interest, commonly...

Please sign up or login with your details

Forgot password? Click here to reset