How is model-related uncertainty quantified and reported in different disciplines?

06/24/2022
∙
by   Emily G. Simmonds, et al.
∙
0
∙

How do we know how much we know? Quantifying uncertainty associated with our modelling work is the only way we can answer how much we know about any phenomenon. With quantitative science now highly influential in the public sphere and the results from models translating into action, we must support our conclusions with sufficient rigour to produce useful, reproducible results. Incomplete consideration of model-based uncertainties can lead to false conclusions with real world impacts. Despite these potentially damaging consequences, uncertainty consideration is incomplete both within and across scientific fields. We take a unique interdisciplinary approach and conduct a systematic audit of model-related uncertainty quantification from seven scientific fields, spanning the biological, physical, and social sciences. Our results show no single field is achieving complete consideration of model uncertainties, but together we can fill the gaps. We propose opportunities to improve the quantification of uncertainty through use of a source framework for uncertainty consideration, model type specific guidelines, improved presentation, and shared best practice. We also identify shared outstanding challenges (uncertainty in input data, balancing trade-offs, error propagation, and defining how much uncertainty is required). Finally, we make nine concrete recommendations for current practice (following good practice guidelines and an uncertainty checklist, presenting uncertainty numerically, and propagating model-related uncertainty into conclusions), future research priorities (uncertainty in input data, quantifying uncertainty in complex models, and the importance of missing uncertainty in different contexts), and general research standards across the sciences (transparency about study limitations and dedicated uncertainty sections of manuscripts).

READ FULL TEXT

page 4

page 6

page 12

page 13

page 14

page 37

page 38

page 39

research
∙ 04/27/2020

Workshop on Quantification, Communication, and Interpretation of Uncertainty in Simulation and Data Science

Modern science, technology, and politics are all permeated by data that ...
research
∙ 04/22/2020

Deeply Uncertain: Comparing Methods of Uncertainty Quantification in Deep Learning Algorithms

We present a comparison of methods for uncertainty quantification (UQ) i...
research
∙ 10/25/2022

Combined Data and Deep Learning Model Uncertainties: An Application to the Measurement of Solid Fuel Regression Rate

In complex physical process characterization, such as the measurement of...
research
∙ 06/02/2021

Uncertainty Quantification 360: A Holistic Toolkit for Quantifying and Communicating the Uncertainty of AI

In this paper, we describe an open source Python toolkit named Uncertain...
research
∙ 10/19/2021

The Creation of Puffin, the Automatic Uncertainty Compiler

An uncertainty compiler is a tool that automatically translates original...
research
∙ 08/22/2023

Towards a unified approach to formal risk of bias assessments for causal and descriptive inference

Statistics is sometimes described as the science of reasoning under unce...

Please sign up or login with your details

Forgot password? Click here to reset