Adjusting Rate of Spread Factors through Derivative-Free Optimization: A New Methodology to Improve the Performance of Forest Fire Simulators

09/11/2019
by   Jaime Carrasco, et al.
0

In practical applications, it is common that wildfire simulators do not correctly predict the evolution of the fire scar. Usually, this is caused due to multiple factors including inaccuracy in the input data such as land cover classification, moisture, improperly represented local winds, cumulative errors in the fire growth simulation model, high level of discontinuity/heterogeneity within the landscape, among many others. Therefore in practice, it is necessary to adjust the propagation of the fire to obtain better results, either to support suppression activities or to improve the performance of the simulator considering new default parameters for future events, best representing the current fire spread growth phenomenon. In this article, we address this problem through a new methodology using Derivative-Free Optimization (DFO) algorithms for adjusting the Rate of Spread (ROS) factors in a fire simulation growth model called Cell2Fire. To achieve this, we solve an error minimization optimization problem that captures the difference between the simulated and observed fire, which involves the evaluation of the simulator output in each iteration as part of a DFO framework, allowing us to find the best possible factors for each fuel present on the landscape. Numerical results for different objective functions are shown and discussed, including a performance comparison of alternative DFO algorithms.

READ FULL TEXT

page 19

page 22

page 25

research
05/22/2019

Cell2Fire: A Cell Based Forest Fire Growth Model

Cell2Fire is a new cell-based forest and wildland landscape fire growth ...
research
06/30/2020

Parameter Estimation of Fire Propagation Models Using Level Set Methods

The availability of wildland fire propagation models with parameters est...
research
08/11/2016

Warm Starting Bayesian Optimization

We develop a framework for warm-starting Bayesian optimization, that red...
research
06/23/2020

Inexact Derivative-Free Optimization for Bilevel Learning

Variational regularization techniques are dominant in the field of mathe...
research
04/19/2019

Derivative-Free Global Optimization Algorithms: Bayesian Method and Lipschitzian Approaches

In this paper, we will provide an introduction to the derivative-free op...
research
02/10/2021

Derivative-Free Reinforcement Learning: A Review

Reinforcement learning is about learning agent models that make the best...
research
11/26/2017

A note on using performance and data profilesfor training algorithms

It is shown how to use the performance and data profile benchmarking too...

Please sign up or login with your details

Forgot password? Click here to reset