Sensitivity-based Heuristic for Guaranteed Global Optimization with Nonlinear Ordinary Differential Equations

We focus on interval algorithms for computing guaranteed enclosures of the solutions of constrained global optimization problems where differential constraints occur. To solve such a problem of global optimization with nonlinear ordinary differential equations, a branch and bound algorithm can be used based on guaranteed numerical integration methods. Nevertheless, this kind of algorithms is expensive in term of computation. Defining new methods to reduce the number of branches is still a challenge. Bisection based on the smear value is known to be often the most efficient heuristic for branching algorithms. This heuristic consists in bisecting in the coordinate direction for which the values of the considered function change the most "rapidly". We propose to define a smear-like function using the sensitivity function obtained from the differentiation of ordinary differential equation with respect to parameters. The sensitivity has been already used in validated simulation for local optimization but not as a bisection heuristic. We implement this heuristic in a branch and bound algorithm to solve a problem of global optimization with nonlinear ordinary differential equations. Experiments show that the gain in term of number of branches could be up to 30

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/30/2021

Numerical solution of several second-order ordinary differential equations containing logistic maps as nonlinear coefficients

This work is devoted to find the numerical solutions of several one dime...
research
09/23/2022

A new perspective on parameter study of optimization problems

We provide a new perspective on the study of parameterized optimization ...
research
08/26/2022

Improving the Efficiency of Gradient Descent Algorithms Applied to Optimization Problems with Dynamical Constraints

We introduce two block coordinate descent algorithms for solving optimiz...
research
07/19/2022

Magnus integrators for linear and quasilinear delay differential equations

A procedure to numerically integrate non-autonomous linear delay differe...
research
10/28/2020

Optimization Fabrics for Behavioral Design

Second-order differential equations define smooth system behavior. In ge...
research
09/03/2020

Probabilistic Gradients for Fast Calibration of Differential Equation Models

Calibration of large-scale differential equation models to observational...
research
09/14/2022

Vectorized Adjoint Sensitivity Method for Graph Convolutional Neural Ordinary Differential Equations

This document, as the title stated, is meant to provide a vectorized imp...

Please sign up or login with your details

Forgot password? Click here to reset