First derivatives at the optimum analysis (fdao): An approach to estimate the uncertainty in nonlinear regression involving stochastically independent variables

02/25/2018
by   Carlos Sevcik, et al.
0

An important problem of optimization analysis surges when parameters such as {θ_i}_i=1, ... ,k , determining a function y=f(x | {θ_i}) , must be estimated from a set of observables { x_j,y_j}_j=1, ... ,m . Where {x_i} are independent variables assumed to be uncertainty-free. It is known that analytical solutions are possible if y=f(x | θ_i) is a linear combination of {θ_i=1, ... ,k}. Here it is proposed that determining the uncertainty of parameters that are not linearly independent may be achieved from derivatives ∂ f(x | {θ_i})∂θ_i at an optimum, if the parameters are stochastically independent.

READ FULL TEXT

page 15

page 28

research
02/11/2019

Efficient Computation of High-Order Electromagnetic Field Derivatives for Multiple Design Parameters in FDTD

This paper introduces a new computational framework to derive electromag...
research
12/09/2020

Consistent regression of biophysical parameters with kernel methods

This paper introduces a novel statistical regression framework that allo...
research
08/12/2020

A nonlinear system related to investment under uncertainty solved using the fractional pseudo-Newton method

A nonlinear algebraic equation system of two variables is numerically so...
research
11/06/2019

Fast Derivatives for Multilinear Polynomials

The article considers linear functions of many (n) variables - multiline...
research
07/29/2020

Bayesian preference elicitation for multiobjective combinatorial optimization

We introduce a new incremental preference elicitation procedure able to ...
research
07/07/2021

Uncertainty in Ranking

Ranks estimated from data are uncertain and this poses a challenge in ma...

Please sign up or login with your details

Forgot password? Click here to reset