Gradient boosting-based numerical methods for high-dimensional backward stochastic differential equations
In this work we propose a new algorithm for solving high-dimensional backward stochastic differential equations (BSDEs). Based on the general theta-discretization for the time-integrands, we show how to efficiently use eXtreme Gradient Boosting (XGBoost) regression to approximate the resulting conditional expectations in a quite high dimension. Numerical results illustrate the efficiency and accuracy of our proposed algorithms for solving very high-dimensional (up to 10000 dimensions) nonlinear BSDEs.
READ FULL TEXT