Extensions of Morse-Smale Regression with Application to Actuarial Science

08/17/2017
by   Colleen M. Farrelly, et al.
0

The problem of subgroups is ubiquitous in scientific research (ex. disease heterogeneity, spatial distributions in ecology...), and piecewise regression is one way to deal with this phenomenon. Morse-Smale regression offers a way to partition the regression function based on level sets of a defined function and that function's basins of attraction. This topologically-based piecewise regression algorithm has shown promise in its initial applications, but the current implementation in the literature has been limited to elastic net and generalized linear regression. It is possible that nonparametric methods, such as random forest or conditional inference trees, may provide better prediction and insight through modeling interaction terms and other nonlinear relationships between predictors and a given outcome. This study explores the use of several machine learning algorithms within a Morse-Smale piecewise regression framework, including boosted regression with linear baselearners, homotopy-based LASSO, conditional inference trees, random forest, and a wide neural network framework called extreme learning machines. Simulations on Tweedie regression problems with varying Tweedie parameter and dispersion suggest that many machine learning approaches to Morse-Smale piecewise regression improve the original algorithm's performance, particularly for outcomes with lower dispersion and linear or a mix of linear and nonlinear predictor relationships. On a real actuarial problem, several of these new algorithms perform as good as or better than the original Morse-Smale regression algorithm, and most provide information on the nature of predictor relationships within each partition to provide insight into differences between dataset partitions.

READ FULL TEXT

page 3

page 7

page 8

page 9

research
01/30/2022

Extremal Random Forests

Classical methods for quantile regression fail in cases where the quanti...
research
06/08/2023

Maximally Machine-Learnable Portfolios

When it comes to stock returns, any form of predictability can bolster r...
research
05/17/2019

Merging versus Ensembling in Multi-Study Machine Learning: Theoretical Insight from Random Effects

A critical decision point when training predictors using multiple studie...
research
03/10/2021

Piecewise linear regression and classification

This paper proposes a method for solving multivariate regression and cla...
research
03/27/2023

Nonparametric approaches for analyzing carbon emission: from statistical and machine learning perspectives

Linear regression models, especially the extended STIRPAT model, are rou...
research
11/23/2020

Conjecturing-Based Computational Discovery of Patterns in Data

Modern machine learning methods are designed to exploit complex patterns...
research
11/17/2020

A statistical machine learning approach for benchmarking in the presence of complex contextual factors and peer groups

The ability to compare between individuals or organisations fairly is im...

Please sign up or login with your details

Forgot password? Click here to reset