The Selectively Adaptive Lasso

05/22/2022
by   Alejandro Schuler, et al.
11

Machine learning regression methods allow estimation of functions without unrealistic parametric assumptions. Although they can perform exceptionally in prediction error, most lack theoretical convergence rates necessary for semi-parametric efficient estimation (e.g. TMLE, AIPW) of parameters like average treatment effects. The Highly Adaptive Lasso (HAL) is the only regression method proven to converge quickly enough for a meaningfully large class of functions, independent of the dimensionality of the predictors. Unfortunately, HAL is not computationally scalable. In this paper we build upon the theory of HAL to construct the Selectively Adaptive Lasso (SAL), a new algorithm which retains HAL's dimension-free, nonparametric convergence rate but which also scales computationally to massive datasets. To accomplish this, we prove some general theoretical results pertaining to empirical loss minimization in nested Donsker classes. Our resulting algorithm is a form of gradient tree boosting with an adaptive learning rate, which makes it fast and trivial to implement with off-the-shelf software. Finally, we show that our algorithm retains the performance of standard gradient boosting on a diverse group of real-world datasets. SAL makes semi-parametric efficient estimators practically possible and theoretically justifiable in many big data settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/18/2018

Flexible Collaborative Estimation of the Average Causal Effect of a Treatment using the Outcome-Highly-Adaptive Lasso

Many estimators of the average causal effect of an intervention require ...
research
10/26/2020

A Homotopic Method to Solve the Lasso Problems with an Improved Upper Bound of Convergence Rate

In optimization, it is known that when the objective functions are stric...
research
12/25/2019

An efficient penalized estimation approach for a semi-parametric linear transformation model with interval-censored data

We consider efficient estimation of flexible transformation models with ...
research
09/16/2022

Nonparametric Estimation via Mixed Gradients

Traditional nonparametric estimation methods often lead to a slow conver...
research
11/13/2020

Adaptive Estimation In High-Dimensional Additive Models With Multi-Resolution Group Lasso

In additive models with many nonparametric components, a number of regul...
research
12/05/2021

Local Adaptivity of Gradient Boosting in Histogram Transform Ensemble Learning

In this paper, we propose a gradient boosting algorithm called adaptive ...
research
08/01/2017

Fast Exact Conformalization of Lasso using Piecewise Linear Homotopy

Conformal prediction is a general method that converts almost any point ...

Please sign up or login with your details

Forgot password? Click here to reset