L1-norm vs. L2-norm fitting in optimizing focal multi-channel tES stimulation: linear and semidefinite programming vs. weighted least squares

06/03/2022
by   Fernando Galaz Prieto, et al.
0

This study focuses on Multi-Channel Transcranial Electrical Stimulation, a non-invasive brain method for stimulating neuronal activity under the influence of low-intensity currents. We introduce mathematical formulation for finding a current pattern which optimizes a L1-norm fit between a given focal target distribution and volume current density inside the brain. L1-norm is well-known to favor well-localized or sparse distributions compared to L2-norm (least-squares) fitted estimates. We present a linear programming approach which performs L1-norm fitting and penalization of the current pattern (L1L1) to control the number of non-zero currents. The optimizer filters a large set of candidate solutions using a two-stage metaheuristic search in from a pre-filtered set of candidates. The numerical simulation results, obtained with both a 8- and 20-channel electrode montages, suggest that our hypothesis on the benefits of L1-norm data fitting is valid. As compared to L1-norm regularized L2-norm fitting (L1L2) via semidefinite programming and weighted Tikhonov least-squares method, the L1L1 results were overall preferable with respect to maximizing the focused current density at the target position and the ratio between focused and nuisance current magnitudes. We propose the metaheuristic L1L1 optimization approach as a potential technique to obtain a well-localized stimulus with a controllable magnitude at a given target position. L1L1 finds a current pattern with a steep contrast between the anodal and cathodal electrodes meanwhile suppressing the nuisance currents in the brain, hence, providing a potential alternative to modulate the effects of the stimulation, e.g., the sensation experienced by the subject.

READ FULL TEXT

page 5

page 7

page 9

page 10

research
03/15/2017

Optimization for L1-Norm Error Fitting via Data Aggregation

We propose a data aggregation-based algorithm with monotonic convergence...
research
03/11/2022

Delaunay-like Triangulation of Smooth Orientable Submanifolds by L1-Norm Minimization

In this paper, we focus on one particular instance of the shape reconstr...
research
07/08/2018

BALSON: Bayesian Least Squares Optimization with Nonnegative L1-Norm Constraint

A Bayesian approach termed BAyesian Least Squares Optimization with Nonn...
research
09/09/2017

Graph Scaling Cut with L1-Norm for Classification of Hyperspectral Images

In this paper, we propose an L1 normalized graph based dimensionality re...
research
04/13/2019

L1-norm Tucker Tensor Decomposition

Tucker decomposition is a common method for the analysis of multi-way/te...
research
07/07/2021

Variable selection in convex quantile regression: L1-norm or L0-norm regularization?

The curse of dimensionality is a recognized challenge in nonparametric e...
research
05/01/2021

l1-Norm Minimization with Regula Falsi Type Root Finding Methods

Sparse level-set formulations allow practitioners to find the minimum 1-...

Please sign up or login with your details

Forgot password? Click here to reset