Reducing Computational and Statistical Complexity in Machine Learning Through Cardinality Sparsity

02/16/2023
by   Ali Mohades, et al.
0

High-dimensional data has become ubiquitous across the sciences but causes computational and statistical challenges. A common approach for dealing with these challenges is sparsity. In this paper, we introduce a new concept of sparsity, called cardinality sparsity. Broadly speaking, we call a tensor sparse if it contains only a small number of unique values. We show that cardinality sparsity can improve deep learning and tensor regression both statistically and computationally. On the way, we generalize recent statistical theories in those fields.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/29/2021

A block-sparse Tensor Train Format for sample-efficient high-dimensional Polynomial Regression

Low-rank tensors are an established framework for high-dimensional least...
research
12/11/2022

Statistical guarantees for sparse deep learning

Neural networks are becoming increasingly popular in applications, but o...
research
05/15/2019

An Empirical Analysis of Deep Learning for Cardinality Estimation

We implement and evaluate deep learning for cardinality estimation by st...
research
08/22/2023

Tensor Regression

Regression analysis is a key area of interest in the field of data analy...
research
02/03/2023

Characterization and estimation of high dimensional sparse regression parameters under linear inequality constraints

Modern statistical problems often involve such linear inequality constra...
research
04/16/2021

Accelerating Sparse Deep Neural Networks

As neural network model sizes have dramatically increased, so has the in...
research
11/08/2016

Accelerating the BSM interpretation of LHC data with machine learning

The interpretation of Large Hadron Collider (LHC) data in the framework ...

Please sign up or login with your details

Forgot password? Click here to reset