DeepAI AI Chat
Log In Sign Up

Nonsmoothness in Machine Learning: specific structure, proximal identification, and applications

10/02/2020
by   Franck Iutzeler, et al.
0

Nonsmoothness is often a curse for optimization; but it is sometimes a blessing, in particular for applications in machine learning. In this paper, we present the specific structure of nonsmooth optimization problems appearing in machine learning and illustrate how to leverage this structure in practice, for compression, acceleration, or dimension reduction. We pay a special attention to the presentation to make it concise and easily accessible, with both simple examples and general results.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/17/2019

A Survey of Optimization Methods from a Machine Learning Perspective

Machine learning develops rapidly, which has made many theoretical break...
09/24/2021

Accelerated nonlinear primal-dual hybrid gradient algorithms with applications to machine learning

The primal-dual hybrid gradient (PDHG) algorithm is a first-order method...
04/18/2020

Optimization in Machine Learning: A Distribution Space Approach

We present the viewpoint that optimization problems encountered in machi...
07/05/2020

Participation is not a Design Fix for Machine Learning

This paper critically examines existing modes of participation in design...
06/15/2016

Optimization Methods for Large-Scale Machine Learning

This paper provides a review and commentary on the past, present, and fu...
09/06/2020

Screening Rules and its Complexity for Active Set Identification

Screening rules were recently introduced as a technique for explicitly i...
04/27/2020

The Dark Side of Unikernels for Machine Learning

This paper analyzes the shortcomings of unikernels as a method of deploy...