Jensen: An Easily-Extensible C++ Toolkit for Production-Level Machine Learning and Convex Optimization

07/17/2018
by   Rishabh Iyer, et al.
2

This paper introduces Jensen, an easily extensible and scalable toolkit for production-level machine learning and convex optimization. Jensen implements a framework of convex (or loss) functions, convex optimization algorithms (including Gradient Descent, L-BFGS, Stochastic Gradient Descent, Conjugate Gradient, etc.), and a family of machine learning classifiers and regressors (Logistic Regression, SVMs, Least Square Regression, etc.). This framework makes it possible to deploy and train models with a few lines of code, and also extend and build upon this by integrating new loss functions and optimization algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2021

Never Go Full Batch (in Stochastic Convex Optimization)

We study the generalization performance of full-batch optimization algor...
research
05/22/2023

Faster Differentially Private Convex Optimization via Second-Order Methods

Differentially private (stochastic) gradient descent is the workhorse of...
research
05/15/2019

Predictive Online Convex Optimization

We incorporate future information in the form of the estimated value of ...
research
06/17/2021

Shuffle Private Stochastic Convex Optimization

In shuffle privacy, each user sends a collection of randomized messages ...
research
06/22/2020

On the alpha-loss Landscape in the Logistic Model

We analyze the optimization landscape of a recently introduced tunable c...
research
02/17/2023

Smoothly Giving up: Robustness for Simple Models

There is a growing need for models that are interpretable and have reduc...
research
12/31/2021

High Dimensional Optimization through the Lens of Machine Learning

This thesis reviews numerical optimization methods with machine learning...

Please sign up or login with your details

Forgot password? Click here to reset