Deeplite Neutrino: An End-to-End Framework for Constrained Deep Learning Model Optimization

01/11/2021
by   Anush Sankaran, et al.
20

Designing deep learning-based solutions is becoming a race for training deeper models with a greater number of layers. While a large-size deeper model could provide competitive accuracy, it creates a lot of logistical challenges and unreasonable resource requirements during development and deployment. This has been one of the key reasons for deep learning models not being excessively used in various production environments, especially in edge devices. There is an immediate requirement for optimizing and compressing these deep learning models, to enable on-device intelligence. In this research, we introduce a black-box framework, Deeplite Neutrino for production-ready optimization of deep learning models. The framework provides an easy mechanism for the end-users to provide constraints such as a tolerable drop in accuracy or target size of the optimized models, to guide the whole optimization process. The framework is easy to include in an existing production pipeline and is available as a Python Package, supporting PyTorch and Tensorflow libraries. The optimization performance of the framework is shown across multiple benchmark datasets and popular deep learning models. Further, the framework is currently used in production and the results and testimonials from several clients are summarized.

READ FULL TEXT

page 2

page 4

page 7

research
07/24/2020

Orpheus: A New Deep Learning Framework for Easy Deployment and Evaluation of Edge Inference

Optimising deep learning inference across edge devices and optimisation ...
research
02/06/2023

Stop overkilling simple tasks with black-box models and use transparent models instead

In recent years, the employment of deep learning methods has led to seve...
research
12/12/2018

PyText: A Seamless Path from NLP research to production

We introduce PyText - a deep learning based NLP modeling framework built...
research
11/24/2020

Benchmarking Inference Performance of Deep Learning Models on Analog Devices

Analog hardware implemented deep learning models are promising for compu...
research
07/18/2022

Accelerating Deep Learning Model Inference on Arm CPUs with Ultra-Low Bit Quantization and Runtime

Deep Learning has been one of the most disruptive technological advancem...
research
10/03/2021

Progressive Transmission and Inference of Deep Learning Models

Modern image files are usually progressively transmitted and provide a p...
research
03/25/2020

Plausible Counterfactuals: Auditing Deep Learning Classifiers with Realistic Adversarial Examples

The last decade has witnessed the proliferation of Deep Learning models ...

Please sign up or login with your details

Forgot password? Click here to reset