Neural Network Distiller: A Python Package For DNN Compression Research

10/27/2019
by   Neta Zmora, et al.
0

This paper presents the philosophy, design and feature-set of Neural Network Distiller, an open-source Python package for DNN compression research. Distiller is a library of DNN compression algorithms implementations, with tools, tutorials and sample applications for various learning tasks. Its target users are both engineers and researchers, and the rich content is complemented by a design-for-extensibility to facilitate new research. Distiller is open-source and is available on Github at https://github.com/NervanaSystems/distiller.

READ FULL TEXT
research
04/25/2017

Pycobra: A Python Toolbox for Ensemble Learning and Visualisation

We introduce pycobra, a Python library devoted to ensemble learning (reg...
research
05/02/2020

wisardpkg – A library for WiSARD-based models

In order to facilitate the production of codes using WiSARD-based models...
research
11/04/2019

XDeep: An Interpretation Tool for Deep Neural Networks

XDeep is an open-source Python package developed to interpret deep model...
research
09/18/2022

HiPart: Hierarchical Divisive Clustering Toolbox

This paper presents the HiPart package, an open-source native python lib...
research
07/28/2023

ODTlearn: A Package for Learning Optimal Decision Trees for Prediction and Prescription

ODTLearn is an open-source Python package that provides methods for lear...
research
01/11/2022

pymdp: A Python library for active inference in discrete state spaces

Active inference is an account of cognition and behavior in complex syst...
research
08/12/2023

Instruction Set Architecture (ISA) for Processing-in-Memory DNN Accelerators

In this article, we introduce an instruction set architecture (ISA) for ...

Please sign up or login with your details

Forgot password? Click here to reset