DLOPT: Deep Learning Optimization Library

07/10/2018
by   Andrés Camero, et al.
0

Deep learning hyper-parameter optimization is a tough task. Finding an appropriate network configuration is a key to success, however most of the times this labor is roughly done. In this work we introduce a novel library to tackle this problem, the Deep Learning Optimization Library: DLOPT. We briefly describe its architecture and present a set of use examples. This is an open source project developed under the GNU GPL v3 license and it is freely available at https://github.com/acamero/dlopt

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/03/2018

McTorch, a manifold optimization library for deep learning

In this paper, we introduce McTorch, a manifold optimization library for...
research
09/10/2018

Torchbearer: A Model Fitting Library for PyTorch

We introduce torchbearer, a model fitting library for pytorch aimed at r...
research
06/12/2021

Lessons learned from hyper-parameter tuning for microservice candidate identification

When optimizing software for the cloud, monolithic applications need to ...
research
09/23/2020

ANNdotNET – deep learning tool on .NET Platform

ANNdotNET is an open source project for deep learning written in C# with...
research
06/09/2023

HypLL: The Hyperbolic Learning Library

Deep learning in hyperbolic space is quickly gaining traction in the fie...
research
03/12/2021

Latent Space Explorations of Singing Voice Synthesis using DDSP

Machine learning based singing voice models require large datasets and l...
research
02/03/2020

Torch-Struct: Deep Structured Prediction Library

The literature on structured prediction for NLP describes a rich collect...

Please sign up or login with your details

Forgot password? Click here to reset