Blocks and Fuel: Frameworks for deep learning

06/01/2015
by   Bart van Merriënboer, et al.
0

We introduce two Python frameworks to train neural networks on large datasets: Blocks and Fuel. Blocks is based on Theano, a linear algebra compiler with CUDA-support. It facilitates the training of complex neural network models by providing parametrized Theano operations, attaching metadata to Theano's symbolic computational graph, and providing an extensive set of utilities to assist training the networks, e.g. training algorithms, logging, monitoring, visualization, and serialization. Fuel provides a standard format for machine learning datasets. It allows the user to easily iterate over large datasets, performing many types of pre-processing on the fly.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/04/2018

JANUS: Fast and Flexible Deep Learning via Symbolic Graph Execution of Imperative Programs

The rapid evolution of deep neural networks is demanding deep learning (...
research
12/15/2014

MatConvNet - Convolutional Neural Networks for MATLAB

MatConvNet is an implementation of Convolutional Neural Networks (CNNs) ...
research
12/02/2019

Deep Learning for Symbolic Mathematics

Neural networks have a reputation for being better at solving statistica...
research
08/19/2020

Compiling ONNX Neural Network Models Using MLIR

Deep neural network models are becoming increasingly popular and have be...
research
01/29/2019

Impact of Training Dataset Size on Neural Answer Selection Models

It is held as a truism that deep neural networks require large datasets ...
research
05/21/2020

SymJAX: symbolic CPU/GPU/TPU programming

SymJAX is a symbolic programming version of JAX simplifying graph input/...
research
11/13/2014

A Randomized Algorithm for CCA

We present RandomizedCCA, a randomized algorithm for computing canonical...

Please sign up or login with your details

Forgot password? Click here to reset