Theano: new features and speed improvements

11/23/2012
by   Frédéric Bastien, et al.
0

Theano is a linear algebra compiler that optimizes a user's symbolically-specified mathematical computations to produce efficient low-level implementations. In this paper, we present new features and efficiency improvements to Theano, and benchmarks demonstrating Theano's performance relative to Torch7, a recently introduced machine learning library, and to RNNLM, a C++ library targeted at recurrent neural networks.

READ FULL TEXT
research
08/29/2019

TapirXLA: Embedding Fork-Join Parallelism into the XLA Compiler in TensorFlow Using Tapir

This work introduces TapirXLA, a replacement for TensorFlow's XLA compil...
research
04/04/2021

LAGraph: Linear Algebra, Network Analysis Libraries, and the Study of Graph Algorithms

Graph algorithms can be expressed in terms of linear algebra. GraphBLAS ...
research
06/28/2022

Memory Safe Computations with XLA Compiler

Software packages like TensorFlow and PyTorch are designed to support li...
research
05/09/2016

Theano: A Python framework for fast computation of mathematical expressions

Theano is a Python library that allows to define, optimize, and evaluate...
research
05/07/2020

TIRAMISU: A Polyhedral Compiler for Dense and Sparse Deep Learning

In this paper, we demonstrate a compiler that can optimize sparse and re...
research
03/05/2013

GURLS: a Least Squares Library for Supervised Learning

We present GURLS, a least squares, modular, easy-to-extend software libr...
research
04/10/2019

Compiling a Calculus for Relaxed Memory: Practical constraint-based low-level concurrency

Crary and Sullivan's Relaxed Memory Calculus (RMC) proposed a new declar...

Please sign up or login with your details

Forgot password? Click here to reset