JANUS: Fast and Flexible Deep Learning via Symbolic Graph Execution of Imperative Programs

12/04/2018
by   Eunji Jeong, et al.
0

The rapid evolution of deep neural networks is demanding deep learning (DL) frameworks not only to satisfy the traditional requirement of quickly executing large computations, but also to support straightforward programming models for quickly implementing and experimenting with complex network structures. However, existing frameworks fail to excel in both departments simultaneously, leading to diverged efforts for optimizing performance and improving usability. This paper presents JANUS, a system that combines the advantages from both sides by transparently converting an imperative DL program written in Python, the de-facto scripting language for DL, into an efficiently executable symbolic dataflow graph. JANUS can convert various dynamic features of Python, including dynamic control flow, dynamic types, and impure functions, into elements of a symbolic dataflow graph. Experiments demonstrate that JANUS can achieve fast DL training by exploiting the techniques imposed by symbolic graph-based DL frameworks, while maintaining the simple and flexible programmability of imperative DL frameworks at the same time.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/23/2022

Terra: Imperative-Symbolic Co-Execution of Imperative Deep Learning Programs

Imperative programming allows users to implement their deep neural netwo...
research
07/11/2023

DyCL: Dynamic Neural Network Compilation Via Program Rewriting and Graph Optimization

DL compiler's primary function is to translate DNN programs written in h...
research
06/01/2015

Blocks and Fuel: Frameworks for deep learning

We introduce two Python frameworks to train neural networks on large dat...
research
08/22/2023

Towards Safe Automated Refactoring of Imperative Deep Learning Programs to Graph Execution

Efficiency is essential to support responsiveness w.r.t. ever-growing da...
research
10/01/2020

Symbolic Techniques for Deep Learning: Challenges and Opportunities

As the number of deep learning frameworks increase and certain ones gain...
research
12/11/2017

Cavs: A Vertex-centric Programming Interface for Dynamic Neural Networks

Recent deep learning (DL) models have moved beyond static network archit...
research
10/13/2022

Graph-based Neural Modules to Inspect Attention-based Architectures: A Position Paper

Encoder-decoder architectures are prominent building blocks of state-of-...

Please sign up or login with your details

Forgot password? Click here to reset