STANNIS: Low-Power Acceleration of Deep NeuralNetwork Training Using Computational Storage

02/17/2020
by   Ali HeydariGorji, et al.
0

This paper proposes a framework for distributed, in-storage training of neural networks on clusters of computational storage devices. Such devices not only contain hardware accelerators but also eliminate data movement between the host and storage, resulting in both improved performance and power savings. More importantly, this in-storage processing style of training ensures that private data never leaves the storage while fully controlling the sharing of public data. Experimental results show up to 2.7x speedup and 69 energy consumption and no significant loss in accuracy.

READ FULL TEXT

page 4

page 5

research
02/17/2020

STANNIS: Low-Power Acceleration of Deep Neural Network Training Using Computational Storage

This paper proposes a framework for distributed, in-storage training of ...
research
12/23/2021

In-storage Processing of I/O Intensive Applications on Computational Storage Drives

Computational storage drives (CSD) are solid-state drives (SSD) empowere...
research
07/16/2020

HyperTune: Dynamic Hyperparameter Tuning For Efficient Distribution of DNN Training Over Heterogeneous Systems

Distributed training is a novel approach to accelerate Deep Neural Netwo...
research
03/06/2023

Domain-Specific Computational Storage for Serverless Computing

While (1) serverless computing is emerging as a popular form of cloud ex...
research
12/04/2018

Pre-Defined Sparse Neural Networks with Hardware Acceleration

Neural networks have proven to be extremely powerful tools for modern ar...
research
07/08/2022

The Dirty Secret of SSDs: Embodied Carbon

Scalable Solid-State Drives (SSDs) have revolutionized the way we store ...
research
06/07/2023

An Analytical Model-based Capacity Planning Approach for Building CSD-based Storage Systems

The data movement in large-scale computing facilities (from compute node...

Please sign up or login with your details

Forgot password? Click here to reset