A parallel Fortran framework for neural networks and deep learning

02/18/2019
by   Milan Curcic, et al.
0

This paper describes neural-fortran, a parallel Fortran framework for neural networks and deep learning. It features a simple interface to construct feed-forward neural networks of arbitrary structure and size, several activation functions, and stochastic gradient descent as the default optimization algorithm. Neural-fortran also leverages the Fortran 2018 standard collective subroutines to achieve data-based parallelism on shared- or distributed-memory machines. First, I describe the implementation of neural networks with Fortran derived types, whole-array arithmetic, and collective sum and broadcast operations to achieve parallelism. Second, I demonstrate the use of neural-fortran in an example of recognizing hand-written digits from images. Finally, I evaluate the computational performance in both serial and parallel modes. Ease of use and computational performance are similar to an existing popular machine learning framework, making neural-fortran a viable candidate for further development and use in production.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2021

Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions

We propose a new type of neural networks, Kronecker neural networks (KNN...
research
10/18/2017

Asynchronous Decentralized Parallel Stochastic Gradient Descent

Recent work shows that decentralized parallel stochastic gradient decent...
research
03/17/2017

Empirical Evaluation of Parallel Training Algorithms on Acoustic Modeling

Deep learning models (DLMs) are state-of-the-art techniques in speech re...
research
12/02/2021

Memory-efficient array redistribution through portable collective communication

Modern large-scale deep learning workloads highlight the need for parall...
research
03/22/2019

Scalable Data Augmentation for Deep Learning

Scalable Data Augmentation (SDA) provides a framework for training deep ...
research
12/12/2017

Integrated Model and Data Parallelism in Training Neural Networks

We propose a new integrated method of exploiting both model and data par...
research
04/19/2023

Parallel Neural Networks in Golang

This paper describes the design and implementation of parallel neural ne...

Please sign up or login with your details

Forgot password? Click here to reset