Alpha-Net: Architecture, Models, and Applications

07/05/2020
by   Jishan Shaikh, et al.
1

Deep learning network training is usually computationally expensive and intuitively complex. We present a novel network architecture for custom training and weight evaluations. We reformulate the layers as ResNet-similar blocks with certain inputs and outputs of their own, the blocks (called Alpha blocks) on their connection configuration form their own network, combined with our novel loss function and normalization function form the complete Alpha-Net architecture. We provided the empirical mathematical formulation of network loss function for more understanding of accuracy estimation and further optimizations. We implemented Alpha-Net with 4 different layer configurations to express the architecture behavior comprehensively. On a custom dataset based on ImageNet benchmark, we evaluate Alpha-Net v1, v2, v3, and v4 for image recognition to give the accuracy of 78.2%, 79.1%, 79.5%, and 78.3% respectively. The Alpha-Net v3 gives improved accuracy of approx. 3% over the last state-of-the-art network ResNet 50 on ImageNet benchmark. We also present an analysis of our dataset with 256, 512, and 1024 layers and different versions of the loss function. Input representation is also crucial for training as initial preprocessing will take only a handful of features to make training less complex than it needs to be. We also compared network behavior with different layer structures, different loss functions, and different normalization functions for better quantitative modeling of Alpha-Net.

READ FULL TEXT

page 6

page 11

research
12/11/2018

Deep networks with probabilistic gates

We investigate learning to probabilistically bypass computations in a ne...
research
07/04/2019

Deep Saliency Models : The Quest For The Loss Function

Recent advances in deep learning have pushed the performances of visual ...
research
05/08/2018

PAD-Net: A Perception-Aided Single Image Dehazing Network

In this work, we investigate the possibility of replacing the ℓ_2 loss w...
research
09/20/2022

Physical Logic Enhanced Network for Small-Sample Bi-Layer Metallic Tubes Bending Springback Prediction

Bi-layer metallic tube (BMT) plays an extremely crucial role in engineer...
research
11/18/2021

Interactive segmentation using U-Net with weight map and dynamic user interactions

Interactive segmentation has recently attracted attention for specialize...
research
05/25/2016

Deeply-Fused Nets

In this paper, we present a novel deep learning approach, deeply-fused n...
research
06/17/2021

CIRA Guide to Custom Loss Functions for Neural Networks in Environmental Sciences – Version 1

Neural networks are increasingly used in environmental science applicati...

Please sign up or login with your details

Forgot password? Click here to reset