DeepAI AI Chat
Log In Sign Up

3DQ: Compact Quantized Neural Networks for Volumetric Whole Brain Segmentation

by   Magdalini Paschali, et al.

Model architectures have been dramatically increasing in size, improving performance at the cost of resource requirements. In this paper we propose 3DQ, a ternary quantization method, applied for the first time to 3D Fully Convolutional Neural Networks (F-CNNs), enabling 16x model compression while maintaining performance on par with full precision models. We extensively evaluate 3DQ on two datasets for the challenging task of whole brain segmentation. Additionally, we showcase our method's ability to generalize on two common 3D architectures, namely 3D U-Net and V-Net. Outperforming a variety of baselines, the proposed method is capable of compressing large 3D models to a few MBytes, alleviating the storage needs in space critical applications.


On the Compactness, Efficiency, and Representation of 3D Convolutional Networks: Brain Parcellation as a Pretext Task

Deep convolutional neural networks are powerful tools for learning visua...

Concurrent Spatial and Channel Squeeze & Excitation in Fully Convolutional Networks

Fully convolutional neural networks (F-CNNs) have set the state-of-the-a...

Structured Binary Neural Networks for Accurate Image Classification and Semantic Segmentation

In this paper, we propose to train convolutional neural networks (CNNs) ...

Structured Binary Neural Networks for Image Recognition

We propose methods to train convolutional neural networks (CNNs) with bo...

OSS-Net: Memory Efficient High Resolution Semantic Segmentation of 3D Medical Data

Convolutional neural networks (CNNs) are the current state-of-the-art me...

Self-Reorganizing and Rejuvenating CNNs for Increasing Model Capacity Utilization

In this paper, we propose self-reorganizing and rejuvenating convolution...