SBNet: Sparse Blocks Network for Fast Inference

01/07/2018
by   Mengye Ren, et al.
0

Conventional deep convolutional neural networks (CNNs) apply convolution operators uniformly in space across all feature maps for hundreds of layers - this incurs a high computational cost for real time applications. For many problems such as object detection and semantic segmentation, we are able to obtain a low-cost computation mask, either from a priori problem knowledge, or from a low resolution segmentation network. We show that such computation masks can be used to reduce computation in the high resolution main network. Variants of sparse activation CNNs have previously been explored on small scale tasks, and showed no degradation in terms of object classification accuracy, but often measured gains in terms of theoretical FLOPs without realizing a practical speed-up when compared to highly optimized dense convolution implementations. In this work, we leverage the sparsity structure of computation masks and propose a novel tiling-based sparse convolution algorithm. We verified the effectiveness of our sparse CNN on LiDAR based 3D object detection, and we report significant wall-clock speed-ups compared to dense convolution, as well as improved detection accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/16/2021

QueryDet: Cascaded Sparse Query for Accelerating High-Resolution Small Object Detection

While general object detection with deep learning has achieved great suc...
research
03/30/2023

SparseViT: Revisiting Activation Sparsity for Efficient High-Resolution Vision Transformer

High-resolution images enable neural networks to learn richer visual rep...
research
10/05/2022

Transferring dense object detection models to event-based data

Event-based image representations are fundamentally different to traditi...
research
10/17/2022

Deformably-Scaled Transposed Convolution

Transposed convolution is crucial for generating high-resolution outputs...
research
04/16/2021

High Performance Convolution Using Sparsity and Patterns for Inference in Deep Convolutional Neural Networks

Deploying deep Convolutional Neural Networks (CNNs) is impacted by their...
research
01/31/2018

Inference, Learning and Attention Mechanisms that Exploit and Preserve Sparsity in Convolutional Networks

While CNNs naturally lend themselves to densely sampled data, and sophis...
research
10/05/2020

Mind the Pad – CNNs can Develop Blind Spots

We show how feature maps in convolutional networks are susceptible to sp...

Please sign up or login with your details

Forgot password? Click here to reset