DeepAI
Log In Sign Up

Scaling Wide Residual Networks for Panoptic Segmentation

11/23/2020
by   Liang-Chieh Chen, et al.
0

The Wide Residual Networks (Wide-ResNets), a shallow but wide model variant of the Residual Networks (ResNets) by stacking a small number of residual blocks with large channel sizes, have demonstrated outstanding performance on multiple dense prediction tasks. However, since proposed, the Wide-ResNet architecture has barely evolved over the years. In this work, we revisit its architecture design for the recent challenging panoptic segmentation task, which aims to unify semantic segmentation and instance segmentation. A baseline model is obtained by incorporating the simple and effective Squeeze-and-Excitation and Switchable Atrous Convolution to the Wide-ResNets. Its network capacity is further scaled up or down by adjusting the width (i.e., channel size) and depth (i.e., number of layers), resulting in a family of SWideRNets (short for Scaling Wide Residual Networks). We demonstrate that such a simple scaling scheme, coupled with grid search, identifies several SWideRNets that significantly advance state-of-the-art performance on panoptic segmentation datasets in both the fast model regime and strong model regime.

READ FULL TEXT
05/23/2016

Wide Residual Networks

Deep residual networks were shown to be able to scale up to thousands of...
06/09/2022

PointNeXt: Revisiting PointNet++ with Improved Training and Scaling Strategies

PointNet++ is one of the most influential neural architectures for point...
03/23/2019

Residual Pyramid Learning for Single-Shot Semantic Segmentation

Pixel-level semantic segmentation is a challenging task with a huge amou...
11/22/2019

Panoptic-DeepLab: A Simple, Strong, and Fast Baseline for Bottom-Up Panoptic Segmentation

In this work, we introduce Panoptic-DeepLab, a simple, strong, and fast ...
03/11/2019

Scaling up deep neural networks: a capacity allocation perspective

Following the recent work on capacity allocation, we formulate the conje...