resnet-1k-layers
Deep Residual Networks with 1K Layers
view repo
Deep residual networks have emerged as a family of extremely deep architectures showing compelling accuracy and nice convergence behaviors. In this paper, we analyze the propagation formulations behind the residual building blocks, which suggest that the forward and backward signals can be directly propagated from one block to any other block, when using identity mappings as the skip connections and after-addition activation. A series of ablation experiments support the importance of these identity mappings. This motivates us to propose a new residual unit, which makes training easier and improves generalization. We report improved results using a 1001-layer ResNet on CIFAR-10 (4.62 Code is available at: https://github.com/KaimingHe/resnet-1k-layers
READ FULL TEXTDeep Residual Networks with 1K Layers
Recreating the Deep Residual Network in Lasagne
ResNet for Cifar10
MatConvNet Implementation for Deep Residual Networks
Deep Residual Networks for MatConvNet