ImageNet pre-trained models with batch normalization

12/05/2016
by   Marcel Simon, et al.
0

Convolutional neural networks (CNN) pre-trained on ImageNet are the backbone of most state-of-the-art approaches. In this paper, we present a new set of pre-trained models with popular state-of-the-art architectures for the Caffe framework. The first release includes Residual Networks (ResNets) with generation script as well as the batch-normalization-variants of AlexNet and VGG19. All models outperform previous models with the same architecture. The models and training code are available at http://www.inf-cv.uni-jena.de/Research/CNN+Models.html and https://github.com/cvjena/cnn-models

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2019

musicnn: Pre-trained convolutional neural networks for music audio tagging

Pronounced as "musician", the musicnn library contains a set of pre-trai...
research
02/01/2019

Do we train on test data? Purging CIFAR of near-duplicates

We find that 3.3 sets, respectively, have duplicates in the training set...
research
09/20/2022

CoV-TI-Net: Transferred Initialization with Modified End Layer for COVID-19 Diagnosis

This paper proposes transferred initialization with modified fully conne...
research
09/20/2023

ModelGiF: Gradient Fields for Model Functional Distance

The last decade has witnessed the success of deep learning and the surge...
research
09/08/2021

EMA: Auditing Data Removal from Trained Models

Data auditing is a process to verify whether certain data have been remo...
research
03/30/2021

Automated Cleanup of the ImageNet Dataset by Model Consensus, Explainability and Confident Learning

The convolutional neural networks (CNNs) trained on ILSVRC12 ImageNet we...
research
03/09/2021

DeepSeagrass Dataset

We introduce a dataset of seagrass images collected by a biologist snork...

Please sign up or login with your details

Forgot password? Click here to reset