Learning Multiple Categories on Deep Convolution Networks

02/21/2018
by   Mohamed Hajaj, et al.
0

Deep convolution networks have proved very successful with big datasets such as the 1000-classes ImageNet. Results show that the error rate increases slowly as the size of the dataset increases. Experiments presented here may explain why these networks are very effective in solving big recognition problems. If the big task is made up of multiple smaller tasks, then the results show the ability of deep convolution networks to decompose the complex task into a number of smaller tasks and to learn them simultaneously. The results show that the performance of solving the big task on a single network is very close to the average performance of solving each of the smaller tasks on a separate network. Experiments also show the advantage of using task specific or category labels in combination with class labels.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2019

NetTailor: Tuning the Architecture, Not Just the Weights

Real-world applications of object recognition often require the solution...
research
02/21/2018

Batch Normalization and the impact of batch structure on the behavior of deep convolution networks

Batch normalization was introduced in 2015 to speed up training of deep ...
research
09/14/2023

Universality of underlying mechanism for successful deep learning

An underlying mechanism for successful deep learning (DL) with a limited...
research
10/05/2022

Decomposed Prompting: A Modular Approach for Solving Complex Tasks

Few-shot prompting is a surprisingly powerful way to use Large Language ...
research
05/03/2018

SdcNet: A Computation-Efficient CNN for Object Recognition

Extracting features from a huge amount of data for object recognition is...
research
11/28/2016

Dense Prediction on Sequences with Time-Dilated Convolutions for Speech Recognition

In computer vision pixelwise dense prediction is the task of predicting ...

Please sign up or login with your details

Forgot password? Click here to reset