Does Optimal Source Task Performance Imply Optimal Pre-training for a Target Task?

06/21/2021
by   Steven Gutstein, et al.
0

Pre-trained deep nets are commonly used to improve accuracies and training times for neural nets. It is generally assumed that pre-training a net for optimal source task performance best prepares it to learn an arbitrary target task. This is generally not true. Stopping source task training, prior to optimal performance, can create a pre-trained net better suited for learning a new task. We performed several experiments demonstrating this effect, as well as the influence of amount of training and of learning rate. Additionally, we show that this reflects a general loss of learning ability that even extends to relearning the source task

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset