Probabilistic Knowledge Transfer for Deep Representation Learning

03/28/2018
by   Nikolaos Passalis, et al.
0

Knowledge Transfer (KT) techniques tackle the problem of transferring the knowledge from a large and complex neural network into a smaller and faster one. However, existing KT methods are tailored towards classification tasks and they cannot be used efficiently for other representation learning tasks. In this paper a novel knowledge transfer technique, that is capable of training a student model that maintains the same amount of mutual information between the learned representation and a set of (possible unknown) labels as the teacher model, is proposed. Apart from outperforming existing KT techniques, the proposed method allows for overcoming several limitations of existing methods providing new insight into KT as well as novel KT applications, ranging from knowledge transfer from handcrafted feature extractors to cross-modal KT from the textual modality into the representation extracted from the visual modality of the data.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset