Ensemble Manifold Segmentation for Model Distillation and Semi-supervised Learning
Manifold theory has been the central concept of many learning methods. However, learning modern CNNs with manifold structures has not raised due attention, mainly because of the inconvenience of imposing manifold structures onto the architecture of the CNNs. In this paper we present ManifoldNet, a novel method to encourage learning of manifold-aware representations. Our approach segments the input manifold into a set of fragments. By assigning the corresponding segmentation id as a pseudo label to every sample, we convert the problem of preserving the local manifold structure into a point-wise classification task. Due to its unsupervised nature, the segmentation tends to be noisy. We mitigate this by introducing ensemble manifold segmentation (EMS). EMS accounts for the manifold structure by dividing the training data into an ensemble of classification training sets that contain samples of local proximity. CNNs are trained on these ensembles under a multi-task learning framework to conform to the manifold. ManifoldNet can be trained with only the pseudo labels or together with task-specific labels. We evaluate ManifoldNet on two different tasks: network imitation (distillation) and semi-supervised learning. Our experiments show that the manifold structures are effectively utilized for both unsupervised and semi-supervised learning.
READ FULL TEXT