Generative Adversarial Network (GAN) and its variants demonstrate state-of-the-art performance in the class of generative models. To capture higher dimensional distributions, the common learning procedure requires high computational complexity and large number of parameters. In this paper, we present a new generative adversarial framework by representing each layer as a tensor structure connected by multilinear operations, aiming to reduce the number of model parameters by a large factor while preserving the quality of generalized performance. To learn the model, we develop an efficient algorithm by alternating optimization of the mode connections. Experimental results demonstrate that our model can achieve high compression rate for model parameters up to 40 times as compared to the existing GAN.
10/30/2017 ∙ by Xingwei Cao, et al. ∙ 0 ∙ share
Tensor regression networks achieve high rate of compression of model parameters in multilayer perceptrons (MLP) while having slight impact on performances. Tensor regression layer imposes low-rank constraints on the tensor regression layer which replaces the flattening operation of traditional MLP. We investigate tensor regression networks using various low-rank tensor approximations, aiming to leverage the multi-modal structure of high dimensional data by enforcing efficient low-rank constraints. We provide a theoretical analysis giving insights on the choice of the rank parameters. We evaluated performance of proposed model with state-of-the-art deep convolutional models. For CIFAR-10 dataset, we achieved the compression rate of 0.018 with the sacrifice of accuracy less than 1
12/27/2017 ∙ by Xingwei Cao, et al. ∙ 0 ∙ share
Xingwei Caois this you? claim profile