Balancing Generalization and Specialization in Zero-shot Learning
Zero-Shot Learning (ZSL) aims to transfer classification capability from seen to unseen classes. Recent methods have proved that generalization and specialization are two essential abilities to achieve good performance in ZSL. However, they all focus on only one of the abilities, resulting in models that are either too general with the degraded classifying ability or too specialized to generalize to unseen classes. In this paper, we propose an end-to-end network with balanced generalization and specialization abilities, termed as BGSNet, to take advantage of both abilities, and balance them at instance- and dataset-level. Specifically, BGSNet consists of two branches: the Generalization Network (GNet), which applies episodic meta-learning to learn generalized knowledge, and the Balanced Specialization Network (BSNet), which adopts multiple attentive extractors to extract discriminative features and fulfill the instance-level balance. A novel self-adjusting diversity loss is designed to optimize BSNet with less redundancy and more diversity. We further propose a differentiable dataset-level balance and update the weights in a linear annealing schedule to simulate network pruning and thus obtain the optimal structure for BSNet at a low cost with dataset-level balance achieved. Experiments on four benchmark datasets demonstrate our model's effectiveness. Sufficient component ablations prove the necessity of integrating generalization and specialization abilities.
READ FULL TEXT