DeepAI AI Chat
Log In Sign Up

Learning Inward Scaled Hypersphere Embedding: Exploring Projections in Higher Dimensions

10/16/2018
by   Muhammad Kamran Janjua, et al.
SEECS Orientation
Uninsubria
0

Majority of the current dimensionality reduction or retrieval techniques rely on embedding the learned feature representations onto a computable metric space. Once the learned features are mapped, a distance metric aids the bridging of gaps between similar instances. Since the scaled projection is not exploited in these methods, discriminative embedding onto a hyperspace becomes a challenge. In this paper, we propose to inwardly scale feature representations in proportional to projecting them onto a hypersphere manifold for discriminative analysis. We further propose a novel, yet simpler, convolutional neural network based architecture and extensively evaluate the proposed methodology in the context of classification and retrieval tasks obtaining results comparable to state-of-the-art techniques.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/20/2018

DEFRAG: Deep Euclidean Feature Representations through Adaptation on the Grassmann Manifold

We propose a novel technique for training deep networks with the objecti...
09/19/2019

Interpretable Discriminative Dimensionality Reduction and Feature Selection on the Manifold

Dimensionality reduction (DR) on the manifold includes effective methods...
01/25/2018

NDDR-CNN: Layer-wise Feature Fusing in Multi-Task CNN by Neural Discriminative Dimensionality Reduction

State-of-the-art Convolutional Neural Network (CNN) benefits a lot from ...
08/13/2018

Deep Randomized Ensembles for Metric Learning

Learning embedding functions, which map semantically related inputs to n...
10/16/2016

Probabilistic Dimensionality Reduction via Structure Learning

We propose a novel probabilistic dimensionality reduction framework that...
12/20/2013

Learned versus Hand-Designed Feature Representations for 3d Agglomeration

For image recognition and labeling tasks, recent results suggest that ma...