Deep Stochastic Attraction and Repulsion Embedding for Image Based Localization

08/27/2018
by   Liu Liu, et al.
5

This paper tackles the problem of large-scale image-based localization where the geographical location at which a query image was taken is estimated by retrieving geo-tagged reference images depicting the same place from a large database. For this problem, an important and yet under-researched issue is how to learn discriminative image representations that are best tailored to the task of geo-localization. Aiming to find a novel image representation having higher location-discriminating power, this paper presents the following contributions: 1) we represent a place (location) as a set of exemplar images depicting the same landmarks, instead of some pre-defined geographic locations by partitioning the world; 2) we advocate the use of competitive learning among places, directly via feature embeddings, aiming to maximize similarities among intra-class images while minimizing similarities among inter-class images. This represents a significant departure from the state-of-the-art IBL methods using triplet ranking loss, which only enforces intra-place visual similarities are bigger than inter-place ones; 3) we propose a new Stochastic Attraction and Repulsion Embedding (SARE) loss function to facilitate the competitive learning. Our SARE loss is easy to implement and pluggable to any Convolutional Neural Network. Experiments show that the method improves localization performance on standard benchmarks by a large margin.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset