User-driven mobile robot storyboarding: Learning image interest and saliency from pairwise image comparisons

06/19/2017
by   Michael Burke, et al.
0

This paper describes a novel storyboarding scheme that uses a model trained on pairwise image comparisons to identify images likely to be of interest to a mobile robot user. Traditional storyboarding schemes typically attempt to summarise robot observations using predefined novelty or image quality objectives, but we propose a user training stage that allows the incorporation of user interest when storyboarding. Our approach dramatically reduces the number of image comparisons required to infer image interest by applying a Gaussian process smoothing algorithm on image features extracted using a pre-trained convolutional neural network. As a particularly valuable by-product, the proposed approach allows the generation of user-specific saliency or attention maps.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset