Visual Search Asymmetry: Deep Nets and Humans Share Similar Inherent Biases

06/05/2021
by   Shashi Kant Gupta, et al.
9

Visual search is a ubiquitous and often challenging daily task, exemplified by looking for the car keys at home or a friend in a crowd. An intriguing property of some classical search tasks is an asymmetry such that finding a target A among distractors B can be easier than finding B among A. To elucidate the mechanisms responsible for asymmetry in visual search, we propose a computational model that takes a target and a search image as inputs and produces a sequence of eye movements until the target is found. The model integrates eccentricity-dependent visual recognition with target-dependent top-down cues. We compared the model against human behavior in six paradigmatic search tasks that show asymmetry in humans. Without prior exposure to the stimuli or task-specific training, the model provides a plausible mechanism for search asymmetry. We hypothesized that the polarity of search asymmetry arises from experience with the natural environment. We tested this hypothesis by training the model on an augmented version of ImageNet where the biases of natural images were either removed or reversed. The polarity of search asymmetry disappeared or was altered depending on the training protocol. This study highlights how classical perceptual properties can emerge in neural network models, without the need for task-specific training, but rather as a consequence of the statistical properties of the developmental diet fed to the model. All source code and stimuli are publicly available https://github.com/kreimanlab/VisualSearchAsymmetry

READ FULL TEXT

page 5

page 6

page 7

page 9

page 10

page 12

page 13

page 14

research
09/28/2022

Target Features Affect Visual Search, A Study of Eye Fixations

Visual Search is referred to the task of finding a target object among a...
research
10/27/2022

Predicting Visual Attention and Distraction During Visual Search Using Convolutional Neural Networks

Most studies in computational modeling of visual attention encompass tas...
research
07/31/2018

What am I searching for?

Can we infer intentions and goals from a person's actions? As an example...
research
07/18/2018

Finding any Waldo: zero-shot invariant and efficient visual search

Searching for a target object in a cluttered scene constitutes a fundame...
research
11/24/2022

Efficient Zero-shot Visual Search via Target and Context-aware Transformer

Visual search is a ubiquitous challenge in natural vision, including dai...
research
01/18/2021

Deadeye: A Novel Preattentive Visualization Technique Based on Dichoptic Presentation

Preattentive visual features such as hue or flickering can effectively d...
research
12/10/2021

Benchmarking human visual search computational models in natural scenes: models comparison and reference datasets

Visual search is an essential part of almost any everyday human goal-dir...

Please sign up or login with your details

Forgot password? Click here to reset