Measuring and Understanding Sensory Representations within Deep Networks Using a Numerical Optimization Framework

02/17/2015
by   Chuan-Yung Tsai, et al.
0

A central challenge in sensory neuroscience is describing how the activity of populations of neurons can represent useful features of the external environment. However, while neurophysiologists have long been able to record the responses of neurons in awake, behaving animals, it is another matter entirely to say what a given neuron does. A key problem is that in many sensory domains, the space of all possible stimuli that one might encounter is effectively infinite; in vision, for instance, natural scenes are combinatorially complex, and an organism will only encounter a tiny fraction of possible stimuli. As a result, even describing the response properties of sensory neurons is difficult, and investigations of neuronal functions are almost always critically limited by the number of stimuli that can be considered. In this paper, we propose a closed-loop, optimization-based experimental framework for characterizing the response properties of sensory neurons, building on past efforts in closed-loop experimental methods, and leveraging recent advances in artificial neural networks to serve as as a proving ground for our techniques. Specifically, using deep convolutional neural networks, we asked whether modern black-box optimization techniques can be used to interrogate the "tuning landscape" of an artificial neuron in a deep, nonlinear system, without imposing significant constraints on the space of stimuli under consideration. We introduce a series of measures to quantify the tuning landscapes, and show how these relate to the performances of the networks in an object recognition task. To the extent that deep convolutional neural networks increasingly serve as de facto working hypotheses for biological vision, we argue that developing a unified approach for studying both artificial and biological systems holds great potential to advance both fields together.

READ FULL TEXT

page 4

page 8

page 11

page 12

page 13

research
10/22/2020

Factorized Neural Processes for Neural Processes: K-Shot Prediction of Neural Responses

In recent years, artificial neural networks have achieved state-of-the-a...
research
06/24/2016

Robust and scalable Bayesian analysis of spatial neural tuning function data

A common analytical problem in neuroscience is the interpretation of neu...
research
05/28/2018

A neural network trained to predict future video frames mimics critical properties of biological neuronal responses and perception

While deep neural networks take loose inspiration from neuroscience, it ...
research
05/01/2019

Gradient-free activation maximization for identifying effective stimuli

A fundamental question for understanding brain function is what types of...
research
05/26/2022

A Hybrid Neural Autoencoder for Sensory Neuroprostheses and Its Applications in Bionic Vision

Sensory neuroprostheses are emerging as a promising technology to restor...
research
04/18/2019

Understanding Neural Networks via Feature Visualization: A survey

A neuroscience method to understanding the brain is to find and study th...
research
02/19/2018

Closing the loop on multisensory interactions: A neural architecture for multisensory causal inference and recalibration

When the brain receives input from multiple sensory systems, it is faced...

Please sign up or login with your details

Forgot password? Click here to reset