Integrating Temporal Information to Spatial Information in a Neural Circuit

03/01/2019
by   Mien Brabeeba Wang, et al.
4

In this paper, we consider a network of spiking neurons with a deterministic synchronous firing rule at discrete time. We propose three problems -- "first consecutive spikes counting", "total spikes counting" and "k-spikes temporal to spatial encoding" -- to model how brains extract temporal information into spatial information from different neural codings. For a max input length T, we design three networks that solve these three problems with matching lower bounds in both time O(T) and number of neurons O( T) in all three questions.

READ FULL TEXT VIEW PDF

page 1

page 2

page 3

page 4

02/27/2019

Counting to Ten with Two Fingers: Compressed Counting with Spiking Neurons

We consider the task of measuring time with probabilistic threshold gate...
09/01/2020

A Deep 2-Dimensional Dynamical Spiking Neuronal Network for Temporal Encoding trained with STDP

The brain is known to be a highly complex, asynchronous dynamical system...
04/22/2021

Noise-Robust Deep Spiking Neural Networks with Temporal Information

Spiking neural networks (SNNs) have emerged as energy-efficient neural n...
07/07/2020

Multivariate Time Series Classification Using Spiking Neural Networks

There is an increasing demand to process streams of temporal data in ene...
11/08/2019

Extracting temporal features into a spatial domain using autoencoders for sperm video analysis

In this paper, we present a two-step deep learning method that is used t...
09/20/2022

A Simple Temporal Information Matching Mechanism for Entity Alignment Between Temporal Knowledge Graphs

Entity alignment (EA) aims to find entities in different knowledge graph...
11/24/2020

A Pattern-mining Driven Study on Differences of Newspapers in Expressing Temporal Information

This paper studies the differences between different types of newspapers...

1. Introduction

Algorithms in the brain are inherently distributed. Although each neuron has relatively simple dynamics, as a distributed system, a network of neurons shows strong computational power. There have been many attempts to model the brain computationally. At a single-neuron level, theoretical neuroscientists were able to model the dynamics of a single neuron to high accuracy with the Hodgkin-Huxley model [HH52]. At a circuit level, to make the analysis tractable, neuroscientists approximated detailed dynamics of neurons with simplified models such as the nonlinear integrate-and-fire model [FTHvVB03] and the spiking response model [WWvJ97]. Recently, Lynch et al. used stochastic neurons firing at discrete time to solve problems such as winner-take-all and similarity testing [LMP17a, LMP17b]. These models vary in their assumptions about spike/rate code, deterministic/stochastic response, and continuous/discrete time. In this paper, we consider a network of spiking neurons with a deterministic synchronous firing rule in discrete time to simplify the analysis and focus on the computational principles.

One of the most important questions in neuroscience is how humans integrate information over time. Sensory inputs such as visual and auditory stimulus are inherently temporal; however, brains are able to integrate the temporal information to a single concept, such as a moving object in a visual scene, or an entity in a sentence. There are two kinds of neuronal codings: rate coding and temporal coding. Rate coding is a neural coding scheme assuming most of the information is coded in the firing rate of the neurons. It is most commonly seen in muscle in which the higher firing rates of motor neurons correspond to higher intensity in muscle contraction [AZ26]. On the other hand, rate coding cannot be the only neural coding brains employ. A fly is known to react to new stimuli and change its direction of flight within - ms. There is simply not enough time for neurons to compute averages [RWdRvSB96]. Therefore, neuroscientists propose the idea of temporal coding, assuming the information is coded in the specific temporal firing patterns. One of the popular temporal codings is the first-to-spike coding. It has been shown that the timing of the first spike encodes most information of an image in retinal cells [GM08]. We propose three toy problems to model how brains extract information from different coding. “First consecutive spikes counting” (FCSC) counts the first consecutive interval of spikes, which is equivalent to counting the distance between the first two spikes, a prevalent neural coding scheme in sensory cortex. “Total spikes counting” (TSC) counts the number of the spikes over an arbitrary interval, which is an example of rate coding. Lastly, “k-spikes temporal to spatial encoding” (kSTS) is a generalization of “first consecutive spikes counting” and an example of temporal coding. In particular, TSC contains an interesting difficulty: there are conflicting objectives between maintaining the count when no spike arrives and updating the count when a spike arrives. To overcome this difficulty, we allow the network to enter an unstable intermediate state which carries the information of the count. The intermediate state then converges to a stable state that represents the count after a computation step without inputs. Hitron and Parter, in a newly-submitted paper [HP19], propose a different solution to our TSC problem.

In this paper, we design three networks that solve the above three problems by translating temporal information into spatial information with matching lower bounds in both time and number of neurons for all three questions. The organization of this paper is as follows. In Section , we present the definition of a network of spiking neurons and the problem statements. In Section , we present the FCSC network that counts consecutive spikes in binary. In Section , we generalize section and present the TSC network that counts spikes over arbitrary intervals. In Section , we present the kSTS network that embeds sparse temporal inputs into spatial codings a,s an easy application of Section . In Section , we discuss our model assumptions and their implication along with possible future directions.

2. Problem Statements/Goals

In this section, we cover the model definition and the following three problems: first consecutive spikes counting (FCSC), total spikes counting (TSC) and -spikes temporal to spatial encoding (kSTS). In particular, we will use FCSC networks as subroutines on a kSTS network.

2.1. Model

In this paper, we consider a network of spiking neurons with deterministic synchronous firing at discrete times. Formally, a neuron consists of the following data with

where is the indicator function of neuron firing at time , is the threshold (bias) of neuron , is the set of presynaptic neurons of , is the strength of connection from neuron to neuron and is a nonlinear function. Here we take as the Heaviside function given by if and otherwise. At , we let if is not one of the input neurons.

2.2. First consecutive spikes counting (FCSC)

Given an input neuron and the max input length , we consider any input firing sequence such that for all . Define in terms of this firing sequence as follows: if for some , then there must exist integers such that for all , for all and . Define . (i.e. is the length of the first consecutive spikes interval in the sequence.) Otherwise, that is if for all , then define .

Let be output neurons. Then we say a network of neurons solves FCSC in time with neurons if there exists an injective function such that for all and the network has neurons.

2.3. Total spikes counting (TSC)

Given an input neuron and the max input length , we consider any input firing sequence such that for all . Define as the total number of spikes in the sequence. Let be output neurons. Then we say a network of neurons solves TSC in time with neurons if there exists an injective function such that for all and the network has neurons.

2.4. -spikes Temporal to Spatial Encoding

Given an input neuron and the max input length , we consider any input firing sequence such that for all and (i.e, there are spikes at distinct time points). We also assume that there is a designated neuron that fires at time to notify the network that the input ends. Let be output neurons. Denote the set of input temporal signals of max input length with distinct as . Then we say a network of neurons solves kSTS in time with neurons if there exists an injective function such that for all and the network has neurons.

Our contributions in this paper are to design networks that solve these three problems respectively with matching lower bounds.

Theorem 2.1.

There exists a network with neurons that solves FCSC problem in time.

Theorem 2.2.

There exists a network with neurons that solves TSC problem in time.

Theorem 2.3.

There exists a network with neurons that solves kSTS problem in time.

It is easy to see that we also have the corresponding information-theoretical lower bound all being if we treat as a constant.

3. First Consecutive Spikes Counting

We present the constructions in two stages. At the first stage, we count consecutive spikes in binary transiently. At the second stage, we transform the transient firing into persistant firing. By composing the two stages, we get our desired network.

First stage: The network contains neurons , and we build the network inductively. For the base network that counts mod , we have

Figure 1. Base Network

By noticing that if and only if and for , we have the following lemma

Lemma 3.1.

For the base network, if for all for , then at time , .

Now we iteratively build the network where on top of the base network with the following rule:

Figure 2. First Stage

This completes the construction. From the construction, we can deduce the following lemma

Lemma 3.2.

For , neurons fire according to the following rules:

  1. if and only if , , and either for all or

  2. if and only if for all

Proof.

Case (1): The potential of is

Only if: Let’s show the only if direction for the firing rule of by proving the contrapositive.
If , then the potential of is

If , then the potential of is

If there exists such that and , then the potential of is

In all three cases, we have .
If: For the if direction, if , and for all , then the potential of is

If , and , then the potential of is

In both cases, we have .

Case (2): The firing rule of can be analyzed similarly.

The potential of is

Only If: For the only if direction, if there exists such that , then the potential of is

We have .
If: For the if direction, if for all , then the potential of is

We have as desired. ∎

Using the above lemma, we can verify that indeed the network at the first stage fires in binary, with encoding the th digit in the binary representation.

Theorem 3.3.

For , , if for all , then

  1. for where .

  2. if and only if or .

Proof.

First, let’s verify that the claim is true for . Since for all , if and only if . This implies exactly as desired (for all the modular arithematic at this paper, we choose the smallest nonnegative number from the equivalence class). Now let’s do the induction on and we will verify the induction by checking fires in according to the induction hypothesis for all . When , the induction statement is trivially satisfied for all . Fix , we have the following cases:

  1. :
    This implies that . By induction hypothesis, not all for . Now by Lemma 3.2, we have as desired.

  2. :
    This implies that . By induction hypothesis, for all . Now by Lemma 3.2, we have as desired.

  3. :
    This implies that . By induction hypothesis, not for all . Now by Lemma 3.2, we have as desired.

  4. :
    This implies that . By induction hypothesis, for all . Now by Lemma 3.2, we have as desired.

  5. :
    This implies that . By induction hypothesis, for all . Now by Lemma 3.2, we have as desired.

This completes the induction. ∎

Second stage: Now the second stage is a simple “capture network” with input neurons , for all , output neurons for and an auxilary neuron . Intuitively, the network persistently captures the state of for all into for all . We will specify the timing of the states of being captured later. The network is defined as the following:

and

Figure 3. Second Stage

Notice that the above weight ensures the following one step firing rule:

Lemma 3.4.

For , neurons fire according to the following rules:

  1. if and only if , or ( and )

  2. if and only if , or (there exists such that or , and )

Proof.

Case (1): The potential of is

Only If: Let’s show the only if direction for the firing rule of first. If , the potential of is

If , the potential of is

If , the potential of is

In all three cases, we have .
If: For the if direction, if , then the potential of is

If , the potential of is

In both cases, we have .

Case (2): The potential of is

Only If: For the only if direction, if for all , then the potetntial of is

If , the potetntial of is

In both cases, we have .
If: For the if direction, if there exists such that and , then the potential of is

If there exists such that and , the potential of is

If , the potential of is

In all three cases, we have as desired. ∎

Now we can describe the behaviors of the capture network in the following theorem. The network persistantly captures the state of for all at the first time point such that and there exists some such that into for all .

Theorem 3.5.

For the network at the second stage, let be such that and there exists such that , and for all , either or for all . Then for all .

Proof.

First by Lemma 3.4, for all , for all . Now at time , by Lemma 3.4, we see that and . Now by Lemma 3.4, we know that for all . Now by Lemma 3.4 again, if , then since for all , for all ; and if , then we also have for all as desired. ∎

Now we are ready to prove the main Theorem 2.1.

Proof.

We are going to prove the main theorem by composing the networks from stage one and two together. If for all , then the network satisfies the criterion trivially since for all . If not, then there exists such that for all , for all and where is the length of the first consecutive spikes interval. Let ; then by Theorem 3.3 and Lemma 3.1, for all . Now because , we know there exists such that by Theorem 3.3. And by Lemma 3.2, we know for all . Now the assumption of Theorem 3.5 is satisfied with . By Theorem 3.5, we have for all as desired. This shows that the above network solves FCSC problem in times with neurons. ∎

4. Total Spikes Counting

To count the total number of spikes in an arbitrary interval requires persistence of neurons without external spikes. Notice that on FCSC network, each neuron toggles itself according to binary representation without delay. However, persistence of neurons and toggles without delays are conflicting objectives; persistence of neurons stabilizes the network while toggling without delays changes the firing patterns of the network. For example, we use self-inhibition to count but if we use self inhibition to count mod , the neuron cannot maintain the count during intervals with no inputs. In this section, we circumvent this difficulty by allowing the network to enter an unstable intermediate state that still stores the information of the count when the spikes arrive; however, the network will converge to a clean state that according to binary representation after one step of computation without external signals, and this clean state is stable in an arbitrary interval with no input.

In this section, because the self-inhibition used in Section to count cannot induce persistence, we build a network of four neurons to count to replace the function of in Section . We then iteratively build the rest of the network that approximately fires in binary on top of the counter network.

The construction of the counter network is the following:

and

Figure 4. Counter Network

We have the following lemma to specify the firing rules of :

Lemma 4.1.

Neurons for at time fire according to the following rules:

  1. if and only if , and or

  2. for if and only if , and or

Proof.

Case (1): The potential of is

Only If: Let’s show the only if direction for the firing rule of first. If , then the potential of is

If , then the potential of is

If , then the potential of is

In all three cases, we have .
If: For the if direction, if , then the potential of is

If , then the potential of is

In both cases, we have .

Case (2): For , The potential of is

Only If: For the only if direction, if , then the potential of is

If , then the potential of is

If , then the potential of is

In all three cases, we have .
If: For the if direction, if , then the potential of is

If , then the potential of is

In both cases, we have as desired. ∎

Define a clean state with value at time of the counter network to be a state in which