1 Introduction
One of the most fundamental question in neuroscience is the problem of neural coding, which is a key to understand the brain. A neural coding is a system of rules and mechanisms by which a signal carries information(1). To our knowledge, generally two types of neural coding are mainly considered including rate coding(2; 3; 4; 5; 6; 7)and temporal coding(8; 9; 10; 11; 12). In the rate code paradigm, neuronal information is carried by the mean firing rate of neurons. In the temporal code paradigm, which is also called synchrony code, neuronal information is represented by the precise spiking timings of neurons. Though the two paradigms are alternative to each other, however, some works thought that the brain may work in a combination of both paradigms(13; 14; 15; 16; 17). Up to now, how neuronal information is coded in the brain cortex is still under debate.
In these neural coding hypothesis, population rate coding is an ingenious hypothesis worth paying attention to. According to the classical rate coding, neurons encode the information in rate coding through representing it in the number of spikes per observation time window(firing rate)(18; 19)
. The neurons, however, need a short integration time to estimate the signal
(20). If it is required for every synaptic stage to average in time calculating the firing rate, the rate code is highly impossible for the information coding in the brain in that the brain is a highly efficient machine so its mechanism of information coding is ought to be efficient. The population rate coding was introduced, which calculates the population firing rate instead of the mean firing rate, improving the efficiency of rate coding. The population rate coding is based on the experimental observation with the intensity of the external stimuli(21; 22). The population rate coding is capable of faster and more accurate processing since averaging is performed across many fast responding neurons(23; 24; 21; 25). Recently, the population rate coding has been widely investigated in many works(14; 26; 6; 27; 28).In most theoretical studies mentioned above, they only considered excitatory neurons or synapses but ignored the recurring properties in neuronal networks. In the real biological neuronal network, there exist not only excitatory neurons or synapses. The dynamics of excitatoryinhibitory (EI) neuronal networks has been investigated in many works(29; 30; 31; 32; 33). Activity patterns and spiking dynamics in EI recurrent networks were systematically studied in these works. What’s more, signal coding in recurrent EI network also attracted much research(31; 12; 34; 35; 17; 36; 34; 37; 27; 28; 38; 39). In most works, neurons are considered into excitatory and inhibitory neurons and whether the synaptic connections are excitatory or inhibitory is mainly determined by the type of presynaptic neurons. However, according to physiological evidences, the property of the synapses should be determined by the type of activated receptors(40). If excitatory receptors are activated, then the synapses are excitatory; Otherwise, they are inhibitory. Shrivastava et al. reviewed that inhibitory receptor GABA coreleases with the excitatory receptor, e.g., glutamate, at the same presynaptic synapse, which suggested that the crosstalk between different types of receptors might be a general phenomenon in the nervous system(41). Furthermore, recently Kantamneni reviewed that how GABA receptor activity influences glutamate receptor function and vice versa, which concluded that the crosstalk between excitatory and inhibitory receptors plays a key role in the balance between excitatory and inhibitory neurotransmission in brain(42). Therefore, inspired by these experimental studies, we construct a novel neuronal network, instead of taking predetermined excitatory and inhibitory neurons, in which the excitatory and inhibitory synapses grow from the same one neuron with the ratio of 4:1. Meanwhile, for simplicity, that one postsynaptic neuron receives the excitatory and inhibitory synapse from the same presynaptic neuron at the same time is avoided. Unlike the former works, in this paper, we will discuss population rate coding in such neuronal network, in which the excitation or inhibition of synaptic connections is just determined by the type of activated receptors, namely, the same presynaptic neurons might perform excitatory or inhibitory effects on the corresponding postsynaptic neurons. In the following content, we will mainly study the representation of population firing rate in such a recurrent neuronal network.
In this paper, we construct a neuronal network in a more biological manner to study the population rate coding. The contents are arranged as follows. In ’Model and method’ section, we introduce the computational model of neurons, synapses, and networks as well as the measurements of the neuronal information coding. In ’Results’ section, the effects of some critical parameters on the population rate coding will be illustrated and discussed. Lastly, we try to compare the performance of population rate coding in the different model and ours. In ’Conclusion’ section, we summarize the conclusions of this work and have some discussions about the further works in the future.
2 Model and Method
2.1 Network Topology
The recurrent network constructed in this paper is shown in 1.
The network consists of neurons coupled through excitatory and inhibitory synaptic connections. Neurons connect to each other with probability through recurrent connections, of which are excitatory synaptic connections while are inhibitory, namely, excitatory probability and inhibitory probability . Notably, excitatory connections could grow from a neuron who also grow inhibitory connections. Generally, and otherwise specified.
2.2 Neurons and Synapses
The neuron model used in this paper is the HodgkinHuxley model(43) whose dynamics is shown as follows.
(1) 
where denotes the membrane potential of each neuron . is the membrane capacity. denotes the external stimuli injected into all neurons in the network. are activation and inactivation gate variables of sodium channels, activation gate variable of potassium channels, respectively. are the maximal sodium and potassium conductance, respectively. , , are the reversal potential of sodium, potassium, leaky currents, respectively. is the synaptic current of each neuron from the coupled presynaptic neurons. is the background noise with denoting the noise intensity and
being the white Gaussian noise, whose mean is zero and the standard deviation is 1. The function
and with are given by: , , , , , from Hansel et al(44).A neuron’s synaptic currents are shown as follows. Synaptic connections here are modeled as a conductancebased model.
(2) 
where denotes the amount of presynaptic neurons coupled with the neuron . denotes the reversal potential of synapse with for excitatory connections and for inhibitory connections. denotes the synaptic conductance. In detail, the conductance is written as
(3) 
with . is the spike timings of the presynaptic neurons. is synaptic time constant. For excitatory connections, . For inhibitory connections, we set . If not specified below, the excitatory synaptic strength is set as . The inhibitory synaptic strength is calculated by excitatory one. The ratio between them is
(4) 
through which we could control the value of to modulate the degree of inhibition in the population as well as the population firing pattern.
2.3 Timevarying external input current
In order to study the rate code, we use a kind of timevarying external input current which is used ubiquitously in many papers(6; 27; 28; 38), also named a halfwave rectified Gaussian noise.
(5) 
with denoting the modulation strength. is an OrnsteinUhlenbeck process whose dynamics equation is
(6) 
where
is a Gaussian white noise, correlation time
is set as 80 ms. If not specified, the external input current intensity is set as .2.4 Methods
2.4.1 Population firing rate
The population firing rate is used to represent the signal, which is calculated as
(7) 
where is a given short time interval centered at time , whose value here is taken as 1 , is the total amount of spikes in the neuronal population during the given time interval.
2.4.2 Encoding quality
The correlation coefficient of input and population rate in target layer is introduced to quantify how well the input is encoded and propagated by the network. is calculated as(12)
(8) 
where is the population firing rate in a 10 time window sliding with a step of 1 . Here we term the maximum of the correlation coefficients as encoding fidelity, , through which we could know how much information is captured by population firing rate.
2.4.3 Synchrony degree
The synchrony degree is used to quantify how synchronously the neurons in the network generate spikes, by which we could determine the local state of neuronal population and do more analysis. The synchrony degree is quantified by the population coherence measure . is calculated as(45)
(9) 
where is the size of a subnetwork. is the coherence measure of a neuron pair calculated as
(10) 
where are the discretizing action potential trains of neuron pair respectively. In detail, one would divide the two time trains whose interval is into small bins of , then write the trains as
where .
3 Results
In this section, the simulation results and relevant analysis are presented. We mainly pay attention to discussing how stochastic stimuli can be represented with the population firing rate in the studied neuronal network and how to improve the encoding quality. The key parameters we investigate include I/E strength ratio , recurrent probability , noise intensity , excitatory strength , and the synaptic time constant . Finally, we compare the encoding quality in our considered network with the neuronal network with determined neuron types in the previous studies.
3.1 I/E strength ratio and recurrent probability
The network state has a directive effect on information encoding. How the network state affects the population rate coding quality is an important problem. In the current studied network, the ratio between inhibitory and excitatory synaptic strength directly determine the network state, making which more excitatory or inhibitory. If the ratio is small, the excitation dominates, otherwise, inhibition dominates. Meanwhile, the recurrent probability determines how much tightly the neurons in the network connect with, which directly influences the total number of excitatory and inhibitory connections, respectively. That is to say that the I/E strength ratio and recurrent probability definitely have a great influence on the network’s state. In this subsection, therefore, we first discuss the effects of and on the encoding quality .
We first show a curve of versus under some certain parameters in Fig. 2(a). From this glimpse that the effect of on the encoding quality , we see that there might exist an optimal range of for a better encoding performance, which in this figure suggests a range of . The encoding quality takes higher than denoting that the current studied network encodes the stimuli into population firing rate as much as or even more. It is a positive phenomenon for us because the high suggests that the model we construct in this paper is able to encode input information and the population rate code could represent the stimuli well. The typical results are shown in the Fig. 2(b)(c)(d), in which the top, middle, and bottom represent the stochastic input, network spiking raster and the corresponding population firing rate, respectively. Comparing to the other two cases, the case has a better performance on representing the input using population firing rate, which preserves the input more completely while in the other two cases some signals are weakened or amplified.
The Fig. 2 obviously shows that the I/E strength ratio has an important role in the population rate coding and the firing of neurons in the network. What does it happen if is altered? Now, we test the encoding quality versus different and . The results that effects of on the parameters and are shown in the Fig. 3. From this figure, however, we see that the curve as is shown in the Fig.2(a) is not adapted to every values of . In the parameter space of and it seems that takes higher values below the black dashed curve(noted as the dark blue area) while smaller values above the dashed curve(noted as the light blue area). Interestingly, for every , there exists an optimal range of in which takes higher values while smaller values out of the range.
From the statistical characteristic shown in the Fig. 4(a), the mean values of , however, present a decaying trend as increases, in which the large error bars are almost due to the lower encoding quality in the light blue area. Therefore, we should focus on the dark blue area with high encoding performance. The green curve in Fig. 4(b) shows that presents a decaying trend with increasing and takes optimal values around . According to the two statistical results and corresponding analysis, we now turn to the Fig. 3 again. Obviously, most optimal ranges of located in the dark blue area, though the range gets narrower as increases. In the dark blue area, especially around , the quality has a clear tendency to increase first and then decrease similar to the curve shown in 2(a). We call this curve with such a tendency the idealized curve. The encoding quality has a better performance as the recurrent probability is set around 0.1.
In order to determine the relationship of versus around , we show some curves selected from the results around in the Fig. 3. As the Fig. 4 illustrates, almost all of the five curves have the idealized tendency of versus . However, in details, the curve tends to be saturated as increases if ; the curve tends to decay rapidly as increases if . In other word, the optimal range tends to be wider for while narrower for . In the following simulations, we choose cause that it is a compromise of the previous two cases which has no obvious saturation and sharp decay but a suitable optimal range of .
3.2 Effects of considered parameters on population rate coding
3.2.1 Noise intensity
Neurons in the biological environment receive background noise from time to time and fire spikes spontaneously. In the studied work, we use Gaussian noise to mimic the background noise and spontaneous spikes. It turned out that, in the previous studies, the noise intensity is a key parameter for information coding(6; 46). The encoding quality depending on the noise intensity vs. is shown in the Fig. 5.
As is shown in the Fig. 5, the left part denotes the encoding quality on the parameter space of and noise intensity . The red color represents better performance on information coding. Obviously there exist an optimal area of and in which takes high value(red area). For , the network encodes the input information very well within an optimal range of . Meanwhile, in the range of , the lower limits and upper limits of the optimal range of (eg., the case that ) changes with varying . In details, we found that as is small the upper and lower limits of are both larger, otherwise they are smaller. To see the relationship between and , we perform some statistical analysis of the data from the effects of noise intensity on the population rate coding. The statistical results are shown in the Fig. 6.
The orange curve shows the same trend that lower for lower and higher for moderate . Notably, the trend is weaker than that we ’see’ from the Fig. 5 in vision due to the bad performance out of the red area, but that does not affect us to get such a conclusion. Similarly, the blue curve in the Fig. 5(b) tells us that there exists an optimal range of where the encoding quality takes high values.
Take a closer look at the results. We slice three cases of in the left part of Fig. 5 with different color and plot the corresponding curve of vs. in the right part. As we discussed above, the three cases all have an optimal range of , the case has a wider optimal range of and its lower and upper limits are larger, the case has a narrower optimal range of and its lower and upper limits are smaller. Comparing to the other cases, the case has a moderate optimal range of maintaining the high level of and does not decay rapidly if exceeds the corresponding optimal range, which makes the case be a compromise of the other two cases and proves the rationality of the parameter that we selected before. Of course, we could get some very high encoding quality with some noise intensity like . We still select in the later simulations because that does not affect the results we investigate the effects of other parameters on the encoding performance.
3.2.2 Excitatory synaptic strength
The excitatory synaptic strength is a key parameter who directly determines the strength of excitatory synapses and indirectly determines the inhibitory synaptic strength together with . Therefore, the excitatory synaptic strength together with I/E ratio have a great influence on the network state and the performance of population rate coding. The encoding quality depending on the and is shown in the Fig. 7.
As is shown in the figure, there is an obvious orange area denoting the better performance where most of the encoding quality is almost beyond 0.9, which is located between the black dashed parallel lines. We note that values of is high in the orange area almost overlapping with the line whose slope approximates from the lowerright corner to the upperleft corner, while outside the orange area is lower. To describe the graph in details, takes higher values in the middle area on the 2D parameter space(orange color), secondhigh values in the lowerleft corner(white color) and low values in the upperright corner(purple color). Meanwhile, we note that the product of and takes lower values in the lower left corner, moderate values in the middle area and high values in the upper right corner. It seems that the product of and has a positive relationship with the encoding quality. The product of and is exactly the inhibitory synaptic strength . The inhibitory synaptic strength might have a great influence on the encoding quality, which is understood as follows. For a large , the encoding quality is worse, which suggests that the too strong inhibitory synapses depress the encoding ability of the network making smaller; for a small , is larger than the values for large but still smaller than for intermediate , which suggests that the decreasing inhibitory strength could facilitate the encoding quality of the network but too weak inhibition could indirectly excite the network activity to decrease the accuracy of information coding. For intermediate , the network could keep in an intermediate state — not too excitatory or too inhibitory — so that the network could respond to the signals well and possess a better performance for coding. Notably, it explains why in the results that we fix the value of above there exists an optimal range of , namely, the network needs an intermediate state to have a better coding ability.
In a summary, we find the effects of on the encoding quality, which is determined by the excitatory synaptic strength and I/E ratio . That’s to say the excitatory synaptic strength and I/E ratio influence the encoding quality together. For a (not too large), there always exists an optimal range of locating in the intermediate values of . We, therefore, fix the value of , e.g., , in the previous simulations which is enough for our works.
3.2.3 Synaptic time constant
In this subsection, we investigate the effects of synaptic time constant on the encoding performance. In the simulations, we fix the ratio between the excitatory and inhibitory synaptic time constant , then we alter the values of . The results are shown in the Fig. 8, which suggests that only small synaptic time constant is good for encoding performance. It is easy to understand that only if a synapse with a small synaptic time constant, namely, fast dynamics, could respond to the coming synaptic currents with high speed so that the coding is more accurate.
3.3 Comparison with the determined network
We have studied the effects of multiple parameters on the population rate coding quality of the neuronal network, in which we claimed that one presynaptic neuron might perform either excitatory or inhibitory effects on the corresponding postsynaptic neurons. That is different from the previous works in which they considered the neurons into determined types — inhibitory or excitatory. In order to determine the differences, we compare the population rate coding quality between the previous networks(determined EI model) and our studied network(undetermined EI model). The synaptic strength and noise intensity are mainly considered in this subsection.
First, let us introduce the set up of the determined EI model. The network consists of 100 neurons with 80 excitatory neurons and 20 inhibitory neurons, which could be named as excitatory subpopulation and inhibitory subpopulation . The connection probability of these subpopulations is denoting excitatoryexcitatory connection, excitatoryinhibitory connection, inhibitoryexcitatory connection, and inhibitoryinhibitory connection, respectively. The other parameters are the same as the parameters in our considered network mentioned above.
The results of the determined model are illustrated as in the Fig. 9. As is shown in the subgraph(a), the noise intensity has a similar effect with ours on the encoding quality that too weak and strong noise depress the population rate coding while intermediate noise facilitates coding. The synaptic strength has a similar effect with us on the encoding quality that only intermediate has the optimal effect on the population rate coding. We note that the values of and both are smaller than the counterparts in our model, which might be due to the determined excitatory and inhibitory neurons have the different and separating effects so that they do not need some larger parameters. With the same parameters, however, the encoding quality in the determined model seems worse than ours like the Fig. 9(b). To confirm that point, we show comparisons of some cases in the following figures. The comparison of noise intensity on the encoding quality in two models is shown in Fig. 10(a), which shows us that in both models, as increases, reaches the optimal value sooner under large noise intensity than the counterpart under the smaller noise intensity, but its value is smaller. The optimal takes higher values in our model than the counterpart in the determined model. As is shown in the Fig. 10(b), the synaptic strength has a similar effect and better performance in our model than the determined model.
4 Conclusion
Information coding in cortical network is one of critical questions for people to understand the brain, which has attracted extensive attention. In the past decades, people developed many potential strategies of information coding(2; 6; 10; 12; 47; 48; 49; 50; 51), in which there are two main types of coding paradigm including temporal coding(8; 10; 11) and rate coding(2; 6). Among thses neural code hypthesis, the population rate coding has been widely studied in many works(23; 24; 21; 25; 14; 26; 6; 27; 28).
People constructed many recurrent networks consisting of predetermined excitatory and inhibitory neurons to model the cortical neuronal networks. In these works, whether the synaptic connections are excitatory or inhibitory is mainly determined by the type of presynaptic neurons. Considering the physiological evidences, however, the property of the synaptic connection should be determined by the type of activated receptors. Many experimental works illustrated that both excitatory and inhibitory receptors might coexist and in the same presynaptic synapse(41; 42), as well as the crosstalk between excitatory and inhibitory receptors plays a key role in the balance of excitation and inhibition in brain(42). Inspired by these evidences, therefore, we construct a recurrent neuronal network in which one presynaptic neuron might perform excitatory or inhibitory effect on the corresponding postsynaptic neurons.
In this paper, we study the population rate coding in our considered neuronal network. The I/E strength ratio is a key parameter that determines how different between the excitatory and inhibitory synaptic strength, which indirectly influences the network state. We find there exists an optimal range of for better population rate coding performance, which usually locates in the intermediate values of . The recurrent probability has a similar optimal intermediate range in which the population rate coding performs well. After determining the two key parameters, in the following simulations effects of the noise intensity, synaptic strength and synaptic time constant on the population rate coding are investigated. The noise intensity has an obvious effect on the population rate coding that intermediate noise intensity could facilitate the encoding quality while large and small noise intensity depresses the encoding quality. Excitatory synaptic strength together with I/E strength ratio influences the encoding quality illustrating that if the excitatory synaptic strength is not too large there exists an optimal range of . The optimal range of slides with the increasing excitatory synaptic strength. The synaptic time constant determines the speed of neurons’ response to signals so that small synaptic time constant promotes the accuracy of population rate coding. All the results above suggest that the optimal range of every parameters(besides synaptic time constant) locates in the intermediate values. With the suitable combination of these parameters the neuronal network will encode signal information in population rate code very well.
Furthermore, we compare the population rate coding performance of our considered network(undetermined EI model) and the previous networks(determined EI model). We find that the determined EI model has a similar optimal range of , but in the model usually takes lower than ours. The noise intensity also has a similar effect on the population rate coding that too weak and too strong noise both are bad for the population rate coding. The synaptic strength has a little different effect on the encoding quality. Weak excitatory synaptic strength could enhance the performance of population rate coding while the strong counterpart decreases the performance. Notably, with the same setting parameters, the population rate coding performance in the determined model is worse than ours. In conclusions, our simulation results suggested that the neuronal network with determined types of neurons has a similar effect as the network without determined types of neurons(ours) on the performance of population rate coding, but our model has a better performance in population rate coding and a more rational architecture to some degree.
Although we improve the neuronal network by considering the actual effect of synaptic connections determined by the types of activated receptors, it is still not very biological enough. In fact, many unknown issues about the corelease of excitatory and inhibitory synaptic receptors remain. There are lots of further works on this topic. On the one hand, the more complex and realistic synapse model should be considered such as introducing the stochastic process of neural transmitter and receptor. On the other hand, apart from the population rate coding, it seems there are kinds of more accurate code representing the neural information, which are worth studying to open our minds.
Acknowledgements
This work is supported by the National Natural Science Foundation of China (Grant Nos. 11472061, 11572084) and the Fundamental Research Funds for the Central University (No. 2018XKJC02).
References
 (1) R. C. deCharms, A. Zador, Neural representation and the cortical code, Annual Review of Neuroscience 23 (1) (2000) 613–647, pMID: 10845077. doi:10.1146/annurev.neuro.23.1.613.
 (2) W. T. Newsome, K. H. Britten, J. A. Movshon, Neuronal correlates of a perceptual decision, Nature 341 (1989) 52–54. doi:10.1038/341052a0.
 (3) A. Georgopoulos, M. Taira, A. Lukashin, Cognitive neurophysiology of the motor cortex, Science 260 (5104) (1993) 47–52. doi:10.1126/science.8465199.
 (4) M. N. Shadlen, W. T. Newsome, Noise, neural codes and cortical organization, Current Opinion in Neurobiology 4 (4) (1994) 569 – 579. doi:https://doi.org/10.1016/09594388(94)900590.
 (5) P. Marsalek, C. Koch, J. H. R. Maunsell, On the relationship between synaptic input and spike output jitter in individual neurons., Proceedings of the National Academy of Sciences of the United States of America 94 2 (1997) 735–40. doi:10.1073/pnas.94.2.735.
 (6) M. C. W. van Rossum, G. G. Turrigiano, S. B. Nelson, Fast propagation of firing rates through layered networks of noisy neurons, Journal of Neuroscience 22 (5) (2002) 1956–1966. doi:10.1523/JNEUROSCI.220501956.2002.
 (7) M. E. Mazurek, M. N. Shadlen, Limits to the temporal fidelity of cortical spike rate signals, Nature Neuroscience 5 (2002) 463–471. doi:10.1038/nn836.

(8)
A. Aertsen, M. Diesmann, M. Gewaltig, Propagation of synchronous spiking activity in feedforward neural networks, Journal of PhysiologyParis 90 (3) (1996) 243 – 247.
doi:https://doi.org/10.1016/S09284257(97)814325.  (9) A. Riehle, S. Grün, M. Diesmann, A. Aertsen, Spike synchronization and rate modulation differentially involved in motor cortical function, Science 278 (5345) (1997) 1950–1953. doi:10.1126/science.278.5345.1950.
 (10) M. Diesmann, M.O. Gewaltig, A. Aertsen, Stable propagation of synchronous spiking in cortical neural networks, Nature 402 (1999) 529–533. doi:10.1038/990101.
 (11) V. Litvak, H. Sompolinsky, I. Segev, M. Abeles, On the transmission of rate code in long feedforward networks with excitatory–inhibitory balance, Journal of Neuroscience 23 (7) (2003) 3006–3015. doi:10.1523/JNEUROSCI.230703006.2003.
 (12) T. P. Vogels, L. F. Abbott, Signal propagation and logic gating in networks of integrateandfire neurons, Journal of Neuroscience 25 (46) (2005) 10786–10795. doi:10.1523/JNEUROSCI.350805.2005.
 (13) N. Masuda, K. Aihar, Bridging rate coding and temporal spike coding by effect of noise, Phys. Rev. Lett. 88 (2002) 248101. doi:10.1103/PhysRevLett.88.248101.
 (14) N. Masuda, K. Aihara, Duality of rate coding and temporal coding in multilayered feedforward networks, Neural Computation 15 (2003) 103–125. doi:10.1162/089976603321043711.
 (15) N. Masuda, K. Aihara, Dual coding and effects of global feedback in multilayered neural networks, Neurocomputing 5860 (2004) 33 – 39, computational Neuroscience: Trends in Research 2004. doi:https://doi.org/10.1016/j.neucom.2004.01.019.
 (16) R. M. Bruno, B. Sakmann, Cortex is driven by weak but synchronously active thalamocortical synapses, Science 312 (5780) (2006) 1622–1627. doi:10.1126/science.1124593.
 (17) A. Kumar, S. Rotter, A. Aertsen, Spiking activity propagation in neuronal networks: reconciling different perspectives on neural coding, Nature Reviews Neuroscience 11 (2010) 615–627. doi:10.1038/nrn2886.
 (18) E. D. Adrian, The basis of sensation, W W Norton and Co, 1928. doi:10.1016/01662236(92)90361B.
 (19) L. Kostal, P. Lansky, J.P. Rospars, Review article: Neuronal coding and spiking randomness, European Journal of Neuroscience 26 (10) (2007) 2693–2701. doi:10.1111/j.14609568.2007.05880.x.
 (20) M. N. Shadlen, W. T. Newsome, The variable discharge of cortical neurons: Implications for connectivity, computation, and information coding, Journal of Neuroscience 18 (10) (1998) 3870–3896. doi:10.1523/JNEUROSCI.181003870.1998.
 (21) W. Gerstner, Population dynamics of spiking neurons: Fast transients, asynchronous states, and locking, Neural Comput. 12 (1) (2000) 43–89. doi:10.1162/089976600300015899.
 (22) E. Kandel, J. Schwartz, T. Jessell, Principles of Neural Science, PrenticeHall International edit, Elsevier, 1991.
 (23) N. Brunel, V. Hakim, Fast global oscillations in networks of integrateandfire neurons with low firing rates, Neural Computation 11 (7) (1999) 1621–1671. doi:10.1162/089976699300016179.
 (24) B. W. Knight, Dynamics of encoding in neuron populations: Some general mathematical features, Neural Computation 12 (3) (2000) 473–518. doi:10.1162/089976600300015673.
 (25) P. Dayan, L. F. Abbott, Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems, The MIT Press, 2005.
 (26) N. Masuda, B. Doiron, A. Longtin, K. Aihara, Coding of temporally varying signals in networks of spiking neurons with global delayed feedback, Neural Computation 17 (2005) 2139–2175. doi:10.1162/0899766054615680.
 (27) S. Wang, C. Zhou, Information encoding in an oscillatory network, Phys. Rev. E 79 (2009) 061910. doi:10.1103/PhysRevE.79.061910.
 (28) D. Guo, C. Li, Population rate coding in recurrent neuronal networks with unreliable synapses, Cognitive Neurodynamics 6 (1) (2012) 75–87. doi:10.1007/s115710119181x.
 (29) C. van Vreeswijk, H. Sompolinsky, Chaos in neuronal networks with balanced excitatory and inhibitory activity, Science 274 (5293) (1996) 1724–1726. doi:10.1126/science.274.5293.1724.
 (30) E. Salinas, T. J. Sejnowski, Correlated neuronal activity and the flow of neural information, Nature Reviews Neuroscience 2 (2001) 539–550. doi:10.1038/35086012.
 (31) N. Brunel, Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons, Journal of Computational Neuroscience 8 (3) (2000) 183–208. doi:10.1023/A:1008925309027.
 (32) C. Mehring, U. Hehl, M. Kubo, M. Diesmann, A. Aertsen, Activity dynamics and propagation of synchronous spiking in locally connected random networks, Biological Cybernetics 88 (2003) 395–408. doi:10.1007/s0042200203844.
 (33) J.n. Teramae, T. Fukai, Local cortical circuit model inferred from powerlaw distributed neuronal avalanches, Journal of Computational Neuroscience 22 (3) (2007) 301–312. doi:10.1007/s1082700600146.
 (34) J. Kremkow, A. Aertsen, A. Kumar, Gating of signal propagation in spiking neural networks by balanced and correlated excitation and inhibition, Journal of Neuroscience 30 (47) (2010) 15760–15768. doi:10.1523/JNEUROSCI.387410.2010.
 (35) A. Kumar, S. Rotter, A. Aertsen, Conditions for propagating synchronous spiking and asynchronous firing rates in a cortical network model, Journal of Neuroscience 28 (20) (2008) 5268–5280. doi:10.1523/JNEUROSCI.254207.2008.
 (36) T. P. Vogels, L. F. Abbott, Gating multiple signals through detailed balance of excitation and inhibition in spiking networks, Nature Neurosciencedoi:10.1038/nn.2276.
 (37) J. Mayor, W. Gerstner, Transient information flow in a network of excitatory and inhibitory model neurons: Role of noise and signal autocorrelation, Journal of PhysiologyParis 98 (4) (2004) 417 – 428. doi:https://doi.org/10.1016/j.jphysparis.2005.09.009.

(38)
R. Han, J. Wang, H. Yu, B. Deng, X. Wei, Y. Qin, H. Wang, Intrinsic excitability state of local neuronal population modulates signal propagation in feedforward neural networks, Chaos: An Interdisciplinary Journal of Nonlinear Science 25 (4) (2015) 043108.
doi:10.1063/1.4917014.  (39) J. Barral, X.J. Wang, A. Reyes, Propagation of spike timing and firing rate in feedforward networks reconstituted in vitro, bioRxivdoi:10.1101/151134.

(40)
Wikipedia contributors, Neuron
— Wikipedia, the free encyclopedia, [Online; accessed 26August2018]
(2018).
URL https://en.wikipedia.org/wiki/Neuron  (41) A. Shrivastava, A. Triller, W. Sieghart, Gabaa receptors: Postsynaptic colocalization and crosstalk with other receptors, Frontiers in Cellular Neuroscience 5 (2011) 7. doi:10.3389/fncel.2011.00007.
 (42) S. Kantamneni, Crosstalk and regulation between glutamate and gabab receptors, Frontiers in Cellular Neuroscience 9 (2015) 135. doi:10.3389/fncel.2015.00135.
 (43) A. L. Hodgkin, A. F. Huxley, Currents carried by sodium and potassium ions through the membrane of the giant axon of loligo, The Journal of Physiology 116 (4) (1952) 449–472. doi:10.1113/jphysiol.1952.sp004717.
 (44) D. Hansel, G. Mato, C. Meunier, Phase dynamics for weakly coupled hodgkinhuxley neurons, EPL (Europhysics Letters) 23 (5) (1993) 367. doi:10.1209/02955075/23/5/011.
 (45) X.J. Wang, G. Buzsáki, Gamma oscillation by synaptic inhibition in a hippocampal interneuronal network model, Journal of Neuroscience 16 (20) (1996) 6402–6413. doi:10.1523/JNEUROSCI.162006402.1996.
 (46) S. Wang, W. Wang, F. Liu, Propagation of firing rate in a feedforward neuronal network, Phys. Rev. Lett. 96 (2006) 018103. doi:10.1103/PhysRevLett.96.018103.
 (47) S. I. Thorpe, D. Fize, C. Marlot, Speed of processing in the human visual system, Nature 381 (1996) 520–522. doi:10.1038/381520a0.
 (48) S. J. Thorpe, A. Delorme, R. van Rullen, Spikebased strategies for rapid processing, Neural networks : the official journal of the International Neural Network Society 14 67 (2001) 715–25. doi:10.1016/S08936080(01)000831.
 (49) R. van Rullen, S. J. Thorpe, Rate coding versus temporal order coding: What the retinal ganglion cells tell the visual cortex, Neural Computation 13 (2001) 1255–1283. doi:10.1162/08997660152002852.
 (50) T. Gollisch, M. Meister, Rapid neural coding in the retina with relative spike latencies, Science 319 (5866) (2008) 1108–1111. doi:10.1126/science.1149639.
 (51) B. A. Olshausen, D. J. Field, Sparse coding of sensory inputs., Current opinion in neurobiology 14 4 (2004) 481–7. doi:10.1016/j.conb.2004.07.007.
Comments
There are no comments yet.