Period-halving Bifurcation of a Neuronal Recurrence Equation

10/17/2011 ∙ by Renè Ndoundam, et al. ∙ 0

We study the sequences generated by neuronal recurrence equations of the form x(n) = 1[∑_j=1^h a_j x(n-j)- θ]. From a neuronal recurrence equation of memory size h which describes a cycle of length ρ(m) × lcm(p_0, p_1,..., p_-1+ρ(m)), we construct a set of ρ(m) neuronal recurrence equations whose dynamics describe respectively the transient of length O(ρ(m) × lcm(p_0, ..., p_d)) and the cycle of length O(ρ(m) × lcm(p_d+1, ..., p_-1+ρ(m))) if 0 ≤ d ≤ -2+ρ(m) and 1 if d=ρ(m)-1. This result shows the exponential time of the convergence of neuronal recurrence equation to fixed points and the existence of the period-halving bifurcation.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

Caianiello and De Luca [3] have suggested that the dynamic behavior of a single neuron with a memory, which does not interact with other neurons can be modeled by the following recurrence equation :

(1)

where :

  • is a variable representing the state of the neuron at .

  • are the initial states.

  • is the memory length, i.e., the state of the neuron at time depends on the states assumed by the neuron at the previous steps .

  • () are real numbers called the weighting coefficients. More precisely, represents the influence of the state of the neuron at time on the state assumed by the neuron at time .

  • is a real number called the threshold.

  • 1[] = 0 if , and 1[] = 1 if .

The system obtained by interconnecting several neurons is called a neural network. These networks were introduced by McCulloch and Pitts

[7]

, and are quite powerful. Neural networks are able to simulate any sequential machine or Turing machine if an infinite number of cells is provided. Neural networks have been studied extensively as tools for solving various problems such as classification, speech recognition, and image processing

[19]. The field of application of threshold functions is large[1, 9, 10, 19]

. The spin moment of the spin glass system is one of the most cited example in solid state physics that has been simulated by neural networks.


Neural networks are usually implemented by using electronic components or is simulated in software on a digital computer. One way in which the collective properties of a neural network may be used to implement a computational task is by way of the concept of energy minimization. The Hopfield network is a well-known example of such an approach. It has attracted great attention in literature as a content-addressable memory [2].
Given a finite neural network, the configuration assumed by the system at time t is ultimately periodic. As a consequence, there is an integer called the period (or a length of a cycle) and another integer called the transient length such that:

where . The period and the transient length of the sequences generated are good measures of the complexity of the neuron. A bifurcation occurs when a small smooth change made to the parameter values (the bifurcation parameters) of a system causes a sudden ’qualitative’ or topological change in its behaviour. A period halving bifurcation in a dynamical system is a bifurcation in which the system switches to a new behaviour with half the period of the original system. A great variety of results have been established on recurrence equations modeling neurons with memory [1, 4, 5, 6, 11, 12, 14, 15, 20]. However some mathematical properties are still very intriguing and many problems are being posed. For example, the question remains as to whether there exists one neuronal recurrence equation with transients of exponential lengths [18]. In [13], we give a positive answer to this question by exhibiting a neuronal recurrence equation with memory which generates a sequence of exponential transient length and exponential period length with respect to the memory length. Despite this positive answer, one question remains: does there exist one neuronal recurrence equation with exponential transient length and fixed point ?
In this work, from a neuronal recurrence equation of memory size , whose dynamics contains a cycle of length , we build a set of neuronal recurrence equations whose dynamics describe respectively:

  • the transient of length , if

  • the cycle of length if and 1 if

Thus, we give a positive answer to the precedent question.
The technique used in this paper to get the period-halving bifurcation is to modify some parameters (weighting coefficients and threshold) of the neuronal recurrence equation. This technique relies on control theory. Controllability is related to the possibility of forcing the system into a particular state by using an appropriate control signal.

The paper is organized as follows: in Section 2, some previous results are presented. Section 3 presents some preliminaries. Section 4 is devoted to the construction of neuronal recurrence equation . Section 5 deals with the behavior of neuronal recurrence equation . Concluding remarks are stated in Section 6.

2 Previous Results

The only study of bifurcation was done by Cosnard and Goles in [6]. Cosnard and Goles [6] studied the bifurcation in two particular cases of neuronal recurrence equation:
Case 1: geometric coefficients and bounded memory
Cosnard and Goles described completely the structure of the bifurcation of the following equation:

when varies. They showed that the associated rotation number is an increasing number of the parameter .
Case 2: geometric coefficients and unbounded memory
Cosnard and Goles described completely the structure of the bifurcation of the following equation:

when varies. They showed that the associated rotation number is a devil’s staircase.

From line 11 to line 15 of page 15 in [5], it is written: “This shows that, if there is a neuronal recurrence equation with memory length that generates sequences of periods , then there is a neuronal recurrence equation with memory length that generates a sequence of period , where denotes the least common multiple.” This allows us to write the following fundamental lemma of composition of a neuronal recurrence equation:

Lemma 1

[5]
If there is a neuronal recurrence equation with memory length that generates sequences of periods , then there is a neuronal recurrence equation with memory length that generates a sequence of period .


Lemma 1 does not take into account the study of the transient length. One can amend Lemma 1 to obtain the following lemma:

Lemma 2

[13, 14] If there is a neuronal recurrence equation with memory length that generates a sequence of transient length and of period then there is a neuronal recurrence equation with memory length that generates a sequence of transient length and of period

In the following example, we will show that Lemma 1 and Lemma 2 are incomplete.

Example 1:
Let us suppose that the neuronal recurrence equation defined by Equation (1) generated six sequences

(2)

of periods

(3)

It is clear that each sequence defined by Equation (2) is a fixed point. We present two different cases of evolution.
First case:
We suppose that

(4)
(5)

It is easy to verify that the shuffle of the neuronal recurrence equation defined by Equations (4) and (5) is

(6)

The sequence defined by Equation (6) describes a period of length . By application of the Lemma 1 the period of the sequence defined by Equation (6) should be  ( more precisely ).
Second case:
We suppose that

(7)
(8)

It is easy to verify that the shuffle of the neuronal recurrence equation defined by Equations (7) and (8) is

(9)

The sequence defined by Equation (9) describes a period of length . By application of the Lemma 1 the period of the sequence defined by Equation (9) should be  ( more precisely ).
The first case and the second case of example 1 show that Lemma 1 and Lemma 2 don’t take into account all the cases.
One can amend the Lemma 1 as follows:

Lemma 3

If there is a neuronal recurrence equation with memory length that generates sequences of periods , then there is a neuronal recurrence equation with memory length that generates a sequence of period . is defined as follows:
First case: such that

Second case: ; .

The improvement of Lemma 1 doesn’t modify all the main results about periods obtained in the papers [5, 11, 12, 14, 15] because all these main results consider only the case where the periods of the sequences are greater or equal to 2.
We can also amend the Lemma 2 as follows:

Lemma 4

If there is a neuronal recurrence equation with memory length that generates a sequence of transient length and of period then there is a neuronal recurrence equation with memory length that generates a sequence of transient length and of period . is defined as follows:
First case: such that

Second case: ; .

3 Preliminaries

Let

be a positive integer. For a vector

, a real number and a vector . We define the sequence by the following recurrence:

(10)

We denote by the sequence generated by equation (10), its period and its transient length.

Let be a positive integer, we denote the cardinality of the set by . Let us denote by the prime numbers belonging to the set , the sequence is defined as .
We also suppose that:

(11)

Subsequently, we consider only the integers such that .
It is easy to check that contains at most odd integers. It follows that

(12)

We set and , we define :

From the previous definitions, we have .
It is clear that

This implies that

Therefore

(13)

, we want to construct a neuronal recurrence equation with memory of length which evolves as follows :

(14)

and which describes a cycle of length .
, let be the vector defined by

(15)

In other words, is defined by:

We define the neuronal recurrence equation by the following recurrence:

(16)

where is defined as follows:

First case: is even and

(17)

Second case: is odd, and

(18)

We also define:

(19)
(20)
(21)
(22)
(23)
(24)
(25)

By definition represents the set of indices such that
From the definition of and from Equation (15), one can easily verify that

(26)
(27)

, we also denote the set of indices such that , in other words:

and , we denote:

The neuronal recurrence equation with memory of length is defined by Equations (15) and (16).
We will show that the neuronal recurrence equation evolves as specified in equation (14).
In the following proposition, we present an important property.

Proposition 1

[14] and

The following proposition characterizes the sum of the interaction coefficients when

Proposition 2

, we have

The following lemma characterizes the evolution of the sequence at time .

Lemma 5

From Lemma 5 and Equation (15), it is easy to verify that

(28)

From the definition of , from Equation (15), from Equation (28) and from the Lemma 5, we check easily that:

(29)

The values of the sequence at time are given by the following lemma.

Lemma 6

It is easy to verify that , we have:

Lemma 7

Lemma 8

, and

In order to present some properties of the sequence , we introduce the following notation:

Notation 1

Let us define as:

and let be a strictly negative real number such that:


Lemma 9

such that and ,

Let be the sequence whose first terms are defined as follows:

(30)

and the other terms are generated by the following neuronal recurrence equation:

(31)
Remark 1

The term is equal to , this implies that is equal to .

The parameters and used in neuronal recurrence Equation (31) are those defined in Equations (17), (18) and (24).

The following lemma, which is easy to prove, characterizes the evolution of the sequence .

Lemma 10

In the evolution of the sequence , we have:
(a) ,
(b) ,
(c) The sequence describes a transient of length and a fixed point.

The instability of the sequence occurs as a result of the convergence of the sequence to .

Notation 2

. is the length of the memory of some neuronal recurrence equations.

Let us also note:

(32)
(33)
(34)

, and represent the periods of some neuronal recurrence equations.

Let be the sequence whose first terms are defined as follows:

(35)

and the other terms are generated by the following neuronal recurrence equation:

(36)

where

(37)
(38)

The parameters are those defined in Equations (17) and (18). The parameters and are defined in Equations (24) and (25).

Remark 2

(a) The first terms of the sequence are obtained by shuffling the terms of each subsequence where .
(b) The neuronal recurrence equation (36) is obtained by applying the construction of Lemma 1 to the neuronal recurrence Equation (16) whose parameters are given in Equations (17), (18), (24) and (25).

From the fact that the sequence is the shuffle of the subsequences and from its construction, we can write:

Lemma 11

such that with and , we have:

The next lemma gives the period of the sequences .

Lemma 12

The sequence describes a cycle of length .

such that , we denote by the sequence whose first terms are defined as:

(39)

and

(40)

The first terms of the sequence are obtained by shuffling the terms of each of the sequences:

(41)

and

(42)

where:

(43)

The other terms of the sequence are generated by the following neuronal recurrence equation:

(44)

The next lemma gives the period of the sequence .

Lemma 13

The sequence generates a transient of length and a cycle of length .

Notation 3

Let us define as:

Remark 3

On the basis of the composition of automata [5] and the definition of , we can conclude that:

  • , and

  • .

3.1 Results on the dynamics of sequences and

In this subparagraph, we recall and give some interesting results on dynamics of the sequences and .
The following lemma characterizes the sequence