# Period-halving Bifurcation of a Neuronal Recurrence Equation

We study the sequences generated by neuronal recurrence equations of the form x(n) = 1[∑_j=1^h a_j x(n-j)- θ]. From a neuronal recurrence equation of memory size h which describes a cycle of length ρ(m) × lcm(p_0, p_1,..., p_-1+ρ(m)), we construct a set of ρ(m) neuronal recurrence equations whose dynamics describe respectively the transient of length O(ρ(m) × lcm(p_0, ..., p_d)) and the cycle of length O(ρ(m) × lcm(p_d+1, ..., p_-1+ρ(m))) if 0 ≤ d ≤ -2+ρ(m) and 1 if d=ρ(m)-1. This result shows the exponential time of the convergence of neuronal recurrence equation to fixed points and the existence of the period-halving bifurcation.

## Authors

• 4 publications
• ### Réseaux d'Automates de Caianiello Revisité

We exhibit a family of neural networks of McCulloch and Pitts of size 2n...
02/10/2006 ∙ by Renè Ndoundam, et al. ∙ 0

• ### On non-Hamiltonian cycle sets of satisfying Grinberg's Equation

In [1] we used a cycle basis of the cycle space to represent a simple co...
06/24/2019 ∙ by Heping Jiang, et al. ∙ 0

• ### Study of all the periods of a Neuronal Recurrence Equation

We characterize the structure of the periods of a neuronal recurrence eq...
03/23/2015 ∙ by Serge Alain Ebélé, et al. ∙ 0

• ### Homogenization of the Landau-Lifshitz equation

In this paper, we consider homogenization of the Landau-Lifshitz equatio...
12/23/2020 ∙ by Lena Leitenmaier, et al. ∙ 0

• ### Quadratic Word Equations with Length Constraints, Counter Systems, and Presburger Arithmetic with Divisibility

Word equations are a crucial element in the theoretical foundation of co...
05/17/2018 ∙ by Anthony W. Lin, et al. ∙ 0

• ### A generative, predictive model for menstrual cycle lengths that accounts for potential self-tracking artifacts in mobile health data

Mobile health (mHealth) apps such as menstrual trackers provide a rich s...
02/24/2021 ∙ by Kathy Li, et al. ∙ 0

• ### ART: adaptive residual--time restarting for Krylov subspace matrix exponential evaluations

In this paper a new restarting method for Krylov subspace matrix exponen...
12/25/2018 ∙ by Mikhail A. Botchev, et al. ∙ 0

##### This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

## 1 Introduction

Caianiello and De Luca [3] have suggested that the dynamic behavior of a single neuron with a memory, which does not interact with other neurons can be modeled by the following recurrence equation :

 x(n)=1[k∑j=1ajx(n−j)−θ] (1)

where :

• is a variable representing the state of the neuron at .

• are the initial states.

• is the memory length, i.e., the state of the neuron at time depends on the states assumed by the neuron at the previous steps .

• () are real numbers called the weighting coefficients. More precisely, represents the influence of the state of the neuron at time on the state assumed by the neuron at time .

• is a real number called the threshold.

• 1[] = 0 if , and 1[] = 1 if .

The system obtained by interconnecting several neurons is called a neural network. These networks were introduced by McCulloch and Pitts

[7]

, and are quite powerful. Neural networks are able to simulate any sequential machine or Turing machine if an infinite number of cells is provided. Neural networks have been studied extensively as tools for solving various problems such as classification, speech recognition, and image processing

[19]. The field of application of threshold functions is large[1, 9, 10, 19]

. The spin moment of the spin glass system is one of the most cited example in solid state physics that has been simulated by neural networks.

Neural networks are usually implemented by using electronic components or is simulated in software on a digital computer. One way in which the collective properties of a neural network may be used to implement a computational task is by way of the concept of energy minimization. The Hopfield network is a well-known example of such an approach. It has attracted great attention in literature as a content-addressable memory [2].
Given a finite neural network, the configuration assumed by the system at time t is ultimately periodic. As a consequence, there is an integer called the period (or a length of a cycle) and another integer called the transient length such that:

where . The period and the transient length of the sequences generated are good measures of the complexity of the neuron. A bifurcation occurs when a small smooth change made to the parameter values (the bifurcation parameters) of a system causes a sudden ’qualitative’ or topological change in its behaviour. A period halving bifurcation in a dynamical system is a bifurcation in which the system switches to a new behaviour with half the period of the original system. A great variety of results have been established on recurrence equations modeling neurons with memory [1, 4, 5, 6, 11, 12, 14, 15, 20]. However some mathematical properties are still very intriguing and many problems are being posed. For example, the question remains as to whether there exists one neuronal recurrence equation with transients of exponential lengths [18]. In [13], we give a positive answer to this question by exhibiting a neuronal recurrence equation with memory which generates a sequence of exponential transient length and exponential period length with respect to the memory length. Despite this positive answer, one question remains: does there exist one neuronal recurrence equation with exponential transient length and fixed point ?
In this work, from a neuronal recurrence equation of memory size , whose dynamics contains a cycle of length , we build a set of neuronal recurrence equations whose dynamics describe respectively:

• the transient of length , if

• the cycle of length if and 1 if

Thus, we give a positive answer to the precedent question.
The technique used in this paper to get the period-halving bifurcation is to modify some parameters (weighting coefficients and threshold) of the neuronal recurrence equation. This technique relies on control theory. Controllability is related to the possibility of forcing the system into a particular state by using an appropriate control signal.

The paper is organized as follows: in Section 2, some previous results are presented. Section 3 presents some preliminaries. Section 4 is devoted to the construction of neuronal recurrence equation . Section 5 deals with the behavior of neuronal recurrence equation . Concluding remarks are stated in Section 6.

## 2 Previous Results

The only study of bifurcation was done by Cosnard and Goles in [6]. Cosnard and Goles [6] studied the bifurcation in two particular cases of neuronal recurrence equation:
Case 1: geometric coefficients and bounded memory
Cosnard and Goles described completely the structure of the bifurcation of the following equation:

 xn+1=1[θ−k−1∑i=0bixn−i]

when varies. They showed that the associated rotation number is an increasing number of the parameter .
Case 2: geometric coefficients and unbounded memory
Cosnard and Goles described completely the structure of the bifurcation of the following equation:

 xn+1=1[θ−n∑i=0bixn−i]

when varies. They showed that the associated rotation number is a devil’s staircase.

From line 11 to line 15 of page 15 in [5], it is written: “This shows that, if there is a neuronal recurrence equation with memory length that generates sequences of periods , then there is a neuronal recurrence equation with memory length that generates a sequence of period , where denotes the least common multiple.” This allows us to write the following fundamental lemma of composition of a neuronal recurrence equation:

###### Lemma 1

[5]
If there is a neuronal recurrence equation with memory length that generates sequences of periods , then there is a neuronal recurrence equation with memory length that generates a sequence of period .

Lemma 1 does not take into account the study of the transient length. One can amend Lemma 1 to obtain the following lemma:

###### Lemma 2

[13, 14] If there is a neuronal recurrence equation with memory length that generates a sequence of transient length and of period then there is a neuronal recurrence equation with memory length that generates a sequence of transient length and of period

In the following example, we will show that Lemma 1 and Lemma 2 are incomplete.

Example 1:
Let us suppose that the neuronal recurrence equation defined by Equation (1) generated six sequences

 {xi(n) : n ≥0} , 0≤i≤5 (2)

of periods

 pi = 1 , 0≤i≤5 (3)

It is clear that each sequence defined by Equation (2) is a fixed point. We present two different cases of evolution.
First case:
We suppose that

 x2i(n) =0 ;∀n,i such that n≥0 and 0≤i≤2 (4) x2i+1(n) =1 ;∀n,i such that n≥0 and 0≤i≤2 (5)

It is easy to verify that the shuffle of the neuronal recurrence equation defined by Equations (4) and (5) is

 x0(0)x1(0)…x5(0)x0(1)x1(1)…x5(1)⋯x0(i)x1(i)…x5(i)⋯=010101010101010101010101⋯010101010101⋯ (6)

The sequence defined by Equation (6) describes a period of length . By application of the Lemma 1 the period of the sequence defined by Equation (6) should be  ( more precisely ).
Second case:
We suppose that

 xi(n) =0 ,∀n,i such that n≥0 and i∈{0,1,3,4} (7) xi(n) =1 ,∀n,i such that n≥0 and i∈{2,5} (8)

It is easy to verify that the shuffle of the neuronal recurrence equation defined by Equations (7) and (8) is

 x0(0)x1(0)…x5(0)x0(1)x1(1)…x5(1)⋯x0(i)x1(i)…x5(i)⋯=001001001001001001001001001001001001⋯001001⋯ (9)

The sequence defined by Equation (9) describes a period of length . By application of the Lemma 1 the period of the sequence defined by Equation (9) should be  ( more precisely ).
The first case and the second case of example 1 show that Lemma 1 and Lemma 2 don’t take into account all the cases.
One can amend the Lemma 1 as follows:

###### Lemma 3

If there is a neuronal recurrence equation with memory length that generates sequences of periods , then there is a neuronal recurrence equation with memory length that generates a sequence of period . is defined as follows:
First case: such that

 Per = r×lcm(p1,⋯,pr).

Second case: ; .

 Per  is a divisor of r.

The improvement of Lemma 1 doesn’t modify all the main results about periods obtained in the papers [5, 11, 12, 14, 15] because all these main results consider only the case where the periods of the sequences are greater or equal to 2.
We can also amend the Lemma 2 as follows:

###### Lemma 4

If there is a neuronal recurrence equation with memory length that generates a sequence of transient length and of period then there is a neuronal recurrence equation with memory length that generates a sequence of transient length and of period . is defined as follows:
First case: such that

 Per = r×lcm(p1,⋯,pg).

Second case: ; .

 Per is a divisor of g.

## 3 Preliminaries

Let

be a positive integer. For a vector

, a real number and a vector . We define the sequence by the following recurrence:

 x(t)={ϕ(t) ;t∈{0,…,k−1}1(∑ki=1aix(t−i)−θ) ;t≥k (10)

We denote by the sequence generated by equation (10), its period and its transient length.

Let be a positive integer, we denote the cardinality of the set by . Let us denote by the prime numbers belonging to the set , the sequence is defined as .
We also suppose that:

 p−1+ρ(m) < p−2+ρ(m) < ⋯ < pi+1 < pi < ⋯ < p1 < p0 (11)

Subsequently, we consider only the integers such that .
It is easy to check that contains at most odd integers. It follows that

 ρ(m) ≤ ⌈m−12⌉ (12)

We set and , we define :

 μ(m,αi) =⌊ k3m−αi ⌋ β(m,αi) =k−((3m−αi)μ(m,αi))

From the previous definitions, we have .
It is clear that

 2m+1 ≤ 3m−αi ≤ 3m−1

This implies that

 (6m−1)ρ(m)3m−1 ≤ k3m−αi ≤ (6m−1)ρ(m)2m+1

Therefore

 2ρ(m)  ≤  μ(m,αi)  ≤  3ρ(m) (13)

, we want to construct a neuronal recurrence equation with memory of length which evolves as follows :

 00…0β(m,αi)100…03m−αi100…03m−αi⋯100…03m−αi⋯100…03m−αi⋯ (14)

and which describes a cycle of length .
, let be the vector defined by

 ϕαi(0)…ϕαi(k−1)=0…0β(m,αi)10…0pi⋯10…0piμ(m,αi)pi (15)

In other words, is defined by:

 ϕαi(j)={1if ∃ ℓ, 0≤ℓ ≤μ(m,αi)−1 such that j =β(m,αi)+ℓpi0otherwise

We define the neuronal recurrence equation by the following recurrence:

 (16)

where is defined as follows:

First case: is even and

 ¯aj=⎧⎪ ⎪ ⎪⎨⎪ ⎪ ⎪⎩2if j∈Pos(αi2) and j≤3×ρ(m)×pi22 ,−2if j∈Pos(αi2) and j >3×ρ(m)×pi22 ,0otherwise. (17)

Second case: is odd, and

 ¯aj=⎧⎪ ⎪ ⎪ ⎪ ⎪⎨⎪ ⎪ ⎪ ⎪ ⎪⎩2if j∈Pos(αi2) and j≤(3ρ(m)−1)2×pi2 , −2if j∈Pos(αi2) and (3ρ(m)+1)2×pi2≤j≤(2ρ(m)−2)×pi2,−1if j∈{(2ρ(m)−1)×pi2 , 2ρ(m)×pi2} \ , 0otherwise. (18)

We also define:

 Pos(αi) ={jpi:j=1,…,2ρ(m)} (19) ={pi,2pi,…,(−1+2ρ(m))pi,2ρ(m)pi},  0 ≤i ≤−1+ρ(m) (20) D ={i:i=1,…,k}={1,2,…,k−1,k} (21) F =−1+ρ(m)⋃i=0Pos(αi) (22) G =D∖F (23) ¯θ = 2×ρ(m) (24) k =(6m−1)×ρ(m) (25)

By definition represents the set of indices such that
From the definition of and from Equation (15), one can easily verify that

 j ∈ Pos(αi) ⟹ xαi(k−j) = 1 (26) j ∈ D∖Pos(αi) ⟹  xαi(k−j) = 0 (27)

, we also denote the set of indices such that , in other words:

 PPos(αi,d) = { j:xαi(k+d−j)=1 and 1≤j≤k }

and , we denote:

 Q(αi,d) ={d+jpi:j=0,1,…,μ(m,αi)},   0

The neuronal recurrence equation with memory of length is defined by Equations (15) and (16).
We will show that the neuronal recurrence equation evolves as specified in equation (14).
In the following proposition, we present an important property.

###### Proposition 1

[14] and

 card E(αi,d)≤ρ(m)−1.

The following proposition characterizes the sum of the interaction coefficients when

###### Proposition 2

, we have

 ∑j∈Pos(αi)¯aj = 2×ρ(m).

The following lemma characterizes the evolution of the sequence at time .

###### Lemma 5
 xαi(k)=1.

From Lemma 5 and Equation (15), it is easy to verify that

 PPos(αi,1) = Q(αi,1) (28)

From the definition of , from Equation (15), from Equation (28) and from the Lemma 5, we check easily that:

 ℓ ∈ D∖E(αi,1)  ⟹  xαi(k+1−ℓ)=0  or  ¯aℓ = 0. (29)

The values of the sequence at time are given by the following lemma.

###### Lemma 6
 ∀ t ∈ N such that 1≤t≤3m−1−αi, we have  xαi(k+t)=0.

It is easy to verify that , we have:

 PPos(αi,j) = Q(αi,j)  ∀ j,  1 ≤ j ≤3m−1−αi
###### Lemma 7
 There exists ¯a,ϕαi∈Rk and ¯θ∈R such that :
 Per(¯a,¯θ,ϕαi)=pi.

###### Lemma 8

, and

 μ(m,αi) ≤k∑j=1xαi(t−j) ≤1+μ(m,αi).

In order to present some properties of the sequence , we introduce the following notation:

###### Notation 1

Let us define as:

 S1(αi,n)=k∑j=1¯ajxαi(n−j)

and let be a strictly negative real number such that:

 max { S1(αi,n)−¯θ : S1(αi,n) < ¯θ and n≥k}≤λ

###### Lemma 9

such that and ,

 S1(αi,n) ∈ [−2(1+μ(m,αi)),¯θ−1]∪{¯θ},
 λ ∈[−1,0[.

Let be the sequence whose first terms are defined as follows:

 vαi(0)vαi(1)…vαi(k−1) = xαi(1)⋯xαi(k−1)¯¯¯¯¯¯¯¯¯¯¯¯¯¯xαi(k), (30)

and the other terms are generated by the following neuronal recurrence equation:

 vαi(n)=1[k∑j=1¯ajvαi(n−j)−¯θ],   n≥k. (31)
###### Remark 1

The term is equal to , this implies that is equal to .

The parameters and used in neuronal recurrence Equation (31) are those defined in Equations (17), (18) and (24).

The following lemma, which is easy to prove, characterizes the evolution of the sequence .

###### Lemma 10

In the evolution of the sequence , we have:
(a) ,
(b) ,
(c) The sequence describes a transient of length and a fixed point.

The instability of the sequence occurs as a result of the convergence of the sequence to .

###### Notation 2

. is the length of the memory of some neuronal recurrence equations.

Let us also note:

 L0(d)={ρ(m)×lcm(pd+1,pd+2,…,p−2+ρ(m),p−1+ρ(m)) ,if 0≤d≤−2+ρ(m) 1 ,if d=−1+ρ(m). (32)
 L1(d) =ρ(m)×lcm(p0,p1,…,pd) , 0≤d≤ρ(m)−1 (33) L2 =ρ(m)×lcm(p0,p1,…,p−1+ρ(m)) (34)

, and represent the periods of some neuronal recurrence equations.

Let be the sequence whose first terms are defined as follows:

 ∀ j∈N, 0≤j≤ k−1   y((ρ(m)×j)+i)=xαi(1+j),  0≤i≤−1+ρ(m) (35)

and the other terms are generated by the following neuronal recurrence equation:

 y(n)=1⎡⎣h∑f=1bfy(n−f)−θ1⎤⎦ ; n≥h (36)

where

 bf={¯aj ,if f=ρ(m)×j,  1≤j≤k 0 ,otherwise. (37)
 θ1=¯θ. (38)

The parameters are those defined in Equations (17) and (18). The parameters and are defined in Equations (24) and (25).

###### Remark 2

(a) The first terms of the sequence are obtained by shuffling the terms of each subsequence where .
(b) The neuronal recurrence equation (36) is obtained by applying the construction of Lemma 1 to the neuronal recurrence Equation (16) whose parameters are given in Equations (17), (18), (24) and (25).

From the fact that the sequence is the shuffle of the subsequences and from its construction, we can write:

###### Lemma 11

such that with and , we have:

 y(t) = xαi(1+q).

The next lemma gives the period of the sequences .

###### Lemma 12

The sequence describes a cycle of length .

such that , we denote by the sequence whose first terms are defined as:

 ∀ i, 0≤i≤d   w(ρ(m)j+i,d)={xαi(1+j),0≤ j ≤ k−2¯¯¯¯¯¯¯¯¯¯¯¯¯¯xαi(k) ,j = k-1. (39)

and

 ∀ i, d+1≤i≤−1+ρ(m)  w(ρ(m)j+i,d)=y(ρ(m)j+i+L1(d)) ; 0≤ j ≤ k−1 (40)

The first terms of the sequence are obtained by shuffling the terms of each of the sequences:

 vαi(0) vαi(1) vαi(2)…vαi(k−1) ; 0≤i≤d (41)

and

 xαi(1+γi(d)) xαi(2+γi(d)) xαi(3+γi(d))…xαi(k+γi(d)) ; d+1≤i≤−1+ρ(m) (42)

where:

 L1(d)ρ(m)≡γi(d)(modpi) ; d+1≤i≤−1+ρ(m). (43)

The other terms of the sequence are generated by the following neuronal recurrence equation:

 w(n,d)=1⎡⎣h∑f=1bfw(n−f,d)−θ1⎤⎦ ; n≥h (44)

The next lemma gives the period of the sequence .

###### Lemma 13

The sequence generates a transient of length and a cycle of length .

###### Notation 3

Let us define as:

 S2(n) = h∑f=1bfy(n−f), S3(n,d) = h∑f=1bfw(n−f,d).
###### Remark 3

On the basis of the composition of automata [5] and the definition of , we can conclude that:

• , and

• .

### 3.1 Results on the dynamics of sequences y and w

In this subparagraph, we recall and give some interesting results on dynamics of the sequences and .
The following lemma characterizes the sequence