# A System of Billiard and Its Application to Information-Theoretic Entropy

In this article, we define an information-theoretic entropy based on the Ihara zeta function of a graph which is called the Ihara entropy. A dynamical system consists of a billiard ball and a set of reflectors correspond to a combinatorial graph. The reflectors are represented by the vertices of the graph. Movement of the billiard ball between two reflectors is represented by the edges. The prime cycles of this graph generate the bi-infinite sequences of the corresponding symbolic dynamical system. The number of different prime cycles of a given length can be expressed in terms of the adjacency matrix of the oriented line graph. It also constructs the formal power series expansion of Ihara zeta function. Therefore, the Ihara entropy has a deep connection with the dynamical system of billiards. As an information-theoretic entropy, it fulfils the generalized Shannon-Khinchin axioms. It is a weakly decomposable entropy whose composition law is given by the Lazard formal group law.

## Authors

• 8 publications
• 5 publications
06/06/2019

### Ihara Zeta Entropy

In this article, we introduce an entropy based on the formal power serie...
08/05/2019

08/23/2019

### The Group Theoretic Roots of Information I: permutations, symmetry, and entropy

We propose a new interpretation of measures of information and disorder ...
07/16/2018

### On the Information Theoretic Distance Measures and Bidirectional Helmholtz Machines

By establishing a connection between bi-directional Helmholtz machines a...
07/07/2020

### Information-theoretic convergence of extreme values to the Gumbel distribution

We show how convergence to the Gumbel distribution in an extreme value s...
03/25/2021

### Information theoretic parameters of non-commutative graphs and convex corners

We establish a second anti-blocker theorem for non-commutative convex co...
08/02/2021

### Fundamental Advantage of Feedback Control Based on a Generalized Second Law of Thermodynamics

Based on a novel generalized second law of thermodynamics, we demonstrat...
##### This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

## 1 Introduction

In information theory, entropy is a measure of information. The information is the uncertainty which is inherent in a probability distribution. Shannon entropy is a well-known measure of information. The idea of entropy is diversely studied in the literature of thermodynamics, information theory, dynamical system, graph theory, and social science. The community of mathematical physics is interested to generalize the concept of entropy due to its emerging applications in economics, astrophysics, and informatics. In recent years, the generalization of entropy is a crucial topic for the investigations in mathematics.

There are different approaches to generalize entropy, in literature. An entropy is a function over the set of all probability distribution satisfying the Shannon-Khincin axioms or their generalizations. The function is independent of the probability distributions. The literature of generalized entropy is concerned with the foundation and properties of the entropy functions. To define new entropy functions we introduce a number of parameters in the expression of Shannon entropy. These parameters may not have any physical significance. The Tsallis entropy is a generalized entropy with a single parameter . Given a discrete probability distribution the Tsallis entropy is defined by . Observe that , which is the Shannon entropy. Entropy with more than two parameters is also investigated in the literature. Note that, there is no dependence between the parameter and probability distribution. Therefore, different values of generate different measures of information for a particular . Another formulation for generalizing the Shannon entropy is replacing the logarithm with varieties of generalized logarithms, such as deformed logarithms, formal group logarithms, poly-logarithms etc. In this scenario also, the literature is relevant to the properties and the structure of the entropy function.

Following similar ideas, we introduce the Ihara entropy in this article. The Ihara zeta function [1, 2] of a combinatorial graph is defined by

 ζG(z)=∏P(1−zγ(P))−1, (1)

where is a prime cycle in the graph of length . The Ihara zeta function is defined on a class of graphs satisfying a number of particular characteristics. In this article, we present a physical meaning of these characteristics. Consider the vertices of the graph as reflectors and edges as the movement of a billiard ball between them. It helps us to present the dynamical system as a symbolic dynamical system. The Ihara zeta function acts as a Ruelle zeta function for this system. There are invertible formal power series [3] which can be expressed in terms of Ihara zeta function. We consider one of them as a formal group logarithm, which replaces the natural logarithm for Shannon entropy. The new generalized entropy of probability distributions is mentioned as Ihara entropy, which depends on the structure of graphs. We then prove that this entropy fulfills the Shannon-Khinchin axioms. A number of formal group-theoretic entropy are recently introduced in literature [4, 5, 6]. This article discusses the dynamical system theoretic nature of this entropy. The entropy function depends on the prime cycles of a graph, which are induced by the movement of billiard ball between reflectors. Therefore, the billiard dynamics inherent in the entropy function. Another important characteristic is that the new entropy is a member of the one-parameter class of entropy. This parameter scales a probability distribution in the domain of Ihara zeta function. In addition, this entropy is a measure of uncertainty in a probability distribution and different from the graph entropy or the dynamical entropy.

This article is distributed into four sections. In section 2, we present a model of billiard dynamics. This section describes a combinatorial graph associated with a billiard dynamical system. It also introduces a symbolic dynamical system where the symbols are the edges of the graph. The bi-infinite sequences of symbols represent the bi-infinite walks, which can be decomposed into prime cycles. The next section is dedicated to the Ihara zeta function and its formal power series representations. Here we define the Ihara entropy and discuss its characteristics. Then we conclude this article.

## 2 A model of billiard dynamics

This article considers a particular model of the motion of a billiard ball on a smooth plane. At least four round shaped reflectors are placed at arbitrarily chosen positions on a smooth plain, such that, they are not arranged on a single straight line. A billiard ball moves between the reflectors and reflected elastically when it collides with a reflector. The ball can not be reflected on the same reflector consecutively. We are not interested in the radius of the reflectors, their internal distance, initial and terminal position of the ball, as well as initial speed and angles of reflections of the billiard.

To associate a combinatorial graph with this system we assume the reflectors as the vertices. There is an edge between two vertices if a ball can be reflected between the corresponding reflectors. A ball can move in any directions, between two reflectors. Therefore, each edge has two opposite orientations. It is assumed that the ball can not be consequently reflected on the same reflector. Thus, there is no loop on the vertices. The reflectors do not form a straight line. Hence, the path graphs are excluded from our consideration. This assumption also indicates that there is no vertex in these graphs which is adjacent to only one vertex. A cycle is also excluded from our discussion. We can arrange the reflectors in a cycle. In this arrangement, if a ball moves from the reflectors to another reflector , then it has a chance to move towards another reflector which is located nearly . Combining all these observations, we find that a graph describing the dynamics of billiard under our consideration, is a simple, finite, connected, and undirected graph without any vertex of degree one. In addition, is neither a cycle graph nor a path graph. We call them admissible graphs. Different arrangements of the reflectors are represented by different admissible graphs. As an example, consider figure 0(a) which contains a set of reflectors which are represented by circles. This system can be represented as the combinatorial graph depicted in figure 0(b).

Let a particular billiard system be represented by a graph with vertices and edges. As every edge has two opposite orientations the set of all orientations can be collected as

 E={e(1),e(2),…e(m),e(m+1)=(e(1))−1,…e(2m)=(e(m))−1}. (2)

Here, is a directed edge with initial and terminating vertices and , respectively. Every directed edge has an inverse in for .

The movement of a ball between the reflectors generates a directed path, which is a sequence of directed edges, in the graph. We are not interested in the initial and terminal position of the billiard ball. Hence, we assume that the sequence of directed edges forms a bi-infinite, directed walk on the graph. Two directed edges and consecutively arise on a walk if . It indicates that a ball which is moving along the direction of will follow the direction of after getting reflected at . We describe that the edges and are composeble and the composition is represented by . Note that, the set forms a collections of symbols, that is an alphabet [7] of a symbolic dynamic system. Symbolically, we write a bi-infinite walk , such that, any two constitutive edges and are composeble for all . The set of all such walks is a full -shift, which is denoted by . A block over is a walk of finite length . The ball rarely reflects between two reflectors repeatedly. Therefore, we neglect situation , in a bi-infinite walk. Now we define a set of forbidden blocks . Let be the subset of which does not contain any block in . Note that, is a shift space of finite type. A cycle of length is a closed finite walk, such that, . Two cycles and are equivalent if for some . The set of equivalence classes of cycles are called prime cycles. The length of a prime cycle is denoted by . Two primes and are composable if , and the composition is denoted by . In a similar fashion, we define product of a prime and a finite walk , or product of two finite walks and . A simple observation indicates that any bi-infinite walk can be expressed as , where is the number of consecutive repetitions of the prime and is the finite walk after the repetition of prime .

## 3 Ihara zeta function and entropy

Recall from the last section that the movement of the billiard ball in the system of reflectors generates bi-infinite walks on a graph . To illustrate the properties of these walks we consider the oriented edges as the vertices of a new graph, which is called the oriented line graph. Formally, the oriented line graph of the graph is represented by and

 E(¯¯¯¯G)={(e(i),e(j))∈E×E:t(e(i))=i(e(j)) and i(e(i))≠t(e(j))}. (3)

It is known that the number of prime cycles of length starting and ending at the vertex is expressed as the -th element of the , where is the adjacency matrix of the graph defined by,

 t(e(i),e(j))={1if (e(i),e(j))∈E(¯¯¯¯G), and0if (e(i),e(j))∉E(¯¯¯¯G). (4)

Therefore, represents the number of all cycles of length , which is a non-negative integer. Now the generating function for the number of cycles in a graph is given by . The Ihara zeta function for the graph is alternatively represented by the formal power series

 ζG(z)=exp(∞∑k=1tr(Tk)kzk), (5)

where [8]. Here,

is the greatest eigenvalue of

, which is a positive number.

In this work, we are interested in entropy of a probability distribution depending on the billiard dynamics. As probability is a positive real number, we restrict to the real interval . The restricted function , such that, can be expressed as

 ζG(x)=1+∞∑k=1tr(Tk)kxk+12!(∞∑k=1tr(Tk)kxk)2+13!(∞∑k=1tr(Tk)kxk)3+…=1+c1x+c2x2+c3x3+c4x4+c5x5+…. (6)

In the above expression, , since is an adjacency matrix of a graph without a loop. Also, . As, is non-negative for all , and the coefficients of in equation (6) are all positive. Hence, for all , as well as all its derivatives exists and positive. Clearly, are all monotone increasing functions.

Given two formal power series [9] and the composition is defined by another power series . The power series is said to be the compositional inverse of if holds. The power series has an inverse with respect to the composition if and only if . The coefficients in equation (6) suggest that the formal power series of has no compositional inverse.

The formal group entropy [4] of a discrete probability distribution is given by , where is an invertible formal power series. Let , which refers . Now, the equation (6) indicates that

 ζG(ae−t)=1+c2(ae−t)2+c3(ae−t)3+c4(ae−t)4+…. (7)

Here, is a non-zero scaling factor, such that, . In addition,

 ζG(a)=1+c2a2+c3a3+c4a4+…. (8)

Hence,

 ζG(ae−t)−ζG(a)=c2a2(e−2t−1)+c3a3(e−3t−1)+c4a4(e−4t−1)+… (9)

Note that, . Now,

 ζG(ae−t)−ζG(a)+e−t−1=(e−t−1)+c2a2(e−2t−1)+c3a3(e−3t−1)+c4a4(e−4t−1)+… (10)

Clearly, has no constant term. The coefficient of in the power series of is

 ddt[ζG(ae−t)−ζG(a)+e−t−1]|t=0=[−ae−tζ′G(ae−t)−e−t]|t=0=−[1+aζ′G(a)]. (11)

Hence, the formal power series corresponding to

 G(t)=ζG(ae−t)−ζG(a)+e−t−1−(1+aζ′G(a)) (12)

has zero constant coefficient as well as the coefficient for is . Therefore, there exists a formal power series , such that, . Now, replacing in the expression of we find

 G(log(1p))=ζG(a)+1−(ζG(ap)+p)1+aζ′G(a). (13)

As is a monotone increasing function, . Therefore, . It leads us to construct the formal group theoretic entropy associated to the Ihara zeta function, which is defined below.

###### Definition 1.

Given a graph the Ihara entropy of a discrete probability distribution is defined by

 SG(P)=W∑i=1piG(log(1pi))=W∑i=1piζG(a)+1−(ζG(api)+pi)1+aζ′G(a),

where . Here, is the largest eigenvalue of , which is the oriented line graph of .

In information theory, an entropy of a probability distribution satisfies the Shannon-Khinchin axioms [10, 11] which are mentioned below:

1. The function is continuous with respect to all its arguments , where is a discrete probability distribution.

2. Adding a zero probability event to a probability distribution does not alter its entropy, that is where .

3. The function

is maximum for the uniform distribution

.

4. Given two independent subsystems of a statistical system, .

Define a function , such that,

 s(p)=p×ζG(a)+1−(ζG(ap)+p)1+aζ′G(a). (14)

Therefore, the Ihara entropy . Clearly, is a continuous function of , that is, is also continuous with respect to all its arguments for . Thus, satisfies the axiom 1. The axiom 2 also trivially satisfied as , that is probability alters nothing in . The axiom 3 and axiom 4 are non-trivial which are illustrated in the following two theorems.

###### Theorem 1.

There exists a global maxima of in , where is defined in equation (14).

###### Proof.

We have

 s′(p)=1+ζG(a)−2p−ζG(ap)−apζ′G(ap)1+aζ′G(a). (15)

Now holds if and only if

 h(p)=1+ζG(a)−2p−ζG(ap)−apζ′G(ap) (16)

has a root in . Equation (6) suggests that for any graph we have . Therefore, and . As is a continuous function of there is at least one point in , such that, that is . Also, for all . Therefore, is strictly monotone decreasing function, that is is the unique point in such that . Now,

 s′′(p)=−2−2aζ′G(ap)−a2pζ′′G(ap)1+aζ′G(a)<0, (17)

for all . Hence, is a global maxima of in . ∎

The theorem 1 leads us to the conclusion that the entropy considers the maximum value if is maximum for all . Thus, to maximize we need for all , which is the uniform distribution after a normalization. Therefore, the Ihara entropy mentioned in definition 1 fulfills the axiom 3 of the Shannon-Khinchin axioms.

We generalize the axiom 4 of the Shannon-Khinchin axioms by utilize the Lazard formal group law. Recall that, a commutative one-dimensional formal group law over is a formal power series with two indeterminates and of the form higher order terms, such that

 Φ(x,0)=Φ(0,x)=x,Φ(Φ(x,y),z)=Φ(x,Φ(y,z)),Φ(x,y)=Φ(y,x). (18)

Recall from equation (12) that we have considered the as the compositional inverse of . Now, the Lazard formal group law [12] is defined by the formal power series

 Φ(s1,s2)=G(F(s1)+F(s2)). (19)
###### Theorem 2.

Let and be two independent probability distributions. Then the Ihara entropy of the joint probability distribution is given by

 SG(PAPB)=WA∑i=1WB∑j=1p(A)ip(B)jΦ⎛⎜⎝G⎛⎝log⎛⎝1p(A)i⎞⎠⎞⎠,G⎛⎜⎝log⎛⎜⎝1p(B)j⎞⎟⎠⎞⎟⎠⎞⎟⎠,

where is Lazard formal group law given by .

###### Proof.

The joint probability distribution is given by . Now,

 SG(PAPB)=WA∑i=1WB∑j=1p(A∪B)ijG⎛⎜⎝log⎛⎜⎝1p(A∪B)ij⎞⎟⎠⎞⎟⎠=WA∑i=1WB∑j=1p(A)ip(B)jG⎛⎜⎝log⎛⎝1p(A)i⎞⎠+log⎛⎜⎝1p(B)j⎞⎟⎠⎞⎟⎠. (20)

Denote and . It leads us to write

 SG(PAPB)=WA∑i=1WB∑j=1p(A)ip(B)jG(t(A)i+t(B)j)=WA∑i=1WB∑j=1p(A)ip(B)jG(F(s(A)i)+F(s(B)j)), (21)

where is the compositional inverse of as well as and . Applying Lazard formal group law we have

 SG(PAPB)=WA∑i=1WB∑j=1p(A)ip(B)jΦ(s(A)i,s(B)j)=WA∑i=1WB∑j=1p(A)ip(B)jΦ(F−1(t(A)i),F−1(t(B)j))=WA∑i=1WB∑j=1p(A)ip(B)jΦ⎛⎜⎝G⎛⎝log⎛⎝1p(A)i⎞⎠⎞⎠,G⎛⎜⎝log⎛⎜⎝1p(B)j⎞⎟⎠⎞⎟⎠⎞⎟⎠. (22)

The axiom 4 is generalised by the composition law of the Lazard formal group law mentioned in equation (19) [5, theorem 1].

## 4 Conclusion

This article is at the interface of the dynamical system, information and graph theory. It focuses on the information-theoretic entropy of a discrete probability distribution. This article has a two-fold significance. It presents a physical significance for selecting a particular class of graphs in the literature of the Ihara zeta function. We begin with a dynamical system consists of a billiard ball moving between the reflectors. We describe the reflectors as the vertices of a combinatorial graph. An edge between two vertices represents the possibility of movement of the ball between the corresponding reflectors. A bi-infinite path generated by the movement of the ball represents a bi-infinite walk in the graph. Every bi-infinite walk can be decomposed into prime cycles in the graph. The number of prime cycles of finite length can be expressed in terms of the adjacency matrix of an oriented line graph. We can represent this system in terms of symbolic dynamics over the corresponding graph. The Ihara zeta function is the dynamical zeta function for this system. It can be represented as a formal power series. Note that, this formal power series depends on the distribution of reflectors in the system. Now the idea of Ihara entropy is introduced. It is an entropy in terms of the Ihara zeta function. The composition law of this entropy is induced by the Lazard formal group law. It also satisfy the other properties of the Shannon-Khinchin axioms.

## Acknowledgment

SD is thankful to Dr. Subhashish Banerjee who introduced the author to the Ihara Zeta function and its applications in quantum information theory.

## References

• [1] Audrey Terras. Zeta functions of graphs: a stroll through the garden, volume 128. Cambridge University Press, 2010.
• [2] Yasutaka Ihara. On discrete subgroups of the two by two projective linear group over p-adic fields. Journal of the Mathematical Society of Japan, 18(3):219–235, 1966.
• [3] Jean A Dieudonne. Introduction to the theory of formal groups, volume 20. CRC Press, 1973.
• [4] Piergiulio Tempesta. A theorem on the existence of trace-form generalized entropies. Proc. R. Soc. A, 471(2183):20150165, 2015.
• [5] Piergiulio Tempesta. Beyond the shannon–khinchin formulation: the composability axiom and the universal-group entropy. Annals of Physics, 365:180–197, 2016.
• [6] Supriyo Dutta and Partha Guha. Ihara zeta entropy. arXiv preprint arXiv:1906.02514, 2019.
• [7] Douglas Lind, Brian Marcus, Lind Douglas, and Marcus Brian. An introduction to symbolic dynamics and coding. Cambridge university press, 1995.
• [8] Motoko Kotani and Toshikazu Sunada. 2.-zeta functions of finite graphs. Journal of Mathematical Sciences-University of Tokyo, 7(1):7–26, 2000.
• [9] Thomas S Brewer. Algebraic properties of formal power series composition. 2014.
• [10] Claude Elwood Shannon. A mathematical theory of communication. Bell system technical journal, 27(3):379–423, 1948.
• [11] A Ya Khinchin. Mathematical foundations of information theory. Courier Corporation, 2013.
• [12] Michiel Hazewinkel. Formal groups and applications, volume 78. Elsevier, 1978.