denotes the proximity cell of point generator , i.e., the locii of points closer to than to any other generator .
When the dissimilarity is chosen as the Euclidean distance, we recover the ordinary Voronoi diagram . Figure LABEL:fig:VoronoiDeEucl (left) displays the Voronoi cells of an ordinary Voronoi diagram for a given set of generators.
The Voronoi diagram and its dual Delaunay complex  are fundamental data structures of computational geometry . These geometric data-structures find many applications in robotics, 3D reconstruction, geographic information systems (GISs), etc. See the textbook  for some applications. The Delaunay simplicial complex is obtained by drawing a straight edge between two generators iff their Voronoi cells share an edge. In Euclidean geometry, the Delaunay simplicial complex triangulates the convex hull of the generators, and is called the Delaunay triangulation. Figure 1 (middle, right) depicts the dual Delaunay triangulations to ordinary Voronoi diagrams. In general, when considering arbitrary dissimilarity , the Delaunay simplicial complex may not triangulate the convex hull of the generators.
When the dissimilarity is oriented or asymmetric, i.e., , one can define the reverse or dual dissimilarity . This duality is termed reference duality in , and is an involution:
The dissimilarity is called the forward dissimilarity.
In the remainder, we shall use the ‘:’ notational convention  between the arguments of the dissimilarity to emphasize that a dissimilarity is asymmetric: . For an oriented dissimilarity , we can define two types of dual Voronoi cells as follows:
with the property that
That is, the dual Voronoi cell with respect to a dissimilarity is the primal Voronoi cell for the dual (reverse) dissimilarity .
We can build a Voronoi diagram as a minimization diagram  by defining the functions . Then iff for all . Thus by building the lower envelope  of the functions , we get the the Voronoi diagram.
An important class of smooth asymmetric dissimilarities are the Bregman divergences . A Bregman divergence is defined for a -strictly convex functional generator by
where denotes the gradient of . In information geometry [13, 4, 45], Bregman divergences are the canonical divergences of dually flat spaces . Dually flat spaces generalize the (self-dual) Euclidean geometry obtained for the generator . In information sciences, dually flat spaces can be obtained as the induced information geometry of the Kullback-Leibler divergence  of an exponential family manifold [24, 4] or a mixture manifold . The dual Bregman Voronoi diagrams and their dual regular complexes have been studied in .
In this paper, we study the Voronoi diagrams induced by the Fisher-Rao distance [52, 6, 51], the Kullback-Leibler (KL) divergence  and the chi square distance  for the family of Cauchy distributions. Cauchy distributions also called Lorentzian distributions in the literature [34, 30].
The paper is organized with our main contributions as follows:
In Section 2, we concisely review the information geometry of the Cauchy family: We first describe the hyperbolic Fisher-Rao geometry in §2.1 and make a connection between the Fisher-Rao distance and the chi square divergence, then we point out the remarkable fact that any -geometry coincides with the Fisher-Rao geometry (§2.2), and we finally present the dually flat geometric structures on the Cauchy manifold related to Tsallis’ quadratic entropy  which amount to a conformal flattening of the Fisher-Rao geometry (§2.4). Section 3.3 proves that the square root of the KL divergence between any two Cauchy distributions yields a metric distance (Theorem 3), and that this metric distance can be isometrically embedded in a Hilbert space for the case of the Cauchy scale families (Theorem 4). Section 4 shows that the Cauchy Voronoi diagram induced either by the Fisher-Rao distance, the chi-square divergence, or the Kullback-Leibler divergence (and its square root metrization) all coincide with a hyperbolic Voronoi diagram calculated on the Cauchy location-scale parameters. This result yields a practical and efficient construction algorithm of hyperbolic Cauchy Voronoi diagrams  (Theorem 5) and their dual hyperbolic Cauchy Delaunay complexes. We prove that the hyperbolic Cauchy Voronoi diagrams are Fisher orthogonal to the dual Delaunay complexes (Theorem 6). Finally, we conclude this work in §5.
2 Information geometry of the Cauchy family
We start by reporting the Fisher-Rao geometry of the Cauchy manifold (§2.1), then show that all -geometries coincide with the Fisher-Rao geometry (§2.2). Then we recall that we can associate an information-geometric structure from any divergence (§2.3) and finally dually flatten this Fisher-Rao geometry using Tsallis’s quadratic entropy  (§2.4) and a conformal Fisher metric.
2.1 Fisher-Rao geometry of the Cauchy manifold
investigates the geometry of families of probability measures. The 2D familyof Cauchy distributions
is the Cauchy standard distribution.
Let denote the log density. The parameter space of the Cauchy family is called the upper plane. The Fisher-Rao geometry [27, 52, 51] of consists in modeling as a Riemannian manifold by choosing the Fisher information metric 
as the Riemannian metric tensor, wherefor (i.e., and ).
The Fisher-Rao distance is then defined as the Riemannian geodesic length distance on the Cauchy manifold :
The Fisher information metric tensor for the Cauchy family  is
A generic formula for the Fisher-Rao distance between two univariate elliptical distributions is reported in . This formula when instantiated for the Cauchy distributions yields the following closed-form for the Fisher-Rao distance:
we get a relationship between the square infinitesimal lengths (line elements) and as follows:
It follows that the Fisher-Rao distance between two Cauchy distributions is simply obtained by rescaling the 2D hyperbolic distance expressed in the Poincaré upper plane :
This latter term shall naturally appear in §2.4 when studying the dually flat space obtained by conformal flattening the Fisher-Rao geometry. The expression of Eq.23 can be interpreted as a conformal divergence for the squared Euclidean distance .
We may also write the delta term using the 2D Cartesian coordinates as:
In particular, when , we get the simplified Fisher-Rao distance for the Cauchy scale family:
The Fisher-Rao distance between two Cauchy distributions is
The Fisher-Rao manifold of Cauchy distributions has constant negative scalar curvature , see  for detailed calculations.
It is well-known that the Fisher-Rao geometry of location-scale families amount to a hyperbolic geometry . For -variate scale-isotropic Cauchy distributions with , the Fisher information metric is , where denotes the identity matrix. It follows that
where is the -dimensional Euclidean norm: . That is, is the scaled -dimensional real hyperbolic distance  expressed in the Poincaré upper space model.
2.2 The dualistic -geometry of the statistical Cauchy manifold
A statistical manifold  is a triplet where is a Riemannian metric tensor and a cubic totally symmetric tensor (i.e., for any permutation ). For a parametric family of densities , the cubic tensor is called the skewness tensor , and defined by:
A statistical manifold structure allows one to construct Amari’s dualistic -geometry  for any : Namely a quadruplet where and are dual affine connections (i.e., ). We refer the reader to the textbook  and the overview  for further details concerning the dual torsion-free affine connections coupled with the metric tensor.
The Fisher-Rao geometry corresponds to the -geometry, i.e., the self-dual geometry where is the Levi-Civita metric connection : .
In information geometry, the invariance principle
states that the geometry should be invariant under the transformation of a random variableto provided that is a sufficient statistics . The -geometry are invariant geometry [4, 45].
A remarkable fact is that all the -geometries of the Cauchy family coincide with the Fisher-Rao geometry since the cubic skewness tensor vanishes everywhere , i.e., . The non-zero coefficients of the Christoffel symbols of the -connections (including the Levi-Civita metric connection derived from the Fisher metric tensor) are:
All -geometries coincide and have constant negative scalar curvature . In other words, we cannot choose a value for to make the Cauchy manifold dually flat . To contrast with this result, Mitchell  reported values of for which the -geometry is dually flat for some parametric location-scale families of distributions: For example, it is well known that the manifold
of univariate Gaussian distributions is-flat . The manifold of -Student’s distributions with degrees of freedom is proven dually flat when . Dually flat manifolds are Hessian manifolds  with dual geodesics being straight lines in one of the two dual global affine coordinate systems. On a global Hessian manifold, the canonical divergences are Bregman divergences. Thus these dually flat Bregman manifolds are computationally friendly  as many techniques of computational geometry can be naturally extended to these spaces .
2.3 Dualistic structures induced by a divergence
A divergence or contrast function  is a smooth parametric dissimilarity. Let denote the manifold of its parameter space. Eguchi  showed how to associate to any divergence a canonical information-geometric structure . Moreover, the construction allows to prove that (see [4, 45] for details). That is the dual affine connection associated to coincides with the primal connection associated to the dual divergence . Conversely, Matsumoto  proved that given an information-geometric structure , one can build a divergence such that from which we can derived the structure . Thus when calculating the Voronoi diagram for an arbitrary divergence , we may use the induced information-geometric structure to investigate some of its properties. For example, is the bisector -autoparallel?, or is the bisector of two generators orthogonal with respect to the metric to their -geodesic? Section 4 will study these questions.
2.4 Dually flat geometry of the Cauchy manifold by conformal flattening
The Cauchy distributions are usually handled in information geometry using the wider scope of -Gaussians [35, 30, 4] (deformed exponential families ) which also include the Student’s -distributions. Cauchy distributions are -Gaussians for . These -Gaussians are also called58], and they can be obtained as maximum entropy distributions with respect to Tsallis’ entropy  (see Theorem 4.12 of ):
When , we have the following Tsallis’ quadratic entropy:
That is, -Gaussians are -exponential families , generalizing the maxent exponential families derived from Shannon entropy . The integral corresponds to Onicescu’s informational energy [50, 46].
A dually flat structure construction for -Gaussians is reported in  (Sec. 4.3, p. 84–89). We instantiate this construction for the Cauchy distributions (-Gaussians):
denote the deformed -exponential and
its compositional inverse, the deformed -logarithm. The probability density of a -Gaussian can be factorized as
where denotes the 2D natural parameters. We have
Therefore the natural parameter is (for ) and the deformed log-normalizer is
In general, we obtain a strictly convex and -function , called the -free energy for a -Gaussian family. Here, we let for the Cauchy family.
We convert back the natural parameter to the ordinary parameter as follows:
The gradient of the deformed log-normalizer is
The gradient defines the dual global affine coordinate system where is the dual parameter space.
It follows the following divergence  between Cauchy densities which is by construction equivalent to a Bregman divergence between their corresponding natural parameters:
where and . We term the Bregman-Tsallis quadratic divergence ( for general -Gaussians).
We used a computer algebra system (CAS, see Appendix A) to calculate the closed forms of the following definite integrals:
Here, observe that the equivalent Bregman divergence is not on swapped parameter order as it is the case for ordinary exponential families: where is the cumulant function of the exponential family, see [4, 45].
We term the divergence the flat divergence because its induced affine connection  has zero curvature (i.e., the Riemann-Christofel curvature tensor induced by the connection vanishes, see  p. 134). We refer to  for the -geometry construction from a divergence (also called contrast function). Reciprocally, a statistical manifold has a contrast function .
Since , the flat divergence is interpreted as a conformal squared Euclidean distance  with conformal factor . The Fisher-Rao geometry of -Gaussians has scalar curvature  . Thus we recover the scalar curvature for the Fisher-Rao Cauchy manifold since .
The flat divergence between two Cauchy distributions is equivalent to a Bregman divergence on the corresponding natural parameters with the following closed-form formula in the ordinary location-scale parameterization:
In general, we call the Bregman divergence arising from the -Gaussian flattening the -Bregman-Tsallis divergence .
The conversion of -coordinates to -coordinates are
is the Legendre-Fenchel convex conjugate :
that is independent of the location parameter . Moreover, we have 
We can convert the dual parameter to the ordinary parameter as follows:
It follows that we have the following equivalent expressions for the flat divergence:
is the Legendre-Fenchel divergence measuring the inequality gap of the Fenchel-Young inequality:
That is, , where and .
The Hessian metrics of the dual convex potential functions and are:
The Hessian metric is also called the -Fisher metric  (for ). Let and denote the Fisher information metric expressed using the -coordinates and the -coordinates, respectively. Then, we have
where denotes the Jacobian matrix:
Similarly, we can express the Hessian metric using the -coordinate system:
We have the following Jacobian matrices:
We check that we have
That is, the Riemannian metric tensors and are conformally equivalent for a smooth function .
This dually flat space construction
Notice that this dually flat geometry can be recovered from the divergence-based structure of S2.3 by considering the Bregman-Tsallis divergence. Figure 2 illustrates the relationships between the invariant -geometry and the dually flat geometry of the Cauchy manifold. The -Gaussians can further be generalized by -family with corresponding deformed logarithm and exponential functions [4, 3]. The -family unifies both the dually flat exponential family with the dually flat mixture family . A statistical dissimilarity between two parametric distributions and amounts to an equivalent dissimilarity between their parameters: . When the parametric dissimilarity is smooth, one can construct the divergence-based -geometry [2, 45]. Thus the dually flat space structure of the Cauchy manifold can also be obtained from the divergence-based -geometry obtained from the flat divergence (see Figure 2). It can be shown that the dually flat space -geometry is the unique geometry in the intersection of the conformal Fisher-Rao geometry with the deformed -geometry (Theorem 13 of ) when the manifold is the positive orthant .
3 Invariant divergences: -divergences and -divergences
3.1 Invariant divergences in information geometry
The KL divergence is a -divergence obtained for the generator .
An invariant divergence is a divergence is a divergence which satisfies the information monotonicity : with equality iff is a sufficient statistic. The invariant divergences are the -divergences for the simplex sample space . Moreover, the standard -divergences (with and ) induce the Fisher information metric for its metric tensor : , see .
3.2 -Divergences between location-scale densities
Let denote the -divergence  between and :
We have .
The -divergences include the chi square divergence (), the squared Hellinger divergence () and in the limit cases the KL divergence () and the reverse KL divergence (). The -divergences are -divergences for the generator:
For location scale families, let
Using change of variables in the integrals, one can show that
For the location-scale families which include the normal family , the Cauchy family and the -Student families with fixed degree of freedom , the -divergences are not symmetric in general (e.g., -divergences between two normal distributions). However, we have shown that the chi square divergences and the KL divergence are symmetric when densities belong to the Cauchy family. Thus it is of interest to prove that the -divergences between Cauchy densities are symmetric, and report their closed-form formula for all .
Using symbolic integration described in Appendix A, we found that