I Introduction
Image processing is designated as any type of signal processing where the input is an image and the output can be another image or a set of features extracted from the input image. Once the computer vision involves the identification and subsequent classification of certain objects in a given image, edges detection is an essential tool in image analysis. When performing edge detection on an image, there is a reduction of the amount of information to be processed because the redundant information (considered less relevant) can be unconsidered.
The segmentation by edge detection is based on two important concepts: similarity and discontinuity. Thus, the algorithms look for points (or curves and contours) of the digital image where the intensity changes abruptly. This sudden change in intensity may occur for various reasons, as example, the orientation discontinuities in a surface and changes in brightness and illumination in a scene. Applications for the edge detection method are found in various fields of science: medicine Gudmundsson et al. (1998), engineering and satellite images Augusto et al. (1984), robotics and machine visionJain et al. (1995).
There are several methods for edge detection, like: Canny, Sobel, Prewitt, and based on Gaussian masks (kernels), as Laplacian of Gaussian (LoG) and Difference of Gaussian (DoG) Gonzalez and Woods (2011). The DoG method generally uses classical Gaussians in its approach. But this work suggests the use of qGaussian for the composition of the mask that will be applied to the image to extract its edges. The qGaussian probability distribution comes from the qalgebra introduced by Tsallis.
The qalgebra is derived from Tsallis definition of nonextensive entropy. There are some works in the literature that used with success the Tsallis qentropy into the image processing de Albuquerque et al. (2004); Sathya and Kayalvizhi (2010); Kilic and Kayacan (2012) and image analysis Fabbri et al. (2013, 2012) fields. The qGaussian kernels was previously used to noise reduction Soares and Murta (2013). In this work, we present a proposal of compose the DoG filter using the qGaussian kernels. The potential of the proposed method is demonstrated by comparing with the traditional method for DoG. One can notice that it is able to perform edge extraction with one more parameter (q), which can make the DoG approach more flexible.
Ii Laplacian of Gaussian vs Difference of Gaussians
Consider the onedimensional Gaussian distribution:
with and , where is the mean, is the standard deviation andis the variance.
If we take the second derivative of the onedimensional Gaussian function considering , we obtain the Ricker wavelet: .
In the twodimensional the Gaussian distribution becomes:
(1) 
The Laplacian of Gaussian LoG is a multidimensional generalization of the Ricker wavelet. To obtain it we need to take the twodimensional Laplacian of the Gaussian distribution:
(2) 
However, in practice the Laplacian of Gaussian (LoG), Figure 1, is approximated by the Difference Gaussians (DoG) function since this reduces the computational costs in two or more dimensions. The DoG is obtained by performing the subtraction of two Gaussian kernels where a kernel must have a standard deviation slightly lower than its previous.
Figure 1 compares the LoG function with with the DoG function using kernels with and . The DoG method has lower computational cost, what justifies its use in this study. The convolution of the DOG filter with the input image generates the edge detection for this image.
Iii qGaussian
In 1988, Tsallis proposed the non additive statistical mechanics, entitled “Qstatistic”Tsallis (1988). This theory suggests that different systems require different tools of analysis, appropriated to the particularities of this system. The informational tool entropy, applied to the information theory by ShannonShannon (1948) is defined as: , where is the occurrence probability, and is the total number of probabilities.
The generalization proposed by Tsallis gives the definition of the qentropy: , where is the occurrence probability, is the total number of probabilities and is an adjustable parameter, freely variable. The correct choice of certain parameters can evidence important characteristics of the system. When , one retrieves the standard entropy.
The qGaussian probability distribution comes from the maximization of the Tsallis entropy under appropriate constraintsTsallis (2011). Again, when , one retrieves the Gaussian distribution. The qGaussian is defined as:
(3) 
with and
Figure 2 shows some of the curves generated by the equations of qGaussian, compared with the classical Gaussian ():
It is important to note that all the curves have the same parameter , but through the generalization proposed by Tsallis, we gain a second adjustable parameter, . Changes in this parameter are able to promote changes in the traditional Gaussian shape, adapting it to the peculiarities of the problem in which it is applied. When , the onedimensional qGaussian function tends to the Dirac function. Moreover the shape of the qGaussian function tends to a straight line when approaches the value .
The same way as the classical Gaussian has a TwoDimensional version, we can derive the multidimensional generalization to qGaussian. The bidimensional qGaussian is defined by . It is important to note that curves with the same parameter can have its shape changed adjusting the parameter, adapting it to the peculiarities of the problem in which it is employed. Figures 3(a), 3(b) and 3(c) show some representatives of the family of 2D qGaussian.
Iv Method
This work introduces the use of DoG method using qGaussian kernels as an alternative to traditional use of Gaussian kernels in edge detection. Following the metric proposed by the DoG filter, standard deviations and are setted, with smaller than .
After the filter having the appropriate size, we should set it with the input image in gray scale. After the convolution we identify the edges by using the “zero cross” detector. Figure 4 summarizes the described process.
V Results: Gaussian vs qGaussian Edge Detection
The results for edge detection using the method DoG with qGaussian kernels show up rich in detail when compared to the method DoG with Classic Gaussian kernels because the qGaussian probability distribution have the adjustable parameter . This parameter allow us to define the degree of detail that we seek in our detection. Figure 5 shows results obtained from qGaussian using and .
Vi Conclusions
The results presented in this work show that using the DoG filter with qGaussian kernels proves to be an excellent alternative to the LoG and DoG with classical Gaussian kernels. Compared to the LoG filter, the proposed method has lower computational cost. Constructing the convolution mask from the subtraction of two Gaussian kernels with standard deviations and slightly different is much less costly than take the Laplacian of the qGaussian function (which involves calculating derivatives).
Compared to the DoG filter with kernels using the normal distribution of probabilities, we note that we gain in details of edge detection. That is because in addition to the variable parameter
, responsible for more or less blurring (Gaussian blur), we also have the entropic index , variable and responsible for the shape of qGaussian, being able to get more details that the traditional approach when both have the same blur.The extensiveness or not extensiveness of the entropy depends on the system characteristics. Thus, it can be extended for certain values of . In this point, we can apply this concept to our work. By using the qGaussian method, the entropic index allow us adjust the function used in the filter to get the details and results that are more relevant.
Acknowledgments
Lucas Assirati acknowledges the Confederation of Associations in the Private Employment Sector (CAPES) Grant . Núbia R. Silva, Lilian Berton and Odemir M. Bruno are grateful for São Paulo Research Foundation, grant Nos.: 2011/214679, 2011/218803 and 2011/231123. Bruno also acknowledges the National Council for Scientific and Technological Development (CNPq), grant Nos. 308449/20100 and 473893/20100.
References
References

Gudmundsson et al. (1998)
M. Gudmundsson, E. ElKwae, and M. Kabuka, “Edge detection in medical images using a genetic algorithm,” Medical Imaging, IEEE Transactions on,
17, 469–474 (1998).  Augusto et al. (1984) G. Augusto, M. Goltz, and J. Demísio, “Detecção de bordas em imagens aéreas e de satélite com uso de redes neurais artificiais,” , 1044–1045 (1984).
 Jain et al. (1995) R. Jain, R. Kasturi, and B. Schunck, Machine vision (McGrawHill, 1995).
 Gonzalez and Woods (2011) R. Gonzalez and R. Woods, Digital Image Processing (Pearson Education, 2011).

de Albuquerque et al. (2004)
M. P. de Albuquerque, I. A. Esquef, and A. R. G. Mello, “Image thresholding using tsallis entropy,” Pattern Recognition Letters,
25, 1059 – 1065 (2004).  Sathya and Kayalvizhi (2010) P. D. Sathya and R. Kayalvizhi, “Psobased tsallis thresholding selection procedure for image segmentation,” Pattern Recognition Letters, 5, 39 – 46 (2010).
 Kilic and Kayacan (2012) I. Kilic and O. Kayacan, “Generalized {ICM} for image segmentation based on tsallis statistics,” Physica A: Statistical Mechanics and its Applications, 391, 4899 – 4908 (2012).
 Fabbri et al. (2013) R. Fabbri, I. Bastos, F. Neto, F. Lopes, and O. Bruno, “Multiq pattern classification of polarization curves,” (2013), arXiv:arXiv:1305.2876 [cs.CE] .
 Fabbri et al. (2012) R. Fabbri, W. N. Gonçalves, F. J. Lopes, and O. M. Bruno, “Multiq pattern analysis: A case study in image classification,” Physica A: Statistical Mechanics and its Applications, 391, 4487 – 4496 (2012).
 Soares and Murta (2013) I. J. A. Soares and L. O. Murta, “Noise reduction using nonadditive qgaussian filters in magnetic resonance images,” in Proc. SPIE 8669, Medical Imaging 2013: Image Processing, 86692J (2013).
 Tsallis (1988) C. Tsallis, “Possible generalization of boltzmanngibbs statistics,” Journal of Statistical Physics, 52, 479–487 (1988).
 Shannon (1948) C. E. Shannon, “A mathematical theory of communication,” The Bell System Technical Journal, 27, 379–423–623–656 (1948).
 Tsallis (2011) C. Tsallis, “The nonadditive entropy sq and its applications in physics and elsewhere: Some remarks,” Entropy, 13, 1765–1804 (2011).
Comments
There are no comments yet.