A Parallel Way to Select the Parameters of SVM Based on the Ant Optimization Algorithm

05/19/2014 ∙ by Chao Zhang, et al. ∙ 0

A large number of experimental data shows that Support Vector Machine (SVM) algorithm has obvious advantages in text classification, handwriting recognition, image classification, bioinformatics, and some other fields. To some degree, the optimization of SVM depends on its kernel function and Slack variable, the determinant of which is its parameters δ and c in the classification function. That is to say,to optimize the SVM algorithm, the optimization of the two parameters play a huge role. Ant Colony Optimization (ACO) is optimization algorithm which simulate ants to find the optimal path.In the available literature, we mix the ACO algorithm and Parallel algorithm together to find a well parameters.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

I Support Vector Classification and Parameters

SVM is based on the principle of structural risk minimization,using limited training samples to obtain the higher generalization ability of decision function. Suppose a sample set , where means the number of training samples, means the sample characteristics, means the sample classification.
SVM Classification function:

( means weight vector,b means setover)

Functional margin :

functional margin is the minimum margin from hyperplane

to

Geometrical margin:

When classifying a data point, the larger the margin is, the more credible the classification is. So to improve the credibility is to maximize the margin. The

and can be proportional scaled through the functional margin, thus the value of can be any large. Such that it is not appropriate to maximize a value. But in the geometrical margin, when scaling the and b, the value of can’t be change. So it is appropriate to maximize a value.
Slack variable:
Using to allow the data to deviate from the hyperplane to a certain extent.
Radial Basis Function kernel:

Maximum margin classifier:

Let: ,so

The constraints is associated with objective function through the Lagrange function:

Let:

When all the constraints is contented, the is the value that we first want to minimized.So objective function:

Dual function:

To solve the problem, requiring:

The minimized duel problem:

The final classification function:

Ii Modified Ant Colony Optimization Algorithm

Different from traditional problem of TCP, in this algorithm,the coordinate is used to represented the node. In a two-dimensional rectangular coordinate system, the significance of X is defined as the significant digit of parameters C and , Y is varied from 0 to 10.[1] The significant digit of C are assumed to be the five, and the highest level of C is hundred’s place. Similarly, assume that the significant digit of are also assumed to be the five, and the highest level of is The Unit.
To realize the ant colony optimization, we follow the steps below:

  1. Suppose there are ants. Each ant has a one-dimensional array which has n elements(n is the total significant digit of C and ), it s used to store the vertical coordinates of each point which the ant visited.

  2. Set the loop time N=0 to . Initialize the each point s pheromone concentration .Set .

  3. Set to

    .Calculate the deflection probability

    of each ant move to the node vertical line .Then select the next point via roulette wheel and store the point s vertical coordinates y to

    The mean the ant k s accuracy of cross-validation

    means the deflection probability point to

    Q is a constant

    W:the weight coefficient;

    Set:the minimum acceptable accuracy;

  4. Let :,if ,turn to step 3;else turn to step 5.

  5. Record this motion path calculate the mapping data .

  6. Make the training samples evenly divided into k mutually exclusive subsets of

  7. Calculate the cross-validation accuracy:

    1. Initialize i=1;

    2. Make the subset reserved for test sets, and set the rest as the training set, training SVM;

    3. Calculate the subset s Sample Classification Accuracy ,set ,When , repeat the step b);

    4. Calculate the mean of k Sample Classification Accuracy :

    Right: Correctly classified (+1) number of samples

    Error:Misclassification (-1) number of samples

    :the mean of Sample Classification Accuracy

  8. Update the pheromone concentration at each points by . Clear ;

  9. Reset .When , and the entire colony has not converged to follow the same path, then turn to step 3; , and the entire colony substantially converged to follow the same path, then the algorithm ends. Take the last update of the path and its mapping data to ,that is the SVM parameters C, final optimization results.

In order to validate the algorithm,we do simulation experiment on matlab R2010b and PC with windows7 64-bit operating system, 4 G memory, core i5 processor. We divide the data set into a sample set and a test set. Then we will have 90 training datas and 88 test datas. We set ACO parameter m=30, N=500, =0.7, Q=100, =1, =1. In order to calculate the classification accuracy, we introduced the LIBSVM.[3]

.LIBSVM has gained wide popularity in machine learning and many other areas.

[2]

(a) The best accuracy’s path
(b) All accuracy’s paths

The best accuracy is 95.4545% and the convergence accuracy is 86.3636% (76/88).
The parameter c: 18.605, parameter : 0.6643

Iii Parallel Optimize the Parameters

The Open Computing Language, is an open specification for heterogeneous computing released by the Khronos Group2 in 2008. It resembles the NVIDIA CUDA3 platform, but can be considered as a superset of the latter, they basically differ in the following points[4]:

  • OpenCL is an open specification that is managed by a set of distinct representatives from industry, software development, academia and so forth.

  • OpenCL is meant to be implemented by any compute device vendor, whether they produce CPUs, GPUs, hybrid processors, or other accelerators such as digital signal processors (DSP) andfield-programmable gate arrays (FPGA).

  • OpenCL is portable across architectures, meaning that a parallel code written in OpenCL is guaranteed to correctly run on every other supporte device.

In this ant colony optimization algorithm, we have ants and we loop it times. If we give a large number, such as ten thousand or one hundred thousand, our update of node pheromone concentration will be more accurate and reliable, but we will spend more time. As we all know, every cycle, each ant’s access to nodes are unrelated with others,so we can parallel the ant access process. In this article, we use openCL to realize the parallel of ant[4].

OpenCL kernel for the ant-based solution construction
;
;
for ( i=0 to n-1) do
;
 for (j=0 to 9) do
  
   ;
  
   
 j=0;
 while do
  j=j+1;
  ;
;

Training samples are divided into subsetnumber subset on average and each subset has samplenumber samples. One subset see as test set and the other subsets see as a training set(each subset has c samples), then according to the current parameters training the SVM, calculating error of K-flod cross validation.

OpenCL kernel for sample classification accuracy ; ; for (i=0 to ) to   ;   if     ;   esle    ; ; ;

Iv Conclusion

Through the ant colony optimization algorithm, we can find a satisfactory parameter of SVM, and the convergence accuracy can be guaranteed more than 85%. There are also many other ways to optimize parameters, such as Genetic algorithm (GA)

[5], dynamic encoding algorithm [6]

for handwritten digit recognition, Particle swarm optimization(PSO)

[7]. We also can parallel Ant Colony Optimization,article [8] introduced a new way which parallel Ant Colony Optimization on Graphics Processing Units.Article [9] improving ant colony optimization algorithm for data clustering.

Acknowledgment

References

  • [1] P. F. LIU Chun-bo, WANG Xian-fang, “Paramters selection and stimulation of support vector machines based on ant colony optimization algorithm,” J.Cent,South Univ, 2008.
  • [2] C. chung Chang and C.-J. Lin, “Libsvm : a library for support vector machines,” Linux Journal, 2001.
  • [3] C.-C. Chang and C.-J. Lin, “LIBSVM: A library for support vector machines,” vol. 2, pp. 1–27, 2011.
  • [4] E. by Helio J.C. Barbosa, Ant Colony Optimization - Techniques and Applications. InTech, Chapters published February 20, 2013 under CC BY 3.0 license, ISBN 978-953-51-1001-9,203 pages.
  • [5] J. L.-c. ZHENG Chun-hong, “Automatic parameters selection for SVM based on GA[C],” NJ:IEEE Press,2004:1869-1872.
  • [6] Y. Park, S.-W. Kim, and H.-S. Ahn, “Support vector machine parameter tuning using dynamic encoding algorithm for handwritten digit recognition,” 2005.
  • [7] X. Li, S. dong Yang, and J. xun Qi, “A new support vector machine optimized by improved particle swarm optimization and its application,” Journal of Central South University of Technology, vol. 13, pp. 568–572, 2006.
  • [8] A. Delevacq, P. Delisle, and M. Gravel, “Parallel Ant Colony Optimization on Graphics Processing Units,” Journal of Parallel and Distributed Computing, vol. 73, 2013.
  • [9] R. Tiwari, M. Husain, S. Gupta, and A. Srivastava, “Improving ant colony optimization algorithm for data clustering,” pp. 529–534, 2010.