1. Introduction
Soft computing concerns designing techniques that can provide inexact though reasonable solutions to a given problem in a feasible amount of time. A branch of such approaches is represented by the socalled “bioinspired” techniques, which aim at solving optimization problems by means of naturedriven heuristics
(Yang, 2008). Also known as “metaheuristics”, such techniques play an important role in both scientific and commercial communities.The research area devoted to the study and development of optimization techniques based on nature and living beings has grown widely in the last decades. Approaches based on bird flocks (Kennedy and Eberhart, 2001), bats (Yang and Gandomi, 2012), fireflies (Yang, 2010), ants (Dorigo and Stützle, 2004), and bees (Karaboga and Basturk, 2007) are some successful examples of techniques that have been used in a number of optimizationoriented applications, just to name a few.
As a consequence, one may refer to several optimization libraries based on metaheuristics in the literature^{1}^{1}1https://pypi.python.org/pypi/metaheuristicalgorithmspython/0.1.6^{2}^{2}2http://opt4j.sourceforge.net^{3}^{3}3https://github.com/yasserglez/metaheuristics^{4}^{4}4http://home.gna.org/momh/^{5}^{5}5https://projects.coinor.org/metslib^{6}^{6}6http://dev.heuristiclab.com/trac.fcgi/wiki/Download (Egea et al., 2014; Beukelaer et al., 2016)
, some of them being implemented in Java, C++, C#, Python, R and Matlab. There are a plenty of techniques implemented in such libraries, which range from population and evolutionarybased optimization techniques to multiobjective algorithms. Some libraries even consider machine learning techniques, thus making available a bunch of tools to be further used by users and readers.
Although one may refer to a number of libraries, most of them have their bottlenecks, such as do not being selfcontained, i.e., they need other libraries to be installed together, and some of them comprise a number of techniques other than optimization ones, which may take longer to the user to get used to it. In this work, we present the LibOPT, which is an opensource library implemented in C language for the development and use of metaheuristicbased optimization techniques. The library is selfcontained, which means it does not require additional packages, it contains only optimization techniques, and it can be easily integrated with other tools. Currently, LibOPT implements techniques, benchmarking functions, and it also contains hypercomplexoriented optimization techniques based on quaternions and octonions. Actually, the library allows the user to implement any number of hypercomplex dimensions. As far as we are concerned, LibOPT is the only one that can group all these features, mainly with respect to the hypercomplexbased search spaces.
LibOPT has been used in a number of scientific works that comprise feature selection
(Ramos et al., 2012; Rodrigues et al., 2014; Rodrigues et al., 2015; Ramos et al., 2016), deep learning fine tuning
(Papa et al., 2015a; Papa et al., 2016, 2015b; Rosa et al., 2015; Rosa et al., 2016)(Papa et al., 2017), hyperheuristics (Papa et al., 2016a), and quaternionbased optimization (Papa et al., 2016b), among others. The library is built upon the concept of fast prototyping, since all techniques share the very same underlying idea, which means that it is quite easy to implement additional techniques. Also, if one desires to use the library instead of going deeper into the metaheuristic designing, the only thing to be implemented is the function to be minimized.In this paper, we present the main functionalities concerning LibOPT, as well as how one can simply use it in two different ways: (i) just for application purposes, i.e., we want to minimize some function, or for (ii) development purposes, i.e., we want to implement a new metaheuristic optimization technique in the library. The paper provides a comprehensive but quite simple viewpoint of the main data structures used in the library, and how one can install and configure the package for further usage.
The remainder of the paper is organized as follows. Section 2 presents the LibOPT and its main functionalities, and Section 3 addresses a “how to”like approach, i.e., how one can use the library to minimize her/his own function, as well as how to add a new technique. Finally, Section 4 states conclusions and future works.
2. Library Organization
In this section, we present the main tools implemented in LibOPT, as well as how to install the library and design your own function.
2.1. General Tools
LibOPT is freely available at GitHub^{7}^{7}7https://github.com/jppbsi/LibOPT, where a homepage presents all techniques^{8}^{8}8https://github.com/jppbsi/LibOPT/wiki and benchmarking functions currently available in the library. To date, LibOPT comprises techniques, as follows:

Particle Swarm Optimization (Kennedy and Eberhart, 2001);

Particle Swarm Optimization with Adaptive Inertia Weight (A. Nickabadi and Safabakhsh, 2011);

Bat Algorithm (Yang and Gandomi, 2012);

Flower Pollination Algorithm (Yang et al., 2014);

Firefly Algorithm (Yang, 2010);

Cuckoo Search (Yang and Deb, 2010);

Genetic Programming (Koza, 1992);

Black Hole Algorithm (Hatamlou, 2013);

Migrating Birds Optimization (Duman et al., 2012);

Geometric Semantic Genetic Programming (Moraglio et al., 2012);

Artificial Bee Colony (Karaboga and Basturk, 2007);

Water Cycle Algorithm (Eskandar et al., 2012);

Harmony Search (Geem, 2009);

Improved Harmony Search (Mahdavi et al., 2007); and

Parametersettingfree Harmony Search (Geem and Sim, 2010).
As one can observe, the library comprises a broad variety of techniques, such as population and evolutionarybased, phenomenonmimicking, and natureinspired.
In regard to hypercomplexbased techniques, LibOPT implements the following approaches concerning quaternion and octonionbased representations:

Particle Swarm Optimization;

Particle Swarm Optimization with Adaptive Inertia Weight;

Bat Algorithm (Fister et al., 2015);

Flower Pollination Algorithm;

Firefly Algorithm (Fister et al., 2013);

Cuckoo Search;

Black Hole Algorithm;

Artificial Bee Colony;

Harmony Search (Papa et al., 2016b);

Improved Harmony Search (Papa et al., 2016b); and

Parametersettingfree Harmony Search.
Notice that most of the above techniques in their hypercomplex representation are not even published yet. Additionally, LibOPT implements benchmarking functions^{9}^{9}9https://github.com/jppbsi/LibOPT/wiki/Benchmarkingfunctions (Jamil and Yang, 2013), which are not displayed here for the sake of space.
2.2. Installation
The library was implemented and tested to work under Unix and MacOSbased operational systems, and it can be quickly installed by executing the make
command right after decompressing the file. On MacOS, if one faces any problem, you should try using GNU/gcc compiler^{10}^{10}10https://github.com/jppbsi/LibOPT/wiki/Installation.
2.3. Data Structures
Apart from other directories, LibOPT contains two main folders, say that LibOPT\include
and LibOPT\src
, being the first one in charge of the header files, and the latter responsible for the source files and main implementation.
In order to allow a fast prototyping, the library was created with one main structure in mind called Agent
, which has the following implementation in its simplest version:
typedef struct Agent_{ /* common definitions */ int n; /* number of decision variables */ double *x; /* position */ double fit; /* fitness value */ double **t; /* tensor */ }Agent;
The above implementation comprises all common information shared by the techniques implemented to date, which means all techniques available in the library must set that parameters, being the number of decision variables to be optimized, and x an array that encodes the current position of the agent when working under standard search spaces. Further, variable fit
stores the fitness value, and t stands for a matrixlike structure that is used to implement the hypercomplexbased versions of the naïve techniques, and it works similarly to x, but in another search space representation.
Another main structure models the whole search space, which includes additional information concerning the optimization problem other than the agents, as follows:
typedef struct SearchSpace_{ /* common definitions */ int m; /* number of agents (solutions) */ int n; /* number of decision variables */ int iterations; /* number of iterations */ Agent **a; /* array of pointers to agents */ double *LB; /* lower boundaries */ double *UB; /* upper boundaries */ double *g; /* global best agent */ double **t_g; /* global best tensor */ int best; /* index of the best agent */ double gfit; /* global best fitness */ int is_integer_opt; /* integervalued problem? */ }SearchSpace;
Notice the library contains a quite detailed explanation about every attribute information in order to avoid possible misunderstandings, thus leading the user to the maximum advantages of LibOPT.
The main purpose of SearchSpace
structure is to encode crucial information about the optimization problem, such as the number of agents (solutions) , the lower and upper boundaries of each decision variable, the global best agent and the global best fitness, among others. As the reader may have noticed, SearchSpace
also includes the number of decision variables (dimensionality of the search space), although Agent
structure contains the very same information. The reason for that is related to the fitness function, as shall be explained later, which has as the main input parameter an Agent
structure instead of the whole search space. Thus, we need Agent
to be selfcontained.
Both Agent
and SearchSpace
structures are defined in LibOPT/include/common.h
, as well as other common structures and functions, and their implementations can be found in LibOPT/src/common.c
. In order to facilitate the allocation and deallocation of every structure in LibOPT, the library comprises constructors and destructors, similarly to an implementation in C++. As an example, we have the constructor Agent *CreateAgent(int n, int opt_id)
, which has the number of dimensions (decision variables) and the identification of the metaheuristic technique that is going to be considered. For instance, one can create an agent with dimensions related to the Particle Swarm Optimization (PSO) technique as follows: Agent A = CreateAgent(10, _PSO_)
, where _PSO_
is the directive concerning PSO. More detailed information about that directives are given further. The deallocation of that agent can be easily implemented using the following command: DestroyAgent(&A, _PSO_)
.
2.4. Model Files
As aforementioned, although most of techniques have something in common (e.g., number of decision variables, current position and maybe velocity), they may also differ in the number of parameters. Such circumstance led us to design a model filebased implementation, which means all parameter setting up required for a given optimization technique must be provided in a single text file, hereinafter called “model file”.
For the sake of explanation, let us consider the model file of Particle Swarm Optimization^{11}^{11}11Detailed information concerning the model files of the techniques implemented in LibOPT can be found at https://github.com/jppbsi/LibOPT/wiki/Modelfiles.. Roughly speaking, the user must input all information required by that technique, as follows:
10 2 100 #<n_particles> <dimension> <max_iterations> 1.7 1.7 #<c1> <c2> 0.7 0.0 0.0 #<w> <w_min> <w_max> 5.12 5.12 #<LB> <UB> x[0] 5.12 5.12 #<LB> <UB> x[1]
The first line contains three integers: number of agents (particles), number of decision variables (dimension) and number of iterations. Notice everything right after the caracter # is considering a comment, thus not taking into account by the parser. The next two lines configure PSO parameters and , and the inertia weight . Since LibOPT implements the naïve PSO, it does not employ adaptive inertia weight (they are used only for Particle Swarm Optimization with Adaptive Inertia Weight). Therefore, there is no need to set and . The last two lines aim at setting up the range of each decision variable. Since we have two dimensions in the example, each line stands for one variable, say and later . In the above example, we have a problem with particles, decision variables and iterations for convergence. Also, we used , , and , .
3. Using LibOPT
In this section, we present one toy example concerning using LibOPT to optimize your own function, and another one discussing how to add a new technique in there.
3.1. Function Optimization
LibOPT works with the concept of “function minimization”, which means you need to take that into account when trying to “maximize” some function. Suppose we want to minimize the following 2D function:
(1) 
where and . Note that for simplicity reasons, we will be using as and as . Since all functions are implemented in both LibOPT/include/function.h
(header) and LibOPT/src/function.c
directories, one must add the function’s signature in the first file, and the function’s implementation in the second one.
In LibOPT/include/function.h
, the following line of code must be added: double MyFunction(Agent *a, va_list arg);
. With respect to the file LibOPT/src/function.c
, one should implement the function as follows:
In the above sourcecode, the first two conditional structures verify whether the Agent
has been allocated or not, and if the number of decision variables is greater than . The next line implements the function itself: since , each agent has two dimensions only, i.e., a>x[0]
and a>x[1]
. Notice LibOPT uses double
as the data type to allow a more accurate precision.
Although the user can implement any function to be optimized, we need to follow the guidelines implemented in LibOPT/include/common.h
by the following function: typedef double (*prtFun)(Agent *, va_list arg)
. This signature tells us the function to be minimized should return a double
value, as well as its first parameter should be an Agent
, followed by a list of arguments, which depends on the function.
In our example, suppose we want to use Particle Swarm Optimization to minimize MyFunction
. We need first to define the parameters according to the model according to the Section 2.4. In this case, for the sake of explanation, we will use a similar model file to the one given in that section, as follows:
10 2 100 #<n_particles> <dimension> <max_iterations> 1.7 1.7 #<c1> <c2> 0.7 0.0 0.0 #<w> <w_min> <w_max> 10 10 #<LB> <UB> x[0] 10 10 #<LB> <UB> x[1]
Notice we have only one decision variable to be optimized, as defined in the first line of the model file. Therefore, as the boundaries at the end of the file, we have set .
Let pso_model.txt
be the file name concerning the above model. Basically, one needs to create a main file to call PSO procedure as follows:
As one can observe, it is quite simple to execute PSO, since we need to call five main functions only:

ReadSearchSpaceFromFile
: it reads the model file and creates a search space; 
InitializeSearchSpace
: it initializes the search space; 
CheckSearchSpace
: it checks wether the search space is valid or not; 
runPSO
: it minimizes functionMyFunction
; and 
DestroySearchSpace
: it deallocates the search space.
Notice one can find a number of similar examples in LibOPT/examples
.
3.2. Adding New Techniques
In this section, we discuss how to add a new technique inside LibOPT ^{12}^{12}12A more detailed explanation about that topic can be found at https://github.com/jppbsi/LibOPT/wiki/Howtoaddanewtechnique%3F. Let us consider a fictitious optimization algorithm called Brazilian Soccer Optimization (BSO), and the following steps:

Add the following line in
LibOPT/include/opt.h
:#define _BSO_ X
, whereX
stands for a natural number not used before. This parameter (directive) stands for an unique number used as the metaheuristic technique identifier. 
If your technique does need a different structure not implemented in LibOPT, you must do the following:

In the structure
Agent
, add your desired parameters. For instance, suppose BSO needs a player’s strength for each decision variable. Thus, we need to consider the following structure:typedef struct Agent_{int n; /* number of decision variables */double *x; /* position */double *v; /* velocity */double f; /* fitness value */...double *strength; /* >>> NEW LINE HERE <<< */}Agent; 
In the structure
SearchSpace
, add your desired parameters. For instance, suppose BSO uses an additional variable that encodes the quality of the grass during the match: we need to add the following line:typedef struct SearchSpace_{int m; /* number of agents (solutions) */int n; /* number of decision variables */Agent **a; /* array of pointers to agents */...double grass_quality; /* >>> NEW LINE HERE <<< */}SearchSpace; 
In function
CreateAgent
(LibOPT/src/common.c
), you should add one more switch command in order to allocate your new variable, as well as to initialize it:/* It creates an agentParameters:n: number of decision variablesopt_id: identifier of the optimization technique */Agent *CreateAgent(int n, int opt_id){if((n < 1)  opt_id < 1){fprintf(stderr,"\nInvalid parameters @CreateAgent.\n");return NULL;}Agent *a = NULL;a = (Agent *)malloc(sizeof(Agent));a>v = NULL;a>strength = NULL; /* >>> NEW LINE HERE <<< */switch (opt_id){case _PSO_:a>v = (double *)malloc(n*sizeof(double));break;...case _BSO_: /* >>> NEW CASE HERE <<< */a>strength = (double *)malloc(n*sizeof(double));break;default:free(a);fprintf(stderr,"\nInvalid optimization identifier @CreateAgent\n");return NULL;break;}a>x = (double *)malloc(n*sizeof(double));return a;} 
In function
DestroyAgent
(LibOPT/src/common.c
), you should deallocate your new variable:/* It deallocates an agentParameters:a: address of the agent to be deallocatedopt_id: identifier of the optimization technique */void DestroyAgent(Agent **a, int opt_id){Agent *tmp = NULL;tmp = *a;if(!tmp){fprintf(stderr,"\nAgent not allocated @DestroyAgent.\n");exit(1);}if(tmp>x) free(tmp>x);switch (opt_id){case _PSO_:if(tmp>v) free(tmp>v);break;case _BSO_:if(tmp>strength) free(tmp>strength); /* >>> DEALLOCATE YOUR VARIABLE HERE <<<*/break;default:fprintf(stderr,"\nInvalid optimization identifier @DestroyAgent.\n");break;}free(tmp);} 
In function
CreateSearchSpace
(LibOPT/src/common.c
), you should add one more switch command in order to allocate your new variable, as well as to initialize it. Notice you must do that only if you have an arraylike variable./* It creates a search spaceParameters:m: number of agentsn: number of decision variablesopt_id: identifier of the optimization technique */SearchSpace *CreateSearchSpace(int m, int n, int opt_id){SearchSpace *s = NULL;int i;if((m < 1)  (n < 1)  (opt_id < 1)){fprintf(stderr,"\nInvalid parameters @CreateSearchSpace.\n");return NULL;}s = (SearchSpace *)malloc(sizeof(SearchSpace));s>m = m;s>n = n;s>a = (Agent **)malloc(s>m*sizeof(Agent *));s>a[0] = CreateAgent(s>n, opt_id);if(s>a[0]){for(i = 1; i < s>m; i++)s>a[i] = CreateAgent(s>n, opt_id);}else{free(s>a);free(s);return NULL;}switch (opt_id){case _BSO_:/* >>> NEW VARIABLE HERE <<<*/break;}return s;} 
In function
DestroySearchSpace
, you should deallocate your new variable./* It deallocates a search spaceParameters:s: address of the search space to be deallocatedopt_id: identifier of the optimization technique */void DestroySearchSpace(SearchSpace **s, int opt_id){SearchSpace *tmp = NULL;int i;tmp = *s;if(!tmp){fprintf(stderr,"\nSearch space not allocated @DestroySearchSpace.\n");exit(1);}for(i = 0; i < tmp>m; i++)if(tmp>a[i]) DestroyAgent(&(tmp>a[i]), opt_id);free(tmp>a);switch (opt_id){case _BSO_:/* >>> DEALLOCATE YOUR VARIABLE HERE <<<*/break;default:fprintf(stderr,"\nInvalid optimization identifier @DestroySearchSpace.\n");break;}free(tmp);}


Finally, you need to update
Makefile
in order to compile your new technique. You can just copy and paste the lines regarding any technique that has been written already.
4. Conclusions
In this paper, we presented an opensource library for handling metaheuristic techniques and function optimization called LibOPT. The main features of the library concerns fast prototyping, selfcontained code, as well as a simple but efficient implementation.
The library implements a number of optimization techniques, as well as more than a hundred of benchmark functions. Additionally, LibOPT implements hypercomplexbased search spaces, which we believe makes it one the first of its kind in the literature. We also showed how to use LibOPT to optimize functions, as well as how to add your own technique. Currently, LibOPT’s implementation is for nonlinear optimization problems with simple bounds.
In regard to future works, we intend to make available multi and manyobjective versions of the techniques, to support constrainthandling techniques, such as penalty method, to extend its usage to discrete problems, as well as to support more efficient implementations based on Graphics Processing Units.
Acknowledgements.
The authors are grateful to FAPESP grants #2013/203877, #2014/122361, #2014/162509, #2015/257394, and #2016/212437, CNPq grant #306166/20143, and Capes.References
 (1)
 A. Nickabadi and Safabakhsh (2011) M. M. Ebadzadeh A. Nickabadi and R. Safabakhsh. 2011. A novel particle swarm optimization algorithm with adaptive inertia weight. Applied Soft Computing 11 (2011), 3658–3670. Issue 4.
 Beukelaer et al. (2016) H. D. Beukelaer, G. F. Davenport, G. D. Meyer, and V. Fack. 2016. JAMES: An objectoriented Java framework for discrete optimization using local search metaheuristics. Journal of Software: Practice and Experience (2016). DOI:http://dx.doi.org/10.1002/spe.2459
 Dorigo and Stützle (2004) M. Dorigo and T. Stützle. 2004. Ant Colony Optimization. Bradford Company, Scituate, MA, USA.
 Duman et al. (2012) E. Duman, M. Uysal, and A. F. Alkaya. 2012. Migrating Birds Optimization: A New Metaheuristic Approach and Its Performance on Quadratic Assignment Problem. Information Sciences 217 (2012), 65–77.
 Egea et al. (2014) J. A. Egea, D. Henriques, T. Cokelaer, A. F. Villaverde, A. MacNamara, D.P. Danciu, J. R. Banga, and J. SaezRodriguez. 2014. MEIGO: an opensource software suite based on metaheuristics for global optimization in systems biology and bioinformatics. BMC Bioinformatics 136 (2014), 1–9. Issue 15.
 Eskandar et al. (2012) H. Eskandar, A. Sadollah, A. Bahreininejad, and M. Hamdib. 2012. Water cycle algorithm  A novel metaheuristic optimization method for solving constrained engineering optimization problems. Computers & Structures 110 (2012), 151–166.

Fister
et al. (2015)
I. Fister, J. Brest.,
I. Fister Jr., and X.S. Yang.
2015.
Modified bat algorithm with quaternion
representation. In
IEEE Congress on Evolutionary Computation
. 491–498.  Fister et al. (2013) I. Fister, X.S. Yang, J. Brest, and I. Fister Jr. 2013. Modified firefly algorithm using quaternion representation. Expert Systems with Applications 40, 18 (2013), 7220–7230. DOI:http://dx.doi.org/10.1016/j.eswa.2013.06.070
 Geem (2009) Z. W. Geem. 2009. MusicInspired Harmony Search Algorithm: Theory and Applications (1st ed.). Springer Publishing Company, Incorporated.
 Geem and Sim (2010) Z. W. Geem and K.B. Sim. 2010. Parametersettingfree harmony search algorithm. Appl. Math. Comput. 217, 8 (2010), 3881–3889.
 Hatamlou (2013) A. Hatamlou. 2013. Black hole: A new heuristic optimization approach for data clustering. Information Sciences 222 (2013), 175–184.
 Jamil and Yang (2013) M. Jamil and X.S. Yang. 2013. A Literature Survey of Benchmark Functions for Global Optimization Problems. International Journal of Mathematical Modelling and Numerical Optimisation 4, 2 (2013), 150–194.
 Karaboga and Basturk (2007) D. Karaboga and B. Basturk. 2007. A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. Journal of Global Optimization 39, 3 (2007), 459–471.
 Kennedy and Eberhart (2001) J. Kennedy and R. C. Eberhart. 2001. Swarm Intelligence. Morgan Kaufmann Publishers Inc., San Francisco, USA.
 Koza (1992) J.R. Koza. 1992. Genetic programming: on the programming of computers by means of natural selection. The MIT Press, Cambridge, USA.
 Mahdavi et al. (2007) M. Mahdavi, M. Fesanghary, and E. Damangir. 2007. An improved harmony search algorithm for solving optimization problems. Appl. Math. Comput. 188, 2 (2007), 1567–1579.
 Moraglio et al. (2012) A. Moraglio, K. Krawiec, and C. G. Johnson. 2012. Geometric Semantic Genetic Programming. Springer Berlin Heidelberg, Berlin, Heidelberg, 21–31.
 Papa et al. (2017) J. P. Papa, S. E. N. Fernandes, and A. X. Falcão. 2017. OptimumPath Forest based on kconnectivity: Theory and Applications. Pattern Recognition Letters 87 (2017), 117–126. Issue 1.
 Papa et al. (2016a) J. P. Papa, L. P. Papa, R R. J. Pisani, and D. R. Pereira. 2016a. A HyperHeuristic Approach for Unsupervised LandCover Classification. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 9 (2016), 2333–2346. Issue 6.
 Papa et al. (2016b) J. P. Papa, D. R. Pereira, A. B., and X.S. Yang. 2016b. On the Harmony Search Using Quaternions. Springer International Publishing, Cham, 126–137.

Papa et al. (2015a)
J. P. Papa, G. H. Rosa,
K. A. P. Costa, A. N. Marana,
W. Scheirer, and D. D. Cox.
2015a.
On the Model Selection of Bernoulli Restricted Boltzmann Machines Through Harmony Search. In
Proceedings of the Genetic and Evolutionary Computation Conference (GECCO ’15). ACM, New York, USA, 1449–1450.  Papa et al. (2015b) J. P. Papa, G. H. Rosa, A. N. Marana, W. Scheirer, and D. D. Cox. 2015b. Model Selection for Discriminative Restricted Boltzmann Machines Through Metaheuristic Techniques. Journal of Computational Science 9 (2015), 14–18.

Papa
et al. (2016)
J. P. Papa, W. Scheirer,
and D. D. Cox. 2016.
Finetuning Deep Belief Networks using Harmony Search.
Applied Soft Computing 46 (2016), 875–885. Issue C.  Ramos et al. (2016) C. C. O. Ramos, D. Rodrigues, A. N. de Souza, and J. P. Papa. 2016. On the Study of Commercial Losses in Brazil: A Binary Black Hole Algorithm for Theft Characterization. IEEE Transactions on Smart Grid PP, 99 (2016), 1–1.
 Ramos et al. (2012) C. C. O. Ramos, A. N. Souza, A. X. Falcão, and J. P.Papa. 2012. New Insights on Nontechnical Losses Characterization Through EvolutionaryBased Feature Selection. IEEE Transactions on Power Delivery 27, 1 (2012), 140–146.
 Rodrigues et al. (2014) D. Rodrigues, L. A. M. Pereira, R. Y. M. Nakamura, K. A. P. Costa, X.S. Yang, A. N. Souza, and J. P. Papa. 2014. A wrapper approach for feature selection based on Bat Algorithm and OptimumPath Forest. Expert Systems with Applications 41, 5 (2014), 2250–2258.
 Rodrigues et al. (2015) D. Rodrigues, X.S. Yang, A. N. Souza, and J. P. Papa. 2015. Recent Advances in Swarm Intelligence and Evolutionary Computation. Springer International Publishing, Cham, Chapter Binary Flower Pollination Algorithm and Its Application to Feature Selection, 85–100. DOI:http://dx.doi.org/10.1007/9783319138268_5
 Rosa et al. (2016) G. H. Rosa, J. P. Papa, K. A. P. Costa, L. A. Passos, C. R. Pereira, and X.S. Yang. 2016. Learning Parameters in Deep Belief Networks Through Firefly Algorithm. Springer International Publishing, Cham, 138–149.

Rosa
et al. (2015)
G. H. Rosa, J. P. Papa,
A. N. Marana, W. Scheirer, and
D. D. Cox. 2015.
FineTuning Convolutional Neural Networks Using Harmony Search.
InProgress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
. Lecture Notes in Computer Science, Vol. 9423. 683–690. 20th Iberoamerican Congress on Pattern Recognition.  Yang et al. (2014) S.S. Yang, M. Karamanoglu, and X. He. 2014. Flower pollination algorithm: A novel approach for multiobjective optimization. Engineering Optimization 46, 9 (2014), 1222–1237.
 Yang (2008) X.S. Yang. 2008. NatureInspired Metaheuristic Algorithms. Luniver Press.
 Yang (2010) X.S. Yang. 2010. Firefly algorithm, stochastic test functions and design optimisation. International Journal BioInspired Computing 2, 2 (2010), 78–84.
 Yang and Deb (2010) XS. Yang and S. Deb. 2010. Engineering Optimisation by Cuckoo Search. International Journal of Mathematical Modelling and Numerical Optimisation 1 (2010), 330–343. Issue 4.
 Yang and Gandomi (2012) X.S. Yang and A. H. Gandomi. 2012. Bat algorithm: a novel approach for global engineering optimization. Engineering Computations 29, 5 (2012), 464–483.