An intelligent extension of Variable Neighbourhood Search for labelling graph problems

In this paper we describe an extension of the Variable Neighbourhood Search (VNS) which integrates the basic VNS with other complementary approaches from machine learning, statistics and experimental algorithmic, in order to produce high-quality performance and to completely automate the resulting optimization strategy. The resulting intelligent VNS has been successfully applied to a couple of optimization problems where the solution space consists of the subsets of a finite reference set. These problems are the labelled spanning tree and forest problems that are formulated on an undirected labelled graph; a graph where each edge has a label in a finite set of labels L. The problems consist on selecting the subset of labels such that the subgraph generated by these labels has an optimal spanning tree or forest, respectively. These problems have several applications in the real-world, where one aims to ensure connectivity by means of homogeneous connections.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

03/05/2015

Towards an intelligent VNS heuristic for the k-labelled spanning forest problem

In a currently ongoing project, we investigate a new possibility for sol...
07/02/2018

Minimum Labelling bi-Connectivity

A labelled, undirected graph is a graph whose edges have assigned labels...
08/15/2018

Forbidden cycles in metrically homogeneous graphs

Aranda, Bradley-Williams, Hubička, Karamanlis, Kompatscher, Konečný and ...
10/29/2019

Flexible Graph Connectivity: Approximating Network Design Problems Between 1- and 2-connectivity

Graph connectivity and network design problems are among the most fundam...
12/21/2012

Random Spanning Trees and the Prediction of Weighted Graphs

We investigate the problem of sequentially predicting the binary labels ...
11/12/2015

Learning Nonparametric Forest Graphical Models with Prior Information

We present a framework for incorporating prior information into nonparam...
02/14/2022

Codes with structured Hamming distance in graph families

We investigate the maximum size of graph families on a common vertex set...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

In this paper we scratch an Intelligent Variable Neighbourhood Search (Int-VNS) aimed to achieve further improvements of a successful VNS for the Minimum Labelling Spanning Tree (MLST) and the -Labelled Spanning Forest (LSF) problems. This approach integrates the basic VNS with other complementary intelligence tools and has been shown a promising strategy in [3] for the MLST problem and in [5] for the

LSF problem. The approach could be easily adapted to other optimization problems where the space solution consists of the subsets of a reference set; like the feature subset selection or some location problems. First we introduced a local search mechanism that is inserted at top of the basic VNS to get the Complementary Variable Neighbourhood Search (Co-VNS). Then we insert a probability-based constructive method and a reactive setting of the size of shaking process.

2 The Labelled Spanning Tree and Forest problems

A labelled graph consists of an undirected graph where is its set of nodes and is the set of edges that are labelled on the set of labels. In this paper we consider two problems defined on a labelled graph: the MLST and the LSF problems. The MLST problem [2] consists on, given a labelled input graph , to get the spanning tree with the minimum number of labels; i.e., to find the labelled spanning tree of the input graph that minimizes the size of label set . The LSF problem [1] is defined as follows. Given a labelled input graph and an integer positive value , to find a labelled spanning forest of the input graph having the minimum number of connected components with the upper bound for the number of labels to use, i.e. with . Given the subset of labels , the labelled subgraph may contain cycles, but they can arbitrarily break each of them by eliminating edges in polynomial time until a forest or a tree is obtained. Therefore in both problems, the matter is to find the optimal set of labels . Since a MLST solution would be a solution also to the LSF problem if the obtained solution tree would not violate the limit on the used number of labels, it is easily deductable that the two problems are deeply correlated. The NP-hardness of the MLST and LSF problems was stated in [2] and in [1]

respectively. Therefore any practical solution approach to both problems requires heuristics 

[1, 4].

3 Complementary Variable Neighbourhood Search

The first extension of the VNS metaheuristic that we introduced for these problems is a local search mechanism that is inserted at top of the basic VNS [4]. The resulting local search method is referred to as Complementary Variable Neighbourhood Search (Co-VNS) [5, 3]. Given a labelled graph with vertices, edges, and labels, Co-VNS replaces iteratively each incumbent solution with another solution selected from the complementary space of defined as the sets of labels that are not contained in ; . The iterative process of extraction of a complementary solution helps to escape the algorithm from possible traps in local minima, since the complementary solution lies in a very different zone of the search space with respect to the incumbent solution. This process yields an immediate peak of diversification of the whole local search procedure. To get a complementary solution, Co-VNS uses a greedy heuristic as constructive method in the complementary space of the current solution. For the MLST and LSF problems the greedy heuristic is the Maximum Vertex Covering Algorithm (MVCA) [2] applied to the subgraph of with labels in . Note that Co-VNS stops if either the set of unused labels contained in the complementary space is empty () or a final feasible solution is produced. Successively, the basic VNS is applied in order to improve the resulting solution.

At the starting point of VNS, it is required to define a suitable series of neighbourhood structures of size . In order to impose a neighbourhood structure on the solution space we use the Hamming distance between two solutions given by where consists of labels that are in one of the solutions but not in the other. VNS starts from an initial solution with increasing iteratively from 1 up to the maximum neighborhood size, . The basic idea of VNS to change the neighbourhood structure when the search is trapped at a local minimum, is implemented by the shaking phase. It consists of the random selection of another point in the neighbourhood of the current solution . Given , we consider its neighbourhood comprised by sets having a Hamming distance from equal to labels, where . In order to construct the neighbourhood of a solution , the algorithm proceeds with the deletion of labels from .

4 Intelligent Variable Neighbourhood Search

The proposed intelligent metaheuristic (Int-VNS) is built from Co-VNS, with the insertion of a probability-based local search as constructive method to get the complementary space solutions. In particular, this local search is a modification of greedy heuristic, obtained by introducing a probabilistic choice on the next label to be added into incomplete solutions. By allowing worse components to be added to incomplete solutions, this probabilistic constructive heuristic produces a further increase on the diversification of the optimization process. The construction criterion is as follows. The procedure starts from an initial solution and iteratively selects at random a candidate move. If this move leads to a solution having a better objective function value than the current solution, then this move is accepted unconditionally; otherwise the move is accepted with a probability that depends on the deterioration, , of the objective function value. This construction criterion takes inspiration from Simulated Annealing (SA) [7]. However, the probabilistic local search works with partial solutions which are iteratively extended with additional components until complete solutions emerge. In the probabilistic local search, the acceptance probability of a worse component into a partial solution is evaluated according to the usual SA criterion by the Boltzmann function , where the temperature parameter controls the dynamics of the search. Initially the value of is large, so allowing many worse moves to be accepted, and is gradually reduced by the following geometric cooling law: , where and , with being the current best solution. This cooling law is very fast and produces a good balance between intensification and diversification capabilities. In addition, this cooling schedule does not requires any intervention from the user regarding the setting of its parameters, as it is guided automatically by the best solution . Therefore the whole optimization process is able to react in response to the search algorithm’s behavior and to adapt its setting on-line according to the instance of the problem under evaluation [7]. The probabilistic local search has the purpose of allowing also the inclusion of less promising labels to incomplete solutions. Probability values assigned to each label are decreasing in the quality of the solution they give. In this way, at each step, labels with a better quality will have a higher probability of being selected; the progressive reduction of the temperature in the adaptive cooling law produces, step by step, an increasing of this diversity in probabilities.

At the beginning of Int-VNS, the algorithm generates an initial feasible solution at random, that is the first current best solution , and set parameter to the number of labels of the initial solution (). Then the Complementary procedure is applied to to obtain a solution from the complementary space of by means of the probabilistic local search. The Complementary procedure stops if either a feasible solution is obtained, or the set of unused labels contained in the complementary space is empty producing a final infeasible solution. Subsequently, the shaking phase used for the basic VNS is applied to the resulting solution . It consists of the random selection of a point in the neighbourhood of the current solution , as in Co-VNS. The successive local search corresponds also to that of Co-VNS. However, since either Co-VNS, or the deletion of labels in the shaking phase, can produce an incomplete solution, the first step of the local search consists of including additional labels in the current solution in order to restore feasibility, if needed. The addition of labels at this step is according to the probabilistic procedure. Then, the local search tries to drop labels in , and then to add further labels following the greedy rule, until labels emerge. At this step, if no improvements are obtained the neighbourhood structure is increased () producing progressively a larger diversification. Otherwise, the algorithm moves to solution restarting the search with the smallest neighbourhood (). This iterative process is repeated until the maximum size of the shaking phase, , is reached. The resulting local minimum is compared to the current best solution , which is updated in case of improvement (). At this point a reactive setting for the parameter is used [6]. In case of an improved solution, is decreased () in order to raise the intensification factor of the search process. Conversely, in case of none improvement, the maximum size of the shaking is increased ( in order to enlarge the diversification factor of the algorithm. In each case, the adaptive setting of is bounded to lie in the interval between and to avoid a lack of balance between intensification and diversification factors. The algorithm proceeds with the same procedure until the user termination conditions are satisfied, producing at the end the best solution to date, , as output.

5 Summary and Outlook

The achieved optimization strategy seems to be highly promising for both labelling graph problems. Ongoing investigation consists in statistical comparisons of the resulting strategy against the best algorithms in the literature for these problems, in order to quantify and qualify the improvements obtained. Further investigation will deal with the application of this strategy to other problems.

References

  • [1] R. Cerulli, A. Fink, M. Gentili, and A. Raiconi. The k-labeled spanning forest problem. Procedia - Social and Behavioral Sciences, 108:153–163, 2014.
  • [2] R. S. Chang and S. J. Leu. The minimum labelling spanning trees. Information Processing Letters, 63(5):277–282, 1997.
  • [3] S. Consoli, N. Mladenović, and J. A. Moreno-Pérez. Solving the minimum labelling spanning tree problem by intelligent optimization. Applied Soft Computing, 28:440–452, 2015.
  • [4] S. Consoli and J. A. Moreno-Pérez. Variable neighbourhood search for the k-labelled spanning forest problem. Electronic Notes in Discrete Mathematics, to appear, 2014.
  • [5] S. Consoli, J. A. Moreno-Pérez, and N. Mladenović. Intelligent variable neighbourhood search for the minimum labelling spanning tree problem. Electronic Notes in Discrete Mathematics, 41:399–406, 2013.
  • [6] A. Stenger, D. Vigo, S. Enz, and M. Schwind. An adaptive variable neighborhood search algorithm for a vehicle routing problem arising in small package shipping. Transportation Science, 47(1):64–80, 2013.
  • [7] E. Triki, Y. Collette, and P. Siarry. A theoretical study on the behavior of simulated annealing leading to a new cooling schedule. European Journal of Operational Research, 166:77–92, 2005.