Online Mincut: Advice, Randomization and More

04/25/2020 ∙ by Avah Banerjee, et al. ∙ Louisiana State University 0

In this paper we study the mincut problem on connected graphs in the online setting. We consider the vertex arrival model; whenever a new vertex arrives it's adjacency to previously revealed vertices are given. An online algorithm must make an irrevocable decision to determine the side of the cut that the vertex must belong to in order to minimize the size of the cut. Various models are considered. 1) For classical and advice models we give tight bounds on the competitive ratio of deterministic algorithms. 2) Next we consider few semi-adversarial inputs: random order of arrival with adversarially generated and sparse graphs. 3) Lastly we introduce a new model, which we call the friendly sequence model. We look at several online optimization problems : mincut, maxcut and submodular maximization and show that there are input ordering where a greedy strategy can produce an optimal answer.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

Let be a (possibly infinite) graph. An online graph is a finite subgraph of and there is a total order on or (and) . We assume and . Sometimes is not mentioned when we describe a problem because is allowed to be any finite graph and is the disjoint union of all finite graphs (and thus there is no need to mention ). In the vertex arrival model, vertices of are revealed one at a time according to , along with its neighbors in the current set of revealed vertices. In the edge arrival model, vertices of are known and edges arrive one at a time according to . We do not explicitly consider the edge arrival model in this paper. However some of our results in the vertex arrival model can be extended to the latter setting without much effort.

1.1 Computational Models

We introduce some standard notions in online computation [5]. However we frame our discussions in terms of online graph problems. Let be some optimization problem on a graph and let be the optimal value of when the input is . It is also known as the offline optimal. Let be the output computed by some online algorithm given the ordering . We use competitive analysis to measure the relative performance of with respect to the offline minimum. Specifically, we say that is -competitive if for every ,

for some constant . If then the algorithm is said to be strictly -competitive. The smallest for which an algorithm is (strictly) -competitive is known as the (strict) competitive ratio. An algorithm is said to be competitive if it is 1-competitive. For maximization problems the competitive ratio is defined in a similar manner. It is important to note that the constant must be independent of but may depend on and . We omit the subscript whenever the context is clear.

In the adversary model the input and its order of arrival (be it vertex or edge arrival) are determined by an adversary. At each step the algorithm makes an irrevocable decision based on the part of the input seen so far. No other knowledge about the input is known to the algorithm in advance (not even the length). The model is used for both deterministic and randomized algorithms [5]. In the deterministic setting, the adversary knows the algorithm (also referred to as the online player) in advance. For every input sequence the adversary knows the sequence of actions performed by the algorithm. Hence it is often assumed that adversary creates the entire input then feeds the online algorithm one piece at a time. However, in the case of randomized algorithms the notion of an adversary is a bit more complex. Due to usage of random bits, the behaviour of an randomized online algorithm may differ in each run even with the same input sequence. Informally, the power of an adversary depends on whether they are allowed to look at the current state of the online algorithm before deciding the next input.

Many different models can be considered semi-adversarial or non-adversarial. They are often characterized in terms of an weak adversary. In some models the online algorithm is supplied with additional information by a benevolent oracle. Interested readers can refer to [15, 11, 17, 9, 19] for a more detail overview of these models. Some of the more well known models are random-order model (for the matroid secretary problem), diffuse adversary (for paging), Markov process (for paging) etc. Resource augmentation based models, where the adversary is made weak by giving more “resources” to the online algorithm can also be thought of as semi-adversarial. A good example is the -server problem ( [1]. Here is the number of server the adversary is allowed to used to process the requests they generate. These models are an important alternative to the adversarial models as they strive to represent real world situations more accurately. We consider two such models: the first one is a randomized/ restricted input model and the latter is a beyond-worst-case measure. We discuss these next.

1.1.1 Semi-adversarial Inputs

In the context of online graph problems, we look at a relevant semi-adversarial model. The arrival order of vertices are chosen uniformly at random. In this setting we consider two situation: (1) The graph is adversarially generated (2) The graph comes from a particular family of graphs which is known to the algorithm in advance. In particular we look at sparse graphs.

In the random order model we want to determine the the competitive ratio in terms of the expected value of the solution determined by the algorithm. That is,

The above expectation is over the random permutation and possibly over the random choices made by Since

is not random the optimal value is not a random variable.

1.1.2 Friendly Sequence Model

For many online problems the classical worst case model yields pessimistic results. A review of some well known alternatives can be found here [15, 11]. However there are online problems, particularly in the minimization setting, where these model fail to distinguish the hardness of these problems. Graph problems, such as finding the mincut, min-degree, minimum spanning tree, minimum dominating set (discussed later) are good examples of online problems which are considered hard even with many beyond worst case measures.

With this in mind we look at the following measure to evaluate the hardness of some online problems on graphs. An extension of this idea can also be used to compare the “robustness” of different online algorithm even if their worst case performance are indistinguishable. At a high level we classify problems based on whether there is a “good ordering” of the inputs for every possible input graphs such that we can always find the optimal output using a fixed (necessarily greedy) strategy. Let

be an online algorithm for a graph minimization problem under the vertex arrival model. The input of is a permutation of a graph . To measure the performance of , we usually consider over all permutations in the adversary model. Similarly for the random order model with adversarially generated input we are interested in the average . Along this line, one natural question we may ask is, what is , over all ?

For many problems , it is easy to construct such that holds for all graphs . For instance, if is the minimum vertex-cover problem, then it is clear that the following satisfies the requirement: placing in the cover, where is the largest index such that is an independent set. To see that we only need to take a maximum independent set and define to be a permutation that first lists all vertices of and then vertices of .

However, there are also problems for which no matter what is, is different from for at least one graph . For instance, consider the minimum domination problem : find a smallest set of vertices of such that every vertex outside is adjacent to at least one vertex inside . Then is such a problem. Suppose otherwise that satisfies for all . Then

  1. must place in because might have only one vertex. In general, if is independent then has to place all of them in .

  2. If is adjacent to then must place outside because might be . In general, if is adjacent to and is independent then must place outside since might be .

Now let be the tree with five edges and . Then has a unique minimum dominating set . If is a permutation so that , by (1) above we may assume . Then (2) implies a contradiction. So, no matter what is, for all .

The above two examples show the two extremes concerning . In this paper we establish that if is the mincut problem then there exists an algorithm with holds for all . We extend our results to other graph optimization problems such as online maxcut and sub-modular function maximization [20].

1.1.3 Advice Model

Advice in the context of online computation is a model where some information about the future inputs are available to the algorithm. Its inception is somewhat recent [6, 10]. The informal idea is as follows. The online algorithm is given access to a friendly oracle which knows the input in advance. The oracle is assumed to have unlimited computational power. The algorithm is allowed to ask arbitrary questions to this oracle at any stage of the computation. We do not care about the nature of information received rather the amount, in terms of the number of bits. This quantity is known as the advice complexity of the algorithm. Given some online problem we want to determine the lower (upper) bound of the amount of advice needed by any (some) algorithm to achieve a certain competitive ratio. This model have been shown to be useful in proving certain lower bounds for online problems.

There are various flavors of advice models, which are more or less equivalent. The model we use here is a variant of the tape model [16]. Let be some online graph minimization problem. Let be an algorithm solving which has access to an advice string . We say is -competitive with advice complexity for if there is an advice string of size at most such that:

Where is some constant independent of the size of . The advice complexity can be a function of the size of , however it is not dependent on itself. In the above definition we implicitly assume the length of the advice string is known to the algorithm. Otherwise we may assume advice strings are self delimiting adding to a overhead.

1.2 Problem Definition and Notations

Let be a graph. For any disjoint subsets , we denote by the set of all edges of that are between and . A partition of is a pair of disjoint subsets of with . A cut of is a set that can be expressed as for a partition of with . Note that every graph with two or more vertices must have at least one cut.

The minimum cut problem () is to minimize over all cuts of . Note that the minimum is finite for all with two or more vertices, and the minimum is if since we are minimizing over the empty set. For a graph with a positive edge weights the problem () is to minimize , where .

Let be a class of graphs. All graphs considered in the paper are simple. By we denote the problem with its input limited to graphs in . According to our definition (in section 1.1), an online algorithm for is called -competitive if there exists a constant (which may depend on ) such that

For any integer , let denote the class of -edge-connected graphs. Equivalently, consists of all graphs with . In addition, every graph in has at least vertices. We use to denote an infinite collection of graphs. The collection contains graphs of size whenever is sufficiently large.

We consider the minimum cut problem in the advice model as follows. The input, which is generated by the adversary, is a graph together with a total order on its vertices. We denote the vertices, under , by throughout our discussion. A partial input sequence is termed as a prefix sequence. By symmetry we assume . The algorithm may choose to ask questions even before is revealed. Since the placement of is fixed, it does not matter if these questions are asked before or after is revealed. To be consistent with all other steps, we assume that does not ask anything before is revealed. So the process goes as follows:

Step 1: is revealed and is placed in .
Step 2: is revealed, then asks a question and gets an answer, then is placed in or .
Step 3: is revealed, then asks a question and gets … and so on.

At the th ) step of the computation, has placed in or already, is just revealed, and needs to decide where to place . At this point, will ask a question about , with the knowledge of (the subgraph of induced on ) and possibly other information about that was obtained by from the previous inquires. We define as the collection of potential inputs after seeing the first vertices. A partition of is called extendable if it can be extended into an optimal solution.

1.3 Related Work

To the best of our knowledge and its other siblings (like min-bisection) have not been studied in the online setting. In contrast there have been few results related to . The folklore randomized -approximation for the offline also works in the online setting. In [2] authors gave a almost tight bound of for the competitive ratio of the maximum directed cut problem under the vertex arrival model.

Few other studies have been made for online minimization problems on graphs. Two important problems in this area are online minimum spanning tree and coloring[13, 12, 3]. For the minimum spanning tree problem generally the edge arrival model is used. In [18] authors study this problem when the edge weights are selected uniformly at random from . More recently this problem has been studied in the advice setting [4].

1.4 Our Results

This section serves to summarize our main results and present some selected proofs. The proofs that are omitted can be found in the appendix.

1.4.1 Adversarial Input and Advice Complexity

Theorem 1.

(i) Let be an online algorithm for , where knows in advance. Then the following hold.
(a) If then is not -competitive for any .
(b) If and is -competitive then for some .
(ii) Suppose . Then there exists an online algorithm for , where does not know in advance, such that is -competitive for all .

We formally prove this result in appendix A.1. There is a trivial algorithm which creates a partition with one part having a solitary vertex. We show that this is the best one can do without any additional information about the input except for its length. This result stands in contrast to that for the online maxcut problem. In the case of online minimization problems like mincut making a single mistake can prove to be costly. Can advice help? There are two interesting cases to consider. One where we want to find the optimal cut and the other where an approximate value would suffice. As it turns out the advice complexity of these two problems are more or less the same. This is a bit surprising as there are -complete online problems for which this is not the case. Here stands for asymmetric online cover which was introduce in [7]. For problems in this class a -competitive algorithm requires -bits of advice and these bounds are tight. But for the minimum cut the results are pessimistic. The following two theorems gives the advice complexity for optimality.

Theorem 2.

There is an competitive algorithm that finds a minimum cut with bits of advice.

This is proved formally in appendix A.2. To achieve this we simply ask the oracle if putting in makes the current solution extendable. If not we put in . Note that in this case answer to each question is a 1-bit answer. Hence there are no overhead due to self-delimiting strings. The algorithm correctly determines a minimum cut even if the graph is disconnected. Unfortunately as theorem 3 shows this naive strategy is almost optimal.

Theorem 3.

There is a collection of graphs such that any competitive algorithm solving requires at least bits of advice.

[width=6cm]./Figures/fig-lb-mincut.eps

Figure 1: The class used in the proof of theorem  3
Proof.

For every we present a graph for which a competitive algorithm requires at least bits of advice. Each graph in the collection has path a of length 4 (see Figure 1). Additionally, all other vertices of are divided into two parts and . Each vertex in is adjacent to both and each vertex in is adjacent to both . There are no other edges in . Suppose the adversary first reveals the vertices in . The induced subgraph graph forms an independent set. Let be the set of potential graphs remain after processing the set . First we show . This follows from the fact that the set can be partitioned in different ways depending on which vertices (if any) are adjacent to . Since the labels and are interchangeable there are exactly pairwise distinct optimal solutions in .

An optimal algorithm, without advice, must be able distinguish between these pairwise distinct solutions before the path is revealed. By the standard information theoretic argument we see that advice bits are necessary to solve optimally. ∎

Next we ask : how much advice is necessary and sufficient to approximate the value of the mincut value. Theorem  1 gives a -competitive algorithm even without advice whenever . However, with only bits of advice we can achieve a -competitive algorithm. Here is the minimum degree of . At the beginning we ask the oracle the position of a vertex with the minimum degree, which requires bits. The term correspond to the extra bits used to make the advice string self-delimiting. The algorithm puts this vertex in one part and all other vertices into the other part resulting in a cut of size . Unfortunately, if then it is no better than the algorithm without advice. In the next theorem we show that this is essentially the best one can do.

Theorem 4.

Let be a -competitive algorithm for where . For every , if uses bits of advice then .

The proof uses a counting argument similar to the one we used above. The main technical contribution is devising a hard instance for the problem. We used the graph from Figure 2 for this purpose. This graph has a unique minimum cut of size . For any other partition of the size of the resulting cut is large. The technical details are given in appendix 2.1.

In some respect, the minimum cut problem shows a limitation of the advice model. Unlike

-complete problems the advice complexity for mincut has a sharp phase transition. Either we have sufficient amount of advice to produce an optimal solution or a sub-linear competitive ratio cannot be guaranteed.

1.4.2 On Semi-adversarial Models

In the previous section we showed that there is -competitive algorithm when both the input graph and the order of arrival is determined by an adversary. This upper bound also holds when the order of arrival is determined by a random permutation. Unfortunately, it turn’s out this is the best we can do without any restriction on the input graph. We show this next. We complement this with an upper bound for sparse connected graphs.

Theorem 5.

For any deterministic algorithm for under the random-vertex order model there exists a class of graphs for infinitely many values of for which,

Here the expectation is taken over the random order.

[width=9cm]./Figures/fig-mincut-adv.eps

Figure 2: A graph used in the proof of theorem  4
Proof.

We use the class of graphs from theorem  4 (Figure 2). Here we take and . We note that , same as before. An optimal offline algorithm returns this value. Consider any online algorithm . Without loss of generality we may assume is assigned to the partition . Let be the set of vertices to arrive so far. Let be the expected value of the mincut computed by the online algorithm after processing the vertices through . Let is the following algorithm which has two phases: online and offline. In the online phase it processes the first vertices same as creating a partial solution. Then it is allowed to read the rest of the input just like an offline algorithm. This is the offline phase. It outputs a final partition that minimizes the cut value while respecting the decisions made during its online phase. Let be the expected value of the minimum cut computed by . It is clear that . Further, the function is monotonically increasing in . Hence we have,

We give a lower bound for as claimed in the theorem. Let be the minimum cut achievable after assigning the first vertices by , where is the resulting partition. There are two cases as follows.

Case 1:

[ and are not adjacent]. either puts (1) of them in or (2) puts in . Suppose chooses (1). Then,

(1)

Here is a lower bound on the minimum cut found by when and are in different stable sets and . Clearly . Since are picked from a random order,

(2)

From Equation 1 we get:

(3)

Now suppose puts in . A similar argument to the one above can be made to show that,

(4)
Case 2:

[ and are adjacent.] Again we have two possibilities. (1) puts in and (2) puts in . For the first case we have,

(5)
(6)

In a similar manner we find that if puts in then,

(7)

In all of the of the above cases regardless of what chooses do with we have ,

The right hand side of the above expression is maximized when and we get . ∎

For sparse connected graphs with -edges we can do significantly better in the random order model. The proof the following is given in appendix B.

Theorem 6.

In the random order model there is -competitive algorithm in expectation for sparse connected graphs with edges.

1.4.3 Friendly Sequence and Greedy Order

As we have discussed in section 1.1.2, the performance of could serve as a measure on the complexity of an online problem . In this section we will study for online mincut and maxcut problems. In both cases, we establish that there exists satisfying for all . In addition, we obtain an analogous result for maximizing a submodular function and we refute the existence of such a result for minimizing a submodular function.

We begin by clarifying our Terminology. Let be a graph. For the current discussion, we allow parallel edges but not loops in . This is the same as allowing a nonnegative weight on edges and measuring the size of a cut by the total weight . In mincut and maxcut problems, a cut is represented by a partition of such that is the set of edges between and . We insist that neither nor could be empty and we assume that has at least two vertices.

For any disjoint , let denote the number of edges of between and . We will write or for if or , respectively. If is a minimum or maximum cut of for a partition of then we may simply call is a minimum, maximum cut of , respectively. If then we use to denote the subgraph of induced on .

We will consider a greedy type algorithm . Let be a permutation of . Let be the partition determined by during the process. Since is unknown to the algorithm, has to place and , because needs to ensure even when . In the th iteration (), vertex is revealed and need to decide if should go to or . A simple greedy strategy is to make the choice depending on and , which are the number of edges from to and , respectively. In the mincut problem, goes to if , while in the maxcut problem, goes to if . When , needs to have a tie breaking rule to decide where should go.

Such a greedy strategy is a common sense approach. The difficulty in studying such an algorithm is to come up with a simple tie breaking rule. It turns out that letting go with will make things work. To be more specific, in case , then goes to if went to , and goes to if went to . Let and be our greedy algorithms with this tie breading rule for mincut and maxcut problems, respectively. For every graph with two or more vertices, we constructed two permutations and such that is a minimum cut of , and is a maximum cut of . To achieve this, we need the following graph theoretical result.

Theorem 7.

Every loopless graph has a minimum cut for which there exists a permutation of such that the following conditions are satisfied. For each , let and .
(i) and .
(ii) For every , if then and if then .
(iii) If is minimum with then and .

This theorem suggests the tie-breading rule (R) we mentioned above:

(R)  If then goes to where went.

This is equivalent to the following rule.

(R)  If and then ; if and then .

Now we can formally describe our Greedy Algorithm 1.

1:  Input: A graph .
2:  Output: A cut of .
3:  Initialize: , and .
4:  while  do
5:      and
6:     if  then
7:        
8:     else if  then
9:        
10:     else
11:        Using tie breaker (R) to decide if or .This decision is based on
12:     end if
13:  end while
14:  
Algorithm 1 A Greedy proto-Algorithm

Then the following is an immediate consequence of Theorem 7

Theorem 8.

For the online mincut problem there exists a greedy algorithm with the following property. For every loopless graph there exists a permutation of such that when taking this permutation as its input produces a minimum cut.

To establish a similar result for the online MaxCut problem we need the following theorem.

Theorem 9.

Every loopless graph has a maximum cut for which there exists a permutation of such that the following conditions are satisfied. For each , let and .
(i) and .
(ii) For every , if then and if then .
(iii) If and then either or .

This theorem leads to the following.

Theorem 10.

For online MaxCut there exists a greedy algorithm satisfying the following property. For every loopless graph there exists a permutation of such that when taking this permutation as its input produces a maximum cut.

1:  Input: A graph .
2:  Output: A cut of .
3:  Initialize: , and .
4:  while  do
5:      and
6:     if  then
7:        
8:     else if  then
9:        
10:     else
11:        Put where went.
12:     end if
13:  end while
14:  
Algorithm 2 A Greedy Algorithm for
Proof.

We consider the greedy algorithm given in Algorithm 2. To see that satisfies the theorem, for any loopless graph , let partition and permutation be determined as in Theorem 9. Then produces exactly partition , which is a maximum cut, as required. ∎

Therefore, for MinCut and MaxCut problems, we established the existence of an algorithm with for all .

There are two related problems. Let be a finite set and let be a function defined on all subsets of . If holds for all then is called a submodular function. Suppose is a graph and such that is the number of edges between and . Then it is not difficult to verify that is a submodular function. So minimizing and maximizing a submodular function can be considered as a generalization of mincut and maxcut. However, for the corresponding online problems there is a subtle difference. For the online submodular problem, if is the set of currently revealed elements, then the algorithm can access to for all contained in . In contrast, if is the set of currently revealed vertices and if , the algorithm cannot access to , it can only compute the number of edges between and .

Nevertheless, we developed a greedy type algorithm , which behaves very similar to and . In particular, for every submodular function , we constructed a permutation of such that is a subset of that maximizes . In other words, we establish that holds for all submodular functions . Finally, remark that no such exists for minimizing a submodular function. An example is presented in appendix C.4.

References

  • [1] N. Bansal, M. Eliéš, Ł. Jeż, and G. Koumoutsos (2019) The (h, k)-server problem on bounded depth trees. ACM Transactions on Algorithms (TALG) 15 (2), pp. 1–26. Cited by: §1.1.
  • [2] A. Bar-Noy and M. Lampis (2012) Online maximum directed cut.

    Journal of combinatorial optimization

    24 (1), pp. 52–64.
    Cited by: §1.3.
  • [3] Y. Bartal, A. Fiat, and S. Leonardi (1996) Lower bounds for on-line graph problems with application to on-line circuit and optical routing. In

    Proceedings of the twenty-eighth annual ACM symposium on Theory of computing

    ,
    pp. 531–540. Cited by: §1.3.
  • [4] M. P. Bianchi, H. Böckenhauer, T. Brülisauer, D. Komm, and B. Palano (2018) Online minimum spanning tree with advice. International Journal of Foundations of Computer Science 29 (04), pp. 505–527. Cited by: §1.3.
  • [5] A. Borodin and R. El-Yaniv (2005) Online computation and competitive analysis. cambridge university press. Cited by: §1.1, §1.1.
  • [6] J. Boyar, L. M. Favrholdt, C. Kudahl, K. S. Larsen, and J. W. Mikkelsen (2016) Online algorithms with advice: a survey. Acm Sigact News 47 (3), pp. 93–129. Cited by: §1.1.3.
  • [7] J. Boyar, L. M. Favrholdt, C. Kudahl, and J. W. Mikkelsen (2015) Advice complexity for a class of online problems. In 32nd International Symposium on Theoretical Aspects of Computer Science (STACS 2015), Cited by: §1.4.1.
  • [8] C. Chekuri, S. Gupta, and K. Quanrud (2015) Streaming algorithms for submodular function maximization. In International Colloquium on Automata, Languages, and Programming, pp. 318–330. Cited by: §G.3.
  • [9] S. Dehghani, S. Ehsani, M. Hajiaghayi, V. Liaghat, and S. Seddighin (2017) Stochastic k-server: how should uber work?. arXiv preprint arXiv:1705.05755. Cited by: §1.1.
  • [10] S. Dobrev, R. Královič, and D. Pardubská (2009) Measuring the problem-relevant information in input. RAIRO-Theoretical Informatics and Applications 43 (3), pp. 585–613. Cited by: §1.1.3.
  • [11] R. Dorrigiv (2010) Alternative measures for the analysis of online algorithms. Cited by: §1.1.2, §1.1.
  • [12] M. Forišek, L. Keller, and M. Steinová (2012) Advice complexity of online coloring for paths. In International Conference on Language and Automata Theory and Applications, pp. 228–239. Cited by: §1.3.
  • [13] M. M. Halldórsson and M. Szegedy (1994) Lower bounds for on-line graph coloring. Theoretical Computer Science 130 (1), pp. 163–174. Cited by: §1.3.
  • [14] E. Hazan and S. Kale (2012) Online submodular minimization.

    Journal of Machine Learning Research

    13 (Oct), pp. 2903–2922.
    Cited by: §G.3.
  • [15] B. Hiller and T. Vredeveld (2012) Probabilistic alternatives for competitive analysis. Computer Science-Research and Development 27 (3), pp. 189–196. Cited by: §1.1.2, §1.1.
  • [16] J. Hromkovič, R. Královič, and R. Královič (2010) Information complexity of online problems. In International Symposium on Mathematical Foundations of Computer Science, pp. 24–36. Cited by: §1.1.3.
  • [17] E. Koutsoupias and C. H. Papadimitriou (2000) Beyond competitive analysis. SIAM Journal on Computing 30 (1), pp. 300–317. Cited by: §1.1.
  • [18] J. Remy, A. Souza, and A. Steger (2007) On an online spanning tree problem in randomly weighted graphs.

    Combinatorics, Probability and Computing

    16 (1), pp. 127–144.
    Cited by: §1.3.
  • [19] J. A. Soto (2013) Matroid secretary problem in the random-assignment model. SIAM Journal on Computing 42 (1), pp. 178–211. Cited by: §1.1.
  • [20] M. Streeter and D. Golovin (2009) An online algorithm for maximizing submodular functions. In Advances in Neural Information Processing Systems, pp. 1577–1584. Cited by: §1.1.2.

Appendix

E Adversarial Input and Advice Complexity

e.1 Classical Model

See 1

Proof.

(a) Let be obtained from (where and ) by adding an isolated vertex . The adversary first reveal two nonadjacent vertices . If places in the same part of the partition, then the adversary can declare and . In this case . If places in different parts of the partition then the adversary can declare and . In this case holds again. If is -competitive, then there exists a number independent of and such that holds for all our and . It follows that holds for all . This is impossible and thus is not -competitive for any .

(b) Since is -competitive, there exists a constant satisfying for all and all on . Without loss of generality, we assume . Let be obtained from () by adding a new vertex and joining it to vertices of . Then and thus belongs to . The adversary first reveal two adjacent vertices . If places in the same part of the partition, then the adversary can declare . In this case . If places in different parts of the partition then the adversary can declare that neither nor is . In this case holds again. Let . Then . In addition, , implying , as required.

(ii) Let be the following simple online algorithm for : placing the first revealed vertex in the first part of the partition and all other vertices in the second part of the partition. Note that does not need to to know in advance. We now prove that is -competitive for all . To do so, we choose and we show that holds for all and all on , which will prove (ii). We consider two cases.

If then .

If then .

Thus (ii) is verified.

e.2 Advice Complexity of

In theorem  2 and theorem  3 we give upper and lower bounds for the advice complexity of competitive algorithms.

See 2

Proof.

Let define and when it receives . For each , suppose and have been constructed. When is revealed asks: is extendable? If the answer is yes then set and ; if the answer is no then set and . At the end, finds an optimal solution with bits of advice. ∎

See 4

Proof.

We will show that there is an infinite family of graphs for which the theorem holds. Consider a graph as shown in Figure 2. The induced subgraphs and are both cliques of size . We connect and via the sets and . Since the minimum cut is we ensure . The induced subgraphs and are both independent sets and . Each vertex in (resp. ) is adjacent to all vertices in (resp. ). The adversary sends the vertices in the set before sending any of the vertices in . Let be the set of potential graphs after has been revealed. Depending on how the vertices in are connected to there are pairwise different optimal solutions with a minimum cut of corresponding to the set . This is essentially the same argument we used when proving theorem  3. With bits of advice there are only possible advice strings. Hence there exists some advice string which is read by for at least pairwise different optimal solutions. Let this set be . If then the adversary can fool in choosing a non-optimal solution. Suppose after reading , chooses a partition of according to a solution (aka a partition of G) in . Then adversary sends the rest of (aka the vertices in ) according to some other partition . Since has no means of distinguishing these to case based on the advice string it will fail to optimally partition . It is easy to see that for any non-optimal partition of we have . Thus we must have , which implies . Taking we see if to be less than -competitive. ∎

F Semi-adversarial Inputs

f.1 Upper Bounds For Sparse

In this section we present a result on sparse connected graphs. Sparseness here is defined to mean that the graph has linear number of edges. Many important families of graphs falls in this category such as planer graphs, degree bounded expanders etc.

1:  Input: A sparse connected graph with an vertex arrival order chosen uniformly at random.
2:  Output: A cut of .
3:  Initialize: , and .
4:  while  do
5:     if  then
6:        
7:     else
8:        
9:     end if
10:     
11:  end while
Algorithm 3 An algorithm for sparse graphs

See 6

Proof.

We show algorithm  3 is -competitive. Suppose the graph has edges and is connected. Algorithm  3 essentially puts a random vertex in and rest in . Since is connected . Let be the vertex chosen to be in and be its expected degree. Let be the degree sequence of . The number of vertices of degree is at most . Since the total number of edges is we have .

We have,

Hence the competitive ratio is bounded.

Corollary 11.

For a class of connected graphs with -edges there is an -competitive algorithm.

Proof.

Immediately follows from theorem 6. ∎

G Greedy algorithms and Friendly Sequence Model

g.1 Online

See 7

Proof.

Let us choose a minimum cut with as small as possible. We prove that, with respect to this partition , there exists a permutation satisfying (i-iii).

Claim 1. If then the desired permutation exists.

Let be the unique member of and let be an arbitrary vertex of . We prove that there is a desired permutation starting with the two specified terms . Note that no matter how the permutation is determined, conditions (i) and (iii) are always satisfied. So when we define we only need to ensure condition (ii), which is equivalent to: for each , holds.

We define permutation inductively. Suppose terms have been selected, where . Let . We prove that there exists a vertex in , which we call , such that . Suppose otherwise that