The convex body chasing
is a fundamental problems in online combinatorial optimization. It asks for an incrementally-computed path, that traverses a given sequence of convex sets provided one at a time in an online fashion, and is as short as possible. Formally, the input consists of an initial pointand a sequence of convex sets. The objective is to find a path with for each and minimum total length . (Throughout the paper, by we denote the Euclidean distance between points and in .) This path must be computed online, in the following sense: the sets are revealed over time, one per time step. At step , when set is revealed, we need to immediately and irrevocably identify its visit point . Thus the choice of does not depend on the future sets .
As can be easily seen, in this online scenario computing an optimal solution is not possible, and thus all we can hope for is to find a path whose length only approximates the optimum value. A widely accepted measure for the quality of this approximation is the competitive ratio. For a constant , we will say that an online algorithm is -competitive if it computes a path whose length is at most times the optimum solution (computed offline). This constant is called the competitive ratio of . Our objective is then to design an online algorithm whose competitive ratio is as close to as possible.
The convex body chasing problem was originally introduced in 1993 by Friedman and Linial , who gave a constant-competitive algorithm for chasing convex bodies in (the plane) and conjectured that it is possible to achieve constant competitiveness in any -dimensional space . As shown in , this constant would have to depend on ; in fact it needs to be at least .
The Friedman-Linial conjecture has remained open for over two decades. In the last several years this topic has experienced a sudden increase in research activity, partly motivated by connections to machine learning (see[3, 7]), resulting in rapid progress. In 2016, Antoniadis et al.  gave a -competitive algorithm for chasing affine spaces of any dimension. In 2018, Bansal et al.  gave an algorithm with competitive ratio for nested families of convex sets, where the input set sequence satisfies . Soon later their bound was improved to by Argue et al. , and then to by Bubeck et al. . Finally, Bubeck et al.  just recently announced a proof of the Friedman-Linial conjecture, providing an algorithm with competitive ratio for arbitrary convex sets.
One other natural variant of convex body chasing that also attracted attention in the literature is line chasing, where all sets are lines. Friedman and Linial  gave an online algorithm for line chasing in with ratio . Their algorithm was simplified by Antoniadis et al. , who also slightly improved the ratio, to . Earlier, in 2014, Sitters  showed that a generalized work function algorithm has constant competitive ratio for line chasing, but he did not determine the value of the constant.
Our results. We study the line chasing problem discussed above. Our main results is a -competitive algorithm for line chasing in , for any dimension , significantly improving the competitive ratios from [9, 1, 13]. Our algorithm is very simple and essentially memoryless, as it only needs to keep track of the last line in the request sequence. Its amortized analysis is based on a simple potential function. We start by providing the algorithm for line chasing in the plane, in Section 2, and later in Section 3 we extend it to an arbitrary dimension. We also provide a lower bound (see Section 4), showing that no online algorithm can achieve competitive ratio better than , even in the plane. This improves the lower bound of for line chasing established in . To the best of our knowledge, this is also the best lower bound for convex body chasing in the plane.
Other related work. A very general model for online optimization and competitive analysis, called Metrical Task Systems (MTS), was introduced in . An instance of MTS specifies a metric space , an initial point , and a sequence of non-negative functions over called tasks. These tasks arrive online, one at a time. At each step , the algorithm needs to choose a point where it moves to “process” the current task . The goal is to minimize the total cost defined by , where is the metric in . Thus in MTS, in addition to movement cost, at each step we also pay the cost of “processing” . For any metric space with points, if we allow arbitrary non-negative task functions then the competitive ratio of can be achieved and is optimal. This general bound is not particularly useful, because in many online optimization problems that can be modelled as an MTS, the metric space has additional structure and only tasks of some special form are allowed, which makes it possible to design online algorithms with constant competitive ratios, independent of the size of .
An MTS where and all functions are convex is referred to as convex function chasing, and was studied in [1, 4, 11]. For the special case of convex functions on the real line, a -competitive algorithm was given in .
An MTS where each task function
is a characteristic function of a subsetis called a Metrical Service Systems (MSS) . In other words, in an MSS, in each step the algorithm needs to move to a point in . One variant of MSS’s that has been particularly well studied is the famous -server problem (see, for example, [12, 10]), in which one needs to schedule movement of servers in response to requests arriving online in a metric space, where each request must be covered by one server. (In the MSS representation of the -server problem, each set consists of all -tuples of points that include the request point at step .) Naturally, convex body chasing can be thought of as an MSS where and the request sets are arbitrary convex subsets of .
2 A 3-Competitive Algorithm in the Plane
In this section, we present our online algorithm for line chasing in with competitive ratio . The intuition is this: suppose that the last requested line is and that the algorithm moved to point . Let be the new request line, the intersection point of and , and . A naïve greedy algorithm would move to the point on nearest to (see Figure 1) at cost . If is small, then , that is the distance between the greedy algorithm’s point and decreases only by a negligible amount. But the adversary can move to , paying cost , and then alternate requests on and . On this sequence the overall cost of this algorithm would be , so it would not be constant-competitive. This example shows that if the angle between and is small then the drift distance towards needs to be roughly proportional to . Our algorithm is designed so that this distance is roughly if is small (with the coefficient chosen to optimize the competitive ratio), and that it becomes when is perpendicular to .
Algorithm Drift. Suppose that the last request is line and that the algorithm is on point . Let the new request be and for any point , let be the orthogonal projection of onto . If does not intersect , move to . Otherwise, let be the intersection point of and . Let also , , and (see Figure 1). Move to point such that , where .
Algorithm Drift is -competitive for the line chasing problem in .
We establish an upper bound on the competitive ratio via amortized analysis, based on a potential function. The (always non-negative) value of this potential function, , depends on locations of the algorithm’s and the adversary’s point on the current line . If is the new request line, and are the new locations of the algorithm’s and adversary’s points, we want this function to satisfy
Since initially the potential is and is always non-negative, adding inequality (1) for all moves will establish -competitiveness of Algorithm Drift.
The potential function we use in our proof is . Substituting this formula, inequality (1) reduces to
It thus remains to prove inequality (2). Let , , and .
We first discuss the trivial case of non-intersecting and . Keeping with the general notation, here we have and thus . Moreover, as well. For fixed , we have , i.e., the right hand side of (2) is fixed, whereas the left hand side is maximized if is on the other side of than . The left hand side is thus at most
where the first inequality follows from the power mean inequality (for powers and ), proving this easy case.
The situation when and do intersect is illustrated in Figure 2. (The figure shows only the case when is between and .) Orient from left to right (with being to the right of ), as shown in this figure. We want to express the distances in the above inequality in terms of , , , and (keeping in mind that and are functions of and ):
The values of and depend on some cases, that we consider below.
Case 1: is between and , as in Figure 2. Then . Our goal is first to find for which the bound in (2) is tightest. For a given , among the two locations of at distance from , the one on the left gives a larger value of the left-hand side of (2), while the right-hand side is the same for both. Thus we can assume that is to the left of , so . Then we can rewrite (2) as follows:
By elementary calculus, the right-hand side is minimized for , so we can assume that has this value. Then inequality (3) reduces to
After substituting and , inequality (4) reduces further to
The expression in the parenthesis on the right-hand side of (5) is non-positive by triangle inequality, so the right-hand side is minimized when is maximized, that is , and then it reduces to
Recall that . Since , we have
Case 2: is before . In this case we have . Just as in Case 1, we can assume that is to the left of , so that , and (2) reduces to
After substituting and , inequality (4) reduces further to
The expression in the parenthesis on the right-hand side of (8) is non-negative, so the right-hand side is minimized when (because in this case ), so (8) reduces to the same inequality (6) as in Case 1, completing the argument for Case 2.
After substituting and , inequality (9) reduces further to
The expression in the parenthesis on the right-hand side of (10) is non-negative, so the right-hand side is minimized when , and then it reduces to
To prove this, we proceed similarly as in Case 1:
Case 4: is between and . Then (as in Case 1). Similar to Case 3, we can assume that is to the right of , so that now , and that . Then, analogously to (4), we can rewrite (2) for this case as follows:
After substituting and , inequality (12) reduces further to
We now have two sub-cases. If the expression in the parenthesis on the right-hand side of (13) is non-negative then the right-hand side is minimized when , so inequality (13) reduces to inequality (11) from Case 3. If this expression is negative (that is when ), then it is sufficient to prove (13) with on the right-hand side replaced by (because ). This reduces it to . This last inequality follows from and . ∎
Tightness of the analysis. We now show that our analysis of Algorithm Drift is tight; that is, the algorithm is no better than -competitive. To see this, note that (assuming that is very small compared to ) there are two moves that make inequality (2) tight:
One move is when , is to the right of with .
The second move is when and .
The adversary can use the first move to move away from our server, and from then on he can use moves of the second type until our server converges. This sequence can be repeated arbitrarily many times, thus proving that the competitive ratio of Algorithm Drift is not better than .
3 An Algorithm for Arbitrary Dimension
In this section, we show how to extend Algorithm Drift to Euclidean spaces for arbitrary dimension . This extension, that we call ExtDrift, is quite simple, and consists of projecting the whole space onto an appropriately chosen plane that contains the new request line. While such approach was suggested already by Friedman and Linial , their choice of plane may lose a constant factor in the competitive ratio. We choose the particular plane carefully, so that ExtDrift is also -competitive.
Let be the current ExtDrift position and the new request line. If , ExtDrift makes no move. Otherwise, let be the uniquely determined plane which contains both and . ExtDrift makes the move prescribed by Drift in the plane for , and the projection of onto .
Algorithm ExtDrift is -competitive for the line chasing problem in , for arbitrary dimension .
We prove that (1) holds in arbitrary dimension. If then and are co-planar, so the analysis from the previous section works directly.
So assume that . We first allow the adversary to perform a free move from its current position to point defined as the orthogonal projection of onto , and then we analyze the move within (that is, in a two-dimensional setting), as if the adversary started from point .
We note that for any point , as by definition of . It follows that:
In the free adversary move from to the potential function decreases (by taking in the above inequality) and both costs are . Further, in the move within , with the adversary starting from , Algorithm ExtDrift makes the same move as Drift, which implies that (1) is satisfied. Thus the complete move (combining the free adversary move and the move inside ) satisfies inequality (1) as well.
The free move is only beneficial for the adversary: taking shows that the cost of moving to from is no more costly for the adversary than moving to from .
4 A Lower Bound of 1.5358
Finally, in this section, we show how to improve an existing lower bound of to . Our bound holds even in two dimensions.
The competitive ratio of any deterministic online algorithm for the line chasing problem is at least .
We describe our adversarial strategy below. On the created input, we will compare the cost of to the cost of an offline optimum Opt. We assume that both and Opt start at origin point .
Our construction is parameterized with real positive numbers , , , , , , and .
We fix points , , and , see Figure 3 for illustration. For succinctness, we use notation .
Initial part: Line . The first request line is the line , denoted . Without loss of generality, we can assume that moves to point . This is because the adversary can either play the strategy described below or its mirror image (flipped against the line ), so any deviation from , either to the left or right, can only increase the cost of .
From now on, for any point we denote its projection on line by .
Middle part: Line . Next, the adversary issues the request line , denoted . Let and be the points to the left of , such that and .
Let be the point on chosen by . If lies to the right of point , then the adversary forces to move to (by giving sufficiently many different lines that go through at different angles). Opt may then serve the whole sequence by going from to at cost
|while the cost of is then at least|
Hence, the competitive ratio in this case is at least .
We call the half-line of to the right of point forbidden region. From now on, we assume that the point chosen by in does not lie in this region.
Final part: Line . Finally, the adversary issues the request line , denoted . Let be the intersection of line with line . Next, let and be the points on the line to the left of , such that and . Note that belongs to the interval .
Let be the point on chosen by . We consider two cases.
Case 1: lies at point or to its left. In this case, the adversary forces to move to . Opt may serve the whole sequence by going from to paying
We may now argue that the cost of is minimized if is equal to : If is to the left of point , then the cost of is at least . Both the second and the third summand decrease when we move towards . Hence, now we may assume that belongs to the interval . As the path of must avoid forbidden region, its cost is at least . The sum of the last two summands decreases when we move towards . Therefore, we obtain that the cost of is at least
Thus, in this case the competitive ratio is at least .
Case 2: If lies to the right of point , then the adversary forces to move to . Opt may serve the whole sequence by going from to at cost
To go from to and avoid the forbidden region, has to pay at least . Therefore, its cost is at least
Thus, in this case the ratio is also at least . ∎
5 Final Comments
Establishing the optimal competitive ratio for line chasing remains an open problem. We believe that none of our bounds is tight.
For instance, it should be possible to improve the upper bound using an algorithm with memory, for example by storing the actual work function at each step. The intuition is that in the first move, if and are the initial line and position and is the new request line, then the algorithm should move to the nearest point on . More generally, if the requests on and alternate (and their angle is small), the algorithm should initially drift towards slowly and only gradually accelerate to a rate that is proportional to the distance to the other line.
It appears also that our lower bound strategy can be improved by introducing additional steps, although this gives only very small improvements and leads to a very involved analysis. It is possible that an approach fundamentally different from ours may give a better bound with simpler analysis.
-  Antoniadis, A., Barcelo, N., Nugent, M., Pruhs, K., Schewior, K., Scquizzato, M.: Chasing convex bodies and functions. In: Proc. Latin American Theoretical Informatics Symposium (LATIN’16). pp. 68–81 (2016)
-  Argue, C.J., Bubeck, S., Cohen, M.B., Gupta, A., Lee, Y.T.: A nearly-linear bound for chasing nested convex bodies. CoRR abs/1806.08865 (2018), http://arxiv.org/abs/1806.08865
-  Bansal, N., Böhm, M., Eliáš, M., Koumoutsos, G., Umboh, S.W.: Nested convex bodies are chaseable. In: Proceedings of the Twenty-Ninth Annual ACM-SIAM Symposium on Discrete Algorithms. pp. 1253–1260. SODA’18 (2018)
-  Bansal, N., Gupta, A., Krishnaswamy, R., Pruhs, K., Schewior, K., Stein, C.: A 2-competitive algorithm for online convex optimization with switching costs. In: Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques, APPROX/RANDOM 2015, August 24-26, 2015, Princeton, NJ, USA. pp. 96–109 (2015)
-  Borodin, A., Linial, N., Saks, M.E.: An optimal on-line algorithm for metrical task system. J. ACM 39(4), 745–763 (1992)
-  Bubeck, S., Lee, Y.T., Li, Y., Sellke, M.: Chasing nested convex bodies nearly optimally. CoRR abs/1811.00999 (2018), http://arxiv.org/abs/1811.00999
-  Bubeck, S., Lee, Y.T., Li, Y., Sellke, M.: Competitively chasing convex bodies. CoRR abs/1811.00887 (2018), http://arxiv.org/abs/1811.00887
-  Chrobak, M., Larmore, L.L.: Metrical task systems, the server problem and the work function algorithm. In: Online Algorithms, The State of the Art (Proc. Dagstuhl Seminar, June 1996). pp. 74–96 (1996)
-  Friedman, J., Linial, N.: On convex body chasing. Discrete & Computational Geometry 9(3), 293–321 (1993)
-  Koutsoupias, E., Papadimitriou, C.H.: On the k-server conjecture. J. ACM 42(5), 971–983 (1995)
-  Lin, M., Wierman, A., Andrew, L.L.H., Thereska, E.: Dynamic right-sizing for power-proportional data centers. In: INFOCOM 2011. 30th IEEE International Conference on Computer Communications, Joint Conference of the IEEE Computer and Communications Societies, 10-15 April 2011, Shanghai, China. pp. 1098–1106 (2011)
-  Manasse, M.S., McGeoch, L.A., Sleator, D.D.: Competitive algorithms for server problems. J. Algorithms 11(2), 208–230 (1990)
-  Sitters, R.: The generalized work function algorithm is competitive for the generalized 2-server problem. SIAM J. Comput. 43(1), 96–125 (2014)