Pricing for Online Resource Allocation: Beyond Subadditive Values

07/31/2017 ∙ by Shuchi Chawla, et al. ∙ 0

We consider the problem of truthful online resource allocation to maximize social welfare in a stochastic setting. Sequential posted pricing has emerged as a desirable mechanism for this problem that is at once simple, easy to implement in practice, as well as approximately optimal in several cases. In this mechanism, the seller uses his knowledge of the demand distribution to determine and announce prices for individual resources or bundles of resources. Buyers then arrive in sequence and can purchase their favorite bundles while supplies last. Previous work shows that sequential posted pricing achieves good approximations when buyers' values exhibit subadditivity. We consider settings where buyers desire bundles, that is, their values exhibit complementarities, and the seller faces a cost function for supplying the resource. We present both upper and lower bounds for the approximation factors achieved by sequential posted pricing in these settings.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

Consider an online resource allocation setting in which a seller offers multiple items for sale and buyers with preferences over bundles of items arrive over time. We desire an incentive-compatible mechanism for allocating items to buyers that maximizes social welfare—the total value obtained by all the buyers from the allocation. In the offline setting, this is easily achieved by the VCG mechanism. In the online context, however, we would like the mechanism to allocate items and charge payments as buyers arrive, without waiting for future arrivals. What would such a mechanism look like, and can it obtain close to the optimal social welfare?

We consider two settings for online resource allocation motivated by applications in cloud economics. In the interval preferences setting, a cloud provider has multiple copies of a single resource available to allocate over time. Customers have jobs that require renting the resource for some amount of time. If we think of each time unit as a different item, customers desire intervals of items. Different intervals, corresponding to scheduling a job at different times or renting the resource for different lengths of time, may bring different values to the customer. This model closely follows the framework described in [4]. Our second setting, the path preferences setting, models bandwidth allocation over a communication network. Each customer is a source-sink pair in the network that wishes to communicate, and assigns values to paths in the network between the source and the sink. The items are the edges in the network. We focus on the special case where the network is a tree.111Observe that the interval preferences setting is a special case of the path preferences setting. See [19] for further motivation behind this setting.

It is easy to observe that because of the online nature of the problem no algorithm for online resource allocation, truthful or otherwise, can obtain optimal social welfare: the competitive ratio is at least even when there is only a single item for sale and the seller knows both the order of arrival of buyers as well as their value distributions.222Suppose there is one item and two buyers. The buyer that arrives first has value for the item; the second buyer has value

with probability

, and otherwise, for some small . The optimal allocation is to give the item to the first buyer if the second buyer has zero value for it, and otherwise give it to the second buyer. This achieves social welfare of in expectation over the buyers’ values. However, any online algorithm produces an allocation with expected welfare at most . In a remarkable result, Feldman et al. [15] showed that this gap of 2 is the worst possible over a large class of buyer preferences: a particularly simple and natural incentive-compatible mechanism, namely posted item pricing, achieves a -approximation to the optimal social welfare in those settings.

Posted pricing is perhaps the most ubiquitous real-world mechanism for allocating goods to consumers. Supermarkets are a familiar example: the store determines prices for items, which may be sold individually or packaged into bundles. Customers arrive in arbitrary order and purchase the items they most desire at the advertised prices, unless they’re sold out. Many other domains have a similar sequential posted pricing format, from airfares to online retail to concert tickets. Feldman et al.’s results apply to settings where buyers’ values over bundles of items are fractionally subadditive, a.k.a. XOS. In these settings, the seller determines a price for each item based on his knowledge of the buyers’ value distributions. These prices are anonymous and static, meaning that the same prices are offered to each customer and remain unchanged until supply runs out.

However, the settings we consider exhibit complementarity in buyers’ values: buyers may require certain minimal bundles of items to satisfy their requirements; anything less brings them zero value. For these settings, Feldman et al. [15] show that anonymous item pricings cannot achieve a competitive ratio better than linear in the degree of complementarity (that is, the size of the minimal desired bundle of items). See Footnote 3 for an example. Is a better competitive ratio possible? Can it be achieved through a truthful mechanism as simple as static, anonymous item pricings?

We show that near-optimal competitive ratios can be achieved for the interval and path preferences settings via a static, anonymous bundle pricing mechanism.

Our mechanism is a posted bundle pricing: the seller partitions items into bundles, and prices each bundle based on his knowledge of the distribution of buyers’ preferences. Customers arrive in arbitrary order and purchase the bundles they most desire at the advertised prices, unless they’re sold out. As in Feldman et al.’s work, our bundle pricings are static and anonymous.

We now elaborate on our results and techniques.

1.1 Our results

Recall that in the interval preferences setting, items are arranged in a total order and buyers desire intervals of items. We assume that each buyer’s value function is drawn independently from some arbitrary but known distribution over possible value vectors. The seller’s computational problem is essentially a stochastic online interval packing. Let

denote the length of the longest interval that may be desired by some buyer. Feldman et al. [15] show that no item pricing can achieve a competitive ratio in this setting. Im and Wang [18] previously showed that in fact no online algorithm can achieve a competitive ratio of . Our first main result matches this lower bound to within constant factors.

Theorem 1.1.

For the interval preferences setting, there exists a static, anonymous bundle pricing with competitive ratio for social welfare.

The interval preferences setting was studied recently by Chawla et al. [11], who also designed a static item pricing mechanism for the problem. Chawla et al. showed that when the item supply is large, specifically, when the seller has copies of each item for some , static anonymous item pricings achieve a approximation to social welfare. Chawla et al.’s result suggests that there is a tradeoff between item supply and the performance of posted pricings in this setting. Our second main result maps out this tradeoff exactly (to within constant factors) for bundle pricing. We show that the approximation factor decreases inversely with item supply, and is a constant when supply is . In other words, to achieve a constant approximation via bundle pricing, we require an exponentially smaller bound on the item supply relative to that required by Chawla et al. for a approximation. Furthermore, this tradeoff is tight to within constant factors.

Theorem 1.2.

For the interval preferences setting, if every item has at least copies available, then a static, anonymous bundle pricing achieves a competitive ratio of

when , and otherwise.

Theorem 1.3.

For the interval preferences setting, if every item has at least copies available, no online algorithm can obtain a competitive ratio of for social welfare.

We then turn to the path preferences setting, which appears to be considerably harder than the special case of interval preferences. In this setting buyers are single-minded in that they desire a particular path, although their value for this path is unknown to the seller. As before let denote the length of the longest bundle/path that any buyer desires. We first observe that no online algorithm can achieve a subpolynomial in competitive ratio for this setting (see Theorem 4.3). We therefore explore competitive ratios as functions of , the ratio of the maximum possible value to the minimum possible non-zero value. In terms of , the construction of Im and Wang [18] provides a lower bound of on the competitive ratio of any online algorithm. We nearly match this lower bound:

Theorem 1.4.

For the path preferences setting, there exists a static, anonymous bundle pricing with competitive ratio for social welfare.

As for the interval preferences setting, we obtain a linear tradeoff between item supply and the performance of bundle pricing for the path preferences setting.

Theorem 1.5.

For the path preferences setting, if every edge has at least copies available, then a static, anonymous bundle pricing achieves a competitive ratio of for welfare.

Theorems 1.1 and 1.2 for the interval preferences setting are proved in Section 3. Section 4 presents the lower bounds—Theorems 1.3 and 4.3. Our main results for the path preferences setting, Theorem 1.4 and Theorem 1.5, are proved in Section 5. Some of our results extend with the same competitive ratios to settings with non-decreasing marginal production costs instead of a fixed supply; this setting is discussed in Section 6. Any proofs skipped in the main body of the paper can be found in Section 7.

All of our theorems are constructive but require an explicit description of the buyers’ value distributions. The pricings guaranteed above can be constructed in time polynomial in the sum over the buyers of the sizes of the supports of their value distributions.

1.2 Related work and our techniques

Competitive ratios for welfare in online settings are known as prophet inequalities following the work of Krengel and Sucheston [22] and Samuel-Cahn [25] on the special case of allocating a single item. Most arguments for prophet inequalities follow a standard approach, introduced by Kleinberg and Weinberg [21] and further developed by Feldman et al. [15], of setting “balanced” prices or thresholds for each buyer: informally, prices should be low enough that buyers can afford their optimal allocations, and at the same time high enough so that allocating items non-optimally recovers as revenue a good fraction of the optimal welfare that is “blocked” by that allocation. Given such balanced prices, we can then account for the social welfare of the online allocation in two components—the seller’s share of the welfare, namely his revenue, and the buyers’ share of the welfare, namely their utility. On the one hand, the prices of any items/bundles sold by the mechanism contribute to the seller’s revenue. On the other hand, any item/bundle that goes unsold in the online allocation contributes to the buyers’ utility: the buyer who receives it in the optimal allocation forgoes it in the online allocation for another one with even higher utility. The argument asserts that in either case good social welfare is achieved.

When buyers have values with complements and an item pricing is used this argument breaks down. In particular, it may be the case that a bundle allocated by the optimal solution goes unsold because only one of the items in the bundle is sold out. The loss of the buyer’s utility in this case may not be adequately covered by the revenue generated by the sold subset of items.333 Consider, for concreteness, the following example due to Feldman et al. Suppose there are two buyers and items; the first has value 1 for any non-empty allocation (i.e., is unit-demand with value 1 for every item), and the second has value for the set of all items and value zero for every subinterval. Observe that for any item prices, either the first buyer will be willing to purchase some item, thereby blocking the second buyer from purchasing anything, or else the second buyer will be unwilling to purchase the full set of items. As the optimal welfare is , no item prices lead to better than an -approximation. We therefore consider pricing bundles of items.

Recently Duetting et al. [14] developed a framework for obtaining balanced prices via a sort of extension theorem. They showed that if one can define prices achieving a somewhat stronger balance condition in the full-information setting, where the seller knows the buyers’ value functions exactly, then a good approximation through posted prices can be obtained in the Bayesian setting as well. A significant benefit of using this approach is that it suffices to focus on the full-information setting, and the designer need no longer worry about the value distribution.

In the full-information setting, even with arbitrary values over bundles of items, it is easy to construct a static, anonymous bundle pricing that achieves a -approximation for welfare, as demonstrated, for example, by Cohen-Addad et al. [12]. Unfortunately, these prices do not satisfy the strong balance condition of Duetting et al. In fact we do not know of any distribution-independent way of defining bundle prices in the full information setting that satisfy the balance condition of Duetting et al. with approximation factors matching the ones we achieve. Furthermore, while a main goal of our work is to establish a tradeoff between supply and competitive ratio for welfare, the framework of Duetting et al. does not seem to lend itself to such a tradeoff.

The crux of our argument for both of our settings lies in constructing a distribution-dependent partition of items into bundles, and pricing (subsets of) these bundles. For the interval preferences setting we construct a partition for which there exists an allocation of bundles to buyers such that each buyer receives at most one bundle and a good fraction of the social welfare is achieved. Given such a bundling, we are essentially left with a “unit-demand” setting444A buyer is said to have unit-demand preferences if he desires buying at most one item. In our context, buyers may desire buying multiple bundles, but the “optimum” we compare against allocates at most one to each buyer. This allows for the same charging arguments that work in the unit-demand case. for which a prophet inequality can be constructed using the techniques described above. We call such an allocation a unit allocation. The main technical depth of this result lies in constructing such a bundling.

In fact, it is straightforward to construct a bundling that leads to an competitive ratio:555A detailed discussion of this solution can be found in [18]. pick a random power of between and ; partition items into intervals of that length starting with a random offset; and construct an optimal allocation that allocates entire bundles in the constructed partition. However, our improved approximation requires much more care in partitioning items into bundles of many different sizes, and requires the partitioning to be done in a distribution-dependent manner.

As mentioned earlier, the path preferences setting generalizes interval preferences, but appears to be much harder. In particular, we don’t know how to obtain a constant competitive ratio even when all desired paths are of equal length. We show, however, that if all buyers have equal values and all edges have equal capacity, it becomes possible to identify up to two most contentious edges on every path, and the problem behaves like one where every buyer desires only two items. For this special case, it becomes possible to construct a pricing using techniques from [15]. Unfortunately, this argument falls apart when different items have different multiplicity. In order to deal with multiplicities, we present a different kind of partitioning of items into bundles or layers and a constrained allocation that we call a layered allocation, such that each layer behaves like a unit-capacity setting. These ideas altogether lead to an competitive ratio.

As mentioned previously, our approach lends itself to achieving tradeoffs between item supply and competitive ratio. On the one hand, when different items are available to different extents, we need to be careful in constructing a partition into bundles and in some places this complicates our arguments. On the other hand, large supply allows us some flexibility in partitioning the instance into multiple smaller instances, leading to improved competitive ratios. The key to enabling this partitioning is a composability property of our analysis: suppose we have multiple disjoint instances of items and buyers, for each of which in isolation our argument provides a good welfare guarantee; then running these instances together and allowing buyers to purchase bundles from any instance provides at least half the sum of the individual welfares. In the path preferences setting, obtaining this composability crucially requires buyers to be single-minded.

Discussion and open questions.

Two implications of our results seem worthy of further study. First, for the settings we consider as well as those studied previously, posted pricings perform nearly as well as arbitrary online algorithms, truthful or not. Can this be formalized into a meta-theorem that holds for broader contexts? Second, a modest increase in supply brings about significant improvements in competitive ratio for the settings we study. However, once the competitive ratio hits a certain constant factor, our techniques do not provide any further improvement. Can this tradeoff be extended all the way to a competitive ratio?

Another natural open problem is to extend our guarantees for the path preferences setting to general graphs. This appears challenging. Our arguments rely on a fractional relaxation of the optimal allocation. General graphs exhibit an integrality gap that is polynomial in the size of the graph even when all buyers are single-minded, have the same values (0 or 1 with some probability), and have identical path lengths; this integrality gap is driven by the combinatorial structure exhibited by paths in graphs. Indeed this setting appears to be as hard as the most general setting with no constraints on buyers’ values. It may nevertheless be possible to obtain a non-trivial competitive ratio relative to a different relaxation of the offline optimum.

Other related work.

Sequential pricing mechanisms (SPMs) were first studied for the problem of revenue maximization, where computing the optimal mechanism is a computationally hard problem and no simple characterizations of optimal mechanisms are known. A series of works (e.g., [8, 5, 9, 10, 3, 24, 7, 6]) showed that in settings where buyers have subadditive values, SPMs achieve constant-factor approximations to revenue. In most interesting settings, good approximations to revenue necessarily require non-anonymous pricings. As a result, techniques in this literature are quite different from those for welfare. [16] gives (non-truthful) online algorithms which obtain constant-factor approximations in our settings when supply is unit-capacity, buyers are single-minded, and all values are identical. There is also a long line of work on welfare-maximizing mechanisms with buyers arriving online in the worst case setting where the seller has no prior information about buyers’ values [23, 17, 13, 1, 2, 11]. The worst case setting generally exhibits very different techniques and results relative to the Bayesian setting we study.

2 Model and definitions

We consider a setting with buyers and a set of items . We index buyers by and items by (or for edges). Buyer ’s valuation function is denoted , with . Our setting is Bayesian: is drawn from a distribution that is independent of other buyers and known to the seller. We emphasize that values may be correlated across bundles, but not across buyers. Let denote the number of copies available, a.k.a. supply or capacity, for item . Let . The unit-capacity setting is a special case where . An allocation is an assignment of bundles of items to buyers such that no item is allocated more than its number of copies available. Our goal is to maximize the buyers’ total welfare—that is, the sum over values each buyer derives from his allocated bundle.

Jobs.

We now describe assumptions and notational shortcuts for buyers’ value functions that hold without loss of generality and simplify exposition. First, we assume that values are monotone: for all and , and all bundles , . Second, we assume that each buyer’s value distribution is a discrete distribution666For constructive versions of our results, we require the supports of the distributions to be explicitly given, however, our arguments about the existence of a good pricing work also for continuous distributions. with finite support. In other words, each buyer can only have finitely many possible valuation functions: with probability , buyer ’s values are given by the known function .

Third, we think of a buyer with value function as being a collection of “jobs” represented by tuples , one for each bundle of items desired by the buyer. Informally, each job is a potential (minimal) allocation to a buyer with a given value.777In particular we remove “duplicates”, or bundles such that for some , . Let denote the set of all such jobs for a buyer with value function ; let denote the union of these sets over all possible buyers and value functions. When a buyer with value arrives, we interpret this event as the simultaneous arrival of all of the jobs in . If the buyer is allocated the bundle , we say that job is allocated. In what follows, it will be convenient to consider jobs as fundamental entities. We therefore identify a job by the single index . Then is the corresponding bundle of items, is , and is .

Interval and path preferences.

In the interval preferences setting, the set of items is totally ordered, and buyers assign values to intervals of items. In particular, for any bundle of items, where ranges over all contiguous intervals contained in . Accordingly, jobs as defined above also correspond to intervals. In the path preferences setting, corresponds to the set of edges in a given tree. Each buyer has a fixed and known path, denoted , and a scalar (random) value , with for and otherwise. Accordingly, buyers are single-minded and each instantiated value function of a buyer is associated with a single job.

Bundle pricings.

The mechanisms we study are static and anonymous bundle pricings. Let denote such a pricing function. For the interval preferences setting, we partition the multiset of items into disjoint intervals and price each interval in the partition. For the path preferences setting, we partition items into disjoint “layers” and construct a different pricing function for each layer, which assigns a price to every path contained in that layer.888The pricing is essentially a layer-specific item pricing, with bundle totals subject to a layer-specific reserve price. Observe that different copies of an item end up in different bundles/layers, and may therefore be priced differently. Buyers arrive in adversarial order. When buyer arrives, he selects a subset of remaining unsold bundles to maximize his value for the items contained in the subset minus the total payment as specified by the pricing .

Let Opt denote the hindsight/offline optimal expected social welfare, and denote the expected social welfare obtained by the static, anonymous bundle pricing .

A fractional relaxation.

A fractional allocation assigns an -fraction of each item in to job .999Note that fractional allocations are used only in determining prices and in bounding the expected optimal (integer) solution, not in the mechanism itself. A fractional allocation is feasible if it satisfies the supply and demand constraints of the integral problem: no item may be allocated more than times and no buyer obtains additional value for more than one bundle. Let denote the polytope defined by constraints (1)–(3) below.

(supply) (1)
(demand) (2)
(3)

For any , is the total fractional value in (without regard for feasibility), and is the total fractional weight of :

For a subset of jobs, we use (and sometimes ) to denote the fractional allocation confined to set and zeroed out everywhere else. That is, for and for .

Fractional allocations provide an upper bound on the optimal welfare; see Section 7 for the proof of a more general statement (Lemma 6.1). Here .

Lemma 2.1.

.

3 Bundle Pricing for Interval Preferences

In this section we present our main results, Theorems 1.1 and 1.2, for the interval preferences setting. We begin by defining a special kind of fractional allocations that we call fractional unit allocations. These have properties that guarantee the existence of a bundle-pricing mechanism which obtains a good fraction of the welfare of the fractional allocation. The intent is to decompose the fractional allocation across disjoint bundles such that the fractional value assigned to each bundle can be recovered by pricing that bundle individually. Section 3.1 presents a definition of unit allocations and their connection to pricing.

The remaining technical content of this section then focuses on designing fractional unit allocations. We begin with the special case of unit-capacities in Section 3.2, where we show the existence of an -approximate fractional unit allocation. In Section 3.3 we extend our analysis to the general case of arbitrary multiplicities, proving Theorem 1.1. Finally, in Section 3.4 we show that if the capacity for every item is large enough, specifically at least , then the approximation ratio decreases by a factor of (Theorem 1.2).

3.1 Fractional unit allocations

Definition 3.1.

A fractional allocation is a fractional unit allocation if there exists a partition of the multiset of items (where item has multiplicity ) into bundles , and a corresponding partition of jobs with into sets , such that:

  • For all with , there is exactly one index with .

  • For all and , .

  • For all , we have .

A note on the terminology: we call fractional allocations satisfying the above definition unit allocations because once the partition of items is specified, each job can be assigned at most one bundle in the partition and each bundle can be fractionally assigned to at most one job.

We note that given any fractional unit allocation , for any instantiation of values, it is possible to define a pricing function over the bundles that is -balanced with respect to within the framework of [14]. This is because fractional unit allocations behave essentially like feasible allocations for unit-demand buyers. For completeness we present a simpler first-principles argument based on the techniques of Feldman et al. [15] showing that such a pricing obtains at least half the value of the fractional unit allocation. The proof is deferred to Section 7.

Lemma 3.1.

For any feasible fractional unit allocation , there exists a static, anonymous bundle pricing such that

3.2 The unit capacity setting

The main claim of this section is that for any feasible fractional allocation there exists a fractional unit allocation such that where is defined such that . Then, if is the optimal fractional allocation for the given instance, we can apply Lemma 3.1 to the allocation to obtain a pricing that gives an approximation. Observe that , so we get the desired approximation factor. A formal statement of this claim is given at the end of this subsection.

Before we prove the claim, let us discuss the intuition behind our analysis. One approach for producing a fractional unit allocation is to find an appropriate partition of items into disjoint intervals , and remove from all jobs that do not fit neatly into one of the intervals. We can then rescale the fractional allocations of jobs within each interval , so that their total fractional weight is . The challenge in carrying out this approach is that if the intervals are too short, then they leave out too much value in the form of long jobs. On the other hand, if they are too long, then we may require a large renormalizing factor for the weight, again reducing the value significantly.

To account for these losses in a principled manner we consider a suite of nested partitions, one at each length scale, of which there are in number.101010All of our arguments extend trivially to settings where denotes the ratio of lengths of the largest to the smallest interval of interest, because the number of relevant length scales is logarithmic in this ratio. We then place a job of length in the interval that contains it (if one exists) at length scale . The intervals over all of the length scales together capture much of the fractional value of the allocation . Furthermore, the fractional weight within each interval can be bounded by a constant. At this point, any single partition gives us a fractional unit allocation. However, since the number of length scales is , picking a single one of these unit allocations only gives an approximation in the worst case. In order to do better, we argue that there are two possibilities: either (1) it is the case that many intervals have a much larger than average contribution to total weight, in which case grouping these together provides a good unit allocation; or, (2) it is the case that most intervals have low total weight, so that we can put together multiple length scales to obtain a unit allocation without incurring a large renormalization factor for weight.

We now proceed to prove the claim. The proof consists of modifying to obtain through a series of refining steps. Steps 1 and 2 define the suite of partitions and placement of jobs into intervals as discussed above. Step 3 provides a criterion for distinguishing between the cases (1) and (2) above. Step 4 then provides an analysis of case (1), and Step 5 an analysis of case (2).

Step 1: Filtering low value jobs

We first filter out jobs that do not contribute enough to the solution depending on when they are scheduled. Accordingly, we define:

  • For each job , the value density of is .

  • For each item , the fractional value at is . Note . We drop the argument when it is clear from the context.

  • For any set of items (that may or may not be an interval), define .

In this step, we remove from consideration jobs with fractional value less than half the total fractional value of their interval. In particular, let . The following lemma shows that we do not lose too much value in doing so.

Lemma 3.2.

For any fractional allocation and the set of jobs as defined above, we have

The proof of Lemma 3.2 is a simple counting argument and we defer it to Section 7.

Step 2: Bucketing

In the rest of the argument, we will group jobs according to both their value and the length of their interval. Let denote a length scale, and let denote a value scale. We will further partition jobs of similar length across non-overlapping intervals. Accordingly, let be an offset picked u.a.r. from where . The partition corresponding to length scale then consists of length- intervals where

See Figure 1.

We are now ready to define job groups formally. Group consists of every job of length between and and value between and whose interval lies within the th interval at length scale : . We say that the jobs in are assigned to interval .

denotes all the jobs “assigned” to : .

Observe that our choice of the offset may cause us to drop some jobs, in particular those that do not fit neatly into one of the intervals in the partition corresponding to the relevant length scale. Let . The next lemma bounds the loss in value at this step.

Lemma 3.3.

For any fractional allocation and with and defined as above, .

Proof.

Recall that is chosen u.a.r. from . Moreover, the length of any job is at most half the length of the intervals at the scale at which is considered for bucketing. Therefore, survives with probability at least . ∎

Figure 1: Bucketing jobs by length as in Step 2. Jobs and have length scale (i.e., lengths between , exclusive, and , inclusive) and will therefore be assigned to bucket . Job , however, will be dropped; it has length scale , but does not fit entirely within any bucket at that scale.

Step 3: Classification into heavy-weight and light-weight sets of jobs

We first discuss the intuition behind Steps 3 and 4. Consider a single group . Let us assume briefly, for simplicity, that all jobs in this group have the same value . Recall that all of the jobs by virtue of being in satisfy the property that . Furthermore, since every job is of size at least whereas the interval covered by , namely , is of size , the total fractional value of is111111To be precise, the total value of in this case is at most . comparable to . Therefore, as long as the jobs in the group have sufficient total fractional weight, this group of jobs alone would recover (a constant fraction of) the fractional value of the interval . In other words, we could then immediately throw away jobs in other groups (corresponding to other length or value scales) that overlap with this interval. Step 3 filters out such “heavy-weight” groups of jobs and in Step 4 we argue that these groups form a good unit allocation. Intervals that do not have any length scales with heavy-weight groups of jobs are relegated to Step 5.

We now present the details of this argument. First, for every group , we break the group up into contiguous components. Observe, in particular, that the set of items covered by jobs in , namely , may not be an interval itself but is composed of at most three disjoint intervals because each contiguous component has length at least a quarter of . Consider each of the at most three corresponding sets of overlapping jobs. We will use for to denote these sets. For any such set of overlapping jobs, let denote the interval of items it covers.

Next, we classify these sets of jobs into heavy-weight or light-weight. In the following definition, we hide the arguments

to simplify notation.

The following fact is immediate:

Fact 3.4.

.

We will now proceed to construct two fractional unit allocations, one of which extracts the value of the heavy-weight sets (see Step 4), and the other extracts the value of the light-weight sets (see Step 5).

Step 4: Extracting the value of heavy-weight intervals

Consider any heavy-weight set (with the arguments , , etc. being implicit). The following lemma is implicit in the discussion above and follows from the observations that (1) all jobs in are high-value jobs (that is, they belong to ); (2) they have roughly the same value (within a factor of ); (3) the interval can be covered by at most such jobs; and (4) the total weight of , , is at most by the definition of heavy-weight sets.

Lemma 3.5.

For all , , and therefore .

There are two remaining issues in going from the allocation to a unit-allocation. First, the intervals overlap, and second, the fractional weight of can be larger than . The second issue can be dealt with by rescaling by an appropriate factor. To deal with the first, we state the following simple lemma without proof.

Lemma 3.6.

For any given collection of intervals, one can efficiently construct two sets such that

  • (and likewise ) is composed of disjoint intervals; that is, for all .

  • Together they cover the entire collection ; that is, .

We can now put these lemmas together to construct a unit allocation that covers the fractional value of heavy-weight intervals.

Lemma 3.7.

There exists a fractional unit allocation such that

Proof.

Apply Lemma 3.6 to the collection to obtain sets and . We think of and as sets of groups in . Assume without loss of generality that has larger fractional value than . Let be the fractional allocation scaled down by a factor of . Then we have:

Here the first inequality follows from applying Lemma 3.5 to every , and the fact that intervals in are disjoint. The second inequality follows by recalling that the intervals in and together cover the collection, and so, .

It remains to show that is a unit allocation. To see this, consider the partition of jobs into groups in . The corresponding collection of intervals forms a partition of the items by virtue of the fact that the intervals corresponding to groups in are disjoint. It remains to argue that , that is, for all .

To prove this claim, observe that since each job in has length at least a quarter of the length of , we can find up to four items in such that each job in contains at least one of the four items in its interval. Since is a feasible fractional allocation, the total weight of all jobs containing any one of those items is at most , and therefore the total weight of jobs in altogether is at most . ∎

Step 5: Extracting the value of light-weight intervals

We now consider the light-weight groups . As discussed at the beginning of this section, in order to obtain a good approximation from these sets, we must construct a unit allocation out of jobs at multiple length scales.

Define to be the set of all jobs in with value scale that belong to light-weight groups. Let denote all light-weight jobs assigned to . Since each individual group in has total weight at most , we have that . In order to obtain a partition of jobs and items, we would now like to associate a single set of jobs with each partition that is simultaneously high value and low weight. Unfortunately, the set may have very large total weight since it combines together many low weight sets. We use the fact that the values of jobs in these low weight sets increase geometrically to argue that it is possible to extract a subset of jobs from that is both light-weight (i.e. has total weight at most ) and captures a large fraction of the total value in .

Lemma 3.8.

For each interval , there exists a set of jobs such that

  1. [label=., leftmargin=2]

  2. and

  3. .

We defer the proof of Lemma 3.8 to Section 7. The remainder of our analysis then hinges on the fact that if we consider the partition into intervals at some length scale , namely , and consider for every interval in this partition the set of all jobs in this interval at the length scales below , the total fractional weight of these jobs is at most . We therefore obtain a unit allocation while capturing the fractional value in consecutive scales.

Lemma 3.9.

There exists a fractional unit allocation such that

We defer the proof of Lemma 3.9 to Section 7.

Putting everything together

Combining Lemmas 3.2 and 3.3, Fact 3.4, and Lemmas 3.7 and 3.9, we get that the better of the two unit allocations and provides an approximation to the total fractional value of , where we used the fact that .

Theorem 3.10.

For every fractional allocation for the unit-capacity setting, there exists a fractional unit allocation such that

3.3 Extension to arbitrary capacities

We will now give a reduction from the setting with arbitrary item multiplicities to the unit-capacity case discussed in the previous section. Once again we start with an arbitrary fractional allocation and construct a unit allocation with large fractional value. The main idea behind the reduction is to first partition the fractional allocation into “layers”. Layer corresponds to the th copy of each item (if it exists). Each job is assigned to one layer, with its fractional allocation appropriately scaled down, in such a manner that the layers together capture much of the fractional value of . We then apply Theorem 3.10 to each layer separately, obtaining unit allocations for each layer. The union of these unit allocations immediately gives us a unit allocation overall.

Theorem 3.11.

For every feasible fractional allocation in the interval preferences setting with arbitrary capacities, there exists a fractional unit allocation , such that

Proof.

We begin by defining layers. Recall that denotes the number of copies of item that are available. We assume without loss of generality that for all (and otherwise redefine to be the latter quantity). Let . Then we define the th layer for as .

We will now construct a fractional allocation for each layer with the properties that (i) for each , is feasible with respect to the set of items in , and (ii) .

We proceed via induction over . For the base case, suppose that . Then we define and for all . Observe that both the properties (i) and (ii) are satisfied by this definition. For the inductive step, we pick a set of jobs as given by the following lemma (proved in Section 7).

Lemma 3.12.

For any feasible fractional allocation in the interval preferences setting with arbitrary capacities, one can efficiently construct a set of jobs such that the total fractional weight of at any item is at least and at most . Formally, for all items , .

Having constructed such a set , we set , and recursively construct allocations for the remaining layers using . Observe that is feasible for by the definition of . Furthermore, by removing jobs in from , we reduce by at least , and therefore, we can apply the inductive hypothesis to construct allocations for the remaining layers. This provides us with (i) and (ii) as desired above.

Finally, for each layer , we apply Theorem 3.10 to the allocation to obtain a fractional unit allocation for that layer. Then, the allocation is a fractional unit allocation with . ∎

Combining Theorem 3.11 with Lemmas 2.1 and 3.1 immediately implies Theorem 1.1, which we state here for completeness.

Theorem 1.1.

For the interval preferences setting, there exists a static, anonymous bundle pricing with competitive ratio for social welfare.

3.4 The large markets setting

In this section we consider the setting where every item is available in large supply. Specifically, let . We show that as increases, the approximation ratio achieved by bundle pricing gradually decreases.

Theorem 1.2.

For the interval preferences setting, if every item has at least copies available, then a static, anonymous bundle pricing achieves a competitive ratio of

when , and otherwise.

Proof.

Let and . We will partition both the item supply and the jobs into multiple instances, such that on the one hand, the fractional solution confined to jobs within an instance will be feasible for the item supply in that instance; on the other hand, within each instance job lengths will differ by a factor of at most . Then, applying Theorem 3.11 will give us a fractional unit allocation for every instance with a factor of loss in social welfare. Applying Lemma 3.1 to the union of these unit allocations then implies the theorem.

Let be the optimal fractional allocation. Divide all jobs into groups according to length: contains all jobs of length between and . Let denote the allocation confined to the set scaled by , that is, . We now specify the supply for instance . Let

Note that no item is over-provisioned (although some supply may be wasted) :

Note also that is feasible for the supply .

Applying Theorem 3.11 and Lemma 3.1 to the instances defined above then implies an approximation factor of , which is for , and otherwise. ∎

4 Lower Bounds

We now turn to lower bounds. In this section, we ignore incentive constraints and bound the social welfare that can be achieved by any online algorithm for the online resource allocation problem. We consider the case of interval preferences in Section 4.1 and path preferences in Section 4.2.

4.1 Lower bound for interval preferences

We focus on the special case where every buyer is single minded. That is, for all and , there exists an interval and a scalar such that for all and otherwise. This setting is also called online interval scheduling. A lower bound for the unit-capacity version of this problem was previously developed by Im and Wang [18]. Although Im and Wang’s focus was on the secretary-problem version (i.e. with random arrival order and adversarial values rather than worst-case arrival order and stochastic values), their lower bound construction uses jobs drawn independently from a fixed distribution, and it thus provides a lower bound for the Bayesian adversarial order setting as well. We restate Im and Wang’s result; our proof of Theorem 1.3 builds upon this result.

Lemma 4.1.

(Restated from Theorem 1.3 in [18]) For the unit-capacity setting of the online interval scheduling problem, every randomized online algorithm has a competitive ratio of .

Theorem 1.3.

For the online interval scheduling problem, every randomized online algorithm has approximation ratio , where .

Proof.

We argue that if there exists an online algorithm with competitive ratio for the setting with minimum capacity , then we can construct an online algorithm for the unit-capacity setting with competitive ratio , contradicting Lemma 4.1 above.

Our main tool is the following lemma which shows that, for any set of intervals, if the intervals arrive in arbitrary order, there exists an online algorithm that colors them (i.e. such that no two overlapping intervals have the same color) using not many more colors than optimal.

Lemma 4.2.

(Restated from Theorem 5 in [20]) For any set of intervals such that every point belongs to at most intervals in , there exists an online coloring algorithm utilizing at most colors.

Suppose that for some there exists an online algorithm for markets of size with competitive ratio . Consider the following algorithm for the unit-capacity setting. Let be an integer chosen u.a.r. from . As each buyer arrives, if accepts , then input to ; if assigns color to , then accept .

By Lemma 4.2, the expected value of intervals with color is a -fraction of the social welfare obtained by . Thus the approximation ratio of for the unit-capacity setting is , which contradicts Lemma 4.1. ∎

4.2 Lower bound for path preferences

In this section, we show that for the “online path scheduling” problem on trees no online algorithm can achieve subpolynomial approximation with respect to , the length of the longest possible path in the instance.

Theorem 4.3.

There exists an instance of the path preferences setting, where every buyer desires a path of length between and , for which every randomized online algorithm has competitive ratio for social welfare.

Proof.

Consider a complete binary tree with height and capacity on every edge in the tree. We will define an instance with a unique buyer for every (node, leaf) pair in the tree where the leaf belongs to the subtree rooted at the node. Formally, define the level of edges bottom-up: for example, the leaf edges (i.e. edges adjacent to a leaf node) have level 0, and the edges adjacent to root have level . There are edges at level , and each of these edges has leaves in its subtree. For every level , every edge at level , and every leaf edge in the subtree rooted at , our instance contains a unique buyer that arrives independently with probability and has value . We say that the buyer is at level . Note that there are exactly distinct buyers at each level , and therefore buyers in all.

Let us now compute the offline optimal welfare Opt for this instance. We claim . Consider a greedy allocation that admits buyers in order from the highest to the lowest level: every buyer whose path is still available when considered gets allocated with probability . Note that when a level path beginning with edge is considered, it can be allocated if and only if is not allocated to previous buyers. We now make two observations. First, the expected number of buyers at level whose paths contain and that arrive is at most . So, the probability that an edge is allocated prior to considering buyers at its level is at most . Second, if prior to considering level buyers, edge is available, then the probability that it gets allocated to a level buyer is at least . Therefore, the total contribution of level buyers to the social welfare is at least . This proves our claim.

Next we prove via a charging argument that the welfare of any online algorithm is bounded by . Assume that buyers arrive in order from lowest level to highest level. Construct a grid with height and width as follows. Each edge at level corresponds to a size rectangle in the grid at the same level; see Figure 2. With each leaf edge, we also associate the entire column above its cell in the grid with the edge. So each cell in the grid is indexed by a level and a column corresponding to a leaf edge. As buyers arrive and the online algorithm makes allocation decisions, we mark cells in the grid to indicate availability. Initially all cells are unmarked. Whenever the online algorithm allocates a path to a buyer at level with first edge , we mark all cells in the rectangle corresponding to edge , as well as all cells above this rectangle. Now suppose that a buyer at level arrives whose path begins with edge and ends at leaf edge . If the cell corresponding to column and level in the grid is already marked, this means the buyer cannot be legally allocated because a buyer starting at some other edge on the path from to was previously allocated. Let be the number of unmarked cells in the rectangle corresponding to when buyers beginning with start to arrive. If the online algorithm allocates to a buyer with starting edge , the total number of cells that get marked at this iteration is exactly .

(a)
(b)
Figure 2: (a) An example of a binary tree and its corresponding grid; (b) Jobs arrive bottom-up and get allocated. The grid shows the cells marked by each job.

Say an edge at level is “good” if when the buyers beginning with start to arrive, we have . For any buyer beginning with a good edge that is allocated by the algorithm, it marks a rectangle in the grid with area , i.e times the value of the buyer. Since the total area of the grid is , the total value of buyers with good edges allocated in the algorithm is at most .

Now consider the welfare contributed by buyers at bad edges. At any such bad edge at level , at most buyers beginning with can get allocated; the expected number of such buyers that arrive is at most . Thus their total welfare contribution is at most

Summing up the total welfare contribution from buyers at good and bad edges gives us a bound of on the total welfare obtained by any online algorithm. ∎

5 Bundle Pricings for Path Preferences

We now turn to the path preferences setting. Recall that here the items correspond to edges in a fixed tree. Each buyer desires a path . Our techniques for this setting are similar to those in Section 3 — we construct a partition of the items into layers, construct a corresponding fractional allocation that respects this layer structure in that every buyer is allocated items from at most two layers, and no layer has too much fractional weight. We show in Section 5.2 that such a good fractional layered allocation always exists. In Section 5.3 we show that a layered allocation guarantees a good bundle pricing. In Section 5.4 we present the corresponding “large markets” result, namely that the competitive ratio decreases linearly with capacity.

5.1 Fractional layered allocations

As in the case of fractional unit allocations that we defined for the interval preferences setting, we would like to construct a partition of items into bundles and a corresponding allocation of jobs such that each job fits cleanly into one bundle, no bundle contains too much fractional weight, and no item belongs to more bundles than its multiplicity. Unfortunately, it turns out that due to the rich combinatorial structure among paths in trees, such an allocation cannot be constructed without losing too much value in the worst case.

To overcome this issue, we simplify the combinatorial structure by orienting the underlying tree with an arbitrary root and breaking each path into two “monotone” components. Formally, for an arbitrary rooting of the underlying tree, define the depth of an edge , denoted , to be the number of edges on the shortest path from the root to this edge, including itself.121212So, the edges incident on the root have depth . For a job with corresponding path , define the peak of ’s desired path to be the edge(s) closest to the root: . Let and denote the two subpaths of that begin at one of the peak edges in and consist of all subsequent edges in with increasing depth; we call these the arms of the job. Let and denote the corresponding subpaths. Note that one of these arms may be empty; in that case, we take to be the empty arm.

Definition 5.1.

A fractional allocation is a fractional layered allocation if there exists a partition of the multiset of items (where item has multiplicity ) into bundles , and a corresponding partition of the arms with non-zero weight under , , into sets , such that:

  • For all with , there is exactly one index with and, if is not empty, exactly one index with .

  • For all and , .

  • For all and , we have .

Observe that there are two main differences between the layered allocations defined above and fractional unit allocations as defined in Section 3.1. First, layered allocations partition arms into layers rather than entire jobs, so a job can end up in two different layers. Second, and more importantly, a layer can have very large total weight. All we guarantee is that the weight of any single edge in a layer will be bounded by . As such, the pricing we construct is allowed to allocate multiple subpaths of a single layer to different buyers in sequence.

5.2 Layering a tree

We now show that good fractional layered allocations always exist. For a fractional allocation , let and .

Lemma 5.1.

For every feasible fractional allocation , one can efficiently construct a feasible layered fractional allocation such that

Before we proceed we need more notation. For a subset of jobs (or arms) , let denote the set of edges collectively desired by those jobs: . For an edge , let denote the total fractional weight of jobs (or arms) whose paths contain . Likewise, let or denote the fractional weight on from jobs (or arms) in set .

We construct the layering recursively. Informally, given a fractional allocation , we find a set of jobs such that the fractional demand on any item desired by a job in is a constant. That is, for all , . We scale the fractional allocation of jobs in by , and take this scaled allocation along with the set to be the first layer. We thus obtain a feasible layer, and lose a factor of at most in fractional value.

However, in order to apply this step recursively and arrive at a feasible layered allocation, we need the remaining fractional solution to be feasible for the items remaining after reducing the capacity of every item in by one. Therefore, we need the fractional weight on every edge in to decrease by at least 1 (or else to zero) after removing . It may not be possible to find a set which satisfies both this and the constant-demand condition described above. Instead, we find a set such that jobs in and together account for one unit of demand on every edge in , and then we drop . Because we drop jobs in rather than assigning them to a layer, we do not need to ensure that the fractional weight on edges in decreases by any particular amount. We pick a set with small fractional weight relative to that of , so that we can charge its lost fractional value to .

Finally, although we have informally described the argument in terms of jobs, in our formal argument we consider sets of arms. This introduces an extra complication: when we drop the set of arms, we must also drop their sibling arms. We show that we can do this without losing more than another constant factor in fractional value.

We formalize the recursive step in the following lemma; the proof is deferred to Section 7.

Lemma 5.2.

Given a set of jobs and a non-zero fractional allocation , there exist sets of arms and such that

  1. [label=.]

  2. ;

  3. , where ;

  4. for all , ;

  5. for all