For many problems in social choice, the number of alternatives is very large. For example, consider the problem of voting over possible budgets in a given municipality, where the number of alternatives is infinite (for a divisible budget) or exponential (for funding integral projects). In such settings, it may be impractical to elicit full rankings over alternatives from every voter. Instead, we may want to design mechanisms that only require voters to rank at most a constant number of alternatives. In this paper, we study such mechanisms.
We consider the standard problem in social choice wherein there is a set of voters and a set of alternatives from which we must select a single winner. However, we assume that is large enough to prohibit eliciting full rankings over the alternatives. We also allow to be large. We adopt the implicit utilitarian perspective with metric constraints [BCH15, CDK17, AP17, GKM17, ABE18, FFG16]. That is, we assume that voters have cardinal costs over alternatives, and these costs are constrained to be metric, but voters cannot directly report cardinal costs. We want to design social choice mechanisms to minimize the total social cost by only asking voters to rank at most a constant number of alternatives. We measure the efficiency of a mechanism as its Distortion (see Section 2), the worst case approximation to the total social cost.
It is easy to see that randomization is necessary to achieve constant Distortion if we cannot elicit the ordinal preferences of voters over all alternatives. One natural form of randomization is to elicit alternatives by randomly sampling voters and querying them for their favorite alternatives. More generally, in this paper we mechanisms of the following type: The set of alternatives will be the favorite alternatives expressed by a subset of the voters. Subsequently, these alternatives are ranked either (i) by the entire population of the voters or (ii) by a small subset of the voters. We refer to these as the full and limited participation models respectively.
These assumptions are not merely of theoretical interest, but model social choice in emergent domains. Assumption (i) is natural in contexts where all voters are entitled to participate in the final election. For instance, in real-world Participatory Budgeting applications (see Section 5), a small subset of individuals propose projects, but a much larger number participate in the subsequent vote. Assumption (ii) models situations where we want a lightweight social choice mechanism that only involves a small number of voters overall such as the many department level decisions made at universities by committees representing samples of the faculty.
Prior work [AP17, GAX17, FGMP19] analyzed simple social choice mechanisms for achieving constant Distortion. However, focusing on the expected Distortion can yield randomized mechanisms that can deviate significantly from their expectation ex-post, and hence may be risky to implement in practice.
We address this problem by considering higher moments of Distortion. The moment of Distortion is the expected approximation factor with respect to the
power of the optimal utilitarian social cost. The goal of bounding higher moments of Distortion is directly analogous to providing high probability bounds on approximation guarantees with respect to the total social cost. We note that obtaining such a bound does not follow in a trivial manner from standard sampling arguments: The higher moments depend on the entire distribution of the Distortion obtained by the mechanism, and if this distribution has unbounded variance, then it is not possible to bound the second moment by a constant with any number of samples, let alone higher moments. Moreover, it is initially unclear how to take the “best” result out of many randomly sampled alternatives. Our key insight is that the metric assumption enables us to derive tight bounds on higher moments with only a few samples by using existing deterministic social choice rules to take the “best” from many randomly sampled alternatives.
1.1 Summary of Results
Our primary contribution is the development and analysis of randomized social choice mechanisms that achieve constant moment of Distortion in the metric implicit utilitarian model while requiring each voter to rank at most sampled alternatives, regardless of the total number of voters and alternatives. The normalized moment of Distortion is defined formally in Section 2, and our results are summarized in Table 1. In particular, we design two families of mechanisms that have constant moment of Distortion. The first asks just randomly chosen voters for their favorite alternatives, assuming all voters can subsequently participate in a vote among these alternatives. The second asks voters for their favorite alternatives, and only these sampled voters participate in a vote among their favorite alternatives. To the best of our knowledge, these are the first results in implicit utilitarian social choice providing guarantees for arbitrarily high moments of Distortion and approximating the optimal social cost with high probability.
|Participation Model||Lower bound||Upper bound|
|Full||(Thm. 1)||(Thm. 2)|
|Limited||(Thm. 3)||(Thm. 4)|
Additionally, we show that our upper bounds on the number of samples needed are tight. We show that the moment of Distortion is unbounded in the following two settings: First, when we only sample favorite alternatives and all voters can subsequently compare these alternatives, and secondly, when we only sample voters and the entire mechanism uses only their favorite alternatives and their comparisons between these alternatives. From a practical perspective, we demonstrate the value of using additional voters and alternatives: At most two additional samples guarantee that another higher moment of Distortion can be bounded. Finally, in Section 5, we present simulations on real-world Participatory Budgeting data to qualitatively complement our theoretical insights.
1.2 Related Work
1.2.1 Metric Distortion.
The Distortion of randomized social choice mechanisms in metrics is well studied [BCH15, AP17, GKM17, GAX17]. The Random Dictatorship mechanism samples the favorite alternative of a single voter, and the 2-Agree mechanism [GAX17] samples at most favorite alternatives of voters. Random Dictatorship has Distortion at most 3 [AP17], and 2-Agree improves this when is small. Nothing better than Random Dictatorship is known if the goal is to minimize the Distortion. However, it is easy to show that such mechanisms do not have constant second (or higher) moment of Distortion [FGMP19].
Using the second moment of Distortion as a proxy for risk was introduced in [FGMS17, FGMP19], where it was shown that making one sampled voter compare the favorite alternatives of two randomly sampled voters bounds the second moment of Distortion. In this paper, we consider the natural question: What is the value of each additional voter in how well the Distortion concentrates? We provide a tight characterization by bounding not just the second moment, but any higher moment of Distortion.
The extreme case where is the deterministic setting, where it is known that the Copeland mechanism, or any mechanism based on choosing from the uncovered set [Mil77], yields Distortion of 5 [ABE18]. This bound was improved to in [MW19] via a weighted generalization of the uncovered set. However, both of these methods require eliciting full ordinal preferences from voters.
1.2.2 Communication and Sample Complexity.
For a more thorough survey on the complexity of eliciting ordinal preferences to implement social choice rules, we refer the interested reader to [BCE16]. [CS05] comprehensively characterizes the communication complexity (in terms of the number of bits communicated) of common deterministic voting rules. [BCDL17] and [CP10] design social choice mechanisms with low communication complexity when there are a small number of voters, but potentially a large number of alternatives.
[DB15, DN15] study the sample complexity of predicting the outcome of deterministic social choice rules. However, a “sample” in this work is the entire ordinal preference list for a single voter, whereas a sample for us is only the top alternative for a given voter. Even then, they show that predicting the outcome of rules with small Distortion (such as Copeland) requires a number of samples that grows with the total number of alternatives. We show that a smaller number of more limited samples suffice to bound higher moments of Distortion.
Recently, [MPSW19] studied a different notion of communication complexity in a non-metric implicit utilitarian model where voters can communicate bits of information about their cardinal preferences. In this case, the baseline is ordinal voting, and the other extreme is communicating the entire set of cardinal utilities. They show tight results for how Distortion trades off with the communication complexity in terms of bits of information communicated per voter. In our setting, voters only convey ordinal information and we study the sample complexity to bound not just the Distortion but also how well it concentrates.
We have a set of voters and a set of alternatives, from which we must choose a single outcome. For each agent and alternative , there is some underlying dis-utility . Let , that is, is the favorite alternative for voter . Ordinal preferences are specified by a total order consistent with these dis-utilities (i.e., an alternative is ranked above another only if it has lower dis-utility). A preference profile specifies the ordinal preferences of all agents, and we denote to mean that is consistent with the dis-utilities for every . A deterministic social choice rule is a function that maps a preference profile to an alternative . A randomized social choice rule maps a preference profile to a distribution over .
2.1 Metric Implicit Utilitarian Model
We measure the quality of an alternative by its social cost, given by . Where is obvious from context, we will simply write . Let be the minimizer of social cost. The Distortion [PR06] measures the worst case approximation to the optimal social cost of a given mechanism, in expectation for randomized mechanisms.
The Distortion of a social choice rule is
We assume that is a set of points in a metric space. Specifically, we assume the disutility function is the distance function over this metric space. This assumption models social choice scenarios where there is an objective notion of the distance between alternatives. The metric assumption is common in the implicit utilitarian literature [AP17, GKM17, FGMS17, GAX17, CDK17, ABE18, FFG16, FGMP19], and we consider an example from participatory budgeting in Section 5 where the metric assumption is plausible.
2.2 Sampling and Higher Moments of Distortion
We consider mechanisms that implement a randomized social choice rule by first eliciting favorite alternatives from a random sample of voters and then uses only these alternatives for the rest of the mechanism. The size of this random sample is the sample complexity of our mechanism. We are interested in mechanisms with constant sample complexity with respect to and . A mechanism with sample complexity only requires voters to rank at most alternatives, so constant sample complexities implies that the number of alternatives voters must rank is constant with respect to and .
We consider two models that differ in how voters participate after we elicit these alternatives. In the full participation model of Section 3 we allow all voters to rank the alternatives from the first step and we aggregate these votes to output the winner. While this requires two distinct rounds, it is close to how real Participatory Budgeting processes work, where proposals are constructed by a subset of the population in the first stage, and these are put to vote in the second stage. In the limited participation model of Section 4, only the sample of voters from the first step vote over the alternatives. Thus, mechanisms in the limited participation model do not require a second distinct round involving different voters. It is worth noting that while the sample complexity of our results are lower in the full participation model, the total communication complexity is higher because all voters participate in the second round.
In order to capture the notion of risk inherent in a randomized social choice mechanism, we consider higher statistical moments of Distortion. In order to fairly compare the bounds for different moments, we normalize by the root.
The normalized moment of Distortion of a social choice rule is
Note that by Jensen’s inequality, if a mechanism has then for all . By contrast, lower moments do not imply anything about higher moments of Distortion.
2.3 Relationship Between Higher Moments and High Probability Guarantees
Upper bounds on higher moments of Distortion immediately provide high probability guarantees for approximating the optimal social cost via Markov’s inequality (see Corollaries 1 and 2). However, one can reasonably ask whether the high probability bounds we achieve in this way are “tight.”
More precisely, suppose we want to approximate the optimal social cost with high probability: i.e., for constant , find an alternative such that with probability at least . How many samples (favorite alternatives of random voters) are necessary as a function of and ? The example in Theorem 1 shows that one needs at least samples in the full participation model. On the other hand, Corollary 2 shows that our PRC mechanism needs just samples (for ). So our results are tight with respect to the dependence on the probability term , but the factor of 11 in Corollary 1 is a consequence of the analysis for Theorem 2 and may be improvable.
3 Full Participation Model
In this section, we consider mechanisms that first elicit alternatives by sampling a number of voters and querying them for their most preferred alternatives and then apply a social choice rule on the elicited alternatives with all voters. We begin with the lower bound on the number of samples needed to bound the moment of Distortion.
Any mechanism with sample complexity less than has .
Consider a metric space with two outcomes and separated by distance . The fraction of voters located at is and at is . Note that the average (per-voter) social cost of is . If voters are sampled, with probability , all of them lie at , in which case any voting mechanism using these samples is run on only outcome . Therefore, the social cost in this case is . The moment of Distortion is therefore at least:
Choosing for constant so that all but voters lie at , the above expression is . ∎
3.1 The Mechanism
On the constructive side, we consider a family of mechanisms that achieve constant normalized moment of Distortion using the minimum possible number of samples. We call this family Partially Random Copeland rules.
The Partially Random Copeland rule parameterized by positive integer , denoted , proceeds as follows. First sample voters drawn independently and uniformly at random from with replacement. All voters in are queried for their favorite alternative, and the union of all such alternatives is denoted . Finally, returns the winning alternative under the Copeland social choice rule with voters and alternatives .
In the rest of this section, we will show the following. Intuitively, Theorem 2 asserts that every additional sample in the elicitation step of PRC provides a constant approximation to the next higher moment of Distortion.
For any voters, , which approaches as .
As a simple consequence, using Markov’s inequality, this yields a high probability bound on Distortion. In particular, every additional sample in the elicitation step of PRC provides a geometric improvement in the high probability bound.
As and , the probability that outputs an alternative with social cost more than times that of the social optimum is at most .
We first present a useful lemma bounding the moment of the minimum of random variables.
Let be drawn from distribution and let . Then,
Let have support with probabilities Then, Let . Note that . Let and . Note that and . Let . Note that
Therefore, we have:
We now proceed to prove Theorem 2. Let denote the social optimum. Let . Suppose we sample a set of voters. For , let . Note that , and the are random variables.
Let be the voter closest to , and let denote their favorite alternative. Note .
Let . Consider a ball centered at of radius denoted . By Markov’s inequality, we know that a strict majority, at least , of all voters lie within the ball , since the average distance of a voter to is .
Given , suppose chooses alternative , and suppose . We will show an upper bound on using the random variable . Since a Copeland winner must be a member of the uncovered set [Mil77], either a majority of voters prefer to , or a majority of voters must prefer to an alternative such that a majority of voters also prefer to . The first case is easier: if a majority of voters prefer to , then there exists a voter that prefers to . This implies that
Recall that and , so
The second case yields a worse bound, so we continue the analysis in that case without loss of generality. Let . Since a majority of voters prefer to , there is at least one voter that prefers to , that is, . By triangle inequality,
where the rightmost inequalities follow from the fact that . Combining the above inequalities and assuming , we have . Similarly, if a majority of voters prefer to , there exists some such that Again, by triangle inequality:
where we used that and . Combining the above inequalities, we have . Since , we have:
Thus, we know that for to win Copeland,
By triangle inequality, and using , we have:
Setting , we have:
Since is the minimum of i.i.d. random variables with mean , applying Lemma 1, we have . Applying Jensen’s inequality, for , we have
Therefore, we have
Therefore , we have , completing the proof of Theorem 2.
4 Limited Participation Model
In this section, we consider mechanisms that sample some number of voters, query the voters for their most preferred alternatives, and then hold an election on just the sample of voters. We first show that limiting participation in this way necessarily increases the sample complexity.
Any anonymous limited participation randomized mechanism with sample complexity less than has .
Consider the same instance as Theorem 1. Suppose we sample voters. Then the probability that we sample an equal number of voters located at and is
where we have assumed . In this event, since there is no majority of voters in the sample that prefer either alternative, we assume that any anonymous mechanism outputs with probability at least , so that the social cost is at least . Therefore, the moment of distortion is at least:
Choosing for constant so that all but voters lie at , the above expression is . ∎
4.1 The Mechanism
Complementing the above impossibility, we show that sample complexity of is also sufficient to achieve constant moment of Distortion. In particular, we define another family of social choice rules called Fully Random Copeland.
The Fully Random Copeland rule parameterized by positive integer , denoted proceeds as follows. First samples voters drawn independently and uniformly at random from with replacement. All voters in are queried for their favorite alternative, and the union of all such alternatives is denoted . Finally, returns the winning alternative under the Copeland social choice rule with voters and alternatives .
In the rest of this section, we will show the following. Intuitively, Theorem 4 says that every additional two voters participating in FRC provide a constant approximation to the next higher moment of Distortion.
Again, as a consequence of Markov’s inequality, we have the following high probability bound. In particular, every additional two voters in FRC provide a geometric improvement in the high probability bound.
For , the probability that outputs an alternative with social cost more than times that of the social optimum is at most .
As in Section 3, we first present a result on bounding the moment of a function of random variables; this time the function is the median instead of the minimum.
Let be drawn from distribution and let . Let denote the median of . Then, .
We follow the same structure as the proof of Lemma 1. Let have support with probabilities and let . If , then there exists a subset of values from whose minimum is . Using the expression for the probability of the minimum of values from Lemma 1, the probability of the latter event can be upper bounded as:
From the proof of Lemma 1, . Therefore we have:
We will also need the following straightforward property of the Copeland Rule.
Suppose there are alternatives and voters. We construct a tournament graph on the alternatives where there is a directed edge from alternative to alternative if at least voters strictly prefer to . Then the Copeland rule always picks an alternative with in-degree strictly less than .
We now proceed to prove Theorem 4. As in the proof of Theorem 2, let denote the optimal alternative, and let . Suppose we sample a subset of voters, of size For , let . Order these voters so that and let be the voter that corresponds to the median of this sequence. Let . Note from Lemma 2 that .
Suppose the Copeland rule chooses an alternative , and suppose . We will find an upper bound for . Consider the ball centered around with radius ; call this . By definition, at least agents in lie within . Note that for any , . Therefore, for , we have
Now, for , we have
If , then combining the above two observations, we have that for all , we have . This means that the set of at least voters in strictly prefer all of the favorite alternatives to . From Lemma 3, this means that cannot be the Copeland winner. Thus, for to win in the Copeland rule, we must have so . By triangle inequality,
Using Jensen’s inequality in a fashion similar to the proof of Theorem 2, and using we have:
so we have that . This completes the proof of Theorem 4.
5 Empirical Simulation
In this section, we augment our theoretical worst case analysis with a qualitative empirical demonstration of the concentration achieved by the PRC and FRC mechanisms on real world data. We use data from the Participatory Budgeting project; see [GKSA15]. In this domain, there are a number of public projects (such as new sidewalks, park renovations, etc.). Each project has a monetary cost, and we want to select a set of projects subject to not exceeding a total budget. In participatory budgeting, local community members vote directly over their preferred projects, and these votes are aggregated to decide which projects to fund.
We consider knapsack voting data [GKSA15], where each voter reports the set of projects they most prefer, subject to the total budget constraint. This makes knapsack voter data particularly useful for us: voters select their single alternative out of a very large space, the power set of projects. Because we also have information about the latent combinatorial space (specific projects selected and their costs), we can impose simplistic but natural notions of distance to allow us to simulate our mechanisms and study their performance with respect to the imposed distance.
It is important to note that this is a simulation; actually running our mechanisms does not require specifying a notion of distance, and we do not know how these voters would have responded to ordinal queries in reality. We are treating an entire budget allocation as a single outcome and imputing preferences of voters over these outcomes. This reduces the problem to single winner election over a large space of alternatives in keeping with the theoretical model in the paper. Therefore, natural baseline mechanisms are single winner rules with small sample complexity, particularly Random Dictatorship which is the best-known mechanism with respect to the first moment of Distortion. Other mechanisms for participatory budgeting are tailored to specific models of voter preferences over the combinatorial space of projects, and do not, in general, provide constant Distortion guarantees for arbitrary metrics.
We consider two simple notions of distance: budget distance and Jaccard distance. Suppose there are public projects numbered with costs , and there is a total budget of . A feasible budget is a set of projects such that . The budget distance between budgets and is . The Jaccard distance between and is . The social cost of a given budget is the average distance to the proposed budgets of the voters. We use knapsack voting data from the Participatory Budgeting election held in Cambridge, MA, USA in 2015. There were 945 voters, 23 projects (implying million possible budgets), and a total budget constraint of .
In Figure 1, we present the box plots of the distributions of social cost of PRC and FRC alongside Random Dictatorship (RD) when simulating using budget distance and Jaccard distance respectively. The RD mechanism samples a single most preferred alternative uniformly at random and has Distortion at most 3 [AP17], which is asymptotically the best known bound for any randomized social choice mechanism for arbitrary metrics. The examples qualitatively verify that the PRC and FRC mechanisms do provide substantial concentration in terms of the approximation to the optimal social cost. Furthermore, in practice we observe better average performance of PRC and FRC over that of RD, despite RD’s theoretical optimality with respect to the first moment of Distortion. The results also show that FRC requires more samples to achieve similar performance as PRC. To summarize, even on real datasets, just a few additional samples provide substantially improved concentration.
6 Future Directions
There are several avenues of future research. Our mechanisms involve first sampling some alternatives and then putting them to vote. Is there a truly one-shot mechanism that can bound higher moments of Distortion while only eliciting a constant amount of information from each voter with respect to the number of alternatives? Our intuition is that this should be impossible. Also, though our sample complexity bounds are tight, the exact constant in the Distortion bounds can likely be improved. However, this improvement may be nontrivial: We do not use the Distortion of Copeland as a black box, so results such as [MW19] do not directly improve our bounds. As in [MPSW19], it would be interesting to analyze the effect of bits of cardinal information on the sample complexity. For instance, what if we sample fewer voters, but these voters could express limited cardinal information? In a related vein, could methods that make voters interact like [FGMS17] help reduce the sample complexity of the process?
Kamesh Munagala was supported by NSF grants CCF-1408784, CCF-1637397, and IIS-1447554, ONR award N00014-19-1-2268, and awards from Adobe and Facebook.
- [ABE18] Elliot Anshelevich, Onkar Bhardwaj, Edith Elkind, John Postl, and Piotr Skowron. Approximating optimal social choice under metric preferences. Artificial Intelligence, 264:27 – 51, 2018.
- [AP17] Elliot Anshelevich and John Postl. Randomized social choice functions under metric preferences. Journal of Artificial Intelligence Research, 58(1):797–827, January 2017.
- [BCDL17] Sylvain Bouveret, Yann Chevaleyre, François Durand, and Jérôme Lang. Voting by sequential elimination with few voters. In Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence, IJCAI-17, pages 128–134, 2017.
- [BCE16] Felix Brandt, Vincent Conitzer, Ulle Endriss, Jérôme Lang, and Ariel D. Procaccia. Handbook of Computational Social Choice. Cambridge University Press, New York, New York, 2016.
- [BCH15] Craig Boutilier, Ioannis Caragiannis, Simi Haber, Tyler Lu, Ariel D Procaccia, and Or Sheffet. Optimal social choice functions: A utilitarian view. Artificial Intelligence, 227:190–213, 2015.
- [CDK17] Yu Cheng, Shaddin Dughmi, and David Kempe. Of the people: Voting is more effective with representative candidates. In Proceedings of the 2017 ACM Conference on Economics and Computation, EC ’17, 2017.
- [CP10] Ioannis Caragiannis and Ariel D. Procaccia. Voting almost maximizes social welfare despite limited communication. In Proceedings of the Twenty-Fourth AAAI Conference on Artificial Intelligence, AAAI’10, pages 743–748, 2010.
- [CS05] Vincent Conitzer and Tuomas Sandholm. Communication complexity of common voting rules. In Proceedings of the 6th ACM Conference on Electronic Commerce, EC ’05, pages 78–87, 2005.
- [DB15] Palash Dey and Arnab Bhattacharyya. Sample complexity for winner prediction in elections. In Proceedings of the 2015 International Conference on Autonomous Agents and Multiagent Systems, AAMAS’15, pages 1421–1430, Richland, SC, 2015. International Foundation for Autonomous Agents and Multiagent Systems.
- [DN15] Palash Dey and Y. Narahari. Estimating the margin of victory of an election using sampling. In Proceedings of the 24th International Conference on Artificial Intelligence, IJCAI’15, pages 1120–1126, 2015.
- [FFG16] Michal Feldman, Amos Fiat, and Iddan Golomb. On voting and facility location. In Proceedings of the 2016 ACM Conference on Economics and Computation, EC ’16, pages 269–286, 2016.
- [FGMP19] Brandon Fain, Ashish Goel, Kamesh Munagala, and Nina Prabhu. Random dictators with a random referee: Constant sample complexity mechanisms for social choice. In Proceedings of the Thirty-Third AAAI Conference on Artificial Intelligence, AAAI’19, 2019.
- [FGMS17] Brandon Fain, Ashish Goel, Kamesh Munagala, and Sukolsak Sakshuwong. Sequential deliberation for social choice. In Proceedings of the 13th International Conference on Web and Internet Economics, WINE’17, pages 177–190, 2017.
- [GAX17] Stephen Gross, Elliot Anshelevich, and Lirong Xia. Vote until two of you agree: Mechanisms with small distortion and sample complexity. In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, AAAI’17, 2017.
- [GKM17] Ashish Goel, Anilesh K. Krishnaswamy, and Kamesh Munagala. Metric distortion of social choice rules: Lower bounds and fairness properties. In Proceedings of the 2017 ACM Conference on Economics and Computation, EC ’17, pages 287–304, New York, NY, USA, 2017. ACM.
- [GKSA15] Ashish Goel, Anilesh K Krishnaswamy, Sukolsak Sakshuwong, and Tanja Aitamurto. Knapsack voting. Collective Intelligence, 2015.
- [Mil77] Nicholas R. Miller. Graph-theoretical approaches to the theory of voting. American Journal of Political Science, 21(4):769–803, 1977.
- [MPSW19] Debmalya Mandal, Ariel D Procaccia, Nisarg Shah, and David Woodruff. Efficient and thrifty voting by any means necessary. In Thirty-third Conference on Neural Information Processing Systems, pages 7178–7189, 2019.
- [MW19] Kamesh Munagala and Kangning Wang. Improved metric distortion for deterministic social choice rules. In Proceedings of the 2019 ACM Conference on Economics and Computation, EC’19, pages 245–262, 2019.
- [PR06] Ariel D. Procaccia and Jeffrey S. Rosenschein. The distortion of cardinal preferences in voting. In Proceedings of the 10th International Workshop on Cooperative Information Agents, pages 317–331, 2006.