1 Introduction
The aim of this contribution is to provide an overview of conceptual approaches to incorporating a decision maker’s nonknowledge into economic theory. We will focus here on the particular kind of nonknowledge which we consider to be one of the most important for economic discussions: nonknowledge of possible consequencerelevant uncertain events which a decision maker would have to take into account when selecting between different strategies.
It should be noted that — especially after the recent worldwide economic crisis — economics has been frequently blamed for neglecting this kind of nonknowledge. Allegedly it failed to incorporate unexpected events into its theoretical framework, which resulted in severe negative consequences for economies and societies. (For example, the subprime mortgage crisis or the Lehman Brothers bankruptcy of 2008 can be viewed as “Black Swan events” in the sense of Taleb (2007) [50]). We argue, however, that such blatant accusations are not entirely justified. When one looks back at the long history of the debate on uncertainty and nonknowledge in economics, one will identify ongoing efforts to formalize these conceptually difficult issues by means of the mathematical language on the one hand, and tireless criticisms of this formal approach on the other. The first movement is often interpreted as essentially excluding nonknowledge from economic theory, while the second is considered as a heroic effort to reestablish this issue in the scientific discourse (Frydman and Goldberg 2007 [19]; Akerlof und Shiller 2009 [2])). However, we would like to stress and demonstrate that both developments are deeply interwoven and, rather, mutually support and complement each other.
In the course of the debate, the theoretical representations of nonknowledge have taken some specific technical forms. In this paper, we review the historical development of two basic approaches to formalizing nonknowledge in economic theory, in the context of static oneshot choice situations for decision makers. These are

representations of nonknowledge of a decision maker in terms of probability measures, or related nonprobabilistic measures, over sets of mutually exclusive and exhaustive consequencerelevant (past, present or, in most applications, future) states of Nature;

modelling unawareness of a decision maker of potentially important events by means of sets of states that are less complete than the full set of consequencerelevant states of Nature.
As is well known, the most popular method to deal with nonknowledge in economic theory has been to formalize it by means of probability measures; this approach allowed quantifying the matter and, thus, to rationalize and to “cultivate” it (Smithson 1989 [48, p 43]). Introduced into economic theory by Edgeworth, Jevons and Menger during the so called “marginal revolution” in the late 19th century, probability measures, especially frequentist probability measures as probabilities learned from the past, were celebrated as instruments that allowed quantifying and measuring manifestations of uncertainty (cf. Bernstein 1996 [5, p 190ff]). However, the euphoria was halted by the critiques of Knight (1921) [33], Keynes (1921, 1937) [31, 32], Shackle (1949, 1959) [45, 46] and Hayek (1945) [26] who argued that application of frequentist probability measures precludes systematic analysis of the principal nonknowledge of some consequencerelevant events. They initiated the first line of discussion on nonknowledge in economics and decisionmaking theory; namely, they raised the question as to what extent nonknowledge can be represented by means of measurable or immeasurable probability concepts, or if other, nonprobabilistic measures are necessary. Knight’s (1921) [33] solution, for example, was the famous distinction between risk as situations where probabilities of uncertain events can be unambiguously and objectively determined, and uncertainty
as situations where they cannot be accurately measured and, therefore, should rather be treated as “estimates of the estimates,” or subjective probabilities.
This critique gave rise to an axiomatic approach to the definition of subjective probability measures by Ramsey (1931) [41] and de Finetti (1937) [18] who demonstrated that such measures can always be derived from the observed betting behaviour of a decision maker (namely their willingness to bet), and that they can be powerfully used to formalize a decision maker’s proclaimed utility maximization. Both authors helped to establish the concept of probabilistic sophistication which posits that — even if objective probability measures cannot be determined — the decision maker’s behaviour can always be interpreted in a way as if they have a subjective probability measure which they employ in their personal calculations of expected utility. In this approach, individual imprecise knowledge on consequencerelevant events was conceptualized to form a basis for the introduction of an adequate probability measure to represent this status, and this method rendered the whole discussion about measurability and objectivity of probability measures obsolete for the coming years. Savage (1954) [42] famously combined probabilistic sophistication with the expected utility theory as conceived originally by Bernoulli (1738) [4] and von Neumann and Morgenstern (1944) [40] to arrive at a subjective expected utility theory. Savage’s axiomatization of decisionmaking under conditions of uncertainty thus led to the formalization of nonknowledge of the likelihood of uncertain events in terms of a
unique (finitely additive) Bayes–Laplace prior probability measure
over a complete space of consequencerelevant states of Nature. The latter is assumed to be known to a decision maker before committing to a certain action.Yet, this theoretical move to “absorb” nonknowledge by means of probability distributions obviously precludes the consideration of “unknown unknowns” (e.g. Li 2009
[37, p 977]), as, by assumption, this space of states of Nature is common knowledge for all decision makers. The prior probability measures employed just formalize nonknowledge of which uncertain event from a given list of possibilities will occur. The incorporation of surprises into a theoretical framework, however, necessitates a notion of incomplete sets of uncertain events on the decision maker’s part. Surprising events, by definition, cannot be known at the instant of choice and, thus, cannot be part of the set of events possible known to a decision maker. However, many accounts which aspire to introduce true nonknowledge and uncertainty into economic theory primarily criticize the use of a unique and additive prior probability measure over a given set of states of Nature, but maintain the assumption that the latter set is finite and exhaustive, and that the states are mutually exclusive. These works thus pursue the first line of research mentioned above. In their attempt to formally deal with true uncertainty of some events, and so to reestablish this issue in economic theory, they replace the unique, additive prior probability measure by entire sets of additive prior probability measures (e.g. Gilboa and Schmeidler 1989 [22], Bewley 1986, 2002 [6, 7]), by nonadditive prior probability measures (e.g. Schmeidler 1989 [44], Mukerji 1997 [39], Ghirardato 2001 [20]), or they introduce some alternative nonprobabilistic concept such as fuzzy logic, possibility measures, and weights (Zadeh 1965, 1978 [51, 52]; Dubois and Prade 2011 [12]; Kahneman and Tversky 1979 [30]).We interpret all of these works as attempts to conceptualize Knightian uncertainty in mathematical terms. In all of these cases, nonknowledge is generally captured by an unknown probability measure. However, we would like to stress that theorizing about the principal nonknowledge of some events necessitates the aforementioned representation of the incompleteness of a decision maker’s subjective space of consequencerelevant states of Nature
, because only then the failure of probability theory to represent nonknowledge and surprises adequately can be overcome. This state of affairs motivates the discussion of the second line of research on nonknowledge in the list above. Here, we are dealing with attempts to formalize choice situations where decision makers are aware of the fact that they do
not possess the full list of consequencerelevant states of Nature due to unforeseen contingencies. In our view, the development of this second line shifts emphasis from the issue of the importance of prior probability measures in dealing with nonknowledge to the more fundamental question as to what extent the full space of consequencerelevant states of Nature can actually be known to a decision maker in the first place.In what follows, we first present in Section 2 the standard mathematical framework in terms of which discussions on the formal representation of nonknowledge and uncertainty in economic theory are usually conducted. Subsequently, we describe developments of the inclusion/exclusion movements of nonknowledge along the two lines mentioned above. In Section 3 we address the representation of nonknowledge based on the usage of (various kinds of) probability measures, while in Section 4 we discuss the representation of nonknowledge based on particular formal descriptions of the state space. In Section 5 we conclude with a discussion and provide a brief outlook.
2 The basic mathematical framework
In economic theory, nonknowledge of the likelihood of uncertain events at the initial decision stage of a static oneperson, oneshot decision problem is the crucial feature. Formulated within the settheoretic descriptive behavioural framework developed by Savage (1954) [42] and Anscombe and Aumann (1963) [3], a decision maker chooses from a set of alternative acts. The set of consequences of their choice (i.e., von Neumann–Morgenstern (1944) [40] lotteries over sets of outcomes) depends on which relevant state of Nature out of an exclusive and exhaustive set will occur following a decision (the statecontingency structure). In this framework, acts are perceived as mappings of states of Nature into consequences, . An ordinal binary preference relation is defined over the set , which in turn induces an analogous preference relation on the set via the mapping. The actual state of Nature that will be realized is usually understood as a move by the exogenous world which resolves all uncertainty (Nature “chooses” the state of the world; Debreu 1959 [8], Hirshleifer and Riley 1992 [29, p 7]). The decision maker does not know which consequencerelevant state of Nature will occur, but (like the modeller) has complete knowledge of all possibilities. In the subjective expected utility context of Savage (1954) [42] and Anscombe and Aumann (1963) [3], this kind of nonknowledge is formalized by means of a unique finitely additive prior probability measure over the set of states of Nature, , which expresses the decision maker’s assessment of the likelihood of all uncertain events possible. Existence of such a prior probability measure (usually interpreted as representing a decision maker’s beliefs) is ensured provided the preference relation on satisfies the five behavioural axioms of weak order, continuity, independence, monotonicity and nontriviality (see Anscombe and Aumann 1963 [3, p 203f], Gilboa 2009 [21, p 143f]). As is well known, this axiomatization gives rise to a subjective expected utility representation of the preference relation on in terms of a realvalued preference function , where for every act one defines
Here constitutes a decision maker’s realvalued personal utility function of an outcome
, and it is unique up to positive linear transformations;
denotes the expectation value of with respect to the von Neumann–Morgenstern lottery . The decision maker weakly prefers an act to an act , namely , whenever . It is presupposed here that the prior probability measure is used to express the nonknowledge of the decision maker about exactly which state (from the given exhaustive list) will occur. Figure 1 outlines the structure of the decision matrix for a static oneperson, oneshot choice problem in the subjective expected utility framework due to Savage (1954) [42] and Anscombe and Aumann (1963) [3]. We remark in passing that the exposition by the latter two authors in particular provided the formal basis for more recent decisiontheoretical developments by Schmeidler (1989) [44], Gilboa and Schmeidler (1989) [22] and Schipper (2013) [43].The decision matrix suggests that, besides coding uncertainty about the likelihood of events via a unique prior probability measure, there are at least two further ways to incorporate aspects of a decision maker’s nonknowledge: either to suppose that the particular kind of prior probability measure is unknown (while the set of consequencerelevant states of Nature is complete), or to accept that the set of consequencerelevant states of Nature can be known only incompletely. In the second case, nonknowledge of events is directly captured by means of nonknowledge of the full state space, allowing hereby for unexpected events. In what follows, we will discuss these two general possibilities to formally deal with nonknowledge in more detail — the application of various probability measures on the one hand, and the representation of incomplete state spaces on the other.
3 First way of formalization: probabilistic and nonprobabilistic approaches
As already mentioned, application of additive prior probability measures to capture nonknowledge about the likelihood of uncertain events has been the silver bullet of economic theory in dealing with this problem. In terms of elements of the decision matrix in Figure 1, both the modeller and the decision maker have complete knowledge of all consequencerelevant states of Nature, and of all possible outcomes/lotteries over outcomes contingent on these states. In this respect, both subjects need to be perceived as omniscient. However, throughout the entire history of applications of probability theory in various manifestations in the economic science, discussions have revolved around the question whether different kinds of definitions of probability measures are measurable at all in economic settings, and thus suitable to represent nonknowledge of some events. According to Knight (1921) [33], the conceptual basis for such an operationalization is principally absent from economic life in most cases. Thus, mathematical and statistical probabilities are — though basically measurable — not applicable in an economic context.
The concerns as formulated in the works of Knight (1921) [33], Keynes (1921) [31], and also Shackle (1949) [45], however, were played down for a while by the opposing movement of strong formalization and specific exclusion of nonknowledge of the probability measure from the theoretical economic framework: Ramsey (1931) [41], de Finetti (1937) [18] and Savage (1954) [42] demonstrated that subjective probabilities can be measured in principle when taking a behavioural approach. Followup research, however, drew attention to cases in which nonknowledge of the probability measure is essential for decisionmaking. Especially after Ellsberg’s (1961) [14] paper, a new branch of research appeared that endeavoured to reintroduce absence of perfect knowledge of relevant probability measures into economic theory. Ellsberg (1961) [14] had demonstrated empirically that many people tend to prefer situations with known probability measures over situations with unknown ones, thus violating Savage’s (1954) [42] behavioural “sure thing principle” axiom in particular. He explicitly referred to situations with unknown probability measures as “ambiguous” and named the phenomenon of avoiding such situations “ambiguity aversion” (this corresponds to the term “uncertainty aversion” coined by Knight 1921 [33]).
Subsequently, efforts to formalize Knightian uncertainty were resumed. Relevant work has been developing in two directions (cf. Mukerji 1997 [39, p 24]). First, it was stressed that, in Savage’s (1954) [42] static choice framework, the decision maker ‘mechanically’ assigns probabilities without differentiating between those cases in which they have some knowledge and, thus, can reason about the likelihood of future events, and those cases in which they are completely ignorant about what might happen. Secondly, Savage’s (1954) [42] framework precludes from modelling the decision maker “…who doubts his own ability to imagine and think through an exhaustive list of possible states of the world” (Mukerji 1997 [39, p 24]). Savage’s (1954) [42] axiomatization assumes that the decision maker is completely unaware of the limitations of their knowledge about the future. However, as surprises are a part of real life, this assumption is too strong and cuts back the power of the theory.
Both lines of research represent the efforts to include the limitations of a decision maker’s knowledge into economic theory. In the remaining parts of this section, we briefly discuss the development of the first line of research mentioned in the introduction which employs alternative concepts of probability measures, and then, in the next section, we turn to review representations of nonknowledge by means of various formalizations of the state space.
Knight’s (1921) [33] work, and later Ellsberg’s (1961) [14] paradox, gave way to the intuition that there are differences in how people assign and treat probability measures, and that those differences are related to the quality of the decision maker’s knowledge. Some probability measures are based on more or less reliable information (evidence, or knowledge), and some result from a default rule based on ignorance. For example, there should be a difference between probability as formed by an expert and by a layman. The intuition behind Schmeidler’s (1989) [44] nonadditive prior probability measures framework is exactly this: there is a difference between
“…bets on two coins, one which was extensively tested and was found to be fair, and another about which nothing is known. The outcome of a toss of the first coin will be assigned a 50–50 distribution due to ‘hard’ evidence. The outcome of a toss of the second coin will be assigned the same distribution in accord with Laplace’s principle of indifference. But as Schmeidler (1989) argues, the two distributions feel different, and, as a result, our willingness to bet on them need not to be the same” (Gilboa
et al 2008 [23, p 179]).
The failure of Savage’s (1954) [42] model to account for differences in the knowledge quality in both cases was called by Gilboa et al (2008) [23, p 181] “an agnostic position.” Ellsberg (1961) [14] underlined this issue empirically and demonstrated that the preference of a decision maker for “known” probabilities violates the “sure thing principle” in Savage’s axiomatization: people do not necessarily behave as though they were subjective expected utility maximizers. To model a decision maker’s state of imperfect knowledge in such situations more accurately, there were suggestions to replace the unique prior probability measure with an entire set of prior probability measures (Gilboa and Schmeidler 1989 [22], Bewley 1986, 2002 [6, 7]): nonknowledge regarding the likelihood of uncertain states of Nature here is linked to the number of elements contained in a decision maker’s set of prior probability measures used in calculations of expected utility of acts and consequences, and so is represented in a more comprehensive fashion than in Savage’s (1954) [42] framework. For example, to account for their ignorance, the decision maker assigns not a unique prior probability to an event but rather a certain range of values. In the case of the untested coin (when knowledge of the coin’s properties is vague or nonexistent) this range for head/tail could be “between 45 and 55 percent.” We note that Epstein and Wang (1994) [15] later extended Gilboa and Schmeidler’s (1989) [22] multiplepriors approach to intertemporal settings.
A different way to account for the limitations of a decision maker’s knowledge of future contingencies was the development of nonprobabilistic concepts, for example, fuzzy logic and possibility theory (Zadeh 1965, 1978 [51, 52]; Dubois and Prade 2011 [12]). Interestingly, the economist Shackle (1961) [47], whose work was ignored for decades, was one of the founders of this particular line of research. For Shackle, possibility in particular expresses the incompleteness of a decision maker’s knowledge about the future, and hereby allows representing the “degree of potential surprise” of an event. Possibility as a measure of subjective nonknowledge is less precise (“fuzzier”) than probability and is based either on a numerical (quantitative) or on a qualitative scaling of events from “totally possible” to “impossible.” Those measures must not be additive. It means that two or more events can be simultaneously considered as absolutely possible (or impossible, ”surprising”). The modern formalized version of this idea suggests that there is a finite set of states to which a possibility distribution is assigned (Dubois and Prade 2011 [12, p 3]):
“A possibility distribution is a mapping from to a totally ordered scale , with top 1 and bottom 0, such as the unit interval. The function represents the state of knowledge of an agent (about the actual state of affairs) distinguishing what is plausible from what is less plausible, what is the normal course of things from what is not, what is surprising from what is expected.”
Despite this seemingly radical innovation, and some promising applications in the economic science (e.g., Dow and Ghosh 2009 [11]), the possibility theory could not “revolutionize” decision theory. Zadeh (1978) [52, p 7], the founder of fuzzy logic, famously hinted that “our intuition concerning the behaviour of possibilities is not very reliable” and required the axiomatization of possibilities “in the spirit of axiomatic approaches to the definition of subjective probabilities,” i.e., in line with Savage’s (1954) [42] axiomatization. To make the connection with decision theory, such a theoretical framework was successfully developed by Dubois et al (2001) [13]. Also, more generally, various probability–possibility transformations were discussed, i.e., how to translate for example quantitative possibilities into probabilities and vice versa. In the end, as Halpern (2005) [24, p 40] states, “possibility measures are yet another approach to assigning numbers to sets,” implying all benefits and limits of alternative probability theories.
Finally, — and this is very crucial for our discussion — note that the set of possible consequencerelevant states of Nature in all cases discussed in this section, i.e., in the case of a unique prior probability measure, in the case of a set of prior probability measures, for nonadditive prior probability measures, and in the possibility framework, is assumed to be finite, so that a real surprise (a completely unexpected event) cannot be incorporated. However, to properly account for surprising events, this list should not be modelled as exhaustive. It is crucial to emphasise in this context that assigning subjective probability zero does not help to represent true unawareness of particular events because
“[s]tatements like ‘I am assigning probability zero to the event because I am unaware of it’ are nonsensical, since the very statement implies that I think about the event ” (Schipper 2013 [43, p 739]; cf. also Dekel et al 1998 [9]).
By definition, a decision maker should be perfectly unaware of surprising events before committing to a specific action, and it lies in this very nature that this issue cannot be captured solely by means of more or less welldefined probability measures.
4 Second way of formalization: genuine nonknowledge of the state space and the possibility of true surprises
The second line of thought of incorporating nonknowledge on a decision maker’s part into economic theory likewise has its history and tradition. It was recognized by a number of authors that in order to include true nonknowledge concerning future contingencies and surprising events into the framework of decision theory, it is necessary to shift research efforts from the issue of determination of adequate (prior) probability measures (i.e., risk and uncertainty in the modern economic parlance) to the issue of representation of a decision maker’s unawareness with respect to possible states of Nature beyond their imagination which could also affect the consequences of their choice behaviour. This unawareness may be interpreted as a manifestation of a decision maker’s natural bounded rationality. Their nonknowledge should not be limited to just a lack of knowledge as to which state from the exhaustive list of states of Nature will materialize (“uncertainty about the true state”), but rather nonknowledge about the full state space itself should be a part of decision theory. This challenge was met in the economics literature in particular by Kreps (1979) [34], Fagin and Halpern (1988) [17], Dekel et al (1998, 2001) [9, 10], and Modica and Rustichini (1999) [38]. Their proposals presuppose a coarse (imperfect) subjective knowledge of all consequencerelevant states of Nature possible, and so criticize a central assumption in Savage’s (1954) [42] and Anscombe and Aumann’s (1963) [3] axiomatizations of a decision maker’s choice behaviour, suggesting a radical departure from their frameworks. First of all, proving two famous impossibility results, Dekel, Lipman and Rustichini (1998) [9] demonstrated that the standard partitional information structures of economic theory (i.e., the settheoretic state space models discussed earlier in Section 2) preclude unawareness. Specifically, in such settings, only two very extreme situations can be captured: either a decision maker has complete knowledge of the full space of consequencerelevant states of Nature (as has the modeller), or they have no knowledge of this state space whatsoever. In addition, Dekel et al (1998) [9] made explicit crucial epistemic properties of true unawareness: e.g., that it is necessarily impossible for a decision maker to be aware of their own unawareness (technically termed AU introspection); cf. also Heifetz et al (2006) [27].
Following this discussion, new accounts were developed which suggested different ways to depart from the settheoretic state space concepts of Savage (1954) [42] and of Anscombe and Aumann (1963) [3]; foremost from their assumption on the existence of an exhaustive list of mutually exclusive consequencerelevant states of Nature which is available to both the modeller and the decision maker alike. These new accounts formalize a principally different kind of nonknowledge compared to the nonknowledge of (prior) probability measures over a complete state space: unawareness of potentially ensuing important events, or of additional future subjective contingencies. In terms of elements of the decision matrix in Figure 1, only the modeller now has complete knowledge of all consequencerelevant states of Nature, and of all possible outcomes/lotteries over outcomes contingent on these states. The decision maker has a restricted perception of matters depending on the awareness level they managed to attain.
Three ways to overcome Dekel et al’s (1998) [9] impossibility results concerning standard partitional information structures can be identified in the economics literature, two of which maintain the status of a (now enriched) state space concept as a primitive of the framework proposed. These are

the twostage choice approach

the epistemic approach, and

the settheoretic approach.
We now briefly review these in turn.
One solution is to formalize an endogenous subjective state space of a decision maker as a derived concept, as was initially suggested by Kreps (1979, 1992) [34, 35], and then further developed by Dekel et al (2001) [10] and Epstein et al (2007) [16]. These researchers proposed a decision maker who is unaware of certain future subjective contingencies, and a modeller who can infer a decision maker’s subjective state space regarding these contingencies from observing the decision maker’s choice behaviour. (To a certain extent this strategy can be viewed as analogous to Savage’s (1954) [42] reconstruction of a decision maker’s beliefs from their revealed preferences.) Kreps (1979) [34] developed a twostage model in which a decision maker first chooses from a set of finite action menus. Subsequently, a particular state of Nature is realized. The decision maker chooses a specific action from the selected menu only afterwards. The central idea is that although the decision maker does not know all the states that are possible, they know their subjective subset of possibilities, and this subset is not exogenous. The decision maker anticipates future scenarios which affect their expected later choices from the action menus and their ex ante utility evaluation of these menus. Thus, these scenarios (or the subjective state space) form the basis for ordinal binary preference relations with respect to the menus and can be revealed through observation of those preferences.
The more unaware a decision maker is regarding consequencerelevant states of Nature, the more flexibility they prefer by choosing the menus during the first phase. This intuition was more rigorously formalized by Dekel et al (2001) [10], who provided conditions required to determine the endogenous subjective state space uniquely. For example, they replaced the action menus by menus of lotteries over finite sets of actions, in the spirit of Anscombe and Aumann (1963) [3]. Epstein et al (2007) [16] proposed ways for the twostage choice approach to account for a decision maker’s manifested uncertainty aversion according to Ellsberg’s (1961) [14] empirical result. The pioneers of the unawareness concept depart from Savage’s (1954) [42] and Anscombe and Aumann’s (1963) [3] axiomatizations by replacing the state space in the list of primitives by a set of menus over actions which are the objects of choice. This theoretical move allows for dealing with unforeseen contingencies due to a decision maker’s natural bounded rationality, the latter of which is manifested by their inability to list all the states of the exogenous world that could be relevant. For further details on this approach refer also to Svetlova and van Elst (2012) [49].
In the epistemic approach to formalizing a decision maker’s unawareness, initiated by Fagin and Halpern (1988) [17], and subsequently pursued by Modica and Rustichini (1999) [38], Heifetz, Meier and Schipper (2008) [28], and Halpern and Rêgo (2008) [25], a modal logic syntax is employed to elucidate the finestructure of the (consequencerelevant) states of Nature. Such states are here perceived as maximally consistent sets of propositions which are constructed from a set of countably many primitive propositions, their binary truth values, and a set of related inference rules defined on the set of propositions. The propositional logic models so obtained extend the standard Kripke (1963) [36] information structures of mathematics. The concrete awareness level attributed to a decision maker is associated with a specific subset of consistent propositions and their corresponding binary truth values; the awareness level varies with the number of elements in these subsets. Depending on the approach taken, the awareness level of a decision maker in a given state of Nature is expressed in terms of an explicit awareness modal operator defined over propositions (Fagin and Halpern 1988 [17]), or indirectly in terms of a knowledge modal operator (Modica and Rustichini 1999 [38], and Heifetz, Meier and Schipper 2008 [28]).
While Fagin and Halpern (1988) [17] in their multiperson awareness structure deal with a single state space and propose two kinds of knowledge (implicit and explicit) a decision maker may have depending in their awareness level, Modica and Rustichini (1999) [38] in their oneperson generalized partitional information structure distinguish between the full state space associated with the modeller on the one hand, and the (typically lowerdimensional) subjective state space of the decision maker on the other. A projection operator between these two kinds of spaces is defined. A consequence of this construction is that a 2valued propositional logic obtains in the full state space, while a 3valued propositional logic applies in the decision maker’s subjective state space: a proposition of which they are not aware at a given state can be neither true nor false. Thus, unawareness of a decision maker of a particular event is given when this event cannot be described in terms of states in their subjective state space. According to Halpern and Rêgo (2008) [25], an advantage of addressing the issue of the finestructure of the states of Nature is that this offers a language of concepts for decision makers at a given state, as well as flexibility for covering different notions of awareness. Furthermore, these authors demonstrated that all of the propositional logic models of the epistemic approach to unawareness referred to above are largely equivalent. So far, propositional logic models have not been tied to any specific decisiontheoretic framework.
The settheoretic approach, finally, can be viewed as a less refined subcase of the propositional logic models of the epistemic approach in that it discards the finestructure of the states of Nature, thus leading to a syntaxfree formalization of unawareness. The key realization here is that in order to overcome Dekel et al’s (1998) [9] troubling impossibility results regarding a nontrivial representation of unawareness in standard partitional information structures, an entire hierarchy of disjoint state spaces of differing dimensionality should be introduced amongst the primitives of a decisiontheoretic framework to describe decision makers that have attained different levels of awareness.
Heifetz, Meier and Schipper (2006) [27] deal with this insight by devising in a multiperson context a finite lattice of disjoint finite state spaces which encode decision makers’ different strengths of expressive power through the cardinality of these spaces. Hence, these state spaces share a natural partial rank order between them; every one of them is associated with a specific awareness level of a decision maker. The uppermost state space in the hierarchy of this unawareness structure corresponds to a full description of consequencerelevant states of Nature and may be identified either with an omniscient decision maker or with a modeller. The different state spaces are linked by projection operators from higher ranked spaces to lower ranked spaces. These projection operators are invertible and filter out knowledge existing at a higher level of awareness that cannot be expressed at a lower level. In this way, it is possible to formulate events at a given state of which a decision maker of a certain awareness level has no conception at all. A 3valued logic applies in each state space, with the exception of the uppermost one where the standard 2valued logic obtains. Unawareness respectively awareness of a decision maker of a particular event are formally defined indirectly in terms of a knowledge operator, which satisfies all the properties demanded of such an operator in standard partitional information structures; cf. Dekel et al (1998) [9, 164f]. We remark that the Heifetz et al (2006) [27] proposal may have the potential to provide a framework for capturing Taleb’s (2007) [50] “black swan events” in a decisiontheoretic context. For this purpose a scenario is required where no decision maker’s awareness level corresponds to the uppermost state space in the hierarchy.
A related settheoretic framework was suggested by Li (2009) [37]. In her “product model of unawareness,” she distinguishes factual information on the (consequencerelevant) states of Nature from awareness information characterizing a decision maker, and so provides a formal basis for, again, differentiating between the full space of states of Nature and a decision maker’s (generically lowerdimensional) subjective state space. With a projection operator between these two spaces defined, events of which a decision maker is unaware can be made explicit.
In contrast to the epistemic approach, direct contact with decision theory was recently established by Schipper (2013) [43] for the settheoretic unawareness structure of Heifetz et al (2006) [27]. He puts forward an awarenessdependent subjective expected utility proposal in the tradition of Anscombe and Aumann (1963) [3], where a set of awarenessdependent ordinal binary preference relations for a collection of decision makers is defined over a set of acts on the union of all state spaces in the lattice. Acts map consequencerelevant states of Nature in this union to von Neumann–Morgenstern (1944) [40] lotteries over outcomes contingent on these states. That is, preferences for acts can now depend on the awarenesslevel of a decision maker, and thus may change upon receiving new consequencerelevant information. This is clearly a major conceptual step forward concerning representations of nonknowledge in economic theory, especially since it focuses on the important multiperson case. However, also Schipper’s (2013) [43] proposal is likely to suffer from Ellsberg’s (1961) [14] paradox, as decision makers’ experimentally manifested uncertainty aversion has not been formally addressed in his framework. In this respect, one expects that Schipper’s (2013) [43] work could be combined with the multiplepriors methodology of Gilboa and Schmeidler (1989) [22] in order to settle this matter, in analogy to Epstein et al’s (2007) [16] extension of the work by Dekel et al (2001) [10] in the context of the twostage choice approach.
5 Discussion
Having reviewed the major approaches to nonknowledge in static decisionmaking frameworks, we would now like to address some open questions. We discussed concepts that investigated two key elements of the decision matrix in Figure 1: the space of consequencerelevant states of Nature, and prior probability measures over this space. Until recently, both approaches have been developed detached from one another: the respective papers have been concerned with either the determination of adequate probability measures, or with handling imperfect knowledge of the state space. However, the paper by Schipper (2013) [43] makes an important attempt to connect both of these issues. In our view, further work should be done in this direction.
Moreover, other elements of the decision matrix, particularly the set of available actions and the set of possible consequences, have been widely excluded from the discussion about nonknowledge in economics to date. We suggest that more conceptual work should be done to clarify if it is justified to presuppose that actions and their consequences are perfectly known to decision makers, as has been the case in economic decisionmaking theory so far. Another important open question is how the elements of the decision matrix — probability measure, state space, actions and consequences — are related to each other. For example, recent research on performativity, reflexivity and nonlinearity (cf. see the recent special issue on reflexivity and economics of the Journal of Economic Methodology) suggests that actions chosen could causally influence states of Nature; cf. also Gilboa (2009) [21].
These considerations raise the issue of the very nature of, respectively, the states possible, the actions available, and their resultant consequences. It is important to properly understand what it means to know states, actions and consequences, to be aware or unaware of them. For example, obviously it makes a difference to conceive of possible states as “states of nature” or as “states of the world” (Schipper 2013 [43, p 741]). Both types of states differ concerning the role of the decision maker. In a “statesofnature” approach, the decision maker, i.e., their beliefs and actions, is irrelevant for the construction of the state space: only Nature plays against them. Thus, the elimination of nonknowledge (of the future?)
would depend on the improvement of our understanding of the physical world. If, however, we conceive of the states as “states of the world,” the decision maker’s beliefs and actions are a part of the world description, with the necessity to consider the interrelation of all elements of the decision matrix, as well as the interconnections between the decision matrices of different decision makers as a consequence. For the conception of nonknowledge, our understanding of the social world would be as relevant as our views about the physical world. We think these insights, which relate to epistemic game theory, should be further developed, though the complexity of a resultant theoretical framework might become its own constraint.
Finally, we would like to ask if the assumption of omniscience on the part of the modeller in the unawareness concepts reviewed is justified. Is it warranted to presuppose that there is an institution that possesses a complete view of all states of Nature possible, while an ordinary decision maker has only imperfect knowledge of them? Heifetz et al (2006) [27, p 90] stress that “…unawareness …has to do with the lack of conception.” For us, this conception includes knowledge of the interrelationships between all elements of the decision matrix. But who possesses this knowledge? And, given the complexity of those interrelations, can anybody possess this knowledge at all? To date, presentday economic modellers have not settled this issue.
Acknowledgments
We thank the editors of this volume and an unknown referee for useful comments on an earlier draft.
References
 [1]
 [2] Akerlof G A and Shiller R 2009 Animal Spirits: How Human Psychology Drives the Economy, and Why it Matters for Global Capitalism (Princeton, NJ: Princeton University Press)
 [3] Anscombe F J and Aumann R J 1963 A definition of subjective probability The Annals of Mathematical Statistics 34 (1963) 199–205

[4]
Bernoulli D 1738
Specimen theoriae novae de mensura sortis
English translation: 1954 Exposition of a new theory on the measurement of risk Econometrica 22 23–36  [5] Bernstein P L 1996 Against the Gods — The Remarkable Story of Risk (New York: Wiley)
 [6] Bewley T F 1986 Knightian decision theory: part I (Cowles Foundation: discussion paper)
 [7] Bewley T F 2002 Knightian decision theory: part I Decisions in Economics and Finance 25 79–110
 [8] Debreu G 1959 Theory of Value: An Axiomatic Analysis of Economic Equilibrium (New Haven, CT and London: Yale University Press)
 [9] Dekel E, Lipman B L and Rustichini A 1998 Standard statespace models preclude unawareness Econometrica 66 159–173
 [10] Dekel E, Lipman B L and Rustichini A 2001 Representing preferences with a unique subjective state space Econometrica 69 891–934
 [11] Dow S C and Ghosh D 2009 Fuzzy logic and Keynes’s speculative demand for money Journal for Economic Methodology 16 57–69
 [12] Dubois D and Prade H 2011 Possibility Theory and its applications: where do we stand? IRIT Institut de Recherche en Informatique de Toulouse, http://www.irit.fr/ Didier.Dubois/Papers1208/possibilityEUSFLATMag.pdf (accessed on June 23, 2014)
 [13] Dubois D, Prade H and Sabbadin R 2001 Decisiontheoretic foundations of qualitative possibility theory European Journal of Operational Research 128 459–478
 [14] Ellsberg D 1961 Risk, ambiguity, and the Savage axioms The Quarterly Journal of Economics 75 643–669
 [15] Epstein L G and Wang T 1994 Intertemporal asset pricing under Knightian uncertainty Econometrica 62 283–322
 [16] Epstein L G, Marinacci M and Seo K 2007 Coarse contingencies and ambiguity Theoretical Economics 2 355–394
 [17] Fagin R and Halpern J Y 1988 Belief, awareness, and limited reasoning Artificial Intelligence 34 39–76
 [18] de Finetti B 1937 La prévision: ses lois logiques, ses sources subjectives Annales de l’Institut Henri Poincaré 7 1–68
 [19] Frydman R and Goldberg M 2007 Imperfect Knowledge Economics: Exchange Rates and Risk (Princeton, NJ: Princeton University Press)
 [20] Ghirardato P 2001 Coping with ignorance: unforeseen contingencies and nonadditive uncertainty Economic Theory 17 247–276
 [21] Gilboa I 2009 Theory of Decision under Uncertainty (Cambridge: Cambridge University Press)
 [22] Gilboa I and Schmeidler D 1989 Maxmin expected utility with nonunique prior Journal of Mathematical Economics 18 141–153
 [23] Gilboa I, Postlewaite A W and Schmeidler D 2008 Probability and uncertainty in economic modelling Journal of Economic Perspectives 22 173–188
 [24] Halpern J Y 2005 Reasoning about Uncertainty (Cambridge, MA: MIT Press)
 [25] Halpern J Y and Rêgo L C 2008 Interactive unawareness revisited Games and Economic Behavior 62 232–262 [arXiv:cs/0509058v1 [cs.AI]]
 [26] Hayek F A 1945 The use of knowledge in society The American Economic Review 35 519–530
 [27] Heifetz A, Meier M and Schipper B C 2006 Interactive unawareness Journal of Economic Theory 130 78–94
 [28] Heifetz A, Meier M and Schipper B C 2008 A canonical model for interactive unawareness Games and Economic Behavior 62 304–324
 [29] Hirshleifer J and Riley J G 1992 The Analytics of Uncertainty and Information (Cambridge: Cambridge University Press)
 [30] Kahneman D and Tversky A 1979 Prospect Theory: an analysis of decision under risk Econometrica 47 263–292
 [31] Keynes J M 1921 A Treatise on Probability (London: Macmillan)
 [32] Keynes J M 1937 The general theory of employment The Quarterly Journal of Economics 51 209–233
 [33] Knight F H 1921 Risk, Uncertainty and Profit (Boston, MA: Houghton Mifflin)
 [34] Kreps D M 1979 A representation theorem for “preference for flexibility” Econometrica 47 565–577
 [35] Kreps D M 1992 Static choice and unforeseen contingencies Economic Analysis of Markets and Games: Essays in Honor of Frank Hahn ed P Dasgupta, D Gale, O Hart and E Maskin (Cambridge, MA: MIT Press) 259–281
 [36] Kripke S 1963 Semantical analysis of modal logic Z. Math. Logik Grundl. Math. 9 67–96
 [37] Li J 2009 Information structures with unawareness Journal of Economic Theory 144 977–993
 [38] Modica S and Rustichini A 1999 Unawareness and partitional information structures Games and Economic Behavior 27 265–298
 [39] Mukerji S 1997 Understanding the nonadditive probability decision model Economic Theory 9 23–46
 [40] von Neumann J and Morgenstern O 1944 Theory of Games and Economic Behavior (Princeton, NJ: Princeton University Press)
 [41] Ramsey F P 1931 Truth and probability The Foundations of Mathematics and Other Logical Essays (London: Routledge and Kegan Paul) 156–198
 [42] Savage L J 1954 The Foundations of Statistics (New York: Wiley)
 [43] Schipper B C 2013 Awarenessdependent subjective expected utility International Journal of Game Theory 42 725–753
 [44] Schmeidler D 1989 Subjective probability and expected utility without additivity Econometrica 57 571–587
 [45] Shackle G L S 1949 Expectations in Economics (Cambridge: Cambridge University Press)
 [46] Shackle G L S 1959 Time and thought The British Journal for the Philosophy of Science 9 285–298
 [47] Shackle G L S 1961 Decision, Order and Time in Human Affairs 2nd edition (Cambridge: Cambridge University Press)
 [48] Smithson M 1989 Ignorance and Uncertainty: Emerging Paradigms (New York: Springer)
 [49] Svetlova E and van Elst H 2013 How is nonknowledge represented in economic theory? Ungewissheit als Herausforderung f r die ökonomische Theorie: Nichtwissen, Ambivalenz und Entscheidung eds B Priddat and A Kaballak (Marburg: Metropolis) 41–72 [arXiv:1209.2204v1 [qfin.GN]]
 [50] Taleb N N 2007 The Black Swan — The Impact of the Highly Improbable (London: Penguin)
 [51] Zadeh L A 1965 Fuzzy sets Information and Control 8 338–353
 [52] Zadeh L A 1978 Fuzzy sets as a basis for a theory of possibility Fuzzy Sets and Systems 1 3–28
Comments
There are no comments yet.