# A Description Logic Framework for Commonsense Conceptual Combination Integrating Typicality, Probabilities and Cognitive Heuristics

We propose a nonmonotonic Description Logic of typicality able to account for the phenomenon of concept combination of prototypical concepts. The proposed logic relies on the logic of typicality ALC TR, whose semantics is based on the notion of rational closure, as well as on the distributed semantics of probabilistic Description Logics, and is equipped with a cognitive heuristic used by humans for concept composition. We first extend the logic of typicality ALC TR by typicality inclusions whose intuitive meaning is that there is probability p about the fact that typical Cs are Ds. As in the distributed semantics, we define different scenarios containing only some typicality inclusions, each one having a suitable probability. We then focus on those scenarios whose probabilities belong to a given and fixed range, and we exploit such scenarios in order to ascribe typical properties to a concept C obtained as the combination of two prototypical concepts. We also show that reasoning in the proposed Description Logic is EXPTIME-complete as for the underlying ALC.

## Authors

• 9 publications
• 5 publications
• ### Reasoning about Typicality and Probabilities in Preferential Description Logics

In this work we describe preferential Description Logics of typicality, ...
04/20/2020 ∙ by Laura Giordano, et al. ∙ 0

• ### On Rational Closure in Description Logics of Typicality

We define the notion of rational closure in the context of Description L...
05/05/2013 ∙ by Laura Giordano, et al. ∙ 0

• ### Extending Description Logic EL++ with Linear Constraints on the Probability of Axioms

One of the main reasons to employ a description logic such as EL or EL++...
08/27/2019 ∙ by Marcelo Finger, et al. ∙ 0

• ### A strengthening of rational closure in DLs: reasoning about multiple aspects

We propose a logical analysis of the concept of typicality, central in h...
04/01/2016 ∙ by Valentina Gliozzi, et al. ∙ 0

• ### An ASP approach for reasoning in a concept-aware multipreferential lightweight DL

In this paper we develop a concept aware multi-preferential semantics fo...
06/08/2020 ∙ by Laura Giordano, et al. ∙ 0

• ### An extended description logic system with knowledge element based on ALC

With the rise of knowledge management and knowledge economy, the knowled...
04/16/2019 ∙ by Bin Wen, et al. ∙ 0

• ### On a plausible concept-wise multipreference semantics and its relations with self-organising maps

Inthispaperwedescribeaconcept-wisemulti-preferencesemantics for descript...
08/30/2020 ∙ by Laura Giordano, et al. ∙ 0

##### This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

## 1 Introduction

Inventing novel concepts by combining the typical knowledge of pre-existing ones is among the most creative cognitive abilities exhibited by humans. This generative phenomenon highlights some crucial aspects of the knowledge processing capabilities in human cognition and concerns high-level capacities associated to creative thinking and problem solving. Still, it represents an open challenge in the field of artificial intelligence (AI)

Boden (1998). Dealing with this problem requires, from an AI perspective, the harmonization of two conflicting requirements that are hardly accommodated in symbolic systems (including formal ontologies Frixione and Lieto (2012)): the need of a syntactic and semantic compositionality (typical of logical systems) and that one concerning the exhibition of typicality effects. According to a well-known argument Osherson and Smith (1981), in fact, prototypes are not compositional. The argument runs as follows: consider a concept like pet fish. It results from the composition of the concept pet and of the concept fish. However, the prototype of pet fish cannot result from the composition of the prototypes of a pet and a fish: e.g. a typical pet is furry and warm, a typical fish is grayish, but a typical pet fish is neither furry and warm nor grayish (typically, it is red).

In this work we provide a framework able to account for this type of human-like concept combination. We propose a nonmonotonic Description Logic (from now on DL) of typicality called (typicality-based compositional logic). This logic combines two main ingredients. The first one relies on the DL of typicality introduced in Giordano et al. (2015). In this logic, “typical” properties can be directly specified by means of a “typicality” operator enriching the underlying DL, and a TBox can contain inclusions of the form to represent that “typical s are also ”. As a difference with standard DLs, in the logic one can consistently express exceptions and reason about defeasible inheritance as well. For instance, a knowledge base can consistently express that “normally, athletes are in fit”, whereas “sumo wrestlers usually are not in fit” by the typicality inclusions

given that . The semantics of the operator is characterized by the properties of rational logic Lehmann and Magidor (1992), recognized as the core properties of nonmonotonic reasoning. is characterized by a minimal model semantics corresponding to an extension to DLs of a notion of rational closure as defined in Lehmann and Magidor (1992) for propositional logic: the idea is to adopt a preference relation among models, where intuitively a model is preferred to another one if it contains less exceptional elements, as well as a notion of minimal entailment restricted to models that are minimal with respect to such preference relation. As a consequence, inherits well-established properties like specificity and irrelevance: in the example, the logic allows us to infer (being bald is irrelevant with respect to being in fit) and, if one knows that Hiroyuki is a typical sumo wrestler, to infer that he is not in fit, giving preference to the most specific information.

As a second ingredient, we consider a distributed semantics similar to the DISPONTE semantics proposed by Riguzzi et al. (2015b, a) for probabilistic extensions of DLs, allowing to label inclusions (and facts) with degrees representing probabilities, but restricted - in - to typicality inclusions. Our basic idea is to label inclusions with a real number between 0.5 and 1, representing their probabilities111

We want to stress that, as in any probabilistic formal framework, probabilities are assumed to come from an application domain. This is true also for other frameworks such as, for example, fuzzy logics or probabilistic extensions of logic programs. In this paper, we focus on the proposal of the formalism itself, therefore the machinery for obtaining probabilities from a dataset of the application domain is out of the scope.

, assuming that each axiom is independent from each others (as in the DISPONTE semantics). The resulting knowledge base defines a probability distribution over

scenarios: roughly speaking, a scenario is obtained by choosing, for each typicality inclusion, whether it is considered as true or false. In a slight extension of the above example, we could have the need of representing that both the typicality inclusions about athletes and sumo wrestlers have a probability of , whereas we also believe that athletes are usually young with a higher probability of , with the following KB:

We consider eight different scenarios, representing all possible combinations of typicality inclusion: as an example, represents the scenario in which (2) and (4) hold, whereas (3) does not. We equip each scenario with a probability depending on those of the involved typicality inclusions, then we can only consider scenarios whose probabilities belong to a given and fixed range.

As an additional element of the proposed formalization we employ a method inspired by cognitive semantics Kamp and Partee (1995); Smith et al. (1988); Hampton (1987, 1988) for the identification of a dominance effect between the concepts to be combined. Namely, for every combination, we distinguish a HEAD and a MODIFIER, where the HEAD represents the stronger element of the combination. The basic idea is as follows: given a KB and two concepts and occurring in it, where is the HEAD and the MODIFIER, we consider only some scenarios in order to define a revised knowledge base, enriched by typical properties of the combined concept . Such scenarios are those (i) consistent with respect to the initial knowledge base, (ii) not trivial, i.e. we discard those with the highest probability, containing either all properties that can be consistently ascribed to or all properties of the HEAD that can be consistently ascribed to , and (iii) giving preference to the typical properties of the HEAD (with respect to those of ) having the highest probability.

We are able to exploit the logic in two different perspectives. On the one hand, we show that it is able to capture well established examples in the literature of cognitive science concerning concept combination and, as such, we argue that is a promising candidate to tackle the problem of typicality-based concept combination (Sections 4.1, 4.2 and 4.3). On the other hand, we use as a tool for the generation and the exploration of novel creative concepts (Section 5), that could be useful in many applicative scenarios, ranging from video games to the creation of new movie or story characters.

As a further result, we show that the proposed approach is essentially inexpensive, in the sense that reasoning in is ExpTime-complete as for the underlying standard Description Logic.

The plan of the paper is as follows. In Section 2 we recall the two semantics, the DISPONTE semantics for probabilistic DLs and the rational closure for the logic of typicality, that represent the starting points of our proposal, outlined in Section 3. In this section we present the logic for concept combination, and we show its reasoning complexity. In Section 4 we show that the proposed logic is able to capture some well known and paradigmatic examples of concept combination coming from the cognitive science literature. In Section 5 we exploit the logic in the application domain of computational creativity: i.e. we show how can be used for inventing/generating novel concepts as the result of the combination of two (or more) prototypes. In Section 6 we proceed further by showing that the logic can be iteratively applied to combine prototypical concepts already resulting from the combination of prototypes. We conclude in Section 7 by mentioning some related approaches addressing the problem of common-sense concept combination, as well as by discussing on possible future works.

## 2 Background: probabilistic DLs and DLs of typicality

The main aim of this work is to introduce a nonmonotonic Description Logic able to deal with the combination of prototypical concepts. In order to achieve this goal, we exploit two well established logical frameworks:

• the DISPONTE semantics of probabilistic extensions of DLs

• the nonmonotonic logic of typicality based on a notion of rational closure for DLs.

In this section we briefly recall such ingredients, before introducing our proposal in Section 3.

### 2.1 Probabilistic DLs: the DISPONTE semantics

A probabilistic extension of Description Logics under the distribution semantics is proposed in Riguzzi et al. (2015a). In this approach, called DISPONTE, the authors propose the integration of probabilistic information with DLs based on the distribution semantics for probabilistic logic programs Sato (1995). The basic idea is to label inclusions of the TBox as well as facts of the ABox with a real number between 0 and 1, representing their probabilities, assuming that each axiom is independent from each others. The resulting knowledge base defines a probability distribution over worlds

: roughly speaking, a world is obtained by choosing, for each axiom of the KB, whether it is considered as true of false. The distribution is further extended to queries and the probability of a query is obtained by marginalizing the joint distribution of the query and the worlds.

As an example, consider the following variant of the knowledge base inspired by the people and pets ontology in Riguzzi et al. (2015a):

(1)
(2)
(3)
(4)

The inclusion (1) expresses that individuals that own a pet are nature lovers with a 30% probability, whereas (2) is used to state that cats are pets with probability 60%. The ABox fact (3) represents that Tom is a cat with probability 90%. Inclusions (1), (2) and (3) are probabilistic axioms, whereas (4) is a certain axiom, that must always hold. The KB has the following eight possible worlds:

representing all possible combinations of considering/not considering each probabilistic axiom. For instance, the world represents the situation in which we have that (1) and (3) hold, i.e. and , whereas (2) does not. The query

 NatureLover(kevin)

is true only in the last world, i.e. having that (1), (2) and (3) are all true, whereas it is false in all the other ones. The probability of such a query is .

### 2.2 Reasoning about typicality in DLs: the nonmonotonic logic ALC+TR

The logic is obtained by adding to standard the typicality operator Giordano et al. (2009). The intuitive idea is that selects the typical instances of a concept . We can therefore distinguish between the properties that hold for all instances of concept (), and those that only hold for the normal or typical instances of ().

The semantics of the operator can be given by means of a set of postulates that are a reformulation of axioms and rules of nonmonotonic entailment in rational logic R  Lehmann and Magidor (1992): in this respect an assertion of the form is equivalent to the conditional assertion in R. The basic ideas are as follows: given a domain and an evaluation function , one can define a function that selects the typical instances of any ; in case for a concept , the selection function selects the typical instances of , namely:

 (T(C))I=fT(CI).

has the following properties for all subsets of , that are essentially a restatement of the properties characterizing rational logic R:

The semantics of the operator can be equivalently formulated in terms of rational models Giordano et al. (2015): a model is any structure where is the domain, is an irreflexive, transitive, well-founded and modular (for all in , if then either or ) relation over . In this respect, means that is “more normal” than , and that the typical members of a concept are the minimal elements of with respect to this relation. An element is a typical instance of some concept if and there is no -element in more typical than . In detail, is the extension function that maps each concept to , and each role to . For concepts of , is defined as usual. For the operator, we have

 (T(C))I=Min<(CI),

where .

Given standard definitions of satisfiability of a KB in a model, we define a notion of entailment in . Given a query (either an inclusion or an assertion or an assertion of the form ), we say that is entailed from a KB if holds in all models satisfying KB.

Even if the typicality operator itself is nonmonotonic (i.e. does not imply ), what is inferred from a KB can still be inferred from any KB’ with KB KB’, i.e. the logic is monotonic. In order to perform useful nonmonotonic inferences, in Giordano et al. (2015) the authors have strengthened the above semantics by restricting entailment to a class of minimal models. Intuitively, the idea is to restrict entailment to models that minimize the atypical instances of a concept. The resulting logic corresponds to a notion of rational closure on top of . Such a notion is a natural extension of the rational closure construction provided in Lehmann and Magidor (1992) for the propositional logic.

The nonmonotonic semantics of relies on minimal rational models that minimize the rank of domain elements. Informally, given two models of KB, one in which a given domain element has rank 2 (because for instance , and another in which it has rank 1 (because only ), we prefer the latter, as in this model the element is assumed to be “more typical” than in the former.

Query entailment is then restricted to minimal canonical models. The intuition is that a canonical model contains all the individuals that enjoy properties that are consistent with KB. A model is a minimal canonical model of KB if it satisfies KB, it is minimal and it is canonical222In Theorem 10 in Giordano et al. (2015) the authors have shown that for any consistent KB there exists a finite minimal canonical model of KB.. A query is minimally entailed from a KB if it holds in all minimal canonical models of KB. In Giordano et al. (2015) it is shown that query entailment in is in ExpTime.

## 3 A Logic for Concept Combination

In this section, we introduce a new nonmonotonic Description Logic that combines the semantics based on the rational closure of Giordano et al. (2015) with the DISPONTE semantics Riguzzi et al. (2015a, b) of probabilistic DLs.

By taking inspiration from Lieto et al. (2017), in our representational assumptions we consider two different types of properties associated to a given concept: rigid and typical. Rigid properties are those defining a concept, e.g. (all s are s). Typical properties are represented by inclusions equipped by a degree of belief expressed through probabilities like in the DISPONTE Semantics. Additionally, as mentioned, we employ insights coming from the cognitive science for the determination of a dominance effect between the concepts to be combined, distinguishing between concept HEAD and MODIFIER. Since the conceptual combination is usually expressed via natural language we consider the following common situations: in a combination ADJECTIVE - NOUN (for instance, red apple) the HEAD is represented by the NOUN (apple) and the modifier by the ADJECTIVE (red). In the more complex case of NOUN-NOUN combinations (for instance, pet fish) usually the HEAD is represented by the last expressed concept (fish in this case). As we will see, however, in the NOUN-NOUN case (i.e. the one we will take into account in this paper) does not exists a clear rule to follow 333It is worth-noting that a general framework for the automatic identification of a HEAD/MODIFER combination is currently not available in literature. In this work we will take for granted that some methods for the correct identification of these pairs exist and we will focus on the reasoning part..

The language of extends the basic DL by typicality inclusions of the form equipped by a real number – observe that it is an open interval, whose extremes are not included – representing its probability, whose meaning is that “normally, s are also with probability 444The reason why we only allow typicality inclusions equipped with probabilities is detailed in the next page..

###### Definition 1 (Language of T\tiny CL)

We consider an alphabet of concept names , of role names , and of individual constants . Given and , we define:

We define a knowledge base where:

is a finite set of rigid properties of the form ;

is a finite set of typicality properties of the form

 p :: T(C)⊑D

where is probability of the typicality inclusion;

is the ABox, i.e. a finite set of formulas of the form either or , where and .

###### Example 1

Let us consider and extend the previous example about athletes and sumo wrestlers, already mentioned in the Introduction. In the logic we can have a knowledge base as follows:

:

:

:

Rigid properties of are intended as usual in standard : all sumo wrestlers are athletes, and all athletes are human beings. Typicality properties of represent the following facts, respectively:

• usually, athletes are in fit, and this fact has a probability of ;

• typical sumo wrestlers are not in fit with a probability of ;

• we have probability of in the fact that, normally, athletes are young persons.

The ABox facts i are used to represent that Roberto is an athlete, whereas Hiroyuky is a sumo wrestler.

We remind that, since we exploit the logic of typicality , our logic inherits its nonmonotonic reasoning capabilities. For instance, given , we can infer

the last one stating that being bald is irrelevant with respect to being in fit. Furthermore, since we know that Hiroyuki is a sumo wrestler and Roberto is an athlete, we can infer the following facts:

Observe that, in the last one, the logic gives preference to the most specific information (Hiroyuky is both an athlete and a sumo wrestler).

It is worth noticing that we avoid typicality inclusions with degree 1. Indeed, an inclusion would mean that it is a certain property, that we represent with . Also, observe that we only allow typicality inclusions equipped with probabilities . The reasons guiding this choice are the following:

• the very cognitive notion of typicality derives from that one of probability distribution Rosch (1975), in particular typical properties attributed to entities are those characterizing the majority of instances involved;

• in our effort of integrating two different semantics – DISPONTE and typicality logic – the choice of having probabilities higher than for typicality inclusions seems to be the only one compliant with both the formalisms. In fact, despite the DISPONTE semantics allows to assign also low probabilities/degrees of belief to standard inclusions, in the logic it would be misleading to also allow low probabilities for typicality inclusions. For example, the logic does not allow an inclusion like , that could be interpreted as “normally, students are not young people”. Please, note that this is not a limitation of the expressivity of the logic : we can in fact represent properties not holding for typical members of a category, for instance if one needs to represent that typical students are not married, we can have that , rather than .

Following from the DISPONTE semantics, each axiom is independent from each others. This avoids the problem of dealing with probabilities of inconsistent inclusions. Let us consider the following knowledge base:

Also in the scenarios where both the conflicting typical inclusions and are considered, the two probabilities describe, respectively, the probability () of having exceptional students paying working taxes, and the probability () of having exceptional working students not paying working taxes, and those probabilistic inclusions are both acceptable due to the independence assumption. The two probabilities will contribute to a definition of probability of such scenario (as we will describe in Definition 7). It is worth noticing that the underlying logic of typicality allows us to get for free the correct way of reasoning in this case, namely if the ABox contains the information that is a working student, we obtain that he pays working taxes, i.e. .

A model in the logic extends standard models by a preference relation among domain elements as in the logic of typicality Giordano et al. (2015). In this respect, means that is “more normal” than , and that the typical members of a concept are the minimal elements of with respect to this relation555It could be possible to consider an alternative semantics whose models are equipped with multiple preference relations, whence with multiple typicality operators. In this case, it should be possible to distinguish different aspects of exceptionality, however the approach based on a single preference relation in Giordano et al. (2015) ensures good computational properties (reasoning in the resulting nonmonotonic logic has the same complexity of the standard ), whereas adopting multiple preference relations could lead to higher complexities.. An element is a typical instance of some concept if and there is no -element in more normal than . Formally:

###### Definition 2 (Model of T\tiny CL)

A model is any structure

 ⟨ΔI,<,.I⟩

where:

• is a non empty set of items called the domain;

• is an irreflexive, transitive, well-founded and modular (for all in , if then either or ) relation over ;

• is the extension function that maps each atomic concept to , and each role to , and is extended to complex concepts as follows:

• where .

A model can be equivalently defined by postulating the existence of a function , where assigns a finite rank to each domain element Giordano et al. (2015): the rank of is the length of the longest chain from to a minimal , i.e. such that there is no such that . The rank function and can be defined from each other by letting if and only if .

###### Definition 3 (Model satisfying a knowledge base in T\tiny CL)

Let be a KB. Given a model , we assume that is extended to assign a domain element of to each individual constant of . We say that:

• satisfies if, for all , we have ;

• satisfies if, for all , we have that , i.e. ;

• satisfies if, for all assertion , if then , otherwise if then .

Even if the typicality operator itself is nonmonotonic (i.e. does not imply ), what is inferred from a KB can still be inferred from any KB’ with KB KB’, i.e. the resulting logic is monotonic. As already mentioned in Section 2, in order to perform useful nonmonotonic inferences, in Giordano et al. (2015) the authors have strengthened the above semantics by restricting entailment to a class of minimal models. Intuitively, the idea is to restrict entailment to models that minimize the untypical instances of a concept. The resulting logic corresponds to a notion of rational closure on top of . Such a notion is a natural extension of the rational closure construction provided in Lehmann and Magidor (1992) for the propositional logic. This nonmonotonic semantics relies on minimal rational models that minimize the rank of domain elements. Informally, given two models of KB, one in which a given domain element has rank 2 (because for instance , and another in which it has rank 1 (because only ), we prefer the latter, as in this model the element is assumed to be “more typical” than in the former. Query entailment is then restricted to minimal canonical models. The intuition is that a canonical model contains all the individuals that enjoy properties that are consistent with KB. This is needed when reasoning about the rank of the concepts: it is important to have them all represented. A query is minimally entailed from a KB if it holds in all minimal canonical models of KB. In Giordano et al. (2015) it is shown that query entailment in the nonmonotonic is in ExpTime.

###### Definition 4 (Entailment)

Let be a KB and let be either ( could be ) or or . We say that follows from if, for all minimal satisfying , then also satisfies .

Let us now define the notion of scenario of the composition of concepts. Intuitively, a scenario is a knowledge base obtained by adding to all rigid properties in and to all ABox facts in only some typicality properties. More in detail, we define an atomic choice on each typicality inclusion, then we define a selection as a set of atomic choices in order to select which typicality inclusions have to be considered in a scenario.

###### Definition 5 (Atomic choice)

Given , where we define (, ) an atomic choice, where .

###### Definition 6 (Selection)

Given , where and a set of atomic choices , we say that is a selection if, for each , one decision is taken, i.e. either (, 0) and (, 1) or (, 1) and (, 0) for . The probability of is .

###### Definition 7 (Scenario)

Given , where and given a selection , we define a scenario We also define the probability of a scenario as the probability of the corresponding selection, i.e. . Last, we say that a scenario is consistent with respect to when it admits a model in the logic satisfying .

We denote with the set of all scenarios. It immediately follows that the probability of a scenario is a probability distribution over scenarios, that is to say .

Given a KB and given two concepts and occurring in , our logic allows defining the compound concept as the combination of the HEAD and the MODIFIER , where the typical properties of the form (or, equivalently, ) to ascribe to the concept are obtained in the set of scenarios that:

1. are consistent with respect to in presence of at least a -element, in other words the knowledge base extending with the properties ascribed to the combined concept in the scenario, i.e. , where does not occur in , admits a model in ;

2. are not trivial, i.e. the scenarios with the highest probability considering either all properties that can be consistently ascribed to are discarded or all properties of the HEAD that can be consistently ascribed to are discarded;

3. are those giving preference to the typical properties of the HEAD (with respect to those of the MODIFIER ) with the highest probability, that is to say a scenario is discarded if, in case of conflicting properties and , contains an inclusion whereas it does not include another inclusion .

In order to select the resulting scenarios we apply points 1, 2, and 3 above to blocks of scenarios with the same probability, in decreasing order starting from the highest one. More in detail, we first discard all the inconsistent scenarios, then we consider the remaining (consistent) ones in decreasing order by their probabilities. We then consider the blocks of scenarios with the same probability, and we proceed as follows:

• we discard those considered as trivial, consistently inheriting all the properties from the HEAD (therefore, also scenarios inheriting all the properties of HEAD and MODIFIER are discarded) from the starting concepts to be combined;

• among the remaining ones, we discard those inheriting properties from the MODIFIER in conflict with properties that could be consistently inherited from the HEAD;

• if the set of scenarios of the current block is empty, i.e. all the scenarios have been discarded either because trivial or because preferring the MODIFIER, we repeat the procedure by considering the block of scenarios, all having the immediately lower probability;

• the set of remaining scenarios are those selected by the logic .

More formally, our mechanism is described in Algorithm 1. Please note that this block-based procedure extends a previously developed method that simply selected the consistent scenarios with the probability range immediately lower to the non-trivial ones Lieto and Pozzato (2018b). Notice also that, in the initial knowledge base , we have that the set of typicality inclusions is , where does not contain neither inclusions of the HEAD of the form nor inclusions of the MODIFIER of the form (Algorithm 1, line 2).

Lastly, we define the ultimate output of our mechanism: a knowledge base in the logic whose set of typicality properties is enriched by those of the compound concept . Given a scenario satisfying the above properties, we define the properties of as the set of inclusions , for all that are entailed (Definition 4) from in the logic . The probability is such that:

• if is entailed from , that is to say is a property inherited either from the HEAD (or from both the HEAD and the MODIFIER), then corresponds to the probability of such inclusion of the HEAD in the initial knowledge base, i.e. ;

• otherwise, i.e. is entailed from , then corresponds to the probability of such inclusion of a MODIFIER in the initial knowledge base, i.e. .

The knowledge base obtained as the result of combining concepts and into the compound concept is called -revised knowledge base, and it is defined as follows:

 KC=⟨R,T∪{p: T(C)⊑D},A⟩,

for all such that either is entailed in or is entailed in by Definition 4, and is defined as above.

Let us now define the probability that a query is entailed from a -revised knowledge base. We restrict our concern to ABox facts. The intuitive idea is that, given a query of the form and its associated probability , the probability of is the product of and the probability of the inclusion in the -revised knowledge base which is responsible for that.

###### Definition 8 (Probability of query entailment)

Given a knowledge base , the -revised knowledge base , a query and its probability , we define the probability of the entailment of the query from , denoted as as follows:

• , if is not entailed from ;

• , where either belongs to or belongs to and is entailed from in standard , otherwise.

We conclude this section by showing that reasoning in remains in the same complexity class of standard Description Logics.

###### Theorem 1

Reasoning in is ExpTime-complete.

Proof. For the completeness, let be the size of KB, then the number of typicality inclusions is . It is straightforward to observe that we have an exponential number of different scenarios, for each one we need to check whether the resulting KB is consistent in which is ExpTime-complete. Hardness immediately follows form the fact that extends standard . Reasoning in the revised knowledge base relies on reasoning in , therefore we can conclude that reasoning in is ExpTime-complete.

## 4 Applications of the logic T\tiny CL

We propose three different types of examples adopting the logic , along with its embedded HEAD-MODIFIER heuristic, to model the phenomenon of typicality-based conceptual combination. In the first case (pet fish) we show how our logic is able to handle this concept composition which is problematic for other formalisms. In the second case (Linda the feminist bank teller) we show how is able to model the well known conjunction fallacy problem Tversky and Kahneman (1983). In the third case (stone lion) we show how our logic is also able to account for complex form of metaphorical concept combination. All these examples do not come ex-abrupto, since they represent classical challenging cases to model in the field of cognitive science and cognitive semantics (see e.g. Lewis and Lawry (2016)) and have been showed in that past problematic to model by adopting other kinds of logics (for example fuzzy logic, Osherson and Smith (1981); Smith and Osherson (1984); Hampton (2011)).

In addition, we exploit to present an example of a possible application in the area of creative generation of new characters. Finally, we show that the logic can be iteratively applied to combine concepts already resulting from the combination of concepts. This type of iterative process has been never provided in previous formalizations trying to address similar or the very same phenomena, (e.g. in Lewis and Lawry (2016); Eppe et al. (2018)). We show that the procedures provided in are robust and consistent enough also for dealing with higher, iterative, levels of prototype-based compositionality.

### 4.1 Pet Fish

In this section we exploit the logic in order to define the typical properties of the concept pet fish, obtained as the combination of the concepts Pet and Fish. As mentioned before, this represents a well known and the paradigmatic example in cognitive science. The problem of combining the prototype of a pet with those of a fish is the following: a typical pet is affectionate and warm, whereas a pet fish is not; on the other hand, as a difference with a typical fish, a pet fish is not greyish, but it inherits its being scaly.

Let be a the KB, where the ABox is empty, the set of rigid inclusions is

 R={Fish⊑∀livesIn.Water}

and the set of typicality properties is as follows:

By the properties of the typicality operator , we have that

 (∗) T(Pet⊓Fish)⊑∀livesIn.Water.

Indeed, is a rigid property, which is always preferred to a typical one: in this case, additionally, the rigid property is also associated to the HEAD element fish. Therefore, this element is reinforced.

Since , we have different scenarios. We can observe that some of them are not consistent, more precisely those

• containing the inclusion , thus contradicting ;

• containing both inclusions and ;

• containing both inclusions and .

It is worth noticing that this example represents the worst case in our analysis: indeed, the probabilities associated to the properties in related to the MODIFIER are not lower than the ones associated to the properties in related to the HEAD. More in detail, we have that the probability in of is , whereas the probability in of is . Furthermore, the typical property of being warm and of its negation has the same probability () both in the HEAD and the MODIFIER.

The scenario with the highest probability (up to ) is both trivial and inconsistent: indeed, since probabilities equipping typicality inclusions are such that by definition, we immediately have that the higher is the number of inclusions belonging to a scenario the higher is the associated probability. Since typicality inclusions introduce properties that are pairwise inconsistent, it follows that such scenarios must be discarded (Algorithm 1, from line 17).

Consistent scenarios with the highest probabilities (two scenarios with probability ) contain and do not contain , namely they privilege the MODIFIER with respect to the corresponding negation in the HEAD, obtaining that being affectionate is a typical property of a pet fish. In both cases, in these scenarios we pay the price of discarding some properties of the HEAD.

As described in the previous section, we consider the other blocks of consistent scenarios considering their probabilities in descending order (Algorithm 1, lines 21-23). Figure 1 shows the different scenarios, one row for each scenario as in the previous example about the pet fish. Again, scenarios are proposed in descending order of probability. Inconsistent scenarios are highlighted in the last column.

All scenarios with probabilities ranging from down to are inconsistent. The first valid block contains scenarios whose probability is : the four consistent scenarios of this block, however, are discarded. Indeed, they either contain the inclusions but not or but not , namely they give preference to the MODIFIER concerning a conflicting property of the HEAD, or they are trivial, i.e. they inherit all the properties of the HEAD (Algorithm 1, lines 31-34)).

The next block contains four scenarios with probability of . The first two scenarios, again, either contain inclusions and not or contain and not , namely again it privileges the MODIFIER with respect to the corresponding negation in the HEAD. Therefore, these scenarios are discarded. The same for the last one, where both and are included rather than and . The remaining scenario of this block includes three out of four properties of the HEAD, therefore it is not trivial and it is selected by the logic for the composition of the two initial prototypes.

In conclusion, in our proposal, the not trivial scenario defining prototypical properties of a pet fish is defined from the selection , and contains inclusions 3, 6, and 7, and the resulting scenario is as follows:

The resulting -revised knowledge base that the logic suggests is as follows:

 KPet ⊓ Fish=⟨{Fish⊑∀livesIn.Water},T′,∅⟩,

where is:

Notice that in our logic , adding a new inclusion , would not be problematic. (i.e. this means that our formalism is able to tackle the phenomenon of prototypical attributes emergence for the new compound concept, a well established effect within the cognitive science literature Hampton (1987)).

### 4.2 Linda the feminist bank teller

We now exploit the logic in order to tackle the conjunction fallacy problem (or “Linda Problem”). The problem configuration is as follows: let us suppose to know that Linda is a 31 years old, single, outspoken, and bright lady. She majored in philosophy and was concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. When asked to rank the probability of the statements 1) “Linda is a bank teller” and 2) “Linda is a bank teller and is active in the feminist movement”, the majority of people rank 2) as more probable than 1), violating the classic probability rules. In our logic, let be a KB, where , is: