1.1 Section 2.2, p. 5 of [Sep13]
An analogical argument has the following form:
1. is similar to in certain (known) respects.
2. has some further feature
3. Therefore, also has the feature or some feature similar to
(1) and (2) are premises. (3) is the conclusion of the argument. The argument form is inductive; the conclusion is not guaranteed to follow from the premises.
and are referred to as the source domain and target domain, respectively. A domain is a set of objects, properties, relations and functions, together with a set of accepted statements about those objects, properties, relations and functions. More formally, a domain consists of a set of objects and an interpreted set of statements about them. The statements need not belong to a first-order language, but to keep things simple, any formalizations employed here will be first-order. We use unstarred symbols (a, to refer to items in the source domain and starred symbols to refer to corresponding items in the target domain.
Formally, an analogy between and is a one-to-one mapping between objects, properties, relations and functions in and those in
1.2 Section 2.2, pp. 6-7 of [Sep13]
In an earlier discussion of analogy, Keynes, in [Key21], introduced some terminology that is also helpful.
Let stand for a list of accepted propositions Pn about the source domain Suppose that the corresponding propositions abbreviated as are all accepted as holding for the target domain so that and represent accepted (or known) similarities. Then we refer to as the positive analogy.
Let A stand for a list of propositions Ar accepted as holding in and for a list of propositions holding in Suppose that the analogous propositions fail to hold in and similarly the propositions Bs fail to hold in so that A, and represent accepted (or known) differences. Then we refer to A and as the negative analogy.
The neutral analogy consists of accepted propositions about for which it is not known whether an analogue holds in
The hypothetical analogy is simply the proposition in the neutral analogy that is the focus of our attention.
These concepts allow us to provide a characterization for an individual analogical argument that is somewhat richer than the original one.
Correspondence between SOURCE (S) and TARGET (T)
An analogical argument may thus be summarized: It is plausible that holds in the target because of certain known (or accepted) similarities with the source domain, despite certain known (or accepted) differences.
1.3 Section 2.4 of [Sep13]
Of course, it is difficult to show that no successful analogical inference rule will ever be proposed. But consider the following candidate, formulated using the concepts of the schema in Definition 1.3 (page 1.3) and taking us only a short step beyond that basic characterization.
Suppose and are the source and target domains. Suppose Pn (with 1) represents the positive analogy, Ar and represent the (possibly vacuous) negative analogy, and represents the hypothetical analogy. In the absence of reasons for thinking otherwise, infer that holds in the target domain with degree of support 0, where is an increasing function of and a decreasing function of and
above. We use the generic phrase “degree of support” in place of probability, since other factors besides the analogical argument may influence our probability assignment for
So, how do we chose the “right one”?
1.4 A Side Remark
2 The Idea
We now describe the idea, and compare it to other ideas in philosophical and AI related logics.
But first, we formalize above ideas into a definition.
Let be an alphabet.
Let and an injective function, preserving the type of symbol, e.g.,
if stands for an object of the universe, then so will
if stands for a subset of the universe, then so will
if stands for an unary predicate of the universe, then so will
etc., also for higher symbols, like the universe.
Let a subset of the formulas formed with symbols from
For let be the obvious formula constructed from with the function
We now look at the truth values of and and In particular, there may be s.t. is known, not, and we extrapolate that this is then the analogical reasoning based on
There may be s.t. is not known, is known or not, such do not interest us here.
Assume in the following that is known.
and are known, and The set of such is the positive support of
and are known, and The set of such is the negative support of
is known, is not known. The set of such is denoted
The “effect” of is to conjecture, by analogy, that for such
works well for but not for so
works well for but not for so
Let further and
What shall we do, should we chose one, or for guessing, or combine and to chosing for expressions with and for expressions with more precisely and ?
Usually, this “best” relation will be partial only, and there will be many “best” Thus, it seems natural to conclude the properties which hold in ALL best
We then write iff holds in all best
(This is a sketch only, details have to be filled in according to the situation considered.)
This sounds like cheating: we changed the level of abstraction, and packed the question of “good” analogies into the -relation.
But when we look at the Stalnaker-Lewis semantics of counterfactual conditionals, see [Sta68], [Lew73], the preferential semantics for non-monotonic reasoning and deontic logic, see e.g. [Han69], [KLM90], [Sch04], [Sch18], the distance semantics for theory revision, see e.g. [LMS01], [Sch04], this is a well used “trick” we need not be ashamed of.
In above examples, the comparison was between possible worlds, here it is between usually more complicated structures (functions), but this is no fundamental difference.
2.2 Problems and solutions
In the case of infinitely many f’s we might have a definability problem, as the resulting best guess might not be definable any more - as in the case of preferential structures, see e.g. [Sch04].
Abstract treatment of representation problems for abovementioned logics work with arbitrary sets, so we have a well studied machinery for representation results for various types of relations of “better” analogies - see e.g. [LMS01], [Sch04], [Sch18].
To give the reader an idea of such representation resuls, we mention some, slightly simplified.
(1) Let again be the relation, and
(2) is called smooth iff for all either or there is
(3) is called ranked iff for all if neither nor then if then too, and, analogously, if then too.
We then have e.g.
General and transitive relations are characterised by
Smooth and transitive smooth relations are characterised by and the additional property
Ranked relations are characterised by and the additional property
For more explanation and details, see e.g. [Sch18], in particular Table 1.6 there.
- [Han69] B. Hansson, “An analysis of some deontic logics”, Nous 3, 373–398. Reprinted in R. Hilpinen, ed. “Deontic Logic: Introductory and Systematic Readings”, Reidel, pp. 121–147, Dordrecht 1971
S. Kraus, D. Lehmann, M. Magidor, “Nonmonotonic reasoning, preferential models and cumulative logics”, Artificial Intelligence, 44 (1–2), pp. 167–207, July 1990.
- [Key21] J. M. Keynes, “A treatise on probability”, London, 1921
- [LMS01] D. Lehmann, M. Magidor, K. Schlechta, “Distance semantics for belief revision”, Journal of Symbolic Logic, Vol. 66, No. 1, pp. 295–317, March 2001
- [Lew73] D. Lewis, “Counterfactuals”, Blackwell, Oxford, 1973
- [SEP13] “Analogy and analogical reasoning”, Fall 13 edition, Stanford Encyclopedia of Philosophy, 2013
- [SEP19c] “Analogy and analogical reasoning”, Stanford Encyclopedia of Philosophy, 2019
- [Sch04] K. Schlechta, “Coherent systems”, Elsevier, Amsterdam, 2004.
- [Sch18] K. Schlechta, “Formal Methods for Nonmonotonic and Related Logics”, Vol. 1: “Preference and Size”, Vol. 2: “Theory Revision, Inheritance, and Various Abstract Properties” Springer, 2018
- [Sta68] R. Stalnaker, “A theory of conditionals”, N. Rescher (ed.), “Studies in logical theory”, Blackwell, Oxford, pp. 98–112