1 Introduction
1.1 Homogenousness as a default metarule
Homogeneousness was discussed as an important  though rarely explicitly addressed  concept by the author in [Sch972], section 1.3.11, page 32, and treated in more detail in [GS16], chapter 11, see also [Sch18b], section 5.7.
It is a second order hypothesis about the world, more precisely about the adequacy of our concepts analysing the world, and discussed in an informal way in [GS16] and [Sch18b].
The aim of these notes is to make the discussion more formal, treating it as a second order application of the fundamental concept of nonmonotonicity  that the set of exceptions is small  and, in particular, to base the intuitively very appealing idea of specificity  a way of solving conflicts between contradictory homogenousness requirements  on that same fundamental concept.
The author recently discovered (reading [SEP13], section 4.3) that J. M. Keynes’s Principle of the Limitation of Independent Variety, see [Key21] expresses essentially the same idea as homogenousness. (It seems, however, that the epistemological aspect, the naturalness of our concepts, is missing in his work.) By the way, [SEP13] also mentions “inference pressure” (in section 3.5.1) discussed in [Sch972], section 1.3.4, page 10. Thus, the ideas are quite interwoven.
Our main formal contribution here is to analyse a size relation (or between sets, generated by a relation between elements  similarly to Definition 2.6 and Fact 2.7 in [Sch972].
1.2 A general comment
The reader will see that we treat here again semantics based on the notions of distance and size. These notions seem very natural, perhaps also because they have a neurological correspondence: semantically close neurons or groups of neurons tend to fire together, and a large number of neurons has a potentially bigger effect than a small number, as their effect on other neurons might add up.
In an appendix, we discuss a different, we think important, concept, the core of a set, base it on distance, and find it by repeated application of standard theory revision.
2 Filters and Ideals
Definition 2.1
Let

is called a filter on iff
(1)
(2)
(3) (finite intersection suffices here)

If there is such that we say that is the (principal) filter generated by

is called an ideal on iff
(1)
(2)
(3) (finite union suffices here)
Definition 2.2
Let a filter over then
is the corresponding ideal (and
Given and the corresponding we set
The intuition is that elements of the filter are big subsets, of the ideal small subsets, and subsets in have medium size.
When speaking about over the same set we will always assume that they correspond to each other as just defined.
Remark 2.1
but not necessarily the converse.
Proof
and so
For the converse: Consider then but
Definition 2.3
Given (and corresponding and we define:


(a) and
or
(b) and

If we write and instead of and
Obviously, and are irreflexive.
We define two coherence properties:
Definition 2.4
(Coh1)
(Coh2)
These properties will be discussed in more detail below in Section 3 (page 3), as they are closely related to properties of a preferential relation between elements of see [Sch18a].
First, an initial remark: if (Coh1) and (Coh2) hold, and behave well:
Fact 2.2
imply:
(1) Let then
(2) Let then the following four conditions are equivalent:
Proof
(1)
“”: so by (Coh1), so
“”: so and so Moreover and so by (Coh2).
(2)
We use the finite union and downward closure properties of without mentioning.
We also use the following without further mentioning:
(a)
(b)
(c)
(d) by (Coh1)
We now show the equivalences.
(2.1) :
so by (Coh2) by (Coh1).
(2.2) :
by (Coh1). so by (Coh2), but
Note that we did not use is just an arbitrary set.
(2.3) :
Let by (2.1) so by (2.2)
(2.4) :
Trivial by (Coh1).
(2.5) :
so
Note that we did not use is just an arbitrary set.
(2.6) :
Let so by (2.4), so by (2.5).
Definition 2.5
Let a binary relation on we define for
We assume in the sequel that for any such and
Fact 2.3
Let the filter over generated
by then the corresponding
and and
When we discuss on and for subsets of we implicitly mean the filters, ideals, etc. generated by on subsets of as discussed in Fact 2.3 (page 2.3).
It is now easy to give examples:
Example 2.1
is neither upward nor downward absolute. Intuitively, in a bigger set, formerly big sets might become small, conversely, in a smaller set, formerly small sets might become big.
Let Then
(1) does not imply
(2) does not imply
(1): Let Then but both
(2): Let but NOT Then but both
3 on U and on
Definition 3.1
We define the following standard properties for
(1) Transitivity (trivial)
(2) Smoothness
If there there is
(3) Rankedness
If neither nor and then also
(Rankedness implies transitivity.)
See, e.g. Chapter 1 in [Sch18a].
3.1 Simple and smooth
Recall:
Fact 3.1
(1) (Coh1) is equivalent to the basic property of preferential structures,
(2) The basic property of smooth preferential structures, implies (Coh2), and imply
Proof
As there is a biggest we can argue with elements.
(1) (Coh1):
suppose contradiction.
(2) (Coh2): Let so
Let and so so follows from (Coh1)
Example 3.1
(1) Consider but not with Then but Nontransitivity of is crucial here.
(2) Consider with and close under transitivity. Then but Note that this structure is not smooth, but transitive.
Fact 3.2
is transitive, if is smooth.
Proof
Let so and We have to show i.e.
3.2 Ranked
Rankedness speaks about so it is not surprising that behaves well for ranked
Definition 3.3
We define
This is welldefined.
Fact 3.3
(1)
Let
Then iff
(a) and or
(b) and
(2)
iff
(a) or
(b) and
Proof
(1)
iff
and or
and or
and
(2)
The case is immediate.
Example 3.2
Here, is transitive and smooth, but not ranked, and is not transitive.
Consider
Let
If
If
So have same size in
Fact 3.4
Let the relation be ranked. Then is transitive.
Proof
Let If both and hold by case (2) (b) in Fact 3.3 (page 3.3), then again by case (2) (b), otherwise by case (2) (a).
Fact 3.5
Let then (and for
(This does not hold for of course.)
Proof
By but so
4 Specificity and Differentiation of Size
We now base the specificity criterion on the same notion of size as nonmonotonicity.
In this section, and are the positive or negative arrows of defeasible inheritance diagrams.
Fact 4.1
Suppose (Coh1) holds. If or and for some then
(Likewise, if only the contradiction matters.)
Proof
If or then moreoever so by the finite intersection property. Thus by (Coh1). by so by (Coh1), thus by the finite union property.
Corollary 4.2
If then and together are impossible by irreflexivity.
We may now base the specificity principle, like nonmonotonicity itself, on small exception sets, but this time, the exceptions are second order. We thus have a uniform background principle for reasoning.
4.1 Summary
It seems best to illustrate the situation with an example.
Consider the Tweety Diagram: If we treat like we violate a comparatively smaller subset: As and is a comparatively smaller subset of than of and smaller exception sets are more tolerable than bigger exception sets.
Thus, size is a very strong concept for the foundation of nonmonotonic reasoning.
(In addition, otherwise, the chain has two changes: but this way, we have only one change:
4.2 Differentiation of Size
Our basic approach has only three sizes: small, medium, big.
Using above ideas, we may further differentiate: if and then is doubly small in etc.
5 Homogenousness
We have now the following levels of reasoning:

Classical logic:
monotony, no exceptions, clear semantics

Preferential logic:
small sets of exceptions possible, clear semantics, strict rules about exceptions, like no other restrictions

MetaDefault rules (Homogenousness):
They have the form: and even if in the nonmonotonic sense of (2), we prefer those models where but exceptions are possible by nonmonotonicity itself, as, e.g., in (2).
We minimize those exceptions, and resolve conflicts whenever possible, as in Fact 4.1 (page 4.1), by using the same principle as in level (2): we keep exception sets small. This is summarized in the specificity criterion.
(We might add a modified length of path criterion as follows: Let We know by Fact 4.1 (page 4.1) that then any shorter chain has a shorter possible size reduction (if there are no other chains, of course!), and we can work with this. This is the same concept as in [Sch18e], section 4.)
This has again a clear (preferential) semantics, as our characterisations are abstract, see e.g. [Sch18a].
Remark: Inheritance diagrams and Reiter defaults are based on homogenousness, e.g. in the concatenation by default.
6 Extensions
All distance based semantics, like theory revision, counterfactuals, have a natural notion of size: the set of closest elements is the smallest set in the filter. Thus, we can apply our techniques to them, too.
Analogical reasoning and induction also operate with distance (analogical reasoning) and size (comparison of the sample size with the target size), so we may apply our principles here, too.
7 Defeasible Inheritance Diagrams
We discuss two of the main questions about defeasible inheritance diagrams in the light of our above analysis.

Upward versus downward chaining

Extension based versus directly sceptical approaches
7.1 Upward versus downward chaining
Diagram 7.1
\begin{picture}(130.0,100.0)\par\par\put(0.0,95.0){{\rm\bf The problem of % downward chaining}} \par\put(43.0,27.0){\vector(1,1){24.0}} \put(37.0,27.0){\vector(1,1){24.0}} \put(13.0,57.0){\vector(1,1){24.0}} \put(67.0,57.0){\vector(1,1){24.0}} \par\put(53.0,67.0){\line(1,1){4.0}} \par\put(67.0,54.0){\vector(1,0){54.0}} \par\put(40.0,7.0){\vector(0,1){14.0}} \put(43.0,7.0){\line(3,5){24.0}} \put(58.0,28.1){\line(5,3){3.6}} \par\put(39.0,3.0){$Z$} \put(39.0,23.0){$U$} \put(9.0,53.0){$V$} \put(69.0,53.0){$X$} \put(39.0,83.0){$Y$} \par\par\end{picture}
We assume (Coh1) and (Coh2).
By specificity, there is with
Of course, only (a big subset of) is affected by as we do only downward reasoning, no analogical (sideways) reasoning, or so.
There is with thus, is not affected by
But there is no information that
So there is and inherits from that (or, better: there is
So, in this example, our (downward) approach coincides with upward chaining, see [Sch972], section 6.1.3. Basically, the reason is that we look inside at and not only globally at which would involve (hidden) analogous reasoning.
7.2 Extension based versus directly sceptical approaches
Extension based approaches branch into different extensions when a conflict cannot be solved by specificity.
Each extension violates the homogenousness assumption rather drastically. Assuming that half of (a medium size subset) is in the other half in is a less drastic violation, thus it corresponds to the overall strategy to minimize violation of homogenousness.
But, there is no prinpal difference between two medium size subsets and one big and one small subset which are in conflict. So they should be treated the same way. Thus, in one single picture we have not only conflicting big and small subsets, but also conflicting medium size subsets, so this is more in the directly sceptical colour  without saying it is strictly the same as the traditional directly sceptical approach.
Thus, we have with and
Of course, any is NOT affected by
Thus, considering Diagram 7.3 (page 7.3), there is and this inherits only from i.e. that it is mostly in
Diagram 7.2
Diagram 7.3
Example 7.1
Then we have to split into two sets with (basically) and two sets with and, as they are independent, we split into four sets, all in with e.g. but but and
Note that it unimportant in which order we treat the conflicts or if we treat them simultanously  as should be the case.
7.3 Ideas for a Semantics
If we want to treat the Nixon Diamond, we have to consider So far, we have not considered abstract coherence properties for We do this now.
Definition 7.1
This is a condition for ranked preferential structures.
We rewrite this:
(The case does not interest here very much.)
Illustration of (the main part of)
If, in addition, we add and and a negative arrow (not as in Diagram 7.3 (page 7.3)), is mostly in with and we may still conclude by the above that inherits to be (mostly) in from
7.3.1 Principles
We work with very few background principles:

For many purposes, reasoning with abstract size seems the adequate approach.

As always in nonmonotonic reasoning, small sets of (firstlevel) exceptions are possible, so we work with and instead of the full or empty set (we used above as an abbreviation).

The hard rules of the background logic and of the filter/ideal properties tell us how to treat big/small/medium subsets on the first level.

This is complemented by the homogenousness principle and conflict resolution by specificity on the second level.

We treat all subsets the same way, not medium size sets differently by branching into different possibilities.

Specificity is based on the same idea as nonmonotonicity itself: we tolerate (better) small exception sets (than bigger ones).
Based on these principles, we proceed as follows:

We decide for a background logic, i.e. for coherence conditions. (Coh1), (Coh2), seems a good choice.

We respect the coherence conditions, and inherit properties strictly downward, not by analogy. Contradictions are either solved by specificity, or we chose one half for property the other for
It is important to chose the (sub)sets from which we inherit carefully, there is no analogical reasoning here. This was illustrated in above examples.
8 Appendix  the Core of a Set
The following remarks are only abstractly related to the main part of these notes. The concept of a core is a derivative concept to the notion of a distance, and the formal approach is based on theory revision, see e.g. [LMS01], or [Sch18b], section 4.3.
We define the core of a set as the subset of those elements which are “sufficiently” far away from elements which are NOT in the set. Thus, even if we move a bit, we still remain in the set.
This has interesting applications. E.g., in legal reasoning, a witness may not be very sure about colour and make of a car, but if he errs in one aspect, this may not be so important, as long as the other aspect is correct. We may also use the idea for a differentiation of truth values, where a theory may be “more true” in the core of its models than in the periphery, etc.
In the following, we have a set and a distance between elements of All sets etc. will be subsets of will be finite, the intuition is that is the set of models of a propositional language.
Definition 8.1
Let
(1)
(2)
Definition 8.2
Fix some the core will be relative to One might write but this is not important here, where the discussion is conceptual.
Define
(We might add some constant like 1/2 for so singletons have a nonempty core  but this is not important for the conceptual discussion.)
It does not seem to be easy to describe the core operator with rules e.g. about set union, intersection, etc. It might be easier to work with pairs (X, but we did not pursue this.
We may, however, base the notion of core on repeated application of the theory revision operator (for formulas) or (for sets) as follows:
Given (defined by some formula and (defined by the outer elements of (those of depth 1) are The elements of depth 2 are M( ) respectively, etc.
We make this formal.
Fact 8.1

The set version
Consider we want to find its core.
Let
Let
Let
Let
Continue etc. until it becomes constant, say
Now we go back:

The formula version
Consider we want to find its core.
Let
Let
Let
Let
Continue etc. until it becomes constant, say
Now we go back:
8.1 Acknowledgement
The author would like to thank David Makinson for an important comment.
References
 [1]
 [GS16] D. Gabbay, K. Schlechta, “A New Perspective on Nonmonotonic Logics”, Springer, Heidelberg, Nov. 2016, ISBN 9783319468150,

[Key21]
J. M. Keynes, “A treatise on probability”, London, 1921
 [LMS01] D. Lehmann, M. Magidor, K. Schlechta, “Distance semantics for belief revision”, Journal of Symbolic Logic, Vol. 66, No. 1, pp. 295–317, March 2001
 [SEP13] “Analogy and analogical reasoning”, Fall 13 edition, Stanford Encyclopedia of Philosophy, 2013
 [Sch18a] K. Schlechta, “Formal Methods for Nonmonotonic and Related Logics”, Vol. 1: “Preference and Size” Springer, 2018
 [Sch18b] K. Schlechta, “Formal Methods for Nonmonotonic and Related Logics”, Vol. 2: “Theory Revision, Inheritance, and Various Abstract Properties” Springer, 2018
 [Sch18e] K. Schlechta, “Operations on partial orders”, Unpublished manuscript, arXiv 1809.10620
 [Sch972] K. Schlechta, “Nonmonotonic logics  basic concepts, results, and techniques” Springer Lecture Notes series, LNAI 1187, Jan. 1997.