0  1  2  3  4  5  6  7  8  9  What Logics of
Induction are There? John D. Norton 
The nogo result arises because an inductive logic that is entirely defined in terms of deductive relations is too responsive. It tries to accommodate too many deductive structures, including all possible refinements of the deductive structure with which we might start.
We can still generate deductively definable logics of induction if we strengthen the logic to make it less responsive to all possible refinements. This amounts to adding inherently inductive suppositions to the definitions of the logic. To do this, we consider all possible disjunctive refinements of the propositions of our system. These we shall call the different "partitions" of the system. We declare that the degrees of inductive support may be deductively defined only in certain of these partitions but not others. The designation of these preferred partitions in which deductive definability holds is the addition of inherently inductive content. More precisely, we shall henceforth consider the following sorts of inductive logics.
An inductive logic is asymptotically stable and
deductively definable in preferred partitions , if:
• There are "inductively adapted partitions" that form a proper subset
of all possible partitions.
• The strengths of inductive support are definable in each these
inductively adapted partitions solely in terms of the deductive relations of
that partition.
• There are inductively adapted partitions of arbitrarily large size.
• The inductive strength [AB], for each pair of propositions A and B,
stabilizes towards a unique value as the size of the partition grows.
(asymptotic stability)
The result (S) derived earlier still obtain. However it now only holds in the preferred partitions. It now says:
In inductively adapted partitions with N atoms, there
exists a function f_{N} such that
(S)[AB] = f_{N}(#A&B,
#A&~B, #~A&B)
This idea of introducing purely inductive suppositions by means of inductively adapted partitions may seem abstruse. However it is an idea that has been present since the beginning of formal work on inductive logic. It arose at the earliest moments in the invention of probability theory.
When probability theory was set up centuries ago, it did not start with a notion of probability, a numerical measure between 0 and 1. It started with a more primitive notion, the equally likely case. A familiar example is that of a die throw. By design, the equally likely cases are the six faces of the die:
A partition with these six outcomes as its atoms is an inductively adapted partition of a probabilistic logic of induction.
The probability of some outcome was then determined as a derived quantity by counting cases and computing
the ratios of favorable to all cases.
Thus the probability of throwing a was 1/6 since there is one favorable case out of the six total
cases.
Or the probability of throwing an even outcome or or given a "low" outcome or or is 1/3 since there is one favorable outcome among the three possible or or
If we write "#A" for the number of equally likely cases in outcome A, the general formula is
(P) P(AB) = #A&B / #B = #A&B / (#A&B + #~A&B)
This formula should look familiar. It just a special case of the formula (S). So we might rewrite is more explicitly as
P(AB) = f_{N}(#A&B, #A&~B, #~A&B)
= f_{6}(#A&B, 0, #~A&B) = #A&B / (#A&B +
#~A&B)
When we introduce probabilities through these equally likely cases, we have a special case of an inductive logic deductively definable just in the partition in which the equally likely cases arise.
Since probability is deductively definable in partitions with equally likely cases, we have as before that the symmetries of the deductive relations are also symmetries of the inductive relations. To pick a simple example, we know that the following two probabilities are the same:
P(  or or ) = P(  or or )
We can know this without computing the values of the probabilities. It follows simply because the probabilities are deductively definable in this partition; and the deductive relations between
and ( or or )
are the same as the relation between
and ( or or )
The situation is exactly the same as we saw earlier with the tiles. The first pair turns into the second merely by switching around labels:
We switch and ;
and we switch and
;
and we switch and
.
Just imagine that the dice are tiles with the numbers affixed as labels; we peel them off and swap them around, just as we did with the tiles.
The deductive symmetries will match the inductive symmetries only in inductively adapted partitions. That matching will fail if we choose a partition that is not inductively adapted. Here is a simple example. Instead of considering the usual six outcome partition for a die throw, consider one in which we replace the outcomes ( or or ) by "LOW." Our atomic outcomes in the new partition of size 4 is:
LOW
If we now mistakenly apply
formula (S), we would conclude that
P( LOW  Ω ) = 1/4
P(  Ω ) =
1/4
suggesting incorrectly that the two probabilities are the same. (Ω is
just the set of all outcomes, ( or ... or ).)
The example of probability theory also lets us illustrate the notion of asymptotic stability. The central idea is that the inductive strengths stabilize as the size of the inductively adapted partition is increased.
The simplest way to increase the size of the inductively adapted partitions is to tack on outcomes from a different space. Consider, for example, that we have a second die, whose outcomes are
This original partition has 6 atoms, so N=6. Its disjunctive refinement consists of the 36 pairs of combinations of the 6 faces of the first die and the six faces of the second. It has N=36.
This refinement does not affect the probabilities assigned in the original partition. There we had the unconditional probability:
P( ) = 1/6
We recover the same result in the N=36 sized refinement:
P( )
= P( ) + P( ) + P( )
+ P( ) + P( ) + P(
)
= 1/36 + 1/36 + 1/36 + 1/36 + 1/36 + 1/36 = 1/6
Hence we see the strongest form of asymptotic stability for P( ): the probability does not change at all.
This last example of asymptotic stability works quite well. But one might feel that it is a cheap example. The disjunctive refinement is formed by tacking on manifestly irrelevant outcomes to the original outcome of interest and it leads to no change in the original probability assignment. Here's an illustration that shows converging changes in probabilities as well as refinements that are of direct relevance to the content.
Consider a dartboard. At the simplest level of
description, there are just two outcomes possible for the dart throw:
a bull's eye "A" and the surrounding ring "B". These are the two
atoms of a two atom algebra. If this is an inductively adapted partition, the symmetry theorem tells us immediately that each of A and B gets the same inductive support on the background information Ω. [AΩ] = [BΩ] This might be a reasonable judgment depending on the circumstances. However in the two atom algebra, it is the only judgment possible for a deductively defined logic of induction. If we are to discriminate between A and B, we will need to refine the algebra of propositions. We would expect that A gets less support than B merely because it is smaller and harder to hit. This needs to be expressible in the logic and we can introduce such judgments by means of inductively adapted partitions. 

Let us say that we assume the dart is thrown so
that parts of equal area have an equal chance of being struck. This
"equal area" rule is knowledge that must be drawn from elsewhere; we
cannot find it within the algebra of propositions. We start to capture this extra knowledge by dividing the board up into smaller parts of roughly equal area. These smaller parts correspond to the atoms of an inductively adapted partition. There are 4 atoms comprising A and 32 atoms comprising B. So now it is quite possible for us to arrive at the judgment [AΩ] < [BΩ] If we use the probability formula (P), we would arrive at the probabilities P(A) = 4/36 = 1/9 = .111 P(B) = 32/36 = 8/9 = .888 

Our external knowledge tells us that equal areas
have equal probabilities, so these last probability assignments are
not quite right. The difficulty is that each of the smaller parts above are not of equal area. Some are larger and some smaller. We arrive at a better approximation by dividing the dartboard into smaller parts, as shown at right. We now have 16 atoms associated with A and 76 with B. This yields the probabilities P(A) = 16/92 = 0.174 P(B) = 76/92 = 0.836 These values are getting closer to the probabilities given by the full physical description, which are P(A) = 0.16 P(B) = 0.84 We continue introducing partitions with more parts, striving as much as possible to keep the areas of the parts equal. If we do that well enough, the inductive judgments arising from the application of formula (P) will stabilize on the physically motivated probabilities. That means that the associated sequence of inductively adapted partitions will be asymptotically stable. These final values a derived from the fact that the ratio of the bull's eye radius to that of the full board is 2 to 5. So P(A) = (2/5)^{2} = 0.16. 

The three figures above illustrate a sequence of
inductively adapted partitions that yield probabilities that approach
the physically motivated probabilities asymptotically. There is nothing internal to the logic that forces use to designate these particular partitions as inductively adapted. We must have knowledge from another source. In this case, that other source was the knowledge that parts of equal area were equally likely to be hit. At left we have a different refinement that, using the "equal area" criterion, is far from inductively adapted. All we can say about it is that it is a logically possible refinement. However it is not adapted so we cannot use its atoms counts in the rule (S) to assign inductive strengths. 
Here's a second example of inductively adapted partitions that are asymptotically stable.
0  1  2  3  4  5  6  7  8  9  What Logics of
Induction are There? John D. Norton 