0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 What Logics of Induction are There? John D. Norton

# 4. Deductively Definable Inductive Logics in Preferred Partitions

The no-go result arises because an inductive logic that is entirely defined in terms of deductive relations is too responsive. It tries to accommodate too many deductive structures, including all possible refinements of the deductive structure with which we might start.

We can still generate deductively definable logics of induction if we strengthen the logic to make it less responsive to all possible refinements. This amounts to adding inherently inductive suppositions to the definitions of the logic. To do this, we consider all possible disjunctive refinements of the propositions of our system. These we shall call the different "partitions" of the system. We declare that the degrees of inductive support may be deductively defined only in certain of these partitions but not others. The designation of these preferred partitions in which deductive definability holds is the addition of inherently inductive content. More precisely, we shall henceforth consider the following sorts of inductive logics.

An inductive logic is asymptotically stable and deductively definable in preferred partitions , if:

• There are "inductively adapted partitions" that form a proper subset of all possible partitions.

• The strengths of inductive support are definable in each these inductively adapted partitions solely in terms of the deductive relations of that partition.

• There are inductively adapted partitions of arbitrarily large size.

• The inductive strength [A|B], for each pair of propositions A and B, stabilizes towards a unique value as the size of the partition grows. (asymptotic stability)

The result (S) derived earlier still obtain. However it now only holds in the preferred partitions. It now says:

In inductively adapted partitions with N atoms, there exists a function fN such that

(S)[A|B] = fN(#A&B, #A&~B, #~A&B)

## The Model of the Classical Approach to Probability Theory

This idea of introducing purely inductive suppositions by means of inductively adapted partitions may seem abstruse. However it is an idea that has been present since the beginning of formal work on inductive logic. It arose at the earliest moments in the invention of probability theory.

When probability theory was set up centuries ago, it did not start with a notion of probability, a numerical measure between 0 and 1. It started with a more primitive notion, the equally likely case. A familiar example is that of a die throw. By design, the equally likely cases are the six faces of the die:

A partition with these six outcomes as its atoms is an inductively adapted partition of a probabilistic logic of induction.

The probability of some outcome was then determined as a derived quantity by counting cases and computing the ratios of favorable to all cases.

Thus the probability of throwing a was 1/6 since there is one favorable case out of the six total cases.

Or the probability of throwing an even outcome-- or or --given a "low" outcome-- or or --is 1/3 since there is one favorable outcome among the three possible or or

If we write "#A" for the number of equally likely cases in outcome A, the general formula is

(P) P(A|B) = #A&B / #B = #A&B / (#A&B + #~A&B)

This formula should look familiar. It just a special case of the formula (S). So we might rewrite is more explicitly as

P(A|B) = fN(#A&B, #A&~B, #~A&B)
= f6(#A&B, 0, #~A&B) = #A&B / (#A&B + #~A&B)

When we introduce probabilities through these equally likely cases, we have a special case of an inductive logic deductively definable just in the partition in which the equally likely cases arise.

## Inductive symmetries of probability theory

Since probability is deductively definable in partitions with equally likely cases, we have as before that the symmetries of the deductive relations are also symmetries of the inductive relations. To pick a simple example, we know that the following two probabilities are the same:

P( | or or ) = P( | or or )

We can know this without computing the values of the probabilities. It follows simply because the probabilities are deductively definable in this partition; and the deductive relations between

and ( or or )

are the same as the relation between

and ( or or )

The situation is exactly the same as we saw earlier with the tiles. The first pair turns into the second merely by switching around labels:

We switch and ;
and we switch and ;
and we switch and .

Just imagine that the dice are tiles with the numbers affixed as labels; we peel them off and swap them around, just as we did with the tiles.

## Partitions that are not inductively adapted

The deductive symmetries will match the inductive symmetries only in inductively adapted partitions. That matching will fail if we choose a partition that is not inductively adapted. Here is a simple example. Instead of considering the usual six outcome partition for a die throw, consider one in which we replace the outcomes ( or or ) by "LOW." Our atomic outcomes in the new partition of size 4 is:

LOW

If we now mistakenly apply formula (S), we would conclude that

P( LOW | Ω ) = 1/4
P( | Ω ) = 1/4

suggesting incorrectly that the two probabilities are the same. (Ω is just the set of all outcomes, ( or ... or ).)

## Asymptotic stability illustrated

The example of probability theory also lets us illustrate the notion of asymptotic stability. The central idea is that the inductive strengths stabilize as the size of the inductively adapted partition is increased.

The simplest way to increase the size of the inductively adapted partitions is to tack on outcomes from a different space. Consider, for example, that we have a second die, whose outcomes are

This original partition has 6 atoms, so N=6. Its disjunctive refinement consists of the 36 pairs of combinations of the 6 faces of the first die and the six faces of the second. It has N=36.

This refinement does not affect the probabilities assigned in the original partition. There we had the unconditional probability:

P( ) = 1/6

We recover the same result in the N=36 sized refinement:

P( )
= P( ) + P( ) + P( ) + P( ) + P( ) + P( )
= 1/36 + 1/36 + 1/36 + 1/36 + 1/36 + 1/36 = 1/6

Hence we see the strongest form of asymptotic stability for P( ): the probability does not change at all.

## Asymptotic stability illustrated again

This last example of asymptotic stability works quite well. But one might feel that it is a cheap example. The disjunctive refinement is formed by tacking on manifestly irrelevant outcomes to the original outcome of interest and it leads to no change in the original probability assignment. Here's an illustration that shows converging changes in probabilities as well as refinements that are of direct relevance to the content.