Quantitative rules

The product rule

Let's find the plausibility of the expression \(AB \, | \, C\). We can break down the decision process into elementary decisions about \(A\) and \(B\) separately. The robot can

  1. Decide that \(B\) is true \((B \, | \, C)\)
  2. Having accepted \(B\) as true, decide that \(A\) is true \((A \, | \, BC)\)

or, equally well,

  1. Decide that \(A\) is true \((A \, | \, C)\)
  2. Having accepted \(A\) as true, decide that \(B\) is true \((B \, | \, AC)\)

In order for \(AB\) to be a true proposition, it is necessary that \(B\) is true. Therefore we first calculate the plausibility \(B \, | \, C\). Also, if \(B\) is true, it is further necessary that \(A\) should be true, so the plausibility \(A \, | \, BC\) is needed. However, if \(B\) is false, then \(AB\) is false without any knowledge of \(A\). The plausibility of \(A\) is relavent only if \(B\) is true, therefore the robots has \(B \, | \, C\) and \(A \, | \, BC\), and it does not need \(A \, | \, C\).

In the same way, \(A \, | \, B\) and \(B \, | \, A\) are not needed: whatever plausibility \(A\) or \(B\) might have in the absence of information \(C\) could not be relevant to judgements of a case in which the robots knows that \(C\) is true.

In the more definite form, and, taking into account desideratum (IIIa), \((AB \, | \, C)\) will be some function of \(B \, | \, C\) and \(A \, | \, BC\):

$$(AB \, | \, C) = F[(B \, | \, C), (A \, | \, BC)] \tag{2.1}$$

Notice that

$$(AB \, | \, C) = F[(A \, | \, C), (B \, | \, C)] \tag{2.2}$$

is not a permissible form because of decideratum (II). Proposition \(A\) might be very plausible given \(C\), and \(B\) might be very plausible given \(C\), but \(AB\) could still be very plausible or very implausible.

We can also test all other possibilities which are represented by real numbers

$$u = (AB \, | \, C), \quad v = (A \, | \, C), \quad w = (B \, | \, AC), \quad x = (B \, | \, C), \quad y = (A \, | \, BC)$$

and subject them to various extreme conditions. Trubus (Rational Descriptions, Decisions and Design, 1969) has shown that only two of the possibilites survive: \(u = F(x,y)\) and \(u = F(w,v)\).

\(F(x,y)\) must be continuous monotonic increasing function of both \(x\) and \(y\). We assume it is differentiable. According to desideratum (IIIa) of structural consistency, the robot reasoning takes the form of a functional equation

$$F[F(x,y),z] = F[x,F(y,z)] \tag{2.13}$$

This associativity equation was proved by Aczel (A Short Couse on Functional Equations, 1987) without assuming differentiability, however the proof given by Cox (The Algebra of Probable Inference, 1961), which assumes differentiability, is shorter.

As a result of applying functional equation, the associativity of the logical product requires that the relatiion we want to find must take the functional form

$$w(AB \, | \, C) = w(A \, | \, BC) w(B \, | \, C) = w(B \, | \, AC) w(A \, | \, C)$$

which we call the product rule. \(w(x)\) must be a positive continuous monotonic function. Certainty is represented by \(w(A \, | \, C) = 1\). We adopt the choice \(0 \le w(x) \le 1\) as a convention, so that all possibilities in the desiderata are included in this form.

The sum rule