and Kahneman. The most conspicuous indication of its fashionableness is

that Tversky and Kahneman received the 2002 Nobel prize in economics for

their theory. (See Kahneman et al., 1982; Moser 1990; Piatelli-Palmorini 1994;

Tversky and Kahneman 1983.)

Our results have also implications for the evaluation of Quine™s philosophy.

It is often said that his main insight is that factual meaning and conceptual

assumptions cannot be disentangled from each other. In one perspective, what

has been found might look like a vindication of Quine™s position. For what we

have seen is that in trying to measure factual information, we cannot eliminate

the in¬‚uence of a priori assumptions. I am in fact prepared to give Quine credit

for having perceived this unavoidable role of prior assumptions in our factual

judgments. But at the same time we have to realize that, if so, Quine has both

misidenti¬ed the nature of the prior assumptions in question and misjudged

the way in which his insight is to be implemented. As to the ¬rst point, what

has been seen shows that the prior assumptions in question are factual rather

Socratic Epistemology

202

than linguistic or otherwise conceptual. They are naturally coded by means of

the choice of one™s conceptual base, in that they typically appear in the form

of symmetry assumptions, as in the classical -formula (3). But this does not

make these assumptions themselves conceptual.

As to the question of to how the inextricability of conceptual and factual

information is to be dealt with, Quine™s recommendation to simply give up the

distinction between factual and conceptual is a counsel of theoretical despair.

Even the true nature of the very fact of inextricability cannot be adequately

understood without a theoretical apparatus that involves even ¬ner distinc-

tions than the one Quine proposes to abolish. Even though Quine™s ideas seem

to be motivated by a valid insight, the way he has sought to implement it has

been injurious to the course of analytic philosophy. We need a far sharper

logical analysis of the problem situation than Quine was capable of.

How, then, can we de¬ne measures of surface information? In principle, we

can try to work along the same lines as in the case of depth information”in

other words, to use the weighted number of constituents ruled out as a guide-

line. It is not easy, however, to ¬nd simple theoretically motivated measures of

this kind. One complicating factor is the relativity of such measures to one™s

conceptual basis. As an illustration, consider a derivation of G from F. This

can increase one™s surface information, and that increase is measured by the

sum of the weights of the constituents we have to eliminate to demonstrate

the consequence relation. But constituents with what basis? Those involving

all the non-logical constants occurring both in F and in G? But such measures

may be changed by adding redundant parts to F or to G. It turns out that it

suf¬ces to consider constituents whose non-logical vocabulary is shared by F

and G. But that basis can then be different from that of the normal forms of

both F and G. Moreover, suitable interpolation theorems can bring out those

constituents that have to be eliminated from the reduct of the normal form

of F to the common vocabulary in order to see the consequence relation, and

similarly for G. Thus we have many interesting conceptual possibilities here,

but no simple measures of surface information.

In such circumstances, it is tempting to take an altogether different tack.

Instead of trying to measure the surface information of a proposition in terms

of the “impossible possibilities” it rules out, one can try to use the complexity

of the elimination process of these possibilities as the measure of their surface

information. In other words, the weight of an eliminable possibility depends

on the amount of work (which one can, for instance, think of as Kopfarbeit, or

as computer time) that it takes it to rule it out.

In order to weight inconsistent constituents in a theoretically (and practi-

cally) interesting way, we would have to have some insight into the structural

reasons for their inconsistency. In particular, it might be relevant to anticipate

how the addition to the number of individuals we are considering together”

which means adding to the depth of a constituent”affects the situation. But

at this moment, neither logical nor computational theory seems to offer any

Who Has Kidnapped Information? 203

real insights into the behavior of constituents. The only crude parameter of

interest here seems to be the depth at which an inconsistent constituent turns

out to be trivially inconsistent.

At this point, the reader is supposed to have another deja vu experience.

´`

What was just recommended as a possible measure of surface information is,

roughly speaking, the same notion of information as is used by complexity

theorists. What we have thus reached is a vantage point from which this third

main type of information can be seen to occupy a legitimate conceptual niche.

We might call this kind of notion of information “computational.”

At the same time, we can see that such “information” is quite different from

our usual ideas of information and should be kept strictly apart from them for

the sake of clarity. For one thing, it is related only to surface information,

not depth information. It behaves in many ways differently from both these

senses of information. For instance, if an inconsistent constituent is deep, it

represents a narrow possibility, and hence its elimination does not in some

non-computational sense add much to our surface information. Because of its

depth, it may nevertheless be cumbersome computationally to eliminate, and

hence it will be assigned a large measure of computational information.

Another anomaly is that computational information does not depend only

on the proposition to which the concept is applied. It depends also on the

method of logical proof. This dependence can be neutralized in some cases by

choosing a suitable normalized proof method. But when there is no theoreti-

cally privileged proof method, this dependence may lead”and has led”to

nonsensical results. As an example, we can consider Raatikainen™s (1998(a)

and 1998(b)) analysis of Chaitin™s measures of information in formalized ele-

mentary number theory. There, the information of a proposition (coded in its

Godel number) does not depend, only or even roughly, on the proposition

¨

itself. It depends crucially on the way in which Turing machines are coded into

the elementary arithmetic in question. By changing this code, a given proposi-

tion can be assigned any arbitrarily chosen value. Obviously, in such contexts,

the term “information” is used, to put the point charitably, in a Pickwickian

sense.

Suitably de¬ned measures of computational information can nevertheless

be used in logical theorizing, if handled properly. For instance, in a formal

system, the computational information of a formula F (as a function d(g(F))

of the Godel number g(F) of F) can be characterized as the depth to which one

¨

has to go in order to prove its inconsistency (if it is inconsistent). Such a notion

of computational information offers a way to discuss the decision problem of

the system in question. For instance, the unsolvability of the decision problem

for the system in question means that for any recursive function r(x) we have

d(g(y)) > r (y) from some yo (y > yo) on.

Applied to a given system of elementary number theory, since any method

of computation in it corresponds to a recursive function, from some point yo

we can no longer calculate the computational information of sentences with

Socratic Epistemology

204

a Godel number > yo . This result is a restatement of Chaitin™s well-known

¨

Limiting Theorem. It sounds like a surprising and deep result, but in reality

it is only a trivial (and misleading) restatement of the unsolvability of the

decision problem for the system in question.

Writers such as Chaitin are misleading their readers by using the term

information as if it were meant in something like our everyday sense, when in

reality it means in his results and his colleagues results”something entirely

different. As a result, their approach to problems such as those prompted by

Godel™s incompleteness results have led only to obfuscation of the issues. I am

¨

appalled that philosophers have given up their critical mission so completely

as not to have pointed out this shady practice.

Instead of such misleading popularization, students of computational infor-

mation might have been well advised to have a closer look at what their sense

of information depends on in those applications in which it is possible to

compare it with other senses of information. Here I can only point out one

important distinction. An attempt to prove that G follows logically from F in

¬rst-order logic can be thought of as an experimental attempt to construct a

countermodel in which F is true and G false. This attempt is guided by the sets

of formulas A so far reached that are intended to be true in the hypotheti-

cal countermodel. Apart from propositional rules, there are essentially two

kinds of applications of rules of inference that contribute to the complex-

ity of the proof. There are (i) existential instantiations that introduce new

individuals to the attempted countermodel and (ii) applications of the rule

of universal instantiation to formulas in A with respect to all the permu-

tations of names of individuals already in the approximation to a counter-

model so far reached. The number of new formulas that can be added at any

stage by (i) is at most the number of existentially quanti¬ed formulas in A,

while the number of new formulas that can be introduced by (ii) can be of

the order of na ”where a is the number of all the different universal qanti-

¬ers in A and n is the number of names that have not yet been substituted

for universally quanti¬ed variables. Thus, universal instantiation is likely to

introduce many more new formulas and in this way increase the computa-

tional complexity of the theorem-proving process more quickly than existential