the stream of random bits, and observe the behavior of the algorithm. The

reader may wish to visualize σ as a ¬nite path in an in¬nite binary tree,

where we start at the root, branching to the left if the next bit in σ is a 0

bit, and branching to the right if the next bit in σ is a 1 bit. In this context,

we call σ an execution path. Some further terminology will be helpful:

• If A halts in at most |σ| steps, then we call σ a complete execution

path;

• if A halts in exactly |σ| steps, then we call σ an exact execution

path;

• if A does not halt in fewer than |σ| steps, then we call σ a partial

execution path.

The sample space S of the probability distribution associated with A on

input x consists of all exact execution paths. Clearly, S is pre¬x free; that

is, no string in S is a proper pre¬x of another.

2’|σ| ¤ 1.

Theorem 7.1. If S is a pre¬x-free set of bit strings, then σ∈S

Proof. We ¬rst claim that the theorem holds for any ¬nite pre¬x-free set S.

We may assume that S is non-empty, since otherwise, the claim is trivial.

We prove the claim by induction on the sum of the lengths of the elements

of S. The base case is when S contains just the empty string, in which case

the claim is clear. If S contains non-empty strings, let „ be a string in S of

maximal length, and let „ be the pre¬x of length |„ | ’ 1 of „ . Now remove

from S all strings which have „ as a pre¬x (there are either one or two

such strings), and add to S the string „ . It is easy to see (verify) that the

resulting set S is also pre¬x-free, and that

2’|σ| ¤ 2’|σ| .

σ∈S σ∈S

The claim now follows by induction.

For the general case, let σ1 , σ2 , . . . be a particular enumeration of S, and

consider the partial sums Si = i 2’|σj | for i = 1, 2, . . . . From the above

j=1

claim, each of these partial sums is at most 1, from which it follows that

limi’∞ Si ¤ 1. 2

7.1 Basic de¬nitions 151

From the above theorem, if S is the sample space associated with algo-

rithm A on input x, we have

2’|σ| ¤ 1.

S :=

σ∈S

Assume that S = 1. Then we say that A halts with probability 1 on

input x, and we de¬ne the distribution DA,x associated with A on input

x to be the distribution on S that assigns the probability 2’|σ| to each bit

string σ ∈ S. We also de¬ne TA (x) and A(x) as random variables on the

distribution DA,x in the natural way: for each σ ∈ S, we de¬ne TA (x) to be

|σ| and A(x) to be the output produced by A on input x using σ to drive

its execution.

All of the above de¬nitions assumed that A halts with probability 1 on

input x, and indeed, we shall only be interested in algorithms that halt with

probability 1 on all inputs. However, to analyze a given algorithm, we still

have to prove that it halts with probability 1 on all inputs before we can use

these de¬nitions and bring to bear all the tools of discrete probability theory.

To this end, it is helpful to study various ¬nite probability distributions

associated with the execution of A on input x. For every integer k ≥ 0, let

us consider the uniform distribution on bit strings of length k, and for each

(k)

j = 0, . . . , k, de¬ne Hj to be the event that such a random k-bit string

causes A on input x to halt within j steps.

A couple of observations are in order. First, if S is the set of all exact

execution paths for A on input x, then we have (verify)

(k)

2’|σ| .

P[Hj ] =

σ∈S

|σ|¤j

From this it follows that for all non-negative integers j, k, k with j ¤

min{k, k }, we have

(k) (k )

P[Hj ] = P[Hj ].

(k)

De¬ning Hk := P[Hk ], it also follows that the sequence {Hk }k≥0 is non-

decreasing and bounded above by 1, and that A halts with probability 1 on

input x if and only if

lim Hk = 1.

k’∞

A simple necessary condition for halting with probability 1 on a given

input is that for all partial execution paths, there exists some extension that

is a complete execution path. Intuitively, if this does not hold, then with

152 Probabilistic algorithms

some non-zero probability, the algorithm falls into an in¬nite loop. More

formally, if there exists a partial execution path of length j that cannot be

extended to a complete execution path, then for all k ≥ j we have

Hk ¤ 1 ’ 2’j .

This does not, however, guarantee halting with probability 1. A simple

su¬cient condition is the following:

There exists a bound (possibly depending on the input) such

that for every partial execution path σ, there exists a complete

execution path that extends σ and whose length at most |σ| + .

To see why this condition implies that A halts with probability 1, observe

that if A runs for k steps without halting, then the probability that it does

not halt within (k + 1) steps is at most 1 ’ 2’ . More formally, let us de¬ne

H k := 1 ’ Hk , and note that for all k ≥ 0, we have

((k+1) ) ((k+1) ) ((k+1) )

| Hk ] · P[Hk

H (k+1) = P[H(k+1) ]

((k+1) )

¤ (1 ’ 2’ )P[Hk ]

= (1 ’ 2’ )H k ,

and hence (by an induction argument on k), we have

H k ¤ (1 ’ 2’ )k ,

from which it follows that

lim Hk = 1.

k’∞

It is usually fairly straightforward to verify this property for a particular

algorithm “by inspection.”

Example 7.1. Consider the following algorithm:

repeat

b ←R {0, 1}

until b = 1

Since every loop is only a constant number of instructions, and since there

is one chance to terminate with every loop iteration, the algorithm halts with

probability 1. 2

Example 7.2. Consider the following algorithm:

7.1 Basic de¬nitions 153

i←0

repeat

i←i+1

s ←R {0, 1}—i

until s = 0—i

For positive integer n, consider the probability pn of executing at least

n loop iterations (each pn is de¬ned using an appropriate ¬nite probability

distribution). We have

n’1 n’1 Pn’2

’2’i+1 2’i

’i ’

≥ e’2 ,

(1 ’ 2 ) ≥

pn = e =e i=0

i=1 i=1

where we have made use of the estimate (iii) in §A1. As pn does not tend

to zero as n ’ ∞, we may conclude that the algorithm does not halt with

probability 1.

Note that every partial execution path can be extended to a complete

execution path, but the length of the extension is not bounded. 2

The following three exercises develop tools which simplify the analysis of