describe turbulent ¬‚uids with limited predictability

While counter intuitive, the result is reasonable;

(i.e., small perturbations grow quickly, so two

the chance of observing a speci¬c value is zero

paths through phase space diverge quickly).

because innumerable different values can occur.

Finally, a continuous random variable is de¬ned

2.6.2 Continuous Random Variable. We have

expanded the concept of the sample space S to the as follows:

concept of a phase space S. We must also expand Let S be a phase space and let P (·) be a

the concept of the probability rule, P (·), used to continuous probability measure on S. Then a

compute the probability of events, by converting continuous random variable X is a continuous

P (·) into a function that measures the relative size function of S that takes values in an interval ⊆

R, the real line, in such a way that

of an event.

The way events are measured is not uniform

1 P (X ∈ ) ≥ 0 for all ⊆ ,

because measurements must re¬‚ect the likelihood

2 P (X ∈ ) = 1.

of events. For example, let T represent temperature

at a northern midlatitude location in January, and

consider events A and B, where A = {T ∈ 2.6.3 The Probability Density and Distribution

(’5, 5) —¦ C} and B = {T ∈ (30, 40) —¦ C}. Both Functions. Events described in terms of con-

A and B describe 10 —¦ C temperature ranges but tinuous random variables are expressed as open

P (A) = P (B), that is, the probability measure of intervals on the real line, R, and the probability

these events is not the same. of an event is expressed as the integral of a

Now assume that we are able to observe probability density function (pdf) taken over the

temperature on a continuous scale (i.e., that interval that describes the event. In theory, the

the intervening instruments do not discretize the density function is derived from the de¬nition of

observed temperature) and consider the event C = the random variable and the probability measure

{T = 0.48 —¦ C}. This event challenges our intuition P (·). In practice, we will use intuition and simple

because P (C) = 0. Why? Consider a sequence of mathematical arguments wherever possible.

events Our working de¬nition of the probability

density function will be as follows:

1 1 —¦

Ck = T ∈ 0.48 ’ , 0.48 + C.

Let X be a continuous random variable that takes

k k

values in the interval . The probability density

Note that lim k’∞ Ck = C and that the event

function for X is a continuous function f X (·)

Ck+1 is a subset of Ck , or in mathematical terms

de¬ned on R with the following properties:

C1 ⊃ C2 ⊃ · · · . Therefore

1 f X (x) ≥ 0 for all x ∈ ,

P (C1 ) > P (C2 ) > · · · .

f X (x) d x = 1,

2

Intuitively, we see that, for large k, the probability b

3 P (X ∈ (a, b)) = a f X (x) d x

of event Ck is proportional to k ’1 . It follows that

for all (a, b) ⊆ .

P (C) = 0.

Let us consider another situation. Assume that

An equivalent description of the stochastic

the probability measure is continuous and that

characteristics of a continuous random variable

there is a point x and an > 0 such that

is given by the distribution function, frequently

referred to more descriptively as the cumulative

P (X = x) = 2 .

distribution function (cdf).

Then, because of continuity, there must exist a The distribution function for X is a non-

δ > 0 such that for all y with |x ’ y| < δ decreasing differentiable function FX (·) de¬ned on

R with the following properties:

P (X = y) > .

1 lim x’’∞ FX (x) = 0,

Now, if we choose n > 1/ points x1 , . . . , xn such

that |x ’ xi | < δ, we obtain the contradiction that 2 lim x’+∞ FX (x) = 1,

P (X ∈ {x1 , . . . , xn }) > 1. FX (x) = f X (x).

d

3 dx

2.6: Continuous Random Variables 31

2.6.5 Expectation. The expected value of a

Temperature (Deg C)

continuous random variable X is given by

10 20

Temperature o C

E(X) = x f X (x) d x.

If g(·) is a function then the de¬nition of the

expected value of g(X) generalizes from the

0

discrete case in the same way, and

-10

E g(X) = g(x) f X (x) d x.

0 100 200 300

Results (2.4) and (2.5), about the expectation of a

Julian Day

Julian Day sum of functions and about linear transformations

of random variables, also apply in the continuous

case:

Figure 2.1: The 10th, 50th, and 90th quantiles

E g1 (X) + g2 (X) = E g1 (X) + E g2 (X)

of daily mean temperature at Potsdam, Germany

(1983“94). (2.15)

E ag(X) + b = aE g(X) + b. (2.16)

The last equation tells us that

2.6.6 Interpreting Expectation as the Long-

x

FX (x) = f X (r ) dr. (2.14) term Average. The expectation is often also

’∞

named ˜the mean™ value, that is, this number is

The cumulative distribution function is often identi¬ed with the average of an in¬nite number

useful for computing probabilities because of realizations of X. We will show this here

with an intuitive limit argument. Another heuristic

P X ∈ (a, b) = FX (b) ’ FX (a). argument is presented in [5.2.5].

First, we approximate the continuous random

2.6.4 Median and Quantiles. The median, x0.5 , variable X with a discrete random variable Xδ that

takes values in the set {kδ: k = 0, ±1, ±2, . . . }

is the solution of

for some small positive number δ and with

FX (x0.5 ) = 0.50.

probabilities

(k+1/2)δ

It represents the middle of the distribution in the

pkδ = f X (x) d x ≈ δ f X (kδ).

sense that

(k’1/2)δ

P (x ¤ x0.5 ) = P (x ≥ x0.5 ) = 0.5. The expected value of the discrete random variable

Xδ is given by

Exactly 50% of all realizations will be less than the

∞

median, the other 50% will be greater.

E(Xδ ) = kδpkδ .

The median is an example of a p-quantile, the k=’∞

point x p on the real line such that

By interpreting pkδ as the frequency with which

X takes a value in the neighbourhood of x = kδ,

P X ∈ (’∞, x p ) = p

we see that the expectation of the approximating

P X ∈ [x p , ∞) = 1 ’ p.