You are on page 1of 6

Suraiya Kassim

CHAPTER 7
7.1

Sem 1,
21014

EXPECTATION

Introduction

One of the most important concepts in probability theory is that of the expectation of a
random variable. In chapters 4 and 5, we have discussed about the expected value of several
discrete and continuous random variables. In this chapter, we will learn more about the
properties of expected values and its usage. Recall from chapters 4 and 5 that:
For a discrete random variable X with probability mass function p(x), the expected value of X
is defined by
E[ X ] x p ( x) .
x

For a continuous random variable X with probability density function f(x), the expectation of
X is defined by
E ( X ) x f ( x) dx .
x

The expected value of X is a weighted average of all possible values of X ; hence, if X must
lie between a and b, then so must its expected value. In other words, if

P a X b 1 ,
then

a E[ X ] b .

Remark: The expectation of a random variable is defined in terms of a sum (discrete case) or
integral (continuous case). This implies that the expectation is only defined when the
corresponding sum or integral is defined.
Example (The Cauchy random variable). The probability density function of X, a Cauchy
random variable is, in its simplest form, given by

f ( x)

1
,
1 x 2

x .

It can be shown that E[ X ] .


Lemma For any nonnegative random variable X,

E ( X ) 0 P X x dx

Proof

Suraiya Kassim

Sem 1,
21014

Using the above lemma, it can be shown that


E g ( X ) x g ( x) f ( x) dx
7.2

Expectation of a Function of a Random Variables

Consider two continuous random variables, X and Y. Let g(X, Y) be a function of both X and
Y . If f X ,Y x, y denote the joint probability density of X and Y, then
E g ( X , Y ) g ( x, y ) f ( x, y ) dxdy .
y x

For two discrete random variables, X and Y with joint probability mass function p X ,Y x, y ,

E g ( X , Y ) g ( x, y ) p ( x, y ) .
y

Example (Expectation of Sum of Random Variables) Suppose X and Y are continuous


random variables, and E[X] and E[Y] are both finite. Determine the E[X + Y].
Solution For g(X, Y) = X + Y,
E g ( X , Y ) E X Y x y f ( x, y ) dxdy
y x

x f ( x, y ) dydx y f ( x, y ) dxdy
x y

y x

x x f X ( x) dx y x fY ( y ) dy
E X E Y
Example Suppose X and Y are independent uniform (0, L) random variables. Find
E[ X Y ] .
Solution The joint probability density function of X and Y is
f ( x, y ) =

Suraiya Kassim

Sem 1,
21014

In general, if E[ X i ] is finite for all i = 1, 2, ..., n, then


E[ X 1 + X 2 + ... + X n ] = E[ X 1 ] + E[ X 2 ] + ... + E[ X n ]
Example (Expectation of Sample Mean) Let X 1 , X 2 , ..., X n be independent and
identically distributed random variables having distribution function F and expected value .
n

Compute E ( X ) where X =

Xi
(the sample mean).
n
i =1

n Xi
Solution E[ X ] = E
i =1 n

1 n
E[X i ]
n i =1

1 n
1
= n =

n i =1
n

Example Calculate the expected sum obtained when 10 independent rolls of a fair die are
made.
Solution Let X denote the sum obtained. We can represent X as
X = X1 + X 2 + ... + X10
where X i = value of the ith roll.

( ) (16 ) + ... + 6(16 )

E[ X i ] = 1 16 + 2
=
Thus

E[ X ] = E[ X1 + X 2 + ... + X10 ]

Example (Expected Value of a Binomial Random Variable)


Let X be a binomial random variable with parameter n and p. Recall that X represents the
number of successes in n independent trials with probability of success p , at each trial. Thus,
X can be written as
X = X1 + X 2 + ... + X n
1 if the i th trial is a success
where X i
0 if the i th trial is a failure
3

Suraiya Kassim

Sem 1,
21014

Each X i is, in fact, a Bernoulli random variable with parameter p, and

E[ X i ] 1 P the i th trial is a success 0 (the i th trial is a failure)

= 1 (p) + 0(1 - p) = p.
Hence, E[ X ] = E[ X1 + X 2 + ... + X n ]
= E[ X1 ] + E[ X 2 ] + ... + E[ X n ] = p + p + ... + p = np
Example (Expected Value of a Hypergeometric Random Variable)
Suppose n balls are selected from a box containing N balls of which m are white. Let X
denote the number of white balls selected, then
X = X1 + X 2 + ... + X n
1
if the i th selection results a white
where X i
0
if the i th selection results is a non-white
E[ X i ] 1 P the i th selection results a white 0 (the i th selection results a non-white)

m
(since the ith selection is likely to be any of the N balls.)
N

Hence, E[ X ] = E[ X1 + X 2 + ... + X n ]
= E[ X1 ] + E[ X 2 ] + ... + E[ X n ] =

7.3

m m
m nm
+ + ... + =
N N
N
N

Covariance and Correlations

The following proposition tells us that the expectation of a product of independent random
variables is just the product of their expectations.
Proposition 7.1 If X and Y are independent, then for any functions g and h,
E[ g ( X )h(Y )] = E[ g ( X )] E[h(Y )]

Proof Suppose X and Y are jointly continuous with joint density f(x, y). Then
E[ g ( X )h(Y )] =

g ( x ) h( y ) f ( x. y ) dx dy
g ( x ) h( y ) f X (x) fY ( y ) dx dy

g ( x ) f X ( x ) dx

h
Y

= E[ g ( x)] E[h( x)] .


Note: Proof for discrete case is similar.
4

( y ) fY ( y )dy

Suraiya Kassim

Sem 1,
21014

Covariance of X and Y
The covariance of any two random variables X and Y, cov(X, Y) is defined by

cov( X , Y ) = E [( X E[ X ])(Y E[Y ]]

.......................(1)

Expanding the term in the expectation, we have


cov( X , Y ) E XY E[ X ]Y XE[Y ] E[ X ]E[Y ]
= E[ XY ] E[ X ]E[Y ] E[ X ]E[Y ] + E[ X ]E[Y ]
= E[ XY ] E[ X ]E[Y ]
i.e. the covariance of variables X and Y is the difference between the expected of the product
and the product of the expected.
If X and Y are independent, then by proposition 7.1, E[ XY ] E[ X ]E[Y ] and hence,
cov( X , Y ) = E[ X ]E[Y ] E[ X ]E[Y ] = 0.
Note: If cov( X , Y ) = 0, then X and Y are not necessarily independent.
Example Let X and Y be independent random variables such that
P( X = 0) = P( X = 1) = P( X = 1) = 13
0 if X 0
Define Y =
1 if X = 0
Find the covariance of X and Y. Are X and Y independent?
Solution
X
Y
XY

0
1
0

1
0
0

1
0
0

Since the values of XY are all 0, E[XY] = 0.


E[X] =
E[Y ]
Hence, Cov(X, Y) =
However, X and Y are
A useful expression for the variance of the sum of two random variables in terms of their
covariance is obtained as follows:
2
Var(X + Y) E X Y E X Y

Suraiya Kassim

Sem 1,
21014

The variance of the sum of n random variables in terms of their covariance can be obtained
with a similar argument. For n random variables X 1 , X 2 , ..., X n , we have

Var ( X 1 X 2 ... X n ) Var ( X 1 ) Var ( X 2 ) ... Var ( X n ) 2 cov( X i , X j )


i j

or

n
n

Var X i Var ( X i ) 2 cov( X i , X j ) .....................(2)


i1 i1
i j

If X 1 , X 2 , ..., X n are pairwise independent, then cov( X i , X j ) 0 for all i, j. The above
equation (eq. 2) reduces to
n
n

Var X i Var ( X i ) , ...............(3)

i 1

i 1

i.e variance of the sum equals the sum of the variances.


Example (Variance of Sample Mean) Let X 1 , X 2 , ..., X n be independent and identically
distributed random variables having distribution function F, expected value and variance
n

Xi
(the sample mean).
n
i =1

2 . Compute Var X where X =


n X
Solution Var[ X ] Var i
i1 n
1 n
2 Var X i
n i1

1
n2

2
i 1

((from equ. (3))

1
2
2
n

n2
n

Example (Variance of a Binomial Random Variable) Let X be a binomial random


variable with parameter n and p. Since X represents the number of successes in n independent
trials with probability of success p , at each trial X can be written as
X = X1 + X 2 + ... + X n

1 if the i th trial is a success


where X i
0 if the i th trial is a failure
is a Bernoulli random variable with parameter p.
Var[ X i ] E[ X i2 ] E[ X i ]

Hence, Var[ X ] Var[ X 1 X 2 ... X n ]


6

You might also like