Professional Documents
Culture Documents
CHAPTER 7
7.1
Sem 1,
21014
EXPECTATION
Introduction
One of the most important concepts in probability theory is that of the expectation of a
random variable. In chapters 4 and 5, we have discussed about the expected value of several
discrete and continuous random variables. In this chapter, we will learn more about the
properties of expected values and its usage. Recall from chapters 4 and 5 that:
For a discrete random variable X with probability mass function p(x), the expected value of X
is defined by
E[ X ] x p ( x) .
x
For a continuous random variable X with probability density function f(x), the expectation of
X is defined by
E ( X ) x f ( x) dx .
x
The expected value of X is a weighted average of all possible values of X ; hence, if X must
lie between a and b, then so must its expected value. In other words, if
P a X b 1 ,
then
a E[ X ] b .
Remark: The expectation of a random variable is defined in terms of a sum (discrete case) or
integral (continuous case). This implies that the expectation is only defined when the
corresponding sum or integral is defined.
Example (The Cauchy random variable). The probability density function of X, a Cauchy
random variable is, in its simplest form, given by
f ( x)
1
,
1 x 2
x .
E ( X ) 0 P X x dx
Proof
Suraiya Kassim
Sem 1,
21014
Consider two continuous random variables, X and Y. Let g(X, Y) be a function of both X and
Y . If f X ,Y x, y denote the joint probability density of X and Y, then
E g ( X , Y ) g ( x, y ) f ( x, y ) dxdy .
y x
For two discrete random variables, X and Y with joint probability mass function p X ,Y x, y ,
E g ( X , Y ) g ( x, y ) p ( x, y ) .
y
x f ( x, y ) dydx y f ( x, y ) dxdy
x y
y x
x x f X ( x) dx y x fY ( y ) dy
E X E Y
Example Suppose X and Y are independent uniform (0, L) random variables. Find
E[ X Y ] .
Solution The joint probability density function of X and Y is
f ( x, y ) =
Suraiya Kassim
Sem 1,
21014
Compute E ( X ) where X =
Xi
(the sample mean).
n
i =1
n Xi
Solution E[ X ] = E
i =1 n
1 n
E[X i ]
n i =1
1 n
1
= n =
n i =1
n
Example Calculate the expected sum obtained when 10 independent rolls of a fair die are
made.
Solution Let X denote the sum obtained. We can represent X as
X = X1 + X 2 + ... + X10
where X i = value of the ith roll.
E[ X i ] = 1 16 + 2
=
Thus
E[ X ] = E[ X1 + X 2 + ... + X10 ]
Suraiya Kassim
Sem 1,
21014
= 1 (p) + 0(1 - p) = p.
Hence, E[ X ] = E[ X1 + X 2 + ... + X n ]
= E[ X1 ] + E[ X 2 ] + ... + E[ X n ] = p + p + ... + p = np
Example (Expected Value of a Hypergeometric Random Variable)
Suppose n balls are selected from a box containing N balls of which m are white. Let X
denote the number of white balls selected, then
X = X1 + X 2 + ... + X n
1
if the i th selection results a white
where X i
0
if the i th selection results is a non-white
E[ X i ] 1 P the i th selection results a white 0 (the i th selection results a non-white)
m
(since the ith selection is likely to be any of the N balls.)
N
Hence, E[ X ] = E[ X1 + X 2 + ... + X n ]
= E[ X1 ] + E[ X 2 ] + ... + E[ X n ] =
7.3
m m
m nm
+ + ... + =
N N
N
N
The following proposition tells us that the expectation of a product of independent random
variables is just the product of their expectations.
Proposition 7.1 If X and Y are independent, then for any functions g and h,
E[ g ( X )h(Y )] = E[ g ( X )] E[h(Y )]
Proof Suppose X and Y are jointly continuous with joint density f(x, y). Then
E[ g ( X )h(Y )] =
g ( x ) h( y ) f ( x. y ) dx dy
g ( x ) h( y ) f X (x) fY ( y ) dx dy
g ( x ) f X ( x ) dx
h
Y
( y ) fY ( y )dy
Suraiya Kassim
Sem 1,
21014
Covariance of X and Y
The covariance of any two random variables X and Y, cov(X, Y) is defined by
.......................(1)
0
1
0
1
0
0
1
0
0
Suraiya Kassim
Sem 1,
21014
The variance of the sum of n random variables in terms of their covariance can be obtained
with a similar argument. For n random variables X 1 , X 2 , ..., X n , we have
or
n
n
If X 1 , X 2 , ..., X n are pairwise independent, then cov( X i , X j ) 0 for all i, j. The above
equation (eq. 2) reduces to
n
n
i 1
i 1
Xi
(the sample mean).
n
i =1
1
n2
2
i 1
1
2
2
n
n2
n