Professional Documents
Culture Documents
Objectives:
1. Develop techniques for determining the probabilities of
events that involve the joint behavior of two or more
random variables.
2. Determine when a set of random variables are
independent.
3. Quantify the degree of correlation when they are not
independent.
Example 1
A random experiment consists of selecting a
student name from an urn, let be the outcome
of the experiment. Define the following three
functions:
(): age of the student in years
(): height of the student in inches
(): weight of the student in lbs
Example 2
A random experiment consists of finding the number of
defects in a semiconductor chip and identifying their
locations. The outcome of the experiment is a vector
= , 1 , 2 , , , where the first component specifies
the total number of defects and the remaining components
specify the coordinates of their locations.
The chip consists of regions. Let 1 , 2 ,
, be the number of defects in each of these
regions, i.e., is the number of that fall in the
region . Thus, the vector = 1 , 2 , , is a
vector random variable. Here, both the outcome and the
random variable are vectors.
5
Example 3
Let the outcome of some random experiment
be a voltage waveform (). Let the random
variable = () be the sample of the
voltage taken at time .
The vector consisting of the first samples
= 1 , 2 , , , is then a vector random
variable.
Example 4
Let = , be a two-dimensional random
variable. Find the region in the plane
corresponding to the events:
(a) = + 10
(b) = 2 + 2 100
(c) = min , 5
(a) = + 10
(b) = 2 + 2 100
(c) = min , 5
min , 5 = 5 5
that is, minimum of and is less than or equal to 5,
if and/or is less than or equal to 5.
9
10
11
12
13
14
=1
2.
=
=1
(2)
, = 1
=1 =1
17
,
=1
Similarly,
= = =
,
=1
18
Example 5
The number of bytes in a message have a
geometric distribution with parameter 1 and
range = {0,1, }. Suppose that messages are
broken into packets of maximum length bytes.
Let be the number of full packets in a
message, and be the number of bytes left over.
Find the joint pmf and the marginal pmfs of
and .
19
20
1 + = 1
=
=0
= 1
1
= 1
1
=0
= 0, 1,
1 + = 1
=
=0
1
=
=0
= 0, 1, , 1
21
2.
, , 1 = , 1 , = 0
3.
, , = 1
since it is certain that and will assume values less than infinity.
23
24
lim+ , , = , ,
25
Example 6
The joint cdf for the vector random variable
= (, ) is given by
,
1 1
, =
0
0, 0
26
= lim , , = 1
= lim , , = 1
27
28
, 2 , 1 = , 1 , 1 + 1 < 2 , 1
4 = 3
3 =
we have
4 = + + [3 ]
, 2 , 2 = 1 < 2 , 1 < 2 + , 2 , 1 , 1 , 1 + , 1 , 2
< , < = , , , , , , + , ,
29
Example 7
The joint cdf of the random variables and is
given by
,
1 1
, =
0
0, 0
(a) = 1, 1
= 1, 1
= , 1, 1 = 1 1
= > , > = }{
= + }{
= 1 + 1 1 1
= 1 +
Therefore,
= 1 =
= , 2, 5 , 2, 2 , 1, 5 + , 1, 2
= 1 2 1 5 1 2 1 2
1 1 5 + 1 1 2
31
, ,
Note the similarity between this equation and that for the discrete case
=
, ,
,
32
, , = 1
, , =
, ,
33
, ,
< + , < + =
, ,
= , ,
< + }{ < +
34
Marginal pdfs
= , , =
=
=
Similarly,
, ,
, ,
, ,
Example 8
A randomly selected point (, ) in the unit
square has a uniform joint pdf given by
,
1
, =
0
0 1, 0 1
36
37
, , =
(1) =
, , =
(1) =
, , =
(1) =
, , =
(1) = 1
0
38
Example 9
Given the joint pdf
,
, =
0
39
(a)
1=
=0
=0
=0
1 =
=0
1
= 2
2
=0
1
=0
1
1 0
0
2
2
=
0
=2
(b)
= 2
, , =
= 2
, , =
= 2
1 ,
=
=
0<
= 2 0 + = 2 2 ,
0<
40
(c)
The region corresponding to the intersection of the event + 1 and the region where the
pdf is non-zero is shown in the figure.
=
+ 1 =
=
=
1
2
=1
=0 =
1
=
2
=0
1
=
2
2 2 1
=0
1
= 2 2 1
2
1
2
=0
1
1
1
1 1 0
2
2
2
= 1 2 1
=2
41
Example 10
The joint pdf of and is given by
, , =
1
2 1
2 2+ 2 )/2(12 )
< , <
42
( 2 2)/2(12 )
=
2 1 2
Note that
2 2 + 2 2 2 2 = 2 2 2
=
2 /2(12 )
2 1 2
( 2 2 2 )/2(12 )
2 2 2 )/2(12 )
(
2 1 2
2 )/2(12 )
(
43
Note that
1
2
2 )/2(12 )
(
= 1
2 1
since the integral I is that of a Gaussian pdf with mean
= and variance 2 = 1 2 .
Thus,
2 )/2
(
2
The marginal pdf of is therefore a 1-D Gaussian pdf with
mean = 0 and variance 2 = 1.
Because of the symmetry, the marginal pdf of :
2 )/2
(
=
2
44
, , =
1 2
1 2
= 1 [2 ]
2
46
Example 11
In Example 5, we obtained
, , = 1 + ,
= 1
1
=
1
= 1
= 0, 1, ; = 0, 1, , 1
= 0,1,
= 0,1, , 1
1
+ =
=
1
,
1
(1)
(2)
Example 12
In Example 9, given the joint pdf:
,
, =
0
0<
0<
= 2 2 ,
0<
Example 13
In Example 6, given the joint cdf:
1 1
, , =
0
we obtained the marginal cdfs:
0, 0
= 1 ,
= 1 ,
1 1 ,
0,
, , =
0, 0
51
52
Conditional probability
The probability that , given that we know the exact value of , i.e.
= :
, =
| = =
(1)
=
(i) When is discrete, (1) can be used to obtain the conditional cdf of
given = :
|
, =
=
=
= > 0 (2)
| =
(3)
| = =
|
54
, = = = = () =
Thus, from (2)
| =
| =
(ii)
When and are discrete random variables, the pdf will consist of delta functions:
| =
= = | =
= , =
, ( , )
=
=
=
( )
> 0
(4)
The probability of any event given = is found by summing the pmf over the event
= =
If and independent
= , =
= [ = ]
=
= = = ( )
=
=
55
(iii)
As 0
,
,
,
=
=
+
, (, )
=
=
()
Similarly,
, (, )
=
()
(5)
(5)
56
=
=
=
=
57
Example 14
In Example 9, given the joint pdf,
,
,
0
=
0,
(a)
, (, ) 2
=
=
()
2 2
= 0
(b)
, (, )
2
=
=
()
2 1
=
0
1
The conditional pdf of is an exponential pdf shifted
by to the right.
The conditional pdf of is an exponential pdf that has
been truncated to the interval [0, ].
59
= , =
= | = =
=
= , = = = | = =
Therefore,
, =
Suppose we are interested in the probability that
=
, , =
(6)
Therefore,
=
| =
(6)
60
| =
(6)
| =
(7)
Example 15
The total number of defects on a chip is a
Poisson random variable with mean . Suppose
that each defect has a probability of falling in a
specific region and the location of each defect
is independent of the locations of other defects.
Find the pmf of the number of defects that fall
in the region .
62
= =
= | = [ = ]
=0
63
= 0,1,
!
Since ,
= =
1
!
=
=
!
1
!
(1)
=
=
!
!
Thus, is a Poisson random variable with mean .
64
Example 16
The random variable is selected at random
from the unit interval 0,1 . The random variable
is then selected at random from the interval
[0, ]. Find the cdf of Y.
65
= =
| =
0
| = =
1
>
1
0
01
1 1 +
0
= + ln 1
= + ln 1 ln = ln
=
=
ln = ln
01
66
Conditional expectation
The conditional expectation of , given = , is
defined by
(2)
() =
()
(3)
=
=
( ) =
( )
(4)
68
(5)
using the case when and are jointly continuous random variables.
(|) ()
Since
, (, )
=
()
= []
69
()
(6)
( )
(7)
70
Example 17
In Example 15, out of = defects in the chips,
the total number of defects that fall in the
region is a binominal distribution.
= | = =
0
1
>
From (7):
| = =
=0
= =
=0
=
=0
= []
Since is a Poisson random variable with mean
=
,
= 0,1,
!
and
=
Therefore,
=
72
, ,
, ( , )
73
Example 18
Let = + . Find [].
= +
+ ,
, +
= + []
In general
1 + 2 + + = 1 + 2 + + [ ]
Example 19
Let and be independent random variables and , =
1 2 (). Find , .
Since and are independent random variables,
, , = ()
1 2 , ,
= 1 2
In general, if 1 , 2 , , are independent random variables, then
1 1 2 2
= 1 1 2 2
75
=
(ii) and discrete
Correlation of and
Covariance of and
The covariance of and is defined as = = 1 central moment
, = [] []
The covariance of and as defined above can also be expressed
as
, = +
= +
=
Note that if either of the random variables has a zero mean, then
, = []
If and are independent random variable then ( []) and
( []) are also independent . Therefore,
, =
= =0
Therefore, pairs of independent random variables have a zero
covariance.
78
i.e.,
0
79
i.e.,
i.e.,
2
2
,
2
2
2
2
2
i.e., 1 2, + 1 0
Therefore,
1 , 1
The extreme values (i.e. +1 and -1) of are achieved
when and are related linearly, that is = + .
, = 1,
> 0
, = 1,
< 0
and are said to be uncorrelated, if , = 0
80
Example 20
Given the joint pdf
2
, , =
0
obtain the (, )
82
From the joint pdf, we first obtain the marginal pdfs () and
()
, , =
= 2
0
2
0<
1 ,
=
2 0
, , =
= 2
= 2
7
3
=
2
2
= 2 2
2
2
0<
3
=
2
7
=
2
5
=
4
83
=
2 =
2 =
1
=
2
2 2 2 =
1
1
=
2
2
1
2
1
=
4
, ,
= =
=
=
2 2
=0
=0
=
2
(2 )
1
=0
=1
,
=
3 1
1
1
2 2
=
=
5
5 1
4 4
, =
84
[ ]
and is the covariance matrix defined by
[1 ] (1 , 2 ) (1 , )
(2 , 1 ) [2 ] (2 , )
=
( , 1 ) ( , 2 )
85
86
= [ ] = 2 , = 1,2, ,
Therefore, = 1 2
1 =
and
1
2
=
=1
Thus,
1,2,, 1 , 2 , , =
1
exp
2
=1
2
1
1
2
2
2
2
=
( )
2
=1
=1
Thus, for jointly Gaussian random variables, 1 , 2 , , , if the covariance matrix is a diagonal matrix
(i.e., the pairwise covariances , , , are zero), then the individual random variables are
independent Gaussian random variables.
87
1
2
2
= 1 2 1 ,
, 1 2
2
1
= 2 2
2
1 2 (1 ,
) , 1 2
12
88
= 1
1
2
12 22 1 ,
1
1 =
2
, , =
1
1
1
1
1
2
12 22 1 ,
2,
22
, 1 2
, 1 2 1
2
12
22 1 , 1 2 2
, 1 2 1 + 12 2
2
1
1
2
2
2 1 ,
1
2,
1
2
2
+
2
2
2
2
2
2 1
2
21 2 1 ,
89