You are on page 1of 77

Probability Review

Course : AAOC C312


Review of Basic Probability (Ch 12)
Laws of probability
Addition Law
Conditional Probability
Random Variables
Probability Distribution
Joint random variable
Some Common Probability Distributions
Binomial, Poisson, Exponential, Normal
Probability
Probability provides a measure of uncertainty
associated with the occurrence of events or
outcomes of a random experiment.
Experiments
Sample Space
Events
0 P(E) 1
Impossible Event; P() = 0
Certain Event; P(S) = 1
Mutually Exclusive Events
Pair wise Mutually Exclusive Events
Equally likely Events


Some definitions
An experiment
Any process that yields a result or an
observation
Outcome
A particular result of an experiment
Sample space
The set of all possible outcomes of an
experiment
Some definitions
An event
Any subset of the sample space.
If the event is A, then n (A) is the number
of sample points that belong to event A
If the event is getting heads on a series of
coin flips, then n (heads) is the number of
heads in the sample of flips
Mutually Exclusive Event
Events defined in such a way that the
occurrence of one event precludes the
occurrence of any of the other events
If one of them happens, the other cannot
happen
Probability
Probability provides a measure of uncertainty
associated with the occurrence of events or
outcomes of a random experiment.
Definition:
If in a n-trial experiment an event E occurs
m times then the probability of occurrence of
event E is


By definition,
0 <= P[E] <=1
P[E] = 0 Impossible event
P[E] = 1 Sure event
n
m
E P
n
lim
] [

=

Example: What is the probability of getting
even nos. in a rolling a die.
Example: What is the probability of getting
total of 7 on two dice?

Addition law of Probability

+
+
= +
otherwise EF P F P E P
exclusive mutually are F and E
F P E P
F E P
}, { } { } {
}, { } {
} {
For two events E and F, E + F represents
union, and EF represents intersection.
Conditional law of Probability
0 } { ,
} {
} {
} | { > = F P
F P
EF P
F E P
The two events E and F with P[F] > 0, then the
conditional probability of E given F is
If E is a subset of F, then P[EF] =P[E].
Two events, are independent if and only if,
P{E|F} =P {E}
In this case conditional probability become
P{EF} =P{E}P{F}
Problem: In a certain college, 25 percent of the
student failed mathematics, 15 percent failed
chemistry, and 10 percent failed both
mathematics and chemistry. A student is
selected at random.
(a) if the student failed chemistry, what is the
probability that he failed mathematics?
(b) if the student failed mathematics, what is the
probability that he failed chemistry?
(c) what is the probability that he failed chemistry
or mathematics?
(d) what is the probability that he failed neither
chemistry nor mathematics?

Problem: Two men A and B fire at a target.
Suppose P(A) = 1/3 and P(B) = 1/5 denote
their probabilities of hitting the target. ( we
assume that the events A and B are
independent). Find the probability that
(a) A does not hit the target
(b) Both hit the target
(c) One of them hits the target
(d) Neither hits the target.
Bayes Theorem
,
} {
} { } | {
} | {
B P
A P A B P
B A P =
The two events A and B with P[B] > 0, then
Let E be an event in a sample space S, and let A
1
,
A
2
,.A
n
be mutually disjoint event whose union is
S. then
,
} {
} { } | {
} | {
} { } | { ... } { } | { } { } | { } {
2 2 1 1
E P
A P A E P
E A P
A P A E P A P A E P A P A E P E P
k k
k
n n
=
+ + + =
Problem: Three machines A, B and C produce,
respectively, 40%, 10% and 50% of the items in
a factory. The percentage of defective items
produced by the machines is respectively, 2%,
3% and 4%. An item from the factory is selected
at random.
(a) Find the probability that the item is defective
(b) If the item is defective, find the probability that
the item was produced by (i) machine A, (ii)
machine B, (iii) machine C.
15
Random Variables
Definition: A random variable X on a sample space S is
a rule that assigns a numerical value to each outcome
of S or in other words a function from S into the set R
of real numbers.
X : S R
x : value of random variable X
R
X
: The set of numbers assigned by random variable X,
i.e. range space.
16
Random Variables (contd)
Classifications of Random Variables According to the
number of values which they can assume, i.e. number
of elements in R
x
.
Discrete Random Variables: Random variables which
can take on only a finite number, or a countable
infinity of values, i.e. R
x
is finite or countable infinity.
Continuous Random Variables: When the range space
R
x
is a continuum of numbers. For example an
interval or the union of the intervals.
17
Random Variables (contd)
Example: Consider the experiment consisting of 4
tosses of a coin then sample space is
S = {HHHH, HHHT, HHTH, HTHH, THHH, HHTT,
HTTH, TTHH, THTH, HTHT, THHT, TTTH,
TTHT, THTT, HTTT, TTTT}
Let X assign to each (sample) point in S the total
number of heads that occurs. Then X is a random
variable with range space
R
X
= {0, 1, 2, 3, 4}
Since range space is finite, X is a discrete random
variable
18
Random Variables (contd)
Example: A point P is chosen at random in
a circle C with radius r. Let X be the
distance of the point from the center of the
circle. Then X is a (continuous) random
variables with R
X
= [0, r]
r
P
X
O
C
19
Probability Distributions
If X is discrete random variable, the function given
by
f(x) = P[X = x]
for each x within the range of X is called the
probability mass function (pmf) of X.
To express the probability mass function, we
give a table that exhibits the correspondence
between the values of random variable and the
associated probabilities
20
Probability Distributions (contd)
Ex: In the experiment consisting of four tosses
of a coin, assume that all 16 outcomes are
equally likely then probability mass
function for the total number of heads is
x 0 1 2 3 4
f(x) 1/16 1/4 3/8 1/4 1/16
21
Probability Distributions (contd)
A function can serve as the probability mass function
of a discrete random variable X if and only if its
value, f(x), satisfy the conditions
1. f(x) > 0 for all value of x.
2.

1 ) (
all
=

x
x f
Example: Check whether the following can define
probability distributions

. 5 , 4 , 3 , 2 , 1 , 0 for
15
) ( ) a ( = = x
x
x f
22
Probability Distributions (contd)
. 5 , 4 , 3 , 2 , 1 for
25
1
) ( ) d (
. 6 , 5 , 4 , 3 for
4
1
) ( ) c (
. 3 , 2 , 1 , 0 for
6
5
) ( ) b (
2
=
+
=
= =
=

=
x
x
x f
x x f
x
x
x f
23
Distribution Function
If X is a discrete random variable, the function given
by
< < = s =

s
x t f x X P x F
x t
for ) ( ) ( ) (
where f(t) is the value of the probability mass function
of X at t, is called the distribution function or the
cumulative distribution function (cdf) of X.
24
Example
Cumulative Distribution function of the total
number of heads obtained in four tosses of a
balanced coin
We know that f(0) = 1/16, f(1) = 4/16, f(2) = 6/16,
f(3) = 4/16, f(4) = 1/16. It follows that
F(0) = f(0) = 1/16
F(1) = f(0) + f(1) = 5/16
F(2) = f(0) + f(1) + f(2) =11/16
F(3) = f(0) + f(1) + f(2) + f(3) = 15/16
F(4) = f(0) + f(1) + f(2) + f(3) + f (4) = 1
25

>
< s
< s
< s
< s
<
=
4 for 1
4 3 for
16
15
3 2 for
16
11
2 1 for
16
5
1 0 for
16
1
0 for 0
) (
x
x
x
x
x
x
x F
The distribution function is given by
The distribution function is defined not only for the
values taken on by the given random variable, but
for all real number.


26
1 2 3 4 0
1/16
5/16
11/16
15/16
1
F(x)
x
Graph of the Distribution function
.
.
.
.
.
27
The values F(x) of the distribution function of a
discrete random variable X satisfy the conditions
1.F(-) = 0 and F() = 1; that is, it ranges from 0 to
1.
2.If a < b, then F(a) s F(b) for any real numbers a
and b. Hence it is non-decreasing.
If the range of a random variable X consists of the
values x
1
< x
2
< x
3
< < x
n
, then f(x
1
) = F(x
1
) and
f (x
i
) = F(x
i
) - F(x
i-1
) for i = 2, 3, , n.
That is, f (x
i
) is the size of the jump in the graph,
Distribution Function (contd)
28
Similarly, for continuous random variable X, we associate
a probability density function (pdf) f, such that

( ) ( ) 0, .
( ) ( ) ( ) .
( ) ( ) 1.
( ) ( ) ( ) , .
( ) ( )
b
a
x
a f x x
b f a X b f x dx if a b
c f x dx
d x f x dx x
d
e f x
dx

>
s s = <
=
=
=
}
}
}
for all real
is integrable andP
F for eachreal
F
29
Parameters of random variables
(i) Expectation of a random variable X is

If h is a real valued function of X, then

(ii) Variance

(iii) Moments The r-th moment about origin is

The r-th moment about mean is

x
= E(X) = xp(x)
2
o =
2 2 2
Var(X) = E(X- ) = E(X ) - {E(X)}
'
r
r
= E(X )


x
x f x
all
) (

=
x all
x f x h X h E ) ( ) ( )) ( (
] ) [(
r
r
X E =
dx x f x ) ( or
}


dx x f x h ) ( ) ( or
}


[ ]
Probability Density Function (pdf)


Characteristics
Random variable X
Discrete Continuous
Applicable range a, a+1, , b a x b
Conditions for
pdf
p(x)0, f(x)0,

1 ) ( =

=
b
a x
x p
1 ) ( =
}
b
a
x f
Cumulative distribution function
(CDF)

=
=
= s
}

=
X
a
X
a x
continuous x dx x f X F
discrete x x p X P
X x P
, ) ( ) (
, ) ( ) (
} {
Problem: The number of units, x, needed for an
item is discrete from 1 to 5. the probability p(x)
is directly proportional to the number of units
needed. The constant of proportionality is K.
(a) find the pdf of x,
(b) Find the value of the constant k
(c) determine the CDF, and find the probability that
x is even value.

Problem: Consider the following function



(a) find the value of the constant k that will make
f(x) a pdf
(b) determine the CDF, and find the probability that
x is (i) larger than 12, and (ii) between 13 and 15.

20 10 , ) (
2
s s = x
x
k
x f
Expectation of Random Variable
Given that h(x) is a real function of a
random variable x, we define the expected
value of h(x), E{h(x)}, as the weighted
average with respect to the pdf of x.

=
}

=
continuous x x f x h
discrete x x p x h
x h E
b
a
b
a x
), ( ) (
), ( ) (
)} ( {
Moments of Random Variable
The m
th
moment of a random variable x,
denoted by E(X
m
), also called the
expected value of X
m
, is defined

=
}

=
continuous x x f x
discrete x x p x
X E
b
a
m
i
b
a x
i
m
i
m
), (
), (
} {
Mean

=
}

=
continuous x x xf
discrete x x xp
x E
b
a
b
a x
), (
), (
} {
The mean of x, E{x}, is a numeric
measure of central tendency of random
variable.
First moment of x.
Variance
} var{ } {
), ( }) { (
), ( }) { (
} {
2
2
x x stdDev
continuous x x f x E x
discrete x x p x E x
x Var
b
a
b
a x
=

=
}

=
The variance var{x}, is a measure of
dispersion of x around the mean
Problems
Consider a random variable X that is equal to 1,2 or
3. If we know p(1) =1/2 and p(2) = 1/3 then p(3)=?
Find E{x} and Var{x} where x is the outcome when
we are roll a fair die.
Suppose the r.v. has a following distribution function



What is the probability that X exceeds 1?

)
`

>
s
=
0 ) exp( 1
0 0
) (
2
x x
x
x F
Problems
A construction firm has recently sent in
bids for 3 jobs worth (in profit) 10, 20 and
40 (thousand) dollars. If its probabilities of
winning the jobs are respectively 0.2, 0.8
and 0.3, what is the firms expected total
profit?
40
Some Standard Distributions
Bernoullis Distribution A r. v. X is said to have
Bernoulli distribution if and only if the corresponding
probability mass function is given by
( )
x 1-x
p X= x = p (1- p) , x = 0,1.
t
X
Also, E(X) = p, Var(X) = p(1- p), and M (t) = 1- p + pe
41
Binomial Distribution A r. v. X is said to have
Binomial distribution if and only if the corresponding
probability mass function is given by
| |
|
\ .
x n-x
n
p(X = x) = p (1- p) , x = 0,1, ..., n
x
t n
X
E(X) = np, Var(X) = np(1- p) and M (t) = (1- p + pe )
42
Geometric Distribution A r. v. X is said to have
Geometric distribution if and only if the corresponding
probability mass function is given by
P(X = x) = p.q
x-1
, x = 1, 2, 3, .; q = 1 - p
Memoryless Property

1
2
) 1 ( ) ( ) ( ,
1
) (

= = =
t t
X
qe pe t M
p
q
X V
p
X E
P(X > t +h | X > t) = P(X > h), t > 0, h > 0
43
Poissons Distribution A random variable is
said to be Poissons random variable with parameter
if X has the mass points 0,1,2, and its
probability mass function is
o
o
x
-
P(X = x) = e , x = 0,1, 2, ...
x
and
+
> 0
o
o o
t
(e -1)
X
Inthiscase,
E(X) = , Var(X) = and M (t) = e
Theorem If X and Y are independent Poissons
random variables with parameters
respectively, then X+Y will be a Poissons random
variable with parameter
44
Theorem Suppose X has binomial distribution with
parameters n and p. If n is large and p is small so
that , then X will follow Poissons
distribution with parameter .
= np
o
Exponential Distribution
A continuous r.v. whose probability density
function is given, for some >0, by

<
>
=

0 , 0
0 ,
) (
x if
x if e
x f
x

Its CDF is

E[X] =1/, V[X] = 1/
2
,
0 , 1 ) ( > =

x e x F
x
46
Markov or Memoryless Property of the Exponential
Distribution
P(X > t +h | X > t) = P(X > h), t > 0, h > 0
If the no. of arrivals at a service facility
during a specified time period follows
Poison distribution, then the distribution
of the time interval between successive
arrivals must be Exponential
distribution.
If is the rate at which events occur,
then 1/ is the average time interval
between successive events.
47
Uniform: ~ if (Fig. 1) , ), , ( b a b a U X <

s s

=
otherwise. 0,
, ,
1
) (
b x a
a b
x f
X
) (x f
X
x
a
b
a b
1
Fig. 1
Exponential: ~ if (Fig. 2) ) ( E X

>
=

otherwise. 0,
, 0 ,
) (
x e
x f
x
X

) (x f
X
x
Fig. 2
Mean of Uniform
Distribution
( )
} }

= =


b
a
dx
a b
x x xf
1

2
b a +
=
( )
2
2
12
1
a b = o
Normal Distribution:
Normal (Gaussian): X is said to be normal or Gaussian r.v,
if

This is a bell shaped curve, symmetric around the
parameter and its distribution function is given by

where is often tabulated. Since
depends on two parameters and the notation ~
will be used to represent
.
2
1
) (
2 2
2 / ) (
2
o
to

=
x
X
e x f
,
,
2
1
) (
2 2
2 / ) (
2
}


|
.
|

\
|

= =
x
y
X
x
G dy e x F
o

to
o
dy e x G
y
x
2 /
2
2
1
) (


}
=
t
) , (
2
o N X
) (x f
X
x

Fig.
A
) (x f
X
,
2
o
The Standard Normal
Distribution
To find P(a < x < b), we need to
find the area under the appropriate
normal curve. There are several
such normal curves, but one of
them is called standard normal
curve.
The Standard Normal
Distribution
Definition : The normal distribution
with Mean = 0; Standard deviation = 1
is called the standard normal
distribution (standard normal variable
is denoted by Z).


X

o
Normal
Distribution
Z
X
=

o
Normal to Standard Normal
Distribution
Normal
Standardized
X=oZ +
The Normal Approximation
to the Binomial
We can calculate binomial probabilities
using
The binomial formula
The cumulative binomial tables
When n is large, and p is not too close
to zero or one, areas under the
normal curve with mean np and
variance npq can be used to
approximate binomial probabilities.
Approximating the Binomial
While approximating a random
variable with integer values by a
continuous random variable, use
continuity correction.
. In this, the integer value x
0
of
discrete random variable is replaced
by the interval (x
0
1/2,

x
0
+1/2) of
the continuous random variable.




Thus if a and b are integers, and X* is
a continuous random variable
approximating discrete random
variable X then
P(a s X s b) = P(a < X* s b + )


Make sure that np and nq are both
greater than 15 to avoid inaccurate
approximations!
Exercise :The probability that an
electronic component will fail in less
than 1200 hours of continuous use is
0.2. Use normal approximation to find
the probability that among 250 such
components, fewer than 50 will fail in
less than 1200 hours of continuous
use.
X= no. of electronic components among 250
randomly chosen which fail in less than
1200 hours.
X has binomial distribution with n=250, p=
0.2.
We can approximate X by normal random
variable X* with mean (250)(0.2)=50 and
variance = (50)(0.8)=40.
o=6.324.
Z=(X*-50)/6.324 has standard normal dist.

P(X<50) = P(X s 49) = P(X* s 49.5) =
P(Z s (49.5-50)/6.324) =
F(-0.079)=1-F(0.079)=
0.4681
Central Limit Theorem
Let x
1
, x
2
, , and x
n
be independent and
identically distributed random variables,
each with mean and standard deviation
o, and defined s
n
= x
1
+x
2
+.+x
n
.
As n become large (n), the distribution
of s
n
becomes normal with mean n and
variance no
2
, regardless of the original
distribution of x
1
, x
2
, , and x
n
.


.
-
0.2266
From (1) and (2) we get z = -.37.
(b) P(-z < Z < z) = .9298
or F(z) F(-z) = .9298
or F(z) (1 - F(z)) =.9298
or 2F(z) = 1.9298
or F(z) = P(Z z)= .9649
from table, z =1.81.
73.3
Joint random variable
Consider the two continuous r.vs x
1
, a
1
x
1


b
1
,
and x
2
, a
2
x
2


b
2
. Define f(x
1
, x
2
) as the joint
pdf of x
1
and x
2
and f
1
(x
1
) and f
2
(x
2
) as the
marginal pdfs of x
1
and x
2
respectively. Then
f(x
1
,x
2
) 0, a
1
x
1


b
1
, a
2
x
2


b
2


t independen are x and x if x f x f x x f
dx x x f x f
dx x x f x f
dx x x f dx
b
a
b
a
b
a
b
a
2 1 2 2 1 1 2 1
1 2 1 2 2
2 2 1 1 1
2 2 1 1
), ( ) ( ) , (
) , ( ) (
) , ( ) (
1 ) , (
1
1
2
2
2
2
1
1
=
=
=
=
}
}
} }
E[c
1
x
1
+c
2
x
2
]=c
1
E[x
1
]+c
2
E[x
2
]
Var[c
1
x
1
+c
2
x
2
]=c
1
2
var[x
1
]+c
2
2
Var[x
2
]
+2c
1
c
2
cov{x
1
x
2
}
Cov{x
1
x
2
}=E[x
1
x
2
] E[x
1
]E[x
2
]
Example: The joint pdf of x
1
and x
2
, P(x
1
,x
2
), is
2 . 0 0 2 . 0
0 2 . 0 0
2 . 0 0 2 . 0
7 5 3
3
2
1
2 2 2
1
1
1
= = =
=
=
=
x x x
x
x
x
(a)Find the marginal pdfs p
1
(x
1
) and p
2
(x
2
).
(b)Are x
1
and x
2
independent?
(c)Compute E{x
1
+ x
2
}





Example: 12.3-3
A lot includes four defectives (D) items and
six good (G) ones. You select one item
randomly and test it. Then, without
replacement, you test a second item. Let
the r.v.s x
1
and x
2
represents the outcomes
for the first and second item, respectively.
a) Determine the joint and marginal pdfs of x
1

and x
2
.
b) Suppose that you get $ 5 for each good
item you select but pay $ 6 if it is defective.
Determine the mean of your revenue after
two items have been selected.

You might also like