You are on page 1of 2

2.

3 Counting Techniques
Product rule for ordered pairs if the first element or object
of an ordered pair can be selected in x ways and for each of
2.1-2.2
Sample Space set of all possible outcomes
of an experiment
Event simple or compound
Complement (A) everything not in A
Union A or B (everything in A, B, and
both A and B)
Intersection A and B (everything thats in
both A and B)
Disjoint/Mutually exclusive A B = null
Axiom 1: for any event A, P(A) >= 0
Axiom 2: P(S) = 1
Axiom 3: P (A1 A2 A3) = sum of
each probability
For any two events, A and B, P(A B) =
P(A) + P(B) P(A B)
For any three events, A, B, and C, P(A B
C) = P(A) + P(B) + P(C) P(A B)
P(A C) P(B C) + P(A B C)

2.4 Conditional Probability
P(A | B) =
( )
()

Multiplication Rule for P(A B) = P(A | B) * P(B)
Bayes Theorem
2.5 - Independence
Two events A and B are independent if P(A | B) = P(A) and
dependent otherwise
Multiplication Rule for P(A B) = P(A) * P(B)
3.3 Expected Values
Expected value the mean E(X) or
E(aX + b) = a*E(X) + b
Variance E(X - )^2, or ^2 or E(X^2)
[E(X)]^2
Standard deviation -
3.1-3.2 Random Variables and Probability
Distributions
Bernoulli RV only possible values are 0 &
1
Discrete finite set
Continuous uses cdf
Probability mass function is defined for
every number of a discrete random variable
CDF last number should be 1 the sum of
the PMF
CDF can be obtained from PMF and vice
versa
3.4 Binomial Probability Distribution
Conditions: 1) fixed number of trials (n), 2) each trial may
result in one of two events (S/F), 3) trials are independent, 4)
probability of success is constant (p)
X = number of Ss among the trials
b(x; n, p) = (

(1 )


o n = number of trials, x = total number of successes
X ~ Bin(n, p)
E(X) = np
V(X) = np(1-p) = npq
=
Replacement isnt always possible; if sample size n is at most
5% of the population size N, then the experiment can be
treated as a binomial
3.6 Poisson Probability Distribution
p(x; ) =

!
; p(x; ) is always greater than 0 because is always greater than 0
E(X) = V(X) = = np x = number of successes
Conditions: 1) length of time of observation period is fixed, 2) events occur at constant average rate, 3) events are
independent, 4) RV X has no upper limit

uv dx = uv dx u'(v dx) dx
3.5 Hypergeometric & Negative Binomial Distributions
Hypergeometric Conditions: 1) consists of a finite population (N), 2) each individual can be S or F, and there
are M successes, 3) sample of n individuals is selected without replacement each subset of size n is equally
likely to be chosen
( = ) = (; , , ) =
(

)(

)
(

)
=



o M = total number of successes, N = total population, n = subset of population, x = number of successes
chosen
() = (

)
V(X) = [(N - n) / (N - 1)] * n * (M/N) * (1 M/N)
Negative Binomial Conditions: 1) consists of a sequence of independent trials, 2) each trial can result in S/F, 3)
p is constant, 4) experiment continues until total of r successes observed
o Number of trials is random
nb(x; r, p) probability that x Fs occur before the r-th success
o = (
1
+ 1
)

(1 )


o r = number of successes, x = number of failures
E(X) =
(1)

, V(X) =
(1)

2

4.1 Probability Density Functions (f(x))
For continuous variables
( ) = ()

(total area for infinity


to infinity = 1)
To be a legitimate pdf, f(x) must be greater than 0 for
all x
Uniform Distribution -
(; , ) =
1


0


4.2 Cumulative Distribution Functions (F(x)) &
Expected Values
CDF sum the pmf
o F(x) = ( ) = ()


(100p)th percentile denoted by () is p = F(n(p)) =
()
()


Median -
E(X) = ()


V(X) = ( )
2
() = [( )
2
]


V(X) = E(X^2) [E(X)]^2

You might also like