You are on page 1of 4

Marks Formula Sheet for Exam P

Discrete distributions
Uniform, U (m)
PMF: f (x) =
=

1
m,

for x = 1, 2, . . . , m

m+1
m2 1
and 2 =
2
12

Hypergeometric
PMF: f (x) =

N1
x

N2
nx

N
n

x is the number of items from the sample of n items that are from group/type 1.
N1 N2 N n
N1
)
= n( ) and 2 = n( )( )(
N
N N N 1
Binomial, b(n, p)
 
n x
PMF: f (x) =
p (1 p)nx , for x = 0, 1, . . . , n
x
x is the number of successes in n trials.
= np and 2 = np(1 p) = npq
MGF: M (t) = [(1 p) + pet ]n = (q + pet )n
Negative Binomial, nb(r, p)


x1 r
PMF: f (x) =
p (1 p)xr , for x = r, r + 1, r + 2, . . .
r1
x is the number of trials necessary to see r successes.
1
r
r(1 p)
rq
= r( ) = and 2 =
= 2
2
p
p
p
p

r
t
r
(pe )
pet
MGF: M (t) =
=
[1 (1 p)et ]r
1 qet
Geometric, geo(p)
PMF: f (x) = (1 p)x1 p, for x = 1, 2, . . .
x is the number of trials necessary to see 1 success.
CDF: P (X k) = 1 (1 p)k = 1 q k and P (X > k) = (1 p)k = q k
1
1p
q
= 2
= and 2 =
2
p
p
p
t
pe
pet
MGF: M (t) =
=
1 (1 p)et
1 qet
Distribution is said to be memoryless, because P (X > k + j|X > k) = P (X > j).

Poisson
x e
, for x = 0, 1, 2, . . .
x!
x is the number of changes in a unit of time or length.

PMF: f (x) =

is the average number of changes in a unit of time or length in a Poisson process.


CDF: P (X x) = e (1 + +

2
2!

+ +

x
x! )

= 2 =
MGF: M (t) = e(e

t 1)

Continuous Distributions
Uniform, U (a, b)
1
, for a x b
ba
xa
CDF: P (X x) =
, for a x b
ba
(b a)2
a+b
and 2 =
=
2
12
etb eta
, for t 6= 0, and M (0) = 1
MGF: M (t) =
t(b a)
PDF: f (x) =

Exponential
1
PDF: f (x) = ex/ , for x 0

x is the waiting time we are experiencing to see one change occur.


is the average waiting time between changes in a Poisson process. (Sometimes called the
hazard rate.)
CDF: P (X x) = 1 ex/ , for x 0.
= and 2 = 2
1
MGF: M (t) =
1 t
Distribution is said to be memoryless, because P (X x1 + x2 |X x1 ) = P (X x2 ).
Gamma
1
1
x1 ex/ =
x1 ex/ , for x 0

()
( 1)!
x is the waiting time we are experiencing to see changes.

PDF: f (x) =

is the average waiting time between changes in a Poisson process and is the number of
changes that we are waiting to see.
= and 2 = 2
1
MGF: M (t) =
(1 t)
Chi-square (Gamma with = 2 and = 2r )
PDF: f (x) =

1
xr/21 ex/2 , for x 0
(r/2)2r/2

= r and 2 = 2r
MGF: M (t) =

1
(1 2t)r/2

Normal, N (, 2 )
1
2
2
PDF: f (x) = e(x) /2
2
MGF: M (t) = et+

2 t2 /2

Integration formulas
Z
1
1
1

p(x)eax dx = p(x)eax 2 p0 (x)eax + 3 p00 (x)eax . . .


a
a
a

Z 
1 x/
x

e
dx = (a + )ea/

a

Z 
2 1 x/
e
dx = ((a + )2 + 2 )ea/

a
Other Useful Facts
2 = E[(X )2 ] = E[X 2 ] 2 = M 00 (0) M 0 (0)2
Cov(X, Y ) = E[(X x )(Y y )] = E[XY ] x y
Cov(X, Y ) = xy = x y
and
xy
=
x y
Least squares regression line: y = y +

y
(x x )
x

When variables X1 , X2 , . . . , Xn are not pairwise independent, then


n
n
X
X
X
Var(
Xi ) =
i2 + 2
ij
i=1

i=1

i<j

and

n
n
X
X
X
Var(
ai Xi ) =
a2i i2 + 2
ai aj ij
i=1

i=1

i<j

where ij is the covariance of Xi and Xj .


When X depends upon Y , E[X] = E[E[X|Y ]].
When X depends upon Y , Var(X) = E[Var(X|Y )] + Var(E[X|Y ]). (Called the Total Variance of
X.)
Chebyshevs Inequality: For a random variable X having any distribution with finite mean and
variance 2 , P (|X | k) k12 .
For the variables X and Y having the joint PMF/PDF f (x, y), the moment generating function for
this distribution is
M (t1 , t2 ) = E[et1 X+t2 Y ] = E[et1 X et2 Y ] =

XX
x

et1 x et2 y f (x, y)

x = Mt1 (0, 0) and y = Mt2 (0, 0) (These are the first partial derivatives.)
E[X 2 ] = Mt1 t1 (0, 0) and E[Y 2 ] = Mt2 t2 (0, 0) (These are the pure second partial derivatives.)
E[XY ] = Mt1 t2 (0, 0) = Mt2 t1 (0, 0) (These are the mixed second partial derivatives.)
Central Limit Theorem: As the sample size n grows,

the distribution of

n
X

Xi becomes approximately normal with mean n and variance n 2

i=1
n

X
2
= 1
Xi becomes approximately normal with mean and variance
.
the distribution of X
n
n
i=1

If X and Y are joint distributed with PMF f (x, y), then


X
f (x, y)
the marginal distribution of X is given by fx (x) =
y

the marginal distribution of Y is given by fy (y) =

X
x

f (x, y0 )
.
fy (y0 )
P
X
xf (x, y0 )
xf (x|y = y0 ) = x
E[X|Y = y0 ] =
fy (y0 )
x
f (x|y = y0 ) =

f (x, y)

You might also like