You are on page 1of 7

Slide 1 of 4

Continuous distributions
Continuous distributions are defined on a range: 0 £ x £ 1, 0 £ x, …
They can be univariate or multivariate
Ÿ Definitions
b
Prob@a < X < bD = Ùa f HxL â x f HxL is the probability density
b
Prob@X < bD = Ù-¥ f HxL â x = FHbL FHbL is the cumulative probability distribution
¥
Hopefully, Ù-¥ f HxL â x = FH¥L = 1

1.0

0.8

0.6

0.4

0.2

2 4 6 8

Ÿ A continuous probability density may have discrete components:

0.4

0.3

0.2

0.1

2 4 6 8

Ÿ Transforming the scale


Prob@x < X < x + d xD ~ f HxL d x
Prob@y < Y < y + d yD ~ gHyL dy = f HxL d x
dx
So, gHyL = f HxL dy

dK y O
dx 1
e.g. f HxL = 1 for 0 < x < 1, and y = x2 Þ gHyL = 1 ´ dy
= dy
=
2 y

Printed by Mathematica for Students


2 Continuous distributions.nb

2.5

2.0

1.5

1.0

0.5

2 4 6 8

Ÿ A distribution is described by its moments:


¥ ¥
mean x = Ù-¥ x f HxL â x … k ' th moment Mk = Ù-¥ x k f HxL â x = EAxk E
¥ ¥
variance v = Ù-¥ Hx - xL2 f HxL â x … k ' th central moment Ù-¥ Hx - xLk f HxL â x = E[Hx - xLk ]
1
moments can be infinite: e.g. a Cauchy distribution f HxL =
ΠI1+x2 M

Sort@RandomReal@CauchyDistribution@0, 1D, 100DD

Printed by Mathematica for Students


Continuous distributions.nb 3

Slide 2 of 4

Some examples
Ÿ Uniform distribution
1
range 8a, b< density f HxL = b-a
.
Sort@RandomReal@UniformDistribution@80, 1<D, 100DD

Ÿ Exponential distribution
range 80, ¥< density f HxL = Λã-Λx where Λ is the rate
1 1 2 1
Mean: x = E@xD = Variance EBIx - Λ M F =
Λ Λ2

An exponential is the distribution of the smallest of very many uniformly distributed values
tab = Sort@Table@Min@RandomReal@UniformDistribution@80, 100<D, 100DD, 810 000<DD;

8Mean@tabD, Variance@tabD<

BarChart@BinCounts@tab, 80, 5, 0.1<DD

800

600

400

200

2.5% chance that x < 0.025 x; 2.5% chance that x > 3.7 x
If events occur at exponentially distributed intervals at rate Λ, the number of events in time T is Poisson, with
expectation ΛT
Ÿ Gamma distribution
What is the distribution of the sum of two exponentials?
x x
f2 HxL = Ù0 f1 HyL f1 Hx - yL â y = Ù0 ã-y ã-Hx-yL â y = xã-x
This is a convolution: f2 = f1 * f1
The sum of k exponentials has a distribution:
xk-1 xk-1 -x
fk HxL = f1 * f1 … = Hk-1L!
ã-x or G@kD
ã where G@kD is the gamma function

Printed by Mathematica for Students


4 Continuous distributions.nb

0.25

0.20

0.15

0.10

0.05

5 10 15 20
k k-1
Λ x k k variance 1
More generally, fk HxL = ã-Λx The mean is and the variance is ; =
G@kD Λ Λ2 mean2 k

Ÿ Normal distribution
The sum of many independent random variables follows a normal distribution:
1 Hx-xL2
f (x) = expJ- 2V
N
2 ΠV

0.4

0.3

0.2

0.1

-4 -2 2 4

1.0

0.8

0.6

0.4

0.2

-4 -2 2 4

A linear combination of normal variables also follows a normal distribution

Printed by Mathematica for Students


Continuous distributions.nb 5

Slide 3 of 4

Moment generating functions


The generating function of a discrete distribution is EAz j E
Ž ¥
The moment-generating function of a continuous distribution is f HtL = EAãt X E = Ù-¥ ãtx f HxL â x
Ž
Ÿ Differentiating f around zero gives the moments:
Ž
f H0L = 1
Ž Ž ŽHkL
f ' H0L = E@XD f '' H0L = EAX2 E … f H0L = EAXk E
Ÿ The MGF for the convolution of two distributions is the product of their MGFs:
¥
f * g = Ù-¥ f HyL gHx - yL â y
Ž ¥ ¥ ¥ ¥
f * g = Ù-¥ Ù-¥ ãtx f HyL gHx - yL â yâ x = Ù-¥ Ù-¥ ãty ãtHx-yL f HyL gHx - yL â yâ x
ŽŽ
¥ ¥
= IÙ-¥ ãty f HyL â yM IÙ-¥ ãtz g HzL â z = f g
Ÿ Examples
Λ
Exponential: Λ ã-Λx -> Λ-t

Λk xk-1 Λ k
Gamma: G@kD
ã-Λx -> I Λ-t M
1 2 2
Normal: ã-Hx-xL ‘2 V ® ãtx ãt V2
2 ΠV

Printed by Mathematica for Students


6 Continuous distributions.nb

Slide 4 of 4

Central limit theorem


“I know of scarcely anything so apt to impress the imagination as the wonderful form of cosmic order expressed
by the “Law of Frequency of Error”. The law would have been personified by the Greeks and deified, if they had
known of it. It reigns with serenity and in complete self-effacement, amidst the wildest confusion. The huger the
mob, and the greater the apparent anarchy, the more perfect is its sway. It is the supreme law of Unreason. When-
ever a large sample of chaotic elements are taken in hand and marshaled in the order of their magnitude, an
unsuspected and most beautiful form of regularity proves to have been latent all along.” Galton, 1889
The sum of a large number of independent random variables, with finite mean and variance, tends to a normal
distribution
BarChart@BinCounts@
Total@RandomReal@UniformDistribution@80, 1<D, 810, 10 000<DD, 80, 10, 0.05<DD

200

150

100

50

Ÿ
MGF of the distribution of the sum of many independent variables is the product of their MGF:
Ž Ž Ž
f 1 HtL f 2 HtL f 3 HtL …
Ž t2 t2
Near t = 0, f i ~ expJt xi + Vi 2 … N so we have overall expJt Úi xi + Úi Vi 2 … N which is just the MGF of a
normal distribution with mean Úi xi and variance Úi Vi
For example, consider the sum of 10 exponentially disributed variables, each with mean 1. These follow a
Gamma distribution. The figure shows the exponential ã-x (black), the distribution of the sum of 10 exponentials
(blue) and the corresponding normal (red), with the same mean and variance.

0.25

0.20

0.15

0.10

0.05

5 10 15 20

This shopws the corresponding generating functions. The GF for the normal and the gamma are actually indistin-
guishable - the mean and variance are the same, so they must have the same slope and second derivative. The
actual distributions are not so close, though - not as close as for the sum of uniform distributions, say. That is
because the exponential is more widely spread than the uniform - in fact, the exponential has the fattest possible
tails, consistent with finite moments.
Printed by Mathematica for Students
Continuous distributions.nb 7

This shopws the corresponding generating functions. The GF for the normal and the gamma are actually indistin-
guishable - the mean and variance are the same, so they must have the same slope and second derivative. The
actual distributions are not so close, though - not as close as for the sum of uniform distributions, say. That is
because the exponential is more widely spread than the uniform - in fact, the exponential has the fattest possible
tails, consistent with finite moments.

2.0

1.5

1.0

0.5

-0.2 -0.1 0.0 0.1 0.2

Printed by Mathematica for Students

You might also like