Professional Documents
Culture Documents
Bayesian paper
Bayesian table
Bayesian experiment
s
A
o
q
In this experiment p + q = N
Bayesian experiment
o being uniformly distributed and once o is
determined, M has binomial, conditional distribution
with o representing success or failure on a single trial,
the model is:
N p
P (M = p|o) =
o (1 o)N p
p
p = 0, . . . , N
Probability of heads
Probability of tails 32
1
3
1
3!
1
2 2
3 0.444
1! 2!
3
0
3!
1
2 3
3 0.296
3! 3
0.4
0.35
0.3
0.25
0.2
0.15
0.1
0.05
0.5
1.5
2.5
binomial(0.3, 3)
0.3
0.25
0.2
0.15
0.1
0.05
10
binomial(0.3, 10)
0.08
0.07
0.06
0.05
0.04
0.03
0.02
0.01
10
20
30
40
50
60
70
80
90
100
binomial(0.3, 100)
0.2
0.15
0.1
0.05
10
binomial(0.5, 10)
0.07
0.06
0.05
0.04
0.03
0.02
0.01
10
20
30
40
50
60
70
80
90
100
binomial(0.5, 100)
Questions to answer
Questions to answer
Lemma 1
The probability that the point o will fall between any
two points f and b in the line AB is the ratio of the
distance between the two points to the whole line
fb
P (f < o < b) =
AB
Lemma 2
The ball W having been thrown, and the line os
drawn, the probability of the event M in a single trial
is the ratio of Ao to AB
Ao
P (M = 1) =
AB
Proposition 8
P ((f < o < b)
Z b
N p
(M = p)) =
o (1 o)N p do
p
f
N p
1
N p
o (1 o)
do =
N +1
p
Marginalization of probability
P (test
P (test)
T
P (test cancer) = P (test|cancer) P (cancer)
Marginalization of probability
P (test
P (test)
T
P (test cancer) = P (test|cancer) P (cancer)
0.8 0.01 = 0.008
0.107
0.1 0.99 = 0.099
Marginalization of probability
P (test
P (test)
T
P (test cancer) = P (test|cancer) P (cancer)
0.8 0.01 = 0.008
0.107
0.1 0.99 = 0.099
P (cancer|test) =
P (cancer)P (test|cancer)
P (cancer)P (test|cancer)+P (cancer)P (test|cancer)
Marginalization of probability
P (test
P (test)
T
P (test cancer) = P (test|cancer) P (cancer)
0.8 0.01 = 0.008
0.107
0.1 0.99 = 0.099
P (cancer|test) =
P (cancer)P (test|cancer)
P (cancer)P (test|cancer)+P (cancer)P (test|cancer)
0.01 0.8
0.008
0.008
=
=
= 0.075
0.01 0.8 + 0.99 0.1
0.008 + 0.099
0.107
Physicians say...
The article [3] discusses some of the reactions of the
physicians to even considering such problems. Here
are some quotes:
On such a basis one cant make a diagnosis.
Statistical information is one big lie.
I never inform my patients about statistical data.
I would tell the patient that mammography is not
so exact, and I would in any case perform a
biopsy.
Oh, what nonsense. I cant do it. You should test
my daughter. She studies medicine.
Statistics is alien to everyday concerns and of
little use for judging individual persons.
Proposition 9
T
Rb
N
p
N p
o
(1
o)
do
f p
R 1 N
p (1 o)N p do
o
0 p
Beta function
Z
N p
o (1 o)
p!(N p)!
do =
(N + 1)!
Beta distribution
P ((f < o < b)|(M = p)) =
Rb
f
N
p
op (1 o)N p do
B(p + 1, N p + 1)
0p1
2.5
1.5
0.5
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Beta(0.5, 0.5)
1.8
1.6
1.4
1.2
0.8
0.6
0.4
0.2
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Beta(1, 1)
1.5
0.5
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Beta(1, 2)
4.5
3.5
2.5
1.5
0.5
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Beta(1, 5)
18
16
14
12
10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Beta(1, 20)
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Beta(2, 20)
4.5
3.5
2.5
1.5
0.5
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Beta(10, 20)
10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Beta(10, 100)
14
12
10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Beta(200, 200)
10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Beta(100, 100)
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Beta(50, 50)
3.5
2.5
1.5
0.5
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Beta(10, 10)
2.5
1.5
0.5
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Dirichlet prior
Dirichlet prior Dirichlet prior is conjugate to
multinomial distribution. The Dirichlet distribution
can be written as
PK
K
Y
( i=1 i )
i 1
D(|) = QK
i
i=1 (i ) i=1
i satisfy 0 i 1 and
PK
i=1 i
=1
Multinomial distribution
The multinomial distribution corresponding to k balls
dropped into n boxes with fixed probability
(p1 , . . . , pn ), (with ith box containing ki balls) is
k
k!
k1
kn
p1 pn =
pk11 pknn
k1 ! kn !
k1 , . . . , kn
The Dirichlet is a convenient prior because the
posterior having observed frequencies (k1 , . . . , kn ) is
Dirichlet with probability (1 + k1 , . . . , n + kn ).
0.8
0.6
0.4
0.2
0
0
0
0.2
0.2
0.4
0.4
0.6
0.6
0.8
0.8
1
0.8
0.6
0.4
0.2
0
0
0
0.2
0.2
0.4
0.4
0.6
0.6
0.8
0.8
1
Dirichlet(1, 1, 1)
0.8
0.6
0.4
0.2
0
0
0
0.2
0.2
0.4
0.4
0.6
0.6
0.8
0.8
1
Dirichlet(1, 2, 1)
0.8
0.6
0.4
0.2
0
0
0
0.2
0.2
0.4
0.4
0.6
0.6
0.8
0.8
1
0.8
0.6
0.4
0.2
0
0
0
0.2
0.2
0.4
0.4
0.6
0.6
0.8
0.8
1
References
[1] Rev. Thomas Bayes, An essay towards solving a
problem in the doctrine of chances, Philosophical
Transactions of the Royal Society of London
(1763).
[2] D. M. Eddy, Variations in phsician practice: the
role of uncertainty, Professional Judgment: A
Reader in Clinical Decision Making (A. Elstein
and J. Dowie, eds.), Cambridge University Press,
1988, pp. 45590.
[3] Gerd Gigerenzer, The psychology of good
judgment, Medical Decision Making 16 (1996),
no. 3, 273280.