You are on page 1of 41

Uncertainty in

Rule-Based Expert Systems

• Fundamental ideas
• Bayesian reasoning
• Certainty factors theory
• Comparison

1st Semester, 2005 SCCS 451 1


Uncertainty
• Lack of exact knowledge to reach a perfectly
reliable conclusion
• Information often incomplete, inconsistent,
uncertain  uncertainty unavoidable
• Without uncertainty, exact rules are feasible
– A = TRUE  A = ~FALSE
– B = FALSE  B = ~TRUE

1st Semester, 2005 SCCS 451 2


Source of Uncertainty
• Weak implications: handle vague association
– difficult to establish strong correlation between
condition and action part
• Imprecise language: often, sometimes, hardly
• Unknown data: approximate reasoning
• Combination of the views of different experts
– different conclusion from different experts on the
same question

1st Semester, 2005 SCCS 451 3


Expert Systems and Uncertainty
• Uncertainty is unavoidable
• Mechanism to allow some uncertainty in
expert system
• Most popular paradigm:
– Bayesian reasoning
– certainty factors

1st Semester, 2005 SCCS 451 4


Definition of Probability
• Proportion of cases in which the event occurs
• Scientific measure of chance
• Range : [0,1]

# Event_occured
p=
# Possible_event

1st Semester, 2005 SCCS 451 5


Conditional and Joint Probability
• Conditional probability : probability that event
A will occur if event B occurs
p( A  B)
p( A | B) =
p( B)
• Joint probability (p(A ∩B)) : probability that
both event A and B occur

1st Semester, 2005 SCCS 451 6


Common Relationships
p ( A ∩ B ) = p ( B ∩ A)
p( A ∩ B) = p( A | B) p( B)
p ( A | B ) p ( B ) = p ( B | A) p( A)
n
p ( A) = ∑ p ( A | Bi ) p( Bi )
i =1

p ( B | A) p ( A)
p( A | B) =
p( B | A) p ( A) + p ( B | ¬A) p (¬A)
1st Semester, 2005 SCCS 451 7
Bayesian Reasoning
• Rules:
IF E is true
THEN H is true (with probability p),
where E : evidence, H : hypothesis
• When event E occurs, H will occurs with
probability p(H|E)
• p(H|E) : posterior probability

1st Semester, 2005 SCCS 451 8


Multiple Hypotheses
• Select a hypothesis from a number of
hypotheses to explain the current evidence

p( E | H i ) p( H i )
p( H i | E ) = m

∑ p( E | H
k =1
k ) p( H k )

1st Semester, 2005 SCCS 451 9


Multiple Evidences, Multiple Hypothesis
• Select a hypothesis from a number of
hypotheses to support the existing multiple
evidences

p ( E1 E2 ...En | H i ) p ( H i )
p ( H i | E1 E2 ...En ) = m

∑ p( E E ...E
k =1
1 2 n | H k ) p( H k )

1st Semester, 2005 SCCS 451 10


Difficulty and Solution
• All cases of conditional probabilities must be
known
• Assume conditional independence among
evidences
p ( E1 | H i ) p ( E2 | H i )... p ( En | H i ) p ( H i )
p ( H i | E1 E2 ...En ) = m

∑ p( E
k =1
1 | H k ) p ( E2 | H k )... p( En | H k ) p ( H k )

1st Semester, 2005 SCCS 451 11


Computation Example
Hypothesis
Probability i=1 i=2 i=3
p(Hi) 0.40 0.35 0.25
P(E1|Hi) 0.3 0.8 0.5
P(E2|Hi) 0.9 0.0 0.7
P(E3|Hi) 0.6 0.7 0.9
Find P(Hi|E3), P(Hi|E1E3) and P(Hi|E1E2E3)
1st Semester, 2005 SCCS 451 12
Case Study: FORECAST
• See Table 3.3
• Two basic rules
– 1: Today is rain.  Tomorrow is rain.
– 2: Today is dry.  Tomorrow is dry.
– Error: 10 mistakes in 30 forecasts

How to include probability into rules….

1st Semester, 2005 SCCS 451 13


Parameters for Belief in Hypothesis
• Likelihood of sufficiency (LS): measure of the
expert belief in hypothesis H
p( E | H )
LS =
p ( E | ¬H )
• Likelihood of necessity (LN): measure of
discredit to hypothesis H
p ( ¬E | H )
LN =
p ( ¬E | ¬H )
1st Semester, 2005 SCCS 451 14
Determination of LN and LS
• Directly decided by experts
• High LS (LS >> 1) : rule strongly supports the
hypothesis
• Low LN (0 < LN < 1): rule strongly opposes
the hypothesis in case of missing evidence
• LN cannot be derived from LS

1st Semester, 2005 SCCS 451 15


Probability Inclusion
• Assume each event equally possible (p = 0.5)
• Two adapted rules:
– 1: Today is rain {LS 2.5, LN .6}  Tomorrow is
rain {prior 0.5}
– 2: Today is dry {LS 1.6, LN .4}  Tomorrow is
dry {prior 0.5}

1st Semester, 2005 SCCS 451 16


Calculation for Overall Probability
• Objective: find max. posterior probabilities
• Use prior probability for the 1st calculation
• Update by LS if the evidence is true
• Update by LN if the evidence is false
• Continue until all events are applied

1st Semester, 2005 SCCS 451 17


Prior Odd
• Define prior odd: ratio of the occurrence to
non-occurrence
p( H )
O( H ) =
1 − p( H )
• Help in easy calculation of the posterior
probability

1st Semester, 2005 SCCS 451 18


Find Posterior Probability by O(H)
O( H | E ) = LS × O( H )
O( H | ¬E ) = LN × O( H )
O( H | E )
p( H | E ) =
1 + O( H | E )
O ( H | ¬E )
p ( H | ¬E ) =
1 + O ( H | ¬E )
1st Semester, 2005 SCCS 451 19
FORECAST: Rules (1)
• 1: Today is rain (LS 2.5, LN 0.6)  Tomorrow
is rain {prior 0.5}
• 2: Today is dry (LS 1.6, LN 0.4)  Tomorrow
is dry {prior 0.5}
• 3: Today is rain and rainfall is low (LS 10, LN
1)  Tomorrow is dry {prior 0.5}

1st Semester, 2005 SCCS 451 20


FORECAST: Rules (2)
• 4: Today is rain; rainfall is low and
temperature is cold (LS 1.5, LN 1) 
Tomorrow is dry {prior 0.5}
• 5: Today is dry and temperature is warm (LS 2,
LN 0.9)  Tomorrow is rain {prior 0.5}
• 6: Today is dry; temperature is warm and sky
is overcast (LS 5, LN 1)  Tomorrow is rain
{prior 0.5}
1st Semester, 2005 SCCS 451 21
Information about Today
• Weather: rain
– use rules with the event on weather only (1, 2)
• Rainfall: low
– use rules with the event on rainfall (3)
• Temperature: cold
– use rules with the event on temperature (4, 5)
• Cloud cover: overcast
– use rules with the event on cloud cover (6)
1st Semester, 2005 SCCS 451 22
Problems of Bayesian Method
• Primary input: probability
• Human unable to/badly elicit probability
values consistent with the Bayesian rules 
Conditional probability inconsistent with prior
probability
• Prior probabilities inconsistent with LS & LN
• Conditional independence of evidence rarely
satisfied
• Impractical for large knowledge bases
1st Semester, 2005 SCCS 451 23
Inconsistency in Bayesian Method
• Case study: starting car
• Given probability:
– p(bad starter | odd noises) = 0.7
– p(odd noises | bad starter) = 0.85
– p(bad starter) = 0.05
• Derived probability from p(odd noises | bad
starter) and p(bad starter)
– p(bad starter | odd noises) = 0.23 (<< 0.7!!!)
1st Semester, 2005 SCCS 451 24
Cause of Inconsistency
• Different assumptions on conditional and prior
probability
• p(bad starter) calculated from p(bad starter |
odd noises) and p(odd noises | bad starter) is
0.29. Hint: rearranging term in the following
equation:
p( E | H ) p( H )
p( H | E ) =
p ( E | H ) p ( H ) + p ( E | ¬H ) p ( ¬H )
1st Semester, 2005 SCCS 451 25
Certainty Factor Theory
• Popular alternative to Bayesian reasoning
• Pros:
– No reliable statistical data of the problem domain
– Unable to explain the hypothesis in logical term
and mathematically consistence
• Introduce certainty factor (c.f.) for measuring
expert believes
– Range [-1,1] : [definitely untrue, definitely true]

1st Semester, 2005 SCCS 451 26


Rules in Certainty Factors Theory
• IF <evidence> THEN <hypothesis> {cf}
Term Certainty factor
Definitely not -1.0
Almost certainly not -0.8
Probably not -0.6
Maybe not -0.4
Unknown -0.2 to 0.2
Maybe 0.4 …

1st Semester, 2005 SCCS 451 27


cf and Probabilities
MB( H , E ) − MD( H , E )
cf =
1 − min[ MB( H , E ), MD( H , E )]
 1 ; p(H) = 1
 max[ p ( H | E ), p ( H )] − p ( H )
MB( H , E ) = 
; otherwise
 max[1,0] − p ( H )
 1 ; p(H) = 0
 min[ p ( H | E ), p( H )] − p( H )
MD( H , E ) = 
; otherwise
 min[1,0] − p( H )
MB : Measure of Belief, MD : Measure of Disbelief
1st Semester, 2005 SCCS 451 28
Example of Rules
• IF A is X
THEN B is Y {cf 0.7}
B is Z {cf 0.2}
• More than one value assigned to B
• For the remaining 10%, anything can happen
including value not yet been observed.

1st Semester, 2005 SCCS 451 29


Propagation of cf
• cf(H,E) = cf(E) × cf
• E.g.
IF the sky is clear
THEN the forecast is sunny {cf 0.8}
Current cf for the sky is clear= 0.5
cf(H,E) = 0.5 × 0.8 = 0.4
 It may be sunny

1st Semester, 2005 SCCS 451 30


Multiple Evidences: Conjunctive
• A hypothesis implied from a number of evidences
cf ( H , E1 ∩ E2 ∩ ... ∩ En ) =
min[cf ( E1 ), cf ( E2 ),..., cf ( En )] × cf
• E.g.
IF sky is clear {cf 0.9}
AND the forecast is sunny {cf 0.7}
THEN the action is “wear sunglasses” {cf 0.8}

1st Semester, 2005 SCCS 451 31


Multiple Evidence: Disjunctive
• A hypothesis implied from one or more
evidences
cf ( H , E1 ∪ E2 ∪ ... ∪ En ) =
max[cf ( E1 ), cf ( E2 ),..., cf ( En )] × cf
• E.g.
• IF sky is overcast {cf 0.6}
• OR the forecast is rain {cf 0.8}
• THEN the action is “take an umbrella” {cf 0.8}
1st Semester, 2005 SCCS 451 32
Rules Combination
• Rules that lead to the same hypothesis
• Rule 1: IF A is X
THEN C is Z {cf 0.8}
• Rule 2: IF B is Y
THEN C is Z {cf 0.6}
How to combine the two rules…

1st Semester, 2005 SCCS 451 33


Combined cf

cf1 + cf 2 (1 − cf1 ) ; cf1 > 0 ∩ cf 2 > 0


 cf1 + cf 2
cf (cf1 , cf 2 ) =  ; cf1 < 0 ∪ cf 2 < 0
1 − min( cf1 , cf 2 )
cf1 + cf 2 (1 + cf1 ) ; cf1 < 0 ∩ cf 2 < 0

Combine according to the confidence of the


hypothesis from each evidence.

1st Semester, 2005 SCCS 451 34


FORECAST: Rules for cf (1)
• Control cf
• Control “threshold 0.01”
• 1: Today is rain  Tomorrow is rain {cf 0.5}
• 2: Today is dry  Tomorrow is dry {cf 0.5}
• 3: Today is rain and rainfall is low 
Tomorrow is day {cf 0.6}

1st Semester, 2005 SCCS 451 35


FORECAST: Rules for cf (2)
• 4: Today is rain; rainfall is low and
temperature is cold  Tomorrow is dry {cf
0.7}
• 5: Today is dry and temperature is warm 
Tomorrow is rain {cf 0.65}
• 6: Today is dry; temperature is warm and sky
is overcast  Tomorrow is rain {cf 0.55}
• Seek tomorrow
1st Semester, 2005 SCCS 451 36
Information about Today
• Weather: rain
– use rules on weather evidence (1)
• Rainfall: low with cf of 0.8
– use rules with the event on rainfall and weather (3)
• Temperature: cold with cf of 0.9
– use rules with the event on temperature (4)
• Combine rules with the same hypothesis
– combine result from rule (3) with rule (4)
1st Semester, 2005 SCCS 451 37
Pitfall of cf
• Assumption of independence
• Violation of independence
1) IF sprinkler ON last night {cf 1}
THEN grass wet in the morning {cf 0.8}
2)IF grass wet in the morning
THEN rained last night {cf 0.9}
cf[wet, sprinkler ON] = 1 × 0.8 = 0.8
cf[rain, wet] = 0.8 × 0.9 = 0.72
Evidence of RAIN when grass was wet by sprinkler!?
1st Semester, 2005 SCCS 451 38
How to Avoid Pitfall
• Use only causal or inverse causal
– Sprinkler on  grass wet : causal
– Grass wet  rain : inverse causal
• Partition causal and inverse causal relationship
so that the derived evidence cannot be used
again to go back the other way without new
evidence.

1st Semester, 2005 SCCS 451 39


Bayesian Reasoning VS cf
Bayesian Reasoning Certainty Factor Theory
• Rely on valid data and • Subjective (unclear)
statistical information information
• Assume the conditional • Heuristic approach
independence of • Assume the
evidence independence of
• Mathematically correct evidence
• E.g. PROSPECTOR • E.g. MYCIN
Both aim for E.S able to quantify personal, subjective
and qualitative info.
1st Semester, 2005 SCCS 451 40
HOMEWORK
• Design the diagnostic system for

1st Semester, 2005 SCCS 451 41

You might also like