You are on page 1of 5

CS464 Introduction to Machine Learning

Fall 2010
Questions 2 Decision Tree Learning
Q1) Give decision trees to represent the following boolean functions:
A B
A [B C
A !"# B
A1 $eans A% and A2 $eans B
[A B [C &
Q2) Consider the following set of training e'a$ples:
a) (hat is the entrop) of this collection of training e'a$ples with respect to the target function classification*
+ntrop) , -./01log./0) 2 ./01log./0) , 1
3) (hat is the infor$ation gain of a2 relative to these training e'a$ples*
+1a2 ,4) , -2/5log12/5) 2 2/5log12/5) , 1
+1a2 ,F) , -1/2log11/2) 2 1/2log11/2) , 1
Gain of a2 , +16) 2 5/0+1a2 ,4) 2 2/.+1a2 ,F)
, 1- 5/0 2 2/0 , 0
c) Create the decision tree for these training e'a$ples using 7&.8
Gain16%a1) 9 Gain16%a2)
Q.) Consider the following set of training e'a$ples:
Instance Classification Attrb1 Attrb2 Attrb
1 c1 a 4 a
2 c1 a 4 3
. c2 3 F c
5 c1 c 4 d
: c. a F a
0 c. 3 4 3
; c2 c F c
< c2 3 4 c
= c1 a 4 a
10 c1 3 F 3
a) (hat is the entrop) of this collection of training e'a$ples with respect to the target function classification*
+ntrop) , -:/10log1:/10) 2 2/10log12/10) 2 ./10log1./10) , 185<
3) (hat is the infor$ation gain of Attr31 relative to these training e'a$ples*
+1Attr31 , a) , -./5log1./5) 2 1/5log11/5) , , 08<11
+1Attr31 , 3) , -2/5log12/5) 2 1/5log11/5) 2 1/5log11/5) , 18:
+1Attr31 , c) , -1/2log11/2) 2 1/2log11/2) , 1
Gain of Attr31 , + 2 5/10+1Attr31 , a) 2 5/10 +1Attr31 , 3) 2 2/10+1Attr31 , c)
, 08.0
c) Create the decision tree for these training e'a$ples using 7&.8
d) Convert the decision tree into the rules8
If (Attrb3 = a) (Attrb2 = T) Then Classification = c1
If (Attrb3 = a) (Attrb2 = F) Then Classification = c3
If (Attrb3 = b) (Attrb1 = a) Then Classification = c1
If (Attrb3 = b) (Attrb1 = b) (Attrb2 = T) Then Classification = c3
If (Attrb3 = b) (Attrb1 = b) (Attrb2 = F) Then Classification = c1
If (Attrb3 = c) Then Classification = c2
If (Attrb3 = d) Then Classification = c1
Q5) Consider the following set of training e'a$ples:
Instance Classification Attrb1 Attrb2
1 4 5 F
2 4 10 4
. F 20 F
5 F .5 4
: F :0 F
0 F ;0 F
; 4 ;0 F
< 4 <0 4
= F =0 4
10 F =2 4
a) Create the decision tree for these training e'a$ples using 7&.8 >ote that Attr31 is a continuous-valued attri3ute8
3) Convert the decision tree into the rules8
If (Attrb15 < 15) Then Classification = True
If (Attrb15 > 15) (Attrb73 < 73) Then Classification = False
If (Attrb15 > 15) (Attrb73 > 73) (Attrb5 < T) Then Classification = True
If (Attrb3 = b) (Attrb1 = b) (Attrb5 > T) Then Classification = False

You might also like