WPI Worcester Polytechnic Institute

Computer Science Department
------------------------------------------

CS 4445 Data Mining and Knowledge Discovery in Databases - B Term 2014 
Homework and Project 2: Data Pre-processing, Mining, and Evaluation of Decision Trees

Prof. Carolina Ruiz 

DUE DATES: Thursday, Nov. 13, 9:00 am (electronic submission) and 11:00 am (hardcopy submission) 
------------------------------------------


HOMEWORK AND PROJECT OBJECTIVES

The purpose of this project is multi-fold:

HOMEWORK AND PROJECT ASSIGNMENTS

Readings: Read in great detail Chapter 4 (except for Section 4.6) of your textbook.

This project consists of two parts:

  1. Part I. INDIVIDUAL HOMEWORK ASSIGNMENT

    See solutions by Chiying Wang.

    Consider the following dataset.

    @relation simple-weather
    
    @attribute outlook {sunny,overcast,rainy}
    @attribute humidity numeric
    @attribute windy {TRUE,FALSE}
    @attribute play {yes,no}
    
    @data
    sunny,    80, FALSE, no
    sunny,    90, TRUE,  no
    overcast, 80, FALSE, yes
    rainy,    96, FALSE, yes
    rainy,    80, FALSE, yes
    rainy,    72, TRUE,  no
    overcast, 72, TRUE,  yes
    sunny,    96, FALSE, no
    sunny,    72, FALSE, yes
    rainy,    80, FALSE, yes
    sunny,    72, TRUE,  yes
    overcast, 90, TRUE,  yes
    overcast, 80, TRUE,  yes
    rainy,    96, TRUE,  no
    
    where the play attribute is the classification target.

    1. (30 points) Construct the full ID3 decision tree using entropy to rank the predicting attributes (outlook, humidity, and windy) with respect to the target/classification attribute (play). Keep humidity as a numeric attribute (do not discretize it). Show all the steps of the calculations. Make sure you compute log in base b (for the appropriate b) correctly as some calculators don't have a log_b primitive for all b's. Also, state explicitly in your tree what instances exactly belong to each tree node using the line numbers provided next to each data instance in the dataset above.

    2. (5 points) Propose approaches to using your decision tree above to classify instances that contain missing values. Use the following instance to illustrate your ideas.
      outlook = ?, humidy = 80, and windy = FALSE.
      

    3. Study how J4.8 performs post-prunning by reading in detail:

  2. Part II. GROUP PROJECT ASSIGNMENT