CS539 MACHINE LEARNING. SPRING 99
SYLLABUS

PROF. CAROLINA RUIZ
Department of Computer Science
Worcester Polytechnic Institute



COURSE DESCRIPTION:

Machine learning is concerned with the design and study of computer programs that are able to improve their own performance with experience, or in other words, computer programs that learn. In this graduate course we cover several theoretical and practical aspects of machine learning. We study different machine learning techniques/paradigms, including decision trees, neural networks, genetic algorithms, Bayesian learning, rule learning, and reinforcement learning. We discuss applications of these techniques to problems in data analysis, knowledge discovery and data mining.

We will closely follow the excellent recent book "Machine Learning" by Tom M. Mitchell and will discuss several state of the art research articles. The course will provide substantial hands-on experience through four computer projects. These projects use code and datasets provided online as companions to the textbook.

For the catalog description of this course see the WPI Graduate Catalog.


CLASS MEETING:

Tuesdays 5:30-8:20 pm
KH116

Students are also encouraged to attend the AIRG Seminar Thursdays at 11 am.


INSTRUCTOR:

Prof. Carolina Ruiz
ruiz@cs.wpi.edu
Office: FL 232
Phone Number: (508) 831-5640
Office Hours: Tu 4:30-5:20 pm, Th 10-10:50 am, or by appointment.

Other speakers may occasionally be invited to lecture to the class.


TEXTBOOK:


PREREQUISITE:

CS 534 or equivalent, or permission of the instructor.


GRADES:

Exam 1 20%
Exam 2 20%
Project 1 15%
Project 2 15%
Project 3 15%
Project 4 15%
Class Participation Extra Points

Your final grade will reflect your own work and achievements during the course. Any type of cheating will be penalized with an F grade for the course and will be reported to the WPI Judicial Board in accordance with the Academic Honesty Policy.


EXAMS

There will be a total of 2 exams. Each exam will cover the material presented in class since the beginning of the semester. In particular, the final exam is cumulative. Both will be in-class exams.

PROJECTS

There will be a total of 4 projects. These projects may be implemented using any high level programming language (
Lisp, Prolog, C, C++, ...) More detailed descriptions of the projects will be posted to the course webpage at the appropriate times during the semester. Although you may find similar programs/systems available online or in the references, the design and all code you use and submit for your projects MUST be your own original work.

The code you submit for each of the projects should run on the WPI CS machines or the CCC machines and should rely on software available on those machines only.

Project 1 - Decision Tree Learning

Construction of a decision tree learner using the decision tree learning code from Chapter 3 of the textbook and application of the system to several learning tasks.

Project 2 - Neural Networks

Design and implementation of a learning system for face recognition using neural networks and the error back propagation procedure. This project is based on the source code and dataset provided online as a companion to Chapter 4 of the textbook.

Project 3 Bayesian Learning

Development of a Bayesian learning system. This system will be based on the code from Chapter 6 of the textbook, which can be found online.

Project 4 Comparison of Machine Learning Methods

Given a common learning task to be determined, each student in the class is expected to:
  1. select an ML technique/paradigm (mutually agreed upon with the instructor) not covered by the previous 3 projects nor by other students in the class. These techniques include, but are not limited to, instance-based learning, genetic algorithms, rule learning, and reinforment learning;
  2. research the ML literature on this technique;
  3. design and implement a prototype system that solves the learning task using the chosen learning technique;
  4. write a webpage summarizing the relevant background knowledge and project results;
  5. and give a 30 minute, oral, in-class presentation describing the achievements of this project.

A comparison of the results obtained by the different learning techniques/algorithms will be drawn as a group effort.


CLASS PARTICIPATION

Students are expected to read the material assigned for each class in advance and to participate in class discussions. Class participation will be taken into account when deciding students' final grades.

CLASS MAILING LIST

The mailing list for this class is: cs539@cs.wpi.edu
If your email address does not belong to the class mailing list, you can subscribe to it by sending the following one-line email message to majordomo@cs.wpi.edu:
subscribe cs539

CLASS WEB PAGES

The web pages for this class are located at
http://www.cs.wpi.edu/~cs539/s99/
Announcements will be posted on the web pages and/or the class mailing list, and so you are urged to check your email and the class web pages frequently.

ADDITIONAL SUGGESTED REFERENCES

(See also the list of assigned papers in the
Class Schedule.)

Machine Learning

  1. Tom M. Mitchell. "Machine Learning" McGraw-Hill, 1997.

  2. P. Langley. "Elements of Machine Learning" Morgan Kaufmann Publishers, Inc. 1996.

  3. Fayyad, Piatetsky-Shapiro, Smyth, and Uthurusamy, eds. "Advances in Knowledge Discovery and Data Mining" The MIT Press, 1995

  4. See http://www.aic.nrl.navy.mil/~aha/research/ml/books.html for an extensive list of ML books organized by topics.

General AI

  1. T. Dean, J. Allen, Y. Aloimonos. "Artificial Intelligence: Theory and Practice" The Benjamin/Cummings Publishing Company, Inc. 1995.

  2. B. L. Webber, N. J. Nilsson, eds. "Readings in Artificial Intelligence" Tioga Publishing Company, 1981.

  3. Patrick H. Winston. "Artificial Intelligence" 3rd edition Addison Wesley.

  4. S. L. Tanimoto. "The Elements of Artificial Intelligence Using Common Lisp" Computer Science Press 1990.

  5. E. Rich and K. Knight. "Artificial Intelligence" Second edition McGraw Hill 1991.

  6. P. Norvig. "Paradigms of Artificial Intelligence Programming: Case Studies in Common Lisp" Morgan Kaufmann Publishers, 1992.

  7. M. Ginsberg. "Essentials of Artificial Intelligence" Morgan Kaufmann Publishers, 1993.

  8. G. F. Luger and W. A. Stubblefield. "Artificial Intelligence Structures and Strategies for Complex Problem Solving" Third edition Addison-Wesley, 1998.

  9. M.R. Genesereth and N. Nilsson. "Logical Foundations of Artificial Intelligence" Morgan Kaufmann, 1987.

Lisp/Prolog Textbooks and Manuals

  1. G. L. Steele Jr. "Common Lisp: The language'' 2nd edition Digital Press, 1990. (ISBN 1-55558-041-6)
    This reference is online.

  2. Patrick H. Winston and Berthold K.P. Horn. "Lisp" 3rd edition.

  3. L. Sterling, E. Shapiro. "The Art of Prolog" MIT Press, 1986.

WARNING:

Small changes to this syllabus may be made during the course of the term.

OTHER AI/ML RESOURCES ONLINE: