## CS4341 Introduction to Artificial Intelligence. C2000 Exam 2 - February 29, 2000

### Prof. Carolina RuizDepartment of Computer ScienceWorcester Polytechnic Institute

#### Instructions

• Write your name on each page
• Ask in case of doubt

### NAME: _____________________________________________________

#### Problem I. Constraint Satisfaction (20 points)

Suppose that a medical doctor needs your help constructing a constraint net to find prescriptions for his/her flu patients. The following table shows the symptoms exhibited by flu patients and the list of medications that can be prescribed for each symptom.
```Symptom                Medications

Runny nose             R1, R2
Fever                  F1, F2
Sore throat            S1, S2
```
Assume also that the FDA has found that each the following sets of medications presents dangerous drug interactions and so drugs in each set should not be prescribed together.
##### Drugs that must NOT be prescribed together
• {H1,S1}
• {H3,S2}
• {H2,R2}
• {F1,S1}
On the other hand, the following sets of drugs work in conjunction and so if one of them is prescribed then the other one must be prescribed as well.
##### Drugs that must be prescribed together
• {H2,S2}
• {S2,F1}
1. (10 points) Construct a constraint net that helps the medical doctor in finding appropriate prescriptions for his/her flu patients. Provide the constraint matrix associated to each link in the net explicitly.

2. (10 points) A patient may be allergic to some drugs. Hence, an appropriate prescription for a flu patient should consist of 4 medications, one for each symptom, satisfying the constraints above and to which the patient is not allergic. Find an appropriate prescription for Lucy, who is allergic to F1 and to R2. Describe in detail each of the steps of your problem solving strategy:

1. Arc Consistency Check
```
```

2. Depth First Search
1. Selection of the 1st node
```
```
2. Selection of each consecutive node

#### Problem II. Logic-Based Systems (20 points)

Consider the following set of axioms:
1. forall x [equal(x,x)]

2. forall y,z [equal(y,z) -> equal(z,y)]

3. forall w,s,t [equal(w,s) and equal (s,t) -> equal(w,t)]

4. equal(b,a)

5. equal(b,c)
And the conclusion:
• equal(c,a)
As usual, a, b, and c denote constants; and x, y, z, w, s, and t denote variables.

Prove the conclusion from the axioms by refutation using resolution:

• (10 points) Translate the axioms and the negation of the conclusion into clausal form:

1. ```
```
2. ```
```
3. ```

```
4. ```
```
5. ```
```
6. (negated conclusion:)

• (10 points) Apply resolution (state explicitly which substitutions are made):

#### Problem III. Learning - Decision Trees (20 points)

The following table contains training examples that help a robot janitor predict whether or not an office contains a recycling bin.

 STATUS DEPT. OFFICE SIZE RECYCLING BIN? 1. faculty ee large no 2. staff ee small no 3. faculty cs medium yes 4. student ee large yes 5. staff cs medium no 6. faculty cs large yes 7. student ee small yes 8. staff cs medium no

1. (15 points) Use information theory to construct a minimal decision tree that predicts whether or not an office contains a recycling bin. Show each step of the computation.
```            x      1     2/3    1/3    1/2
---------------------------------------
log_2(x)     0    -0.6   -1.6    -1
```
2. (5 points) Translate your decision tree into a collection of decision rules.

### NAME: _____________________________________________________

#### Problem IV. Learning - Neural Networks (20 points)

Consider the following function of two variables:
```        A  B  Desired Output
0  0    1
1  0    0
0  1    0
1  1    1
```
1. (5 points) Prove that this function cannot be learned by a single perceptron that uses the step function as its activation function.
2. (15 points) Explain how you would train a neural network (in which each unit uses the sigmoid function as its activation function) to learn this function by describing in detail each of the steps below:
• (5 points) Topology of your neural nets:
1. How many hidden layers would you use?
2. How many hidden units per layer?
3. How many connections would your net have?
4. Draw the neural net that would result from your choices above.
• (10 points) Training your neural net:
1. How would you select the initial weights of the connections?
2. Explain how the error back propagation algorithm would work when applied to your neural net.
3. When would you stop the iterations of the error back propagation algorithm?
4. What would you do if the trained neural net does not generate the desired outputs?

#### Problem V. Learning - Genetic Algorithms (20 points)

Consider a genetic algorithm in which the rank method is used to select candidates. Recall that, given a fixed value p between 0 and 1, the probability of picking the first-ranked candidate is p; the probability of picking the second-ranked candidate is p times the remaining probability (1-p), that is p*(1-p); the probability of picking the third-ranked candidate is p times the remaining probability (1-(p + p*(1-p))), that is p*(1-(p + p*(1-p))) = p*(1-p)(1-p); and so on, until only one candidate is left, in which case this candidate receives the remaining probability, so that the probabilities of all the individuals in the population sum up to 1.
1. (7 points) Suppose that you have five candidates. You decide to select one using the rank-space method with p=0.25. Compute the probabilities for each of the five as a function of rank.
```
Individuals     Rank      Rank Fitness
-----------------------------------------
C1           1st         0.25

C2           2nd

C3           3rd

C4           4th

C5           5th

```
2. (7 points) What is peculiar about the rank fitnesses that you computed in the previous part?
```
```
3. (6 points) Can you avoid the peculiarity you observed in part 2 by restricting the probability p to a particular range? Your answer should be independent of the number of individuals in the population.