CS4341 Introduction to Artificial Intelligence. C2000
Exam 2 - February 29, 2000
Instructions
- Show your work
- Justify your answers
- Use the spaces provided to write your answers
- Write your name on each page
- Ask in case of doubt
NAME: _____________________________________________________
Problem I. Constraint Satisfaction (20 points)
Suppose that a medical doctor needs your help constructing
a constraint net to find prescriptions for his/her flu patients.
The following table shows the symptoms exhibited by flu patients
and the list of medications that can be prescribed for each symptom.
Symptom Medications
Headache H1, H2, H3
Runny nose R1, R2
Fever F1, F2
Sore throat S1, S2
Assume also that the FDA has found that each the following sets of
medications presents dangerous drug interactions and so drugs in each
set should not be prescribed together.
Drugs that must NOT be prescribed together
- {H1,S1}
- {H3,S2}
- {H2,R2}
- {F1,S1}
On the other hand, the following sets of drugs work in conjunction and so
if one of them is prescribed then the other one must be prescribed as well.
Drugs that must be prescribed together
Answer the following questions:
- (10 points) Construct a constraint net that helps the medical doctor
in finding appropriate prescriptions for his/her flu patients.
Provide the constraint matrix associated to each link in the net explicitly.
- (10 points)
A patient may be allergic to some drugs. Hence, an appropriate
prescription for a flu patient should consist of 4
medications, one for each symptom, satisfying the constraints above and
to which the patient is not allergic.
Find an appropriate prescription
for Lucy, who is allergic to F1 and to R2.
Describe in detail each of the steps of your problem solving strategy:
- Arc Consistency Check
- Depth First Search
- Selection of the 1st node
- Selection of each consecutive node
Problem II. Logic-Based Systems (20 points)
Consider the following set of axioms:
- forall x [equal(x,x)]
- forall y,z [equal(y,z) -> equal(z,y)]
- forall w,s,t [equal(w,s) and equal (s,t) -> equal(w,t)]
- equal(b,a)
- equal(b,c)
And the conclusion:
As usual, a, b, and c denote constants; and x, y, z, w, s, and t denote
variables.
Prove the conclusion from the axioms by refutation using resolution:
- (10 points) Translate the axioms and the negation of the conclusion
into clausal form:
-
-
-
-
-
- (negated conclusion:)
- (10 points) Apply resolution (state explicitly which substitutions are
made):
Problem III. Learning - Decision Trees (20 points)
The following table contains training examples that help a robot janitor
predict whether or not an office contains a recycling bin.
| STATUS | DEPT. | OFFICE SIZE | RECYCLING BIN?
|
1. | faculty | ee | large | no
|
2. | staff | ee | small | no
|
3. | faculty | cs | medium | yes
|
4. | student | ee | large | yes
|
5. | staff | cs | medium | no
|
6. | faculty | cs | large | yes
|
7. | student | ee | small | yes
|
8. | staff | cs | medium | no
|
- (15 points)
Use information theory to construct a minimal decision tree that
predicts whether or not an office contains a recycling bin.
Show each step of the computation.
x 1 2/3 1/3 1/2
---------------------------------------
log_2(x) 0 -0.6 -1.6 -1
- (5 points)
Translate your decision tree into a collection of decision rules.
NAME: _____________________________________________________
Problem IV. Learning - Neural Networks (20 points)
Consider the following function of two variables:
A B Desired Output
0 0 1
1 0 0
0 1 0
1 1 1
- (5 points) Prove that this function cannot be learned by a single
perceptron that uses the step function as its activation function.
- (15 points) Explain how you would train a neural network (in which
each unit uses the sigmoid function as its activation function)
to learn this function by describing in detail each of the steps below:
- (5 points) Topology of your neural nets:
- How many hidden layers would you use?
- How many hidden units per layer?
- How many connections would your net have?
- Draw the neural net that would result from your choices above.
- (10 points) Training your neural net:
- How would you select the initial weights of the connections?
- Explain how the error back propagation algorithm would work
when applied to your neural net.
- When would you stop the iterations of the error back propagation
algorithm?
- What would you do if the trained neural net does not generate
the desired outputs?
Problem V. Learning - Genetic Algorithms (20 points)
Consider a genetic algorithm in which the rank method is used
to select candidates.
Recall that, given a fixed value p between 0 and 1,
the probability of picking the first-ranked candidate is p;
the probability of picking the second-ranked candidate is
p times the remaining probability (1-p), that is p*(1-p);
the probability of picking the third-ranked candidate is
p times the remaining probability (1-(p + p*(1-p))), that is
p*(1-(p + p*(1-p))) = p*(1-p)(1-p);
and so on, until only one candidate is left, in which case this candidate
receives the remaining probability, so that the probabilities of all
the individuals in the population sum up to 1.
- (7 points) Suppose that you have five candidates. You decide to select
one using the rank-space method with p=0.25. Compute the probabilities
for each of the five as a function of rank.
Individuals Rank Rank Fitness
-----------------------------------------
C1 1st 0.25
C2 2nd
C3 3rd
C4 4th
C5 5th
- (7 points) What is peculiar
about the rank fitnesses that you computed in the previous part?
- (6 points) Can you avoid the peculiarity you observed in part 2 by
restricting the probability p to a particular range?
Your answer should be independent of
the number of individuals in the population.