Knowledge Elicitation Tool Classification

 

Janet E. Burge

Artificial Intelligence Research Group

Worcester Polytechnic Institute

 

 

1 Knowledge Elicitation Methods *

2 KE Methods by Interaction Type *

2.1 Interviewing *

2.2 Case Study *

2.3 Protocols *

2.4 Critiquing *

2.5 Role Playing *

2.6 Simulation *

2.7 Prototyping *

2.8 Teachback *

2.9 Observation *

2.10 Goal Related *

2.11 List Related *

2.12 Construct Elicitation *

2.13 Sorting *

2.14 Laddering *

2.15 20 Questions *

2.16 Document Analysis *

3 KE Methods by Knowledge Type Obtained *

3.1 Procedures *

3.2 Problem Solving Strategy *

3.3 Goals/Subgoals *

3.4 Classification *

3.5 Dependencies/Relationships *

3.6 Evaluation *

4 KE Techniques by Design Knowledge Type *

4.1 Choosing KE Techniques for Design *

4.2 Knowledge Types for Design *

4.2.1 Needs and Desires *

4.2.2 Requirements Formation Knowledge *

4.2.3 Problem Specification Knowledge *

4.2.4 Problem Solving Knowledge *

4.2.5 Solution Analysis Knowledge *

4.2.6 Documentation and Rationale Recovery Knowledge *

4.2.7 Presentation Knowledge *

4.3 Design Plan Knowledge *

5 References *

 

 

 

Table 1. KE Techniques Grouped by Interaction Type *

Table 2. Interview Methods *

Table 3. Case Study Methods *

Table 4. Protocol Methods *

Table 5. Critiquing Methods *

Table 6. Role Playing Methods *

Table 7. Simulation Methods *

Table 8. Prototyping Methods *

Table 9. Teachback Methods *

Table 10. Observation Methods *

Table 11. Goal Related Methods *

Table 12. List Related Methods *

Table 13. Construct Elicitation Methods *

Table 14. Sorting Methods *

Table 15. Laddering Methods *

Table 16. 20 Questions Method *

Table 17. Document Analysis Methods *

Table 18. Methods that Elicit Procedures *

Table 19. Methods that Elicit Problem Solving Strategy *

Table 20. Methods that Elicit Goals/Subgoals *

Table 21. Methods that Elicit Classification of Domain Entities *

Table 22. Methods that Elicit Relationships *

Table 23. Methods that Elicit Evaluations *

 

 

 

 

  1. Knowledge Elicitation Methods
  2.  

    Many Knowledge Elicitation (KE) methods have been used to obtain the information required to solve problems. These methods can be classified in many ways. One common way is by how directly they obtain information from the domain expert. Direct methods involve directly questioning a domain expert on how they do their job. In order for these methods to be successful, the domain expert has to be reasonably articulate and willing to share information. The information has to be easily expressed by the expert, which is often difficult when tasks frequently performed often become 'automatic.' Indirect methods are used in order to obtain information that can not be easily expressed directly.

     

    Three other ways of classifying methods are discussed in this document. One classifies the methods by how they interact with the domain expert. Another classifies them by what type of information is obtained. The third classifies them according to the type of design knowledge obtained.

     

    Other factors that influence the choice of KE method are the amount of domain knowledge required by the knowledge engineer and the effort required to analyze the data.

     

     

  3. KE Methods by Interaction Type
  4.  

    There are many ways of grouping KE methods. One is to group them by the type of interaction with the domain expert. Table 1 shows the categories and the type of information produced.

     

    Table 1. KE Techniques Grouped by Interaction Type

     

    Category

    Examples

    Type

    Results

    Interview

    Structured

    Unstructured

    Semi-Structured

    Direct

    Varies depending on questions asked

    Case Study

    Critical Incident Method

    Forward Scenario Simulation

    Critical Decision Method

    Direct

    Procedures followed, rationale

    Protocols

    Protocol Analysis

    Direct

    Procedures followed, rationale

    Critiquing

    Critiquing

    Direct

    Evaluation of problem solving strategy compared to alternatives

    Role Playing

    Role Playing

    Indirect

    Procedures, difficulties encountered due to role

    Simulation

    Simulation

    Wizard of Oz

    Direct

    Procedures followed

    Prototyping

    Rapid Prototyping

    Storyboarding

    Direct

    Evaluation of proposed approach

    Teachback

    Teachback

    Direct

    Correction of Misconceptions

    Observation

    Observation

     

    Procedure followed

    Goal Related

    Goal Decomposition

    Dividing the Domain

    Direct

    Goals and subgoals, groupings of goals

    List Related

    Decision Analysis

    Direct

    Estimate of worth of all decisions for a task

    Construct Elicitation

    Repertory Grid

    Multi-dimensional Scaling

    Indirect

    Entities, attributes, sometimes relationships

    Sorting

    Card Sorting

    Indirect

    Classification of entities (dimension chosen by subject)

    Laddering

    Laddered Grid

    Indirect

    Hierarchical map of the task domain

    20 Questions

    20 Questions

    Indirect

    Information used to solve problems, organization of problem space

    Document Analysis

    Document Analysis

    Indirect (usually)

    Varies depending on available documents, interaction with experts

     

     

     

    1. Interviewing
    2.  

      Interviewing consists of asking the domain expert questions about the domain of interest and how they perform their tasks. Interviews can be unstructured, semi-structured, or structured. The success of an interview session is dependent on the questions asked (it is difficult to know which questions should be asked, particularly if the interviewer is not familiar with the domain) and the ability of the expert to articulate their knowledge. The expert may not remember exactly how they perform a task, especially if it is one that they perform automatically". Some interview methods are used to build a particular type of model of the task. The model is built by the knowledge engineer based on information obtained during the interview and then reviewed with the domain expert. In some cases, the models can be built interactively with the expert, especially if there are software tools available for model creation. Table 2 shows a list of interview methods.

       

       

       

      Table 2. Interview Methods

       

      Method

      Type

      Output

      Reference

      Interviewing (structured, unstructured, semi-structured)

      Direct

      Procedures followed, knowledge used (easily verbalized knowledge)

      [Hudlicka, 1997], [Geiwitz, et al., 1990]

      Concept Mapping

      Direct

      Procedures followed

      [Hudlicka, 1997], [Thordsen, 1991], [Gowin & Novak, 1984]

      Interruption Analysis

      Direct

      Procedures, problem-solving strategy, rationale

      [Hudlicka, 1997]

      ARK (ACT-based representation of knowledge) (combination of methods)

      Direct

      Goal-subgoal network

      Includes production rules describing goal/subgoal relationship

      [Geiwitz, et al., 1990]

      Cognitive Structure Analysis (CSA)

      Direct

      Representational format of experts knowledge; content of the knowledge structure

      [Geiwitz, et al., 1990]

      Problem discussion

      Direct

      Solution strategies

      [Geiwitz, et al., 1990]

      Tutorial interview

      Direct

      Whatever expert teaches!

      [Geiwitz, et al., 1990]

      Uncertain information elicitation

       

      Uncertainty about problems

      [Geiwitz, et al., 1990]

      Data flow modeling

      Direct

      Data flow diagram (data items and data flow between them – no sequence information)

      [OTT, 1998], [Gane & Sarson, 1977]

      Entity-relationship modeling

      Direct

      Entity relationship diagram (entities, attributes, and relationships)

      [OTT, 1998], [Swaffield & Knight, 1990]

      Entity life modeling

      Direct

      Entity life cycle diagram (entities and state changes)

      [OTT, 1998], [Swaffield & Knight, 1990]

      Object oriented modeling

      Direct

      Network of objects (types, attributes, relations)

      [OTT, 1998], [Riekert, 1991]

      Semantic nets

      Direct

      Semantic Net (inc. relationships between objects)

      [OTT, 1998], [Atkinson, 1990]

      IDEF modeling

      Direct

      IDEF Model (functional decomposition)

      [OTT, 1998], [McNeese & Zaff, 1991]

      Petri nets

      Direct

      Functional task net

      [OTT, 1998], [Coovert et al., 1990], [Hura, 1987], [Weingaertner & Lewis, 1988]

      Questionnaire

      Direct

      Sequence of task actions, cause and effect relationships

      [OTT, 1998], [Bainbridge, 1979]

      Task action mapping

      Direct

      Decision flow diagram (goals, subgoals, actions)

      [OTT, 1998], [Coury et al., 1991]

      User Needs Analysis (decision process diagrams)

      Direct

      Decision process diagrams

      [OTT, 1998], [Coury et al., 1991]

       

       

    3. Case Study
    4.  

      In Case Study methods different examples of problems/tasks within a domain are discussed. The problems consist of specific cases that can be typical, difficult, or memorable. These cases are used as a context within which directed questions are asked. Table 3 shows a list of methods that use cases to obtain information.

       

      Table 3. Case Study Methods

       

       

      Method

      Type

      Output

      Reference

      Retrospective case description

      Direct

      Procedures followed

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Critical incident strategy

      Direct

      Complete plan, plus factors that influenced the plan.

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Forward scenario simulation

      Direct

      Procedures followed, reasons behind them

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Critical Decision Method

      Direct

      Goals considered, options generated, situation assessment

      [Hudlicka, 1997], [Thordsen, 1991], [Klein et al., 1986]

      Retrospective case description

      Direct

      Procedures used to solve past problems

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Interesting cases

      Direct

      Procedures used to solve unusual problems

      [Geiwitz, et al., 1990], [Cordingley, 1989]

       

    5. Protocols
    6.  

      Protocol analysis [Ericsson and Simon, 1984] involves asking the expert to perform a task while "thinking aloud." The intent is to capture both the actions performed and the mental process used to determine these actions. As with all the direct methods, the success of the protocol analysis depends on the ability of the expert to describe why they are making their decision. In some cases, the expert may not remember why they do things a certain way. In many cases, the verbalized thoughts will only be a subset of the actual knowledge used to perform the task. One method used to augment this information is Interruption analysis. For this method, the knowledge engineer interrupts the expert at critical points in the task to ask questions about why they performed a particular action.

       

      For design, protocol analysis would involve asking the expert to perform the design task. This may or not be possible depending on what is being designed or the length of time normally required to perform a design task. Interruption analysis would be useful in determining why subtasks are performed in a particular order. One disadvantage, however, is that the questions could distract the expert enough that they may make mistakes or start "second guessing" their own decisions.

       

      If time and resources were available, it would be interesting to perform protocol analysis of the same task using multiple experts noting any differences in ordering. This could obtain both alternative orderings and, after questioning the expert, the rationale for their decisions.

       

      Table 4 lists protocol analysis.

       

      Table 4. Protocol Methods

       

      Method

      Type

      Output

      Reference

      protocol analysis (think aloud, talk aloud, eidetic reduction, retrospective reporting, behavioral descriptions, playback)

      Direct

      Procedures, problem-solving strategy

      [Hudlicka, 1997], [Ericsson & Simon, 1984], [Geiwitz, et al., 1990]

       

       

    7. Critiquing
    8.  

      In Critiquing, an approach to the problem/task is evaluated by the expert. This is used to determine the validity of results of previous KE sessions. Table 5 lists critiquing methods.

       

      Table 5. Critiquing Methods

       

      Method

      Type

      Output

      Reference

      Critiquing

      Direct

      Evaluation of a problem solving strategy compared to alternatives

      [Geiwitz, et al., 1990], [Cordingley, 1989]

       

       

    9. Role Playing
    10.  

      In Role Playing, the expert adapts a role and acts out a scenario where their knowledge is used [Geiwitz, et al., 1990]. The intent is that by viewing a situation from a different perspective, information will be revealed that was not discussed when the expert was asked directly. Table 6 shows role playing.

       

      Table 6. Role Playing Methods

       

      Method

      Type

      Output

      Reference

      role playing

      Indirect

      Procedures, difficulties encountered due to role

      [Geiwitz, et al., 1990], [Cordingley, 1989]

       

    11. Simulation
    12.  

      In Simulation methods, the task is simulated using a computer system or other means. This is used when it is not possible to actually perform the task. Table 7 shows simulation methods.

       

      Table 7. Simulation Methods

       

      Method

      Type

      Output

      Reference

      wizard of oz

      Direct

      Procedures followed

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Simulations

      Direct

      Problem solving strategies, procedures

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Problem analysis

      Direct

      Procedures, rationale (like simulated interruption analysis)

      [Geiwitz, et al., 1990]

       

    13. Prototyping
    14.  

      In Prototyping, the expert is asked to evaluate a prototype of the proposed system being developed. This is usually done iteratively as the system is refined. Table 8 shows prototyping methods.

       

      Table 8. Prototyping Methods

       

      Method

      Type

      Output

      Reference

      System refinement

      Direct

       

      New test cases for a prototype system

      [Geiwitz, et al., 1990]

      System examination

      Direct

      Experts opinion on prototype’s rules and control structures

      [Geiwitz, et al., 1990]

      System validation

      Direct

      Outside experts evaluation of cases solved by expert and protocol system

      [Geiwitz, et al., 1990]

      Rapid prototyping

      Direct

      Evaluation of system/procedure

      [Geiwitz, et al., 1990], [Diaper, 1989]

      Storyboarding

      Direct

      Prototype display design

      [OTT, 1998], [McNeese & Zaff, 1991]

       

    15. Teachback
    16.  

      In Teachback, the knowledge engineer attempts to teach the information back to the expert, who then provides corrections and fills in gaps. Table 9 shows teachback methods.

       

      Table 9. Teachback Methods

       

      Method

      Type

      Output

      Reference

      teachback

      Direct

      Correction of misconceptions

      [Geiwitz, et al., 1990], [Cordingley, 1989]

       

    17. Observation
    18.  

      In Observation methods, the knowledge engineer observes the expert performing a task. This prevents the knowledge engineer from inadvertently interfering in the process, but does not provide any insight into why decisions are made. Table 10 shows observation methods.

       

      Table 10. Observation Methods

       

      Method

      Type

      Output

      Reference

      Discourse analysis (observation)

      Direct

      Taxonomy of tasks/subtasks or functions

      [OTT, 1998], [Belkin & Brooks, 1988]

      On-site observation

      Direct

      Procedure, problem solving strategies

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Active participation

      Direct

      Knowledge and skills needed for task

      [Geiwitz, et al., 1990], [Cordingley, 1989]

       

       

       

    19. Goal Related
    20.  

      In Goal Related methods, focused discussion techniques are used to elicit information about goals and subgoals. Table 11 shows goal related methods.

       

      Table 11. Goal Related Methods

       

      Method

      Type

      Output

      Reference

      Goal Decomposition

      Direct

      Goals and subgoals

      [Geiwitz, et al., 1990]

      Dividing the domain

      Direct

      How data is grouped to reach a goal

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Reclassification

      Direct

      Evidence needed to prove that a decision was correct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Distinguishing goals

      Direct

      Minimal sets of discriminating features

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Goal Directed Analysis (goal-means network)

      Direct

      Goal-means network

      [OTT, 1998], [Woods & Hollnagel, 1987]

       

       

    21. List Related
    22.  

      In List Related methods, the expert is asked to provide lists of information, usually decisions. Table 12 shows list related methods.

       

      Table 12. List Related Methods

       

      Method

      Type

      Output

      Reference

      Decision analysis

      Direct

      Estimate of worth for all possible decisions for a task

      [Geiwitz, et al., 1990], [Cordingley, 1989]

       

       

    23. Construct Elicitation
    24.  

      Construct Elicitation methods are used to obtain information about how the expert discriminates between entities in the problem domain. The most commonly used construct elimination method is Repertory Grid Analysis [Kelly, 1955]. For this method, the domain expert is presented with a list of entities and is asked to describe the similarities and differences between them. These similarities and differences are used to determine the important attributes of the entities. After completing the initial list of attributes, the knowledge engineer works with the domain expert to assign ratings to each entity/attribute pair. Table 13 shows construct elicitation methods.

       

      Table 13. Construct Elicitation Methods

       

      Method

      Type

      Output

      Reference

      repertory grid

      Indirect

      Attributes (and entities if provided by subject)

      [Hudlicka, 1997], [Kelly, 1955]

      multi-dimensional scaling

      Indirect

      Attributes and relationships

       

      proximity scaling

      Indirect

      Attributes and relationships

      [Hudlicka, 1997]

       

    25. Sorting
    26.  

      In sorting methods, domain entities are sorted to determine how the expert classifies their knowledge. Table 14 shows sorting methods.

       

      Table 14. Sorting Methods

       

      Method

      Type

      Output

      Reference

      card sorting

      Indirect

      Hierarchical cluster diagram (classification)

      [1], [Geiwitz, et al., 1990], [Cordingley, 1989]

       

       

    27. Laddering
    28.  

      In Laddering, a hierarchical structure of the domain is formed by asking questions designed to move up, down, and across the hierarchy. Table 15 shows laddering methods.

       

      Table 15. Laddering Methods

       

      Method

      Type

      Output

      Reference

      Laddered grid

      Indirect

      A hierarchical map of the task domain

      [Geiwitz, et al., 1990], [Cordingley, 1989]

       

    29. 20 Questions
    30.  

      This is a method used to determine how the expert gathers information by having the expert as the knowledge engineer questions. Table 16 shows the 20 questions method.

       

      Table 16. 20 Questions Method

       

      Method

      Type

      Output

      Reference

      20 questions

      Indirect

      Amount and type of information used to solve problems; how problem space is organized, or how expert has represented

      Task-relevant knowledge.

      [Cordingley, 1989], [Geiwitz, et al., 1990]

       

    31. Document Analysis

     

    Document analysis involves gathering information from existing documentation. May or may not involve interaction with a human expert to confirm or add to this information.

    Table 17 shows documentation analysis methods.

     

    Table 17. Document Analysis Methods

     

    Method

    Type

    Output

    Reference

    Collect artifacts of task performance

    Indirect

    How expert organizes or processes task information, how it is compiled to present to others

    [Geiwitz, et al., 1990], [Cordingley, 1989]

    Document analysis

    Direct

    Conceptual graph

    [OTT, 1998], [Gordon et al., 1993]

    Goal Directed Analysis (goal-means network)

    Direct

    Goal-means network

    [OTT, 1998], [Woods & Hollnagel, 1987]

     

     

  5. KE Methods by General Knowledge Type Obtained

 

Besides being grouped into direct and indirect categories, KE methods can also be grouped (to some extent) by the general type of knowledge obtained. For example, many of the indirect KE methods are best at obtaining classification knowledge while direct methods are more suited for obtaining procedural knowledge. This does not, however, mean that the techniques can not be used for other knowledge types. Since some designers may not be able to directly express how they perform a design task, it might be useful to use an indirect method in conjunction with a direct method to obtain this information.

 

Information types used here are:

 

 

 

Many methods fit into more than one category and are listed more than once. Also, this classification shows the information most commonly extracted using a method and does not imply that only that type of information can be elicited.

 

 

    1. Procedures
    2.  

      These are methods that can be used to determine the steps followed to complete a task. Table 18 lists methods used to elicit procedures.

       

      Table 18. Methods that Elicit Procedures

       

      Method

      Category

      Output

      Type

      Reference

      Interviewing (structured, unstructured, semi-structured)

      Interviewing

      Procedures followed, knowledge used

      Direct

      [Hudlicka, 1997], [Geiwitz, et al., 1990]

      Concept Mapping

      Interview

      Procedures followed

      Direct

      [Hudlicka, 1997], [Thordsen, 1991], [Gowin & Novak, 1984]

      Interruption Analysis

      Interviewing

      Procedures, problem-solving strategy, rationale

      Direct

      [Hudlicka, 1997]

      Problem discussion

      Interview

      Solution strategies

      Direct

      [Geiwitz, et al., 1990]

      Tutorial interview

      Interview

      Whatever expert teaches!

      Direct

      [Geiwitz, et al., 1990]

      Entity life modeling

      Interview

      Entity life cycle diagram (entities and state changes)

      Direct

      [OTT, 1998], [Swaffield & Knight, 1990]

      IDEF modeling

      Interview

      IDEF Model (functional decomposition)

      Direct

      [OTT, 1998], [McNeese & Zaff, 1991]

      Petri nets

      Interview

      Functional task net

      Direct

      [OTT, 1998], [Coovert et al., 1990], [Hura, 1987], [Weingaertner & Lewis, 1988]

      Questionnaire

      Interview

      Sequence of task actions, cause and effect relationships

      Direct

      [OTT, 1998], [Bainbridge, 1979]

      Task action mapping

      Interview

      Decision flow diagram (goals, subgoals, actions)

      Direct

      [OTT, 1998], [Coury et al., 1991]

      Retrospective case description

      Case Study

      Procedures followed

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Critical incident strategy

      Case Study

      Complete plan, plus factors that influenced the plan.

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Forward scenario simulation

      Case Study

      Procedures followed, reasons behind them

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Retrospective case description

      Case Study

      Procedures used to solve past problems

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Interesting cases

      Case Study

      Procedures used to solve unusual problems

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      protocol analysis (think aloud, talk aloud, eidetic reduction, retrospective reporting, behavioral descriptions, playback)

      Protocols

      Procedures, problem-solving strategy

      Direct

      [Hudlicka, 1997], [Ericsson & Simon, 1984], [Geiwitz, et al., 1990]

      teachback

      Teachback

      Correction of misconceptions

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      critiquing

      Critiquing

      Evaluation of a problem solving strategy compared to alternatives

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      role playing

      Role Playing

      Procedures, difficulties encountered due to role

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      wizard of oz

      Simulation

      Procedures followed

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Simulations

      Simulation

      Problem solving strategies, procedures

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Problem analysis

      Simulation

      Procedures, rationale (like simulated interruption analysis)

      Direct

      [Geiwitz, et al., 1990]

      On-site observation

      Observation

      Procedure, problem solving strategies

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

       

       

    3. Problem Solving Strategy
    4.  

      These methods attempt to determine how the expert makes their decisions. Table 19 lists methods that elicit a problem solving strategy.

       

      Table 19. Methods that Elicit Problem Solving Strategy

       

      MethodCategory

      Output

      Type

      Reference

      Interviewing (structured, unstructured, semi-structured)

      Interviewing

      Procedures followed, knowledge used

      Direct

      [Hudlicka, 1997], [Geiwitz, et al., 1990]

      Interruption Analysis

      Interviewing

      Procedures, problem-solving strategy, rationale

      Direct

      [Hudlicka, 1997]

      Problem discussion

      Interview

      Solution strategies

      Direct

      [Geiwitz, et al., 1990]

      Tutorial interview

      Interview

      Whatever expert teaches!

      Direct

      [Geiwitz, et al., 1990]

      Uncertain information elicitation

      Interview

      Uncertainty about problems

      Direct

      [Geiwitz, et al., 1990]

      Critical incident strategy

      Case Study

      Complete plan, plus factors that influenced the plan.

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Forward scenario simulation

      Case Study

      Procedures followed, reasons behind them

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      protocol analysis (think aloud, talk aloud, eidetic reduction, retrospective reporting, behavioral descriptions, playback)

      Protocols

      Procedures, problem-solving strategy

      Direct

      [Hudlicka, 1997], [Ericsson & Simon, 1984], [Geiwitz, et al., 1990]

      critiquing

      Critiquing

      Evaluation of a problem solving strategy compared to alternatives

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      wizard of oz

      Simulation

      Procedures followed

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Simulations

      Simulation

      Problem solving strategies, procedures

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Problem analysis

      Simulation

      Procedures, rationale (like simulated interruption analysis)

      Direct

      [Geiwitz, et al., 1990]

      Reclassification

      Goal Related

      Evidence needed to prove that a decision was correct

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      On-site observation

      Observation

      Procedure, problem solving strategies

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Goal Directed Analysis (goal-means network)

      Interview/Document Analysis

      Goal-means network

      Direct

      [OTT, 1998], [Woods & Hollnagel, 1987]

      20 questions

      20 Questions

      Amount and type of information used to solve problems; how problem space is organized, or how expert has represented

      Task-relevant knowledge.

      Indirect

      [Cordingley, 1989], [Geiwitz, et al., 1990]

      Cloze experiments

      Indirect

      Model of decision-making rules and structures

      Indirect

      [Geiwitz, et al., 1990]

       

    5. Goals/Subgoals
    6.  

      These are methods that are concerned with extracting the goals and subgoals for performing the task. These methods are listed separately from procedures since ordering is not necessarily provided. Table 20 lists methods that elicit this information.

       

      Table 20. Methods that Elicit Goals/Subgoals

       

      Method

      Category

      Output

      Type

      Reference

      ARK (ACT-based representation of knowledge) (combination of methods)

      Interview

      Goal-subgoal network

      Includes production rules describing goal/subgoal relationship

      Direct

      [Geiwitz, et al., 1990]

      Task action mapping

      Interview

      Decision flow diagram (goals, subgoals, actions)

      Direct

      [OTT, 1998], [Coury et al., 1991]

      Critical Decision Method

      Case Study

      Goals considered, options generated, situation assessment

      Direct

      [Hudlicka, 1997], [Thordsen, 1991], [Klein et al., 1986]

      goal decomposition

      Goal Related

      Goals and subgoals

      Direct

      [Geiwitz, et al., 1990]

      Dividing the domain

      Goal Related

      How data is grouped to reach a goal

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Reclassification

      Goal Related

      Evidence needed to prove that a decision was correct

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Distinguishing goals

      Goal Related

      Minimal sets of discriminating features

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Goal Directed Analysis (goal-means network)

      Interview/Document Analysis

      Goal-means network

      Direct

      [OTT, 1998], [Woods & Hollnagel, 1987]

       

    7. Classification
    8.  

      These methods are used to classify entities within a domain. Figure 21 lists methods concerned with classification.

       

      Table 21. Methods that Elicit Classification of Domain Entities

       

      Method

      Category

      Output

      Type

      Reference

      Cognitive Structure Analysis (CSA)

      Interview

      Representational format of experts knowledge; content of the knowledge structure

      Direct

      [Geiwitz, et al., 1990]

      Data flow modeling

      Interview

      Data flow diagram (data items and data flow between them – no sequence information)

      Direct

      [OTT, 1998], [Gane & Sarson, 1977]

      Entity-relationship modeling

      Interview

      Entity relationship diagram (entities, attributes, and relationships)

      Direct

      [OTT, 1998], [Swaffield & Knight, 1990]

      Entity life modeling

      Interview

      Entity life cycle diagram (entities and state changes)

      Direct

      [OTT, 1998], [Swaffield & Knight, 1990]

      Object oriented modeling

      Interview

      Network of objects (types, attributes, relations)

      Direct

      [OTT, 1998], [Riekert, 1991]

      Semantic nets

      Interview

      Semantic Net (inc. relationships between objects)

      Direct

      [OTT, 1998], [Atkinson, 1990]

      Distinguishing goals

      Goal Related

      Minimal sets of discriminating features

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Decision analysis

      List Related

      Estimate of worth for all possible decisions for a task

      Direct

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Discourse analysis (observation)

      Observation

      Taxonomy of tasks/subtasks or functions

      Direct

      [OTT, 1998], [Belkin & Brooks, 1988]

      Collect artifacts of task performance

      Document Analysis

      How expert organizes or processes task information, how it is compiled to present to others

      Indirect

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Document analysis

      Document Analysis

      Conceptual graph

      Direct

      [OTT, 1998], [Gordon et al., 1993]

      repertory grid

      Construct Elicitation

      Attributes (and entities if provided by subject)

      Indirect

      [Hudlicka, 1997], [Kelly, 1955]

      multi-dimensional scaling

      Construct Elicitation

      Attributes and relationships

      Indirect

       

      proximity scaling

      Construct Elicitation

      Attributes and relationships

      Indirect

      [Hudlicka, 1997]

      card sorting

      Sorting

      Hierarchical cluster diagram (classification)

      Indirect

      [1], [Geiwitz, et al., 1990], [Cordingley, 1989]

      laddered grid

      Laddering

      A hierarchical map of the task domain

      Indirect

      [Geiwitz, et al., 1990], [Cordingley, 1989]

      Ranking augmented conceptual ranking

      Other

      Conceptual Ranking (ordering by value)

      Direct

      [OTT, 1998], [Chignell & Peterson, 1988], [Kagel, 1986], [Whaley, 1979]

       

    9. Dependencies/Relationships
    10.  

      Table 22 lists methods that obtain relationships between domain entities.

       

      Table 22. Methods that Elicit Relationships

       

      Method

      Category

      Output

      Type

      Reference

      Data flow modeling

      Interview

      Data flow diagram (data items and data flow between them – no sequence information)

      Direct

      [OTT, 1998], [Gane & Sarson, 1977]

      Entity-relationship modeling

      Interview

      Entity relationship diagram (entities, attributes, and relationships)

      Direct

      [OTT, 1998], [Swaffield & Knight, 1990]

      Object oriented modeling

      Interview

      Network of objects (types, attributes, relations)

      Direct

      [OTT, 1998], [Riekert, 1991]

      Semantic nets

      Interview

      Semantic Net (inc. relationships between objects)

      Direct

      [OTT, 1998], [Atkinson, 1990]

      Questionnaire

      Interview

      Sequence of task actions, cause and effect relationships

      Direct

      [OTT, 1998], [Bainbridge, 1979]

      Discourse analysis (observation)

      Observation

      Taxonomy of tasks/subtasks or functions

      Direct

      [OTT, 1998], [Belkin & Brooks, 1988]

      multi-dimensional scaling

      Construct Elicitation

      Attributes and relationships

      Indirect

       

      Proximity scaling

      Construct Elicitation

      Attributes and relationships

      Indirect

      [Hudlicka, 1997]

      card sorting

      Sorting

      Hierarchical cluster diagram (classification)

      Indirect

      [1], [Geiwitz, et al., 1990], [Cordingley, 1989]

      Laddered grid

      Laddering

      A hierarchical map of the task domain

      Indirect

      [Geiwitz, et al., 1990], [Cordingley, 1989]

       

    11. Evaluation

 

Table 23 lists methods that are used for evaluation of prototypes or other types of KE session results.

 

Table 23. Methods that Elicit Evaluations

 

Method

Category

Output

Type

Reference

teachback

Teachback

Correction of misconceptions

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

critiquing

Critiquing

Evaluation of a problem solving strategy compared to alternatives

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

System refinement

Prototyping

New test cases for a prototype system

Direct

 

[Geiwitz, et al., 1990]

System examination

Prototyping

Experts opinion on prototype’s rules and control structures

Direct

[Geiwitz, et al., 1990]

System validation

Prototyping

Outside experts evaluation of cases solved by expert and protocol system

Direct

[Geiwitz, et al., 1990]

Rapid prototyping

Prototyping

Evaluation of system/procedure

Direct

[Geiwitz, et al., 1990], [Diaper, 1989]

Storyboarding

Prototyping

Prototype display design

Direct

[OTT, 1998], [McNeese & Zaff, 1991]

Decision analysis

List Related

Estimate of worth for all possible decisions for a task

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Ranking augmented conceptual ranking

Other

Conceptual Ranking (ordering by value)

Direct

[OTT, 1998], [Chignell & Peterson, 1988], [Kagel, 1986], [Whaley, 1979]

 

 

  1. KE Techniques by Design Knowledge Type
  2.  

    Design knowledge covers several aspects of the item being designed. Information needs to be obtained to cover the structure, function, and behavior of the design artifact [Gero, 1990]. Design information can be broken into several types of knowledge are independent of the specific domain and problem. If a particular type of knowledge is desired, it can be most effectively acquired using the knowledge elicitation techniques that are best suited for obtaining that type of knowledge.

     

    1. Choosing KE Techniques for Design
    2.  

      To choose a KE technique, the type of design knowledge needs to be mapped to the type of information given by a particular technique, or class of techniques. In some cases, this may be a sufficient reason to choose one technique over another. In other cases, this will not be. The domain expert may not always be able to articulate all the knowledge required when a direct method is used. This could be because this knowledge is used automatically by the expert. The expert and knowledge engineer are unlikely to realize that the information is missing until later steps in the development process.

      For this reason, it desirable to use indirect techniques with modification (to obtain knowledge of the required type) and/or in conjunction with a direct technique.

       

    3. Knowledge Types for Design
    4.  

      Different types of knowledge are used at different stages of the design process. One process, defined in Smithers [1998], looks at knowledge needed for requirements definition, problem statement definition, solution generation, analysis, and documentation. In addition, presentations are made throughout the process to provide information to the customer/client. Each step in this process requires a different type of knowledge.

       

      These knowledge requirements are presented at a high level. What information is available at each step will vary depending on the problem to be solved and how much information is provided by the customer. In some cases, only the needs and desires of the customer are specified, in others the designer may be given a detailed problem specification [Bernaras, 1993]. Many cases fall in between when the designer is presented with initial requirements that may or may not be complete. The following subparagraphs discuss what is involved with each kind of knowledge.

       

      1. Needs and Desires
      2.  

        This involves a statement of what the customer wants (or thinks he or she wants). The level of detail varies depending on the customer and the problem.

         

      3. Requirements Formation Knowledge

 

This consists of additional information needed to turn the needs and desires into actual requirements. In the software world, these are often referred to as "testable requirements." Requirements revision knowledge will also be needed since requirements are likely to require adjustment throughout the process.

 

Knowledge needed to form requirements includes:

 

  1. Description of the artifact to be designed
  2. Requirements on its behavior
  3. Resources required (if applicable)

 

      1. Problem Specification Knowledge

 

This is the knowledge needed to transform the requirements into an actual specification of the item being built. Knowledge needed to create a problem specification includes:

 

  1. What are the design components (includes information on if the component is always required)
  2. Attributes and possible values for these components
  3. Constraints between components
  4. Priorities on constraints, information on which can be relaxed
  5. Relationships (dependencies) between components

 

      1. Problem Solving Knowledge

 

This involves knowledge required to turn the specification into a solution (or solutions). This involves a plan for how to perform the design and knowledge of what resources are available to build the artifact. Note that resources available is different from the resources required stated in the requirements. In some cases, the client/customer may want specific components involved for various reasons (they have a warehouse of part x and want to get rid of them). In others, the actual choice of component can be deferred until an exact solution is designed.

 

Knowledge used to solve the problem, given a specification, includes:

 

  1. Decomposition of the design problem
  2. Ordering in which to perform the design
  3. Relationships between subproblems
  4. How to recompose the subproblems into a final solution
  5. Resources available (such as a part catalog)

 

      1. Solution Analysis Knowledge
      2.  

        This involves knowledge needed to determine if a given solution meets the requirements.

         

      3. Documentation and Rationale Recovery Knowledge

 

This is knowledge required to both document the process and justify design decisions. This knowledge needs to be captured at each step of the process. Some of this information will be (or should be) a natural output of previous design steps, others will need to be explicitly captured (such as rationale).

 

  1. Requirements (see above)
  2. Problem specification (see above)
  3. Solution description
  4. Rationale for design decisions
  5. Results of solution analysis

 

      1. Presentation Knowledge

 

This is the knowledge required to provide feedback to the customer on the design process. How detailed this information should be will depend on the needs of the customer. It should, however, provide the customer with enough information so that they can evaluate if their needs and desires are being met. The knowledge used will be a subset of that needed for documentation and rationale recovery.

 

    1. Design Plan Knowledge

 

Of these items, the most interesting ones are the design plan [Chandrasekaran, 1990] and design rationale. The design plan discussed here uses a decomposition method/model [Maher, 1990] to perform the design. This involves breaking the problem into subproblems. These subproblems are either solved sequentially or, when possible, in parallel. Since subproblems may depend on other subproblems, it is necessary to solve the problems in the (or a) correct order. Otherwise, the system would need to backtrack and make adjustments before coming up with the final solution. [Liu & Brown, 1994] The design plan needs to both indicate the decomposition and the order in which the problems should be solved.

 

Two factors influence the order in which subproblems should be addressed: the dependencies between subproblems and the number of constraints on a subproblem. If one subproblem depends on the solution to another, they need to be solved in order. If a subproblem is heavily constrained, it makes sense to solve it first. There are two reasons for this: minimizing the amount (and/or length) of backtracking and to ensure that a solution is even possible. The ordering information (dependencies and constraints) needs to be obtained by some method.

 

 

  1. References

 

Atkinson, G. (1990). Practical experience using an automated knowledge acquisition tool. Proceedings of the Second Annual Conference of the International Association of Knowledge Engineers, 87-97.

 

Bainbridge, L. (1979). Verbal reports as evidence of the process operator's knowledge. International Journal of-Man-Machine Studies, 11, 411-436.

 

Belkin, N. J., Brooks, H. M. (1988). Knowledge elicitation using discourse analysis. In B. Gaines and J. Boose (Eds.) Knowledge based systems, Vol. 1, pp 107-124. Academic Press Limited.

 

Bernaras, A. (1993). Models of Design for the CommonKADS Library, ESPIRIT Project P5248 KADS-II.

 

Chandrasekaran, B. (1990) Design Problem Solving: A Task Analysis, AI Magazine, pp. 59-71.

 

Chignell, M. H., Peterson, J. G. (1988). Strategic issues in knowledge engineering. Human Factors, 30(4), 381-394.

 

Coovert, M. D., Cannon-Bowers, J. A., & Salas, E. (1990). Applying mathematical modeling technology to the study of team training and performance. Paper presented at the 12th Annual Interservice/Industry Training Systems Conference, Orlando, FL, November.

 

Cordingley, E. S. (1989). Knowledge elicitation techniques for knowledge-based systems. In D. Diaper (Ed.), Knowledge elicitation: Principles, techniques and applications. Chichester, England: Ellis Horwood Ltd.

 

Coury, B. G., Motte, S., & Seiford, L. M. (1991). Capturing and representing decision processes in the design of an information system. Proceedings of the Human Factors Society 35th Annual Meeting, 1223-1227. Santa Monica, CA: Human Factors Society.

 

Diaper, D. (Ed.). (1989). Knowledge elicitation: Principles, techniques and applications. Chicester, England: Ellis Horwood Ltd.

 

Ericsson, K.A., Simon, H.A. (1984). Protocol Analysis: Verbal Reports as Data. Cambridge, MA: The MIT Press.

 

Gane, C., Sarson, T. (1977). Structured Systems Analysis:--Tools and Techniques. Unpublished document! McDonnell Douglas Corporation.

 

Geiwitz, J., Kornell, J., McCloskey, B. (1990). An Expert System for the Selection of Knowledge Acquisition Techniques. Technical Report 785-2, Contract No. DAAB07-89-C-A044. California, Anacapa Sciences.

 

Gero, J. (Winter 1990), Design Prototypes: Knowledge Representation Schema for Design, AI Magazine, pp. 26 - 36

Gordon, S. E., Schmierer, K. A., & Gill, R. T. (1993). Conceptual graph analysis: Knowledge acquisition for instructional system design. Human Factors, 35, p. 459-481.

 

Gowin, R., Novak, J.D. (1984). Learning how to learn. NY: Cambridge University Press.

 

Hudlicka, E. (1997). Summary of Knowledge Elicitation Techniques for Requirements Analysis, Course Material for Human Computer Interaction, Worcester Polytechnic Institute.

 

Hura, G. S. (1987). Petri net applications. IEEE Potentials, October, 25-28.

 

Kagel, A. S. (1986). The unshuffle algorithm. Computer Language, 1(11), 61-66.

 

Kelly, G. (1955). The Psychology of Personal Constructs. New York: Norton.

 

Klein, G. A., Calderwood, R., Clinton-Cirocco, A. (1986). Rapid decision making on the fireground, Proceedings o fthe 30th Annual Human Factors Society, 1, 576-580. Dayton, OH: Human Factors Society.

 

Liu J., Brown D. (1994), Generating Design Decomposition Knowledge for Parametric Design Problems, Proceedings of AID-94, Kluwer Academic Publishers, pp. 661-678.

 

Maher, M. (Winter 1990). Process Models for Design Synthesis, AI Magazine, pp. 49 - 58

 

McNeese, M. D., Zaff, B. S. (1991). Knowledge as design: A methodology for overcoming knowledge acquisition bottlenecks in intelligent interface design. Proceedings of the Human Factors Society 35th Annual Meeting, 1181-1185. Santa Monica, CA: Human Factors Society.

 

OTT (1998), http://www.ott.navy.mil/2_2/2_2_6/ , Task Analysis, Chief of Naval Operations' Office of Training Technology.

 

Riekert, W. (1991). Knowledge acquisition as an object-oriented modeling process. In M. J. Tauber and D. Ackermann (Eds.) Mental models and human computer interactions, 373-381. Amsterdam: Elsevier Sciences Publishers B. V.

 

Smithers, T. (1998) Towards a Knowledge Level Theory of Design Process, to appear in Proceedings of AID-98, Kluwer Academic Publishers.

 

Swaffield, G., Knight, B. (1990). Applying system analysis techniques to knowledge engineering. Expert Systems, 1, 82-93.

 

Thordsen, M. (1991). A Comparison of Two Tools for Cognitive Task Analysis: Concept Mapping and the Critical Decision Method. Proceedings of the Human Factors Society 35th Annual Meeting.

 

Weingaertner, S. T., Lewis, A. H. (1988). Evaluation of decision aiding in submarine emergency decision making. In J. Ranta (Ed.) Analysis, Design, and Evaluation of Man-Machine Systems: Selected Papers from the 3rd IFAC/IEA/IFORS Conference, 1 95-201. Oxford, UK: Pergamon.
Whaley, C. P. (1979). Collecting paired-comparison data with a sorting algorithm. Behavior Research Methods and Instrumentation, 11, 147-150.
Woods, D. D., Hollnagel, E. (1987). Mapping cognitive demands in complex problem-solving worlds. International Journal of Man-Machine Studies, 26, 257-275.