Knowledge Elicitation Tool Classification

 

Janet E. Burge

Artificial Intelligence Research Group

Worcester Polytechnic Institute

 

 

Knowledge Elicitation Methods *

KE Methods by Interaction Type *

Interviewing *

Case Study *

Protocols *

Critiquing *

Role Playing *

Simulation *

Prototyping *

Teachback *

Observation *

Goal Related *

List Related *

Construct Elicitation *

Sorting *

Laddering *

20 Questions *

Document Analysis *

KE Methods by Knowledge Type Obtained *

Procedures *

Problem Solving Strategy *

Goals/Subgoals *

Classification *

Dependencies/Relationships *

Evaluation *

References *

 

Table 1. KE Techniques Grouped by Interaction Type *

Table 2. Interview Methods *

Table 3. Case Study Methods *

Table 4. Protocol Methods *

Table 5. Critiquing Methods *

Table 6. Role Playing Methods *

Table 7. Simulation Methods *

Table 8. Prototyping Methods *

Table 9. Teachback Methods *

Table 10. Observation Methods *

Table 11. Goal Related Methods *

Table 12. List Related Methods *

Table 13. Construct Elicitation Methods *

Table 14. Sorting Methods *

Table 15. Laddering Methods *

Table 16. 20 Questions Method *

Table 17. Document Analysis Methods *

Table 18. Methods that Elicit Procedures *

Table 19. Methods that Elicit Problem Solving Strategy *

Table 20. Methods that Elicit Goals/Subgoals *

Table 21. Methods that Elicit Classification of Domain Entities *

Table 22. Methods that Elicit Relationships *

Table 23. Methods that Elicit Evaluations *

 

 Knowledge Elicitation Methods

 Many Knowledge Elicitation (KE) methods have been used to obtain the information required to solve problems. These methods can be classified in many ways. One common way is by how directly they obtain information from the domain expert. Direct methods involve directly questioning a domain expert on how they do their job. In order for these methods to be successful, the domain expert has to be reasonably articulate and willing to share information. The information has to be easily expressed by the expert, which is often difficult when tasks frequently performed often become 'automatic.' Indirect methods are used in order to obtain information that can not be easily expressed directly.

 Two other ways of classifying methods are discussed in this document. One classifies the methods by how they interact with the domain expert. Another classifies them by what type of information is obtained.

 Other factors that influence the choice of KE method are the amount of domain knowledge required by the knowledge engineer and the effort required to analyze the data.

KE Methods by Interaction Type

 There are many ways of grouping KE methods. One is to group them by the type of interaction with the domain expert. Table 1 shows the categories and the type of information produced.

Table 1. KE Techniques Grouped by Interaction Type

 

Category

Examples

Type

Results

Interview

Structured

Unstructured

Semi-Structured

Direct

Varies depending on questions asked

Case Study

Critical Incident Method

Forward Scenario Simulation

Critical Decision Method

Direct

Procedures followed, rationale

Protocols

Protocol Analysis

Direct

Procedures followed, rationale

Critiquing

Critiquing

Direct

Evaluation of problem solving strategy compared to alternatives

Role Playing

Role Playing

Indirect

Procedures, difficulties encountered due to role

Simulation

Simulation

Wizard of Oz

Direct

Procedures followed

Prototyping

Rapid Prototyping

Storyboarding

Direct

Evaluation of proposed approach

Teachback

Teachback

Direct

Correction of Misconceptions

Observation

Observation

 

Procedure followed

Goal Related

Goal Decomposition

Dividing the Domain

Direct

Goals and subgoals, groupings of goals

List Related

Decision Analysis

Direct

Estimate of worth of all decisions for a task

Construct Elicitation

Repertory Grid

Multi-dimensional Scaling

Indirect

Entities, attributes, sometimes relationships

Sorting

Card Sorting

Indirect

Classification of entities (dimension chosen by subject)

Laddering

Laddered Grid

Indirect

Hierarchical map of the task domain

20 Questions

20 Questions

Indirect

Information used to solve problems, organization of problem space

Document Analysis

Document Analysis

Indirect (usually)

Varies depending on available documents, interaction with experts

 

Interviewing

 Interviewing consists of asking the domain expert questions about the domain of interest and how they perform their tasks. Interviews can be unstructured, semi-structured, or structured. The success of an interview session is dependent on the questions asked (it is difficult to know which questions should be asked, particularly if the interviewer is not familiar with the domain) and the ability of the expert to articulate their knowledge. The expert may not remember exactly how they perform a task, especially if it is one that they perform automatically". Some interview methods are used to build a particular type of model of the task. The model is built by the knowledge engineer based on information obtained during the interview and then reviewed with the domain expert. In some cases, the models can be built interactively with the expert, especially if there are software tools available for model creation. Table 2 shows a list of interview methods.

Table 2. Interview Methods

 

Method

Type

Output

Reference

Interviewing (structured, unstructured, semi-structured)

Direct

Procedures followed, knowledge used (easily verbalized knowledge)

[Hudlicka, 1997], [Geiwitz, et al., 1990]

Concept Mapping

Direct

Procedures followed

[Hudlicka, 1997], [Thordsen, 1991], [Gowin & Novak, 1984]

Interruption Analysis

Direct

Procedures, problem-solving strategy, rationale

[Hudlicka, 1997]

ARK (ACT-based representation of knowledge) (combination of methods)

Direct

Goal-subgoal network

Includes production rules describing goal/subgoal relationship

[Geiwitz, et al., 1990]

Cognitive Structure Analysis (CSA)

Direct

Representational format of experts knowledge; content of the knowledge structure

[Geiwitz, et al., 1990]

Problem discussion

Direct

Solution strategies

[Geiwitz, et al., 1990]

Tutorial interview

Direct

Whatever expert teaches!

[Geiwitz, et al., 1990]

Uncertain information elicitation

 

Uncertainty about problems

[Geiwitz, et al., 1990]

Data flow modeling

Direct

Data flow diagram (data items and data flow between them – no sequence information)

[OTT, 1998], [Gane & Sarson, 1977]

Entity-relationship modeling

Direct

Entity relationship diagram (entities, attributes, and relationships)

[OTT, 1998], [Swaffield & Knight, 1990]

Entity life modeling

Direct

Entity life cycle diagram (entities and state changes)

[OTT, 1998], [Swaffield & Knight, 1990]

Object oriented modeling

Direct

Network of objects (types, attributes, relations)

[OTT, 1998], [Riekert, 1991]

Semantic nets

Direct

Semantic Net (inc. relationships between objects)

[OTT, 1998], [Atkinson, 1990]

IDEF modeling

Direct

IDEF Model (functional decomposition)

[OTT, 1998], [McNeese & Zaff, 1991]

Petri nets

Direct

Functional task net

[OTT, 1998], [Coovert et al., 1990], [Hura, 1987], [Weingaertner & Lewis, 1988]

Questionnaire

Direct

Sequence of task actions, cause and effect relationships

[OTT, 1998], [Bainbridge, 1979]

Task action mapping

Direct

Decision flow diagram (goals, subgoals, actions)

[OTT, 1998], [Coury et al., 1991]

User Needs Analysis (decision process diagrams)

Direct

Decision process diagrams

[OTT, 1998], [Coury et al., 1991]

 

Case Study

 In Case Study methods different examples of problems/tasks within a domain are discussed. The problems consist of specific cases that can be typical, difficult, or memorable. These cases are used as a context within which directed questions are asked. Table 3 shows a list of methods that use cases to obtain information.

Table 3. Case Study Methods

 

Method

Type

Output

Reference

Retrospective case description

Direct

Procedures followed

[Geiwitz, et al., 1990], [Cordingley, 1989]

Critical incident strategy

Direct

Complete plan, plus factors that influenced the plan.

[Geiwitz, et al., 1990], [Cordingley, 1989]

Forward scenario simulation

Direct

Procedures followed, reasons behind them

[Geiwitz, et al., 1990], [Cordingley, 1989]

Critical Decision Method

Direct

Goals considered, options generated, situation assessment

[Hudlicka, 1997], [Thordsen, 1991], [Klein et al., 1986]

Retrospective case description

Direct

Procedures used to solve past problems

[Geiwitz, et al., 1990], [Cordingley, 1989]

Interesting cases

Direct

Procedures used to solve unusual problems

[Geiwitz, et al., 1990], [Cordingley, 1989]

 

Protocols

 Protocol analysis [Ericsson and Simon, 1984] involves asking the expert to perform a task while "thinking aloud." The intent is to capture both the actions performed and the mental process used to determine these actions. As with all the direct methods, the success of the protocol analysis depends on the ability of the expert to describe why they are making their decision. In some cases, the expert may not remember why they do things a certain way. In many cases, the verbalized thoughts will only be a subset of the actual knowledge used to perform the task. One method used to augment this information is Interruption analysis. For this method, the knowledge engineer interrupts the expert at critical points in the task to ask questions about why they performed a particular action.

 For design, protocol analysis would involve asking the expert to perform the design task. This may or not be possible depending on what is being designed or the length of time normally required to perform a design task. Interruption analysis would be useful in determining why subtasks are performed in a particular order. One disadvantage, however, is that the questions could distract the expert enough that they may make mistakes or start "second guessing" their own decisions.

 If time and resources were available, it would be interesting to perform protocol analysis of the same task using multiple experts noting any differences in ordering. This could obtain both alternative orderings and, after questioning the expert, the rationale for their decisions.

 Table 4 lists protocol analysis.

Table 4. Protocol Methods

 

Method

Type

Output

Reference

protocol analysis (think aloud, talk aloud, eidetic reduction, retrospective reporting, behavioral descriptions, playback)

Direct

Procedures, problem-solving strategy

[Hudlicka, 1997], [Ericsson & Simon, 1984], [Geiwitz, et al., 1990]

 

Critiquing

 In Critiquing, an approach to the problem/task is evaluated by the expert. This is used to determine the validity of results of previous KE sessions. Table 5 lists critiquing methods.

Table 5. Critiquing Methods

 

Method

Type

Output

Reference

Critiquing

Direct

Evaluation of a problem solving strategy compared to alternatives

[Geiwitz, et al., 1990], [Cordingley, 1989]

 

Role Playing

 In Role Playing, the expert adapts a role and acts out a scenario where their knowledge is used [Geiwitz, et al., 1990]. The intent is that by viewing a situation from a different perspective, information will be revealed that was not discussed when the expert was asked directly. Table 6 shows role playing.

Table 6. Role Playing Methods

 

Method

Type

Output

Reference

role playing

Indirect

Procedures, difficulties encountered due to role

[Geiwitz, et al., 1990], [Cordingley, 1989]

 

Simulation

 In Simulation methods, the task is simulated using a computer system or other means. This is used when it is not possible to actually perform the task. Table 7 shows simulation methods.

Table 7. Simulation Methods

 

Method

Type

Output

Reference

wizard of oz

Direct

Procedures followed

[Geiwitz, et al., 1990], [Cordingley, 1989]

Simulations

Direct

Problem solving strategies, procedures

[Geiwitz, et al., 1990], [Cordingley, 1989]

Problem analysis

Direct

Procedures, rationale (like simulated interruption analysis)

[Geiwitz, et al., 1990]

 

Prototyping

In Prototyping, the expert is asked to evaluate a prototype of the proposed system being developed. This is usually done iteratively as the system is refined. Table 8 shows prototyping methods.

Table 8. Prototyping Methods

 

Method

Type

Output

Reference

System refinement

Direct

 

New test cases for a prototype system

[Geiwitz, et al., 1990]

System examination

Direct

Experts opinion on prototype’s rules and control structures

[Geiwitz, et al., 1990]

System validation

Direct

Outside experts evaluation of cases solved by expert and protocol system

[Geiwitz, et al., 1990]

Rapid prototyping

Direct

Evaluation of system/procedure

[Geiwitz, et al., 1990], [Diaper, 1989]

Storyboarding

Direct

Prototype display design

[OTT, 1998], [McNeese & Zaff, 1991]

 

Teachback

 In Teachback, the knowledge engineer attempts to teach the information back to the expert, who then provides corrections and fills in gaps. Table 9 shows teachback methods.

Table 9. Teachback Methods

 

Method

Type

Output

Reference

teachback

Direct

Correction of misconceptions

[Geiwitz, et al., 1990], [Cordingley, 1989]

 

Observation

 In Observation methods, the knowledge engineer observes the expert performing a task. This prevents the knowledge engineer from inadvertently interfering in the process, but does not provide any insight into why decisions are made. Table 10 shows observation methods.

Table 10. Observation Methods

 

Method

Type

Output

Reference

Discourse analysis (observation)

Direct

Taxonomy of tasks/subtasks or functions

[OTT, 1998], [Belkin & Brooks, 1988]

On-site observation

Direct

Procedure, problem solving strategies

[Geiwitz, et al., 1990], [Cordingley, 1989]

Active participation

Direct

Knowledge and skills needed for task

[Geiwitz, et al., 1990], [Cordingley, 1989]

 

Goal Related

In Goal Related methods, focused discussion techniques are used to elicit information about goals and subgoals. Table 11 shows goal related methods. 

Table 11. Goal Related Methods

 

Method

Type

Output

Reference

Goal Decomposition

Direct

Goals and subgoals

[Geiwitz, et al., 1990]

Dividing the domain

Direct

How data is grouped to reach a goal

[Geiwitz, et al., 1990], [Cordingley, 1989]

Reclassification

Direct

Evidence needed to prove that a decision was correct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Distinguishing goals

Direct

Minimal sets of discriminating features

[Geiwitz, et al., 1990], [Cordingley, 1989]

Goal Directed Analysis (goal-means network)

Direct

Goal-means network

[OTT, 1998], [Woods & Hollnagel, 1987]

 

List Related

 In List Related methods, the expert is asked to provide lists of information, usually decisions. Table 12 shows list related methods.

Table 12. List Related Methods

 

Method

Type

Output

Reference

Decision analysis

Direct

Estimate of worth for all possible decisions for a task

[Geiwitz, et al., 1990], [Cordingley, 1989]

 

Construct Elicitation

 Construct Elicitation methods are used to obtain information about how the expert discriminates between entities in the problem domain. The most commonly used construct elimination method is Repertory Grid Analysis [Kelly, 1955]. For this method, the domain expert is presented with a list of entities and is asked to describe the similarities and differences between them. These similarities and differences are used to determine the important attributes of the entities. After completing the initial list of attributes, the knowledge engineer works with the domain expert to assign ratings to each entity/attribute pair. Table 13 shows construct elicitation methods.

 

Table 13. Construct Elicitation Methods

 

Method

Type

Output

Reference

repertory grid

Indirect

Attributes (and entities if provided by subject)

[Hudlicka, 1997], [Kelly, 1955]

multi-dimensional scaling

Indirect

Attributes and relationships

 

proximity scaling

Indirect

Attributes and relationships

[Hudlicka, 1997]

 

Sorting

In sorting methods, domain entities are sorted to determine how the expert classifies their knowledge. Table 14 shows sorting methods. 

Table 14. Sorting Methods

 

Method

Type

Output

Reference

card sorting

Indirect

Hierarchical cluster diagram (classification)

[1], [Geiwitz, et al., 1990], [Cordingley, 1989]

 

Laddering

 In Laddering, a hierarchical structure of the domain is formed by asking questions designed to move up, down, and across the hierarchy. Table 15 shows laddering methods.

Table 15. Laddering Methods

 

Method

Type

Output

Reference

Laddered grid

Indirect

A hierarchical map of the task domain

[Geiwitz, et al., 1990], [Cordingley, 1989]

 

20 Questions

 This is a method used to determine how the expert gathers information by having the expert as the knowledge engineer questions. Table 16 shows the 20 questions method.

Table 16. 20 Questions Method

 

Method

Type

Output

Reference

20 questions

Indirect

Amount and type of information used to solve problems; how problem space is organized, or how expert has represented

Task-relevant knowledge.

[Cordingley, 1989], [Geiwitz, et al., 1990]

 

Document Analysis

 Document analysis involves gathering information from existing documentation. May or may not involve interaction with a human expert to confirm or add to this information.

Table 17 shows documentation analysis methods.

Table 17. Document Analysis Methods

 

Method

Type

Output

Reference

Collect artifacts of task performance

Indirect

How expert organizes or processes task information, how it is compiled to present to others

[Geiwitz, et al., 1990], [Cordingley, 1989]

Document analysis

Indirect (Usually)

Conceptual graph

[OTT, 1998], [Gordon et al., 1993]

Goal Directed Analysis (goal-means network)

Direct

Goal-means network

[OTT, 1998], [Woods & Hollnagel, 1987]

 

KE Methods by Knowledge Type Obtained

 Besides being grouped into direct and indirect categories, KE methods can also be grouped (to some extent) by the type of knowledge obtained. For example, many of the indirect KE methods are best at obtaining classification knowledge while direct methods are more suited for obtaining procedural knowledge. This does not, however, mean that the techniques can not be used for other knowledge types. Since some designers may not be able to directly express how they perform a design task, it might be useful to use an indirect method in conjunction with a direct method to obtain this information.

 Information types used here are:

Many methods fit into more than one category and are listed more than once. Also, this classification shows the information most commonly extracted using a method and does not imply that only that type of information can be elicited.

Procedures

 These are methods that can be used to determine the steps followed to complete a task. Table 18 lists methods used to elicit procedures.

Table 18. Methods that Elicit Procedures

 

Method

Category

Output

Type

Reference

Interviewing (structured, unstructured, semi-structured)

Interviewing

Procedures followed, knowledge used

Direct

[Hudlicka, 1997], [Geiwitz, et al., 1990]

Concept Mapping

Interview

Procedures followed

Direct

[Hudlicka, 1997], [Thordsen, 1991], [Gowin & Novak, 1984]

Interruption Analysis

Interviewing

Procedures, problem-solving strategy, rationale

Direct

[Hudlicka, 1997]

Problem discussion

Interview

Solution strategies

Direct

[Geiwitz, et al., 1990]

Tutorial interview

Interview

Whatever expert teaches!

Direct

[Geiwitz, et al., 1990]

Entity life modeling

Interview

Entity life cycle diagram (entities and state changes)

Direct

[OTT, 1998], [Swaffield & Knight, 1990]

IDEF modeling

Interview

IDEF Model (functional decomposition)

Direct

[OTT, 1998], [McNeese & Zaff, 1991]

Petri nets

Interview

Functional task net

Direct

[OTT, 1998], [Coovert et al., 1990], [Hura, 1987], [Weingaertner & Lewis, 1988]

Questionnaire

Interview

Sequence of task actions, cause and effect relationships

Direct

[OTT, 1998], [Bainbridge, 1979]

Task action mapping

Interview

Decision flow diagram (goals, subgoals, actions)

Direct

[OTT, 1998], [Coury et al., 1991]

Retrospective case description

Case Study

Procedures followed

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Critical incident strategy

Case Study

Complete plan, plus factors that influenced the plan.

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Forward scenario simulation

Case Study

Procedures followed, reasons behind them

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Retrospective case description

Case Study

Procedures used to solve past problems

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Interesting cases

Case Study

Procedures used to solve unusual problems

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

protocol analysis (think aloud, talk aloud, eidetic reduction, retrospective reporting, behavioral descriptions, playback)

Protocols

Procedures, problem-solving strategy

Direct

[Hudlicka, 1997], [Ericsson & Simon, 1984], [Geiwitz, et al., 1990]

Teachback

Teachback

Correction of misconceptions

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Critiquing

Critiquing

Evaluation of a problem solving strategy compared to alternatives

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

role playing

Role Playing

Procedures, difficulties encountered due to role

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

wizard of oz

Simulation

Procedures followed

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Simulations

Simulation

Problem solving strategies, procedures

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Problem analysis

Simulation

Procedures, rationale (like simulated interruption analysis)

Direct

[Geiwitz, et al., 1990]

On-site observation

Observation

Procedure, problem solving strategies

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

 

Problem Solving Strategy

 These methods attempt to determine how the expert makes their decisions. Table 19 lists methods that elicit a problem solving strategy.

 

Table 19. Methods that Elicit Problem Solving Strategy

 

Method

Category

Output

Type

Reference

Interviewing (structured, unstructured, semi-structured)

Interviewing

Procedures followed, knowledge used

Direct

[Hudlicka, 1997], [Geiwitz, et al., 1990]

Interruption Analysis

Interviewing

Procedures, problem-solving strategy, rationale

Direct

[Hudlicka, 1997]

Problem discussion

Interview

Solution strategies

Direct

[Geiwitz, et al., 1990]

Tutorial interview

Interview

Whatever expert teaches!

Direct

[Geiwitz, et al., 1990]

Uncertain information elicitation

Interview

Uncertainty about problems

Direct

[Geiwitz, et al., 1990]

Critical incident strategy

Case Study

Complete plan, plus factors that influenced the plan.

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Forward scenario simulation

Case Study

Procedures followed, reasons behind them

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

protocol analysis (think aloud, talk aloud, eidetic reduction, retrospective reporting, behavioral descriptions, playback)

Protocols

Procedures, problem-solving strategy

Direct

[Hudlicka, 1997], [Ericsson & Simon, 1984], [Geiwitz, et al., 1990]

critiquing

Critiquing

Evaluation of a problem solving strategy compared to alternatives

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

wizard of oz

Simulation

Procedures followed

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Simulations

Simulation

Problem solving strategies, procedures

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Problem analysis

Simulation

Procedures, rationale (like simulated interruption analysis)

Direct

[Geiwitz, et al., 1990]

Reclassification

Goal Related

Evidence needed to prove that a decision was correct

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

On-site observation

Observation

Procedure, problem solving strategies

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Goal Directed Analysis (goal-means network)

Interview/Document Analysis

Goal-means network

Direct

[OTT, 1998], [Woods & Hollnagel, 1987]

20 questions

20 Questions

Amount and type of information used to solve problems; how problem space is organized, or how expert has represented

Task-relevant knowledge.

Indirect

[Cordingley, 1989], [Geiwitz, et al., 1990]

Cloze experiments

Indirect

Model of decision-making rules and structures

Indirect

[Geiwitz, et al., 1990]

 

Goals/Subgoals

 These are methods that are concerned with extracting the goals and subgoals for performing the task. These methods are listed separately from procedures since ordering is not necessarily provided. Table 20 lists methods that elicit this information. 

Table 20. Methods that Elicit Goals/Subgoals

 

Method

Category

Output

Type

Reference

ARK (ACT-based representation of knowledge) (combination of methods)

Interview

Goal-subgoal network

Includes production rules describing goal/subgoal relationship

Direct

[Geiwitz, et al., 1990]

Task action mapping

Interview

Decision flow diagram (goals, subgoals, actions)

Direct

[OTT, 1998], [Coury et al., 1991]

Critical Decision Method

Case Study

Goals considered, options generated, situation assessment

Direct

[Hudlicka, 1997], [Thordsen, 1991], [Klein et al., 1986]

goal decomposition

Goal Related

Goals and subgoals

Direct

[Geiwitz, et al., 1990]

Dividing the domain

Goal Related

How data is grouped to reach a goal

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Reclassification

Goal Related

Evidence needed to prove that a decision was correct

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Distinguishing goals

Goal Related

Minimal sets of discriminating features

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Goal Directed Analysis (goal-means network)

Interview/Document Analysis

Goal-means network

Direct

[OTT, 1998], [Woods & Hollnagel, 1987]

 

Classification

 These methods are used to classify entities within a domain. Figure 21 lists methods concerned with classification. 

Table 21. Methods that Elicit Classification of Domain Entities

 

Method

Category

Output

Type

Reference

Cognitive Structure Analysis (CSA)

Interview

Representational format of experts knowledge; content of the knowledge structure

Direct

[Geiwitz, et al., 1990]

Data flow modeling

Interview

Data flow diagram (data items and data flow between them – no sequence information)

Direct

[OTT, 1998], [Gane & Sarson, 1977]

Entity-relationship modeling

Interview

Entity relationship diagram (entities, attributes, and relationships)

Direct

[OTT, 1998], [Swaffield & Knight, 1990]

Entity life modeling

Interview

Entity life cycle diagram (entities and state changes)

Direct

[OTT, 1998], [Swaffield & Knight, 1990]

Object oriented modeling

Interview

Network of objects (types, attributes, relations)

Direct

[OTT, 1998], [Riekert, 1991]

Semantic nets

Interview

Semantic Net (inc. relationships between objects)

Direct

[OTT, 1998], [Atkinson, 1990]

Distinguishing goals

Goal Related

Minimal sets of discriminating features

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Decision analysis

List Related

Estimate of worth for all possible decisions for a task

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Discourse analysis (observation)

Observation

Taxonomy of tasks/subtasks or functions

Direct

[OTT, 1998], [Belkin & Brooks, 1988]

Collect artifacts of task performance

Document Analysis

How expert organizes or processes task information, how it is compiled to present to others

Indirect

[Geiwitz, et al., 1990], [Cordingley, 1989]

Document analysis

Document Analysis

Conceptual graph

Indirect

[OTT, 1998], [Gordon et al., 1993]

repertory grid

Construct Elicitation

Attributes (and entities if provided by subject)

Indirect

[Hudlicka, 1997], [Kelly, 1955]

multi-dimensional scaling

Construct Elicitation

Attributes and relationships

Indirect

 

proximity scaling

Construct Elicitation

Attributes and relationships

Indirect

[Hudlicka, 1997]

card sorting

Sorting

Hierarchical cluster diagram (classification)

Indirect

[1], [Geiwitz, et al., 1990], [Cordingley, 1989]

laddered grid

Laddering

A hierarchical map of the task domain

Indirect

[Geiwitz, et al., 1990], [Cordingley, 1989]

Ranking augmented conceptual ranking

Other

Conceptual Ranking (ordering by value)

Direct

[OTT, 1998], [Chignell & Peterson, 1988], [Kagel, 1986], [Whaley, 1979]

 

Dependencies/Relationships

 Table 22 lists methods that obtain relationships between domain entities. 

Table 22. Methods that Elicit Relationships

 

Method

Category

Output

Type

Reference

Data flow modeling

Interview

Data flow diagram (data items and data flow between them – no sequence information)

Direct

[OTT, 1998], [Gane & Sarson, 1977]

Entity-relationship modeling

Interview

Entity relationship diagram (entities, attributes, and relationships)

Direct

[OTT, 1998], [Swaffield & Knight, 1990]

Object oriented modeling

Interview

Network of objects (types, attributes, relations)

Direct

[OTT, 1998], [Riekert, 1991]

Semantic nets

Interview

Semantic Net (inc. relationships between objects)

Direct

[OTT, 1998], [Atkinson, 1990]

Questionnaire

Interview

Sequence of task actions, cause and effect relationships

Direct

[OTT, 1998], [Bainbridge, 1979]

Discourse analysis (observation)

Observation

Taxonomy of tasks/subtasks or functions

Direct

[OTT, 1998], [Belkin & Brooks, 1988]

multi-dimensional scaling

Construct Elicitation

Attributes and relationships

Indirect

 

Proximity scaling

Construct Elicitation

Attributes and relationships

Indirect

[Hudlicka, 1997]

card sorting

Sorting

Hierarchical cluster diagram (classification)

Indirect

[1], [Geiwitz, et al., 1990], [Cordingley, 1989]

Laddered grid

Laddering

A hierarchical map of the task domain

Indirect

[Geiwitz, et al., 1990], [Cordingley, 1989]

 

Evaluation

 Table 23 lists methods that are used for evaluation of prototypes or other types of KE session results. 

Table 23. Methods that Elicit Evaluations

 

Method

Category

Output

Type

Reference

teachback

Teachback

Correction of misconceptions

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

critiquing

Critiquing

Evaluation of a problem solving strategy compared to alternatives

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

System refinement

Prototyping

New test cases for a prototype system

Direct

 

[Geiwitz, et al., 1990]

System examination

Prototyping

Experts opinion on prototype’s rules and control structures

Direct

[Geiwitz, et al., 1990]

System validation

Prototyping

Outside experts evaluation of cases solved by expert and protocol system

Direct

[Geiwitz, et al., 1990]

Rapid prototyping

Prototyping

Evaluation of system/procedure

Direct

[Geiwitz, et al., 1990], [Diaper, 1989]

Storyboarding

Prototyping

Prototype display design

Direct

[OTT, 1998], [McNeese & Zaff, 1991]

Decision analysis

List Related

Estimate of worth for all possible decisions for a task

Direct

[Geiwitz, et al., 1990], [Cordingley, 1989]

Ranking augmented conceptual ranking

Other

Conceptual Ranking (ordering by value)

Direct

[OTT, 1998], [Chignell & Peterson, 1988], [Kagel, 1986], [Whaley, 1979]

 

References 

Atkinson, G. (1990). Practical experience using an automated knowledge acquisition tool. Proceedings of the Second Annual Conference of the International Association of Knowledge Engineers, 87-97.

Bainbridge, L. (1979). Verbal reports as evidence of the process operator's knowledge. International Journal of-Man-Machine Studies, 11, 411-436. 

Belkin, N. J., Brooks, H. M. (1988). Knowledge elicitation using discourse analysis. In B. Gaines and J. Boose (Eds.) Knowledge based systems, Vol. 1, pp 107-124. Academic Press Limited. 

Chignell, M. H., Peterson, J. G. (1988). Strategic issues in knowledge engineering. Human Factors, 30(4), 381-394. 

Coovert, M. D., Cannon-Bowers, J. A., & Salas, E. (1990). Applying mathematical modeling technology to the study of team training and performance. Paper presented at the 12th Annual Interservice/Industry Training Systems Conference, Orlando, FL, November. 

Cordingley, E. S. (1989). Knowledge elicitation techniques for knowledge-based systems. In D. Diaper (Ed.), Knowledge elicitation: Principles, techniques and applications. Chichester, England: Ellis Horwood Ltd. 

Coury, B. G., Motte, S., & Seiford, L. M. (1991). Capturing and representing decision processes in the design of an information system. Proceedings of the Human Factors Society 35th Annual Meeting, 1223-1227. Santa Monica, CA: Human Factors Society. 

Diaper, D. (Ed.). (1989). Knowledge elicitation: Principles, techniques and applications. Chicester, England: Ellis Horwood Ltd. 

Ericsson, K.A., Simon, H.A. (1984). Protocol Analysis: Verbal Reports as Data. Cambridge, MA: The MIT Press. 

Gane, C., Sarson, T. (1977). Structured Systems Analysis:--Tools and Techniques. Unpublished document! McDonnell Douglas Corporation. 

Geiwitz, J., Kornell, J., McCloskey, B. (1990). An Expert System for the Selection of Knowledge Acquisition Techniques. Technical Report 785-2, Contract No. DAAB07-89-C-A044. California, Anacapa Sciences. 

Gordon, S. E., Schmierer, K. A., & Gill, R. T. (1993). Conceptual graph analysis: Knowledge acquisition for instructional system design. Human Factors, 35, p. 459-481. 

Gowin, R., Novak, J.D. (1984). Learning how to learn. NY: Cambridge University Press.

Hudlicka, E. (1997). Summary of Knowledge Elicitation Techniques for Requirements Analysis, Course Material for Human Computer Interaction, Worcester Polytechnic Institute.

Hura, G. S. (1987). Petri net applications. IEEE Potentials, October, 25-28.

Kagel, A. S. (1986). The unshuffle algorithm. Computer Language, 1(11), 61-66.

Kelly, G. (1955). The Psychology of Personal Constructs. New York: Norton.

Klein, G. A., Calderwood, R., Clinton-Cirocco, A. (1986). Rapid decision making on the fireground, Proceedings o fthe 30th Annual Human Factors Society, 1, 576-580. Dayton, OH: Human Factors Society.

McNeese, M. D., Zaff, B. S. (1991). Knowledge as design: A methodology for overcoming knowledge acquisition bottlenecks in intelligent interface design. Proceedings of the Human Factors Society 35th Annual Meeting, 1181-1185. Santa Monica, CA: Human Factors Society.

OTT (1998), http://www.ott.navy.mil/2_2/2_2_6/ , Task Analysis, Chief of Naval Operations' Office of Training Technology.

Riekert, W. (1991). Knowledge acquisition as an object-oriented modeling process. In M. J. Tauber and D. Ackermann (Eds.) Mental models and human computer interactions, 373-381. Amsterdam: Elsevier Sciences Publishers B. V.

Swaffield, G., Knight, B. (1990). Applying system analysis techniques to knowledge engineering. Expert Systems, 1, 82-93.

Thordsen, M. (1991). A Comparison of Two Tools for Cognitive Task Analysis: Concept Mapping and the Critical Decision Method. Proceedings of the Human Factors Society 35th Annual Meeting.

Weingaertner, S. T., Lewis, A. H. (1988). Evaluation of decision aiding in submarine emergency decision making. In J. Ranta (Ed.) Analysis, Design, and Evaluation of Man-Machine Systems: Selected Papers from the 3rd IFAC/IEA/IFORS Conference, 1 95-201. Oxford, UK: Pergamon.

Whaley, C. P. (1979). Collecting paired-comparison data with a sorting algorithm. Behavior Research Methods and Instrumentation, 11, 147-150.

Woods, D. D., Hollnagel, E. (1987). Mapping cognitive demands in complex problem-solving worlds. International Journal of Man-Machine Studies, 26, 257-275.