Accelerating Workforce Training: Introducing the CTA in E/Affect Initiative

NDM Tools

Through its scientific and applied endeavors, the NDM community has generated many tools, including methods, models, guidance, and research supports. Some of the tools are available as Courses.

 

This list is not intended to be comprehensive. The NDMA is seeking ideas and resources for developing this tool box – send recommendations and/or links for other NDM tools to: info@naturalisticdecisionmaking.org.


These NDM community members contributed to the list: Cindy Dominguez, Julie Gore, Robert Hoffman, Devorah Klein, Gary Klein, Laura Militello, Brian Moon, Emilie Roth, Jan Maarten Schraagen, and Neelam Naikar. Adam Zaremsky added the short descriptors and references.

Table of Contents

Knowledge Elicitation

CDM is a knowledge elicitation technique. It uses a set of cognitive probes to determine the basis for situation assessment and decision making during nonroutine incidents in multiple sweeps of the incident in retrospection.

 

Sources:

  • Klein, G. A., Calderwood, R., & MacGregor, D. (1989). Critical decision method for eliciting knowledge. IEEE Transactions on systems, man, and cybernetics19(3), 462-472.
  • Hoffman, R. R., Crandall, B., & Shadbolt, N. (1998). Use of the critical decision method to elicit expert knowledge: A case study in the methodology of cognitive task analysis. Human factors40(2), 254-276.

Specific changes in cue usage and goals in an expert’s understanding of the dynamics of a particular case.

 

Sources:

  • Klein, G. A., Calderwood, R., & Macgregor, D. (1989). Critical decision method for  eliciting knowledge. IEEE Transactions on systems, man, and cybernetics19(3), 462-472.

A streamlined set of interview methods for identifying cognitive demands and skills needed to perform a task proficiently.

Sources:

  • Militello, L. G., & Hutton, R. J. (1998). Applied cognitive task analysis (ACTA): a practitioner’s toolkit for understanding cognitive task demands. Ergonomics41(11), 1618-1641.

A set of probes organized to capture the most important knowledge categories recognized as characteristic of expertise.

 

Sources:

  • Militello, L. G., & Hutton, R. J. (1998). Applied cognitive task analysis (ACTA): a practitioner’s toolkit for understanding cognitive task demands. Ergonomics41(11), 1618-1641.
  • Klein, G., & Militello, L. (2004). The knowledge audit as a method for cognitive task analysis. How professionals make decisions, 335-342.

A method to assess and identify which specifically trainable cognitive skills an instructor might want to address.

 

Sources:

A tool for organizing knowledge, by showing the relationships of events and objects between which a perceived regularity exists, through the use of connecting lines, circles, and boxes of some type.

 

Sources:

  • Novak, J. D., & Cañas, A. J. (2006). The theory underlying concept maps and how to construct them. Florida Institute for Human and Machine Cognition1(1), 1-31.
  • Moon, B., Hoffman, R. R., Novak, J., & Canas, A. (Eds.). (2011). Applied concept mapping: Capturing, analyzing, and organizing knowledge. CRC Press.

Cognitive Specifications and Representation

A format for practitioners to focus on the analysis of intended project goals, with headings based on the types of information needed to develop a new course or design a new system.

 

Sources:

  • Militello, L. G., & Hutton, R. J. (1998). Applied cognitive task analysis (ACTA): a practitioner’s toolkit for understanding cognitive task demands. Ergonomics41(11), 1618-1641. Human factors40(2), 254-276.

A use of Naturalistic Decision-Making cognitive frameworks to more accurately describe the processes involved in real-world decision making, to help decision makers with what they are actually trying to do.

 

Sources:

 

  • Militello, L. G., & Klein, G. (2013). Decision-centered design. The Oxford handbook of cognitive engineering, 261-271.
  • Militello, L. G., Dominguez, C. O., Lintern, G., & Klein, G. (2010). The role of cognitive systems engineering in the systems engineering design process. Systems Engineering13(3), 261-273, 19(3), 462-472.

A collection of all of the informational and perceptual cues that are pinpointed in critical decision interviews.

 

Sources:

  • Klein, G. A., Calderwood, R., & Macgregor, D. (1989). Critical decision method for eliciting knowledge. IEEE Transactions on systems, man, and cybernetics19(3), 462-472.

A system for visual monitoring of a user’s mental state, assessing in real time human emotion, levels of mental workload and stress.

 

Sources:

  • Chrenka, J., Hotton, R.J. B., Klinger, D. W., & Anastasi, D. (2001). The cognimeter: Focusing cognitive task analysis in the cognitive function model. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 4, 1738-1742.How professionals make decisions, 335-342.

A strategy for facilitating human-machine teaming consisting of activities to analyze function allocation trade space, analyze operational demands and work requirements, analyze interdependencies between humans and automation, evaluate alternative options with human performance modeling/simulation, and identify and evaluate alternative function allocation and crewing options.

 

Sources:

  • Ernst, K., Roth, E., Militello, L., Sushereba, C., DiIulio, J., Wonderly, S., … & Taylor, G. (2019). A strategy for determining optimal crewing in future vertical lift: Human-automation function allocation. Proceedings of the Vertical Lift Society’s 75th Annual Forum & Technology Display, Philadelphia, PA.
  • Sushereba, C. E., Diiulio, J. B., Militello, L. G., & Roth, E. (2019, November). A tradespace framework for evaluating crewing configurations for future vertical lift. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 63, No. 1, pp. 352-356). Sage CA: Los Angeles, CA: SAGE Publications.

A representation for characterizing activity in work systems that can be decomposed into both work situations and work functions, by capturing all of the combinations of work situations, work functions and control tasks that are possible.

 

Sources:

  • Naikar, N., Moylan, A., & Pearce, B. (2006). Analyzing activity in complex systems with cognitive work analysis: concepts, guidelines and case study for control task analysis. Theoretical Issues in Ergonomics Science7(4), 371-394. Reflecting Education3(1), 29-42.

A diagram with multiple features for complex sociotechnical systems including: showcasing what work demands an actor can be responsible for, depiction of the fundamental boundaries on the allocation or distribution of work demands from which various possibilities may be derived, with these possibilities regarded as emergent.

Sources:

  • Naikar, N., & Elix, B. (2016). Integrated system design: Promoting the capacity of sociotechnical systems for adaptation through extensions of cognitive work analysis. Frontiers in psychology7, 962.Reflecting Education3(1), 29-42.

The HMT Systems Engineering Guide provides this guidance based on a 2016-17 literature search and analysis of applied research. The guide provides a framework organizing HMT research, along with methodology for engaging with users of a system to elicit user stories and/or requirements that reflect applied research findings. The framework uses organizing themes of Observability, Predictability, Directing Attention, Exploring the Solution Space, Directability, Adaptability, Common Ground, Calibrated Trust, Design Process, and Information Presentation.

 

Sources:

 

Training

A training approach that allows trainees to learn how experts make sense of situations, what they pay attention to, why they make their choices, and what their mental models are, all without the expert having to be present.

 

Sources:

  • Klein, G., & Borders, J. (2016). The ShadowBox approach to cognitive skills training: An empirical evaluation. Journal of Cognitive Engineering and Decision Making10(3), 268-280.
  • Klein, G., Hintze, N., & Saab, D. (2013, May). Thinking inside the box: The ShadowBox method for cognitive skill development. In Proceedings of the 11th International Conference on Naturalistic Decision Making (pp. 121-124).
  • Human factors40(2), 254-276.

Tactical Decision Games (TDGs) are simple, fun, and effective exercises to improve one’s decision making ability and tactical acumen by repeatedly playing through problems to learn to make decisions better as well as better decisions.

 

Sources:

  • Schmitt, J. F. (1994). Mastering Tactics: A tactical decision games workbook. Marine Corps Association., 19(3), 462-472.

A set of tools for human-machine systems designed to increase the human operators’ knowledge and understanding of the technologies explainability.

 

Sources:

A cognitive model of training that has one primary function, learning management, and six subordinate functions of a provider: setting/clarifying goals, providing instruction, assessing trainee proficiency and diagnosing barriers to progress, sharing expertise, setting a climate conducive to learning, and promoting ownership of the learning process and performance of the trainee.

 

Sources:

  • Zsambok, C. E., Kaempf, G. L., Crandall, B., & Kyne, M. (1997). A Comprehensive Program to Deliver On-The-Job Training (OJT). Klein Associates Inc., Fairborn OH. How professionals make decisions, 335-342.

A new training tool designed to improve the ability of decision makers in the oil, gas, and petrochemical industries to help plant operators build richer mental models and more effective mindsets, helping them to make better decisions.

 

Sources:

  • Borders, J., Klein, G., & Besuijen, R. (2020) Mental Models: Cognitive After-Action Review Guide for Observers Video Final Report. Center for Operator Performance (Unpublished Report).

Design

A framework focused on utilizing CTA methods to identify the tough, key decisions of performance, and then create the design of the technology, training, and processes based upon those identified requirements.

 

Sources:

  • Militello, L. G., & Klein, G. (2013). Decision-centered design. The Oxford handbook of cognitive engineering, 261-271. Human factors40(2), 254-276.

A replanning tool designed to support human operators to collaborate with an automated support collaborative planner, where the planner was created to operate with observable and directable functioning, with a shared frame of reference to the human operator, to allow for work to be completed iteratively and jointly.

 

Sources:

  • Scott, R., Roth, E., Truxler, R., Ostwald, J., & Wampler, J. (2009, October). Techniques for effective collaborative automation for air mission replanning. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 53, No. 4, pp. 202-206). SAGE Publications, 19(3), 462-472.

A set of methodologies applicable in any context when humans directly interact with computing devices and systems, that facilitates any personal, social or cultural aspects, and addresses issues such as information design, human-information interaction, human-computer interaction, human-human interaction, and the relationships between computing technology and art, social, and cultural issues.

 

Sources:

  • Hoffman, R. ed. (2012). Collected Essays on Human-Centered Computing, 2001-2011. Piscataway, NJ: IEEE CS Press.
  • Zhang, J., Patel, V. L., Johnson, K. A., & Smith, J. W. (2002). Designing human-centered distributed information systems. IEEE intelligent systems17(5), 42-47.

Evaluation and Assessment

A tool for organizing knowledge, showing the relationships of events and objects between which a perceived regularity exists, through the use of connecting lines, circles, and boxes of some type. Sero! is a a software platform for conducting concept mapping-based assessments of mental models.

 

Sources:

  • Novak, J. D., & Canas, A. J. (2007). Theoretical origins of concept maps, how to construct them, and uses in education. Reflecting Education3(1), 29-42.
  • www.serolearn.com; Moon, B., Johnston, C., & Moon, S. (2018). A Case for the Superiority of Concept Mapping-Based Assessments for Assessing Mental Models. In Concept Mapping: Renewing Learning and Thinking. Proceedings of the 8th Int. Conference on Concept Mapping, Medellín, Colombia: Universidad EAFIT.

An alternative method to standard performance reviews, where the reviewer and reviewee independently draw up and evaluate a list of decisions made in the prior year, focusing attention to the quality of the decision itself, not the decision maker.

 

Sources:

An evaluation framework that focuses on a support system’s usability, usefulness, and impact in supporting human performance in complex work environments.

 

Sources:

  • Eggleston, R. G., Roth, E. M., & Scott, R. (2003, October). A framework for work-centered product evaluation. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 47, No. 3, pp. 503-507). Sage CA: Los Angeles, CA: SAGE Publications.

Teamwork

A process for team members to share SA information, which may include a group exercise of questioning norms, checking for conflicting information, setting up coordination and prioritization of tasks, and establishing contingency planning.

Sources:

  • Klinger, D. W., & Klein, G. (1999). Emergency response organizations: An accident waiting to happen. Ergonomics in Design, 7(3), 20-25.

A model that captures the nature and origin of international cognitive differences, usually arising from a group’s origin in a specific physical and social ecology and provides mechanisms for increasing comprehension and effectiveness in the face of these cognitive differences.

Sources:

  • Klein, G. A., Calderwood, R., & Macgregor, D. Klein, H. A. (2004). Cognition in natural settings: The cultural lens model. Cultural ergonomics. 249-280.

A set of CTA methods for investigating teamwork.

 

Sources:

  • Klinger, D., & Thordsen, M. (1998, October). Team CTA applications and methodologies. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 42, No. 3, pp. 206-209). Sage CA: Los Angeles, CA: SAGE Publications.

Risk Assessment

A managerial strategy at the outset of a project where a team engages in a hypothetical presumption that it has failed spectacularly, subsequently working backwards to identify what weaknesses or threats to the project they can avoid.

 

Sources:

  • Klein, G. (2007). Performing a project premortem. Harvard business review85(9), 18-19.

Measurement

The six main macrocognitive functions are identified as, decision making, sensemaking, problem detection, planning, adapting, and coordinating.

 

Sources:

  • Klein, G. (2018). Macrocognitive measures for evaluating cognitive work. Macrocognition Metrics and Scenarios 47-64. CRC Press. Human factors40(2), 254-276.

A novel method of statistical data analysis for assessing the learnability of cognitive work methods.

 

Sources:

  • Hoffman, R. R., Marx, M., Amin, R., & McDermott, P. L. (2010). Measurement for evaluating the learnability and resilience of methods of cognitive work. Theoretical Issues in Ergonomics Science11(6), 561-575.

A means to explain a computational system to decision makers who rely on Artificial Intelligence, so that they may decide on the reasonableness of that system.

 

1. Trust: A series of active exploration measures aimed at maintaining an appropriate context-dependent expectation for users to know whether, when and why to trust or mistrust an XAI system.

 

2. Explanation Satisfaction: The degree to which users feel that they understand the AI system or process being explained to them.

 

3. Explanation Goodness: Utilizing factors such as clarity and precision, a checklist for researchers to either try and design goodness into the explanation that their XAI system generates, or to evaluate a priori goodness of the explanations generated.

A measure to assess the “goodness” (i.e., correctness, comprehensiveness, coherence, usefulness) of a users’ mental model in regard to an XAI system, by calculating the percentage of concepts, relations, and propositions that are in a user’s explanation that are also in an expert’s explanation.

 

Sources:

  • Hoffman, R. R., Mueller, S. T., Klein, G., & Litman, J. (2018). Metrics for explainable AI: Challenges and prospects. arXiv preprint arXiv:1812.04608.How professionals make decisions, 335-342.

Conceptual Descriptions

A decision-making model that explains how people use situation assessment to generate plausible courses of action, while simultaneously using mental simulation to evaluate generated courses of action.

 

Sources:

  • Klein, G., Calderwood, R., & Clinton-Cirocco, A. (2010). Rapid decision making on the fire ground: The original study plus a postscript. Journal of Cognitive Engineering and Decision Making, 4, 186-209. Human factors40(2), 254-276.

A combination of the conceptual RPD model with computer programming to simulate realistic expert decision making.

 

Sources:

  • Hutton, R. J., Warwick, W., Stanard, T., McDermott, P. L., & McIlwaine, S. (2001, October). Computational model of recognition-primed decisions (RPD): improving realism in computer-generated forces (CGF). In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 45, No. 26, pp. 1833-1837). Sage CA: Los Angeles, CA: SAGE Publications., 19(3), 462-472.

A description of the ever-changing and interconnected relationship between data, which is available information, and cognitive frames, which are explanatory structures that define entities by explaining their relationship to other entities.

 

Sources:

  • Klein, G., Moon, B., & Hoffman, R. (2006a). Making Sense of Sensemaking 1: Alternative Perspectives. IEEE Intelligent Systems., 21, No. 4(July/August), 70-73.

  • Klein, G., Moon, B., & Hoffman, R. (2006b). Making Sense of Sensemaking 2: A Macrocognitive Model. IEEE Intelligent Systems 21, No. 5 (September/October), 88-92.

  • Klein, G., Phillips, J. K., Rall, E. L., & Peluso, D. A. (2007, May). A data-frame theory of sensemaking. In Expertise out of context: Proceedings of the sixth international conference on naturalistic decision making (pp. 113-155). New York, NY: Lawrence Erlbaum Assoc Inc.

  • Moore, D. T., & Hoffman, R. R. (2011). Data-frame theory of sensemaking as a best model for intelligence. American Intelligence Journal29(2), 145-158.

Use of the Data Frame model of sensemaking in the creation of computational simulations for machine systems to support human decision makers, utilizing key characteristics of computational cognition, including ontology representation, network theory, and reasoning processes with recursive feedback.

 

Sources:

  • Kodagoda, N., Pontis, S., Simmie, D., Attfield, S., Wong, B. W., Blandford, A., & Hankin, C. (2017). Using machine learning to infer reasoning provenance from user interaction log data: based on the data/frame theory of sensemaking. Journal of Cognitive Engineering and Decision Making11(1), 23-41.
  • Codjoe, E. A. A., Ntuen, C., & Chenou, J. (2010). A case study in sensemaking using Data/Frame Model. In IIE Annual Conference. Proceedings (p. 1). Institute of Industrial and Systems Engineers (IISE).

Intertwined concepts, both revolving around good managers being able to change goals as they go based on discoveries, by trying to learn more about those goals even as they pursue them.

 

Sources:

  • Klein, G. (2007) Flexecution as a paradigm for replanning, part 1. IEEE Intelligent Systems 22.5, 79-83.
  • Klein, G. A. (2011) Streetlights and shadows: Searching for the keys to adaptive decision making. MIT Press.

This model describes three paths that can lead people to having insights: contradictions, connections, and creative desperation.

 

Sources:

  • Klein, G. (2013) Seeing what others don’t: The remarkable ways we gain insights. Public Affairs.Reflecting Education3(1), 29-42.

Three models of how explanations are formed when a person tries to explain the reasons for the decisions, actions, or workings of a device to another person: Local explaining, global explaining and self-explaining.

 

Sources:

  • Klein, G., Hoffman, R., & Mueller, S. (2019) Naturalistic Psychological Model of Explanatory Reasoning: How People Explain Things to Others and to Themselves. International Conference on Naturalistic Decision Making, San Francisco, California

A model of sensemaking, where in the process by which a person becomes aware of a problem, or an unexpected or undesirable direction of a situation, that person begins to reconceptualize the situation they found themselves in.

 

Sources:

  • Klein, G., Pliske, R. M., Crandall, B., & Woods, D. (2005). Problem detection. Cognition, Technology, and Work, 7, 14-28.

Learn more about NDM in action

Attend our next event!