Accelerating Workforce Training: Introducing the CTA in E/Affect Initiative

CTA in Effect: Case studies demonstrating the benefits of Cognitive Task Analysis

Assessing the Proficiency of Supplemental Nutrition Assistance Program (SNAP) Case Workers

Share Post:

assorted sliced fruits

Primary Submitter:

David Feldon,


Supplemental Nutrition Assistance Program (SNAP) Case Workers in Social Services

Generic description of sponsoring organization or customer:

A cabinet agency with the State Director appointed by the Governor with the advice and consent of the Senate. Amongst many other services, the agency administers food assistance problems in every country across the state. The Supplemental Nutrition Assistance Program (SNAP) is a federal benefit to assist low-income citizens in the purchase of food. Caseworkers meet with applicants, help them complete applications and then evaluate those applications to ensure accuracy and compute appropriate SNAP benefit levels.

Cognitive Task Analysis Method(s):

Concepts, Processes, & Procedures (CPP; Clark et al., 2008; Clark, 2006, 2014) and PARI (Precursor, Action, Result, Interpretation; Hall et al., 1995)

Number of Participants:

Total Number = 23; Total Number of Proficient Performers = 23;

Method for determining proficiency:

DSS-administered standard evaluation case administered as a proficiency assessment at the conclusion of the case worker provisional training period.


1.5 calendar months


Findings share
A copy of the CTA-derived branching decision tree with criteria for each decision were provided to the sponsor, along with updated training materials that incorporated the decision tree and other CTA-derived content. An evaluation report was provided to the sponsor that assessed participant performance on the end-of-training exam and analysis of quality assurance (QA) audits of participants’ work products over the following three months. Each of these outcomes was compared against those of participants in conventional training used prior to the incorporation of CTA into materials.

Instructional and/or training experience
The CTA team was provided with editable electronic versions of the agency’s standardized training, which was edited to include CTA-derived content and decision rules. Digital versions were provided to the sponsor and to the sponsor’s training delivery team for use.

Demonstration of value

Evidence of value
The sponsoring organization offered the training experience to 23 employees who had been hired
within the previous six months. Following the training, the employees completed course evaluation surveys and a post-test of content knowledge. Following training, passive QA audits were conducted on participants in the CTA training group and the control group to examine the likelihood of detectable errors over time. The control group completed their training six weeks prior to the CTA group, so timing and professional experience were equivalent.

Key findings include:

a. CTA-based training required six fewer hours of training than the control version.

b. CTA-based training cohort performed significantly better on the post-training assessment (F=2.6, p=.044).

c. Survival and hazard analyses were used to compare QA audit results as cumulative odds of errors over 3 months. Survival analysis shows the odds of being error-free over time; hazard analysis shows the log cumulative hazard risk of making an error over time. The CTA-trained group (green line) was significantly less likely than the control group (blue line) to make errors over time, resulting in fewer cumulative errors during the audit period (χ2 = 21.34, p < .001).

Customer-provided perspective
Accountability, Data, and Research Director said: “I appreciate all that you have done for (the agency) – we have truly learned a great deal and have benefited from your work. Just so, you know – your CTA modules are still in place and being used. I spoke with __ and she continues to be enthusiastic on this method. You have a convert.”


Clark, R. E. (2006). Not knowing what we don’t know: Reframing the importance of automated knowledge for educational research. In G. Clarebout & J. Elen (Eds.), Avoiding simplicity, confronting comp[lexity: Advances in studying and designing powerful learning environments (pp. 3-15). Rotterdam: Sense Publishers.

Clark, R. E. (2014). Cognitive task analysis for expert-based instruction in healthcare. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology, 4th Edition (pp. 541-551).

Clark, R. E., Feldon, D., van Merriënboer, J. J. G., Yates, K., & Early, S. (2008). Cognitive task analysis. In J. M. Spector, M. D. Merrill, J. J. G. van Merriënboer, & M. P. Driscoll (Eds.). Handbook of research on educational communications and technology (3rd ed.) (pp. 577-593). New York: Macmillan/Gale.

Hall, E. M., Gott, S., & Pokorny, R. A. (1995). A procedural guide to cognitive task analysis: The PARI methodology. Brooks Air Force Base, TX: Manpower and Personnel Division, U.S. Air Force.