Close
The page header's logo
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected 
Invert selection
Deselect all
Deselect all
 Click here to refresh results
 Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Using incremental cognitive task analysis to capture expert instruction in expository writing for secondary students
(USC Thesis Other) 

Using incremental cognitive task analysis to capture expert instruction in expository writing for secondary students

doctype icon
play button
PDF
 Download
 Share
 Open document
 Flip pages
 More
 Download a page range
 Download transcript
Copy asset link
Request this asset
Transcript (if available)
Content Running head: COGNITIVE TASK ANALYSIS


USING INCREMENTAL COGNITIVE TASK ANALYSIS TO CAPTURE EXPERT
INSTRUCTION IN EXPOSITORY WRITING FOR SECONDARY STUDENTS


By


Nicolas Lim







________________________________________________________________________




A Dissertation Presented to the  
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the  
Requirements for the Degree
DOCTOR OF EDUCATION



May 2015










Copyright 2015        Nicolas Lim
COGNITIVE TASK ANALYSIS

 

 
2
Dedication
I wholeheartedly dedicate this dissertation to my family. Isabelle and Ryan, I
know that you are both too young to understand what daddy was doing and why he was
not home very much, but I want you both to know that you two were my greatest
motivation throughout this difficult process, and there is absolutely nothing else that I am
looking forward to more than just being more present in your lives. Annie, this
dissertation is as much yours as it is mine. Without your support every step of the way, it
would have been impossible for me to complete any of this.  















COGNITIVE TASK ANALYSIS

 

 
3
Acknowledgements
First, I would like to acknowledge my dissertation chair, Dr. Kenneth Yates, for
inspiring me to study cognitive task analysis. I will always be amazed at how you always
treated us in your thematic group as though we were colleagues and not just your
students. Your enthusiasm, patience, and encouragement not only paved the way to
obtaining my doctorate, but you motivated me to be a better teacher for my students. The
dissertation process was the most challenging task I have ever undertaken in my
academic career, and yet you managed to also make it one of the most enjoyable, and for
that I cannot thank you enough. I also would like to acknowledge my dissertation
committee members, Dr. Anthony Maddox and Dr. Paula Carbone, for your generous
support of this research study. Knowing that I had two instructors of your caliber as part
of this process allows me to feel a greater sense of pride in my work.  
I would also like to acknowledge all of the amazing people I had the privilege to
call friends as a result of this program. First, to my CTA thematic group (Megan, Milo,
Chris, Muteti, Doug, Judith, Kari, Deidre, Charlotte, Chad), I could not have asked for a
kinder and more intelligent group of professionals to work along side for the past two
years. I would also like to acknowledge the best cohort ever (OC Cohort 2012). If it were
not for you all, I probably would have dropped out of this program after the second week.
I would especially like to acknowledge my research partner and trusted friend, Milo Jury.
I consider myself extremely fortunate to have had you by my side during this entire
process. And to Megan McGuinness, you were my closest friend from day one. This
program would not have been the same without you.  

COGNITIVE TASK ANALYSIS

 

 
4
Table of Contents
Dedication          2

Acknowledgements         3

List of Tables          7

List of Figures          8

List of Abbreviations         9

Abstract          10

Chapter One: Overview of the Study       11
Statement of the Problem       11
Purpose of the Study        14
Methodology of the Study       15
Definition of Domain Term       16
Definition of CTA Terms       18
Organization of the Study       19

Chapter Two: Review of the Literature      20
Expository Writing        22
 Definition of Expository Writing     22
 The Connections between Expository Writing and Academic  
Success        23
Students’ Underachievement in Expository Writing   25
 Teachers’ Influence on Student Achievement   26
The Need to Improve Teacher Preparation Programs and  
Professional Development      27
The Need for Expertise in Writing Instruction   28
Summary         31
Knowledge Types        32
 Declarative Knowledge      32
 Procedural Knowledge      33
 Conditional Knowledge      33
Automaticity         34
Expertise         35
 Characteristics of Experts      35
 Building Expertise       37
 Consequences of Expertise      39
 Expert Omissions       40
Cognitive Task Analysis       41
 Definition of CTA       41
 CTA History        41
COGNITIVE TASK ANALYSIS

 

 
5
 CTA Methodology       44
 Effectiveness of CTA       46
 Benefits of CTA for Instruction     47
Summary         48

Chapter Three: Methods        50
Participants         51
Data Collection for Question 1      54
 Phase 1: Collect Primary Knowledge     55
 Phase 2: Identify Knowledge Types     55
 Phase 3: Apply Knowledge Elicitation Techniques   56
  Instrumentation      56
  Interviews       57
 Phase 4: Data Analysis      58
  Coding        58
  Inter-rater Reliability      58
  Subject Matter Expert Protocol and Verification  59
 Phase 5: Formatting the Results     59
  Gold Standard Protocol     59
  Summary       60
Data Collection for Question 2      60
 Spreadsheet Analysis       60
Data Collection for Question 3      61

Chapter Four: Results         62
Overview of Results        62
Research Questions        62
 Question 1        62
  Inter-rater Reliability      62
  Flowchart Analysis      62
  Gold Standard Protocol     63
  Recalled Action and Decision Steps    65
  Action and Decision Steps Contributed by Each SME 65
  Action and Decision Steps Captured in Follow-up    
Interviews       67
  Alignment of SMEs in Describing the Same Action  
and Decision Steps      68
 Question 2        70
  Total Knowledge Omissions     70
  Analysis of Action and Decision Step Omissions  72
 Question 3        73
  Total Time       73
  Total Cost       74
  Knowledge Elicitation     75
  Expert Contribution of Action and Decision Steps  76
  Incremental (1i) Method Analysis    78
COGNITIVE TASK ANALYSIS

 

 
6
  Individual (3i) Method Analysis    79
Summary         79

Chapter Five: Discussion        81
Overview of Study        81
Process of Conducting Cognitive Task Analysis    82
 Selection of Experts       82
 Collection of Data       84
Discussion of Findings       87
 Question 1        87
  Action Steps Versus Decision Steps    87
 Question 2        89
 Question 3        90
Limitations         96
 Confirmation Bias       97
 Internal Validity       97
 External Validity       97  
Implications         98
Future Research        101
Conclusion         103

References          105

Appendix A: Cognitive Task Analysis Interview Protocol    120

Appendix B: Inter-rater Reliability Code Sheet     123

Appendix C: Job Aid for Developing a Gold Standard Protocol   124  

Appendix D: SME A Initial Individual Protocol Flowchart    126

Appendix E: Gold Standard Protocol       147

Appendix F: Incremental Coding Spreadsheets     159






COGNITIVE TASK ANALYSIS

 

 
7
List of Tables
Table 1: SMEs Interviewed for Both the Current Study and the                    
Concurrent Study (Jury, 2015) 53
Table 2:  Example of Incremental (1i) Process toward the Preliminary                  
Gold Standard Protocol (PGSP) 64    
Table 3:  Cumulative Action and Decision Steps Captured for Each                      
SME in the Initial Individual Protocols 66
Table 4:  Additional Expert Knowledge Captured, in Action and                      
Decision Steps, During Follow-up Interviews 68
Table 5:  Number and Percentage of Action and Decision Steps that                        
are in Complete Alignment, Substantial Alignment, Partial          
Alignment, and No Alignment 69
Table 6:  Total Action and Decision Steps, or Expert Knowledge,                
Omissions by SMEs when Compared to the Gold Standard                
Protocol 71
Table 7:  Comparison of Total Time Spent Doing the Incremental (1i)              
Method and the Individual (3i) Method of CTA 73
Table 8:  Comparison of Total Cost Doing the Incremental (1i) Method                  
and the Individual (3i) Method of CTA 74
Table 9:  Comparison of Overall Action and Decision Steps Between                      
the Incremental (1i) Method and the Individual (3i) Method                    
Gold Standard Protocols 75








COGNITIVE TASK ANALYSIS

 

 
8
List of Figures
Figure 1: A timeline representing SME selection to either the incremental                
(1i) method or the individual (3i) method 54
Figure 2:  A step-by-step description of the incremental (1i) method to                  
reach the gold standard protocol (GSP) 60
Figure 3:  Number of action steps, decision steps, and action and decision              
steps for SME A, SME B, SME C, and SME D captured through            
CTA 67
Figure 4:  Number and percentage of action and decision steps that are in        
complete alignment, substantial alignment, partial alignment,                    
and no alignment 70
Figure 5:  Total SME knowledge omissions when compared to the gold            
standard protocol 72
Figure 6:  Total expert knowledge recall for the incremental (1i) method                  
and the individual (3i) method 76













COGNITIVE TASK ANALYSIS

 

 
9
List of Abbreviations
1i  1i+3r (1 Independent Interview + 3 Reviews)
3i  3i+3r (3 Independent Interviews + 3 Reviews)
BTA  Behavioral Task Analysis
CDE  California Department of Education
CCSS  Common Core State Standards
CTA  Cognitive Task Analysis
CDM  Critical Decision Method
CPP  Concepts, processes, and principles
EAP  Early Assessment Program  
ELA  English Language Arts
ELL  English Language Learner
ERWC  Expository Reading and Writing Course
GSP  Gold Standard Protocol
IRR  Inter-rater Reliability
NAEP  National Assessment of Educational Progress
NCES  National Center for Educational Statistics
NCLB  No Child Left Behind
PARI  Precursor, Action, Result, and Interpretation
PD  Professional Development
PGSP  Preliminary Gold Standard Protocol
SME  Subject Matter Expert
TPP  Teacher Preparation Programs
COGNITIVE TASK ANALYSIS

 

 
10
Abstract
Cognitive Task Analysis (CTA) methods were used to capture the knowledge and skills,
represented by action and decision steps, of expert English teachers when they recall and
describe how they provide expository writing instruction at the eleventh-grade level.
Three semi-structured CTA interviews were conducted to elicit and capture the
declarative and procedural knowledge in the form of action and decision steps. The data
was then coded, analyzed, and aggregated into a gold standard protocol (GSP) that was
verified by a fourth expert. This study also sought to identify and quantify the number
and percentage of expert knowledge and skills omissions when expert teachers recall how
they teach expository writing. The results confirmed previous studies that suggested
experts may omit up to 70% of critical information, and that 3-4 experts are optimal for
reversing the 70% omission effect of knowledge automaticity. Lastly, this study along
with a concurrent study (Jury, 2015) compared the efficiency of two methods of CTA.  
Both studies operationalized efficiency in terms of which knowledge elicitation method
captures more action and decision steps from the experts for less cost and time. The
results of the comparison yielded rich data, however there was no definitive answer as to
which method is more efficient. The expert knowledge and skills captured by CTA
methods may be used to train pre-service and in-service teachers in performing the
complex task of instructing eleventh-grade students to write expository essays.





COGNITIVE TASK ANALYSIS

 

 
11
CHAPTER ONE: OVERVIEW OF THE STUDY
Statement of the Problem
According to the Common Core State Standards Initiative (CCSS; 2010), high
school students are expected to acquire the necessary cognitive skills to successfully
demonstrate expository writing ability prior to entering their post-secondary education.  
The general purpose of expository writing, as stated by the California Department of
Education (CDE; 2008), is to explain or clarify an idea. Roughly synonymous with the
more commonly used label of argument, expository writing typically involves providing
evidence in support of the student’s position on a particular topic (Beck, Llosa, &
Fredrick, 2013; Chandrasegaran, 2013). According to Beck, et al. (2013), expository
writing is arguably the most significant writing genre for students’ academic success, as
well as central to advanced academic literacy.  
The importance of expository writing should not be overlooked since it is the
writing genre most often required of students on high-stakes assessments such as the
ACT and SAT (Beck, et al., 2013). For instance, the SAT, a college entrance exam used
in the admission process at nearly all four-year colleges and universities in the United
States (College Board, 2013), introduced a writing section to its general test in 2005 that
required all participating students to demonstrate expository writing skills, which counts
for approximately 30% of the overall score. In California, the demand for expository
writing skills continues to grow with exams such the California High School Exit Exam
(CaHSEE) and the Early Assessment Program (EAP). For example, in 2004, the
California State University (CSU) system in partnership with the CDE and the State
Board of Education developed the EAP, a statewide assessment that seeks to identify
COGNITIVE TASK ANALYSIS

 

 
12
students at the eleventh-grade level who need additional support with their expository
reading and writing skills (Cardenas, n.d.). The CSU has gone as far as developing an
Expository Reading and Writing Course (ERWC) for students to take during their senior
year in order to become college-ready (Cardenas, n.d.).  
Adding to the challenge of successfully writing expository essays, students are
also expected to complete their essays within an allotted period of time. The added
component of a time limit works in conjunction with the CCSS Initiative that states
students as early as the 3
rd
grade are expected to routinely write over shorter time frames.
Shorter time frames for students to write a full-length expository essay differ with each
exam. For example, students who take the SAT have as little as 25 minutes, and up to 45
minutes when writing their essays for the EAP exam.  
Results from such high-stakes assessments paint a discouraging picture of
students’ writing achievement, and calls for greater attention to the improvement of
writing instruction for students (Beck, et al., 2013). For instance, the ACT (2005) reports
that almost one third of high school graduates are unprepared to face the demands of
college-level writing (Beck, et. al., 2013). Similarly, the College Board, sponsor of the
SAT, stated that the latest scores show that roughly 6 in 10 college-bound high school
students who took the test were so lacking in their reading, writing, and math skills, they
were unprepared for college-level work (Sanchez, 2013). More precisely, the SAT
writing scores for the graduating class of 2011 dipped to 489, two points less than the
previous year (Banchero, 2011). More troubling is the data that indicates that writing
scores on the SAT have gone down almost every year since the revised exam was first
given in 2006 (Banchero, 2011). Additionally, the latest results of the EAP exam show
COGNITIVE TASK ANALYSIS

 

 
13
that 63% of eleventh-grade students in California are not ready for college English
(Cardenas, n.d.). The percentage of underprepared students jumps to 78% when it
includes students who were deemed only “conditionally-ready” for college English
(Cardenas, n.d.).  
Consequently, students leave high school decidedly underprepared for college,
especially in the area of writing (Fanetti, Bushrow, & DeWeese, 2010). In fact, in depth
research performed with proficient adult writers has revealed important information about
the cognitive activity that underlies the act of writing (Graham & Perin, 2007). For
example, the National Writing Project model promotes the notion that writing is made up
of closely linked processes that operate simultaneously, as teachers model and guide
students through various writing strategies (Graham & Perin, 2007). Therefore, the
expertise of teachers who teach expository writing will play a vital role in improving the
instruction of expository writing, and in turn, student achievement. According to Smith
(2004), there seems to be general agreement among researchers that teachers’ expertise
relates to subject matter knowledge, knowing how to teach the subject matter to others
(didactic knowledge), knowledge about how children learn, feel and develop
(pedagogical knowledge), self-awareness and social skills, and organizational
competence. Students can ultimately benefit from such teacher expertise, especially if
they receive explicit scaffolding, constructed within expertly delivered instructional
conversations that address the language, knowledge, and strategies required for problem
solving in writing (Gibson, 2008).
Expertise, however, is acquired as a result of continuous and deliberate practice in
solving problems in a particular domain, and therefore becomes automated and
COGNITIVE TASK ANALYSIS

 

 
14
unconscious (Yates & Clark, 2011). In fact, a number of studies have reported that
experts may omit up to 70% of the critical decisions they make when describing a
complex task (Yates & Clark, 2011). For that reason, Cognitive Task Analysis (CTA),
may be useful to capture the unobserved and automated knowledge of subject-matter
experts (i.e. SMEs) to inform teacher preparation programs (TPP) and in-service
teachers’ professional development (PD) (Yates & Clark, 2011). Furthermore, there are
many varieties of conducting CTA and this current study is being done concurrently with
another (Jury, 2015) comparing two methods of CTA.  
Purpose of the Study
The purpose of this study is threefold. The first is to conduct a CTA with ELA
teachers who are considered experts in expository writing instruction (i.e. Subject Matter
Experts or SMEs) in order to elicit the action and decision steps when they describe how
they instruct students in expository writing. In light of the number of the high-stakes
exams (i.e. SAT, ACT, EAP) that are emphasized during the students’ eleventh-grade
year, this study will interview SMEs who teach or have taught expository writing at the
eleventh-grade level. The second purpose of the study is to determine the number of
action and decision steps the teachers omit in their descriptions when compared to a gold
standard that is aggregated from all the teachers’ descriptions.  The third and final
purpose is to determine which method of CTA is more effective at capturing the
automated knowledge of SMEs – the 1i+3r incremental, in which one SME protocol (1i)
is reviewed by all three SMEs, or the 3i+3r individual, in which three separate SME
protocols (3i) are generated. More specifically, the goal is to determine whether the
incremental (1i) approach provides as much, if not more, action and decision steps from
COGNITIVE TASK ANALYSIS

 

 
15
the SMEs as the individual (3i) for less cost and time. The individual (3i) approach is
being conducting by Jury (2015).
As such, the following questions will be answered through this research study:
1. What are the action and decision steps that expert teachers recall when they
describe how they provide expository writing instruction to their eleventh-
grade students?
2. What percentage of action and/or decision steps, when compared to a gold
standard do expert teachers omit when they describe how they provide
expository writing instruction to their eleventh-grade students?
3. Which method of CTA, the incremental (1i) or the individual (3i) (Jury,
2015), is more efficient represented by the number of actions and decisions
steps and represented by cost and time?
Methodology of the Study
CTA was used in this study to determine the knowledge and skills,
operationalized as the action and decision steps, of English teachers identified as experts
or SMEs in describing how they provide expository writing instruction to eleventh-grade
students from a school district in Southern California. Four SMEs were selected, three to
participate in interviews and the fourth to verify the data collected from the three SMEs
on expository writing instruction. CTA is a five-step process (Clark, Feldon, van
Merriënboer, Yates, Early, 2008), as follows:  
1) A preliminary phase to build general familiarity frequently called
“bootstrapping;”
COGNITIVE TASK ANALYSIS

 

 
16
2) The identification of declarative and procedural knowledge and any hierarchal
relationships in the application of these knowledge types;  
3) Knowledge elicitation through semi-structured interviews;
4) Data analysis involving coding, inter-rater reliability, and individual SME
protocol verification; and  
5) The development of a gold standard protocol that was used to analyze and
determine expert omissions and ultimately for use in the training of novice
teachers.
Definition of Domain Terms
Expository essay: a type of argument that asks students to take a position on a
specific topic or issue and support their position with their evidence.
Argument: A formal argument emphasizes a line of reasoning that attempts to
prove by logic. When presenting an argument, the goal is to convince an audience of the
rightness of the claims being made using logical reasoning and relevant evidence.  
Introduction: The first paragraph of the essay. The overall purpose of an
introductory paragraph is to properly contextualize the essay’s topic or issue to help the
reader understand what is being written about and why. The introduction typically
includes a hook and a thesis.
Thesis or the primary claim: Typically a sentence that clearly conveys the
student’s position on the topic or issue of the essay. An assertion based on evidence of
some sort.
Supporting paragraphs or Body paragraphs: The portion of the essay where
students provide evidence in support of their thesis. Evidence can come in many forms,
COGNITIVE TASK ANALYSIS

 

 
17
including but not limited to: a syllogism, numerical data, personal observations, current
and historical events, and fictional and non-fictional literature. All of the following terms
below are commonly used by teachers when teaching their students to write an effective
supporting paragraph:  
Topic Sentence: The first sentence of each supporting paragraph. The topic
sentence typically indicates the argument that will be made for that particular paragraph.  
Concrete Detail or Evidence: A term that often refers to the specific evidence
students use to support their thesis. A concrete detail should not be debatable. For
example, if a student chooses to use iPhones as his example to make a point about
technology, the iPhone example is the concrete detail because iPhones do exist and has
clear connections as a technological device.  
Commentary or Warrant: The student’s explanation or rationale as to how the
concrete detail supports his thesis. It is opinion-based. Explains how the evidence
supports the claim. It is a commonplace rule that people accept as generally true, laws,
scientific principles or studies, and thoughtfully argued definitions.
Closing sentence or Transition sentence: The last sentence of the supporting
paragraph. This sentence attempts to communicate to the reader that the argument
presented in that paragraph has now come to an end. Concurrently, the closing sentence
serves as a transition to the next paragraph.  
Conclusion: The final paragraph of the essay. The general expectation is that
students finalize their argument. Students may do so by addressing opposing views,
offering pertinent arguments that were outside the scope of the essay, exploring other
possible solutions or explanations, etc.  
COGNITIVE TASK ANALYSIS

 

 
18
Definition of CTA Terms
The following are definitions of terms related to CTA as suggested by Zepeda-
McZeal (2014).
Adaptive expertise: When experts can rapidly retrieve and accurately apply
appropriate knowledge and skills to solve problems in their fields or expertise; to possess
cognitive flexibility in evaluating and solving problems (Gott, Hall, Pokorny, Dibble, &
Glaser, 1993; Hatano & Inagaki, 2000).
Automaticity: An unconscious fluidity of task performance following sustained
and repeated execution; results in an automated mode of functioning (Anderson, 1996a;
Ericsson, 2004).
Automated knowledge: Knowledge about how to do something: operates outside
of conscious awareness due to repetition of task (Wheatley & Wegner, 2001).
Cognitive load: Simultaneous demands placed on working memory during
information processing that can present challenges to learners (Sweller, 1988).
Cognitive tasks: Tasks that require mental effort and engagement to perform
(Clark & Estes, 1996).
Cognitive task analysis: Knowledge elicitation techniques for extracting implicit
and explicit knowledge from multiple experts for use in instruction and instructional
design (Clark et al., 2008; Schraagen, Chipman, & Shalin, 2000).
Conditional knowledge: Knowledge about why and when to do something; a type
of procedural knowledge to facilitate the strategic application of declarative and
procedural knowledge to problem solve (Paris, Lipson, & Wixson, 1983).
COGNITIVE TASK ANALYSIS

 

 
19
Declarative knowledge: Knowledge about why or what something is; information
that is accessible in long-term memory and consciously observable in working memory
(Anderson, 1996a; Clark & Elen, 2006).
Expertise: The point at which an expert acquires knowledge and skills essential
for consistently superior performance and complex problem solving in a domain;
typically develops after a minimum of 10 years of deliberate practice or repeated
engagement in domain-specific tasks (Ericsson, 2004).
Procedural knowledge: Knowledge about how and when something occurs;
acquired through instruction or generated through repeated practice (Anderson, 1982;
Clark & Estes, 1996).
Subject matter expert: An individual with extensive experience in a domain who
can perform tasks rapidly and successfully; demonstrates consistent superior performance
or ability to solve complex problems (Clark et al., 2008).
Organization of the Study
Chapter Two reviews the literature in the specific writing genre of exposition,
particularly in its impact on academic success, and the relevant literature related to CTA
as a knowledge elicitation technique for subject matter expertise. Next, Chapter Three
specifically addresses the methods of this study and how the approach to the research
answers the research questions. Chapter Four then reviews the result of the study and
describes, in detail, the findings to each of the research questions. Lastly, Chapter Five
serves as a discussion of findings, the implication of the findings upon expository writing
instruction and CTA, limitations of the study, and implications for future research.

COGNITIVE TASK ANALYSIS

 

 
20
CHAPTER TWO: REVIEW OF THE LITERATURE
Several reports and research studies agree that writing is an important tool for
educational, occupational, and social success in the United States (College Board, 2006;
Graham & Perin, 2007; Kiuhara, Graham, & Hawken, 2009; NAEP, 2011; Troia &
Olinghouse, 2013). For example, in 2011, the National Assessment of Educational
Progress (NAEP) published results that revealed 80% or more of salaried employees in
corporations in nearly all industries and services have some responsibility for writing in
their professions. The National Center for Educational Statistics (NCES; 2012) also
suggested that given the necessity for proficient writing in today’s society and the ability
to use written language to communicate with others, the need for effective writing
instruction has become more relevant than ever.  
Likewise, the College Board established the National Commission on Writing in
America’s Schools and Colleges in 2002 and has since published multiple reports in an
effort to focus national attention on the teaching and learning of writing. In 2003, the
Commission issued a benchmark report, The Neglected “R”: The Need for a Writing
Revolution, which declared that “Writing today is not a frill for the few, but an essential
skill for the many” (College Board, 2003). One of the report’s many discoveries was that
more than 90% of midcareer professionals cited the “need to write effectively” as a skill
“of great importance” in their day-to-day work (College Board, 2003).  
A year later, the Commission surveyed 120 major American corporations
employing nearly 8 million people in the United States and published another report,
Writing: A Ticket to Work…Or a Ticket Out, which discovered that two-thirds of salaried
(i.e. professional) employees in large American companies have some writing
COGNITIVE TASK ANALYSIS

 

 
21
responsibility (College Board, 2004). Writing thus acts as a “threshold skill” in such a
way that people who cannot write and communicate clearly will not be hired and are
unlikely to last long enough to be considered for promotion (College Board, 2004).
Corporate responses essentially left little doubt that writing is a ticket to professional
opportunity, while poorly written applications are a figurative “kiss of death” for job
seekers (College Board, 2004).  
Expanding on those earlier reports, the Commission published its third report,
Writing: A Powerful Message from State Government, which surveyed human resource
directors in 49 out of the 50 states. Survey results showed that for the nearly 2.7 million
state employees in the nation, writing is considered an important job requirement more
than ever before (College Board, 2005). The report evinced that writing is how students
connect the dots in their learning, how graduates connect the dots in their careers in the
private sector, and how public servants connect with themselves and their constituents
(College Board, 2005). Thus, the Commission arrived at the bold conclusion that
“Without writing, government would not function long, or well” (College Board, 2005).  
In 2006, the Commission published yet another report, Writing and School
Reform, which summarized what the Commission heard during a yearlong seminar that
consisted of five hearings held in different regions of the United States in 2004. At the
five hearings, teachers, administrators, university faculty, and academic leaders,
including experts from school and campus writing programs, commented on the
challenges of expanding and improving writing instruction (College Board, 2006). It was
very clear during the hearings that the tradition of analytic and expository writing
continues to be highly valued by academics, and also happens to be the writing genre
COGNITIVE TASK ANALYSIS

 

 
22
most heavily emphasized in K-12 education, especially at the secondary level (College
Board, 2006).
Expository Writing
Definition of Expository Writing
Expository is a broad term that encompasses numerous text structures that include
definition, description, process, classification, comparison, analysis, and persuasion
(Piccolo, 1987; Wilder & Mongillo, 2007), but for the purpose of this study, expository
will be regarded as a genre of writing that asks students to present a point of view and
support it with examples and evidence (Graham & Perin, 2007; Schleppegrell, 2004). In
other words, students who write expository essays are expected to provide their rationale
using evidence to substantiate their claims (i.e. thesis statements) (Beck, et al., 2013;
Chandrasegaran, 2013; Schleppegrell, 2004). Additionally, expository writing is
commonly topic oriented, focuses on concepts and issues, and expresses the unfolding of
ideas, claims, and arguments in terms of the logical interrelations among them (Berman
& Nir-Sagiv, 2007).  
Also worth noting is that exposition is closely related to what is commonly
referred to as an argument, since the nature of an argument is to support a position using
individual support (Beck, et al., 2013; Chandrasegaran, 2013). Consequently, this
distinction may lead to some confusion because the persuasive essay, another popular
writing genre that is taught at the secondary level, is also described in terms of an
argument. The two should not be confused however. To clarify, a formal argument
emphasizes a line of reasoning that attempts to prove its point solely on the merits of
logic (Hillocks, 2010). The goal when presenting an argument is to convince an audience
COGNITIVE TASK ANALYSIS

 

 
23
of the rightness of the claims being made using logical reasoning and relevant evidence
(Hillocks, 2010). Persuasive writing, on the other hand, focuses more on swaying its
audience (Hillocks, 2010). Therefore, in a persuasive essay, one is expected to select the
most favorable evidence, appeal to emotions, and use a particular style that seeks to
connect with the reader (Hillocks, 2010).  
The Connections between Expository Writing and Academic Success
Grounded in the premise put forth by the College Board (2003) that “if students
are to learn, they must write” (p. 9), there is a growing trend within an academic context
to use writing proficiency as a determiner of graduation eligibility and in making
decisions regarding grade retention and promotion (Troia & Olinghouse, 2013). In
particular, expository writing or building an argument is at the heart of critical thinking
and academic discourse, as well as the kind of writing students need to know for success
in college (Hillocks, 2010). Expository writing is also considered to be central to
advanced academic literacy as it often serves as a key gatekeeping milestone as students
move from one level of schooling to the next (Graham & Perin, 2007; Schleppegrell,
2004). For instance, Graham and Perin (2007) discovered that 60% of writing
assignments in 4
th
grade, 65% in 8
th
grade, and 75% in 12
th
grade are expository in
nature. The results of the study showed that students are increasingly expected to write
expository essays as they move through K-12 and into higher education (Graham &
Perin, 2007).  
Furthermore, expository essays have become symbolic of students’ success with
language at school and often serve as an evaluation metric for acceptance at colleges or
universities and placement in a writing program (Schleppegrell, 2004). In fact, directors
COGNITIVE TASK ANALYSIS

 

 
24
of admissions, deans, provosts, and university presidents have all spoken as to how much
they highly regard and value the tradition of analytic and expository writing (College
Board, 2006). In postsecondary education, universities use writing to evaluate applicants’
qualifications, and proficient writing is expected for completion of a college degree
(Troia & Olinghouse, 2013). Not surprisingly, expository writing is arguably one of the
most important tasks students face and the most significant for students’ academic
success (Beck, et al., 2013).  
Additionally, writing appears to be crucial for students’ success on high-stakes
achievement tests that have become vital in school reform efforts in the United States
(Troia & Olinghouse, 2013). The most current and relevant example can be seen in the
new CCSS. Unlike previous reforms such as No Child Left Behind (NCLB) where
writing was all but absent (Kuihara, et al., 2009), the CCSS pay explicit attention to
students’ writing skills (Graham, Gillespie, & McKeown, 2013; McQuitty, 2012),
specifically on writing to sources by using evidence from texts to present careful
analyses, well-defended claims, and clear information, all of which are hallmarks of
expository writing (Coleman, Pimentel, & Zimba, 2012). Therefore, in order to meet
CCSS writing benchmarks, teachers and schools must place greater emphasis on learning
how to write and how to use expository text, especially persuasive and informational
texts, to promote learning within and across disciplines for a variety of purposes and
audiences (Graham, et al., 2013). As a result, increased attention to writing means it will
join reading and math as a highly scrutinized and rigorously tested subject area
(McQuitty, 2012). Furthermore, exposition is typically the required genre in extended
writing tasks on high-stakes exit-level assessments of writing and on assessments of
COGNITIVE TASK ANALYSIS

 

 
25
students’ readiness for college-level writing, such as the ACT and SAT (Beck, et al.,
2013). Those who know the needs of college writers and who are familiar with the new
ACT and SAT writing samples know that persuasive writing will not suffice; for college
and career success, one needs to know how to make an effective case and make a good
argument (Hillocks, 2010).  
Students’ Underachievement in Expository Writing
Unfortunately, the statistics on adolescents’ writing achievement is discouraging
(Beck, et al., 2013), and many students exit high school without the writing skills needed
for success in college or work (Kuihara, et al., 2009). For instance, students meet
proficiency on the NAEP writing framework when they are able to produce an effectively
organized and fully developed expository response (i.e. include details that support and
develop the main idea of the piece) within an allotted period of time (i.e. approximately
50 minutes), and demonstrate analytical, evaluative, or creative thinking (Applebee &
Langer, 2009). In 2007, only 23% of twelfth-grade students demonstrated such writing
proficiency (Applebee & Langer, 2009). Other reports shared similar results. For
example, the ACT (2007) described that nearly one-third of high school graduates are
unprepared to face the demands of college-level writing (Troia & Olinghouse, 2013), and
the NCES (2012) revealed that only 24% of students at both grades 8 and 12 performed at
the proficient level in writing in 2011 (Beck, et al. 2013). Not surprisingly, college
instructors have reported that 50% of high school graduates are not prepared for college
writing (Kuihara, et al., 2009).  
Further still, the California State University (CSU) administers the Early
Assessment Program (EAP) exam every year to all eleventh-grade students in California
COGNITIVE TASK ANALYSIS

 

 
26
in order to identify those students who do not possess the basic English skills to be
successful in a first-year English course. The exam is composed of a multiple-choice
section, as well as an in-class expository essay that students must complete within 45
minutes. Results of the EAP exam revealed that of the 384,722 eleventh-grade students
within California who participated in 2013, only 23% (i.e. 88,486) demonstrated English
proficiency. Additional data from the CSU released in 2013 showed that only 32% of
incoming freshmen demonstrated proficiency on the essay subtest of the English
Placement Test (EPT). Although the data indicate a slight improvement as students enter
their first year of college, the majority of the incoming students are still in need of writing
remediation.  
Teachers’ Influence on Student Achievement
If there is to be any real chance of improving the writing abilities of students,
teachers will have to play an essential role because how well students write is largely
influenced by how they are taught to write (Kuihara, et al., 2009). The research has
shown that highly-qualified teachers can make the greatest impact to positively influence
student achievement (Darling-Hammond, 2000; Hanushek, 1992). For instance, the
difference in annual achievement growth between having a good as opposed to a bad
teacher can be more than one grade-level equivalent in test performance (Hanushek,
1992). Additionally, students who are assigned to several ineffective teachers in a row
perform significantly lower than those who are assigned to several highly effective
teachers in sequence (Darling-Hammond, 2000; Hanushek, 1992). Furthermore, when
aggregated at the state level, teacher quality variables appear to be more strongly related
to student achievement than class sizes, overall spending levels, teacher salaries, or such
COGNITIVE TASK ANALYSIS

 

 
27
factors as the statewide proportion of staff who are teachers (Darling-Hammond, 2000).
Considering the significant role teachers play in student learning, they must be properly
trained in order to provide effective writing instruction that is rich in content and
meaningful to their students (Darling-Hammond, 2004; Kiuhara, et al., 2009; McQuitty,
2012; Sternberg & Horvath, 1995).  
The Need to Improve Teacher Preparation Programs and Professional Development
Responsibility to provide such preparation rests in part with college teacher
preparation programs (TPP), the efforts of school districts to provide in-service teachers
professional development (PD), and the efforts of individual teachers to obtain needed
information (Kiuhara, et al., 2009). A reason for concern however is that Applebee
(1981) recognized over 30 years ago that TPPs did not stress writing instruction with its
pre-service teachers, and it is a concern that continues today. More recently, the research
has found that despite the growing number of TPPs, there is very little consistency among
them in the amount of required writing (Chamblass & Bass, 1995; Saphier, 2011).  
In fact, teachers in the field have reported that their TPPs did not prepare them in
the use of instructional practices associated with the writing process (Brimi, 2012;
Chambless & Bass, 1995). For instance, 71% of all teachers indicated that they received
minimal to no preparation to teach writing during their pre-service preparation, and 44%
continued to report the same low level of preparation following their in-service PDs
(Kiuhara, et al., 2009). More specifically, when asked about their preparation to teach
writing within their content area or discipline, 61% of language arts teachers still
indicated that they received minimal to no preparation (Kiuhara, et al., 2009).
COGNITIVE TASK ANALYSIS

 

 
28
Compounding the lack of writing instruction is the finding that state department
requirements for certification in teaching writing are minimal (Chambless & Bass, 1995).  
In addition to TPPs, greater efforts should be given to the PDs of teachers who
will be expected to guide students’ writing (Chambless & Bass, 1995; Street, 2003).
Kuihara et al. (2009) discovered that although most language arts teachers indicated that
their in-service training to teach writing was adequate, one in four did not agree.
Applebee and Langer (2009) also discovered that ELA teachers, by and large, are aware
of the potential usefulness of standards and respond positively to PD experiences that
help them support their students’ reading and writing processes. However, such learning
experiences were not made available to 20%-30% of the teachers surveyed, and the
extent and usefulness of the experiences that were provided is unclear (Applebee &
Langer, 2009). As teachers have indicated, it is not that uncommon that they learned
more about teaching writing once they were faced with the task, through their interaction
with colleagues, from books, or through their own trial and error in the classroom (Brimi,
2012).
The Need for Expertise in Writing Instruction
John Dewey addressed this issue nearly 100 years ago when he acknowledged
that the observation of expert teachers is a necessary component of teacher education
(Ethell & McMeniman, 2000). However, Dewey cautioned that observation must focus
on the cognitions underlying the observable teaching practice, because observation alone
has a danger of focusing only on visible classroom behaviors, and pre-service teachers
may limit their practices to mere imitation that is devoid of insight and initiative (Ethell
& McMeniman, 2000).  
COGNITIVE TASK ANALYSIS

 

 
29
Historically, observation has played a significant role in the learning and
acquisition of new skills (Collins, Brown, and Newman, 1988). More precisely, before
formal schooling in America and other industrialized nations was fully realized,
apprenticeships were the most common means of learning used to transmit the knowledge
required for expert practices in fields from painting and sculpting to medicine and law
(Collins, et al., 1988). Typically within traditional apprenticeships, the expert shows the
apprentice how to do a task, watches as the apprentice practices portions of the task, and
then turns over more and more responsibility until the apprentice is proficient enough to
accomplish the task independently (Collins, et al., 1988). Observation thus plays a key
role in that it aids learners in developing a conceptual model of the target task or process
prior to attempting to execute it, and having a conceptual model is an important factor in
apprenticeship’s success in teaching complex skills without resorting to lengthy practice
of isolated sub-skills (Collins, et al., 1988).  
As a result, Collins et al. (1988) began to reimagine aspects of a traditional
apprenticeship within the context of learning in school, and proposed the notion of
“cognitive apprenticeship.” On one hand, cognitive apprenticeship emphasizes methods
that are aimed primarily at teaching the processes that experts use to handle complex
tasks (Collins, et al., 1988). In other words, where conceptual and factual knowledge is
addressed, cognitive apprenticeship emphasizes its uses in solving problems and carrying
out tasks (Collins, et al., 1988). On the other hand, cognitive apprenticeship also refers to
the fact that the focus of the learning-through-guided-experience is on cognitive and
metacognitive, rather than on physical, skills and processes (Collins, et al., 1988).
Whereas the learning of physical skills and processes lend themselves to be observed, the
COGNITIVE TASK ANALYSIS

 

 
30
cognitive and metacognitive skills and processes are by their nature tacit and therefore
unobservable (Collins, et al., 1988). Thus, cognitive apprenticeship teaching methods are
designed, among other things, to bring these tacit processes into the open, where students
can observe, enact, and practice them with help from the teacher and from other students
(Collins, et al., 1988).  
Examining a successful model of cognitive apprenticeship, Collins et al. (1988)
cite Palinscar and Brown’s (1984) Reciprocal Teaching of reading, which exemplifies
many of the features of cognitive apprenticeship and demonstrated remarkable effects in
raising students’ scores on reading comprehension tests, especially those of poor readers.
As the name suggests, the teacher and students take turns playing the role of teacher
(Collins, et al., 1988). After reading a passage, the teacher begins by modeling specific
reading strategies and skills, then coaches the students to perform those reading strategies
and skills, and eventually fades away allowing the student to practice and play the role of
teacher (Collins, et al., 1988). In a pilot study that used Reciprocal Teaching with
individual students who were poor readers, the method raised subjects’ reading
comprehension test scores from 15% to 85% accuracy after about 20 training sessions
(Collins, et al., 1988). Six months later the students were still at 60% accuracy,
recovering to 85% after only one session (Collins, et al., 1988). In a subsequent study
with groups of two students, the scores increased from about 30% to 80% accuracy, with
very little change eight weeks later (Collins, et al., 1988). Furthermore, in classroom
studies with groups of four to seven students, test scores increased from about 40% to
80% correct, again with only a slight decline eight weeks later (Collins, et al., 1988).  
COGNITIVE TASK ANALYSIS

 

 
31
Ethell and McMeniman (2000) also conducted a study where they adopted a
teaching approach derived from cognitive apprenticeship to provide novice teachers with
access to the thinking underlying the practice of one expert teacher. Specifically, they
used video recordings of the classroom practice and related stimulated-recall interviews,
which allowed the expert teacher to reflect on the thinking underlying his classroom
practice and make explicit the typical tacit cognitive and metacognitive processes that
guided his teaching practice (Ethell & McMeniman, 2000). The goal of the study was to
reconcile the conflict that often occurs between the theory pre-service teachers receive at
the university and the on-the-job practicum they experience during student teaching
(Ethell & McMeniman, 2000). The results positively showed that the pre-service teachers
gained access into the cognitive world of the expert practitioner, and revealed how expert
practitioners make sense of the day-to-day complexity of classroom practice through
drawing on their professional knowledge of teaching and learning (Ethell & McMeniman,
2000). Pre-service teachers’ reactions to the intervention demonstrate that although they
had previously had the opportunity to observe expert teachers, both in university and
school classrooms, they had not been able to identify or articulate the intentions
underlying teachers’ practice (Ethell & McMeniman, 2000).  
Summary
Writing is an essential skill for advancement in both college and career. That
being said, the most popular form of writing that is taught at the secondary level is
exposition, which asks students to present a point of view and support it with examples
and evidence. Research has revealed however that at least half of secondary students
tested do not demonstrate proficiency when it comes to such writing skill. Research has
COGNITIVE TASK ANALYSIS

 

 
32
also shown that teachers are the most influential factor in students’ achievement, thus it is
natural to believe that teachers will need to play a central role in improving students’
expository writing skills. Unfortunately, significant numbers of in-service teachers admit
that they were not properly or sufficiently taught to instruct students in writing during
their TPPs and in the PD they have attended. In response, research suggests that
educators can greatly benefit from the knowledge of experts and using their expertise in
the training of novice teachers. As seen in the next section, the knowledge can be
disaggregated into knowledge types for further examination.
Knowledge Types
Between novices and experts, the research is clear that experts are more
knowledgeable than novices (Ethell & McMeniman, 2000). More precisely, the three
knowledge types that are necessary to perform a complex task are declarative, procedural
and conditional, which is a type of procedural knowledge.  
Declarative Knowledge
Declarative knowledge is essentially factual knowledge (Anderson & Krathwohl,
2001; Corbett & Anderson, 1995). In general, knowledge is first learned in a conscious,
declarative form (Anderson, 1982). That knowledge can then be controlled and changed
abruptly in working memory, and once that knowledge is committed to long-term
memory, declarative knowledge is retrievable information that can articulate the answers
to questions of what and why with regards to concepts and facts (Anderson & Schunn,
2000). In other words, declarative knowledge is that which one can say or tell. The
declarative stage is also where the domain knowledge is directly embodied in procedures
for performing certain skills (Anderson, 1982). However, declarative knowledge by itself
COGNITIVE TASK ANALYSIS

 

 
33
is insufficient to execute skilled performance (Anderson, 1982). Declarative knowledge
clears the path for, and supports the attainment of, procedural knowledge.
Procedural Knowledge  
Declarative and procedural knowledge are not the same and enable different types
of performance (Ambrose, Bridges, DiPietro, Lovett, & Norman, 2010). One may know
facts, but are not able to perform the procedure or know when to execute the function
(Ambrose, et al., 2010). Similarly, one may be able to perform a function, but not be able
to explain the rationale for why they one is doing it (Ambrose, et al., 2010). More
precisely, procedural knowledge is knowing how and when, this is, conditional
knowledge, to apply specific procedures, skills or methods (Ambrose, et al., 2010). It is
goal-oriented, subject-specific, and facilitates knowledge that promotes problem solving
(Anderson & Krathwohl, 2001; Corbett & Anderson, 1995). It also includes sequences
and steps to be followed during a simple or complex task (Anderson & Krathwohl, 2001).
With repetition and practice, both declarative and procedural knowledge become stronger
and performance becomes more fluid, consistent, and automated.  
Conditional Knowledge
Conditional knowledge is a type of procedural knowledge of when to use or not
use a given procedure (Anderson & Krathwohl, 2001; Paris, et al., 1983). Conditional
knowledge provides the circumstances or rationale for various actions, includes value
judgments, and helps modulate procedural and declarative knowledge (Paris, et al., 1983).  
Taken all together, declarative, procedural, and conditional knowledge are
required for completing complex tasks and are acquired as one transitions from novice to
expert. With repetition and practice, both declarative and procedural knowledge become
COGNITIVE TASK ANALYSIS

 

 
34
stronger and performance becomes more fluid, rapid and consistent (Corbett & Anderson,
1995). Once procedural knowledge is developed, it becomes automated and very difficult
to change (Anderson, 1993).  
Automaticity
Automated knowledge in experts is based not only on more knowledge, but also
more advanced knowledge that is expressed through abstractions (Hinds, Patterson, &
Pfeffer, 2001). There are 3 stages of automaticity (Anderson, 1996b). The first stage is
the interpretive stage or cognitive stage in which a learner is able to complete a task or at
least a close approximation of the task with initial instructions that are often verbal
(Anderson, 1996b).  This stage frequently involves talking to oneself when performing
the action (Anderson, 1996b). The second stage is the knowledge compilation or
associative stage.  In this stage the learner works through the procedure and applies or
learns the declarative knowledge necessary to correct procedural errors. As the learner
corrects errors and develops stronger procedural knowledge the verbal cueing of talking
to oneself decreases and ultimately disappears (Anderson, 1996b). The third stage is the
strengthening and tuning or autonomous stage where the learner performs the procedure
automatically without verbal cueing and any changes made to the procedure serve to
strengthen or make the process more efficient (Anderson, 1996b).  
Automated processes often initiate without prompting and once they initiate,
automated processes run to completion without being available for conscious monitoring
(Feldon, 2007). For example, even when teachers are made aware of omissions in their
automated teaching processes or are provided with goals to change these automated
processes, they may ironically fail to make changes because the working memory
COGNITIVE TASK ANALYSIS

 

 
35
becomes quickly occupied with the changes and the automated processes begin and run
to completion because the working memory is occupied (Feldon, 2007). The result is that
teachers may be unable to attend to and monitor automated processes in order to bring
about desired changes (Feldon, 2007). As experts develop their conscious declarative
knowledge, it becomes gradually more automated (Feldon, 2007).
Consequently, automaticity acts as a double-edged sword. In one sense, the
automation process is advantageous to expertise as it supports the capacity to respond to
novel problems with speed, accuracy, and consistency within an expert’s domain (Clark,
1999; Clark & Elen, 2006; Wheatley & Wegner, 2001). Automation can alleviate the
cognitive overload and/or processes that can impede the efficiency of working memory,
thereby freeing it to attend to new tasks (Feldon, 2007; Kirschner, Sweller, & Clark,
2006; Wheatley & Wegner, 2001). However, automated knowledge can be detrimental in
that it is very resistant to change and difficult to unlearn (Wheatley & Wegner, 2001).
Research has suggested that experts are unaware of the information they use to complete
complex tasks as a consequence of automaticity (Clark & Elen, 2006). As a result,
experts have automated procedural knowledge that they cannot consciously explain
(Clark & Elen, 2006).  
Expertise
Characteristics of Experts
An expert can be described as the distinguished or brilliant journeyman, highly
regarded by peers, whose judgments are uncommonly accurate and reliable, and have a
number of years working in the domain of interest (Chi, 2006; Feldon, 2007).
Additionally, experts perform with consistently high results that demonstrate
COGNITIVE TASK ANALYSIS

 

 
36
consummate skill, economy of effort, and who can deal effectively with certain types of
rare or “tough” cases (Chi, 2006; Feldon, 2007). The mark of expert performance is they
can display their superior performance reliably upon demand (Ericsson & Lehmann,
1996). Expert memory performance is not an automatic consequence of domain-specific
experience, but rather requires the acquisition of specific memory skills tailored to the
demands of working memory by the particular activity (Ericsson & Lehmann, 1996).  
Analysis of expert performance shows that experts increase level of performance
by structural changes of performance. The ability of experts to exceed usual capacity
limitations is important because it demonstrates how acquired skills can supplant critical
limits within a specific type of activity (Ericsson & Lehmann, 1996). Experts’ ability to
accurately report anticipated outcomes refutes claims that experts completely automate
the execution of perceptual-motor performance (Ericsson & Lehmann, 1996). Experts’
memory is not general across all types of information in a domain but reflects selective
encoding of relevant information and mechanisms acquired to expand the functional
capacity of the experts’ working memory (Ericsson & Lehmann, 1996). The vastly
superior memory of experts for briefly presented information appears to reflect memory
skills and long term-working memory that they acquire to support many important
activities, such as planning, reasoning, and anticipation of future events (Ericsson &
Lehmann, 1996).
An expert can also be described through the differences in their relationship to
novices (Bedard & Chi, 1992). For example, novices are characteristically more literal,
predictable, and only have surface information (Bedard & Chi, 1992). Experts, on the
other hand, have more domain knowledge, have developed schemas allowing them to
COGNITIVE TASK ANALYSIS

 

 
37
organize information so it is quickly and efficiently retrieved with minimal effort, and
solve problems more efficiently than novices (Bedard & Chi, 1992; Glaser & Chi, 1988).
As a result, expert performance is consistently superior to novices because experts
perceive more meaningful, interconnected patterns in their domain, are faster and more
accurate than novices when performing domain skills, perceive problems in a domain at
deeper (more principled) levels than novices, spend a larger proportion of time
qualitatively analyzing problems, and self-monitor effectively during problem solving
(Chi, 2006; Feldon, 2007; Glaser & Chi, 1988).  
Building Expertise
Expertise was initially thought to be a gift from the gods (Ericsson & Charness,
1994). When research on expertise was first approached, theorists leaned toward innate
abilities and superior talent was a result of being “born” with it (Ericsson & Charness,
1994). For instance, Galton (1869/1979) first investigated and suggested that exceptional
performance was a result of inherited natural ability (Ericsson, Krampe, & Tesch-Romer,
1993). Since then, the search for innate characteristics that could predict or at least
account for the superior performance of eminent individuals has been surprisingly
unsuccessful (Ericsson, et al., 1993). The perception is that if individuals are innately
talented, they can more easily and rapidly achieve expert levels of performance once they
have acquired basic skills and knowledge. However, biographical materials disprove this
notion (Ericsson, et al., 1993). The more modern view supports the idea that expert
performance is the result of skill that develops over time and with increased exposure to a
task (Ericsson & Charness, 1994; Ericsson, et al., 1993). Thus, experience is a key
element in the development of expertise (Ericsson & Charness, 1994).  
COGNITIVE TASK ANALYSIS

 

 
38
According to Ericsson, et al. (1993), deliberate practice is also necessary to
building expertise. Deliberate practice can be characterized in several ways. First,
deliberate practice can be seen through the subjects’ motivation to attend to the task and
exert effort to improve their performance (Ericsson, et al., 1993). Second, the design of
the task should take into account the preexisting knowledge of the learners so that the
task can be correctly understood after a brief period of instruction (Ericsson, et al., 1993).
Third, the subjects should receive immediate informative feedback and knowledge of
results of their performance (Ericsson, et al., 1993). Fourth, the subjects should
repeatedly perform the same or similar tasks (Ericsson, et al., 1993). By engaging in
deliberate practice and problem solving, a novice learner develops over time more
efficient schema, knowledge, skills and decision steps. More precisely, Simon and Chase
(1973) argued that high levels of mastery require at least 10 years to achieve and
maintain expert performance, which corresponds to several thousand hours of deliberate
practice (Ericsson, et al., 1993).
An individual’s primary goal is to reach a level of mastery that will allow them to
perform everyday tasks at an acceptable level or to engage proficiently in recreational
activities with their friends (Ericsson, 2004). During the first phase of learning, novices
try to understand the activity and concentrate on avoiding mistakes (Ericsson, 2004).
With more experience in the middle phase of learning, gross mistakes become rare,
performance appears smoother, and learners no longer need to concentrate as hard to
perform at an acceptable level (Ericsson, 2004). After a limited period of training and
experience—frequently less than 50 hours for most recreational activities such as typing,
playing tennis, and driving a car—an acceptable standard of performance is typically
COGNITIVE TASK ANALYSIS

 

 
39
attained (Ericsson, 2004). As individuals adapt to a domain and their skills become
automated, they are able to execute skills smoothly and without apparent effort (Ericsson,
2004). As a consequence of automation, performers lose conscious control over
execution of those skills, making intentional modifications difficult. Once the automated
phase of learning has been attained, performance reaches a stable plateau with no further
improvements (Ericsson, 2004).
Consequences of Expertise
Despite the wealth of knowledge and skills experts possess, they also fall short in
relatively significant ways. For instance, expertise has been found to be domain-limited,
and experts rely on contextual cues within their domains of expertise (Chi, 2006). Experts
have also been found to be over-confident, inflexible, and sometimes fail to recall surface
features and overlook details (Chi, 2006). For example, Feldon (2007) examined the
accuracy of experts’ self-reports and found errors to be prevalent. Experts’ schemas are
highly adaptive for problem solving, but they can subsequently interfere with the accurate
recall of problem-solving situations (Feldon, 2007). Moreover, the patterns in the self-
report data indicated that self-report errors and omissions increased as skills improved
(Feldon, 2007). Individuals tend to attribute most, if not all, of their actions to intentional
decision making processes, and the strength of this belief can unintentionally lead them to
fabricate consciously reasoned explanations for their automated behaviors (Feldon,
2007). Hence, reports may be inaccurate when participants rely on incorrect preexisting
causal theories to explain their processes (Feldon, 2007). Consequently, the most
frequently employed elements – presumably those of greatest utility within a domain of
expertise – would be the most difficult to articulate through recall (Feldon, 2007).
COGNITIVE TASK ANALYSIS

 

 
40
Expert Omissions
The most significant limitation of experts is they unintentionally omit 70% of
critical information that novices need to learn to successfully perform the procedure
(Bedard & Chi, 1992; Clark, Pugh, Yates, Inaba, Green, & Sullivan, 2011). For instance,
in a study of instruction of research design in psychology, automaticity and the accuracy
of self-reporting were negatively correlated (Feldon, 2004). This is a serious problem
because it forces novices to “fill in the blanks” using less efficient and error-prone trial-
and-error methods. Moreover, as these errors are practiced over time, they become more
difficult to ‘‘unlearn’’ and correct (Clark, et al., 2011). There are two reasons for this
problem. First, as expertise increases, skills become automated and the steps of the
procedure blend together. Experts perform tasks largely without conscious knowledge as
a result of years of practice and experience (Clark & Estes, 1996). This causes experts to
omit specific steps when trying to describe a procedure because this information is no
longer accessible to conscious processes (Clark & Elen, 2006). Secondly, many
physicians are not able to share the complex thought processes of behavioral execution of
technical skills. Consequently, it is difficult to identify points during a procedure where
an expert makes decisions (Clark & Elen, 2006). For instance, when asked to describe a
procedure, many surgeons rely on self-recall of specific skill. Studies from the field of
cognitive psychology suggest that the use of standard self-report or interview protocols to
extract descriptions of events, decision making, and problem solving strategies can lead
to inaccurate or incomplete reports. Experts often do not recognize these errors because
of the automated and unconscious nature of the knowledge they describe (Wheatley &
Wegner, 2001). Moreover, errors are likely to increase in number and impact under
COGNITIVE TASK ANALYSIS

 

 
41
stressful situations (Hunt & Joslyn, 2000). However, omissions from one expert can be
mitigated by employing methods such as cognitive task analysis, to capture expertise
from multiple experts.
Cognitive Task Analysis (CTA)
Definition of CTA  
CTA was first introduced in the early 1970s by the educational technology
community (Annett, 2000), and it was not until 1979 when J. P. Gallagher published an
article reviewing instructional design and used the term “cognitive task analysis” while
discussing cognitive influences on instructional design (Hoffman & Militello, 2009). In
general, CTA is defined as a family of practices used to elicit, analyze, and organize both
the explicit and implicit knowledge and goal structures experts use to perform complex
tasks (Chipman, 2000; Clark, et al., 2008).  
CTA History  
The seeds for CTA were sown as far back as the late 1800s as a result of the
Industrial Revolution (1860-1914) in the United States. The Industrial Revolution was a
significant period in American history as it evolved from a predominantly agrarian, rural
nation to one that was more commercial and urban (Annett, 2000). More specifically,
products that were generally crafted by hand were now being manufactured in larger
quantities by machines in large factories. As newer forms of technology (i.e. the steam
engine, Ford production line, transcontinental railroad, etc,) were being developed, whole
industries were forced to reorganize and larger, more complex organizations were formed
(Nyland, 1996).  
 
COGNITIVE TASK ANALYSIS

 

 
42
During this time, Fredrick Taylor and Frank and Lillian Gilbreth, all of whom had
experience with manufacturing industries, noticed that not only were laborers and
machines not working at full capacity, but that some processes were not necessary to
complete a task. As a result, Taylor and the Gilbreths realized the need for a scientific
approach to describe, identify, and optimize human tasks during production (Pigage &
Tucker, 1954). Taylor’s (1911) research studied experts as they worked on certain tasks
and determined the exact amount of time it took to complete them (Schraagen, et al.,
2000). Based on the results of his time study, Taylor (1911) recommended that a
scientific method of management be implemented in order to transform work from rule-
of-thumb and apprenticeship-style learning to a systematic way to train workers to
perform a series of simplified tasks in a time-efficient manner that would minimize error
and increase efficiency (Gowler & Legge, 1983; Krenn, 2011; Nylan, 1996). Moreover,
the Gilbreths, both engineers by trade, studied the movements of bricklayers in order to
identify a set of universal elements that could be used to describe a wide range of
activities to reduce the number of movements required to achieve a given result (Annett,
2000). The work of the Gilbreths on motion study eventually was used to determine
likely labor costs, and subsequent systems design came into question to make systems
more efficient (Annett, 2000). Eventually, time and motion studies were used together as
an approach to scientific management of production, and called for managers to observe
and record workers’ physical behavior while executing tasks in order to identify, describe,
and analyze tasks for efficiency (Pigage & Tucker, 1954; Thorndike, 1921).  


COGNITIVE TASK ANALYSIS

 

 
43
The process of observing overt behavior, delineating componential tasks, and
describing these tasks became known as traditional task analysis or Behavioral Task
Analysis (BTA) in the United States where behavioral psychology was prevalent. BTA
essentially promoted that only the observable sequence of actions be the focus of any
analysis of a task, and became a sufficient training method in the early part of the 1900s
for manufacturing jobs that primarily required physical labor and simple decisions (Clark
& Estes, 1996; Hoffman & Militello, 2009). By the 1950s however, new technologies
that were more elaborate, complex, and automated, emerged along with a global market
economy causing a significant rise in the complexity of tasks in the workplace (Clark &
Estes, 1996; Hoffman & Militello, 2009). Jobs now required more trouble-shooting and
advanced problem-solving skills of teams of people working together as opposed to a
single individual performing a physical task (Annett, 2000; Cooke, 1992; Berryman,
1993). As a result, industry trainers who had relied on BTA for training were challenged
because BTA only focused on physical, observable tasks, and could not be used in
providing trainees with the knowledge of mental strategies, such as problem-solving or
making decisions when presented with multiple alternatives (Annett, 2000).
Consequently, incomplete knowledge inevitably led to incomplete instruction and errors
in performance (Clark, et al., 2008; Annett, 2000).  
The foundations for cognitive psychology also began to take hold during the
1950s as behavioral psychology could no longer meet the conceptual and mental
demands of increasingly complex tasks (Annett, 2000). Due to the realization that BTA
was not sufficient to fully analyze the skills and knowledge necessary to teach and
structure material in a more technologically sophisticated world, CTA came into general
COGNITIVE TASK ANALYSIS

 

 
44
practice among the educational technology community in the early 1970s (Annett, 2000).
In 1956, Alan Newell and Herbert Simon offered their theory on information processing.
Their metaphor of computer information processing compared to human memory placed
human mental processes of data intake, processing, and retrieval as physical events that
could be studied scientifically (Crandall, Klein, & Hoffman, 2006). Furthermore, as
cognitive psychology became more prominent, the artificial intelligence community
began to use computers to simulate thinking and problem-solving skills (Annett, 2000).
As a result, there was a growing interest in capturing human expertise, which earlier
studies of time and motion could never capture (Annett, 2000).  
CTA Methodology  
A number of researchers have identified the stages through which a typical, ideal
cognitive task analysis would proceed. The ideal model of cognitive task analysis, one
that is not subject to resource restrictions, is typified by a series of discrete steps: 1) a
preliminary phase, 2) the identification of knowledge representations, 3) knowledge
elicitation techniques, 4) a review and possible modification of the knowledge elicited to
date by experts, and 5) using the results of the analysis as a basis for an expert system or
expert cognitive model. (Chipman, Schraagen, & Shalin, 2000; Clark, et al., 2008).
According to Cooke (1999), there are more than 100 variations of CTA methods
and applications, which can then be organized into four categories of knowledge
elicitation. The first is observation. Knowledge elicitation begins with observing task
performance within the domain of interest and provides a general conceptualization of the
domain and any constraints or issues to be addressed in the later phases (Cooke, 1999).
The second is interviews, which also happens to be the most frequently used elicitation
COGNITIVE TASK ANALYSIS

 

 
45
method. There are various interview techniques such as unstructured, structured,
conditional procedure, goal decomposition, diagram focused, teach-back, “twenty
questions,” and PARI (precursors, actions, results, interpretations) (Cooke, 1999). The
third is process tracing, which involves the collection of sequential behavioral events and
the analysis of the resulting event protocols so that inferences can be made about
underlying cognitive processes (Cooke, 1999). Process tracing is also most often used to
elicit procedural information, such as conditional rules used in decision making (Cooke,
1999). The fourth is conceptual methods, which elicit and represent conceptual structure
in the form of domain-related concepts and their interrelations (Cooke, 1999). It is used
mainly to elicit knowledge to improve user interface design, guide development of
training programs, and understand expert-novice differences (Cooke, 1999).
Likewise, Yates (2007) identified over 100 different CTA methods in his review
of existing literature. Yates and Feldon (2011) recognized that existing classification
schemes tend to sort CTA methods by the knowledge elicitation processes rather than the
outcome. As a result, CTA practices can present challenges for practitioners in the
selection of techniques and the ways in which these techniques are applied to knowledge
elicitation (Yates & Feldon, 2011). Because of the wide variety in the application of
knowledge elicitation techniques to the solution of problems, it is worth noting that in the
absence of a researched-based taxonomy, CTA remains a craft, rather than a technology
(Yates & Feldon, 2011).  



COGNITIVE TASK ANALYSIS

 

 
46
Effectiveness of CTA
CTA has proven to be an effective method for capturing the explicit observable
behaviors, as well as the tacit, unobservable knowledge of experts, such as domain
content, concepts and principles, experts’ schemas, reasoning and heuristics, mental
models and sense making (Hoffman & Militello, 2009). Means and Gott (1988)
discovered that in domains that emphasize technical-functional capabilities, such as
engineering or military, simply listing the action steps for a particular procedure or task is
not an adequate way to train (Means & Gott, 1988). Even if context is captured,
traditional methods, such as asking experts to list steps or making observations do not
accurately account for the abstract knowledge experts possess (Means & Gott, 1988).
CTA has thus proven to be the optimal method for capturing knowledge because it
emphasizes aspects of the task that are important to the learner, assists in the scalability
of understanding abstract knowledge across domains, and provides a framework for
abstract-problem solving and general principles of knowledge (Means & Gott, 1988).
Research has also shown that CTA-informed instruction is more cost effective and
efficient than other instructional models. Not only has CTA-informed instruction yielded
stronger results compared to traditional instruction, but also has proven to decrease total
training days by nearly half (Clark, et al., 2008). Additionally, Means and Gott (1988)
discovered that five years of work experience and knowledge can be taught in only 50
hours of CTA-based training. Ultimately, CTA has proven to generate long-term savings.  
 



COGNITIVE TASK ANALYSIS

 

 
47
Benefits of CTA for Instruction
CTA has also shown to be effective at eliciting the nuances in expert knowledge,
such as decision points and perspectives, resulting in a variety of instructional strategies
utilizing the outcomes of a CTA-informed instruction (Means & Gott, 1988; Crandall, et
al., 2006; Hoffman & Militello, 2009). Studies across a variety of domains, including
software design, the military, the business sector, and the medical field, have indicated
that CTA-informed instruction can lead to increases of 30-45% in learning performance
compared to instruction that is informed by traditional observation or task analysis (Clark,
2014).  
As a result, CTA-informed instruction can be used for training in a variety of
ways, such as “cognitive training requirements, scenario design, cognitive feedback, and
on-the-job training” (Crandall, et al., 2006, p .196). For instance, CTA-informed
instruction has shown to reduce the number of mistakes made by recently graduated
students within the field of healthcare, and CTA-informed learners, or employees, may be
considered better trained and perhaps more appealing to employers throughout the
medical field as a result (Clark, 2014).  
CTA-informed instruction has also proven to enhance the training systems of the
National Emergency Training Center, particularly for task-based instruction (Crandall, et
al., 2006). More specifically, CTA assisted in capturing the critical decision points,
judgments, and patterns of expert firefighters, which is essential to their training
(Crandall, et al., 2006). Decision training was then built into the how-to course
curriculum that focused on teaching the novice firefighters the general principles of
decision making, scenario that are specific to the context of the learner (Crandall, et al.,
COGNITIVE TASK ANALYSIS

 

 
48
2006). Additionally, CTA facilitator questions could be used for the trainee to reflect
upon the process they took to make a decision within a given scenario (Crandall, et al.,
2006).  
When CTA is used to capture complete and accurate cognitive processes, the
resulting instruction is consistently more effective than other non-CTA approaches (Clark,
2014, Tofel-Grehl & Feldon, 2013). For example, Lee’s (2004) meta-analytic review of
studies on the instructional effectiveness of CTA-based training as measured by
performance gains revealed a 75.2% post-training gain compared to more traditional
instructional design using behavioral task analysis. Likewise, Tofel-Grehl and Feldon’s
(2013) meta-analysis of CTA studies across multiple domains discovered post-training
learning gains of 31% for CTA-based instruction compared to other instructional
approaches, such as behavioral task analysis and unguided expert self-reporting.
Furthermore, Critical Decision Method (CDM) and PARI, two CTA methods of
knowledge elicitation, have demonstrated to be more effective toward improved training
outcomes than non-CTA methods. Studies showed effect sizes of .329 for CDM and
1.598 for PARI, demonstrating a 13% learning increase for CDM and a 45% learning
increase for PARI (Clark, 2014; Tofel-Grehl & Feldon, 2013).  
Summary
Experts are frequently called upon for their knowledge and skills to teach, inform
curriculum content and instructional materials, and mentor and coach others to perform
complex tasks and solve difficult problems. However, current research shows that
experts’ knowledge becomes automated over time and experts consequently omit up to
70% of the critical knowledge and skills novices need to replicate expert performance.
COGNITIVE TASK ANALYSIS

 

 
49
These omissions have had led to adverse effects on self-reports of experts that are
incomplete and inaccurate resulting in the obstruction of novice performance. More
specifically, when novices receive incomplete information they fill in the void with their
own information, which often contains misconceptions about successful strategies.
Experts also omit critical knowledge and skills during self-reporting because they have
automated their knowledge and skills through repeated practice so that it becomes
unconscious and unavailable for recall. In response, CTA has been shown to be an
effective method for capturing both the unconscious and automated knowledge experts
use to perform complex skills and solve difficult problems.  




























COGNITIVE TASK ANALYSIS

 

 
50
CHAPTER THREE: METHODS

The purpose of this study is to conduct a CTA in order to capture the expertise of
English teachers as they describe how they provide expository writing instruction at the
eleventh-grade level. The specific task examined in this study is how expert teachers
instruct eleventh-grade students to acquire and demonstrate the skills necessary to
successfully write an expository essay. Research has discovered that expertise, however
necessary to inform instruction, becomes highly automated over time and experts may
omit up to 70% of the critical information when recalling the action and decision steps of
the task (Clark, 1999). Thus, this study relied on the incremental (1i) method of CTA to
elicit the action and decision steps necessary to successfully replicate expert expository
writing instruction at the eleventh-grade level. This study along with a concurrent study
(Jury, 2015) are also comparing the efficiency of two effective, but varying, methods of
CTA, the incremental (1i) and the individual (3i).  
This chapter discusses the research methodology employed in this descriptive
study on CTA and presents details on the study design, the selection of participants, task,
instrumentation, data collection, and data analysis. The research questions that guided
this study are:  
1. What are the action and decision steps that expert teachers recall when they
describe how they provide expository writing instruction to their eleventh-grade
students?
2. What percentage of action and/or decision steps, when compared to a gold
standard do expert teachers omit when they describe how they provide expository
writing instruction to their eleventh-grade students?
COGNITIVE TASK ANALYSIS

 

 
51
3. Which method of CTA, the incremental (1i) or the individual (3i) (Jury, 2015), is
more efficient represented by the number of actions and decisions steps and
represented by cost and time?
Participants  
A total of eight expert ELA high school teachers within a Southern California
School District were interviewed for this study using the incremental (1i), as well as the
concurrent CTA study using the individual (3i) method (Jury, 2015).  
The following criteria were used to select the English teachers of both studies.
SMEs must be at the top of their profession, with a minimum of five, preferably ten,
years of consistent and exceptionally successful on-the-job experience (Ericsson, et al.,
1993).  Their “success” is based on reliable, industry standard outcomes that have been or
can be validated, and not solely “time on the job” (Ericcsson & Charness, 1994). SMEs
also characteristically have extensive domain knowledge (Bedard & Chi, 1992). Thus,
SMEs were selected based on their levels of education (i.e. Masters degree or higher) and
their histories of professional development in expository writing instruction (i.e. ERWC
training).  Furthermore, peer recognition (Berliner, 1986) as well as experience with the
widest possible variety of settings, problems, specialties and applications (Clark, et al.,
2008) were taken into consideration. Lastly, potential SMEs were selected based on not
having provided instruction to others on the performance of the task within the past year
or longer. Yates (2007) suggested that instructors or trainers would often describe how
they train other in a task as opposed to how they actually perform the task while on the
job.  
COGNITIVE TASK ANALYSIS

 

 
52
The researcher initially contacted the Assistant Superintendent of the school
district with a brief description of the research study, including a list of potential SMEs.
The Assistant Superintendent approved both the study and the potential SMEs, as well as
obtained permission from a fellow Assistant Superintendent, the school board, and the
principals of the school sites where the SMEs worked. The researcher then contacted
each prospective SME through email detailing the purpose of the study, requirements of
the study, and an invitation to participate. Table 1 provides a complete list of the SMEs
selected for both studies.
The selection of SMEs to either the incremental (1i) or individual (3i) method of
CTA was made randomly. More specifically, SMEs were assigned primarily on the basis
of their availability and the researchers’ timetables. For example, the data collected from
the initial in-depth interview with SME 1 was used for the incremental (1i) method so
that the researcher could immediately begin developing the CTA protocol for the
subsequent SME interview. SMEs 2, 3, and 4 were then selected to inform the concurrent
study using the individual (3i) method as a result of those SMEs all becoming available
for interviews in close order shortly after the interview with SME 1. In other words, the
decision to not use SME 2, SME 3, and SME 4 to inform the incremental (1i) method
was based on the knowledge that the researcher would need enough time to develop the
CTA protocol and confirm it prior to the interview with SME 4 to review the protocol. A
timeline showing SME selection to either the incremental (1i) method or the individual
(3i) method is represented in Figure 1.


COGNITIVE TASK ANALYSIS

 

 
53
Table 1
SMEs interviewed for both the Current Study and the Concurrent Study (Jury, 2015)



SME
Years of
Experience
teaching
expository
writing

Level of
education
attained
Eleventh-
grade courses
taught at
varying levels
of difficulty
Hours of
Professional
Development
in Expository
Writing  
Was
recommended
by way of the
district, peers, or
both

1
*


17
M.A. in
English
Literature

2

75

YES

2
††


8
M.F.A in
Creative
Writing

1

40

YES

3
††


7
M.A in
English


2

20+

YES

4
††


13
M.A. in
English/Ph.D
Candidate

2

50

YES

5
*


11
M.A. in
English
Literature

4

50

YES

6
*



18
M.A. in
English
Literature

3

80

YES

7
*



20
M.A. in
English

0
**


50

YES

8
††



20
M.A. in
English
Literature

3

75

YES
Note: All data is de-identified. Each SME is numbered for demonstration purposes only and numbering
does not represent any rank order or selection criteria. SMEs with an asterisk (*) denote the incremental
(1i) method, and SMEs with the double cross (††) denote the individual (3i) method of CTA. The double
asterisk (
**
) indicates that SME 7 has taught 20 total years of English, including the twelfth-grade level and
the post-secondary level, but not at the eleventh-grade level.

The total number of SMEs that were used in this study was based on the research
that three to four SMEs are sufficient to elicit the expertise of SMEs and that beyond four
SMEs the law of diminishing marginal utility is reached, that is, when the knowledge
acquired from a SME yields less than 10% in additional action and decision steps
COGNITIVE TASK ANALYSIS

 

 
54
(Bartholio, 2010; Chao & Salvendy, 1994; Crispen, 2010). Therefore, to remain
consistent with research on the optimal number of experts for knowledge elicitation, three
SMEs were randomly assigned to the incremental (1i) method, along with verification
from a fourth SME, and three were randomly assigned to the independent (3i) method
used in the concurrent Jury (2015) study, along with verification from a fourth SME.

Incremental (1i) Method







Individual (3i) Method
Figure 1. A timeline representing SME selection to either the incremental (1i) method or
the individual (3i) method.
Note: Dates indicate when the SMEs were interviewed. Several factors were considered when determining
dates to conduct CTA interviews with SMEs, such as SME response time to researchers’ emails, SME
availability times, researchers’ availability times, and time allotted to coding transcripts and developing
protocols.

Data Collection for Question 1: What are the action and decision steps that expert
teachers recall when they describe how they provide expository writing instruction to
their eleventh-grade students?
The CTA procedure followed the five steps of knowledge elicitation suggested by
Clark, et al. (2008), which includes:  
SME 1
6/7/14
SME 5
7/9/14
SME 6
7/24/14
SME 7
10/21/14
SME 2
6/9/14
SME 3
6/10/14
SME 4
6/24/14
SME 8
12/16/14
Researcher
developed initial
CTA protocol
Researcher (Jury,
2015) developed
PGSP
COGNITIVE TASK ANALYSIS

 

 
55
1) Collecting preliminary information that builds general familiarity with the topic of the
study through document analysis, observation, and informal interviews.
2) Identifying knowledge types used when performing the task that requires the
researcher to identify declarative and procedural knowledge and any hierarchal
relationship in the application of these knowledge types.  
3) Applying the knowledge elicitation techniques best suited to the study.
4) Verifying and analyzing the data gathered through the use of qualitative data analysis
techniques.  
5) Formatting the results into a training tool (i.e. job aid).
Phase 1: Collect preliminary knowledge. The researcher is currently an English
teacher at the secondary level and has 14 years of experience giving instruction in
expository writing. Furthermore, a thorough literature review was conducted to help
gather preliminary information and build general familiarity on expository writing
instruction.
Phase 2: Identify knowledge types. During the process of completing the
literature review, the researcher was able to develop a thorough understanding of the
distinction between declarative and procedural knowledge. To practice distinguishing
between these two knowledge types and to understand hierarchal relationship, the
researcher participated in exercises with other researchers under the guidance of a senior
researcher to identify action steps, decision steps, as well as conceptual knowledge types
such as concepts, processes, and principles that might be required for teaching expository
writing. These knowledge types were used in the interview protocol.

COGNITIVE TASK ANALYSIS

 

 
56
Phase 3: Identify knowledge elicitation techniques.  
Instrumentation. For the purposes of this study, a semi-structured interview was
used to capture the knowledge and skills from the SMEs using the concepts, processes
and principles (CPPs) technique. CPP interviews at least 3 experts and has them describe
the same procedure. Interviews are then followed by experts engaging in self- and peer-
review. Clark (2008) found that using 3 experts describing the same procedure results in
similar strategies used, but with varying decisions and analysis strategies the other
experts might have missed. The semi-structured interview focuses not only on concepts,
processes, and principles, but also the conditions under which to start a subtask;
equipment, materials, and sensory experiences needed; performance standards required
among other relevant areas, and action and decision steps (Clark, 2008). In particular,
action and decision steps are considered the critical information novices need to
successfully perform the task. Action steps should begin with a verb and are statements
about what a person should do, such as “Insert car keys into the ignition and turn to start
the car.” Decision steps should contain two or more alternatives to consider before taking
an action, such as “IF the traffic light is yellow and you cannot maintain current speed
through the intersection before the light turns red, THEN proceed to stop the car; IF the
traffic light turns yellow and you can maintain current speed through the intersection
before the light turns red, THEN proceed with caution.” The semi-structured interview
protocol is attached as Appendix A.
This study used the CPP incremental (1i) method, in contrast with the individual
(3i) method used by Jury (2015). Both methods interview three SMEs, however this
study followed the incremental method by initially completing one in-depth interview
COGNITIVE TASK ANALYSIS

 

 
57
(1i) followed by reviews of the CTA protocol with the first SME and two more SMEs
(3r). In other words, once the CTA protocol was generated using the data from the initial
in-depth interview with SME A, a follow-up interview took place for SME A to review
the protocol with the researcher in order to provide SME A with the opportunity to make
corrections and/or additions. SME B was then asked to review SME A’s protocol and
make any possible corrections and/or improvements. This process of review was repeated
with SME C in order to achieve a Preliminary Gold Standard Protocol (PGSP) of the
task. As a final step, the PGSP was taken to SME D for one final review at which point
the Gold Standard Protocol (GSP) was achieved.  
In the concurrent study (Jury, 2015), the individual (3i) method required three
complete interviews with SMEs (3i). Each of the three interviews was transcribed, coded,
and analyzed separately to produce three separate CTA protocols. Each CTA protocol
was then reviewed by each SME who was interviewed to make any necessary additions
and/or corrections (3r). The three corrected CTA protocols were then aggregated into a
PGSP by the researcher. The PGSP was then submitted to a fourth senior SME for review
and correction in order to achieve a final GSP.  
Interviews. Following IRB approval, a total of eight SMEs were asked to
participate in semi-structured interviews that follow the methods described above. One of
the goals during interviews was to determine whether the incremental (1i) approach
provides as much, if not more, useful decision information from the SMEs as the
individual (3i) method in less time and at a smaller cost. Therefore, both researchers were
present for all eight SME interviews. The duration of each SME interview was
approximately 90 minutes. With the SME’s approval, each interview was audio recorded.
COGNITIVE TASK ANALYSIS

 

 
58
Both researchers used the CPP (Clark, 2008) protocol (see Appendix A) previously
described to capture the automated declarative and procedural knowledge of each
individual SME, along with other cognitive processes they use during time of instruction.  
Phase 4: Data Analysis. Each interview was recorded for transcription. The
advantage of recording SMEs during interviews permitted deep analysis of the data
collected through elicitation of the skills, decisions, and steps provided by each SME.
Coding. All of the interviews was transcribed and then coded based on Clark’s
(2008) CPP semi-structured interview method. The information was broken down and
listed as tasks and sub-tasks. Each sub-task will be assigned a procedure that includes an
action step and decision step. This analysis created a written protocol of the knowledge,
steps, and skills of each SME and allowed for further review.  
Inter-rater reliability (IRR). The transcript of the initial in-depth interview with
SME A was coded separately by the researcher and a fellow researcher to safeguard
against researcher bias. IRR was completed using SME A’s transcript because it was the
only in-depth interview as part of the incremental (1i) method. The independent result of
the coding was then discussed and analyzed in detail by both the researcher and fellow
researcher in order to determine IRR. A standard IRR was then reached by calculating the
percentage of agreement between the two researchers for each of the action and decision
steps. If differences were found in the coding, then the researcher and colleague
discussed his rationale and justification until there was an agreement of coding or until
both sides agreed to disagree. Hoffman, Crandall, and Shadbolt (1998) determined that
once there is an 85% or higher agreement in IRR, the coding process is consistent and
reliable among different coders. However, if the IRR is less than 85%, Crandall, et al.
COGNITIVE TASK ANALYSIS

 

 
59
(2006) recommend that the coding scheme and the function-unit categories may need to
be further refined. The results of the IRR are presented in greater detail in Chapter Four.
The completed IRR tally sheet is included as Appendix B.
SME protocol and verification. After coding of the initial in-depth interview with
SME A was completed, a protocol was generated. This protocol was written as a list of
action and decision steps that also included the objectives, standards, reasons, etc. elicited
during the interview. This protocol was given to two additional SMEs as previously
described, who were asked to review to verify the action and decision steps, as well as
make any improvements based on their expert knowledge of the task (Clark, et. al.,
2008). The result of this process was the Preliminary Gold Standard Protocol.  
Phase 5: Formatting the results
Gold standard protocol (GSP). The PGSP was taken to a fourth and final SME
for a final review and verification of the total action and decision steps that make up the
PGSP. After the fourth SME approved all of the action and decision steps of the PGSP,
the GSP was achieved. See Figure 2 for a description of the complete procedure for
creating the GSP.
Both studies relied on the job aid created by Clark and Yates (2010) to develop
the GSP, which has been attached as Appendix C. Because the job aid was written to
assist in the development of the individual (3i) GSP, the researcher had to slightly adapt
the job aid in order to accurately reflect the incremental nature of the GSP. For example,
the only time this study gave credit to one SME and not to another SME was when they
added a step that the others did not. In order to ensure that both studies could make valid
COGNITIVE TASK ANALYSIS

 

 
60
comparisons between the results of the incremental (1i) and the individual (3i) methods,
both researchers used the same criteria set forth in the job aid.  
Summary. The five-phase process (Clark, et al., 2008) noted above is referred to
as the incremental (1i) method of CTA, which represents one in-depth interview with
review, followed by two more interviews to review the initial review. Figure 2 visually
represents the incremental (1i) method.


Data Collection for Question 2: What percentage of action and/or decision steps, when
compared to a gold standard do expert teachers omit when they describe how they
provide expository writing instruction to their eleventh-grade students?
Spreadsheet Analysis. The final stage of data analysis was completed by
transferring the action and decision steps of the final GSP in to a spreadsheet. Each
individual SME protocol was reviewed and compared to the GSP. If the individual SME
protocol includes the action and decision step then a “1” was placed in the appropriate

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 
Figure 2. A step-by-step description of the incremental (1i) method to reach the gold standard
protocol (GSP).

 
Researcher  
In-depth
Interview
with
SME A
Review
protocol
with
SME B
Review
protocol
with  
SME C
Review
PGSP
with
SME D
Develop
initial CTA
protocol
and review
with SME
A
Revise
CTA
protocol
and review
with SME
B
Develop
PGSP and
review
with SME
C
Gold
Standard
Protocol
COGNITIVE TASK ANALYSIS

 

 
61
cell for that SME. If there is no agreement, or the SME missed this action or decision
step, a “0” was recorded. A frequency count was conducted and a percentage of the total
number of agreements and omissions between the individual SMEs and the GSP was
calculated.
Data Collection for Question 3: Which method of CTA, the incremental (1i) or the
individual (3i) (Jury, 2015), is more efficient represented by the number of actions and
decisions steps and represented by cost and time?
Both this study using the incremental (1i) method and the concurrent study (Jury,
2015) using the individual (3i) method have operationalized efficiency in terms of which
knowledge elicitation method can capture as much, if not more, action and decision steps
from the SMEs for less cost and time. Therefore, the final GSPs using the incremental
(1i) and the independent (3i) methods were compared to determine which knowledge
elicitation method yielded a greater number of critical action and decision steps.
Additionally, the total amount of time and cost that was spent to do the incremental (1i)
method as opposed to the total amount of time and cost to do the individual (3i) method
was calculated. Specifically, the total amount of time was calculated by adding the
amount of time spent by the knowledge analyst to conduct the CTA interviews, and the
total cost was calculated by adding up the individual costs of transcribing each CTA
interview.  
 
COGNITIVE TASK ANALYSIS

 

 
62
CHAPTER FOUR: RESULTS
Overview of Results
This study examined both the declarative and procedural knowledge, represented
by action and decision steps, of four expert secondary English teachers using an
incremental method of CTA to capture their expertise. More precisely, four semi-
structured interviews were conducted with SMEs using the incremental (1i) method
described in Chapter Three to elicit descriptions of procedures used to provide expository
writing instruction to students at the eleventh-grade level. The results of the data analysis
are organized by research questions.  
Research Questions
Question 1
What are the action and decision steps that expert teachers recall when they
describe how they provide expository writing instruction to their eleventh-grade
students?
Inter-rater reliability (IRR). As described in Chapter Three, IRR was calculated
by tallying the number of coded items that were in agreement, and dividing that number
by the total number of coded items. The final results of the IRR was 100% agreement. A
copy of the IRR has been attached in Appendix B.  
Flowchart analysis. A flowchart of SME A’s CTA protocol was also developed
in addition to IRR. The added benefit of the flowchart not only confirmed the action and
decision steps that were captured during SME A’s interview, but also ensured the fluidity
between all of the action and decision steps that comprised SME A’s expository writing
instruction. Additionally, the flowchart informed the researcher about additional
COGNITIVE TASK ANALYSIS

 

 
63
questions to consider during the protocol review of SME B and SME C. A copy of the
flowchart has been attached in Appendix D.  
Gold standard protocol (GSP). Following the methodology described in Chapter
Three, a GSP was developed to represent what expert teachers do when they teach
expository writing to their students, as well as the critical decisions that are typically
associated with those actions. More precisely, SME A initially contributed six main
procedures that encompass all of the action and decision steps to teach expository
writing. After reviewing SME A’s protocol, SME B then added three more main
procedures that were omitted by SME A. SME C also contributed to the GSP with
modifications to the existing nine main procedures. An example of the process is shown
in Table 2.  
In response to Research Question 1, there were a total of nine main procedures to
inform the GSP of providing expert expository writing instruction to eleventh-grade
students. The final GSP has been attached as Appendix E.
1. Prepare to teach  
2. Introduce/Review expository writing to students  
3. Teach how to formulate an argument  
4. Teach how to support an argument  
5. Conduct close reading of the text to prepare students for writing  
6. Conduct research to cite outside sources  
7. Begin writing the expository essay  
8. Teach how to edit writing  
9. Teach reflection and evaluate students’ writing  
COGNITIVE TASK ANALYSIS

 

 
64
Table 2
Example of Incremental (1i) Process toward the Preliminary Gold Standard Protocol
(PGSP)

SME A SME B SME C
1. Prepare to teach
2. Conduct close reading
of a literary text to
prepare students for
writing
3. Teach how to formulate
an argument
4. Teach how to support an
argument
5. Teach how to edit
writing
6. Teach how to write
reflections on their
writing
1. Prepare to teach
2. Introduce expository
writing to students
3. Conduct close reading
of a literary text to
prepare students for
writing
4. Teach how to formulate
an argument
5. Teach how to support an
argument
6. Conduct research to
cite outside sources
7. Begin writing the
expository essay
8. Teach how to edit
writing
9. Teach how to write
reflections on their
writing

1. Prepare to teach
2. Introduce/Review
expository writing to
students
3. Teach how to formulate
an argument
4. Teach how to support an
argument
5. Conduct close reading
of a literary text to
prepare students for
writing
6. Conduct research to cite
outside sources
7. Begin writing the
expository essay
8. Teach how to edit
writing
9. Teach reflection and
evaluate students’
writing  

Note. The bolded portions of text indicate a new step, and the underlined portions of text indicate a new
arrangement of steps. A column for SME D was not added because nothing was added or changed to the
main procedures during the interview with SME D.

Each of these nine main procedures house the expertise of SMEs, including action
and decision steps, reasons, standards, conditions, etc. For example, under Procedure 1:
Prepare to teach, teachers are directed to “Select either an article or a novel for students
to read and eventually use to write an expository essay.” This single action step is
followed by further decision steps, such as “If it is the beginning of the school year (i.e.
first quarter), then teach expository writing using an article, and “If it is the second
quarter of the school year or later, then teach expository writing using a novel.” Those
COGNITIVE TASK ANALYSIS

 

 
65
decision steps are also followed by reasons, such as “Starting with an article can be more
effective because it takes less time to read, which allows you to focus more on the writing
aspect,” as well as conditions, such as “You will have to teach more rhetorical devices
when doing a close reading of an article, whereas you would teach more literary devices
with a literary text.” The following sections describe the disaggregated results in further
detail.
Recalled action and decision steps. In order to conduct an analysis of each
SME’s action and decision steps and determine the total number of steps recalled by each
SME, the researcher entered every action and decision step of the final GSP in its own
row of a Microsoft Excel spreadsheet. Each row with a step of the final GSP was coded
with either an “A” for an action step or a “D” for a decision step. Each SME was also
assigned a letter (e.g. SME A, SME B, SME C, and SME D) for identification purposes,
as well as to indicate the order in which the SMEs were interviewed. The number of
action and decision steps was then determined through frequency counts. In other words,
all action and decision steps that were included in the individual SMEs protocol and the
GSP were marked with a “1.” The number of actions and decisions for each SME were
totaled at the bottom of each respective SME’s column. The spreadsheet analysis is
attached as Appendix F. Table 3 also provides a total of each SME’s action and decision
steps.  
Action and decision steps contributed by each SME. Table 3 represents the
disaggregated data of action and decision steps that was elicited through the method (1i)
of CTA for each SME. All of the action and decision steps for each SME in Table 3 are
meant to illustrate the gradual elicitation of automated knowledge as a result of using the  
COGNITIVE TASK ANALYSIS

 

 
66
Table 3
Cumulative Action and Decision Steps Captured for Each SME in the Initial Individual
Protocols.

 
Action Steps

Decision Steps
Total Action and
Decision Steps

SME A

88

25

113

SME B

141

40

181

SME C

155

46

201

SME D

159

51

210
Note. A fourth row was included for this table because SME D provided new action and decision steps
during the interview.

incremental (1i) method of CTA. It should also be noted that due to the progressive
design of data collection in the incremental (1i) method, many of the action and decision
steps identified in Table 3 are not unique to one individual SME. The disaggregation of
action and decision steps among all four SMEs is also visually represented in Figure 3.
Collectively, SME A recalled a total of 113 action steps, of which 88 were action
steps and 25 were decision steps, and contributed 53.81% to the GSP. SME B added 53
new action steps and 15 new decision steps to SME A’s protocol for a total of 181 steps,
and contributed 86.19% to the GSP. Likewise, SME C added 14 new action steps and 6
new decision steps for a total of 201 steps, and contributed 95.71% to the GSP. The
researcher then took the PGSP to SME D who was asked to verify the action and decision
steps of the PGSP. While verifying the PGSP, SME D was able to add 4 new action steps
and 5 new decision steps, for a grand total recall of 210 steps to complete the GSP. As
such, SME D’s additional steps represent 100% of the GSP. In total, all four SMEs
recalled more action steps than decision steps.  
COGNITIVE TASK ANALYSIS

 

 
67
Figure 3. Number of action steps, decision steps, and action and decision steps for SME
A, SME B, SME C, and SME D captured through the incremental (1i) method of CTA.

Action and decision steps captured in the follow-up interviews.  As an
additional analysis, the researcher wanted to know how many action and decision steps
were added during the follow–up interviews with each SME. The results of this analysis
are shown in Table 4.
With the exception of SME D, when the SMEs participated in second-round
interviews and reviewed their individual protocols, the process resulted in increased
action and decision steps. For example, SME A contributed 16.8% more action steps and
6.2% more decision steps, for a total of 23% more action and decision steps. SME B also
contributed 12.7% more action steps and 1.7% more decision steps, for a total of 14.4%
more action and decision steps. Additionally, SME C contributed 2.5% more action steps
and 1.5% more decision steps, for a total of 4% more action and decision steps during the
second rounds of CTA interviews.


113
181
201
210
88
141
155
159
25
40
46
51
0
50
100
150
200
250
SME A SME B SME C SME D
Total Steps Captured
Action Steps Captured
Decision Steps Captured
COGNITIVE TASK ANALYSIS

 

 
68
Table 4
Additional Expert Knowledge Captured, in Action and Decision Steps, During Follow-up
Interviews

 
Action Steps

Decision Steps
Total Action and
Decision Steps

SME A

19

7

26

SME B

23

3

26

SME C

5

3

8

SME D

0

0

0

Alignment of SMEs in describing the same action and decision steps. The
spreadsheet analysis was also used to determine the number and percentage of action and  
decision steps described by each SME that were in complete alignment, substantial
alignment, partial alignment, or no alignment. More precisely, if an action or decision
step could be attributed to only one SME, a “1” was added in the column indicating “no
alignment.” If an action or decision step was confirmed by two of the four SMEs, then
the number “2” was added in the column indicating that the step was in “partial
alignment.” If an action or decision step was confirmed by three of the four SMEs, then a
“3” was added in the column indicating the step was in “substantial alignment.” Lastly, if
an action or decision step was confirmed by all four SMEs, then a “4” was added in the
column indicating that the step was in “complete alignment.” This same process of
frequency counts was used to determine alignment of action steps and decision steps.
Table 5 shows the results of this analysis.
 
 
COGNITIVE TASK ANALYSIS

 

 
69
Table 5
Number and Percentage of Action and Decision Steps that are in Complete Alignment,
Substantial Alignment, Partial Alignment, and No Alignment

 
Action Steps

Decision Steps
Total Action and
Decision Steps
Complete
Alignment

88

77.88%

25

22.12%

113

53.81%

Substantial
Alignment


54


79.41%


15


22.06%


68


32.38%

Partial
Alignment


14


70%


6


30%


20


9.52%

No
Alignment


4


44.44%


5


55.56%


9


4.29%

Collectively, the SMEs were in “complete alignment” on 113 of the combined
action and decision steps, or approximately 53.81% of the GSP. All four SMEs also were
in “substantial alignment” on 68 action and decision steps, or 32.38% of the GSP, “partial
alignment” on 20 action and decision steps, or 9.52% of the GSP, and “no alignment” on
9 action and decision steps, or 4.29% of the GSP.  
A closer look at the results also revealed that all four SMEs were completely
aligned on 88 action steps and 25 decision steps, or 77.88% and 22.12% of the GSP
respectively. Furthermore, all four SMEs were substantially aligned on 54 action steps
and 15 decision steps, or 79.41% and 22.06% of the GSP respectively. Continuing on, all
four SMEs were partially aligned on 14 action steps and 6 decision steps, or 70% and
30% of the GSP respectively. Lastly, all four SMEs were not aligned on 4 action steps
and 5 decision steps, or 44.44% and 55.56% respectively of the GSP. The disaggregation
of alignment among all four SMEs is also visually represented in Figure 4.
COGNITIVE TASK ANALYSIS

 

 
70
Question 2
What percentage of action and/or decision steps, when compared to a gold
standard do expert teachers omit when they describe how they provide expository writing
instruction to their eleventh-grade students?
Figure 4. Number and percentage of action and decision steps that are in complete
alignment, substantial alignment, partial alignment, and no alignment.

Total knowledge omissions. To answer Research Question Two, the Microsoft
Excel spreadsheet that was used to analyze the percentage of action and decision step
captured was also used to determine the number of action and decision steps omitted by
the individual SMEs when describing expository writing instruction at the eleventh-grade
level. Action and decision steps that were included in the gold standard protocol, but
omitted by the SME, were marked “0.” The total number of action and decision steps
omitted was added and divided by the total number of cumulative action and decision
steps for all SMEs in the GSP which produced a percentage of knowledge omissions for
action and decision steps and total steps. Table 6 provides a comparison of action and  
53.81%
32.38%
9.52%
4.29%
Complete Alignment
Substantial Alignment
Partial Alignment
No Alignment
COGNITIVE TASK ANALYSIS

 

 
71
Table 6
Total Action and Decision Steps, or Expert Knowledge, Omissions by SMEs when
Compared to the Gold Standard Protocol

Steps Omitted

Action Steps

Decision Steps
Total Action and
Decision Steps
Omitted % Omitted % Omitted %

SME A

71

44.65

26

50.98

97

46.19

SME B

18

11.32

11

21.57

29

13.81

SME C

4

2.52

5

9.80

9

4.29

SME D

0

0.00

0

0.00

0

0.00
Mean
Omissions

23.25

14.62

10.50

20.59

33.75

16.07

Range

71

26

97

SD

28.37

9.76

38

decision steps omitted by each SME when compared to the GSP, including the range and
standard deviation.
Among all of the SMEs, the total omission of action and decision steps ranged
from 0 to 97 steps. Between action and decision steps, a wider range of omissions
occurred with recalling action steps (i.e. 71) than decision steps (i.e. 26). Furthermore,
the mean omission of total action and decision steps was 33.75, or 16.07% (SD ± 38).
Further breakdown of the data revealed that the percentage of action steps omitted was
23.25, or 14.62% (SD ± 28.37), and the percentage of decision steps omitted was 10.50,
or 20.59% (SD ± 9.76).
COGNITIVE TASK ANALYSIS

 

 
72
Analysis of action and decision step omissions. Figure 5 represents the
omissions of action and decision steps, or expert knowledge omissions data for SMEs A
through D when compared to the GSP.
SME A, whose data was used to construct the initial CTA protocol, omitted the
most critical information with 71 action steps, 26 decision steps, for a total omission of
97 steps, or 46.19% of the GSP. SME B omitted significantly less than SME A, with 18
action steps, 11 decision steps, for a total omission of 29 steps, or 13.81% of the GSP.
The decline of omissions continued with SME C who omitted 4 action steps, 5 decision
steps, for a total omission of only 9 steps, or 4.29% of the GSP. SME D had no omissions
as this SME represented 100% of the steps.

Figure 5. Total SME Knowledge Omissions when Compared to the Gold Standard
Protocol.


97
29
9
0
71
18
4
0
26
11
5
0
0
20
40
60
80
100
120
SME A SME B SME C SME D
Total Steps Omitted
Action Steps Omitted
Decision Steps Omitted
COGNITIVE TASK ANALYSIS

 

 
73
Question 3
Which method of CTA, the incremental (1i) or the individual (3i) (Jury, 2015), is
more efficient represented by the number of actions and decisions steps and represented
by cost and time?
Following the methodology described in Chapter Three to determine the overall
efficiency between the incremental (1i) method and the individual (3i) method (Jury,
2015) of CTA, the results do not indicate a clear, definitive answer. The following
sections provide detailed comparisons between the incremental (1i) and the individual
(3i) methods with regards to time, cost, and expert knowledge elicitation.  
Total time. The total amount of time spent by the researchers of this study and the
concurrent study to conduct two rounds of CTA interviews with four SMEs and to
develop CTA protocols was calculated to determine, in part, which method of CTA is
more efficient. Table 7 provides the amount of hours and minutes spent by the
researchers to conduct their respective method of CTA.  
Table 7

Comparison of Total Time spent doing the Incremental (1i) Method and the Individual
(3i) Method of CTA

 
Incremental (1i)  

Individual (3i)

Difference  
Time Spent
Conducting SME
Interviews

09h 03min

11h 46min

2h 43 min

The results of the data show that the incremental (1i) method is more time
efficient than the individual (3i) method of CTA. Specifically, the interviews using the
incremental (1i) method lasted a total of 9 hours and 3 minutes, while the interviews
COGNITIVE TASK ANALYSIS

 

 
74
using the individual (3i) method lasted a total of 11 hours and 46 minutes. The results
indicate a difference of 2 hours and 43 minutes in favor of the incremental (1i) method.
Total cost. The total cost to transcribe audio recordings of CTA interviews was
also calculated by the researchers of this study and the concurrent study to determine
efficiency. Both of the researchers of this study and the concurrent study used the same
transcription service to transcribe the first round of interviews with their respective first
three SMEs. The transcription service charged one dollar per minute for each audio
recording. Table 8 provides the total costs, as well as the individual costs of each SME
interview from both the incremental (1i) and the individual (3i) methods.
Table 8

Comparison of Total Cost doing the Incremental (1i) Method and the Individual (3i)
Method of CTA

First-Round
Interviews
 
Incremental (1i)
 
Individual (3i)

Difference

SME A/1

$133.00

$106.25

$26.75

SME B/2

$92.00

$205.00

$113.00

SME C/3

$89.00

$163.75

$74.75

Total Costs

$314.00

$475.00

$161.00
Note. The researcher thought to more thorough and decided to transcribe and code interviews for SME B
and C even though those steps are typically excluded during the incremental (1i) method.

In sum, the incremental (1i) method proved to be more cost effective than the
individual (3i) method of CTA. The total cost to transcribe the audio recordings for
SMEs A, B, and C using the incremental (1i) method was $314.00. On the other hand, the
total cost to transcribe the audio recordings for SMEs A, B, and C using the individual
(3i) method was $475.00. The difference in costs amounted to a total of $161.00 in favor
of the incremental (1i) method. However, the cost to transcribe audio recordings using
COGNITIVE TASK ANALYSIS

 

 
75
the incremental (1i) method was not more cost efficient with each individual SME
interview. For instance, the first-round interview with SME A was less expensive using
the individual (3i) method at $106.25, as opposed to the incremental (1i) method at
$133.00, a difference of $26.75. Conversely, the costs to transcribe audio recordings for
SME B and SME C using the incremental (1i) method, $92.00 and $89.00 respectively,
were less expensive than the costs to transcribe the audio recordings for SME B and SME
C using the individual (3i) method, $205.00 and $163.75 respectively.  
Knowledge elicitation. The total number of action and decision steps from the
incremental (1i) method and the individual (3i) method was also compared to determine
which knowledge elicitation method is more efficient. Table 9 provides a side-by-side
comparison of total action and decision steps generated from the incremental (1i) and the
individual (3i) method GSPs.  
Table 9
Comparison of Overall Action and Decision Steps from Independent (3i) Method and
Incremental (1i) Method Gold Standard Protocols

 
Incremental (1i)
 
Independent (3i)

Difference

% Difference
Total Action and
Decision Steps

210

333

123

45.30%

Action Steps

159

281

122

55.45%

Decision Steps

51

52

1

1.94%

A comparison of the two knowledge elicitation methods revealed that the
individual (3i) method captured a greater total of action and decision steps than the
incremental (1i) method. Specifically, SMEs collectively recalled 210 nonrepeating
action and decision steps with the incremental (1i) method and 333 nonrepeating action
COGNITIVE TASK ANALYSIS

 

 
76
and decision steps with the individual (3i) method. The individual (3i) method captured a
total of 123 more action and decision steps, a difference of 45.30%. A closer look at the
comparison revealed that the individual (1i) method captured significantly more action
steps with 281 steps, compared to the 159 action steps captured by the incremental (1i)
method. The individual (3i) method captured a total of 122 more action steps, a
difference of 55.45%. On the other hand, the number of decision steps was nearly
identical between the incremental (1i) and the individual (3i) methods. The individual
(3i) method captured 52 decision steps, while the incremental (1i) method captured 51
decision steps. The individual (3i) method captured only 1 more decision step, a
difference of 1.94%.
Expert contribution of action and decision steps. Figure 6 reports action and
decision steps recalled by each SME for the two knowledge elicitation methods.  

Figure 6. Total Expert Knowledge Recall for the Incremental (1i) Method and Individual
(3i) Method.
88
141
155
159
100
132
123
278
25
40
46
51
28 18 24 0
0
50
100
150
200
250
300
SME A SME B SME C SME D SME 1 SME 2 SME 3 SME 4
Total Steps Recalled
1i+3r Incremental Method                   3i+3r Individual Method
Action Steps Decision Steps
COGNITIVE TASK ANALYSIS

 

 
77
Regardless of the method that was used, SMEs consistently recalled more action
steps than decision steps. SMEs who were interviewed using the incremental (1i) method
recalled a total of 543 action steps and 162 decision steps for a grand total of 705 action
and decision steps. Collectively, all of the SMEs interviewed using the incremental (1i)
method recalled approximately 3 times more action than decision steps with SMEs
recalling 77.02% of action steps and 22.98% of decision steps. After two rounds of CTA
interviews, SME A recalled a total of 113 steps, of which 88 were action steps (i.e.
77.88%) and 25 were decision steps (i.e. 22.12%), and recalled 55.76% more action than
decision steps. SME B not only confirmed the 113 steps of SME A’s protocol, but also
added 68 new action and decision steps, for an incremental contribution of 181 steps.
More specifically, SME B contributed 141 action steps (i.e. 77.90%) and 40 decision
steps (i.e. 22.10%), which calculates to a 55.80% difference. After taking the protocol to
SME C, she confirmed the 181 steps that SME A and B contributed and was also able to
add 20 new action and decision steps of her own for a total incremental contribution of
201 steps. More specifically, SME C contributed 155 action steps (i.e. 77.11%) and 46
decision steps (i.e. 22.89%), which results in a 54.22% difference. The PGSP that was
now created by SMEs A, B, and C was then taken to a fourth expert, SME D, for
verification. While reviewing the PGSP, SME D not only verified the 201 action and
decision steps SMEs A, B, and C contributed, but also identified some omissions and
added 9 new action and decision steps, for an incremental total of 210 steps. More
specifically, SME D was able to contribute 159 action steps (i.e. 75.71%) and 51 decision
steps (i.e. 24.29%), a difference of 51.42%.  
COGNITIVE TASK ANALYSIS

 

 
78
Likewise, SMEs interviewed under the individual (3i) method recalled a total of
633 action steps and 122 decision steps for a grand total of 755 action and decision steps.
Taken all together, the SMEs recalled a total of 83.84% action steps and 16.16% decision
steps under the individual (3i) method. Individually, SME 1 recalled a total of 128 steps,
of which 100 were action steps (i.e. 78.13%) and 28 were decision steps (i.e. 21.87%),
and was able to recall 56.26% more action than decision steps. SME 2 recalled a total of
150 steps, of which 132 were action steps (i.e. 88%) and 18 were decision steps
(i.e.12%), and was able to recall 76% more action than decision steps. SME 3 recalled a
total of 147 steps, of which 123 were action steps (i.e. 83.67%) and 24 were decision
steps (i.e. 16.33%), and was able to recall 67.34% more action than decision steps. When
asked to verify the PGSP, SME 4 recalled 330 steps, of which 278 were action steps (i.e.
84.24%) and 52 were decision steps (i.e. 15.76%). Furthermore, SME 4 recalled 68.48%
more action steps than decision steps.  
Incremental (1i) method analysis. Upon further analysis of the incremental (1i)
method, a total of 159 nonrepeating action steps and 51 nonrepeating decision steps from
SMEs A through D were captured in total. More precisely, SME A recalled 55.35% of
action steps and 49.02% of decision steps when compared to the incremental (1i) method
GSP. Including the steps in agreement with SME A’s independent protocol, SME B
contributed 88.68% of action steps and 78.43% of decision steps. Additionally, SME C
contributed 97.48% of action steps and 90.20% of decision steps including the steps in
agreement with SME B’s incremental protocol. SME D contributed 100% of action and
decision steps.  
COGNITIVE TASK ANALYSIS

 

 
79
Individual (3i) method analysis. Further analysis of the individual (3i) method
revealed that it captured a total of 281 nonrepeating action steps and 52 nonrepeating
decision steps from SMEs 1 through 4. When compared to the independent (3i) GSP,
SME 1 recalled 35.59% of action steps and 53.85% of decision steps. Moreover, SME 2
recalled 46.98% of action steps and 34.62% of decision steps, SME 3 recalled 43.77% of
action steps and 46.15% of decision steps, and SME 4 recalled 98.93% of action steps
and 100% of decision steps.  
Summary
A final GSP of expository writing instruction was developed as a result of
interviewing four expert English teachers using the incremental (1i) method of CTA.
Once the data from the GSP was disaggregated and analyzed, the results yielded several
significant findings. For example, in total, all four SMEs contributed more action than
decision steps when describing how they provide expository writing instruction to
eleventh-grade students. However, a closer look at the data of each SME’s contributions
revealed that this was not entirely accurate. Additional analysis showed that SMEs were
able to add more action and decision steps during the follow-up interviews, and were
completely aligned on over half of the GSP. Furthermore, SMEs did indeed omit critical
action and decision steps; however, SMEs omitted less critical information with each
additional interview. Lastly, a definitive conclusion could not be drawn as to which
method of CTA, the incremental (1i) and the individual (3i), is more efficient. Whereas
this study using the incremental (1i) method proved to be more time and cost efficient,
the concurrent study using the individual (3i) method (Jury, 2015) captured significantly
more action and decision steps. Further analysis discovered, however, that the number of
COGNITIVE TASK ANALYSIS

 

 
80
decision steps captured using both the incremental (1i) and the individual (3i) methods
were nearly identical.  
 
COGNITIVE TASK ANALYSIS

 

 
81
CHAPTER FIVE: DISCUSSION
Overview of Study
In general, CTA is defined as a family of practices used to elicit, analyze, and
organize both the explicit and implicit knowledge and goal structures experts use to
perform complex tasks (Chipman, 2000; Clark, et al., 2008). Moreover, the effects of
CTA have been researched across multiple domains, including medicine, military, and
computer software design (Clark, 2014). However, studies on the effects of CTA in the
area of K12 education are relatively new. Therefore, one of the purposes of this study is
to add to the existing research on CTA by capturing knowledge in the form of action and
decision steps of expert teachers who provide expository writing instruction to eleventh-
grade students.  
Furthermore, as research has shown that experts may omit up to 70% of critical
information when describing how to perform a complex task to novices (Clark, 2014;
Clark, et al., 2008; Feldon & Clark, 2006), this study also sought to identify the
omissions of critical knowledge and skills by expert teachers when they recall how to
provide expository writing instruction to eleventh-grade students.  
Thus far, only two studies have attempted to determine which method of CTA,
the incremental (1i) or the individual (3i), is more efficient. Specifically, Flynn (2012)
discovered that the incremental (1i) method captured more decision steps, which
operationalized the critical information required to perform a task, in less time and with
less money than the individual (3i) method. Zepeda-McZeal (2014), on the other hand,
found that the individual (3i) method was more effective at capturing a total number of
action and decision steps than the incremental (1i) method. Therefore, in order to shed
COGNITIVE TASK ANALYSIS

 

 
82
further light on this issue, this study using the incremental (1i) method and the concurrent
study using the individual (3i) method (Jury, 2015) sought to replicate aspects of both
Flynn (2012) and Zepeda-McZeal’s (2014) studies to determine which of these two
methods is more efficient.
Lastly, the knowledge captured from SMEs may be used to inform and design
instruction. In fact, CTA-informed instruction has shown to be more effective than
traditional training methods that are based on behavioral task analysis and the self-
reporting of experts. Moreover, Means and Gott (1988) reported that five years of work
experience and knowledge could be condensed in to only 50 hours of CTA-based
training. Therefore, the goal of this study is to inform future pre-service TPPs and in-
service teachers’ PD on expository writing instruction.  
The following sections of this chapter discuss the process of conducting CTA
with expert teachers in the context of prior CTA studies, followed by the results of the
study organized by research questions, limitations of the study, implications of the study,
and areas for future research.  
Process of Conducting Cognitive Task Analysis
Selection of Experts
As previously stated in Chapter Three, a total of eight SMEs were selected for this
study and the concurrent study (Jury, 2015) within a Southern California School District.
Of the eight SMEs who volunteered to participate in the CTA research, three were
randomly selected and interviewed using the incremental (1i) method along with a fourth
SME for verification. Likewise, three more SMEs were randomly selected and
interviewed using the individual (3i) method, also with a fourth SME for verification.  
COGNITIVE TASK ANALYSIS

 

 
83
Moreover, the literature on expertise suggests that SMEs have a minimum of five,
preferably ten, years of experience (Ericsson, et al., 1993), demonstrated success that is
based on reliable, industry standard outcomes (Ericsson & Charness, 1994), extensive
domain knowledge (Bedard & Chi, 1992), recognition of his/her peers as having
expertise in that particular domain (Berliner, 1986), a wide-range of experiences within
the domain (Clark, et al., 2008), and not instructed others on the performance of the task
within the past year or longer (Yates, 2007).  
The SMEs selected for this study and the concurrent study (Jury, 2015) met all of
the criteria as described in the literature above. However, identifying expert teachers
within the domain of expository writing instruction proved to be a challenge. The
challenge lied in not only the difficulty of identifying the characteristics of expertise for
teachers, but also in how expertise is developed in teaching. For example, in the
elementary grades, a teacher’s reputation, along with classroom observations and
consistent excellent classroom performance on standardized tests may be taken as
indicators of expertise despite all of the well-known faults inherent in reputational
measures, observation, and standardized tests (Berliner, 1986). Identifying expert
teachers at the secondary level can be even more difficult as students may have five or
more teachers a day, and standardized testing may not occur annually, or may not be tied
clearly to a particular course (Berliner, 1986). In fact, Berliner (1986) discovered that the
level of preparation and criteria used to judge Teacher of the Year candidates was not
nearly as rigorous as when judging prized livestock and animals. Further problems exist
with trying to understand what domains of knowledge are used by expert teachers in
accomplishing their tasks, as well as understanding the differences between teaching
COGNITIVE TASK ANALYSIS

 

 
84
experience and expertise so as not to correlate the two unnecessarily (Berliner, 1986).
Therefore, further research may need to be conducted to more accurately determine what
constitutes expertise in teaching.  
Collection of Data
As outlined in Clark, et al. (2008) and consistent with other research studies
(Canillas, 2010; Flynn, 2012; Tolano-Leveque, 2010), CTA was conducted in the
following five stages: (a) collection of preliminary knowledge; (b) identification of
knowledge representations; (c) application of focused knowledge elicitation methods; (d)
analysis and verification of data collected; and (e) formatting of results for the intended
purpose or application. Moreover, data were collected using the incremental (1i) method
of knowledge elicitation as discussed in the methodology of Chapter Three.  
Results of the data collection showed that CTA interviews using the incremental
(1i) method required a greater length of time than expected. For example, SME A’s initial
CTA interview, which was expected to last approximately an hour and a half, lasted more
than two hours. The interview with SME A was also conducted with the assistance of a
senior researcher, and thus the additional length of time was not a result of a lack of
experience on the part of the novice researcher. Likewise, when the novice researcher for
this study conducted the subsequent interviews with SMEs B and C, the length of the
interviews also extended beyond what was anticipated. A possible explanation for the
unexpected time increase to conduct the interviews may be a result of the nature of the
task. For example, Hoffman (1987) described the degrees of difficulty to which expert
knowledge can be elicited, and during his review of the literature on expert systems, he
consistently found that researchers struggled with the large amounts of time required to
COGNITIVE TASK ANALYSIS

 

 
85
elicit expert knowledge. Furthermore, other CTA studies that have focused on tasks, such
as inserting a central venous catheter (Bartholio, 2010) or performing an open
cricothyrotomy procedure (Crispen, 2010), may take a few hours to successfully
complete. Conversely, it typically takes weeks to complete the task of teaching eleventh-
grade students to write expository essays.  
Furthermore, during SME B’s initial CTA interview, SME B questioned a number
of details that had been provided by SME A. For example, SME A recalled developing
writing prompts for students as part of the “Preparing to Teach” main procedure. SME B,
however, remarked how she does not give her students writing prompts. Alternatively,
SME B mentioned how her students create organic arguments. When the researcher
probed further, SME B expressed that she believes writing prompts limits what students
write about. SME A also recalled providing her students with an essay checklist at the
beginning of the school year that serves as a reminder of the elements that should be
present in any of their essays (i.e. MLA formatting, thesis statement, conclusion, etc.).
After reading that step, SME B expressed that she would prefer the term criteria as
opposed to checklist because it is too formulated. Moreover, SME B disagreed with the
use of an action step by SME A where she cuts up several worked examples of body
paragraphs into separate strips of paper so that each sentence of the paragraphs is on a
strip of paper. SME A then distributes the strips of paper to the students in the class and
instructs them to go around the room to try and find the other students with the strips of
paper that logically and accurately complete the body paragraph. After some probing by
the researcher, SME B asserted that this exercise was too simplistic for eleventh-grade
students. Furthermore, unlike SME A who typically models a method of essay writing
COGNITIVE TASK ANALYSIS

 

 
86
that is commonly referred to as the Jane Schaeffer method, SME B was strictly against
using anything that may be perceived as coming from the Jane Schaeffer method. For
example, SME B refuses to let her students write essays that are five paragraphs long
because the five-paragraph essay has become so closely associated with the Jane
Schaeffer method. In fact, SME B informs her students that they may write essays that
are four, six or seven paragraphs, but just not five.
The differences that emerged during SME B’s review of SME A’s protocol may
be explained by the influence that student demographics have on teachers’ classroom
instruction. Whereas SME A has spent the past seventeen years teaching at a school that
serves a large English Language Learner (ELL) population, SME B has spent the past
thirteen years teaching at a school that serves a student population where English is the
primary language spoken in the home. More specifically, the latest data from the
California Department of Education (CDE; n.d.) identified a total of 663 ELLs through
grades nine through twelve at the school where SME A teaches, of which 143 were in the
eleventh-grade. In contrast, SME B’s school was identified as having a total of 81 ELL
students through grades nine through twelve, of which 19 were in the eleventh-grade.
Calculating the difference both in the entire school and at the eleventh-grade, there is a
considerable difference in ELL populations between the two schools. When Bloom and
Peters (2012) conducted their study of student teachers and their field placement, they
found that the efficacy levels of student teachers decreased as the number of students of
color in the classroom increased. Bloom and Peters’ (2012) research demonstrates that a
correlation does exist between teachers’ attitudes and the diverse students they teach.
Furthermore, when teachers work at schools with a more diverse student population, they
COGNITIVE TASK ANALYSIS

 

 
87
may unintentionally allow their attitudes, such as cultural biases to influence their
instruction in ways that may not always benefit students and their achievement (Green,
Kent, Lewis, Feldman, Motley, Baggett, Shaw Jr., Byrd, & Simpson, 2011; Portes &
Smagorinsky, 2010). Therefore, further research may be necessary to understand the
relationship between student demographics and writing instruction.  
Discussion of Findings
No formal hypotheses were developed for this research study. Rather, the study
was guided by the following three research questions.  
Question 1
What are the action and decision steps that expert teachers recall when they
describe how they provide expository writing instruction to their eleventh-grade
students?
In total, four SMEs were interviewed using the incremental (1i) method to capture
the total number of action and decision steps they recall to provide expository writing
instruction to students at the eleventh-grade. As a result of using the incremental (1i)
method, a total of 210 action and decision steps were captured, of which 159 were action
steps and 51 were decision steps.
Action steps versus decision steps.  Analyzing only the percentages of total
action and decision steps, all four SMEs consistently recalled more action steps than
decision steps, including the results during the second-round interviews. These results
suggest the impact of expertise upon knowledge recall. Previous studies have
demonstrated that although automation of expert knowledge frees up working memory to
respond to novel problems and alleviate cognitive overload, recalling those critical action
COGNITIVE TASK ANALYSIS

 

 
88
and decision steps once they have been automated becomes difficult (Clark, 2014; Clark
& Estes, 1996; Ericsson, 2004). For example, Clark (2014) suggested that experts in the
healthcare field typically recall more action steps because they form a “mental image”
that can be recalled more easily, unlike decision steps which do not. Likewise, Canillas
(2010) found that SMEs are able to consistently describe “how” to perform a task as
opposed to “when” to do a task. Canillas’ (2010) research revealed that experts described
75.8% action steps compared to 24.2% decision steps when describing the critical
information required to properly insert a central venous catheter.  
However, analyzing the additional action and decision steps of each individual
SME provided a slightly different result. Consistent with the earlier findings, SMEs A, B,
and C contributed more action steps than decision steps during first- and second-round
interviews. SME D, on the other hand, contributed 4 action steps and 5 decision steps
during the review of the PGSP. Based on research that SMEs recall action steps more
easily than decision steps, the results of SME D are unexpected. One possible explanation
for the marginal increase of decision steps over action steps may be that SME D
benefitted more than SMEs A, B, and C from having the most complete iteration of the
protocol describing expository writing instruction. Unlike SMEs A, B, and C, SME D
was asked primarily to review the protocol and verify the action and decision steps that
should go into providing eleventh-grade students with effective expository writing
instruction. According to Sweller (1988), a heavy cognitive load (of reviewing a
protocol) has shown to impede performance. Therefore, as a result of not having the
pressure of providing critical steps like SMEs A, B, and C, it may be that the cognitive
COGNITIVE TASK ANALYSIS

 

 
89
load of SME D was reduced, which then opened the door for more higher-order level
thinking that is necessary in the formation of decision steps.  
The results also showed that SMEs progressively recalled a smaller percentage of
action and decision steps with each additional incremental interview. In other words,
SME A contributed the greatest percentage of non-repeating action and decision steps,
followed by SME B, SME C, and SME D, who contributed the least. These particular
results support the existing research that suggests interviewing three to four SMEs is
optimal for knowledge elicitation (Bartholio, 2010; Crispen, 2010). Moreover, research
has demonstrated that when more than four SMEs are interviewed to conduct a CTA, the
law of diminishing marginal utility goes into effect (Bartholio, 2010; Chao & Salvendy,
1994). Only when the researcher interviewed the fourth and final SME did both the
percentages of new action and decision steps return less than 10%, thus reaching the law
of diminishing marginal utility.  
Question 2
What percentage of action and/or decision steps, when compared to a gold
standard do expert teachers omit when they describe how they provide expository writing
instruction to their eleventh-grade students?
In order to answer this research question, the final GSP was compared to each
SME’s individual protocol to determine expert knowledge omissions, derived from
omissions of action and decision steps, for the task of teaching how to write expository
essays. Research has demonstrated that when experts describe how they perform a
complex task, they may unintentionally omit up to 70% of the critical information
COGNITIVE TASK ANALYSIS

 

 
90
novices need to learn to successfully perform the task (Clark, et al., 2011; Feldon, 2004;
Feldon & Clark, 2006).  
The results of this study using the incremental (1i) method demonstrate that
though the SMEs do omit critical action and decision steps when recalling performance
of a complex task, the omissions were not nearly as high as 70%. A possible explanation
may have to do with the way in which the incremental (1i) method elicits and captures
expertise. In other words, SMEs B and C benefited from the work of SME A, who
provided the initial protocol for expository writing instruction. The average omission of
action and decision steps are consistent with the existing research studies that have
reported higher percentages of omissions for decision steps than action steps when
compared to the GSP (Canillas, 2010; Clark, 2014; Tolano-Leveque, 2010). Furthermore,
an analysis of the disaggregated omission results showed that omissions decreased with
each additional SME interview. These findings are consistent with the research that
suggests that it takes up to three to four experts to reverse the 70% rule (Bartholio, 2010;
Crispen, 2010).  
Question 3
Which method of CTA, the incremental (1i) or the individual (3i) (Jury, 2015), is
more efficient represented by the number of actions and decisions steps and represented
by cost and time?
The final goal of this study was to replicate aspects of Flynn (2012) and Zepeda-
McZeal’s (2014) dissertations to determine which method of CTA, the incremental (1i)
or the individual (3i), is more efficient. More specifically, Flynn (2012) confirmed that
the incremental (1i) method was more efficient than the individual (3i) method based on
COGNITIVE TASK ANALYSIS

 

 
91
its capacity to capture more decision steps in a shorter amount of time and at a lower cost.
Zepeda-McZeal (2014), on the other hand, concluded that the individual (3i) method was
more effective at capturing critical action and decision steps than the incremental (1i)
method. Although the results of this study using the incremental (1i) method and those of
the concurrent study using the individual (3i) method (Jury, 2015) were inconclusive as
to which method is more efficient, the findings from both studies merited further analysis
and discussion.  
Of the three criteria used to determine the efficiency between the incremental (1i)
and the individual (3i) methods, the most glaring result was the difference between the
total number of critical action and decision steps that were captured in the GSPs of each
method. More precisely, the incremental (1i) method was able to capture a total of 210
action and decision steps, while the individual (3i) method amassed a total of 333 action
and decision steps, for a substantial difference of 123 action and decision steps describing
the same complex task.  
A possible explanation for the individual (3i) method eliciting more action and
decision steps than the incremental (1i) method may be that the SMEs participating in the
incremental (1i) method were unconsciously influenced by the knowledge that the
protocol they were reviewing was developed on the knowledge provided by other
experts. This non-conscious, cognitive process is what Tulving and Schacter (1990) refer
to as priming. Similar to priming, SMEs interviewed using the incremental (1i) method
may also have been influenced by a concept known as groupthink, a way of thinking that
individuals engage in when they are part of a cohesive group (Schneider, Gruman, &
Coutts, 2012). More precisely, groupthink occurs when people in a group setting allow
COGNITIVE TASK ANALYSIS

 

 
92
their desire to comply or maintain group harmony supersede their desire to challenge
existing beliefs and ideas (Aronson, 2012). In fact, the pressure to conform is so great
that individual group members begin to doubt their own concerns and refrain from
voicing conflicting opinions (Aronson, 2012). For example, whereas SME B added 68
more action and decision steps to SME A’s protocol, SME C added significantly less
with only 20 new action and decision steps to SME B’s protocol. SME C may have
assumed that the protocol was accurate simply because she had been primed with the
knowledge that two SMEs had already approved it and did not review the protocol with
as critical of an eye. Likewise, SME C may have succumbed to the effects of groupthink
and unintentionally chosen to maintain the existing knowledge of the previous SMEs as
opposed to challenge what they had described.  
Another example occurred with SME C when SME D discovered a procedural
error while reviewing the PGSP that had been previously verified by SME C. During the
initial interview, SME B had recalled reviewing a list of about 80 topics with the students
in order to help them generate thesis statements. After reading the chosen text, SME B
would ask students to review the list of 80 topics and identify all of the possible topics
the text possibly addresses. According to SME B, this step was one of the initial ones to
helping students begin developing arguments for their thesis statements. The researcher
had unintentionally placed the action step of using the list of 80 topics under a different
section of the protocol where it did not belong. More specifically, the list of 80 topics was
unintentionally placed as an action step when reading an article with students, as opposed
to reading a novel. However, during the follow-up interview, SME B confirmed the
placement of the action step using the list of 80 topics with using an article to teach how
COGNITIVE TASK ANALYSIS

 

 
93
to formulate an argument. SME C also reviewed the protocol and confirmed the
placement of the list of 80 topics. SME D, however, questioned the placement of the list
with the reason that articles are relatively short in nature and focus on a single topic to be
discussed in further detail throughout the rest of the article. SME D then suggested that it
would be more sensible to use a list of 80 topics to begin generating thesis statements
when reading a lengthier and more complex piece of literature, such as a novel. As
confirmation, the researcher contacted SME B through email to clarify which type of
literature she uses the list of 80 topics. SME B’s email response confirmed that she does
in fact use the list of 80 topics early in the school year with a piece of literature, such as a
short story or a play.  
Another explanation for the significantly larger number of action and decision
steps using the individual (3i) method may be found in the nature of the task. As
previously mentioned, Hoffman (1987) noted the tremendous amount of time that is often
required to elicit expert knowledge. Additionally, Hoffman (1987) offered several criteria
by which to analyze and compare tasks, including brevity of the task, flexibility of the
task, and method efficiency. These three criteria, in particular, denote the differences
between Flynn’s (2012) task and Zepeda-McZeal’s (2014) task. Flynn (2012) analyzed
the task of recruitment interviews conducted by the Army, which may require at most an
hour or two with a single individual. On the other hand, Zepeda-McZeal (2014) analyzed
expository reading instruction for students with mild to moderate learning disabilities,
which would require a number of weeks with multiple students. Therefore, a task such as
Flynn’s (2012) with a shorter time frame and fewer individuals to interact with may be
COGNITIVE TASK ANALYSIS

 

 
94
more conducive to eliciting a greater number of action and decision steps to perform the
specified task.  
Although the individual (3i) method captured a significant number of more steps,
a closer look at the data revealed that the majority of steps captured by the individual (3i)
method were action steps. Specifically, the individual (3i) method captured a total of 281
action steps, whereas the incremental (1i) method captured a total of 159 action steps, for
a difference of 122 action steps. However, the number of decision steps was nearly
identical between the incremental (1i) and the individual (3i) methods. Specifically, the
incremental (1i) method captured 51 decision steps, while the individual (3i) method
captured only 1 more decision step for a total of 52 decision steps.  
One possible explanation for such similar results may be the time intervals
between the first- and second-round interviews with the SMEs. Whereas months passed
between the first- and second-round interviews under the individual (3i) method, only a
week or two elapsed between first- and second-round interviews with the SMEs using the
incremental (1i) method. The length of time may be significant because the data supports
the notion that SMEs recalled more decision steps during the second-round interviews
under the incremental (1i) method than the individual (3i) method. In fact, the individual
(3i) method did not elicit any decision steps during the second-round of SME interviews,
while the incremental (1i) method was able to capture 13 additional non-repeating
decision steps. It is unknown whether studies have been applied to the impact of the time
between the first- and second-round of SME interviews to date. Therefore, this area of
CTA may be a topic for future research.  
COGNITIVE TASK ANALYSIS

 

 
95
Another possible explanation for the minimal difference in decision steps may be
that the SMEs experienced more cognitive overload (Sweller, 1988) during their
interviews using the individual (3i) method than the SMEs interviewed using the
incremental (1i) method. The individual (3i) method requires that all three SMEs undergo
an in-depth first-round interview with the researcher. The researcher of this study
accompanied the researcher of the concurrent study to the first-round of interviews. The
total duration of each individual (3i) interview averaged over 2 hours, with the longest
interview lasting 2 hours and 42 minutes. For comparison purposes, the incremental (1i)
first-round interviews averaged 1 hour and 23 minutes, with the longest interview lasting
2 hours and 12 minutes. A comparison of the decision steps for both methods showed that
35 of the 52 decision steps for the individual (3i) method were elicited within the first
four main procedures of the protocol, while 37 of the 51 decision steps for the
incremental (1i) method were elicited within the final five main procedures of the
protocol. These findings suggest that the method by which the individual (3i) method
asks experts to recall both actions and decisions of the task created cognitive overload for
the SMEs which may be the reason why the decision steps taper off as the protocol
progresses.  
Conversely, the SMEs interviewed using the incremental (1i) method may have
experienced less cognitive overload, which then freed up their working memory for more
higher-order level thinking that is necessary in the formation of decision steps. Research
has shown that people tend to try and conserve cognitive energy under certain situations,
such as when they are asked to recall a decision quickly or experience cognitive overload
(Aronson, 2012). These people are known as “cognitive misers” (p. 122) because they
COGNITIVE TASK ANALYSIS

 

 
96
create shortcuts cognitively in some instances by ignoring some information to reduce the
cognitive load, or overuse other information to keep from having to search for more, or
even be willing to accept a less-than-perfect alternative because it is almost good enough
(Aronson, 2012). Thus, in order to ease their cognitive overload, SMEs interviewed using
the individual (3i) method may have succumbed to their “cognitive miserliness” and
recalled fewer decision steps than was actually possible.  
In sum, this study and the concurrent study (Jury, 2015) sought to determine
which method of CTA, the incremental (1i) and the individual (3i), is more efficient.
Both studies have operationalized efficiency in terms of which knowledge elicitation
method can capture as much, if not more, action and decision steps from the SMEs for
less cost and time. The results this study and the concurrent study (Jury, 2015) discovered
that whereas the individual (3i) method was more efficient in terms of capturing total
action and decision steps, the incremental (1i) method was more efficient in terms of time
and cost. Thus, the results did not provide a definitive answer to which method is more
efficient in favor of all three areas.  
Limitations
The present study produced results that are both consistent and inconsistent with
existing CTA research studies regarding expert knowledge captured in the form of action
and decision steps, expert knowledge omissions, and the relative efficiency of the
incremental (1i) and individual (3i) methods in capturing expertise. The following
sections discuss the limitations of the present study, such as confirmation bias, and
internal and external validity.  

COGNITIVE TASK ANALYSIS

 

 
97
Confirmation Bias
When a CTA analyst has experience in a task domain, the analyst has a natural
tendency to edit the knowledge captured from SMEs to align with the analyst’s own
experiences (Clark, 2014). The researcher has experience teaching expository writing to
eleventh-grade students for the past 14 years, therefore the researcher had to be aware of
researcher bias while conducting the CTA study. Although the researcher’s background
and years of experience of the task domain contributed to minimal bootstrapping
(Schraagen et al., 2000), conscious effort was required by the researcher to avoid placing
his preexisting expectations and experiences onto the data that was collected.
Internal Validity  
Observation of the experts’ knowledge and skills as they perform the task would
have ensured the internal validity of the data that was captured as a result of the study and
that informed the GSP. However, observation of the SMEs did not occur as part of this
study, and therefore the results cannot be validated from this perspective at the present
time. Although the scope of this study did not include the validation of the GSP by
observation, this would be an appropriate study for future research.  
External Validity
For this study to have external validity, the likelihood of the results would have to
be generalizable and transferrable to other settings containing similar domain
characteristics. This study was conducted using a mixed-methods approach, but from a
qualitative viewpoint. More specifically, this study became four separate, but related case
studies of expert teachers. Merriam (2009) describes qualitative case study research as
anchored in real-life situations that result in a rich and holistic account of a phenomenon.
COGNITIVE TASK ANALYSIS

 

 
98
Case studies also play an important role in advancing a field’s knowledge base,
particularly in applied fields of study, such as education (Merriam, 2009). The issue of
generalizability may actually be more apparent with a case study than with other types of
qualitative research because it focuses on a single unit or instance, thereby allowing
readers to learn vicariously through the images and vivid portraits provided by the
researcher’s narrative description (Merriam, 2009). In fact, Erickson (1986) has argued
that the general lies in the particular, and what is learned in a particular case can be
transferred to similar situations (Merriam, 2009). Likewise, an English teacher can read
this study and consider how he or she can apply it to his or her place of work.  
Implications
First, experts are frequently called upon for their knowledge and skills to teach, to
inform curriculum content and instructional materials, and to mentor and coach others to
perform complex tasks and solve difficult problems (Jackson, 1985). However, experts
have automated their knowledge and may omit up to 70% of critical information when
teaching or training novices. In response, CTA has proven to be effective in capturing
highly automated expert knowledge that is otherwise unavailable for recall across a
variety of domains. Moreover, the current study supports the use of CTA research to
capture expert knowledge and skills in complex instructional tasks, such as teaching
expository writing at the eleventh-grade level.  
Previous research has demonstrated that CTA-based training and instruction
enhance performance efficiency by increasing human “speed, accuracy and adaptability”
while providing significant, long-term cost savings (Clark, et al., 2008). Velmahos,
Toutouzas, Sillin, Chan, Clark, Theodorou, and Maupin (2004) also concluded that
COGNITIVE TASK ANALYSIS

 

 
99
surgical residents may complete tasks 25% faster, learn 40% more information, and do so
with fewer critical mistakes by 50%. These proven results of the effectiveness of CTA
suggest that school districts which use CTA-based training and instruction can similarly
produce more highly-trained and skilled teachers for the classroom in less time and fewer
costs. In fact, quantitative analyses indicate that measures of teacher preparation and
certification are by far the strongest correlates of student achievement in reading and
math, both before and after controlling for students’ socio-economic and language status
(Darling-Hammond, 2000; Hanushek, 1992; McQuitty, 2012). Additionally, poor
preparation of teachers is one of the factors contributing to the problems of low academic
achievement for a vast majority of minority and low-income children (Ukpokodu, 2002).
Therefore, harnessing the expertise captured within CTA-based instruction may not only
improve the efficiency of teacher development, but also may be the most effective
method to moving the needle of student achievement.  
Moreover, the newly designed CCSS call for students to write expository text in
nearly every grade level, as well as write more across the curriculum, including subject
areas such as social studies, science, and math. Teachers in content areas other than
English will likely need future PD to help their students with writing. In order to meet the
expectations of the CCSS, CTA-based instruction may be the solution to providing in-
service teachers, especially those in subject areas outside of English Language Arts, with
effective writing instruction that is developed from experts in the field.  
Research has also noted that TPPs are not consistent with writing instruction, and
do not devote the amount of time it should to preparing its candidates to teach writing
(Applebee, 1981; Chamblass & Bass, 1995; Saphier, 2011). Within the context of teacher
COGNITIVE TASK ANALYSIS

 

 
100
education, multiple researchers recommend that teacher educators should make their
pedagogical practice more transparent in order to improve the preparation of their pre-
service teachers (Loughran, 2006; Zeichner, 2005). For example, Loughran (2006)
describes such pedagogical elucidation as making the tacit explicit, wherein teacher
educators unpack the “hidden” aspects of their instruction so that students of teaching
might better appreciate not only what teachers know, need to know and are able to do, but
to also be able to actively develop, assess, adjust and articulate such knowledge in
relation to their own teaching. In addition, Zeichner (2005) suggests although some
teacher educators see their roles primarily as one of passing along knowledge about good
teaching practices, the task of teacher education must also include the development of the
pre-service teacher’s ability to exercise his or her judgment about when to use particular
practices and how to adapt them to the specific circumstances in which he or she is
teaching.  
Zeichner (2005) also suggests that cooperating teachers will have to make the
thinking processes and reasoning that underlie their particular choices in the classroom
more apparent to their student teachers because they need to develop the ability to
exercise their judgment about when to use particular practices and how to adapt them to
the specific circumstances in which they are teaching. CTA-based instruction has been
designed to address these very issues when it comes to improving instruction because it
provides novice teachers with access to more than just what they observe during
instruction. Thus, CTA-based writing instruction, such as one developed by this study,
may provide a foundation for expert writing instruction for pre-service teachers, or at the
COGNITIVE TASK ANALYSIS

 

 
101
very least, serve as a model for making the tacit aspects of expert instruction more
explicit to beginning teachers.  
As a result of CTA-based research, more and more TPPs and PDs should look to
not only identify what expert teachers do with their students, but also seek to understand
when and why they do what they do. Therefore, teacher educators may want to spend
more time preparing pre-service teachers by thinking out loud with their students in order
to make more of the tacit aspects of their teaching more explicit. Teacher educators may
also want to spend more time analyzing video recorded lessons in order to help pre-
service teachers focus more on the cognitive aspects of the lesson. Likewise, PDs that
meet several times over the span of a few weeks can begin by introducing ELA teachers
to CTA-based instruction, followed by the ELA teachers going back to their school sites
to test the effectiveness of CTA-based instruction against their traditional methods of
teaching, and then reconvene in order to discuss the results.        
Future Research
Although several studies exist on expertise and provide criteria to identify
individuals who may be deemed experts in other fields, one recommendation toward
further research is the identification of experts in the education field. For example, this
study relied on several criteria to identify expert teachers, one of which was consistent
and exceptionally successful on-the-job experience as measured by student achievement
in expository writing.  In this particular case, “success” was based on reliable, industry
standard outcomes that have been or can be validated, such as standardized test scores.
However, an important and relevant concern, and one that this study did not address, was
the number of variables that can possibly influence the results of student achievement
COGNITIVE TASK ANALYSIS

 

 
102
other than the teacher. A reasonable assumption is that students enter classrooms each
year with a variety of skill levels, content knowledge, motivation level, access to
resources, and psychosocial support, just to name a few. With so many factors that can
influence a student’s writing achievement, it makes identifying an expert teacher using
“success” as part of the criteria far more complex.  
CTA-guided instruction is also relatively new to the field of education, and it is
unknown whether it has been applied to pre-service and in-service teacher training
programs to date. A search of the research did not reveal any such studies. Therefore,
future research may consider a randomized experimental design study to implement pre-
service and in-service teacher training, particularly for novice teachers, using CTA-
guided instruction and traditional instructional methods to compare learning gains on
expository writing instruction similar to the tasks outlined in this present study.
Conducting longitudinal research may also benefit this body of research to determine
short- and long-term learning gains in expository writing.  
Furthermore, in consideration of procedural optimization of knowledge elicitation
methods, future research is needed to compare the relative effectiveness of the
incremental (1i) and individual (3i) methods in capturing critical declarative and
procedural knowledge for complex task performance in K-12 school settings. Further
research should be conducted within the domain of expository writing instruction to
determine which of the two methodologies is most appropriate for complex learning
tasks. The incremental (1i) method may be another methodology to add to the repertoire
of CTA; however, additional research is needed to determine its overall effectiveness in
COGNITIVE TASK ANALYSIS

 

 
103
capturing complete and accurate knowledge representations of complex, problem-solving
tasks.  
As a final recommendation, further studies should analyze the optimal length of
time to perform a complex task for a CTA study. As previously mentioned, the nature of
the task can greatly influence the outcomes of doing CTA (Hoffman, 1987). If the goal is
to maximize the educational benefits of instructional content generated by way of CTA,
perhaps certain complex tasks are more suitable to doing a CTA based on their relative
time frame.  
Conclusion
The purpose of this study was to add to the existing body of CTA research as an
effective way to capture the knowledge and skills experts use to perform complex tasks.
Previously, CTA has demonstrated its effectiveness in capturing the expert knowledge
necessary for novice learners to attain information and skills needed to perform complex
tasks in a variety of domains, including but not limited to computer software design, the
military, and medical field. Only recently have researchers attempted to demonstrate the
effectiveness of CTA within the field of education.  
The aim of this study was threefold. First, describe the benefits of CTA for
capturing more complete descriptions of the action and decision steps that experts use
when providing expository writing instruction at the eleventh-grade level. Second,
identify the omissions these experts make when recalling the action and decision steps to
teach expository writing. Lastly, determine which of the two common CTA methods, the
incremental (1i) and the individual (3i), is more effective at capturing action and decision
steps in a shorter period of time and at a lower cost. The GSP that was developed as a
COGNITIVE TASK ANALYSIS

 

 
104
result of this study has identified both the observable and unobservable behaviors
necessary for novice teachers to replicate expert instruction. Moreover, the expertise that
has been captured within the GSP can serve as the foundation for the instruction of pre-
service teachers enrolled in TPPs and for the PD of in-service teachers teaching at the
eleventh grade level. By replicating the expertise that so often goes in to informing
professional training, CTA-based instruction such as this research study may lead to
significant improvement gains in student writing achievement.  





















COGNITIVE TASK ANALYSIS

 

 
105
References
ACT. (2005). Crisis at the core: Preparing all students for college and work. Iowa, IA:
ACT, Inc.
ACT. (2007). Aligning postsecondary expectations and high school practice: The gap
defined. Policy implications of the ACT national curriculum survey results
2005-2006. Iowa, IA: ACT, Inc.
Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010).
How learning works: Seven research-based principles for smart teaching. San
Francisco, CA: John Wiley & Sons.
Anderson, J.R. (1982). Acquisition of cognitive skill. Psychological Review, 89(4), 369-
406.
 
Anderson, J. R. (1993). Problem solving and learning. American Psychologist, 48(1),  
25-44.
Anderson, J. R. (1996a). ACT: A simple theory of complex cognition. American
Psychologist, 51(4), 355-365.
Anderson, J. R. (1996b). The architecture of cognition. Mahwah, NJ: Erlbaum.
Anderson, L. W., & Krathwohl, D. (Eds.). (2001). A taxonomy for learning, teaching,  
and assessing: A revision of Bloom's taxonomy of educational objectives. New  
York, NY: Longman.
Anderson, J. R., & Schunn, C. D. (2000). In R. Glaser (Ed.), Advances in instructional
psychology (5). Mahwah, NJ: Erlbaum
Annett, J. (2000). Theoretical and pragmatic influences on task analysis methods. In J.M.
Schraagen, S.F. Chipman & V.L. Shalin (Eds.), Cognitive Task Analysis (pp. 24-
COGNITIVE TASK ANALYSIS

 

 
106
36). Mahwah, NJ: Lawrence Erlbaum Associates.
Applebee, A. N. (1981). Looking at writing. Educational Leadership, 38(6), 458-462.  
Applebee, A. N. & Langer, J. A. (2009). What is happening in the teaching of writing?  
English Journal, 98(5), 18-28.  
Aronson, E. (2012). The social animal, tenth edition. Madison, NY: Worth Publishers.
Banchero, S. (2011). SAT reading, writing scores hit low. The Wall Street Journal.  
Retrieved from http://online.wsj.com/news/articles/SB10001424053111904491
704576571060049856724
Bartholio, C. W. (2010). The use of cognitive task analysis to investigate how many
experts must be interviewed to acquire the critical information needed to perform
a central venous catheter placement (Doctoral dissertation). Retrieved from
http://digitallibrary.usc.edu/cdm/ref/collection/p15799coll127/id/385767
Beck, S. W., Llosa, L., & Fredrick, T. (2013). The challenges of writing exposition:
Lessons from a study of ELL and non-ELL high school students. Reading &
Writing Quarterly: Overcoming Learning Difficulties, 29(4), 358-380.  
Bedard, J. & Chi, M.T.H. (1992). Expertise. Current Directions in Psychological  
Science, 1(4), 135-139.
Berliner, D. C. (1986). In pursuit of the expert pedagogue. Educational Researcher,  
15(7), 5-13.  
Berman, R. A., & Nir-Sagiv, B. (2007). Comparing narrative and expository text
construction across adolescence: A developmental paradox. Discourse processes,  
43(2), 79-120.

COGNITIVE TASK ANALYSIS

 

 
107
Berryman, S. E. (1993). Learning in the workplace. In L. Darling-Hammond, Ed.,  
Review of Research on Education, vol. 19. Washington, D. C.: American  
Educational Research Association.  
Bloom, D. S., & Peters, T. (2012). Student teaching experience in diverse settings, white  
racial identity development and teacher efficacy. Journal of Educational and  
Developmental Psychology, 2(2), 72-84.  
Brimi, H. (2012). Teaching Writing in the Shadow of Standardized Writing Assessment:
An Exploratory Study. American Secondary Education, 41(1), 52-77.
California Department of Education (CDE; 2008). English-language arts study guide:  
California High School Exit Examination. Retrieved from  
http://www.cde.ca.gov/ta/tg/hs/documents/studyela08sec6.pdf
California Department of Education. (CDE; n.d.). Data quest. Retrieved January 25,
2015, from http://dq.cde.ca.gov/dataquest/
Canillas, E. N. (2010). The use of cognitive task analysis for identifying the critical  
information omitted when experts describe surgical procedures (Doctoral
dissertation). University of Southern California, Los Angeles, CA.  
Cardenas, C. (n.d). CSU: The California state university: Working for California.  
Retrieved from http://www.calstate.edu/eap/
Chamblass, M. S., & Bass, J. A. (1995). Effecting changes in student teachers’ attitudes
toward writing. Reading Research and Instruction 35(2), 153-160.  
Chandrasegaran, A. (2013). The effect of a socio-cognitive approach to teaching writing  
on stance support moves and topicality in students’ expository essays. Linguistics
and Education, 24(2), 101-111. doi: http://dx.doi.org/10.1016/j.linged.2012.12.
COGNITIVE TASK ANALYSIS

 

 
108
005
Chao, C. J., & Salvendy, G. (1994). Percentage of procedural knowledge acquired as a
function of the number of experts from whom knowledge is acquired for  
diagnosis, debugging, and interpretation tasks. International Journal of Human-­‐  
  Computer Interaction, 6(3), 221-233.
Chi, M. T. H. (2006). Two approaches to the study of experts’ characteristics. In K. A.
Ericsson, N. Charness, P. J. Feltovich, & R. R. Hoffman (Eds.), The Cambridge  
handbook of expertise and expert performance (pp. 21–30). New York:  
Cambridge University Press
Chipman, S.F. (2000). Introduction to Cognitive Task Analysis. In J.M. Schraagen,S.F.
Chipman & V.L. Shalin (Eds.), Cognitive Task Analysis (pp. 24-36). Mahwah,  
NJ: Lawrence Erlbaum Associates
Chipman, S. F., Schraagen, J. M., & Shalin, V. L. (2000). Introduction to cognitive task
analysis. In J. M. Schraagen, S. F. Chipman, & V. L. Shalin (Eds.), Cognitive task
analysis (pp. 3-23). Mahwah, NJ: Erlbaum
Clark, R.E. (1999). Ying and yang: Cognitive motivational processes in multimedia
learning environments. In J. van Merriënboer (Ed.) Cognition and multimedia
Design. Herleen, Netherlands: Open University Press
Clark, R. E. (2008). Resistance to change: Unconscious knowledge and the challenge of  
unlearning. In D. C. Berliner, & H. Kupermintz (Eds.), Changing institutions,  
environments, and people. Mahwah, NJ: Lawrence Erlbaum Associates
Clark, R. (2014). Cognitive task analysis for expert-based instruction in healthcare. In
Handbook of research on educational communications and technology (pp. 541-
COGNITIVE TASK ANALYSIS

 

 
109
551). Springer, New York.
Clark, R., & Elen, J. (2006). When less is more: Research and theory insights about
instruction for complex learning. In J. Elen & R.E. Clark (Eds.), Handling  
complexity in learning environments: Theory and research (pp. 283-295). New  
York: Elsevier.
Clark, R. E. and Estes, F. (1996). Cognitive task analysis. International Journal of
Educational Research. 25(5). 403-417.
Clark, R. E., Feldon, D. F., van Merriënboer, J. J. G., Yates, K. A., & Early, S. (2008).  
Cognitive task analysis. In J. M. Spector, M. D. Merrill, J. van Merriënboer, & M.  
P. Driscoll (Eds.), Handbook of research on educational communications and  
technology (3rd ed., pp. 577-593). Mahwah, NJ: Lawrence Erlbaum Associates  
Clark, R. E., Pugh, C. M., Yates, K. A., Inaba, K., Green, D., & Sullivan, M. (2011). The  
use of cognitive task analysis to improve instructional descriptions of procedures.  
Journal for Surgical Research. http://www.ncbi.nlm.nih.gov/pubmed/22099596
Coleman, D., Pimentel, S., & Zimba, J. (2012). Three core shifts to deliver on the
promise of the common core standards. State Education Standard, 12(2), 9-12.
College Board. (2003). The neglected “r”: The need for a writing revolution. New York:
The National Commission on Writing in America’s Schools and Colleges.  
College Board. (2004). Writing: A ticket to work…or a ticket out. A survey of business
leaders. New York: The National Commission on Writing in America’s Schools
and Colleges.
College Board. (2005). Writing: A powerful message from state government. New York:
The National Commission on Writing in America’s Schools and Colleges.
COGNITIVE TASK ANALYSIS

 

 
110
College Board. (2006). Writing and school reform. New York: The National Commission
on Writing in America’s Schools and Colleges.
College Board. (2013). SAT report on college and career readiness. New York: NY.  
Collins, A., Brown, J. S., & Newman, S. E. (1988). Cognitive apprenticeship: Teaching  
the craft of reading, writing, and mathematics. Thinking, The Journal of  
Philosophy for Children, 8(1), 2-11.  
Common Core State Standards. (CCSS; 2010). Key shifts in English language arts.
Retrieved from http://www.corestandards.org/other-resources/key-shifts-in-
english-language-arts/
Cooke, N. J. (1992). Modeling human expertise in expert systems. In R. R. Hoffman  
(Ed.), The psychology of expertise: Cognitive research and empirical AI (pp. 29-
60). Mahwah, NJ: Lawrence Erlbaum Associates.  
Cooke, N. J. (1999). Knowledge elicitation. In F. T. Durso (Ed.), Handbook of  Applied
Cognition (pp. 479-509). New York: Wiley.
Corbett, A. T., Anderson, J. R. (1995). Knowledge Tracing: modeling the acquisition of  
procedural knowledge. User Modeling and User-Adapted Interaction, 4, 253-278.
Crandall, B., Klein, G., & Hoffman, R. R. (2006). Working minds: A practitioner’s guide  
to cognitive task analysis. Cambridge, MA: The MIT Press.
Crispen, P. D. (2010). Identifying the point of diminishing marginal utility for cognitive
task analysis surgical subject matter expert interviews (Doctoral dissertation)  
University of Southern California, Los Angeles, CA.
Darling-Hammond, L. (2000). Teacher quality and student achievement: A review of
state policy evidence. Education Policy Analysis Archives, 8(1).
COGNITIVE TASK ANALYSIS

 

 
111
Darling-Hammond, L. (2004). Standards, accountability, and school reform. Teachers  
College Record, 106(6), 1047-1085.  
Ericsson, K. A. (2004). Invited address: Deliberate practice and the acquisition and
maintenance of expert performance in medicine and related domains. Academic  
Medicine, 79(10), s70-s81.
Ericsson, K.A. & Charness, N. (1994). Expert performance its structure and acquisition.
American Psychologist, 49(8), 725-747.
Ericsson, K. A., & Lehmann, A. C. (1996). Expert and exceptional performance:
Evidence of maximal adaptation to task constraints. Annual Review of  
Psychology, 47(1), 273-305.
Ericsson, K. A., Krampe, R. T., & Tesch-Römer, C. (1993). The role of deliberate
practice in the acquisition of expert performance. Psychological Review, 100(3),  
363-406
Ethell, R. G., & McMeniman, M. M. (2000). Unlocking the knowledge in action of an
expert practitioner. Journal of Teacher Education, 51(2), 87-101.  
Fanetti, S., Bushrow, K. M., & DeWeese, D. L. (2010). Closing the gap between high  
school writing instruction and college writing expectations. English Journal,  
99(4), 77-83.  
Feldon, D. F. (2004). Inaccuracies in expert self-report: Errors in the description of  
strategies for designing psychology experiments (Doctoral dissertation)  
University of Southern California, Los Angeles, CA.
Feldon, D. F. (2007). Implications of research on expertise for curriculum and pedagogy.
Educational Psychology Review, 19(2), 91-110.
COGNITIVE TASK ANALYSIS

 

 
112
Feldon, D. F., & Clark, R. E. (2006). Instructional implications of cognitive task analysis
as a method for improving the accuracy of experts’ self-report. In G. Clarebout &
J. Elen (Eds.), Avoiding simplicity, confronting complexity: Advances in studying
and designing (computer-based) powerful learning environments (pp. 109-116).
Rotterdam, The Netherlands: Sense Publishers.
Flynn, C. L. (2012). The relative efficiency of two strategies for conducting cognitive task  
analysis (Doctoral dissertation). Retrieved from http://gradworks.umi.com/35/61/
3561771.html
Gibson, S. (2008). An effective framework for primary-grade guided writing instruction.  
The Reading Teacher, 62(4), 324-334. doi: 10.1598/RT.62.4.5  
Glaser, R., & Chi, M. T. H. (1988). Overview. In M. T. H. Chi, R. Glaser, & M. Farr
(Eds.), The Nature of Expertise (p. xv-xxviii). Mahwah, NJ: Lawrence Erlbaum  
Associates.
Gott, S. P., Hall, E. P., Pokorny, R. A., Dibble, E, & Glaser, R. (1993). A naturalistic
study of transfer: Adaptive expertise in technical domains. In D. K. Detterman &
R. J. Sternberg (Eds.), Transfer on trial: Intelligence, cognition, and instruction
(pp. 255-288). Norwood, NJ: Ablex.
Gowler, D., & Legge, K. (1983). The meaning of management and the management of  
meaning: A view from social anthropology. In Perspectives on management. M.  
D. Earl (Ed.), 197-233. Oxford: Oxford University.  
Graham, S., & Perin, D. (2007). Writing next: Effective strategies to improve writing of  
adolescents in middle and high schools. A report to Carnegie corporation of New  
COGNITIVE TASK ANALYSIS

 

 
113
York. Alliance for Excellent Education. Washington, DC. Retrieved from
http://search.proquest.com/docview/864940908?accountid=14749
Graham, S., Gillespie, A., & McKeown, D. (2013). Writing: Importance, development,
and instruction. Reading and Writing: An Interdisciplinary Journal, 26(1), 1-15.
Green, A. M., Kent, A. M., Lewis, J., Feldman, P., Motley, M. R., Baggett, P. V., Shaw  
Jr., E. L., Byrd, K., & Simpson, J. (2011). Experiences of elementary pre-service  
teachers in an urban summer enrichment program. Western Journal of Black  
Studies, 35(4), 227-239.  
Hanushek, E. A. (1992). The trade-off between child quantity and quality. Journal of
Political Economy, 100(1), 84-117.  
Hatano, G., & Inagaki, K. (2000). Domain-specific constraints of conceptual
development. International Journal of Behavioral Development, 24(3), 267-275.
Hillocks Jr., G. (2010). Teaching argument for critical thinking and writing: An
introduction. English Journal, 99(6), 24-32.  
Hinds, P. J., Patterson, M., & Pfeffer, J. (2001). Bothered by abstraction: The effect of
expertise on knowledge transfer and subsequent novice performance. Journal of
Applied Psychology, 86(6), 1232-1243.  
Hoffman, R. R. (1987). The problem of extracting the knowledge of experts from the
perspective of experimental psychology. AI Magazine, 8(2), 53-67.
Hoffman, R. R., Crandall, B., & Shadbolt, N. (1998). A case study in cognitive task
analysis methodology: The Critical Decision Method for the elicitation of expert
knowledge. Human Factors, 40(2), 254-276.

COGNITIVE TASK ANALYSIS

 

 
114
Hoffman, R. & Militello, L. (2009). Perspectives on cognitive task analysis: Historical
origins and modern communities of practice. New York: Psychology Press.
Hunt, E., & Joslyn, S. L. (2000). A functional task analysis of time-pressured decision
making. In J. M. Schraagen, S. Chipman, & V. J. Shalin (Eds.), Cognitive task
analysis (pp. 119-132). Mahwah, NJ: Erlbaum.  
Jackson, P. W. (1985). Private lessons in public schools: Remarks on the limits of
adaptive instruction. In M. C. Wang & H. J. Walberg (Eds.), Adapting instruction
to individual differences (pp. 66–81). Berkeley, CA: McCutchan.
Jury, M. (2015). Using individual cognitive task analysis to capture expert writing
instruction in expository writing for secondary students (Unpublished doctoral
dissertation). University of Southern California, Los Angeles, CA.
Kirschner, P. A, Sweller, J., & Clark, R. E. (2006). Why minimally guided instruction
does not work: An analysis of the failure of constructivist, discovery, problem  
based, experiential, and inquiry based teaching. Educational psychologist, 41(2),  
75-86.
Kiuhara, S. A., Graham, S., & Hawken, L. S. (2009). Teaching writing to high school
students: A national survey. Journal of Educational Psychology, 101(1), 136.
Krenn, M. (2011). From scientific management to homemaking: Lillian M. Gilbreth’s  
contributions to the development of management thought. Management &  
Organisational History, 6(2), 145-161.  
Lee, R. L. (2004). The impact of cognitive task analysis on performance: A meta-analysis  
of comparative studies (Doctoral dissertation). University of Southern California,
Los Angeles, CA.  
COGNITIVE TASK ANALYSIS

 

 
115
Loughran, J. (2006). Developing a pedagogy of teacher education: Understanding  
teaching and learning about teaching (pp.43-62). New York, NY: Routledge
McQuitty, V. (2012). Emerging possibilities: A complex account of learning to teach
writing. Research in the Teaching of English, 46(4), 358-389
Means, B., & Gott, S. (1988). Cognitive task analysis as a basis for tutor development:  
Articulating abstract knowledge representations. In J. Psotka, L. D. Massey, & S.  
A. Mutter (Eds.), Intelligent Tutoring Systems: Lessons Learned (pp. 35-58).
Merriam, S. B. (2009). Qualitative research: A guide to design and implementation. San  
Francisco: Jossey-Bass.  
National Assessment of Educational Progress. (NAEP; 2011). Writing framework for the
2011 national assessment of educational progress. Washington, DC: National
Assessment Governing Board and U.S. Department of Education.
National Center for Education Statistics. (NCES; 2012). The nation's report card:
Writing 2011 (NCES 2012–470). Washington, DC: Institute of Education
Sciences, U.S. Department of Education .  
Nyland, C. (1996). Taylorism, John R. Commons, and the hoxie report. Journal of  
Economic Issues, 30(4), 985-1016.
Palinscar, A. S., & Brown, A. L. (1984). Reciprocal teaching of comprehension-fostering  
and comprehension-monitoring activities. Cognition and Instruction, 1(2), 117-
175.  
Paris, S. G., Lipson, M. Y., & Wixson, K. K. (1983). Becoming a strategic reader.  
Contemporary Educational Psychology, 8(3), 293-316.  

COGNITIVE TASK ANALYSIS

 

 
116
Piccolo, J. A. (1987). Expository text structure: Teaching and learning strategies. The  
Reading Teacher, 40(9), 838-847.  
Pigage, L. C., & Tucker, J. L. (1954). Motion and time study. Bulletin no. 24.  
Champaign, Il.: Institute of Labor and Industrial Relations. University of Illinois  
at Urbana-Champaign.  
Portes, P. R., & Smagorinsky, P. (2010). Static structures, changing demographics:  
Educating teachers for shifting populations in stable schools. English Education,  
42(3), 236-247.  
Sanchez, C. (2013). College board ‘concerned’ about low SAT scores. National Public  
Radio. Retrieved from http://www.npr.org/2013/09/26/226530184/college-board-
concerned-about-low-sat-scores
Saphier, J. (2011). Outcomes: Coaching, teaching standards, and feedback mark the  
teacher’s road to mastery. Journal of Staff Development, 32(4), 58-62.  
Schleppegrell, M. J. (2004). The language of schooling: A functional linguistics
perspective. New York: Routledge.  
Schneider, F. W., Gruman, J. A., & Coutts, L. M. (2012). Applied social psychology:
Understanding and addressing social and practical problems (2
nd
ed.). Thousand
Oaks, CA: Sage.  
Schraagen, J. M., Chipman, S. F., & Shalin, V. L. (Eds.). (2000). Cognitive task analysis.
Psychology Press
Simon, H. A., & Chase, W. G. (1973). Skill in chess: Experiments with chess-playing
tasks and computer simulation of skilled performance throw light on some human
perceptual and memory processes. American scientist, 394-403.
COGNITIVE TASK ANALYSIS

 

 
117
Smith, K. (2004). Teacher educators’ expertise: What do novice teachers and teacher  
educators say? Teacher and Teacher Education, 21(2005), 177-192. doi:10.1016/
j.tate.2004.12008  
Sternberg, R. J., & Horvath, J. A. (1995). A prototype view of expert teaching.  
Educational Researcher, 24(6), 9-17.
Street, C. (2003). Pre-service teachers’ attitudes about writing and learning to teach
writing: Implications for teacher educators. Teacher Education Quarterly, 30(3),
33-50.
Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive
science, 12(2), 257-285.
Taylor, F. W. (1911). Principles of scientific management. New York: Harper and Row.
Thorndike, E. L. (1921). Measurement in education. Teachers College Record, 22(5),  
371-379.
Tofel-Grehl, C., & Feldon, D. F. (2013). Cognitive Task Analysis–Based Training A
Meta-Analysis of Studies. Journal of Cognitive Engineering and Decision  
Making, 7(3), 293-304.
Tolano-Leveque, M. (2010). Using cognitive task analysis to determine the percentage of  
critical information that experts omit when describing a surgical procedure
(Doctoral dissertation). Available from ProQuest Dissertations and Theses
database (UMI No. 3418184)
Tulving, E., & Schacter, D. L. (1990). Priming and human memory systems. Science,
247, 301-306.

COGNITIVE TASK ANALYSIS

 

 
118
Troia, G. A., & Olinghouse, N. G. (2013). The common core state standards and  
evidence-based educational practices: The case of writing. School Psychology  
Review, 42(3), 343-357.
Ukpokodu, N. (2002). Breaking through preservice teachers' defensive dispositions in a  
multicultural education course: A reflective practice. Multicultural Education,
9(3), 25-33. Retrieved from http://search.proquest.com/docview/216309385?
accountid=14749
Velmahos, G. C., Toutouzas, K. G., Sillin, L. F., Chan, L., Clark, R. E., Theodorou, D., &  
Maupin, F. (2004). Cognitive task analysis for teaching technical skills in an
inanimate surgical skills laboratory. American Journal of Surgery, 187(1), 114-
119. doi:10.1016/ j.amjsurg.2002.12.005
Wheatley, T., & Wegner, D. M. (2001). Automaticity of action, psychology of. In N. J.
Smelser & P. B. Baltes (Eds.), International encyclopedia of the social and  
behavioral sciences (pp. 991-993). Oxford: Elsevier Science Ltd.
Wilder, H., & Mongillo, G. (2007). Improving expository writing skills of preservice  
teachers in an online environment. Contemporary Issues in Technology and  
Teacher Education, 7(1), 476-489.  
Yates (2007). Towards A Taxonomy of Cognitive Task Analysis Methods: A Search for  
Cognition and Task Analysis Interactions. (Doctoral dissertation).  
Yates, K. & Clark, R. E. (2011). Cognitive Task Analysis Process Description.
Yates, K. A., & Feldon, D. F. (2011). Advancing the practice of cognitive task analysis: a
call for taxonomic research. Theoretical Issues in Ergonomics Science, 12(6),  
472-495.
COGNITIVE TASK ANALYSIS

 

 
119
Zeichner, K. (2005). Becoming a teacher educator: A personal perspective. Teaching and  
Teacher Education, 21(2), 117-124. doi: http://dx.doi/org/10.1016/j.tate.2004.
12.001
Zepeda-McZeal, D. (2014). Using cognitive task analysis to capture expert reading
instruction in informal text for students with mild to moderate learning
disabilities (Doctoral dissertation). University of Southern California, Los
Angeles, CA.  
































COGNITIVE TASK ANALYSIS

 

 
120
Appendix A
Cognitive Task Analysis Interview Protocol
Begin
 the
 Interview:
 
 Meet
 the
 Subject
 Matter
 Expert
 (SME)
 and
 explain
 the
 
purpose
 of
 the
 interview.
 Ask
 the
 SME
 for
 permission
 to
 record
 the
 interview.
 
Explain
 to
 the
 SME
 the
 recording
 will
 be
 only
 used
 to
 ensure
 that
 you
 do
 not
 miss
 
any
 of
 the
 information
 the
 SME
 provides.
 
 
 

 
Name
 of
 task(s):
 Teaching
 expository
 writing
 

 
Performance
 Objective:
 
 
Ask:
 “What
 is
 the
 objective
 of
 teaching
 expository
 writing?
 
 What
 action
 verb
 should
 be
 
used?”
 

 
Step
 1:
 
 
Objective:
 Capture
 a
 complete
 list
 of
 outcomes
 for
 expository
 writing
 instruction.
 
 

 
A. Ask
 the
 Subject
 Matter
 Expert
 (SME)
 to
 list
 outcomes
 when
 these
 tasks
 are
 
complete.
 
 Ask
 them
 to
 make
 the
 list
 as
 complete
 as
 possible.
 
B. Ask
 SME
 how
 the
 outcomes
 are
 assessed.
 

 
Step
 2:
 
Objective:
 
 Provide
 practice
 exercises
 that
 are
 authentic
 to
 the
 task
 of
 teaching
 
expository
 writing.
 

 
A. Ask
 the
 SME
 to
 list
 all
 the
 tasks
 that
 are
 performed
 during
 expository
 writing
 
instruction.
 
B. Ask
 the
 SME
 how
 the
 tasks
 would
 change
 when
 teaching
 expository
 writing
 
among
 various
 student
 skill
 levels.
 
 

 
Step
 3:
 
Objective:
 Identify
 main
 steps
 or
 stages
 to
 accomplish
 the
 task
 

 
A. Ask
 SME
 the
 key
 steps
 or
 stages
 required
 to
 accomplish
 the
 task.
 
B. Ask
 SME
 to
 arrange
 the
 list
 of
 main
 steps
 in
 the
 order
 they
 are
 performed,
 or
 if
 
there
 is
 no
 order,
 from
 easiest
 to
 difficult.
 
 

 
Step
 4:
 
Objective:
 Capture
 a
 list
 of
 “step
 by
 step”
 actions
 and
 decisions
 for
 each
 task
 

 
A. Ask
 the
 SME
 to
 list
 the
 sequence
 of
 actions
 and
 decisions
 necessary
 to
 complete
 
the
 task
 and/or
 solve
 the
 problem
 

 
Ask:
 “Please
 describe
 how
 you
 accomplish
 this
 task
 step-­‐by-­‐step,
 so
 a
 first
 year
 
teacher
 could
 perform
 it.”
 

 
COGNITIVE TASK ANALYSIS

 

 
121
For
 each
 step
 the
 SME
 gives
 you,
 ask
 yourself,
 “Is
 there
 a
 decision
 being
 made
 
by
 the
 SME
 here?”
 
 If
 there
 is
 a
 possible
 decision,
 ask
 the
 SME.
 

 
If
 SME
 indicates
 that
 a
 decision
 must
 be
 made…
 

 
Ask:
 “Please
 describe
 the
 most
 common
 alternatives
 (up
 to
 a
 maximum
 of
 
three)
 that
 must
 be
 considered
 to
 make
 the
 decision
 and
 the
 criteria
 first
 year
 
teachers
 should
 use
 to
 decide
 between
 the
 alternatives”.
 
 

 
Step
 5:
 
Objective:
 Identify
 prior
 knowledge
 and
 information
 required
 to
 perform
 the
 task.
 

 
A. Ask
 SME
 about
 the
 prerequisite
 knowledge
 and
 other
 information
 required
 to
 
perform
 the
 task.
 

 
1.
 Ask
 the
 SME
 about
 Cues
 and
 Conditions
 

 

  Ask:
 
 “For
 this
 task,
 what
 must
 happen
 before
 someone
 starts
 the
 task?
 
 What
 
prior
 task,
 permission,
 order,
 or
 other
 initiating
 event
 must
 happen?
 
 Who
 
decides?”
 

 

  2.
 Ask
 the
 SME
 about
 New
 Concepts
 and
 Processes
 
 

 

  Ask:
 
 “Are
 there
 any
 concepts
 or
 terms
 required
 of
 this
 task
 that
 may
 be
 new
 to
 
the
 first
 year
 teacher?”
 

 
Concepts
 –
 terms
 mentioned
 by
 the
 SME
 that
 may
 be
 new
 to
 the
 first
 
year
 teacher
 
Ask
 for
 a
 definition
 and
 at
 least
 one
 example
 

 
Processes
 -­‐
 How
 something
 works
 
If
 the
 first
 year
 teacher
 is
 operating
 equipment,
 or
 working
 on
 a
 team
 
that
 may
 or
 may
 not
 be
 using
 equipment,
 ask
 the
 SME
 to
 “Please
 
describe
 how
 the
 team
 and/or
 the
 equipment
 work
 -­‐
 in
 words
 that
 a
 
first
 year
 teacher
 will
 understand.
 Processes
 usually
 consist
 of
 different
 
phases
 and
 within
 each
 phase,
 there
 are
 different
 activities
 –
 think
 of
 it
 
as
 a
 flow
 chart”
 

 
Ask:
 “Must
 first
 year
 teachers
 know
 this
 process
 to
 do
 the
 task?”
 
 “Will
 
they
 have
 to
 use
 it
 to
 change
 the
 task
 in
 unexpected
 ways?”
 
 
 
IF
 the
 answer
 is
 NO,
 do
 NOT
 collect
 information
 about
 the
 process.
 
 

 

 

 

 

 
COGNITIVE TASK ANALYSIS

 

 
122
3.
 Ask
 the
 SME
 about
 Equipment
 and
 Materials
 

 

  Ask:
 “What
 equipment
 and
 materials
 are
 required
 to
 succeed
 at
 this
 
task
 in
 routine
 situations?
 
 Where
 are
 they
 located?
 
 How
 are
 they
 
accessed?
 

 

  4.
 Performance
 Standard
 

 

  Ask:
 “How
 do
 we
 know
 the
 objective
 has
 been
 met?
 
 What
 are
 the
 
criteria,
 such
 as
 time,
 efficiency,
 quality
 indicators
 (if
 any)?”
 

  5.
 Sensory
 experiences
 required
 for
 task
 

 

  Ask:
 “Must
 first
 year
 teachers
 see,
 hear,
 smell,
 feel,
 or
 taste
 something
 in
 
order
 to
 learn
 any
 part
 of
 the
 task?
 For
 example,
 are
 there
 any
 parts
 of
 
this
 task
 they
 could
 not
 perform
 unless
 they
 could
 smell
 something?”
 

 
Step
 6:
 
Objective:
 Identify
 problems
 that
 can
 be
 solved
 by
 using
 the
 procedure.
 

 
A. Ask
 the
 SME
 to
 describe
 at
 least
 one
 routine
 problem
 that
 the
 first
 year
 
teacher
 should
 be
 able
 to
 solve
 if
 they
 can
 perform
 each
 of
 the
 tasks
 on
 the
 
list
 you
 just
 made.
 
 
 

 

  Ask:
 “Of
 the
 task
 we
 just
 discussed,
 describe
 at
 least
 one
 routine
 problem
 that
 
the
 first
 year
 teacher
 should
 be
 able
 to
 solve
 IF
 they
 learn
 to
 perform
 the
 task”.
 





















COGNITIVE TASK ANALYSIS

 

 
123
Appendix B
Inter-rater Reliability Code Sheet


COGNITIVE TASK ANALYSIS

 

 
124
Appendix C
Job
 Aid
 for
 Developing
 a
 Gold
 Standard
 
Richard
 Clark
 and
 Kenneth
 Yates
 
(2010,
 Proprietary)
 
 

 
The
 goals
 of
 this
 task
 are
 to
 1)
 aggregate
 CTA
 protocols
 from
 multiple
 experts
 to
 create
 a
 “gold
 
standard”
 protocol
 and
 2)
 create
 a
 “best
 sequence”
 for
 each
 of
 the
 tasks
 and
 steps
 you
 have
 collected
 
and
 the
 best
 description
 of
 each
 step
 for
 the
 design
 of
 training.
 

 
Trigger:
 After
 having
 completed
 interviews
 with
 all
 experts
 and
 capturing
 all
 goals,
 settings,
 triggers
 
and
 all
 action
 and
 decision
 steps
 from
 each
 expert
 –
 and
 after
 all
 experts
 have
 edited
 their
 own
 
protocol.
 

 
Create
 a
 gold
 standard
 protocol
 
STEPS
 
  Actions
 and
 Decisions
 
1. For
 each
 CTA
 protocol
 you
 are
 aggregating,
 ensure
 that
 the
 transcript
 line
 number
 is
 
present
 for
 each
 action
 and
 decision
 step.
 
a. If
 the
 number
 is
 not
 present,
 add
 it
 before
 going
 to
 Step
 2.
 
2. Compare
 all
 the
 SME’s
 corrected
 CTA
 protocols
 side-­‐by-­‐side
 and
 select
 one
 protocol
 
(marked
 as
 P1)
 that
 meets
 all
 the
 following
 criteria:
 
a. The
 protocol
 represents
 the
 most
 complete
 list
 of
 action
 and
 decision
 steps.
 
b. The
 action
 and
 decisions
 steps
 are
 written
 clearly
 and
 succinctly.
 
c. The
 action
 and
 decision
 steps
 use
 the
 most
 accurate
 language
 and
 
terminology.
 
3. Rank
 and
 mark
 the
 remaining
 CTA
 protocols
 as
 P2,
 P3,
 and
 so
 forth,
 according
 to
 the
 
same
 criteria.
 
4. Starting
 with
 the
 first
 step,
 compare
 the
 action
 and
 decision
 steps
 of
 P2
 with
 P1
 and
 
revised
 P1
 as
 follows:
 
a. IF
 the
 step
 in
 P2
 has
 the
 same
 meaning
 as
 the
 step
 in
 P1,
 then
 add
 “(P2)”
 at
 
the
 end
 of
 the
 step.
 
b. IF
 the
 step
 in
 P2
 is
 a
 more
 accurate
 or
 complete
 statement
 of
 the
 step
 in
 P1,
 
THEN
 revise
 the
 step
 in
 P1
 and
 add
 “(P1,
 P2)”
 at
 the
 end
 of
 the
 step.
 
c. IF
 the
 step
 in
 P2
 is
 missing
 from
 P1,
 THEN
 revise
 the
 list
 of
 steps
 by
 adding
 
the
 step
 to
 P1
 and
 add
 “(P2N)”
∗

 at
 the
 end
 of
 the
 step.
 
5. Repeat
 Step
 4
 by
 comparing
 P3
 with
 P1,
 and
 so
 forth
 for
 each
 protocol,
 
 
6. Repeat
 Steps
 4
 and
 5
 for
 the
 remaining
 components
 of
 the
 CTA
 report,
 such
 as
 
triggers,
 main
 procedures,
 equipment,
 standards,
 and
 concepts
 to
 create
 a
 “Draft
 
Gold
 Standard”
 protocol
 (DGS).”
 
7. Verify
 the
 DGS
 protocol
 by
 either:
 
a. Asking
 a
 senior
 SME,
 who
 has
 not
 been
 interviewed
 for
 a
 CTA,
 to
 review
 the
 
DGS
 and
 note
 any
 additions,
 deletions,
 revisions,
 and
 comments.
 
b. Asking
 each
 participating
 SME
 to
 review
 the
 DGS,
 and
 either
 by
 hand
 or
 
using
 MS
 Word
 Track
 Changes,
 note
 any
 additions,
 deletions,
 revisions,
 or
 
comments.
 
i. IF
 there
 is
 disagreement
 among
 the
 SMEs,
 THEN
 either:
 
1. Attempt
 to
 resolve
 the
 differences
 by
 communicating
 with
 
the
 SMEs,
 OR
 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
∗

 N
 =
 New
 
COGNITIVE TASK ANALYSIS

 

 
125
2. Ask
 a
 senior
 SME,
 who
 has
 not
 been
 interviewed
 for
 a
 CTA,
 to
 
review
 and
 resolve
 the
 differences.
 
 
8. Incorporate
 the
 final
 revisions
 in
 the
 previous
 Step
 to
 create
 the
 “Gold
 Standard”
 
protocol.
 
Create
 a
 best
 sequence
 of
 tasks
 and
 steps
 for
 training.
 
STEPS
 
  Actions
 and
 Decisions
 
IF
 two
 of
 the
 experts
 agree
 on
 the
 tasks
 that
 must
 be
 performed
 in
 a
 specific
 sequence,
 
THEN
 organize
 your
 tasks
 it
 that
 sequence
 and
 place
 all
 left
 over
 tasks
 on
 a
 separate
 list
 and
 
go
 to
 step
 2.
 
1. Work
 with
 novice
 trainees
 if
 possible
 in
 order
 to
 classify
 all
 of
 the
 non-­‐sequenced
 
tasks
 on
 the
 separate
 list
 into
 a
 list
 where
 the
 first
 tasks
 on
 the
 list
 are
 those
 that
 
appear
 to
 be
 simpler
 for
 trainees
 to
 learn
 and
 those
 later
 in
 the
 list
 are
 more
 
complex
 and
 difficult
 to
 learn.
 
2. IF
 any
 of
 the
 tasks
 on
 the
 simple
 to
 difficult
 list
 seem
 to
 be
 applied
 often
 in
 the
 larger
 
procedure
 that
 you
 are
 capturing,
 THEN
 place
 a
 “P”
 (for
 Prerequisite)
 next
 to
 those
 
tasks
 and
 consider
 moving
 them
 up
 in
 priority
 and
 training
 them
 earlier
 rather
 than
 
later
 even
 if
 they
 are
 more
 complex.
 
3. Integrate
 the
 two
 task
 lists
 into
 one
 master
 outline
 of
 the
 training
 by
 placing
 the
 
formerly
 non-­‐sequenced
 tasks
 into
 the
 list
 created
 in
 step
 1
 using
 the
 decision
 rules
 
in
 steps
 5,
 6
 and
 7.
 
4. IF
 the
 task
 seems
 to
 be
 prerequisite
 and/or
 performed
 earlier
 by
 the
 expert,
 THEN
 
place
 it
 earlier
 in
 the
 training.
 
5. IF
 the
 task
 is
 not
 prerequisite
 but
 at
 least
 two
 of
 the
 experts
 told
 us
 it
 was
 
performed
 earlier
 than
 other
 tasks,
 THEN
 place
 it
 in
 the
 same
 order
 that
 the
 two
 
experts
 suggested.
 
6. IF
 the
 experts
 disagreed
 about
 the
 exact
 order
 of
 tasks,
 THEN
 select
 what
 appears
 to
 
be
 a
 logical
 order
 and
 ask
 the
 experts
 later
 if
 you’ve
 made
 a
 mistake.
 
7. IF
 you
 have
 completed
 the
 ordering
 of
 the
 tasks,
 THEN
 create
 a
 flow
 chart
 of
 the
 
tasks
 that
 will
 serve
 as
 a
 job
 aid
 for
 trainees.
 
8. IF
 your
 flow
 chart
 indicates
 that
 the
 order
 of
 the
 tasks
 may
 be
 difficult
 to
 perform
 
and
 so
 are
 in
 the
 wrong
 order,
 THEN
 revise
 the
 order
 of
 the
 tasks
 to
 reflect
 the
 best
 
performance
 sequence
 in
 the
 flow
 chart.
 
9. IF
 you
 are
 finished
 ordering
 the
 tasks,
 THEN
 go
 back
 and
 start
 with
 the
 first
 task
 and
 
order
 the
 steps
 in
 the
 task
 using
 the
 same
 rules
 that
 you
 used
 to
 order
 the
 tasks
 in
 
Steps
 1,
 2,
 5,
 6
 and
 7.
 
 
 
10. IF
 you
 are
 finished
 ordering
 the
 steps
 in
 each
 task,
 THEN
 select
 the
 description
 of
 
each
 step
 that
 appears
 to
 stated
 in
 a
 way
 that
 novices
 will
 be
 able
 to
 apply
 more
 
easily.
 
11. IF
 in
 doubt
 about
 which
 description
 of
 steps
 you
 will
 use
 for
 the
 gold
 standard,
 
THEN
 ask
 trainees
 to
 identify
 the
 descriptions
 that
 are
 clearer
 and
 most
 helpful
 to
 
them.
 
12. IF
 none
 of
 the
 experts
 or
 trainees
 has
 provided
 step
 descriptions
 that
 are
 clear
 and
 
helpful
 to
 trainees,
 THEN
 rewrite
 the
 steps
 yourself,
 check
 your
 revisions
 with
 the
 
trainees
 and
 ask
 one
 of
 the
 experts
 whether
 your
 description
 is
 accurate.
 




COGNITIVE TASK ANALYSIS

 

 
126
Appendix D
SME A Initial Individual Protocol Flowchart












































Begin
 Procedure
 1:
 
Prepare
 to
 teach
 
Send
 out
 essay
 
checklist
 to
 your
 
students
 
Prepare
 mock
 
exercises
 for
 your
 
students
 
Write
 3-­‐5
 fairly
 simple
 
questions
 allowing
 
students
 the
 
opportunity
 to
 defend
 
their
 positions
 with
 
specific
 reasons
 and/or
 
evidence
 
Select
 two
 exemplar
 
body
 paragraphs
 for
 
students
 to
 color
 code
 in
 
class
 
 
Procedure
 1
 
continued
 on
 
page
 2
 
COGNITIVE TASK ANALYSIS

 

 
127














































Cut
 up
 a
 sample
 body
 
paragraph
 into
 strips
 of
 
paper
 so
 that
 each
 strip
 
of
 paper
 has
 a
 sentence
 
of
 the
 paragraph
 on
 it
 
 
Select
 a
 piece
 of
 
literature
 for
 the
 
students
 to
 read
 and
 
eventually
 write
 an
 
expository
 essay
 
 
Read
 the
 text
 closely
 and
 
annotate
 it
 for
 the
 following
 
literary
 elements:
 difficult
 
vocabulary,
 tone
 of
 the
 text,
 
important
 or
 relevant
 word
 
choice,
 structure
 of
 the
 text,
 
and
 important
 and
 relevant
 
themes
 and
 motifs
 
Formulate
 
writing
 prompts
 
 
Procedure
 1
 
continued
 on
 
page
 3
 
Procedure
 1
 
continued
 
from
 page
 1
 
COGNITIVE TASK ANALYSIS

 

 
128














































Ask
 yourself:
 
How
 do
 I
 connect
 
with
 the
 text?
 
Ask
 yourself:
 
How
 would
 my
 
students
 connect
 
to
 the
 text?
 
Based
 on
 your
 responses
 to
 
the
 previous
 two
 questions,
 
write
 3-­‐5
 prompts
 of
 
varying
 levels
 of
 difficulty
 
pertaining
 to
 each
 –
 theme,
 
critical
 analysis,
 and
 
compare
 and
 contrast
 
Procedure
 1
 
continued
 on
 
page
 4
 
Adjust/Differentiate
 
prompts
 so
 they
 address
 
various
 levels
 of
 reading,
 
ability,
 comprehension,
 and
 
understanding
 
Procedure
 1
 
continued
 
from
 page
 2
 
COGNITIVE TASK ANALYSIS

 

 
129













NO YES
































End
 Procedure
 1
 
Is
 the
 text
 
complex
 and
 rich
 
with
 concepts,
 
themes,
 and
 
motifs?
 
Write
 
upwards
 of
 
five
 
prompts
 
Write
 closer
 
to
 three
 
prompts
 
Procedure
 1
 
continued
 
from
 page
 3
 
COGNITIVE TASK ANALYSIS

 

 
130















 






























Begin
 Procedure
 2:
 
Conduct
 close
 reading
 of
 
a
 literary
 text
 to
 prepare
 
students
 for
 writing
 

 
Procedure
 2
 
continued
 on
 
page
 6
 
Do
 first
 mock
 exercise
 from
 
procedure
 1:
 Pose
 a
 fairly
 
simple
 question
 to
 students
 
Do
 second
 mock
 exercise
 
from
 procedure
 1:
 Color-­‐
code
 sample
 body
 
paragraphs
 
Write
 one
 of
 the
 questions
 
on
 the
 board
 
Tell
 students
 to
 think
 and
 
write
 about
 their
 answers
 
to
 the
 question
 along
 with
 
possible
 reasons
 and/or
 
evidence
 
Discuss
 informally
 as
 a
 
whole
 class
 
COGNITIVE TASK ANALYSIS

 

 
131














































Do
 third
 mock
 exercise
 from
 
procedure
 1:
 Cut
 up
 a
 body
 
paragraph
 into
 strips
 of
 paper
 
 
Distribute
 one
 of
 the
 two
 
paragraphs
 along
 with
 
highlighters
 
Practice
 color-­‐coding
 as
 
an
 entire
 class
 
Distribute
 the
 second
 
paragraph
 and
 have
 
students
 color
 code
 it
 
independently
 
Distribute
 the
 strips
 of
 paper
 to
 
the
 students
 
 
Procedure
 2
 
continued
 
from
 page
 5
 
Procedure
 2
 
continued
 on
 
page
 7
 
Procedure
 2
 
continued
 
from
 page
 6
 
COGNITIVE TASK ANALYSIS

 

 
132


































                             NO                

                                 




                                                                           
                                                               YES



Tell
 students
 to
 get
 out
 of
 their
 seats
 
and
 walk
 around
 the
 room
 in
 order
 
to
 identify
 the
 other
 sentences
 that
 
make
 up
 their
 paragraph
 
Tell
 students
 to
 refer
 back
 to
 the
 
previous
 exercise
 to
 determine
 
whether
 they
 are
 a
 topic
 sentence,
 
concrete
 detail,
 or
 commentary
 
Once
 students
 believe
 they
 have
 
successfully
 constructed
 their
 
paragraph,
 they
 will
 read
 their
 strips
 
of
 paper
 in
 the
 order
 that
 makes
 
logical
 sense
 
Discuss
 as
 a
 class
 
how
 to
 make
 
improvements
 so
 
that
 the
 
sentences
 read
 in
 
logical
 order
 
Did
 the
 students
 
successfully
 
reconstruct
 their
 
paragraphs?
 
Procedure
 2
 
continued
 on
 
page
 8
 
COGNITIVE TASK ANALYSIS

 

 
133














































Distribute
 all
 of
 the
 writing
 
prompts
 to
 the
 students
 before
 
you
 begin
 reading
 the
 text
 
Read
 and
 discuss
 each
 of
 the
 
prompts
 to
 the
 students
 
clarifying
 for
 meaning
 and
 
difficult
 vocabulary
 
Distribute
 copies
 of
 the
 chosen
 
text
 to
 students
 
Alternate
 between
 your
 reading
 
of
 the
 text
 aloud
 in
 class
 and
 the
 
students
 reading
 the
 text
 at
 home
 
Procedure
 2
 
continued
 
from
 page
 7
 
Procedure
 2
 
continued
 on
 
page
 9
 
COGNITIVE TASK ANALYSIS

 

 
134












  NO  YES

 







           
                                         







         














Are
 students
 
proficient
 at
 
reading
 
academic
 text?
 
Assign
 more
 
reading
 at
 
home
 
Assign
 less
 
reading
 at
 
home
 
Alternate
 between
 
students
 reading
 on
 
their
 own
 and
 you
 
reading
 aloud
 to
 them
 
Stop
 at
 annotations
 from
 
your
 preparation
 and
 
identify
 the
 literary
 
elements
 within
 the
 text
 
Alert
 students
 that
 there
 
will
 be
 an
 assessment
 of
 
the
 reading
 assignment
 
when
 they
 return
 to
 class
 
Tell
 students
 to
 take
 notes
 
on
 the
 reading
 and
 that
 
they
 will
 be
 able
 to
 use
 
those
 notes
 during
 the
 
assessment
 
Have
 students
 identify
 
parts
 of
 the
 text
 (i.e.
 
color-­‐coded
 sticky
 
notes)
 that
 correspond
 
with
 the
 various
 writing
 
prompts
 
Assessment
 can
 be
 one-­‐
open
 ended
 question
 that
 
ties
 to
 one
 of
 the
 writing
 
prompts
 
Procedure
 2
 
continued
 
from
 page
 8
 
Procedure
 2
 
continued
 on
 
page
 10
 
COGNITIVE TASK ANALYSIS

 

 
135






















                         
 




 NO YES

















End
 Procedure
 2
 
Continue
 to
 check
 for
 understanding
 as
 
you
 finish
 close
 reading
 of
 the
 text
 and
 
assess
 students’
 readiness
 to
 write
 (i.e.
 
quickwrites,
 student
 reflections,
 
comprehension
 quizzes,
 etc.)
 
Revisit
 writing
 prompts
 explicitly
 to
 
determine
 which
 prompts
 would
 be
 
most
 appropriate
 with
 students’
 abilities
 
Has
 the
 student
 
demonstrated
 a
 
strong
 
understanding
 of
 
the
 text?
 
Procedure
 2
 
continued
 
from
 page
 9
 
For
 procedure
 3,
 
recommend
 less
 
challenging
 
prompt
 or
 
student
 can
 write
 
on
 any
 prompt
 
after
 discussing
 it
 
with
 you
 
For
 procedure
 3,
 
recommend
 more
 
challenging
 
prompt
 or
 
student
 can
 write
 
on
 any
 prompt
 
after
 discussing
 it
 
with
 you
 
COGNITIVE TASK ANALYSIS

 

 
136













 
 































Begin
 Procedure
 3:
 
Teach
 how
 to
 
formulate
 an
 argument
 

 
Model
 writing
 for
 students
 
using
 a
 worked
 example
 of
 a
 
student
 expository
 essay
 on
 a
 
text
 different
 than
 the
 one
 they
 
are
 writing
 on.
 
 
Read
 through
 the
 entire
 
essay
 with
 the
 students
 and
 
pause
 when
 necessary
 to
 
define
 and
 show
 examples
 
of
 introduction,
 body
 
paragraphs,
 and
 conclusion
 
Tell
 students
 to
 
write
 an
 answer
 to
 
the
 writing
 prompt
 
they
 have
 chosen
 
Tell
 students
 to
 
share
 out
 their
 
answers
 to
 the
 class
 
and
 discuss
 whether
 
or
 not
 the
 answers
 
are
 arguable
 
Procedure
 3
 
continued
 on
 
page
 12
 
COGNITIVE TASK ANALYSIS

 

 
137











        NO






                                                                             YES



























Procedure
 3
 
continued
 
from
 page
 11
 
Are
 
students’
 
answers
 
arguable?
 
Help
 students
 
revise
 their
 
answers
 to
 make
 
them
 arguable
 
statements
 
Teach
 students
 to
 write
 the
 
remaining
 part
 of
 the
 
introduction
 that
 comes
 
before
 the
 thesis
 statement
 
Ask
 students:
 What
 should
 
readers
 know
 as
 they
 begin
 
reading
 your
 essay?
 
Use
 the
 worked
 example
 to
 
demonstrate
 how
 the
 
author
 tells
 the
 reader
 who
 
the
 author
 is,
 as
 well
 as
 the
 
title
 and
 genre
 of
 the
 work
 
Procedure
 3
 
continued
 on
 
page
 13
 
COGNITIVE TASK ANALYSIS

 

 
138














































Procedure
 3
 
continued
 
from
 page
 12
 
Tell
 students
 to
 
write
 a
 brief
 (i.e.
 3-­‐5
 
sentences)
 summary
 
or
 synopsis
 of
 what
 
this
 book
 is
 mainly
 
about
 
Tell
 students
 to
 
write
 a
 transition
 to
 
the
 idea
 in
 the
 book
 
that
 is
 going
 to
 tie
 to
 
their
 thesis
 
statements
 
Tell
 students
 to
 refer
 
to
 essay
 checklist
 to
 
evaluate
 whether
 or
 
not
 they
 have
 
completed
 writing
 
their
 introductions
 
Procedure
 3
 
continued
 on
 
page
 14
 
COGNITIVE TASK ANALYSIS

 

 
139













       NO








                      YES
                                                         






















Do
 students’
 
introductions
 
meet
 all
 of
 the
 
requirements
 of
 
the
 essay
 
checklist?
 
Procedure
 3
 
continued
 
from
 page
 13
 
End
 Procedure
 3
 
Help
 students
 to
 
revise
 their
 
introductions
 to
 
meet
 all
 of
 the
 
requirements
 
listed
 on
 the
 essay
 
checklist
 
COGNITIVE TASK ANALYSIS

 

 
140
































                   NO







  YES





Begin
 Procedure
 4:
 
 
Teach
 how
 to
 support
 an
 
argument
 
Procedure
 4
 
continued
 on
 
page
 16
 
Tell
 students
 to
 
review
 their
 thesis
 
statements
 and
 have
 
them
 answer
 the
 
question:
 Why
 would
 
you
 say
 that?
 
Tell
 students
 that
 
their
 answers
 to
 the
 
previous
 question
 
will
 become
 the
 topic
 
sentence
 of
 their
 first
 
body
 paragraph
 
Is
 the
 
students’
 
topic
 
sentence
 
arguable?
 
Have
 students
 
revise
 
COGNITIVE TASK ANALYSIS

 

 
141














































Procedure
 4
 
continued
 
from
 page
 15
 
Tell
 students
 to
 locate
 
evidence
 from
 the
 text
 that
 
corroborates
 the
 argument
 
in
 their
 topic
 sentence
 
Tell
 students
 to
 
write
 
commentary
 
Instruct
 students
 to
 refer
 back
 
to
 their
 notes
 during
 the
 close
 
reading
 stage
 and
 identify
 the
 
specific
 passages/quotes
 that
 
support
 their
 topic
 sentence
 
Tell
 students
 to
 review
 their
 
topic
 sentence
 and
 think
 
about
 the
 argument
 they
 
are
 making
 
Procedure
 4
 
continued
 on
 
page
 17
 
COGNITIVE TASK ANALYSIS

 

 
142






















                                  YES







     NO















Procedure
 4
 
continued
 
from
 page
 16
 
Tell
 students
 to
 review
 the
 
textual
 support
 they
 have
 
chosen
 and
 identify
 the
 
specific
 word(s),
 tone,
 
and/or
 image
 that
 
specifically
 support
 what
 
they
 are
 arguing
 
Does
 student
 
summarize
 as
 
opposed
 to
 
analyze
 the
 
plot?
 
Use
 a
 worked
 
example
 to
 
demonstrate
 the
 
difference
 between
 
summary
 and
 
analysis/explanation
 
to
 help
 student
 
revise
 his/her
 
commentary
 
Conference
 with
 students
 
over
 their
 first
 body
 
paragraph
 and
 students
 
make
 revisions
 per
 teacher
 
recommendations
 
Procedure
 4
 
continued
 on
 
page
 18
 
COGNITIVE TASK ANALYSIS

 

 
143












                                     NO  







     YES

























Procedure
 4
 
continued
 
from
 page
 17
 
Has
 student
 
successfully
 
written
 first
 
body
 
paragraph?
 
 
Provide
 further
 
instruction
 on
 
how
 to
 make
 
corrections
 and
 
improvements
 

 
Tell
 student
 to
 
write
 the
 
remaining
 two
 
body
 paragraphs
 
Tell
 students
 
to
 write
 a
 
conclusion
 
Procedure
 4
 
continued
 on
 
page
 19
 
COGNITIVE TASK ANALYSIS

 

 
144














































Procedure
 4
 
continued
 
from
 page
 18
 
Revisit
 the
 
conclusion
 from
 the
 
worked
 example
 in
 
procedure
 3
 
Tell
 students
 that
 
you
 will
 read
 the
 
conclusion,
 but
 will
 
not
 grade
 it
 against
 
them
 
 
End
 Procedure
 4
 
COGNITIVE TASK ANALYSIS

 

 
145

























                                     YES







 






     
      NO




Begin
 Procedure
 5:
 
 
Teach
 how
 to
 edit
 
writing
 
Students
 submit
 their
 final
 
draft
 of
 their
 essays
 with
 all
 
of
 the
 work
 leading
 up
 to
 it
 
Only
 grade
 the
 final
 draft
 of
 
the
 essay
 and
 look
 at
 the
 
previous
 student
 work
 only
 
if
 necessary
 
As
 you
 grade
 the
 final
 essay,
 
make
 comments
 on
 thesis
 
statement,
 topic
 sentences,
 
and
 commentary
 
Are
 there
 
noticeable
 
grammatical
 
mistakes?
 
End
 Procedure
 5
 
Provide
 instruction
 
on
 how
 to
 make
 
corrections
 and
 
improvements
 to
 
revise
 errors
 
COGNITIVE TASK ANALYSIS

 

 
146














































Begin
 Procedure
 6:
 
 
Teach
 how
 to
 write
 
reflections
 on
 their
 
writing
 
Return
 graded
 
work
 back
 to
 
students
 
Tell
 students
 to
 look
 
over
 the
 comments
 
and/or
 adjustments
 you
 
made
 and
 write
 a
 one-­‐
page
 reflection
 
Tell
 students
 to
 attach
 their
 
reflections
 to
 the
 bottom
 of
 
all
 their
 work
 and
 resubmit
 
one
 final
 time
 
Grading
 student
 
reflections
 is
 
optional
 
End
 Procedure
 6
 
COGNITIVE TASK ANALYSIS

 

 
147
Appendix E
Gold Standard Protocol

1i+3r Incremental Method Protocol  

Task: To teach expository writing at the eleventh-grade level  

Objective: Students will be able to construct and defend an argument represented by a
written analysis  

Main Procedures:  
1. Prepare to teach  
2. Introduce/Review expository writing to students  
3. Teach how to formulate an argument  
4. Teach how to support an argument  
5. Conduct close reading of the text to prepare students for writing  
6. Conduct research to cite outside sources  
7. Begin writing the expository essay  
8. Teach how to edit writing  
9. Teach reflection and evaluate students’ writing  

Procedure 1. Prepare to teach.  
1.1. Develop essay criteria that includes the following.  
1.1.1. Standard: A clearly defined, argumentative thesis statement.  
1.1.2. Standard: Written in the literary present.  
1.1.3. Standard: Written in 3
rd
person.  
1.1.4. Standard: Written in active voice.  
1.1.5. Standard: Topic sentences as mini-arguments.  
1.1.6. Standard: Lead-ins with the quotations.  
1.1.7. Standard: Commentary that connects the chosen text to the topic sentence.  
1.1.8. Standard: Body paragraphs that include a closing statement.  
1.1.9. Standard: Free of major grammatical and mechanical errors.  
1.1.10. Standard: MLA format.  
1.2. Prepare mock exercises for students:  
1.2.1. Generate 3-5 simple and relatable questions that allow students the
opportunity to defend their positions with specific reasons and/or evidence  
1.2.1.1. Reason: This exercise is designed to give students practice making
an argument.  
1.2.1.2. Reason: It helps students to understand that there is no one right
answer, and that they are doing a good job as far as meeting the writing
goal as long as they can support their answers.  
1.2.2. Select two exemplar body paragraphs and/or an article for students to
color code in class.  
1.2.2.1. Reason: This exercise is designed to help students visualize their
writing and see how much of their writing goes to forming their argument.  
COGNITIVE TASK ANALYSIS

 

 
148
1.2.2.2. Choose paragraphs or articles that are accessible to your students
based on the following criteria (i.e. uses text that is congruent with
students’ reading comprehension levels and is on a topic of interest to
students.  
1.2.3. Choose a different paragraph from 1.2.2 and cut it up into individual strips
of paper so that each strip of paper has a sentence of the paragraph on it.  
1.2.3.1. Condition: The class should be lower level (i.e. English 1 or 2,
Limited English Proficient (LEP), or English Language Development
(ELD).  
1.2.3.2. Reason: This exercise is designed to help students understand the
purpose of each sentence and make the connections that make up an
argument.  
1.2.3.3. Take in to account the number of students in the class and the
number of sentences in the paragraph to have enough for each student.  
1.3. Select a novel or an article for the students to read and eventually use to write an
expository essay.  
1.3.1. IF it is the beginning of the school year (i.e. first quarter), THEN teach
expository writing using an article.  
1.3.1.1. Reason: English at the eleventh-grade is typically taught as a
chronological survey of American literature, therefore the first opportunity
to teach a novel may not be until the second quarter during Romanticism.  
1.3.1.2. Reason: Starting with an article that is rich in opposition can be
more accessible to the average student than a novel because it could be
more debatable and easier for students to identify thesis statement and the
argument.  
1.3.1.3. Reason: Starting with an article can be more effective because it
takes less time to read, which allows you to focus more on the writing
aspect.  
1.3.2. IF it is the second quarter of the school year or later, THEN teach
expository writing using a novel.  
1.3.2.1. Reason: Teaching with a novel and going through the richness of
the language together helps them to understand the reading and motivates
them to enjoy what they are reading even further.  
1.3.2.2. Reason: The knowledge of close reading skills students learn with
a literary text does transfer when they are asked to read articles.  
1.3.2.3. Condition: You will have to teach more rhetorical devices when
doing a close reading of an article, whereas you would teach more literary
devices with a literary text.  
1.4. Read the chosen text closely and annotate it for the following literary elements:  
1.4.1. Standard: Difficult vocabulary  
1.4.2. Standard: Tone of the text  
1.4.3. Standard: Important or relevant word choice (i.e. diction)  
1.4.4. Standard: Structure of the text  
1.4.5. Standard: Important and relevant themes/motifs of the text  
1.5. Formulate writing prompts  
1.5.1. Ask yourself how do I connect with the text?  
COGNITIVE TASK ANALYSIS

 

 
149
1.5.2. Ask yourself how would my students connect to the text?  
1.5.3. Ask yourself how do I get my students to figure out what they can argue?  
1.5.4. Use your responses to 1.5.1 to 1.5.3 to develop 2-5 writing prompts of
varying levels of difficulty pertaining to a theme or a universal truth of mankind
(i.e. love, community, death, etc.), the structure of how the text was written,
critical analysis, and compare and contrast.  
1.5.4.1. Standard: Prompts should provide a brief summary or statement
about the work.  
1.5.4.2. Standard: Prompts should be very general and open-ended, and not
directive and tell students what to write.  
1.5.4.3. Standard: Prompts should describe a topic/theme/concept/dilemma
and pose a question.  
1.5.4.4. Standard:  Questions should invite conversation about the text,
help students brainstorm topics and ideas, and help students arrive at their
own conclusion about the text in the form of a thesis or claim.  
1.5.5. Adjust/Differentiate prompts so they address various levels of reading
ability, comprehension, and understanding by simplifying vocabulary and
avoiding lengthy complex sentences  
1.5.6. IF the text is more complex and rich with concepts, themes, and motifs,
THEN write upwards of five prompts.  
1.5.7. IF the text is less complex and less endowed with concepts, themes, and
motifs, THEN write closer to 3 prompts.  
1.5.8. Identify potential arguments that students can make based on the writing
prompts, as well as the complexity of those arguments.  
1.5.8.1. IF the potential argument can be made with clearly defined, black
and white explanations, THEN the argument is less complex.  
1.5.8.2. IF the potential argument cannot be made with explanations that
are clearly defined and there are no clear right or wrong answers, THEN
the argument is more complex.  

Procedure 2. Introduce/Review expository writing to students.  
2.1. Tap into students’ prior knowledge about essay writing by doing the following:  
2.1.1. Ask students what is the purpose of an expository essay.  
2.1.2. Review academic language such as claim, topic sentence, concrete detail,
and commentary, and ask students to identify synonyms to those terms.  
2.1.3. Review and define the four rhetorical writing modes:  
2.1.3.1. Teach narrative as a story that can be told from different points of
view such as the overarching narrator (i.e. 3
rd
person omniscient) versus the
more focused narrator (i.e. 1
st
person limited).  
2.1.3.2. Teach descriptive as a detailed description of a specific event or
topic that does not have to be written in the first-person.  
2.1.3.3. Teach expository writing as a way to inform and/or construct and
defend an argument represented by written analysis.  
2.1.3.4. Teach that everything is an argument, and that argument has an
element of persuasion.  
COGNITIVE TASK ANALYSIS

 

 
150
2.2. Provide baking analogy to help students understand that writing requires both
structure (i.e. essential ingredients) and creativity (i.e. non-essential ingredients you
use to experiment) and/or any other analogy that connects with students’ interests,
promotes structure, creativity, and finding their voice.  

Procedure 3. Teach how to formulate an argument.  
3.1. Do mock exercise 1.2.1 with students.  
3.1.1. Write the questions on the board or show a movie clip that is followed by
a controversial question to help students understand that an argument is a claim
(e.g. Watch opening scene of “Vertical Limit” and ask students if the son should
cut the climbing rope to save himself and his sister at the expense of his father’s
life?)  
3.1.2. Tell students to think and write about their answers to the question along
with possible reasons and/or evidence.  
3.1.3. Ask students to share out their answers in order to discuss informally as a
whole class.  
3.1.4. IF students’ responses vary, THEN keep the dialogue going by having
students discuss and debate their answers in class.  
3.1.5. IF students share similar responses, THEN have them group together and
defend their positions with reasons and evidence.  
3.1.6. Use student responses during this exercise to connect to the academic
language in 2.1.2.  
3.2. Do mock exercise 1.2.2 with students.  
3.2.1. Distribute the paragraphs from 1.2.2 along with highlighters.  
3.2.1.1. Give students time to practice color-coding one of the two
paragraphs as an entire class (e.g. blue indicates topic sentence, red
indicates concrete detail, and green indicates commentary)
3.2.1.2. Tell students to work individually to color code the second
paragraph and share out to the rest of the class.  
3.2.2. Distribute the article from 1.2.2 or allow students to bring an article of
their choosing that is no more than 2 pages in length.  
3.2.3. Tell students to identify the main argument.  
3.2.4. Tell students to read and color code the article.  
3.2.5. IF students do not know how to classify a sentence, THEN tell them to
mark it using a distinct color and discuss it with the class to solve.  
3.2.6. Ask students to share their favorite sentence(s) from the article.  
3.2.7. Tell students to hold up their papers to prompt discussion about which
articles were well written and those that are poorly written.  
3.2.7.1. Reason: The juxtaposition of both strong and weak articles helps
students to understand how to improve their own writing.  
3.2.8. IF students have a poorly written article, discuss why it is poorly written
and how it could be improved.  
3.2.9. IF students bring in articles that are from a blog or another type of
informal source, teach them the differences between colloquial and formal
language.  
3.3. Do mock exercise 1.2.3 with students.  
COGNITIVE TASK ANALYSIS

 

 
151
3.3.1. Distribute the strips of paper to the students accordingly.  
3.3.2. Tell students to get up out of their seats and walk around the room and
connect with the other sentences in their paragraph.  
3.3.3. Tell students to refer back to the previous exercise to determine whether
they are the topic sentence, concrete detail, and/or commentary.  
3.3.4. Ask students to volunteer their findings once they believe they have
successfully reconstructed their paragraph.  
3.3.4.1. IF students reconstruct their paragraph successfully, THEN affirm
their work.  
3.3.4.2. IF students did not reconstruct their paragraph successfully, THEN
work with the class to make the appropriate changes.  
3.4. Model writing for students using a worked example of a student expository essay on
a text that they are familiar with and that is different than the one they are writing on.  
3.5. Read through the entire essay with the students and pause when necessary to identify
and teach the following elements:  
3.5.1. Teach the introduction as the opening paragraph of the essay that includes
the following:  
3.5.1.1. Title, Author, Genre  
3.5.1.2. Formal language  
3.5.1.3. Essential summary  
3.5.1.4. Sentence structure (e.g. appositives)  
3.5.1.5. Present tense  
3.5.2. Teach the thesis statement as the main argument of the essay that meets
the following criteria:  
3.5.2.1.1.1. The last sentence of the introduction.  
3.5.2.1.1.2. Answers the question from the writing prompt.  
3.5.3. Teach body paragraphs as the paragraphs where students can discuss
reasons and evidence to support the thesis statement and include the following:  
3.5.3.1. Teach topic sentence as a mini-argument that supports the thesis or
the main argument.  
3.5.3.2. Teach concrete detail as a textual example that supports the
argument in the topic sentence.  
3.5.3.3. Teach concrete detail can also be a paraphrase of something that
happened in the text.  
3.5.3.3.1. Tell students the concrete detail should be as succinct as possible
and not take up more than 40% of the paragraph.  
3.5.3.4. Teach commentary as the students’ rationale that explains how
they are connecting the text to their topic sentence.  
3.5.3.4.1. Tell students that commentary should be multiple sentences and
roughly 60% of the body paragraph.  
3.5.3.4.2. Tell students that effective commentary analyzes the literary
elements from 1.4.1 to 1.4.5.  
3.5.4. Teach the conclusion as a restating and a closing of their entire argument.  
3.5.4.1. Show worked examples to demonstrate how the conclusion is a
restating and closing of an argument, that it provides closure, settles the
COGNITIVE TASK ANALYSIS

 

 
152
tone of the argument, takes the reader beyond the topic, and/or explains
why the argument matters.  

Procedure 4. Teach how to support an argument  
4.1. Teach rhetorical ideas: ethos, pathos, logos and that arguments are made based on
these rhetorical ideas.  
4.2. Ask students: What do you need to have to make a convincing argument?  
4.2.1. Reason: Get students to recognize that they need support to make an
argument by asking them.  
4.3. Tell students that if they want people to believe their arguments, they cannot solely
rely on one type of appeal.  
4.4. Teach students to develop their arguments by using a combination of the following
types of support that are relevant for this particular writing assignment:  
4.4.1. Standard: Emotional appeals  
4.4.2. Standard: Statistics  
4.4.3. Standard: Historical and/or Societal relevance and impact.  
4.4.4. Standard: Refuting the author’s claims  
4.4.5. Standard: Direct quotes from the article or novel
4.5. Revisit the article from 3.2.2 and tell students to identify the types of support the
author used to support his/her argument.  
4.6. Pose either a random controversial question or pull from the questions you generated
from 1.2.1 to students in order to give them practice formulating a claim as an
argument.  
4.7. Tell students to write down their answer to the question as an argument statement
form.  
4.8. Tell students to bullet point at least 4 reasons or evidence to support their claim.  
4.9. Tell students to go through their bulleted points of reasons and evidence and identify
if it is based on an emotional appeal, a statistic, or its relevance or impact on history
and/or society.  
4.10. IF students rely too much on one type of reasoning or evidence, THEN
recommend they incorporate other types.  
4.11. Pose another random controversial question (e.g. should the school enforce a
dress code?), divide the class in half to indicate those who are for and those who are
against and assign small groups of students to present the types of reasons and
evidence from 4.4.1 to 4.4.5.  

Procedure 5. Conduct close reading of the text to prepare the students for writing.  
5.1. IF you have students read a novel, THEN do the following steps:  
5.1.1. Distribute all of the writing prompts to the students before you begin
reading the text.  
5.1.2. Read and discuss each of the prompts with the students clarifying for
meaning and difficult vocabulary.  
5.1.3. Distribute copies of the chosen text to students.  
5.1.4. Alternate between your reading of the text aloud in-class and students
reading the text at-home.  
COGNITIVE TASK ANALYSIS

 

 
153
5.1.5. IF students are less proficient at reading, THEN read more of the text
together in class.  
5.1.5.1. Alternate between students reading on their own or you reading
aloud to the students.  
5.1.5.1.1. Condition: Chapters or passages that are rich in dialogue lend
themselves to reading aloud as a class.  
5.1.5.2. Stop at annotations from the preparation in 1.4 and identify the
literary elements within the text (e.g. author’s tone, diction, use of
symbolism, etc.).  
5.1.5.2.1. IF students demonstrate understanding during discussion of literary
element, THEN move forward with the close reading.  
5.1.5.2.2. IF students do not demonstrate understanding during discussion of
literary elements, THEN stop and address students’ confusion  
5.1.5.3. Ask students to identify parts of the text (e.g. color-coded sticky
notes) that correspond with the various writing prompts.  
5.1.5.4. IF the quote or passage in the text obviously connects to one of the
writing prompts, THEN ask students to make the connection.  
5.1.5.5. IF the quote or passage in the text is not obvious as to which of the
writing prompts it connects to, THEN ask 3 more clarifying questions to
give them more opportunities to think and reach an answer.  
5.1.5.5.1. IF students show they are reaching their frustration level, THEN
make the connection for them.  
5.1.6. IF students are more proficient at reading, THEN assign more reading at
home.  
5.1.6.1. Tell students that there will be an assessment of the reading
assignment when they return to class.  
5.1.6.2. Tell the students to take notes on their reading and that they will be
able to use those notes during the assessment.  
5.1.6.3. IF students are assigned reading at home, THEN give assessment
(i.e. quiz) when they return to class.  
5.1.6.4. Standard: Assessment can be one open-ended question that ties to
one of the writing prompts.  
5.1.7. Continue to check for student understanding as you finish close reading of
the text, and assess students’ readiness to write using the following methods:  
5.1.7.1. Administer quickwrites  
5.1.7.2. Give reading comprehension quizzes  
5.1.7.3. Tell students to answer free-response questions on the text  
5.1.7.4. Tell students to write reflections on the text  
5.1.7.5. Observe students during class  
5.1.7.6. Hold class discussions  
5.1.8. Invite students to converse about the text and discuss multiple
interpretations.  
5.1.8.1. Objective: Give students an opportunity to produce multiple
interpretations of the text.  
5.1.8.2. Distribute a list of about 80 topics to the students.  
COGNITIVE TASK ANALYSIS

 

 
154
5.1.8.3. Review the list of topics and ask students to identify which topics
are relevant to the text.  
5.1.8.4. Mark a star next to those topics that students believe are relevant to
the text and do not ask why.  
5.1.8.4.1. Reason: Answering “why” is part of the argument and you don’t
want students generating thesis statements for one another  
5.1.8.5. IF students do not recognize a topic that you believe is appropriate,
THEN ask them, “Could you argue this?” and wait to see if they change
their minds.  
5.1.9. Revisit writing prompts explicitly to determine which prompts would be
most appropriate with students’ abilities.  
5.1.9.1. IF students demonstrate a strong understanding of the text, THEN
recommend one of the more challenging prompts.
5.1.9.2. IF students demonstrate minimal understanding of the text, THEN
recommend one of the less challenging prompts.  
5.1.9.3. IF a student would like to write on a prompt different than the one
you’ve recommended and they’ve discussed it with you, THEN let student
write to the desired prompt.  
5.1.10. Tell students to write an answer to the writing prompt they have chosen.  
5.1.11. Ask students to share out their answers to the class when they are ready
and discuss whether or not the answers are arguable.  
5.2. IF you have students read an article, THEN do the following steps:  
5.2.1. Provide students with the article you have chosen from 1.3.
5.2.2. Tell students to read the article at home and annotate referring to the
annotation guide they received at the beginning of the year.  
5.2.3. IF there is time in class, THEN read and annotate the first paragraph
together.  
5.2.4. Ask students to identify the author’s main claim.  
5.2.5. IF students identify the author’s main claim in the article, THEN ask them
to share out with the rest of the class how they know that.  
5.2.6. Ask students to generate their thesis or claim based on their responses
from 5.2.4 and their own experiences, without sharing or discussing it with other
students.  
5.2.7. Meet with each student and read his/her claim or thesis statement.  
5.2.8. IF student’s claim or thesis is not clearly written as an arguable position
on the author’s claim, THEN make time to conference with the student to help
revise.  
5.2.9. IF student’s claim or thesis statement is clearly written as an arguable
position on the topic, THEN direct students to procedure 6: Conduct research.  

Procedure 6. Conduct research to cite outside sources  
6.1. IF it is the beginning of the school year and you are following the chronology of
American literature (i.e. Puritans, Revolutionary writers), THEN provide the
following scaffold for your students:  
6.1.1.  Provide 3-4 articles per subject/topic with bibliographies related to the
text you are reading.  
COGNITIVE TASK ANALYSIS

 

 
155
6.1.2. Review how to conduct research using reliable sources (e.g. Google
Scholar, JSTOR, university websites).  
6.1.3. Pull all of the available and relevant resources from the school library
prior to sending students to do their research.  
6.1.3.1. Reason: This will eliminate students checking out resources before
other students researching similar topics can get to them.  
6.1.4. Ask students to evaluate the reliability of the sources you’ve provided
them in 6.1.1 by asking them questions such as why should they trust you? How
do they know your choices are reliable? How do you know these are credible
sources? Where did I get them? How do you know where I got them? When
were they written and how does that change the ideas we are discussing.  
6.2. Tell students to read the articles you have provided and choose 1 or 2 of those
articles to incorporate into the argument of their essays.  
6.3. Tell students to conduct their own research and find articles that support their thesis
or claim, or are based on the topic or motif that is found in the text they are reading
as the school year progresses.  
6.4. IF it is the beginning of the year, THEN require only 1 article.  
6.5. IF it is later in the school year, THEN require 3-4 articles.  
6.6. Tell students to bring the article to class and annotate it.  

Procedure 7. Begin writing the expository essay  
7.1. Teach the overall structure of the essay by providing a visual outline as follows.  
7.1.1. Draw an upside down triangle to visually represent how the introduction
progresses from more general statements to something more specific like their
claim.  
7.1.2. Teach students that the claim goes at the end of their introduction or the
bottom of the upside down triangle.  
7.1.3. Tell students to avoid writing in the 1
st
person in order to establish more of
an authoritative voice.  
7.1.4. Draw squares to represent body paragraphs of the essay.  
7.1.5. Remind students of their knowledge of topic sentences, concrete details,
and commentary.  
7.1.6. Draw a right side up triangle to represent the essay’s conclusion.  
7.2. Tell students to write a minimum of 4 paragraphs (including introduction and
conclusion) and up to however many is necessary to make the students’ arguments.  
7.3. Refer back to worked examples and the texts you have read with the students to
analyze how many paragraphs they needed to successfully make their arguments.  
7.4. Tell students to write the remaining part of their introductions that comes before their
thesis statements from 5.2.7.  
7.5. Ask students: What should readers know as they begin reading your essay?  
7.6. Use the worked example to demonstrate how the author tells the reader who the
author is, as well as the title and genre of the work.  
7.7. Tell students to write a brief (i.e. 3-5 sentences) summary or synopsis of what this
novel or article is mainly about.  
7.8. Tell students to write a transition to the idea in the novel or article that is going to tie
to their thesis statements.  
COGNITIVE TASK ANALYSIS

 

 
156
7.9. Tell students to refer to essay checklist from 1.1 to evaluate whether or not they have
completed writing their introductions.  
7.10. IF students’ introductions meet all of the requirements of the essay checklist for
introductions, THEN move on to the next step.  
7.11. Tell students to review their thesis statements and have them answer the question:
“Why would you say that?”  
7.12. Tell students that their answers to 7.10 will become the topic sentence of their
first body paragraph.  
7.13. IF student’s topic sentence is not arguable, THEN have students revise.  
7.14. IF student’s topic sentence is arguable, THEN move on to the next step.  
7.15. Tell students to refer back to their notes at 5.1.5.3. and identify the specific
passages/quotes that support their topic sentence.  
7.16. Ask students to think about how the article(s) from their research in Procedure 6
supports and/or furthers their argument.
7.17. Explain to students the difference between quoting from the text as opposed to
paraphrasing the text.  
7.18. IF the author’s wording is essential (i.e. you need to keep the author’s words in
their original form because you are specifically analyzing author’s word choice
and/or style or you need to include specific words or phrases from the author in your
commentary) for your commentary, THEN quote.  
7.19. IF it is information that students can successfully restate without losing anything
(i.e. a statistic, a fact, or a plot detail), THEN students should paraphrase.  
7.20. Tell students that they should not rely only on quotes because it is easier.  
7.21. Review the worked example in 3.4, and provide worked examples of “lead-ins” or
introductory phrases that precede quotations to show students how to integrate
concrete details into their essays.  
7.22. Teach students how to do in-text citations as well as citations for their Work Cited
page by providing students with the detailed information stated in the MLA
guidelines.  
7.23. IF it is the first essay of the year, THEN provide a full citation of the primary
source they are writing about, along with additional examples of sources they would
be familiar with from previous works they have read.  
7.24. IF students demonstrate the ability to cite sources correctly in their first essays,
THEN do not provide the citation for their primary source.  
7.25. Tell students to write commentary  
7.26. IF there is time, THEN prepare some worked examples of commentary prior to
students writing commentary.  
7.27. Tell students to review their topic sentence and think about the argument they are
making.  
7.28. Tell students to review the textual support they have chosen and identify the
specific word(s), tone, and/or image that specifically support what they are arguing.  
7.29. IF students write commentary that is plot summary, THEN use another worked
example to demonstrate the difference between summary and analysis/explanation.  
7.30. Conference with students over their first complete body paragraph.  
7.31. Tell students to revise their first body paragraph per teacher recommendations.  
COGNITIVE TASK ANALYSIS

 

 
157
7.32. IF students have successfully completed their first body paragraphs, THEN have
them write the remaining two body paragraphs.  
7.33. Tell students to write a conclusion.  
7.34. Revisit the conclusion and the suggestions you gave them from the worked
example in 3.5.4.1.  
7.35. Tell students that you will read the conclusion, but will not grade it against them.  

Procedure 8. Teach how to edit writing  
8.1. Tell students get in to groups of four and take out a different color pen from
everyone else in their group.  
8.2. Assign more capable and skilled students to each group if possible.  
8.3. Tell students take their most recent draft of the essay and highlight their thesis
statements.  
8.4. Tell students to pass their essays to the student on their left for the first reading.  
8.5. Tell students to read for about 7-10 minutes looking specifically at formatting, the
thesis statement, and whether or not the commentary addresses the thesis statement,
and grammar and syntax.  
8.6. Tell students to pass their essays again to the student on their left for the second
reading.  
8.7. Tell students to read for about 7-10 minutes looking specifically at whether or not
the opening paragraph introduces a logical and coherent line of thinking beginning
with a hook and ending with the thesis statement, and grammar and syntax.  
8.8. Tell students to pass their essays again to the student on their left for the third
reading.  
8.9. Tell students to read the essay paying close attention to the comments and
suggestions made by the previous two students and adding anything else they can.  
8.10. Tell students to pass their essays again for the last time so that essays are returned
to their original owners.  
8.11. Tell each group to have a table discussion about each individual paper, where
students can ask questions about the feedback they have received from their group
members.

Procedure 9. Teach reflection and evaluate students’ writing  
9.1. Tell students to answer the question: “What was positive about the essay?” on the
last page of their essays.  
9.2. Tell students to answer the question: “What was negative about the essay?” on the
last page of their essays.  
9.3. Tell students to submit their final draft of their essays with all of the work leading up
to it.  
9.4. Grade only the final draft of students’ essays and read their comments afterwards and
look at the previous student work only if necessary.  
9.5. Give feedback on the following as you grade their essays:  
9.5.1. Formatting (i.e. MLA)  
9.5.2. Sentence structure  
9.5.3. Diction and word choice  
9.5.4. More sophisticated use of transitional words and/or phrases  
COGNITIVE TASK ANALYSIS

 

 
158
9.5.5. Thesis statement  
9.5.6. Topic sentences  
9.5.7. Commentary or voice  
9.5.8. Whether or not the student took a risk? (i.e. has the student attempted
more complex arguments based on their level of writing proficiency).  
9.6. IF you read the final draft and suspect plagiarism, THEN revisit students’ prior work.  
9.7. IF you read the final draft and suspect the student has made little to no progress
based on your feedback, THEN revisit prior work.  
9.8. IF there are noticeable grammatical errors in the essays, THEN provide instruction
on how to make corrections and improvements.  
9.9. Return student essays quickly so that their work remains relevant along with your
grade and with enough feedback to guide them.  
9.10. Offer students the option of revising their essays.  
9.11. IF students’ essays need significant improvement (i.e. would receive a C- letter
grade or lower), THEN assign a letter of R to indicate that this essay is worth going
back to again.  
9.11.1. Reason: Students have an opportunity to demonstrate improvement right
away as opposed to waiting for the next essay.  
9.11.2. Reason: The improvements students need to make for their essays do not
always transfer to another essay, nor is that transfer easy to see for novice
students.  
9.12. IF students choose to revise their essays, THEN do the following:  
9.12.1. Be available to conference with students you required to rewrite or who
choose to rewrite.  
9.12.2. Allow students who revise no more than 2 weeks to revise their essays.  
9.12.3. Allow students who revise the opportunity to receive the highest grade
possible as long as their work merits it.  
9.13. IF students do not revise their essays, THEN assign the letter grade you would
have originally assigned.  
9.14. Tell all students to look over the comments and/or adjustments you made, and
write a one-page reflection with the following guidelines.  
9.15. Tell students they can write openly and freely for their reflection.
9.16. Recommend to students that they reflect on what they plan to do differently for
the next essay, what they learned through the writing process, and on any possible
questions that remain.
9.17. Tell students to resubmit all of their work, including their reflections, for
discussion during individual student-teacher conferences.
9.17.1. Reason: Students can understand the rationale behind their grade and
reflect upon the writing process.  
9.17.2. Grading student reflections is optional.  

 





COGNITIVE TASK ANALYSIS

 

 
159
Appendix E
Incremental Coding Spreadsheets

 
 
SME
Ste
p Type
Final Incremental Gold Standard
Protocol Procedures A B C D
 
Procedure 1. Prepare to teach.
(A1R269; B1R52; C1R93; D1R)  
 
1 A
1.1. Develop essay criteria that
includes the following. (A1R1474;
B1R165-184; C1R103-125; D1R)  1 1 1 1
2 A
1.2. Prepare mock exercises for
students: (A1R862-864; B1R188;
C1R107; D1R) 1 1 1 1
3 A
1.2.1. Generate 3-5 simple and
relatable questions that allow
students the opportunity to defend
their positions with specific reasons
and/or evidence (A1R252-256,
879-889; B1R188; C1R127; D1R)  1 1 1 1
4 A
1.2.2. Select two exemplar body
paragraphs and/or an article for
students to color code in class.
(A1R954-955; B1R201; C1R130;
D1R) 1 1 1 1
5 A
1.2.2.2. Choose paragraphs or
articles that are accessible to your
students based on the following
criteria (i.e. uses text that is
congruent with students’ reading
comprehension levels and is on a
topic of interest to students. (A2R;
B1R201; C1R130; D1R) 1 1 1 1
6 A
1.2.3. Choose a different paragraph
from 1.2.2 and cut it up into
individual strips of paper so that
each strip of paper has a sentence of
the paragraph on it. (A1R922-925;
B2R; C1R131; D1R) 1 1 1 1
7 A
1.2.3.3. Take in to account the
number of students in the class and
the number of sentences in the
paragraph to have enough for each
student. (A2R; B2R; C1R131;
D1R) 1 1 1 1
COGNITIVE TASK ANALYSIS

 

 
160
8 A
1.3. Select a novel or an article for
the students to read and eventually
use to write an expository essay.
(A1R1108-1116; B1R257-279;
C1R190; D1R) 1 1 1 1
9 D
1.3.1. IF it is the beginning of the
school year (i.e. first quarter),
THEN teach expository writing
using an article. (B2R; C1R190;
D1R) 0 1 1 1
10 D
1.3.2. IF it is the second quarter of
the school year or later, THEN
teach expository writing using a
novel. (B2R; C1R190; D1R) 0 1 1 1
11 A
1.4. Read the chosen text closely
and annotate it for the following
literary elements: (A1R590-596,
603-611, 650; B1R334; C1R206;
D1R) 1 1 1 1
12 A
1.5. Formulate writing prompts
(A1R252, 304, 309-318, 652;
B1R385-490; C1R213; D1R) 1 1 1 1
13 A
1.5.1. Ask yourself how do I
connect with the text? (A1R673;
C1R214; D1R) 1 1 1 1
14 A
1.5.2. Ask yourself how would my
students connect to the text?
(A1R675; C1R215; D1R) 1 1 1 1
15 A
1.5.3. Ask yourself how do I get my
students to figure out what they can
argue? (B1R399-400; C1R216;
D1R) 0 1 1 1
16 A
1.5.4. Use your responses to 1.5.1
to 1.5.3 to develop 2-5 writing
prompts of varying levels of
difficulty pertaining to a theme or a
universal truth of mankind (i.e.
love, community, death, etc.), the
structure of how the text was
written, critical analysis, and
compare and contrast. (A1R677,
685, 689, 690, 711; B1R427, 441,
490; C1R225, D1R) 1 1 1 1
17 A
1.5.5. Adjust/Differentiate prompts
so they address various levels of
reading ability, comprehension, and 1 1 1 1
COGNITIVE TASK ANALYSIS

 

 
161
understanding by simplifying
vocabulary and avoiding lengthy
complex sentences (A1R704-713,
772-774, 784-801; A2R; C1R241;
D1R).  
18 D
1.5.6. IF the text is more complex
and rich with concepts, themes, and
motifs, THEN write upwards of
five prompts. (A1R734-735;
C1R246; D1R) 1 1 1 1
19 D
1.5.7. IF the text is less complex
and less endowed with concepts,
themes, and motifs, THEN write
closer to three prompts. (A1R731-
732; C1R246; D1R) 1 1 1 1
20 A
1.5.8. Identify potential arguments
that students can make based on the
writing prompts, as well as the
complexity of those arguments.
(C2R; D1R) 0 0 1 1
21 D
1.5.8.1. IF the potential argument
can be made with clearly defined,
black and white explanations,
THEN the argument is less
complex. (C2R; D1R) 0 0 1 1
22 D
1.5.8.2. IF the potential argument
cannot be made with explanations
that are clearly defined and there
are no clear right or wrong answers,
THEN the argument is more
complex. (C2R; D1R) 0 0 1 1
 
Procedure 2. Introduce/Review
expository writing to students.
(B1R52, 599-689; C1R249; D1R)  
1 A
2.1. Tap into students’ prior
knowledge about essay writing by
doing the following: (B1R615;
C1R275; D1R) 0 1 1 1
2 A
2.1.1. Ask students what is the
purpose of an expository essay.
(B2R; C1R277; D1R) 0 1 1 1
3 A
2.1.2. Review academic language
such as claim, topic sentence,
concrete detail, and commentary,
and ask students to identify
synonyms to those terms. (B1R604, 0 1 1 1
COGNITIVE TASK ANALYSIS

 

 
162
633; C1R283; D1R)
4 A
2.1.3. Review and define the four
rhetorical writing modes: (B1R609;
C1R284; D1R) 0 1 1 1
5 A
2.1.3.1. Teach narrative as a story
that can be told from different
points of view such as the
overarching narrator (i.e. 3rd
person omniscient) versus the more
focused narrator (i.e. 1st person
limited). (B2R; C1R295-307, 873-
878; D1R) 0 1 1 1
6 A
2.1.3.2. Teach descriptive as a
detailed description of a specific
event or topic that does not have to
be written in the first-person. (B2R;
C1R294; D1R)  0 1 1 1
7 A
2.1.3.3. Teach expository writing as
a way to inform and/or construct
and defend an argument
represented by written analysis.
(A1R405; B1R609; C1R294; D1R) 1 1 1 1
8 A
2.1.3.4. Teach that everything is an
argument, and that argument has an
element of persuasion. (B2R;
C1R492; D1R) 0 1 1 1
9 A
2.2. Provide baking analogy to help
students understand that writing
requires both structure (i.e.
essential ingredients) and creativity
(i.e. non-essential ingredients you
use to experiment) and/or any other
analogy that connects with
students’ interests, promotes
structure, creativity, and finding
their voice. (B1R603, 617, 678;
C1R324; D1R)  0 1 1 1
 
Procedure 3. Teach how to
formulate an argument.
(A1R352-356, 463-464, 472; 1304-
1307; B1R695-696; C1R380;
D1R)
 
1 A
3.1. Do mock exercise 1.2.1 with
students. (A2R; B1R695-765;
C1R382; D1R)  1 1 1 1
COGNITIVE TASK ANALYSIS

 

 
163
2 A
3.1.1. Write the questions on the
board or show a movie clip that is
followed by a controversial
question to help students
understand that an argument is a
claim (e.g. Watch opening scene of
“Vertical Limit” and ask students if
the son should cut the climbing
rope to save himself and his sister
at the expense of his father’s life?)
(A2R; B1R698, 715, 741; C1R384;
D1R) 1 1 1 1
3 A
3.1.2. Tell students to think and
write about their answers to the
question along with possible
reasons and/or evidence. (A2R;
C1R393; D1R) 1 1 1 1
4 A
3.1.3. Ask students to share out
their answers in order to discuss
informally as a whole class. (A2R;
B1R721; C1R395; D1R)  1 1 1 1
5 D
3.1.4. IF students’ responses vary,
THEN keep the dialogue going by
having students discuss and debate
their answers in class. (D1R) 0 0 0 1
6 D
3.1.5. IF students share similar
responses, THEN have them group
together and defend their positions
with reasons and evidence.
(B1R737; C1R397; D1R) 0 1 1 1
7 A
3.1.6. Use student responses during
this exercise to connect to the
academic language in 2.1.2.
(B1R727; C1R399; D1R) 0 1 1 1
8 A
3.2. Do mock exercise 1.2.2 with
students. (A2R; B1R844; C1R401;
D1R) 1 1 1 1
9 A
3.2.1. Distribute the paragraphs
from 1.2.2 along with highlighters.
(A2R; B1R844; C1R402; D1R) 1 1 1 1
10 A
3.2.1.1. Give students time to
practice color-coding one of the
two paragraphs as an entire class
(e.g. blue indicates topic sentence,
red indicates concrete detail, and
green indicates 1 1 1 1
COGNITIVE TASK ANALYSIS

 

 
164
commentary)(A2R948; B1R844;
C1R405-407; D1R)
11 A
3.2.1.2. Tell students to work
individually to color code the
second paragraph and share out to
the rest of the class. (A2R;
B1R844; C1R408; D1R) 1 1 1 1
12 A
3.2.2. Distribute the article from
1.2.2 or allow students to bring an
article of their choosing that is no
more than 2 pages in length.
(B1R853, 858; C1R409; D1R) 0 1 1 1
13 A
3.2.3. Tell students to identify the
main argument. (C2R; D1R) 0 0 1 1
14 A
3.2.4. Tell students to read and
color code the article. (B1R207,
865; C1R412; D1R) 0 1 1 1
15 D
3.2.5. IF students do not know how
to classify a sentence, THEN tell
them to mark it using a distinct
color and discuss it with the class to
solve. (B1R867; C1R413; D1R) 0 1 1 1
16 A
3.2.6. Ask students to share their
favorite sentence(s) from the
article. (B2R; C1R416; D1R) 0 1 1 1
17 A
3.2.7. Tell students to hold up their
papers to prompt discussion about
which articles were well written
and those that are poorly written.
(B1R875; C1R416; D1R) 0 1 1 1
18 D
3.2.8. IF students have a poorly
written article, discuss why it is
poorly written and how it could be
improved. (B1R876; C1R421;
D1R) 0 1 1 1
19 D
3.2.9. IF students bring in articles
that are from a blog or another type
of informal source, teach them the
differences between colloquial and
formal language. (B1R882;
C1R422; D1R) 0 1 1 1
20 A
3.3. Do mock exercise 1.2.3 with
students. (A2R; C1R451; D1R) 1 1 1 1
21 A
3.3.1. Distribute the strips of paper
to the students accordingly. (A2R;
C1R452; D1R) 1 1 1 1
COGNITIVE TASK ANALYSIS

 

 
165
22 A
3.3.2. Tell students to get up out of
their seats and walk around the
room and connect with the other
sentences in their paragraph. (A2R;
C1R452; D1R)  1 1 1 1
23 A
3.3.3. Tell students to refer back to
the previous exercise to determine
whether they are the topic sentence,
concrete detail, and/or commentary.
(A2R; C1R455; D1R) 1 1 1 1
24 A
3.3.4. Ask students to volunteer
their findings once they believe
they have successfully
reconstructed their paragraph.
(A2R; C1R457; D1R) 1 1 1 1
25 D
3.3.4.1. IF students reconstruct their
paragraph successfully, THEN
affirm their work. (A2R; C1R459;
D1R)  1 1 1 1
26 D
3.3.4.2. IF students did not
reconstruct their paragraph
successfully, THEN work with the
class to make the appropriate
changes. (A2R; C1R460; D1R) 1 1 1 1
27 A
3.4. Model writing for students
using a worked example of a
student expository essay on a text
that they are familiar with and that
is different than the one they are
writing on. (A1R1320-1335;
C1R628; D1R) 1 1 1 1
28 A
3.5. Read through the entire essay
with the students and pause when
necessary to identify and teach the
following elements: (A1R1371-
1373; C1R631; D1R) 1 1 1 1
29 A
3.5.1. Teach the introduction as the
opening paragraph of the essay that
includes the following: (A1R1336-
1344; C1R633; D1R) 1 1 1 1
30 A
3.5.2. Teach the thesis statement as
the main argument of the essay that
meets the following criteria:
(A1R1504; C1R636; D1R) 1 1 1 1
31 A
3.5.3. Teach body paragraphs as the
paragraphs where students can 1 1 1 1
COGNITIVE TASK ANALYSIS

 

 
166
discuss reasons and evidence to
support the thesis statement and
include the following: (A1R1348;
C1R639; D1R)
32 A
3.5.3.1. Teach topic sentence as a
mini-argument that supports the
thesis or the main argument.
(A1R476, 1505, 1634; C1R641;
D1R) 1 1 1 1
33 A
3.5.3.2. Teach concrete detail as a
textual example that supports the
argument in the topic sentence.
(A1R476-477, 1598-1599;
C1R644; D1R) 1 1 1 1
34 A
3.5.3.3. Teach concrete detail can
also be a paraphrase of something
that happened in the text. (A1R476-
477, 1596-1599, 1841; C1R644;
D1R) 1 1 1 1
35 A
3.5.3.3.1. Tell students the concrete
detail should be as succinct as
possible and not take up more than
40% of the paragraph. (A1R1817-
1821, 1831-1834; C1R647; D1R)  1 1 1 1
36 A
3.5.3.4. Teach commentary as the
students’ rationale that explains
how they are connecting the text to
their topic sentence. (A1R476-477,
1596, 1738; C1R650; D1R)  1 1 1 1
37 A
3.5.3.4.1. Tell students that
commentary should be multiple
sentences and roughly 60% of the
body paragraph (A1R1734-1740,
1817-1821; C1R652; D1R). 1 1 1 1
38 A
3.5.3.4.2. Tell students that
effective commentary analyzes the
literary elements from 1.4.1 to 1.4.5
(A1R1759-1769; C1R654; D1R) 1 1 1 1
39 A
3.5.4. Teach the conclusion as a
restating and a closing of their
entire argument. (A1R1905-1909;
C1R657; D1R) 1 1 1 1
40 A
3.5.4.1. Show worked examples to
demonstrate how the conclusion is
a restating and closing of an
argument, that it provides closure, 1 1 1 1
COGNITIVE TASK ANALYSIS

 

 
167
settles the tone of the argument,
takes the reader beyond the topic,
and/or explains why the argument
matters. (A1R1914-1923; C1R658;
D1R)
 
Procedure 4. Teach how to
support an argument (A1R373-
375, 466, 476 1586; B1R766;
C1R719; D1R)  
1 A
4.1. Teach rhetorical ideas: ethos,
pathos, logos and that arguments
are made based on these rhetorical
ideas. (B1R766-821; C1R720;
D1R) 0 1 1 1
2 A
4.2. Ask students: What do you
need to have to make a convincing
argument? (B1R776; C1R723;
D1R) 0 1 1 1
3 A
4.3. Tell students that if they want
people to believe their arguments,
they cannot solely rely on one type
of appeal. (B1R774; C1R725; D1R) 0 1 1 1
4 A
4.4. Teach students to develop their
arguments by using a combination
of the following types of support
that are relevant for this particular
writing assignment: (B1R781;
C1R727; D1R) 0 1 1 1
5 A
4.5. Revisit the article from 3.2.2
and tell students to identify the
types of support the author used to
support his/her argument. (C2R;
D1R) 0 0 1 1
6 A
4.6. Pose either a random
controversial question or pull from
the questions you generated from
1.2.1 to students in order to give
them practice formulating a claim
as an argument. (B1R798; C1R732;
D1R) 0 1 1 1
7 A
4.7. Tell students to write down
their answer to the question as an
argument statement form.
(B1R803; C1R736; D1R)  0 1 1 1
8 A
4.8. Tell students to bullet point at
least 4 reasons or evidence to 0 1 1 1
COGNITIVE TASK ANALYSIS

 

 
168
support their claim. (B1R804;
C1R737; D1R)
9 A
4.9. Tell students to go through
their bulleted points of reasons and
evidence and identify if it is based
on an emotional appeal, a statistic,
or its relevance or impact on history
and/or society. (B1R813; C1R738;
D1R) 0 1 1 1
10 D
4.10. IF students rely too much on
one type of reasoning or evidence,
THEN recommend they incorporate
other types. (B1R815; C1R741;
D1R) 0 1 1 1
11 A
4.11. Pose another random
controversial question (e.g. should
the school enforce a dress code?),
divide the class in half to indicate
those who are for and those who are
against and assign small groups of
students to present the types of
reasons and evidence from 4.4.1 to
4.4.5. (B2R; C1R743; D1R) 0 1 1 1
 
Procedure 5. Conduct close
reading of the text to prepare the
students for writing. (A1R598;
B1R890; C1R751; D1R)  
1 D
5.1. IF you have students read a
novel, THEN do the following
steps: (A1R; C1R; D1R) 1 1 1 1
2 A
5.1.1. Distribute all of the writing
prompts to the students before you
begin reading the text. (A1R1020-
1022, 1033-1042; C1R464; D1R) 1 1 1 1
3 A
5.1.2. Read and discuss each of the
prompts with the students clarifying
for meaning and difficult
vocabulary. (A2R; C1R465; D1R) 1 1 1 1
4 A
5.1.3. Distribute copies of the
chosen text to students. (A1R;
C1R468; D1R) 1 1 1 1
5 A
5.1.4. Alternate between your
reading of the text aloud in-class
and students reading the text at-
home. (A1R1164-1166; C1R468;
D1R) 1 1 1 1
COGNITIVE TASK ANALYSIS

 

 
169
6 D
5.1.5. IF students are less proficient
at reading, THEN read more of the
text together in class. (A1R1164-
1166, 1214-1216; C1R470; D1R) 1 1 1 1
7 A
5.1.5.1. Alternate between students
reading on their own or you reading
aloud to the students. (A1R1206;
C1R475; D1R)  1 1 1 1
8 A
5.1.5.2. Stop at annotations from
the preparation in 1.4 and identify
the literary elements within the text
(e.g. author’s tone, diction, use of
symbolism, etc.). (A1R1240, 1246;
C1R478; D1R) 1 1 1 1
9 D
5.1.5.2.1. IF students demonstrate
understanding during discussion of
literary element, THEN move
forward with the close reading.
(A2R; C1R481; D1R) 1 1 1 1
10 D
5.1.5.2.2. IF students do not
demonstrate understanding during
discussion of literary elements,
THEN stop and address students’
confusion (A2R; C1R482; D1R) 1 1 1 1
11 A
5.1.5.3. Ask students to identify
parts of the text (e.g. color-coded
sticky notes) that correspond with
the various writing prompts.
(A1R1074; C1R485; D1R) 1 1 1 1
12 D
5.1.5.4. IF the quote or passage in
the text obviously connects to one
of the writing prompts, THEN ask
students to make the connection.
(A1R1254-1255; C1R489; D1R) 1 1 1 1
13 D
5.1.5.5. IF the quote or passage in
the text is not obvious as to which
of the writing prompts it connects
to, THEN ask 3 more clarifying
questions to give them more
opportunities to think and reach an
answer. (A1R1257-1258; C1R501;
D1R) 1 1 1 1
14 D
5.1.5.5.1. IF students show they are
reaching their frustration level,
THEN make the connection for
them. (A1R1257-1258; C1R511; 1 1 1 1
COGNITIVE TASK ANALYSIS

 

 
170
D1R)
15 D
5.1.6. IF students are more
proficient at reading, THEN assign
more reading at home. (A1R1164-
1166; C1R471; D1R) 1 1 1 1
16 A
5.1.6.1. Tell students that there will
be an assessment of the reading
assignment when they return to
class. (A1R1174; C2R; D1R) 1 1 1 1
17 A
5.1.6.2. Tell the students to take
notes on their reading and that they
will be able to use those notes
during the assessment. (A1R1214-
1218; C2R; D1R) 1 1 1 1
18 D
5.1.6.3. IF students are assigned
reading at home, THEN give
assessment (i.e. quiz) when they
return to class. (A1R1211-1212;
C2R; D1R)  1 1 1 1
19 A
5.1.7. Continue to check for student
understanding as you finish close
reading of the text, and assess
students’ readiness to write using
the following methods: (A1R218-
219, 236, 277, 1153; C1R542;
D1R) 1 1 1 1
20 A
5.1.7.1. Administer quickwrites
(A1R231; C1R545; D1R) 1 1 1 1
21 A
5.1.7.2. Give reading
comprehension quizzes (A1R1043;
C1R545; D1R) 1 1 1 1
22 A
5.1.7.3. Tell students to answer
free-response questions on the text
(A1R1044; C1R545; D1R) 1 1 1 1
23 A
5.1.7.4. Tell students to write
reflections on the text (A1R1044;
C1R546; D1R) 1 1 1 1
24 A
5.1.7.5. Observe students during
class (A1R232; C1R547; D1R) 1 1 1 1
25 A
5.1.7.6. Hold class discussions
(A1R232, 276; C1R547; D1R) 1 1 1 1
26 A
5.1.8. Invite students to converse
about the text and discuss multiple
interpretations. (B1R73-104, 966,
988-1027; C1R586; D1R) 0 1 1 1
27 A 5.1.8.2. Distribute a list of about 80 0 1 1 1
COGNITIVE TASK ANALYSIS

 

 
171
topics to the students. (B1R533,
555-557; C1R593; D1R)
28 A
5.1.8.3. Review the list of topics
and ask students to identify which
topics are relevant to the text.
(B1R582-583; C1R594; D1R) 0 1 1 1
29 A
5.1.8.4. Mark a star next to those
topics that students believe are
relevant to the text and do not ask
why. (B1R582-583; C1R596; D1R) 0 1 1 1
30 D
5.1.8.5. IF students do not
recognize a topic that you believe is
appropriate, THEN ask them,
“Could you argue this?” and wait to
see if they change their minds.
(B1R587-590; C1R602; D1R) 0 1 1 1
31 A
5.1.9. Revisit writing prompts
explicitly to determine which
prompts would be most appropriate
with students’ abilities. (A2R;
C1R551; D1R) 1 1 1 1
32 D
5.1.9.1. IF students demonstrate a
strong understanding of the text,
THEN recommend one of the more
challenging prompts. (A1R294-
303, 1041-1051; C1R555; D1R) 1 1 1 1
33 D
5.1.9.2. IF students demonstrate
minimal understanding of the text,
THEN recommend one of the less
challenging prompts. (A1R1041-
1051; C1R557; D1R) 1 1 1 1
34 D
5.1.9.3. IF a student would like to
write on a prompt different than the
one you’ve recommended and
they’ve discussed it with you,
THEN let student write to the
desired prompt. (A1R1041-1051;
C1R559; D1R) 1 1 1 1
35 A
5.1.10. Tell students to write an
answer to the writing prompt they
have chosen. (A2R; C1R663; D1R) 1 1 1 1
36 A
5.1.11. Ask students to share out
their answers to the class when they
are ready and discuss whether or
not the answers are arguable.
(A1R350-351; C1R664; D1R) 1 1 1 1
COGNITIVE TASK ANALYSIS

 

 
172
37 D
5.2. IF you have students read an
article, THEN do the following
steps: (B1R890; C1R578; D1R) 0 1 1 1
38 A
5.2.1. Provide students with the
article you have chosen from 1.3
(B1R890; C1R579; D1R) 0 1 1 1
39 A
5.2.2. Tell students to read the
article at home and annotate
referring to the annotation guide
they received at the beginning of
the year. (B1R924; C1R579; D1R)    0 1 1 1
40 D
5.2.3. IF there is time in class,
THEN read and annotate the first
paragraph together. (B1R956;
C1R583; D1R) 0 1 1 1
41 A
5.2.4. Ask students to identify the
author’s main claim. (B1R366-367,
990; C1R589; D1R) 0 1 1 1
42 D
5.2.5. IF students identify the
author’s main claim in the article,
THEN ask them to share out with
the rest of the class how they know
that. (B1R990-991; C1R590; D1R) 0 1 1 1
43 A
5.2.6. Ask students to generate their
thesis or claim based on their
responses from 5.2.4 and their own
experiences, without sharing or
discussing it with other students.
(B1R992-1006; C1R605; D1R) 0 1 1 1
44 A
5.2.7. Meet with each student and
read his/her claim or thesis
statement. (B1R1069-1074, 1080-
1081; C1R607; D1R) 0 1 1 1
45 D
5.2.8. IF student’s claim or thesis is
not clearly written as an arguable
position on the author’s claim,
THEN make time to conference
with the student to help revise.
(B1R1074-1076; C1R610; D1R) 0 1 1 1
46 D
5.2.9. IF student’s claim or thesis
statement is clearly written as an
arguable position on the topic,
THEN direct students to procedure
6: Conduct research. (B1R1076-
1078; C1R612, 753; D1R) 0 1 1 1
   
COGNITIVE TASK ANALYSIS

 

 
173
Procedure 6. Conduct research to
cite outside sources (B1R280,
295-296, 312-318, 1010-1078;
C1R756; D1R)
1 D
6.1. IF it is the beginning of the
school year and you are following
the chronology of American
literature (i.e. Puritans,
Revolutionary writers), THEN
provide the following scaffold for
your students: (C1R768-786; D1R) 0 0 1 1
2 A
6.1.1.  Provide 3-4 articles per
subject/topic with bibliographies
related to the text you are reading.
(C1R788; D1R) 0 0 1 1
3 A
6.1.2. Review how to conduct
research using reliable sources (e.g.
Google Scholar, JSTOR, university
websites). (C1R808-825; D1R) 0 0 1 1
4 A
6.1.3. Pull all of the available and
relevant resources from the school
library prior to sending students to
do their research. (C1R836; D1R) 0 0 1 1
5 A
6.1.4. Ask students to evaluate the
reliability of the sources you’ve
provided them in 6.1.1 by asking
them questions such as why should
they trust you? How do they know
your choices are reliable? How do
you know these are credible
sources? Where did I get them?
How do you know where I got
them? When were they written and
how does that change the ideas we
are discussing. (C1R844-856; D1R) 0 0 1 1
6 A
6.2. Tell students to read the
articles you have provided and
choose 1 or 2 of those articles to
incorporate into the argument of
their essays. (C1R789; D1R) 0 0 1 1
7 A
6.3. Tell students to conduct their
own research and find articles that
support their thesis or claim, or are
based on the topic or motif that is
found in the text they are reading as
the school year progresses. 0 1 1 1
COGNITIVE TASK ANALYSIS

 

 
174
(B1R312-318, 1036, 1046-1048,
1051-1052; C1R760; D1R)
8 D
6.4. IF it is the beginning of the
year, THEN require only 1 article.
(B1R1032; C1R764-765; D1R) 0 1 1 1
9 D
6.5. IF it is later in the school year,
THEN require 3-4 articles.
(B1R1032; C1R764-765; D1R) 0 1 1 1
10 A
6.6. Tell students to bring the
article to class and annotate it.
(B1R1053-1054; C1R763; D1R) 0 1 1 1
 
Procedure 7. Begin writing the
expository essay (B1R1079-1143;
C1R861; D1R)  
1 A
7.1. Teach the overall structure of
the essay by providing a visual
outline as follows. (B1R1096-1109;
C1R863; D1R) 0 1 1 1
2 A
7.1.1. Draw an upside down
triangle to visually represent how
the introduction progresses from
more general statements to
something more specific like their
claim. (B1R1113-1115; C1R865-
869; D1R) 0 1 1 1
3 A
7.1.2. Teach students that the claim
goes at the end of their introduction
or the bottom of the upside down
triangle. (B1R1099, 1110; C1R870;
D1R) 0 1 1 1
4 A
7.1.3. Tell students to avoid writing
in the 1st person in order to
establish more of an authoritative
voice. (B1R1057-1066; C1R871;
D1R) 0 1 1 1
5 A
7.1.4. Draw squares to represent
body paragraphs of the essay.
(B1R1116-1118; C1R880; D1R) 0 1 1 1
6 A
7.1.5. Remind students of their
knowledge of topic sentences,
concrete details, and commentary.
(B1R1116-1118; C1R880; D1R) 0 1 1 1
7 A
7.1.6. Draw a right side up triangle
to represent the essay’s conclusion.
(B1R1120-1126; C1R894-927;
D1R) 0 1 1 1
COGNITIVE TASK ANALYSIS

 

 
175
8 A
7.2. Tell students to write a
minimum of 4 paragraphs
(including introduction and
conclusion) and up to however
many is necessary to make the
students’ arguments. (B1R1120-
1126; C1R894-927; D1R) 0 1 1 1
9 A
7.3. Refer back to worked examples
and the texts you have read with the
students to analyze how many
paragraphs they needed to
successfully make their arguments.
(C1R934-947; D1R) 0 0 1 1
10 A
7.4. Tell students to write the
remaining part of their
introductions that comes before
their thesis statements from 5.2.7.
(A1R363-375, 1380-1409;
C1R668; D1R)  1 1 1 1
11 A
7.5. Ask students: What should
readers know as they begin reading
your essay? (A1R1396-1407;
C1R672; D1R) 1 1 1 1
12 A
7.6. Use the worked example to
demonstrate how the author tells
the reader who the author is, as well
as the title and genre of the work.
(A1R1329-1335; C1R673; D1R) 1 1 1 1
13 A
7.7. Tell students to write a brief
(i.e. 3-5 sentences) summary or
synopsis of what this novel or
article is mainly about. (A1R1380,
1385-1386; C1R675; D1R) 1 1 1 1
14 A
7.8. Tell students to write a
transition to the idea in the novel or
article that is going to tie to their
thesis statements. (A1R1386-1390;
C1R676; D1R) 1 1 1 1
15 A
7.9. Tell students to refer to essay
checklist from 1.1 to evaluate
whether or not they have completed
writing their introductions.
(A1R1466-1476; C1R678; D1R) 1 1 1 1
16 D
7.10. IF students’ introductions
meet all of the requirements of the
essay checklist for introductions, 1 1 1 1
COGNITIVE TASK ANALYSIS

 

 
176
THEN move on to the next step.
(A2R; C1R681; D1R)
17 A
7.11. Tell students to review their
thesis statements and have them
answer the question: “Why would
you say that?” (A1R1588-1593;
C1R687; D1R) 1 1 1 1
18 A
7.12. Tell students that their
answers to 7.10 will become the
topic sentence of their first body
paragraph. (A1R1591-1593;
C1R690; D1R) 1 1 1 1
19 D
7.13. IF student’s topic sentence is
not arguable, THEN have students
revise. (A1R1616; C1R691; D1R) 1 1 1 1
20 D
7.14. IF student’s topic sentence is
arguable, THEN move on to the
next step. (A1R1616; C1R692;
D1R) 1 1 1 1
21 A
7.15. Tell students to refer back to
their notes at 5.1.5.3. and identify
the specific passages/quotes that
support their topic sentence.
(A1R1597-1599; C1R696; D1R) 1 1 1 1
22 A
7.16. Ask students to think about
how the article(s) from their
research in Procedure 6 supports
and/or furthers their argument.
(C2R; D1R) 0 0 1 1
23 A
7.17. Explain to students the
difference between quoting from
the text as opposed to paraphrasing
the text. (D1R) 0 0 0 1
24 D
7.18. IF the author’s wording is
essential (i.e. you need to keep the
author’s words in their original
form because you are specifically
analyzing author’s word choice
and/or style or you need to include
specific words or phrases from the
author in your commentary) for
your commentary, THEN quote.
(D1R) 0 0 0 1
25 D
7.19. IF it is information that
students can successfully restate
without losing anything (i.e. a 0 0 0 1
COGNITIVE TASK ANALYSIS

 

 
177
statistic, a fact, or a plot detail),
THEN students should paraphrase.
(D1R)
26 A
7.20. Tell students that they should
not rely only on quotes because it is
easier. (D1R) 0 0 0 1
27 A
7.21. Review the worked example
in 3.4, and provide worked
examples of “lead-ins” or
introductory phrases that precede
quotations to show students how to
integrate concrete details into their
essays. (D1R) 0 0 0 1
28 A
7.22. Teach students how to do in-
text citations as well as citations for
their Work Cited page by providing
students with the detailed
information stated in the MLA
guidelines. (D1R) 0 0 0 1
29 D
7.23. IF it is the first essay of the
year, THEN provide a full citation
of the primary source they are
writing about, along with additional
examples of sources they would be
familiar with from previous works
they have read. (D1R) 0 0 0 1
30 D
7.24. IF students demonstrate the
ability to cite sources correctly in
their first essays, THEN do not
provide the citation for their
primary source. (D1R)   0 0 0 1
31 A
7.25. Tell students to write
commentary (A1R1668-1769;
C1R699; D1R) 1 1 1 1
32 D
7.26. IF there is time, THEN
prepare some worked examples of
commentary prior to students
writing commentary. (A1R1688-
1693; C1R700; D1R) 1 1 1 1
33 A
7.27. Tell students to review their
topic sentence and think about the
argument they are making.
(A1R1798-1799; C1R701; D1R) 1 1 1 1
34 A
7.28. Tell students to review the
textual support they have chosen
and identify the specific word(s), 1 1 1 1
COGNITIVE TASK ANALYSIS

 

 
178
tone, and/or image that specifically
support what they are arguing.
(A1R1802-1811; C1R703; D1R)
35 D
7.29. IF students write commentary
that is plot summary, THEN use
another worked example to
demonstrate the difference between
summary and analysis/explanation.
(A1R1686-1698; C1R705; D1R) 1 1 1 1
36 A
7.30. Conference with students over
their first complete body paragraph.
(A1R1864-1865, 1884-1889;
C1R709; D1R) 1 1 1 1
37 A
7.31. Tell students to revise their
first body paragraph per teacher
recommendations. (A1R1872-1873,
1890; C1R710; D1R) 1 1 1 1
38 D
7.32. IF students have successfully
completed their first body
paragraphs, THEN have them write
the remaining two body paragraphs.
(A1R1877-1878; C1R711; D1R) 1 1 1 1
39 A
7.33. Tell students to write a
conclusion (A1R1896-1906;
C1R714; D1R) 1 1 1 1
40 A
7.34. Revisit the conclusion and the
suggestions you gave them from the
worked example in 3.5.4.1.
(A1R1905, 1919; C1R715; D1R) 1 1 1 1
41 A
7.35. Tell students that you will
read the conclusion, but will not
grade it against them. (A1R1921-
1923; C1R717; D1R) 1 1 1 1
 
Procedure 8. Teach how to edit
writing (A1R385-386, 483-485;
B1R1157; C1R968; D1R)  
1 A
8.1. Tell students get in to groups of
four and take out a different color
pen from everyone else in their
group. (B2R; C1R973; D1R)  0 1 1 1
2 A
8.2. Assign more capable and
skilled students to each group if
possible. (B2R; C1R974; D1R) 0 1 1 1
3 A
8.3. Tell students take their most
recent draft of the essay and
highlight their thesis statements. 0 1 1 1
COGNITIVE TASK ANALYSIS

 

 
179
(B2R; C1R976; D1R)
4 A
8.4. Tell students to pass their
essays to the student on their left
for the first reading. (B2R;
C1R978; D1R) 0 1 1 1
5 A
8.5. Tell students to read for about
7-10 minutes looking specifically at
formatting, the thesis statement,
and whether or not the commentary
addresses the thesis statement, and
grammar and syntax. (B2R;
C1R979; D1R) 0 1 1 1
6 A
8.6. Tell students to pass their
essays again to the student on their
left for the second reading. (B2R;
C1R983; D1R) 0 1 1 1
7 A
8.7. Tell students to read for about
7-10 minutes looking specifically at
whether or not the opening
paragraph introduces a logical and
coherent line of thinking beginning
with a hook and ending with the
thesis statement, and grammar and
syntax. (B2R; C1R984; D1R) 0 1 1 1
8 A
8.8. Tell students to pass their
essays again to the student on their
left for the third reading. (B2R;
C1R989; D1R) 0 1 1 1
9 A
8.9. Tell students to read the essay
paying close attention to the
comments and suggestions made by
the previous two students and
adding anything else they can.
(B2R; C1R991; D1R) 0 1 1 1
10 A
8.10. Tell students to pass their
essays again for the last time so that
essays are returned to their original
owners. (B2R; C1R993; D1R) 0 1 1 1
11 A
8.11. Tell each group to have a
table discussion about each
individual paper, where students
can ask questions about the
feedback they have received from
their group members. (B2R;
C1R995; D1R)   0 1 1 1
   
COGNITIVE TASK ANALYSIS

 

 
180
Procedure 9. Teach reflection and
evaluate students’ writing
(A1R532-541; B1R1157;
C1R1021; D1R)
1 A
9.1. Tell students to answer the
question: “What was positive about
the essay?” on the last page of their
essays. (A1R563; B2R; C1R1022-
1023; D1R) 1 1 1 1
2 A
9.2. Tell students to answer the
question: “What was negative about
the essay?” on the last page of their
essays. (A1R563; B2R; C1R1022-
1024; D1R) 1 1 1 1
3 A
9.3. Tell students to submit their
final draft of their essays with all of
the work leading up to it. (A1R487-
488, 505-506, 1990-1994;
C1R1000; D1R) 1 1 1 1
4 A
9.4. Grade only the final draft of
students’ essays and read their
comments afterwards and look at
the previous student work only if
necessary. (A1R529, 1995; B2R;
C1R1005; D1R). 1 1 1 1
5 A
9.5. Give feedback on the following
as you grade their essays:
(A1R1996-2003; C1R1010; D1R) 1 1 1 1
6 D
9.6. IF you read the final draft and
suspect plagiarism, THEN revisit
students’ prior work. (A2R;
C1R1007; D1R) 1 1 1 1
7 D
9.7. IF you read the final draft and
suspect the student has made little
to no progress based on your
feedback, THEN revisit prior work.
(A2R; C1R1008; D1R) 1 1 1 1
8 D
9.8. IF there are noticeable
grammatical errors in the essays,
THEN provide instruction on how
to make corrections and
improvements. (A1R2028-2031;
B2R; C1R1017; D1R) 1 1 1 1
9 A
9.9. Return student essays quickly
so that their work remains relevant
along with your grade and with 1 1 1 1
COGNITIVE TASK ANALYSIS

 

 
181
enough feedback to guide them.
(A1R529-530; B2R; C1R1026;
D1R)  
10 A
9.10. Offer students the option of
revising their essays. (C1R1043;
D1R) 0 0 1 1
11 D
9.11. IF students’ essays need
significant improvement (i.e. would
receive a C- letter grade or lower),
THEN assign a letter of R to
indicate that this essay is worth
going back to again. (C1R1105;
D1R) 0 0 1 1
12 D
9.12. IF students choose to revise
their essays, THEN do the
following: (C1R1129; D1R) 0 0 1 1
13 A
9.12.1. Be available to conference
with students you required to
rewrite or who choose to rewrite.
(C1R1129; D1R) 0 0 1 1
14 A
9.12.2. Allow students who revise
no more than 2 weeks to revise
their essays. (C1R1210; D1R) 0 0 1 1
15 A
9.12.3. Allow students who revise
the opportunity to receive the
highest grade possible as long as
their work merits it. (C1R1194;
D1R) 0 0 1 1
16 D
9.13. IF students do not revise their
essays, THEN assign the letter
grade you would have originally
assigned. (C1R1007; D1R) 0 0 1 1
17 A
9.14. Tell all students to look over
the comments and/or adjustments
you made, and write a one-page
reflection with the following
guidelines. (A1R530, 540-541;
C1R1032; D1R). 1 1 1 1
18 A
9.15. Tell students they can write
openly and freely for their
reflection (A1R558; C1R1034;
D1R) 1 1 1 1
19 A
9.16. Recommend to students that
they reflect on what they plan to do
differently for the next essay, what
they learned through the writing 1 1 1 1
COGNITIVE TASK ANALYSIS

 

 
182
process, and on any possible
questions that remain (A1R563;
B2R; C1R1028, 1036; D1R)
20 A
9.17. Tell students to resubmit all
of their work, including their
reflections, for discussion during
individual student-teacher
conferences (A1R542-548;
C1R1038; D1R) 1 1 1 1
 210 Total Action and Decision Steps 113 181 201 210
 159  Action Steps 88 141 155 159
 51  Decision Steps 25 40 46 51

 

 

 

 
Total Action and Decision Steps 53.81% 86.19% 95.71% 100.00%
Action Steps 55.35% 88.68% 97.48% 100.00%
Decision Steps 49.02% 78.43% 90.20% 100.00%
 
     
Action and Decision Steps Omitted 97 29 9 0
Action Steps Omitted 71 18 4 0
Decision Steps Omitted 26 11 5 0
         
Action and Decision Steps Omitted 46.19% 13.81% 4.29% 0.00%
Action Steps Omitted 44.65% 11.32% 2.52% 0.00%
Decision Steps Omitted 50.98% 21.57% 9.80% 0.00%
 
 
 
Total Mean/Average Captured Omitted
 
Total Action and Decision Steps 83.93% 16.07%
 
Action Steps 85.38% 14.62%
 
Decision Steps 79.41% 20.59%
 
 
 
 Complete Alignment (Total Action
and Decision Steps) 113 53.81%
 Substantial Alignment (Total
Action and Decision Steps) 68 32.38%
 Partial Alignment (Total Action
and Decision Steps) 20 9.52%
 No Alignment (Total Action and
Decision Steps) 9 4.29%
 
     
 
 210  
 
 
 
 
COGNITIVE TASK ANALYSIS

 

 
183
Complete Alignment (Action Steps) 88 77.88%
 Complete Alignment (Decision
Steps) 25 22.12%
 Substantial Alignment (Action
Steps) 54 79.41%
 Substantial Alignment (Decision
Steps) 15 22.06%
 
Partial Alignment (Action Steps) 14 70.00%
 
Partial Alignment (Decision Steps) 6 30.00%
 
No Alignment (Action Steps) 4 44.44%
 
No Alignment (Decision Steps) 5 55.56% 
Asset Metadata
Creator Lim, Nicolas (author) 
Core Title Using incremental cognitive task analysis to capture expert instruction in expository writing for secondary students 
Contributor Electronically uploaded by the author (provenance) 
School Rossier School of Education 
Degree Doctor of Education 
Degree Program Education (Leadership) 
Publication Date 04/09/2015 
Defense Date 02/19/2015 
Publisher University of Southern California (original), University of Southern California. Libraries (digital) 
Tag expository writing,incremental cognitive task analysis,OAI-PMH Harvest,secondary students 
Format application/pdf (imt) 
Language English
Advisor Yates, Kenneth A. (committee chair), Carbone, Paula M. (committee member), Maddox, Anthony B. (committee member) 
Creator Email limn@usc.edu,nicolaslim15@gmail.com 
Permanent Link (DOI) https://doi.org/10.25549/usctheses-c3-544213 
Unique identifier UC11297852 
Identifier etd-LimNicolas-3262.pdf (filename),usctheses-c3-544213 (legacy record id) 
Legacy Identifier etd-LimNicolas-3262.pdf 
Dmrecord 544213 
Document Type Dissertation 
Format application/pdf (imt) 
Rights Lim, Nicolas 
Type texts
Source University of Southern California (contributing entity), University of Southern California Dissertations and Theses (collection) 
Access Conditions The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law.  Electronic access is being provided by the USC Libraries in agreement with the a... 
Repository Name University of Southern California Digital Library
Repository Location USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Abstract (if available)
Abstract Cognitive Task Analysis (CTA) methods were used to capture the knowledge and skills, represented by action and decision steps, of expert English teachers when they recall and describe how they provide expository writing instruction at the eleventh‐grade level. Three semi‐structured CTA interviews were conducted to elicit and capture the declarative and procedural knowledge in the form of action and decision steps. The data was then coded, analyzed, and aggregated into a gold standard protocol (GSP) that was verified by a fourth expert. This study also sought to identify and quantify the number and percentage of expert knowledge and skills omissions when expert teachers recall how they teach expository writing. The results confirmed previous studies that suggested experts may omit up to 70% of critical information, and that 3-4 experts are optimal for reversing the 70% omission effect of knowledge automaticity. Lastly, this study along with a concurrent study (Jury, 2015) compared the efficiency of two methods of CTA. Both studies operationalized efficiency in terms of which knowledge elicitation method captures more action and decision steps from the experts for less cost and time. The results of the comparison yielded rich data, however there was no definitive answer as to which method is more efficient. The expert knowledge and skills captured by CTA methods may be used to train pre‐service and in-service teachers in performing the complex task of instructing eleventh‐grade students to write expository essays. 
Tags
expository writing
incremental cognitive task analysis
secondary students
Linked assets
University of Southern California Dissertations and Theses
doctype icon
University of Southern California Dissertations and Theses 
Action button