Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
An examination of the impact of professional development on teachers' knowledge and skills in the use of text complexity
(USC Thesis Other)
An examination of the impact of professional development on teachers' knowledge and skills in the use of text complexity
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
i
AN EXAMINATION OF THE IMPACT OF PROFESSIONAL DEVELOPMENT ON
TEACHERS’ KNOWLEDGE AND SKILLS IN THE USE OF TEXT COMPLEXITY
by
Donald David Moore
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
December 2020
Copyright 2020 Donald David Moore
ii
Dedication
I dedicate my dissertation work to my merciful Lord and Savior, and to my wife, Phoebe
Moore. Thank you for your unwavering support during my dissertation research and writing. My
success was possible because of your support, encouragement, and understanding. I love you!
iii
Acknowledgements
It is with great pleasure to acknowledge my deepest thanks and gratitude to Dr. Sandra
Kaplan, dissertation committee chair. Thank you for your words of encouragement, direction,
and scholarship. Your continuous support was instrumental in my success. Thank you for your
dedication to bringing out the best in me throughout the writing of my dissertation.
I want to express my deepest thanks and sincere appreciation to Dr. Raymond Gallagher,
a dissertation committee member. Thank you for your guidance, insight, and encouragement
throughout this process. Your contributions to my dissertation work were of tremendous value
and crucial to completing my doctoral studies and my Ed.D.
I want to express my sincere gratitude and appreciation to Dr. Lizbeth Johnson, a
dissertation committee member. Thank you for the many hours you invested in reading my
dissertation. Your encouragement, creative, and comprehensive advice never wavered. I
appreciate you more than you know.
Finally, I would like to thank my wife and family for their support and understanding
while working on my dissertation. Thank you to my Mom and Dad, Dr. Donald and Dana
Moore, for sacrificing so that I could receive a quality early education that served as a solid
foundation for lifelong scholarship.
iv
Table of Contents
Dedication ....................................................................................................................................... ii
Acknowledgements ........................................................................................................................ iii
List of Tables ................................................................................................................................. ix
List of Figures ................................................................................................................................ xi
List of Abbreviations .................................................................................................................... xii
Abstract ........................................................................................................................................ xiii
Chapter One: Overview of the Study ...............................................................................................1
Statement of the Problem ...........................................................................................................1
Background of the Problem .......................................................................................................6
What Is Text Complexity? .................................................................................................. 9
Purpose of the Study ................................................................................................................14
Research Questions ..................................................................................................................14
Conceptual Framework ............................................................................................................15
Michael Fullan’s Coherence Framework .......................................................................... 15
Guskey and Sparks’s Professional Development Model .................................................. 17
Statement of Methodology .......................................................................................................18
Significance of the Problem .....................................................................................................18
Assumptions .............................................................................................................................19
Limitations ...............................................................................................................................19
Delimitations ............................................................................................................................20
v
Definition of Terms..................................................................................................................20
Organization of the Study ........................................................................................................22
Historical, Political, and Social Frameworks of Literacy ........................................................25
Basal Readers: A Phonics Approach to Reading .............................................................. 28
Whole Language: A Philosophical Approach to Reading ................................................ 30
Balanced Reading: A Balance of Phonics and Whole Language ..................................... 30
Leveled Reading Versus Scaffolded Reading................................................................... 31
Anchor Reading Standard 10 and College Readiness....................................................... 32
English Language Arts (ELA): District Assessment Data .......................................................33
Summary of District Assessment Data ....................................................................................36
Conceptual Framework ............................................................................................................37
Framework for Professional Development ....................................................................... 37
Establishing Clarity of Deep Learning Goals ................................................................... 38
Content Characteristics of Professional Development ..................................................... 39
Building Precision in Pedagogies ..................................................................................... 42
Process Variables in Professional Development............................................................... 44
Shifting Practices through Capacity Building................................................................... 47
Context Characteristics in Professional Development ...................................................... 47
Targeted Professional Development in Text Complexity ................................................. 51
Chapter Three: Methodology .........................................................................................................54
Restatement of the Problem .....................................................................................................55
vi
Purpose of the Study Restated .................................................................................................57
Research Questions ..................................................................................................................57
Research Methodology ............................................................................................................57
Sample: Participation Selection ...............................................................................................59
Site Selection ...........................................................................................................................60
Data Collection and Instrumentation .......................................................................................61
Pre-Survey and Post Survey.............................................................................................. 63
Validity and Reliability ............................................................................................................64
Professional Development .......................................................................................................65
Triangulation ............................................................................................................................65
Data Analysis ...........................................................................................................................66
Ethical Considerations .............................................................................................................66
Limitations ...............................................................................................................................66
Chapter Summary ....................................................................................................................67
Chapter Four: Results and Findings ...............................................................................................68
Statement of the Problem .........................................................................................................68
Research Questions ..................................................................................................................69
Participants ...............................................................................................................................70
Factor Analysis Results..................................................................................................... 71
Communalities and Number of Components Extracted ................................................... 74
T-Test 77
vii
Findings....................................................................................................................................79
Results: Research Question One ....................................................................................... 79
Results: Research Question Two ...................................................................................... 80
Results: Research Question Three .................................................................................... 82
Results: Research Question Four ...................................................................................... 84
Professional Development Discussion Notes ................................................................... 88
Summary of Findings ...............................................................................................................89
Chapter Five: Discussion ...............................................................................................................91
Statement of the Problem .........................................................................................................91
Summary of Study ...................................................................................................................92
Text Complexity: Skill Level............................................................................................ 93
Text Complexity: Preparedness ........................................................................................ 94
Participants ...............................................................................................................................95
Instrument ................................................................................................................................95
Data Analysis ...........................................................................................................................95
Research Questions ..................................................................................................................96
Discussion of Findings .............................................................................................................96
Research Questions One and Two: Pre- and Post-Treatment ........................................... 96
Research Questions Three and Four: Pre- and Post-Treatment ...................................... 100
Implications............................................................................................................................102
Theoretical Implications ................................................................................................. 103
viii
Practical Implications...................................................................................................... 104
Future Implications and Recommendations.................................................................... 105
Limitations .............................................................................................................................107
Conclusion .............................................................................................................................108
References ....................................................................................................................................110
Appendix B: Pre- and Post-Surveys ............................................................................................143
Appendix C: Professional Development Treatment: Facilitator’s Slides and Notes ...................149
Appendix D: Letter of Invitation to Participate in Study .............................................................171
Appendix E: Flyer for PD Treatment...........................................................................................173
Appendix F: Consent Form ..........................................................................................................174
Appendix G: USC IRB Approval of Study..................................................................................178
ix
List of Tables
Table 1: Text Complexity Framework 10
Table 2: Original Scale: Increased Lexile Ranges for CCR Reading Anchor Standard 10 12
Table 3: Change Theory: Phases of the Change Process 15
Table 4 : RUSD Student CAASPP Data by Subgroup from 2015-2019: English: Standard Met or
Exceeded 34
Table 5 : RUSD Student CAASPP Data by Subgroup from 2015-2019: English: Nearly Met or
Not Met 35
Table 6: Teacher Agency 50
Table 7: Participants’ Teaching Level and Years of Experience in Education 60
Table 8 : Participant Demographics 71
Table 9 : Total Variance Explained by Each Component 72
Table 10: Component Loadings > 0.32 73
Table 11: Communalities of the Principal Components Analysis 75
Table 12: Text Complexity Preparedness Scale: Component Loadings > 0.32 76
Table 13: Professional Development and Other Scale: Component Loadings >0.32 76
Table 14: Additional Items: Component Loadings > 0.32 77
Table 15: Cronbach’s Alpha and Internal Consistency 77
Table 16: Descriptive Statistics of Pre- and Post-Treatment: Text Complexity Preparedness 78
Table 17: Descriptive Statistics of Pre-Treatment: Self-perceived Knowledge of Text Complexity
Measures 79
Table 18: Descriptive Statistics of Pre-Treatment: Self-perceived Knowledge of Text Complexity
Measures 79
x
Table 19: Descriptive Statistics of Pre- and Post-Treatment: Knowledge of Text Complexity
Measures 81
Table 20: Descriptive Statistics Pre- and Post-Treatment: Perceived Knowledge of Text
Complexity Measures 81
Table 21: Descriptive Statistics of Pre-Treatment: Preparedness in Reading Comprehension
Strategies 83
Table 22: Descriptive of Pre-Treatment: Preparedness in Text Complexity Measures 83
Table 23: Descriptive Statistics of Pre-Treatment: Preparedness in Reading Strategies 85
Table 24: Descriptive Statistics of Pre- and Post-Treatment: Preparedness in Text Complexity
Measures 86
Table 25: Participants’ Responses to Question: How Would You Define Text Complexity, and to
What Extent Is Text Complexity Considered When You Evaluate the Texts Used in Your
Classroom Instruction? 88
xi
List of Figures
Figure 1: Change theory: Phases of the change process 15
Figure 2: Coherence framework grounded in change theory 17
Figure 3: RUSD student CAASPP data trends by subgroup from 2015-2019 35
Figure 4: RUSD student CASPP data trends by subgroup from 2015-2019 36
Figure 5: Guskey and Sparks model of relationships between PD and improvements in student
learning 38
Figure 6: Operation theory: Professional development 42
Figure 7: Cycle of systems thinking 49
Figure 8: Pre-treatment mean of perceived knowledge and skills in text complexity 80
Figure 9: Comparison of pre- and post-treatment means 82
Figure 10: Pre-treatment mean of perceived level of preparedness in text complexity 85
Figure 11: Comparison of pre- and post-treatment means 87
Figure 12: Pre and post comparison 97
Figure 13: Comparison of pre- and post-treatment means 101
xii
List of Abbreviations
CAASPP California Assessment of Student Performance and Progress
CCR College and Career Readiness
CCSS Common Core State Standards
CCSSO Council of Chief State School Officers
CSRT Content Support Resource Teacher
ELA English Language Arts
ELDRT English Language Development Resource Teacher
NGA National Governors Association
PD Professional Development
SBE Standards-Based Education
xiii
Abstract
An Examination of the Impact of Professional Development on Teacher’s
Knowledge and Skills in the Use of Text Complexity
Donald David Moore
Preparing students for College and Career Readiness (CCR) requires instruction that
prepares students to interact with complex, college-level text. The challenge is that teachers are
not prepared to implement reading instruction that adequately prepares students to read complex
texts, and students are entering college unprepared to read at the college level. Students entering
college in need of remediation in reading brought about a renewed emphasis on text complexity.
The renewed emphasis is highlighted in the Common Core State Standards (CCSS) and CCR
Reading Anchor Standard 10, which require capable and effective teachers to teach reading using
the knowledge and skills of text complexity.
This dissertation examines the impact of targeted professional development on teachers’
perceived pre-and post-professional development skill levels in the qualitative, quantitative, and
reader and task measures of text complexity. Utilizing a quasi-experimental one-group pretest-
posttest design and a professional development session in text complexity, the researcher
solicited the responses of 11 teacher participants. Based on the principal component analysis and
the T-Test scale reliability, the professional development session increased teachers’ knowledge
and skills in text complexity.
TEACHER UNDERSTANDING OF TEXT COMPLEXITY
1
Chapter One: Overview of the Study
College readiness is at the forefront of much of the current educational discourse, as K-12
institutions and higher education grapple with the fact that students are not prepared for reading
and writing at the college level (Jackson & Kurlaender, 2013). As a result, the Common Core
State Standards (CCSS) were created to ensure that all students acquire the skills and knowledge
in their kindergarten through 12 grade education to be college and career readiness (CCR). The
CCSS K-12 standards correspond to the CCR Anchor Standards, providing additional specificity
to the CCSS. More specifically, Common Core State Standards English Language Arts-Literacy
(CCSS ELA-Literacy), and College and Career Anchor Reading Standard 10 (CCRA.R.10), the
anchor standard for reading, addresses the reading skills a high school student should have to
enter postsecondary education (National Governors Association [NGA] Center for Best Practices
& Council of Chief State School Officers [CCSSO)], 2010a)
Although the CCSS standards are clear, the K-12 system has done little to develop
teachers’ instructional practices in the CCSS to prepare students for the demands of post-
secondary study (Cruce, Lowe, & Mattern, 2018; National Center for Educational Statistics
[NCES], 2020). Additionally, K-12 teachers have not experienced consistent and targeted
professional development (PD) in the CCSS. Adding to this problem, there is a continued debate
among states and school districts regarding how PD should address high-quality teaching.
Statement of the Problem
There are two persistent problems plaguing the educational system in the United States.
First, student achievement in reading remains low across all demographics; however, the largest
deficit is in demographic subgroups, as noted in the National Assessment of Educational
Progress (NAEP, 2019), fourth grade reading assessment. The 2019 scores fall within one of
2
three NAEP achievement levels: Basic (208), Proficient (238), or Advanced (268), on a scale of
0-500 (NEAP, 2019). The NAEP data reported that African American students scored a 204,
Hispanic students scored 209, and students who are National School Lunch Program (NSLP)
eligible, a key indicator of poverty, scored 207 on the assessment. The second problem plaguing
the educational system is lack of teacher PD to adequately train teachers to implement the CCSS
CCR Anchor Reading Standard 10 with fidelity (Harrington, 2017).
Data in student literacy achievement in the United States over the last decade have
catalyzed a renewed emphasis on text complexity in English Language Arts (ELA) curriculum as
well as research-based instructional practices in reading (NAEP, 2019). Furthermore, the CCSS-
ELA has created heightened expectations for student achievement in reading, placing valid
demands on teachers to engage students in accessing complex text. Unfortunately, teachers may
not have been exposed to the PD that is needed to understand the dimensions of text complexity
and the instructional considerations needed for implementation.
The need to prepare students who are CCR has been at the forefront of standards-based
education (SBE) reform for many years; however, the CCSS are different in how students are
required to interact with complex texts, which requires reading instruction that is different from
the way in which teachers have been trained to teach reading (Hiebert, 2013; Hiebert & Grisham,
2012). To be college ready, according to D. Conley and McGaughy (2012), a student should be
able to:
read with understanding a range of nonfiction publications and materials, using
appropriate decoding and comprehension strategies to identify key points, note areas of
questions or confusion, remember key terminology, and understand the basic conclusions
reached and points of view expressed. (p. 51)
To have the skills to interact with complex text at this level is a predictor of CCR (Zygouris-Coe,
2014). Evidence regarding the predictor of CCR is based on an ACT study noting that the
3
primary difference between students who succeed or struggle in introductory college courses is
based on the complexity of what students can read (Barton, 2006). Consequently, “what students
could read, in terms of its complexity, was at least as important as what they could do with what
they read” (CCRA.R.10, p. 2).
The CCR Anchor Reading Standard 10, range of reading and level of text complexity,
requires teachers to approach reading instruction through a framework that focuses on students’
comprehension of text at a deeper level than required under the previous standards. In order for
students to comprehend text, reading instruction needs to shift from a skills approach to a
pedagogical approach that combines the analysis of text with content outcomes (Wixson &
Valencia, 2014). The skills approach in reading instruction remains prevalent among teachers, as
highlighted in a teacher survey completed by T. Shanahan and Duffet (2013). Their findings
validate that teachers’ lesson design and pedagogy are dominated by skills; they are more likely
to fit text to skills than to ground their skills instruction in what is appropriate to the content of
the texts they are teaching. Since the implementation of CCSS-ELA, teachers have been
grappling with a lack of direction on how to implement instructional strategies in reading that are
aligned with the CCSS-ELA Standards without the needed PD to do so (Hiebert & Morris,
2012). Burkins and Yaris (2012a) considered how teachers are “integrating quantitative and
individual reader/text variables when decisions are made about grade-level complex text” (p. 1).
This question is indicative of what teachers face with the changes in reading instruction as a
result of the shifts in the learning accountability and assessment outcomes in the CCSS-ELA.
The CCSS-ELA asserts a school-wide shared responsibility in the instruction of reading,
writing, speaking, listening, and language. This integrated model of literacy development and
proficiency is intended to “define general, cross-disciplinary literacy expectations that are
4
identified as critical to meet for students to be prepared to enter college and workforce training
programs ready to succeed” (NGA & CCSSO, 2010a, p. 2). The problem with insisting that
reading instruction be a shared responsibility among all teachers in all disciplines is that not all
teachers have the skills to teach reading or even view themselves as teachers of reading
(Achieve, College Summit, National Association of Secondary School Principals [NASSP], &
National Association of Elementary School Principals [NAESP], 2013). The CCSS-ELA outlines
the increasing rigor in student comprehension of text, which insists that all educators
strategically build students’ skills and stamina with increasingly complex text (Fisher, Frey, &
Lapp, 2012; Gewertz, 2013; Liben, 2010; T. Shanahan, 2013). Another feature of the CCSS-
ELA is in the articulation of how CCR standards clearly define what students should know and
be able to do by the end of each grade level. This feature of CCSS-ELA is exemplified in CCR
Anchor Reading Standard 10 in which students are expected to “read and comprehend complex
literary and informational texts independently and proficiently” (NGA & CCSSO, 2010a, p. 10).
Moving students to the point of interacting with texts independently and proficiently require
instructional shifts around text complexity. These shifts include: (a) building students’
knowledge through content-rich nonfiction informational texts; (b) grounding reading and
writing in evidence from texts; (c) offering consistent instruction that provides opportunities for
students to interact with complex texts and academic vocabulary; and (d) exposing all students to
grade-level text complexity, regardless of their reading ability (see Appendix A).
The literacy demands of CCR Anchor Reading Standard 10 require capable and effective
teachers with the knowledge and skills to teach reading. William and Baumann (2008) defined
an effective reading teacher as one who exhibits the ability to improve students’ literacy
achievement. However, students continue to struggle in their reading skills, as highlighted in
5
research showing that between 40-60% of first-year college students require remediation in
English courses, i.e., “classes that cover content [students] should have learned in high school”
(Polumbo, 2018, para. 2).
To add to the problem of student achievement data, teachers have not been adequately
prepared to understand or implement text complexity as it was intended. A report by the Thomas
B. Fordham Institute raised several red flags about the implementation of text complexity,
including: (a) a centrality of text in the ELA curriculum, (b) lessons continuing to be dominated
by skills, (c) 73% of elementary and 56% of middle school teachers placing greater emphasis on
reading skills than texts, and (d) 64% of elementary teachers continuing to match student with
books that align with their instructional reading levels (T. Shanahan & Duffett, 2013).
Implementing text complexity incorrectly is a dilemma that can be resolved if teachers
have access to focused training on text complexity implementation. The research is clear
regarding the connection between student achievement in reading and teachers’ preparedness in
their instructional practices in reading (Perkins & Cooter, 2013). Despite the dilemma teachers
face when preparing literate students with accelerated literacy expectations, researchers believe it
can be countered with an extensive shift in reading instruction that is supported by targeted PD.
The predominant approach to improving reading instruction is through targeted PD: a process
that increases inservice teacher quality and fosters educational change (Darling-Hammond, 2000;
Fullan, 1990, 2001a; Joyce & Showers, 2002). In their coherence framework, Fullan and Quinn
(2016) emphasize what they call building precision in pedagogy. According to Fullan and Quinn,
teachers will change when they are provided with “consistent and sustained capacity-building
based on research-proven practices to build precision in pedagogy” (p. 89). Targeted PD is
essential for educational improvement. When teachers have ongoing access to effective PD in
6
research-based instructional reading practices and are empowered in their practices, they will
meet the goals of CCR Anchor Reading Standard 10, as well as adjust to shifts in text
expectations (Darling-Hammond, 2000; Fullan, 2001b; Sparks, 2002). Significant improvement
in teachers’ instructional capacities in the area of reading require dynamic intervention within
school districts. The fundamental improvement needed in instructional practices across all
content areas highlights the importance of mobilizing the abilities of teachers across levels and
disciplines to “invent, adapt, and implement reliable ways to improve instruction,” because “the
success of this enterprise [CCSS]—including but not limited to literacy instruction—will depend
on it” (Cohen & Bhatt, 2012, p. 133-134).
Background of the Problem
Released in 2010, The CCSS were designed based on the Common Core State Standards
Initiative (CCSSI) led by the National Governors Association Center for Best Practices (NGA
Center), and the Council of Chief School Officers (CCSSO). The development of the CCSS was
a collaborative partnership among leading education organizations, national organizations,
community groups, researchers, educators, and content experts (Common Core State Standards
Initiative [CCSSI], 2010b).
The CCSS are the iteration of over 25 years of effort to establish consensus in defining
the expectations for the knowledge and skills across all K-12 students as they acquire skills in
mathematics, ELA, and literacy in history/social science, science, and technical subjects. The
central purpose for designing and implementing the CCSS was to ensure consistency across
states and provide clear expectations of the knowledge and skills needed for all students to
succeed in college, career, and life following high school.
7
Highlighted in the CCSS (2010) mission statement, their purpose is:
Provide a consistent, clear understanding of what students are expected to learn, so
teachers and parents know what they need to do to help them. The standards are designed
to be robust and relevant to the real world, reflecting the knowledge and skills that our
young people need for success in college and career. (p.2 )
The standards promote equity by preparing all students in an educational system that is based on
a common set of standards (NGA & CCSSO, 2010a). The CCR focus within the CCSS places
greater emphasis on what students need to know and their ability to demonstrate the application
of their knowledge and less emphasis on how to actually teach the standards. Schwartz (2014)
stated, “Standards are the learning goal, the what of education” (paras. 4-5), and the CCSS
“standards are what curriculum, assessments and professional development are designed to
support and achieve” (Apex Learning, 2017, section 2). The CCSS standards “inform teachers
what the learning outcome should be” and “form a common set of goals that can be measured
within a state or across the country to determine student success” (Apex Learning, 2017, section
2).
With a decreased emphasis on how to teach specific knowledge and skills in the CCSS,
there is concern among educators regarding instructional delivery, especially in the area of
effective literacy instruction that is needed to build students’ comprehension, writing skills, and
overall communication skills (Porter, McMaken, Hwang, & Yang, 2013). This concern is
twofold. A gap in student literacy data indicates that students are underperforming across the K-
12 continuum. Further, there are indicators that teachers’ skills do not reflect adequate
preparation to implement effective reading instruction (Lambert, 2020). For example, teacher
preparation in reading instruction continues to be an issue, as noted by the National Council on
Teacher Quality (NCTQ, 2016). Based on the NCTQ report, out of 820 elementary pre-service
teacher programs that were evaluated, only 39% provide instruction in all five essential
8
components of early reading instruction. The NCTQ study confirms what researchers have
reported about teacher pre-service programs: that many of the programs “fall short in preparing
teachers adequately for the classroom…leaving schools with an urgent need that can only be
addressed via professional development programs” (section. 2). According to these findings,
teachers completing pre-service programs are not fully prepared to implement text complexity
strategies effectively (Bayar, 2104; Hiebert & Grisham, 2012; Palardy & Rumberger, 2008). To
fill the knowledge and skills gap that exists in reading instruction, school districts need to
leverage targeted PD focused on the strategies needed to teach the CCR Anchor Reading
Standard 10 effectively (Draper, Broomhead, Jensen, & Nokes, 2012; Fisher, Frey, & Nelson,
2012)
Within the CCSS staircase of text complexity, the bar for Lexile levels has been moved to
a higher level for students across second through 12th grades. An example of this acceleration is
confirmed in the high end of Lexile expectations for grades two and three, topping out at 790
Lexile (L), which is approximately one grade level higher than previous recommendations made
under No Child Left Behind (NCLB; Hiebert, 2013). This accelerated expectation not only
challenges teachers to instruct and engage students in more complex texts, but also places more
demanding reading expectations on students (Hiebert, 2013). Hiebert (2012), challenged the
acceleration, stating, “Two-thirds of the American fourth-grade cohort is failing to reach the
current proficient reading standard on the [National Assessment of Educational Performance]
2009 NAEP,” noting that “At present, two-thirds of a third-grade cohort fails to attain the
proficient standard with current levels of text complexity” (pp. 26-27). With so many students
not meeting expectations in reading levels under NCLB, Hiebert contended, “Before we increase
the levels of text complexity in primary-level reading programs, we need to examine why it is
9
that so many exiting third graders are not reading proficiently at current complexity level”
(p. 27).
The renewed attention to text complexity in the CCSS arose out of the need that showed
“nearly half of the students who graduate high school need some kind of remediation to cope
with the reading required in college and during their careers”(Margerison, 2017), a statement
supported by the ACT (2013) scores indicating that only 44% of the students who took the ACT
in 2013 possessed the reading skills necessary for success in college (Graduate NYC, 2016). The
ACT data confirm a dilemma educational practitioners, publishers, and researchers face given
the increased attention placed on text complexity and comprehension of text. The initial
challenge is in establishing criteria for how students interact with text. Additional challenges
include preparing students to read the range of complex texts independently, as well as preparing
students for the rigorous comprehension required in college and careers (Mesmer, Cunningham,
& Hiebert, 2012).
What Is Text Complexity?
Text complexity is exemplified in the CCSS-ELA and CCR Anchor Reading Standard
10. Text complexity refers to the text-derived difficulty of a given passage and focuses on those
features central to the text itself that make the passage more or less challenging to decode and
understand (Munir-Mchill, 2013). The complexity of text, or the degree of challenge presented
by the text, are the result of specific combinations and interactions of these factors (Hess &
Biggam, 2004). CCR Anchor Reading Standard 10 requires students to be able to “read and
comprehend complex literary and informational texts independently and proficiently” (NGA &
CCSSO, 2010a, p. 10). The Text Complexity Framework was developed and based on three
dimensions: (a) qualitative; (b) quantitative; (c) reader and task. Each of the dimensions has
10
various factors that influence text complexity. Table 1 outlines the Text Complexity Framework,
which includes the measures, factors, or variables that influence text complexity.
Table 1
Text Complexity Framework
Text
Complexity
Measures
Factors or
Variables Description
Qualitative Levels of
Meaning
or Purpose
• Single level of meaning vs. Multiple levels of meaning
• Explicitly stated purpose vs. Implicit purpose, may be hidden or obscure
Structure • Simple vs. Complex
• Explicit vs. Implicit
• Conventional vs. Unconventional (chiefly literary texts)
• Events related in chronological order vs. Events related out of chronological order
(chiefly literary texts)
• Traits of a common genre or subgenre vs. Traits specific to a particular discipline
(chiefly informational texts)
• Simple graphics vs. Sophisticated graphics
• Graphics unnecessary or merely supplementary to understanding the text vs. Graphics
essential to understanding the text and may provide information not otherwise conveyed
in the text
Language
Conventionality
and
Clarity
• Literal vs. Figurative
• Clear vs. Ambiguous or purposefully misleading
• Contemporary, familiar vs. Archaic or otherwise unfamiliar
• Conversational vs. General academic and domain specific
Knowledge
Demands
LITERARY TEXTS:
• Simple theme vs. Complex or sophisticated themes
• Single themes vs. Multiple themes
• Common, everyday experiences or clearly fantastical situations vs. experiences
distinctly different from one’s own
• Single perspective vs. Multiple perspectives
• Perspective(s) like one’s own vs. Perspective(s) unlike or in opposition to one’s own
• Everyday knowledge and familiarity with genre conventions required vs. Cultural and
literary knowledge useful
• Low intertextuality (few if any references/allusions to other texts vs. High intertextuality
(many references/allusions to other texts)
INFORMATIONAL TEXTS:
• Everyday knowledge and familiarity with genre conventions required vs. Extensive,
perhaps specialized discipline-specific content knowledge required.
• Low intertextuality (few if any references to/citations of other texts) vs. High
intertextuality (many references to/citations of other texts.
(continued)
11
Text
Complexity
Measures
Factors or
Variables Description
Quantitative Word Length
or
Frequency
• Word Frequency measures the use of rare words. The range in word frequency is
exemplified with a high number, which means fewer rare words are used in the text;
low numbers mean more rare words are used in the text.
Sentence
Length
• Sentence length is a proxy for syntactical and semantic demands on reader, such as
prepositional phrases, dependent clauses, and adverbs.
Text
Cohesion
• Text coherence refers to the logical and plausible connections of ideas within and across
paragraphs.
• Coherence can be enhanced with explicitly stated connections, repeated concepts and
key words, and clear pronoun references.
• Coherent texts link ideas clearly for readers.
• Coherent texts are considerate of the audience use familiar words and concepts, build on
background knowledge, introduce new ideas and vocabulary at an appropriate pace, and
provide information in interesting ways.
• Coherent texts may include the use of a variety of literary devices to make texts
interesting and unusual such as flashback, imagery, metaphor, and humor. These
features are difficult to quantify because their complexity depends on the knowledge
and experiences of the reader, but they are important because they engage readers in the
meanings of text.
Reader and
Task
Considerations
Motivation • Will the readers at this grade level understand the purpose—which might shift over the
course of the reading experience—for reading the text (i.e., skimming, studying to retain
content, close reading, etc.)?
• Will the readers at this grade level be interested in the content of the text?
• Might the readers at this grade level develop an interest in this content because of this
text?
• Do readers at this grade level believe that they will be able to read and understand the
text?
• Will the readers at this grade level be interested and engaged with the style of writing
and the presentation of ideas within the text?
• Will the text maintain the reader’s motivation and engagement throughout the reading
experience?
Knowledge
and
Experience
• Do readers at this grade level possess adequate prior knowledge and/or experience
regarding the topic of the text to manage the material that is presented?
• Are there any explicit connections that can be made between what content the readers at
this grade level will encounter in the text and other learning that may occur in this or
another class?
• Do readers at this grade level possess adequate prior knowledge and/or experience
regarding the vocabulary used within the text to manage the material that is presented?
• Do readers at this grade level possess adequate knowledge of and/or experience with the
genre of the text to manage the material that is presented?
• Do readers at this grade level possess adequate knowledge of and/or experience with the
language (i.e. syntax, diction, rhetoric) of the text to manage the material that is
presented?
Purpose and
Complexity
of Task
• Will the complexity of any tasks associated with the text interfere with the reading
experience?
• Will the complexity of any questions asked or discussed concerning specific texts
interfere with the reading experience?
Note. Adapted from “Literacy Achievement Through Sustained Professional Development,” by D. Fisher, N. Frey, & J. Nelson,
2012, The Reading Teacher, 65(8), 551-563. Copyright 2012 by the authors. Adapted from “Teaching With Challenging Texts in
the Disciplines,” by Z. Fang & B. G. Pace, 2013, Journal of Adolescent & Adult Literacy, 57(2), 104-108. Copyright 2013 by the
12
authors. Adapted from “Upping the Ante of Text Complexity in the Common Core State Standards,” by E. H. Hiebert & H. A. E.
Mesmer, 2013, Educational Researcher, 42(1), 44-51. Copyright 2013 by the authors.
The CCSS identifies variables through the Smarter Balanced Assessment, also known as
the California Assessment of Student Performance and Progress (CAASPP). The variables
indicate that students in kindergarten through 12th grade require a significant shift in instruction
that changes the way students interact with text. However, the level at which students interact
with text is driven first by their ability to read the text. Williamson, Fitzgerald, and Stenner
(2013) posited that the idea espoused in the standards regarding the skills of graduating students
and their ability to read, analyze, synthesize, and critique text routinely encountered in college
and the workplace contradicts the actual reading levels of current K-12 students. Research
indicates students have historically achieved a reading level at or above 820L by the completion
of third grade; however; struggling readers have not fared as well and average 400L by the end
of third grade. Making up the nearly 400L difference from kindergarten through third grade
would require a Herculean effort on the part of teachers.
Table 2 exemplifies the increasing difficulty levels of text experienced by students in
second through 12th grades (NGA & CCSSO, 2010b).
Table 2
Original Scale: Increased Lexile Ranges for CCR Anchor Reading Standard 10
Grade Previous Lexile Range
CCSS Expected Lexile
Range (2010)
K-1 N/A N/A
2-3 450-725 450-790
4-5 645-845 770-980
6-8 860-1010 955-1155
9-10 960-1115 1080-1305
11-CCR 1070-1220 1215-1355
Note: Adapted from “Key Considerations in Implementing Text Complexity,” by the National Governors
Association Center for Best Practices & Council of Chief State School Officers, 2010, Common Core State
Standards for English Language Arts & Literacy in History/Social Studies, Science, and Technical Subject, p. 8.
Copyright 2010 by the authors.
13
The CCSS-ELA outline the increasing rigor in student comprehension of text, which
insists that all educators strategically build students’ skills and stamina with increasingly
complex text (Fisher, Frey, & Lapp, 2012; Gewertz, 2013; Liben, 2010; T. Shanahan & Duffett,
2013). Moving students to the point of interacting with texts independently and proficiently will
require significant instructional shifts around text complexity. These shifts include: (a) building
students’ knowledge through content-rich nonfiction informational texts, (b) ensuring that
reading and writing are grounded in evidence from texts, (c) offering consistent instruction that
provides opportunities for students to interact with complex texts and academic vocabulary, and
(d) exposing all students to grade-level text complexity, regardless of reading ability (see
Appendix A). Before these shifts around text complexity can occur, teachers require further
training in reading instruction and ongoing support. For decades, reading instruction to improve
student comprehension has been an issue for teachers. The RAND Reading Study Group
supports supports the improvement of reading comprehension (Snow, 2002). The study found
that several factors motivating students’ ability in reading comprehension are at play: (a) high
school students must have the capacity to comprehend complex text because subject-matter
learning and discipline-specific content are essential to the curriculum, yet the comprehension
outcomes are not improving; (b) teachers lack explicit training in developing the necessary skills
to support reading comprehension, ensure learning of content through reading, and appropriately
address the various levels of reading comprehension within a classroom; and (c) the programs
and policies often adopted to increase student achievement in reading comprehension are not
based on empirical data or evaluated adequately.
This chapter will describe the added dimension of text complexity and the CCS-ELA.
Additionally, this chapter will address the importance of implementing text complexity in
14
reading instruction, as well as the challenges teachers face in the implementation of text
complexity. Next, it is proposed that targeted PD in text complexity may counter the entrenched
instructional practices in reading that lend to the gap in students’ reading level.
Purpose of the Study
The purpose of this mixed methods study was to determine the relationship between
targeted PD and the perceived skill level of K-8 teachers’ implementation of the trifold measures
of text complexity. The significance cultivating developing a greater understanding of text
complexity is that it may: (a) accelerate learning of strategies for the implementation of text
complexity, (b) develop precision in teachers’ instructional practices around text complexity, and
(c) help teachers identify processes that will shift instructional practices around literacy
instruction. As a result, teachers’ increased knowledge of text complexity will provide
instructional strategies to teach students how to interact with complex text.
Research Questions
To achieve the purpose of this study, the following research questions were posed:
1. What is the perceived knowledge and skill level of teachers in text complexity who
have not experienced prior professional development?
2. What is the perceived level of preparedness of teachers in text complexity who have
not experienced prior professional development?
3. To what degree did the teachers’ knowledge and skill levels in text complexity
change after engaging in professional development?
4. To what degree did the teachers’ preparedness in text complexity change after
engaging in professional development?
15
Conceptual Framework
Michael Fullan’s Coherence Framework
The changes in teacher knowledge and preparedness in the dimensions of text complexity
in this study were viewed through the lens of Michael Fullan’s coherence framework (Fullan &
Quinn, 2016), which is rooted in his previous work related to change theory (Fullan, 1993,
2001a, 2009). To understand the coherence framework, it is important to first understand the
three overlapping phases of the change theory process: initiation, implementation, and
institutionalization (Fullan, 2007; Miles & Ekholm, 1987) See Figure 1 and Table 3 for the
phases of the change process
Figure 1. Change theory: Phases of the change process.
Table 3
Change Theory: Phases of the Change Process
Initiation Implementation Institutionalization
What needs to be done to
prepare for the initiative?
What support, learning
opportunities, and
professional development
designs can be employed to
meet the intended goal?
What structures can be put
into place to ensure that the
initiative becomes a part of
the culture and is maintained?
Note. Adapted from The New Meaning of Educational Change, by M. Fullan, 2007, Routledge.
Copyright 2007 by the author. Adapted from Lasting School Improvement: Exploring the
Process of Institutionalization, by M. B. Miles, M. Ekholm, R. Vandenberghe, 1987, Acco.
Copyright 1987 by the authors.
Initiation Implementation Institutionalization
16
Making the connection between Fullan’s coherence framework and his work in change
theory is important because, in this study, change is imperative in two areas: (a) reading
instruction; and (b) PD. Using change theory as the driver, the premise is that effective reading
instruction and PD will become institutionalized because they are implemented clearly and
consistently as a result of launching a reading initiative that recognizes the fact that “how well
something begins affects how it ends” (Fullan, 2007, p. 21).
With change theory as the driver, Fullan and Quinn’s (2016) coherence framework was
selected for this study due to the focus on deep learning, precision in pedagogy, and building
capacity: the three elements through which the changes in teachers’ reading instruction, as well
PD, can be framed. See Figure 2 to see how change theory supports the coherence framework.
The coherence framework identifies four components that are used to foster systematic change
within districts and schools: (a) cultivating collaborative cultures, (b) focusing direction,
(c) deepening learning, and (d) securing accountability (Fullan & Quinn, 2016). Although all
four of the components of the framework are critical to the change process, for the purpose of the
study, greater emphasis was placed on the deepening learning component of the framework,
which focuses on: (a) developing clarity of learning goals, (b) building precision in pedagogy,
and (c) shifting practices through capacity building.
This study sought to accomplish Fullan and Quinn’s (2016) deepening learning through
Guskey and Sparks’s (2002) model for PD. In this study, the initiation (change theory) was to
change teacher knowledge and ability to implement text complexity. The implementation
(change theory) occurred through PD (coherence framework; Fullan & Quinn, 2016) using a
specific PD model (Guskey & Sparks, 2002) to deepen the teachers’ learning regarding text
complexity.
17
Figure 2. Coherence framework grounded in change theory. Adapted from The New Meaning of
Educational Change, by M. Fullan, 2007, Routledge. Copyright 2007 by the author. Adapted
from Coherence: The Right Drivers in Action for Schools, Districts, and Systems, by M. Fullan
& J. Quinn, 2016, Corwin. Copyright 2016 by the authors. Adapted from Lasting School
Improvement: Exploring the Process of Institutionalization, by M. B. Miles, M. Ekholm, R.
Vandenberghe, 1987, Acco. Copyright 1987 by the authors.
Guskey and Sparks’s Professional Development Model
The Guskey and Sparks (2002) PD model was used to facilitate a change process in
teachers’ knowledge of the trifold dimensions of text complexity. The premise of the model is
based on the belief that quality PD is influenced by multiple factors. The Guskey and Sparks PD
model posits three direct influences on PD that are categorized in three major categories: content
characteristics, process variables, and context characteristics. The content characteristics refer to
the what of PD, which concerns the new knowledge, skills, and understandings of the PD effort.
The process variables refer to the how of PD, which concerns the way in which PD activities are
planned, organized, carried out, and followed up. Finally, the context characteristics refer to the
who, when, where, and why of PD, which concern the traits of the group of educators involved in
The Coherence Framework is grounded in Change Theory
Change Theory
Institutionalization
Implementation Initiation
Securing
Accoutability
Deepening
Learning
Focusing
Direction
Cultivating
Collaborative
Cultures
1. Developing clarity
of learning goals.
2. Building precision in
pedagogy
3. Capacity Building
Coherence Framework
18
the PD, as well as the organization, system, or culture in which the PD takes place, as well as the
district or school level policies that may influence the implementation.
Statement of Methodology
To understand how PD in text complexity influences instructional practices in literacy,
one must have the ability to study the context of text complexity, as well as the teachers involved
in the process of learning about text complexity. To fully understand the factors, the study cannot
be limited to surveys (Yin, 2009). Although numerical data may provide general understandings
and cultivate specific beliefs around best instructional practices in literacy, the numerical data
cannot portray the live interactions teachers experience within the process of learning best
practices through PD regarding the implementation of text complexity (Creswell & Plano Clark,
2018).
In this study, teachers were pre- and post-surveyed, as well as asked to participate in a PD
setting focused on the topic of CCR Anchor Reading Standard 10, Text Complexity. The data
collection from the pre- and post-surveys provided a rich and extensive analysis related to
teachers’ efficacy and ability to implement text complexity (Merriam 1998; Yin, 2009). The data
from the survey and PD activity were analyzed using both qualitative and quantitative measures
through a comparative analysis of pre- and post-PD data.
Significance of the Problem
The impact of targeted PD focused on teachers’ understanding of text complexity and the
perceptions regarding their ability to implement text complexity in the classroom could yield
valuable information for the Education Services Divisions in PK-12 school districts. By
identifying whether teachers understand text complexity, school districts can tailor explicit and
focused PD that builds on identified knowledge of text complexity as well as establishing a
19
baseline of expectations for implementing text complexity in the classroom. The benefit of a
focused PD on text complexity could result in coherent instructional practices in the teaching of
reading in all content areas across all grade levels; a result that will serve as a foundation for the
effective implementation of CCR Anchor Reading Standard 10.
Assumptions
The principal focus of this dissertation study was the perceived knowledge and skills of
teachers, as well as their understanding of text complexity that is needed for the effective
implementation of the CCSS-ELA shifts in text expectations, in addition to the impact of
targeted PD on teachers’ knowledge and skills. Two assumptions were made in this study. First,
it was assumed that the participants would respond openly and honestly to the pre- and post-
surveys that assessed teacher understanding of text complexity and teacher efficacy in
implementing text complexity in the classroom. It was further assumed the teachers were able to
understand the survey questions based on their knowledge of reading instruction.
Limitations
The limitations in this study were characteristic of those within a mixed instrumental
study incorporating a single setting, small number of participants, and purposeful sampling.
These characteristics may limit future researchers’ ability to generalize findings across a broader
spectrum of school districts. Creswell, Hanson, Plano Clark, and Morales (2007) contended that
a central phenomenon can be understood to a greater degree by researchers through purposeful
sampling of selected individuals who provide information-rich input. A second limitation was
related to the timeframe of the study. The data collection period spanned over a 2-week period
and consisted of a pre-survey, followed by a PD exercise, and then concluding with a post-
survey. In order to optimize evidence of change in teacher knowledge and understanding of text
20
complexity, it may have been beneficial to add pre- and post-PD teacher observation to the
exercise. Although this study was preliminary, it may contribute to the limited extant literature
and provide a framework for expanded research in this area.
Delimitations
This study was limited to a sample population of elementary teachers and district content
support resource teachers (CSRTs) who were either kindergarten through eighth grade classroom
teachers or who had curriculum specialist responsibilities within a single California urban school
district consisting of 16 elementary schools. The teachers and CSRTs included in the study
participated in both the pre and post-surveys as well as the 2.5-hour PD activity.
Definition of Terms
Literacy. Literacy is defined as “an individual’s ability to read, write, compute, and solve
problems at levels of proficiency necessary to function on the job, in the family of the individual,
and in society” (Workforce Investment Act, 1998, Section 203). Paulo Freire (1970) broadened
the understanding of literacy to an all-encompassing view that the words of language,
mathematical symbols, or even images can transform lives. Freire believed literacy “moves
beyond strict decoding and reproducing of language into issues of economics, health, and
sustainable development” (Zemliansky & Amant, 2008, p. 214).
Text complexity. Text complexity is “the inherent difficulty of reading and
comprehending a text combined with consideration of reader and task variables; in the Standards,
a three-part assessment of text difficulty that pairs qualitative and quantitative measures with
reader-task considerations” (CCRA.R.10, p. 43).
Qualitative dimensions. Qualitative dimensions and qualitative factors refer to those
aspects of text complexity best measured or only measurable by an attentive human reader, such
21
as levels of meaning or purpose, structure, language conventionality and clarity, and knowledge
demands (see Appendix A).
Quantitative dimensions. The terms quantitative dimensions and quantitative factors
refer to those aspects of text complexity—such as word length or frequency, sentence length, and
text cohesion—that are difficult if not impossible for a human reader to evaluate efficiently,
especially in long texts, and are thus today typically measured by computer software (see
Appendix A).
Reader and task. Whereas the prior two elements of the model focus on the inherent
complexity of text, variables specific to particular readers (such as motivation, knowledge, and
experiences) and particular tasks (such as purpose and the complexity of the task assigned and
the questions posed) must also be considered when determining whether a text is appropriate for
a given student; such assessments are best made by teachers employing their professional
judgment, experience, and knowledge of their students and the subject (see Appendix A).
Targeted professional development. Targeted PD is not a one-time workshop, but rather
of significant duration and ongoing, collaborative, intensive, and connected to practice (Darling-
Hammond & Richardson, 2009). PD is a comprehensive, sustained, and intensive approach to
improving teachers’ and principals’ effectiveness in raising student achievement. PD focuses on
improving instructional practice by providing teachers with new knowledge and techniques for
assessing learning with the ultimate goal of improving students’ learning (Wei, Andree,
Richardson, & Orphanos, 2009).
Reading comprehension. Reading comprehension is defined as the process of
simultaneously extracting and constructing meaning. Extracting is figuring out how print
represents words and then engaging in the translation of print to sound accurately and efficiently.
22
Constructing meaning is formulating a representation of the information being printed, which
inevitably requires building new meanings and integrating new and old information (Snow
2010).
Teacher agency. “Teacher agency is the capacity of teachers to act purposefully and
constructively to direct their professional growth and contribute to the growth of their
colleagues” (Calvert, 2016, p. 4).
Balanced literacy. In this study, balanced literacy refers to an instructional approach that
is research-based, guided by assessed needs, integrated across the curriculum, and flexible so that
teachers can respond to the individual needs of each child in decoding, vocabulary, and
comprehension in a socio-cultural setting with the purpose of empowering children to develop
into lifelong readers. In the balanced literacy classroom, all the essentials noted in the
aforementioned definitions must be addressed, and the teacher must meet the demands of the
society represented in the classroom. The definition reflects balance between reading and
writing, between teacher directed and student-centered activities, and between skill-based and
meaning-based approaches to literacy instruction (Frey, Lee, Tollefson, Pass, & Massengill,
2005). A balanced literacy approach requires that teachers plan lessons based on systematic
study of all student needs, think about moment-by-moment decisions in the classroom, and
reflect on lessons in a productive and student-centered way (Stein & D’Amico, 2002).
Organization of the Study
This dissertation study is organized into five distinct chapters, beginning with the
introduction chapter highlighting the purpose of the study. Chapter One presented a statement of
the problem along with research questions that shaped and guided the critical analysis of the
study. The chapter continued with key assumptions, delimitations, and limitations, and then
23
concludes with the definitions of key terms that provide the reader with a clear understanding of
the work and structure of the research.
Chapter Two presents a theoretical framework expanding the tenets of the Fullan and
Quinn (2016) coherence framework using the Guskey and Sparks (2002) framework for PD, as
well as a review of the literature to show the correlation to change theory and its implications for
teacher PD and student achievement in literacy.
Chapter Three details the research methodology used in the study. The methodology
presents a research design that mixes qualitative and quantitative approaches in a single study.
The mixed methods approach combines the what and why in the process of analyzing and
explaining the data. The topics in Chapter Three include: (a) research questions and
methodology, (b) participant selection, (c) site selection, and (d) data collection and analysis.
Chapter Four contains the findings and disaggregates the results of the study to provide
explanations of the values derived from the data analysis. The findings from the study are studied
in light of the literature review for a triangulation of data that can improve the reliability and
validity of the findings. Chapter Five provides a brief discussion and summary of the study,
triangulates the findings to prior research and literature, and recommends a direction for future
studies in in the area of effective literacy instruction, as well as the PD opportunities that support
instruction.
24
Chapter Two: Review of Literature
As long as educators are in any way expected to base their educational decisions on the
issues, debates, politics and polemics of the Great Debate, and as long as we limit our
horizons to approaches and philosophies that have been advocated by one faction or
another, there is no reason to believe that real progress in reading education will ever be
made. (Wren, 2020, section, 13)
The implementation of text complexity in the classroom poses two very complex
challenges for teachers: measuring the complexity of texts to make certain the texts assigned to
students are appropriately complex, and effectively preparing students to interact with
increasingly complex texts (Varlas, 2012). The third and most telling challenge for
implementation lies within the teachers’ ability to identify and assign complex text and then have
the skills to provide students with the strategies they need to interact with complex text.
The third challenge is the most critical because it is tied to the direct development of
teachers’ knowledge and skills and their ability to implement text complexity. It is the most
telling challenge because student achievement data showed that by 2017—7 years after most
states adopted the CCSS—there has been a modest decline in fourth-grade reading scores. The
greatest contributing factor to the student achievement data was the lack of teacher PD.
Harrington (2017) reported that teachers and administrators have not had access to high-quality
sustained PD that would enable them to teach or implement more rigorous reading
instruction.CCSS
This chapter will present and critique the literature related to approaches in reading
instruction that have contributed to low student achievement in reading in the decades leading up
to the implementation of CCSS, beginning with a review of the historical, political, and social
frameworks of literacy and followed by a review of what is known about current reading
instruction and its impact on student achievement in literacy. The history of reading achievment
demonstrates how the students most affected by low reading achievment have been the same low
25
socio-economic group for hundreds of years. The difference now is perhaps the education system
is not deliberately keeping students from learning how to read, but rather, the current education
system may not be providing the instruction needed for our students to become proficient
readers. Lastly, the review will take a broad look at the impact of PD on reading instruction and
its relation to text complexity implementation.
Historical, Political, and Social Frameworks of Literacy
Countless philosophers, classicists, historians, and researchers have agreed that the ability
to read and write (literacy), a multifaceted phenomenon, commenced with the introduction of the
Greek letters into inscription around 700 B., altering the culture of humanity and having many
implications on both individual and social levels. The invention of the Greek alphabet was
heralded not merely as the beginning of literacy, but also as the literate basis for modern thought
(Powell, 1989). John Eisenberg (1992) advanced a theory known as causal indeterminacy to
explain the ardent transformation of “human consciousness, perceptions, relationships, society,
even values” (p. 13) that followed the embracing of the Greek phonetic alphabet. People were
prone to logical and analytical thought, abstract uses of language, critical and rational thought,
political stability, wealth and productivity, and urbanization. Indeed, as aresult of literacy, people
became innovative, achievement-oriented, productive, politically aware, more globally oriented,
and less likely to commit crimes (Gee, 1989; Gibson, 2005).
Throughout history, literacy has been rife with implications, often serving the political,
religious, and social frameworks of the time. The implications of literacy are most prominently
seen in the social, religious, or economic elites who often reasoned that the lower classes should
not be literate because they would then be unhappy with their lot in life, politically critical, and
begrudging of the menial jobs in society (Gee, 1989). The debate over literacy began centuries
26
ago with Plato’s belief that the written word would lead to the deterioration of human memory;
he considered only dialogic thought and speaking to be truly authentic discourse. Oral tradition,
according to his belief, was being challenged needlessly through poetry and Sophistic rhetoric.
Plato’s concerns about the written word were also rooted in his political opposition to the
traditional order of society. He believed people were “born for a particular place in a naturally
given hierarchy” (Gee, 1989, p.199). This order meant that philosopher-kings ruled and all others
beneath them had no actual say in the government However, the societal order of Greek culture
was interrupted at the time because people were enamored with the rhapsody in Homer’s epic
verse, causing Plato to lose much of his audience for his oral and written dialogues. As a result,
Plato attacked poetry and Sophistic rhetoric because he believed it to be a “rejection of the entire
oral tradition of Greek education” (Gibson, 2005, p. 4).
In contrast, Plato’s views on literacy were seemingly conflicted. On the one hand, he
believed literacy was constraining and authoritarian. On the other hand, Plato contradicted his
authoritarian view of literacy by viewing literacy as a liberator—that is, perceiving the use of
literacy as “an emancipatory literacy for religious, political, and cultural resistance to
domination” (Gee, 1989 p. 206).
Similar to Plato’s view is that of Paulo Freire who is also known for his similar
emancipatory view of literacy; however, Freire believed people were empowered through
literacy only when they questioned their social reality:
Reading the world always precedes reading the word, and reading the word implies
continually reading the world… In a way, however, we can go further and say that
reading the word is not preceded merely by reading the world, but by a certain form of
writing it or rewriting it using conscious, practical work. For me, this dynamic movement
is central to the literacy process. (Freire & Macedo, 1987, p. 35)
27
According to Freire, literacy is more than strict decoding and reproducing of language; literacy
serves as a tool for people to gain the capacity to become critical readers to understand, and to be
active in social, political, and economic change and progress (Timpson, 1988). Instead of
passively accepting the information being presented, Shor and Freire (1987) suggested that
critical readers should read text not only to understand the written word but also to understand
the text’s purpose in order to avoid being manipulated by it. Kaestle (1985) contrasted the
distinguishing effects of literacy on one’s life using Greer and Graff’s (1981) literacy myth, an
argument positioned against the historically perpetual idea that literacy has either a “liberating or
constraining effect on individuals’ lives” (p. 13). In a report to the Ford Foundation, Hunter and
Harman (1979) further contrasted the connection between literacy and individuals’ privilege or
plight through the assertion that poverty and the power structures of society are responsible for
low literacy, and illiteracy is only one factor among many forms of discrimination such as class,
race, and sex discrimination. Other factors include welfare, unemployment, poor housing, and “a
general sense of powerlessness” (pp. 9).
The role of literacy in the historical, political, and social frameworks is widely debated,
and within this debate are countless large-scale literacy reforms that have been promoted in the
United States for many years (Strauss, 2014). At the core of large-scale literacy reforms is an
understanding and consensus that there is a “cultural price-tag to literacy” (Kaestle, 1985, p. 30).
Also, “Whether literacy is liberating or constraining…Literacy is discriminatory in two ways,
with regard to access and with regard to content” (Kaestle, 1985, p. 30).
Literacy has been at the forefront of history for centuries; it has been ardently politicized
in the most recent decades, with government intervention in the form of federal acts and the
standardization of curriculum. The following section outlines the brief history of the
28
philosophies and approaches to reading instruction in the decades leading up to CCSS. The
purpose of this discussion is to examine reading instruction focusing specifically on how reading
comprehension was addressed in various instructional approaches. The comprehension of
complex text is at the core of CCR Anchor Reading Standard 10: range, quality, and complexity.
The research supporting the need for Anchor Reading Standard 10 “showed that the clearest
differentiator in reading between students who are and are not college ready is the ability to
comprehend complex texts” (T. Shanahan & Duffett, 2013, p. 13). Decades of reading
instruction and practices have influenced and shaped the CCSS’s call for student achievement in
reading: a call for students to have the ability to be ready strategically for the understanding of a
wide range of non-fiction and technical texts (M. Conley, 2008).
Basal Readers: A Phonics Approach to Reading
Reading pedagogy over the decades has encompassed a wide variety of conceptions and
beliefs that have shaped reading instruction in the classroom. Researchers have studied the
implications of various approaches to reading to determine if these approaches are preparing
students in the foundations of English, namely reading comprehension and literature, information
gathering, writing, editing, analysis, and making connections (M. Conley, 2008). Throughout the
1970s and 1980s, the emphasis in reading instruction at the primary grade level was primarily
technical, dividing reading instruction into sub-skills of reading, writing, speaking, and listening
(Ai, Scheu, Kawakami, & Herman, 1990). The basal reading series, created by William
McGuffey (1866), was the dominant reading program throughout the United States beginning in
the 1950s and still currently used in schools to this day (Hoffman, Sailors, & Patterson, 2002).
Basal reading programs of this era were developed on the premise that an all-inclusive set of
instructional materials could teach students to read despite variations in teacher competence and
29
differences among learners (Goodman, Shannon, Freeman, & Murphy, 1988). Basal readers are
grade-leveled books focusing on the teaching of reading either through a code-emphasis or
meaning emphasis approach. The code-emphasis approach is based on phonemic awareness,
decoding, and word attack skills. In contrast, the meaning emphasis approach relies heavily on
reading to understand how to use worksheets and vocabulary lessons (Solor, 2016; Wexler,
2019).
The leveled reading approach to reading instruction is based on the belief that reading
comprehension improves when an extensive number of books are assigned that are just right for
the individual student. Consequently, the level of text difficulty is matched to the student’s
reading level, and the reading level is assessed based on the number of errors made while reading
an assigned passage. The goal in the assessment is usually 95% accuracy (Goodman, Shannon,
Freeman, & Murphy, 1988). The theory behind the leveled reading view suggests that every
student’s reading level falls in one of three levels: (a) the independent reading level in which
teacher scaffolding and support are not needed, (b) the instructional level in which students
access text with some teacher scaffolding and support, and (c) the frustration reading level in
which students experience frustration in trying to access text (Goodman, Shannon, Freeman, &
Murphy, 1988; Solor, 2016; Wexler, 2019).
An integral component of learning to read is reading comprehension; however, various
studies (Durkin, 1978; Englemann & Meyer, 1984; Mason & Osborn, 1982; Shannon, 1983)
conducted during the basal reading era concluded that: (a) reading comprehension focused on
skills rather than focusing on strategies to develop students’ ability to predict, confirm, modify or
summarize; and (b) reading comprehension instruction was second to the hierarchal
implementation of the components of the basal program, which focuses on monitoring and
30
controlling the program based on well-defined goals and objectives, “treating reading instruction
as the application of the materials (Shannon, 1983, p. 83). Whereas basal readers were at the
forefront of reading programs in the era, the whole language philosophy persistently battled the
basal reading approach and its impact on students’ desire and ability to learn to read (Henk &
Moore, 1992).
Whole Language: A Philosophical Approach to Reading
The late 1980s and 1990s witnessed a shift in emphasis in reading instruction in which
reading comprehension became the next instructional outcome for students. As a result, the
whole language reading philosophy was implemented. Whole language, as described by Ken
Goodman (1986), offered a better approach to reading instruction that focuses on how readers
construct meaning from language by developing a knowledge of the graphophonic, syntactic,
semantic, and pragmatic aspects of language. The effectiveness of whole language and reading
comprehension is difficult to assess empirically because whole language is a philosophy, not an
instructional approach (McKenna, Miller, & Robinson, 1990). However, the Cohen (1968) study,
which was replicated by Cullinan, Jaggar, and Strickland (1974), showed how the whole
language approach resulted in significant increases in word knowledge, reading comprehension,
and vocabulary as measured by an achievement test. In contrast, later studies by Bateman (1991),
Blachman (1991), Liberman and Liberman (1990), and Yates (1988) highlighted the detrimental
effects of the whole language approach on at-risk and students with disabilities.
Balanced Reading: A Balance of Phonics and Whole Language
Balanced reading, also known as balanced literacy, is an approach to reading instruction
that includes phonemic awareness, phonics, fluency, vocabulary, and text comprehension (Snow,
Burns, & Griffin, 1998). In the late 1990s and into the 2000s, a balanced approach to reading
31
instruction was implemented through the reading and writing workshop model that resulted from
the Report of the National Reading Panel (NRP): Teaching Children to Read: An Evidence-
Based Assessment of the Scientific Research Literature on Reading and Its Implications for
Reading Instruction (National Reading Panel, 2000). The report was instrumental in the
development of the Reading First Initiative, incorporated into Title I of the Elementary and
Secondary Act (ESEA) in 2001.
Leveled Reading Versus Scaffolded Reading
The CCSS challenges teachers to place complex and difficult texts in front of students
and then provide scaffolded instructional supports that will increase the students’ bank of related
language, knowledge, skills, or metacognition to help them comprehend the information.
However, the scaffolded reading approach can be challenging for teachers in schools where
students’ reading levels are significantly below grade level because students are not reading on
their own, but rather relying on teachers to help them struggle through the difficult texts (Beeb,
Hawkins, & Roller, 1991).
The debate between the leveled and scaffolded reading approaches is not in how to teach
reading, but rather how to help students build knowledge and improve comprehension of text.
For example, Porter-Magee (2012) stated, “It’s about how to choose the books you are asking
students to read, and the outcome of this debate could go a long way towards deciding the long-
term impact of CCSS-ELA” (para. 1). The CSS-ELA is beholden to the problem that exists as a
result of the debate between the leveled and scaffolded reaching camps. Teachers within the two
camps hold fast to their conceptions about how students learn to read, and these conceptions may
or may not be grounded in current research-based theory. This fact is illustrated by teachers who
adhere to the theories posited in the leveled reading approach, in which students are only
32
provided access to the level of text difficulty relative to their reading level; however, this
approach ignores what a student can accomplish with teacher-provided guidance, support, and
scaffolding provided (C. Shanahan, Shanahan, & Misischia, 2011).
Under the current CCSS-ELA, text comprehension is not a solely a function of the
intermediate grade levels. The accelerated expectations of CCSS-ELA require reading
instruction to include guided instruction on the comprehension strategies students need to access
text. For this to happen, teachers first need to understand that a shift in mindset around the issue
of reading instruction is critical. Williamson et al. (2014) studied the historical trends for readers
related to CCSS and Anchor Reading Standard 10 expectations and noted that a shift in
instructional strategies needs to happen in order to help struggling readers climb the text
complexity trajectory outlined by the standards. To that end, CCSS-ELA insists that teachers
honor all three factors in determining text complexity. In order for teachers to honor all three
factors, they need to have a working understanding of what the factors are and how to use them
when carefully considering the kinds of texts students will read. The problem with teachers’
knowledge and understanding of text complexity was noted by Burkins and Yaris (2012b), who
stated, “For as many teachers who give careful consideration to quantitative, qualitative, and
reader/task factors, we see publishers, administrators and teachers who see text complexity as
synonymous with a Lexile number” (para. 5).
Anchor Reading Standard 10 and College Readiness
The CCSS were created to ensure that all students acquire the skills and knowledge in
their kindergarten through 12th grade education to succeed in college, career, and life.
“Readiness for college or a career means readiness to read and understand challenging and
complex texts—a bar many of today’s students are missing” (Preiss, 2016, para. 2). Research
33
from the Center for American Progress reported that 40-60% of first year college students
require remedial English instruction. Moreover, the problem is worse for low-income students
and students of color (Jimenez, Sargrad, Morales, & Thompson, 2019).
According to M. Conley (2008), preparing students for college readiness means K-12
schools are developing students’ cognitive and metacognitive capabilities, which include
analysis, interpretation, precision and accuracy, problem solving, and reasoning. For these
metacognitive capabilities to materialize, the CCSS will require a significant and profound shift
in instruction that changes the way students interact with text.
English Language Arts (ELA): District Assessment Data
To establish a context for the study, it is important to look at the reality presented in the
current CAASPP-ELA data from the Reagan Unified School District (pseudonym). Analyzing
the CAASPP-ELA data from RUSD will provide input in two areas: (a) the trajectory of the
RUSD students in their college and career readiness, specifically in the area of reading and
understanding challenging and complex text; and (b) teacher preparedness in teaching the
strategies necessary to interact with challenging and complex texts.
The NAEP (2009) determined that reading comprehension skills are a key indicator of
performance on standardized tests. Pavonetti, Brimmer, and Cipielewski (2003) confirmed that
reading comprehension is a fundamental skill affecting standardized testing. Students’
metacognitive skills are essential in providing students with the ability to make inferences,
understand figurative language, or use various reading strategies and monitor the understanding
of complex text (Kintsch & Kintsch, 2005; Miller, 2005; Oakhill & Cain, 2000).
The RUSD results from the California student assessment system known as the CAASPP
are reflective of the San Diego and California State results. Refer to Table 4 for the RUSD
34
CAASPP-ELA data by specific by student subgroups in 2015, 2016, 2017, 2018, and 2019 for
the percentage of students meeting or exceeding CCSS-ELA standards. (California Department
of Education [CDE], 2020).
Table 4
RUSD Student CAASPP Data by Subgroup from 2015-2019: English: Standard Met or Exceeded
Year African American Hispanic White Asian
2015 43.0% 31.0% 62.0% 71.0%
2016 39.0% 35.0% 67.0% 77.0%
2017 43.78% 35.97% 66.12% 77.24%
2018 48.39% 38.91% 69.45% 77.01%
2019 46.63% 40.32% 71.04% 76.39%
The subgroups represented in the CAASPP-ELA data show the student achievement over
a span of 5 years for all students in grades 3-8 and 11. For the purpose of this study, the focus
will be on two significant subgroups: White and Hispanic. These subgroups were selected due to
significant due to the enrollment number for each subgroup. With an enrollment of 23,922 in
RUSD, the Hispanic subgroup accounts for 64.8% of the total enrollment, and the White
subgroup accounts for 23.4% of the total enrollment. The CAASPP-ELA data for the Hispanic
subgroup increased from 31.0% to 40.32% over a 5-year span. The same data for the White
subgroup increased from 62.0% to 71.04% over a 5-year period. Refer to Figure 3 for the
CAASPP-ELA data by specific by student subgroups for trends in data related to students
meeting or exceeding standard over a 5-year span from 2015-2019 (CDE, 2020).
35
Figure 3. RUSD student CAASPP data trends by subgroup from 2015-2019.
The CAASPP-ELA data for Hispanic students who met or exceeded standard trended
upward over a 5-year span from 31.0% in 2015 to 40.32% in 2019. The same data for the White
subgroup who met or exceeded standard trended upward over a 5-year span from 62.0% in 2015
to 71.04% in 2019. The CAASPP-ELA data in Table 5 are organized by student subgroups in
2015, 2016, 2017, 2018, and 2019 for percentage of students nearly meeting or not meeting
standard (CDE, 2020).
Table 5
RUSD Student CAASPP Data by Subgroup from 2015-2019: English: Nearly Met or Not Met
Year African American Hispanic White Asian
2015 57.0% 69.0% 38.0% 29.0%
2016 61.0% 65.0% 33.0% 23.0%
2017 56.22% 64.03% 33.88% 22.76%
2018 51.61% 61.09% 30.55% 22.99%
2019 50.37% 59.68% 28.96% 23.61%
The CAASPP-ELA data in Figure 4 are organized by student subgroups for trends in data
for students nearly meeting or not meeting standard over a 5-year span beginning with 2015 and
ending in 2019 (CDE, 2020).
0.00%
10.00%
20.00%
30.00%
40.00%
50.00%
60.00%
70.00%
80.00%
90.00%
2015 2016 2017 2018 2019
English: Standard Met or Exceeded
African America Hispanic White Asian
36
Figure 4. RUSD student CASPP data trends by subgroup from 2015-2019.
The CAASPP-ELA data for Hispanic students who nearly met or did not meet standard
trended downward over a 5-year span from 69.0% in 2015 to 59.68% in 2019. The same data for
the White subgroup who nearly met or did not meet standard trended downward over a 5-year
span from 38.0% in 2015 to 28.96% in 2019. The CAASPP-ELA data in Table 5 are organized
by student subgroups in 2015, 2016, 2017, 2018, and 2019 for percentage of students nearly
meeting or not meeting standard (CDE, 2020).
Summary of District Assessment Data
In the 2018-2019 school year, RUSD tested approximately 10,091 third through eighth
and 11th graders on the annual CAASPP. Out of the 10,091 students who tested, approximately
5,014 did not meet the standard, which means approximately 50% met the standard and 50% did
not. Although the number of students not meeting the standard declined in most of the
subgroups, the numbers remain consistently high.
0.00%
10.00%
20.00%
30.00%
40.00%
50.00%
60.00%
70.00%
80.00%
2015 2016 2017 2018 2019
English: Standard Nearly Met or Not Met
African America Hispanic White Asian
37
Conceptual Framework
The conceptual framework for this study was Fullan and Quinn’s (2016) coherence
framework. Emphasis is placed on the third component, deepening learning, due to the
correlation between the variables of deepening learning and the targeted PD on text complexity
implemented in the study. Grounded in change theory, the deepening learning variables were
used to determine if targeted PD on text complexity would change (a) teachers’ skill level in
determining the level of challenge a text provides based on the quantitative features, qualitative
features, and reader and task measures of text complexity; and (b) teachers’ level of preparedness
in implementing text complexity. The targeted PD model used in this study was Guskey and
Sparks’s (2002) PD model.
Framework for Professional Development
School districts in the United states spend over $18 billion a year on PD, amounting to
$15,000 to $20,000 per teacher, yet is has done little in changing student achievement outcomes
(Wexler, 2019). A study conducted by an education reform nonprofit called The New Teacher
Project (TNTP; 2015) concluded that:
The pervasive beliefs “that we know what works,” that more support for teacher is
inherently good regardless of the results, and that development is the key to instructional
excellence have all contributed to a vision of widespread teaching excellence just over
the horizon that is mostly a mirage. (Wexler, 2019, p. 112)
Changing the mirage is imperative and it can be done by implementing targeted PD that will
ultimately affect student achievement outcomes (Desimone, 2009; Guskey & Sparks, 2002;
Wang, Frechtling, & Sanders, 1999). Guskey and Sparks (2002) posited that although improved
student achievement outcomes are expected to increase, it is the influence of PD on teachers’ and
school administrators’ knowledge and practices that will ultimately change student achievement
outcomes.
38
The four research questions in this study were investigated through the lens of the Fullan
and Quinn (2016) coherence framework while implementing Guskey and Sparks’s (2002) three
major factors of PD: content characteristics, process variables, and context characteristics. Refer
to Figure 5 for the Guskey and Sparks model of the relationship between PD and improvements
in student learning.
Figure 5. Guskey and Sparks model of relationships between PD and improvements in student
learning.
According to Guskey and Sparks (2002), the quality of PD, which is the central
component of their PD model, is determined by three factors: (a) content characteristics,
(b) process variables, and (c) context characteristics. The three factors in the PD model provide
what Fullan and Quinn (2016) called clarity and coherence in that “once the purpose and goals
are identified” (content characteristics), “it is crucial that everyone perceive that there is a clear
strategy for achieving them” (process variables) “and be able to see their part in the strategy”
(context characteristics; p. 24).
Establishing Clarity of Deep Learning Goals
The targeted PD implemented in this study had two clear goals: (a) to deepen teachers’
knowledge of text complexity, and (b) to increase and accelerate teachers’ level of preparedness
in implementing text complexity. The ultimate goal of the PD in this study is to increase
students’ ability to navigate and interact with complex text. Fullan and Quinn (2016) asserted
ADMINISTRATOR
Knowledges and
Practices
CONTEXT
Characteristics
QUALITY
of Professional
Development
PROCESS
Variables
CONTENT
Characteristics
TEACHER
Knowledge and
Practices
IMPROVED
STUDENT
LEARNING
OUTCOMES
39
that CCSS provides opportunities for students to “build knowledge about the world and other
disciplines through text rather than their teacher. This requires students to be able to read closely,
think deeply, and learn independently” (p. 8).
Content Characteristics of Professional Development
In the Guskey and Sparks (2002) PD model, the first factor that contributes to high
quality PD is the content characteristics, which are the what of PD (Guskey & Sparks, 202). The
content characteristics of a PD define what a teacher should be teaching and what students
should be learning (DuFour, 2004; Guskey, 2002). At the center of content characteristics is the
purpose of PD, which is for teachers to acquire new knowledge, skills, and a deeper
understanding of academic disciplines and specific pedagogical processes (Guskey & Sparks,
2002; Shulman, 1985).
According to Garet, Porter, Desimone, Birman, and Yoon (2001), PD should be specific
in nature, “addressing a central topic and application of the theme rather than a generic
presentation with no clear purpose” (p. 293). The authors emphasize that PD is most effective
when it is one piece of a cohesive approach to learning. Researchers have argued that cohesion is
critical in PD to the extent to which it builds on the knowledge and skills the teachers already
have learned. Reading instruction is an example of a specific pedagogical process teachers need.
Faced with the challenges many students have in reading, teachers require the instructional
knowledge needed to understand the struggles students are having with the reading process
(Gyovai, Cartledge, Kourea, Yurick, & Gibson, 2009).
New knowledge, skills, and pedagogical processes. Central to the content
characteristics of targeted PD is the learning of new knowledge, skills, and instructional practices
(Guskey & Sparks, 2002). For this study, the targeted PD was designed to provide explicit
40
training on each of the trifold dimensions of text complexity: knowledge critical to having the
ability to implement Anchor Reading Standard 10. Whitworth and Chiu (2015) asserted the
imperative for teacher knowledge because it “plays a vital role in both the quality of instruction
and student performance” (p. 123). Cottingham et al. (2008) stated, PD should shape teachers’
instructional practices in content such as reading and writing. According to Scherff (2018), PD
that is devoid of a strong content component is usually an event that “happens to” teachers, is
“often associated with workshops, seminars, or lectures, and is typically a one-size-fits all
approach” (p. 1)
Researchers have concluded that when PD focuses explicitly on training teachers in the
pedagogical processes of critical academic disciplines, such as reading instruction, instructional
practices begin to change (Guskey, 2009; Malatesha Joshi et al., 2009). Researchers such as Ball
and Forzani (2009); Ball, Sleep, Boerst, and Bass (2009); Grossman and McDonald, (2008);
Lampert (2009) have long highlighted the need for PD that focuses on effective teaching
practices. For example, Hiebert and Morris (2012) stated,
If one examines up close the nature of teachers’ practice, it is possible, claim the
researchers, to identify instructional routines that could be taught to teachers and that
significantly affect the quality of teaching and the subsequent learning of students. (p. 93)
Additional studies have shown a correlation between content-focused PD and an increase
in student achievement outcomes. In the Firestone and Riehl (2005) study, the findings
concluded that the three school districts’ coherent, content-focused approach to PD changed
teachers’ instructional practices, which positively affected student achievement outcomes.
Research conducted on the Science Teachers Learning from Lesson Analysis (STeLLA)
highlighted how the program strengthens teachers’ understanding of how to teach science. The
study draws attention to the focus on content, but also to its pedagogical process. The goal of the
41
STeLLA Program is twofold: (a) to help teachers understand their students’ science thinking,
thus, paving the way for teachers to respond to students’ ideas and misunderstandings about
science; and (b) to help strengthen teachers’ their knowledge and skills on how to sequence
science ideas so that students construct a coherent story that makes sense (Roth et al., 2011_. The
STeLLA Program teachers’ students made significantly higher gains on pre- and post-science
tests when compared to students who received content training only, “a finding further confirmed
by a second randomized study of the program several years later” (Darling-Hammond,
Hyler, & Gardner, 2017, p. 3)
The work of Guskey and Sparks (2002) is supported in Desimone’s (2009) operational
theory regarding the influence of PD on teachers. Desimone’s operational theory outlines four
steps that affect instructional practice and ultimately student learning outcomes. Key to
Desimone’s theory is the increase of teacher knowledge and skills as a result of PD. Desimone’s
theory also highlights the change in teachers’ instructional practices. Understanding the
important link between PD and how instructional practices can change, as well as how these
changes can improve student achievement outcomes, is the basis on which district leadership can
target support for teachers that ultimately will result in student achievement (King, 2012). See
Figure 6 for the four steps in Desimone’s operation theory.
Step Four
Changes in
instructional practices
promote student
learning
Step Three
Teachers adapt their
instructional practices
Step Two
Step One
Teachers participate in
effective professional
development
42
Figure 6. Operation theory: Professional development. Adapted from “Improving Impact Studies
of Teachers’ Professional Development: Toward Better Conceptualizations and Measures,” by L.
M. Desimone, 2009, Educational Researcher, 38(3), p. 207. Copyright 2009 by the author.
Building Precision in Pedagogies
Building precision in pedagogies means that teachers are developing precision in their
instructional strategies. Sustainable change in a school district is most often the result of a
concerted effort to make instruction changes based on a specific framework, model, or
pedagogical system (Fullan & Quinn, 2016). Four components make up what Fullan and Quinn
(2016) termed a pedagogical system. The first of the four components is building common
language and knowledge base. The targeted PD oin this study built common language around the
dimensions of text complexity: quantitative, qualitative, and reader and task. The text complexity
dimensions were clearly defined, and the participants had the opportunity to practice what they
had learned through various activities. The second component is to identify proven pedagogical
practices. This component was evidenced in the targeted PD in this study by identifying
instructional practices to implement the qualitative and reader and task dimensions of text
complexity. This portion of the PD empowered teachers to look at the kinds of texts they are
placing in front of students while considering the qualitative and reader and task dimensions of
text complexity. The third component of the pedagogical system is building capacity. Building
teacher capacity in text complexity requires a change in teacher mindset and pedagogy around
reading instruction. Even so, Snow et al. (1998) cautioned “providing teachers with information
about new instructional strategies does not necessarily result in changes in existing teaching
behaviors” (p. 292). Darling-Hammond (2005) challenged the educational system to improve the
quality of teaching through the systematic development of teacher knowledge and skills and to
extend their expertise. To support teachers in making changes in deeply held beliefs, knowledge,
43
and habits of practice, there needs to be a focus on implementing high-quality and focused
sources of PD (Bean, Eichelberger, Swan, & Tucker 1999; Thompson, Zeuli, & Sykes, 1999).
Miller (2012) stated:
The changes in instruction and curriculum that come hand-in-hand with the Common
Core ELA standards are not inconsequential, and focusing on text complexity will likely
require additional opportunities for teacher professional development. (p. 2)
The fourth component of the pedagogical system is to provide clear causal links to
impact. Fullan and Quinn (2016) defined this construct as strengthening “the specificity of
instructional practice and its causal efficacy in making a difference to learning” (p. 89).
Teacher beliefs. A critical factor affecting teaching practices are teachers’ pedagogical
beliefs because they are manifested in teaching methods, decision-making, and the selection of
strategies and activities used within the classroom (M. Borg, 2001; Handal & Harrington, 2003;
Stipek, Givin, Salmon, & MacCyvers, 2001; Richardson, Anders, Tidwell, & Lloyd, 1991).
Richardson et al. (1991) highlighted the research around teacher beliefs, stating that “practices
without theory may lead to [an incorrect] implementation or no implementation at all unless
teachers’ beliefs are congruent with the theoretical assumptions of the practice” (p. 579). In
Anders, Tidwell, and Lloyd’s (1991) study on teachers’ beliefs and practices in reading
comprehension instruction, the researchers found that many of the teachers in the study may
have had knowledge and belief related to research-based reading comprehension instruction;
however, they “did not know the practices that would allow them to act upon those beliefs”
(p. 579). The Anders et al. study demonstrates the impact of teachers’ beliefs on literacy
instruction. As a result, these factors become competing elements in instructional practices that
create a misalignment of teacher practices, theoretical assumptions, and currently implemented
reading initiatives.
44
The added dimension of text complexity in literacy underscores the need for teachers to
move beyond personal conceptions, self-perceptions, and pedagogical beliefs around reading
instruction. To that end, the CCSS-ELA shifts require reading instruction to be framed through
the lens of text choice, text rigor, text complexity, and the balance of literary nonfiction and
informational texts. For teachers to fully understand the instructional practices needed for any
initiative, Marzano (2009) emphasized the need “to first develop a viable tool for fostering
expertise” in teachers’ instructional practices by providing “opportunities for deliberate practice
within a comprehensive professional learning system in which there are clear and focused tasks,
clear criteria for success, and motivation to improve within the context of mentoring and
professional development” (p. 5). The comprehensive PD Marzano proposed is the strategic tool
that may provide teachers with the pedagogical practices needed for the implementation of text
complexity.
Process Variables in Professional Development
In the Guskey and Sparks (2002) PD model, the second factor that contributes to high-
quality PD is the process variable, which refers to the how of PD. Process variables are
concerned with the type and forms of PD, as well as how PD is planned, implemented,
monitored, and supported (Guskey & Sparks, 2002; Sparks & Hirsh, 1997). The delivery of PD
has various types, including:
• Periodic workshops
• In-class observation
• Single session
• Sustained and continuous
• Active learning
45
• Job-embedded
• Collaborative
• Coaching
• Content or standard specific
• Aligned with school/district goals
• Connected to practice (Archibald, Coggshall, Croft, & Goe, 2011; Calvert; 2016;
Darling-Hammond et al., 2017; Labone & Long, 2016).
To accomplish the pedagogical precision teachers need, it is important to provide the kind
of PD experiences that target and enhance pedagocial precision (Guskey, 2002; Fullan & Quinn,
2016). PD takes on several forms, and Marzano (2003) warned against standardized PD activities
that have no lasting impact on teachers’ instructional behaviors in the classroom. Based on the
research affirmed by Ross and Gray (2004), successful PD is reliant upon the participants’
efficacy beliefs, and these beliefs have a critical impact on teachers’ willingness to try and then
embrace new ideas. For this reason, designing PD that also includes opportunities for teachers to
practice the new learning around text complexity would foster growth in teachers’ efficacy in
implementing text complexity (Fang & Pace, 2013; Hiebert & Mesmer, 2013).
Planned, monitored, and supported. Research shows that school districts’ limited plans
for PD around the CCSS did not offer teachers the deep learning needed to implement the
standards as they were intended, and “professional development is not up to the task of
supporting the substantive changes required of teachers to meet the new standards” (Hirsh, 2012,
p. 1). Odden (2002) contended that PD should be supported with a well thought out plan that
aligns with district goals and student learning. The direction of PD in school districts is typically
determined at the district level with little to no input from teachers or administrators. The PD
46
may or may not be aligned to district or school site goals, with topics unrelated to the specific
needs of schools or teachers (Choy et al., 2006; Corcoran, 1995). This approach to PD is what
Guskey (2002) calls a top-down approach that contributes to the gap between the PD teachers get
and the PD teachers need.
District-mandated PD opportunities usually result in half-day or full-day workshops
frequently delivered by outside consultants (Kesson & Henderson, 2010). By contrast, research
supports the need for PD that is longer in duration (Banilower, Heck, & Weiss, 2007; Gerard,
Varma, Corliss, & Linn, 2011; Lamprianou & Boyle, 2005). When teachers are given the
opportunity to practice new instructional strategies with sufficient duration, it increases their
self-efficacy and willingness to implement specific instructional practices in the classroom
(Roberts, Henson, Tharp, & Moreno, 2000). McKnight (2018) referenced a 3-year longitudinal
study (Johnson, Kahle, & Fargo, 2007) to examine the effect of “sustained whole school
professional development” on student achievement in science (p. 27) The results were
encouraging, supporting the fact that the duration and structure of PD is linked to significant
increases in students’ achievment in science.
The success of PD and implementation following PD are clearly linked to support and
accountability for teachers. Support is critical so that the implementation of new strategies is not
left to the teachers’ discretion (Wei et al., 2009). Implementation of new teaching strategies is
often nonexistent due to the lack of follow-up and support from the district level, unfortunately
leading to the demise of many district initiatives. In districts where follow-up and accountability
are in place, there is usually a system for mentoring and coaching of teachers through the
implementation phase (Corcoran, 1995; Luft et al., 2011). Fullan (2014) stated, “What matters is
what happens after (or between) workshops: Who tries things out? Who supports you, who gives
47
you feedback?... all these questions depend on the culture to which one returns, and especially on
its social capital” (p. 79).
Shifting Practices through Capacity Building
After learning goals have been identified and teachers have had the opportunity to
develop precision in their pedagogical practices, according to Fullan and Quinn (2016), the focus
should be on making changes in the instructional practices of teachers. Guskey and Sparks
(2002) contended that shifting a district or school’s instructional practices requires a close
examination of the organizational systems. The examination of the organizational systems should
result in what Fullan (1993), calls individual and instititutional development, “to break the
impasse” of the status quo and build “a new conceptualism of teacher professionalism that
integrates moral purpose and change agentry” (p. 3).
Context Characteristics in Professional Development
In the Guskey and Sparks (2002) PD model, the third factor that contributes to high-
quality PD is the context characteristics, which refer to the “who,” “when,” ‘where,” and “why”
of PD (Guskey & Sparks, 2002, p. 3). The context characteristics refer to the educators involved
in the PD, the organization, system, and culture in which the PD takes place. Additionally,
context also focuses on school- and district-level policies. The targeted PD in this study sought to
shift the instructional practices in reading through the implementation of Anchor Reading
Standard 10: Text Complexity. For a complete district or schoolwide shift in the instructional
practices in reading, PD would need to be a part of a district or school wide systems change
called systems thinking.
Systems thinking. Before systemic change can happen through PD, Calvert (2016)
asserted that teachers must first decide to improve their instructional practices. For many
48
teachers, the idea of using PD to improve instruction is difficult because many believe PD is “an
empty exercise in compliance, one that falls short of its objectives and rarely improves
professional practice” (p. 2) In a 2014 survey conducted by the Bill and Melinda Gates
Foundation, more than 1,600 teachers responded and characterized their PD experience as
irrelevant, ineffective, and “not connected to their core work of helping students learn” (Calvert,
2016, p. 2). When such a belief about PD is systemic within an organization, it is often rooted in
teachers’ underlying beliefs about their own instructional practices (Opfer & Pedder, 2011; Stoll
& Fink, 1996), as well as their overall disconnect with incohesive district-led PD (Calvert, 2016).
District-led PD garnered its reputation as a result of not recognizing how the system of
PD either hinders or supports the efforts of PD (Murphy, 2005). District leadership needs to
implement a systems thinking approach to PD (Garland, Layland, & Corbett, 2018). Systems
thinking is an approach that is “characterized by synergy—the whole (system) is greater than the
sum of its parts (elements), because the relationship among the different elements adds value to
the system” (Garland et al., 2018, p. 9; Betts, 1992). Fullan (2016) referred to systems as system
transformation collaboratives: systems that build coherence “because (a) [they increase] clarity
of goals and strategy, (b) [provide] sustained learning experiences with cycles of application and
reflection, (c) [increase] both vertical and lateral capacity; and (d) [foster] relentless focus on
student learning” (p. 69). In the McKnight (2018) study, systems thinking within a school district
setting was described as a process that will affect the students, as initiatives and the subsequent
PD are sustained and implemented. Review Figure 7 for the cycles of systems thinking.
49
Team analyzes student
data to determine
instructional needs
Team researches and reviews
best teaching practices to
address student needs
Team determines the
best strategies that will
address student needs
District policies are
reviewed and modified
to better support
teachers and students
Professional development is
designed in accordance to
district policies and schedules
Time is set aside for
teachers to receive
ongoing training
Coaching and support
for teachers are
organized
District departments are
consulted: technology,
educational services, business
Business office and
Purchasing are
consulted
Figure 7. Cycle of systems thinking.
Teacher agency. Earley and Bubb (2004) believe that teachers can be united in changing
instructional practices by building collaborative culture built on trusting relationships. Senge
(2000) asserted how profound and positive change within an organization must be nurtured and
cannot emerge as a result of a top-down mandate. Change comes about when the principal,
teachers, and parents are engaged in creating something new. Carreón and Rau (2014) endorsed
the idea of coordinating PD around a common vision such as literacy instruction. The element of
a common vision around literacy eliminates the barrage of training on a variety of topics teachers
often find difficult to implement in the classroom. Fisher, Frey, and Nelson (2012) advocated for
a common vision of literacy through an instructional framework that guides what teachers do by
having a common vocabulary to discuss and implement the necessary literacy practices. For
districts to foster a collaborative and trusting culture, there must be a platform for teacher agency
(Guskey, 2002). Calvert (2016) defined teacher agency as “the capacity of teachers to act
purposefully and constructively to direct their professional growth and contribute to the growth
of their colleagues” (p. 4). Understanding teacher agency is critical because, as King (2013)
asserted, “teachers are the gatekeepers of change or the change agents in the PD process” (p. 13).
When districts value the role of teacher in the change process, school improvement initiatives
have a greater chance of success (NCCA, 2010).
50
King (2012) referenced a study conducted in Ireland that involved teachers from urban
disadvantaged schools who participated in a collaborative PD literacy initiative. One significant
finding in the study was the presence of teacher agency in the literacy initiative. The teachers
described the initiative as “feasible, focused, structured, workable, and with a clear framework”
(p. 12). The teachers in the literacy initiative participated due to teacher agency, more
specifically due to the three factors of teacher agency as outlined by King. Refer to Table 6 for a
description of various aspects of teacher agency.
Table 6
Teacher Agency
Openness and Willingness Motivation Deep Learning
• Bottom-up approach
• Teachers elected to
engage with new
practices
• Teachers’ personal and
professional needs met
• Procedural and conceptual knowledge
of practices
• May be intrinsic or
extrinsic or both
• Pedagogy and pedagogical content
knowledge related to the new practice
• Value other teachers’
opinions about what
works
• To facilitate teachers’ ability to adapt
the practices to meet the needs of their
students
• Motivated by practices
that will increase student
outcomes
• To facilitate teachers’ ability to see how
it aligns with existing practice
Note. Adapted from Developing and Sustaining Teachers’ Professional Learning: A Case Study of Collaborative
Professional Development [Doctoral dissertation, University of Lincoln], by King, 2012, DCU Online Research
Access Service (http://doras.dcu.ie/22058/). Copyright 2012 by the author.
King’s (2012) findings support the evidence of teacher agency in the implementation of
the literacy initiative: (a) teachers engaged in the initiative because it aligned with their personal
and professional needs., (b) teachers engaged when they understood the literacy initiative would
meet the needs of their students in their individual contexts, and (c) teachers engaged because
they had pedagogical content knowledge related to the practice.
School and district level policies. The effectiveness of PD is dependent upon the
district’s commitment to collaboratively plan, implement, and provide follow-up support during
and beyond implementation (Desimone, 2009; King, 2012) The important question to answer
51
when fostering a vision for PD is whether or not the PD aligns with the vision, mission, goals,
and policies of the district and schools (Desimone, 2009; Garet et al., 2001). Effective PD also
aligns with standards, assessments, and supports other PD being implemented in the district or
school. The direction of PD in a district or school is shaped by leadership and can have a
significant impact on teaching and learning that results from high-quality PD (Desimone et al.,
2002; Leithwood, Jantzi, & Steinbach, 2002; Ogawa & Bossert, 1995).
Targeted Professional Development in Text Complexity
When it comes to the skills of teaching reading through text complexity, it stands to
reason that it requires extensive PD to expand a teacher’s tool kit beyond the surface level
literacy instruction of the past. The current standards call for pedagogical practices that reflect
the accelerated expectations of CCSS, so that students in kindergarten through 12th grade are
taught the comprehension strategies necessary to access complex text. CCR Anchor Reading
Standard 10 outlines the text complexity measures teachers need to consider when examining the
texts placed in front of students, i.e., quantitative, qualitative, and reader and task measures.
These measures serve as a framework for teachers to understand the interconnectedness of text
through (a) the content; (b) what students should learn, understand, and grapple with as a result
of interacting with complex text; and (c) the features of the text that make it difficult or easy for
students to access and learn the content (Draper, 2012; Fang, 2014). As teachers begin to
acknowledge that students are facing challenges in reading a more complex text, they begin the
first step toward understanding the need to implement the text complexity measures in literacy
instruction.
If the CCSS-ELA, and specifically CCR Anchor Reading Standard 10, are to be
implemented with fidelity, teachers need to experience targeted PD in the dimensions of text
52
complexity so that they possess an understanding of the dimensions of text complexity.
According to Hiebert (2013), teachers have received limited guidance in terms of how to
evaluate the qualitative features of text, as well as the intent of text complexity’s triad of
dimensions. Targeted PD would provide support for teachers in their understanding of text
complexity and teach them how to design learning experiences that promote reading
achievement as students attempt to engage with complex text (Hiebert, 2012; Hiebert &
Grisham, 2012). The Center for Instruction (2006) outlined three components of high-quality
targeted PD: (a) PD includes opportunities for teaching to be observed, strategies demonstrated,
and opportunities to practice, (b) PD provides time for teachers to collaborate, and (c) PD
provides opportunities to work with expert coaches and instructional leaders using a cycle of
feedback to address instructional practices.
Targeted PD is pivotal component in building teacher capacity and expertise as new skills
like those required with CCSS-ELA are brought on board; however, the PD offered should be
high-impact PD that is focused on people and practices, as well as the implementation of
practices (Reeves, 2008). Reeves (2008) stated that professional practices are distinguished “not
by label (‘We’re doing professional learning communities’ or ‘We’re doing high-yield
instructional strategies’) but rather by implementation—the extent to which 90 percent or more
of the teachers [are] actually using these practices in their classroom” (p. 23). Reaching the goal
of 90% or more of teachers implementing text complexity in their classroom can happen when
teachers experience high quality PD. Moreover, teachers must then have the opportunity to
practice using the strategic tools needed to improve student outcomes in comprehending complex
text. Studies have shown that, on average, teachers need 20 separate instances of practicing new
53
skills before they can demonstrate mastery of the skill; this number increases along with the
complexity of the skill in question (Joyce & Showers, 2002).
The quality of PD is contingent upon how much coaching and mentoring are available to
teachers when implementing specific instructional strategies. Researchers have focused on
quality PD as a roadmap to address the quality of instructional practices in the classroom to make
a difference in the literacy outcomes of students (Carreon & Rau, 2014; Lyons & Pinnell, 2001).
Once teachers understand how to use and implement text complexity strategies, it is imperative
to for teachers to see the results in student achievement so that the teachers’ instructional
practices are validated. Student achievement often changes what teachers believe about their own
practices, and, as Guskey (2002) asserted, teachers will change their underlying beliefs about
pedagogical practices only after they see success with students.
The PD also needs to be substantive so that teachers do not approach the PD with what
Huberman (1995) calls a tinkering mindset and then proceed to implement only a fraction of
what was presented. Joyce, Wolf, and Calhoun (1993) contended that teachers who have
experienced extensive PD implement in their instructional practices a mere 10% of what they
have learned. The substantive approach to PD in text complexity will affect the way in which
teachers make meaning of CCR Anchor Reading Standard 10, ultimately influencing the reading
experiences of students as they move through the levels of text complexity required by the CCSS
(Hiebert, 2013; Hiebert & Grisham, 2012).
54
Chapter Three: Methodology
Due to the growing percentage of college students who are inadequately prepared for the
rigor of college level text, the CCSS present a case for implementing the use of complex texts in
K-12 classroom instruction (Barton, 2006). Although the CCSS framework does not address how
to prepare 21
st
century students who are career and college ready, the framework does speak to
the critical need for students to advance in their capacity to:
1. Demonstrate independent and engaged reading;
2. Build strong content knowledge;
3. Actively use literacy in responsive and purposeful ways;
4, Comprehend and critique texts;
5. Use evidence from multiple texts to strengthen argumentation;
6. Use technology and digital media strategically; and
7. Demonstrate cross-cultural awareness and understanding (Turner & Danridge, 2014).
ACT and SAT data combined with additional information are used for college admissions
and granting merit-based scholarships to students. Viewed through the lens of literacy, the data
from these assessments signal that students may not have the skills necessary to interact with and
understand complex text. Analysis of ACT test-takers in 2011 indicates that 52% of students at
the time were performing at a college- or career-ready level when interacting with and
understanding of text. Additional data show that 57% of the high school graduating students who
completed the SAT did not meet the CCR benchmarks (American College Testing [ACT], 2013;
Fisher, Frey, & Lapp, 2012). ACT also reported that only 35% of high school students
understand key aspects of language, such as (a) a knowledge of language varieties and the ability
to use language skillfully, and (b) the ability to acquire and use rich vocabulary. Further, data
55
show that 24% of students are able to read and understand content-area texts, especially in the
area of science (ACT, 2017).
When high school students graduate and enter college, only to find themselves
challenged when required to interact with complex texts as first year college students, they are
often counseled toward non-credit remedial courses. According to Thomas Bailey (2009) from
the Community College Research Center, nearly 60% of first year college eligible students
discover they are not prepared for post-secondary studies and must enroll in non-credit remedial
English or mathematics courses. Once students have completed the remedial courses, they then
enroll in a gatekeeper English course that prepares students to enter required credit granting
courses. Fewer than 50% of students pass these courses: a barrier for many students continuing
their college careers (Bailey, 2009; Jenkins, Jaggars, & Roksa, 2009). Data indicate that less than
20% of students who begin in remedial courses actually persist to graduate with a certificate or
degree, and over 75% of surveyed students who dropped out of college courses indicated a
difficulty with reading as a major contributing factor (Bailey, 2009; Chait & Venezia, 2009;
Roksa, Jenkins, Jaggars, Zeidenberg, & Cho, 2009). To add to the college literacy dilemma,
Baer, Cook, and Baldi (2006) highlighted the findings in the National Survey of America’s
College Students (2005), in which 75% of students at 2-year colleges and more than 50% of
students at 4-year colleges did not score at the proficient level in literacy.
Restatement of the Problem
Teachers in kindergarten through 12th grades have the task of building a foundation of
college and career readiness (CCR). Two significant factors in building CCR include having the
knowledge and skills necessary for preparing students to read widely and deeply using an
extensive range of high-quality, increasingly challenging literary and informational texts that
56
develop rich and rigorous content knowledge across all grade levels (Common Core Standards
Initiative, 2010). The critical challenges related to these tasks are teachers’ ability to select
increasingly rigorous literary and informational texts and the availability of targeted PD to
advance teachers’ skills in implementing text complexity.
Developing teachers’ knowledge of CCR Anchor Reading Standard 10 (see Appendix A)
is paramount. The International Reading Association (IRA, 2012) emphasizes the fact that
teachers are the ones who will make the ELA-CCSS “an effective instructional reality” in their
classrooms (p. 1). Effectively influencing the instructional reality in classrooms requires PD that
targets skills teachers need for implementation of CCR Anchor Reading Standard 10. Limited
research has suggested that teacher knowledge of the CCSS is developing at a rapid pace;
however, teacher knowledge of CCSS has not translated into instructional practices that develop
the “deeper level of mastery that students need to meet the standards” (ASCD, 2012, p. 5). For
teachers to gain a clear understanding of text complexity, educational leaders need to provide
targeted PD that builds teachers’ knowledge and skills around the dimensions of text complexity
and how these dimensions influence the kinds of texts used in reading instruction (Fisher, Frey,
& Lapp, 2012; Hiebert, 2013; Hiebert & Grisham, 2010). The challenge is that focusing on text
complexity requires changes in both instructional practices and curriculum. Likewise, changes
that increase teachers’ competence and enhance student achievement in interacting with complex
text also require targeted PD (Guskey, 1986; Miller, 2012). A targeted approach to PD,
according to Loucks-Horsley et al. (1999), changes instructional practices by aligning ongoing
practice and support for teachers and not providing a one-time PD.
57
Purpose of the Study Restated
The purpose of this mixed methods study was to determine the relationship between
targeted PD and the perceived skill level of K-8 teachers’ implementation of the trifold measures
of text complexity. The significance of deeper learning around text complexity is that it may:
(a) accelerate clarity and deep learning of strategies for the implementation of text complexity,
(b) develop precision in teachers’ instructional practices around text complexity, and (c) help
teachers identify processes that will shift instructional practices around literacy instruction. As a
result, teachers’ increased knowledge of text complexity will provide instructional strategies to
teach students how to interact with complex text.
Research Questions
The following questions were used to investigate the subject of this research:
1. What is the perceived knowledge and skill levels of teachers in text complexity who
have not experienced prior professional development?
2. What is the perceived level of preparedness of teachers in text complexity who have
not experienced prior professional development?
3. To what degree did the teachers’ knowledge and skill levels in text complexity
change after engaging in professional development?
4. To what degree did the teachers’ preparedness in text complexity change after
engaging in professional development?
Research Methodology
The research study enlisted a mixed methods approach using a quasi-experimental, one-
group pretest-posttest design. The mixed methods approach builds on the strengths of
quantitative and qualitative research while minimizing the weaknesses of both methodologies
58
(Creswell et al., 2003; Johnson & Onwuegbuzie, 2004). The design of the study was quasi-
experimental because the subjects in the single group were not randomly assigned. The subjects
were composed of kindergarten through eighth grade teachers, as well as district office
administrators. Quasi-experimental research designs test causal hypotheses and lack random
assignment. The one-group pretest-posttest design can be diagrammed as follows:
t1 (Pre-Test) X t2 (Post-Test)
In a one-group pretest-posttest design, an observation in the form of a pretest (t1) is given first,
then the intervention is given to all participants, and, finally, a second observation (t2) or posttest
is conducted (Morgan, Gliner, & Harmon, 2000). In this study, the research sought to implement
a focused intervention by taking the participants through a PD treatment with a focus on text
complexity. Bärnighausen et al. (2017) asserted that interventions such as a PD treatment
“generate results that are of higher external validity than experimental results because they take
place in ‘realworld’ settings rather than in the artificial context of experiments” (p. 28).
The pre-survey and post-survey element of the design provided data for analysis from
surveys addressing teacher knowledge of text complexity and the choices teachers make in the
process of implementing text complexity in the ELA CCSS. The mixed methods approach
combined analysis from teacher surveys along with the qualitative analysis of the discussion with
the participants during the PD treatment. Using qualitative research, according to Merriam
(2009), is an attempt to confirm “how people interpret their experience, construct their world,
and what meaning they attribute to their experiences” (p. 14). To directly align the questions in
the study to qualitative research (Patton, 2015), critical considerations were made such as (a) the
challenges teachers face in implementing complex text strategies, and (b) the challenges teachers
face in motivating students who are not interested in reading. The implementation of the study
59
began with a pre-survey that was followed by a focused PD and discussion on text complexity.
The PD concluded with a post-survey (Johnson et al., 2007). The data from both surveys and the
notes from the discussion were analyzed using qualitative and quantitative measures to yield a
comparative analysis of participants’ pre- and post-perceptions of text complexity, as well as
their responses to PD focused on text complexity.
Although numerical data provide general understanding and measure specific beliefs
related to literacy development, numerical data cannot portray the live interactions teachers have
regarding the process of teaching and learning to read through the implementation of text
complexity. The interaction that took place during the PD event and the discussion following the
event provided further qualitative data to add depth and breadth to the quantitative research
(Creswell & Plano Clark, 2007).
Sample: Participation Selection
The study focused on the Reagan Unified School District (RUSD, pseudonym) and the
kindergarten through eighth grade (K-8) teachers’ implementation of text complexity across all
content areas. The study questions sought to understand teachers’ perceptions regarding their
own knowledge and skills in text complexity. Additionally, the study sought to determine the
participants’ preparedness to implement text complexity.
To gain access to conduct the study, the superintendent of RUSD was contacted via
email, followed by a letter outlining the purpose and rationale for the study. The letter also
included how the research could benefit the district, the expected commitment from the district
and participants, and a request for approval by the superintendent. A letter of agreement was
provided by the superintendent and submitted to the University of Southern California (USC)
Human Subjects Institutional Review Board (IRB). Upon the district’s approval of the study, all
60
K-8 teachers in RUSD were invited to participate in the study. Eleven instructional staff
members accepted the invitation to participate in the text complexity study and provide relevant
information on the research questions. Table 7 summarizes the study’s significant participant
demographics.
Table 7
Participants’ Teaching Level and Years of Experience in Education
Participants Educational Role Teaching Level Years in Education
#1 Director of Curriculum and Instruction K-5 n/a 21-25
#2 Teacher K-2 11-15
#3 Teacher 3-5 30+
#4 Teacher 6-8 21-25
#5 Teacher 3-5 21-25
#6 Teacher 3-5 0-5
#7 Teacher 6-8 16-20
#8 Teacher/Instructional Coach n/a 6-10
#9 Teacher K-2 26-30
#10 Teacher K-2 6-10
#11 Teacher/Instructional Coach n/a 6-10
Included in the 12 participants in the study were three K-2 grade teachers, three third-
fifth grade teachers, two sixth-eighth grade teachers, three CSRTs, one English language
development resource teacher (ELDRT), and a K-5 Curriculum and Instruction Director. One
participant has served in education 0-5 years, four participants have served 6-10 years, one has
served 11-15 years, on has served 16-20 years, three have served 21-25 years, one has served 26-
30 years, and one has served in education over 30 years. The district employs 1,185 teachers
with an average of 13 years of experience in teaching. The ratio of pupil to teachers is 21.9.
Site Selection
To develop a comprehensive and genuine investigation into teachers’ experiences in the
implementation of text complexity in the CCSS-ELA, this study explicitly attempted to highlight
information about text complexity across a single district. A key element to this study lay in the
purposeful, or more specifically, critical case sampling used to decisively explain the text
61
complexity phenomenon and allow for logical generalizations about the knowledge and
implementation of text complexity district wide. To illustrate the core of critical case samplings,
Patton (2002) acknowledged the power of critical case sampling in permitting generalizations
and maximizing application of information to other cases because if a generalization is true of
one case, it is likely to be true of all other cases. When research situations do not feasibly warrant
the participation of all members of the population, Wiersma and Jurs (1998) recommended
selecting a random sample of subjects who can represent the whole population. Bogdan and
Biklen (2016) emphasized the importance of randomly selecting subjects who have the same
characteristics that exist proportionate to the total population. The 11 study participants represent
the total K-8 teaching population with a balanced selection of teachers in primary and middle
school grades, as well as district level instructional support teachers.
The 12 participants were informed that the pre-survey, the 2.5 hour PD, and the post-
survey were being conducted for the purpose of conducting research for the completion of the
researcher’s dissertation and doctoral program. The participants were also informed that the
surveys were anonymous and would not be shared. Making the determination that teachers in
RUSD were capable of addressing the research questions was essential to this study’s process.
Data Collection and Instrumentation
The data process began after the researcher completed the initial steps to protect the
human subjects who participated in the study. The researcher sought and received approval from
the USC IRB to conduct this study by submitting the following documents for review: a letter of
invitation (see Appendix D) and informed consent for the quantitative and qualitative data
collection processes (see Appendix F).
62
Ensuring the confidentiality, safety, and full knowledge of the research for the study
participants is a critical outcome of obtaining IRB approval prior to commencing research (Yin,
2014). Upon approval by the USC IRB (see Appendix G), the superintendent of the RUSD was
contacted seeking approval to contact kindergarten through eighth grade teachers and district
personnel who fit the criteria for the study.
A full-color flyer identifying the researcher, introducing the purpose of the study, and
requesting interested participants to contact the researcher via phone or email was distributed to
583 kindergarten through eighth-grade teachers and district personnel in February 2015 (see
Appendix E). There were 18 respondents to the invitation to participate flyer; however, only 11
participants fit the criteria and were selected through purposive sampling. Eleven participants out
of a pool of 583 may have limited the study, but the researcher was able to develop logical
generalizations from the rich evidence produced by the small number of participants. Patton
(2002) emphasized the fact that working with a small group is likely to “yield the most
information and have the greatest impact on the development of knowledge” (p. 236).
The respondents were advised that their participation was voluntary and their responses
would be kept completely confidential. As an incentive to complete the pre-survey, the 2.5-hour
PD event, and finally the post-survey, the respondents’ names were entered into a random
drawing for a Dell Venue Pro 3000 Series Windows Tablet (32GB). Additional drawings
included two $25 Starbucks gift cards and five $5 Starbucks gift cards.
Once the participants completed the post-survey, the surveys were matched with the pre-
surveys with corresponding names. Following the analysis of the surveys, all documents were
destroyed.
63
For data collection, a mixed methods approach was used for collecting and analyzing
both quantitative and qualitative data for the purposes of examining teachers’ current knowledge
of text complexity implementation and the outcomes of teacher PD focused on text complexity
(Creswell, 2006). The mixed methods approach included the unequivocal quantitative data with
rich, detailed qualitative data from educator responses to discussion questions (Greene &
Caracelli 2003).
Pre-Survey and Post Survey
The pre- and post-survey data were collected online through USC’s Qualtrics system.
Eleven surveys were distributed through Qualtrics and all were completed, representing a 100%
response rate. The pre- and post-surveys consisted of 34 items (see Appendix B). Aligning the
study research questions, the activity theory of literacy instruction, the Guskey framework for
PD, as well as the existing literature was key in guiding the development of the pre- and post-
surveys. The alignment of the survey items with the research questions and the text complexity
matrix was critical to the outcomes of the study because the development of survey items as a
“method must depend on the formulation of a core research question that is amenable to being
answered through a survey” (Baumann & Bason, 2011, p. 414).
In Section A (items 1-7) of the pretest, the participants answered questions regarding
biographical data such as gender, age, teaching experience, and degree or certification. In
Section B (items 8-34), a 5-point Likert scale was used to collect data. The items in this section
focused on the participants’ perceptions of their knowledge of text complexity. The participants
answered questions about their readiness to implement text complexity and the extent to which
they implement various reading strategies in their instructional practices. The Likert scale
64
provided a broad range of data with which to explore the various aspects of the participants’
perceptions of their knowledge of and their ability to implement text complexity.
Validity and Reliability
The surveys measured the participants’ knowledge and perceptions about text
complexity, and were validated to ensure accuracy and reliability. Validity is the extent to which
accurate conclusions can be drawn from the measurement data (Creswell, 2003; Meier &
Brudney, 2002). To that end, the study instrument must bring to life the researcher’s mental
model and also measure its intended concept. Reliability is best understood as an instrument’s
capacity to measure a construct consistently with each administration (Trochim, 2006).
The validity and reliability constructs of the survey were supported in two areas. First, to
ensure the internal validity of the survey, three education professors who were familiar with PD
and statistical procedures reviewed the survey instrument in an effort to reduce researcher bias in
the wording of the questions. Additionally, six teachers (who did not participate in the study)
field-tested the survey questions to test its usability. Second, to ensure the reliability of the
survey instrument the survey was standardized, with all of the respondents receiving the same
surveys with the same questions. The principal purpose of creating the standardized instrument
used in the study was to measure the impact of PD on the participants’ knowledge and perception
of text complexity. To further measure the reliability of the survey, future researchers would
need to yield similar results using the exact instrument and procedures as the initial researcher
(S. Borg, 2003; Messick, 1989). The process of ensuring validity and reliability led to the
rewording of a few survey questionnaire items and added focus and clarity to the survey. The
process of refining is critical to researchers developing a clearly worded survey instrument
(Baumann & Bason, 2011)
65
Professional Development
Following the pre-survey, the participants attended a 2.5 hour PD presented by the
researcher. The learning expectations for the participants in the PD included: (a) to learn the
elements of text complexity in the context of the CCSS; (b) to learn how to employ a structured
process of evaluating text complexity and choosing complex texts for instruction; and, (c) to
understand the importance of providing access to all learners, regardless of ability level. The PD
experience included a PowerPoint presentation, as well as hands on experience with the
quantitative, qualitative, and reader and task components of text complexity. The PD concluded
with a post-survey and group discussion.
Triangulation
The mixed methods quasi-experimental design in this study provided both quantitative
and qualitative data through a pre-survey, post-survey, PD, and participant discussion. The
analysis of quantitative findings from the surveys, and the qualitative findings from the
discussion notes were triangulated to increase the confidence in the findings, “creating
innovative ways of understanding a phenomenon, revealing unique findings, challenging or
integrating theories, and providing a clearer understanding of the problem” (Thurmond, 2001, p.
254). O’Donoghue and Punch (2003) added that triangulation is a “method of cross-checking
data from multiple sources to search for regularities in the research data” (p. 78).
Throughout the study, the participants served as multiple sources of data. Analysis of the
pre- and post-surveys provided the opportunity to understand the responses from participants
with similar and differing grade-level perspectives on text complexity implementation. Overall,
the participants in this study shared various levels of personal knowledge and perceptions about
66
text complexity, both in how they explicitly understand the components of text complexity and
also in their personal experiences in implementing text complexity in their reading instruction.
Data Analysis
Marshall and Rossman (2011) asserted that:
Data analysis is the process of bringing order, structure and meaning to the mass of
collected data. It is a messy, ambiguous, time consuming, creative, and fascinating
process. It does not proceed in a linear fashion; it is not neat. Qualitative data analysis is a
search for general statements about relationships among categories of data. (p. 111)
This research was conducted through the framework of Guskey and Sparks’s (2002) PD model,
which validates the emergence of teachers’ understanding of text complexity as a result of the
PD treatment. Guskey and Sparks’s PD model as a framework for analyzing data provided a
means for observing emergent patterns in the participants’ knowledge around text complexity, in
addition to exploring and generating a theory regarding the relationship between PD and
improvements in student learning (Creswell, 2009; Guskey & Sparks, 2002; Patton, 2002).
Ethical Considerations
Prior to beginning this research, permission was secured from the IRB of USC. All of the
participants in the study were assured of the following considerations: informed consent,
anonymity, voluntary participation, confidentiality of all of the data generated and disseminated
from this research, the potential benefits of the study, and that their privacy would be respected.
A copy of the consent form is included in Appendix F.
Limitations
Every effort was made to minimize the potential limitations of this study; however, there
were factors beyond the researcher’s control. This section identifies and describes these factors,
as well as highlights alternative methods and how these methods could minimize possible
limitations in further research. One critical limitations in this study was the sample size. The
67
researcher selected purposive sampling due to the limited number of respondents to the invitation
to participate in the study. Although the 11 participants may not be considered representative of
the larger group to whom the results are tatistically generalized, purposive sampling allows for
carefully made logical generalizations. The researcher was motivated by the potential to
contribute to a cumulative body of knowledge around text complexity through rigorous
systematic inquiry and through the legitimacy of the findings: findings that may be further
authenticated through continued research by others (Efron & Ravid, 2020; McNiff & Whitehead,
2010).
Chapter Summary
This study was designed using a mixed methods approach adopting a quasi-experimental,
pre-test/post-test no control group design. The sample was gathered through non-probability
purposive sampling. The participants were selected through homogeneous and typical case
sampling. The target population included all kindergarten through eighth grade teachers, as well
as the education services department in RUSD.
The researcher conducted the study collecting data using a pre-/post-survey, combined
with a PD activity. A culminating discussion with the participants during the PD was followed
by the post-survey at the end of the PD. Information from the participants during the culminating
discussion provided the corpus of the study’s qualitative data. Validity for the surveys was
established through the guidance of educational experts and a field test. Trends were identified in
the data and these trends were then triangulated with the study questions and the components of
text complexity research.
68
Chapter Four: Results and Findings
The purpose of this study was to examine teachers’ conceptualizations of literacy
instruction after implementing the CCR Anchor Reading Standard 10 into their reading
curriculum construction, as well as to gain an understanding of how teacher PD could more
effectively prepare teachers with the necessary tools to interpret and utilize the trifold text
complexity formula, which expects students to “read and comprehend complex literary and
informational texts independently and proficiently” (NGA & CCSSO, 2010a, p. 8). This chapter
presents the findings related to the study’s research questions to gain clarity on the influence of
PD on teachers’ instructional practices in teaching students the skills to navigate complex texts
following the implementation of CCR Anchor Reading Standard 10.
Statement of the Problem
Every teacher faces the essential challenge of preparing students for the literacy demands
outlined in the CCSS. The responsibility for literacy development of today’s K-12 students tests
the belief that literacy is solely the responsibility of the language arts teacher. Under the CCSS,
the demand for literacy achievement places the responsibility of reading instruction on all
teachers. This responsibility is challenged by teachers’ perceptions regarding their ability to
navigate the selection of increasingly rigorous literary and informational texts without the
knowledge, skills, and strategies that a targeted PD in text complexity would provide. Preparing
students to interact with complex text is a critical component of the CCR Anchor Reading
Standard 10, yet after a decade of CCSS, the availability of targeted PD to advance teachers’
skills in the implementation of text complexity is nearly nonexistent. For that reason, the current
levels of text complexity implementation are a result of teachers’ confusion on how to address
student comprehension skills while also trying to engage students with more rigorous text.
69
To meet teachers’ challenges in their perceived skill level, preparedness, and PD
challenges, the teacher’s role in the successful implementation of CCR Anchor Reading Standard
10 should not be underestimated. Targeted PD is critical in increasing teachers’ competence in
navigating the dimensions of text complexity, as well as to equip teachers with the strategies they
need to increase students’ ability to interact with complex text (Miller, 2012). District leaders are
instrumental in making the right kind of PD decisions for teachers. This point is highlighted by
Fisher, Frey, and Nelson. (2012b), who asserted that educational leaders influence the
implementation of PD opportunities that develop teachers’ skills in targeting the dimensions of
text complexity, so that teachers have the skills to use the text deminsions to influence the kinds
of texts teachers should be using in reading instruction.
Research Questions
In this chapter, the researcher presents the results from surveys of educators in the
kindergarten through eighth-grade setting, as well as leadership from the district office. The
surveys highlighted the participants’ perceived skill level, as well as their level of preparedness
in text complexity. Additionally, the surveys examined the participants’ ability to implement the
quantitative dimensions, qualitative measures, and reader and task considerations of text
complexity. The researcher framed the results and data analysis around the effect of PD on the
participants’ knowledge of and ability to implement text complexity utilizing the following
research questions:
1. What is the perceived knowledge and skill levels of teachers in text complexity who
have not experienced prior professional development?
2. What is the perceived level of preparedness of teachers in text complexity who have
not experienced prior professional development?
70
3. To what degree did the teachers’ knowledge and skill levels in text complexity
change after engaging in professional development?
4. To what degree did the teachers’ preparedness in text complexity change after
engaging in professional development?
Participants
Eleven instructional staff members participated in this mixed methods research study.
The respondents had varying levels of understanding and training in the implementation of text
complexity and Standard 10 of the CCSS. The participants volunteered for the study, and all of
the participants were based in a suburban a single school district in Southern California. The
participants’ experience in education ranged from 1 year to over 26 years. The participants’ years
of experience were gained in the kindergarten through eighth grade setting, and in various
district leadership roles. Table 8 provides the participants’ demographic data.
The sample population included 11 participants (91% female, 9% male), with three
kindergarten through second-grade teachers, three third-grade through fifth-grade teachers, two
sixth-grade through eighth-grade teachers, three CSRTs, one ELDRT, and a kindergarten
through fifth-grade Curriculum and Instruction Director. The median age range of the
participants was 41-50 years. The participants’ median years of service in education was 16-20
years.
The district employs 1,185 teachers with an average of 13 years of experience in
teaching. The ratio of pupil to teachers is 21:9. Five hundred eighty-three educators were invited
to participate in the study and 18 educators volunteered to participate. From this group,
purposive sampling was used to ensure participants met the criteria for participating. Purposive
sampling was applied for the purpose of focusing on particular characteristics in the population
71
of interest to more specific generalizations from the rich evidence produced by the small number
of participants.
Table 8
Participant Demographics
Variable n %
Gender
Male 1 9%
Female 10 91%
Age range
21-30 years old 1 9%
31-40 years old 2 18%
41-50 years old 2 18%
51-60 years old 5 45%
60+ years old 1 9%
Years teaching prior to this year
0-5 years 1 9%
6-10 years 3 27%
11-15 years 1 9%
16-20 years 1 9%
1-25 years 3 27%
26-30 years 1 9%
30+ years 1 9%
Grade level taught
K-2 3 27%
3-5 3 27%
6-8 2 18%
Instructional Coach/Special Assignment 2 18%
Other 1 9%
Additional degrees or certificates
K-12 Reading Certificate. 2 18%
Master’s Degree in a Reading/Literacy-related field 2 18%
Education Specialist’s Degree in a Reading/Literacy related field. 2 18%
Factor Analysis Results
Components were extracted via principal component analysis (PCA), which is a
“dimensionality-reduction method that is used to…transform a large set of variables into a
smaller one that still contains most of the information in the large set” (Jaadi, 2020, para. 4). Five
components were extracted with the PCA using the traditional Kaiser criterion, which specifies
that only components with an eigenvalue of 1.0 or greater should be retained for analysis
72
(Kaiser, 1960). These five components with an eigenvalue above 1.0 explained 85% of the total
variance in the scores on the pre- and post-treatment surveys (see Table 9).
Table 9
Total Variance Explained by Each Component
Initial Eigenvalues Rotation Sums of Squared Loadings
Component* Initial Eigenvalue
% of
Cumulative % Total Variance Cumulative % Variance
1 12.307
53.507
53.507
8.444 36.711 36.711
2 3.658
15.906
69.413
5.118 22.252 58.964
3 1.473
6.405
75.819
2.694 11.715 70.678
4 1.210
5.263
81.081
1.837 7.987 78.665
5 1.012
4.402
85.483
1.568 6.818 85.483
6 .691
3.005
88.487
7 .600
2.610
91.098
8 .437
1.902
93.000
9 .393
1.709
94.709
10 .284
1.234
95.943
11 .251
1.090
97.033
12 .209
.909
97.942
13 .154
.671
98.613
14 .126
.546
99.159
15 .067
.291
99.450
16 .049
.213
99.662
17 .036
.156
99.818
18 .023
.101
99.919
19 .012
.051
99.970
20 .005
.023
99.993
21 .002
.007
100.000
22 .000
.000
100.000
23 .000
.000
100.000
Note. Italicized components were those extracted for the purposes of this analysis.
Kaiser’s criterion is the default retention criterion for SPSS (Hayton, Allen, & Scarpello,
2004). Conway and Huffcut (2003) asserted that Kaiser’s criterion is the most highly utilized
factor analysis method for identifying the number of components and providing the most
interpretable solution when conducting a factor analysis. The five components with eigenvalues
greater than 1.0 were rotated using varimax software to generate an orthogonal solution shown in
73
Table 10. Varimax software is the most highly used method to produce an orthogonally rotated
matrix (Comrey & Lee, 1992; Stevens, 2009; Tabachnick & Fidell, 2007). Using the method
preferred by Tabachnick and Fidell (1996a) of identifying the point in the scree plot where the
points change slope, three components explaining 71% of the variance of the PCA were
ultimately chosen and reported here.
Table 10
Component Loadings > 0.32
Item on Pre and Post-Treatment Survey
Component
1 2 3 4 5
q8 Perceived knowledge of Quantitative Measures of Text Complexity .855 -- .353 -- --
q9 Perceived knowledge of Qualitative Measures of Text Complexity .856 -- -- -- --
q10 Perceived knowledge of Reader and Task Measures of Complexity .845 -- .408 -- --
q11 Preparation to implement a variety of reading comprehension strategies in
your reading instruction
.684 -- -- -- .459
q12 Preparation to provide differentiated instruction when students are unclear
about your reading instruction
-- -- .628 .526 --
q13 Preparation to assess student comprehension (check for understanding) of the
lessons in your reading instruction
.326 .328 -- -- .728
q14 Preparation to provide strategies for your students to learn how to interact
with complex texts
.874 -- -- -- --
q15 Preparation to assist students in believing they can do well in reading -- .530 .670 -- --
q16 Preparation to motivate students who demonstrate a low interest in reading -- -- .835 -- --
q17 Preparation to improve the reading skills of students who are not working to
the expectations in the Reading Standards
-- .634 .525 -- --
q18 Preparation to help your students think critically about the texts they are
reading
.425 .652 -- -.327 --
q19 Extent that you understand conventional readability formulas used to rate
text difficulty
.887 -- -- -- --
q20 Extent that you are able to use apps such as Book Wizard, or websites such
as lexile.com as resources to provide readability formulas
.711 -- -- .341 --
q21 Extent that you understand the qualitative dimensions and factors of text .850 -- -- -- --
q22 Extent that you are able to do qualitative reviews of student text .927 -- -- -- --
q23 Extent that you understand the element of matching the reader and task in
consideration of variables specific to particular readers and particular tasks
.725 .359 -- -- --
q24 Extent that you are able to match the reader and task in consideration of
variables specific to particular readers and particular tasks
.598 .447 -- -- .388
q25 Belief that professional development improves teaching practice -- .857 -- -- --
q26 Belief that professional development increases student achievement -- .904 -- -- --
q27 Belief that professional development often helps teachers develop new
teaching strategies
-- .917 -- -- --
q28 Belief that teacher has been enriched by the professional development events
I have attended
-- .415 -- .826 --
q29 Attended in professional development focused on Text Complexity .705 -- -- .565 --
q30 Belief that teacher reading instruction would improve by attending a
professional development focused on Text Complexity
.462 .699 -- -- --
74
Communalities and Number of Components Extracted
The generally accepted loadings should be .30 or greater so that there is interpretive value
(Comrey & Lee, 1992). A loading is simply the correlation between the variable and the
extracted component (Stevens, 2009). Although there is no conclusive standard, the higher the
loading, the greater confidence the researcher can have that a strong relationship exists. Comrey
and Lee (1992) developed the most cited guideline, which is 0.71 = excellent, 0.63 = very good,
0.55 = good, 0.45 = fair and 0.32 = poor. The greater the loading, the more the variable is a pure
measure of the component. When there are several variables with loadings in the very good to
excellent range, the researcher has a basis on which more definitive conclusions about the
component can be made (Tabachnick & Fidell, 2001).
Table 11 shows the communalities for the items in the pre and post-treatment surveys.
The communalities “indicate the variation in a variable that overlaps the variance in the factors”
and can be “inspected to see if the variables are well defined by the by the solution” (Tabachnick
& Fidell, 1996a, p. 692). In general, the higher the communality, the better the factor solution,
with communalities ranging from 0.6 to 0.8 as high. The desired mean level of communality is at
least .70 and also for communalities not to vary over a wide range (MacCallum, Widaman,
Zhang, & Hong, 1999). In this study, communalities ranged from 0.684 to 0.938 with an average
communality of 0.855, indicating that the items are a good fit for the solution and can be
considered reliable.
Component 1 loading shown in Table 12 contains 12 variables or 52% of the total
variables included in the study. Eleven variables loaded in the very good to excellent range. One
of the 12 variables loaded in the good range. Six variables, representing 26% of the total
75
variables, loaded at .32 or higher in component two loading. All six of the variables in
component 2 loaded in the good to excellent range.
Table 11
Communalities of the Principal Components Analysis
Item on Pre and Post-Treatment Survey
Communalities
Initial Extraction
q8.1 Perceived knowledge of Quantitative Measures of Text Complexity 1 0.890
q8.2 Perceived knowledge of Qualitative Measures of Text Complexity 1 0.887
q8.3 Perceived knowledge of Reader and Task Measures of Complexity 1 0.911
q11 Preparation to implement a variety of reading comprehension strategies in your
reading instruction
1 0.806
q13 Preparation to provide differentiated instruction when students are unclear about your
reading instruction
1 0.819
q19 Preparation to assess student comprehension (check for understanding) of the lessons
in your reading instruction
1 0.797
q21 Preparation to provide strategies for your students to learn how to interact with
complex texts
1 0.820
q23 Preparation to assist students in believing they can do well in reading 1 0.844
q25 Preparation to motivate students who demonstrate a low interest in reading 1 0.905
q27 Preparation to improve the reading skills of students who are not working to the
expectations in the Reading Standards
1 0.872
q29 Preparation to help your students think critically about the texts they are reading 1 0.814
q33.1 Extent that you understand conventional readability formulas used to rate text
difficulty
1 0.901
q33.2 Extent that you are able to use apps such as Book Wizard, or websites such as
lexile.com as resources to provide readability formulas
1 0.684
q33.4 Extent that you understand the qualitative dimensions and factors of text 1 0.925
q33.5 Extent that you are able to do qualitative reviews of student text 1 0.938
q33.7 Extent that you understand the element of matching the reader and task in
consideration of variables specific to particular readers and particular tasks
1 0.801
q33.8 Extent that you are able to match the reader and task in consideration of variables
specific to particular readers and particular tasks
1 0.796
q34.1 Belief that professional development improves teaching practice 1 0.895
q34.2 Belief that professional development increases student achievement 1 0.868
q34.3 Belief that professional development often helps teachers develop new teaching
strategies
1 0.884
q34.4 Belief that teacher has been enriched by the professional development events I have
attended
1 0.920
q34.5 Attendance in professional development focused on Text Complexity 1 0.854
q34.6 Belief that teacher reading instruction would improve by attending a professional
development focused on Text Complexity
1 0.829
76
Table 12
Text Complexity Preparedness Scale: Component Loadings > 0.32
Text Complexity Preparedness Survey Item
Component 1
Loading
q8.1 Perceived knowledge of Quantitative Measures of Text Complexity .855
q8.2 Perceived knowledge of Qualitative Measures of Text Complexity .856
q8.3 Perceived knowledge of Reader and Task Measures of Complexity .845
q11 Preparation to implement a variety of reading comprehension strategies in your reading
instruction
.684
q14 Preparation to provide strategies for your students to learn how to interact with complex texts .874
q19 Extent that you understand conventional readability formulas used to rate text difficulty .887
q20 Extent that you are able to use apps such as Book Wizard, or websites such as lexile.com as
resources to provide readability formulas
.711
q21 Extent that you understand the qualitative dimensions and factors of text .850
q22 Extent that you are able to do qualitative reviews of student text .927
q23 Extent that you understand the element of matching the reader and task in consideration of
variables specific to particular readers and particular tasks
.725
q24 Extent that you are able to match the reader and task in consideration of variables specific to
particular readers and particular tasks
.598
q29 Attended professional development focused on Text Complexity .705
Table 13
Professional Development and Other Scale: Component Loadings >0.32
Professional Development and Other Preparedness Survey Item
Component 2
Loading
q17 Preparation to improve the reading skills of students who are not working to the
expectations in the Reading Standards
.634
q18 Preparation to help your students think critically about the texts they are reading .652
q25 Belief that professional development improves teaching practice .857
q26 Belief that professional development increases student achievement .904
q27 Belief that professional development often helps teachers develop new teaching strategies .917
q30 Belief that teacher reading instruction would improve by attending a professional
development focused on Text Complexity
.699
In component 3 loading shown in Table 14, there are three loadings representing 13% of
the total variables. All the loadings are in the excellent to good range. Component loadings 4 and
5 had a combined three loadings. Caution is advised when researchers interpret component
loadings with only a few variables (Nunnally & Bernstein, 1994; Tabachnick & Fidell, 2001;
Stevens, 2009). Stevens (2009) suggested that components with at least four loadings greater
than .60 or at least three loadings greater than .80 are considered reliable. In consideration of
77
these guidelines, component loadings 3-5 were deemed unreliable and are not included in further
analysis.
Table 14
Additional Items: Component Loadings > 0.32
Other Components
Components
3 4 5
q12 Preparation to provide differentiated instruction when students are unclear about your
reading instruction
.628 .526 --
q15 Preparation to assist students in believing they can do well in reading .670 -- --
q16 Preparation to motivate students who demonstrate a low interest in reading .835 -- --
q28 Belief that teacher has been enriched by the professional development events I have
attended
-- .826 --
q13 Preparation to assess student comprehension (check for understanding) of the lessons
in your reading instruction
-- -- .728
T-Test
As a follow-up to the principal component analysis, the researcher constructed a scale to
measure participants’ pre- and post-treatment differences in teachers’ knowledge of and
perceived ability to implement text complexity in the classroom. After combining all pre- and
post-treatment responses and conducting a scale reliability, it was determined that the scale had
high internal consistency (11 items; = 0.961). A commonly accepted
rule for describing internal
consistency using Cronbach’s alpha is as follows (DeVellis, 1991; see Table 15).
Table 15
Cronbach’s Alpha and Internal Consistency
Cronbach’s alpha Internal consistency
0.9 ≤ α Excellent
0.8 ≤ α < 0.9 Good
0.7 ≤ α < 0.8 Acceptable
0.6 ≤ α < 0.7 Questionable
0.5 ≤ α < 0.6 Poor
α < 0.5 Unacceptable
78
Results showed significant differences between groups such that post-treatment scores
(M = 3.84, SD = 0.64) were significantly higher than those before treatment (M = 2.77,
SD = 1.04), t(20) = 2.906, p = 0.0087. Descriptive statistics showed that post-treatment
individual-item and overall scale mean ratings were higher than pre-treatment ratings. In addition
to the more descriptive analyses, the researcher wanted to determine the effectiveness of the PD
using a holistic measure of text complexity. As such, all areas of the administered surveys were
analyzed to determine the best components to measure text complexity. What resulted was a
highly reliable measure of text complexity that, when treated as a scale, strongly supported the
efficacy of the PD delivered to improve the participants’ perceived knowledge (see Table 16).
Table 16
Descriptive Statistics of Pre- and Post-Treatment: Text Complexity Preparedness
Pre-Treatment Post-Treatment
Text Complexity Preparedness Survey Item N M SD N M SD MD
q8.1 Perceived knowledge of Quantitative Measures of
Text Complexity
11 2.64 0.92 11 3.55 0.52 0.91
q8.2 Perceived knowledge of Qualitative Measures of
Text Complexity
11 2.64 0.92 11 3.64 0.67 1.00
q8.3 Perceived knowledge of Reader and Task Measures
of Complexity
11 2.64 0.92 11 3.36 0.51 0.72
q11 Preparation to implement a variety of reading
comprehension strategies in your reading
instruction
11 3.64 0.67 11 4.00 0.45 0.36
q14 Preparation to provide strategies for your students to
learn how to interact with complex texts
11 3.18 0.87 11 3.91 0.70 0.73
q19 Extent that you understand conventional readability
formulas used to rate text difficulty
11 2.55 1.21 11 3.73 0.65 1.18
q20 Extent that you are able to use apps such as Book
Wizard, or websites such as lexile.com as resources
to provide readability formulas
11 2.91 1.22 11 3.91 0.54 1.00
q21 Extent that you understand the qualitative dimensions
and factors of text
11 2.36 1.29 11 4.09 0.70 1.73
q22 Extent that you are able to do qualitative reviews of
student text
11 2.18 1.33 11 4.00 0.89 1.82
q23 Extent that you understand the element of matching
the reader and task in consideration of variables
specific to particular readers and particular tasks
11 3.00 1.00 11 4.18 0.75 1.18
q24 Extent that you are able to match the reader and task
in consideration of variables specific to particular
readers and particular tasks
11 2.91 1.04 11 3.82 0.60 0.91
Overall Text Complexity Preparedness Scale 11 2.77 1.04 11 3.84 0.64 1.07
79
Findings
Results: Research Question One
Research question 1 asked, What is the perceived knowledge and skill level of teachers in
text complexity who have not experienced prior professional development? Descriptive statistics,
including means and standard deviation for the independent variables, are summarized in Table
17. The participants’ knowledge of quantitative, qualitative, and reader and task measures of text
complexity variables share a pre-treatment mean of 2.64.
Table 17
Descriptive Statistics of Pre-Treatment: Self-perceived Knowledge of Text Complexity Measures
Text Complexity Preparedness Survey Item
Pre-Treatment
N M SD
q8.1 Perceived knowledge of Quantitative Measures of Text Complexity 11 2.64 0.92
q8.2 Perceived knowledge of Qualitative Measures of Text Complexity 11 2.64 0.92
q8.3 Perceived knowledge of Reader and Task Measures of Complexity 11 2.64 0.92
Descriptive statistics, including means and standard deviation for the independent
variables, are summarized in Table 18. The participants’ understanding of conventional
readability formulas (quantitative measure of text complexity) had a pre-treatment mean of 2.55.
The participants’ understanding of the qualitative dimensions and factors of text complexity had
a pre-treatment mean of 2.36. The participants’ understanding of the element of matching reader
with task had a pre-treatment mean of 3.00.
Table 18
Descriptive Statistics of Pre-Treatment: Self-perceived Knowledge of Text Complexity Measures
Text Complexity Preparedness Survey Item
Pre-Treatment
N M SD
q19 Extent that you understand conventional readability formulas used to rate text
difficulty
11 2.55 1.21
q21 Extent that you understand the qualitative dimensions and factors of text 11 2.36 1.29
q23 Extent that you understand the element of matching the reader and task in
consideration of variables specific to particular readers and particular tasks
11 3.00 1.00
80
As shown in Figure 8, the participants’ self-perceived pre-treatment knowledge of the
quantitative, qualitative, and reader and task measures of text complexity fell within a very
limited range.
Figure 8. Pre-treatment mean of perceived knowledge and skills in text complexity.
Results: Research Question Two
Research question 2 asked, What is the perceived level of preparedness of teachers in text
complexity who have not experienced prior professional development? Descriptive statistics,
including means and standard deviation for the independent variables, are summarized in Table
19. The participants’ perceived knowledge of the quantitative, qualitative, and reader and task
measures of text complexity increased from a pre-treatment mean of 2.64 to a post-treatment
mean of 3.55 (quantitative), 3.64 (qualitative), and 3.36 (reader and task).
2.64
2.64
2.64
1 2 3 4 5
q10 Perceived knowledge of Reader and Task
Measures of Text Complexity
q9 Perceived knowledge of Qualitative
measures of Text Complexity
q8 Perceived knowledge of Quantitative
Measures of Text Complexity
Expert Knowledge
Very Knowledgeable
Working Knowledge
Very Limited
Knowledge
No
Knowledge
q8 Perceived knowledge of
Quantitative Measures of Text
Complexity
q8.2 Perceived knowledge of
Qualitative measures of Text
Complexity
q8.3 Perceived knowledge of Reader
and Task Measures of Text
Complexity
81
Table 19
Descriptive Statistics of Pre- and Post-Treatment: Knowledge of Text Complexity Measures
Text Complexity Preparedness Survey Item
Pre-Treatment Post-Treatment
N M SD N M SD MD
q8.1 Perceived knowledge of Quantitative Measures of
Text Complexity
11 2.64 0.92 11 3.55 0.52 0.91
q8.2 Perceived knowledge of Qualitative Measures of
Text Complexity
11 2.64 0.92 11 3.64 0.67 1.00
q8.3 Perceived knowledge of Reader and Task
Measures of Complexity
11 2.64 0.92 11 3.36 0.51 0.72
Descriptive statistics, including means and standard deviation for the independent
variables, are summarized in Table 20. The participants’ understanding of conventional
readability formulas used to rate text difficulty increased from a pre-treatment mean of 2.55 to a
post-treatment mean of 3.73. The participants’ understanding of the qualitative dimensions and
factors of text increased from a pre-treatment mean of 2.36 to a post-treatment mean of 4.09. The
participants’ understanding of the element of matching the reader and task increased from a pre-
treatment mean of 3.00 to a post-treatment mean of 4.18.
Table 20
Descriptive Statistics Pre- and Post-Treatment: Perceived Knowledge of Text Complexity
Measures
Text Complexity Preparedness Survey Item
Pre-Treatment Post-Treatment
N M SD N M SD MD
q19 Extent that you understand conventional readability
formulas used to rate text difficulty
11 2.55 1.21 11 3.73 0.65 1.18
q21 Extent that you understand the qualitative dimensions
and factors of text
11 2.36 1.29 11 4.09 0.70 1.73
q23 Extent that you understand the element of matching
the reader and task in consideration of variables
specific to particular readers and particular tasks
11 3.00 1.00 11 4.18 0.75 1.18
As shown in Figure 9, the participants’ perceived knowledge of the quantitative,
qualitative, and reader and task measures of text complexity increased from a pre-treatment mean
82
of 2.64 to a post-treatment mean of 3.55 (quantitative), 3.64 (qualitative), and 3.36 (reader and
task). The respective (0.91; 1.0; 0.72) increases moved the participants’ perceived knowledge of
the trifold dimension of text complexity from the higher end of a very limited knowledge to the
mid-range of a working knowledge of the text complexity dimension.
Figure 9. Comparison of pre- and post-treatment means.
Results: Research Question Three
Research question 3 asked, To what degree did the teachers’ knowledge and skill levels
in text complexity change after engaging in professional development? Descriptive statistics,
including means and standard deviation for the independent variables, are summarized in Table
21. The participants’ preparation to implement a variety of reading comprehension strategies
during reading instruction had a pre-treatment mean of 3.64. The participants’ preparation to
2.91
2.36
2.55
2.64
2.64
2.64
4.18
4.09
3.73
3.36
3.64
3.55
1 2 3 4 5
q23 Extent that you understand the element of matching
the reader and task in consideration of variables specific
to particular readers and particular tasks
q21 Extent that you understand the qualitative
dimensions of text.
q19 Extent that you understand conventional readability
formulas used to rate text difficulty
q10 Perceived knowledge of Reader and Task Measures of
text Complexity
q9 Perceived knowledge of Qualitative Measures of text
Complexity
q8 Perceived knowledge of Quantitative Measures of Text
Complexity
Series2 Series1
Post-
Survey
Pre-
Survey
q8.1 Perceived knowledge of Quantitative Measures of
Text Complexity
q8.2 Perceived knowledge of Qualitative Measures of
Text Complexity
q8.3 Perceived knowledge of Reader and Task
Measures of Text Complexity
q19 Extent that you understand conventional
readability formulas used to rate text difficulty
q21 Extent you understand the qualitative dimensions
of text
q23 Extent that you understand the elements of
matching the reader and task in consideration of
variables specific to particular readers and particular
tasks
q21 Extent that understand the qualitative dimensions
of text
q23 Extent that you understand the element of
matching the reader and task in consideration of
variables specific to particular readers and particular
tasks
83
provide strategies for their students to learn how to interact with complex texts had a pre-
treatment mean of 3.18.
Table 21
Descriptive Statistics of Pre-Treatment: Preparedness in Reading Comprehension Strategies
Text Complexity Preparedness Survey Item
Pre-Treatment
N M SD
q11 Preparation to implement a variety of reading comprehension strategies in your reading
instruction
11 3.64 0.67
q21 Preparation to provide strategies for your students to learn how to interact with
complex texts
11 3.18 0.87
Descriptive statistics, including means and standard deviation for the independent
variables, are summarized in Table 22. The participants’ ability to use apps such as Book
Wizard, or websites such as lexile.com as resources to provide readability formulas had a pre-
treatment mean of 2.91. The participants’ ability to do qualitative reviews of student text had a
pre-treatment mean of 2.18. The participants’ ability to match the reader and task in
consideration of variables specific to particular readers and particular tasks had a pre-treatment
mean of 2.91.
Table 22
Descriptive of Pre-Treatment: Preparedness in Text Complexity Measures
Text Complexity Preparedness Survey Item
Pre-Treatment
N M SD
q20 Extent that you are able to use apps such as Book Wizard, or websites such as
lexile.com as resources to provide readability formulas
11 2.91 1.22
q22 Extent that you are able to do qualitative reviews of student text
11 2.18 1.33
q24 Extent that you are able to match the reader and task in consideration of variables
specific to particular readers and particular tasks
11 2.91 1.04
As shown in Table 22, the participants’ belief in their ability to do qualitative reviews of
student text had a pre-treatment mean of 2.18, which is slightly above a very limited knowledge
range. However, the participants’ belief in their preparation to implement a variety of reading
84
comprehension strategies during reading instruction had a pre-treatment mean of 3.64, which is
in the mid-range between working knowledge and very knowledgeable.
What is not known about the participants prior to the treatment is whether or not they had
a clearly defined understanding of the CCSS meaning of the qualitative dimensions of text
complexity; the aspects of text complexity best measured or only measurable by an attentive
human reader, such as levels of meaning or purpose; structure; language conventionality and
clarity; and knowledge demands (NGA & CCSSO, 2010b).
A lack of knowledge of the qualitative dimensions of text may account for the
participants’ pretreatment mean of 2.18, which is slightly above the very limited knowledge
range in the extent of their ability to conduct qualitative reviews of student text. Although the
participants may or may not understand the CCSS qualitative dimensions of text complexity
prior to the treatment, it is important for the researcher to point out that the qualitative dimension
of text complexity is a weakness, with ratings in the very limited range compared to their pre-
treatment reading comprehension strategies mean of 3.64. In short, the participants did not earn
low scores across all areas in their knowledge of text complexity; however, the qualitative
factors were comparatively low.
Results: Research Question Four
Research question 4 asked, To what degree did the teachers’ preparedness in text
complexity change after engaging in professional development? Descriptive statistics, including
means and standard deviation for the independent variables are summarized in Table 23. The
participants’ preparedness to implement a variety of reading comprehension strategies in your
reading instruction increased from a pre-treatment mean of 3.64 to 4.00. The participants’
85
preparedness to provide strategies for your students to learn how to interact with complex texts
increased from a pre-treatment mean of 3.18 to 3.91.
Table 23
Descriptive Statistics of Pre-Treatment: Preparedness in Reading Strategies
Text Complexity Preparedness Survey Item
Pre-Treatment Post-Treatment
N M SD N M SD MD
q11 Preparation to implement a variety of reading
comprehension strategies in your reading
instruction
11 3.64 0.67 11 4.00 0.45 0.36
q14 Preparation to provide strategies for your students to
learn how to interact with complex texts
11 3.18 0.87 11 3.91 0.70 0.73
As shown in Figure 10, the participants’ the participants’ self-perceived pre-treatment
understanding of conventional readability formulas, qualitative dimensions of text, and the
elements of matching reader and tasks fell within a very limited knowledge range.
Figure 10. Pre-treatment mean of perceived level of preparedness in text complexity.
2.91
2.36
2.55
1 2 3 4 5
q23 Extent that you understand the element
of matching the reader and task in
consideration of variables specific to
particular readers and particular tasks
q21 Extent that you understand the
qualitative deminsions of text
q19 Exent that you understand conventional
readability formulas used to rate text
difficulty
Expert Knowledge
Very Knowledgeable
Working Knowledge
Very Limited
Knowledge
No
Knowledge
q19 Extent that you understand
conventional readability formulas used to
rate text difficulty
q21 Extent that understand the
qualitative dimensions of text
q23 Extent that you understand the
element of matching the reader and task
in consideration of variables specific to
particular readers and particular tasks
86
Descriptive statistics, including means and standard deviation for the independent
variables, are summarized in Table 24. The participants’ ability to use apps such as Book
Wizard, or websites such as lexile.com as resources to provide readability formulas increased
from a pre-treatment mean of 2.91 to a post-treatment mean of 3.91. The participants’ ability to
conduct qualitative reviews of student text increased from a pre-treatment mean of 2.18 to a post-
treatment mean of 4.00. The participants’ preparedness to match the reader and task in
consideration of variables specific to particular readers and particular tasks increased from a pre-
treatment mean of 2.91 to a post-treatment mean of 3.82.
Table 24
Descriptive Statistics of Pre- and Post-Treatment: Preparedness in Text Complexity Measures
Text Complexity Preparedness Survey Item
Pre-Treatment Post-Treatment
N M SD N M SD MD
q20 Extent that you are able to use apps such as Book
Wizard, or websites such as lexile.com as resources
to provide readability formulas
11 2.91 1.22 11 3.91 0.54 1.00
q22 Extent that you are able to do qualitative reviews of
student text
11 2.18 1.33 11 4.00 0.89 1.82
q24 Extent that you are able to match the reader and task
in consideration of variables specific to particular
readers and particular tasks
11 2.91 1.04 11 3.82 0.60 0.91
As shown in Figure 11, the participants’ preparedness in implementing a variety of
reading comprehension strategies increased from a pre-treatment mean of 3.64 to a post-
treatment mean of 4.0. The mean difference increase of 0.36 moved the participants’
preparedness from a working knowledge to a very knowledgeable range. The participants’
preparedness in providing strategies for students to learn how to interact with complex text
increased from a pre-treatment mean of 3.18 to a post-treatment mean of 3.91. The mean
difference increase of 0.73 moved the participants’ preparedness from a lower working
knowledge to a higher working knowledge. The participants’ ability to use apps as resources to
87
provide readability formulas increased from a pre-treatment mean of 2.91 to a post-treatment
mean of 3.91. The mean difference increase of 1.00 moved the participants’ preparedness from a
high limited knowledge to a high working knowledge range. The participants’ ability to conduct
qualitative reviews of student text increased from a pre-treatment mean of 2.18 to a post-
treatment mean of 4.0. The mean difference increase of 1.82 moved the participants’
preparedness from a very limited knowledge to a very knowledgeable range. The participants’
ability to match the reader and task in consideration of variables specific to particular readers and
particular tasks increased from a pre-treatment mean of 2.91 to a post-treatment mean of 3.82.
The mean difference increase of 0.91 moved the participants’ preparedness from a high limited
knowledge to a highworking knowledgeable range.
Figure 11. Comparison of pre- and post-treatment means.
2.91
2.18
2.91
3.18
3.64
3.82
4
3.91
3.91
4
1 2 3 4 5
q24 Extent that you are able to match the reader and task
in consideration of variables specific to particular readers
and particular tasks
q22 Extent that you are able to do qualitative reviews of
student text
q20 Extent that you are able to use apps such as Book
Wizard, or websites such as lexile.com as resources to
provide readability formulas
q14 Preparation to provide strategies for your students to
learn how to interact with complex texts
q11 Preparation to implement a variety of reading
strategies in your reading instruction
Series2 Series1
Expert Knowledge
Post-
Survey
Pre-
Survey
Very
Knowledgeable
Working
Knowledge
Very
Limited
Knowledge
No
Knowledge
q11 Preparation to implement a variety of reading
strategiesin your reading instruction
q14 Preparation to provide strategies for your students
to learn how to interact with complex texts
q20 Extent that you are able to use apps such as Book
Wizard, or websites such as lexile.com as resources to
provide readability formulas
q22 Extent that you are able to do qualitative reviews
of student text
q24 Extent that you are able to match reader and task
in consideration of variables specific to particular
readers and particular tasks
88
Professional Development Discussion Notes
The PD discussion focused on defining text complexity and how the participants use text
complexity to evaluate text used in classroom instruction. See Table 25 for the participants’
responses.
Table 25
Participants’ Responses to Question: How Would You Define Text Complexity, and to What Extent
Is Text Complexity Considered When You Evaluate the Texts Used in Your Classroom Instruction?
Participant Responses
Text Complexity
Measure
#1 “I use text complexity to determine the level of texts my students read.”
“It is important to choose books the students are interested in.”
Quantitative
Reader and Task
#2 “Our school uses the STAR Reading Assessment, which addresses text complexity for
the teacher.”
“There isn’t a lot of time in the day to focus on text complexity, and every student is at a
different reading level. It is really difficult to manage.”
Quantitative
#3 “Text complexity is about how difficult the text is.”
“It is up to the teacher to make sure the students are reading at and a little above their
Lexile levels.”
Quantitative
#4 “I don’t really understand all that much about text complexity. I know it has a lot to do
with the reading levels of students.”
“I think text complexity is directed more toward English learners and their reading level.”
Quantitative
#5 “Quite honestly, I signed up for this PD because I know nothing about text complexity. I
know it is important because of the focus on text complexity in the standards, but I just
have not been trained.”
#6 “I know that text complexity is not just about Lexile. It has more to do with how we
match text with what students want to read. We need to consider reading level, but also
high interest books or they will not read.”
“I understand quantitative much more than i do the qualitative and reader and task areas.”
Reader and Task
#7 “I hate to say this, but middle school teachers are pressed for time, so I’m not sure we can
consider the text complexity level of every student.” We see over one hundred students a
day.”
“Many of my students cannot read middle school level materials, so I have to make sure I
don’t give them text that is too difficult. If I do, they can’t do the assignments.”
#8 “Text complexity is tied to the standards, which is my understanding. So, we need to
teach reading, especially comprehension, as outlined in the standards.”
#9 “When I was in the classroom, I used Fountas and Pinnell to level my students. I focused
heavily on guided reading, but like some of you have said, I think student interest in what
they are reading is critical to get students to read.”
Quantitative
89
“I visit so many classrooms throughout the district, and I think the problem many
teachers have with text complexity is that so many students are lacking the foundational
skills in reading. Before we focus on text complexity, we need to make sure students can
read.”
#10 “If STAR Reading Assessment determines the reading level of students, does the teacher
need to consider text complexity? I know the assessment levels of the students so that
they can choose the AR books in their ZPD.”
Quantitative
#11 “I’m here to learn about text complexity. I wish I knew more, but I am fairly new to
teaching.”
Out of the 19 responses from the discussion, 32% of the responses indicated that the
participants’ understanding of text complexity is connected to students’ reading levels, which is
the quantitative measure of text complexity. Only 11% of the responses indicated an
understanding of the reader and task measure of text complexity. Zero percent of the responses
indicated an understanding of the qualitative measure of text complexity. Four of the responses,
21%, indicated that participants have little to no understanding of the quantitative, qualitative,
and reader and task measures of text complexity.
Summary of Findings
The PD treatment provided by the researcher in this study demonstrated how targeted PD
can increase participants’ belief in their knowledge of and capacity to implement the various text
complexity measures. The highest gains were noted in the post-treatment means in the
understanding of the qualitative measures of text complexity. The post-treatment means
demonstrated a 1.0 gain in the participants’ perceived knowledge of the qualitative measures of
text complexity. Finally, The post-treatment means also demonstrated a 1.82 gain in the ability to
do qualitative reviews of student text.
Significant gains were noted in the post-treatment means in the reader and task measures
of text complexity. The post-treatment means demonstrated a 0.72 gain in the participants’
perceived knowledge of the reader and task measures of text complexity. The post-treatment
means also demonstrated a 1.24 gain in the ability to match the reader and task.
90
Slight gains were noted in the post-treatment means in the quantitative measures of text
complexity. The post-treatment means demonstrated a 0.91 gain in the participants’ perceived
knowledge of the quantitative measures of text complexity. The post-treatment means also
demonstrated a 1.0 gain in the ability to use apps as resources to provide readability formulas.
Chapter Five focuses on the findings of the study, examining the implications with
recommendations for further study.
91
Chapter Five: Discussion
This study examined teachers’ perceived skill level and their preparedness in determining
the level of challenge a text provides based on its quantitative and qualitative features, as well as
reader and task measures of text complexity before and after a PD experience focusing on text
complexity. Additionally, the study examined the outcomes of targeted teacher PD focused on
text complexity. This chapter provides an summary of the study, a review of the findings,
conclusions based on the findings, implications regarding the highlights from the research, and
suggestions for future research.
Statement of the Problem
Targeted PD can transform teaching and learning in the area of student reading when
teachers are empowered with the strategies they need to implement text complexity and CCR
Anchor Reading Standard 10 in the CCSS effectively. Darling-Hammond (2000) asserted that
the right PD has been demonstrated to fill the gap when teachers lack pedagogical knowledge or
skills in a specific area. However, almost a decade into the implementation of CCSS, little is
known and understood by teachers about text complexity and CCR Anchor Reading Standard 10
in the CCSS. This is a problem because the initial focus on text found in Appendix A of the
CCSS makes a research-based case for why the complexity of the text students read is critical to
their readiness for college and careers. By adding text complexity as a dimension of literacy,
CCSS strove to ensure that high school seniors will have the ability to read at college and career
levels.
Being able to read complex text independently and proficiently is essential for high
achievement in college and the workplace and important in numerous life tasks…In
particular, if students cannot read complex expository text to gain information, they
will likely turn to text-free and text-light sources, such as video, podcasts, and
tweets. These sources, while not without value, cannot capture the nuance, subtlety,
depth, or breadth of ideas developed through complex text. (CCRA.R.10, p. 4)
92
To a great degree, the state of the union of PD seems broken; however, PD cannot be
abandoned (Gray et al., 2015). With PD in the hands of state, county, and district leadership,
there is a call to reform PD so that it is content-focused, collaborative, sustained, and continuous
(Gray et al., 2015; Hiebert & Morris, 2012; Darling-Hammond et al., 2017)
Summary of Study
The implementation of CCSS and CCR Anchor Reading Standard 10 underscore the need
for students to demonstrate their ability to interact with complex texts. Consequently, teachers’
perceptions, skill level, and preparedness in teaching students how to interact with complex texts
deeply affect the level of student reading and learning in the classroom. Explicit among
researchers such as Hattie (2010) is the notion that teachers are a critical component of student
reading. In his synthesis of over 800 meta-analyses regarding influences on student learning, he
concluded, “Teachers are among the most powerful influences on learning” (p. 238), attributing
30% of the variance in student learning to teachers. Hattie (2003) emphasized, “It is what
teachers know, do, and care about which is very important in the learning equation” (p. 2). At the
same time, teachers’ conceptualizations of practices in reading instruction shape the depth of
learning for students through the increasing interaction with complex text.
The skills needed for reading instruction aligned with the demands of CCSS and CCR
Anchor Reading Standard 10 require extensive PD to expand a teacher’s toolkit beyond the
surface level literacy instruction of the past. The current standards call for instructional practices
that reflect the accelerated expectations of CCSS so that students in kindergarten through 12th
grade are taught the comprehension strategies necessary to access complex text. CCR Anchor
Reading Standard 10 outlines the text complexity measures teachers need to consider when
examining the texts placed in front of students. When teachers acknowledge that students are
93
facing challenges in reading complex text and then participate in targeted PD focused on text
complexity, they take the first step toward understanding how to implement text complexity
measures in reading instruction (Ryan & Frazee, 2012).
A key analysis of the study highlighted the relationship between targeted PD on text
complexity and teachers’ perceptions about their skill level and preparedness to implement the
three dimensions of text complexity.
Text Complexity: Skill Level
In the trifold measures of text complexity, the data from the study show that the
participants increased in their skill level following the PD treatment. In the quantitative measure
of text complexity, the participants increased from a limited knowledge pre-PD (M = 2.64) to a
working knowledge post-PD (M = 3.55) . In the qualitative measure of text complexity, the
participants increased from a limited knowledge pre-PD (M = 2.64) to a working knowledge
post-PD (M = 3.64). In the reader and task measure of text complexity, the participants increased
from a limited knowledge pre-PD (M = 2.64) to a working knowledge post-PD (M = 3.36).
The research data on teacher skill level presented in this study adds to the existing
research by demonstrating the impact PD can have on teachers’ skills in instructional practices.
Guskey (2002) asserted that the kind of PD teachers experience matters. Equally important is the
opportunities for teachers to practice new learning (Fang & Pace, 2013; Hiebert & Mesmer,
2013). The PD treatment in this study provided opportunities for teachers to experience the
application of working with specific texts while considering the quantitative, qualitative, and
reader and task measures of text complexity.
94
Text Complexity: Preparedness
In the trifold measures of text complexity, the data from the study show that the
participants increased their preparedness to implement text complexity following the PD
treatment. In the quantitative measure of text complexity, the participants increased from a lower
working knowledge pre-PD (M = 2.91) to very knowledgeable post-PD (M = 3.91). In the
qualitative measure of text complexity, the participants increased from a lower working
knowledge pre-PD (M = 2.91) to a high working knowledge post-PD (M = 3.82). In the reader
and task measure of text complexity, the participants increased from a limited knowledge pre-PD
(M = 2.64) to a working knowledge post-PD (M = 3.336).
The research data on teacher skill level presented in this study adds to the existing
research by demonstrating the impact PD can have on teachers’ preparedness in using various
instructional strategies to implement text complexity. A key result from the PD treatment was the
noticeable change in the participants’ beliefs regarding the importance of considering the trifold
measures of selecting the right texts to use in the classroom. The participants were awakened to
the fact that their current practices in reading instruction may have been misguided or rooted in
beliefs about reading instruction that are not supported by research. In Anders et al.’s (1991)
study on teachers’ beliefs and practices in reading comprehension instruction, the researchers
found that many teachers in their study lacked knowledge and belief in research-based reading
comprehension instruction; however, they “did not know the practices that would allow them to
act upon those beliefs” (p. 579).
The findings from this study could help educational leaders in their decision-making
processes regarding targeted PD, as this study validated a difference in teachers’ perceived skill
levels and preparedness in text complexity. Based on the data gathered in this study, targeted PD
95
was a key factor in influencing teachers’ perceived skill level and preparedness in implementing
text complexity. To gain a clear understanding of a possible relationship between targeted PD
and teachers’ perceived skill level and preparedness in implementing text complexity, pre- and
post-surveys were administered to the participants via Qualtrics. For quantitative analysis, the
study sought to find variances between the pre- and post-surveys following the teachers’
participation in targeted PD.
Participants
The 11 participants in the study were selected from elementary and middle schools within
a suburban school district in Southern California based on their willingness to participate in the
study. The participants had 1-3 years of CCSS implementation experience at the time of the
study. The participants provided insight and perspective on teacher knowledge and level of
implementation of text complexity in a larger district setting.
Instrument
The pre- and post-survey data were collected through the Qualtrics system. The surveys
were piloted to gauge the content validity of the survey questions in relation to the research
questions in the study. The participants in the pilot group provided input that facilitated revisions
in the survey. The final survey instrument consisted of 34 questions. The pre-survey included
questions related to text complexity pre-treatment. The post-survey addressed the same questions
related to text complexity post-treatment.
Data Analysis
The principal component analysis (PCA) was used to identify groups of variables that
were highly correlated into principal components. The principal component analysis, according
to Leech et al. (2005) reduces a larger number of variables to a smaller set of underlying factors
96
that summarize the crucial information in the variables. The decision on which of the principal
components to retain was determined through the use of Varimax rotation, which transformed
the components into factors that could be clearly interpreted to simplify the analysis. Two
essential themes emerged from the PCA and Varimax rotation data: (a) text complexity
preparedness scale, and (b) PD and other preparedness. Additionally, the data were utilized for a
paired t-test to compare the means of the pre- and post-survey to determine if there was a
significant difference in the responses following the treatment, or PD.
Research Questions
In order to learn about how site principals implemented the educational reforms that
currently exist, the following research questions organized the focus on instructional leadership
practices principals utilized for this study:
1. What is the perceived knowledge and skill levels of teachers in text complexity who
have not experienced prior professional development?
2. What is the perceived level of preparedness of teachers in text complexity who have
not experienced prior professional development?
3. To what degree did the teachers’ knowledge and skill levels in text complexity
change after engaging in professional development?
4. To what degree did the teachers’ preparedness in text complexity change after
engaging in professional development?
Discussion of Findings
Research Questions One and Two: Pre- and Post-Treatment
Research question one asked, What is the perceived knowledge and skill levels of
teachers in text complexity who have not experienced prior professional development? Research
97
question two asked, What is the perceived level of preparedness of teachers in text complexity
who have not experienced prior professional development?
The basis for questions one and two was to examine teachers’ self-perceptions regarding
their ability to determine the level of challenges a text provides based on the text complexity
measures before and after experiencing PD. This data (see Figure 12) is important because it
provides insight about participants’ level of understanding of CCR Anchor Reading Standard 10
in the CCSS. The participants in this study provided either direct reading instruction to students
or they coaching for teachers in the area of reading instruction. However, based on the pre-
survey findings, their collective self-perceived knowledge of the trifold measures of text
complexity was well below the working knowledge range before experiencing a PD in text
complexity.
Figure 12. Pre and post comparison.
2.91
2.36
2.55
2.64
2.64
2.64
4.18
4.09
3.73
3.36
3.64
3.55
1 2 3 4 5
q23 Extent that you understand the element of matching
the reader and task in consideration of variables specific
to particular readers and particular tasks
q21 Extent that you understand the qualitative
dimensions of text.
q19 Extent that you understand conventional readability
formulas used to rate text difficulty
q10 Perceived knowledge of Reader and Task Measures of
text Complexity
q9 Perceived knowledge of Qualitative Measures of text
Complexity
q8 Perceived knowledge of Quantitative Measures of Text
Complexity
Series2 Series1
Pre-
Survey
Post-
Survey
q8 Perceived knowledge of Quantitative Measures of
Text Complexity
q9 Perceived knowledge of Qualitative Measures of
Text Complexity
q10 Perceived knowledge of Reader and Task Measures
of text Complexity
q19 Extent that you understand conventional
readability formulas used to rate text difficulty
q21 Extent that understand the qualitative dimensions
of text
q23 Extent that you understand the element of
matching the reader and task in consideration of
variables specific to particulular readers and particular
tasks
98
The data from the post-survey following the PD revealed that teachers made appreciable
gains toward having a strong working knowledge of the trifold measures of text complexity. The
most notable gains in the post-survey were witnessed in the participants’ knowledge of the
quantitative and qualitative measures of text complexity. The gains were significant given the
data from the pre-survey and the participants’ discussion near the beginning of the PD, which
highlighted the participants’ limited knowledge about the quantitative and qualitative measures
of text complexity. The participants’ discussion in the PD highlighted consistent references to the
quantitative measures of text complexity, such as students’ Lexile levels or Accelerated Reader
(AR) reading levels. What was not evident in the pre-survey and the PD discussion was the
participants’ knowledge of the quantitative measures of text complexity beyond Lexile levels.
There was no mention of the syntactical or semantic demands of readers or the cohesion of text,
both of which are variables of the quantitative measures of text complexity (Fisher, Frey, and
Lapp, 2012). Likewise, the participants in the PD discussion made no reference to the qualitative
features of text complexity, such as understanding the levels of meaning or purpose in the
reading of a text, or the language conventionality and clarity in text complexity (Fang & Pace,
2013; Fisher, Frey, and Nelson, 2012; Hiebert & Mesmer, 2013). Finally, the reader and task
measure of text complexity demonstrated the lowest gains in the pre- and post-survey
comparison as well as in the participants’ discussion. In the discussion, one participant
referenced the importance of considering the students’ interest in the kinds of texts used in the
classroom. However, there was no evidence that the participants had knowledge of the variables
of text complexity, such as students’ motivation in reading or students’ knowledge and
experience and prior knowledge of the texts being used in the classroom. The implementation of
strategies around CCR Anchor Reading Standard 10 is based on teachers’ understanding of the
99
reader and task measure of text complexity. (Fisher, Frey, & Lapp, 2012; Mesmer et al., 2012;
Pardo, 2004)
Various studies have highlighted the deficiencies in teachers’ understanding of text
complexity. In the Smith (2018) study, a case study involving elementary teachers, the findings
revealed that teachers’ conceptions and understanding of text complexity were limited. Smith
stated,
Their limited understanding of these concepts not only shapes how and what they teach
but impacts students’ learning. If teachers are unfamiliar with the concept of “text
complexity” and how to identify complex texts they can use with their students, then they
are unable to meet the goals outlined by [CCR Reading] Anchor Standard 10 of the
CCSS. (p. 97)
The confusion around text complexity persists from elementary to high school. Middle and high
school teachers are especially challenged in implementing the trifold measures of text
complexity due to the seemingly extensive process that is not common practice for content-area
teachers at the secondary level. Research on secondary teachers as teachers of reading points to
the fact that many secondary content-area teachers continue to believe the teaching of reading is
someone else’s responsibility (Cunningham & Mesmer, 2014; Gewertz, 2011; Hiebert, 2012;
Liben, 2013; Mayher, 2012; Nesi, 2012).
The results of this study confirm that teachers are ill-prepared to consider the measures of
text complexity in their selection of texts implemented in the classroom. However, based on this
study, the participants’ skill level and the preparedness can be improved using well-planned and
targeted PD. Desimone’s (2009) research indicates that effective PD increases knowledge and
skills that cause teachers to change their instructional practices to increase student learning. The
effectiveness of the PD in this study is consistent with prior research findings in that the content
of the PD was specific and cohesive (Garet, Porter, Desimone, Birman, & Yoon, 2001), the PD
100
targeted a pedagogical process for teaching reading (Gyovai et al., 2009), and the PD provided
new knowledge to shape future instructional practices in reading (Cottingham et al., 2008). The
implications from the data related to research questions one and two demonstrate a need for
district and school leaders to shift away from a workshop approach to PD to a targeted approach
that addresses teachers’ instructional needs. The purpose of PD should be to target support for
teachers so that their instructional practices result in higher levels of student achievement (King,
2013).
Research Questions Three and Four: Pre- and Post-Treatment
Research question three asked, To what degree did the teachers’ knowledge and skill
levels in text complexity change after engaging in professional development? Research question
four asked, To what degree did the teachers’ preparedness in text complexity change after
engaging in professional development?
The basis for research questions three and four was to examine teachers’ self-perceptions
regarding their preparedness in using the tools for measuring the quantitative features, qualitative
features, and reader and task measures of text complexity before and after experiencing PD. In a
broad sense, questions one and two asked: Do you understand the dimensions of text
complexity? Whereas, questions three and four asked: How prepared are you to implement text
complexity? This data are critical (see Figure 13) because they provide insight about the level of
understanding and preparedness to implement text complexity in the classroom: data that could
guide professional developers in their understanding of what teachers need to prepare students to
be college and career ready in the area of reading.
101
Figure 13. Comparison of pre- and post-treatment means.
The data from the post-survey demonstrated an increase in the participants’ ability to
conduct qualitative reviews of text. The increase is significant considering the fact that the pre-
survey showed that teachers had limited skills or preparedness to consider the qualitative features
of text. The participants’ skill level in the qualitative measure of text complexity was influenced
by the PD due to the explicit content of the targeted PD, which clearly defined what the
participants needed to know about text complexity and how they should use the new knowledge
to teach students how to interact with complex text (DuFour, 2004; Guskey, 2002). The
understanding, increased skill level, and preparedness in considering the qualitative measures of
text are critical because the CCSS challenges teachers to place complex and difficult texts in
front of students and then provide scaffolded instructional supports that will increase the
students’ bank of related language, knowledge, skills, or metacognition to help them comprehend
the information (Fisher, Frey, & Lapp, 2012). Data from prior research and this study confirm
that teachers do not have a high level of skill to make decisions about text using the qualitative
feature of text complexity.
2.91
2.18
2.91
3.18
3.64
3.82
4
3.91
3.91
4
1 2 3 4 5
q24 Extent that you are able to match the reader and task
in consideration of variables specific to particular readers
and particular tasks
q22 Extent that you are able to do qualitative reviews of
student text
q20 Extent that you are able to use apps such as Book
Wizard, or websites such as lexile.com as resources to
provide readability formulas
q14 Preparation to provide strategies for your students to
learn how to interact with complex texts
q11 Preparation to implement a variety of reading
strategies in your reading instruction
Series2 Series1 Post-
Survey
Pre-
Survey
q11 Preparation to implement a variety of reading
strategies in your reading instruction
q14 Preparation to provide strategies for your students
to learn how to interact with complex texts
q20 Extent that you are able to use apps such as Book
Wizard, or websites such as lexile.com as resources to
provide readability formulas
q22 Estent that you are able to do qualitative reviews of
student text
q24 Extent that you are able to match the reader and
task in consideration of variables specific to particular
readers and particular tasks
102
For school districts and schools to increase teachers’ knowledge, skills, and preparedness
in making text complexity a significant component of reading instruction, it will take PD that is
sustainable and consistent, representing a move away from one-off workshops that are easily
scheduled and do not require extensive follow up with teachers Likewise, the PD cannot lack a
coherent, integrated approach to instruction. The PD approach in this study could prove to be
impactful beyond the one-time PD experience. To extend the work of this study, district and
school leaders could develop a comprehensive coaching approach to support teachers in the use
of text complexity in their reading instruction. However, the coaches selected must have
expertise in text complexity, as well as in the teachers’ grade level or content area. The depth of
observation, feedback, and suggestions for various strategies depends upon the coach’s expertise
in al of these areas (Tooley & Connally, 2016) Finally, for teachers to implement text complexity
as intended by CCR Anchor Reading Standard 10 of the CCSS, teachers need a solid foundation
in their understanding of what makes text complex (Cunningham & Mesmer, 2014; Fisher, Frey,
and Lapp, 2012; Jago, 2011).
Implications
The study examined the perceived knowledge, skills, and preparedness of the participants
before and after a PD focused on text complexity. The motivation for conducting research
around text complexity and CCR Anchor Reading Standard 10 is due to the lack of student
progress in reading over many years. Student achievement data in reading over the last 20 years
have demonstrated a need to get to the core of why students are leaving high school unprepared
to meet the demand of college-level reading.
Although student literacy data show that students are not reading at grade level, evidence
suggests it has much to do with how teachers are trained, or more accurately not trained, in
103
delivering reading instruction (Hiebert, 2012). As an instructional leader in education for over
two decades, I have witnessed students not receiving the kind of reading instruction they need to
learn how to read and comprehend text. Students enter middle school and high school with
reading deficits, leaving teachers to provide remediation. However, many secondary teachers do
not view themselves as teachers of reading and will not provide instruction beyond the delivery
of core content. As a result, students leave high school unprepared to read college-level text.
Theoretical Implications
Fullan and Quinn’s (2016) coherence framework was used to conceptualize the study.
One of the four variables of the coherence framework, deepening learning, provided the
scaffolding for the research questions in the study that examined the participants’ perceived
knowledge, skills, and preparedness before and after a PD that focused on text complexity.
Deepening learning focuses on: (a) developing clarity of learning goals, (b) building precision in
pedagogy, and (c) shifting practices through capacity building.
The three deepening learning variables were examined to determine the impact of
targeted PD in text complexity. The targeted PD in text complexity focused on developing clarity
in the participants’ learning and understanding of text complexity. As evidenced in Figure 12,
there was an increase in the participants’ overall understanding of the trifold measures of text
complexity pre- and post-PD.
Of the three focus variables in deepening learning, the strength of the study was most
notable in (a) developing clarity of learning goals, and to some degree in (b) building precision
in pedagogy. However, the weakness of the study was evident in the variable of (c) shifting
practices through capacity building. To a lesser degree, the second variable of deepening
learning (b) building precision in pedagogy was a strength in that the PD treatment provided
104
strategies the participants could use in looking at text through the trifold measures of text
complexity. However, the study did not yield evidence that precision in pedagogy in the area of
text complexity improved in the classroom setting following the PD treatment. In the third
variable of deepening learning, (c) shifting practices through capacity building, there was no
evidence of instructional practices shifting in text complexity. The lack of evidence in the third
variable of deepening learning was caused by the fact the participants were not observed in the
implementation of text complexity following the PD treatment. Without observing the
participants in an instructional setting, which was not a part of this study, a shift in instructional
practices would not be noted.
Practical Implications
The most glaring practical implication evidenced in this study showed that after one
decade of CCSS and CCR Anchor Reading Standard 10, a lack of teacher knowledge about text
complexity still persists. When teachers do not understand the dimensions of text complexity, the
result is a deficiency in teaching students how to interact with complex text: a result that
Andrianatos (2019) reported “has been known for a number of years that many learners who pass
through the education system exit without being able to comprehend what they read” (p. 1). This
study highlighted the lack of teacher understanding of the trifold measures of text complexity in
the pre-PD treatment survey.
Although the study PD treatment had a positive impact on teacher knowledge and skills
in text complexity, there was no evidence of the impact on teacher practices following the PD.
However, given the increase in teachers’ knowledge and skills, it could be stated that PD is a
step in the right direction in the process of changing instructional practices in K-12 classrooms.
Fullan and Quinn (2016) called this directional vision, noting that it “emerges in partnership to
105
develop a shared purpose and vision by engaging in continuous collaborative conversations that
build shared language, knowledge, and expectations” (p. 29).
An additional practical implication evidenced in this study is the need for targeted PD to
address the current deficits in reading instruction. Although the CCSS call for teachers to
understand the various dimensions of text complexity, Hiebert (2013) highlighted the limited
guidance teachers have received in attempting to evaluate the various features of text, which is
why, as Seidenberg (2017) suggested, teachers are often more comfortable basing their reading
instruction practices and philosophies on beliefs rather than science. The limited guidance of
teachers in the area of reading instruction also plays a role in secondary teachers’ belief that they
are not teachers of reading, a belief that contradicts the basic message in the CCSS-ELA. The
message that all teachers are teachers of reading is still not being heard after a decade into CCSS.
The findings from this study could serve as a launching pad in making a change regarding how
secondary teachers view themselves only as deliverers of content rather than teachers of reading.
This can be accomplished through targeted PD for secondary teachers to increase their
understanding of text complexity and their role in teaching students how to navigate complex
text.
Future Implications and Recommendations
Improvement in the participants’ understanding of text complexity increased as a result of
2.5 hours of PD. However, questions remain regarding the long-term impact of PD in changing
and shaping instructional practices around text complexity. Future research on the
implementation of text complexity could include an observational longitudinal study to measure
the impact of text complexity implementation over time. Data from such a study would provide
input on the effect of targeted PD on teacher change, instructional practices, and student
106
achievement outcomes in reading. Although the current study was not a longitudinal one, the
findings are consistent with previous research around text complexity knowledge and
implementation in the classroom (Hiebert & Mesmer, 2013; Williamson et al., 2013; Wixson &
Valencia, 2014), including survey responses, statistical analysis, and teacher participation in the
targeted PD.
A recommendation for future study would be to continue the current study and include a
longitudinal observational component of the study that examines precision in instructional
practices around text complexity, as well as how these instructional practices shift the trajectory
of student achievement in the area of reading complex text. The study could begin with targeted
PD around text complexity that provides:
1. Clarity in learning goals for CCSS-ELA.
2. Targeted strategies for building precision in text complexity and reading instruction.
3. Opportunities for building teacher capacity in effective reading instruction.
These three areas of focus for targeted PD are the key to changing instructional practices that
address students’ academic needs (DuFour, DuFour, Eaker, Many, & Mattos, 2020; Kennedy,
2005), which happens most in schools where collaboration is a component of PD (DuFour et al.,
2020). The most prominent collaborative model in school districts is the professional learning
communities (PLC) model. PLCs are “collaborative teams whose members work
interdependently to achieve common goals for which members are mutually accountable”
(DuFour et al., 2020, p. 11).
The longitudinal observational study should include the provision of ongoing support and
coaching to examine the impact on teachers’ reading instruction. Such an initiative requires
instructional leadership that will stay the course in improving students’ achievement outcomes in
107
reading. Dutro, Fisk, Kuch, Roop, and Wixson (2002) contended that targeted PD increases
teachers’ capacity to become learners who implement change in their instructional practices. As
teachers become more reflective in their practices and begin to make changes, the students
benefit through yielding improved student outcomes. When teachers receive targeted PD in
content-specific and higher order thinking strategies, they outperform their peers by upwards of
40% based on data from the NAEP (2019).
Limitations
The limitations of this study were characteristic of those within a mixed instrumental
study incorporating a single setting, a small number of participants, and purposeful sampling.
These characteristics may limit future researchers’ ability to generalize findings across a broader
spectrum of school districts. Creswell (2008) contended that researchers can understand a central
phenomenon to a greater degree through purposeful sampling of selected individuals who
provide information-rich input. A second limitation was related to the timeframe of the study.
The data collection period spanned a 2-week period and consisted of a pre-survey, followed by a
PD exercise, and concluding with a post-survey. In order to optimize evidence of change in
teacher knowledge and understanding of text complexity, it would have been beneficial to
include teachers’ observations to determine the extent to which instructional practices change as
a result of the new text complexity knowledge. Instructional practices in consideration of text
complexity measures would require follow-up support and instructional coaching to change
instructional practices in reading. Although this study was preliminary, it may contribute to the
limited extant literature and provide a framework for expanded research in this area.
108
Conclusion
The CCSS-ELA imposes a school-wide shared responsibility in the instruction of
reading, writing, speaking, listening, and language. This integrated model of literacy
development and proficiency is intended to “define general, cross-disciplinary literacy
expectations that are identified as critical to meet for students to be prepared to enter college and
workforce training programs ready to succeed” (NGA & CCSSO, 2010a, p. 2).
The analysis phase of this study revealed that teachers’ perceptions and knowledge
around text complexity can be affected through targeted PD. The results may be attributed to a
variety of factors. First, the PD treatment systematically targeted each of the text complexity
measures and provided time for the participants to practice identifying text complexity
dimensions within various texts. Second, the sample size was small, which allowed time for a
focus on differentiation of the PD content to meet the participants’ needs. Further study is
warranted to examine the connection between the PD treatment’s relationship with the
participants’ use of text complexity in the classroom and student achievement data.
The way in which students interact with complex text is contingent upon the scope and
depth of teachers’ knowledge of text complexity and CCR Anchor Reading Standard 10 of the
CCSS. The findings of the current study indicate that teachers need ongoing PD to facilitate a
deeper understanding of text complexity. The CCSS-ELA have changed the way reading
instruction is delivered to students, and this researcher recommends that teachers have access to
targeted PD to gain the strategies they need to be effective reading teachers. Teachers need
extensive training on CCR Anchor Reading Standard 10 and strategies to increase students’
interaction with complex text. As part of a comprehensive reading initiative, school districts
should implement a plan for PD that includes: (a) targeted PD in the trifold measures of text
109
complexity, (b) opportunities for teacher coaching to assist teachers in the strategies to
implement CCR Anchor Reading Standard 10 of the CCSS, and (c) reflective practices and
collaborative conversations across grade levels and content areas related to the instructional
practices and considerations around the implementation of CCR Anchor Reading Standard 10
110
References
20 USC 9201. [Bill]. Workforce Investment Act 1-313 (1998).
Ai, K. H., Scheu, J. A., Kawakamai, A. J., & Herman, P. A. (1990). Assessment and
accountability in whole language literacy. The Reading Teacher, 43, 574-578.
American College Testing. (2016). The condition of college and career readiness. ACT, Inc.
American College Testing. (2017). The condition of college and career readiness 2017. ACT,
Inc.
Andrianatos, K. (2019). Barriers to reading in higher education: Rethinking reading support.
Reading & Writing. https://rw.org.za/index.php/rw/article/view/241/557
Apex Learning. (2017, January 16). 3 reasons standards are essential to educational success.
https://www.apexlearning.com/blog/3-reasons-standards-are-essential-to-educational-
success
Archibald, S., Coggshall, J. G., Croft, A., & Goe, L. (2011). High-quality professional
development for all teachers: Effectively allocating resources. Research & policy brief
(ED520732). ERIC. https://eric.ed.gov/?id=ED520732
Baer, J. D., Cook, A. L., & Baldi, S. (2006). The literacy of America’s college students: The
National Survey of America’s College Students. (ED518670). ERIC.
https://files.eric.ed.gov/fulltext/ED518670.pdf
Baer, J. D., Cook, A. L., & Baldi, S. (2006). The literacy of America’s college students.
American Institutes for Research. https://www.air.org/resource/literacy-americas-college-
students
111
Bailey, T. (2009). Challenge and opportunity: Rethinking the role and function of developmental
education in community college. New Directions for Community Colleges, 2009(145),
11–30. https://doi.org/10.1002/cc.352
Ball, D. L., & Forzani, F. M. (2009). The work of teaching and the challenge for teacher
education. Journal of Teacher Education, 60(5), 497–511.
https://doi.org/10.1177/0022487109348479
Ball, D. L., Sleep, L., Boerst, T. A., & Bass, H. (2009). Combining the development of practice
and the practice of development in teacher education. The Elementary School Journal,
109(5), 458–474. https://doi.org/10.1086/596996
Banilower, E. R., Heck, D. J., & Weiss, I. R. (2007). Can professional development make the
vision of the standards a reality? The impact of the national science foundation’s local
systemic change through teacher enhancement initiative. Journal of Research in Science
Teaching, 44(3), 375–395. https://doi.org/10.1002/tea.20145
Bärnighausen, T., Tugwell, P., Røttingen, J.-A., Shemilt, I., Rockers, P., Geldsetzer, P., … Atun,
R. (2017). Quasi-experimental study designs series—paper 4: Uses and value. Journal of
Clinical Epidemiology, 89, 21-29. https://doi.org/10.1016/j.jclinepi.2017.03.012
Barton, P. E. (2006). Reading between the lines: What the ACT reveals about college readiness
in reading. ACT.
Bateman, B. (1991). Teaching word recognition to slow-learning children. Reading & Writing
Quarterly, 7(1), 1-16. https://doi.org/10.1080/0748763910070102
Baumann, J. F., & Bason, J. J. (2011). Survey research. In N. K. Duke & M. H. Mallette (Eds.),
Literacy research methodologies (pp. 404–426). Guilford.
112
Bayar, A. (2014). The components of effective professional development: Activities in terms of
teachers’ perspectives. International Online Journal of Educational Sciences, 6(2), 319-
327. https://doi.org/10.15345/iojes.2014.02.006
Bean, R. M., Eichelberger, R. T., Swan, A., & Tucker, R. (1999). Professional development to
promote early literacy achievement. In J. R. Dugan, P. E. Linder, W. M. Linek, & E. F.
Sturtevant (Eds.), Advancing the world of literacy: moving into the 21st century (pp. 94-
106). Reading Association.
Beeb, P. L., Hawkins, M., & Roller, C. M. (1991). Moving learners toward independence: The
power of scaffolded instruction. The Reading Teacher, 44(9), 648–655.
Blachman, B. (1991). Early intervention for children’s reading problems: Clinical applications of
the research in phonological awareness. Topics in Language Disorders, 12(1), 51-65.
https://doi.org/10.1097/00011363-199112010-00006
Bogdan, R., & Biklen, S. K. (2016). Qualitative research for education: An introduction to
theories and methods. Pearson India Education Services.
Borg, M. (2001). Teachers’ beliefs. ELT Journal, 55(2), 186-187.
Borg, S. (2003). Teacher cognition in language teaching: A review of research on what language
teachers think, know, believe, and do. Language Teaching, 36(2), 81–109.
https://doi.org/10.1017/s0261444803001903
Boyle, B., Lamprianou, I., & Boyle, T. (2005). A longitudinal study of teacher change: What
makes professional development effective? Report of the second year of the study.
School Effectiveness and School Improvement, 16(1), 1–27.
https://doi.org/10.1080/09243450500114819
Bubb, S., & Earley, P. (2004). Managing teacher workload. Sage.
113
Burkins, J., & Yaris, K. (2012a). Lost in translation. Burkins & Yaris.
https://www.burkinsandyaris.com/lost-in-translation/
Burkins, J., & Yaris, K. (2012b). Text complexity 101. Burkins & Yaris.
https://www.burkinsandyaris.com/text-complexity-101/
California Department of Education. (2020). DataQuest. https://dq.cde.ca.gov/dataquest/
Calvert, L. (2016). Moving from compliance to agency: What teachers need to make professional
learning work. https://learningforward.org/wp-content/uploads/2017/08/moving-from-
compliance-to-agency.pdf
Carreón, V., & Rau, S. (2014). Reforming professional development to improve literacy
outcomes in Nevada. Guinn Center for Policy Priorities/Nevada Succeeds.
Carreón, V., & Rau, S. (2014). Reforming professional development to improve literacy
outcomes in Nevada. Guinn Center for Policy Priorities/Nevada Succeeds.
Center on Instruction. (2006). Designing high quality professional development: Building a
community of reading experts in elementary schools. Author.
http://www.centeroninstruction.org/designing-high-quality-professional-development-
building-a-community-of-reading-experts-in-elementary-schools
Chait, R., & Venezia, A. (2009). Improving academic preparation for college: What we know
and how state and federal policy can help. Center for American Progress.
https://www.americanprogress.org/issues/economy/reports/2009/01/27/5411/improving-
academic-preparation-for-college/.
Choy, S. P., Chen, X., & Bugarin, R. (2006). Teacher professional development in 1999-2000:
What teachers, principals, and district staff report. Statistical analysis report. NCES
2006-305. National Center for Education Statistics.
114
Cohen, D. (1968). The effect of literature on vocabulary and reading achievement. Elementary
English, 45(2), 209-213.
Cohen, D. K., & Bhatt, M. P. (2012). The importance of infrastructure development to high-
quality literacy instruction. The Future of Children, 22(2), 117-138.
https://doi.org/10.1353/foc.2012.0012
Common Core State Standards Initiative. (2010). English language arts & literacy in
history/social studies, science, and technical subjects. Corestandards.org.
http://www.corestandards.org/assets/Appendix_A.pdf
Common Core State Standards Initiative. (2010). Preparing America’s students for success.
http://www.corestandards.org/
Comrey, A. L., & Lee, H. B. (1992). A first course in factor analysis. L. Erlbaum.
Conley, D. T., & McGaughy, C. (2012). College and career readiness: Same or different?
Educational Leadership, 69(7), 28-34.
Conley, M. W. (2008). Meeting the challenge of adolescent literacy: Research we have, research
we need. Guilford Press.
Conway, J. M., & Huffcutt, A. I. (2003). A review and evaluation of exploratory factor analysis
practices in organizational research. Organizational Research Methods, 6(2), 147-168.
https://doi.org/10.1177/1094428103251541
Corcoran, T. C. (1995). Transforming professional development for teachers: a guide for state
policymakers. National Governors Association, Education Policy Studies, Center for
Policy Research.
Cottingham, P., Cronen, S., Eaton, M., Garet, M. S., Jones, W., Kurki, A., & Ludwig, M. (2008).
The impact of two professional development interventions on early reading instruction
115
and achievement (REL 2008-No. 4030). U.S. Department of Education, Institute of
Education Sciences, National Center for Education Evaluation and Regional Assistance.
Creswell, J. W. (2003). Research design: Qualitative, quantitative, and mixed methods
approaches. Sage.
Creswell, J. W. (2006). Understanding mixed method research. Sage.
Creswell, J. W. (2008). Educational research: planning, conducting, and evaluating quantitative
and qualitative research. Pearson Education International.
Creswell, J. W. (2009). Research design: qualitative, quantitative, and mixed methods
approaches. SAGE.
Creswell, J. W., & Clark, V. L. (2007). Understanding mixed methods research. In J. Creswell
(Ed.), Designing and conducting mixed methods research (pp. 1–19). Sage.
Creswell, J. W., Hanson, W. E., Plano Clark, V. L., & Morales, A. (2007). Qualitative research
designs. The Counseling Psychologist, 35(2), 236–264.
https://doi.org/10.1177/0011000006287390
Creswell, J. W., & Plano Clark, L. (2018). Designing and conducting mixed methods research.
SAGE.
Creswell, J. W., & Plano Clark, V. (2018). Designing and conducting mixed methods research.
Sage.
Creswell, J. W., Plano Clark, V. L., Gutmann, M. L., & Hanson, W. E. (2003). Advanced mixed
methods research designs. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed
methods in social and behavioral research (pp. 209-240). Sage.
116
Cruce, T. M., Lowe, J., & Mattern, K. D. (2018). The use of collegeready to improve course
performance in English without the need for formal remediation: A case study at
Chattanooga Community College. ACT, Inc.
Cullinan, B., Jaggar, A., & Strickland, D. (1974). Language expansion for black children in the
primary grades: A research report. Young Children, 29(1), 98-112.
Cunningham, J. W., & Mesmer, H. A. (2014). Quantitative measurement of text difficulty. The
Elementary School Journal, 115(2), 255–269. https://doi.org/10.1086/678292
Darling-Hammond, L. (1999). Educating teachers: The academy’s greatest failure or its most
important future? Academe, 85(1). https://doi.org/10.2307/40251715
Darling-Hammond, L. (2000). Teacher quality and student achievement. Education Policy
Analysis Archives, 8. https://doi.org/10.14507/epaa.v8n1.2000
Darling-Hammond, L. (2005). Prepping our teachers for teaching as a profession. The Education
Digest, 71(4), 22–27.
Darling-Hammond, L., & Richardson, N. (2009). Teacher learning: What matters. Educational
Leadership, 66(5), 46-53.
Darling-Hammond, L., Hyler, M. E., & Gardner, M. (2017). Effective teacher professional
development. Palo Alto, CA: Learning Policy institute.
Desimone, L. M. (2009). Improving impact studies of teachers’ professional development:
Toward better conceptualizations and measures. Educational Researcher, 38(3), 181-199.
https://doi.org/10.3102/0013189x08331140
Desimone, L., Porter, A. C., Birman, B. F., Garet, M. S., & Yoon, K. S. (2002). How do district
management and implementation strategies relate to the quality of the professional
117
development that districts provide to teachers? Teachers College Record, 104(7), 1265-
1312. https://doi.org/10.1111/1467-9620.00204
DeVellis, R. F. (1991). Scale development: Theory and applications. Sage.
Draper, D. (2012). Comprehension Strategies: Comprehension strategies applied to
mathematics.
http://gabriellec12.weebly.com/uploads/3/7/1/7/37176479/comprehension_and_mathema
_1.pdf
Draper, R. J., Broomhead, P., Jensen, A. P., & Nokes, J. D. (2012). (Re)Imagining literacy and
teacher preparation through collaboration. Reading Psychology, 33(4), 367-398.
https://doi.org/10.1080/02702711.2010.515858
DuFour, R. (2004). Whatever it takes: How professional learning communities respond when
kids don’t learn. National Educational Service.
DuFour, R., DuFour, R. B., Eaker, R. E., Many, T. W., & Mattos, M. (2020). Learning by doing:
a handbook for professional learning communities at work. Solution Tree Press.
Durkin, D. (1978). What classroom observations reveal about reading comprehension
instruction. Reading Research Quarterly, 14(4), 481-533.
https://doi.org/10.1598/rrq.14.4.2
Dutro, E., Fisk, M. C., Koch, R., Roop, L. J., & Wixson, K. (2002). When state policies meet
local district contexts: Standards-based professional development as a means to
individual agency and collective ownership. Teachers College Record, 104(4), 787–811.
https://doi.org/10.1111/1467-9620.00179
Efron, S. E., & Ravid, R. (2020). Action research in education: a practical guide. The Guilford
Press.
118
Eisenberg, J. A. (1992). The limits of reason: Indeterminacy in law, education, and morality.
OISE Press.
Engelmann, S., & Meyer, L. A. (1984). Reading comprehension instruction in grades 4, 5, and
6: Program characteristics; Teacher perceptions; teacher behaviors; and student
performance. National Institute of Education.
Fang, Z. (2014). Preparing content area teachers for disciplinary literacy instruction. Journal of
Adolescent & Adult Literacy, 57(6), 444-448. https://doi.org/10.1002/jaal.269
Fang, Z., & Pace, B. G. (2013). Teaching with challenging texts in the disciplines. Journal of
Adolescent & Adult Literacy, 57(2), 104-108. https://doi.org/10.1002/jaal.229
Firestone, W. A., & Riehl, C. (2005). A new agenda for research in educational leadership.
Teachers College Press.
Fisher, D., Frey, N., & Lapp, D. (2012). Text complexity: Raising rigor in reading. International
Reading Association.
Fisher, D., Frey, N., & Nelson, J. (2012). Literacy achievement through sustained professional
development. The Reading Teacher, 65(8), 551-563. https://doi.org/10.1002/trtr.01082
Freire, P. (1970). Pedagogy of the oppressed. Seabury Press.
Freire, P., & Macedo, D. P. (1987). Literacy: reading the word and the world. Routledge &
Kegan Paul.
Frey, B., Lee, S. W., Tollefson, N., Pass, L., & Massengill, D. (2005). Balanced literacy in an
urban school district. The Journal of Educational Research, 98(5), 272-280.
https://doi.org/10.3200/joer.98.5.272-280
119
Fullan, M. (1990). Staff development, innovation, and institutional development. In B. Joyce
(Ed.), Changing school culture through staff development (pp. 3-15). Association of
Supervision and Curriculum Development.
Fullan, M. (2001a). Leading in a culture of change. Jossey-Bass.
Fullan, M. (2001b). The new meaning of educational change.
https://doi.org/10.4324/9780203986561
Fullan, M. (2007). The new meaning of educational change. Routledge.
Fullan, M. (2009). What’s worth fighting for in headship? Open University Press.
Fullan, M. (2014). The principal. Wiley.
Fullan, M. G. (1993). Why teachers must become change agents. Educational Leadership, 50(6),
1–13.
Fullan, M., & Quinn, J. (2016). Coherence: The right drivers in action for schools, districts, and
systems. Corwin.
Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Yoon, K. S. (2001). What makes
professional development effective? Results from a national sample of teachers.
American Educational Research Journal, 38(4), 915-945.
https://doi.org/10.3102/00028312038004915
Garland, J., Layland, A., & Corbett, J. (2018). Systems thinking leadership for district and school
improvement. Illinois Center on School Improvement at American Institutes for
Research.
Gee, J. P. (1989). The legacies of literature: From Plato to Freire through Harvey Graff. Journal
of Education, 171(1), 147-165.
120
Gerard, L. F., Varma, K., Corliss, S. B., & Linn, M. C. (2011). Professional development for
technology-enhanced inquiry science. Review of Educational Research, 81(3), 408–448.
https://doi.org/10.3102/0034654311415121
Gewertz, C. (2011). Teachers seek ways to gauge rigor of texts. Education Week, 30(24), 12-13.
Gewertz, C. (2013). A Common Core challenge: Learners with special needs. Education Week,
33, 4-6.
Gibson, T. (2005). Epilogue to Plato: The bias of literacy. Proceedings of the Media Ecology
Association, 6, 1-17.
Goodman, K. (1986). What’s whole in whole language? Heinemann.
Goodman, K. S., Shannon, P., Freeman, Y., & Murphy, S. (1988). Report card on basal readers.
R.C. Owen.
Graduate NYC. (2016). The state of college readiness and degree completion in New York City.
http://www.graduatenyc.org/wp-content/uploads/2016/05/GNYC-Report-Brief-2.pdf
Gray, J., Haynie, K., Packman, S., Boehm, M., Crawford, C., & Muralidhar, D. (2015, February-
March). A mid-project report on a statewide professional development model for CS
principles [Paper presentation]. The 46th ACM Technical Symposium on Computer
Science Education - SIGCSE ’15, Kansas City, MO, United States.
https://doi.org/10.1145/2676723.2677306
Greene, J., & Caracelli, V. (2003). Making paradigmatic sense of mixed methods practice. In A.
Tashakkori & C. Teddie (Eds.), Handbook of mixed methods in social behavior research
(pp. 91-110). Sage.
Greer, A., & Graff, H. J. (1981). The literacy myth: Literacy and social structure in the
nineteenth-century city. Labour/Le Travail, 7, 187. https://doi.org/10.2307/25140033
121
Grossman, P., & Mcdonald, M. (2008). Back to the future: Directions for research in teaching
and teacher education. American Educational Research Journal, 45(1), 184–205.
https://doi.org/10.3102/0002831207312906
Guskey, T. R. (1986). Staff development and the process of teacher change. Educational
Researcher, 15(5), 5-12. https://doi.org/10.3102/0013189x015005005
Guskey, T. R. (1989). Attitude and perceptual change in teachers. International Journal of
Educational Research, 13(4), 439-453. https://doi.org/10.1016/0883-0355(89)90039-6
Guskey, T. R. (2002). Does it make a difference? Evaluating professional development.
Educational Leadership, 59(6), 45-51.
Guskey, T. R. (2009). Closing the knowledge gap on effective professional development.
Educational Horizons, 87(4), 224–233.
Guskey, T. R., & Sparks, D. (2002). Linking professional development to improvements in
student learning. In E. Guyton, J. D. Rainer, & J. Rainer Dangel (Eds.), Research linking
teacher preparation and student performance (pp. 11-21). Kendall-Hunt.
Gyovai, L. K., Cartledge, G., Kourea, L., Yurick, A., & Gibson, L. (2009). Early reading
intervention: Responding to the learning needs of young at-risk English language
learners. Learning Disability Quarterly, 32(3), 143-162.
https://doi.org/10.2307/27740365
Handal, B., & Harrington, A. (2003). Mathematics teachers’ beliefs and curriculum reform.
Mathematics Education Research Journal, 15(1), 59–69.
Handal, B., & Harrington, A. (2003). Mathematics teachers’ beliefs and curriculum reform.
Mathematics Education Research Journal, 15(1), 59-69.
https://doi.org/10.1007/bf03217369
122
Harrington, T. (2017). More teacher preparation needed to fully implement Common Core
standards in California. https://edsource.org/2017/more-teacher-preparation-needed-to-
fully-implement-common-core-standards-in-california/575306
Hattie, J. (2003). Teachers make a difference: What is the research evidence? New Zealand
Ministry of Education. http://www.educationalleaders.govt.nz/Pedagogy-and-
assessment/Building-effective-learning-environments/Teachers-make-a-difference-What-
is-the-research-evidence
Hattie, J. (2010). Visible learning: A synthesis of over 800 meta-analyses relating to
achievement. Routledge.
Hayton, J. C., Allen, D. G., & Scarpello, V. (2004). Factor retention decisions in exploratory
factor analysis: A tutorial on parallel analysis. Organizational Research Methods, 7(2),
191-205. https://doi.org/10.1177/1094428104263675
Henk, W. A., & Moore, J. C. (1992). Facilitating change in school literacy: From state initiatives
to district implementation. Journal of Reading, 35(7).
Hess, K., & Biggam, S. (2004). A discussion of increasing text complexity. National Center for
the Improvement of Educational Assessment.
https://www.nciea.org/publications/TextComplexity_KH05.pdf
Hiebert, E. H. (2012). The common core’s staircase of text complexity: Getting the size of the
first step right. Reading Today, 29, 26-27.
Hiebert, E. H. (2013). Supporting students’ movement up the staircase of text complexity. The
Reading Teacher, 66(6), 459-468. https://doi.org/10.1002/trtr.1149
123
Hiebert, E. H., & Grisham, D. (2012). What literacy teacher educators need to know about
supporting teachers in understanding text complexity within the common core. Journal of
Reading Education, 37(3), 5-12.
Hiebert, E. H., & Mesmer, H. A. E. (2013). Upping the ante of text complexity in the Common
Core State Standards. Educational Researcher, 42(1), 44-51.
https://doi.org/10.3102/0013189x12459802
Hiebert, J., & Morris, A. K. (2012). Teaching, rather than teachers, as a path toward improving
classroom instruction. Journal of Teacher Education, 63(2), 92–102.
https://doi.org/10.1177/0022487111428328
Hinkle, D.E., Wiersma, W., & Jurs, S.G. (1998). Applied statistics for the behavioral sciences
(4th Ed.). Houghton Mifflin.
Hirsh, S. (2012, January 31). Common-core work must include teacher development. Education
Week. http://mdk12-
archive.msde.maryland.gov/instruction/teacher_induction/pdf/CommonCoreContradictio
n_Edweek_FEB012012.pdf
Hoffman, J. V., Sailors, M., & Patterson, E. U. (2002). Decodable texts for beginning reading
instruction: The year 2000 basals. Journal of Literacy Research, 34(3), 269–298.
https://doi.org/10.1207/s15548430jlr3403_2
Huberman, M. (1995). Networks that ALTER teaching: Conceptualizations, exchanges and
experiments. Teachers and Teaching, 1(2), 193-211.
https://doi.org/10.1080/1354060950010204
Hunter, C. S.-J., & Harman, D. (1979). Adult illiteracy in the United States: A report to the Ford
Foundation. McGraw Hill.
124
Jaadi, Z. (2020). A step by step explanation of principal component analysis. Built In.
https://builtin.com/data-science/step-step-explanation-principal-component-analysis
Jackson, J., & Kurlaender, M. (2013). College readiness and college completion at broad access
four-year institutions. American Behavioral Scientist, 58(8), 947–971.
https://doi.org/10.1177/0002764213515229
Jago, C. (2011). With rigor for all: meeting common core standards for reading literature.
Heinemann.
Jenkins, D., Jaggars, S. S., & Roksa, J. (2009). Promoting gatekeeper course success among
community college students needing remediation: Findings and recommendations from a
Virginia study. CCRC Publications.
Jimenez, L., Sargrad, S., Morales, J., & Thompson, M. (2016). Remedial education. Retrie
https://www.americanprogress.org/issues/education-k-
12/reports/2016/09/28/144000/remedial-education/
Johnson, C. C., Kahle, J. B., & Fargo, J. D. (2007). A study of the effect of sustained, whole-
school professional development on student achievement in science. Journal of Research
in Science Teaching, 44(6), 775-786. https://doi.org/10.1002/tea.20149
Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm
whose time has come. Educational Researcher, 33(7), 14-26.
https://doi.org/10.3102/0013189x033007014
Joyce, B. R., & Showers, B. (2002). Student achievement through staff development. Association
for Supervision and Curriculum Development.
Joyce, B. R., Wolf, J., & Calhoun, E. F. (1993). The self-renewing school. Association for
Supervision and Curriculum Development.
125
Kaestle, C. F. (1985). The history of literacy and the history of readers. Review of Research in
Education, 12, 11-53. https://doi.org/10.2307/1167145
Kennedy, C. H. (2005). Single case designs for educational research. Allyn & Bacon.
Kesson, K. R., & Henderson, J. G. (2010). Reconceptualizing professional development for
curriculum leadership: Inspired by John Dewey and informed by Alain
Badiou. Educational Philosophy and Theory, 42(2), 213-229.
https://doi.org/10.1002/9781444391527.ch5
King, F. (2012). Developing and sustaining teachers’ professional learning: A case study of
collaborative professional development [Doctoral dissertation, University of Lincoln].
DCU Online Research Access Service. http://doras.dcu.ie/22058/
King, F. (2013). Evaluating the impact of teacher professional development: an evidence-based
framework. Professional Development in Education, 40(1), 89–111.
https://doi.org/10.1080/19415257.2013.823099
Kintsch, W., & Kintsch, E. (2005). Comprehension. In S. G. Paris & S. A. Stahl (Eds.),
Children’s reading comprehension and assessment (pp. 71-104). Lawrence Erlbaum
Associates.
Labone, E., & Long, J. (2014). Features of effective professional learning: a case study of the
implementation of a system-based professional learning model. Professional
Development in Education, 42(1), 54–77. https://doi.org/10.1080/19415257.2014.948689
Lambert, D. (2020, March 11). California moves closer to eliminating, replacing reading
instruction test that has blocked thousands from teaching credential. EdSource.
https://edsource.org/2020/california-moves-closer-to-eliminating-replacing-reading-
instruction-test-that-has-blocked-thousands-from-teaching-credential/622830.
126
Lampert, M. (2009). Learning teaching in, from, and for practice: What do we mean? Journal of
Teacher Education, 61(1-2), 21–34. https://doi.org/10.1177/0022487109347321
Lazarin, M. (2016). Reading, writing, and the Common Core State Standards. Center for
American Progress. https://cdn.americanprogress.org/wp-
content/uploads/2016/08/16084151/ELAmatters-report.pdf
Leech, N. L., Barrett, K. C., & Morgan, G. A. (2005). SPSS for intermediate statistics: Use and
interpretation. Lawrence Erlbaum Associates.
Leithwood, K., Jantzi, D., & Steinbach, R. (2002). Leadership practices for accountable schools.
In K. Leithwood & P. Hallinger (Eds.), Second international handbook of educational
leadership and administration, part 2 (pp. 849-879). Kluwer.
https://doi.org/10.1007/978-94-010-0375-9_29
Liben, D. (2010). Why complex text matters. http://achievethecore.org/page/62/why-complex-
text-matters-detail-pg
Liben, D. (2013). Which words do I teach and how? Achieve the Core.
https://achievethecore.org/page/61/which-words-do-i-teach-and-how
Liberman, I. Y., & Liberman, A. M. (1990). Whole language vs. code emphasis: Underlying
assumptions and their implications for reading instruction. Annals of Dyslexia, 40, 51-76.
https://doi.org/10.1007/bf02648140
Loucks-Horsley, S., & Matsumoto, C. (1999). Research on professional development for
teachers of mathematics and science: The state of the scene. School Science and
Mathematics, 99(5), 258–271. https://doi.org/10.1111/j.1949-8594.1999.tb17484.x
127
Luft, J. A., Firestone, J. B., Wong, S. S., Ortega, I., Adams, K., & Bang, E. (2011). Beginning
secondary science teacher induction: A two-year mixed methods study. Journal of
Research in Science Teaching, 48(10), 1199–1224. https://doi.org/10.1002/tea.20444
Lyons, C. A., & Pinnell, G. S. (2001). Systems for change in literacy education a guide to
professional development. Heinemann.
MacCallum, R. C., Widaman, K. F., Zhang, S., & Hong, S. (1999). Sample size in factor
analysis. Psychological Methods, 4(1), 84-99. https://doi.org/10.1037/1082-989x.4.1.84
Malatesha Joshi, R., Binks, E., Hougen, M., Dahlgren, M. E., Ocker-Dean, E., & Smith, D. L.
(2009). Why elementary teachers might be inadequately prepared to teach reading.
Journal of Learning Disabilities, 42(5), 392-402.
https://doi.org/10.1177/0022219409338736
Margerison, J. (2017). A Beginner's Guide to Text Complexity -. Generation Ready.
https://www.generationready.com/a-beginners-guide-to-text-complexity/.
Marshall, C., & Rossmann, G. B. (2011). Designing qualitative research. SAGE.
Marzano, R. J. (2003). What works in schools: Translating research into action. ASCD.
Mayher, J. S. (2012). English teacher education as literacy teacher education. English Education,
44(2), 180–187. https://www.jstor.org/stable/23238753
McGuffey, W. H. (1866). McGuffey’s new eclectic reader. Van Antwerp, Bragg & Co.
McKenna, M. C., Miller, J. W., & Robinson, R. D. (1990). Whole language and the need for
open inquiry: A rejoinder to Edelsky. Educational Researcher, 19(8), 12-13.
https://doi.org/10.3102/0013189x019008012
McKnight, G. (2018). Exploring the relationship between professional development leaders’
competencies of effective professional learning and teachers’ perceptions of professional
128
development [Doctoral dissertation, University of North Carolina]. Carolina Digital
Repository. https://cdr.lib.unc.edu/concern/dissertations/2j62s5576
McNiff, J., & Whitehead, J. (2009). Action research: What it is and what it is not. You and Your
Action Research Project, 13–30. https://doi.org/10.4324/9780203871553-6
Meier, K. J., & Brudney, J. L. (2002). Applied statistics for public administration. Harcourt
College.
Merriam, S. B. (1998). Qualitative research and case study applications in education: Revised
and expanded from case study research in education. Jossey-Bass.
Merriam, S. B. (2009). Qualitative research: a guide to design and implementation. Jossey-Bass.
Mesmer, H. A., Cunningham, J. W., & Hiebert, E. H. (2012). Toward a theoretical model of text
complexity for the early grades: Learning from the past, anticipating the future. Reading
Research Quarterly, 47(3), 235-258. https://doi.org/10.1002/rrq.019
Messick, S. (2005). Standards of validity and the validity of standards in performance
assessment. Educational Measurement: Issues and Practice, 14(4), 5–8.
https://doi.org/10.1111/j.1745-3992.1995.tb00881.x
Messick, S. (2005). Standards of validity and the validity of standards in performance
Asessment. Educational Measurement: Issues and Practice, 14(4), 5–8.
https://doi.org/10.1111/j.1745-3992.1995.tb00881.x
Miles, M. B., Ekholm, M., & Vandenberghe, R. (1987). Lasting school improvement: Exploring
the process of institutionalization. Acco.
Miller, P. (2005). Reading comprehension and its relation to the quality of functional hearing:
Evidence from readers with different functional hearing abilities. American Annals of the
Deaf, 150(3), 305-323. https://doi.org/10.1353/aad.2005.0031
129
Miller, T. (2012). Ethics in qualitative research. SAGE.
Morgan, G. A., Gliner, J. A., & Harmon, R. J. (2000). Quasi-experimental designs. Journal of
the American Academy of Child & Adolescent Psychiatry, 39(6), 794–796.
https://doi.org/10.1097/00004583-200006000-00020
Munir-McHill, S. (2013). Evaluating passage-level contributors to text complexity [Doctoral
dissertation, University of Oregon].
http://scholarsbank.uoregon.edu/xmlui/handle/1794/13422
Murphy, J. (2005). Connecting teacher leadership and school improvement. Corwin.
National Assessment of Educational Progress. (2009). The Nation’s Report Card: Reading 2009.
National Center for Education Statistics.
https://nces.ed.gov/nationsreportcard/pdf/main2009/2010458.pdf.
National Assessment of Educational Progress. (2019). NAEP reading 2019 highlights. The
Nation’s Report Card. https://www.nationsreportcard.gov/highlights/reading/2019/
National Center for Education Statistics. (2020, May). The Condition of Education at a glance:
Reading performance. https://nces.ed.gov/programs/coe/indicator_cnb.asp
National Council on Teacher Quality. (2016). NCTQ releases new ratings of elementary teacher
prep programs. https://www.nctq.org/publications/NCTQ-Releases-New-Ratings-of-
Elementary-Teacher-Prep-Programs
National Governors Association and Council of Chief State School Officers. (2010a). Appendix
A: Research supporting key elements of the standards (ED576695). ERIC
https://eric.ed.gov/?id=ED576695
130
National Governors Association and Council of Chief State School Officers. (2010b). Reaching
higher: The Common Core State Standards validation committee. Washington, DC:
National Governors Association and Council of Chief State School Officers.
National Reading Panel. (2000). Teaching children to read: An evidence-based assessment of the
scientific research literature on reading and its implications for reading instruction.
National Institute of Child Health and Human Development.
https://www.nichd.nih.gov/sites/default/files/publications/pubs/nrp/Documents/report.pdf
Nesi, H., & Gardner, S. (2012). Genres across the disciplines: student writing in higher
education. Cambridge University Press.
Nunnally, J. C., & Bernstein, I. H. (2010). Psychometric theory. McGraw-Hill Education.
O’Donoghue, T. A., & Punch, K. (2003). Qualitative educational research in action doing and
reflecting. Routledge Falmer.
Oakhill, J. (2000). Children’s difficulties in text comprehension: Assessing causal issues.
Journal of Deaf Studies and Deaf Education, 5(1), 51–59.
https://doi.org/10.1093/deafed/5.1.51
Odden, A., Archibald, S., Fermanich, M., & Gallagher, H. A. (2002). A cost framework for
professional development. Journal of Education Finance, 28(1), 51–74.
Ogawa, R. T., & Bossert, S. T. (1995). Leadership as an organizational quality. Educational
Administration Quarterly, 31(2), 224-243.
https://doi.org/10.1177/0013161x95031002004
Opfer, V. D., & Pedder, D. (2011). The lost promise of teacher professional development in
England. European Journal of Teacher Education, 34(1), 3-24.
https://doi.org/10.1080/02619768.2010.534131
131
Palardy, G. J., & Rumberger, R. W. (2008). Teacher effectiveness in first grade: The importance
of background qualifications, attitudes, and instructional practices for student learning.
Educational Evaluation and Policy Analysis, 30(2), 111-140.
https://doi.org/10.3102/0162373708317680
Pardo, L. S. (2004). What every teacher needs to know about comprehension. The Reading
Teacher, 58(3), 272–280. https://doi.org/10.1598/rt.58.3.5
Patton, M. Q. (2002). Qualitative research and evaluation methods. Sage.
Patton, M. Q. (2015). Qualitative research & evaluation methods: Integrating theory and
practice. Sage.
Pavonetti, L. M., Brimmer, K. M., & Cipielewski, J. F. (2003). Accelerated reader: What are the
lasting effects on the reading habits of middle school students exposed to accelerated
reader in elementary grades? Journal of Adolescent and Adult Literacy, 46(4), 300-311.
https://www.jstor.org/stable/40013588
Perkins, J. H., & Cooter, K. (2013). An investigation of the efficacy of one urban literacy
academy: Enhancing teacher capacity through professional development. Reading
Horizons: A Journal of Literacy and Language Arts, 52(2).
https://scholarworks.wmich.edu/reading_horizons/vol52/iss2/6/
Polumbo, B. (2018). Up to 60 percent of college students need remedial classes. The Federalist.
Retrieved https://thefederalist.com/2018/09/18/60-percent-college-students-need-
remedial-classes-needs-change-now/
Porter-Magee, K. (2012, April 18). Are “just right” books right for the Common Core?
https://fordhaminstitute.org/ohio/commentary/are-just-right-books-right-common-core#
132
Porter, A., McMaken, J., Hwang, J., & Yang, R. (2011). Common core standards. Educational
Researcher, 40(3), 103-116. https://doi.org/10.3102/0013189x11405038
Powell, B. B. (1989). Why Was the Greek Alphabet Invented? The Epigraphical Evidence.
Classical Antiquity, 8(2), 321–350. https://doi.org/10.2307/25010912
Preiss, A. (2016). RELEASE: Common Core English Language Arts Standards arm students with
necessary literacy skills needed for college and career, new CAP report says. Center for
American Progress.
https://www.americanprogress.org/press/release/2016/08/17/142724/release-common-
core-english-language-arts-standards-arm-students-with-necessary-literacy-skills-needed-
for-college-and-career-new-cap-report-says/
Reeves, J. (2008). Between a rock and a hard place? Curriculum for excellence and the quality
initiative in Scottish schools. Scottish Educational Review, 40(2), 6-16.
http://www.storre.stir.ac.uk/handle/1893/980
Richardson, V., Anders, P., Tidwell, D., & Lloyd, C. (1991). The relationship between teachers’
beliefs and practices in reading comprehension instruction. American Educational
Research Journal, 28(3), 559–586. https://doi.org/10.3102/00028312028003559
Roberts, J. K., Henson, R. K., Tharp, B. Z., & Moreno, N. P. (2001). An examination of change
in teacher self-efficacy beliefs in science education based on the duration of inservice
activities. Journal of Science Teacher Education, 12(3), 199-213.
https://doi.org/10.1023/a:1016708016311
Roksa, L., Jenkins, D., Jaggars, S. S., Zeidenberg, M., & Cho, S. W. (2009). Strategies for
promoting gate keeper course success among students needing remediation: research
133
report for the Virginia community college system. Community College Research Center,
Teachers College, Columbia University.
Ross, J. A., & Gray, P. (2006). School leadership and student achievement: The mediating
effects of teacher beliefs. Canadian Journal of Education/Revue Canadienne De
L’éducation, 29(3), 798. https://doi.org/10.2307/20054196
Roth, K. J., Garnier, H. E., Chen, C., Lemmens, M., Schwille, K., & Wickler, N. I. (2011). Video
based lesson analysis: Effective science PD for teacher and student learning. Journal of
Research in Science Teaching, 48(2), 117–148. https://doi.org/10.1002/tea.20408
Scherff, L. (2018). Distinguishing professional learning from professional development. Institute
of Education Sciences, Regional Educational Laboratory Program.
https://ies.ed.gov/ncee/edlabs/regions/pacific/blogs/blog2_DistinguishingProfLearning.as
p
Schulman, L. S. (1985). Those who understand: Knowledge growth in teaching. CERAS, School
of Education, Stanford University.
Schwartz, K. (2014). How to teach the standards without becoming standardized. KQED.
https://www.kqed.org/mindshift/34403/how-to-teach-the-standards-without-becoming-
standardized.
Seidenberg, M. S. (2017). Language at the speed of sight: how we read, why so many can’t, and
what can be done about it. Basic Books.
Senge, P. (2000). The leadership of profound change. SPC Press.
http://www.spcpress.com/pdf/other/Senge.pdf
134
Shanahan, C., Shanahan, T., & Misischia, C. (2011). Analysis of expert readers in three
disciplines. Journal of Literacy Research, 43(4), 393–429.
https://doi.org/10.1177/1086296x11424071
Shanahan, T. (2013). Letting the text take center stage: How the Common Core State Standards
will transform English language arts instruction. American Educator, 37(3), 4–11.
Retrieved from https://eric.ed.gov/?id=EJ1021044
Shanahan, T., & Duffett, A. (2013). Common core in the schools: A first look at reading
assignments. Washington, DC: Thomas B. Fordham Institute.
Shannon, P. (1983). The use of commercial reading materials in American elementary schools.
Reading Research Quarterly, 19(1), 68-85. https://doi.org/10.2307/747338
Shor, I., & Freire, P. (1987). A pedagogy for liberation: A dialogue on transforming education.
Bergin & Garvey.
Smith, D. S. (2018). An examination of teachers’ understanding and use of text complexity and
complex text in second grade classrooms (Publication No. 10826286) [Doctoral
dissertation, The University of North Carolina at Charlotte]. Proquest Dissertations and
Theses Global.
Snow, C. E. (2002). Reading for understanding: toward an R & D program in reading
comprehension. RAND.
Snow, C. E. (2010). Reading comprehension: Reading for learning. In P. Peterson, E. Baker, &
B. McGaw (Eds.), International encyclopedia of education (pp. 413-418). Elsevier.
https://doi.org/10.1016/b978-0-08-044894-7.00511-x
Snow, C. E., Burns, S., & Griffith, P. (1998). Preventing reading difficulties in young children.
National Academies Press. https://doi.org/10.17226/6023
135
Soler, J. (2016). The politics of the teaching of reading. Prospects, 46(3-4), 423–433.
https://doi.org/10.1007/s11125-017-9415-8
Sparks, D. (2002). Designing powerful professional development for teachers and principals.
National Staff Development Council.
Sparks, D., & Hirsh, S. (1997). A new vision for staff development. Association for Supervision
and Curriculum Development.
Stein, M. K., & D’Amico, L. (2002). Inquiry at the crossroads of policy and learning: A study of
a district-wide literacy initiative. Teachers College Record, 104(7), 1313-1344.
https://doi.org/10.1111/1467-9620.00205
Stevens, J. (2009). Applied multivariate statistics for the social sciences. Routledge Academic.
Stipek, D. J., Givvin, K. B., Salmon, J. M., & MacCyvers, V. L. (2001). Teachers’ beliefs and
practices related to mathematics instruction. Teaching and Teacher Education, 17(2),
213-226. https://doi.org/10.1016/s0742-051x(00)00052-4
Stoll, L., & Fink, D. (1996). Changing our schools. Open University Press.
Strauss, V. (2014, August 21). For first time, minority students expected to be majority in U.S.
public schools this fall. The Washington Post.
https://www.washingtonpost.com/news/answer-sheet/wp/2014/08/21/for-first-time-
minority-students-expected-to-be-majority-in-u-s-public-schools-this-fall/
Tabachnick, B. G., & Fidell, L. S. (1996a). SPSS for Windows workbook to Accompany large
sample examples of using multivariate statistics. HarperCollins College.
Tabachnick, B. G., & Fidell, L. S. (1996b). Using multivariate statistics. Harper Collins.
Tabachnick, B. G., & Fidell, L. S. (2001). Using multivariate statistics. Allyn & Bacon.
Tabachnick, B. G., & Fidell, L. S. (2007). Using multivariate statistics. Pearson/Allyn & Bacon.
136
The New Teacher Project. (2015). The mirage. https://tntp.org/assets/documents/TNTP-
Mirage_2015.pdf
Thompson, C. L., Zeuli, J. S., & Sykes, G. (1999). The frame and the tapestry: Standards-based
reform and professional development. In L. Darling-Hammond (Ed.), Teaching as the
learning profession: Handbook of policy and practice (pp. 341-375). Jossey-Bass.
Thurmond, V. A. (2001). The point of triangulation. Journal of Nursing Scholarship, 33(3), 253-
258. https://doi.org/10.1111/j.1547-5069.2001.00253.x
Timpson, W. M. (1988). Paulo Freire: Advocate of literacy through liberation. Educational
Leadership, 45(5), 62-66.
Tooley, M., & Connally, K. (2016). No panacea: Diagnosing what ails teacher professional
development before reaching for remedies (ED570895). ERIC.
https://eric.ed.gov/?id=ED570895
Trochim, W. M. (2006). Idea of construct validity.
http://www.socialresearchmethods.net/kb/considea.php
Turner, J. D., & Danridge, J. C. (2014). Accelerating the college and career readiness of diverse
k-5 literacy learners. Theory Into Practice, 53(3), 212-219.
https://doi.org/10.1080/00405841.2014.916963
Varlas, L. (2012). It’s complicated. ASCD Education Update, 54(4).
Wang, Y. L., Frechtling, J. A., & Sanders, W. L. (1999, April). Exploring linkages between
professional development and student learning: A pilot study. Paper presented at the
annual meeting of the American Educational Research Association, Montreal, Canada.
137
Wei, R. C., Andree, A., Richardson, N., & Orphanos, S. (2009). Professional learning in the
learning profession: A status report on Teacher development in the United States and
abroad. National Staff Development Council.
Wexler, N. (2019). The knowledge gap: The hidden cause of America’s broken education
system-and how to fix it. Averly.
Whitworth, B. A., & Chiu, J. L. (2015). Professional development and teacher change: The
missing leadership link. Journal of Science Teacher Education, 26(2), 121–137.
https://doi.org/10.1007/s10972-014-9411-2
Williamson, G. L., Fitzgerald, J., & Stenner, A. J. (2013). The common core state standards’
quantitative text complexity trajectory. Educational Researcher, 42(2), 59-69.
https://doi.org/10.3102/0013189x12466695
Wixson, K. K., & Valencia, S. W. (2014). CCSS-ELA suggestions and cautions for addressing
text complexity. The Reading Teacher, 67(6), 430-434. https://doi.org/10.1002/trtr.1237
Wren, S. A. (2020). What does a “balanced approach” to reading instruction mean? Balanced
Reading. https://balancedreading.com/
Yates, C. G. R. (1988). Classroom research into effective teaching. Australian Journal of
Remedial Education, 20(1), 4-9.
Yin, R. K. (2009). Case study research: Design and methods. Sage.
Yin, R. K. (2014). Case study research: Design and methods. Sage.
Zemliansky, P., & Amant, K. S. (2008). Handbook of research on virtual workplaces and the
new nature of business practices. Information Science Reference.
Zygouris-Coe, V. I. (2014). Teaching discipline-specific Literacies in grades 6-12. Routledge.
https://doi.org/10.4324/9780203073162
138
Appendix A: CCRA.R.10
English Language Arts Standards » Standard 10: Range, Quality, & Complexity » Measuring
Text Complexity: Three Factors
Qualitative evaluation of the text
Levels of meaning, structure, language conventionality and clarity, and knowledge demands
Quantitative evaluation of the text
Readability measures and other scores of text complexity
Matching reader to text and task
Reader variables (such as motivation, knowledge, and experiences) and task variables (such as
purpose and the complexity generated by the task assigned and the questions posed)
The Standards’ Approach to Text Complexity
To help redress the situation described above, the Standards define a three-part model for
determining how easy or difficult a particular text is to read as well as grade-by-grade
specifications for increasing text complexity in successive years of schooling (Reading standard
10). These are to be used together with grade-specific standards that require increasing
sophistication in students’ reading comprehension ability (Reading standards 1–9). The
Standards thus approach the intertwined issues of what and how student read. A Three-Part
Model for Measuring Text Complexity As signaled by the graphic at right, the Standards’ model
of text complexity consists of three equally important parts.
(1) Qualitative dimensions of text complexity. In the Standards, qualitative dimensions and
qualitative factors refer to those aspects of text complexity best measured or only measurable by
139
an attentive human reader, such as levels of meaning or purpose; structure; language
conventionality and clarity; and knowledge demands.
(2) Quantitative dimensions of text complexity. The terms quantitative dimensions and
quantitative factors refer to those aspects of text complexity, such as word length or frequency,
sentence length, and text cohesion, that are difficult if not impossible for a human reader to
evaluate efficiently, especially in long texts, and are thus today typically measured by computer
software.
(3) Reader and task considerations. While the prior two elements of the model focus on the
inherent complexity of text, variables specific to particular readers (such as motivation,
knowledge, and experiences) and to particular tasks (such as purpose and the complexity of the
task assigned and the questions posed) must also be considered when determining whether a text
is appropriate for a given student. Such assessments are best made by teachers employing their
professional judgment, experience, and knowledge of their students and the subject.
Qualitative and Quantitative Measures of Text Complexity
The qualitative and quantitative measures of text complexity described below are representative
of the best tools presently available. However, each should be considered only provisional; more
precise, more accurate, and easierto-use tools are urgently needed to help make text complexity a
vital, everyday part of classroom instruction and curriculum planning.
Qualitative Measures of Text Complexity
Using qualitative measures of text complexity involves making an informed decision about the
difficulty of a text in terms of one or more factors discernible to a human reader applying trained
judgment to the task. In the Standards, qualitative measures, along with professional judgment in
matching a text to reader and task, serve as a necessary complement and sometimes as a
corrective to quantitative measures, which, as discussed below, cannot (at least at present)
capture all of the elements that make a text easy or challenging to read and are not equally
successful in rating the complexity of all categories of text. Built on prior research, the four
qualitative factors described below are offered here as a first step in the development of robust
tools for the qualitative analysis of text complexity. These factors are presented as continua of
difficulty rather than as a succession of discrete “stages” in text complexity. Additional
development and validation would be needed to translate these or other dimensions into, for
example, grade-level- or grade-band-specific rubrics. The qualitative factors run from easy (left-
hand side) to difficult (right-hand side). Few, if any, authentic texts will be low or high on all of
these measures, and some elements of the dimensions are better suited to literary or to
informational texts.
(1) Levels of Meaning (literary texts) or Purpose (informational texts). Literary texts with a
single level of meaning tend to be easier to read than literary texts with multiple levels of
meaning (such as satires, in which the author’s literal message is intentionally at odds with his or
her underlying message). Similarily, informational texts with an explicitly stated purpose are
140
generally easier to comprehend than informational texts with an implicit, hidden, or obscure
purpose.
(2) Structure. Texts of low complexity tend to have simple, well-marked, and conventional
structures, whereas texts of high complexity tend to have complex, implicit, and (particularly in
literary texts) unconventional structures. Simple literary texts tend to relate events in
chronological order, while complex literary texts make more frequent use of flashbacks, flash-
forwards, and other manipulations of time and sequence. Simple informational texts are likely
not to deviate from the conventions of common genres and subgenres, while complex
informational texts are more likely to conform to the norms and conventions of a specific
discipline. Graphics tend to be simple and either unnecessary or merely supplementary to the
meaning of texts of low complexity, whereas texts of high complexity tend to have similarly
complex graphics, graphics whose interpretation is essential to understanding the text, and
graphics that provide an independent source of information within a text. (Note that many books
for the youngest students rely heavily on graphics to convey meaning and are an exception to the
above generalization.)
(3) Language Conventionality and Clarity. Texts that rely on literal, clear, contemporary, and
conversational language tend to be easier to read than texts that rely on figurative, ironic,
ambiguous, purposefully misleading, archaic or otherwise unfamiliar language or on general
academic and domain-specific vocabulary.
(4) Knowledge Demands. Texts that make few assumptions about the extent of readers’ life
experiences and the depth of their cultural/literary and content/discipline knowledge are
generally less complex than are texts that make many assumptions in one or more of those areas.
Quantitative Measures of Text Complexity
A number of quantitative tools exist to help educators assess aspects of text complexity that are
better measured by algorithm than by a human reader. The discussion is not exhaustive, nor is it
intended as an endorsement of one method or program over another. Indeed, because of the
limits of each of the tools, new or improved ones are needed quickly if text complexity is to be
used effectively in the classroom and curriculum.
Numerous formulas exist for measuring the readability of various types of texts. Such formulas,
including the widely used Flesch-Kincaid Grade Level test, typically use word length and
sentence length as proxies for semantic and syntactic complexity, respectively (roughly, the
complexity of the meaning and sentence structure). The assumption behind these formulas is that
longer words and longer sentences are more difficult to read than shorter ones; a text with many
long words and/or sentences is thus rated by these formulas as harder to read than a text with
many short words and/or sentences would be. Some formulas, such as the Dale-Chall Readability
Formula, substitute word frequency for word length as a factor, the assumption here being that
less familiar words are harder to comprehend than familiar words. The higher the proportion of
less familiar words in a text, the theory goes, the harder that text is to read. While these
readability formulas are easy to use and readily available—some are even built into various word
processing applications—their chief weakness is that longer words, less familiar words, and
141
longer sentences are not inherently hard to read. In fact, series of short, choppy sentences can
pose problems for readers precisely because these sentences lack the cohesive devices, such as
transition words and phrases, that help establish logical links among ideas and thereby reduce the
inference load on readers.
Like Dale-Chall, the Lexile Framework for Reading, developed by MetaMetrics, Inc., uses word
frequency and sentence length to produce a single measure, called a Lexile, of a text’s
complexity. The most important difference between the Lexile system and traditional readability
formulas is that traditional formulas only assign a score to texts, whereas the Lexile Framework
can place both readers and texts on the same scale. Certain reading assessments yield Lexile
scores based on student performance on the instrument; some reading programs then use these
scores to assign texts to students. Because it too relies on word familiarity and sentence length as
proxies for semantic and syntactic complexity, the Lexile Framework, like traditional formulas,
may underestimate the difficulty of texts that use simple, familiar language to convey
sophisticated ideas, as is true of much high-quality fiction written for adults and appropriate for
older students. For this reason and others, it is possible that factors other than word familiarity
and sentence length contribute to text difficulty. In response to such concerns, MetaMetrics has
indicated that it will release the qualitative ratings it assigns to some of the texts it rates and will
actively seek to determine whether one or more additional factors can and should be added to its
quantitative measure. Other readability formulas also exist, such as the ATOS formula associated
with the Accelerated Reader program developed by Renaissance Learning. ATOS uses word
difficulty (estimated grade level), word length, sentence length, and text length (measured in
words) as its factors. Like the Lexile Framework, ATOS puts students and texts on the same
scale.
A nonprofit service operated at the University of Memphis, Coh-Metrix attempts to account for
factors in addition to those measured by readability formulas. The Coh-Metrix system focuses on
the cohesiveness of a text—basically, how tightly the text holds together. A high-cohesion text
does a good deal of the work for the reader by signaling relationships among words, sentences,
and ideas using repetition, concrete language, and the like; a low-cohesion text, by contrast,
requires the reader him- or herself to make many of the connections needed to comprehend the
text. Highcohesion texts are not necessarily “better” than low-cohesion texts, but they are easier
to read.
The standard Coh-Metrix report includes information on more than sixty indices related to text
cohesion, so it can be daunting to the layperson or even to a professional educator unfamiliar
with the indices. Coh-Metrix staff have worked to isolate the most revealing, informative factors
from among the many they consider, but these “key factors” are not yet widely available to the
public, nor have the results they yield been calibrated to the Standards’ text complexity grade
bands. The greatest value of these factors may well be the promise they offer of more advanced
and usable tools yet to come.
Reader and Task Considerations
The use of qualitative and quantitative measures to assess text complexity is balanced in the
Standards’ model by the expectation that educators will employ professional judgment to match
142
texts to particular students and tasks. Numerous considerations go into such matching. For
example, harder texts may be appropriate for highly knowledgeable or skilled readers, and easier
texts may be suitable as an expedient for building struggling readers’ knowledge or reading skill
up to the level required by the Standards. Highly motivated readers are often willing to put in the
extra effort required to read harder texts that tell a story or contain information in which they are
deeply interested. Complex tasks may require the kind of information contained only in similarly
complex texts.
Numerous factors associated with the individual reader are relevant when determining whether a
given text is appropriate for him or her. The RAND Reading Study Group identified many such
factors in the 2002 report Reading for Understanding:
The reader brings to the act of reading his or her cognitive capabilities (attention,
memory, critical analytic ability, inferencing, visualization); motivation (a purpose for
reading, interest in the content, self-efficacy as a reader); knowledge (vocabulary and
topic knowledge, linguistic and discourse knowledge, knowledge of Common Core State
Standards for english language arts & literacy in history/social studies, science, and
technical subjects appendix A | 8 comprehension strategies); and experiences.
As part of describing the activity of reading, the RAND group also named important task-related
variables, including the reader’s purpose (which might shift over the course of reading), “the
type of reading being done, such as skimming (getting the gist of the text) or studying (reading
the text with the intent of retaining the information for a period of time),” and the intended
outcome, which could include “an increase in knowledge, a solution to some realworld problem,
and/or engagement with the text.”
143
Appendix B: Pre- and Post-Surveys
144
145
146
147
148
149
Appendix C: Professional Development Treatment: Facilitator’s Slides and Notes
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
Appendix D: Letter of Invitation to Participate in Study
INVITATION TO PARTICIPATE IN DOCTORAL STUDY
Study Title: Assessing Teacher Knowledge: A Study of K-8 Teacher’s Knowledge of Text
Complexity, and the Implications of this Knowledge on Instructional Practices in Reading.
Dear Perspective Study Participant:
My name is D. David Moore, and I am a doctoral candidate in the Rossier School of Education at
the University of Southern California. I am conducting a research study as part of the requirements
of my degree in Doctor of Education in Educational Leadership (Ed.D.), and I would like to invite
you to participate.
The purpose of this dissertation study is to examine teacher knowledge in selecting texts to provide
text complexity for a diverse group of students, and to measure the impact of a focused, activity-
based, coherent, interactive professional development on teacher knowledge of text complexity.
The intent of the professional development is to transform teacher knowledge of text complexity
in such a way as to impact student motivation to read and ultimately, student achievement in
literacy.
If you agree to participate in this study, you will be asked to: (1) Complete a 34-Question online
Pre-Survey, (2) attend a two-and a half hour (2.5 Hours) professional development on Text
Complexity, and (3) Complete a short online Post-Survey. Each online survey should take
15minutes to complete. The link to the online pre-survey will be given to you once you have agreed
to participate in the study. Following the pre-survey, the participants will attend a 2.5-hour
professional development to be held at the Literacy Center in the Vista Unified School District
on April 28, 2015. I will facilitate the professional development as the lead researcher in the study.
The professional development will focus on a short presentation on text complexity, followed by
hands-on activities involving all participants. Finally, the participants will take the online post-
survey at the conclusion of the professional development.
We hope the results of this study will inform the larger body of research on the types of instruction
that can increase students’ abilities to read and comprehend complex literary and informational
texts independently and proficiently.
We will keep your records for this study confidential as far as permitted by law. Your participation
in this study is voluntary, and can be terminated at any time with no penalty.
Following the completion of the Post-Survey at the end of the 2.5-hour Professional Development,
a drawing will be done for a Dell Venue 8 Pro 3000 Series Windows Tablet (32GB). Additionally,
there will be drawings for two $25 Starbucks gift cards and five $5 Starbucks gift cards. Upon the
participant’s arrival at the professional development, a ticket with a number will be issued to each
person. The number on the ticket will match a number in the basket from which the drawings will
be made. The Dell Tablet and Starbucks gift cards will be awarded following each of the
drawings. The drawings will take place within the last half hour of the professional development.
172
All participants in the professional development will have an equal opportunity to win any of the
drawings.
You will not be paid cash for participating in the study. Since the data will be collected at your
work site, no reimbursement will be provided for your travel or parking expenses.
We will be happy to answer any questions you have about the study. If you have any questions or
concerns about the research, please feel free to contact David Moore by calling (951) 595-1718 or
via email at donaldmo@usc.edu.
If you have questions, concerns, or complaints about your rights as a research participant or the
research in general and are unable to contact the research team, or if you want to talk to someone
independent of the research team, please contact the University Park Institutional Review Board
(UPIRB), 3720 South Flower Street #301, Los Angeles, CA 90089-0702, (213) 821-5272 or
upirb@usc.edu
Thank you for your consideration. If you would like to participate, please contact me by phone or
email in the contact information listed below to discuss participating.
With kind regards,
D. David Moore
D. David Moore
951-595-1718
donaldmo@usc.edu
173
Appendix E: Flyer for PD Treatment
174
Appendix F: Consent Form
TEACHER’S CONSENT FORM
University of Southern California
Rossier School of Education
Waite Phillips Hall
3470 Trousdale Parkway, Los Angeles, CA 90089
Assessing Teacher Knowledge: A Study of K-8 Teacher’s Knowledge of Text Complexity,
and the Implications of this Knowledge on Instructional Practices in Reading
TEACHERS
Because you are a Kindergarten - 8th grade teacher at your school site, you are being invited
to participate in a research study conducted by David Moore (M.Ed) and Sandra Kaplan (Ed.D)
for the University of Southern California. Your participation is voluntary. You should read the
information below, and ask questions about anything you do not understand, before deciding
whether to participate. Please take as much time as you need to read the consent form. If you decide
to participate, you will be asked to sign this form. A copy of this form will be provided for your
records.
PURPOSE OF THE STUDY
The purpose of this dissertation study is to examine teacher knowledge in selecting texts
to provide text complexity for a diverse group of students, and to measure the impact of a focused,
activity-based, coherent, interactive professional development on teacher knowledge of text
complexity. The intent of the professional development is to transform teacher knowledge of text
complexity in such a way as to impact student motivation to read and ultimately, student
achievement in literacy.
STUDY PROCEDURES
As a teacher volunteering to participate in this study, you will be asked to perform three
duties: (1) Complete a 34-Question Pre-Survey, (2) attend a two-and a half hour (2.5 Hours)
professional development on Text Complexity, and (3) Complete a short Question Post-Survey
The Pre and Post Surveys
The Pre- and Post- Surveys will assess teacher knowledge of Text Complexity before and
after the 2.5-hour professional development. Each survey should take 15 minutes to complete.
The Professional Development
The professional development on Text Complexity will focus on text complexity in such a
way as to impact student motivation to read and ultimately, student achievement in literacy.
175
POTENTIAL RISKS AND DISCOMFORTS
No foreseeable risks or discomforts will be part of the study.
POTENTIAL BENEFITS TO PARTICIPANTS AND/OR TO SOCIETY
There are two possible benefits that you might experience from seeing the analysis of the
data. First, you could get a sense of how your knowledge increases in the area of Text Complexity.
Second, you will be able to see how your instructional practices in reading begin to build the
capacity your students have in reading and comprehending complex literary and informational
texts independently and proficiently.
Regarding the benefits to the larger society, the results of this study may inform teachers
across the world on the type of instruction that can increase students’ ability to read and
comprehend complex literary and informational texts independently and proficiently. This
possible shift in understanding has the potential to increase the capacity of students to succeed in
high school and college courses, as well as, to increase their ability to be creative about problems
in general. Of course, all the benefits previously mentioned are all conditional upon the results
obtained in the study.
PAYMENT/COMPENSATION FOR PARTICIPATION
Following the completion of the Post-Survey at the end of the 2.5-hour Professional
Development, a drawing will be done for a Dell Venue 8 Pro 3000 Series Windows Tablet (32GB).
The Dell Tablet will be awarded following the drawing. You will not be paid cash for participating
in the study. Since the data will be collected at your work site, no reimbursement will be provided
for your travel or parking expenses.
CONFIDENTIALITY
We will keep your records for this study confidential as far as permitted by law.
However, if we are required to do so by law, we will disclose confidential information about you.
The members of the research team and the University of Southern California’s
Human Subjects Protection Program (HSPP) may access the data. The HSPP reviews and monitors
research studies to protect the rights and welfare of research subjects.
The data will be stored inside a locked briefcase during transit and inside a locked filing
cabinet at all other times. After results are analyzed all participants will have the ability to view
them. However, the raw data will not be viewed by, or released to, anyone besides the principal
investigator (David Moore) and the faculty advisors overseeing the study.
In order to maintain anonymity each participating teacher will complete the Pre- and Post-
survey online. The raw data will not be viewed by, or released to, anyone besides the principal
investigator (David Moore) and the faculty advisors overseeing the study.
176
The raw data will be kept for the required minimum of three years after the completion of the
study. After the three years are over the principal investigator will destroy the data beyond any
possible recognition.
CERTIFICATE OF CONFIDENTIALITY
Any identifiable information obtained in connection with this study will remain confidential,
except if necessary to protect your rights or welfare (for example, if you are injured and need
emergency care). A Certificate of Confidentiality has been obtained from the Federal Government
for this study to help protect your privacy. This certificate means that the researchers can resist the
release of information about your participation to people who are not connected with the study,
including courts. The Certificate of Confidentiality will not be used to prevent disclosure to local
authorities of child abuse and neglect, or harm to self or others. When the results of the research
are published or discussed in conferences, no identifiable information will be used.
PARTICIPATION AND WITHDRAWAL
Your participation is voluntary. Your refusal to participate will involve no penalty or loss of
benefits to which you are otherwise entitled. You may withdraw your consent at any time and
discontinue participation without penalty. You are not waiving any legal claims, rights or remedies
because of your participation in this research study. If the principal investigator observes any
misconduct, dishonesty, or unethical behavior on the part of any participant, he reserves the right
to terminate participation immediately.
INVESTIGATOR’S CONTACT INFORMATION
If you have any questions or concerns about the research, please feel free to contact David Moore
by calling (951) 595-1718 or via email at donaldmo@usc.edu.
RIGHTS OF RESEARCH PARTICIPANT – IRB CONTACT INFORMATION
If you have questions, concerns, or complaints about your rights as a research participant or the
research in general and are unable to contact the research team, or if you want to talk to someone
independent of the research team, please contact the University Park Institutional Review Board
(UPIRB), 3720 South Flower Street #301, Los Angeles, CA 90089-0702, (213) 821-5272 or
upirb@usc.edu
177
I have read the information provided above. I have been given a chance to ask questions. My
questions have been answered to my satisfaction, and I agree to participate in this study. I have
been given a copy of this form.
____________________________________________________
Name of Participant
____________________________________________________ ________________________
Signature of Participant Date
I have explained the research to the participant and answered all of his/her questions. I believe
that he/she understands the information described in this document and freely consents to
participate.
___________________________________________________
Name of Person Obtaining Consent
___________________________________________________ ________________________
Signature of Person Obtaining Consent Date
SIGNATURE OF RESEARCH PARTICIPANT
SIGNATURE OF INVESTIGATOR
178
Appendix G: USC IRB Approval of Study
UNIVERSITY OF SOUTHERN CALIFORNIA UNIVERSITY PARK INSTITUTIONAL
REVIEW BOARD
3720 South Flower Street Credit Union Building (CUB) #301
Los Angeles, CA 90089-0702
Phone: 213-821-5272
Fax: 213-821-5276
upirb@usc.edu
Date: May 07, 2015, 03:56pm
Action Taken: Approve
Principal
Investigator:
Donald Moore
ROSSIER SCHOOL OF EDUCATION
Faculty
Advisor:
Sandra Kaplan
ROSSIER SCHOOL OF EDUCATION
Co-
Investigator(s):
Project Title: _Protocol - Tue Dec 9 16:05:10 PST 2014
Study ID: UP-15-00016
Funding: NO FUNDING SOURCES
The University Park Institutional Review Board (UPIRB) designee determined that your
project qualifies for exemption from IRB review under the USC Human Research
Protection Program Flexibility Policy. The study was approved on 05/07/2015 and is
not subject to 45 CFR 46 regulations, including informed consent requirements or
further IRB review.
IF THERE ARE MODIFICATIONS THAT INCREASE RISK TO SUBJECTS OR IF THE
FUNDING STATUS OF THIS RESEARCH IS TO CHANGE, YOU ARE REQUIRED TO
SUBMIT AN AMENDMENT TO THE IRB FOR REVIEW AND approval.
The IRBA has made minor revisions to sections 21.2, 24.2, 24.4 and the recruitment
flyer, the recruitment letter and the information sheet.
The following materials were reviewed and approved:
-- Certified Information Sheet UP-15-00016 4.30.15
-- Certified Recruitment Flyer UP-15-00016 4.30.15
-- Certified Recruitment Letter UP-15-00016 4.30.15
TO ACCESS IRB-approved DOCUMENTS, CLICK ON THE
“Approved DOCUMENTS” LINK IN THE STUDY WORKSPACE. THESE ARE ALSO
179
AVAILABLE UNDER THE “DOCUMENTS” TAB.
PLEASE CHECK WITH ALL PARTICIPATING SITES TO MAKE SURE YOU HAVE
THEIR PERMISSION AND ANY NECESSARY DISTRICT approvals/ETHICS BOARD
REVIEW TO CONDUCT RESEARCH PRIOR TO BEGINNING YOUR STUDY.
Attachments:
Social-behavioral health-related interventions or health-outcome studies must register
with c l i n i c a l tr i a l s . g o v or other International Community of Medical Journal Editors
(ICMJE) approved registries in order to be published in an ICJME journal. The ICMJE
will not accept studies for publication unless the studies are registered prior to
enrollment, despite the fact that these studies are not applicable “clinical trials” as
defined by the Food and Drug Administration (FDA). For support with registration, go
to www.clinicaltrials.gov or contact Jean Chan ( jeanbcha@usc.edu, 323-442-2825).
Abstract (if available)
Abstract
Preparing students for College and Career Readiness (CCR) requires instruction that prepares students to interact with complex, college-level text. The challenge is that teachers are not prepared to implement reading instruction that adequately prepares students to read complex texts, and students are entering college unprepared to read at the college level. Students entering college in need of remediation in reading brought about a renewed emphasis on text complexity. The renewed emphasis is highlighted in the Common Core State Standards (CCSS) and CCR Reading Anchor Standard 10, which require capable and effective teachers to teach reading using the knowledge and skills of text complexity. ❧ This dissertation examines the impact of targeted professional development on teachers’ perceived pre- and post-professional development skill levels in the qualitative, quantitative, and reader and task measures of text complexity. Utilizing a quasi-experimental one-group pretest-posttest design and a professional development session in text complexity, the researcher solicited the responses of 11 teacher participants. Based on the principal component analysis and the T-Test scale reliability, the professional development session increased teachers’ knowledge and skills in text complexity.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Impact of inquiry‐based learning professional development on implementation of Common Core State Standards
PDF
Professional development opportunities for non-core teachers
PDF
The role of superintendents and district administrators in developing career technical education programs to assist students in becoming college‐ and career‐ready
PDF
An examination of teacher-centered Explicit Direct Instruction and student-centered Cognitively Guided Instruction in the context of Common Core State Standards Mathematics…
PDF
The importance of technological training among public school teachers integrating one-to-one computing: an evaluation study
PDF
Multi-site case studies on the collaboration of career technical education teachers and core teachers
PDF
STEM teacher education: An evaluation study
PDF
Shifting educator’s paradigm from the practice of implementing standards based learning and assessments to project based learning and assessments for the Common Core State Standards
PDF
The role of district administrators in developing career technical education programs to assist students in becoming college- and career-ready
PDF
Functional illiteracy: high stakes learning in the community college environment
PDF
Examining the impact of LETRS professional learning on student literacy outcomes: a quantitative analysis
PDF
Perceptions of professional development from the lens of the global teacher in a rapidly evolving, linguistically diverse instructional environment
PDF
Examining the relationship between knowledge, perception and principal leadership for standard English learners (SELs): a case study
PDF
Examining the use of mnemonic devices in instructional practices to improve the reading skills of third grade public school students with learning disabilities
PDF
Incorporating service learning curriculum to enhance college and career readiness: a professional development for teachers
PDF
Reconstructing the literary canon: an innovation study
PDF
The interaction of teacher knowledge and motivation with organizational influences on the implementation of a hybrid reading intervention model taught in elementary grades
PDF
Instructional leadership: the practices employed by elementary school principals to lead the Common Core State Standards and 21st century learning skills
PDF
Support for English learners: an examination of the impact of teacher education and professional development on teacher efficacy and English language instruction
PDF
Examining the practices of teachers who teach historically marginalized students through an enactment of ideology, asset pedagogies, and funds of knowledge
Asset Metadata
Creator
Moore, Donald David
(author)
Core Title
An examination of the impact of professional development on teachers' knowledge and skills in the use of text complexity
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Publication Date
12/13/2020
Defense Date
08/10/2020
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
Anchor Reading Standard 10,college and career readiness,college and career ready,Common Core State Standards,early literacy,frequency sentence length,knowledge demands,language conventionality and clarity,levels of meaning or purpose,Literacy,literacy instruction,mixed methods,Motivation,OAI-PMH Harvest,principal component analysis,professional development,purpose and complexity of task,qualitative,quantitative,reading pedagogy,structure,text cohesion,text complexity,word length
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Gallagher, Raymond (
committee chair
), Johnson, Lizbeth (
committee chair
), Kaplan, Sandra (
committee chair
)
Creator Email
davidmoore0102@gmail.com,donaldmo@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c89-404829
Unique identifier
UC11668080
Identifier
etd-MooreDonal-9218.pdf (filename),usctheses-c89-404829 (legacy record id)
Legacy Identifier
etd-MooreDonal-9218.pdf
Dmrecord
404829
Document Type
Dissertation
Rights
Moore, Donald David
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
Anchor Reading Standard 10
college and career readiness
college and career ready
Common Core State Standards
early literacy
frequency sentence length
knowledge demands
language conventionality and clarity
levels of meaning or purpose
literacy instruction
mixed methods
principal component analysis
professional development
purpose and complexity of task
qualitative
quantitative
reading pedagogy
text cohesion
text complexity
word length