Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
An evaluation of the impact of direct instruction intervention on the academic achievement of English language learners
(USC Thesis Other)
An evaluation of the impact of direct instruction intervention on the academic achievement of English language learners
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
AN EVALUATION OF THE IMPACT OF DIRECT INSTRUCTION
INTERVENTION ON THE ACADEMIC ACHIEVEMENT OF ENGLISH
LANGUAGE LEARNERS
by
Miguel Angel Guerrero
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
August 2010
Copyright 2010 Miguel Angel Guerrero
ii
DEDICATION
This work is dedicated to
my mother, Martha Hagan, who showed me that education provides
the keys to unlock all doors;
to all my daughters, Chelsea, Janessa, Alexis, Alyssa, and Adriana, who,
I hope, will never stop trying to open them;
and, finally, to my beautiful wife, without whose love, support, and endless
patience this dream would never have been realized.
iii
ACKNOWLEDGEMENTS
With much gratitude, I would like to thank Dr. Hocevar, my dissertation chair,
for guiding me through this process.
iv
TABLE OF CONTENTS
DEDICATION ii
ACKNOWLEDGEMENTS iii
LIST OF TABLES vii
ABSTRACT ix
CHAPTER 1: PROBLEM IDENTIFICATION 1
Problem Identification 1
Problem Analysis 6
Knowledge Factor 6
Teacher Motivation 7
Organizational Factors 9
Establishing a Data-Driven School Culture 10
Problem Solution 11
Implementation of Explicit Direct Instruction 11
ELD Program Design 13
Value Added Model 16
Purpose, Design, and Utility 17
Purpose 17
Research Questions 18
Study Design 18
Utility 20
CHAPTER 2: LITERATURE REVIEW 21
Introduction 21
No Child Left Behind Act and Its Impact on English-Language
Learners 23
Common Criteria for Identifying ELL 24
ELL Typologies 26
English Learners’ Achievement on Standardized Testing 29
Impact of Standardized Testing on ELLs 29
Value Added Model Growth 33
Literature on Direct Instruction 36
Teaching Models 36
v
Direct Instruction and English Language Learners 40
Summary of the Literature 43
CHAPTER 3: METHODOLOGY 45
Design Summary 45
Pre-Post Design 45
Description of Benchmark School 47
Participants/Sampling 48
Study Participants 48
Intervention Description 48
Instrumentation 49
Quantitative 49
Qualitative 51
Data Analysis 52
Quantitative Analysis 52
Qualitative Analysis 53
Delimitations and Limitations of the Study 53
CHAPTER 4: RESULTS 55
Overview 55
Pre/Post Dependent Group Design 56
Non-equivalent Comparison Group Design 57
Grade Level Growth Scores 57
8
th
to 9
th
Grade Transition 58
9
th
to 10
th
Grade Transition 59
10
th
to 11
th
Grade Transition 60
Comparison School Results 61
API Comparisons 62
Adequate Yearly Progress Comparison 66
CHAPTER 5: SUMMARY, DISCUSSION, AND
RECOMMENDATIONS 69
Overview 69
Purpose and Method 69
Summary of Findings 71
Patterson High School 71
Statistical Significance 73
Practical Significance 73
Interviews 77
Observations 79
vi
Patterson High School and Selma High School 80
Academic Performance Index 80
Adequate Yearly Progress 81
Title III English Language Proficiency – AMAO 82
Factors Impacting PHS Overall Performance 82
Implications 83
Implications for Implementation of Direct Instruction 85
Site-Based Recommendations 87
Limitations 89
Conclusions 91
REFERENCES 93
APPENDICES 99
Appendix A: API Targets Matrix 99
Appendix B: English Language Learners’ Class Placements 100
Appendix C: Patterson High School’s Lesson Design 101
Appendix D: Walk-Through Observation Form 103
vii
LIST OF TABLES
Table 1: Patterson High School Ethnic Breakdown 2
Table 2: Academic Performance Index (API) School Report 3
Table 3: 2006 CST ELA Comparisons of English Language Learners
vs. Schoolwide 4
Table 4: Patterson High School’s Preliminary CELDT Scores 2006 by
Proficiency Level 14
Table 5: English Language Learners Typologies and Key
Characteristics 27
Table 6: Lesson Design Models 38
Table 7: Patterson HS and Selma HS Comparisons Based on the 2006
Accountability Progress Report 47
Table 8: Normal Curve Equivalent Scores 61
Table 9: Grade Level Growth Scores 61
Table 10: Comparison of API Base Scores of Patterson HS and Selma
HS from 1999–2007 62
Table 11: API Results of ELL Students at Patterson HS Who Have CST
Results for Both the 2006 and 2007 CST in ELA 64
Table 12: Performance Levels of the ELA Portion on the CST for ELL
between 2006 and 2007 65
Table 13: Number of ELL Students Broken Down by Performance Level
and Grade for 2006 and 2007 65
Table 14: AMOs – Percent of Students Scoring Proficient or Advanced
in English-Language Arts 67
viii
Table 15: Comparison of Title III School Accountability 68
Table 16: Patterson High School Enrollment by Ethnicity 83
Table A-1: Patterson High School: API Targets Met Matrix 99
Table B-1: Patterson High School: English Learners’s Class Placements 100
ix
ABSTRACT
The purpose of this case study was to analyze the impact of direct instruction
on the academic achievement of 342 High School English Language Learners on the
English Language Arts portion of the California Standards Test (CST). Using
dependent group and non-equivalent control group benchmark designs, a mixed-
method methodology primarily focused on the quantitative portion to determine both
statistical and practical significance. DATAWORKS Explicit Direct Instruction was
the direct instruction model selected for intervention. Data were drawn from the
English Language Arts portion of the California Standards Test for the 2006 and 2007
administration.
This researched-based intervention demonstrated mixed results on the
California Standards Test. On traditional indicators of accountability (CST, API, and
AYP) declines in test performance from pre-intervention to post intervention were
observed. However, both California’s Academic Performance Index (API) and the No-
Child-Left Behind (NCLB) Adequate Yearly Progress indices (AYP) are based on a
comparison of “successive cohorts” of students. Such a comparison assumes that the
students attending the school have similar background characteristics and prior
learning on a year-to-year basis. The inherent problem of using successive cohorts to
measure school improvement is widely recognized by practitioners and researchers,
and there is a large consensus that longitudinal measurement of individual student
x
growth better measures student academic achievement. In the present study, the
successive cohorts were dramatically different because of a large influx of students
from a nearby urban area. Thus, the declines in the CST, API, and AYP scores likely
were due to a changing student population and not the direct instruction intervention.
1
CHAPTER 1
PROBLEM IDENTIFICATION
Patterson High School (PHS), is the only comprehensive high school in the
Patterson Unified School District. PHS was established in 1914 and serves the
communities of Patterson, Westley, Vernalis, Grayson, and the San Antonio Valley.
The area PHS serves is rural in nature, and the local economy is based on agricultural
and food processing. The school as well as the community has been experiencing
steady growth and a changing population. Many families have recently moved to
Patterson from the Bay Area, and many of these newcomers commute to the Bay Area
for employment. As a result, Patterson is beginning to exhibit some of the
characteristics of a bedroom community.
The student population at Patterson High School is becoming more culturally
and linguistically diverse with an English Learner population of 28.1%. There are
seventeen different languages spoken by the students with Spanish being the dominant
language. Forty-five percent of the student population participates in the “free and
reduced” lunch program through the Nation School Lunch Program receiving
breakfast and lunch. The following table represents the ethnic breakdown of Patterson
High School grades 9 through 12.
2
Table 1
Patterson High School Ethnic Breakdown
Ethnicity Enrollment Percentage
American Indian 16 1%
Asian 40 2.6%
Pacific Islander 21 1.3%
Filipino 23 1.5%
Latino 975 63.5%
African American 96 6.2%
White 372 23.8%
Multiple/ No Response 17 1.1%
Total 1560 100%
The teaching staff is comprised of 68 full-time equivalent teachers made up of
50 subject area teachers, 10 vocational education teachers, and 8 special education
teachers. One hundred percent of the subject area teachers are fully credentialed and
meet the highly qualified criteria under the No Child Left Behind Act of 2001. The
school breakdown of staff differs from the student population. The staff was 68.7%
white, 23.9% Latino, 1.5% African American and 6.0% Multiple/No Response.
Furthermore, data from the 2006 Base Academic Performance Index (API)
School Demographic Characteristic Report for Patterson High School provide
information on the parent education level for students in grades 9 through 11 who
participated in the STAR testing. According to the data source, 78% of the student-
answered documents stated the parent education level information. Of those with
response, 31% indicated “not a high school graduate,” 33% marked “high school
graduate,” 28% indicated “some college,” 6% respond “some college,” and 2%
indicated “graduate school.” In addition, the average parent education level, was 2.14
3
where “1” represents “not a high school graduate and “5” represents “graduate
school.”
Patterson High School is facing a set of challenges common in today’s
American public high schools: high growth in student population, limited financial
resources, high teacher turnover and continual pressure to improve on state and local
assessments. Below is an overview of how PHS has performed over the last five years
under the current state accountability system. PHS has made its API three out of the
last five years as reflected on the chart below. With the exception of 2002 and 2006,
Patterson High School has continually shown growth over the base rankings on the
Academic Performance Index (API).
Table 2
Academic Performance Index (API) School Report
Year
STAR
Percent
Tested
API
Score
API
Base
API
Target
API
Growth
Met
Target
Statewide
Rank
Similar
Schools
Rank
2002 99 577 583 11 -6 No 3 8
2003 98 625 584 11 41 Yes 4 9
2004 99 642 626 9 16 Yes 4 9
2005 98 665 631 8 34 Yes 4 8
2006 98 661 665 7 -4 No 4 8
From California Department of Education, 2007.
Schoolwide, there is a cumulative growth change of 81 points (please see
Appendix A) over the last six years. PHS Socioeconomically disadvantaged and
Latino subgroups have continually met or exceeded their target. The White subgroup
4
is the only subgroup to show no improvement or a negative change in 2002 and 2004.
The cumulative target to growth difference for each subgroup over the last 5 years
show the socio-economically disadvantaged subgroup have grown 117 points, the
Latino subgroup has a growth difference of 98, and the white subgroup shows 37
points over their target change. Scores for Special Education and English Learners
were made available in the 2005-2006 school year. PHS special education population
has a cumulative target to growth score of 59, while our EL students had a -19 decline.
The decline of EL students has forced PHS to engage in several data driven
activities that have brought PHS face-to-face with the achievement gap of the EL
student population. The data revealed that the majority of our English Language
Learners were performing Far Below Basic or Below Basic on the ELA portion of the
CST.
Table 3
2006 CST ELA Comparisons of English Language Learners vs. Schoolwide
Percent in Each Performance Level
FB Basic B Basic Basic Proficient Advanced
Schoolwide 15% 21% 33% 22% 9%
ELL 20% 34% 37% 8% 1%
EduSoft (2006)
The chart above indicates that PHS has a long way to go when closing the
achievement gap between English Language Learners (ELL) and the general
population. In 2006 there were 342 ELL students that took the CST in English
Language Arts. The results are as follows 4 students performed at the advanced level
5
and a total of 26 students scored at the Proficient Level. It became obvious that PHS
needed to implement a new approach to its ELD course. According to Johnson, “the
central purpose for of all of the data activities is to improve learning opportunities and
outcomes for students”(p. 72). With 184 students scoring at or below basic, the need
to restructure was evident.
No Child Left Behind, NCLB; accountability measures have highlighted the
demographic split at PHS. As stated above, there is a significant achievement gap
between ELL students and the general population. PHS did not meet the benchmark
measures for performance on the English Language Arts, California Standards Test for
its EL population. The failure to meet the benchmark in two consecutive years means
PHS will be identify by NCLB as a Program Improvement School.
The purpose of the study was to conduct action research to solve a problem of
practice around English Language Arts instruction of ELL students who are not
performing proficient or advanced. If Patterson High School does not improve its
ELL performance on the CST, it will then be placed on program improvement (PI)
status and be subject to numerous interventions that the State deems applicable to the
school and its student population. This study evaluated the effectiveness of a local
intervention in English Language Arts on students who are far below basic and below
basic.
6
Problem Analysis
The ELL achievement at PHS is a major concern. There are several factors that
have contributed to PHS failure to meet required accountability benchmarks. These
factors need to be analyzed in order to understand the continual performance gap of
English Learners at Patterson High School. Clark and Estes (2002) provide a process
for analyzing “the cause of the gaps between current and desired performance” (p. 22)
and assert that “the gap between desired and actual performance must be assessed and
closed if organizational goals are to be achieved” (p. 22). Ultimately, in order to attain
NCLB established benchmarks the following three factors must be addressed
knowledge and skill factors, motivation factors, and organizational factors.
Knowledge Factor
This area focuses upon the pedagogical and content expertise of the teacher.
The classroom teacher is the main issue that needs to be addressed in relation to
student achievement at PHS. Marzano (2003) notes that, “all researchers agree that
the impact of decisions made by individual teachers is far greater that the impact of
decisions made at the school level.” (p. 71). Clearly teachers who are not familiar
with the effective instructional strategies are a huge obstacle to student achievement,
and they make instructional decisions that negatively affect their student. A skilled
teacher can have a profound effect on student motivation and student achievement.
These factual and procedural knowledge gaps require that teachers learn
effective instructional strategies to address the needs of their ever-changing student
7
population. Currently at PHS, teachers have aligned their curriculum and created
pacing calendars based upon state content standards. However, there seems to be a
reliance on textbook driven instruction rather than standards driven instruction.
Therefore, teacher training is needed to help resolve the situation due to a lack of
knowledge and skills as identified by Clark and Estes (2002). “Knowledge and skills
enhancement are required when people do not know how to accomplish their
performance goals and second when you anticipate that future challenges will require
novel problem solving” (Clark & Estes, 2002, p. 33).
Several attempts were made by Patterson High to enhance the knowledge and
skills of teachers. These attempts were mainly in the form of one-day workshop that
only a select few participated and it did not have much impact on student achievement.
Patterson High School was in need of a professional development that was on going,
and focused on helping teachers develop the range of knowledge and skills they will
need to tailor instruction to their students. Strong professional development is based
on the process through which teachers are partners in identifying student achievement
concerns and defining the changes they need to make their own practices (Olsen and
Romero, 2006b).
Teacher Motivation
Teacher motivation is another issue that contributes to the PHS EL academic
performance. “Central to excellent schooling is excellent teaching” (DuFour & Eaker,
1992, p. 113). It is impossible to have an excellent school unless the instructional
8
program of the school is characterized by excellence in teaching. Recently the district
had to provide Cross-Cultural Language and Academic Development training to 10
teachers at PHS. This was the third attempt by the district to certify that all teachers
have been trained on effective strategies for EL students.
In addition, the district, as well as the site, has invested numerous resources
into training all staff members on DATAWORKS Explicit Direct Instruction.
Although there has been some success utilizing direct instruction strategies at PHS,
there are still some staff members that are reluctant to utilize these strategies. They
refuse to invest the mental effort necessary to be effective. Clark and Estes (2002)
have found that, mental effort is determined, in large measure, by our confidence.
Those who lack confidence tend not to invest much mental effort in a task.
Another teacher level factor that Marzano (2001) has identified and is at play
at PHS is poor monitoring of student progress to provide “effective feedback.”
Currently at PHS, students are tested every six weeks on the standards they have
covered. Through the use of EduSoft, the district’s adopted online data system.
However, several teachers experienced concern that they lacked the skills necessary to
provide effective feedback on the results of the benchmark assessment. Most teachers
at PHS are not providing to their students specific feedback that is based on their
knowledge and skill. Most of the feedback students are receiving is just a percentage
score on tests. Teachers at PHS are missing out on what Marzano (2001) says is “the
9
most powerful single modification a teacher can implement in their class to enhance
achievement is feedback” (p. 96).
Organizational Factors
As a major step toward developing the resources and expertise needed to
achieve their school improvement goals, PHS sent a team of 10 faculty and staff to the
2004 Redesigning American High Schools Institute at the Harvard Graduate School of
Education (HGSE). In the course of the program, the PHS team learned several
concepts and practices for improving student learning outcomes. While at Harvard,
the PHS team identified the concept of “school-within-a-school” as an element of
school structure that had the potential to contribute to personalization of the learning
experience for their students. PHS believes the school-within-a-school format will
allow students to form stronger relationships with their teachers and allow teachers to
better tailor the curriculum and instructional methods to children they know well.
The current English Learner program design at Patterson High School was
another organizational barrier that impeded English Learners ability to learn. A
typical ELL student scheduled was made up mostly of double-rostered classes that
appear on paper to provide specially designed instruction for English Language
Learners but in reality are mainstreamed placement with instructional pace and
strategies geared towards English fluent students. The effect was a “sink” or “swim”
approach that failed to provide the required support to enable English Learners to
10
access the curriculum and overcome the language barrier (Olsen, Romero, & Gold,
2006a).
At the district level, Patterson Unified School District has improved greatly in
many areas while facing several ongoing challenges. The use of data to make
informed decisions represents a positive shift. The adaptation and implementation
districtwide of research-based instructional programs continue to institutionalize best
practices. The district has directed all sites to train all certificated staff members in
Explicit Direct Instruction. With the assistance of the Regional System of District and
School Support Region VI all sites have been trained on the effective use of Direct
Instruction. This model is currently being used throughout the district.
Establishing a Data-Driven School Culture
Moving PHS in the right direction has met cultivating and developing a mind
set that looks at data in a thoughtful and meaningful way. According to Johnson
(2002) no single person can bring schoolwide reform. It takes a collaborative effort by
a core group of individuals that are committed to reform. “Collaborating in teams
creates a shared vision and shared responsibility”(Johnson, 2002, p. 54). PHS has an
established leadership team composed of department chairs, our ELD director, head
counselor, and administrative team. Before my arrival to PHS, the leadership team
was basically a means for the Principal to disseminate information. Members had
little opportunity to engage in the decision making process. The leadership team now
11
finds itself in leading the staff in data-analysis, creating, evaluating and monitoring
curriculum.
In addition, the staff at PHS has been engaged in several professional
development activities that have brought them face-to-face with their students’
achievement data.
One of the most important goals of data is to stimulate
dialogue in the school community. Whenever data are
presented, time – and ideally facilitation as well – must be
invested for the dialogue. Inevitably, some of the discoveries
that the school and district make regarding their beliefs,
practices and outcomes will be painful. (Johnson, 2002, p. 49)
Problem Solution
The problem of low achievement for EL students is not unique to PHS and it is
a problem that district must address. While there are many pros and cons about the
NCLB, schools being held responsible for the achievement of all students is a positive
element of the law. The problem is not a small one and is being addressed on multiple
levels. Fortunately, we have many pieces in place that will assist our teachers to meet
the needs of our ELL students.
Implementation of Explicit Direct Instruction
One of the pieces Patterson High School implemented as an intervention to
address the English Learners achievement gap was DATAWORKS Explicit Direct
Instruction model. Direct Instruction was selected as the intervention because of its
proven track record for closing the achievement gap (Chall, 2002; Good & Grouw,
12
1979; Hunter, 1982; Stevens & Rosenshine, 1981). DATAWORKS Explicit Direct
Instruction is an approach based on the premise that all teachers can learn how to
deliver effective lessons that can significantly improve achievement for all learners,
including English language learners and students with special needs. DATAWORKS
combines educational theory, brain research, and data analysis to present a step-by-
step guide for implementation of the direct instructional model. This model is ideal
for all content areas and shows teachers how to use this highly effective approach to
improve instruction and achievement for every student.
Therefore, the whole staff was provided with several daylong trainings on the
overview of Explicit Direct Instruction. These trainings where followed by early
release Wednesdays in-service days. Teachers participated in ongoing training that
led them through a step-by-step process utilizing the DATAWORKS Explicit Direct
Instruction template. The teachers developed their own lesson utilizing
DATAWORKS lesson design and in some case were observed implementing their
own lesson (please see Appendix C for lesson design). The goal was to establish a
common vocabulary and establishing a consistent instructional framework. This
instructional framework provided the structure and job aid for a common lesson
design at Patterson High School (Clark & Estes, 2003).
Direct instruction is a method of designing well-crafted lessons to teach skills
and concepts to students. This method uses research-based instructional components
proven to optimize student learning (Adams & Engleman, 1996; Duffy & Roehler,
13
1984; Good & Brophy, 1987; Hunter, 1982; Stevens & Rosenshine, 1981). Each
lesson is designed and taught to ensure that students can do the independent work
successfully. An essential part of direct instruction is checking for understanding.
This is performed throughout the lesson to verify students are learning what is being
taught while it is being taught and is in line with Marzano’s (2001) characteristics of
effective feedback. When checking for understanding, the teacher uses non-
volunteers, gives effective feedback and wait time for students. This instructional
model is currently being used throughout Patterson High School.
In order to provide immediate feedback to teachers, the principal and assistant
principal have been trained in using walk-through with a single focus (please see
Appendix D for walk through form design), for example, to gather evidence that
teachers are checking for understanding, or that teachers are teaching the learning
objective at the start of the lesson.
ELD Program Design
With the assistance of EduSoft computer software, PHS is able to do what
Johnson (2002) calls “peeling the data”(p. 84). By further desegregating PHS’s
results, the researcher was able to identify content clusters in English Language Arts
that were either being neglected by our pacing calendars or spending to much time on
a particular standard at the expense of a more relevant standard. We were not giving
our EL students what Marzano’s (2003) calls “an opportunity to learn” with our
currently ELD program design.
14
Furthermore, it was evident that PHS needed to restructure its English
Language Development Program. With the assistance of our ELL Director, we began
to look into what was in place and what needed to be added. We looked at numerous
assessments, summative (i.e., CST & CELDT) and formative (i.e., chapter tests,
district writing test & Accelerated Reading Test). What became apparent was that we
had a very heterogeneous EL student population. Our ELD program consisted of one
class meeting the needs of all students regardless of their CELDT level. One class
could not meet the variety needs of all the English Language Learners in our school.
The one size fits all mentality was not working. We had to look at all students and
their appropriate placement according to CLEDT levels. The following is a
breakdown by grade and CELDT level of all the students at PHS that have been
identified as English Learners.
Table 4
Patterson High School’s Preliminary CELDT Scores 2006 by Proficiency Level
GRADES
Beginning
(Level 1)
Early
Intermediate
(Level 2)
Intermediate
(Level 3)
Early
Advanced
(Level 4)
Advanced
(Level 5)
TOTAL
9
th
16
23
45
55
18
157
10
th
8
20
49
46
8
131
11
th
8
12
37
42
13
112
12
th
1
10
17
35
27
90
TOTAL 33 65 148 178 66 490
15
As the chart indicates above, the bulk of our students are at the CLEDT level 3
or 4. Appendix B illustrates a draft of what was implementing at PHS as their ELD
program. This program design is a comprehensive program that addresses the specific
needs of our English Learner population. The diversity of the English Learner
population requires specific support services and curriculum offerings that address
unique needs. Students will now be placed in the appropriate class based on their
CELDT level. Each level course will address our EL students’ specific language
proficiency level. No longer will we have multiple level students in one class. Our
instructors will be able to specifically address each student’s language proficiency.
According to Olsen, Romero, and Gold (2006b) a comprehensive program for English
Language Learners includes the following:
…initial assessment of home language literacy, academic
content knowledge and mastery, and English skills. The
ability to monitor student progress and to provide good
dedicated counselling support is also essential to ensure
appropriate placement and to facilitate accelerated movement
through the program for students grappling with the double
challenge of achieving English mastery and academic content
master (p. 13).
This new ELD program design meets the orientation and transition needs of newly
arrived immigrants, address the academic gaps of students and provide the flexibility
of movement and access to high level academic courses needed by highly educated
adolescent immigrant students at Patterson High School.
16
Value Added Model
While reforms are designed always with the genuine intent of serving all
students, general lack of expertise on the needs of English Learners and inadequate
awareness about effective strategies for meeting those needs among those who have
designed initiatives and led the implementation of reforms has tended to result in
reforms that do not adequately address the unique needs of students who are limited in
English proficiency (Olsen, Romero & Gold, 2006b). A lack of monitoring and
assessments that are valid and reliable for English Learners has often meant that gaps
in meeting the needs of this group of students go unnoticed for to long (Lachat, 1999).
Therefore, a new method of measuring student academic growth was
incorporated into this study that measures individual student gains for the same
student from year to year in the same school. This method of accountability, “student-
level longitudinal data model” or “value added model,” was first introduced by
Sanders and Horn in 1994. Currently, year to year growth is the basis for assessing
school performance in most states. In California, this involves a comparison between
last year’s students and this year’s student using test scores aggregated in to the
Academic Performance Index (API). This model is based on what Linn (2000) calls
the “successive groups” model of accountability. The model is inherently flawed
because scores are based on different cohorts of students and it is difficult to know if
yearly changes in the API scores are attributed to changes in the school’s curriculum
17
and instruction or changes in the composition of the school’s students (Hocevar,
2009).
Implementing a “value added” approach to accountability would limit some of
the inherent limitations associated with the current “successive groups” model of
accountability. (Further discussion of the inherent limitations of the current system
will be addressed in chapter 2). According to Hocevar (2009), a value added or
longitudinal accountability systems are inherently fairer then other accountability
models because value added models of school accountability are more precise and
reliable indicators of school performance.
Purpose, Design, and Utility
Purpose
The purpose of this study was to conduct a formative and summative
evaluation to determine whether DATAWORKS Explicit Direct Instruction design for
direct instruction was effective in meeting the needs of all the English Learners
population at Patterson High School to become proficient on the English Language
Arts portion of the California Standards Test. This study attempted to quantify the
impact of the intervention on EL students. The formative evaluation involved looking
at how direct instruction pedagogy is implemented on the ELL population. The
summative evaluation involved looking at quantitative data to see if the intervention
was effective in helping to improve student achievement on the 2007 CST.
18
Research Questions
The focus of this evaluation study was to discover the depth of impact that
DATAWORKS Explicit Direct Instruction had in the academic achievement of ELL
students at Patterson High School. The overarching research question used to guide
the summative discovery process was
• What effects did DATAWORKS Explicit Direct Instruction have on the
achievement of English Learners at Patterson High School?
Based on the discovery process in this evaluation study, utilizing both summative and
formative evaluation, the researcher strives to provide an increased understanding and
efficacy of the intervention, Explicit Direct Instruction, in addressing the academic
achievement in English-Language Arts of English language learners at Patterson High
School.
Study Design
The design incorporated a pre/post look at data that is collected through the
state testing and accountability system, as well as, a Benchmark school that has
outperformed Patterson High School in both API and percent of students identified as
proficient and advanced. The benchmark school, Selma High School, was selected
from one of the100 similar schools grouping.
Selma High School (SHS) and Patterson High School (PHS) share similar
school characteristics in percentage of students classified as English Language
Learners (PHS 26%, SHS 32%), average parent education level (PHS 2.14, SHS 2.28),
19
and percent of participants in the free or reduced lunch program (PHS 47%, SHS
69%). Both schools had high mobility rates as defined by the California Department
of Education on the Base API School Report (PHS 93%, SHS 96%).
Selma High School was selected as the comparison school to learn what
programs, procedures or processes were making the difference, especially for ELL.
Our intent is to identify specific actions that were occurring at Selma High School that
were attributing to the success of the ELL population. Using a similar school as our
benchmark measure of what we want to become will enable PHS to avoid being
identified as a Program Improvement school.
Formative data included interviews with school principals and teachers, walk
throughs observation, and surveys with the focus of improvement PHS instructional
practices. Through visitations and interviews we can identify and discuss actions that
can improve the delivery and fidelity of the intervention.
The summative data will predominantly come from the CST results by
comparing the 2006-07 outcomes to 2005-06 outcomes of the ELA portion of the test.
The data was disaggregated by grade level and English Learners subgroup. This data
was utilized to determine the effectiveness of DATAWORKS Explicit Direct
Instruction on the English Language Learners at Patterson High School. This helped in
identifying the strengths and weakness of current practices and designing needed
changes.
20
The boundary of this study was the EL population of 342 students who had
valid scores on the ELA portion of the CST for 05-06 and 06-07 school years and the
68 teachers employed at PHS. Inquiry and research was related to instructional
practices promoting growth in student achievement. Interventions affecting
instruction were monitored schoolwide. Accountability for specific student
achievement predominantly will be students identified as ELL focusing on the CST
and CELDT measurement.
As mentioned before, the primary data was obtained through the California
Department of Education Dataquest Web site. Additional quantitative data included
EduSoft reports.
Utility
The information gleaned from this study is directly applicable to my job and
role in the district. As Principal it is my responsibility to assure that all students are
provided a high quality education and prevent PHS from being identified as a Program
Improvement School.
21
CHAPTER 2
LITERATURE REVIEW
Introduction
With an ever increasing number of students who do not speak English as their
primary language attending public schools in both California and the United States, it
is imperative that schools meet the academic needs of non-English speakers so that
they can achieve their full potential. Public Law 107, also known as The No Child Left
Behind Act of 2002, requires that all students meet “minimum proficiency on
challenging State academic achievement standards and state academic assessments”
(Public Law 107-110, p. 1439) and strives to ensure that all students receive an
equitable education regardless of their language background.
In the United States, according to the National Center for Educational Statistics
(2004), over 6 million American students in grades 6 through 12 are at risk of failure
because they read and comprehend below the basic levels needed for success in high
school, postsecondary education and the workforce. Only 30% of all secondary
students read proficiently. For students of color, the situation is even worse. Eighty-
nine percent of Latinos and 86% of African-American middle and high school
students read below grade level (NCES, 2004). To further underscore this point,
22
almost 50% of students of color do not graduate from high school with a regular
diploma in 4 years of instruction.
With the accountability measures of the No Child Left Behind (NCLB)
legislation of 2002 and the increased number of students that are identified as English
Language Learners (ELL), the urgency to identify instructional strategies that can be
associated with a continuous improvement in student achievement has become even
more of a priority. The intent and purpose of NCLB is to address and devise methods
to eliminate the persistent “achievement gap” between advantaged and disadvantaged
students including the English Language Learners. NCLB requires annual high stakes
testing of all students to determine progress toward mastery of rigorous content
standards designed by each individual state with the hope that these assessments will
provide results that can be interpreted and utilized to improve educational
opportunities.
The purpose of this literature review is to identify and summarize recent
research findings concerning direct instruction and its effect on the academic
achievement of English Language Learners and explore the possibility of a “value
added growth model.” The time has come to discuss the possibility of an alternative
accountability model that is valid, precise, equitable, and fair. In this chapter, four
areas related to English-language learners and their academic achievement as
measured by the current accountability model will be discussed. The first segment of
this chapter reviews the challenges of the No Child Left Behind Act for English-
23
language learners. The second section consists of a discussion of the validity of using
standardized assessments as a means of measuring ELL students’ academic
achievement. The third section reviews the literature of a value added growth model
for accountability. The last section summarizes recent findings concerning direct
instruction
No Child Left Behind Act and Its Impact on English-Language
Learners
Under No Child Left Behind, schools districts and states are required to
demonstrate that English Language Learners are making progress not only in meeting
academic standards but also in becoming fully proficient in English. Both types of
progress depend on the effectiveness of the instructional program. However, assessing
this progress is a very challenging task.
In school, English Language Learners (ELLs) have dual challenges of working
towards English-language proficiency for social and academic purposes as well as the
academic challenges that are faced by other students. Given the complexity of
involving ELLs in standardized testing programs it is not surprising that schools feel
challenged. According to Lachat (1999), including ELLs in standardize testing does
not necessarily guarantee that meaningful information is collected. Often ELLs are
unable to demonstrate their knowledge and skills in content areas because of the lack
of English proficiency. The lack of English proficiency on standardized testing is a
strong determiner of test performance for ELL. According to Abedi, Hofstetter, and
Lord (2004), standardized tests that aim to measure knowledge are not sensitive to
24
second language literacy development. What is perceived as lack of mastery of the
content is often instead the normal pace of the second language acquisitions process.
In addition, tests often refer to cultural experiences or historical background to which
many ELLs have not yet been exposed. The ambiguity of this situation means that the
test is not measuring what it is intending to measure. Thus, the scores do not tell
teachers what they need to know about students’ content knowledge. Therefore the
challenge for educators is to create equitable systems which balance high quality and
fair assessment strategies with the learning needs of the English Language Learner.
Common Criteria for Identifying ELL
Developing a system that balances high quality and fair assessment strategies
is difficult for English Language Learners. At present, there is no uniform national
definition of what constitutes an ELL. This makes it very difficult to determine
precisely who these students are, how well they are doing academically, and what
kinds of services they need. According to Abedi (2004), “different states and even
different districts and schools within a state use deferent LEP classification criteria,
thus causing inconsistencies in LEP classification/reclassification across different
educational agencies” (p. 4). The outcome is that it is virtually impossible to collect
and analyze relevant comparable data about these students at the national or even state
level.
According to the NCLB Legislation, Limited English Proficient students, are
defined as students between the ages of 3 and 21 enrolled in elementary or secondary
25
education, often born outside the United States or speaking a language other than
English in their homes and having sufficient mastery of the English to meet state
standards and excel in an English-language classroom. However, individual states
vary widely in their definitions. They may use the terms ELL or Limited English
Proficiency to refer to this student population. Some states define these students as
those who are eligible for language instruction services, whereas others define them as
those who are actually receiving services. Yet another definition proposed by Lachat
(1999) defines ELLs as having a language background other than English, and his or
her proficiency in English is such that the probability of the student’s academic
success in an English-only classroom is below that of an academically successful peer
with an English language background. As result of these inconsistencies among
states, it makes it exceedingly difficult to measure the relative success of schools and
programs in helping students develop English proficiency.
Furthermore, the lack of common criteria for identifying English Language
Learners is further complicated by the measurements used to classify students as
ELLs. Wiley (1994) has identify three measurements that tend to be used most often
by states when identifying a students as an ELL: (a) self-reported information on the
U.S. Census, (b) parents’ replies to questions about their students children on district
developed home language survey, c) language proficiency tests. According to Wiley
(1994), of the three, the language proficiency test would offer the most consistent and
reliable way to assign ELL status to students. However, in most school districts the
26
first criterion, being a nonnative English speaker is based on information gathered
from the home language survey. Unfortunately, the validity of this survey is
threatened by the parents’ concern over equity of opportunity for their children
(Abedi, 2004).
In a study conducted by Abedi, Lord, and Plummer (1997), found significant
discrepancies between student reporting and the school records of students speaking a
language other than English at home. The researchers concluded that the school
record of the number of students speaking a language other than English at home was
significantly lower than what the students themselves reported. Consequently,
estimates of the size of the ELL student population tend to differ even within a given
state or district. The ELL population is more diverse than what data suggests.
Students labeled as ELLs differ substantially in many aspects, including family
characteristics, cultural and language backgrounds, and the level of English
proficiency. Thus, the ELL subgroup is not a well defined, homogeneous group of
students.
ELL Typologies
As English Language Learner students progress through the educational
system there are many factors beyond English fluency that impact English Learners’
participation and achievement in schools. Several researchers have concluded (Hill &
Flynn, 2006; Olsen & Jaramillo, 1999; Olsen & Romero, 2006a) that educators need
to look beyond individual student characteristics and create typologies of academic
27
needs among the school’s English Learner population. According to Olsen and
Romero (2006a) there are many ways of clustering students into typologies that can
help plan programs and services. These are not necessarily mutually exclusive
typologies. In fact, an individual student may fit into several typologies-but the
usefulness of typologies is to help plan programs and services to address the needs of
each type. Table 5 outlines the different typologies and key characteristics developed
by Olsen and Jaramillo (1999):
Table 5
English Language Learners Typologies and Key Characteristics
Typology Key Characteristics
Newcomers
• In U.S. three years or less
• Little or no English proficiency on arrival
• Some well-prepared in native language, on grade level, others are
below
• Steady progress through ELS Sequence
• If school offers native-language content courses, credit accrual
toward graduation can be rapid
• Difficulty passing CAHSEE within four-year time frame
• Academic achievement in terms of grades similar to rest of the
school
• Facing cultural transition to U.S.
Well-Educated
Newcomer Students
• In U.S. three years or less
• Schooling in native country usually excellent
• Strong literacy skills in home language
• Rapid movement through ESL sequence
• Academic achievement in terms of grades exceeds rest of school
• Often highly motivated
• Good possibility of graduating in four years
Table continues next page
28
Table 5 Continued
English Language Learners Typologies and Key Characteristics
Under schooled
• In U.S. several years of less
• Little to no English language fluency or proficiency
• Little to no literacy in native language
• Schooling in native country interrupted, disjointed inadequate, or no
schooling at all
• Three or more years below grade level in Math
• Slow acquisition of English-tendency to repeat ESL level
• Lack of credit accrual over time
• Unable to pass CAHSEE
Long-Term
Limited English
Proficient
• In U.S. 7+ several years or less
• Multiple countries of origin
• Usually orally fluent in English
• Reading/writing below level of native English peers
• Bi-modal academically: some doing well, others not
• Some have literacy in primary language, others not
• Some were in bilingual programs, most not
• Mismatch between student’s own perception of academic
achievement (high) and actual grades or tests scores (low)
• Similar mismatch between perception of language ability and reality
Over-age for grade
level
• May have gaps in prior schooling or a history of school failure and
in-grade
Fluent English
Proficient, but
struggling
academically
• Redesignated from limited English proficiency to fully proficient
• Receiving at least one D or F in core academic classes
• Following redesignation, decline in grades and achievement
Every school needs to know who their English Leaner population is, and
design a program that provides services around their needs. What one school has in
place may not be sufficient or appropriate to another school. Resources are wasted,
and student needs are bypassed when programs are not designed around the needs of
the specific English Learner enrollment in a school. According to Olsen and Romero
(2006b), in order
29
to know and serve the English Leaner population well,
educators need to understand and know specific background
characteristics of their students (nationality, prior schooling,
cultural identity) and be able to categorize them by academic
typologies (long term English Learner, under schooled, over
age, newcomer, accelerated/college bound, moving regularly
through the ELD sequence at normative rates). (p. 7)
The current state is that now we find that teaching English-language skills to English-
Language Learners becomes the responsibility of the entire school staff.
English Learners’ Achievement on Standardized Testing
Impact of Standardized Testing on ELLs
Currently, the No Child Left Behind Act of 2002 mandates that programs
receiving federal funding are held accountable for what they do. The accountability
measures are intended to ensure all students succeed in schools. For English Learners,
according to Munoz (2002) these requirements have “intensified the debate among
practitioners, researchers, and policy makers as to what constitutes success and how to
measure it” (p. 4). While there is a definite need to show the public what students are
learning in schools and hold schools accountable for the education of students. The
use of the same standardized assessments with all groups of students is problematic
and may not be the best approach to accountability. Therefore, the accountability
requirements of No Child Left Behind (NCLB) call into question the validity of
standardized testing of students who are required to meet grade-level standards while
they are acquiring English language skills.
Butler and Stevens (2001) found that
30
[T]est scores are not always reliable and valid for all of the
purposes for which they are used for. This is particularly true
for students who speak English as a second language.
Commercially developed large-scale content assessments were
produced and intended for students who are native speakers of
English or highly proficient non-native speakers. (p. 411)
In addition, studies conducted by Abedi (2004) and Lachat (1999) have
concluded that very few English language learners are included in the norming
samples of the most widely used standardized tests. The academic achievement tests
that are constructed and normed for native English speakers have lower reliability and
validity for the English language learners population. “If English language learners
are not included in the population sample used for validation, the assessment will not
be valid for these students and cannot assess them fairly” (Lachat, 1999, p. 105).
Therefore, consideration must be given to whether the assessment provides both a fair
opportunity for all students and provides a reliable and consistent measure of the
performance of English language learners. The current situation in California is that
all ELLs who have lived in the USA at least twelve month are expected to take
standardized assessments with their native English-speaking peers, regardless of how
well they have mastered English. For students whose English proficiency is still
developing, the tests often pose significant reading challenges that interfere with the
assessment of the content they have learned, making their test scores questionable as
indicators of content knowledge or achievement.
Several states have implemented accommodations in order to limit the impact
of language proficiency on standardized test. Some of the accommodations are
31
designed to reduce the English language demand of ELL students by simplifying the
directions, allowing the use of dictionaries, and reading the questions aloud (Abedi,
Hofstetter & Lord, 2004). Other accommodations permitted included separate testing
sessions, flexible scheduling, allowing extra time, and small groups administration
(August & Lara, 1996). However, Rivera and Vincent (1996) caution that
accommodations do not work equally well for English language learners because of
wide variation in English language proficiency. While accommodations may make a
positive difference for English language learners who already are fairly proficient in
English (Abedi, Hofstetter, & Lord, 2004), for those who have very little proficiency
in English, they may not make enough of a difference to enable students to perform at
high levels.
As noted above, the use of standardized assessments to measure the academic
progress of EL students is problematic, as Butler and Stevens (2001) note in their
research, the vast majority of standardized assessments are designed with English-only
students, therefore the utility of measuring the content knowledge of English Learners
is severely limited. Abedi, Hofstetter, & Lord (2004) have also found that, “LEP
students exhibit substantially lower performance than non-ELL students in subject
areas high on language demand.” (p. 11). This type of assessments results do not
inform classroom teachers about what ELs need to learn in order to be successful
academically rather it highlights the discrepancy between performance on
standardized testing and levels of English proficiency.
32
Numerous studies have been conducted about the causes of the achievement
gap between the performance of EL students on standardized test and their English-
speaking peers. Abedi, Leon, and Mirocha (2005), note, that, “English language
learner (ELL) students generally perform lower than non-ELL students in reading,
science, math and other content areas—a strong indication of the relationship of
English proficiency with achievement assessment.” (p. 2). Furthermore, studies have
shown that the more language skills a content area requires the wider the gap between
EL students and non-EL students performance. Abedi, Leon, and Mirocha (2005)
consider this “language load” (p. 2) as a threat to the validity of EL student
achievement on standardized tests and cite the example of mathematic computation as
an area where language is not a significant factor in EL performance because, “The
difference between ELL and non-ELL student performance becomes the smallest in
math, particularly on math items where language has less impact, such as on math
computation items” (p. 3).
The achievement gap between the performance of EL students and non-EL
student widens as the requirement of increased language proficiency (“language load”)
rises (Abedi, Leon, & Mirocha, 2005, p. 39). Abedi (2004) found that language
proficiency plays an integral part in the academic achievement of EL students. The
complexity between language factors and content-based knowledge makes assessment
and accountability complicated for English language Learners. Ultimately, the use of
standardized tests to determine what EL students have learned in school is a question
33
that will remain unanswered until alternative method of measuring growth for English
Language Learners.
Value Added Model Growth
One method that has pick up momentum within the last decades as a means to
measure more accurately student performance is “value added” approach to
accountability. This approach was first introduced by Sanders and Horn in 1994.
According to Millman (1997) value added models are the gold standard in the
assessment of school accountability.
The cornerstone to an effective accountability system is access to accurate and
fair information that educators can use to improve student learning. Both California’s
API and NCLB’s AYP are blunt instruments when applied at the local level, lacking
as they do the capacity to generate credible descriptions of school improvement that
school teachers and leaders need (Abedi, Hofstetter, & Lord, 2004; Hocevar, 2009).
The current accountability system in California uses a cross-sectional data on
“successive groups” (Linn, 2006). The “successive groups” model of accountability
involves a comparison between last year’s students and this year’s using test scores of
students aggregated into the API. Because API scores are based on different cohorts
of students, it is difficult to know if yearly changes in the API scores are attributed to
changes in the school’s curriculum and instruction or changes in the composition of
the school’s students (Hocevar, 2009).
34
The inherent problem of using successive cohorts to measure school
improvement is widely recognized by practitioners and researchers, there is a growing
need to alter our current method of accountability. Goldschmidt et al. (2005) work has
emerged as a preferred assessment methodology for a value added model. According
to Goldschmidt et al. (2005) “the main purpose of VAMs is to separate the effects of
nonschool-related factors (such as family, peer, and individual influence) from a
school’s performance at any point in time so that student performance can be
attributed appropriately” (p. 5). A value added model is simply the difference between
its actual growth and its expected growth. Under a value added accountability model,
states can use student background characteristics and/or prior achievement and other
data as statistical controls in order to isolate the specific effects of a particular school,
program, or teacher on student academic progress (Goldschmidt et al., 2005).
There are several good arguments that support Goldschmidt et al. (2005) use
for a value added accountability model to supplement or even replace the existing
“successive groups” approach that is used in California and other states. First, year to
year changes in school performance index are largely due to changes in the
characteristics of the students in a school rather than changes in the school curriculum
or instruction (Hocevar, 2009). Second, value added analyses focus on individual
student gains for the same students from year to year in the same school. This model
requires that a student be in a school for two consecutive years, thus limited the effects
of mobility (Linn, 2006). Lastly, there is a uniform consensus that the single most
35
important characteristic of an accountability system is fairness (Millman, 1997). A
value added accountability model is inherently fairer than other accountability models
because they are more valid indicators of the effects of school’s curriculum and
instruction and a more precise and reliable indicator of school performance (Hocevar,
2009).
However, there are some concerns about a value added model of
accountability. Tekwe et al. (2004) and Raudenbush (2004) have identified a major
deficiency in a value added model. Both studies identified the difficulty estimating
the teacher’s effect is most problematic when schools serve very different kinds of
students. In their study they found it difficult to separate teacher and school effects
using currently available accountability data. According to Raudenbusch (2004), “state
officials would not want to hold school personnel accountable for a school’s context
(e.g., the neighborhood in which it is located)” (p. 122). Therefore, accurate estimate
of teacher effective practices is a must, in order to limit school context bias.
36
Literature on Direct Instruction
Teaching Models
More than thirty years ago Rosenshine introduced the term “direct instruction”
into the education literature in his review of effective teaching (Stein, Carnine, &
Dixon, 1998). Since then direct instruction has taken on many forms and names over
its four decade existences. For the purpose of this literature review three specific
models will be highlighted. All three models reported high success rates and were
widely integrated into practice in K-12 settings. According to Stein, Carnine & Dixon
(1998), what has been persistence throughout all the different variation of direct
instruction models have been the following three key components: (1) all models
begin with some type of opening activity, (2) next enact the main idea, (3) then to give
students opportunities for practices.
First, Rosenshine’s explicit teaching model: Rosenshin’s (1979) model was
designed to be sensitive to differences in student ability and complexity of subject
matter. The central theme of the explicit teaching model is that teachers need to enact
intentionally clear and well-defined lesson (Rosenshine, 1979). The function of each
lesson includes review, presentation, guided practice, corrections and feedback,
independent practice, and weekly and monthly reviews. The major instructional
strategies include teaching in small steps with student practice after each step, guiding
37
students during initial practice, and providing all students with a high level of
successful practice.
Also, Stevens & Rosenshine (1986) brought attention to the need to modify
lessons based on the materials or content to be taught. His modification for difficult
emphasized additional monitoring, with the lesson cycle focused on presentation,
guided practice, and supervised independent practice.
Second, Good and Grouws’s strategies for effective teaching model focused on
the teaching and learning of mathematics. Good and Grouws’s (1979) research
resulted in a scripted procedures that included both instructional and management
strategies. While following the basic direct instruction procedures, this model offered
suggested lesson management strategies and time allotments for each phase of the
lesson, including weekly and monthly practice intervals.
Lastly, Hunter’s design “Instructional Theory into Practice (ITIP)”: During the
1980s Dr. Hunter developed and disseminated a widely known and used teaching
model that merged features from the direct instruction. Madeline Hunter has often
been identified as the founder of the direct instruction model in regards to the current
pedagogy that is seen in classrooms across the United States. Hunter describes her
direct instruction model as an efficient and effective method of instruction that is
focused on the way in which learning occurs in the classroom (Hunter, 1982). Dr.
Hunter’s model represents a direct instructional model that was designed for
practitioners who were trying to infuse the concept and terminology of the objective.
38
Different groups and organizations have modified her design but the basic
tenets are still apparent. DATAWORKS is a derivative of the Hunter design.
DATAWORKS has added more detail to the instruction component but have not
wavered far from the original model. Table 6 provides a comparison of three different
direct instruction models discussed above and DATAWORKS Explicit Direct
Instruction model.
Table 6
Lesson Design Models
Rosenshine’s
Explicit Teaching
Model
Good & Grouws’s
Strategies for Effective
Teaching Model
Hunter’s
Instructional Theory
Into Practice
DATAWORKS
Review
Daily Review
Standard State Content Standard
Objective and Purpose
*Learning Objective
*Importance
Presentation Development Anticipatory Set *Activate Prior Knowledge
Guided Practice
Seatwork
Instruction
*Concept Development
*Skill Development
*Guided Practice
Checking for
Understanding
Homework Checking of
Understanding
*Independent Practice
Independent
Practice
Special Review
Closure *Closure
Weekly and
Monthly reviews
Independent Practice *English Language
Development
*Checking for understanding.
As Table 6 compares types of lesson designs, it should be noted that the
Checking for Understanding (CFU) is an essential component of all the direct
39
instruction models that are listed. CFU is addressed in two distinct methods. Two of
the models list CFU as a separate component that is a separate activity in the overall
lesson design. However, DATAWORKS model is unique in that this model
emphasizes that CFU occurs in all components of the lesson delivery. This model
provides for the ongoing CFU which is more aligned with the “effective and timely
feedback” referred to by Marzano (2001) and critical feedback that is necessary for the
“teacher-centered classroom” success.
As noted above, direct instruction has had a varied history when it comes to
titles and descriptions. Some researches have gone as far as to define direct
instruction as a comprehensive system of instruction that integrates effective teaching
practices with sophisticated curriculum design (Adams & Engleman, 1996). This
distinction between Engleman’s Direct Instruction Model and the instructional models
listed above is an important one because it has been the source of many
misconceptions about the working definition of direct instruction (Stein, Carnine, &
Dixon, 1998). Engleman’s Direct Instruction Model incorporates the developed of a
curriculum design with the teaching techniques that are widely used in direct
instructional models.
For purpose of this section, Duffy and Roehler’s (1982) definition of direct
instruction will be utilized. Duffy and Roehler’s (1982) have defined direct
instruction as “an academic focus, precise sequencing of content, high pupil
engagement, careful teacher monitoring and specific corrective feedback to students”
40
(p. 35). In direct instruction, students and teacher are focused on a goal or objective,
on what is to be learned; students are aware of why it is important to learn the task at
hand; and students are explicitly taught how to do a particular process through teacher
modeling and explanation. This is followed by guided practice as students try out their
interpretations of what was taught while the teacher monitors these tryouts. Then
independent practice takes place. During this phase of the lesson students work
independently and apply what they have learned. Spiegel (1992), describes this phase
of the lesson as “giving students opportunities to experience, as soon as possible, the
value of what they are being asked to do” (p. 41).
In a direct instructional model the teacher, tells shows, models, demonstrates,
and teaches the skill to be learned. The key word is teacher. Direct instruction is a
teacher-centered approach to instruction. It is the teacher who is in command of the
learning situation and leads the lesson (Baumann, 1984). Therefore, the direct
instruction paradigm used in this study required the teacher to be responsible for the
academic focus, sequence of content, student engagement, providing effective
feedback, with a gradual shift of responsibility for learning from the teacher to the
student as a lesson progressed (DATAWORK workshop 10/07).
Direct Instruction and English Language Learners
Numerous studies have been conducted to evaluate the effectiveness of direct
instruction on student academic achievement. One of the earliest studies on the effects
of direct instruction was conducted by Project Follow Through. Project Follow
41
Through evaluated a variety of educational approaches to teaching low-income
children in various communities from kindergarten (or first grade) through third grade.
The researchers of Follow Through believed that gains made by students in Head Start
could be enhanced and solidified in a comprehensive, systematic 3- or 4-year program.
One of the approaches found to be most effective in the longitudinal evaluation
conducted by Abt Associates and Stanford Research Institute under the auspices of the
U.S. Office of Education (USOE) was the Direct Instruction Model (Becker &
Gersten, 1982).
Becker & Gersten (1982) conducted a follow up study to trace the longitudinal
progress of the Follow Through children through their entire 6 years in elementary
school, and contrast their scores with the standardization sample of the achievement
tests. The researcher wanted to examine the later effects of the Direct Instruction
Follow Through program to see if the students maintained the gains they made during
the first study, and to determine in which academic domains these gains were
maintained. Becker and Gersten (1982) concluded that “there is evidence that, in most
domains assessed by standardized achievement tests, low-income graduates of a 3-
year Direct Instruction Follow Through program perform better than comparable
children in their communities who did not attend the program” (p. 88). The second
finding was less optimistic. When compared to the national norm sample, these
children invariably lose ground in the 3 years after they leave Follow Through. The
42
researchers attribute this drop in achievement to the lack of teacher-directed
instruction like, guided practice, checking for understanding and clarity of tasks.
In a study conducted by Baumann (1984) utilizing direct instruction as the
intervention, Baumann concluded that the group receiving the direct instruction
outperformed all other groups in the study. The focus of this study was to investigate
the effectiveness of direct instruction model for teaching children the comprehension
skill of main idea. Baumann (1984) stated that “main idea skills can be taught
effectively when instruction is direct and systematic and students are given
responsibility for the learning” (p. 103). Sterbinsky, Ross, and Redfield (2003) had
similar results with a much different population. Their study included 12 schools
made up of urban and rural settings with high ELL and Title I student populations.
The results of the study were that the two schools that implemented direct instruction
models had higher scores in reading when assessed with the Durrell Analysis of
Reading Difficulty test. In both cases direct instruction models were used to enhance
academic achievement off all students. Both studies incorporated key components of
direct instruction, for example, “opportunity to learn” (Marzano, 2003), “corrective
feedback” (Duffy & Roehler, 1982) and “systematic instruction” (Spiegel, 1992).
The findings were validated in a similar study conducted by Stevens and
Rosenshine in 1981. The researchers concluded that a teacher-centered/direct
instruction model provided the greatest impact on student achievement that was
43
especially noted in reading and math. In addition, students’ uses of higher order
thinking skills also are impacted by direct instruction. (Rosenshine & Stevens, 1981).
In 1996 Adams and Engleman came to the same conclusion in their book
“Research on Direct Instruction: 25 Years Beyond DISTAR”. They found that
students that were taught using the direct instruction model did significantly better
than those students that were taught by other means. In addition, they found that
students identified as poor and involved with direct instruction exhibited positive
affects in student achievement.
Summary of the Literature
Offering high quality instruction for English Learners requires strategies that
build the English language skills students need while simultaneously helping students
to access the core curriculum. Hill and Flynn (2006) have done extensive research on
effective instructional strategies for English Learners. Direct Instruction aligns well
with Hill and Flynn’s (2006) researcher as an effective instructional framework that
can assist ELL students meet today’s challenges.
However, one major challenge that no curriculum or instructional framework
can over come is our current accountability system. Its requirements have influenced
the use of standardized assessments as the tools for measuring the academic
proficiency of students’ performance. As noted earlier, the current California
accountability system includes a cross-sectional data on successive cohorts that has
been criticized as unfair to educators because it holds educators accountable for both
44
student background, prior educational experience, and current educational
effectiveness (Millman, 1997). Critics also declare that this system can not account for
variables like socioeconomic status, innate ability, and school context and thus yield
little valid information about student achievement growth year to year (Abedi et al.,
2004; Goldschmidt et al., 2005).
In conclusion, schools face many challenges in meeting federal and state
guidelines for student achievement and schools that serve large populations of English
Learners can find these challenges almost insurmountable given the amount of time it
takes for ELs to achieve proficiency in English. Despite this rather grim prospect, it is
possible for schools to meet and even exceed achievement goals with the appropriate
instruction framework, curriculum and an accountability system that true measures
individual student growth.
45
CHAPTER 3
METHODOLOGY
The primary purpose of this study was to evaluate the effect of the direct
instruction model, DATAWORKS Explicit Direct Instruction (Independent Variable)
on the achievement of ELL (Dependent Variable) as measured by the English
Language Arts (ELA) portion of the California Standards Test (CST) at Patterson
High School. The overarching question of this study is
• What effect did DATAWORKS Explicit Direct Instruction have on the
achievement of English Learners at Patterson High School?
Design Summary
Pre-Post Design
A pre-post design was used with the experimental group consisting of
Patterson High School English Learner subgroup received continuous direct
instruction as an intervention in the 2006-07 school year. The 2006 archival data
served as the standard of comparison. The following summative evaluation design
format was used: O pre X O post. A cohort match group was utilized. The
following statistics was used for each dependent variable: (a) a dependent groups t-test
to assess the statistical significance of the change (p < .05), and (b) Cohen’s d to
assess practical significance (criterion for practical significance (d > .20).
46
A non-equivalent control group design was utilized in this study. This design
included an experimental group and a benchmark comparison group which were not
randomly assigned. The experimental and benchmark groups were compared on the
post test data utilizing a raw change in percent proficient and above utilizing the CST
ELA results for ELL at the Experimental School and Benchmark School. It must be
noted that the nonequivalent control group design lacked internal validity because of
selection bias. Therefore, causal inferences concerning DATAWORKS intervention
and student achievement cannot be proven.
Patterson High School – Experimental School
Pre-test Observation: 2006 California Standards Test
Treatment (X): Direct Instruction
Post-test Observation: 2007 California Standards Test
Selma High School – Benchmark School
Pre-test Observation: 2006 California Standards Test
Post-test Observation: 2007 California Standards Test
A qualitative (formative evaluation) component was used in this study in the
form of informal, open-ended interviews with Patterson High School ELD teachers
who implemented the direct instruction intervention. Teachers were interviewed
throughout the school year and asked to describe the level of implementation attained
in their classroom.
47
Description of Benchmark School
Benchmark School (Selma High School) was one of the 100 similar schools
grouped with the Experimental School (Patterson High School) as identified by the
California Department of Education. Selma High School was selected because of the
demographic similarities in three areas: average parent education level, number of
students who qualify for free and reduced lunch, and number of ELL. Table 7
provides a comparison of the Experimental School and Benchmark School in these
areas as well as the Academic Performance Index for both schools. Benchmark
School’s parent average education level is slightly higher than the Patterson High
School’s average parent education level. Selma High School has a higher percentage
of students who qualify for free and reduced lunch, and a slightly lower percent of
ELL than Patterson High School as report on the 2006 Base API School Report.
Table 7
Patterson HS and Selma HS Comparisons Based on the 2006 Accountability
Progress Report
School
API
State
Ranking
Similar
School
Ranking
Percent of
English
Language
Learners
Average
Parent
Education
Level
Participation in
Free and
Reduced Lunch
Program
Patterson
High School
661
4
8
32%
2.14
47%
Selma High
School
718
6
10
26%
2.28
69%
48
Participants/Sampling
Study Participants
Participants in this study will consist of the English Language Learners
subgroup population who participated in the STAR assessments at Patterson High
School. Subgroup data was reviewed, and mean scaled scores, as well as percentages
of Advanced, Proficient, Basic, Below Basic, and Far Below Basic bands on the CST.
Subgroup results for the California Standards Tests for the 2006 and 2007 Spring
reports were attained by accessing the STAR Web site provided by the California
Department of Education.
Intervention Description
The treatment intervention provided to the experimental group, ELL
population at PHS, consisted of varying degrees of implementation of the Direct
Instruction methodology. Patterson High School began to implement the Direct
Instruction at the start of the 2005/2006 school year. DATAWORKS Explicit Direct
Instruction Model was selected and the instructional model consists of the following
characteristics:
1. Deconstruction of identified essential standards into daily learning
objectives.
2. Development of an instructional timeline to align curriculum with state
standards.
3. Learning objective matches the independent practice.
49
4. Checking for understanding is implemented throughout the lesson.
5. Presentation – provides initial explanation of the new concept and/or
skill.
6. Structured Practice – master each step one at a time
7. Guided Practice – move students towards independence.
8. Independent Practice – transfer new knowledge from short to long term
memory.
In addition, teachers were provided with classroom support as needed and attended
professional development focused on Direct Instruction through our early release days
throughout the year.
Instrumentation
Quantitative
Quantitative data was collected by reviewing existing data as provided by the
STAR and CELDT reports on the California Department of Education Web site. The
California Standards Test (CST) and CELDT results were reviewed for the subgroup
population of English Language Learners students within Patterson High School.
Additionally, only the English-language arts portion of the CST was utilized in this
study. According to the information provided by the California Standardized Testing
and Reporting Program (STAR), the English Language Arts test in grades 9-11 consist
of 75 questions in a multiple-choice format. Scores are reported by grade level and
content area for each school, district, county, and the state. A mean scaled score
50
method is used to determine proficiency. CST scaled scores range from approximately
150 to 600. Scores between 300 and 349 are at the Basic Performance Standard and
scores of 350 or higher are at or above the Proficient Performance Standard.
For California, the CST represents a foundation to measure whether schools
and districts are achieving academically. The scores collected from the state’s STAR
program are used as one of many accountability elements in calculating each school
and district’s Academic Performance Index (API) rating. The API score is represented
by a number between 200 and 1000. The state’s goal for every school is to have a
minimum API score of 800. If a school has already achieved an API score of 800,
minimally, the school must work to maintain that score. If a school does not have an
API score of 800, the state will set a point goal that the school must achieve from year
to year until the school meets the state’s minimum API score of 800 (EdSource, 2005).
Federally, the CST also represents a foundation to measure whether schools
and districts are achieving academically. Due to the No Child Left Behind Act of
2001 (NCLB), every subgroup of students must meet the proficiency target set every
year by the federal government in the subjects of English and mathematics called the
Adequate Yearly Progress (AYP). In order for a subgroup to be measured, 95% of the
students in each subgroup must be included in the testing administration at each
school. If a school does not meet the AYP target for two consecutive years, the school
will “face an escalating set of consequences – from providing tutoring services to
shutting the school down – within a process called Program Improvement” (EdSource,
51
2005, p. 3). Therefore, the significance of the CST score is a principal measure of
how a school is rated, ranked, and how it is allowed to continue to provide services to
students.
Qualitative
Qualitative data for this study was collected by using an informal interview
process with teachers as issues arise and at the beginning of every six-week grading
period. The interviews took place individually, and during departmental meetings.
The purpose of these interviews was to determine how to best support teacher’s
classroom implementation of Direct Instruction and to determine the efficacy of the
implementation of the lessons by each teacher. Interviews were conducted by the
evaluator and consisted of both opened-ended and simple response questions. Some
examples of questions asked are as follows:
1. How did the staff react to the implementation of DATAWORKS
Explicit Direct Instruction Model?
2. How has direct instruction training impact PHS?
3. What are some of the changes you have implemented since Direct
Instruction?
52
Data Analysis
Quantitative Analysis
Pre-post differences in standard scores were tested for significance using a
dependent groups t-test. Because t-test results depend on sample size, Cohen’s d (ratio
of the observed difference to the population SD) will be computed to assess effect size.
Further analysis was conducted by implementing a residualized-change score formula
as the method to provide a more accurate account for the year-over-year-data. Using x
as the independent variable and y as the dependent variable, a slope, or regression line
can be attained by finding the residual of each point in relation to the line. “A residual
is the error or distance between the point and the slope of the regression line. The
normal procedure to find the residual is to minimize the sum of the squares of all the
errors” (Cameron, 2010, p. 63).
According to Cameron a residualized-change formula is based on the
following:
if x has the value of x, y can be determined using the formula
y=bx+a where b is the slope of the line or line of regression,
and a is the vertical intercept. To find the residualized-change
score for a value, we would simply take the value of y and
subtract the predicted value of y’ to find the residual change or
y-y’=residual. If the residual is above the predicted regression
line, the value or change score is positive, if the residual is
below the predicted regression line, the value or change score
is negative. (p. 64)
53
By utilizing the above formula, the researcher was able to demonstrate year to year
growth of the ELL population by grade level at Patterson High School.
Qualitative Analysis
Qualitative information generated by the informal interviews will be analyzed
using six steps as recommended by Creswell (2003) as follows:
Step 1: Interviews will be transcribed.
Step 2: Record general ideas generated by interview questions.
Step 3: “Chunk” information from interview questions (Creswell, 2003, p.
192).
Step 4: Identify themes generated by Step 3.
Step 5: Provide a detailed description of themes arising from interview
questions.
Step 6: Provide a personal interpretation of themes.
Delimitations and Limitations of the Study
It is widely recognized that the internal validity of the pre-post design is
limited. History, instrumentation, selection, redefinition of the treatment, and Type 1
error are all possible confounding factors in this study. The use of a non-equivalent
control group design increases the internal validity of a study, however, the control
group in this study was not randomly assigned, and therefore selection bias was a
problem. Thus, causal inferences about the effectiveness of direct instruction are
54
risky. The problem was further aggravated because other interventions were being
implemented at the same time as direct instruction. As always, the external validity of
any single study is limited and the results can only be generalized to similar
populations, measurement, settings, and treatments.
55
CHAPTER 4
RESULTS
Overview
The primary purpose of this study was to determine the impact of direct
instruction, specifically the DATAWORKS Explicit Direct Instruction design model,
on the achievement of ELLs at Patterson High School on the English Language Arts
portion of the California Standards Test. A multiple method design was utilized
including a pre/post dependent group t-test and non-equivalent comparison group for
the summative evaluation portion of this study. As mentioned previously in this study,
Selma High School was selected as our benchmark school because it had a higher
Academic Performance Index, State Ranking, and Similar School Ranking than PHS.
The non-equivalent comparison group, Selma HS, results were used to
compare to PHS post-test results of ELLs on the English Language Arts Portion of the
California Standards Test. Selma High School was part of the study in order to
improve the generalizability of the results using the concept of proximal similarity
(Creswell, 2003).
The methodology for comparison focused on three dependent variables. The
three dependent variables were (a) valued added growth model-Grade Level Growth
Scores, (b) California Standards Test (CST) and English Language Arts (ELA) scale
56
scores for ELL at Patterson High School, and (c) percentage of ELL who scored
proficient and above on the CST ELA.
Pre/Post Dependent Group Design
As a result of both groups not being randomly selected, selection bias was a
serious limitation of the study. To address this limitation and enhance the internal
validity, a dependent group design was utilized. The dependent group consisted of
English Language Learner students at Patterson High School who had taken the
California Standards Test English Language Arts portion of the CST in both 2006 and
2007. These results were used to determine and analyze both for statistical and
practical significance.
The dependent group design was employed to analyze individual score
changes at Patterson High School from 2006 (pre-intervention) and 2007 (post-
intervention) for English Language Learners in English Language Arts (ELA). The
following statistics were used to analyze California Standards Test ELA scaled scores
at the experimental school:
1. A dependent group t-test to assess the statistical significance of the
change (criterion for statistical significance = p < .05).
2. Cohen’s d to assess practical significance (criterion for practical
significance = d > .20).
3. Raw change from 2006 to 2007 to assess practical significance
(criterion for practical significance = 10%).
57
Non-equivalent Comparison Group Design
This design included an experimental group and one comparison group that
were not randomly selected. The comparison group, Selma High School was selected
as the benchmark school because of its similarities to Patterson High School. Both
schools were similar based on their ethnic/racial composition, socioeconomic status,
English Learners percentages, and parent education levels.
Furthermore, Selma High School was also selected because of their academic
performance. SHS had a higher Academic Performance Index (API) in 2006 than PHS
(SHS’s API 718, PHS’s API 661). SHS had a higher state rank in 2006 (SHS State
Rank = 5, PHS State Rank = 2) and a higher similar school rank in 2006 (SHS Similar
School rank = 10, PHS Similar School rank = 5).
The treatment, a direct-instruction model known as DATAWORKS Explicit
Direct Instruction, was administered to the experimental group’s ELL during the
2006–2007 school year. The statistical analysis for the experimental versus
comparison group contrast was descriptive and focused on data gathered from the state
and federal accountability system.
Grade Level Growth Scores
Table 8 shows the normal curve equivalent (NCE) scores for the 8
th
to 9
th
, 9
th
to 10
th
, and 10
th
to 11
th
transitions. The Grade Level Growth Scores (GLGS) results
for the same transitions are displayed in Table 9. The GLGS is the difference between
58
the actual student growth and the expected growth. It is utilized to illustrate individual
student growth from one year to the next. This value added method is contrast to the
current accountability system that uses “successive cohorts” and makes the
assumption that students attending the school are comparable from year to year
regardless of school mobility and school factors (Linn, 2000).
8
th
to 9
th
Grade Transition
For the 8
th
to 9
th
grade transition, the normal curve equivalent (NCE) scores are
57.39 and 59.02, respectively. NCE’s range from 1 to 99 and were based on statewide
results for the ELL subgroup. The state mean was 50. The 8 to 9 transition difference
(1.63 points) was tested for significance using a dependent groups t-test and the
increase in 9
th
grade performance relative 8
th
grade performance is not statistically
significant, t(140) = 1.11, p > .05. The effect size (Cohen’s d), computed by dividing
the difference (1.63) by the eighth grade standard deviation (26.79), is .060. This
effect size is generally recognized as very small.
To enhance interpretability of the results grade level scores (GLS) and grade
level growth scores (GLGS) were computed. The mean grade level score for grade
eight was 8.57 on a theoretical range of 8.01 to 8.99. Thus, the eighth grade ELL
subgroup was above the mean (8.50) for the ELL State normative sample of eighth
graders. In the ninth grade, the average for the same group of ELLs was 9.59, above
the state average of 9.50. A grade level growth score was computed by subtracting 8
th
grade GLS from 9
th
grade GLS. For the 8
th
to 9
th
grade transition, the GLGS was 1.02
59
(9.59-8.57). Thus, at Patterson High School, the ELL growth was 102% of the
expected growth of ELLs statewide.
9
th
to 10
th
Grade Transition
Results for the 9
th
to 10
th
grade transition are shown in Tables 8 and 9. For this
group of 111 students, the 9
th
grade mean NCE score was 57.70 and the 10
th
grade
mean was 57.78, resulting in a slight increase of .08. This increase was tested for
significance using a dependent groups t-test. The increase in 10
th
grade performance
compared to 9
th
is not statistically significant, t(111)= .060, p > .05. Because
statistical significance is largely a function of sample size, two indices of effect size
were also computed. Cohen’s d, the ratio of the difference to the standard deviation of
the pretest equaled .042. This effect size is generally recognized as small.
To enhance interpretability of the results grade level scores (GLS) and grade
level growth scores (GLGS) were computed. The mean grade level score for grade
ninth was 9.57 on a theoretical range of 9.01 to 9.99. Therefore, the ninth grade ELL
subgroup was slightly above the mean (9.50) for the ELL State normative sample of
ninth graders. In the tenth grade, the average for the same group of ELLs was 10.57,
which was in a slightly increase over the state average (10.50). A grade level growth
score was computed by subtracting 9
th
grade GLS from 10
th
grade GLS. For the 9
th
to
10
th
grade transition, the GLGS was 1.00 (10.57-9.57). Thus, at Patterson High
School, the ELL growth was 100% of the expected growth of ELLs statewide.
60
10
th
to 11
th
Grade Transition
The 10
th
to 11
th
grade transition results are show in Tables 8 and 9. This cohort
consisted of 91 students, the 10
th
grade mean NCE score was 64.22, and the 11
th
grade
mean was 65.33. Thus, the scores increased by 1.11 points resulting in about average
and slightly above average in 2005-06 and 2006-07, respectively. The increase was
not statistically significant, t(91) =.789, p > .05. Because statistical significance is
largely a function of sample size, two indices of effect size were also computed.
Cohen’s d, the ratio of the difference to the standard deviation of the pretest equaled
.045. This effect size is relatively small.
The grade level scores (GLS) and grade level growth scores (GLGS) (Hocevar,
2010) were computed as another way to illustrate the growth of ELL students at
Patterson High School. The mean grade level score for grade tenth was 10.64 on a
theoretical range of 10.01 to 10.99. Therefore, the tenth grade ELL subgroup was
slightly above the mean (10.50) for the ELL State normative sample of tenth graders.
In the eleventh grade, the average for the same group of ELLs was 11.65, which was
in a slightly increase over the state average (11.50). A grade level growth score was
computed by subtracting 10
th
grade GLGS from 11
th
grade GLGS. For the 10
th
to 11
th
grade transition, the GLGS was 1.01 (10.64-11.65). Thus, at Patterson High School,
the ELL growth was slightly over 100% of the expected growth of ELLs statewide.
61
Table 8
Normal Curve Equivalent Scores
Grade NCE Grade NCE
Transition M (SD) M (SD) diff. p > D
8
th
Grade 9
th
Grade t(140)
8 to 9 57.39
a
(26.79) 59.02
a
(25.39) 1.63 1.11 N.S. .060
9
th
Grade 10
th
Grade t(111)
9 to 10 57.70(20.34) 57.78(22.40) .087 .060 N.S. .042
10
th
Grade 11
th
Grade t(91)
10 to 11 64.22 (24.40) 65.33(21.64) 1.11 .789 N.S. .045
a
The state mean for ELLs in each grade is 50 points.
Table 9
Grade Level Growth Scores
Grade GLS Grade GLS
Transition M (SD) M (SD) GLGS
8
th
Grade 9
th
Grade
8 to 9 8.57
a
9.59
a
1.02
b
9
th
Grade 10
th
Grade
9 to 10 9.57 10.57 1.00
10
th
Grade 11
th
Grade
10 to 11 10.64 11.65 1.01
a
The state mean for the 8
th
and 9
th
grade ELLs is 8.50 and 9.50, respectively.
b
Expected growth is one grade level (+1.0).
Comparison School Results
The design of this study incorporated the use of a comparison/benchmark
school. Selma High School was selected from the similar schools list for the 2005–
2006 Accountability Progress Report (APR). California Department of Education
classifies similar schools utilizing 17 characteristics that include, but are not limited to
62
the following: school population, demographics, free and reduced lunch percentage,
English Language Learner percentage, and average level of parent education. The
comparison school was also selected, as previously explained, because of their
outstanding academic performance by comparison of academic performance index,
state school rank, and similar-school ranking.
API Comparisons
Overall Selma High School has outperformed Patterson High School for
several years, even though they are very similar in their demographics characteristics.
Table 7 illustrates the comparison of PHS and SHS based on the 2006 Accountability
Progress Report. PHS had a slightly lower average of parent education level, lower
percentage of free and reduced lunch, and a higher percentage of students identified as
ELL than did Selma High School. With this knowledge, the researcher expected
Patterson High School API to be about the same as Selma High School’s API.
However, that was not the case. Table 10 illustrates the differences in the two schools’
API.
Table 10
Comparison of API Base Scores of Patterson HS and Selma HS from 1999–2007
Years 1999 2000 2001 2002 2003 2004 2005 2006 2007
PHS
API
549 552 583 584 626 631 665 661 650
SHS
API
548 596 589 623 682 697 704 718 727
Difference -1 -44 -6 -39 -56 -66 -39 -57 -77
63
Since 1999, Patterson High School API had increased 125 point while Selma
High School had increased by 267 points. The previous API comparisons and analysis
were based on the API components for secondary schools that are a made up of the
following five areas: English Language Arts, Mathematics, Science, History,
CAHSEE English Language Arts, and CAHSEE Mathematics results. Under the
secondary API calculation, the ELA portion of the CST is weighted the highest at
30%, followed by the CST tests in History 22.5%, Math at 20%, and Science at 15%.
The rest of the API is made up by both sections of the CAHSEE that are each
weighted at 30% of the overall API calculation. As noted above, the determination of
the API is influenced by the results of the other areas.
Furthermore, the previous comparisons are also limited by the fact that the
year-to-year API ratings were calculated with different cohorts of students. Therefore
a more precise measurement of how the Patterson High School’s English Learner
population did is to calculate an API utilizing the English Language Arts results of the
ELL students only. Table 11 compares the scores of the students utilizing the API
calculation spreadsheet provided by the California Department of Education, the
researcher calculated the grade level API utilizing only the 342 ELL in this study that
had scores for the 2006 English Language Arts California Standards Test (pre-test)
and the 2007 ELA CST (post-test). The results of these calculations were that the 9
th
grade cohort out gained all others grades and cohort 10
th
and 11
th
grade are consistent
64
with the results of table below that demonstrated a decrease in student achievement at
the schoolwide level.
Table 11
API Results of ELL Students at Patterson HS Who Have CST Results for Both
the 2006 and 2007 CST in ELA
Pre
2006
Post
2007
Gain / Loss
All Students- ELA 690 655 -35
All ELLs 567 537 -30
Grade 09 ELL Subgroup 602 622 +20
Grade 10 ELL Subgroup 568 504 -64
Grade 11 ELL Subgroup 512 447 -65
A further analysis of the ELA test scores for 2006 and 2007 was done below
by performance band levels (Table 12). Overall, there was a downward shift of about
16 students scoring one performance band lower between Advanced and Basic. The
biggest change came in the number of students scoring Far Below Basic. There was
an increase of 22 students scoring FFB between 2006 and 2007. The other big drop
came from the Basic level. Fourteen students dropped from Basics to Below Basic
and Far Below Basic.
Table 13 illustrates by grade level where most of the 22 students who failed to
Far Below Basic. Most came from the eleventh grade. There was an increase of 15
more students scoring Far Below Basic from 2006 to 2007. The other major drop
came in grade 10. There was an increase of nine students scoring Far Below Basic
during the same time period. Ninth grade had the only decrease of students scoring
Far Below Basic. They had two less students scoring FBB between 2006 and 2007.
65
Overall, this drops help to explain the decline in API for Patterson High School in
2007.
Table 12
Performance Levels of the ELA Portion on the CST for ELL between 2006 and
2007
All ELL
PL 2006 % 2007 % Change % Change
Advanced 4 1.17% 3 0.88% -1 -0.29%
Proficient 26 7.60% 25 7.31% -1 -0.29%
Basic 128 37.43% 114 33.33% -14 -4.09%
Below B 117 34.21% 111 32.46% -6 -1.75%
FBB 67 19.59% 89 26.02% 22 6.43%
Total 342 342
Table 13
Number of ELL Students Broken Down by Performance Level and Grade for
2006 and 2007
Advanced Proficient Basic Below B FBB
Grade 9
2007 3 20 57 41 19
2006 2 13 62 42 21
Difference 1 7 -5 -1 -2
Grade 10
2007 0 4 35 39 33
2006 2 12 37 36 24
Difference -2 -8 -2 3 9
Grade 11
2007 0 1 22 31 37
2006 0 1 29 39 22
Difference 0 0 -7 -8 15
66
Adequate Yearly Progress Comparison
The federal No Child Left Behind Act of 2001 requires that all schools and
local education agencies (LEAs) meet Adequate Yearly Progress (AYP). To comply
with NCLB, California adopted AYP criteria that consist of the following:
Requirement 1: Participation Rate
Requirement 2: Percent Proficient (Annual Measurable Objectives)
Requirement 3: API
Requirement 4: Graduation Rate
Requirements 1 and 2 apply at the school, LEA, and subgroup levels. Requirements 3
and 4 apply only at the school and LEA levels. Potentially, a school or LEA may have
up to 46 different criteria to meet in order to make AYP.
Adequate Yearly Progress (AYP), unlike the California Academic
Performance Index (API), gives credit only for the percentage of students who are
identified as proficient or advanced on the statewide achievement tests, known as the
California Standards Test (CST). Each elementary, middle, and secondary school must
demonstrate progress in student achievement schoolwide and on the achievement of
numerically significant subgroups of students within schools. The primary measure of
success is the achievement of a specific, and gradually increasing, percentage of
students in each subgroup scoring proficient or advanced on the CST in English-
Language Arts and Mathematics.
67
The AYP target for percent of students scoring proficient or advanced for 2006
and 2007 was 22.3 % in English-Language Arts. Table 14 illustrates that Selma High
School met its AYP targets for 2006 and 2007. When compared the same targets to
Patterson HS performance, PHS only meet its target in 2006 and not 2007.
Table 14
AMOs – Percent of Students Scoring Proficient or Advanced in English-
Language Arts
Year 2006 2007 Met All AYP Criteria
Target 22.3% 22.3%
PHS 23.2% 8.2% No
SHS 24.9% 24.6% Yes
Another component of the AYP is the Title III English Language Proficiency
Annual Measurable Achievement Objectives (AMAOs). An AMAO is a performance
objective, or target, that school districts receiving Title III sub-grant must meet each
year for its English Learners. All school districts receiving a Title III a sub-grant are
required to meet the two English language proficiency and a third academic
achievement AMAO based on the AYP information. Both English language
proficiency AMAOs are calculated based on data from the California English
Language Development Test (CELDT). The third AMAO relating to meeting AYP
requirements for the EL subgroup is based on data from the California Standards Test
(CST) and California High School Exit Exam (CAHSEE).
Table 15 illustrates that Patterson High School posted a greater increase in
AMAO I and II when compared to Selma High School during the same period. Under
68
AMAO I, which is the percentage of ELs making annual progress on the CELDT,
PHS exceeded state targets by 8.2% in 2007(post-intervention) while SHS missed
meeting its target by 1.3%. When we compare both schools performance on AMAO I
in 2007, PHS outperformed SHS by 9.5%. This means that a higher percentage of EL
at PHS are obtaining English proficiency at a faster rate when compared to the state
target and SHS.
PHS had similar resulted for AMAO II. AMAO II measures the percent of
ELs, in a defined cohort, who have attained English proficiency on the CELDT at a
given point in time (Information Guide Title III, 2006). The cohort for AMAO II
contains those students who could reasonably be expected to have reached English
language proficiency at the time of the 2007 annual CELDT administration. Patterson
High School once again outperformed Selma High School, however both schools did
not meet state targets. PHS had an overall score of 23.5% in 2007 (post-intervention).
During the same time span, SHS had an overall score of 22.8%.
Table 15
Comparison of Title III School Accountability
AMAO I AMAO II
2006 2007 Difference 2006 2007 Difference
Target 52% 48.7% -03.3% 31.4% 27.2% -04.2%
PHS 87.8% 56.9% -30.9% 45% 23.5% -21.5%
SHS 69.8% 47.4% -22.4% 26.8% 22.8% -04%
Difference
Between Schools
+18%
+9.5
+18.2
+0.7
69
CHAPTER 5
SUMMARY, DISCUSSION, AND RECOMMENDATIONS
Overview
The preceding chapters in this study provides the rationale, methodology, and
results for the present evaluation of the intervention, DATAWORKS Explicit Direct
Instruction, and its impact on the achievement of the ELLs at Patterson High School.
The following chapter concludes the study by elaborating on the implications of the
quantitative findings, illuminating conclusions drawn from the informal qualitative
data, and comparisons to benchmark schools. Recommendations are provided for the
site and possible future study.
Purpose and Method
The purpose of this study was to examine the impact of the intervention direct
instruction, specifically DATAWORKS’s Explicit Direct Instruction model on the
achievement of ELLs at Patterson High School. Exclusively, the change in individual
ELLs’ scale scores on the California Standards Test (CST) for English Language Arts
from pre-intervention (2006) to post-intervention (2007) were analyzed by school and
grade level in the experimental school, Patterson High School. A sequential
explanatory design model (Creswell, 2003) was used for the study with a focus
predominantly on the quantitative data and minimal use of qualitative data.
70
Participants in this study consisted of the ELLs student population at Patterson
High School, a comprehensive high school, located thirty minutes west of Modesto,
California. The dependent group design was selected in order to enhance the validity
of the results. External validity is one of the limitations of any study that does not
employ random assignment. The dependent group design allows for some control of
this issue by allowing the participants to be their own control group (Creswell, 2003).
Therefore, the students in this study were only those ELLs who had participated in
both administrations of the pre-intervention 2006 and post-intervention 2007
California Standards Test in English Language Arts. There were 342 ELLs in grades 9
through 11 who met these qualifications and who had received the direct instruction
intervention, DATAWORKS Explicit Direct Instruction model of direct instruction.
Informal interviews and observations of the Patterson High School staff were
conducted in grades 9 - 12 to obtain formative information about the fidelity of
implementation of the direct instruction intervention model. The times for interviews
varied from 10 to 20 minutes in length. Teachers were interviewed individually, in
small groups, and as a faculty as a whole. The observations were informal and varied
in length from a 3–5 minute informal walk throughs to a 40-minute formal
observation.
71
Summary of Findings
Patterson High School
The following section provides evidence in answering the overarching research
question that provided the focus for this study: Did the intervention, direct instruction,
have an effect on the academic achievement of English Language Learners (ELL) on
the English Language Arts (ELA) portion of the California Standards Test (CST)?
Overall, mixed outcomes in the academic achievement of the ELL population at
Patterson High School were observed.
As previously mentioned, a dependent group design was utilized for analysis
of data from the 2006 CST ELA results (pre-test) compared to the 2007 CST ELA
(post-test) results for the 342 ELLs. The overall Academic Performance Index (API)
score of the entire school decreased from a base (pre-intervention) of 661 in 2006 to
650 (post-intervention) in 2007. One of the limitations of the study utilizing the API
as a comparison tool was that the schoolwide API for 2006 (pre-intervention) and
2007 (post-intervention) were computed using different cohorts of students. In order to
address this concern, the researcher computed an API strictly for the 342 ELLs who
were participants in the study utilizing only the English Language Arts results from
the 342 participants in the dependent group design. This comparison is illustrated in
Table 11.
Table 11 illustrated that, overall, utilizing solely the English Language Arts
results, the API for all students for grades 9-11 in English Language Arts decreased
72
from 2006 (pre-intervention) base API of 697 to a 2007 (post-intervention) decrease
API of 677, for a decline of 20 points. Of the 342 ELLs who were participants in the
study, their API also decreased during the same assessment period from a 2006 (pre-
intervention) base API of 607 to a 2007 (post-intervention) API of 583 for a decline of
24 points. The results using only the ELA were consistent in regards to gains and
declines in API at various grade levels as well. Grade 9 posted the only positive gain
of 19 points, while grades 10 and 11 posted large declines in their grade-level API (64
and 65 API points, respectively).
On the surface Patterson High did not fare very well when measured by the
API components, however, when reviewing the findings by using a value added
methodology proposed by Hocevar (2009), the grade level growth scores results were
slightly above expected growth for ELLs statewide. In all three grade level
transitions, the ELLs subgroup out performed the ELLs State normative sample in
each of their respective grades. More importantly, the post test grade level progress
scores (Table 9) were 9.59, 10.57 and 11.65, for grades 9-11. These scores suggest
that compared to ELL students statewide (mean = .50), the students at Patterson were
well above average particularly in the 11
th
grade (.65 versus .50 for the state).
As previously highlighted in Chapter 4, the percentage of ELL students
attaining English proficient at Patterson High School exceeded states targets between
2006 (pre-intervention) and 2007 (post-intervention) by 35% and 8.20%. This
indicates that 57% of PHS ELL population is making their annual progress on the
73
CELDT and are attaining English proficiency at a faster rate when compared to the
statewide targets.
Statistical Significance
In reviewing the finding for statistical significance as seen in Table 8,
Patterson High School demonstrated positive gains in the following grades: (a) ELLs
in grade 9
th
, (b) ELLs in 10th grade, and (c) ELLs in 11th grade. The pre/post changes
in the means and the increases in the three components played a significant role in
Patterson High School’s ability in meeting 20 out of 22 areas for the 2007 school
year’s AYP.
Table 9, reflected an increase in the grade level growth scores in the ELL
subgroup from pre-intervention (2006) to post-intervention (2007). All three grades
met their expected growth of one grade level on the California Standards Test English
Language Arts portion of the test. The 8
th
to 9
th
grade transition posted the highest
grade level growth with a +1.02 Grade Level Growth Score.
Practical Significance
Tables 12 and 13 illustrate that the ninth grade findings reflects the largest
mean difference of +1.63 which indicates a consistent positive change of moving ELL
9
th
graders towards the “proficient” and “advanced” performance bands (coded 04).
The percentage of EL students moving from “basic” to “proficient” increased from 9%
pre-intervention (2006) to 15% post-intervention (2007). This means an increase of
seven more students scoring at the “proficient” in 2007. Furthermore, the percentage
74
of ELLs students scoring at the “below basic” and “far below basic” performance band
decreased slightly by about 3% from pre-intervention (2006) to post-intervention
(2007). Further investigation of the data concluded that the ELL subgroup actually
out performed all 9
th
graders schoolwide. The ELL ninth grade subgroup had a higher
percentage of students scoring at the “basic,” “proficient” and “advanced”
performance bands when compared to all ELLs schoolwide (EL 9
th
Grade
subgroup=57%, ELL schoolwide 40%). .
The data for tenth grade students illustrated a mean change of +.087 from the
pre-intervention (2006) to the post-intervention (2007) years on the CST ELA. The
results for all ELL tenth graders showed a decrease overall in performance bands. The
biggest decrease came from students moving from “below basic” to “far below basic.”
There was an increase of 8% of students scoring “far below basic” and “below basic”
from the pre-intervention (2006) to post-intervention (2007). About 20% (24/111) of
all ELL 10
th
graders moved to lower quintiles and, therefore, negatively impacted the
overall API.
Data for ELL in the eleventh grade demonstrated an increase in the mean of
+1.11 from pre-intervention (2006) to post-intervention (2007). Of significance,
however, is the percentage of ELL eleventh graders who achieved “far below basics,”
from the pre-intervention at 24% to the post-intervention at 40%.
Tables 8 and 9 illustrate the grade level scores (GLS) and grade level growth
scores (GLGS) as another way to demonstrate the growth of ELL students at Patterson
75
High School. The main purpose of the “Grade Level Growth Scores” is to separate
the effects of nonschool-related factors (such as family, peer, and individual influence)
from school’s performance at any point in time so that student performance can be
attributed appropriately (Goldschmidt et al., 2005). The Grade Level Growth Scores
is simply the difference between its actual growth and its expected growth. As noted
in chapter 4, the ninth ELL subgroup had the highest grade level growth scores
(GLGS) than any of the other grades in this study. Their GLGS score was a +1.02,
which signifies a full grade level growth from 8
th
to 9
th
. The tenth (+1.00) and
eleventh (+1.01) grades also posted a one-year’s growth. In addition, the eleventh
grade ELL subgroup had the highest grade level score mean of 11.65 while the tenth
grade had the lowest score of 10.57.
Another indicator utilized in this study to indicate the impact of the
intervention on ELLs student achievement was the California English Language
Development Test (CELDT). The CELDT is administered yearly to all students who
have been classified as an ELL and is intended to assess students’ proficiency in
English-language skills in several domains (a) listening, (b) speaking, (c) reading, and
(d) writing. The CELDT scores define five levels of performance on the test as (a)
beginning, (b) early intermediate, (c) intermediate, (d) early advanced, and (e)
advanced. The students in grades 9 through 12 receive an overall score that is
comprised of listening, speaking, reading and writing, as well as a comprehension
score for listening and reading. The following data is a summary of 2005-06 and
76
2006-07 CELDT data. It should be noted that prior to 2006-07 CELDT results were
not vertically linked; therefore, comparisons across grade spans and years could not be
made (Technical Report for CELDT Form F, 2007).
Data for the 2005-06 CELDT test provide information on English-proficiency
levels for English learners at the experimental school for grades 9 through 11,
Patterson High School. The findings revealed that for the 330 English learners
assessed in 2005-06, 35% scored “advanced,” 38% scored “early advanced,” 16%
scored “intermediate,” 6% scored “early intermediate,” and 5% scored “beginning.”
During 2006-07, English learners achieved outcomes whereby 12% scored
“advanced,” 37% scored “early advanced,” 34% scored “intermediate,” 13% scored
“early intermediate,” and 4% scored “beginning.” An important feature of the results
for the pre-intervention (2006) and the post-intervention (2007) years is that in both
years the results for the majority of the English learners fell within the “early
advanced” and “intermediate” proficiency bands. In addition, there were fewer
students achieving the “advanced” level of English-language proficiency in 2007 than
in 2006 with 12% and 34%, respectively. This is due to the recalibration of the
CELDT in 2007 to a “common scale.”
Overall the results were mixed for Patterson High School in this study. There
was a downward shift from 2006 to 2007 of ELLs students scoring in the lower
quintiles of the CST ELA portion of the test. However, CELDT scores and GLGS
showed some promising trends.
77
Interviews
In addition to the quantitative data, informal qualitative data was collected
using interviews and observations. Individual and group interviews were conducted
using Creswell’s (2003) methods for the gathering of qualitative data. Participants
responded to the following questions:
1. What where some of your reactions to the implementation of the
DATAWORKS Direct Instruction Model?
2. How has direct instruction training impacted PHS?
3. What are some of the changes you have implemented in your classroom
since Direct Instruction?
The researcher gathered information with a primary focus on the teacher
perception of the effectiveness of the direct instruction intervention (DATAWORKS)
and its effect on the English Language Learner in their classroom.
Interview question 1. What are some of your reactions to the implementation
of the DATAWORKS Direct Instruction Model?
At first, there was some resistance to the DATAWORKS model of direct
instruction from most of the staff. The strongest resistance came from the most
tenured teachers. They viewed the DATAWORKS model as limiting their creativity
and too rigid. The younger teachers found the model very helpful and adapted to their
instructional methods very quickly. Most teachers agreed that they liked the
structured lesson design and instructional strategies.
78
Furthermore, several teachers noted that administration support enhanced the
implementation of the direct instruction intervention. The principal as well as the Vice
Principal of Instruction attend all trainings and actively participated in the monthly
collaborative meetings.
Interview question 2. How has direct instruction training impacted PHS?
One of the major themes that most teachers commented was the common
vocabulary around direct instruction and the importance of checking for understanding
throughout the lesson. The common vocabulary of the direct instruction model
(DATAWORKS) provided teachers with a narrowed focus of certain specific portions
of the lesson design. All teachers had a fundamental understanding of the common
pedagogy being used throughout the school. For example, the concept of checking for
understanding was presented in a manner that was required throughout the lesson and
not just at the end of the lesson. There was a new spirit of collaboration as teachers
began to share their ideas on how they implement direct instruction in their
classrooms.
Teachers also stated that their experience in deconstructing the standards had
provided a very detailed comprehension of the standard. Deconstructing standards
allowed for teachers to develop learning objectives and to determine the difference
between the learning objective and a standard. Most teachers felt this had a
tremendous impact in their ability to meet the needs of their students.
79
Interview question 3. What are some of the changes you have implemented in
your classroom since Direct Instruction?
Overall, teachers commented on how much more detail they provided for each
lesson that they designed and the amount of time spent collaborating with their
colleagues on instructional planning and delivery. Prior to the implementation of
DATAWORKS, there was evidence of inconsistency in the overall instructional
program utilized at Patterson High School. Teachers taught mainly in isolation with
little time spent collaborating and observing other teachers teach.
Several teachers felt that one of the best things about the implementation of
DATAWORKS at PHS were the instructional tours. Teachers were encouraged to
visit other teachers during their prep time. The principal as well as the vice principal
would schedule these tours through out the day. Any teacher could participate during
their prep period. This practice helped to enhance the implementation of the direct
instruction model at PHS.
Observations
Frequent, formal and informal observations were conducted to assist with
validating the implementation of direct instruction in the classroom. The observations
were as short as five minutes and as long as 40 minutes in order to obtain a more
global view of the implementation of direct instruction. Feedback was provided by
utilizing the DATAWORKS walk-through form (Appendix D). The fidelity of the
implementation of the direct instruction intervention was determined on how many of
80
the components of the DATAWORKS lesson plan were evident in the classroom.
Checking for understanding and learning objectives were two of the key components
that where utilized by all.
Patterson High School and Selma High School
Academic Performance Index
Patterson High School did not improve on their schoolwide API in 2007. They
experienced a decrease of 11 points overall from 2006 to 2007 as depicted in table 10.
The benchmark school, Selma High School, grew by 9 points on the API, moving
from a 2006 API of 718 to a 2008 API of 727. The API for the English Language
Learner Subgroup at Patterson High School also decreased by -30 points, moving from
a 2006 API of 590 to a 2007 API of 537. The ninth grade ELL subgroup was the only
group that grew from pre-intervention 2006 to post-intervention 2007 by 20 points.
The decline on API scores can be attributed to a changing population due to an
influx of students from a nearby urban area. To illustrate the magnitude of this influx,
there were 388 in 2006 and 435 in 2007 ELLs students included in the API. This is an
additional 47 ELLs students included in PHS 2007 API. Table 16 further illustrates the
impact of the changing population at PHS. Because of this problem, the declining
CST scores and API scores at Patterson likely were not due to the direct instruction
intervention.
Therefore, more credence can be given to the highly positive results using
grade level scores. Grade level scores put schools on a .01-.99 scale with a mean of
81
.50. State data for all ELL students were used as the normative group. For example, a
grade level at .50 is doing about average compared to other schools with ELLs
students at that grade level. At Patterson, the posttest grade level scores were .59, .57,
and .65 for grades 9-11, respectively. All of these scores are well above the state
mean of .50. However, because the GLS at Patterson were high on the pretest and the
change from pre to post was not significant, no inference can be made about the
intervention’s effects causing this difference can be made.
The grade level growth score results also were highly positive. Each grade
level exceeded their expected growth of one grade level. It is possible that the direct
instruction intervention might have facilitated Patterson’s growth. However, because
there was no randomized control group, a causal inference should not be made.
Adequate Yearly Progress
As discussed in previously, the No Child Left Behind (NCLB) legislation of
2001 (United States Department of Education, 2002) required that all schools meet a
certain percentage of students who were identified as proficient and advanced with the
goal of 100% of students being proficient or advanced by 2014. NCLB sets an annual
measurable objective (AMO) each year that is used to determine if a school has
demonstrated adequate yearly progress (AYP). The AMO for 2007 for English
Language Arts was 22.3% for all high schools. Patterson High Schools had a total of
10 ELL students out of a possible 122 students who were identified as proficient and
advanced for 2007. This resulted in Patterson High School not meeting the AMO for
82
English Language Arts (ELA). Selma High School did meet their AMO in ELA by
having 44 ELL students out of a possible 179 students score proficient or advanced on
the CST.
Title III English Language Proficiency – AMAO
As noted previously, the Title III English Language Proficiency Annual
Measurable Achievement Objectives (AMAOs) are a performance objective that
school districts receiving Title III sub-grant must meet each year for its English
Learners. In 2007, Patterson High School outperformed the benchmark school, Selma
High School, in both AMAO I and II. Under AMAO I, PHS (56.9%) exceeded SHS
(47.4%) by 9.5% and in AMAO II by .7%.
Factors Impacting PHS Overall Performance
As noted above, the current accountability system uses “successive cohorts” as
the bases to measure improvement in student achievement. The interpretation of the
results from the current successive cohorts accountability approach also requires an
assumption that the students attending the school are comparable from year to year in
terms of background characteristics, mobility and prior learning. According to Linn
(2006) “changes in the composition of the student body due to student mobility call
into question the basic assumption of comparability of the successive cohorts of
students”(p. 10). Below is a snapshot of Patterson High School enrollment over the
past five years.
83
Overall, during 2003-2008, PHS experienced an influx of about 100 new
students per year moving in from the bay area to the City of Patterson. Patterson High
School grew by about 504 students during this period and saw its Asian, Pacific
Islander and African American populations double and in some case tripled in size.
This influx of new students changed the student characteristics of PHS from one year
to the next. Changes in students’ characteristics make it difficult to accurately measure
student achievement under our current accountability system.
Table 16
Patterson High School Enrollment by Ethnicity
Year A.I. Asian P.I. Filipino Latino A.A White M.R. Total
03-04 11 14 3 7 749 37 351 0 1172
04-05 10 17 8 8 823 42 389 6 1303
05-06 16 29 16 22 902 85 358 16 1444
06-07 16 40 21 23 975 96 372 17 1560
07-08 14 39 30 31 1045 132 371 14 1676
Change +3 +25 +27 +24 +296 +95 +20 +14 +504
%
Change
27% 178% 900% 342% 39% 256% 5% 43%
A.I = American Indian
P.I. = Pacific Islander
A.A.= African American
M.R.= Multiple Response
Implications
During the course of this study, changes to the academic achievement of
students at Patterson High School were made and noted. The ninth grade experienced
the largest mean change and had the greatest percentage of students (16%) scoring
84
proficient and above on the CST ELA for the pre-intervention (2006) versus post
intervention (2007). Furthermore, all ELL at Patterson High School made at least a
one year growth on the CST when compared during the pre (2006) and post (2007)
intervention year using the valued added Grade Level Growth Scores developed by
Hocevar (2009).
This study has provided evidence supporting that the achievement of the
English Language Learner has made positive gain and that the direct instruction
intervention, DATAWORKS could have possibly contributed to that success. But
most importantly, the teacher-level factors of self-efficacy as identified by Clark and
Estes (2002) were addressed using the direct instruction model. Both interviews and
observations indicate that teachers’ self efficacy seemed to improve as they focused on
direct instructional model and began to take ownership in determining what researched
based best practices are essential for their continuous improvement.
There are also implications that support the research from Marzano (2003),
Schmoker (1999), Dufour and Eaker (1998) on the effect of a good teacher on
students’ achievement. Their research indicates that teachers have a profound
influence on student achievement. An effective teacher requires time for reflection
and collaboration. As a result of the implementation of the direct instruction model,
Patterson High School redesigned their instructional day to allow collaboration time
on Wednesday. The collaboration time with the implementation of direct instruction
model provided a way for teachers to discus instructional practices that were most
85
effective in assisting with student learning. Teachers were able to discuss more
openly their strategies and lesson designs that were being utilized to improve their
ability to address students’ lack of achievement.
While student achievement at Patterson High School is not where it needs to
be, PHS continues to work collaboratively to plan instruction that is focused on
helping students achieve. Direct instruction has become an essential part of the
Patterson High School instructional program. It has provided a vehicle for
collaboration, planning and enhancing what is critical to the success of all students.
However, there is considerable work that still needs to be done in order to truly
institutionalize the implementation of explicit direct instruction throughout the school.
Implications for Implementation of Direct Instruction
This study was limited to the analysis of direct instruction on the achievement
of the English Language Learners. The implementation of direct instruction created a
need for an increase in the amount of time that teachers spent together planning and
discussing on what Marzano (2003) calls a “guaranteed and viable curriculum.” This
need forced teachers and administrators to identify essential versus supplemental
content and ensure that the essential content is sequenced appropriately and can be
adequately addressed in the instruction time available. Administration at PHS had to
reconstruct the instructional day in order to provide a weekly early release time in
order to create time for collaboration between teachers and administrators. According
to DuFour et al. (2004) without collaboration it becomes difficult to guarantee all
86
students to have access to the same essential outcomes. When collaboration is in
place, teachers are stretched to develop and implement more effective strategies in
their classroom. The commitment to high levels of learning for all students can be
implemented when teachers work together to clarify essential students outcomes,
gather timely evidence regarding student learning, and collaborate with one another to
identify ways to address student weakness and built upon student strengths.
The principal and the leadership team met bi-weekly with each teacher for
about 120 minutes to discuss maximizing students learning and to further enhance the
implementation of the intended curriculum as it is being delivered. For example, PHS
increased the rate of checking for understanding throughout the lesson in order to
constantly monitor the progress of the students and provide immediate feedback. All
these changes were brought about as a result of the implementation of the direct
instruction model, DATAWORKS Explicit Direct Instruction.
Furthermore, administration support of the implementation of direct instruction
at Patterson High School had a positive impact on changing the school culture, even
though more could have been done at specific grade levels. According to DuFour et
al. (2004), Marzano et al. (2005) and Schmoker (1999) school’s administration has the
ability to keep a school focused on student achievement and the extent to which the
leader establishes clear goals and keeps those goals in the forefront of the school’s
attention can lead to greater student gains on standardized tests. In order to keep
teachers focused on the explicit direct instruction model, administration was
87
responsible for classroom observation and weekly “walk throughs.” The weekly
“walk through” form was aligned to the DATAWORKS lesson design. This method of
feedback was used as a way to gather data on the frequency of direct instruction
strategies being utilized by teachers. The data gathered was presented to teachers once
a month.
Site-Based Recommendations
The intent of this study was to evaluate the effect of the direct instruction on
the achievement of ELLs as measured by the English Language Arts (ELA) portion of
the California Standards Test (CST) at Patterson High School. This study was limited
to the English Language Learner students at Patterson High School and to the
implementation of the intervention, direct instruction. The direct instruction model
that was implemented was known as the DATAWORKS Explicit Direct Instruction.
Its purpose was to improve the achievement of English Language Learners as
measured by the ELA portion of the CST. The qualitative results gathered with
informal interviews and observations supported the fact that the intervention was
implemented with some fidelity.
The focus of the researcher in this study was the achievement of the Patterson
High School ELL population on the ELA portion of the CST. By utilizing the methods
established by Clark and Estes (2002) the researcher was able to identify the
knowledge, motivation, and organizational performance gaps that have impeded PHS
academic achievement. The following are the researcher’s recommendation on how
88
best to continue with the implementation of direct instruction throughout the school
and possible expansion to other sites within the district.
As part of the analysis of this study, it is recommended that the staff continues
to develop a collaborative environment in order to sustain and strengthen school
improvement goals. Creating a collaborative environment has been described as the
single most important factor for successful school improvement initiatives and the first
order of business for those seeking to enhance the effectiveness of their school
(DuFuor & Eaker, 1998). The creation of small working groups, learning
communities, or inquiry groups makes it easier to target specific instructional
challenges and improve instructional practices. Through collaboration adults can
support each other’s learning continually over time. According to DuFour et al. (2004)
learning communities “allow teachers to work together to clarify ways to address
student weakness and build upon student strengths” (p. 183). By continuing to
establish learning communities at PHS, the staff will have a better understanding of
the appropriate action needed to take in the future in regards to direct instruction.
Improvement and reform is a collaborative effort and one that needs to continue in
order to improve teaching and learning at PHS.
Another recommendation to support Patterson High School in its quest for
academic improvement lies in the process of developing a data driven culture. Data
can be a powerful tool to keep the needs of the students visible and to help us identify
what works and what does not. With the push for high standards in education and the
89
advent of high stakes accountability, we need data to monitor how well our students
are doing.
Properly used, data can be a compelling means of launching, sustaining, and
institutionalizing reform efforts (Johnson, 2002). For data to do all that, it has to be
good, valid and meaningful. The challenge then is to understand the various measures
and assessments and to make sense of them in terms of English Leaner achievement.
Test scores must be combined with other data to glean the information we need to
target instruction and tailor course placement to the needs of the diverse English
Leaner population. This recommendation also includes the development of additional
formative assessment that will provide for more timely feedback than the semester
assessments and the states summative assessment. By utilizing EduSoft’s capability to
design standards-based exams, the school will be able to create rigorous exams that
will, in turn, increase the rigor of the instruction in the classroom. A systematic
method of measuring progress will increase the amount of rigor that was an issue in
the classrooms at PHS.
Limitations
External validity is a concern for this study. This study utilized a unique
population, therefore, generalizations can not be applied beyond outside of the
experimental group. The lack of external validity is due to the non-random selection
procedure used in the study. The use of disaggregated data from the California
Department of Education is an attempt to maximize external validity. These data were
90
used to compare growth between the control school and the benchmark school as
reported by the state. It should also be noted that interaction of the causal relationship
over treatment variations is another limitation to external validity on this study.
A non-equivalent control group design is susceptible to selection bias.
Therefore, internal validity issues will always be present. Given the results, there are
many known and unknown factors that were not addressed that may have influenced
the results of this study.
The selection threat is a factor that commonly occurs between pretest and
posttest when analyzing ELL data. This selection threat is created by the
reclassification process for ELL. Any time an ELL student is above proficient or
advanced on the CST, that student is most likely reclassified and is no longer part of
the ELL cohort subgroup. Although the state has made some modifications to the
ELL subgroup paring, the issue still remains. ELL students that are high performing
are no longer being counted as part of the ELL subgroup.
Maturation is another factor that must be considered as a threat to the internal
validity of the study. ELLs are essentially learning to master English and also the
content of their grade level. Students do not develop at the same rate both physically
and emotionally. This difference in maturation could have an influence on the results
of the study as well as their years of learning English.
91
Conclusions
There has never been a more difficult time to lead a school site in creating
strong programs for English Learners. In the past several decades, millions of school-
aged immigrants have arrived at our schools, filling our classrooms with the most
ethnically, linguistically, and culturally diverse population. Along with teaching
English to these new populations, educators now need to make sure these learners also
reach the new, higher standards being set for all students. Additionally, schools must
show that their program is research-based, has sufficient resources to succeed, and
overcomes the language barrier.
This study’s focus was to highlight the impact of Explicit Direct Instruction on
the academic achievement of ELL students. This researched-based intervention
demonstrated mixed results on the California Standards Test. On traditional indicators
of accountability (CST, API, and AYP) declines in test performance from pre-
intervention to post intervention were observed. However, both California’s Academic
Performance Index (API) and the No-Child-Left Behind (NCLB) Adequate Yearly
Progress indices (AYP) are based on a comparison of “successive cohorts” of students.
Such a comparison “requires an assumption that the students attending the school are
compared from year to year in terms of background characteristics and prior learning”
(Linn, 2006, p. 9). The inherent problem of using successive cohorts to measure
school improvement is widely recognized by practitioners and researchers and there is
a large consensus that longitudinal measurement of individual student growth are a
92
better way to measure student academic achievement. In the present study, the
successive cohorts were dramatically different because of a large influx of students
from a nearly urban area. Thus, the declines in the CST, API, and AYP scores likely
were due to a changing student population and not the direct instruction intervention.
The value-added indices (grade level scores and grade level growth scores)
were in stark contrast to the traditional status and improvement accountability
indicators. Patterson High School’s ELL subgroup scored considerably higher than
state average in 2007. Their growth from 2006 to 2007 was slightly more than
expected (i.e., more than a full academic year). However, the schools performance in
2006 was also above average and the pre to post change from 2006 to 2007 was not
significant. Thus, any judgment about the effectiveness of direct instruction cannot be
made.
Although the results were inconclusive, DATAWORKS Explicit Direct
Instruction did provide a framework at Patterson High School to address their
student’s academic achievement. A common vocabulary around instructional
strategies has been developed with a focus on the process of teaching and learning.
The staff at Patterson High School for the first time in years is developing the self-
efficacy necessary to the meet the needs of the ELL population.
93
REFERENCES
Abedi, J. (2004). The No Child Left Behind Act and English-language learners:
assessment and accountability issues. Educational Researcher, 33(4), 4-14.
Retrieved September 17, 2007, from http://edr.sagepub.com
Abedi, J., Hofstetter, C., & Lord, C. (2004). Assessment accommodations for English
language learners: Implications for policy-based empirical research. Review of
Educational Research, 74(1), 1-28.
Abedi, J., Leon, S., & Mirocha, J. (2000/2005). Examining ELL and non-ELL student
performance differences on their relationship to background factors: Continued
analyses of extant data. In The validity of administering large-scale content
assessments to English language learners: An investigation from three
perspectives (CSE Rep. No. 663, pp. 1-45). Los Angeles: University of
California, National Center for Research on Evaluation, Standards, and Student
Testing (CRESST).
Abedi, J., Lord, C., & Plummer, J. (1997). Language background as a variable in
NAEP mathematics performance: NAEP TRP Task 3D: Language background
study (CSE Tech Rep. No. 429) Los Angeles: University of California,
National Center for Research on Evaluations, Standards, and Student Testing.
Adams, G. L., & Engleman, S. (1996). Research on direct instruction: 25 years
beyond DISTAR. Seattle, WA: Educational Achievement Systems.
August, D., & Lara, J. (1996). Systemic reform and limited English proficient students.
Washington, D.C: Council of Chief State School Officers.
Baumann, J. F. (1984, autumn). The effectiveness of a direct instruction paradigm for
teaching main idea comprehension. Reading Research Quarterly, 20(1), 93-
115.
Becker, W. C., & Gersten, R. (1982). A follow-up of follow through: The later effects
of the direct instruction model on children in fifth and sixth grades Author(s):
Wesley C. Becker and Russell Gersten. Source: American Educational
Research Journal, 19(1), 75-92 (1982, spring). Published by American
Educational Research Association. Stable URL:
http://www.jstor.org/stable/1162369
94
Butler, F. A., & Stevens, R. (2001). Standardized assessments of the content
knowledge of English language learners K – 12: Current trends and old
dilemmas. Language Testing 2001, 18(4), 409-427.
California Department of Education. (2006, November). 2005–2006 accountability
progress reporting system, 2005–2006 academic performance index reports
information guide. Retrieved November 17, 2007, from
http://api.cde.ca.gov/APIBase2006/2005BaseSchSS
California Department of Education. 1999-2007 API reports. Retrieved November 27,
2008, from http://star.cde.ca.gov./STAR2006/report.asp
Chall, J. S. (2002). The academic achievement challenge: What really works in the
classroom? New York: Guilford Publications.
Clark, R. E., & Estes, F. (2002). Turning research into results: A guide to select the
right performance solutions. Atlanta, GA: CEP Press.
Creswell, J. W. (2003). Research design: Qualitative, quantitative and mixed methods
approaches (2nd ed.). Thousand Oaks, CA: Sage Publications.
EdSource. (2005, June). The state’s official measures of school performance.
Mountain View, CA: EdSource, Inc.
Duffy, G., & Roehler, L. R. (1982). The illusion of instruction. Reading Research
Quarterly, 17(3), 438-445.
DuFour, R., & Eaker, R. (1992). Creating the new American school: A principal’s
guide to school improvement. Bloomington, IN: Solution Tree.
DuFour, R., & Eaker, R. (1998). Professional learning communities at work: Best
practices for enhancing student achievement. Bloomington, IN: Solution Tree.
DuFour, R., DuFour, R., Eaker, R., & Karhanek, G. (2004). Whatever it takes: How
professional learning communities respond when kids don't learn.
Bloomington, IN: National Educational Services.
EduSoft (Version 1.5). [Computer software]. Rolling Meadows:Riverside Publishing
95
Goldschmidt, P., Roschewski, P., Choi, K., Auty, W., Hebbler, S., Blank, R., &
William, A. (2005). Policymakers’ guide to growth models for school
accountability: How do accountability models differ? Washington, DC: The
Council of Chief State School Officers.
Good, T. L., & Brophy, J. E. (1987). Looking in classrooms. New York: Harper&
Row.
Good, T. L., & Grouw, D. (1979). The mission: Mathematics effectiveness project.
Journal of Educational Psychology, 71, 355-366.
Harvard University. (2004, July 26-31) Redesigning American high schools.
Cambridge, MA: Author.
Hill, J. D., & Flynn, K. M. (2006). Classroom instruction that works with English
language learners. Alexandria, VA: Association for Supervision and
Curriculum Development.
Hocevar, D. (2009). [Leveled assessment modeling project]. Unpublished.
Hunter, M. (1982). Mastery teaching: Increasing instructional effectiveness in
elementary, secondary schools, colleges and universities. El Segundo, CA: TIP
Publications.
Johnson, R. S. (2002). Using data to close the achievement gap: How to measure
equity in our schools (2nd ed.). Thousand Oaks, CA: Corwin Press.
Kinder, D., & Douglas, C. (1991). Direct instruction: What it is and what it is
becoming. Journal of Behavioral Education, 1(2), 193-213.
Lachat, M. A. (1999). What policymakers and school administrators need to know
about assessment reform for English language learners. Providence, RI: LAB
at Brown University.
Linn, E. L. (2000). Assessment and accountability. Educational Research, 29(2), 4-16.
Linn, E. L. (2006, June). Education accountability systems (CSE Report 687). Los
Angeles: University of California, National Center for Research on Evaluation,
Standards and Student Testing.
Marzano, R. J. (2003). What works in schools. Alexandria, VA: Association of
Supervision and Curriculum Development.
96
Marzano, R. J., Pickering, D. J., & Polluck, J. E. (2001). Classroom instruction that
works: Research-based strategies for increasing student achievement. Aurora,
CO: Mid-continent Research for Education and Learning.
Marzano, R. J., Walters, T., & McNulty, B. A. (2005). School leadership that works:
From research to results. Aurora, CO: Mid-continent Research for Education
and Learning.
Millman, J. (1997). Grading teachers, grading schools: Is student achievement a valid
evaluation measure? Thousand Oaks, CA: Corwin Press.
Munoz, M. A. (2002). High stakes accountability environments: Its impact on the
administration of English language learners programs. Paper presented at the
Annual Meeting of the American Evaluation Association. Washington, D.C.
National Center for Educational Statistics (NCES). (2004). Language minorities and
their educational and labor market indicators—Recent trends, NCES 2004-09.
Washington, DC: U.S. Department of Education.
Olsen, L., & Jaramillo, A. (1999). Turing the tides of exclusion: A guide for educators
and advocates for immigrant students. Oakland, CA: California Tomorrow
Olsen, L., & Romero, A. (2006a) Knowing our English learners students: Secondary
school leadership for English learner success. Produced and Published by
California Tomorrow for Los Angeles County Office of Education.
Olsen, L., & Romero, A. (2006). Supporting effective instruction for English learners:
Secondary school leadership for English learner success. Produced and
Published by California Tomorrow for Los Angeles County Office of
Education.
Olsen, L., Romero, A., & Gold, N. (2006b). Designing an effective and comprehensive
program for English learners: Secondary school leadership for English
learner success. Produced and Published by California Tomorrow for Los
Angeles County Office of Education.
Olsen, L., Romero, A., & Gold, N. (2006). Leadership and infrastructure for English
learner success: Secondary school leadership for English learner success.
Produced and Published by California Tomorrow for Los Angeles County
Office of Education.
97
Project Follow Through. (n.d.). Direct instruction research summary. Retrieved
January 27, 2008, from http://www.projectpro.com
Raudenbush, S. W. (2004). What are value-added models estimating and what does
this imply for statistical practice? Journal of Educational and Behavioral
Statistics, 29(1), 121-129.
Rivera, C., & Vincent, C. (1996). High school graduation testing: Policies and
practices in the assessment of limited English proficient students. Paper
presented at annual meeting of the National Conference on Large-Scale-
Assessment, June 24, 1996, Phoenix, AZ. Washington, DC: George
Washington University Center for Equity and Excellence in Education.
Rosenshine, B. (1979). Content, time, and direct instruction. Research on teaching:
concepts, findings and implication of education. Berkeley, CA: McCutchan.
Sanders, W. L., & Horn, S. P. (1994). The Tennessee Value-Added Assessment
System (TVAAS): Mixed-model methodology in educational assessment.
Journal of Personnel in Education, 29(3), 299-311.
Schmooker, M. (2001). The results fieldbook: Practice strategies from dramatically
improved schools. Alexandria, VA: Association of Supervision and
Curriculum Development.
Spiegel, D. L. (1992, September). Blending whole language and systemic direct
instruction. The Reading Teacher, 46(1), 38-44.
Stein, M., Carnine, D. & Dixon, R. (1998, March). Direct instruction: Integrating
curriculum design and effective teaching practice. Intervention in School and
Clinic, 33(4), 22-234.
Sterbinsky, A., Ross, S., & Redfield, D. (2003, April). Comprehensive school reform:
A multi-site replicated experiment. Paper presented at the Annual Meeting of
the American Educational Research Association. Chicago, IL.
Stevens, R., & Rosenshine, B. (1981). Advances in research on teaching. Exceptional
Education Quarterly, 2(1), 1-9.
98
Tekwe, C. D., Carter, R. L., Chang-Xing, M., Algina, J., Lucas, M. E., Roth, J., Ariet,
M., Fisher, T., & Resnick, M. B. (2004). An empirical comparison of statistical
models for value-added assessment of school performance. Journal of
Educational and Behavioral Statistics, 29(1), 11-35.
Van Cameron, G. (2010, May). The usability of teacher-growth scores versus
CST/API/AYP math status scores in sixth and seventh grade mathematics
classes. Dissertation, University of Southern California, Los Angeles.
United States Department of Education. (2002). No Child Left Behind Act of 2001.
Public Law 107-110. Retrieved May 27, 2007, from
http://www.nclb.gov/next/overview/index.html
Wiley, T. G. (1994). Estimating literacy in the multilingual United States: Issues and
concerns. Washington, DC: Adjunct ERIC Clearinghouse for ESL Literacy
Education.
Technical Report for the California English Language Development Test (CELDT)
2006-07. (2007, May 30). Retrieved November 4, 2008, from
http://www.cde.ca.gov/ta/tg/el/documents/formftechreport.pdf
99
APPENDICES
Appendix A: API Targets Matrix
Table A-1
Patterson High School: API Targets Met Matrix
Patterson High School
API TARGETS MET MATRIX
Cumulative Targets to Growth Comparisons 2001-2006
2001 2002 2003 2004 2005 2006
School 27 -6 41 16 34 -4 58 108 50
Subgroups
Socioecon. Disadvantaged
N=574**
34 7 59 15 32 17 47 164 117
English Learners
N=377
-13 6 -13 -19
Students with Disabilities
N=139
65 6 65 59
African American
0 0 0
American Indian
0 0 0
Asian
0 0 0
Filipino
0 0 0
Hispanic
N=678
27 11 43 29 33 2 47 145 98
Pacific Islander
0 0 0
White
N=259
32 -19 40 -3 27 7 47 84 37
Year
Cumulative
Growth
Change
Cumulative
Target
Change
Cumulative
Target
to
Growth
Difference
Met or exceeded target
Improved
No improvement or negative change
No data available or incomplete
100
Appendix B: English Language Learners’ Class Placements
Table B-1
Patterson High School: English Learners’s Class Placements
CLASS LEVEL CELDT SCORE
(+ 20 to last year’s score)
OTHER
CLASSES
ESL 1 (2 per.)
High Point
(Level -Basics)
By Hampton-Brown
Beginning
Level 1
(9
th
) 251 - 457
(10
th
-12
th
) 251 - 463
BLG/SEI
ESL 2 (2 per.)
High Point
(Level -A)
By Hampton-Brown
Early
Intermediate
Level 2
(9
th
) 458 - 517
(10
th
- 12
th
) 464 - 527
BLG/SEI
ESL 3 (1 per.)
Making Connections 2 by
Heinle & Heinle
ENGLISH 3 (1 per.)
High Point
(Level -B)
by Hampton-Brown
LOW
Intermediate
Level 3
(9
th
) 518 - 548
(10
th
- 12
th
) 528 - 560
SEI
ENGLISH 3B
(2 per.)
High Point
(Level -C)
by Hampton-Brown
HIGH
Intermediate
Level 3
(9
th
) 549 - 578
(10
th
- 12
th
) 561 - 590
SEI
ENGLISH 4
9
th
Grade -Core Lit.
The Language of Literature
by McDougal-Littell
Early Advanced
Level 4
(9th
) 579 - 637
(10
th
- 12
th
) 591- 651
ONLY Eng.
Mainstream English
& (if recommended)
ENGLISH 5
Write Ahead
by Write Source
Advanced
Level 5
(9
th
) 638 - 761
(10
th
- 12
th
) 652 - 761
Mainstream
101
Appendix C: Patterson High School’s Lesson Design
Patterson High School
Lesson Design
Teacher Name:_________________ Subject: _____________Period ______
Learning Objective (skill-concept-context)
CFU:
What are going to do today?
What are we going to (skill)?
Activate or provide Prior Knowledge (universal experience or sub-skill):
CFU (make-connection to LO)
Concept Development (definition, examples, non-examples)
CFU-RAJ
Importance (LO)
CFU:
Does anyone else have any other reason as to why it is important? Which
reason is most important to you? Why? You can give me my reason or your
reason.
Skill Development & Guided Practice (concept-LO-steps or processes)
102
CFU (strategic)
Closure: (Concept, Importance-pair-share, Skill)
1.
2.
3.
Independent Practice
Notes
Differentiating Strategies (Skills, time, questions)
Teaching Strategies (Explain, Model, Demonstrate)
Cognitive Strategies (Rehearsal, Elaboration, Organization)
English Learner Strategies (Comprehensible Input, Contextual Clues, Supplementary
Materials, Adaptations of Content, Vocabulary Development-Language Objectives)
Vocabulary Words Language Objectives Lesson Delivery
Listen Speak Read Write
Comp
Input
Context
Clue
Voc
Develop
Academic
Content
Support
103
Appendix D: Walk-Through Observation Form
PATTERSON HIGH SCHOOL
WALK-THROUGH CHECK LIST
Teacher:___________________ Period:_______________ Date:____________
STUDENT AWARENESS INDEPENDENT WORK
Standard poster
Benchmark results
Standard based student work displayed
Focus Standards in progress are
displayed
Behavior expectations – environment
conducive to learning
Matches independent practice
according to lesson delivered
Whole Class – small group – individual
Student practice what they were just
taught
Teacher works with students who are not
successful
Continued next page
104
WALK-THROUGH CHECK LIST Continued
CONTENT PRESENTATION LESSON DELIVERY
Concept Development
Skill Development
Importance – Provide personal
examples; academic examples or real
life examples
Guided Practice – Teachers works
along with student at the same time;
highly structured step by step- practice;
and release students slowly
Closure – Students prove to teacher
what they have learned – what is the
concept, how to do skill and
importance
Check for Understanding – Teach; Ask
questions; Pause; pick a non-volunteer;
Listen to response; Effective feedback –
calling on students with hands raised or
accepting shouted out answers
Active Participation – student-teacher and
student-student interaction
Differentiating Strategies – reduce sub
skill difficulty so ALL students learn
Cognitive Strategies – to help student
comprehend, learn and remember –
rehearsal, elaboration, organization
Teaching Strategies – model, explain,
demonstrate
English Learner Strategies –
comprehensible input, contextual clues,
supplementary materials, adaptation
materials, vocabulary development,
students answering in complete sentences
*DNO – Did Not Observe
Abstract (if available)
Abstract
The purpose of this case study was to analyze the impact of direct instruction on the academic achievement of 342 High School English Language Learners on the English Language Arts portion of the California Standards Test (CST). Using dependent group and non-equivalent control group benchmark designs, a mixed-method methodology primarily focused on the quantitative portion to determine both statistical and practical significance. DATAWORKS Explicit Direct Instruction was the direct instruction model selected for intervention. Data were drawn from the English Language Arts portion of the California Standards Test for the 2006 and 2007 administration.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
An evaluation of the impact of a standards-based intervention on the academic achievement of English language learners
PDF
The impact of restructuring the language arts intervention program and its effect on the academic achievement of English language learners
PDF
An analysis of the impact of the total educational support system direct-instruction model on the California standards test performance of English language learners at experimental elementary school
PDF
The effect of a language arts intervention on the academic achievement of English language learners
PDF
Evaluation of the progress of elementary English learners at Daisyville Unified School District
PDF
The impact of professional learning communities and block scheduling on English language learner students at Oak Point Middle School
PDF
Explicit instruction’s impact on the student achievement gap in K-12 English language learners
PDF
English language learners utilizing the accelerated reader program
PDF
An alternative capstone project: Closing the achievement gap for Hispanic English language learners using the gap analysis model
PDF
An evaluation of the impact of a standards-based intervention on the academic achievement of algebra students
PDF
An alternative capstone project: bridging the Latino English language learner academic achievement gap in elementary school
PDF
The effects of open enrollment, curriculum alignment, and data-driven instruction on the test performance of English language learners (ELLS) and re-designated fluent English proficient students ...
PDF
Allocation of educational resources to improve student achievement: Case studies of non-title I schools
PDF
Examining the effectiveness of the intervention programs for English learners at MFC intermediate school
PDF
The block schedule and the English language learners: impact on academic performance and graduation rate at Oxbow High School
PDF
Goal orientation of Latino English language learners: the relationship between students’ engagement, achievement and teachers’ instructional practices in mathematics
PDF
An alternative capstone project: Evaluating the academic achievement gap for Latino English language learners in a high achieving school district
PDF
Perceptions of grade 4-6 teachers on historic failure of English language learners on standardized assessment
PDF
Superintendents and Latino student achievement: promising practices that superintendents use to influence the instruction and increase the achievement of Latino students in urban school districts
PDF
Charter schools, data use, and the 21st century: how charter schools use data to inform instruction that prepares students for the 21st century
Asset Metadata
Creator
Guerrero, Miguel Angel
(author)
Core Title
An evaluation of the impact of direct instruction intervention on the academic achievement of English language learners
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Publication Date
08/03/2010
Defense Date
05/28/2010
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
direct instruction,English language learners,OAI-PMH Harvest
Place Name
California
(states)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Hocevar, Dennis (
committee chair
), Hentschke, Guilbert C. (
committee member
), Soto, Iztaccihuatl (
committee member
)
Creator Email
maguerre@usc.edu,mguerrero@tipton.k12.ca.us
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-m3246
Unique identifier
UC1123762
Identifier
etd-Guerrero-3864 (filename),usctheses-m40 (legacy collection record id),usctheses-c127-336409 (legacy record id),usctheses-m3246 (legacy record id)
Legacy Identifier
etd-Guerrero-3864.pdf
Dmrecord
336409
Document Type
Dissertation
Rights
Guerrero, Miguel Angel
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Repository Name
Libraries, University of Southern California
Repository Location
Los Angeles, California
Repository Email
cisadmin@lib.usc.edu
Tags
direct instruction
English language learners