Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Raising student achievement at Eberman Elementary School with effective teaching strategies
(USC Thesis Other)
Raising student achievement at Eberman Elementary School with effective teaching strategies
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
RAISING STUDENT ACHIEVEMENT AT
EBERMAN ELEMENTARY SCHOOL
WITH EFFECTIVE TEACHING STRATEGIES
by
Laura Lee Twining
____________________________________________________
A Dissertation Presented to the
FACULTY OF THE ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
May 2008
Copyright 2008 Laura Lee Twining
DEDICATION
I’d like to acknowledge and thank my parents, Carolee Beeson Brown
Twining and George Eberman Twining Jr. for instilling the importance of
education and ensuring I completed my first degree. This dissertation is
dedicated to you for teaching me early on how important school was and is.
Thank you to my birth mom, Kaye Sells for supporting me to be who I
am. I’m thankful every day for being in your life and having you in mine.
ii
ACKNOWLEDGMENTS
Acknowledgement and thanks to Dr. Dennis Hocevar, professor
extraordinaire for countless hours of patience and guidance through out the
dissertation process. Your knowledge and faith in us really made a difference!
I greatly appreciate the support and dedication from Dr. Kathy Stowe
and Dr. Denise Hexom for serving on our committee. Thanks for your reflective
and beneficial comments to improve our finished product and help us reach our
goal.
Thank you to Katie Curry for letting me visit Tahoe School numerous
times and with my entire staff! Teachers and staff were so supportive of our
desire to learn. I look forward to our continued visits and conversations about
making a difference in student learning!
The support and dedication of Dr. Steven Lawrence and Sue Brothers,
my Superintendent and Associate Superintendent, are greatly appreciated. I am
thankful to be part of your team—because it is clear you care about students and
learning and keep that as our priority.
Thank you to Stephanie Gregson for sharing her knowledge of Open
Court and giving time to make a difference in learning and instruction at
Eberman Elementary.
Acknowledgment and thank you to the teachers and staff who are
dedicated to reaching every student every day—no matter what.
iii
To our Sacramento Cohort—thank you for the memories of hours and
hours spent together learning and exploring. This has been such a tremendous
experience and I loved every minute of it—thanks for your support and wisdom.
I treasure the time we spent together and look forward to our paths crossing
many times in the future!
Fight on!
iv
TABLE OF CONTENTS
DEDICATION.................................................................................................. ii
ACKNOWLEDGEMENTS............................................................................. iii
LIST OF TABLES............................................................................................ix
LIST OF FIGURES ..........................................................................................xi
ABSTRACT ................................................................................................... xii
CHAPTER 1. PROBLEM OF PRACTICE AT EBERMAN
ELEMENTARY SCHOOL ...................................................................1
Problem Identification...........................................................................1
The School.................................................................................1
District.......................................................................................3
Community................................................................................5
Problem Analysis and Interpretation .....................................................6
Knowledge Factors....................................................................6
Teacher Motivation Factors.....................................................10
Organization Factors................................................................10
Problem Solution.................................................................................13
Curriculum Fidelity.................................................................14
Focus on Learning and the Learner .........................................16
Benchmarking..........................................................................18
Purpose, Design and Utility.................................................................21
Purpose....................................................................................21
Design......................................................................................22
Utility.......................................................................................25
CHAPTER 2. LITERATURE REVIEW.........................................................29
Core Curriculum Implementation........................................................30
Effective Instructional Practices..........................................................33
Lesson Objectives/Goal Setting...............................................34
Student Engagement................................................................36
Checking for Understanding......................................................3
Differentiating Instruction.......................................................39
v
Level of Rigor of Instruction ...................................................40
Staff Expectations................................................................................43
Data Driven Instruction .......................................................................44
English-Language Learner Support and Instructional Needs..............46
Benchmarking......................................................................................51
Conclusion...........................................................................................54
CHAPTER 3. RESEARCH SUMMARY.......................................................55
Quantitative Evaluation Design...........................................................55
Qualitative Evaluation Design.............................................................57
Interventions........................................................................................59
Core Curriculum Instruction....................................................60
Learning Objectives.................................................................62
Benchmark School Visit..........................................................68
After-School Intervention Program .........................................71
Participants and Setting.......................................................................72
Instrumentation: Achievement............................................................74
Procedures................................................................................77
Instrumentation: Surveys, Observational Checklists,
and Questionnaires............................................................................78
Procedures................................................................................79
Instrumentation: Informal Interviews.................................................80
Procedures................................................................................80
Instrumentation: Qualitative Fieldwork..............................................81
Procedures ...............................................................................81
Instrumentation: 2006-2007 Individual Student
Performance Band Change ...............................................................84
Pivot Chart Analysis................................................................84
The 2006-2007 Academic Performance Index........................86
Performance Band Scoring......................................................86
Qualitative Analysis.............................................................................87
Reaction...................................................................................87
Learning...................................................................................88
Behavior...................................................................................89
Quantitative Analysis...........................................................................89
Limitations...........................................................................................91
vi
CHAPTER 4. FINDINGS ..............................................................................94
Proficiency Changes from 2006-2007 (Independent Groups).............94
Proficiency Band Changes from 2006 to 2007
(Dependent Groups)...................................................................100
Statistical Significance: 2006-07 Scaled Score Change .......100
Practical Significance: 2006-07 Scaled Score Change.........101
Performance Level Changes from 2006-2007:
Statistical Significance.......................................................103
Performance Level Changes from 2006 to 2007:
Practical Significance .......................................................106
Overall Changes in 2006-2007 Individual Student Status.................108
Pivot Tables...........................................................................109
Academic Performance Index and Annual Yearly Progress .............112
Comparison with Benchmark School Growth...................................114
California English Language Development Test Results..................115
Conclusion.........................................................................................118
CHAPTER 5. SUMMARY, DISCUSSION, AND
RECOMMENDATIONS..................................................................119
Results Overview...............................................................................121
Independent t-test Results......................................................121
Subgroup Progress.................................................................122
Dependent t-test Results........................................................122
Discussion..........................................................................................124
Core Curriculum Intervention............................................................125
Effective Instructional Strategies.......................................................120
Benchmark School Visit....................................................................134
Future Suggestions and Site-Based Recommendations.....................138
Future Suggestions.................................................................138
Site Based Recommendations................................................140
Conclusions........................................................................................145
Limitations: Internal and External Validity..........................147
REFERENCES ..............................................................................................150
APPENDICES
A. LEARNING OBJECTIVES ....................................................................156
B. STAFF DEVELOPMENT DAY, JANUARY 29, 2007..........................157
C. KAYE BEESON SCHOOL VISIT..........................................................158
vii
D. EFFECTIVE TEACHING PRACTICES CHECKLIST .........................159
E. STUDENT ACADEMIC PROGRESS SUMMARY ..............................161
viii
LIST OF TABLES
1. CST Percent Proficient English Language Arts 2006 ................................4
2. California English Language Development Test Proficiency Data 2006...8
3. Cut Off Points...........................................................................................78
4. CDE Proficiency Bands by Scaled Scores and Grade Levels ..................94
5. The 2006 Student Performance Bands Percentage Totals........................95
6. The 2007 Student Performance Bands Percentage Totals........................97
7. Subgroup Proficiency Change 2006-2007................................................99
8. Scaled Score Changes from 2006 to 2007..............................................101
9. Practical Significance: 2006-07 Scaled Score Change............................103
10. Basic and Above Change........................................................................105
11. Proficient and Above Change.................................................................105
12. Basic and Above Growth........................................................................107
13. Proficient and Above Growth.................................................................107
14. Pre-Basic and Above—Basic and Above Cross Tabulation...................109
15. Pre-Proficiency/Post-Proficiency Cross Tabulation...............................110
16. Academic Performance Index.................................................................112
17. 2007 Proficiency Rates by Grade Level .................................................113
18. Academic Performance Index Progress..................................................115
19. Academic Yearly Performance Comparison of Percent Proficient ........116
ix
20. Eberman Student Results 2005-2006 CELDT - Form E ........................117
21. Eberman Student Results 2006-2007 CELDT - Form F .........................117
22. Student Gains and Losses - 2006-2007 ..................................................120
x
LIST OF FIGURES
1. Student Achievement by Grade Levels, 2006 CST ELA .............................5
2. Rigor/Relevance Framework......................................................................41
3. Training Evaluation Program Levels..........................................................59
4. 2005 CST ELA Proficiency, Grade 6, Class 1 ...........................................85
5. 2005 CST ELA Proficiency Grade 6, Class 2 ............................................85
6. 2007 CST ELA Proficiency Level, Grade 6, Class 1 ...............................111
7. Excerpt from Eberman Report of Findings ..............................................129
xi
ABSTRACT
The purpose of this dissertation was to examine interventions to improve
student achievement at Eberman Elementary. The problem was that over 75%
of the students in grades two through six were not proficient or advanced as
measured through the English Language Arts (ELA) California Standards Test
(CST), and the ELA core curriculum of Open Court was not being implemented
with fidelity. Equally significant was that the students were not engaged and
participating in the lessons.
The method to provide the necessary change in instructional practices
included a focus on effective strategies such as student engagement, lesson
objectives, and checking for understanding. Teachers visited a higher
performing school with complete implementation of the ELA core program.
The intent was to reinforce those striving to implement and show that desired
results could be achieved with fidelity.
The results of the multiple interventions varied across grade levels and
within grade levels, primarily due to the mixture of implementation of the
interventions. Eberman Elementary had a student mobility challenge. To
account for this problem, data were examined with both independent and
dependent groups designs. The quantitative data matched the inconsistency of
curriculum implementation with varying results throughout the grade levels.
Change within proficiency bands for the proficient and above student groups
xii
showed statistically significant change for grades 2 Æ 3 with 09%, 3 Æ 4 with
11%, and 5 Æ 6 also with 11%. Grades 4 Æ 5 showed little growth with 02%.
Future needs for Eberman Elementary to increase academic achievement
are support for teachers with reading coaches so they may gain a better
understanding of the significance of the various components of the curriculum.
Support from an outside source, such as a School Assistance and Intervention
Team (SAIT), would provide another method of curriculum implementation
support. This additional lens should increase fidelity of instruction. With the
core curriculum in place, the main focus must be using student data to drive
instructional practices with collaboration in and across grade levels.
xiii
CHAPTER 1
PROBLEM OF PRACTICE AT EBERMAN ELEMENTARY SCHOOL
Problem Identification
The School
Eberman Elementary School was in Year 1 of Program Improvement for
the 2006-2007 school year. The school had 440 students in grades K-6 and 45
children in pre-school. There was a high number of English-Language Learners
and socioeconomically disadvantaged (SED) students. There was a high
population of Hispanic students and not all of them were bilingual. Russian was
the next predominant language, as well as a small population of East Indian and
Mongolian children. Specific population data for Eberman included: 48%
Hispanic, 30% white, 9% Asian, 7% African American, and 5% American
Indian. Thirteen percent of students received special education services. There
were 36% English-Language Learners, 9% immigrants, and 83% of the students
received Free and Reduced Meals.
Student achievement was low. Unfortunately, the two previous
principals involved with the standards-based language arts curriculum did not
mandate teachers to follow it as prescribed. I am the school principal at
Eberman Elementary. Because of these practices, teachers were not teaching the
adopted curriculum of Open Court with fidelity. There was little differentiation
in the classroom, with fewer than 10% of the classes conducting “workshop”
1
and even less following the state guidelines with 30-45 minutes of English-
Language Development a day. It was not known whether the problem was the
teaching or the curriculum. To begin this diagnosis effectively, once all teachers
were using the prescribed curriculum, a look could be given at the caliber of
teaching and the results.
Looking at state test data from 1999-2003, Eberman saw an increase in
Academic Performance Index (API) from 583 to 717. Since then, there had
been a steady decline in school wide academic performance index (API) of 48
points. In 2006, individual subgroup data for the English-Language Learner
showed a loss of 42 points down to 645 points. Based on the most recent
Academic Performance Index (API) data, there was a discrepancy between the
White and other subgroups ranging from 57 points for Hispanic students, 51
points for English-learner (EL) students, and 38 points for SED students. It was
important to keep in mind that it is common for students to be represented in
more than one subgroup. All subgroups and schoolwide data showed a steady
decline over the past few years. The gravest concern these data show was that
94% of the English-Language Learner students were not proficient based on the
2006 California Standards Test (CST). The re-designated EL data reflected 96%
were not proficient. Equally alarming was the percentage of Hispanic students
who were not reaching proficiency—87%. Considering 48% of the student
2
population was Hispanic, immediate changes needed to be made, or we were
failing our students.
The important information these data provide in Table 1 are the number
of students who were not proficient in each group. In 2006, 188 students were
not proficient in English Language Arts (ELA) on the California Standards Test
(CST) and although this is an alarming number of non-proficient students, the
kindergarten, first-, and second-grade students were not reflected in this number,
as the CST did not begin until the second-grade level. Open Court Assessments
help identify students who are not reaching proficiency. Looking at multiple
sources of data, we know that approximately 75% of the students were not
proficient. Another evaluation in student achievement is to look at the progress
of each grade level. Table 1 shows the percent proficient based on the 2006
CST ELA.
District
Elements of our district needed to be considered as the shifts in
leadership affects the morale and belief of the staff at Eberman. During the last
10 years, Mountain Meadows Unified School District had six superintendents,
with four of them in the past four years.
3
Table 1
CST Percent Proficient English Language Arts 2006
Population
Total
Scores
Number at
or above
Proficient
Percentage at
or above
Proficient
Number Not
Proficient
School wide 234 46 19 188
African-American 12 1 8 11
American Indian
or Alaska Native
9 2 22 7
Asian 22 6 27 16
Hispanic 115 16 13 99
White 73 20 27 53
Special Education 34 3 8 31
English Learner 87 6 6 81
Re-designated as
fluent English
proficient
27 11 4 16
Our new superintendent and the entire cabinet were hired under his
leadership during the first four months of his appointment. Our district was in
deep trouble academically and financially. We did not settle contracts and were
in impasse through January 2007. Figure 1 shows student achievement.
4
0%
10%
20%
30%
40%
50%
Gr 2 Gr 3 Gr 4 Gr 5 Gr 6
BB Basic P & A
Figure 1. Student Achievement by Grade Levels, 2006 CST ELA
Contract agreements fortunately were reached without a strike. This
situation was not positive for student achievement and detracted from our focus
on meeting the needs of all students.
Community
The community was historic, with many families having lived in this
area for multiple generations. Some students were in “transition”—doubled up
with other families or homeless living in local motels. The community was
across the river from Sacramento and had many students and parents who had
“never crossed the bridge” into Sacramento. Many of our families did not have
vehicles and apparently no reason to travel when their needs were being met
within their small community. West Sacramento had quite a bit of growth with
5
new developments on the south side of town. Our northern area had one
exclusive development and children that lived there either attended private
schools or went across the bridge into Sacramento. There was a great divide
within our community between the “have and the have not.”
Problem Analysis and Interpretation
Eberman Elementary School was not delivering the core curriculum
exclusively and few classrooms were teaching the mandated 30-45 minutes of
English-Language Development a day with fidelity. Both concerns were
addressed within our school plan with strong accountability throughout the year
for teachers and administration. The plan was predominately written by the
Data Team in June of 2006 with detail and input from the staff and site council
parents.
Knowledge Factors
Implementation of Core Curriculum. A great concern was the fact that
Eberman teachers were not teaching the core curriculum of Open Court in
Language Arts, Scott Foresman in Math, or Moving Into English (our English-
Language Development curriculum). This dissertation focused on the Language
Arts and English-Language Development Curriculum. Marzano (2003)
identified the number one factor of school effectiveness as a guaranteed and
6
viable curriculum. “That is, a guaranteed and viable curriculum is the school-
level factor with the most impact on student achievement” (Marzano, 2003, p.
15). Both of our textbooks were aligned with the California State Standards and
on the required adoption list, thus they were our viable and guaranteed
curriculum.
State Content Standards. In addition to teaching the core curriculum,
teachers must have time to see how the California State Content Standards
appear within the subject matter. It is significant to note that within the Open
Court manual, there are references to the state standards. Frequently, the depth
of the standard is not, in fact, connected to the material in the text. Teachers
need to correlate the rigor of the standard with the material in the text and make
adjustments as needed to ensure students are working at the right level of rigor.
Failure to Implement English-Language Development. We were failing
to implement with fidelity the state-mandated English-Language Development
requirement of 30-45 minutes of daily instruction. This was greatly due to a
lack of knowledge and skills as identified by Clark and Estes (2002).
Knowledge and skill enhancement are required for job performance
under only two conditions. First, they are required when people do not
know how to accomplish their performance goals; and second, when you
anticipate that future challenges will require novel problem solving. The
first condition usually indicates a need for information, job aids, or
training. The second condition suggests a need for continuing or
advanced education. (Clark & Estes, 2002, p. 58)
7
Clark and Estes (2002) promote the job performance need of training.
Training is defined as any situation where people must acquire “how to”
knowledge and skills, and need practice and corrective feedback to help
them achieve specific work goals. . . . Education is any situation in
which people acquire “conceptual, theoretical, and strategic” knowledge
and skills that might help them handle novel and unexpected future
challenges and problems. (Clark & Estes, 2002, p. 59)
In our case, our teachers had a variety of needs: some required merely
training in the prescribed curriculum, and others needed a great comprehension
of the needs that our English-Language Learners faced. Table 2 shows the
California English Language Development Test proficiency data for the 2006
school year. This data was retrieved from the California Department of
Education website.
Table 2
California English Language Development Test Proficiency Data 2006
Proficiency
Third
Grade
Fourth
Grade
Fifth
Grade
Sixth
Grade
Number Proficient 13 5 13 16
Number Not Proficient 44 30 41 62
Principal Knowledge. Understanding the changes in expectations for
teachers and the focus on student learning is a large shift for our school. Within
the Balanced Leadership article, the definition of first- and second-order
8
changes were explored (Waters, Marzano & McNulty, 2003). In 2006 I
attended a two-day workshop presented by Tim Waters, Ed.D., one of the
authors of the book, School Leadership That Works (Marzano, Waters, &
McNulty, 2005). Revisiting this information was timely and helpful to
understand, as a leader, we need to make shifts in our approach to compensate
for the severity of what we are asking teachers and staff to do. Eberman was
looking at several instructional delivery changes beginning 2006-2007.
Effective leadership means more than knowing what to do—it’s knowing
when, how, and why to do it. Effective leaders understand how to
balance pushing for change while at the same time, protecting aspects of
culture, values, and norms worth preserving (Waters, Marzano, &
McNulty, 2003, p. 2).
Our shift this year matched the first four areas of second-order change
for some teachers: a break with the past, outside of existing paradigms,
conflicted with prevailing values and norms, and emergent. Other teachers saw
the same changes as first-order, an extension of the past, within existing
paradigms, consistent with prevailing values and norms, and focused. As
principal, I attempted to blend our two teacher camps into one with a focused
mission of reaching all of our students. Different teachers required different
approaches based on whether they viewed our shift as second- or first-order
change.
9
Teacher Motivation Factors
Due to the fact that previous principals did not mandate the core
curriculum implementation and the continual shift in district administration with
little previous support on school sites, Eberman Elementary School was in
chaos. As mentioned previously, an additional motivation challenge that did
resolve, but greatly impacted learning and teaching during our 2006-2007 school
year, was the late settlement of the certificated contract. It was no surprise there
were great motivation issues on our campus.
Prior to our break, teachers were on “work to rule,” meaning they only
worked the minimum hours required arriving at the start of the school day and
leaving fifteen minutes after the final bell. We were set to begin our
intervention program after school when we returned from Thanksgiving Break,
but as a result of work to rule—sadly, we were affecting our students greatly in
failing to provide them with additional instructional time due to contract
negotiations.
Organization Factors
There are multiple factors within the school organization that had to be
addressed. The greatest factor has been addressed—not teaching the core
curriculum. Additional factors included failure to identify learning objectives
10
for both teachers and students, not connecting concepts for students within and
beyond lessons, and not using data to drive instruction.
Objectives: If students are to reach proficiency and advance in grade-
level standards, they must know what is expected of them. Providing objectives
for students prior to the lesson, students know what they will be learning and
how they will show their teacher they have learned the concept. Hill and Flynn
(2006) capitalized on Marzano, Pickering, and Pollock’s (2001) research about
effective classroom instruction and the specific strategies that benefit English-
Language Learners. “First, by setting instructional goals, teachers can narrow
the focus for students. Second, students should be encouraged to adapt the
teacher’s goals to their own personal needs and desires” (Hill & Flynn, 2006, p.
6). Having objectives posted prior to a lesson frontloads the students as to what
concept they will be working on and what they will need to do to show they
understand the material. Additionally, Hill and Flynn (2006) addressed the
challenges that English-Language Learners face. “For ELLs, setting objectives
is especially important: Imagine the incredible amount of incoming stimuli
bombarding these students as they try to learn both a new language and content
knowledge” (Hill & Flynn, 2006, p. 22).
Connecting Concepts, English Language Development (ELD): In the
article, Beating the Odds (Langer, 2001), the author discussed that in order for
new information to be learned, students must have an existing knowledge
11
concept to latch onto, or the learner will not be able to retrieve the information.
Learners faced with a new concept must have something tangible on which to
connect new information or they will fail to access knowledge in the future.
Our English-Language Learners, and many of our lower socio-
economically disadvantaged students, were at a larger disadvantage with the
lack of existing knowledge and experiences to connect new information to
existing knowledge. It is common for many EL students to speak minimal
English during the course of the school day, return home and speak in their
native language all afternoon and evening, and then return to school the
following morning repeating the routine. Keeping in mind that 83% of our
students qualified for free and reduced meals, quite a large representation of our
student population had limited exposure to commonalities in the middle-class
world and access to many life experiences as opposed to children who were
more affluent.
Using Data to Drive Instruction: We had the data program, Data
Director, and our school board approved our joining the Just 4 Kids website.
Both of these instruments can create a wealth of data for teachers to use when
identifying individual student strengths and weaknesses and devise intervention
plans based on specific student need.
One of the most important goals of data is to stimulate dialogue in the
school community. Whenever data are presented, time—and ideally
facilitation as well—must be invested for the dialogue. Inevitably, some
12
of the discoveries that the school and district make regarding their
beliefs, practices, and outcomes will be painful. (Johnson, 2002, p. 49)
Eberman staff were just getting into the discomfort zone of beginning to
confront the answers to the question of why some of our students were
succeeding and making progress and why some were not making the needed
growth to reach proficiency and real learning during school.
Financial Impact: We were losing over $140,000 in our categorical
funding from 2006-2007 to the 2007-2008 school year. This affected our ability
to provide supplemental services after school and provide specialty teachers
such as our Reading Specialist. During the spring of 2007, plans were made
about how to provide best for our students with such a drastic reduction in
categorical money.
Problem Solution
Fortunately, we did have many pieces in place that assisted our teachers
to meet the needs of our English-Language Learners and our non-proficient
students in English Language Arts. First and foremost was to teach the adopted
curriculum with fidelity (Marzano, 2003; Schmoker, 2006). This included our
Open Court LA program and the ELD curriculum, Moving Into English.
13
Curriculum Fidelity
An action to improve student achievement was to make certain I was
inside classrooms daily sharing successes with teachers and begin to have some
of the more difficult conversations about instruction as I continued to develop
relationships with teachers and staff. The teachers had been charged with
implementing Open Court with fidelity and, as mentioned previously, this was a
huge shift. I needed to be able to offer support and feedback for teachers daily
and weekly. Each week, I wrote our “Week at a Glance” with specific strategies
I looked for during walk-throughs.
I began my year at Eberman with a focus on student engagement and
checking for understanding. I modeled the strategy during my first two days
with teachers before school started and then did mini-lessons during staff
meetings and provided literature to support the strategies and let them know the
purpose of these concepts. The walk-through strategies did not change weekly,
it was simply a reminder of what I wanted to see when visiting. We also had the
conversation about a walk-through being a snapshot in time as I was not present
for the entire lesson and I did not expect them to perform for me, just do what
they do best every day—that is what I wanted to see. It was important for me to
celebrate successes from my visits and let teachers know what they were doing
was exactly what our children needed.
14
I observed in classrooms that it was possible for students to attend school
and not practice reading on a daily or consistent basis. Unfortunately, within the
Open Court materials there were CD-ROMs with the stories on them. Teachers
would play the CD and have students follow along in their books with little, if
any, interaction on the part of the teacher. The number of proficient students in
fluency was low as based on our Open Court Assessment and initial fluency
scores gathered in August 2006. With reading comprehension, another area
with a schoolwide low number of students reaching proficiency, as
demonstrated by our Open Court and CST data, immediate change had to occur.
How could our students develop reading strategies and improve their
comprehension and fluency if they were not reading at school? We asked that
children read at home nightly for 20-30 minutes and honestly, this was occurring
in very few households. We needed to work to design a schoolwide
accountability to get families excited about the benefits of reading.
After I had taught the concept—using small groups and a choral reading
from the Open Court text—in a fifth-grade classroom, I asked teachers to try this
new strategy to ensure all students were reading out loud daily to increase
fluency. Comprehension strategies should be used from within Open Court to
connect reading to comprehension strategies. We wanted our learners to know
why and how they were reading to gain understanding. To ensure that all
students were reading aloud, I wanted students in small-differentiated groups’
15
choral reading. As the teacher walks around the room, students need to read
loud enough to be heard individually. This means that students are reading and
tracking together; with differentiated small groups enabling students to read at
their own speed, or just a little bit faster. With a goal connecting comprehension
and “Read Aloud Prompts” to standards and choral reading, I expected to see an
improvement in comprehension and fluency. This was one strategy I looked for
during weekly walk-throughs as reflected on our Week at a Glance.
Focus on Learning and the Learner
An additional weak area for Eberman students was teacher focus on
learning and the learner. Use of the core curriculum is paramount for a
consistent basis to evaluate students’ learning schoolwide. Anderson and
Krathwohl (2001) capitalized upon the concept of using Bloom’s Taxonomy to
connect instruction and learning for the learner.
Strategic knowledge is knowledge of the general strategies for learning,
thinking, and problem solving. The strategies in this subtype can be used
across many different tasks and subject matters, rather than being most
useful for one particular type of task in one specific area. (Anderson &
Krathwohl, 2001, p. 56)
Adding together the focus for students to gain strategic knowledge through
using the prescribed curriculum ensured that our students were learning and able
to take knowledge across subject matters. This was not occurring schoolwide
and in very few classrooms.
16
Blooms’ taxonomy of educational objectives is an instrument used by
many educators in lesson planning since the 1950s. In the textbook, A
Taxonomy for Learning, Teaching, and Assessing, Anderson and Krathwohl
(2001) revise the taxonomy by connecting objectives and planning on a higher-
thinking and performance-level. With the introduction of the knowledge
dimension, students are working on a higher level in both knowledge and
cognition. This is addressed through the standards and student learning
objectives. Teachers are able to identify cognitive domains (similar to the
original Bloom’s Taxonomy) as well as the expectation of knowledge produced
when they are planning the lesson. The four levels of knowledge are: factual,
conceptual, procedural, and metacognitive. The grid created by Anderson and
Krathwohl (2001) is a tool teachers may use to identify objectives for the learner
and assess what dimension the teacher is expecting in student outcome.
Another strategy is the posting of learning objectives for the student
(Appendix A). The teachers needed more knowledge and practice in this skill
and feedback. Additional staff development on this concept was provided
following our Winter Break (Appendix B). We began by identifying the parts of
a standard—the content and concept and then identifying the level of learning
the standard required versus the level of rigor our adopted curriculum was
demanding.
17
Ironically, the newer teachers were the first ones to implement “objective
writing” at a high level of accuracy. It was the more experienced teachers who
were not comfortable with this new task. I made certain all teachers were able
to meet this requirement to benefit their instructional level and student
achievement.
Information had to be connected so our students could access it and build
upon known concepts. Frequently, older students new to the United States have
background knowledge about a concept in their home language. When teachers
use the technique of activating prior knowledge with pictures, ELL students can
make the connection in their primary language and attach that to the new
English concept and vocabulary. Students without an experience to attach the
concept, gain an understanding with a picture and a location to store information
for future reference. In order to show teachers what could and is being done
with core curriculum implementation, I decided to benchmark (Tucker, 1996)
Kaye Beeson Elementary.
Benchmarking.
For our staff development day on January 29, 2007, we visited another
elementary school that was outside our district. Kaye Beeson Elementary was
not a “similar school” according to the California Department of Education
(CDE) data, however the school was looking at exiting Program Improvement
18
after meeting criteria in year three (Appendix C). If they met their goals this
year, they would have made such a dramatic improvement in student learning,
they would be out of program improvement.
I took our Reading Specialist to visit this school when we were still off
in January and they were back in session. What we observed was a high level of
student engagement, checking for understanding, rigor and differentiation with
implementation of the core curriculum, and smooth transitions. This was
exactly what I had been working to achieve since I arrived at Eberman in August
2006. For our non-student staff development day, all certificated staff could
visit Kaye Beeson School. Thus, our teachers had an opportunity to visit a
school that had made tremendous growth and shared many commonalities with
us. They were a Title 1 school with 100% students qualifying for free and/or
reduced meal prices. The largest difference between our populations was that
they had a higher percent of African-American students and not as many white
or Russian students.
During the visit, we had focused Observation Points and the teachers
recorded what they saw instructionally and organizationally in regards to
curriculum delivery. We had staff development in tandem with this visit—our
presenter from Action Learning Systems, Inc. went with us. This allowed us to
have a clear reason for the visit, debriefing after the visit, and a common base
for future discussion and development.
19
The targeted strategies for observation were:
• Core curriculum implementation with fidelity in both English-
Language Arts and the English-Language Development program.
• High level of engagement for all students throughout a lesson.
• Checking for understanding during direct instruction.
• Differentiation in lesson delivery to meet the needs of all students
and to provide access to the core curriculum.
• Level of rigor of instruction.
• Staff expectations for student performance and achievement.
Each of these strategies were part of our intervention, or were in progress
of being implemented to benefit our students. The opportunity for teachers to
visit a nearby school benefited our teachers both in verifying what they do well
and what areas they may want to modify to better meet the needs of their
students. Students ideally benefit from a shift in instructional practices that
focus on differentiation and accountability in learning.
A solution for our failure to comply with state and federal mandates
providing daily ELD instruction was to have staff development from the Moving
Into English curriculum publisher. Additional monitoring of the instruction and
delivery should occur through classroom visits and accountability for student
learning using data from classroom assessments and student work.
20
We began an after-school intervention program for targeted students in
March 2007. This was a short-term program intended to provide three hours a
week of additional support for some of our struggling students in grades 3-6
who were close to moving across a proficiency band on the CIS. Students in
kindergarten, first, and second grades who would benefit from additional
support were also included.
These small group interventions served about 50 students with a
maximum of 12 students per group. Curriculum varied from developing fluency
with Read Naturally, building comprehension skills with Soar to Success, and
phonics and letter identification using Open Court resources. We also offered a
multi-age reading class for students reading below grade level who did not have
decoding skills. Data were collected on student fluency rates prior to the onset
of the intervention programs and at the conclusion of the program in June.
Additional formative measures of student growth was classroom assessments in
Open Court.
Purpose, Design, and Utility
Purpose
The purpose of this study was to be both summative and formative in
nature. We wanted to determine whether the interventions in place were
effective in meeting the needs of our students and if not, what interventions or
21
schoolwide actions would, in fact, get the desired results for all students to be
proficient in ELA as required by No Child Left Behind (NCLB) (United States
Department of Education, 2002). Since the required percentage of proficient
students increased each year, what must we do to meet and exceed the state
benchmarks in order to be a proficient school?
Equally significant was the goal of improving the instructional program
at Eberman to benefit our ELL population. With the multiple interventions in
practice, how could we strengthen our student achievement and continue to
move ELL students toward re-designation in addition to being fluent and
successful students?
Design
We used Kaye Beeson Elementary School as our benchmark. This
school and the practices observed were our model school. As previously
mentioned, Eberman certificated staff visited this school while it was in session.
The reason this school was selected was that they had made tremendous
academic growth of 95 Academic Performance Index (API) points in the past
two years and significant growth in the Hispanic subgroup moving from 582 to
694 and the socio-economically disadvantaged (SED) group moving from 594 to
699. Based on this progress, the expectation was that they would exit Program
Improvement status following the 2007 testing year. Equally significant was the
22
fact that they had a highly diverse population and used the same core
curriculum—Open Court for language arts.
It was ideal to use Kaye Beeson School as a benchmark measure of what
we wanted to become and planned to achieve in order to best serve our
community. Through visitations, interviews, and reflective conversations, we
could identify and discuss actions that could improve our program delivery. In
looking at the tremendous achievement gains schoolwide and within subgroups,
we could connect Kaye Beeson School’s practices with student achievement.
Eberman needed to terminate our opportunity gap for children who were not
obtaining proficiency or advanced levels on standardized tests and in student
work.
Formative data included interviews and conversations with the Kaye
Beeson School principal and Eberman teachers; walk-through observations with
comments and checklists; survey results from Eberman teachers and school
staff; as well as agendas from School Site Council, ELAC, and staff
development—both site and district. Teacher lesson plans and templates were
explored for detail and documentation of curriculum implementation. The
purpose of gathering this formative data was to improve instruction and services
for students of Eberman.
Our goal was to remain at the same Program Improvement Level 1 and
not move on to second-year Program Improvement based on our 2006-2007
23
STAR data. The following year we wanted to exit the Program Improvement
status due to our gains in student achievement. With required percentage
increases for the next seven years reaching up to 11% a year, we would require
enormous progress and growth each year to meet the NCLB (United States
Department of Education, 2002) goal of 100% of Eberman students being
proficient prior to 2014.
The summative data were predominantly the CST ELA data comparing
the 2006-2007 year to the results from the 2005-2006 school year. This was
disaggregated by various sub groups and grade levels for in-depth review of
results and to identify any failing populations. Additional data included unit
assessments from the core curriculum of Open Court, Benchmark Assessments
from Action Learning Systems, and grade-level student work. The intention of
the summative data was to determine the effectiveness of the current
interventions and instructional practices of Eberman. Another result was
identifying the strengths and weaknesses of current practices and designing
needed changes, if deemed appropriate.
Additional summative data was accumulated by looking at the progress
our English-Language Learners made on the California English Language
Development Test, comparing 2006 to 2007. In order to make this comparison,
we needed data from the CDE on how to convert the new test in 2007 to the
previous test used in 2006.
24
Primary data were obtained through the California Department of
Education STAR results. Additional quantitative data included, but was not
limited to: Action Learning Systems Language Arts Benchmarks for the 2006-
2007 school year, Reading Lions Open Court Assessments 1-6 for the 2005-
2006 and 2006-2007 years, and Just for the Kids (2006) data. The opportunity
gap was examined between Eberman and Kaye Beeson Elementary School.
The quantitative analysis was a Pre-Post Dependent Groups Design
using the CST to compare the data of 2006 with 2007. Additional quantitative
analysis included a pre-post independent groups’ design also comparing changes
and school improvement on the CST. Although we had a mobility rate below
25%, we still had movement within our school and community. To identify
specific strengths and weaknesses as related to student improvement, we also
compared grade-level progress from year-to-year and followed the improvement
of the same student cohorts in grades 3-6 for the two-year period. Schoolwide
and subgroup data were disaggregated to gather supplementary information to
validate or disprove the success of our interventions.
Utility
With the addition of No Child Left Behind Act as part of Public Law
107-1001 (United States Department of Education, 2002, Sec 1001), and the
25
Elementary and Secondary Education Law on January 8, 2002, the Statement of
Purpose for Title 1 states:
The purpose of this title is to ensure that all children have a fair, equal,
and significant opportunity to obtain a high-quality education and reach,
at a minimum, proficiency on challenging state academic achievement
standards and state academic assessments. (United States Department of
Education, 2002, Sec 1001)
There can be no better reason for the purpose of this study. We must
ensure that children learn and are, at a minimum, reaching proficiency. NCLB
(United States Department of Education, 2002) proposes a deadline of 2014,
however that, is seven years away. Of the 12 purposes cited in the NCLB, this
study encompasses the following 10:
1. Ensuring that high-quality academic assessments, accountability
systems, teacher preparation and training, curriculum, and
instructional materials are aligned with challenging state academic
standards so that students, teachers, parents, and administrators can
measure progress against common expectations for student academic
achievement.
2. Meeting the educational needs of low-achieving children in our
nation's highest-poverty schools, limited English-proficient children,
migratory children, children with disabilities, Indian children,
neglected or delinquent children, and young children in need of
reading assistance.
3. Closing the achievement gap between high- and low-performing
children, especially the achievement gaps between minority and
nonminority students, and between disadvantaged children and their
more advantaged peers.
4. Holding schools, local educational agencies, and states accountable
for improving the academic achievement of all students, and
identifying and turning around low-performing schools that have
failed to provide a high-quality education to their students, while
26
providing alternatives to students in such schools to enable the
students to receive a high-quality education.
6. Improving and strengthening accountability, teaching, and learning by
using state-assessment systems designed to ensure that students are
meeting challenging state academic achievement and content
standards and increasing achievement overall, but especially for the
disadvantaged.
7. Providing greater decision-making authority and flexibility to schools
and teachers in exchange for greater responsibility for student
performance.
8. Providing children an enriched and accelerated educational program,
including the use of schoolwide programs or additional services that
increase the amount and quality of instructional time.
9. Promoting school-wide reform and ensuring the access of children to
effective, scientifically based instructional strategies and challenging
academic content.
10. Significantly elevating the quality of instruction by providing staff in
participating schools with substantial opportunities for professional
development. (United States Department of Education, 2002, Sec
1001)
Author and researcher, Lachat (1999) identified who are our students.
The largest population that continues to grow in America are not English-native
speakers. This means that as time passes, we will have more children entering
our schools with needs that Eberman is not currently addressing. Our practices
must meet our student population.
The intent of this research was to validate the practices used to meet the
educational needs of students they are entitled to have and identify actions that
must change in order to do so. Do students learn more when given an objective
27
prior to lesson delivery? How does student engagement impact student
performance on class work and assessments? When teachers reach a higher
level of rigor and focus on Bloom’s Taxonomy and connect state standards with
core curriculum, is there a positive shift in academic performance? When
English-Language Learners and non-proficient students have equal access to the
core curriculum and grade-level standards, do we see positive results?
The bottom line is: will the interventions adopted during the 2006-2007
school year make an impact on our student achievement? Are we doing what is
best for kids?
28
CHAPTER 2
LITERATURE REVIEW
Throughout the United States, a growing number of high-poverty, low-
performing schools have become high-performing schools. By employing
research on effective schools and best practices for low-performing and other at-
risk students, and by monitoring student performance, these schools have
transformed learning in dramatic ways (Barr & Parrett, 2007, p. 10)
The intent in this chapter was to identify key research-based concepts
and strategies that enable a low-performing, high-poverty school, such as
Eberman, to address changes in instructional practices that will make significant
differences in student achievement. With multiple interventions in place for the
2006-2007 school year, it is important to remember that many are related to one
another in lesson design and delivery. In identifying strategies within
instruction, we focused on the following:
1. Core Curriculum Implementation
2. Effective Instructional Practices
3. Data Driven Instruction
4. English-Language Learner Support and Instructional Needs
5. Benchmarking
29
Core Curriculum Implementation
Marzano (2003) combines opportunity to learn and time into the concept
of a viable and guaranteed curriculum. Opportunity to learn has the strongest
relationship with student achievement of all school-level factors identified in
Marzano’s (2003 comprehensive review. Schmoker (2006) in Results Now
identifies the lack of teaching the core curriculum as curricular chaos and refers
to the research in Marzano’s (2003) book as an example of how to avoid
curricular chaos. The work of Marzano (2003) and Schmoker (2006), two well-
known published authors who both cite the significance of core curriculum,
cannot be ignored—all teachers must be teaching the core curriculum if students
are to get ahead academically.
Truth of the matter is, in California’s elementary schools, there are only
two standards-based core-curriculum options for English Language Arts—
Houghton Mifflin or Open Court. Mountain Meadows Unified School District
adopted the Open Court curriculum in 2002. Therefore, we should be using this
curriculum exclusively, but at Eberman, we are not. When teachers do not
consistently use the adopted curriculum, students miss chunks of information
and common terms and strategies to become good readers. Suppose students
have a teacher in second grade that does not use the core curriculum, then when
they go to third grade, they might be missing certain standards that were covered
in second grade, but not by that teacher and this causes the students to fall
30
behind. Unfortunately, for students already below grade-level in many cases,
they will fall even further behind.
With three principals over the past five years and four superintendents,
there has been little consistency in monitoring the implementation of a core
curriculum. DuFour, DuFour, Eaker, and Karhanek (2004) addressed what
happens when a district and/or school are not consistent with core curriculum
delivery and the results that may occur.
Some districts simply allowed each teacher to continue to determine
what was significant and important for students to learn, resulting in
wildly varying content and outcomes for students in the same grade
level or course within a school. (DuFour, DuFour, Eaker, & Karhanek.,
2004, p. 22)
In looking at phonics instruction, the Open Court curriculum is very
specific with written routines for the teacher to follow. If a lower-performing
student receives one instructional program in class and then is pulled out for
additional support with another program with different routines, there is little
connection for the student to build understanding. This is a grave concern that
must be addressed.
Consistent instruction across grade levels and within each grade level in
a school allows students to activate prior knowledge and build upon concepts
with new knowledge from year-to-year, which is the intent of having adopted a
state-approved standards-based core curriculum. Also, with high mobility
across a district, students have a consistent language arts program and can fill in
31
gaps when moving from one school to another with the knowledge of the system
of a program when every school adheres to the core curriculum.
Barr and Parrett (2007) synthesized research on what works in high-
performing, high-poverty schools. In the 18 studies that the authors reviewed,
there were 8 components identified as present in high-performing schools. The
three strategies and practices that are relevant to implementing a core curriculum
are:
• Align, Monitor, and Manage the Curriculum
• Create a Culture of Assessment and Data Literacy
• Build and Sustain Instructional Capacity
One of the studies, Just for the Kids: Studies of High-Performing School
Systems (Just for the Kids, 2006), focused on the nonprofit Texas-based Just For
The Kids work identifying a best-practices framework for student-achievement
improvement. This agency provides a well-known tool for individuals to
compare student achievement from one school site to another, and within a
school by grade level and subgroups. Just For The Kids motivates educators and
the public to take action to improve schools by giving them a clear picture of a
school’s academic condition and identifying the effective practices found in
high-performing schools (Barr & Parrett (2007).
The framework was developed by Luce (cited in Barr & Parrett (2007)
who created the web-based-data access looking at student achievement. The
32
framework was based on data gathered through interviews, observations, and
data over four-years of 100 high-performing schools. The framework divides
actions into district, school, and classroom practices. For the school actions
regarding instructional practices, the focus is on providing programs for learning
that are evident and scientifically based for every student. This concept aligns
with the requirements in California for our standards-based programs in English
Language Arts and English Language Development.
Effective Instructional Practices
In this section the following interventions are combined:
1. Lesson Objectives/Goal Setting
2. Student Engagement
3. Checking for Understanding
4. Differentiating Instruction
5. Level of Rigor of Instruction
6. Staff Expectations
These practices are necessary for a successful learning experience, as
evidenced by student work, walk-throughs, and conversation with students
during and following a lesson. In conducting this literature review, these topics
were infrequently isolated and often mixed within the same document source.
33
So, to connect the concepts within and across-research from various sources,
they are addressed under the heading of effective instructional practices.
Equally significant, in this search for strategies specific to English-
Language Learners—these effective instructional practices again crossed over
and could provide results for all students, not just limited to English-Language
Learners.
Lesson Objectives/Goal Setting
Marzano, Pickering, and Pollock (2001) focus on classroom techniques
and instruction. Marzano et al. (2001) identified the research on goal setting or
lesson objectives and the results this practice has on student achievement.
“Broadly defined, goal setting is the process of establishing a direction for
learning. It is a skill that successful people have mastered to help them realize
both short-term and long-term desires” (Marzano, Pickering, & Pollock, 2001,
p. 93).
Marzano (2007) published studies on goal setting in his most recent
publication, The Art and Science of Teaching. Three metanalyses each
synthesized research with effect sizes ranging from .40 to 1.37 and found
percentile gains as a result of setting goals or objectives to be as low as 16 and
high as 41. Interesting to note, that the metanalysis that had an average effect
size of 1.37 only included three studies, the lowest with .40 as the average. The
34
study with 204 effect sizes had an average effect size of .55. The conclusion
with setting goals or objectives with students was that there is a “general
tendency to enhance learning” (Marzano, 2007, p. 11).
Surprisingly, previously published research also reported by Marzano
(2003) showed that students were hindered with specific learning goals as their
outcome, with an effect size of -.20. The emphasis on specific objectives
narrowed the focus of the learner. Marzano’s interpretation in this finding was
that the student may lose some interest in the topic due to the subject matter
being too defined by the instructor (Marzano, Pickering, & Pollock., 2001).
Marzano’s recommendation of working with goals is that they should not be too
specific and that students will benefit from taking the teacher’s goals and
personalizing them to suit their own learning.
Within the written objective, teachers are connecting a specific grade-
level standard with the level of cognition required based on the content. This
does require teachers to unpack the standard—identifying the specific task that
is required by students to show mastery of a lesson component.
Students need to know what is expected of them and why they are
learning a particular strategy or concept. With instruction that is explicit and
well-organized, students are able to follow the tasks required of them and know
what the end result is. This is simply effective teaching (Appendix D). We
must provide a means for our students to access a lesson design and understand
35
what will be produced. Most importantly, with our grading shift statewide to
standards, teachers must know what standard is being assessed within
instruction to note whether a student is acquiring a standard, or will need more
targeted instruction to reach mastery.
Stronge (2007) states:
Conscientious planning for student instruction and engagement is a key
to connecting the classroom to future success for students … research
indicates that instructional planning for effective teaching includes the
following elements:
• Identifying clear lessons and learning objectives while carefully
linking activities to them, which is essential for effectiveness.
• Planning lessons that have clear goals, are logically structured, and
progress through the content step-by-step. (p.59)
Using the state adopted core curriculum, Eberman Elementary was
providing quality assignments with instruction to benefit student learning.
Furthermore, following our district directed pacing guide, and publisher
annotated teacher editions, we were implementing the program as intended and
designed for best student comprehension.
Student Engagement
Following an objective, and assuming students and teachers have clear
and defined expectations for the lesson outcome, the next step in learning would
be participation of the learner. In addition to knowing what a particular lesson
entails, students must actively participate in the learning process. Bringing
36
engagement to learning is one way a teacher can employ to expose students to a
particular concept.
Stronge (2007) referenced multiple studies on the significance of high-
quality instruction. With regard to student engagement, he referenced a study by
Walberg in 1984 (Walberg, 1986). This study connected time spent engaged
while learning through effective instruction and positive results for student
learning.
Using student engagement as a strategy to improve student-achievement
is crucial for the learner and the teacher. Students are more likely to learn in a
setting that allows them to participate in the learning. When teachers monitor
learning and raise the accountability within a lesson by keeping students
engaged, they are better able to ascertain the intended results of a lesson by
watching students’ participation. As students participate and are engaged in a
project, this gives an instructor the knowledge of what a student learns and when
the learning occurs during a particular lesson.
Checking for Understanding
Stronge (2007) agrees with Marzano (2007) about time on task in that
when students are engaged in the learning they are much more likely to be
successful. Teachers use various techniques to engage learners and see what
they are learning. Checking for Understanding (Fisher & Frey, 2007) is a newly
37
published book through the Association for Supervision and Curriculum and
Development. The research in this publication shows the strengths of checking-
in during the learning process. “The act of checking for understanding not only
corrects misconceptions; it can also improve learning . . . checking for
understanding is a systematic approach to formative assessment” (Fisher &
Frey, 2007, p. 3).
If our students are able to check out during instruction, how can we
expect them to master concepts taught? With teachers planning lessons and
having the expectation that students will participate throughout the learning
process, students know what they must do to meet the instructors’ expectations.
When students are engaged in the learning process within a lesson, they will
gain information through participation. Equally significant with student learning
is the teacher’s ability to monitor student academic progress during a lesson
(Appendix E). This allows a teacher the luxury of re-teaching a concept mid-
lesson—before a student learns the wrong information. Quick corrective actions
of the instructor during a lesson save time and energy for the learner to modify
what they were doing incorrectly and learn the necessary tactics to succeed.
One of the most effective means to ensure students are attentive and
accountable during lessons is calling on them randomly and using wait time so
the students have time to compose a response. Using a response board and
38
markers also allows for all students to respond in writing and the teachers check
for understanding by glancing at the answers (Stronge, 2007.
Differentiating Instruction
Within the Open Court language art curriculum, there is a prescribed
method for reaching students at various levels of learning ability; this portion is
called “workshop.” Workshop is to occur daily and allows students to be in
smaller groups rotating through various centers or activities. An important
aspect of workshop is that it allows the teacher to work with smaller and flexible
groups of students on specific tasks or standards. Teachers may choose to pre-
teach a concept or vocabulary to English-Language Learners or practice sight
words with students that need time and reinforcement, or work with students
editing their written work from an assignment. The most significant part of
workshop is the flexibility this gives both the teacher and the learner. Teachers
can have a plethora of activities related to the concept they are studying within
the language arts program and continue to build background knowledge for
children during this time of the day.
Stronge (2007) cited multiple studies related to student achievement with
differentiated instruction as a means to enhance achievement:
• Covino and Iwanicki (1996) noted that students are more successful
when they are engaged in instruction at the level matching their needs.
39
• Brophy and Good (1986), Molnar et al. (1999), Taylor, Pearson, Clark,
& Walpole (1999), and Walberg (1984) found that teachers are effective
when using various strategies of grouping during differentiated learning.
• Bain and Jacobs (1990) and Brookhart and Loadman (1992) found that
teachers were effective when they could identify individual student
needs including abilities, achievement, and learning styles.
Teachers have much better results for instructing students when they
know what the students’ strengths and deficiencies are. This is a key component
for success within the Open Court Workshop—planning based on student needs
as evidenced by student work. With the flexibility in planning and scheduling of
these activities, teachers are able to meet the diverse needs of their students.
Level of Rigor of Instruction
The concept of level of rigor of instruction deals with the connection
between the desired level of cognition and the delivery of the instruction. In
addition to following the core curriculum with fidelity, equally significant is the
manner of presentation to students including pacing and teacher comprehension
of the objective desired. The instructional planning must include the desired
student-learning outcome embedded within the rigor of instruction. It is
common for the instruction from the teacher’s edition of core curriculum to lack
the depth needed to meet the intent of each California State Standard.
40
A visual example of using rigor in instruction was designed by The
International Center for Leadership Excellence as a tool called the
“Rigor/Relevance Framework.” This visual framework connects instruction
with planning and desired student outcomes (Figure 2).
Figure 2. Rigor/Relevance Framework
41
As the figure shows, the level of rigor relates to the knowledge and
cognitive processes in Bloom’s Taxonomy. Higher level thinking concepts in
the taxonomy equate to a more rigorous level of understanding. Higher level
application in the taxonomy ranges from near to far transfer. This framework
extends the concept for the desired learner outcomes from acquisition to
adaptation, with adaptation being the highest level of rigor. Additional degrees
of learner outcomes include assimilation and application. Ideally, teachers want
concepts for students to be relevant and something they can reproduce in the
appropriate context—such as “real world” shown in Figure 2.
Dr. W. Daggett, president of the President of the International Center for
Leadership in Education (ICLE) created the “Rigor/Relevance Framework” that
has served as the “cornerstone of many school reform efforts throughout the
United States.” He stated, “A rigorous and relevant education is a product of
effective learning, which takes place when standards, curriculum, instruction,
and assessment interrelate and reinforce each other” (Daggett, 2005, p. 1). The
White Paper includes discussion about the various student learning styles and
notes the instructional practices found in the application and adaptation
quadrants which will benefit all students.
All students will benefit because they will be challenged to achieve
academic excellence, which ultimately boils down to applying rigorous
knowledge to unpredictable, real-world situations, such as those that
drive our rapidly changing world … and the tests will take care of
themselves. (Daggett, 2005, p.5)
42
Including level of rigor in instructional planning ensures the learner will
achieve the intent of the standards.
Staff Expectations
A challenge for schools in poverty areas is that many of the students are
at-risk for numerous challenges both in the academic setting and neighborhood.
It is imperative that instruction is not dumbed down because students are lacking
basic knowledge. Stronge noted that for students to succeed, teachers need to
maintain higher-level thinking activities and not lower expectations if the
students have not acquired the basic skills (Stronge, 2007).
An example of this practice would be to allow students that do not yet
know their multiplication facts to use either a calculator or a multiplication
chart. If a student is spending the better part of instructional math lessons trying
to multiply, they will continue to fall further behind missing new concepts as
they are taught. The exposure to the new content is paramount for all students to
become proficient. In addition to math homework or new material, children can
continue to memorize their math facts working in partnership with parents as
coaches. For those parents unable to do so, it is the schools’ responsibility to
bridge this gap to help children succeed.
43
Data Driven Instruction
Tomlinson (2001) connected the importance of using data to differentiate
instructional practices within her second edition of How to Differentiate
Instruction in Mixed-Ability Classrooms. This is just one significant benefit of
using data to drive instruction. Using knowledge of what the student has learned
to re-teach or target instruction allows for individual learners to gain knowledge
and concepts they missed with the first teaching. If teachers do not have a
planned outcome for learning, how will they know when their students have
reached mastery?
With accountability today driven strongly with federal and state
mandates related to NCLB for student achievement, educators are beginning to
be held responsible for what students learn. This responsibility is changing
school communities out of necessity. DuFour, DuFour, and Eaker (2005) have
written numerous books about schools becoming professional learning
communities. Originally, there were two focus questions when developing
professional learning communities: what do you want students to learn and how
will you know when they learn it? Another question was added which is what
our educators and school staff need to ask ourselves. “What will we do when
students don’t learn?” This is the main issue behind the achievement or
opportunity gap. We know some students learn and others are not able to keep
up. Using data to drive instruction and intervention is how we can ensure that
44
each student will learn. There is no single trick to reach and teach every child,
so as educators we must know who has learned what and give those who have
not reached mastery additional exposure and practice with concepts in order to
gain proficiency.
Monitoring students as they progress though the system tells us about
their progress or lack of progress and about what teachers, curriculum,
and program interventions they may have experienced. From this
information, schools can describe conditions and patterns for individuals
or groups of students. Using this information, practices and policies can
be examined in terms of whether they enhance or inhibit student
progress.” (Johnson, 2002, p. 37)
With data showing student gains are directly related to classroom
instruction and the teacher’s level of effectiveness, we must strive to make
certain all teachers are effective. Marzano published the results of multiple
researchers in this area. The findings “rather dramatically illustrate the profound
impact an individual teacher can have on student achievement” (Marzano, 2003,
p. 72). The difference in one year for student achievement shows a percentile
point gap of 39 points between students receiving instruction from a least to
most effective teacher. It is startling to acknowledge that gap is noted in student
results when only having the teacher for one year. As the variables change, so
do the outcomes on student learning. In three years’ time, the gap grows to 54
percentile points, with students receiving instruction from an effective teacher at
the 83
rd
percentile and those with an ineffective teacher at the 29th percentile.
45
English-Language Learner Support and Instructional Needs
School and classroom practices must be consistent with our
hopes for children and our vision of achieving both excellence and
equity in our education system. Implementing sounder practices for
English-Language Learners will require teachers and administrators to
make different decisions about instruction and assessment, develop
greater awareness of how cultural and linguistic factors impact on
learning, and embrace the belief that children from highly diverse
backgrounds can learn at high levels. (Lachat, 2004, p. x)
Currently in California, in order to comply with NCLB, a “Highly
Qualified Teacher” must be certified to work with English-Language Learners.
It is one reality for a teacher to hold these varying credentials, and another to
understand how to teach our English-Language Learners and modify instruction
to meet the varying needs of children who do not have comprehension in
English. Frequently, new teachers have the intellectual knowledge, but lack the
accompanying delivery skills, or the ability to adapt practices to meet student
needs.
It is important to differentiate the numerous variations within our
English-Language Learners’ population. Some students were born in the United
States and the language of their household was not English. They came to
school with little exposure to English, because of their home environment. We
also had students who have immigrated to the United States, leaving their homes
to build a new life in the USA. For these children, some attended school in their
home country. They may have had a background knowledge of concepts, but
only in their primary language. Instruction in English could connect ideas and
46
concepts to knowledge they had already acquired and allowed them to access the
information and build upon it in their current classroom setting. Children in this
classification frequently return home to where their first language is spoken
among family and community members.
What students from both of these English-Language Learners’
classifications have in common was their exposure to English-only which
occurred during the school day. Some may had additional exposure to English
through after-school and community activities. Unfortunately, children in
poverty frequently rely on television for entertainment, and do not have the
opportunity to engage in extra curricular affairs.
“English-Language Learners are the fastest-growing population group in
public schools today. Their growing number reflect demographic trends
occurring over the past 20 years that are changing the make-up of communities
across the United States” (Lachat, 2004, p. 22). Lachat discusses the difference
between cultures of testing and assessment. With the accountabilities of NCLB,
our educational system has shifted to an assessment culture—one where all
students must have access to the core curriculum and are assessed on their
academic progress through out their educational experience.
“Because standards-based assessments are part of the push toward higher
levels of learning, they drive demands that schools verify that all students,
including students who are not fully proficient in English, are achieving at
47
acceptable levels” (Lachat, 2004, p. 18). In California, this progress is measured
for grades 2-11 within the STAR state testing and assessment practices of
Adequate Yearly Progress (AYP).
The academic progress of students in sub-groups that fall within a school
population is monitored. These sub-groups have specific goals to reach in
performance in order to comply with NCLB. AYP monitors yearly performance
of specific groups of students.
Connecting this concept to instructional practices and interventions
would understandably indicate that standards-based core curriculum must be
taught to all students. Teachers must ensure that their non-proficient English-
learner students comprehend and understand materials taught and know how and
where to support their learning. Assessments show whether students have made
adequate progress in reaching mastery of the subject matters tested on the STAR
CST. Additionally, teachers must monitor student progress through the year and
differentiate instruction as needed in order to ensure all students have access to
the core curriculum.
It is no secret that employers of today want employees to be versed in
subjects not taught within school, or at least, not measured as a standard of
learning.
A recent bestseller, The World is Flat, by Thomas Friedman (2006)
stressed the significance of qualities an employee must have in order to succeed
48
in the workplace. Alarmingly, lower-performing schools have even more needs
to address within their student populations—the children are not only competing
against measurable learning standards, but the global marketplace requires
additional skills the students must reach to be successful and self-sufficient
adults. We must provide all children a chance to succeed and thrive in this
global economy.
Teaching the prescribed state-adopted core curriculums in Language Arts
and English-Language Development will certainly enhance the opportunity for
students to compete in this ever-changing world. However, if we do not monitor
the progress of student learning with every lesson and modify instruction to
meet their needs, students will not be ready to meet the level of rigor placed
upon them to reach proficiency in learning. This unites the necessity for using
data to drive instruction and differentiating instruction based on student-learning
needs.
By attaching the research of Stronge’s (2007) effective teaching
strategies with those that activate learning for ELL students, we would be
activating prior knowledge and using questioning as a strategy to connect new
information to old.
Further, the effective teacher knows how to respond to those needs and
engage learners in the process. One way to engage students and increase
learning is to use a variety of instructional strategies, depending on
factors such as prior knowledge of the students and the content and skills
to be taught. At-risk students benefit from direct instruction, hands-on
49
learning, simulations, inquiry, and other strategies that work well with
the general population of students. (Stronge, 2007, p. 78)
It is the teacher’s skills in implementing these strategies, however, that
distinguishes the more effective from the less effective.
Hill and Flynn (2006) used the work of Marzano (Marzano, Pickering, &
Pollock, 2001) and developed instruction specific to benefit ELLs. Additional
research to support these practices adds the connection of Bloom’s Taxonomy
and the stages of language acquisition or development. It offers a tremendously
useful tool in the form of a table that was introduced and features the levels of
Bloom’s Taxonomy on the y-axis and the stages of language development
across the x-axis. This tool allows teachers to specifically monitor instruction
and connect to the higher-levels of the taxonomy with planning the cognitive
tasks for our ELL students based on where they should be able to respond—
moving from the Pre-Production level with non-verbal responses to Speech
Emergence with phrases or sentences, and finally to Advanced fluent with near
narrative responses. This tool shows teachers the connection and importance of
working with all students across Bloom’s levels regardless of their English
ability level.
Knowledge of specific details and elements refers to the knowledge of
events, locations, people, dates, sources of information, and the like.…
Every subject matter contains some events, locations, people, dates, and
other details that experts know and believe to represent important
knowledge about the field. (Anderson & Krathwohl, 2001, p. 47)
50
Anderson and Krathwohl further unite the concept of knowledge of
specific details and elements to the term “common language.” A strategy to
benefit our students that can be used school wide is using common language
from our core curriculum. This tactic would particularly enable ELL students to
gain a concrete understanding of academic terms they will encounter again. A
simple example would be using formal and academic terms in math from one
grade level to the next. Having a committed focus on building common
language for students, they would have a way to retrieve older information and a
place to connect new concepts.
Benchmarking
The actual benchmark goal is established through the Federal and State
compliance requirements of the No Child Left Behind Act—every student will
be proficient by the year 2014. Research evidence provides the rationale of
using a benchmark to establish goals for participants. Tucker (1996) described
the components for benchmarking and noted that it is a process for reaching a
level of best practices. It is not simply looking at something and moving
towards it as one would hope in this time of accountability. Tucker further
explained the need for understanding where you are before you begin to identify
your benchmark. With higher-student achievement—the desired best practice
51
level—it is imperative that current practices and goals are established prior to
seeking a benchmark target. The detailed components are:
• Plan your study: identify what to benchmark and select your
team.
• Study and document your own practices, success measures, and
problems.
• Identify best practices and establish a benchmarking partnership.
• Develop a questionnaire and a process to study and document
your partner’s practices.
• Analyze the information: the gap between you and your partner,
the enablers, and the best ideas to emerge from the study.
• Develop recommendations to adapt the learning to your own
school and widely communicate your findings.
• Implement the recommendations and monitor progress.
(Tucker, 1996, p. 9)
Providing a tangible benchmark allows individuals to know exactly what
the target is and this is a new concept for many in education. The ability to visit
a school that is showing the results you desire is a powerful step for a staff to
take. Fullen (2003) cited Kotter and Cohen’s (2002) three steps related to the
concept of what how leaders create change and the role that leadership plays in
creating that change—the fact that emotions are a part of change and how
specifically change is related to emotions: “(a) Helps people see [new
possibilities and situations], (b) seeing something new hits the emotions, and (c)
emotionally charged ideas change behavior or reinforce changed behavior”
(Fullen, 2003, p. 2).
52
Bandura (1994) has published numerous articles on the concept of self-
efficacy. His work on self-efficacy centered on the belief of people and their
ability to produce effects. Connecting benchmarking to self-efficacy is one way
to show teachers what others are capable of achieving when visiting another
school or program that has similar populations and programs. Another popular
way of developing self-efficacy in education is through the use of modeling or
having teachers observe one another conducting a lesson.
Perceived self-efficacy is defined as people’s beliefs about their
capabilities to produce designated levels of performance that exercise
influence over events that affect their lives. Self-efficacy beliefs
determine how people feel, think, motivate themselves and behave.
Such beliefs produce these diverse effects through four major phases.
They include cognitive, motivational, affective, and selection processes.
(Bandura, 1994, p. 1)
With the shift in accountability and the pressure to produce results,
teachers must address their own self-efficacy. For teachers in the field for more
than 10 years, this can be a difficult shift for them. Long gone is the ability to
teach the favorite dinosaur unit as a fun project for students. On a daily basis,
teachers must face the question whether they are capable of meeting the high
demands of instructing students at the level they need to achieve results.
Another complexity within the scope of the teacher that can affect self-efficacy
is the difference in the students—a wide range of background knowledge and
student performance impacts the instructional level needs.
53
Conclusion
The numerous research presented clearly supports: core curriculum,
effective instructional practices, data driven instruction, ELL needs, and
benchmarking. There is a common thread through all of these interventions and
that is the teacher. Marzano (2007) states:
In short, research will never be able to identify instructional strategies
that work with every student in every class. The best research can do is
tell us which strategies have a good chance (i.e., high probability) of
working well with students. Individual classroom teachers must
determine which strategies to employ with the right students at the right
time. In effect, a good part of effective teaching is an art—hence the
title, The Art and Science of Teaching. (p. 5)
What the research does not show is the benefit of having all these pieces
in place simultaneously. The interventions in place at Eberman during the 2006-
2007 school year combined these successful strategies to improve student
achievement. This dissertation focused on raising student achievement at
Eberman and addressed the question of using multiple interventions
concurrently, and the results achieved were reflected in student data and
interviews with teachers.
54
CHAPTER 3
RESEARCH SUMMARY
The rationale for this study was to determine whether the multiple
interventions in place during the 2006-2007 school year at Eberman Elementary
School will make an impact on student achievement. The anticipated impact
may be positive, negative, or neutral. If the findings are positive, what
interventions should remain the same? If the findings are negative, what
interventions are not meeting the needs of our students? If there is no school
growth, why are we not making a difference and what must change to do better?
Quantitative Evaluation Design
The quantitative data compared the results of the 2007 Standardized
Testing and Reporting (STAR) measure of the California Standards Test (CST)
to the 2006 CST results. This was a Pre-Post Dependent Groups Design:
Pre X Post
2006 X 2007
For the 2007 year, dependent t test results were limited to the students
that were present and tested in 2006 compared to their growth in 2007. Thus,
scores for 2006 included grades two through five and 2007 scores were limited
to grades three through six.
55
In addition to the dependent group design, data were also analyzed in a
Pre-Post Independent Groups Design:
Pre X Post
2006 X 2007
With analyzing both the independent and dependent results, the benefit is
to look at the data from more than one viewpoint. The major emphasis of this
study is to compare change in students who have attended Eberman for 2
consecutive testing years. Descriptive test results was determined through
California English Language Development Test (CELDT) levels for English-
language learners, but the main emphasis was the California Standards Test
(CST).
An additional quantitative evaluation included comparing the data from
the STAR CST ELA of Eberman with the benchmark school, Kaye Beeson.
These compared the progress made between the two schools and determined
whether Eberman made similar growth, progress equal to or above, or less
growth than the benchmark school.
With this design, the main concern was the issue of control. Options for
maintaining this concern was addressed with disaggregating data of the CST
2007 results, looking at the progress of subgroups in the areas of: English-
language learners, Hispanic, White, and socio-economically disadvantaged
(SED). Specific attention was given to each individual student’s movement
56
between performance bands. Additional means of adjusting for control were
using the pre-post comparisons.
Qualitative Evaluation Design
In looking at using our visit to Kaye Beeson School as an introduction to
our “benchmark school” this visit provided formative exchanges of information
with the teachers who attended. Throughout the visit, teachers were assigned
different classes to observe instruction in either their grade level or one grade
above/below their current level of instruction. A recording sheet was provided
for teachers to focus the visit on the key areas of student engagement, checking
for understanding, rigor of instruction, curriculum delivery, and room set-up.
Appendix A shows this instrument. These recoding sheets were not collected by
the principal, but served as a resource for teachers to remember what was
specifically observed during their visit. During subsequent staff and grade-level
meetings, teachers were able to refer to this tool for dialogue and reflection of
personal practices. Open-ended surveys were given to teachers following the
visit to Kaye Beeson School. A copy of the survey is found in Appendix B.
An additional manner of gathering feedback from these teachers
occurred during our trimester academic progress meetings. This was an oral
exchange for teachers to share the “pros and cons” from their visit. The
57
Leadership Team Meeting in January was another forum to gather qualitative
feedback about the impact of visiting Kaye Beeson School.
One manner of analyzing this visit was the use of the Kirkpatrick and
Kirkpatrick (2006) model from Evaluating Training Programs. “The four levels
represent a sequence of ways to evaluate programs” Kirkpatick& Kirkpatrick,
2006, p. 21). The book begins with a list of ten steps to consider before
implementing a training program.
1. Determining needs
2. Setting objectives
3. Determining subject content
4. Selecting participants
5. Determining the best schedule
6. Selecting appropriate facilities
7. Selecting appropriate instructors
8. Selecting and preparing audiovisual aids
9. Coordinating the program
10. Evaluating the program (Kirkpatrick & Kirkpatrick, 2006, p. 3).
In addition to the reasons for implementing a visit to a benchmark
school, I evaluated the training using the four levels of Kirkpatrick and
Kirkpatrick (2006): Reaction, Learning, Behavior, and Results. Figure 3 shows
how three levels are addressed in this study for qualitative purposes.
58
Level Written Surveys Interviews Observations
1 Reaction X X
2 Learning
(Attitudes)
X
3 Transfer
(Behavior)
X Participant
X Non
Participant
Figure 3. Training Evaluation Program Levels
Interventions
It is important to note when using multiple interventions, we have the
limitation in this study of multiple treatment interference. It was not possible to
identify the specific intervention or interventions that made an impact on student
achievement, nor to isolate results exclusive to one intervention, as most
students participated in all interventions. The exception is for the 50 students
who attended the after-school program 3 hours a week with small-group-need
based instruction. This is a specific number of students that received this
treatment, while others did not. Still, if these students showed growth, we could
not determine which intervention specifically allowed for the change in
achievement. Four interventions are described as follows:
59
Core Curriculum Instruction
Instructional delivery of standards-based core curriculum in language
arts and English Language Development is one intervention. It is important to
mention that neither of these core curriculum pieces were in place school wide
and delivered with fidelity by all teachers. The expectation had been shared
with all teachers and they were expected to be adhering to the delivery of our
core curriculum daily.
In-services had been provided to teachers during the school year.
“Workshop,” the differentiated instructional component of Open Court provides
time for teachers to work with small groups of students daily and weekly. The
intent of the groupings is based on need and flexible. When students master an
area of weakness or complete an enrichment project—meeting grade level
standards—they can be moved from one group to another. Teachers monitored
the progress of individual students continually and made changes as deemed
necessary by student progress.
Prior to the In-service on workshop, many teachers still did not have this
component of Open Court offered to students daily, which is the purpose for
differentiating instruction within this program. Teachers were given grade-level
time to collaborate on activities for workshop to assist students in gaining skills,
knowledge, and most importantly, to becoming proficient in their grade-level
standards.
60
Another part of the language arts core curriculum not being implemented
with fidelity was the “green section”. This section covered the daily lessons in
phonics and phonemic awareness, which is a critical component for students
who are not meeting standards due to low fluency and/or poor reading
comprehension. We had a staff In-service meeting on this area in April 2007.
For primary grades K-3, the focus was on fluency within the Open Court
program. Grades four through six focused on the concept of morphology—the
parts of words, word structure, and meaning. The intent of this one-hour
intensive training was to provide teachers with the research that supported these
practices within our adopted language arts curriculum, Open Court. We wanted
teachers to understand and connect the foundation of this program and the
established practices and procedures that must be adhered to by every teacher in
every grade level to give our students consistency in their access to the core
curriculum. Our District Reading First Coordinator and a Reading Coach from
another school conducted the In-service.
Although the state mandates daily English Language Development, this
was not occurring in classrooms at Eberman. We have one certificated position,
Language Development Specialist, that provided additional support for our
English learners outside of the classroom in a pull-out model. The challenge
with this format was that students were losing more instructional time moving
from classroom to classroom and the instruction was provided in isolation from
61
the language arts program. Our Language Development Specialist was making
strides to connect core curriculum support and pull out program time through
communicating with classroom teachers and using the supplemental materials of
the language arts program.
In-service was provided by the publisher of our ELD program, Moving
Into English, for all classroom teachers in February 2007. Prior practice was to
have one classroom at each grade level with all the students who are English-
language learners located in that one class. Teachers of these designated classes
received staff development 3 years ago, with no additional support offered
during the last 2 years. This year, all classrooms had students with language
needs and all but one of the classroom teachers met the English Language
Learner credential requirements. Teachers had the ELD curriculum but were not
familiar with the curriculum due to lack of exposure and training.
Learning Objectives
Learning objectives were introduced to the teachers in October 2006.
The format for writing the objectives is located in Appendix C. Writing
objectives prior to lesson delivery did take preparation on the part of the
classroom teacher. Many teachers shared they did not understand what was
expected of them following multiple mini-lessons and two staff development
sessions on writing them.
62
In conjunction with the Staff Development Day in January 2007,
following our visit to Kaye Beeson School, the afternoon was spent “unpacking
standards.” It appeared that teachers were reluctant to take this task of
objectives on because they could not identify the different parts of the standard:
content, context, and level of cognition due to lack of exposure and training. I
worked with our provider, Action Learning Systems, to deliver this important
training piece.
Following the staff development, one staff meeting provided dedicated
time for teachers to create the materials/posters they used to post their objectives
in their classrooms. The expectation was that objectives would be posted for
language arts, math, and grade-level choice of science or social studies,
beginning with Monday, Wednesday, and Friday for the third quarter and
moving to daily for the fourth quarter of instruction.
Principal visibility/classrooms. The challenge with the principal being
in the classroom daily was that the principal handles discipline issues, as there
was not a Vice Principal on site. Being new to the school and district, aside
from discipline issues—I also had an entirely new office staff. The combination
of these situations made daily classroom visits a challenge and frequently an
impossible feat. In order to make instruction the priority, school-wide behavior
systems needed to be in place: if a student was causing a disruption in a
63
classroom, teachers needed a procedure to handle the situation, so minimal
instructional time was lost.
Following spring break, I worked with my secretary to schedule
uninterrupted time daily in classrooms. Previously, I was striving to have 3-
hour blocks in the classrooms twice a week, which proved to be unsuccessful.
With this new approach, taking a designated shorter time block, I expected to
have positive results.
An Educational Psychologist, Dr. Robert Mackenzie, who specialized in
setting limits behavior for both students and teachers, presented a 2-hour staff
development in April 2007. We also purchased his book, Setting Limits in the
Classroom for a book study to occur in the 2007-2008 school year.
One of the main functions of an effective instructional leader is to
provide feedback on classroom instruction within the areas of: student
engagement, checking for understanding, rigor of instruction, student
expectations and objectives, and maintain the integrity of our focus for the year.
With a checklist in place (Appendix D), acknowledging what effective teaching
practices were evident during the lesson, teachers had immediate feedback upon
what was observed during the classroom visit. They knew how they were
meeting the expectations and what instructional pieces most benefit their
students with this feedback.
64
Marzano, Pickering, and Pollock (2001) share strategies that work to
increase student achievement through instruction and planning. Research
included numerous studies about how teachers affect student learning. Wright,
Horn, and Sanders (1997), after researching over 100,000 students, concluded
the following:
The results of this study will document that the most important factor
affecting student learning is the teacher.… The immediate and clear
implication of this finding is that seemingly more can be done to
improve education by improving the effectiveness of teachers than by
any other single factor. (Wright, Horn & Sanders, 1997, p. 63)
Thus, it is imperative as an instructional leader to provide feedback for
teachers to know what components of effective teaching strategies are observed
in their classroom. Furthermore, the checklist allows me to write
commendations and recommendations, which will lead to informal reflective
conversations. The dialog with teachers following the visits are focused on
specifically what was seen during the visit and how the teachers’ actions
improve student learning.
The feedback tool was used during the fourth quarter to provide timely
feedback for teachers. It was introduced to teachers during the spring
articulation meeting. The focus was on providing feedback on instruction
observed by the principal during a short walk-through visit. This is simply
another means of reflecting upon instruction and providing an opening for
communication about instruction and student achievement.
65
Using Data to Drive Instruction. The principal had the opportunity to
meet four times a year with grade-level teams and specialists—reading and
language development, focusing on individual and classroom academic data.
The intent of these meetings was to look at progress gained, identify students not
mastering standards, and develop action plans to allow students additional
exposure through small-group instruction and one-on-one time as needed. The
remediation predominantly was to occur within the workshop time of Open
Court.
What really could be created from this time was a professional learning
community using data to tell what results our practices created; which students
were meeting the benchmark assessments; and for those that were not, what
support could be given to reach that specific student. This concept had been
explored in depth by DuFour, DuFour, Eaker, and Karhanek (2005). DuFour,
DuFour, Eaker (2005) further explore the power a professional learning
community can bring to a school. “As the school moves forward, every
professional in the building must engage with colleagues in the ongoing
exploration of three crucial questions that drive the work of those within a
professional learning community:
• What do we want each student to learn?
• How will we know when each student has learned it?
66
• How will we respond when a student experiences difficulty in
learning? (Dufour et al., 2005, p. 33)
Teachers identify where students fall academically based on recent
assessments in language arts. Both the most recent Reading Lions/Open Court
Assessment and our Language Arts Benchmark test were used as tools to
identify academic progress. The five categories are the same as the California
Standards Test: Far Below Basic, Below Basic, Basic, Proficient, and
Advanced. Appendix E shows the form used to place students within the
quintile and identify strategies and instruction to improve academic progress.
At the end of the school year—late May-early June—teachers met with
the principal in “one-on-one” meetings and looked at student data over the
course of the school year celebrating successes and identifying areas of growth
for the following school year. This was an ideal setting to discuss future staff
development needs and assist teachers in selecting appropriate workshops over
the summer to supplement their existing individual growth plans for credential
renewal.
Next year, I will change Grade Level Articulation meetings to be a short
one-on-one conference with each teacher monitoring and discussing every
student and their academic progress. Changing this structure, will allow me to
pull each teacher’s student data into the conversation. We need this forum for
accountability—we must get rid of blaming the test and our students’ prior
67
knowledge. We will continue to have grade-level meetings looking at data in
addition to the one-on-one meetings to further build our professional learning
community. The first step we need is to all be on the same page as to the ability
and expectation that every child can and will learn. Then, within our
professional learning community, we can look for results without blame.
Benchmark School Visit
Our visit to Kaye Beeson School was unique in the fact we were able to
visit a school nearby during the course of a regular school day due to an all-day
staff-development day for us in the middle of the school year. Bringing an
entire instructional staff to enter classrooms of teachers within a school that
made tremendous academic growth over the past three years was ideal. Kaye
Beeson School had an academic performance index (API) growth from 578 to
701. The achievement gap is narrowing for their subgroups as well.
Socioeconomic student achievement moved from 594 to 699 during the same
three testing years, and English Learners reached a 710 API based on the 2006
CST scores.
Kaye Beeson School was selected because of the following factors:
significant student achievement growth, implementation of core curriculum,
diverse student population, and supreme instructional delivery—student
engagement and checking for understanding with high rigor and expectations.
68
Effective teaching practices were evident during previous visits. Predominantly,
the focus of the visit was so Eberman teachers could to see instruction that was
effective with students similar to our population and with the same language arts
program, Open Court.
Referring to the Kirkpatrick and Kirkpatrick (2006) evaluation training
model and the 10 steps of an effective training program, the following reasons
are given behind selecting a “benchmark” school, and the reason that Kaye
Beeson School was selected for Eberman teachers to visit:
• The need for this training is simplistic, the No child Left Behind Act.
Eberman School was leaving children behind. We were not meeting the
requirements established by both the federal and state factors, API, or
annual yearly progress (AYP).
• The objective was for Eberman teachers to observe instruction of
effective teaching with a population similar to our students in ethnic
subgroups and SED students.
• Subject content is the core curriculum. Kaye Beeson School began
implementation of the Open Court curriculum for primary grades in 1998
and intermediate grades in 1999. Since then, that is the only language
arts curriculum taught school wide.
Participants are the teachers and credentialed specialists of Eberman.
They were selected because; we needed to make immediate changes in the
69
manner in which we delivered instruction. A few teachers of Eberman were
providing great direct instruction and delivering the core curriculum with
fidelity. The problem was that not all teachers were consistent and that is a
school wide change that needed to occur.
Schedule selection was determined by our In-service day. Our students
were not present, so teachers could have a school day for in-service. I was
required, by the contract, to allow teachers time to work in their classrooms—so
this was offered in the afternoon. Since the primary focus was on language arts,
we visited Kaye Beeson School during their morning, which was predominantly
dedicated for language arts instruction.
The facility was chosen based on progress made with API and AYP
targets as deemed for progress in complying with NCLB. Equally significant
was an additional, personal reason behind selecting Kaye Beeson School—I
began my teaching and administrative career as a fifth grade teacher there and
then Head Teacher. I knew teachers whom I admired as being effective, both
from a teacher and administrative stand point, and most importantly, the current
principal was a district curriculum specialist in the language arts curriculum,
Open Court, prior to becoming a site administrator.
Kaye Beeson teachers were selected as “instructors,” as they were the
teachers of Kaye Beeson School, a school that make tremendous gains in student
achievement due to effective teaching of the core curriculum.
70
Additional instructors were Isabelle Jacomb, Kaye Beeson School
principal since 1999, and our consultant of Action Learning Systems. Ms.
Jacomb provided answers to questions the teachers may have had following
their visit to the classrooms. Our consultant introduced the ALS document on
Direct Instruction with the effective teaching strategies, which have been the
focus for this school year.
Audiovisual aids were not used within this presentation, visual aids were
the focus—teachers watching teachers in direct instruction and differentiating
lessons within the Open Court workshop segment.
The program was coordinated through the two principals and the ALS
consultant.
The program was evaluated following the four levels of Kirkpatrick and
Kirkpatrick: Reaction, Learning, Behavior, and Results.
After-School Intervention Program
Beginning in March, an after-school intervention program was offered to
students not making acceptable growth toward grade-level standards. The 50
identified students received an additional 3 hours of instruction with certificated
staff in small groups weekly for eight weeks. Primary focus was given to
language arts in decoding skills for non readers and students needing more
exposure to phonemic awareness and phonics. For students with decoding skills
71
needing work on reading comprehension, another intervention—group—was
established to meet their needs. Additional options included a small group of
identified English-language learners working with the language development
specialist, a multi-age class of lower readers below the second grade level but
chronologically in grades 2-4, and a math support group.
These intervention programs worked with various curriculums focusing
on the skill and exposure to grade level content—access to the core curriculum.
Students were given information on testing skills throughout the eight-week
program. The program ran for five weeks prior to the STAR Tests and ran an
additional three weeks after testing was completed.
Participants and Setting
Participants included all students attending Eberman during the 2006-
2007 school year for in-class instructionally based interventions of effective
teaching practices. Particular attention was on the proficiency of students in
English Language Arts based on the 2006 CST results. From the 2006 STAR
findings, we had 188 students that did not meet the proficiency rate of 24.4. The
student population that was tested incorporated our grades three through six, as
the sixth graders from 2006 had moved onto the Middle School for seventh
grade for the 2006-2007 school year.
72
In addition to the students identified as not proficient on the CST, we
targeted students in grades K-2 that were not meeting grade-level benchmarks at
the proficient level. The tools used to ascertain proficiency included, but were
not limited to Reading Lions/Open Court, CELDT, teacher observation and
recommendation, and student work.
English-language learners were a mutually exclusive population within
our over all population of students. We had 157 students identified as EL based
on the 2006 California Basic Educational Data System (CBEDS) report. Total
student population was 440 students, making 36% of our population identified
as English-language learners.
An instrumental group of participants within this study were the
classroom teachers and certificated support staff: Reading Specialist and
Language Development Specialist. The teachers were the individuals receiving
the staff development and were expected to carry out the multiple interventions
within their classrooms. Teachers were required to monitor the academic
achievement of their students and intervene when students were not making the
growth needed leading to grade-level proficiency.
Additionally, the principal of Eberman was a participant of the study.
The principal was the researcher and facilitator of the interventions being
implemented during the 2006-2007 school year. Significant attention should be
paid to the fact that Eberman was the location of the principal and the principal
73
was the sole researcher of this study. This means that every decision and focus
of this school year was accountable for one main purpose—to improve student
achievement of students attending Eberman. All research is for the intense
function of helping teachers with instructional delivery to best serve the current
440 children of Eberman and additional students for years to come.
Also included, as participants are the district leaders, my Superintendent
and the Associate Superintendent of Educational Services. They were included
because they were supportive of the actions and efforts made to improve student
achievement at Eberman. It is interesting to note the fact that each of the three
of us were new to the district as of July 2006. Both administrators had a great
desire and drive to immediately affect the success of learners in our district.
Eberman Elementary School was the setting. It is located in the West
Sacramento community with a high population of families qualifying for free or
reduced meal costs. The setting incorporates the 20 individual classrooms of
Eberman and seven small-group intervention groups outside the regular school
day. About 50 students participated in the after-school intervention.
Instrumentation: Achievement
The tests in the STAR Program ensure that information about the
academic achievement of all students is collected on a regular basis.
This information is critical in evaluating the quality of the education
provided for California students. In order to ensure that schools have the
most complete information possible, all students need to participate in
these tests. By having different tests, all students are given the
74
opportunity to participate in the STAR Program…. STAR Program test
results, along with other available information, help school staffs form a
more complete picture of students’ academic achievement. (California
Department of Education, 2006, p. 27)
In order to meet the compliances established by the No Child Left
Behind Act, students in grades 2-11 take the CST every spring. Student
achievement is monitored through the results of this yearly state-wide testing
and assessment tool. Data is available for each student, school, substantial
student subgroup, and district through the California Department of Education.
Also, through our district data tool, Data Director, we are able to monitor and
measure student achievement by each teacher.
Individual student data shows specific progress in grade-level standards
and assigns scores to each significant area.
The STAR Student Report provides overall scale scores, performance
levels, and content area results for each subject area tested by a CST.
Overall scores are reported on a scale ranging from 150 to 600. The
CST results for each subject area tested are also reported by performance
levels: advanced, proficient, basic, below basic, or far below basic.
Each performance level indicates how well a student is achieving the
state content standards tested. The state target is for all students to score
at the proficient or advanced level on the CSTs. (California Department
of Education Testing and Accountability Office, 2006, p. 32)
The documents examined were the individual student reports as
generated from the STAR CST and CAT6 and the specific subgroups of
Eberman School. The data were disaggregated and analyzed in multiple
measures. Measures included: progress and movement of students within each
performance level band and growth and scores on the Academic Performance
75
Index (API) and Adequate Yearly Progress (AYP). The growth guidelines are
defined within the No Child Left Behind Act.
Information on progress for our English-language learners was available
with the CELDT results. The data were examined for descriptive purposes of
students. The students were classified in the following categories: beginning,
early intermediate, intermediate, early advanced, and advanced. Assessment
areas for grades K and one were speaking and reading. Writing was an
additional area for grades three to six.
When receiving the STAR CST and CAT6 results, I looked at the school
wide API growth from 2006 to 2007. The goal was to increase 7 points from
our base score of 669. In order for us to meet our NCLB growth goal, we
needed a minimum score of 677. Each identified subgroup API was monitored
and assessed for reaching the growth targets.
To reach the Adequate Yearly Progress (AYP) for every student, school
wide AYP was reviewed in addition to every subgroup identified within the
Eberman population. It is essential to note that Eberman did not meet the AYP
for the 2006-year, nor the two previous years—which qualified us for the
Program Improvement status of Year 1 during the 2006-2007 school year.
Particular attention was paid to the growth within this area, as it could move
Eberman to a Program Improvement Year 2 status for failure to meet the 24.6%
in English Language Arts for all subgroups and school wide.
76
English-language learners take the CELDT exam yearly between July
and October until they are re-designated. At the time of initial testing—required
to be completed prior to October 31
st
of each year, a predicted score is available
by using the tools provided by CDE. Final scores are available from the state of
California usually by January, mid way through the school year. Because the
real scores are not available until this time, instruction tends to be based on the
predicted score and teacher judgment.
Procedures
The STAR CST was administered to all enrolled students in grades two
through six at Eberman as of May 1, 2007. The testing procedures mandated by
the state of California were adhered to by every teacher, staff member, and
student.
Student results were available mid August, when they are sent to the
individual school districts. The districts then distribute the information to the
schools, and our school mails the parent/guardian reports to the address of the
student. API and AYP results were shared from CDE on approximately August
15, 2007.
Another measure of monitoring student growth was to compare the
percentage of students within each of the five performance bands by grade level.
We looked at growth within cohorts and non-cohort groups. The challenge with
77
this data was that the cut off point for the specific performance bands does vary
some from grade to grade. Table 3 presents the numbers of the cut off points for
each of the sections, as taken from CDE website.
Table 3
Cut Off Points
Grade Far Below B Below B Basic Proficient Advanced
Second 150-261 262-299 300-349 350-401 402-600
Third 150-258 259-299 300-349 350-401 402-600
Fourth 150-268 269-299 300-349 350-392 393-600
Fifth 150-270 271-299 300-349 350-394 395-600
Sixth 150-267 268-299 300-349 350-393 394-600
Instrumentation: Surveys, Observational,
Checklists, and Questionnaires
Surveys were used to determine the impact of our visit to Kaye Beeson
School in January. The survey was open-ended (Appendix B). An
observational checklist (Appendix D)was introduced to teachers during the
Open Court Articulation meeting in April. The articulation meeting was with
the principal, reading specialist, language specialist, and grade-level teachers.
New with this articulation meeting was the attendance and joint facilitation by
our District Reading First Coordinator and me. The purpose of this change was
78
for teachers to hear a consistent message of the significance of our
implementation of our core curriculum of Open Court from both the site and
district levels.
Procedures
An advantage of focusing the research on Eberman School was that
interviews and observations occurred during the school year monitoring the
intervention results in progress. In order to triangulate the formative data,
multiple individuals were interviewed and all classrooms were observed.
The survey was administered upon the completion of the visit to Kaye
Beeson School. Teachers were required to complete and turn in the anonymous
surveys, as monitored by the school secretary.
The observational checklist was introduced with the Open Court
Articulation meeting in April of 2007. Follow up on the checklists occurred in
the May staff meeting, after the checklist had begun to be used during classroom
visits.
With every walk-through visit, teachers received a copy of the checklist
in their mailbox on the same day of the visit. Individual conversations occurred
following visits that required corrective action based on the observation of the
classroom instruction. It was important to note that this was not a “gotcha.”
Rather, we wanted to celebrate instruction and learning. The checklist
79
(Appendix D) was not intended to be punitive, but informative. The challenge
was that this was a change from previous practice. Particular attention was
made of positive instructional practices; with more difficult conversations
following the initial implementation of the checklist and teacher comfort.
Instrumentation: Informal Interviews
Following the benchmark school visit, teachers were asked individually
about the pros and cons of the visit during a meeting with school specialists—
reading, language, and behavior. The informal format was designed to allow
teachers to share what they liked and did not like about their visit. This allowed
teachers to comment upon what they felt was positive and negative about
instructional practices at Kaye Beeson School.
Procedures
The procedure for gathering more information about the benchmark
school visit was simple. Teachers were given folders during the visit with
individual reflection pages for each classroom they visit (Appendix A).
Teachers were told the papers would not be collected, but were for their personal
use. Prior to the specialist meeting, teachers were informed that the visit to
Kaye Beeson School would be discussed briefly during this session.
80
Many teachers brought their folders with them to the specialist meeting
and used them for reference in answer to the questions. This allowed them to
use them as they shared their opinions and feelings about the visit. The
principal recorded individual comments. Recording comments included the
teacher name and had two columns—one for pros and one for cons. The benefit
of recording data by each teacher allowed the principal to reference them in the
future and follow up on concepts they shared.
Instrumentation: Qualitative Fieldwork
Qualitative fieldwork focused on the use of the observational checklists
and formal teacher observations as part of the evaluation process of Mountain
Meadows Unified School District.
An additional tool including non-participants in this study was our
School Accountability and Intervention Team (SAIT) Visit. The SAIT Visit
was at the end of April, allowing another view of the progress made during the
year and areas requiring more attention to improve student achievement in the
future.
Procedures
Teachers received a copy of the checklist following walk-through visits
in April, May, and June. Individual conversations resulting from the feedback
81
occurred in person and via e-mail. The schedule for implementing the checklist
during walkthrough visits follows:
April 9: Teachers and staff received a copy of the weekly WAG with
classroom visit times specified during the week. Walk-throughs began.
April 12: Teachers received a copy of the checklist template and more
information about the intent of each item.
April 13-June 12: Use of the checklist began and continued daily-with
teachers receiving copies following each visit.
Formal observations for teachers being evaluated were established at the
beginning of the school year, as per the teacher contract. Formal observations
were used as a tool to note instructional practices of effective teaching and
implementation of the core curriculum. I met with teachers in the fall to discuss
goals for the year, provide a template for lesson plans, with specific attention to
checking for understanding and student engagement. Prior to the lesson
observation, teachers turned in a copy of their lesson plans, so I had them during
the observed formal lesson. During each lesson, I scripted the teacher and
student dialog, in addition to providing a diagram of the teacher movements
during monitoring of student work. Equally significant was the tallying of
student behaviors of those who were not on task during a lesson. This included
students that were in transition from an outside “pull-out” program—either
speech, Resource Special Program, language, or reading specialist. Also
82
included were disruptions to the lesson, students coming in from another class
for time out, phone calls from the school office, etc.
Following the lesson observation, I met individually with teachers,
provided them a copy of the script, and reviewed it—this is also a chance to
address suggestions I offered for improving the delivery of the instruction. With
the copy in writing, teachers had a resource with helpful hints on how to reach
specific students with behavior problems and/or tools to increase student
achievement.
Our SAIT visit provided feedback on the Essential Program
Components, also known as EPCs. The areas range from instructional practices,
teacher qualification, fiscal commitments, core curriculum compliance, and
qualifications of teachers and administration. This was a tremendous
opportunity to have an outside lens measure our progress. Action Learning
Systems was hired to conduct this visit—they are a SAIT Provider for the state
of California and conduct site visits such as this with great frequency. We
gained from this visit further actions to improve instruction at Eberman. I
worked with my Leadership Team and School Site Council to meet the
recommendations that resulted from this visit. Changes were not expected to
occur until the 2007-2008 school year.
83
Instrumentation: 2006-2007 Individual
Student Performance Band Change
Pivot Chart Analysis
Our district tool, Data Director, is an electronic program that allows
individuals to manipulate and extrapolate data by student, grade level, and
individual teacher reports. One of the most impressive measures is comparing
student progress within a classroom moving from one performance band to
another. Figures 4 and 5 show examples comparing 2005 to 2006 data for two
sixth grade classes in English Language Arts on the STAR CST.
The numbers 1 through 5 correlate to the performance level bands with
number 1 designated as Far Below Basic and number 5 serving as Advanced.
The x-axis is the student level on the 2005 CST in ELA and the y-axis shows the
2006 growth.
Cross Tabulation Analysis. The analyzing individual student progress or
movement from within the performance bands provides another measure of
growth or lack of progress. The ideal benchmarks are to identify students in the
basic and above-performance bands and proficient and above.
84
Figure 4. 2005 CST ELA Proficiency, Grade 6 Class 1
Figure 5. 2005 CST ELA Proficiency Grade 6, Class 2
85
The specific target is to look at the number of students in the Basic
performance band in the pre test year as compared to number of students in the
proficient and above in the post test year.
The 2006-2007 Academic Performance Index
In addition to Pivot Charts, Data Director also tabulates out API scores
for the individual classrooms. Both of these tools are an ideal avenue for
reflective discussions with teachers about instructional practices. The Pivot
Charts will be used in one-on-one meetings with teachers next year to address
student results from instructional practices in 2006-2007 compared to student
achievement in 2007-2008 and the changes in instructional practices to improve
student achievement. Using Pivot Chart data in this manner focuses the
discussion on instruction and results in student achievement specific to the
individual teacher.
Performance Band Scoring
The data to be measured is the scaled score results which place students
in the appropriate performance bands as reflected by their score. In order to
identify which band the student falls into, the cut scores determined by the CDE
are used. Values are assigned to the individual performance bands with a 0
86
applied to scores not reaching basic or above, and 1 is applied to students in
basic and above. For determining the proficient and above-group 0 is applied to
all students below proficient and a 1 is for students in proficient and advanced.
There is no specific determination of what number of students reach the
individual bands of Basic, Proficient, or Advanced—just that they are within the
range of Basic to Advanced. Thus the need for another cross tabulation of
scaled scores for proficient and above to measure the change in the total number
of students in the smaller range of Proficient and Advanced.
Qualitative Analysis
Reaction
The Kirkpatrick model focuses on a quantitative reaction form. For the
purpose of this study, a reflection page was provided so teachers could share
their reactions to the visit in a non-threatening manner. The reason for this
change is that visiting a benchmark school was a new concept for this staff. The
entire certificated staff had never been to visit another school in session. It was
intended for the visit to produce results, but the needs for each teacher were
different. A few teachers were implementing the core curriculum with fidelity.
The needs were different for teachers not implementing the key program
concepts from teachers who were not following the designed curriculum, but
instead were clinging to familiar programs and worksheets used in the past.
87
Following the visit, an open-ended survey was given to each certificated
teacher who attended the training. Although the survey did not provide
quantifiable data, the qualitative data gave areas of focus: what the teachers
saw; what changes, if any, they made in their own instructional practices; and
connecting the results of student actions with the implementation of effective
teaching strategies and the core curriculum.
Some quantifiable data were gathered through the informal interview
process and asking for pros and cons from each teacher. This was beneficial in
comparing the information shared from teachers who were implementing the
core curriculum, to those who were not in support, and being able to find
common denominators between the two groups. It is interesting to note that
teachers who appeared more effective using the direct instruction strategies and
getting positive student achievement did, in fact, share techniques they were
already using in the classroom, so they were able to see what they were already
doing and had that reinforced by multiple teachers in a higher-performing
school.
Learning
Learning from the visit to the benchmark school was evidenced by
showing knowledge of effective teaching skills and strategies. I did not offer a
quantitative measure of the learning as a direct result of our visit to the
88
benchmark school. In this study, learning was primarily measured by attitudes
of teachers gained through interviews and conversations. Interpreting the results
and impact on individual learning was available through the interviews and
subsequent conversations following walk-throughs, formal observations, grade-
level meetings, and staff development. A change in attitudes and procedures
also showed individual learning had occurred as a result of this intervention.
Behavior
If teachers were not using these methods prior to the visit, but then
modified their own habits or practices to use what was observed through
classroom visits, this indicated transfer of learning had occurring through
behavior—the true focus of the visit. We wanted a change in our instructional
practices that would cause positive growth in student achievement.
Quantitative Analysis
In order to determine whether the various interventions were effective,
we compared the pre intervention STAR data from 2006 for students in grades
two through five to the post intervention data 2007 of students in grades three
through six on the CST ELA. The specific test we used is the dependent groups
t test. The software program SPSS was the method of interpreting this data to
89
determine if there was a statistically significant difference in the pre and post
test scores.
The independent groups t-test design compared growth between
Eberman and the benchmark school. These data used the 2007 CST API school
wide as well as compared progress in the multiple subgroups of Eberman: SED,
EL, Hispanic, and White. The difference in subgroups with the benchmark
school was that they had an African American student subgroup and we did not.
Practical significance was assessed with the Cohen’s d. This test
identifies the specific effect size or practical significance of the entire group
tested for both the comparison of the benchmark school and the progress of
Eberman students.
Other measures of practical significance are the raw change and the
percent of change from pre to post looking at each of three variables. Change in
proficiency band score on the STAR CST in ELA, percent of students reaching
the Basic and above bands, and the percent of students reaching proficient and
above between the 2006 and 2007 scores.
Equally informative is using data for individual students who attended
Eberman in 2006 in grades 2-6 and attended Eberman during 2007. This is a
dependent t-test design. Only students in grades three to six were used in the
2007 data. The intent of this analysis was to specifically see the growth for
students who attended Eberman for the year prior to the interventions and who
90
participated in the intervention. Additional information sought was the data
about the difference in student achievement for students who have continually
attended Eberman and those that had not. There is a high mobility rate of
approximately 25%. One question the data analysis answered regarding
students who had been attending Eberman for two years and had taken the
STAR CST both years: were they performing higher or lower than other
students who only attended this school for one year?
Using individual progress for our English Language Learner students
was movement on the CELDT. It was significant to note that while the CELDT
assessment was specific to our ELL population, it occurred early in the school
year—prior to October with a projected score resulting in the fall and the formal
score coming from the California Department of Education early in the calendar
year, generally by late January.
Limitations
One limitation in 2007 was the fact that the CELDT test used in 2007
was new and vastly different from the 2006 version. The CDE website states
that a comparison is not available between our two testing years. Once that tool
was in place, comparison in student growth was possible.
Quantitatively, this study has several limitations. Internal validity has
the limitation of selection. With numbers in the subgroups significantly below
91
300 and no random selection in the intervention process, we could predetermine
there would be a high degree of sampling error within the STAR CST ELA
subgroups.
External validity limitations deal with the treatment interference of the
multiple interventions. With the number of interventions in place, it was
impossible to determine which intervention made an impact on the students.
Furthermore, it was not possible to monitor which students truly benefited from
the teacher visit to the benchmark school.
As mentioned previously, the CELDT scores could not be compared
from 2007 to 2006 results until the documents were provided from CDE. This
was a huge limitation.
Changes in the intervention plans for the 2007-2008 school year will be
based on the outcome of the results from the 2007 test data as one indicator.
Some of the interventions are non-negotiable and will continue regardless of the
data outcome—this would be the implementation and instruction of the core
curriculum and the use of effective teaching strategies.
Objectivity is another limitation that affects formative measurements.
As the researcher and participant, my objectivity must be questioned. The
selection of Kaye Beeson School presents the question: could there be a better
benchmark, one that I did not have a history with? I work at Eberman every
day. My research is my day-to-day work and relevant to me in more than one
92
aspect due to this status. It is my personal belief that my connection with the
research site and involvement with the participants makes me a more effective
researcher, as I am vested in every step along the way. There is nothing I want
to do more than make great progress in instruction to positively impact student
learning at Eberman for the students, teachers, and community.
93
CHAPTER 4
FINDINGS
The summative results are derived from the 2007 CST STAR ELA test.
Multiple outcomes are examined—the number of students scoring basic and
proficient at each grade level, comparison of 2006-2007 same student progress,
and comparable results to benchmark school.
Proficiency Changes from 2006-2007
(Independent Groups)
The scaled scores for the CST Performance Levels for the English
Language Arts and the five performance bands are shown in Table 4.
Table 4
CDE Proficiency Bands by Scaled Scores and Grade Levels
Grade
Far Below
Basic
Below
Basic Basic Proficient Advanced
2
3
4
5
6
150-261
150-258
150-268
150-270
150-267
262-299
259-299
269-299
271-299
268-299
300-349
300-349
300-349
300-349
300-349
350-401
350-401
350-392
350-394
350-393
402-600
402-600
393-600
395-600
394-600
94
It is interesting to note the bottom cut points are the same in three of the
proficiency bands for elementary grades two to six and vary within the two
bands of Below Basic and Advanced.
For the 2006 CST results, the following percent of students fell within
each of the five performance bands displayed in Table 4. Table 5 shows the
following percent of students by grade level within the performance bands.
Table 5
The 2006 Student Performance Bands Percentage Totals
Grade
Far Below
Basic
(percent)
Below Basic
(percent)
Basic
(percent)
Proficient
(percent)
Advanced
(percent)
2
3
4
5
6
17
25
13
13
8
32
25
21
20
18
31
34
41
43
38
17
13
19
15
29
3
3
6
10
6
The 2007 results showed changes in student growth with limited
movement between bands. There were more students tested in both the third
and sixth grade classes due to enrollment shifts. Additional changes to note,
fourth grade had fewer students as there was one combination class and only one
full fourth-grade class.
The independent groups t-test was used to compare the pre and post test
means for the entire school population and also was computed for each grade
95
level. A rubric of 0-4 was used for each of the performance bands with 0 for Far
Below Basic, 1 for Below Basic, and so forth with 4 representing the Advanced
performance band.
The total school population included 680 students—362 were tested in
2006 and 318 were tested in 2007. The change in pre and post means was
minimal and negative. In order for the change to be evident, the mean must be
examined in the one hundredths column. In 2006 the pre test year, the mean
was 1.776 and in 2007, the mean was 1.732. With rounding the numbers to the
one hundredth column the change is a slight and regressing -.05. This difference
was not statistically significantly, t(.664.5) = 516, p=.606.
Grade two showed a positive change of .09 with a 2007 mean of 1.67. In
grade three, the change was -.10 with a pre mean of 1.46 and a post mean of
1.36. Grade four also showed a decline of -.08 with the post mean falling to
1.78. The grade five results fell from 1.88 to 1.67 for a loss of -.21 with 1.67 as
the 2007 post mean. Progress was made in the sixth grade with a gain of .04 for
a post mean of 2.12.
The results show that the sixth-grade average had the most students
above the Basic performance band as represented by the mean above 2.0 which
is reflective of the Basic performance band. All other grades were between the
1.0 band of Below Basic and the 2.0 band of Basic. Table 6 shows the
96
percentage of students within each grade level and the performance band of the
2007 individual student results in ELA.
Table 6
The 2007 Student Performance Bands Percentage Totals
Grade Far Below
Basic
(percent)
Below Basic
(percent)
Basic
(percent)
Proficient
(percent)
Advanced
(percent)
2
3
4
5
6
22
30
7
16
5
17
24
34
30
17
36
30
37
33
45
22
11
17
13
27
3
4
5
8
6
In comparing grade level growth within the same grade levels from one
to year to the next, the following observations were noted:
Grade two increased the percent of students Proficient and Advanced
from 20% to 25% in 2007. The number of second grade students tested in 2006
was 79 and 64 in 2007, as there was one less class in the grade level for the
recent testing year. The growth in the mean scale score was 9.4 points reaching
313.2 in 2007 which is still in the low range of the Basic performance band.
Small positive changes in grade three included the percent of advanced
students changing from 3 to 4 percent. Results show lack of growth for students
moving from Far Below Basic to Below Basic, with an increase in Far Below
97
Basic of 5% from 25% to 30%. The mean scale score remained in the Below
Basic performance band falling one point from 298 to 297. The bottom cut
point of the Basic band was 300.
The percent change in fourth-grade students from Proficient or
Advanced showed a decline from 25% to 22%. The Below Basic band showed
an increase from 21% to 34%. A negative change in the mean scale score of 5.1
points to 314.5 indicates a similar “average” performance band of Basic to
grades two and three.
In fifth grade, 94 students in 2006 were tested and 61 in 2007. Twenty-
five percent of the 94 students in 2006 were Proficient or Advanced with only
21% of 61 students reaching the same benchmark in 2007. Thus there was a
decline in performance. Equally frustrating was that the mean scale dropped
closer to Basic, with a loss 7.6 points falling to 315.3 in 2007.
Within sixth grade, the percent of students Proficient or above remained
the same at 33%. While the percentage remained the same, there was still
growth of the number of students reaching Proficient or above from 2006 to
2007 due to an increase in the number of students tested from 65 to 82. The
mean scale score showed little growth of 1.9 points reaching 332.8. This grade
level shows the highest mean scale score, but it was still near the basic
performance band.
98
Another manner to assess progress within the whole school population
was to look at the change in the percent of students Proficient or above within
significant subgroups. According to the CDE, a population of students is
identified as a “subgroup” when the percent of students is 15% or higher.
Eberman has four primary subgroups with data from 2006-2007 as shown in
Table 7. The table represents the change in proficiency from 2006 to 2007 for
each of the subgroups. Unfortunately, every single subgroup showed a lack of
progress to the NCLB goal for 2014 of having 100% of students proficient or
above. The target for 2007 in ELA was 24.4. The most significant change is the
shift school wide from meeting the criteria in 2006 to falling .10% below the cut
point in 2007. The white subgroup is the only group to meet the 2007 target, but
overall in this group there was a drop in proficiency from 2006 to 2007.
Table 7
Subgroup Proficiency Change 2006-2007
Group Change
School Wide 25.6 Æ 24.3
Hispanic 20.4 Æ 18.8
White 31.8 Æ 29.9
SED 23.7 Æ 22.0
English Learner 21.9 Æ 21.3
99
Proficiency Band Changes from 2006 to 2007
(Dependent Groups)
In the dependent groups design, growth for the same students from 2006
to 2007 was assessed for statistical and practical significance. Growth data were
beneficial to the interpretation of the effects of the intervention due to the high
mobility rate of 26% at Eberman. These data show the degree of growth only
for students attending Eberman for two consecutive years. In order for the
students to be counted in both testing years, they were enrolled at a minimum
from October 2005 to May 2007. Because the analysis of the nonequivalent
cohorts was confounded by unknown cohort differences between this and last
year’s classes at Eberman, an analysis of the same students over 2 years is a
better indicator of progress than the grade comparisons discussed in the prior
section. However, the analysis of student growth is confounded by the fact the
tests in consecutive grades measure different standards and may be inherently
more or less difficult from year to year.
Statistical Significance: 2006-07 Scaled Score Change
Table 8 shows the statistical findings for the CST scaled score change
from 2006-2007. For the total sample of 210 students at Eberman, there was a
loss of 6.04 CST ELA scores. However, there were statistically significant
100
gains in grades 3 to 4 and 5 to 6. This is reflected by the observed probability
value of .001.
Table 8
Scaled Score Changes from 2006 to 2007
Grade Level Post 2007 Pre 2006 Gain t Test (df) Observed p
2Æ 3
3Æ 4
4Æ 5
5Æ 6
299.79
322.47
320.17
332.85
304.59
301.62
321.38
320.70
-4.80
20.85
-1.21
12.15
-.991 (55)
4.261 (33)
-.290 (46)
4.071 (72)
.326
.001*
.773
.001*
Total 319.51 313.47 6.05 2.81 (209) .005*
p<.150.
Practical Significance: 2006-07 Scaled Score Change
Three indicators of effect size are shown in Table 9: raw growth,
Cohen’s d, and percentage growth from 2006-2007.
Raw growth is the gain or loss from 2006 to 2007. For the CST scaled
scores, raw growth was difficult to interpret because of the arbitrary scaling of
the CST, but this index was included anyway.
Cohen’s d measures the change from 2006 to 2007, divided by the 2006
standard deviation. In order to show practical significance, our pre-set criterion
was that Cohen’s d needs to be greater than .20.
101
The next effect size indicator was the percentage growth from year-to-
year. Percentage growth was computed by dividing the change from 2006 to
2007 by the 2006 average minus 150 points, the arbitrary set zero point on the
CST scaled scores. A common standard for practical significance assessed
using percentage growth is 10%. Because CST scaled scores do not have a true
zero point and percentage growth is dependent on the pretest mean, percentage
growth should be interpreted with great caution.
The subsequent tables show the same outcomes as described previously,
but note the number of students Basic and above and Proficient and above.
When looking at practical significance for the Basic and above and Proficient
and above the set point is a true zero point, therefore no additional
computations, aside from dividing the change by the pre mean, are required.
The raw change measure found two grade levels with positive change.
Third to fourth grade with a value of 20.85 and fifth to sixth grade with 12.15.
These were also the only two grades that met the statistical significance using
the Cohen’s d target of greater than .20. The third to fourth grade change with a
value of .46 and fifth to sixth grade change with a value of .27. The results in
Table 9 show that only one indicator of percentage growth reached the preset
102
Table 9
Practical Significance: 2006-07 Scaled Score Change
Grade Level Pre SD Change Cohen’s d % Growth
2Æ 3
3Æ 4
4Æ 5
5Æ 6
49.03
44.89
46.58
44.61
-4.8
20.85a
-1.21
12.15a
-.10
.46a
.03
.27a
04%
14%
1%
7%
Total Sample 46.80 2.00 .04 04%
a
Cohen’s d>.20, % Growth>.10.
value of .46, a fifth- to sixth-grade change with a value of .27. The results in
Table 9 show that only one indicator of percentage growth reached the preset
standard for significance at .10. The indicator was the percentage growth for
third to fourth grade progress. The 14% of growth exceeds the .10 level for
practical significance.
Performance Level Changes from 2006-2007
Statistical Significance
Another way to look at growth is to look at the change for the Basic and
above group of students and the students classified as Proficient and above.
Tables 10 and 11 show the results. The first two columns show the change from
2006 to 2007. A positive change reflects a gain and negative numbers reflect a
loss. The t-test results and the observed probabilities are given in Tables 10 and
103
11. It is important to note that with small sample sizes, such as in this data,
growth occurs, but is simply not statistically significant.
In looking at the Basic and above results in Table 10, fifth to sixth grade
students met the statistical significance in change with a .10, the t test score was
2.41. The observed probability of .019 does not meet the requirements of
statistical significance of .015. Third to fourth grade students also met the
statistical significance within the change category with .12. The t test was also
positive with a 1.44 result. Consistent with the fifth to sixth grade results, there
was no statistical significance in the observed probability score of .160.
Changes were also positive in the Proficient and above groups for the
third to fourth grade students and the fifth to sixth grade students. Both grade-
level groups reached a change result of .11, which is above the required .10 to
reflect statistical significance. Proficient and above students from fourth to fifth
grade also showed positive growth with a change of .02. T test results were
positive for fifth to sixth grade students at 2.38 and fourth to fifth grade students
scored 2.09. Observed probabilities were not statistically significant for any
grade-level group of students.
104
Table 10
Basic and Above Change
Grade Level
Post 2007
Mean
Pre 2006
Mean Change
T Test (df) Observed p
2Æ 3
3Æ 4
4Æ 5
5Æ 6
.52
.68
.60
.77
.50
.56
.66
.67
.02
.12
-.06
.10
.299 (55)
1.44 (33)
-1.14 (46)
2.41 (72)
.766
.160
.261
.019*
Total .65 .60 .05 1.53 (209) .129*
*p<.150.
Table 11
Proficient and Above Change
Grade Level
Post 2007
Mean
Pre 2006
Mean Change
T Test (df) Observed p
2Æ 3
3Æ 4
4Æ 5
5Æ 6
.16
.26
.23
.33
.25
.15
.25
.22
-.09
.11
.02
.11
-.991 (55)
2.098 (33)
-.44 (46)
2.38 (72)
.024*
.044*
.660
.020*
Total .25 .22.031.178 (209) .240
* p<.150.
Unfortunately, the grade with the least growth in both basic and above
and proficient and above is the second to third grade students. Many excuses
may be made, but the bottom line is that the students were not showing adequate
growth, which is unacceptable. For the Basic and above students, the change in
105
mean was .02, the t test was positive with .299 and the observed probability was
.766. Table 11 shows the results for the Proficient and above students with a
negative change in mean of -.09, negative t results of -.991. Observed
probability was .024.
Performance Level Changes from 2006 to 2007:
Practical Significance
Tables 12 and 13 show the practical significance of the change from
2006 to 2007. As explained previously in the scaled score change section, the
same markers are used—raw change, Cohen’s d, and percent growth to show
practical significance.
The raw change data do show practical significance in light of our preset
criteria of 10%. The change from grade three to four is 12% and five to six
shows a 10% gain in the basic and above category. The same grade levels each
show an 11% gain in the proficient and above data.
Cohen’s d results were positive and statistically significant for the third
to fourth grade students with .24 and the fifth to sixth grade students .21 in the
Basic and above category and within the Proficient and above category with .31
and .26 respectfully.
Three grade-level groups showed positive change within the basic and
above category—third to fourth grade value was .21 and the fifth to sixth grade
106
value was .15. Both of these exceed the 10% growth requirement as dictated by
the NCLB compliance.
Table 12
Basic and Above Growth
Grade Level Pre SD Change Cohen’s d % Growth
2Æ 3
3Æ 4
4Æ 5
5Æ 6
.50
.50
.48
.47
.02
.12a
-.06
.10a
.04
.24a
-.13
.21a
.04
.21a
-.09
.15a
Total Sample .49 .05 .10 .08
a
Change>.10, Cohen’s d>.20, % Growth>.10.
Table 13
Proficient and Above Growth
Grade Level Pre SD Change Cohen’s d % Growth
2Æ 3
3Æ 4
4Æ 5
5Æ 6
.44
.36
.44
.42
-.09
.11a
.02
.11a
-.20
.31
-.05
.26
-.36
.73
a
.08
.50a
Total Sample .42 .03 .07 .14a
a
Change >.10, Cohen’s d>.20, % Growth>.10
107
It is interesting to note that the other grade level with positive growth is
the second to third grade students with .04 growth value. Although there was
growth, it did not meet the 10% benchmark. The least growth was from fourth
to fifth grade students with a -.09 value in growth.
Overall Changes in 2006-2007 Individual Student Status
Utilizing scaled score data and assigning values of 0 for not passing at
the Basic or above level, and 1 for Basic or above, showed a .04 percent gain in
students that were Basic or above. Using the same formula for Proficient and
above with 0 as a value for Far Below Basic, Below Basic, and Basic and 1 for
only Proficient and above found a growth of .03% in the number of Proficient or
above students from 2006 to 2007.
Another manner to view this data was by conducting a cross tabulation
of the number of students who moved from Below Basic to either Basic,
Proficient, or Advanced and any students who failed to move. The first row in
the Table 14 shows that 61 students remained Below Basic and 22 students
moved up to either Basic or Proficient or Advanced. The second row shows that
13 students who were Basic or above in 2006 moved out of that level and fell
below Basic. One hundred fourteen students remained in the Above Basic band.
108
Table 14
Pre-Basic and Above—Basic and Above Cross Tabulation
Basic and Above
.00 1.00
Total
.00 Count
% within
PreBasic and
Above
61
73.5%
22
26.5%
83
100%
PreBasic and
Above
1.00 Count
% within
PreBasic and
Above
13
10.2%
114
89.8%
127
100.0%
Total Count
% within
Prebasic and
Above
74
35.2%
136
64.8%
210
100%
Table 15 shows the cross tabulation, but provides the number of students
moving from the Basic to the Proficient or Above bands. One hundred forty-
seven students remained below Proficient with 16 students moving into
Proficient or Advanced during the 2007 testing year. Ten students lost their
Proficient status and fell below Proficient with 37 students remaining in the
Proficient or Advanced bands.
Pivot Tables
Although the cross tabulation above show progress of students, the Pivot
Chart from Data Director provides specific student progress by teacher and
subject matter from the CST.
109
Table 15
Pre-Proficiency/Post-Proficiency Cross Tabulation
PostPro
.00 1.00
Total
.00 Count
% within
PreProf
147
90.2%
16
9.8%
163
100%
PreProf
1.00 Count
% within
PreProf
10
21.3%
37
78.7%
47
100.0%
Total Count
% within
PreProf
157
74.8%
53
25.2%
210
100%
Figure 6 represents one 2007 sixth-grade class of students from
Eberman. The performance bands are reflected by the numbers 1-5 with 1
serving as Far Below Basic, 2 as Below Basic, 3 as Basic, 4 as Proficient, and 5
as Advanced. Students showing growth in the ELA portion of the CST are
noted in the balloons north of the diagonal line. These data show that 10
students moved one performance band above from where they scored during the
2005-2006 testing year. Fifteen students remained in the same performance
band for the two testing years. This is noted by the numbers associated with
each balloon that stayed on the main diagonal. Almost 20% of the students are
caught in the Basic Band from this class, which is a concern. An additional area
of concern is the two students who fell one performance band below where they
110
were the previous year—one fell from Advanced to Proficient and the other
from Proficient to Basic.
The advantage of this data program was that individual students are
identified by simply selecting the balloon while in the Data Director Program.
This was a tremendous tool to aid teachers in differentiating instruction specific
to the needs of individual students. Equally valuable was the ability to create
flexible groupings for specific skill instruction from this resource.
Figure 6. 2007 CST ELA Proficiency Level, Grade 6, Class 1
111
Academic Performance Index and Annual Yearly Progress
Eberman did not meet the state or federal benchmarks for the No Child
Left Behind Act and therefore fell into Year 2 of Program Improvement. Our
academic performance index (API) showed a decline of 9 points from 669 to
660 and the following subgroup progress is noted in Table below with the
change in API reflected from 2006 to 2007 scores.
Table 16
Academic Performance Index
Group/Sub Group 2006 API 2007 API Change
Schoolwide
Hispanic
White
SED
English Learners
669
637
697
658
643
660
650
685
648
648
-9
13
-12
-10
5
Growth targets varied from 7 to 11 points. Only one group met the
growth requirement—the Hispanic student growth was 13 points which
exceeded the target by 5 points. English Learners were the only other subgroup
with growth, but did not meet the target of 8 points, falling 3 points below the
target score.
Table 17 shows the percent of proficient students by grade level and
subgroup. Adequate Yearly Progress toward the NCLB benchmark for the 2007
level was at the 24.4% level in ELA. The school wide data are also shown for
112
comparison purposes. The English Only subgroup data is different than our
white subgroup, as we have a high Russian/Ukraine population included in the
white subgroup. There are small pockets of groups of students who did make
the academic annual yearly progress (AYP). Sixth grade exceeded the 24.4
requirement with 27%. Second grade EL and socio-economically disadvantaged
(SED) students barely met the minimum with 25% of these students reaching
proficiency. Third grade is the lowest performing grade for the school wide
category.
The existing AYP Index started in the 2001-2002 year with 13.6% for
three years. The next benchmark of 24.4% began in the 2004-2005 school year
and continued for the next three years. For the 2008 year, the Elementary
School AYP benchmark moves to 35.2 and continues to move rapidly every
year with increases ranging between 9 and 11% yearly reaching the 100% mark
in 2014. We are looking at almost a 10% increase expected for each of the next
six years. This is a tremendous change for all schools in California.
Table 17
2007 Proficiency Rates by Grade Level
Grade School wide English Learners SED English Only
2 22% 25% 25% 17%
3 11% 3% 14% 27%
4 17% 7% 24% 16%
5 13% 4% 20% 24%
6 27% 13% 25% 34%
113
Comparison with Benchmark School Growth
Kaye Beeson School made growth for the third consecutive year and
exited Program Improvement status after three years of being in Program
Improvement. This is not an easy feat as the percentage of proficient students
changed from 13.6 to 24.4 in ELA from the first year they entered Program
Improvement to the past year. Overall, significant progress had been made in
the benchmark school for the past three years. Table 18 addresses the difference
in 2007 results between Eberman and the benchmark school, Kaye Beeson
Elementary. Table 18 shows that the change in API for Kaye Beeson School
was a positive growth of 15 points and Eberman was a negative progress of -9.
The greatest difference is that the gap between schools is 56 points on the API,
with 716 for Kaye Beeson and 660 for Eberman.
With the comparison of AYP, found in Table 19 it is interesting to note
that both schools had changes in the number of students tested. Kaye Beeson
had 52 less students from 2006-2007 whereas, Eberman increased the number of
students by 44 during the same timeframe. The main difference between the
benchmark school and Eberman is the scores are higher than Eberman and
above the target of 24.4 in every subgroup
The greatest information that is gained from the benchmark school
comparison is the difference in progress toward the 2014 goal of 100% of all
students proficient or above between the two schools. There was room for
114
significant improvement at Eberman. School wide there was a difference of
9.6%--nearly a 10% difference between Kaye Beeson and Eberman. The
greatest disparity was in the white subgroup with a 13.4% difference. The least
disparity was within the Hispanic subgroup with a difference of 7.5%. Although
the difference in this subgroup was lower than any other subgroup, it was
important to state that the Hispanic students were the least proficient students in
both schools.
One common trend is the achievement gap between various subgroups.
Clearly the continuing gap is evident in both schools and in particular for the
white students at Eberman as compared to any other subgroup.
Table 18
Academic Performance Index Progress
API Score Kaye Beeson Eberman
2006 701 669
2007 716 660
Change 15 -9
California English Language Development Test Results
As mentioned previously, we cannot compare the results from the 2005-
2006 to 2006-2007 California English Language Development Tests (CELDT).
115
Table 19
Academic Yearly Performance Comparison of Percent Proficient
Group
Kaye Beeson
2007
Eberman
2007
Difference
School wide 33.9 24.3 9.6
Hispanic 26.318.87.5
White 43.329.913.4
SED 33.922.011.9
English Learner 30.7 21.3 9.4
As an information indicator, we can look at changes in the population
from year to year. Tables 20 and 21 show the CELDT results. Data simply
shows the number of students within each CELDT performance band. The five
bands from low to high are: Beginner, Early Intermediate, Intermediate, Early
Advanced, and Advanced. In the current year, as of October 31, 2006 we had
128 English language learners (ELL) in grades 1-6. The previous year showed
134 students identified as ELL on October 31, 2005. The date is determined by
the state of California as the “cut off” point for when ELL students must be
tested. A disclaimer on the data obtained via the California Department of
Education website for CELDT scores for the 2006-2007 year states under notes,
CELDT Form F results administered in 2006-2007, are reported using a
new common scale. Beginning with these results, the common scale will
allow year-to-year comparisons to be made in the future. Summary
Results for Form F are not to be compared with any CELDT results of
previous years (Forms A-E) including those available on this Web site.
(California Department of Education Dataquest, 2007)
116
Table 20
Eberman Student Results 2005-2006 CELDT - Form E
Grade/Level
Kinder
First
Second
Third
Fourth
Fifth
Sixth
Total in
Level
Advanced 0 1 2 0 6 6 1 16
Early
Advanced
0 12 8 3 10 12 5 50
Intermediate 0 10 16 9 5 5 3 48
Early
Intermediate
0 1 2 3 4 1 1 12
Beginner 0 0 2 4 1 0 1 8
Total in
Grade
0 24 30 19 26 24 11 134
Table 21
Eberman Student Results 2006-2007 CELDT - Form F
Grade/Level
Kinder
First
Second
Third
Fourth
Fifth
Sixth
Total in
Level
Advanced 0 3 0 0 0 0 3 6
Early
Advanced
0 12 3 2 8 8 6 39
Intermediate 0 7 7 17 5 9 10 55
Early
Intermediate
0 2 8 2 2 1 4 19
Beginner 0 2 3 3 1 0 0 9
Total in
Grade
0 26 21 24 16 18 23 128
Although we cannot make data comparisons within the performance
bands, we can look at the shift in the number of ELL students tested by grade
level-cohort. This does show the mobility of Eberman students. In addition to
the change in tests, it is not possible to know how many of these students were
the same for both testing years, therefore making the CELDT an impossible tool
to use to look at data between these two years.
117
Conclusion
There was, in fact, levels of growth for Eberman student achievement as
evidenced by the previous multiple tables and data. The growth is not evident in
every grade, nor within each of the subgroups. However, the target through the
No Child Left Behind Act of 100% students reaching proficiency or advanced
by 2014 may be within reach if there is a tremendous increase in learning
reflected in student achievement.
Due to the nature of multiple interventions in place during the 2006-2007
school year, we can not determine whether one intervention was more successful
in raising student achievement than another. What we do know through
research and data is that the teacher is the greatest influence in student learning.
Therefore, one can assume with effective instructional practices, devout core
curriculum delivery—for and with engaged students—will make an impact
toward reaching the federal benchmark and leaving no child behind at Eberman.
118
CHAPTER 5
SUMMARY, DISCUSSION, AND RECOMMENDATIONS
As stated previously, the interventions were: Core curriculum
instruction, effective teaching practices, and visiting the benchmark school. The
intent of this chapter is to elaborate on the research and examine further the
results acquired during the 2006-2007 school year. Finally, I will identify areas
for growth and recommend actions and practices to continue raising student
achievement at Eberman Elementary School.
With the numerous interventions beginning during the 2006 school year,
we had the limitation of multiple interventions. No single intervention can
claim responsibility for the results on the California Standards Test (CST)
whether positive, negative, or no change. The CST showed academic gains
were exhibited in some grades such as three to four and five to six, losses were
apparent in others.
Multiple analysis of data results on the CST data were conducted
comparing student achievement in the 2005-2006 school year to the 2006-2007
school year. The greatest challenge in comparing student progress was the high
mobility rate of Eberman students. During the 2006-2007 school year we had
103 students enter after the start of school and 78 students leave. Our average
daily attendance was 440 based on the California Basic Educational Data
119
System (CBEDS) definition. Fifty-two additional students were also relocated
from one classroom to another. The majority of these classroom reassignments
occurred at the beginning of the school year with the collapsing of a third grade
class due to low enrollment and adding a kindergarten class due to high
enrollment. Two teachers were reassigned—one from third grade to
kindergarten and one from fourth to a third/fourth combination class. Table 22
shows all changes in gains and losses for grades 2-6 which may account for the
CST results.
Table 22
Student Gains and Losses - 2006-2007
Grade Second Third Fourth Fifth Sixth
Gains 16 34 7 10 16
Losses 11 31 7 8 14
Total
Students
27 65 14 18 30
In light of the mobility of students during the two school years, it is not
difficult to understand why the results were examined with two lenses. Two
analyses were conducted: one was the independent t- test looking at results of
all students grades 2-6 and assessing change in student achievement from 2006-
2007, and the second was the dependent t-test assessing the progress of students
120
that were enrolled during both testing years. For this reason, the dependent t-
test results only looked at students in grades 3-6.
Results Overview
Independent t-test Results
In comparing progress using grade-level means from 2006 to 2007,
grade two showed a small positive change of .09 with a mean of 1.67. The
mean reflects performance band progress using a 5- point rubric with 0
representing the Far Below Basic band, 1 for Below Basic and so forth with 4
representing the Advanced performance band. Grade six showed growth of .04
points reaching a post mean of 2.12. The sixth grade is the only grade with a
mean reflecting average student performance in the Basic performance band.
All other grades had a mean below the Below Basic band. Grades three, four,
and five showed a negative change ranging from -.08 for fourth grade and a loss
of -.21 for fifth grade. The grade three loss was -.10 for a post mean of 1.36.
Grades four and five had a slightly higher 2007 mean with 1.78 and 1.67
respectively.
School wide data reflected a slight and statistically insignificant loss from
1.78 in 2006 to 1.73 in 2007. The loss was -.05. This is not a tremendous
change, but attention must be given due to the fact it is a negative shift and not a
positive one.
121
Subgroup Progress
The proficiency rate of every subgroup fell during the 2007 testing year.
The white subgroup had the largest decline with a loss of 1.9% to 29.9%.
However, white students were the only subgroup to exceed the proficiency rate
of 24.4% in English Language Arts (ELA) according to the NCLB state
benchmark of 24.4 for 2007. The group with the least change from 2006-2007
was the English Language population moving from 21.9 % to 21.3%.
Dependent t-test Results
When the analysis was limited to those students who attended Eberman
for two years, the results were in marked contrast to those reported above. When
comparing the same students and their progress over the study period, there
were statistically and practically significant gains for students from grades three
to four and five to six. These students were in the lower grades during the 2006
testing cycle and the higher grades for 2007. When identifying student progress,
the grade level refers to the grade of the students in 2007, which is why the
dependent t-test was only used on results in grades three to six.
Scaled score comparisons showed a gain of 6.05 for all students with an
average post test scaled score of 319.51 on the CST ELA. This score places the
average performance of students at Eberman in the Basic performance band.
Grades four and six showed the greatest gain with a 20.85 point gain for the
122
fourth grade and 12.15 for the sixth grade. Grades three and five showed a
statistically insignificant loss of -4.80 and -1.21 respectively in CST ELA scaled
score means.
Under No Child Left Behind (NCLB) another manner to assess progress
is the change in the percent of students that move from one performance band to
another. In particular, we examined the growth for students moving to Basic
and above and Proficient and above. In looking at the movement of students
into Basic and above, grades three, four, and six showed positive change with
fourth grade making the most progress. Cohen’s d compared the progress to the
pretest standard deviation. Grade four scored a .24 standard deviation gain, with
grade six close with a .21 standard deviation gain in standard deviation. Third
grade also showed a positive change of .04 standard deviation. Grade five was
the only grade with a negative change falling -.13. Using the same comparison
of standard deviation units for growth in students reaching Proficient or above,
only fourth and sixth grades showed positive results with gains of .31 and .26
respectively. Grade five had a slight loss of -.05, while the third grade had the
greatest decline in proficiency with a -.20.
Discussion
With the inconsistent results of the analysis of students who attended
Eberman for two years, it is important to ask the question of why some grades
123
made gains and other grades experienced losses. It is possible and probable the
level of implementing the interventions varied from teacher-to-teacher and also
from grade -to-grade. Teacher experience can also impact the level of
curriculum implementation. Open Court, the core language arts curriculum, is
an extremely detailed and vast program. Inexperienced teachers are learning to
manage many facets of instruction and curriculum delivery while more
experienced teachers have a wide variety of exposure to various language arts
curriculum. It is most probable that teacher experience and resources attributed
to the inconsistency of results in student achievement.
Elmore (2004) offers the possible explanation for lack of progress and
positive school wide results. He discussed progress of two schools in School
Reform from the Inside Out. Both schools were working on improving student
achievement through school wide reforms and initially made progress, but then
did not make the necessary growth according to NCLB and are now considered
failing schools.
This is actually a predictable pattern through the entire improvement
process.… Significant gains in performance … are usually followed by
periods of flat performance. These periods of flat performance are
actually very important parts of the improvement process—they are the
periods in which individual teachers consolidate and deepen the
knowledge and practices they acquired in earlier stages. (Elmore, 2004,
p. 248)
124
Could Eberman simply be in a period of flat performance due to teachers
digesting previous professional development concepts brought in as
interventions during the 2006-2007 school year?
Core Curriculum Intervention
The California Comprehensive Center produced a research summary in
2006 on nine Essential Program Components. The nine Essential Program
Components address instructional necessities that are to be in place in order for
a school to improve academic achievement. Frequently this is a tool used by
individual schools that are not reaching the required state benchmarks
established by NCLB. As part of the school’s failure to get results, they may
become a School Assistance Intervention Team (SAIT) School. Districts that
are failing have a similar resource in the District Assistance Intervention Team
(DAIT).
While each individual component is considered an important piece of the
school improvement process, no one component should be seen as a
silver bullet. The nine components are intended to work together to form
a comprehensive and coherent improvement process; the strength of each
component is bolstered by addressing all components in combination.
Recent research shows that a coherent instructional program is key to
successful improvement. (American Institutes for Research, 2006, p. 2)
As one of the main study interventions and state mandated regulations,
implementing the standards based core curriculum in reading is non negotiable.
The Open Court curriculum was adopted as the district curriculum in 2002 by
125
the board of education for Washington Unified School District. Implementation
at Eberman varied greatly from teacher -to-teacher and grade level-to-grade
level. It is not surprising that the CST results also vary greatly from grade level
to grade level. Progress was made for students who moved from third to fourth
grade and those in sixth grade during the 2007 testing year, as discussed
previously.
Another factor that emerged from teacher interviews and conversations
was also the degree of implementation. The teachers felt they were doing a
stronger implementation than was observed by the SAIT walk-through
conducted on April 25, 2007. Comments about implementing the core
curriculum included, “I think we are doing what other schools are doing. Maybe
we need to be more consistent.” The intent of the SAIT walk-through was to
have an outside observance of our core curriculum implementation. The walk-
through was not intended to be a “gotcha” for teachers, simply another snapshot
as to our implementation of the standards based language arts curriculum and
our compliance with the ELD mandates.
Prior to the walk-through, teachers completed the Academic Program
Survey which is a four point rubric rating system of the nine Essential Program
Components (EPC) implementation status. A 3 is the highest rating meaning the
curriculum is fully implemented, 2 reflects curriculum that is substantially
implemented and 1 means partially implemented. A program that is minimally
126
implemented receives a 0. The component most relevant to this study is EPC 1
which looks at the implementation level of the instructional program. Objective
1.1 states: “The school/district provides the most recent State Board-adopted
core instructional programs in reading/language arts (2002-2008 adoption,
including interventions), documented to be in daily use in every classroom, with
materials for every student” (California Department of Education Intervention
Assistance, 2007, p. 1).
With a 3 being the desired scoring of 100% implementation, the average
for Eberman teachers on EPC1.1 was 2.4. The report of findings from the SAIT
Team on objective 1.1 for language arts scored a 1 meaning partially
implementing our Open Court curriculum. There was a discrepancy from what
the teachers perceived to be the level of core curriculum implementation as to
what the SAIT visit showed. The SAIT Team included two consultants from
Action Learning Systems, and two district personnel, Associate Superintendent
of Educational Services, Coordinator of Reading Intervention, and myself, the
site principal.
When teachers were presented with the findings of our SAIT visit, they
were defensive and displeased with the results. Comments teachers made, “We
were getting ready for our CST, and our room had all the posters off the wall.”
“I do implement only Open Court.” “Who isn’t implementing the program?”
The conclusion I came to was that most teachers were truly not being defiant;
127
they simply felt that what they were doing was better for students and many had
been doing it for years. They felt they were, in fact, teaching the prescribed
curriculum. An excerpt on changes and actions from the Eberman Report of
Findings for Objective 1.1 stated the following. Figure 7 gives additional data.
Essential Program Component #1 Instructional Program
Findings: WUSD has provided all students with core materials. Some
supplemental, non SBE –adopted materials are in use. There are teacher
concerns for only using OCR materials to meet all student needs and
how to adjust lesson when students do not succeed. Continual student
engagement during all parts of the lessons is a concern.
• Site adopted EL program “Moving Into English” five years ago but in-
consistent use in classrooms with lack of ELD pacing guide and
assessment monitoring for the program [which had not been purchased].
• Understanding grade level standards, use of CST test release questions
to guide rigor of instruction and planning with core curriculum is a next
step for the teachers.
Clearly, the piece that is missing is a school wide commitment to have a
consistent delivery of the core program. Addressing the fact it does benefit
students—especially ones with high mobility, students need consistency to
achieve as evidenced by schools that do have a full implementation, such as the
benchmark school we visited in January of 2007 which is described in further
detail in this chapter.
128
EPC #1
Instructional
Program
Corrective
Action
Benchmark/Evidence
Date/Personnel
Responsible
Details/Tools
Full
implementation
of OCR lesson
components by
all teachers in
all classrooms.
• Observation of
consistent instructional
routines in all
classrooms.
• All teachers will be
moving all students
appropriately through
the Phases of Direct
Instruction,
Orientation,
Presentation, Highly
Structured Practice,
Guided Practice, and
Independent Practice.
• It will be evident
students and teachers
understand their
different roles during
instruction.
• Differentiating ELA
areas during workshop
instruction to meet all
student needs based on
data.
• Evident student
engagement and
student focused
independent practice as
related to the grade-
level standards and
student needs.
Principal
Reading
Coaches
Teachers
District Adm.
November 07
• Review
with staff
Phases of
Direct
Instruction
and student
engagement
strategies
during all
parts of the
lesson.
• Grade
level/staff
agreement of
engagement
strategies that
will be used
routinely in
all rooms.
• ELA
Coach
support and
modeling.
• Use of
WUSD data
analysis
protocol and
grade-
level action
plan
Figure 7 Excerpt from Eberman Report of Findings
129
Effective Instructional Strategies
Another significant intervention was to get teachers to focus on student
learning in addition to their instruction. We need students engaged and taking
part in the learning process. I have viewed terrific lessons from an instructional
delivery standpoint—but the teachers were the ones doing all the work.
Students need to understand what the purpose is of the lesson and how they will
show the teacher what they have learned. During walk-through visits over the
course of the year, there were changes in the level of engagement of students as
evidenced by the various engagement strategies that teachers brought into their
lesson delivery. Numerous comments were shared with teachers from the walk-
through observation form which addressed the implementation of the
interventions—core curriculum implementation, effective teaching strategies,
and lesson objectives for students.
Using the new feedback form (Appendix D) teachers received feedback
following the principal walk-throughs during the spring of 2007. The focus of
this tool was to let teachers know what was observed in their classroom during a
brief and unscheduled visit. Observations during the walk-through focused on
what we had worked to develop during the year as far as implementing the core
curriculum, effective teaching strategies, and lesson objectives. The comments
were based on evidence of these practices from the visit.
130
With our focus on effective lesson delivery and rigor of instruction and
learning, one feedback area specifically addressed the concept of expectations
and rigor. The descriptor for this task reads as “Level of expectation for student
work is high for every student as evidenced by rigor during the lesson.” Actual
feedback comments from a visit late in April of 2007 to a first-grade classrooms
included, “Teacher reviews concepts and has clear expectations” and “Great
redirection for students! They learn to feel safe and take risks and most
importantly to keep trying.” Visits from intermediate grades four to sixth had
the following observations: “Not all students have their book out; two students
had their heads on the desk and were not working.” “Teacher encourages
students to continue to work and not sit back and let others do the work.”
“Allowing students to work together is a great strategy to promote discussion
and improve achievement. Keep up the great work!” Providing teachers with
timely feedback allowed them to be reflective as to what they were intending to
accomplish compared to the learning outcome that was visible plus, everyone
likes to know when they are doing something well. Positive, sincere, and
truthful comments can build relationships and create topics for further
discussion enhancing teaching and learning.
Fullen, Hill, and Crevola (2007) addressed the challenge of school
reform. The previous performance bar and goal was to move students from
below the 50
th
percentile to the 70
th
percentile on state assessments. The current
131
goal no longer allows us to only move 70% of our student population to
proficiency. By 2014, in compliance with NCLB, all students must reach or
exceed the proficiency benchmark. A study conducted by Cross City Campaign
for Urban School Reform (2005) looked at reform implementation in three
school districts. The research found that none of the districts were successful in
sustaining long term change.
So what is the problem? In our view, the strategy lacks a focus on what
needs to change in instructional practice. In Chicago, teachers did focus
on standards and coverage, but in interviews, they ‘did not articulate any
deep changes in teaching practice that may have been under way.’”
(Cross City Campaign, 2005, p. 23)
Long term change is the only way we, as educators, can choose to do
business. The only way our elementary-age children today will meet the
country and world demands when they graduate is from the result of deep
change in instructional delivery now.
Fullen, Hill, and Crevola (2007) addressed the shift in the past 10 years
in classroom instruction looking at the learning and not simply the teaching.
This is an advancement toward school reform, but not the only piece needed to
improve student achievement. “(B)ut we are all too aware of the enormous task
still to be achieved in extending and embedding the reform work and ensuring
that best practice in classroom instruction becomes the norm and an
institutionalized feature of schools and systems” (p.28).
132
Effective teaching practices include a vast degree of components:
planning, instructional delivery, student engagement, expected outcome, and
assessment of the learning. One of the biggest challenges in education is to
evaluate whether a student achieved the intent of the lesson during the lesson.
Waiting until the student completes and turns in the independent work, usually
assigned as homework and then evaluating whether the student learned the
material, is too late.
Fisher and Frey (2007) noted numerous means for teachers to assess
whether students mastered the concept during the instruction, as opposed to
waiting for the individual student work to be turned in. With the advanced
opportunities technology is bringing to the classroom, one new method is an
electronic device that enables teachers to get feedback on the learning in “real
time.” This option is something all schools should have access to. The name
for this tool is Audience Response Systems (ARS). The teacher can use an ARS
and know immediately which students comprehended the material and which
need more instruction. Fisher and Frey provided an example of this tool in use
during a science lesson. The teacher asked a multiple choice question about
“cells.” When the class responded, the answers were displayed using an LCD
projector. The teacher realized multiple students did not have the understanding
of the concept, retaught the material and had the students answer the question
again. Still, some students selected a wrong response. At this time, the teacher
133
was able to determine they needed more details about test taking strategies as
some were caught with the selection “all of the above.” The teacher just
reviewed why one of the answers could not be correct using visuals. With some
students selecting the wrong answer, the instruction can be geared toward the
knowledge gap and not lose precious instructional time (Fisher and Frey, 2007,
p. 52). Most importantly, using an ARS throughout instruction, with a 90%
accuracy rate prior to the question on cells, the teacher knew immediately there
was a problem and could address it at the instant of the learning, before students
learned the wrong information and required further instruction and knowledge.
Benchmark School Visit
One of the greatest challenges for Eberman teachers was adapting to the
concept of implementing the core curriculum 100% with fidelity. As mentioned
previously, the adoption of the curriculum continued through two previous
principals and four superintendents. During that time, teachers were able to do
what they felt was best for students and many did not follow the intent of the
current language arts adoption. Due to this history, it was difficult to convince
teachers to follow the program 100%. I felt if they could see a school that was
implementing the program with a diverse population and having success, it
would make an impact in the ability for Eberman teachers to return and do the
same with our students. This was the intent in visiting the benchmark school—
134
for Eberman teachers to see a school with a similar population succeeding with a
program that many teachers felt was not going to work for them. The
benchmark school visit’s purpose included not only for teachers to view the core
curriculum in use, but to also identify effective teaching strategies and make the
connection of how the learning is affected with this in tandem.
The State of California Department of Education created the Similar
Schools Ranking as a manner for underachieving schools to see the results that
similar schools in demographics are achieving.
How Are Similar Schools Ranks Used? California public schools serve
students with many different backgrounds and needs. As a result,
schools face different educational challenges. The similar schools ranks
allow schools to look at their academic performance compared to other
schools with some of the same opportunities and challenges. The similar
schools ranks can be used in at least two ways. First, schools can use
this information as a reference point for judging their academic
achievement against other 100 schools facing similar opportunities and
challenges. Second, schools may improve their academic performance
by studying what similar schools with higher rankings are doing.
(California Department of Education, 2007a, p. 3)
It is important to note that Kaye Beeson Elementary School was not a
similar school for Eberman according to the state of California, but I knew the
principal, having myself been a teacher at this location years ago when Open
Court was first introduced to this school. The principal had been there for 8
years. Prior to accepting this position, she worked at the district level
coordinating the reading coaches for implementing the Open Court program.
135
Due to her previous position, she had a vast knowledge of the intent of the
program and understood the research-based curriculum.
Kaye Beeson Elementary School was selected because they made great
gains in improving student achievement since the passage of NCLB. Kaye
Beeson raised achievement by following the prescribed delivery of Open Court.
For the past 5 years staff development had focused on effective instructional
practices and used data to drive instruction. Previous visits to Kaye Beeson had
shown that this is exactly what I wanted the teachers at Eberman to do. What
better way is there to encourage teachers and change thinking patterns and gain
buy-in than by visiting and seeing instruction and learning first hand?
We were fortunate to have a full staff development day when
neighboring school districts were in session. Due to this schedule, I brought all
my certificated staff to visit Kaye Beeson Elementary School. In addition to the
staff, I also invited our consultant from Action Learning Systems to join us. The
consultant had a history with the staff due to extensive staff development prior
to my arrival, which is why I wanted him to join us. Mr. Wood was fully aware
of the lack of consistent core curriculum implementation at the school.
Teachers visited classrooms according to a schedule. In addition to their
own grade, they also saw instruction either above or below their current grade
level. The purpose of this was so they could identify consistency across the
grade levels and see how this benefits student learning. Teachers filled out an
136
evaluation following the visit (Appendix B). The focus of the evaluation was to
have teachers identify what learning and teaching practices they observed and
assess whether they could bring back any strategies to benefit Eberman students.
Names were not required on the evaluation, although some teachers chose to
place their name on their own evaluation. Teacher comments regarding
practices or strategies to benefit our students included:
• “Answer in full sentences and use academic language.”
• “A teacher behavior I saw pretty consistently was the pace of the lesson
delivery—it moved along at what seemed to be the right amount of speed
to keep all the cars on track.”
• “Teachers knew their students were learning because they seemed to be
always looking at the kids during response time.”
Multiple responses along the same line told me that our teachers knew what
needs to be done to improve our achievement.
The greatest change from this visit is the change and modification to pre-
existing teaching to that which is observed following the visit. Additional
measures of changes in behavior were available in dialogue with teachers. One
teacher shared that she came back and rearranged her classroom following the
visit. Prior to the visit, her students’ desks were placed in rows with little ability
for student-to-student interaction during lessons. Numerous teachers of
Eberman had their rooms configured in this manner. I did not ask teachers to
137
specifically connect effective teaching to room set up, but instead asked each
teacher to draw the room set up during the visit. It was exciting to see teachers
make the connection to the necessity of interaction between students and the
teacher in order to improve comprehension of subject matters presented and the
physical barriers that room designs can impose.
Eberman teachers definitely got the sense of urgency that filled the
benchmark school. Time was not lost during transitions or instruction. One key
challenge that did permeate the evaluation responses was about the concern of
improper student behavior at Eberman. They saw students who were respectful
and on task in the classrooms they visited. Unfortunately, the concept that was
not acquired during the visit—was that if students are engaged and challenged in
learning during instruction, then their behavior is focused on learning and
succeeding.
Future Suggestions and Site-Based Recommendations
Future Suggestions
Due to the varying progress students made, continued research should be
conducted on the progress school sites make when all teachers are on the same
page. The results they produce may make a great impact and difference for
other schools that are in similar places with lack of continuity. The state
provides us with benchmark schools. Contacting higher-performing schools and
138
those listed in “similar schools” will allow under performing schools to learn
from and continue to make progress toward all students reaching proficiency.
It is helpful to utilize the CDE website as a resource for locating higher-
performing schools to use as a benchmark, and then researching what the other
schools are doing to achieve results. In addition to using higher-performing
schools as a resource, it is also informative to use the similar schools provided
for each individual school. Using a benchmark school and the similar school
listings were avenues to pursue and discover what commonalities appeared from
these school sites with higher ratings than Eberman. Previous utilization of this
resource as a tool has brought cries from staff that the schools are not really
“similar,” but it is actually easy to locate a high-performing similar school that
has matched demographics.
What must be addressed is the lack of consistency in student learning.
For some students and grade levels to make progress, as measured by the CST
growth from 2006 to 2007, and other grade levels to statistically decline, is not
an acceptable practice. The specific cause of the discrepancy was not
determined through this study. Possible explanations may simply be the fact
that the teachers within certain grade levels were not consistent with curriculum
delivery or perhaps the effectiveness of instruction varies from grade level to
grade level. Providing support for teachers with coaching and professional
development may be the key to creating an environment that excels student
139
learning. Using data to drive instruction and devising opportunities for
reflective conversations based on results will allow teachers to work in unison
effectively and ultimately continue to improve instruction and results at
Eberman Elementary.
The bottom line is that all schools must reach the 100% students
proficient level or above by 2014. Exact school populations will not be the
concern, just the end result--who is achieving and how are they doing it? What
are other schools doing that we are not? With below 25% of Eberman students
scoring proficient or above in 2007, the reality is that we are failing 75% of our
children in grades two through six as measured by the CST and the results we
have produced.
Site-Based Recommendations
Suggestions for Eberman are to continue with the delivery of the core
language arts curriculum with delivery support in addition to the site principal.
The questions that arose from the SAIT visit and the knowledge gap between
teachers about what full implementation looks like, why it is significant, and
what our students need is great. Two manners of support would be ideal. First
of all, bringing in Reading Coaches who only focus on working hand-in-hand
with teachers in a non-threatening and non-evaluative manner is crucial.
Teachers observing model lessons and then having professional and reflective
140
conversations with coaches will allow for a level of honesty that is difficult to
achieve with an administrator who is also one who evaluates their instructional
performance. Second, bringing in a complete SAIT process would allow
teachers to hear the message of delivering the core program and what that
implies from another source and also one that does not evaluate them
individually. With these support pieces in hand, the site administrator can still
serve as the instructional leader but with a continued message with other
professionals that are outside the school.
Recommendations for future implementation would be a focus on
student outcomes. Once the curriculum is implemented with 100% fidelity
consistently school wide and effective instructional practices are the norm and
expectation, what results are we achieving? Data must drive instruction. If
teachers know what students know and need to know, then they can design
instruction to meet the needs of the students. Creating a culture of teachers who
identify weaknesses in instruction and devices to improve instruction with the
result of improved student learning is critical. Johnson (2002) brought data in as
the critical piece needed to close the achievement gap. Teachers must examine
data as a team. When grade-level teachers are able to develop, implement
lessons, immediately assess student learning, and then use the data to reteach
and refine instruction, we will have a culture focused on data. How do we
invent and develop this culture?
141
Johnson (2002) stated that the need for change and school reform can be
created internally or externally. “Whatever the reason, individuals must
recognize from the beginning the value of an inclusive and equitable reform
process and how data can be used as a fundamental tool in the process”
(Johnson, 2002, p. 11). With the additional instructional support that Reading
Coaches would bring and the complete SAIT process, we would have an
external need to help drive our internal need to change the way we do things.
Eberman needs to devise a culture of teachers and staff who believe our students
can learn and we will do whatever is necessary to achieve the results! Johnson
(2002) identified five stages in the change process for incorporating data as a
tool to address the achievement gap that I believe would drive our shift in
looking at student outcomes as a team of teachers.
First, having a school leadership and a data team with the proper training
and time to examine data from school is critical. These individuals will serve as
a resource for the staff and communication of the data results serve as the
critical role in the school reform. Second, killing the myth that students from
various racial groups and low-income environments will have low outcomes is
essential. In order for the myth to be gone, teachers need research and dialogue
to “see the broad discrepancies between rhetoric and actual teaching practices at
schools like their own” (Johnson, 2002, p. 11). The third step is to create a
culture of inquiry. This step allows educators to question and search for true
142
answers. The outcome of this step is for the data to stimulate the school change
process and determine what direction the school needs to take. What changes
must occur to get the desired results? Fourth, “school wide priorities must be
identified, responsibilities assigned, resources allocated, and reallocated,
measures of progress determined, and timelines established” (Johnson, 2002, p.
12). This is done with the production of the school vision and plan. Ironically,
many schools begin the vision and plan without the development of inquiry and
addressing the fact that we have a severe inequity in achievement. Designing a
culture willing to examine instructional outcomes and address the changes to
create an equitable environment for all will allow a vision and plan that can be
achieved. The fifth step is monitoring progress. Without this accountability
piece, how will teachers know what is working and what needs to change? “The
quest must be to evaluate whether the reforms are appropriate and whether they
are raising the level of student achievement” (Johnson, 2002, p. 12).
This study protocol has relevance to other school sites across the nation.
With research, state, and federal mandates supporting the implementation of
standards-based core curriculum, there really is no argument for standards-based
curriculum not being delivered to our students. Providing access to the core
curriculum is simply what must be offered to our students in order to provide
them the best chance to be competitive in an ever-changing global society.
Friedman (2006) addressed the global impact of technology and how economic
143
changes are affecting the world. One of the relevant concepts is about the
difference in education across the globe. American companies have moved their
operations to other countries, such as India. This change is not simply about
factories that produce items to ship back to America. We are talking about
corporate America moving their production and headquarters to India.
The book begins with Friedman on a golf course, getting ready to hit the
ball. His partner tells him to “Aim at either Microsoft or IBM” (Friedman,
2006, p. 3). Other corporations with buildings nearby are: HP, Texas
Instruments, and 3M. This particular course is in Bangalore, southern India.
Friedman states that Bangalore is India’s Silicon Valley. The move of
corporations across the seas is known as outsourcing. Companies are able to
divide parts of projects and send them across the globe electronically and
receive a finished product in less time due to the hours of work, While we are
sleeping, India is up and running.
What does this economic change have to do with student achievement?
If our students today do not get access to knowledge in school, how can they
compete as young adults with others in a divergent job market? No longer can
children that drop out of school in middle school or high school necessarily get a
job to make ends meet. Computers are now an integral part of every aspect of
life. Yes, a wrench may still be a tool uneducated youth can operate. However,
computer chips within engine systems and computer-based diagnostic tools the
144
drop outs cannot operate. It is difficult to navigate a computer without the
ability to read and comprehend print!
A strategy Friedman expresses to allow our youth to compete in this
global market involves education. “It must be accompanied by a focused
domestic strategy aimed at upgrading the education of every American, so that
he or she will be able to compete for the jobs in a new flat world” (Friedman,
2006, p. 263). This real-life situation is the connection to how our instructional
delivery and caliber of teaching affects our students. We must know what the
students are learning every day in every subject.
The most relevant support teachers would gain from these two outside
levels of support, coaches and SAIT, is that the outside parties can share
personal results they have been involved with in other schools or districts.
Providing teachers with first hand views of the Open Court implementation by
viewing model lessons from coaches and visiting other local schools should
reassure them of the potential for success and it is important that teachers have
the necessary efficacy to impact performance.
Conclusions
The original research questions were simply: will the interventions
adopted during the 2006-2007 school year make an impact on our student
achievement? Are we doing what is best for kids? First answer--yes! We
145
exhibited growth in areas with statistical significance for students in grades four
and six during the 2006-2007 school year that were present at Eberman for two
consecutive testing cycles. As to wondering if we are doing what is best for
kids, the research has shown beyond a doubt that the interventions in this study
soon should not be considered interventions, but purely expectations for what is
to occur in every school that is improving student achievement.
Let us revisit what Title 1 now encompasses, the legislation of No Child
left Behind. An excerpt from the purpose can be found in Sec 101 of No Child
Left Behind. Previously, 10 of the components were shared due to relevance of
this particular study. Now, I am sharing the remaining two to show the
significance and intended impact of this law.
11. Coordinating services under all parts of this title with each other,
with other educational services, and, to the extent feasible, with
other agencies providing services to youth, children, and families.
12. Affording parents substantial and meaningful opportunities to
participate in the education of their children. (United States
Department of Education, 2002, Sec 1001)
As with the intent of No Child Left Behind, every child succeeds through
learning and has opportunities open to proficient members of society. I believe
this is our obligation as educators and the commitment we made when acquiring
our credential for teaching and as administrators. Further, this is what site
instructional leaders strive for every day we work with learners and teachers.
All schools shall provide opportunities for every student to reach their potential
146
to be productive lifelong learners with higher-thinking skills who can compete
globally with peers from other states and countries.
Limitations: Internal and External Validity
Internal validity must be addressed in reference to the following issues:
instrumentation, selection bias, mobility, sample size, and researcher
participation. The greatest difficulty with internal validity for this study was the
selection and mobility issue, discussed in detail earlier in this chapter. During
the 2006-2007 school year, we had 103 students enter after the start of the
school year and 78 students who exited Eberman prior to the last day of school.
The mobility rate of our Average Daily Attendance was 440, 25%. This was
exceptionally high and terribly important to recognize the impact mobility may
play in the results. It is not known whether it is a negative or positive, but I am
speculating it is a negative affect for student achievement results for our highly
mobile population.
An added internal validity issue raised during this study was with the use
of California Standards Test (CST) is the instrumentation between the pre- and
post-test years. Students were not tested on the same material or standards
across grade levels. The degree of difficulty in the individual grade level exams
is not known. Because of instrumentation, the gains in this study may not be
147
based on growth, but rather attributable to the difference in the difficulty levels
of the CST test.
Steps were taken to account for the high mobility and selection bias
problem by comparing students who did attend Eberman during the two testing
years with the dependent t- test. Significant growth was reported for two of the
four grade levels examined in this manner.
Another factor when dealing with a small sample is the risk of not
finding significant growth due to the small original sample. Many contrast
larger samples would have to be used for the data to show statistical
significance. With such small sample sizes—well below 100 and as low as 42
and no random assignment—there is a chance of sampling errors that may occur
and affect the results reported herein. Indices of practical significance were
used to counteract this problem but agreed upon standards for practical
significance did not exist.
Internal validity was affected by the researcher participating with the da-
to-day operations and implementations of the interventions as the site principal.
I had much to gain from increased student achievement. Student achievement is
the greatest responsibility of the site administrator as an instructional leader.
Positive, as well as negative, conclusions may be, in part, a reflection of my own
bias.
148
External validity factors affecting this study must also be identified. We
have a question of external validity due to treatment interference from having
multiple interventions. The validity of this study was affected with the
numerous interventions working concurrently—this study cannot be replicated.
There is no method to determine which students exhibited progress due to the
various interventions. Equally significant is the question as to whether student
achievement was affected negatively with multiple interventions. Student
achievement at Eberman, whether success, or lack of progress, cannot be
connected to one single intervention.
Another factor affecting external validity was the ability to generalize
only within the context of Eberman and not out of context. An example of this
is our English Language Learner (ELL) population. We have over 120 students
identified in 2007 as ELL as reported on our California English Language
Development Test results (CELDT) which is more than 25% of the entire
student population. It is possible that the multiple interventions impacted this
challenging group differentially. Further, we are unable to compare progress on
the CELDT due to a change in test administered and dictated by the state. There
is no way to measure progress looking at the group data, as we are not able to
compare Form E used in 2006 and Form F used in 2007. Fortunately, in 2008
we will be able to compare results from 2007 and also the following years that
continue with the current Form F.
149
REFERENCES
Academic Program Survey Elementary Level. Retrieved November 10, 2007,
from http://www.cde.ca.gov/ta/lp/vl/documents/egaps.doc.
American Institutes for Research. (2006). Research summary supporting the
nine essential program components and academic program survey.
Retrieved September 15, 2006, from
www.cacompcenter.org/pdf/aps_research_summary.pdf.
Anderson, L. & Krathwohl, D. (2001). A taxonomy for learning, teaching, and
assessing. New York: Addision Wesley Longman, Inc.
Bain, H.P., & Jacobs, R. (1990, September). The case for smaller classes and
better teachers. Streamlined Seminar—National Association of
Elementary School Principals, 9(1), n.p.
Bandura, A. (1994). Self-efficacy. In V. S. Ramachaudran (Ed.), Encyclopedia
of human behavior (pp. 71-81). New York: Academic Press. Retrieved
September 30, 2007 from www.des.emory.edu/mfp/BanEncy.html.
Barr, R & Parrett, W. (2007). The kids left behind: Catching up the
underachieving children of poverty. Bloomington, IN: Solution Tree.
Bloom, B. (1956) Taxonomy of educational objectives handbook 1: Cognitive
domain. Longman, NY: David McKay Publishing.
Brookhart, S.M., & Loadman, W.E. (1992). Teacher assessment and validity:
What do we want to know? Journal of Personnel Evaluation in
Education, 5, 347-357.
Brophy, J. E. & Good, T.L. (1986). Teacher behavior and student achievement.
In M.C. Whittrock (Ed.), Handbook of research on teaching (pp. 214-
229) (3
rd
ed). New York: Macmillan.
California Basic Educational Data System, retrieved on November 10, 2007,
from http://www.cde.ca.gov/ds/sd/cb/reports.asp.
California Comprehensive Center and American Institutes for Research.
Retrieved November 20, 2007, from
http://www.ed.gov/policy/elsec/leg/esea02/pg1.html#sec101.
150
California Department of Education (2007a). Intervention Assistance Office.
Academic Program Survey Elementary School Level. Retrieved
November 9, 2007, from www.cde.ca.gov/ta/lp/vl/documents/egaps.doc.
California Department of Education (2007b). Testing and Accountability
Office. How are similar schools ranked? Retrieved November 24,
2007, from www.cde.ca.gov/ta/ac/ap/documents/simsch106b.pdf.
California Department of Education Dataquest, 2007. CELDT retrieved
November 24, 2007 from
(http://dqcde.ca.gov/dataquest/CELDT/Celdt03_Sch.asp?cSelect=NOR
MAN^(ALYCE)).
California Department of Education Testing and Accountability Office (2006).
Standardized Testing and Reporting (STAR) program explaining 2006
test to parents and guardians assistance for school districts and schools.
Retrieved September 15, 2006, from
www.cde.ca.gov/ta/tg/sr/resources.asp.
Clark, R. & Estes, F. (2002). Turning research into results: A guide to
selecting the right performance solutions. Atlanta, GA: CEP Press.
Covino, E. A., & Iwanicki, E. (1996). Experienced teachers: Their constructs
on effective teaching. Journal of Personnel Evaluation in Education, 11,
325-363.
Cross City Campaign for Urban School Reform. (2005). A delicate balance:
District policies and classroom practice. Chicago: Author.
Daggett, W. (2005). Achieving academic excellence through rigor and
relevance. Retrieved October 1, 2007, from
www.leadered.com/pdf/Academic_Excellence.pdf.
DuFour, R, DuFour, R, & Eaker, R. (Eds.). (2005). On common ground: The
power of professional learning communities. Bloomington, IN:
Solution Tree.
DuFour, R, DuFour, R, Eaker, R. & Karhanek, G. (2004). Whatever it takes:
How professional learning communities respond when kids don’t learn.
Bloomington, IN: Solution Tree.
151
Elmore, R. (2004). School reform from the inside out: Policy, practice, and
performance. Cambridge, MA: Harvard Education Press.
Fisher, D. & Frey, N. (2007). Checking for understanding. Alexandria, VA:
Association for Supervision and Curriculum Development.
Friedman, T, L. (2006). The world is flat: A brief history of the 21
st
century.
New York: Farrar, Straus, & Giroux.
Fullen, M. (2003). The moral imperative of school leadership. Thousand Oaks,
CA: Corwin Press.
Fullen, M., Hill, P., & Crevola, C. (2007). Breakthrough. Thousand Oaks, CA:
Corwin Press.
Hill, J. & Flynn, K. (2006). Classroom instruction that works with English-
language learners. Alexandria, VA. Association for Supervision and
Curriculum Development.
Hill, J. & Flynn, K. (2007, March 15). Classroom instruction that works with
English language learners. Association for Supervision and Curriculum
Development Conference. ASCD 2007 Annual Conference, Anaheim,
CA. Mid-continent Research for Education and Learning.
Johnson, R. (2002). Using data to close the achievement gap: How to measure
equity in our schools. Thousand Oaks, CA: Corwin Press, Inc.
Just for the Kids. (2006). [Data file]. Available from
www.just4kids.org/jftk/index.cfm?st=US&loc=Educators.
Kirkpatrick, D & Kirkpatrick J. (2006). Evaluating training programs. San
Francisco, CA: Berrett-Koehler Publishers, Inc.
Kotter, J. & Cohen, D. (2002). The heart of change. Boston: Harvard Business
School Press.
152
Lachat, M. (1999). What policymakers and school administrators need to know
about assessment reform for English-language learners. Providence, RI.
Northeast and Islands Regional Educational Laboratory at Brown
University, The Education Alliance.
Lachat, M. (2004). Standards-based instruction and assessment for English-
language learners. Thousand Oaks, CA: Corwin Press.
Langer, J, A. (2001). Beating the odds: Teaching middle and high school
students to read and write well. American Educational Research
Journal, 38(4), 837-880.
Mackenzie, R. (2007, April 25). Setting limits workshop. Held at Alyce
Norman Elementary School. Staff Development Seminar.
Marzano, R. (2003). What works in schools. Alexandria, VA: Association for
Supervision and Curriculum Development.
Marzano, R. (2007). The art and science of teaching. Alexandria, VA:
Association for Supervision and Curriculum Development.
Marzano, R., Pickering, D. & Pollock, J. (2001). Classroom instruction that
works. Alexandria, VA: Association for Supervision and Curriculum
Development.
Marzano, R. Waters, T. McNulty, B. (2005) School leadership that works.
Alexandria, VA: Association for Supervision and Curriculum
Development.
Molnar, A., Smith, P., Zahorik, J., Palmer, A., Halbach, A. & Ehrle, K. (1999).
Evaluating the SAGE program: A pilot program in targeted
pupil/teacher reduction in Wisconsin. Educational Evaluation and
Policy Analysis, 21 (2), 165-178.
Overview of California’s 2006 Similar schools ranks based on the API.
Retrieved on November 24, 2007, from
http://www.cde.ca.gov/ta/ac/ap/documents/simschl06b.pdf.
Schmoker, M. (2006). Results now. Alexandria, VA: Association for
Supervision and Curriculum Development.
153
Standardized Testing and Reporting (STAR). Program Explaining 2006 Tests
to Parents and Guardians Assistance for School Districts and Schools.
Retrieved from www.cde.ca.gov.
Stronge, J. (2007). Qualities of effective teaching. Alexandria, VA:
Association of Supervision and Curriculum Development.
Taylor, B.M., Pearson, D.S., Clark, K. F., & Walpole, S. (1999). Center for
improvement of early reading achievement: Effective
schools/accomplished teachers. The Reading Teacher, 53 (2), 156-159.
Tomlinson, C. (2001). How to differentiate instruction in a mixed-ability
classroom. Alexandria, VA: Association for Supervision and
Curriculum Development.
Tucker, S. (1996). Benchmarking a guide for educators. Thousand Oaks, CA:
Sage Publications.
United States Department of Education (2002). Public Law 107-110.
Elementary and Secondary Education Act adopted January 8, 2002.
Retrieved February 25, 2007 from:
www.ed.gov/policy/elsec/leg/esea02/pg1.html#sec101.
Walberg, H. J. (1984, May). Improving the productivity of America’s schools.
Educational Leadership, 41(8), 19-27.
Walberg, H. (1986). Synthesis of research on teaching. In M.C. Whittrock
(Ed.)., Handbook of research on teaching (pp. 214-229) (3
rd
ed). New
York: Macmillan.
Waters, T, Marzano, R, & McNulty, B. (2003). Balanced Leadership: What 30
years of research tells us about the effect of leadership on student
achievement. Aurora, CO: Mid-continent Research for Education and
Learning.
Wenglinsky, H. (2002). How schools matter: The link between classroom
practices and student academic performance. Education Policy Analysis
Archives, 10(12), 1-31. Retrieved August 21, 2006, from
http://epsa.asu.edu/epaa/v10n12/.
154
Wenglinsky, H. (2004). Closing the racial achievement gap: The role of
reforming instructional practices. Education Policy Analysis Archives,
12(64), 1-24. Retrieved August 21, 2006, from
http://epsa.asu.edu/epaa/v12n64/.
Wright, S.P., Horn, S.P. & Sanders, W.L. (1997). Teacher and classroom
context affects on student achievement: Implications for teacher
evaluation. Journal of Personnel Evaluation in Education, 11, 57-67.
155
APPENDIX A
LEARNING OBJECTIVES
Purpose of objectives: To tell the learner what they will be learning, what level
(cognition-Bloom’s) they will learn the concept, how they will prove they have
learned the item, and at what percent accuracy is required.
Five parts of an objective:
1. Content: what is the specific piece?
2. Cognition Level: what level is to be reached by the student (i.e.,
knowledge, comprehension, application, analysis, synthesis, or
evaluation)?
3. Proving Behavior: what does the student need to do to show they
learned the concept or skill?
4. Condition/Given: what is needed or provided to the student to build on
(materials and time)?
5. Expectation/Performance Level: what percent accuracy is acceptable to
you?
Formula:
Given __________(condition)________
The learner will____(cognition level—knowledge, comprehension, application,
analysis, synthesis, and evaluation)
By ____(observable/congruent student-proving behavior)
With ____(percent of accuracy).
Example:
Given a review of Bloom’s Taxonomy, teachers will understand the connection
and importance of writing objectives using cognition levels and student learning
for a minimum of three lessons per day with 90% accuracy.
This objective formula was created by Pat Lawrence Ed.D.
156
APPENDIX B
STAFF DEVELOPMENT DAY, JANUARY 29, 2007
Visit to Kaye Beeson Elementary School
What was the highlight of your visit and why?
What student behaviors did you see that reinforce the value of student
engagement?
How did teachers know their students were learning?
Are there any practices or strategies we can bring back to benefit our students?
What additional support will help you with your Open Court Delivery?
157
APPENDIX C
KAYE BEESON SCHOOL VISIT
Grade Level: __________ Room # ____________
Time: _________________
Activity:
Engagement Tools and Techniques:
Checking for Understanding:
Levels of Rigor/Examples of High Expectations:
Curriculum Delivery:
Room Set Up:
Draw a diagram of the desk/table set up on the back of this paper.
How does this design help or hinder instruction?
158
APPENDIX D
EFFECTIVE TEACHING PRACTICES CHECKLIST
Date: Time: Teacher: Grade Level:
Components of Faithful
Implementation
Evident
Not
Evident
N/A
Evidence/
Comment
Standards-Based Curriculum and
Assessment
• Standards-based objective
(content and student-proving
behavior) are written and posted
on the board.
• Standards-based objective is
explicitly stated and clarified
during the lesson
• All students have and use the
state-adopted instructional
program
• Instructional sequence is
evident (from lesson plans, posted
work, and explicit instruction)
and provides students access to
content and level of cognition of
grade-level standards
• Lesson uses Direct Instruction
from the appropriate ATE, as
evidenced by lesson format.
Effective Teaching Strategies
• Lesson delivery includes high-
level of teacher/student
interaction
• Lesson demonstrates evidence
of multiple checks for
understanding and feedback
159
Components of Faithful
Implementation
Evident
Not
Evident
N/A
Evidence/
Comment
• Room design encourages
interaction between students and
teacher
• Individual, pair, gestures, and
choral response
• Level of expectation for student
work is high for every student as
evidenced by rigor during the
lesson
• Closure demonstrates progress
towards mastery of standards-
based objective
Commendations:
Recommendations:
Adapted from Action Learning Systems, Inc. Direct Instruction
and Mary Greeson.
160
APPENDIX E
STUDENT ACADEMIC PROGRESS SUMMARY
Teacher: ___________________ Date: ___________________
Levels/Students Areas of Concern Strategies/Action Results
Far Below Basic
Below Basic
Basic
Proficient
Advanced
Targeted Area of Instruction:
Materials/Tools Needed:
Assessment/Monitoring Methods:
Additional resources needed to ensure student progress:
161
Abstract (if available)
Abstract
The purpose of this dissertation was to examine interventions to improve student achievement at Eberman Elementary. The problem was that over 75% of the students in grades two through six were not proficient or advanced as measured through the English Language Arts (ELA) California Standards Test (CST), and the ELA core curriculum of Open Court was not being implemented with fidelity. Equally significant was that the students were not engaged and participating in the lessons.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Raising student achievement on the California Standards Test and California High School Exit Exam at the Phoenix Arts Charter School
PDF
Examining the effectiveness of teacher training on the improvement of California standardized test scores at Eva B. Elementary School
PDF
English learners' performance on the California Standards Test at Aviles Elementary
PDF
The impact of restructuring the language arts intervention program and its effect on the academic achievement of English language learners
PDF
The implementation of strategies to minimize the achievement gap for African-American, Latino, English learners, and socio-economically disadvantaged students
PDF
Relationship of teacher's parenting style to instructional strategies and student achievement
PDF
A case study of an outperforming urban high school: the relational pattern between student engagement and student achievement in a magnet high school in Los Angeles
PDF
Effective factors of high performing urban high schools: a case study
PDF
Closing the achievement gap in a high performing elementary school
PDF
School-wide implementation of the elements of effective classroom instruction: lesson from a high performing high poverty urban elementary school
PDF
Achievement gap and sustainability: a case study of an elementary school bridging the achievement gap
PDF
A case study to determine what perceived factors, including student engagement, contribute to academic achievement in a high performing urban high school
PDF
Building capacity in urban schools by coaching principal practice toward greater student achievement
PDF
A case study of factors related to a high performing urban charter high school: investigating student engagement and its impact on student achievement
PDF
The role of the superintendent in raising student achievement: a superintendent effecting change through the implementation of selected strategies
PDF
A case study of an outperforming elementary school closing the achievement gap
PDF
Overcoming a legacy of low achievement: systems and structures in a high-performing, high-poverty California elementary school
PDF
The effect of site support teams on student achievement in seven northern California schools
PDF
Evaluation of the progress of elementary English learners at Daisyville Unified School District
PDF
A study of an outperforming urban high school and the factors which contribute to its increased academic achievement with attention to the contribution of student achievement
Asset Metadata
Creator
Twining, Laura Lee
(author)
Core Title
Raising student achievement at Eberman Elementary School with effective teaching strategies
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education
Publication Date
04/01/2008
Defense Date
02/21/2008
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
effective teaching,instructional strategies,OAI-PMH Harvest,student achievement
Place Name
California
(states),
educational facilities: Eberman Elementary School
(geographic subject),
USA
(countries)
Language
English
Advisor
Hocevar, Dennis (
committee chair
), Hexom, Denise (
committee member
), Stowe, Kathy Huisong (
committee member
)
Creator Email
twining@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-m1070
Unique identifier
UC176199
Identifier
etd-Twining-20080401 (filename),usctheses-m40 (legacy collection record id),usctheses-c127-50797 (legacy record id),usctheses-m1070 (legacy record id)
Legacy Identifier
etd-Twining-20080401.pdf
Dmrecord
50797
Document Type
Dissertation
Rights
Twining, Laura Lee
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Repository Name
Libraries, University of Southern California
Repository Location
Los Angeles, California
Repository Email
cisadmin@lib.usc.edu
Tags
effective teaching
instructional strategies
student achievement