Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
The effect of site support teams on student achievement in seven northern California schools
(USC Thesis Other)
The effect of site support teams on student achievement in seven northern California schools
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
THE EFFECT OF SITE SUPPORT TEAMS ON STUDENT ACHIEVEMENT
IN SEVEN NORTHERN CALIFORNIA SCHOOLS
by
Anne Rita Zeman
___________________________________________________
A Dissertation Presented to the
FACULTY OF THE ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
May 2008
Copyright 2008 Anne Rita Zeman
ii
ACKNOWLEDGMENTS
The author would like to acknowledge a number of individuals who
provided much appreciated support that made this project possible. Dr. Dennis
Hocevar has provided consistent support, guidance, and sage advice. Dr. Rod
Goodyear and Dr. Magaly Lavadenz offered welcome suggestions. The
encouragement of Dr. Steven M. Ladd has helped to bridge the worlds of
practice and academics.
The initial prompting of Dr. Steven Winlock led to a life-altering
learning experience. Numerous teachers, principals, directors, and associate
superintendents have given support in the substantive as well as the emotional
aspects of this undertaking. Mentioned last, but prominent in the author’s mind,
are the members of the thematic dissertation group who provided collegiality,
encouragement, humor, and inspiration throughout the last three years.
iii
TABLE OF CONTENTS
ACKNOWLEDGMENTS..............................................................................ii
LIST OF TABLES........................................................................................ vi
ABSTRACT ............................................................................................... x
CHAPTER 1. PROBLEM IDENTIFICATION ............................................. 1
Problem Analysis ............................................................................... 6
Knowledge and Skill Factors ................................................. 7
Motivation Factors.................................................................. 9
Organization Factors ........................................................... 11
Problem Solution.............................................................................. 12
Purpose, Design and Utility.............................................................. 16
Purpose................................................................................. 16
Design .................................................................................. 18
Utility ................................................................................... 20
CHAPTER 2. FOUNDATIONS OF SITE SUPPORT TEAMS ................... 22
Self-Efficacy .................................................................................... 23
Motivation Theory............................................................................ 34
Accountability.................................................................................. 43
CHAPTER 3. DESIGN SUMMARY .......................................................... 54
Intervention...................................................................................... 55
Participants and Setting .................................................................... 55
Instrumentation and Statistical Procedures........................................ 58
Achievement......................................................................... 64
Tests and Statistical Methods................................................ 64
Qualitative and Mixed Methods Applied to Findings ........................ 65
Interview Field Work............................................................ 67
iv
CHAPTER 4. RESULTS ............................................................................ 69
Quantitative Analysis ....................................................................... 70
Results of Pre-Post Independent Groups Design.................... 71
Results of Quasi-Experimental Design Using Matched Control
Group Comparison ........................................................................ 93
Summary of Findings ..................................................................... 104
CHAPTER 5. OVERVIEW OF FINDINGS.............................................. 106
High Expectations .......................................................................... 108
Adherence to Standards.................................................................. 109
Learning Environment.................................................................... 109
Discussion of Patterns of Results in Site Support Team Schools..... 111
Statistical Significance........................................................ 111
Successful Schools.............................................................. 112
Practical Significance.......................................................... 112
Strong Improvement in “Proficient + Advanced”
Dependent Variable..................................................................... 113
Experimental Schools Fared Well as Compared to
Control Schools ............................................................... 114
API as an Extension of the Study ........................................ 116
School-by-School Discussion ......................................................... 117
Van Buren High School ...................................................... 117
Fillmore High School.......................................................... 120
Johnson Middle School....................................................... 121
Roosevelt Middle School .................................................... 122
Lincoln High School........................................................... 123
Ford Elementary School...................................................... 126
Madison Elementary School ............................................... 129
Summary of Findings ..................................................................... 131
Recommendations .......................................................................... 137
REFERENCES .......................................................................................... 141
APPENDICES
APPENDIX A.: SITE SUPPORT TEAM TEACHER QUESTIONNAIRE 147
APPENDIX B: SIMILAR SCHOOLS’ DATA IN YEAR
PRIOR TO ONSET OF INTERVENTION ...................... 148
APPENDIX C: STUDENT CST PERFORMANCE DATA,
PER SCHOOL................................................................. 150
v
APPENDIX D: PRE/POST-DATA OF CONTROL SCHOOLS................ 164
APPENDIX E: API CHANGES FOR EXPERIMENTAL SCHOOLS:
BACKGROUND FOR INTERVIEWS ............................ 167
APPENDIX F: LINCOLN HIGH SCHOOL PERCENTAGES
OF SUB-GROUPS, 2004-2007......................................... 168
APPENDIX G: COMMON CHARACTERISTICS THAT MAKE SITE
SUPPORT TEAMS EFFECTIVE AS MENTIONED IN
INTERVIEWS................................................................. 169
APPENDIX H: VAN BUREN HIGH SCHOOL SUPPORT TEAM
MEETING MINUTES..................................................... 171
vi
LIST OF TABLES
Table 1. 2002 Achievement Data, Selected District Schools ....................... 4
Table 2. Years in which Schools Became Site Support Team Schools ...... 17
Table 3. Motivational Generalization and Design Principles..................... 41
Table 4. Composition of Site Support Teams............................................ 57
Table 5. Interventions............................................................................... 62
Table 6. Onset of Interventions................................................................. 63
Table 7. Pre- and Post-Intervention CST ELA Performance Band Data,
Van Buren High School, 2002-2007 ........................................... 71
Table 8. Pre- and Post-Performance Band Findings:
Practical Significance, Van Buren High School, 2002-2007........ 72
Table 9. Pre- and Post-Intervention CST ELA Performance Band Data,
Fillmore High School, 2003-2007............................................... 74
Table 10. Pre- and Post-Performance Band Findings: Practical
Significance, Fillmore High School, 2003-2007.......................... 76
Table 11. Pre- and Post-Intervention CST ELA Performance Band Data,
Johnson Middle School, 2003-2007 ............................................ 78
Table 12. Pre- and Post-Performance Band Findings: Practical
Significance, Johnson Middle School 2003-2007........................ 79
Table 13. Pre- and Post-Intervention CST ELA Performance Band Data,
Roosevelt Middle School, 2003-2007 ......................................... 80
Table 14. Pre- and Post-Performance Band Findings: Practical
Significance, Roosevelt Middle School....................................... 81
Table 15. Pre- and Post-Intervention CST ELA Performance Band Data,
Lincoln High School, 2004-2007 ................................................ 82
vii
Table 16. Pre- and Post-Performance Band Findings: Practical
Significance, Lincoln High School, 2004-2007........................... 82
Table 17. Pre- and Post-Intervention CST ELA Performance Band Data,
Ford Elementary School, 2005-2007........................................... 84
Table 18. Pre-and Post-Performance Band Findings: Practical
Significance, Ford Elementary School, 2005-2007...................... 85
Table 19. Pre- and Post-Intervention CST ELA Performance Band Data,
Madison Elementary School, 2005-2007..................................... 86
Table 20. Pre- and Post-Performance Band Findings: Practical
Significance, Madison ................................................................ 87
Table 21. Summary of Statistical and Practical Significance of CST
Performance Band Changes at Experimental
(Site Support Team) Schools ...................................................... 88
Table 22. API Outcomes for Control and Experimental Schools................. 94
Table 23. Comparison of Change within Dependent Variables,
Control School, and Van Buren .................................................. 96
Table 24. Comparison of Change within Dependent Variables,
Control School, and Fillmore...................................................... 98
Table 25. Comparison of Change within Dependent Variables,
Control School, and Johnson Middle School............................... 99
Table 26. Comparison of Change within Dependent Variables,
Control School, and Roosevelt.................................................. 100
Table 27. Comparison of Change within Dependent Variables,
Control School, and Lincoln High School................................. 101
Table 28. Comparison of Change within Dependent Variables,
Control School, and Ford Elementary School............................ 103
Table 29. Comparison of Change within Dependent Variables,
Control School, and Madison.................................................... 103
viii
Table 30. Table of Higher Performing Schools, Control Versus
Experimental, 2007 ELA Portion of CST.................................. 105
Table 31. Similar Schools’ Data in Year Prior to
Onset of Intervention ................................................................ 148
Table 32. Number of Students Per Performance Band, Van Buren
High School.............................................................................. 150
Table 33. Number of Students Per Performance Band, John G. Downey,
Control for Van Buren High School.......................................... 151
Table 34. Number of Students Per Performance Band, Fillmore
High School.............................................................................. 152
Table 35. Number of Students Per Performance Band,
Ronald Reagan HS, Control for Fillmore HS ............................ 153
Table 36. Number of Students Per Performance Band,
Johnson MS.............................................................................. 154
Table 37. Number of Students Per Performance Band, Budd,
Control for Johnson MS............................................................ 155
Table 38. Number of Students Per Performance Band, Roosevelt
Middle School .......................................................................... 156
Table 39. Number of Students Per Performance Band,
John McDougall MS, Control for Roosevelt ............................. 157
Table 40. Number of Students Per Performance Band, Lincoln
High School.............................................................................. 158
Table 41. Number of Students Per Performance Band, Latham City,
Control for Lincoln................................................................... 159
Table 42. Number of Students Per Performance Band,
Ford Elementary School ........................................................... 160
Table 43. Number of Students Per Performance Band, Haight ES,
Control for Ford ES .................................................................. 161
ix
Table 44. Number of Students Per Performance Band,
Madison Elementary................................................................. 162
Table 45. Number of Students Per Performance Band,
Pete Wilson ES, Control for Madison ....................................... 163
Table 46. Pre/Post Data for John G. Downey High School, Control
School for Van Buren High School........................................... 164
Table 47. Pre/Post Data for Ronald Reagan High School, Control School
for Fillmore High School.......................................................... 165
Table 48. Pre/Post Data for Budd, Control School for
Johnson Middle School............................................................. 165
Table 49. Pre/Post Data for John McDougall Middle School,
Control School for Roosevelt Middle School ............................ 165
Table 50. Pre/Post Data for Latham City Senior High School,
Control School for Lincoln High............................................... 166
Table 51. Pre/Post Data for Haight Elementary School,
Control School for Ford Elementary School.............................. 166
Table 52. Pre/Post Data for Pete Wilson Elementary School,
Control School for Madison Elementary ................................... 166
Table 53. API Changes for Experimental Schools: Background
for Interviews ........................................................................... 167
Table 54. Lincoln High School Percentages of
Sub-Groups, 2004-2007............................................................ 168
Table 55. Common Characteristics that Make Site Support Teams
Effective as Mentioned in Interviews........................................ 169
x
ABSTRACT
The purpose of this study was to evaluate school-level interventions
called Site Support Teams in terms of their correlation to student achievement.
Site Support Teams were in use in a school district in northern California.
Comprised of district-level and school-level leaders, the purpose of the teams
was to work together to improve student achievement. Using a quasi-
experimental design, two elementary schools, two middle schools, and three
high schools served as experimental schools whose improvements in
achievements were measured using the English/Language Arts portion of the
California Standards Test for grades 2–11. Improvement rates at each school
were compared to improvements at control schools. The duration of the
intervention at experimental schools ranged from 2 to 5 years. Qualitative data
taken from surveys of teachers at two schools and interviews of 10 Site Support
Team members were used to help interpret the quantitative portion of this study.
Site Support Teams were found to be a successful tool in assisting schools to
improve student achievement when several features were present: sustained
focus, teacher involvement, classroom visitations, and mutual accountability.
1
CHAPTER 1
PROBLEM IDENTIFICATION
In 2003, the Board of Education of the El Presidente Unified district
adopted three Bold Goals for student achievement: (a) one-hundred percent of
students would be “Proficient” or “Advanced” in math and English/Language
Arts as measured by the California Standards Test (CST), (b) 100% of 12
th
graders would pass the California High School Exit Exam (CAHSEE), and (c)
100% percent of schools will meet annual Adequate Yearly Progress (AYP) and
Academic Performance Index (API) targets. These goals provided clear
directives for district-level and school staff of the district. Although morally
essential in terms of educators’ responsibilities toward students, the boldness of
these goals, and the evident gaps that appear when one compares performance
indicators to the goals, highlighted for staff the stark status of student
achievement in each school in the district.
The district is a large Northern California school district serving more
than 61,000 students and encompassing urban, suburban, and rural areas (Elk
Grove Unified School District, 2007a). The district’s Academic Performance
Index (API) in 2006 was 758 (California Department of Education, 2007a). The
district is comprised of 61 schools including 38 elementary, 8 middle schools, 8
high schools, 1 special education school, 4 alternative education schools, 1 adult
2
school, and 1 charter school (Elk Grove Unified School District, 2007a). Forty-
three percent of the district students live in poverty. Seventeen percent are
English Learners (Elk Grove Unified School District, 2007a). The demographic
analysis of students along racial lines shows that 29% of its students are white,
22% are Latino, 20% are African-American, 20% are Asian, and 10% are other,
including Filipino, Pacific Islander, and Native American (Elk Grove Unified
School District, 2007b).
Demographic diversity in El Presidente Unified School District is echoed
in diversity of academic achievement. As measured by annual state testing,
student achievement indicates widely varying student performance across the
district. In 2006, the school with the highest API had an API of 908, while the
lowest performing school had an API of 648. Subgroups within schools
similarly demonstrated great disparity of achievement. Such variances were not
new in El Presidente, but have been drawn into a spotlight of growing intensity
over the last several years.
In the 2001-02 school year, the district was comprised of 53 schools and
had a total enrollment of 49,970 students. Thirty-six percent of the students
lived in poverty. Twenty-one percent were English Learners. The district’s API
was 720. The district was installing a district-wide student data system that
would yield, over the course of the following two years, instant desktop access
3
by every teacher and administrator in the district to student directory
information, student attendance information, and various test scores.
Although opportunity gaps, also known as achievement gaps, existed at
all schools, Van Buren High School was the school with the lowest API in the
district among all of the regular education and comprehensive schools. Van
Buren High School’s API in 2002 was 579 (California Department of Education,
2003). The school’s statewide rank of three placed Van Buren in the bottom
third of high schools in the state. Its similar schools rank of six indicated that,
when compared to schools with similar characteristics and demographics, Van
Buren was slightly better than average, albeit in a bleakly performing group. On
the API, there was a 67-point achievement gap between the lowest-performing
subgroup at Van Buren High School—Hispanic students—and the white
subgroup.
In terms of percentages of advanced/proficient AYP data, the African-
American subgroup fared even worse than the Hispanic subgroup. While 22%
of Van Buren High School Hispanic students were advanced or proficient, only
18% of African-American Van Buren High School students achieved this
proficiency level. This was especially disturbing because the African-American
subgroup was also the largest subgroup at the school comprising 34% of the
tested population, while the white subgroup comprised just under 11% of the
tested population.
4
While Van Buren High School was the lowest-performing school in the
district as measured by CST results, other district schools were low-achieving,
as well. Table 1 shows CST performance of 10 of the district’s comprehensive
schools.
Table 1
2002 Achievement Data, Selected District Schools
School API
API,
White
Subgroup
(2002 API
Base)
Lowest
Performing
Subgroup
by API
API of this
Subgroup
(2002 API
Base)
CST %
Adv +
Prof.,
ELA
a
Gr. 6,
8, 10
% Adv +
Prof.,
Math
Gr. 6, 8,
(general
Math) gr.
9 (Alg. I)
Van Buren
High School 579 609 Hispanic 542 20 15
Johnson
Middle
School
588 631 Hispanic 546 15 4
Fillmore
High School 622 671 Hispanic 566 28 13
Roosevelt
Middle
School
628 669
African-
American 566 22 15
Madison
Elementary 643 Not sig.
African-
American 620 20 35
Ford
Elementary 666 678
African-
American 643 23 22
Lincoln
High School 715 751
African-
American 640 44 26
a
ELA stands for English Language Arts
Source: California Department of Education, 2002.
5
Data such as those presented above prompted the leadership of El
Presidente to begin to look for deep, sustainable solutions, i.e., support systems
that would help schools to achieve success at drastically raising student
achievement among students, overall, but notably among students in their
lowest-performing subgroups. The solution that is to be the focus of this study,
is the provision of Site Support Teams to low-performing schools in the district.
As the 2002 state testing scores were being released, and as the 2002-03
school year was dawning, factors converged within the district to bring about a
higher degree of urgency concerning achievement problems. Schools the
district, always concerned with preparing students to demonstrate rigorous
learning, sensed a growing focus on academic performance on state tests as
these results increasingly became a focus of public scrutiny. Simultaneously,
changes in district-level administration led to new discussions throughout the
district that evolved into the development of the Bold Goals. Finally,
technological systems were being put into place to give easy data access to all
teachers and administrators in the district. Conditions were right for a new type
of intervention to be applied.
As one of two directors in the elementary division in the 2002-03 school
year, I worked as the Director of Curriculum and Professional Learning in the
district. In that role, I served as a regular district support administrator on four
Site Support Teams and on one elementary mini support team. The organization
6
of site support meetings was under the purview of my department. Additionally,
as the number of Site Support Teams grew, and as permutations of Site Support
Teams occurred due to changing levels of response and achievement, members
of my department saw a need to formalize training for schools that had Site
Support Teams. With considerable resources being put into Site Support Teams,
it was, therefore, important that the district understand the efficacy of Site
Support Teams as structures that foster improved student achievement.
Problem Analysis
In their illuminating approach to achieving peak performance, Clark and
Estes (2002) describe three causes of gaps between desired and actual
performance. These causes are knowledge and skill factors, motivation factors,
and organizational factors. This framework was used as the basis of the analysis
of the underlying reasons for underperformance of schools in the district.
According to the Bold Goals set by the Board of Education, 100% of
students should be proficient or advanced. At some district schools, the actual
performance level was far below this. For instance, at Van Buren High School
in 2002, approximately 23% of students were proficient or advanced. The
difference between this and our goal of 100% proficient or advanced was a 77%
gap of performance. Why was it that students at some schools such as Van
Buren High School had such an achievement gap?
7
Could the cause of the gap be related to the students, themselves? In
terms of such student factors, gaps in any of the three areas of knowledge/skills,
motivation, or organizational factors could cause a gap of academic
performance. For instance, it seems self-apparent that a student who was
scoring below basic lacked the knowledge and lacked the skills that a student
scoring advanced or proficient would have. The student might also have a
motivational factor. As an example, he or she may enjoy staying up late each
night playing video games and then may be unmotivated to get out of bed in
time to arrive and be attentive at school. The student may also be a victim of an
organizational gap in which his family’s need to have him assist at their family
business prevents him from studying or completing homework at home.
The problem with the student-level gap analysis, of course, is that it is
the school or school system that shoulders the accountability for student
learning. Therefore, the focus of this analysis was the classroom-level, school-
level, or school-system-level factors that accounted for the gap in student
performance.
Knowledge and Skill Factors
Why was it that students had a gap in knowledge and skills as evidenced
above? Did the teachers have a gap in the knowledge and skills area, i.e., did
they lack subject matter knowledge, pedagogical knowledge, or knowledge
8
about adjusting instruction for their particular students? Did the site
administration have a knowledge and skills gap? Did the principal know how to
lead instruction? Did he or she know how to monitor practice to assure that
students are learning? Did he or she have the skill to work with teachers in such
a way that brought out the best in instructional practices? Did the district, as a
collection of administrators, have a knowledge and skills gap? Did
administrators know how to guide instruction and how to ensure uniform
adherence to desired curriculum and effective instruction?
In the mid- to late-nineties, reports of 90-90-90 schools began to change
the outlook of the education world which had a history of accepting as inevitable
negative academic associations with poverty and race. Reeves (2004a) wrote
about his own work in 1995 on the 90-90-90 schools. These were defined as
90% poverty (free-and-reduced lunch qualifiers), 90% minority, and 90 %
meeting standards. Staff in these schools could achieve results with students
who were similar to those in the district. Reeves (2004a) identified five
commonalities of practice in these 90-90-90 schools. All five practices (focus
on academic achievement, clear curriculum choices, frequent assessment,
emphasis on nonfiction writing, and collaborative scoring of student work)
imply knowledge and skill on the part of teachers and administrators. Clearly,
the teachers and administrators in these locales knew and had the skill to effect
9
strong student learning! One must questions whether district teachers have the
same sets of knowledge and skill.
Motivation Factors
As one reads Pintrich and Schunk (1996), a frame for looking at
motivation arises out of social cognitive theory. Pintrich and Schunk organize
motivation not only into success-orientation and failure avoidance, but also
consider the incentive value of success in which the pride of accomplishment is,
in itself, motivational. The expectancy construct may be very relevant to the
problem in district schools and may be a factor contributing to low test scores.
Pintrich and Schunk (1995) review many pieces of research on outcome
expectancy. They define outcome expectancy as an individual’s judgments or
beliefs regarding the contingency between their behavior and the outcome. The
expectancy construct may apply to the problem on the teacher level, or it may
apply to either the total population of students at underperforming schools, or to
sub-populations of students.
Self-efficacy, i.e., individuals’ judgments of their own capabilities in
doing what it takes to achieve a desired outcome, is a factor that may affect not
only students at low-achieving schools, but teachers, as well If the self-efficacy
of teachers is low in terms of their beliefs about their abilities to teach students
10
on grade level, the impact on student achievement, logically, could be
significant.
Although data on the outcome expectancy and self-efficacy of teachers
and students in the target schools were not available, district-level observers at
these schools, e.g., administrators and instructional coaches, anecdotally claim
that these factors have been issues that contribute to the problem of low
academic performance. As an example, instructional coach Jo Elbert recalls her
initial impressions when she was first assigned to Van Buren High School after
previous service only in elementary grades (J. M. Elbert, personal
communication, February 10, 2007).
In the early days the first thing that was very apparent to me was the lack
of student engagement in their learning. There was a lack of
connectedness between most students and many of their teachers. The
expectation level was extremely low. Most teacher attitudes reflected an
air of "I've taught it; it's their responsibility to learn it.... It's not my fault
if they aren't paying attention or doing their homework.... They're old
enough to know what they're responsible for.... These kids just don't
care.”
The underlying beliefs in my opinion are reflected in the comments and
attitudes above. The level of teacher efficacy was so low there were
frequent times where I wondered why some of the teachers went into the
profession. It was a very teacher-centered atmosphere that really had
little to do with students or their learning.
Because of years of low-student achievement on multiple measures of
performance (CST, SAT/9, SAT/ACT, GPA, course failure rates, etc.),
the motivation of the staff was reflective of the students. In my opinion
this was a part of the reason why students lacked motivation. While it is
certainly understandable that staff morale would be low given the lack of
student achievement, it was also clear to me that very little had been
done to support the teachers in their efforts (some better than others) to
try to increase their student's level of understanding and performance.
11
Another layer of the expectancy construct lies in the arena of site
administrators’ expectations for teacher performance. Do the principals at these
schools demand, support, and monitor first-rate instructional practices?
Organization Factors
The work that goes on in each classroom every day, the backdrop for all
interactions between teachers and students, occurs within the context of the
organization. If students in some schools of the district are not learning and
performing academically to the extent that the district would prefer, one must
look at the context of school-level and district-level organization. Former
Program Director of WestEd’s Western Assessment Collaborative, Kate
Jamentz, (2002) writes about this context of learning in her article about
supporting teachers for improved student performance.
But, irrespective of what might be printed in any documents, the real
standards in a school or district are evident in the quality of work
expected and/or tolerated of students and of teachers in their day-to-day
work. (Jamentz, 2002, p. 51)
Relevant questions, then, to the problem of academic underperformance
of students include questions of school expectations, structures to monitor and
support those expectations, and district structures that define, support, and
monitor the quality of day-to-day classroom work. Reeves (2004a) advises that,
in the 90-90-90 schools, structures are in place to allow for frequent teacher
collaboration. Elmore (2000) and Reeves (2006) cite the need for distributed
12
leadership. Certain district schools lacked the structures of expectations,
monitoring, collaborating, and sharing leadership. Elmore (2002) also describes
the organization’s capacity to respond to knowledge and skill needs of teachers
as critical to success. He refers to a capacity for effective professional
development. Considering these factors of best structures and best district
practices, there was an organizational gap in our district.
Problem Solution
Mac Iver and Farley (2003), after reviewing the field of research on
district-level role in improved student achievement, concluded that “central
office administrators are crucial to the school improvement process” (p. 3).
Then-Associate-Superintendent Marcus Herrera first identified Van Buren High
School as a school in need of direct and intense intervention in the fall of 2002.
He formed the first Site Support Team that met every other week at Van Buren
High School.
Indeed, the concept of the Site Support Team mimics the technical
assistance that is provided to schools in California that are in Program
Improvement status. In a 2002 letter to County and District Superintendents
then-State Deputy Superintendent Joanne Mendoza (January 23, 2002) outlined
new information for California education leaders concerning help for under-
performing schools. Schools that had failed to meet AYP targets for two
13
consecutive years were declared to be in “Program Improvement.” Leaders of
Program Improvement schools were advised by Ms. Mendoza (personal
communication, January 23, 2002, p. 2) to “work closely with their districts and
utilize resources and funding available to implement interventions that will
result in school improvement and the academic progress of all students” Other
sections of this letter outlined for superintendents the progressive nature of
Program Improvement in a manner consistent with that stipulated by the federal
government under the No Child Left Behind Act of 2001 (2002). The United
States Department of Education defines and describes the actions for schools
that have failed to meet AYP. For each year of required school improvement,
the school “must receive technical assistance that enables [the staff] to
specifically address the academic achievement problem that caused the school to
be identified for improvement” (Paige, 2002, p 4). In essence, the concept of the
Site Support Team was an attempt to effect improved achievement and to
forestall program improvement by providing additional technical expertise and
support to schools before they failed to meet AYP two years consecutively.
The Site Support Team consisted of district-level administrators such as
an associate superintendent, directors of key areas such as research, curriculum
and professional learning, special education, and learning support (categorical
programs). These individuals met on a regular basis with the school team:
principal, vice principals, instructional coach, and other site-specific relevant
14
personnel. Over time, and at a pace dependent on the school’s culture, teachers
became involved, rotating into the Site Support Team meetings. The teams
spend one hour to three hours focusing on student needs at the school. At times,
the school staff presented and explained initiatives and programs that were being
implemented. At times, student data were presented. Often the site visit
includes “walk-through” visits of numerous classrooms with mixed teams
(district office and site personnel) visiting four or more classrooms each. When
classroom visits occur, they are followed by some form of group debriefing of
the informal evidence and findings from the observations. The site team decides
how this data will be provided to the entire staff. These cumulative observations
result either in site-level intervention or professional learning, and/or the visits
result in questions that would become the focus for the next agenda.
Leithwood (2004) considers “providing instructional guidance” as “an
important set of leadership practices in almost all districts and schools aiming to
improve student learning” (p16). This instructional guidance is a primary
objective of the site support team.
Site Support Teams are structures that assist in the monitoring and
support of day-to-day academic progress. Additionally, as prescribed by
Jamentz (2002) and Reeves (2006), they are structures that both exemplify and
promote collaboration and feedback. At the core of the Site Support Team is
mutual accountability. Teachers become accountable to administrators and to
15
peers which prompts them toward the ultimate accountability, that to their
students. Through the Site Support process, site administrators become
accountable to both their teachers and to district-level administrators. District
administrators are accountable to both the teaching staff and to the site principal.
Teacher accountability typically is exemplified through collaborative planning,
pacing and methodological adjustments of instruction, and in the design of more
effective lessons. The accountability of site administrators can include the
adjusting of schedules, the provision of collaboration time, and the
dissemination of feedback and other important information At the district level,
accountability in terms of a Site Support Team could include visibility and
interactions with staff, the provision of specifically requested data, and,
occasionally, policy interpretations that would better accommodate site-
identified needs.
During that first meeting of a Site Support Team with key staff members
of Van Buren High School, an approach was laid out to begin to address the
knowledge/skills gap. As printed in the minutes from that meeting coaches
requested to visit classrooms without administrators. They were told when and
how to provide information about effective instructional strategies to staff. The
group identified Marzano’s Classroom Instruction that Works (Marzano,
Pickering, & Pollock, 2001) as the place they would start with what was,
essentially, the knowledge/skills gap. Moreover, the conception of the Site
16
Support Team and the work upon which it was embarking exemplify several of
Marzano’s (Marzano, Waters, & McNulty, 2005) 21 characteristics of effective
school leadership including monitoring and evaluating; knowledge of
curriculum, instruction and assessment; focus; change agent.
By the end of the 2002-03 school year, district-level leadership in the
district saw promise in the effects of Site Support Teams. Van Buren High
School’s 2003 API was 616, a 37-point increase in scores. Statewide ranking
and similar schools ranking moved up to “4” and “9.” Due to the success that
was perceived to be a result of the Site Support Team, Site Support Teams were
added to other schools. Table 2 shows the years in which various district
schools became Site Support Team schools.
Purpose, Design and Utility
Purpose
The purpose of this study was to evaluate the efficacy of Site Support
Teams in the El Presidente Unified School District in terms of their impact on
improved student achievement. A basic question of interest is whether schools
with Site Support Teams improved their student achievement and, in fact,
whether they improved student achievement more than would be expected of
any school, i.e., more than similar schools without Site Support Teams.
17
Table 2
Years in which Schools Became Site Support Team Schools
School
Onset Date of
Intervention
(Site Support
Team)
Notes Regarding Documentation of Onset
Van Buren
High School
October 9, 2002 Minutes exist from first meeting.
Fillmore
High School
2003-2004 Fillmore HS listed on Site Support Team
planning grids from 2003-04. Many
original members have retired, have left
the district, or, in one case, have died. In
spite of Fillmore’s appearance on
planning grids dated from fall of 2003, the
only agenda or minutes that could be
found were from May 2004. Two vice
principals, instructional coach, principal’s
secretary, former coaching coordinator
and two reading specialists were consulted
re: date of onset. The strongest evidence
suggests that the team began during the
2003-04 school year.
James
Roosevelt
Middle
School
2003-2004 Former principal attests to this start;
moreover, it was on district planning
grids.
Samuel
Johnson
Middle
School
January 2004 On planning grid for 2003-04, confirmed
by E/LA program specialist who was on
the first four teams including Johnson.
Madison
Elementary
School
Fall 2005 Author was on this team at the outset.
Existence confirmed by planning grids.
Ford
Elementary
School
Fall 2005 Author was on this team at the outset.
Existence confirmed by planning grids.
Lincoln
High School
January 2005 Same principal had been at the school
since 2001. He had notes regarding onset.
18
A related area of inquiry in this study was to consider the resources spent
on Site Support Teams. The primary resource considered were the time of
personnel A descriptive summary of time spent was juxtaposed against the
desired improvement as described above.
This study provided both summative and formative evaluations. The
former was determined by the comparison of experimental schools’ (those with
Site Support Teams) achievement with that of similar schools that do not have
Site Support Teams, i.e., the control schools. However, a formative evaluation
was helpful to district administrators as they embarked on decisions to continue,
to modify, or to abort the Site Support Team effort.
Design
The primary method used in this study was a Pre-Post- Quasi-
Experimental Design. One similar school was identified for each of seven Site
Support Team schools. The similar schools were selected from the California
Similar Schools list. Schools within that list were selected that seem to mirror
the demographics of the experimental schools. English-learner populations,
percentages of poverty, and relative location (urban, suburban, rural) were
important factors in selecting schools that were truly similar The rates of
improvements of these schools, measured by California STAR test results, were
compared over two or more years’ time, the number of years of outcomes
19
depending upon the number of years’ involvement in the experimental school’s
Site Support Team.
Surveys and interviews of key players at the Site Support Team Schools
were used to capture participants’ thoughts about why the Site Support Team
seemed to be effective or why it might have had less of an impact than expected.
Focus. While the focus of this study was quantitative, growth of student
achievement at the experimental schools and a comparison of post scores for the
experimental and control schools, qualitative methods were used. Interviews
and surveys of selected administrators and teachers who had been participants
on Site Support Teams helped to explain positive quantitative outcomes if those
are found, and to build understanding of the influence of Site Support Teams.
The qualitative aspects of this study should help district administrators to more
deeply understand the results of the research question, “Is there a relationship
between Site Support Teams and improved student achievement” as they will
provide information about implementation variations that seem to be more or
less helpful.
Unit of analysis. The unit of analysis for the quantitative portion of this
study was the California Standards Test (CST) data. Performance band scores
(numbers of students at the advanced, proficient, basic, below basic, far below
basic) at the whole-school and at the disaggregated level was used.
Additionally, Annual Yearly Progress (AYP) data (proficient or not proficient)
20
was compared. Finally, a third dependent variable, percentage of students
scoring basic or above, was also analyzed. All of these were considered for a
minimum of two years depending upon the longevity of the Site Support Team.
Time boundaries were between 2002 and 2007. The school with the greatest
Site Support Team longevity was Van Buren High School. Data for this school
were taken from 2002-2007.
Sampling and sources of data. In addition to STAR scores from the
state-wide data bank, interviews and surveys were used to sample Site Support
Team teacher and administrator participant groups. Individuals selected for
study were those in leadership roles who were, or who have been, a part of a Site
Support Team and who were still available to offer a perspective.
Utility
Information gleaned from this study is informative for cabinet-level
administration in the district. Given the time and resources invested by the
district in the Site Support Team structure, it is important for the superintendent
and his cabinet to know if these structures could be linked to student
improvement gains and to the ability of the district to keep schools out of
Program Improvement status. At a more specific level, the interpreted results of
this study yield insights as to variations and components of the Site Support
Team structure that seem to be the most useful. These data would help the
21
district to either replicate these portions of Site Support Teams at other schools
or to reduce some of the resources going into Site Support Teams without
undermining their efficacy if, in fact, efficacy is indicated. Therefore, the study
results will assist district administrators to know whether Site Support Teams
should be replicated, modified, or eliminated as the staff of the district pursues
its Bold Goal of 100% proficiency for all students.
22
CHAPTER 2
FOUNDATIONS OF SITE SUPPORT TEAMS
Site Support Teams are groups of district-level officials with
complementary areas of expertise that work together with school site teams to
analyze, to make recommendations, and to spur improvements in practices that
will affect improved student achievement. The concept of the Site Support
Team rests upon a foundation of theories founded in psychology and sociology
and backed by specific research in the field of education. Foundational
psychological theories are those of self-efficacy and motivation. In the field of
education, related seminal research is that of Marzano (2001, Marzano, Waters,
& McNulty, 2005), Elmore (2000), and others who have researched effective
practices in instruction and in professional development. An emphasis on
accountability is among those practices and is consistent with overall national
tone set by the No Child Left Behind Act of 2001 (United States Department of
Education, 2002).
What is involved in stimulating changes in instructional practice in order
to improve the learning, i.e., achievement, of students? Various educators and
educational change theorists might offer varying responses to this question. Site
Support Teams, in the most simplistic sense, operate under notions such as
several heads (focused on a problem) are better than one; shared expertise,
23
again, focused on a problem, yields a synergy of ideas and solutions; and
repeated meetings of the same group of people working on the same problem
will bring about increased focus and accountability of action. Translated into
terminology that is more common in the academic world, Site Support Teams
are under-girded by notions of self-efficacy, motivation theory, and
accountability, as well as best practices of leadership and instruction.
In sum, the support that Site Support Teams give to schools rests on
beliefs in these areas:
Self-efficacy: Does the administration team feel capable of taking the
steps needed to bring about changes needed to affect improved student
achievement?
Do teachers believe that their students can learn, and to learn at the level
of the standards?
Do teachers believe that they are capable of teaching students to achieve
at higher levels?
Do students believe that they are successful learners?
Motivation: Are administrators motivated to do what needs to be done to
affect improved student achievement?
Are teachers motivated to make the changes necessary to effect
improved student achievement?
24
Accountability: Are systems in place that keep teachers accountable for
progress in student achievement?
Are systems in place to keep administrators accountable for progress in
improved student achievement?
Self-Efficacy
What do teachers expect of their students? In other words, what is it that
teachers expect that their students will be able to learn and to do? To what
degree do teachers believe that their students will be able to learn challenging
material, engage in and apply rigorous thinking, and perform academic tasks to
an astute level of precision and specificity? These questions are aspects of a
critical question in education—that of teacher- and student-efficacy.
Tschannen-Moran, Woolfolk Hoy, and Hoy (1998) reviewed 23 years’
worth of studies on the question of teacher self -efficacy. They began by
defining teacher efficacy as the extent to which the teacher believes he or she
has the capacity to affect student performance. They also address the notions of
teachers’ beliefs and convictions that they can influence how their students will
learn, even students who may be challenging or unmotivated. The articles
reviewed included studies of the correlation between teacher self -efficacy and
teachers’ likelihood to use varied methodological methods, tendencies to work
longer and harder preparing academic lessons, and subject-specific correlations
25
such as those between teacher efficacy and pedagogy in science. Tschannen-
Moran, Woolfolk Hoy, and Hoy (1998) acknowledged the difficulty that arises
with measuring teacher efficacy, as it derives from inferences based upon self -
reports. Not-withstanding, the research review found consistently positive
correlations from study–to-study regarding the far-reaching effects of teacher
efficacy on a variety of teacher behaviors that are associated with increased
likelihood of student learning. These behaviors include those of implementing
varied methodologies, persisting even when presented with obstacles,
responding with resilience to instances of failure, and coping behaviors when
faced with circumstances that might lead to depression or to stress. The size and
breadth of this review of decades of studies lend to the reliability of the notion
that teacher efficacy is an influential factor in classroom outcomes.
In concluding sections of their review, Tschannen-Moran, Woolfolk
Hoy, and Hoy (1998) addressed studies by Hoy and Woolfolk (1993) and Lee,
Dedick, and Smith (1991) that link principal behavior to teacher efficacy. Some
of the work in this area is indirect, such as links between how teachers perceive
their principal’s influence in the district and teachers’ own efficacy. No
replications of this study were described, and one might question the relevance
of such an association given its distance from effect on students. However,
other studies consistently link to teacher efficacy certain behaviors of principals
that are more directly tied to the school climate such as those of enforcing order
26
and discipline or behaviors of emphasizing to teachers their powers of influence
on learning.
In sum, the work of Tschannen-Moran, Woolfolk Hoy, and Hoy (1998)
provides a comprehensive review that leads to a strong conclusion that teacher
efficacy matters. Teacher efficacy is linked to a number of teacher behaviors
that seem intuitively connected to student learning. Their work reinforces that
done a few years earlier by Bandura (1993).
Bandura of Stanford University is a noted expert in the area of self-
efficacy. Bandura (1993) organized the studies related to self -efficacy into four
major processes representing the diverse ways in which self -efficacy affects
cognitive development and the general functioning of human beings. Bandura
asserts that individuals’ self-perceptions influence their cognitive processes,
their motivational processes, their affective processes, and their selective
processes representing cognitive, psycho-social, emotional, and behavioral
domains.
In terms of cognitive processes, Bandura (1993) reviews studies of the
effects of strong and weak perceptions of self-efficacy on learning and on
performance in a variety of different tasks. For instance, he cites work by a
Stanford student by the name of Collins (1982) who presented her unpublished
dissertation at the annual meeting of the American Educational Research
Association. This particular study analyzed the performance in mathematics of
27
high-ability, medium-ability, and low-ability math students as plotted against
their respective self-efficacy. Although the author could not find information
about the size of this study nor the details of methodology, it is noted that
Collins concluded that positive attitudes toward math were better predicted by
perceived self-efficacy than by actual ability.
Bandura cites his own studies that have led to his conclusions that self -
efficacy is positively associated with persistence to tasks and with task
accomplishment, even as the difficulty level of the task increases. He also
draws positive causal links between self-efficacy and resilience and between
self-efficacy and performance on memory tasks.
McLaughlin and Talbert (1993) describe a response that is seen when
teachers believe that their current sets of students will not be able to access the
curriculum at a high level:
Teachers who lower their expectations for today’s students often water-
down curriculum. Often, this retreat from traditional standards and
academics represents a well-meaning attempt to structure a supportive
classroom environment…. Regardless of teachers’ rationale, both
teachers and students in classrooms of this stripe find themselves bored
and disengaged from teaching and learning. (McLaughlin & Talbert,
1993, p. 6)
Pintrich and Schunk (2002) assert that self -efficacy of individuals,
teachers or students, affects choices of activities, the degree of effort that will be
invested, and the degree of persistence afforded to the work They further talk
about the relationship between teacher self -efficacy and student achievement,
28
concluding that teacher self-efficacy is a significant predictor of student
achievement. Interestingly, Pintrich and Schunk (2002) call upon Bandura’s
work (1997) in the area of collective efficacy. They found that the teachers’
group efficacy was associated with commitment to the tasks, their resilience in
the face of difficulty, and in their common focus on a goal.
Pintrich and Schunk’s work in 2002 is a theoretical review including a
discussion of many, many studies in the area of self-efficacy over decades’ time.
In some cases, Pintrich, himself, was the researcher.
In the early 1990s, McLaughlin and Talbert (1993) studied teachers who
either had retained or had returned to a sense of high self -efficacy for effecting
high achievement in their students. They found that, in some cases when the
conditions of teaching and the challenges of students’ backgrounds proved
difficult, many teachers were not able to sustain over time a high level of self-
efficacy and performance resulting in significant frustration on the part of the
teacher. In their work they found that those teachers who were able to maintain
strong senses of self-efficacy were part of active professional communities that
bolstered each other and gave mutual support. Not surprisingly, McLaughlin
and Talbert found that teachers at the same high school could experience
enormously divergent contexts and levels of support and performance depending
on the culture and norms of the subject-area department to which they belonged.
Therefore, the professional level of support, and the norms and expectations, can
29
vary at the department level within a school, at the school level, and at the
district level.
McLaughlin and Talbert’s research (1993) also analyzed state-level
support of professional communities as it affected teachers’ adaptation to
reforms. In sum, in order to effect change in instructional practices, the optimal
context would be to have a deep and broad professional community that
supported the teacher at the local levels of department and school; at the district
level in terms of trust, resources, and support; and at the state level in terms of
direction and policy supportive of professional, reform-oriented communities.
The research was conducted over 3 years and involved two states, Michigan and
California, seven districts, 16 schools, and surveys of nearly 900 teachers One
would ask whether the reliability of this study could be called into question
given the 15-year interim. The author believes that the findings are still highly
relevant today and one supportive rationale is the popularity within the national
education community of professional learning communities which may, in fact,
have been an outgrowth of this earlier research. In terms of study design,
changes in student outcomes and classroom observations verifying actual
practice were not considered. However, the continuing relevance of the study
rests in the area of teacher self-efficacy, the question of a teacher’s beliefs about
his or her ability to produce student learning, as this is a foundational context for
30
the curriculum, instructional strategies, and measurements that may be layered
upon it.
Bandura discusses personal as well as collective self-efficacy among
members of a teaching staff He found that “Teachers who lack a secure sense of
instructional efficacy show weak commitment to teaching and spend less time
on academic matters” (Bandura, 1993, p. 134). He cites work by Woolfolk and
Hoy (1990) who noted that teachers with a low sense of instructional efficacy
focus more on control and management issues in the classroom and tend to rely
on negative sanctions for students. In contrast, teachers who have strong senses
of instructional efficacy support the development of students’ intrinsic interests
and academic self-directedness. In this particular, study the researchers studied
a population of prospective teachers and, therefore, findings may or may not
generalize to the population of all teachers.
Teacher self-efficacy seems to be a central principle in the work to help
under-performing students to achieve. Given its importance, Hoy and Woolfolk
(1993) studied the question of factors that might help to promote strong senses
of self-efficacy in teachers. While this study could not, by design, demonstrate
causes of self-efficacy, the researchers surveyed a fairly large sample of 179
teachers and subsequently performed correlation and regression analyses to
examine the correlations among variables. Hoy and Woolfolk studied patterns
31
associated with the self-efficacy of individual teachers rather than the aggregate
at the school level.
The findings of this study indicated that a number of factors were
correlated with the self-efficacy of individual teachers. The statistically
significant (in some cases, p < .01, and in some cases p < .05) factors that
showed the strongest relationship to teacher efficacy were principal influence,
an academic emphasis in the school, and the teacher’s education level. Multiple
regression analyses were performed in order to discern the individual effects of
each of several variables on teacher efficacy. The results revealed that the
principal influence (r = .26), academic emphasis in the school environment, (r =
.23), and the education level of the teacher (r = .21) were all statistically
significant factors and had somewhat strong positive correlation coefficients. In
sum, teachers who had a strong, assertive principal who spoke on behalf of the
teachers, an environment that values academics, and the level of education of the
teacher are associated with teachers’ beliefs that they can successfully teach and
motivate even the most challenging students. Interestingly, teacher experience
was mildly negatively associated with teachers’ sense of teaching efficacy.
The effect sizes in this study were not particularly large for individual
factors. However, the combination of several factors such as principal
influence, education level, and academic emphasis in the environment yielded a
32
total correlation of .35 (r = .35). Therefore, approximately 12% of variance in
teacher efficacy could be attributed to the combination of these factors.
In the study above, the 179 teachers were all at the elementary level in
New Jersey. They represented 37 schools. Twenty-seven out of the 37 schools
were above average in terms of wealth. In this case, wealth meant resources
available to the school rather than the students’ socio-economic level. Given
these parameters of the study, the research outcomes may not generalize to
teachers in all schools, in all states, and at various levels.
Does teacher efficacy affect student academic self-efficacy? Bandura
(1993) cites work by Midgley, Feldlaufer, and Eccles (1989). This was a
narrowly focused study analyzing students’ changes in self -efficacy in
mathematics at the time of transition from elementary school to junior high
school. In spite of the narrow focus, the sample size was 1,329 students. The
researchers found that students of teachers with low senses of self-efficacy lost
self-efficacy and increased self-doubts at this critical juncture in academic life.
Collective teacher efficacy in any given school contributes to the beliefs
held within the school culture, as a whole. These shared beliefs become a
normative part of the school culture and rests on social cognitive theory and
interactive group dynamics. Basically, teachers are socialized to act within the
bounds of the social milieu of the school. Goddard, Hoy, and Woolfolk Hoy
(2000) studied this collective teacher efficacy and its effects on student
33
achievement. In a complex study conducted in urban school districts in the
Midwest, the researchers first developed, tested, and piloted a scale to measure
collective teacher efficacy. After finding the instrument to be reliable and valid,
the researchers studied the teachers’ scores on the instrument and compared
them to students’ achievement scores in math and in reading. Forty-seven
elementary schools were involved in the study. These schools represented
thousands of students. In comparing schools, Goddard, Hoy, and Woolfolk Hoy
(2000) found that a one-unit increase in collective teacher efficacy was
associated with an increase of more than 40% of a standard deviation in student
achievement. The authors asserted that principals who wanted to increase
student achievement scores could do so by effecting improvements in collective
teacher efficacy. Of the four ways hypothesized to do that (provide a mastery
experience, provide a vicarious experience, social persuasion, and the creation
of affective states), the researchers recommended the strategy of providing a
mastery experience. Certainly, if teachers change instructional practices, student
learning is likely to increase. This increase in student performance will be noted
by the teachers who will then feel more control and influence over students’
academic outcomes.
In a literature review published in 1989, Bandura (1989) bridges self-
efficacy and motivation. He summarized research that he and others had
conducted, noting that individuals with high self-efficacy remained motivated,
34
persisted in difficult tasks, and could overcome temporary lapses of confidence
that might follow disappointing performance. While this article reviewed
studies in which the general public had been studied rather than teachers per se,
Bandura’s conclusions logically have strong implications for teachers in that
teachers with strong senses of self-efficacy are likely to be able to remain
effective in schools with challenging student populations. One might also infer
from Bandura that these teachers are likely to both stay motivated to invest
energy in their work and to extend positive expectations to their students.
Pajares (1996) offers a thorough review of the literature regarding self-
efficacy in education. He points to the overlapping and often unclear lines
among terms such as self-efficacy, self-concept, and confidence. The nuances
amount to fine distinctions, with self-efficacy usually being a much more task-
specific or skill-specific term. However, the psychological principles related to
determinants and fluctuations in self-concept, self-efficacy, and confidence are
highly similar. Pajares addresses the notion of reciprocity of motivation and
self-concept. He points out that it is possible that increased performance or
increased achievement results in increases in self -concept (or self-efficacy) just
as improved self-efficacy can result in increases in achievement or performance.
Pajares also raises the logical extension that a motivated teacher with beliefs of
self-efficacy can also affect improvements in students’ self -efficacy, motivation,
and performance. Pajares synthesizes this discussion by asserting the known
35
and complex relationships between behavior and human motivation. Motivation
is looked at more closely below.
Motivation Theory
Teaching at a Title I school, with its clients’ emotional and learning
needs, can be draining on staff. What are factors that counteract this drain,
factors which help teachers to stay motivated in their commitments to teaching,
to their students, and to professional and instructional excellence? In terms of
psychological theory applied to employee management, Frederick Herzberg
(2003) is founder of the Hygiene-Motivation Theory. In January 2003, the
Harvard Business Review reprinted Herzberg’s article “How Do You Motivate
Employees?” The motivators that Herzberg cited included giving people added
responsibility, increasing accountability by removing some controls, and
enabling people to work on new, more difficult tasks. These factors tended to
enrich employees’ work lives and caused a corresponding surge in motivation.
Are there aspects of the Site Support Team that result in this type of motivation
for site administrators and teachers?
Herzberg’s motivation theory was exemplified in a case study that
recently appeared in Educational Leadership, the publication of the Association
of Supervision and Curriculum Development (September, 2007). The article
describes a turn-around by Curtis Middle School in San Bernadino. Data posted
36
on the California Department of Education website (2007b) verifies that the
school progressed from an API of 445 in 2002 to 628 in 2007. This school’s
statewide and similar schools rankings went from 1, 2, respectively, in 2002 to
2, 8 in 2006. Curtis Middle School’s success rested in the recognition and
autonomy given to the highest performing teachers at the school who led their
peers into a productive new culture. Specific aspects of the change in culture
were standardized strategies, peer-to-peer visitations, and common lesson
designs and assessments. The peer visitations and the reporting to peers of
results of common assessments brought about a strong sense of accountability to
colleagues. However, the teacher-led nature of the work most likely improved
teacher self-efficacy and increased motivation. Teachers saw their work as
collaborative, social, relevant, and efficacious. These are key contributors to
motivation.
The article cited above was a brief report. Without the fullness of a
rigorous case study, the reader is left with many questions. The author asserted,
for instance, that morale among staff was much higher than in pre-intervention
years, yet no data were shared to support this. Moreover, the achievement
ranking increase on the similar schools scale increased incredibly from 2 to 8.
However, the students’ performance as measured by the standards test still
shows that the students were under-performing when measured against
standardized criteria. Test result for Curtis yield only a “2” on statewide rank;
37
additionally, the API of 628 is still a far cry from the proficiency expectation of
800. From 2006 to 2007, the API boost was only 2 points. Was the trajectory of
progress sustainable? Were there other background factors related to the
particular individuals who were leaders relevant to the school’s success? Given
the context of the school’s work, and the particular individuals involved, would
that type of improvement be replicable in other settings?
In spite of uncertainties about this report, the structure that the school
applied was sound in respect to current thinking in the field about best practices.
The faculty identified areas of instructional practice to which all members
committed for implementation into practice. They then built in accountability
through peer visitations and teacher support through time for collaboration.
These are practices that could inform other school reform efforts.
The concept of Site Support Teams as units of intervention rests, in part,
on the notion that the team members can prompt motivation by school staff to
improve their practice for better student learning. Spillane (2002) studied
teacher learning and changes in teacher practice in nine school districts in
Michigan. In a complex, mixed-methods study that was based upon 5 years’
worth of data, Spillane analyzed degrees of implementation of reformed
teaching methods that were aligned to the then-new mathematics and science
standards. One central question was whether certain perspectives and sets of
beliefs held by key district leaders in curriculum and instruction might be
38
associated with deeper or more shallow implementation of the new policies by
teachers. Qualitative methods included the gathering of 165 interviews as well
as classroom observations and surveys.
Spillane (2002) found a correlation between teachers who had deeply
implemented reforms and central office leaders who had beliefs about teacher
learning that were centered in a situative-sociohistoric perspective rather than in
a behaviorist or cognitive perspective. Extrapolating from interview data,
Spillane found that a situative-sociohistoric perspective held by the district was
manifest by a view that the teacher is an active learner within a social context.
Peer encouragement, reflective practice, and ongoing opportunities for discourse
about instruction were key motivators of teacher change. Although the term was
not in common use at the time of the data collection of this study, the teachers
who had opportunities to become communities of learners improved their
practice more thoroughly and deeply than teachers who worked in districts with
a more traditional, behaviorist, approach or even than districts that were
characterized by a cognitive, individually oriented approach to teacher learning.
There are acknowledged limitations to Spillane’s (2002) study. These
include the absence of a true experiment, the ultimately small numbers of
teachers who showed deep improvements, and selection of study districts based
upon reputation rather than on more objective variables such as multiple student
outcomes-measures. None-the-less, the overall size of the study and depth of
39
the qualitative portion give rise to the thought that the study should be
considered as the district considers its Site Support Teams and even other
improvement efforts. Along the same terms as the Spillane study, does the
school district convey and support teacher-to-teacher reflection on practice,
collaborative planning, and group lesson study? These may be valuable
components to teacher motivation.
Bandura’s work (1993) on self-beliefs of efficacy is significant to the
topic of staff motivation. Bandura cites his own studies and those of others that
support the conclusion that causal attributions—attributions related to
individuals’ self-efficacy—are linked to motivation. One of the studies Bandura
cites is that of Chwalisz, Altmaier, and Russell (1992). In this study using data
of teacher burnout among 316 public school teachers, Chwalisz and colleagues
found interrelationships among causal attributions (success or failure are
dependent on the cause of ability or the cause of degree of effort), self -efficacy,
and coping. Teachers with higher self-efficacy attributed outcomes to their own
degree of effort, were associated with degree of coping strategies, and led to
either increased motivation or, in the case of lower self-efficacy, to burnout.
While by definition this study did not analyze a random cross-section of
teachers, it is one of numerous studies cited by Bandura which lead to the same
general conclusions in spite of focus areas of varying populations and
methodologies.
40
Pintrich of the University of Michigan spent a career studying motivation
theory. In an article on student motivation, Pintrich (2003) reviewed current
knowledge in the field, categorizing the research while advocating for studies
that are both strong in scientific method while also being highly applicable to
practice. Pintrich organized the research studies and language within
motivational science into five categories. He then listed design principles,
essentially recommendations for practice, for each of the five categories. One
might argue that Pintirch’s synthesis draws unsupported inferences in that he
coalesces dozens of studies taken in varied contexts into parsed generalizations.
Specific studies that contribute to the body of the information he
discusses may not be generalizable. However, given Pintrich’s (2003) depth of
knowledge and earned stature in the field, one can trust that Pintrich’s
generalizations and categories are, at least, based upon well-reasoned insight
borne out of depth of understanding. One might further extrapolate that the
motivational generalizations that he cites, as well as the corresponding design
principles, could apply to both students and to the teachers who design learning
experiences for students. Pintrich’s ideas apply to teachers for whom
motivational issues are a factor in instructional efficacy. Among Pintrich’s
(2003) generalizations and design principles are those listed in Table 3 which
are most salient to application to staff and students:
41
Table 3
Motivational Generalization and Design Principles
Motivational Generalization Design Principles
Adaptive self-efficacy and
competence beliefs motivate
students (and teachers).
Provide clear and accurate feedback regarding
competence and self-efficacy.
Design tasks that allow students (and staff) to be
successful but to be challenged (Vygotskian
model).
Adaptive attributions and
control beliefs motivate
students.
Provide feedback that emphasizes learning through
things such as effort and strategies.
Provide opportunities to exercise choice and
control.
Build caring and supportive personal relationships.
Higher levels of interest and
intrinsic motivation motivate
students.
Provide content and material that are personally
meaningful (to staff and to students).
At the teacher-to-student level, provide interesting
and stimulating tasks that include novelty and
variety.
Higher levels of values
motivate students.
Build in relevance and develop personal
identification with the school.
Goals motivate and direct
students.
Use organizational and classroom-management
structures that encourage personal and social
responsibility and provide a safe, predictable
environment.
Use collaborative groups.
Promote mastery, learning, understanding, effort,
progress, and self-improvement. (Pintrich, 2003,
p. 672
42
Pintrich and Schunk (2002) discuss the correlation of expectancy with
achievement. They cite longitudinal studies by Eccles (1983) and Wigfield
(1994) that have consistently found that students’ levels of self-expectancy was
more highly correlated to academic grades than were other factors, even
previous grades. The authors also discuss students’ beliefs about their own
intelligence as incremental (malleable, changeable over time) or entity-based
(fixed). Students who had the former beliefs are more likely to feel control over
their own learning and achievement and have higher levels of self-expectancy
than students who believe that their intelligence is fixed, that academic success
is due to factors out of their control. The implications by these authors lie in the
influences that teachers can have over students’ own non-productive beliefs
about themselves as learners and, ultimately, over their motivation.
Other factors related to motivation and relevant to student achievement
are noted by Pintrich and Schunk (2002). For instance, the control attributions
that teachers assign to their students’ challenges impact the decisions, the
methodologies, and the practices that they employ which, in turn, affect
improved student performance or a continuing decline. If a teacher believes that
a student is not exerting effort and that the effort is within the student’s control,
he or she may withhold assistance, apply rigid rules for submission of make-up
work, etc. In contrast, if the teacher understands that the lack of effort may, in
fact, be due to the student’s extraordinary responsibilities and inadequate
43
resources at home, the teacher may provide for extra tutoring, may apply more
lenient make-up policies, or allow a student to re-submit corrected and re-
worked problems. These variations in teacher control attributions can reinforce
student motivation or deplete it.
Along the same lines are issues of teacher feedback to students. The
implied messages in feedback may bolster or diminish students’ self-efficacy,
commitment to goals, or views about fateful or malleable degrees of
intelligence. Such beliefs and practices as a part of the normative culture of a
school can affect student achievement at that school, and, therefore, are areas
that should be examined as targets of change efforts.
Accountability
Accountability is a term for responsibilities in a relationship. Burke
(2004), a Senior Fellow of the Rockefeller Institute of Government, refers to
dictionary definitions of answerability and an obligation to accept responsibility
for actions Accountability relationships in education are multi-faceted and
involve teachers, principals, district offices, states, parents, the general public,
the business community, higher education, and the federal government.
Ultimately, all of these entities are accountable to the student and to his and her
preparation for a future role in society. Burke (2004) describes six different
models of accountability. The model of current use has changed over time along
44
with society’s view of the role of education and its relationship to the nation’s
needs. While Burke’s focus is higher education, features of the bureaucratic, the
professional, and the political models are apparent in K-12 education. Key
features of the bureaucratic model such as the goal of efficiency and the
condition of stability may harken back several decades in education. However,
that model’s features of centralization of governance and techniques of
regulation are apparent in the NCLB (United States Department of Education,
2002) landscape. The professional model characterized by goals of quality,
conditions of autonomy, techniques of consultation, and programs of
standardized testing is alive and well to varying degrees in today’s K-12
environment. However, also alive is the political model characterized by
priorities and outcomes.
Features of accountability at the macro level distill down to working
practices at the district, the school, and the teacher level. To what degree does
accountability impact efforts to improve the achievement of students in K-12
education?
Like the technical support teams that are sent to assist a school that has
failed to meet the NCLB requirements and is deemed to be in program
improvement, Site Support Teams provide support but not without increased
accountability. Elmore (2005) discusses the learning of adults and the role of
accountability in that endeavor, what he calls:
45
Reciprocity of accountability for capacity. Accountability must be a
reciprocal process. For every increment of performance I demand from
you, I have an equal responsibility to provide you with the capacity to
meet that expectation. Likewise, for every investment that you make in
my skill and knowledge, I have a reciprocal responsibility to demonstrate
some new increment in performance. (p.5)
At least some staff members of schools with Site Support Teams feel
increased accountability. Clearly, the principal feels the scrutiny of district
office administrators who participate in Site Support Teams and, wittingly or
unwittingly, make judgments about the performance of the school, the
principal’s leadership, the performance of teachers, the quality, decisions, and
content of the principal’s work. Is corresponding support, in terms of the
provision of new understandings and new tools, felt by the principal? Are other
staff members, such as the vice principal and teachers, equally accountable for
improved performance if, in fact, they receive feedback and new learning from
the visits of the Site Support Team?
According to Elmore (2002):
Schools do not “succeed” in responding to external cues or pressures
unless they have their own internal system for reaching agreement on
good practice and for making that agreement evident in organization and
pedagogy.… These schools have a clear, strong internal focus on issues
of instruction, student learning and the expectations for teacher and
student performance. In academia, we call this a strong “internal
accountability system.” By this we mean that there is a high degree of
alignment among individual teachers about what they can do and about
their responsibility for the improvement of student learning. Such
schools also have shared expectations among teachers, administrators,
and students about what constitutes good work and a set of processes for
observing whether these expectations are being met. (p. 20)
46
O’Day (2002) studied under-performing Chicago Public Schools over a
6-year period in the late 1990s. She talks about the complexity of accountability
in schools. Similar to Elmore’s (2002) internal accountability systems, O’Day
(2002) found that the readiness of staffs, including the presence of normative
structures that supported a focus on data, collaboration, and change, have a high
degree of influence on the degree to which a school will show improvement. In
other words, among the schools that had less than 20% of their students
performing proficiently in 1996, those schools that significantly increased their
performance by 1998 shared the characteristic of readiness in terms of peer
collaboration, teacher-to-teacher trust, and collective responsibility for student
learning. O’Day’s findings imply that the internal accountability included a
group belief that teachers felt that the learning of students was under their
control and could be influenced, at least to some significant degree.
In terms of effecting improved methodology, Elmore (2002) subscribes
to the notion that, presumably through accountability, teachers and
administrators can be encouraged or cajoled or pressured into changing a
practice even though they do not believe in the new practice. The accountability
causes compliance in a rather rote manner. However, if teachers observe that
their new strategy or method actually, and surprisingly to them, seems effective
in bringing about better student performance, the teachers’ belief systems will
change as though to catch up with the behavior. Although it seems backward
47
that the behavior precedes a belief or commitment, none-the-less the new belief
is likely to help the new behavior to become firmly incorporated into the
teacher’s repertoire, perhaps because the teacher has already seen evidence of its
efficacy.
There are various forms and outcomes of accountability. Stein,
Hubbard, and Mehan (2004) studied two school districts that had accomplished
massive reform—New City’s School District #2 and San Diego Unified School
District. The top leadership in New York City, Chancellor Anthony Alvarado
and Deputy Superintendent Elaine Fink, were reported to have spent large
amounts of time conducting walkthroughs of schools. Such visits by top
administrators increase the accountability that site leaders—and possibly
teachers—feel.
One area influenced by accountability measures was highlighted in
studies by O’Day, Bitter, Kirst, Camoy, Woody, and Buttles, et al. (2004).
O’Day and colleagues found that accountability measures could influence
schools in which teachers had a culture of analyzing their students’ data,
including data disaggregated by race. As a corollary to this finding, schools that
did not have a culture that embraced the use of data were not likely to respond
well to accountability measures.
Togneri and Anderson (2003), working through The Learning First
Alliance, conducted in-depth studies on five school districts throughout the
48
nation. These districts had demonstrated success in increasing student
achievement, over time, and across grade levels, races, and ethnicities in spite of
poverty levels of more than 25%. Their findings speak to a system-wide
approach to the improvement of instruction. Distributed instructional leadership
was a key component of the districts’ modes of operating. Togneri and
Anderson (2003) found that principal and teacher leaders were crucial to
districts’ instructional leadership. The authors found that the districts built
networks of instructional expertise. The professional development in these
districts were data-driven, marked by collaboration among colleagues, and
included a value of reflection upon practice. The focus on data and the
professional development that was based upon student data caused principals to
report significant changes in their behavior. The principals reported conducting
regular classroom visits, observing instruction and providing feedback, and
putting into place systems of accountability around regular analyses of student
assessments.
The description of elements for success found by Togneri and Anderson
(2003) above, are likely to be cultural shifts borne out of the accountability.
One district that was studied, Providence, Rhode Island, listed as values
“accountable talk” and “socializing intelligence” (p.16). When shared
accountability becomes a normative structure among staff, the groundwork is
laid for support of increased student learning.
49
Among the large-scale efforts to catalog instances of exceptional
achievement by schools that serve students in poverty and students of color is
the work of Reeves (2004b). His second edition of Accountability in Action
(2004b) updates the report of the 90/90/90 schools. The 90/90/90 schools are
schools that achieve proficiency in 90% of their students in spite of the facts that
the student populations are 90% minority and 90% high poverty. Reeves
(2004b) summarizes and synthesizes case studies of schools that serve as models
of efficacy. In the late 1990s, data taken over a 4-year period from more than
100,000 students in Milwaukee showed consistently high results. Although the
results are merely associative and not causative, Reeves (2004b) cites clear
associations between these high achieving schools and a number of consistently
used instructional strategies. The strategies that are characteristic of these
schools are a focus on student achievement, clear curricular choices, frequent
assessment and multiple opportunities for improvement, an emphasis on non-
fiction writing, and collaboration among staff especially that centered on student
work.
Reeves’ work (2004b) identified more recent case examples of schools
that showed similar patterns to those in Milwaukee, schools that also served
low-income populations and sustained positive results. These cases were in
Norfolk, Virginia and Wayne Township in Indianapolis. Additionally, Reeves
mentions, without supportive detail, apparently successful cases in St. Louis and
50
in Southern California. Again, in these examples from recent years, the
characteristic of teacher collaboration around student work surfaced as
consistent practices. The provision of extra time in content areas of reading,
English, and math was a characteristic that was similar to that of clear curricular
choices in the Milwaukee examples. The use of common assessments and the
continual use of data were also characteristic of these schools and not
inconsistent with the earlier cases.
Reeves’ (2004b) work describes the results of summative analyses of
quantitative data followed by rigorous and extensive qualitative studies. While
one might assume that there could be inconsistency between what is reported in
interviews and surveys and actual practice, the very large sizes of these studies,
and the similarity of findings in the geographically diverse locations of
Milwaukee, Indianapolis, and Norfolk mitigate any questions of reliability.
Moreover, the studies were conducted by different research agencies.
Milwaukee’s study was conducted by the Center for Performance Assessment
and verified 3 years later by Schmoker (2001) , according to Reeves (2004b).
Reeves credits Simpson (2003) with the original Norfolk study, but Reeves,
himself, corroborated the report with his own review of the data, site visits, and
interviews. The size, the consistency, and the persistence of results make these
studies ones that should not be ignored.
51
What do the studies performed and cited by Reeves (2004b) tell us about
the role of accountability in effecting improved student achievement? The
findings of characteristics in several cities can be distilled into three themes.
First, in the combined studies, schools focus on learning of a clearly chosen,
focused curriculum. A second theme of the combined characteristics is that of
attending frequently to common assessments. A third theme is that of
collaboration over student work. The teaching and learning of a clearly chosen,
focused-curriculum brings accountability to the teachers and principals as all of
these professional adults are expected to teach the same agreed-upon
curriculum. The attention to common assessments brings accountability both to
the adults who must teach to prepare for them and to the students who work for
improvement based upon feedback on these assessments. Finally, teacher
collaboration over student work supports the accountability of the teachers to
each other. These are examples of the internal accountability that Elmore
(2002) discusses.
As a counter-argument to the notion that accountability prompts
improvements in performance and learning, research can be found that does not
give carte blanche support for accountability as a means of improving student
achievement. In the early 2000s, three well-respected policy study institutes
analyzed the accountability that is a part of California’s Public Schools
Accountability Act of 1999 (California Department of Education, 2000). The
52
findings of the three studies are summarized in a policy brief by O’Day, et al.
(2004). Specifically, the studies found that teachers in the lowest-performing
schools were less likely to believe that they could make a difference in student
learning. In essence, a back-drop of strong self-efficacy among staff was a
necessary ingredient for accountability measures to spur improvements in
student learning.
Mintrop and Trujillo (2007) studied nine middle schools and their
responses to standardized testing accountability measures. Their study
presented thorough and varied mixed-methods which included classroom
observations, student and teacher surveys, and interviews covering numerous
variables to follow up their analysis of the schools’ multi-year data. Mintrop
and Trujillo concluded that the data provided by California’s accountability
measures show little or no correlation with the quality of the school as measured
by numerous instructional and attitudinal indicators. They did find, however,
that among the schools that showed the greatest improvement, the reaction to the
accountability measures by the teaching faculty was a positive response. The
teachers in the successful schools looked at the accountability system as an
opportunity that would help them to improve, and they expected that they could
perform effectively in the system. In contrast, schools that fared poorly on the
annual tests had faculties that held negative views of the accountability
53
measures, felt them to lack value and relevance, and thought them to be
disconnected from instructional practices.
54
CHAPTER 3
DESIGN SUMMARY
Do Site Support Teams have an affect on the achievement of students in
under-performing schools? In this chapter, the author presents the methods used
to study this question and discusses ancillary related questions, and their
respective methodologies, with the intention of providing an in-depth analysis of
Site Support Teams as a method of improving student achievement.
The author first studied the effectiveness of Site Support Teams through
a quantitative study of a quasi-experimental design. Seven experimental
schools, of which two were elementary, two were middle, and three were high
schools, were matched to control schools selected as strong matches from
California’s Similar Schools list. Because the intervention of the Site Support
Team was first implemented in four different academic years at the seven
experimental schools, performance band data for the matched pairs of schools
were looked at for intervals ranging from 2 to 5 five years depending upon
duration of implementation. Specifically, the author looked at the changes in
performance among students at intervention and control schools. Changes were
analyzed in three dependent variables : “proficient and advanced,” “basic and
above,” and “all performance bands.”
55
Intervention
The Site Support Teams of the district were technical support groups for
low-performing schools. The teams, each comprised of district-level
administrators and specialists, become an extension of the school’s staff. One
purpose of the team was to help the school administration and faculty to identify
practices that were in place that were counter-productive to student learning,
and, conversely, to identify practices absent from the school that were known to
be supportive of student achievement. Once identified, the team then guided the
staff through implementation of the prescribed changes, monitored progress, and
provided encouragement and feedback. As noted in Chapter 1, Site Support
Teams were interventions to ward off a slide into Program Improvement status.
Because of perceived success, Site Support Teams have multiplied in numbers
from 1 in 2002 to 10 in the 2006-07 school year, and to 19 (with variations in
design) in 2007-08. They have been generalized to be interventions for any low-
performing school that is not, of its own accord, showing sufficient regular gains
in student achievement.
Participants and Setting
Participants in the quantitative portion of this study were the 2
nd
through
11
th
grade students at the seven EGUSD Site Support Team schools as well as
students in the matched control schools. In the second tier of the study, study
56
participants were selected participants of Site Support Teams, and teachers at
two Site Support Team sites.
Using a version of extreme case sampling design, teachers at the Site
Support Team schools showing the most favorable student achievement results
were surveyed. Teachers were asked to complete a brief questionnaire eliciting
opinions about Site Support Teams (Appendix A.). Since the goal of Site
Support Teams is to effect improved student outcomes through improvements in
professional practice at the schools, teachers are key players in the process.
Their awareness and opinions of Site Support Team work may be used to inform
best practice for future Site Support Teams.
Interviews were conducted with the principals of three Site Support
Team schools. Other key members of Site Support Teams were also
interviewed. Table 4 shows the composition of typical Site Support Teams as
well as the selection of participants who were surveyed and those who were
interviewed.
The author had been a member of Site Support Teams since September
2005, having participated on one Site Support Team in 2005-06, and on five in
subsequent years. As a participant observer, she drew upon her own
observations of differences from team-to-team.
57
Table 4
Composition of Site Support Teams
Participant
District
Member
of Team
School
site
Member
of Team
Candidate
for Questionnaire
Candidate for
Interview
Associate Superintendent
X 2
Director, Curriculum and
Professional Learning
a
X
Director, Research and
Evaluation X
Director, Learning Support
Services X
Director, Special
Education X
Director, Instructional
Support X
Director, Elementary Ed.
X 1
Director, Secondary Ed.
X 1
Program Specialist
Special Ed X
Program Specialist,
Curriculum X
Program Mgr, Research
X
Principal, Elementary
X 1
Principal, Middle School
X
Principal, High School
X 2
Vice Principal, Elementary
X
Vice Principal,
Middle School X
Vice Principal,
High School X
Instructional
Coach X 2
Teacher, Elementary
X X
58
Table 4 (continued)
Participant
District
Member
of Team
School
site
Member
of Team
Candidate
for Questionnaire
Candidate for
Interview
Teacher,
Middle School X
Teacher,
High School X X
a
The author was a participant-observer.
Her observations also served to guide the development of the interviews
with key Site Support Team members.
Instrumentation and Statistical Procedures
According to the California Department of Education, annual testing data
is key to accountability in K-12 education throughout the state:
The Standardized Testing and Reporting (STAR) Program is an
important part of the state assessment system. Administered annually in
the spring in grades two through eleven, the STAR Program was first
authorized in 1997 and reauthorized until 2011 by state law (Education
Code Section 60640). Tests in the STAR Program measure how well
students in California public schools are learning the knowledge and skills
identified in the California content standards. (California Department of
Education, 2007c, p. 8)
Using data from the state of California’s STAR program, the author
compared the pre- and post-progress of student achievement in the intervention
schools against that of control schools with like demographics. While the STAR
program is comprised of a blend of criterion-referenced tests and a norm-
59
referenced test (CAT-6), more attention has been given over time to the
criterion-referenced portion known as the California Standards Test or CST.
The CST data are used to calculate schools’ Academic Performance Index (API)
scores. According to the California Department of Education (2007d),
The API is a state requirement under the Public Schools
Accountability Act (PSAA) of 1999. The API is a single number that
ranges from 200 to 1,000 and indicates how well a school performs
academically from year-to-year. (n.p.)
This study references the Annual Performance Index of each school as the
criterion that administrators of the district used to determine which schools were
in need of intervention. However, the results for determining school growth
relied solely on comparisons of scores of students relative to CST proficiency
bands.
Site Support Team intervention schools were compared to non-
intervention control schools. Control schools were selected from the state’s
Similar School’s list. Among the list of schools, the author looked for schools
that seemed most like the respective intervention schools in the following ways:
x API score +/- 7 points of the experimental school.
x Size: This was considered in a broad sense only, and varied by up to
50% from control to experimental school. The author surmised that
this was important because the dynamics, availability of resources,
and overall climates of schools are likely to be dissimilar if one
school is very small and the other is quite large.
60
x Location/environment, i.e., suburban or urban as opposed to rural.
x Percentage of English Learners (EL): Because learning English as a
non-native language while attending typical California schools has a
significant affect on learning and on CST scores, it was a priority to
select similar schools that had reasonably similar percentages of EL.
The match in this way was most often a match within a band of
relatively similar percentiles rather than an absolute numeric match.
For instance, schools can be thought of as being “no-impact EL,”
low-impact EL,” “moderate impact EL,” or “high-impact EL.” If
these terms were interpreted as approximate percentage bands, they
would fall approximately as follows:
“little or no-impact” 0–15% EL
“low-impact” 15-25% EL
“moderate-impact” 25-60% EL
“high-impact” more than 60% EL
x Ethnic mix: Many schools in California serve populations that are
largely of Hispanic culture. The author believes that the student
environment in such a homogenous school is likely to be significantly
different from schools in the district that tend to have more equal mixes
of white, African-American, Hispanic, and Asian sub-populations. The
absence or presence of several significant subgroups, rather than just one
61
or two significant subgroups, is likely to make a difference in the
learning environment and possibly affect how teachers teach. Therefore,
it was deemed a priority to mirror somewhat similar percentages of
African-American, Asian, Hispanic, and white sub-groups in the control
group as compared to the experimental.
x Poverty: Percentages of the student population that were eligible for
free-and-reduced-price lunch were used for this measure.
In some cases, other school characteristics were considered as a reason to
eliminate possible similar schools from consideration. As an example, John
McDougall Middle School in Oakland was initially chosen to be the control
school for James Roosevelt Middle School. The API of John McDougall was
only three points lower than Roosevelt’s. Like Roosevelt, John McDougall had
large numbers of African-American, Asian, Hispanic, and white students. The
23% of EL at John McDougall compared favorably to the 31% at Roosevelt.
Both schools had free-and-reduced-lunch eligibility rates between 50% and
60%. However, in terms of configuration, John McDougall was a school that
served grades six through eight, whereas James Roosevelt’s configuration
included only grades seven and eight. After consideration of this fact and a
review of the factors listed above for the schools that were within five API
points of Roosevelt, it was decided that the characteristic of including sixth
grade was not more important than the other factors which made John
62
McDougall a strong match for Roosevelt. Only scores from grades seven and
eight were used in calculations. Table 5 shows the interventions. A data
summary for the schools in this study is provided in Appendix B.
Table 5
Interventions
O = STAR test was administered.
X = Intervention of Site Support Team
2002 2003 2004 2005 2006 2007
Van Buren HS O - -X - - - -O - - - X - -O - - - X - - -O - - X - - - O - - X - - O
Control O - - - - - - -O - - - - - - O - - - - - - - -O - - - - - - O - - - - - - - O
Fillmore HS O - - - - - - O - - -X - - O - - - X - - - O - - X - - - O - - - X - - O
Control O - - - - - - -O - - - - - - O - - - - - - - -O - - - - - - O - - - - - - - O
Johnson MS O - - - - - - O - - -X - - O - - - X - - - O - - -X- - -O - - - X - -
Control O - - - - - - -O - - - - - - O - - - - - - - -O - - - - - - O - - - - - - - O
Roosevelt MS O - - - - - - O - - -X - - O - - - X - - -O - - X- - - O - - - X- - - O
Control O - - - - - - -O - - - - - - O - - - - - - - -O - - - - - - O - - - - - - - O
Lincoln O - - - - - - -O - - - - - - O - - - - - X -O - - - X - - O - - - -X - - O
Control O - - - - - - -O - - - - - - O - - - - - - - -O - - - - - - O - - - - - - - O
Ford ES O - - - - - - -O - - - - - - O - - - - - - - -O - - - X - - O - - - -X - - O
Control O - - - - - - -O - - - - - - O - - - - - - - -O - - - - - - O - - - - - - - O
Madison O - - - - - - -O - - - - - - O - - - - - - - -O - - - X - - O - - - -X - - O
Control O - - - - - - -O - - - - - - O - - - - - - - -O - - - - - - O - - - - - - - O
O = STAR test was administered.
X = Intervention of Site Support Team
63
In most cases, the intervention of the Site Support Team was started in the
fall, at or toward the beginning of the school year. An exception to this was
Lincoln High School. Its Site Support Team was first convened in January
2005. Therefore, in that case, 2004 is used as the pre-intervention year
(Table 6).
Table 6
Onset of Interventions
School
Onset of
Interventions Notes
Van Buren High
School
October 2002 Principal changes in 2003
and 2006 but stable Site
Support Team; VP’s
promoted to Principal.
Johnson Middle
School
January 2004 Change of principal in
August 2004, although new
principal arrived as VP
earlier in 2004
Fillmore High
School
2003-04 (Month
uncertain)
Three principals in 5 years
Roosevelt Middle
School
2003-04 Principal change in October
2004
Lincoln High School January 2005 Same principal, 2001-2007
Ford Elementary September 2005 Same Principal for some 20
years
Madison Elementary September 2005 Lack of staff support of
principal through 2006-
2007
64
Achievement
The goal of Student Study Teams was to effect improvements in student
achievement. For the purposes of this study, the measure of achievement was
the degree of improvement in three dependent variables based on CST
performance bands. Under the No Child Left Behind (NCLB) Act of 2001
(United States Department of Education, 2002), schools must annually meet or
exceed steadily increasing criteria, measured in terms of increasingly high
percentages of students deemed “proficient” or “advanced,” in order to comply
with a measure known as Adequate Yearly Progress (AYP). According to the
California Department of Education (2007, March), the cut scale score for
determination of proficiency in CST in ELA and mathematics is 350.
Tests and Statistical Methods
Using AYP advanced/proficient data, the control and experimental
schools were compared for growth of student achievement. In simple terms, the
slopes of the rates of improvement of intervention schools were compared to
those of the respective control schools. Using t-test and Cohen’s d, statistical
derivations calculated with the help of SPSS were used to describe whether the
intervention school improvement ratios compared to control school
improvement ratios were significant.
65
Qualitative and Mixed Methods Applied to Findings
Results from the quantitative study described above lacked internal
validity due to the fact that there was no random assignment of students or
schools to control groups versus experimental groups. None-the-less, it was
hoped that the results of the quantitative portion would help the administration
of the district to better judge whether Site Support Teams may be a factor in
boosting student achievement. The results should assist district-level
administrators in judging the likelihood that the resources expended on Site
Support Teams might be associated with improved student achievement. Since
results indicated that Site Support Teams might be effective in improving
student achievement, and since results were more positive at some of the seven
schools than at others, a deeper look was taken at the Site Support Team schools
that were most improved in order to help to inform a judgment about the likely
efficacy of Site Support Teams.
Since the composition, agendas, procedures, frequencies of meetings,
and focus areas of Site Support Teams varied from school-to-school, the district
administrators needed a clearer picture of implementation practices. Patton
(1987) provided a guiding perspective, “Where outcomes are evaluated without
knowledge of implementation, the results seldom provide a direction for action
because the decision-maker lacks information about what produced the observed
outcome (or lack of outcomes)” (p. 26).
66
Through the qualitative/mixed-methods portion of this study, the author
attempted to answer questions about relative efficacy of the teams.
x How did Site Support Teams work in the schools that showed the most
improvement in student achievement?
x What were the differences in Site Support Teams in most-improved
versus least-improved schools?
x How did teachers view Site Support Teams?
x How did principals view the Site Support Team?
x In the opinion of teachers, what changes in practice resulted from the
presence of the Site Support Team?
x In the opinion of the principal, what changes in practice at the school
resulted from the presence of the Site Support Team?
x Under what conditions is a Site Support Team likely to effect improved
student achievement in other schools?
These questions were addressed through the use of surveys (Appendix
A), interviews, and observations. It is hoped that the results of these methods
help the district administration to see the nuances and discrepancies in
implementation that might account for variations in program efficacy among
intervention schools.
67
Interview Field Work
The Site Support Team is an indirect intervention. More specifically, the
Site Support Team influences the school faculty and they, the principal, and
especially the teachers make changes which, theoretically, affect students and
their learning. Given this, in one sense the teachers were the key target of the
Site Support Teams, whereas, if they improved their teaching, students’ learning
would also improve. This is the classic Professional Development model.
Because it is indirect by definition, and because a myriad of other factors might
intervene, causality has, historically, been very difficult to demonstrate.
This presents another instance in which qualitative data proved to be
important. Following the Kirkpatrick model (Clark and Estes, 2002), surveys
and interviews of teachers and principals helped to verify whether (a) the Site
Support Team caused learning on the part of teachers, and whether (b) teachers
have applied their learning to their teaching practices.
Interviews of district-level Site Support Team members who had been
participants on one or more teams helped to clarify other questions. How had
the school staff’s response to the Site Support Team changed over time, if at all?
What might have caused this change in response? Given that Site Support
Teams conduct brief-visit observations of classrooms, did the team members
perceive a change in the overall instructional practices on campus over the
course of the work of the Site Support Team?
68
The combined methodology of the two tiers of this study yielded the data
necessary for a comprehensive understanding of the likely efficacy of Site
Support Teams under various conditions.
69
CHAPTER 4
RESULTS
Site Support Teams are teams of individuals that meet regularly with the
school principal and key site leaders in order to provide support to the school’s
academic endeavors. The goal of the Site Support Team is to increase student
achievement and student learning. The Site Support Team consists of district-
level administrators such as an associate superintendent, directors of key areas
such as research, curriculum and professional learning, special education, and
learning support (categorical programs). These individuals meet on a regular
basis with the school team: principal, vice principals, instructional coach, and
other site-specific relevant personnel. Over time, and at a pace dependent on the
school’s culture, teachers become involved, rotating into the Site Support Team
meetings. The teams spend one-to-three hours focusing on student needs at the
school. At times, the school staff presents and explains initiatives and programs
that are being implemented, and student data are presented. Often the site visit
includes “walk-through” visits of numerous classrooms with mixed teams
(district office and site personnel) visiting four or more classrooms each. When
classroom visits occur, they are followed by some form of group debriefing of
the informal evidence and findings from the observations. The site team decides
how this data will be provided to the entire staff. These cumulative observations
result either in site-level intervention or professional learning, and/or the visits
70
result in questions that will become the focus for the next agenda. The format
and the work of the Site Support Team varies and is tailored to each school as it
is believed that each school has unique challenges and needs on its path to
improved student achievement.
Quantitative Analysis
Through the quantitative portion of this summative evaluation, the author
measured effects of the intervention in two ways. First, each experimental
school was analyzed in terms of growth on the ELA portion of the CST
performance bands from the year prior to the onset of intervention through the
2007 testing, a pre/post- independent groups design. The analysis included tests
of statistical significance using the t-test and tests of practical significance using
Cohen’s d, raw change, and percentage change. The dependent variables were
CST ELA performance band scores as well as percentages of students
“proficient + advanced” and students “basic + above.”
Second, the author paired control and experimental schools for
comparison. In this portion of the study, the practical significance and
percentages of change were compared using a nonequivalent comparison group
design. This was a post-only design using a control school that was a similar
school to the experimental school in the pre-intervention year. In all cases,
71
performance bands were coded as follows: 0 = Far Below Basic; 1 = Below
Basic; 2 = Basic; 3 = Proficient; 4 = Advanced.
Results of Pre-Post Independent Groups Design
Van Buren High School. When student performance band data from
2002 (pre-intervention) are compared to data from 2007 (post-intervention)
using a t-test for statistical significance, p = .000. Thus, the results were
statistically significant, p < .15. Similarly, the changes in basic + proficient +
advanced bands are statistically significant, p = .000, and therefore p < .15. Also
the changes in proficient + advanced bands are statistically significant, p = .000,
p < .15. Table 7 shows t-test results.
Table 7
Pre- and Post-Intervention CST ELA Performance Band Data, Van Buren High
School, 2002-2007
Dependent
Variable
Pre-
N,
2002
Post-
N,
2007
Pre-
M,
2002
Post-
M,
2007
Difference
in M
t –
ratio
(df)
Observed
Probability
All perf.
Bands
1610 1231 1.486 1.965 .479 10.819
(2839)
.000*
Basic and
above
1610 1231 .501 .649 .148 7.961
(2839)
.000*
Prof +
Adv
1610 1231 .186 .334 .148 9.168
(2839)
.000*
Note. Equal variances were assumed.
72
It is noted that there is a significant decline in N over the five-year
intervention period. This change was most likely due to decreasing enrollment
in the attendance area as well as to a regional boundary change that affected
several district high schools.
Since Van Buren High School was the first school in the district to have
a Site Support Team, the intervention has been in place longer than at other
schools. Therefore, when calculating the practical significance of the gains on
the ELA portion of the CST, the long pre-to-post-intervention interval must be
considered. The practical significance of the data was calculated using raw
change, percentage change, and effect size (Cohen’s d), but each data point has
to be interpreted with consideration of the 5-year interval. Practical Significance
data are shown in Table 8.
Table 8
Pre- and Post-Performance Band Findings: Practical Significance, Van Buren
High School, 2002-2007
Dependent
Variable
Pre-
Mean
Post-
Mean
Difference Percentage
change
SD
Pre-
Cohen’s
d
Performance
bands
1.486 1.965 .479 32% 1.121 0.43
Basic+pro+
adv
.501 .649 .148 29% .500 0.30
Prof + Adv. .186 .334 .148 80% .389 0.38
As indicated in Table 8, results for Van Buren High School show strong
practical significance on all measures. In terms of raw change, all dependent
73
variables show strong gains. The first dependent variable, “all performance
bands” shows a gain of .48 when pre- and post-intervention means are
compared. Practically, whereas students in grades nine, ten, and eleven were, on
average, hovering at the mid-point between below basic and basic in 2002, the
average in 2007 was at the basic level. Overall, the .48 increase was nearly one-
half a performance band. While the “basic + above” and the “proficient +
advanced” dependent variables do not show as great a number as a raw
difference, it must be kept in mind that the pre-intervention mean was lower for
these variables. Therefore, the difference in mean for the “basic + above” group
is significant at the .15 level. The difference in mean for the “proficient +
advanced” dependent variable shows that the pre-intervention mean nearly
doubled by the time of the post-intervention measure.
When using Cohen’s d as the measure of effect size, all dependent
variables show strong effects, with “performance band” scores at the .43 level.
This is a huge effect size given that the standard used for practical significance
for Cohen’s d in this study is d >.20. The gains in the percentage of students in
the “proficient + advanced” band is similarly practically significant, d = .38.
The group of “basic + above” also shows gains that have practical significance,
d = .30.
Perhaps the strongest result rests in the percentage change during the
interval of pre-intervention to 2007. The standard for practical significance in
74
this study is 10%. Therefore, an 80% change in the “proficient + advanced”
category has strong practical significance even if spread over 5 years.
The practical significance of the results for Van Buren High School rests
on data that has changed over a five-year interval. When one questions the
significance of the change for that period, one realizes that a number of other
intervening factors may have been at play. The school has, however, maintained
strong gains over an extended period. For this school, the comparison to its
“best match” similar school will serve to aid in the interpretation of statistical
and practical significance of the gains.
Fillmore High School. The gains in performance bands on the ELA
portion of the CST were statistically significant for the “basic + above” and for
the “proficient + advanced” dependent variables at the .15 level. For the
dependent variable of all performance bands, p = .367, p > .15 and, therefore, is
not statistically significant. For the dependent variable of “basic and above,”
p = .038, p < .15; therefore, p is significant. For the dependent variable of
“proficient + advanced,” p = .041, p < .15; therefore, p is statistically
significant. These data are summarized in Table 9.
75
Table 9
Pre- and Post-Intervention CST ELA Performance Band Data, Fillmore High
School, 2003-2007
Group
Pre-
N,
2003
Post-
N,
2007
Pre-
M,
2003
Post-
M,
2007
Difference
in M
t – ratio
(df)
Observed
probability
All
perf.
Bands
1654 1387 1.947 1.984 .037 .902
(3039)
.367
Basic
and
above
1654 1387 .688 .653 -.035 2.080
(3039)
.038*
Prof +
Adv
1654 1387 .308 .343 .035 2.045
(3039)
.041*
Note. Equal variances were assumed.
It is noted that the N for Fillmore was significantly lower for the post-
intervention year as compared to pre-intervention, 4 years earlier. This is most
likely due to a reduction in the size of Fillmore’s attendance area when a new
school opened in the fall of 2004.
While the results of the t-test were mixed, the question of practical
significance of changes in mean is critical to all schools in this study. These
results are displayed in Table 10.
76
Table 10
Pre- and Post-Performance Band Findings: Practical Significance, Fillmore
High School, 2003-2007
Dependent
Variable
Pre-
Mean
Post-
Mean
Difference Percentage
change
SD
Pre
Cohen’s
D
Perf bands 1.947 1.984 .037 2% 1.059 .04
Basic+pro+
adv
.688 .653 .035 -5% .463 .08
Prof+ Adv .308 .343 .035 11% .462 .08
The results for Fillmore High School do not demonstrate practical
significance for any of the measured dependent variables as measured by
Cohen’s d. Using the standard of d > .20 for practical significance as measured
by Cohen’s d, no dependent variables showed significant gains. In terms of raw
change, the “basic + above” dependent variable showed a decrease over the 4-
year period. This might, in truth, be a positive outcome if the reason was that a
sizeable portion of the 2003 “basic” students moved to “proficient” by 2007. In
fact, the “proficient + advanced” dependent variable showed a positive gain. In
terms of percentage change, this dependent variable had the highest gain, 11%.
The typical standard for percentage change is 10%. The 11% gain evidenced in
this study could be interpreted as reduction of practical significance due to the 4-
year span, but it could also be interpreted as the maintenance of significant
improvement. However, it is believed that most professionals would agree that
11% was steady but lackluster gain over 4 years.
77
Johnson Middle School. When 2003 scores are compared to 2007,
results for Johnson Middle School show statistical significance for all groups of
test-takers on the ELA portion of the CST, and for the “proficient + advanced”
bands dependent variable. The dependent variable “basic + above” did not show
statistical significance.
For all performance bands, p = .02, p < .15, and, therefore, the results
are statistically significant at the .15 level. For the dependent variable of
“proficient + advanced,” p = .000, p < .15, and, therefore, these results are
statistically significant. However, for the dependent variable of “basic + above,”
p =.952, p > .15, and so there is no statistical significance. These data are
summarized in Table 11.
It is noted that there is a significant decline in N over the 4-year
intervention period. This change was most likely due to decreasing enrollment
in the attendance area as well as to a regional boundary change that affected
several district middle schools.
In terms of practical significance, the raw changes in performance at
Johnson on the ELA portion of the CST seem significant for the “proficient +
advanced” group. In fact, when percent change is considered, this dependent
variable increased 42%.
78
Table 11
Pre- and Post-Intervention CST ELA Performance Band Data, Johnson Middle
School, 2003-2007
Dependent
Variable
Pre-
N,
2003
Post-
N,
2007
Pre-
M,
2003
Post-
M,
2007
Difference
in M
t – ratio
(df)
Observed
probability
All perf.
Bands
1372 1097 1.610 1.715 .105 2.369
(2467)
.02*
Basic and
above
1372 1097 .565 .566 .001 .061
(2467)
.95
Prof + Adv 1372 1097 .185 .263 .078 4.687
(2467)
.00*
Note. Equal variances were assumed.
This is a significant change although one must consider that this
percentage increase occurred over 4 years’ time. The raw change and the
percentage change for the dependent variable of all performance bands and the
dependent variable of “basic + above” did not manifest practical significance;
indeed, there were only 6% and 2% changes, respectively, over 4 years.
Using the standard for practical significance using Cohen’s d of .20, the
changes in performance on the ELA portion of the CST at Johnson Middle
School show practical significance only in the “proficient + advanced”
dependent variable, d = .20. For the dependent variable of all performance
bands (d = .10) and for the “basic + above” dependent variable (d = .00), no
practical significance is shown. These data are reported in Table 12.
79
Table 12
Pre- and Post-Performance Band Findings: Practical Significance, Johnson
Middle School 2003-2007
Dependent
Variable
Pre-
Mean
Post-
Mean Difference
Percent
change
SD
Pre Cohen’s d
Perf bands 1.610 1.715 .105 06% 1.019 .10
Basic+pro+adv .565 .566 .001 02% .496 .00
Prof+ Adv .185 .263 .078 42% .389 .20
Roosevelt Middle School. Results in changes in performance at
Roosevelt Middle School were statistically significant for all three dependent
variables. As shown in Table 13, the observed probability for the “all
performance bands” dependent variable is .002. Using the significance level of
.15, p = .002, p < .15. For the dependent variable “proficient + advanced,” p =
.024, p < .15, and therefore these results are statistically significant. Similarly,
the dependent variable of “basic + above” also shows statistical significance;
p = .103, p < .15.
It is noted that the N for Roosevelt was significantly lower for the post-
intervention year as compared to pre-intervention, 4 years earlier. This is most
likely due to a reduction in the size of Roosevelt’s attendance area when a new
school opened in the fall of 2004. What is the practical significance of the
changes in these CST data? The changes over 4-years’ time at Roosevelt all
reflect a positive change.
80
Table 13
Pre- and Post-Intervention CST ELA Performance Band Data, Roosevelt Middle
School, 2003-2007
Dependent
Variable
Pre-
N,
2003
Post-
N,
2007
Pre- M,
2003
Post-
M,
2007
Difference
in M
t – ratio
and (df)
Observed
probability
All perf.
Bands
1334 1111 1.820 1.958 .138 3.101
(2443)
.002*
Basic and
above
1334 1111 .629 .661 .032 1.631
(2443)
.103*
Prof +
Adv
1334 1111 .276 .318 .042 2.262
(2443)
.024*
Note. Equal variances were assumed.
However, the practical significance is negligible or slight in most cases.
This is true in terms of raw change for all three dependent variables. In terms of
percentage change, “all performance bands” and “basic + above” dependent
variables showed insignificant improvement, 8% and 5%, respectively, but the
“proficient + advanced” dependent variable showed a 15% gain. This is a
significant increase; however, one notes that the increase occurred over a period
of 4 years, thus diminishing the practical significance of the finding.
In terms of effect size as measured by Cohen’s d, and using the standard
for education of .20 as the level of significance, no dependent variables at
Roosevelt show strong effect sizes. The “all performance bands” dependent
variables showed the highest effect size, with d = .13. The “basic + above” and
the “proficient + advanced” dependent variables showed effect sizes of d = .07
and d = .09, respectively. These data are summarized in Table 14.
81
Table 14
Pre- and Post-Performance Band Findings: Practical Significance, Roosevelt
Middle School
Dependent
Variable
Pre-
Mean
Post-
Mean Difference
Percentage
change
SD
Pre
Cohen’s
D
Perf bands 1.820 1.958 .138 08% 1.070 .13
Basic+pro+adv .629 .661 .032 05% .483 .07
Prof+ Adv .276 .318 .042 15% .447 .09
Lincoln High School. The pre- and post-intervention measures at
Lincoln High School represent a 3-year span, 2004-2007. The changes that
occurred were not statistically significant except for the dependent variable of
“proficient + advanced.” For the dependent variable “all performance bands,”
p = .19, .19 > .15; therefore, statistical significance is not indicated. For the
“basic + above” dependent variable, p = .23, .23 > .15, and so p is not
significant. For the “proficient + advanced” dependent variable, p = .15 which
is statistically significant using .15 as the threshold for significance in this study.
It is noted that the mean for the “basic + above” dependent variable showed a
loss. These data are summarized below, in Table 15.
As indicated in Table 16, results for Lincoln High School show very
weak practical significance on all measures. In terms of raw change, all
dependent variables show minimal gains. The “basic + above” dependent
variable had a negative change.
82
Table 15
Pre- and Post-Intervention CST ELA Performance Band Data, Lincoln High
School, 2004-2007
Dependent
Variable
Pre-
N,
2004
Post-
N,
2007
Pre-
M,
2004
Post-
M,
2007
Difference
in M
t – ratio
(df)
Observed
Probability
All perf.
Bands
1560 1707 2.079 2.137 .058 1.299
(3265)
.19
Basic and
above
1560 1707 .697 .677 -.020 1.205
(3265)
.23
Prof +
Adv
1560 1707 .400 .425 .025 1.434
(3265)
.15*
Note. Equal variances were assumed.
Table 16
Pre- and Post-Performance Band Findings: Practical Significance, Lincoln High
School, 2004-2007
Dependent
Variable
Pre-
Mean
Post-
Mean Difference
Percent
change
SD
Pre
Cohen’s
D
Perf bands 2.079 2.137 .058 03% 1.241 .05
Basic+pro+
adv
.697 .677 .020 -03% .460 .03
Prof+Adv .400 .425 .025 06% .490 .05
Of course, this could be a practically positive outcome if the “proficient
+ advanced” dependent variable had shown a particularly large gain, indicating
that many basic students had moved into proficient. However, this was not the
case for Lincoln High School. In terms of raw change, results were small for all
dependent variables. When using Cohen’s d as the measure of effect size, all
83
dependent variables showed minor effects. The dependent variables “all
performance bands” and “proficient + advanced” both showed d = .05. The
standard commonly used for practical significance for Cohen’s d is d >.20, and
so these results are not practically significant. The group of “basic + above”
also showed a very insignificant effect of d = -.03. Data are shown in Table 16.
Ford Elementary School. As shown in Table 17, pre-post changes,
2005-2007, are statistically significant using .15 as the level of statistical
significance. For all bands, p(993) = .000, p < .15. For students in the
performance bands of “basic + above,” p = .000, p < .15 and is therefore
statistically significant. The p value for students in the “proficient + advanced”
band is .003 with equal variances assumed. P < .15 and is therefore statistically
significant. These data are displayed in Table 17 which shows t-test results.
It is noted that there is a significant change in N over the 2-year
intervention period. This change was most likely due to decreasing enrollment
in the attendance area as well as to a reduction in numbers of students “off -
loaded” to Ford Elementary School from other impacted schools in the area.
84
Table 17
Pre- and Post-Intervention CST ELA Performance Band Data, Ford Elementary
School, 2005-2007
Dependent
Variable
Pre-
N,
2005
Post-
N,
2007
Pre-
M,
2005
Post-
M,
2007
Difference
in M
t –
ratio
(df)
Observed
probability
All perf.
Bands
566 429 1.705 2.040 +.335 4.875
(993)
.000*
Basic and
above
566 429 .583 .706 +.123 4.031
(993)
.000*
Prof + Adv 566 429 .231 .315 +.084 2.948
(993)
.003*
Note. Equal variances were assumed.
Practical significance was calculated using Cohen’s d as shown in Table
18. Using the standard of practical significance of d > .20, the changes in
performance bands among all tested students was large, with a practical
significance of .31 as measured by Cohen’s d. There is a smaller, but still
somewhat strong practical significance of increases in students into “basic +
above” bands over 2 years’ time. The least growth, although still practically
significant, was shown by the dependent variable “proficient + advanced” range,
with a Cohen’s d of .20. Using the standard for practical significance of .20,
Cohen’s d results are practically significant for all dependent variables.
85
Table 18
Pre-and Post-Performance Band Findings: Practical Significance, Ford
Elementary School, 2005-2007
Dependent
Variable
Pre-
Mean
Post-
Mean Difference
Percent
Change
SD
Pre
Cohen’s
D
Perf bands 1.705 2.040 +.335 19% 1.076 0.31
Basic+
above
.583 .706 +.123 21% .493 0.25
Prof+ Adv .231 .315 +.084 36% .422 0.20
For practical significance, raw change was also calculated. This is
displayed as “Difference” in Table 18. All dependent variables showed gains.
The group of “All performance bands” and “Basic + above” showed practical
significance that exceeded the standard of .10. In the all performance bands
dependent variable, a change of .33, indicates strong practical significance. The
“Basic + above” dependent variable showed improvement of .12. However,
given that this difference is over a 2-year period, it may or may not be truly
practically significant. The gain in the “proficient + advanced” dependent
variable, .08, does not show practical significance using the standard of .10.
Another look at the practical significance is through the percentage of
change. For the calculation of this ratio, the researcher calculated the change in
the mean from the pre-intervention year to the post-intervention year and
divided this number by the mean of the pre-intervention-year results. The
results showed a 19% change in performance bands over the 2-year intervention
period, or an average of close to 10% per year. This approximates the NCLB
86
requirement of 10%. The percentages of change for both the “basic + above”
and the “proficient + advanced” dependent variables also showed gains that
were practically significant over the 2-year period.
Madison Elementary School. When a t-test was applied to the pre-
intervention and post-intervention data for Madison Elementary School,
statistical significance was found for all dependent variables at the .15 level.
For the “all performance bands dependent variable, p = .015, p < .15, and so p is
statistically significant. For the “basic + above” dependent variable, p = .06, .06
< .15, and so p is statistically significant. For the “proficient + advanced”
dependent variable, p = .002, p < .15, and so p is statistically significant.
Table 19
Pre- and Post-Intervention CST ELA Performance Band Data, Madison
Elementary School, 2005-2007
Dependent
Variable
Pre-
N,
2005
Post-
N,
2007
Pre-
M,
2005
Post-
M,
2007
Difference
in M
t –
Ratio
(df)
Observed
Probability
All perf.
Bands
704 698 1.645 1.788 .143 2.43
(1400)
.015*
Basic and
above
704 698 .570 .619 .049 1.881
(1400)
.06*
Prof +
Adv
704 698 .213 .285 .072 3.127
(1400)
.002*
Note. Equal variances were assumed.
87
The gains in ELA performance on the CST at Madison Elementary
School indicated little practical significance with an important exception in one
dependent variable. Using raw change, the gains are positive but modest over a
2-year period. However, the raw change in the “proficient + advanced”
dependent variable shows large practical significance. In terms of Cohen’s d,
the standard of d > .20 is not met by any dependent variable, although the
“proficient + advanced” dependent variable showed gains close to the .20 level.
When percentage change was calculated, the “proficient + advanced” dependent
variable showed strong practical significance, 34%, using 10% as the
significance level. Table 20 displays these data.
Table 20
Pre- and Post-Performance Band Findings: Practical Significance, Madison
Dependent
Variable
Pre-
Mean
Post-
Mean Difference
Percentage
change
SD
Pre
Cohen’s
D
Perf bands 1.645 1.788 .143 9% 1.078 .13
Basic+pro+
adv
.570 .619 .049 9% .495 .10
Prof+ Adv. .213 .285 .072 34% .410 .18
Summary of Findings of Significance. A summary of the key results
from the pre-/post-intervention independent groups portion of this study appears
in Table 21. This table is provided as a quick reference for district-level stake-
holders who have an interest in the findings.
88
Table 21
Summary of Statistical and Practical Significance of CST Performance Band
Changes at Experimental (Site Support Team) Schools
School/
Dependent
Variables
Number
Years
of Site
Support
Teams
Raw
Change
In M
t-test: Obs.
Prob.
Statistical
Sig (Y/N)
Cohen’s d
Test of
Practical
Significance
Percent
Change in
Band/ Avg
Percent
Change per
Yr
Van Buren HS,
Performance Band
Score
5 years .479 .000 *
Y
.43 32%
6.4% per
year
Basic+Prof+Adv .148 .000 *
Y
.30 29%
5.8% per
year
Prof+Adv .148 .000 *
Y
.38 80%
16% per
year
Fillmore HS,
Performance Band
Score
4 years .037 .367
N
.04 02%
<1% per
year
Basic+Prof+Adv .036 .038 *
Y
.08 -5%
-1% per
year
Prof+Adv .035 .041 *
Y
.08 11%
2.2% per
year
Johnson,
Performance Band
Scores
4 years .105 .02 *
Y
.10 06%
1.5% per
year
Basic+Prof+Adv .001 .95
N
.00 02%
< 1% per
year
Prof+Adv .078 .00 *
Y
.20 42%
10.5% per
year
Roosevelt MS,
Performance Band
Scores
4 years .138 .002 *
Y
.13 08%
2% per year
Basic+Prof+Adv .032 .103 *
N
.07 05%
1.25% per
year
89
Table 21 (continued)
School/
Dependent
Variables
# years
of Site
Support
Teams
Raw
Change
In M
t-test: Obs.
Prob.
Statistical
Sig (Y/N)
Cohen’s d
Test of
Practical
Significance
Percent
Change in
Band/ Avg
Percent
Change per
Yr
Prof+Adv .042 .024 *
Y
.09 15%
3.75% per
year
Lincoln,
Performance Band
Scores
3 years .058 .19
N
.05 03%
1% per year
Basic+Prof+Adv -.020 .23
N
.03 -03%
<1% per
year
Prof+Adv .025 .15 *
Y
.05 06%
2% per year
Ford ES,
Performance Band
Scores
2 years .33 .000 *
Y
.31 19%
10% per
year
Basic+Prof+Adv .12 .000 *
Y
.25 21%
10% per
year
Prof+Adv .08 .003 *
Y
.20 36%
18% per
year
Madison ES,
Performance Band
Scores
2 years .143 .015 *
Y
.13 09%
5% per year
Basic+Prof+Adv .049 .06 *
N
.10 09%
5% per year
Prof+Adv .072 .002 *
Y
.18 34%
17% per
year
Notes. The criterion used for statistical significance is p < .15. Statistical
significance is indicated with an asterisk.
The criterion used for practical significance using Cohen’s d is .20.
Bold = > .20.
90
The table is arranged chronologically, by year of onset of the
intervention. Each of the seven schools in this study are listed in the left column
with the three dependent variables of CST ELA performance f or all performance
bands, for “basic + proficient + advanced (“Basic + above”) and “proficient +
advanced” listed per school. In all cases, results were calculated using the
English Language Arts portion of CST scores for the year prior to the onset of
the intervention (Site Support Teams) through 2007. To calculate means for
dependent variables, the ELA score for each student was assigned a number
associated that was assigned to his/her performance band. Each student who
scored “Far Below Basic” received a 0 in the calculation, each student who
scored “below basic” received a 1 for the calculation, and so on.
FBB = 0
BB = 1
Basic = 2
Proficient = 3
Advanced = 4
The statistical analysis program SPSS was used for the calculation of all
means, standard deviations, and t-tests.
The table summarizes the raw change of the mean for each dependent
variable in the third column. The statistical significance of the change from the
pre-intervention year to the post-intervention year is indicated in the fourth
91
column. Statistical significance was computed using a t-test with a .15
significance level for observed probability. Results that were found to be
significant at that level are indicated with an asterisk. The column labeled
Cohen’s d summarizes the practical significance of the change using .20 as the
significance level. Cohen’s d was calculated by dividing the difference in pre-
and post-intervention means by the standard deviation of the pre-intervention
scores. The final column displays the percentage of change which was
calculated by dividing the difference in the means by the mean of the pre-
intervention year. Figures used in these calculations are displayed earlier in this
chapter in Tables 7 through 20. The criterion used for practical significance
using percent change is 10%.
The results in the Table 21 show statistical significance in most schools
and for most dependent variables. The statistical significance might be skewed
if longevity of the intervention could be factored into the calculation. However,
the overarching concern of the school district would be with the practical
significance of the results.
In terms of practical significance, a few patterns are evident in the
summary Table 21. Two of the seven intervention schools showed significant
gains in achievement across all dependent variables over the duration of the
implementation of Site Support Teams as bounded by the years of this study,
i.e., through 2007. Those schools are Van Buren High School and Ford
92
Elementary School. In these cases, there were strong gains as measured by t-
tests, Cohen’s d, and percent change; moreover, the significance of the increases
in student achievement remained strong when the number of years of the
intervention were considered (average percent change per year).
In addition to very strong results for these two schools, a pattern
emerged of practical significance or relative strength of improvement in all
seven schools in the “proficient + advanced” dependent variable. The percent
change in this dependent variable at Van Buren High School was very strong,
80% change in 5 years, and at Ford Elementary School, with 36% change over 2
years. Madison Elementary results showed a 34% change over 2 years in the
“proficient + advanced” dependent variable, also a moderately strong change;
Johnson Middle School showed 42% gain over 4 years. While Roosevelt
Middle School, Fillmore High School, and Lincoln High School showed lower
increases (15% in 4 years, 11 % in 4 years, and 6% in 3 years, respectively), the
“proficient + advanced” dependent variable was the variable of relative strength
for all three of these schools and reflected definite gains.
These summary results are addressed further in Chapter 5.
93
Results of Quasi-Experimental Design Using Matched
Control Group Comparison
The control and experimental schools were pairs of “best fit” matches in
the pre-intervention year. Each control school was within +/- 7 API points of its
respective experimental school. All schools were well below the state’s target
of 800 API points in the pre-intervention year. It is reasonable to assume that all
schools worked to effect improvements in achievement scores during the course
of the intervention at the experimental schools. While API is a difficult number
to use statistically, gains or losses in API have practical significance for
practitioners in California. Table 22 shows API outcomes as compared to those
of the pre-intervention year, for both experimental and control schools
(California Department of Education, 2007e).
As seen in Table 22, schools that started out “similar” in the pre-
intervention year did not necessarily remain similar due to different rates of API
growth. While 13 of the 14 schools in the table increased API scores in 2007 as
compared to the pre-intervention year, six of the seven experimental schools
showed a greater increase than their respective control schools. In fact, Van
Buren High School, Madison Elementary School, Fillmore High School were
experimental schools that showed API gains of 20 or more points than their
control schools. In contrast, only Lincoln High School failed to make API gains
that were as strong as its control school.
94
Table 22
API Outcomes for Control and Experimental Schools
Control Schools’ API
Scores (Growth API)
Experimental Schools’ API
Scores (Growth API)
Control
v. Exp.
Schools /
# Years between
pre-intervention
and 2007
Pre-Int.
Year 2007
API
Gain
Pre-Int.
Year 2007
API
Gain
API net
diff.
Van Buren –
Downey
5
579 663 84 579 693 114 +30
Fillmore HS -
Ronald Reagan /
4
639 664 25 644 689 45 +20
Johnson - Budd
4
598 641 43 588 637 49 + 6
Roosevelt –
McDougall 4
647 682 35 650 695 45 +10
Lincoln - Latham
City / 3
714 751 37 713 711 -2 - 39
Ford ES - Haight
2
697 751 54 690 745 55 +1
Madison –
Wilson 2
655 661 6 658 693 35 +29
Notes.
Bold indicates school of greater API increase.
The author analyzed the similar school list to identify schools that were +/- 5 of
the experimental school’s API. In the case of Ford Elementary School, none of
the schools within this band reflected the degree of diversity of Ford and had
percentages of EL students similar to Ford’s. It was for these reasons that the
author selected a school that was +7 API points within Ford’s.
Table 22 is based upon Base API as Similar Schools are associated with the
Base API calculation. In other areas of this report, Growth API was reported as
that is the only calculation available, at the time of writing, through 2007. The
difference between the “Growth” and “Base” measures is significant for Lincoln
High School which had, for the 2004 testing, a Growth API of 693 but, when
recalculated, a Base API of 713.
95
All control schools made API gains during the period of intervention at
the experimental schools, although one control school, Pete Wilson Elementary,
made a net gain of only six API points over a 2-year period. Ford Elementary
School showed a 55-point increase over 2 years and was a school that showed
significant practical significance on other measures (Cohen’s d and percent
change); it may be coincidence that Ford’s control school also made very similar
gains, 54 API points, during the same period.
One final set of analyses was used to compare the progress of the
experimental schools with that of the control schools. Appendix D provides
tables of calculations of changes in mean, standard deviation, and the resultant
Cohen’s d and percent change calculations. Raw change in Mean, Cohen’s d
test of practical significance, and percent change in band were then displayed in
tables below for each pairing of experimental school and control school.
Van Buren High School. John G. Downey High School was the “best
fit” control school as matched to Van Buren High School based upon 2002 (pre-
intervention) data and demographics. Results after the 5-year period show the
schools’ changes in Mean, percentage change, and Cohen’s d scores. These data
are expressed in Table 23.
96
Table 23
Comparison of Change within Dependent Variables, Control School, and Van
Buren
School Pre-N Post-N Change in M
Percent
Change Cohen’s d
C-perf bands 1331 1524 .434 27% .38
E-perf bands 1610 1231 .479 32% .43
C-Basic+above 1331 1524 .110 20% .22
E-Basic+above 1610 1231 .148 29% .30
C- Prof+ Adv. 1331 1524 .157 72% .38
E- Prof + Adv 1610 1231 .148 80% .38
Note. “Perf bands” refers to data related to changes in mean in students’
performance when all performance bands are considered. “Basic + above”
refers to mean and numbers of students in “basic,” “proficient,” and “advanced”
bands. “Proficient + Advanced” refers to mean of students in one of those two
performance bands. All are based on a 0-4 coding as explained earlier in
chapter.
Both high schools improved significantly during the 5-year period of the
intervention at Van Buren High School. Van Buren High School out-paced the
improvements at Hill High School in all dependent variables on measures of
percentage change. In terms of effect size as measured by Cohen’s d, Van
Buren High school improved more than Hill in the dependent variables of “all
performance bands” and “basic + above.” The schools had identical Cohen’s d
scores for the “Proficient + Advanced” dependent variable. In terms of raw
change of Mean, Van Buren High School’s change was greater for “all
performance bands” and “basic + above” dependent variables, but lower for
97
“proficient + advanced.” It is noted that the control school added approximately
200 students during this period while the experimental school’s N declined by
approximately 280 students. The changes in N could have influenced test scores
if the change in students’ demographic composition changed such as percentage
of poverty or percentage of English Learners.
Fillmore High School. How did Fillmore High School perform over a 4-
year intervention period as compared to its control school? Fillmore High
School performed better on most measures as compared to the school that was
its “best fit” match in 2003. In this case, the control school showed a decrease
in Mean for all dependent variables, resulting in negative percentage change and
meaningless effect sizes as measured by Cohen’s d. In contrast, Fillmore High
School had a small positive gain in Mean in the “all performance bands” and
“proficient + advanced” dependent variables. Like its control, the experimental
school showed small negative changes in Mean and also percentage change for
the “Basic + above” dependent variable. Cohen’s d was low for the
experimental school especially given the long duration of the intervention.
These results are displayed in Table 24.
98
Table 24
Comparison of Change within Dependent Variables, Control School, and
Fillmore
School Pre-N Post-N Change in M
Percent
Change Cohen’s d
C-perf bands 1869 1973 -.039 - 2% -.03
E-perf bands 1654 1387 .037 2% .04
C-Basic+above 1869 1973 -.035 - 5% -.08
E-Basic+above 1654 1387 -.036 - 5% -.08
C- Prof+ Adv. 1869 1973 -.008 - 2% -.02
E- Prof + Adv 1654 1387 .035 11% .08
Note. “Perf bands” refers to data related to changes in mean in students’
performance when all performance bands are considered. “Basic + above”
refers to mean and numbers of students in “basic,” “proficient,” and “advanced”
bands. “Proficient + Advanced” refers to mean of students in one of those two
performance bands. All are based on a 0-4 coding as explained earlier in
chapter.
It is noted that the schools diverged in sheer numbers of students tested
during the 4-year period. While the control school increased numbers of
students tested by 5%, approximately 100 students, Ford’s student numbers
decreased by over 260 students, or 16%. While it is known that Fillmore High
School’s attendance area was reduced when a new school opened, whether that
change was a factor in the changes of scores is not known.
Johnson Middle School. Johnson Middle School and its control school
realized positive but modest gains during the 4 years of the intervention. The
achievement improvements of the control school, although small, were stronger
99
than those of Johnson Middle School on measures for the “all performance
bands” and the “basic + above” dependent variables. However, the schools
showed significant percentages change for the “proficient + advanced”
dependent variable. Interestingly, the results of the two schools were similar,
although not identical, on measures of changes in mean and percentage change.
They both had borderline practical significance as measured by Cohen’s d at a
significance level of d > .20. The strong change in the “proficient + advanced”
dependent variable as measured by both percentage change and Cohen’s d is
tempered by the fact that the improvements occurred over a 4-year period
(Table 25).
Table 25
Comparison of Change within Dependent Variables, Control School, and
Johnson Middle School
School Pre-N Post-N Change in M
Percent
Change Cohen’s d
C-perf bands 1190 920 .229 16% .23
E-perf bands 1372 1097 .105 06% .10
C-Basic+above 1190 920 .066 13% .13
E-Basic+above 1372 1097 .001 02% .00
C- Prof+ Adv. 1190 920 .068 49% .20
E- Prof + Adv 1372 1097 .078 42% .20
Note. “Perf bands” refers to data related to changes in mean in students’ performance
when all performance bands are considered. “Basic + above” refers to mean and
numbers of students in “basic,” “proficient,” and “advanced” bands. “Proficient +
Advanced” refers to mean of students in one of those two performance bands. All are
based on a 0-4 coding as explained earlier in chapter.
100
Roosevelt Middle School. Roosevelt Middle School and the control
school with which it was paired remained remarkably similar after a 4-year
period. Both demonstrated modest positive gains for all dependent variables.
Cohen’s d scores were unremarkable for both schools. The percentage change
was strongest for both schools for the “proficient + advanced” dependent
variable, although the control school’s 20% increase and Roosevelt’s 15%
increase were both modest given that the intervention spanned 4 years. Both
schools also had 13%-16% declines in N during this period.
Table 26
Comparison of Change within Dependent Variables, Control School, and
Roosevelt
School Pre-N Post-N Change in M
Percent
Change
Cohen’s
d
C-perf bands 623 541 .147 08% .13
E-perf bands 1334 1111 .138 08% .13
C-Basic+above 623 541 .014 02% .03
E-Basic+above 1334 1111 .032 05% .07
C- Prof+ Adv. 623 541 .056 20% .12
E- Prof + Adv 1334 1111 .042 15% .09
Note. “Perf bands” refers to data related to changes in mean in students’
performance when all performance bands are considered. “Basic + above”
refers to mean and numbers of students in “basic,” “proficient,” and “advanced”
bands. “Proficient + Advanced” refers to mean of students in one of those two
performance bands. All are based on a 0-4 coding as explained earlier in
chapter.
101
Lincoln High School. Both Lincoln High School and its control school
showed very flat student achievement changes in the English Language Arts
portion of the CST over the 3-year intervention period. Changes in Cohen’s d
and percentage change of mean were in single digits for both schools. The
“Basic + above” dependent variables of both schools posted very small negative
changes in mean. For all dependent variables, and given the 3-year period,
available data show that any interventions yielded no practical significance.
Table 27 displays these data.
Table 27
Comparison of Change within Dependent Variables, Control School, and
Lincoln High School
School Pre-N Post-N Change in M
Percent
Change Cohen’s d
C-perf bands 1568 1581 .075 03% .07
E-perf bands 1560 1707 .058 03% .05
C-Basic+above 1568 1581 -.008 -.01 -.02
E-Basic+above 1560 1707 -.020 -03% .03
C- Prof+ Adv. 1568 1581 .009 02% .02
E- Prof + Adv 1560 1707 .025 06% .05
Note. “Perf bands” refers to data related to changes in mean in students’
performance when all performance bands are considered. “Basic + above”
refers to mean and numbers of students in “basic,” “proficient,” and “advanced”
bands. “Proficient + Advanced” refers to mean of students in one of those two
performance bands. All are based on a 0-4 coding as explained earlier in
chapter.
102
Ford Elementary School. Both Ford Elementary School and the control
school posted strong gains in all dependent variables over the 2-year period.
Ford’s increases in English Language Arts achievement were stronger than that
of the control school across all measures for the dependent variables of “all
performance bands” and “basic + above.” While the control school shows
greater improvement for the “proficient + advanced” dependent variable, Ford’s
percentage change of 36% is a result of strong practical significance, and the
Cohen’s d calculation of .20 also shows practical significance. The data
representing changes in the ELA portion of CST for Ford and its control school
are displayed in Table 28.
Madison Elementary School. The 2-year changes in ELA performance
on the CST for Madison Elementary School and for Pete Wilson Elementary
School are displayed in Table 29. In this case, the control school declined in
performance in all dependent variables. Madison showed increases, although
these 2-year increases showed little or no practical significance for the “all
performance bands” and the “basic + above” dependent variables. Of note,
however, are the positive changes in the “proficient + advanced” dependent
variable for Madison Elementary School.
103
Table 28
Comparison of Change within Dependent Variables, Control School, and Ford
Elementary School
School Pre-N Post-N Change in M
Percent
Change Cohen’s d
C-perf bands 558 390 .283 16% .25
E-perf bands 566 429 .33 19% .31
C-Basic+above 558 390 .085 14% .17
E-Basic+above 566 429 .123 21% .25
C- Prof+ Adv. 558 390 .120 48% .28
E- Prof + Adv 566 429 .083 36% .20
Note. “Perf bands” refers to data related to changes in mean in students’ performance when all
performance bands are considered. “Basic + above” refers to mean and numbers of students in
“basic,” “proficient,” and “advanced” bands. “Proficient + Advanced” refers to mean of
students in one of those two performance bands. All are based on a 0-4 coding as explained
earlier in chapter.
Table 29
Comparison of Change within Dependent Variables, Control School, and
Madison
School Pre-N Post-N Change in M
Percent
Change Cohen’s d
C-perf bands 316 280 -.051 -03% -.05
E-perf bands 704 698 .143 09% .13
C-Basic+above 316 280 -.026 -05% -.05
E-Basic+above 704 698 .049 09% .10
C- Prof+ Adv. 316 280 -.008 -03% -.02
E- Prof + Adv 704 698 .072 34% .18
Note. “Perf bands” refers to data related to changes in mean in students’ performance when all
performance bands are considered. “Basic + above” refers to mean and numbers of students in
“basic,” “proficient,” and “advanced” bands. “Proficient + Advanced” refers to mean of
students in one of those two performance bands. All are based on a 0-4 coding as explained
earlier in chapter.
104
In Table 29, the 34% change in mean shows strong practical
significance. The Cohen’s d result of .18 is very close to the threshold for
practical significance that is being applied in this study, .20. While Madison’s
achievement shows mixed results, it fared well on all measures in relation to the
school that was its 2005 “best match” similar school.
Summary of Findings
Schools with Site Support Teams show mixed results in terms of changes
in CST performance in the area of English Language Arts. Schools’ results
were looked at for three dependent variables: “all performance bands,” “basic +
above,” and “proficient + advanced.” This study first analyzed each school’s
improvements over the period of time from the year prior to the onset of Site
Support Teams through 2007. In two cases, schools showed strong results with
both statistical and practical significance. In most cases, schools posted gains of
little or no practical significance. A pattern emerged of stronger relative gains
across most schools in the “proficient + advanced” dependent variable.
A second effort of this study was to consider the changes in CST
achievement of the seven experimental schools as compared to “best match”
similar schools over the same period. Two experimental schools showed strong
gains that exceeded those of their matched school on most variables. However,
105
most schools showed either mixed results as compared to the control, or better
results than the control school but results that were, none-the-less, lackluster.
When analyzing changes in school achievement, it must be recognized
that multiple factors may influence the results. If quantitative analysis yields
mixed results when the efficacy of Site Support Teams is considered, are there
other factors that might have overshadowed or influenced any effect that the Site
Support Teams were having? Are there practices of implementation that vary
from one Site Support Team to another that seem associated with more positive
results? What do the practitioners at Site Support Team schools think about the
influence of Site Support Teams on student achievement? Responses to such
questions influence our conclusions about Site Support Teams and are
considered in the discussion of Chapter 5. These qualitative data, as gathered
through surveys and interviews, are reported in the discussion in Chapter 5.
106
CHAPTER 5
OVERVIEW OF FINDINGS
Do Site Support Teams have an effect on student achievement? In
attempting to evaluate Site Support Teams, the author f irst reviewed the changes
in student achievement over the course of time in seven schools that had Site
Support Teams. Changes in the dependent variables of “all CST performance
bands,” “basic + above” performance bands, and “proficient + advanced”
performance bands were analyzed for statistical and practical significance. The
seven schools were also compared to control schools, each identified because of
similarity to its matched experimental school in the pre-intervention year.
Qualitative data taken from teacher surveys and interviews of key Site Support
Team members were used to help interpret the results.
In order for Site Support Teams to be linked to student achievement, in
essence, two sequential outcomes would need to take place between the onset of
the Site Support Team as an intervention and the post-intervention measurement
of student achievement. Essentially, these sequential outcomes are questions:
(a) whether Site Support Teams have prompted learning on behalf of the staff,
and (b) whether that learning resulted in behavior changes that yielded improved
student achievement. In this chapter the author discusses this chain of events
and then discusses the student outcomes displayed in chapter four.
107
Alexander and Murphy (1997) organized and grouped learner-centered
psychological principles of the American Psychological Association. These
principles summarized years of well-supported and replicated research.
Alexander and Murphy concluded that a mix of nomothetic and idiographic
factors affect learning. These include cognitive, metacognitive, affective-social,
developmental, situational, and contextual factors. Given these varied and
numerous factors, certainly the answer to the question, “Do Site Support Teams
yield learning and behavioral changes on the part of staff which, in turn, effect
improvements in student learning?” is going to be tentative and contextual.
To further confound conclusions, the viewpoint that the basis for the
quantitative measure, California’s CST, may be used to draw inappropriate
inferences must be considered. Linn (2006). working at the University of
Colorado through the National Center for Research on Evaluation, Standards,
and Student Testing (CRESST) Center, prepared a technical report that calls into
question the integrity of the typical inferences drawn from standardized test
data. After reviewing relevant literature and conducting analyses of
standardized test scores in Colorado, Linn concluded that it was inappropriate to
draw the causal inference that gains in test scores reflected real improvement in
student learning. He discusses many other plausible alternate explanations of
results rather than school quality. While the study at hand has relied heavily
upon comparative standardized test results, the study does consider factors that
108
Linn advocates such as teachers’ and administrators’ reports of changes in
instructional practices. Moreover, practitioners cannot ignore standardized
measures such as the CST as these measures provide significant influence on
behaviors within the K-12 community in California.
Some of the improvements in both control and experimental schools may
have been a result of various factors not related to Site Support Teams. In
experimental schools, the existence of the Site Support Team was hoped to have
brought the school’s staff to a newer level of awareness and action, and to do so
faster than might have happened had the school been left to its own devices.
Examples of this are seen in areas of high expectations, adherence to standards,
and learning environment.
High Expectations
As an example, since the first school became a Site Support Team
school, the California Department of Education began releasing test items.
These sample items helped teachers to see the high level of rigor that can be
interpreted within a standard. This simple action, the release of test items, in
itself, could cause teachers to become alert to the possibility of high
expectations. This is an example of a change in the educational environment
that might have contributed to improvements in school performance apart from
Site Support Teams; alternatively, Site Support Teams could call these tools to
109
the attention of teachers who might have used then less fully, otherwise. (This
latter example seemed to be the case at Madison Elementary School.)
Adherence to Standards
In many schools, change is gradual and incremental. Standards have
been a strong part of California education for a decade, and yet it takes years for
teachers to both deeply understand the standards and their implications as well
as to critically analyze practices ingrained over time (largely because they were
thought to be successful) and to modify these practices for alignment to
standards. In other words, full acculturation to a major change such as the
standards movement takes time. Have Site Support Teams helped to accelerate
this process?
Learning Environment
Numerous factors lay as backdrop for learning at a given school. Among
these are the social and emotional support for learning at the student and adult
levels, the resources for learning tools, and the staffing that allows appropriately
rigorous classes to be matched to students’ needs. Some of these are the
nomothetic factors discussed by Alexander and Murphy (1997), as noted above.
The factors above are among the “Contextual Factors” are discussed
along with the results in the hopes of understanding whether Site Support Teams
110
are an effective tool and, if so, under what circumstances. This chapter is
divided into the following sections:
x Discussion of Patterns of Results in Site Support Team Schools
x School-by-School Discussion
x Summary of Findings
x Recommendations
In all sections, the analysis of the quantitative findings is informed by
observed patterns, by information about contextual factors, and by the
qualitative data gathered from teacher surveys and from 10 interviews of key
Site Support Team participants. Survey results were taken from a February,
2008 survey of teachers at the two Site Support Team schools that showed the
greatest improvement in student achievement during the intervention years.
Only teachers who taught English/Language Arts at a CST-tested grade level (2-
6 for elementary and 9-11 for high school) were surveyed. Twenty-four
elementary and 11 high school surveys were distributed. Thirteen surveys
(37%) were returned, nine at the elementary level and four at the high school
level. Interview data were gleaned from individual interviews conducted from
February 20–March 14, 2008. Individuals interviewed were two associate
superintendents, three principals, two directors, a retired curriculum specialist
who served for 4 years on Site Support Teams, and two instructional coaches at
Site Support Team schools.
111
Discussion of Patterns of Results in Site Support Team Schools
Statistical Significance
Dependent variables of “all performance bands,” “basic + above,” and
“proficient + advanced” were studied for each school. The changes in most
dependent variables were found to be statistically significant. At each of the
seven experimental schools, the three dependent variables were studied yielding
a total of 21 dependent variables. Changes were statistically significant in 15
out of the 21 instances. The results at two schools were statistically significant
across all three dependent variables. Four of the remaining five schools showed
statistically significant results in two of the three dependent variables, and one
school showed statistical significance in the “proficient + advanced” dependent
variable, only. The dependent variable of “proficient + advanced” was
significant for all schools with significance at the .05 level in six schools and at
the .15 level in one school.
Statistical significance tells us that the results found were not attributable
to chance. Moreover, one would think that, due to the large N’s in this study,
that statistical significance is an important outcome. However, the confounding
factor in this study is the length of time over which the intervention has been in
place. Since that time interval varies from a minimum of 2 to a maximum of 5
years, this length of time is likely to skew any conclusion that one might
otherwise draw due to the existence of statistical significance.
112
Successful Schools
Two of the seven Site Support Team schools were particularly
successful. These schools showed significant gains in achievement across all
dependent variables over the duration of the implementation of Site Support
Teams as bounded by the years of this study. Those schools are Van Buren
High School and Ford Elementary School. In these cases, there were strong
gains as measured by t-tests, Cohen’s d, and percent change; moreover, the
significance of the increases in student achievement remained strong when the
number of years of the intervention was considered (average percent change per
year). Overall, the improvements in theses schools were greater than that of
their respective control schools even though the control schools also showed
strong improvement. In the individual school sections, the contextual factors
that surrounded these schools’ efforts and that may be significant are discussed.
Practical Significance
Overall gains in performance band dependent variables show practical
significance that is negligible or slight in most cases. There are notable
exceptions to this such as the two “successful schools” described above, and
instances at other schools in which the Cohen’s d and percentage change tests
for practical significance are strong or moderately strong in the “proficient +
113
advanced” dependent variable. This occurred at three schools in addition to the
two “successful” schools.
Strong Improvement in “Proficient + Advanced”
Dependent Variable
Within the results, a pattern emerges of relative strength of improvement
in all seven schools in the “proficient + advanced” dependent variable. The
percent change in this dependent variable at Van Buren High School was very
strong, 80% change in 5 years, and at Ford Elementary School, with 36%
change over 2 years. Madison Elementary results showed a 34% change over 2
years in the “proficient + advanced” dependent variable; Johnson Middle School
showed 42% gain over 4 years. While Roosevelt Middle School, Fillmore High
School, and Lincoln High School showed lower increases (15% in 4 years, 11 %
in 4 years, and 6% in three years, respectively), the “proficient + advanced”
dependent variable was the variable of relative strength for all. The Cohen’s d
test also showed practical significance in the “proficient + advanced” dependent
variable for Van Buren, Johnson, Ford Elementary School, and Madison.
What does this relative improvement in this dependent variable tell us?
With concerns of sliding into program improvement status, did the schools shift
their focus to that of “advanced + proficient” to align with NCLB(No Child Left
Behind Act, 2002) rather than focusing on moving students out of far below
114
basic and below basic? Did Site Support Teams, which were initially
implemented, in part, due to program improvement concerns, prompt this shift?
Did the schools simultaneously focus on the factors mentioned earlier of high
expectations, adherence to standards, and learning environment? If so, was this
shift in the focus of staff related to the work of Site Support Teams? These
questions remain to be explored.
Experimental Schools Fared Well as
Compared to Control Schools
Overall, experimental schools performed better than their respective
control schools over the period of the intervention. In the pre-intervention year,
control schools were +/- 7 API points of the experimental schools and had
similar characteristics in terms of size of school, ethnic make-up, percentage
English Learners, and percentages participating in the Free and Reduced Lunch
Program. In 2007, CST English Language Arts results were compared for the
three dependent variables. Table 30 summarizes the comparative results. Of the
three dependent variables at each of the seven pairs of schools, the control
schools out-performed the experimental schools only 3 out of 21 times. In
contrast, the experimental schools out-performed the control schools 9 out of 21
times. Schools showed similar results 9 out of 21 times. Therefore, the
experimental schools’ improvements were equal to or better than that of their
115
respective control schools in measures of 18 out of 21 combined dependent
variables.
Table 30
Table of Higher Performing Schools, Control Versus Experimental, 2007 ELA
Portion of CST
Name of
Experimental
School
“All Performance
Bands”
Dependent
Variable
“Basic +
Above”
Dependent
Variable
“Proficient +
Adv”
Dependent
Variable
Van Buren High
School
E E S
Fillmore High
School
E S E
Johnson Middle
School
C C S
Roosevelt Middle
School
S S S
Lincoln High S S S
Ford Elementary
School
E E C
Madison
Elementary School
E E E
Notes. Table is based upon differences in Cohen’s d test of practical
significance.
E = Cohen’s d value of experimental school is .05 or more higher than control
school.
C = Cohen’s d value of experimental school is .05 or more less than control
school.
S = Cohen’s d value of experimental school is within .04 of Cohen’s d of control
school.
116
API as an Extension of the Study
While API was not a part of the study design except as a limiting
criterion for selection of control schools, the author wanted an understanding of
API fluctuations during intervention years as background knowledge prior to
conducting interviews of key Site Support Team participants. API rankings are
certainly part of the reality of public schools and, therefore, of interest to
practitioners. Therefore, the author is involving discussion of API in this section
as an extension of the formal study. As is reviewed in the section of school-by-
school findings, the patterns of growth of the respective schools may be related
to the discussion of context of changes in student achievement. The fluctuations
in API for the experimental schools are displayed in Appendix E.
In terms of improvements as measured by the state’s API system, all
seven experimental schools improved during the Site Support Team years.
(Note that there was a 20-point disparity for Lincoln High School between the
Growth API and Base API calculations of 2004 data. Lincoln shows an
improvement over the course of the study only if one uses the Growth API
calculation for 2004. If the Base API is used, there is a decline. Base API
calculations of 2007 data are not available as of this writing in March 2008.)
The improvements ranged from notable to large. Two schools, those shown to
be most successful on all measures, had average API growth of 23-27 API
points per year of the intervention. Four schools averaged 11-17 API points per
117
year of intervention. Six of the seven experimental schools showed more
improvement on API than their “matched” control school. Given that one can
assume that all of these 14 schools felt pressure to show improvements given the
dual public accountability systems of API and NCLB, might one infer that the
stronger API gains in the district could be related to the Site Support Teams?
School-by-School Discussion
Van Buren High School
Van Buren High School was the first Site Support Team school. Since
the pre-intervention year of 2002, Van Buren has demonstrated significant
improvement on all measures. Its steady growth on API varied from 13 points
to 36 points each year for a total of 114 points during the course of the
intervention. In terms of the current study, Van Buren High School has shown
strong growth in all dependent variables.
A district-level Site Support Team member who participated on the team
during the 5 years represented in this study talked about leadership. The Site
Support Team was initiated by a very strong district-level leader. There was
strong accountability from the outset, according to the Site Support Team
member. Meetings were initially held every other week and district members of
the team were expected to return to the school to conduct classroom visitations
in between visits. According to this Site Support member, “Leadership is key.
118
If not from the site level, it must come from the district level. Someone needs to
say, ‘This matters. I am looking for this during walk-throughs.’” The current
principal shared a similar conclusion. “It was all about accountability early on.
The assistant superintendent insisted that there must be a change. It needed to
happen.”
A new member to Van Buren High School’s Site Support Team
indicated that meetings in the current year no longer involved classroom walks.
He commented that this seems to not be necessary as the administrative team is
in classrooms on a daily basis. This fact was borne out by a visit to the
principal’s office as a sign in that office tracks the number of classroom visits
made each month by administrative team members. This kind of ongoing
monitoring is likely a strong part of the school’s continued success.
During the 5 years of intervention as described in this study, Van Buren
High school has had three principals, and yet has had strong improvement each
year. An associated superintendent explained that Van Buren High School has
had consistency of purpose that has allowed the improvement to continue. She
also commented that it was significant that the successive principals have come
from within the school, serving first within the developing culture of the school
as vice principals before their promotions to principal.
In addition to leadership, focus on student learning and instruction, and
accountability, is there any other factor that has been significant in Van Buren’s
119
success? According to the instructional coach who is new this year to Van
Buren High School, teachers at the school truly believe that “all students can and
will learn,” a theme of the school. The coach says that he hears this discussed
by teachers and sees motivational and cultural signs to this effect in classrooms.
As contrasted with reports made by Van Buren High School’s first instructional
coach who arrived at the school in 2002 (her observations are summarized in
Chapter 1), there has been a transformation at Van Buren in collective self-
efficacy of the staff. Both the new coach and the first coach believe that the Site
Support Team has been a strong part of this transformation. The new coach
stated that he has seen the effects of the Site Support Team in experiences at the
school in his first six months’ tenure there. Additionally, as an editor of this
year’s accreditation report for the Western Alliance of Schools and Colleges
(WASC), he reported that he has read several references to it in the WASC
report. He concluded that the Site Support Team has been a big part of the
school’s improvement.
The current principal would concur with this assessment. He reports that
he sees a definite relationship between Site Support Team and student
achievement accountability systems. The Site Support Team helped the staff to
focus on what mattered most. Specific areas of instructional focus were, at first,
student engagement, and later a focus on mathematics instruction, on the
program for English Learners, and now a focus on special education services.
120
Van Buren High School is not likely to be a Site Support Team school in the
upcoming school year. It has outgrown the need, having developed its own
strong leadership, a focus on student learning, a belief that staff can make a
difference, and strong, yet supportive, accountability. It is expected that Van
Buren’s rate of improvement will continue along a steady trajectory.
Fillmore High School
Fillmore High School has had a Site Support Team for 4 years as of
2007. Gains demonstrated by the school were modest or inconsistent. While
the improvements in the dependent variable of “proficient + advanced” were
relatively strong, the dependent variables of “all performance bands” and “basic
and above” showed little change. When these results are considered in light of
the context of the school’s changes in the 4-year period, however, the results are
easily understood.
During the course of the 4-year period of intervention, the school has had
three principals. Unlike Van Buren High School, the new principals at Fillmore
High came in from the outside. In addition, the school underwent a significant
reduction in students tested during this time, presumably due to the opening of a
new high school. The number of students tested declined by 16% during this
time. While the percentage of English Learner (EL) students also declined from
35% to 28%, the number of students in poverty, as measured by the Free and
121
Reduced Lunch Program, increased by 21% from the level of 57% in 2004 to
69% in 2007. These contextual factors indicate that the school was undergoing
very significant changes during the years of this study.
To further explain Fillmore High School’s results, an associate
superintendent offers insight into the school culture. During its 18-year history,
the school had established a culture of great pride in its work. The associate
superintendent commented that, for a school like that, it may be more difficult to
accept the arrival of an “outside” team who may be perceived to review and
question the very practices that have previously been such a source of pride.
Fillmore High School’s Site Support Team continues through the present day.
As a new culture begins to take hold with the settling in of a new, strong site
leader, the collaborative effort by members of the school with members of the
Site Support Team are likely to lead to positive results.
Johnson Middle School
Johnson’s Site Support Team was in place for 4 years for the purposes of
this study. Johnson’s dependent variable of “proficient + advanced” showed
improvement that had practical as well as statistical significance. Other
dependent variables showed relatively flat changes.
Johnson underwent changes of leadership and a reduction in student
enrollment of 23% during the intervention years. Demographically, Johnson’s
122
percentage of EL dropped 16% during the years of this study; poverty increased
by 23%. There was a change in principal in August 2004, about six months
after the first Site Support Team was convened.
Given the significant changes in Johnson’s population as well as changes
in leadership, it is notable that the school had slight to significant increases in
dependent variables during this 4-year-span. Ideally, a Site Support Team
would assist a school in making adjustments to practice in order to
accommodate such changes. Whether or not Johnson would have experienced a
downturn in improvement without the presence of the Site Support Team is
unknown.
Roosevelt Middle School
Roosevelt’s academic improvements were noted in all dependent
variables from 2003–2007. While practical significance was not demonstrated,
all dependent variables did show a positive change with the “proficient +
advanced” showing the largest increase. Perhaps these changes are significant
in real terms given that Roosevelt Middle School also underwent significant
changes in the 4 years of Site Support Team intervention. The student
population decreased by 17% during those years; most probably, the opening of
a new school was a factor. During this same time, the poverty level at Roosevelt
took a large jump as evidenced by participation in the Free and Reduced Lunch
123
Program. Participation rates jumped 52% from 54% in 2003 to 82% in 2007.
Finally, there was a change in leadership in the 2004-2005 school year. The
combination of these contextual factors may have had a strong influence on
student achievement.
Lincoln High School
Lincoln High School showed flat performance during the 3 years of the
study. The growth in dependent variables of “all performance bands” and “basic
+ above” were not statistically significant even at the .15 level. Moreover, the
“basic + above” dependent variable showed a decrease in mean over the course
of the 3 years. The “proficient + advanced” dependent variable showed
increases and was an area of relative strength; however, the increases were not
practically significant as measured by either Cohen’s d or by percentage change.
Lincoln’s study results were echoed in its API pattern. While Lincoln
High School increased by 28 API points from the pre-intervention year to the
first year of the intervention, that gain was not sustained. The school’s API took
a 15-point dip during the second year of the intervention and then rebounded in
the third year, but only by six API points. When discussing this inconsistent
growth pattern with the principal of Lincoln High School, insights emerged.
The principal indicated that fluctuating differences in students in attendance
from year-to-year might account for the oscillation in achievement. Lincoln
124
High School had been affected by the opening of a new high school as well as
by the ebb and flow of students who were redirected to Lincoln from their
impacted school of residence. This “redirection” caused a pattern of students
who were in attendance in one year and gone the next. The principal also noted
shifts in demographics of the neighborhood. When this hypothesis was
researched using Data Quest demographic data from the California Department
of Education website for the years 2005, 2006, 2007, an interesting pattern
became evident (Appendix F).
As Appendix F shows, the percentages of students of various ethnic
subgroups changed little from year–to-year during the course of this study,
2004-2007. Similarly, percentages of English Learners and mobility rates
(mobility within a year, i.e., from October through April) showed minimal
fluctuation. However, Lincoln High School’s student body underwent
significant changes in percentages of the Free-and-Reduced Lunch (FRLP)
subgroup. In the pre-intervention year, students’ participation in the FRLP was
31%. By 2007, it had grown to 39%, an increase of 26%. However, during the
first year of the intervention, 2005, the FRLP rate dropped for one year to only
25%. By the following year, it had increased almost 50% to 37% FRLP. The
drop in poverty as measured by the FRLP in 2005 coincided with a one-time
spike in API of 28 points. The API growth dropped by 15 points the following
year, a year when the FRLP rate rose to 37%.
125
The coincidence of swings in these two measures, API and FRLP as a
measure of poverty, may be just that, coincidence. However, given the
knowledge base in the field of the effects of poverty on student achievement,
these results may be a sad and striking exemplar. As Lincoln High School’s
poverty rate rose, perhaps the staff was unable to stave off its effects on student
achievement. The work of the Site Support Team, then, could have been to help
the faculty to make quick adjustments in curriculum and instruction in order to
provide the new group of students with necessary support. The reality of what
often happens in a school, however, is that demographic trends are not caught
quickly enough for adjustments to be made in academic schedules and
instruction. In addition, it often takes time to make adjustments in instruction.
Questions remain in terms of the Site Support Team and the patterns of
achievement noted above. Did the presence of the Site Support Team, in its
second year of existence at Lincoln High School, actually help the staff to focus
on instruction and to brake what might otherwise have been an even sharper
slide? After a year of gains of 28 API points, and given a 50% rise is poverty
rate, might it be considered a positive outcome that the API dipped only 15
points? Did the presence of the Site Support Team help the school to regain
upward momentum in the third year of intervention in spite of an FRLP rate that
continued to increase?
126
The principal of Lincoln High School believes that the school is “a
perfect example of not being able to pinpoint anything” in terms of causes for
fluctuations. “Perhaps by holding steady, we’ve done a lot.” Site Support Team
took several years to grow into a helpful and productive entity at his school. He
claims that last year, the third year of Site Support Teams, was the first year that
teachers were really aware of and involved in the Site Support Team. The initial
years were years of confusion and frustration regarding Site Support Teams.
Perhaps there was lack of clarity for the principal about the direction that he was
supposed to take in terms of Site Support. He says that last year was the first
time that classroom visits occurred. The Team last year began to help with
progress monitoring data analysis and the planning of instructional adjustments
that should come from that process. Lincoln’s principal predicts even more
positive outcomes this year. “This year is the first time there has been the
pervasive feeling that Site Support Team members are here to help us.” It takes
time at some schools to get to that point. This year, he claims, they are using the
Site Support Team to bring in a broader perspective and to help the staff by
“informing us of best practices.”
Ford Elementary School
With a Site Support Team in place for two academic years, Ford
Elementary School made tremendous improvements in student achievement.
127
Improvements were significant across all dependent variables and favorable to,
or exceeding, those of the control school even though that school also had strong
improvements. In terms of API, Ford Elementary School made a 34-point
improvement in the first year of the intervention and then a 21-point
improvement in the second year of the intervention. In terms of context, Ford
Elementary School has low mobility among a large percentage of staff who are
long-term tenured faculty. The school has had the same principal for 20 years.
In the words of the principal, “Longevity has been an issue.” He explained that
while the stability brought by a principal’s long service at a school was positive
in terms of relationships, it could cause difficulty in terms of moving forward
curricular and instructional issues. He pointed to a degree of complacency that
can be associated with familiarity.
Ford Elementary’s principal would subscribe to the thesis that the Site
Support Team was a motivating factor for staff. “It helped to move the staff out
of complacency.” He mentioned that it was helpful to have eyes other than his
own cite’s and bring to the attention of the staff the needs at the school. He said
it would have been difficult to move the school as much as he did without the
Site Support Team.
A director who participated on Ford’s Site Support Team talked about
both motivation and accountability. She said, “They’re [school principal and
teachers] accountable, but it’s not accountability with a big stick. It’s more,
128
‘Tell us how we can help you.’” The accountability is motivational, in itself.
The director points out that it’s is not a negligible aspect of accountability to
know, as a teacher, that your peers will be in your room watching you teach.
In the first year of the intervention, an associate superintendent was
worried about Ford Elementary School’s failure to meet Adequate Yearly
Progress (AYP) with its African-American subgroup. He challenged the staff of
Ford, including teachers, who participated on the Site Support Team, to identify
one African-American student who had been at Ford for his entire elementary
tenure and to look at the academic progress of that student. The staff identified
one boy and was shocked to learn that this student was “below basic” in his
sixth grade year, but had performed at the “advanced” level in second grade.
The staff traced the boy’s history of difficulty that included some very difficult
experiences outside of school, but this personalization of a statistic brought the
problem squarely home to the staff.
In addition to motivation, Ford’s principal talked, indirectly, about a shift
in teachers’ shared sense of self-efficacy. He said that the Site Support Team
made recommendations that all seemed “do-able” to the teachers. The Site
Support Team helped teachers to shift their focus. “We used to f ocus on API.
Teachers see a moral dilemma of letting low students flourish.” He spoke of the
difficult change in shifting to a belief that these students needed to be, and could
become, proficient.
129
A director who served on Ford’s Site Support Team talked about the
self-efficacy of teachers, as well. “They learned to take charge.” She described
how a Site Support Team meeting had been largely designed by the teachers
who wanted the team to watch a coach teach a lesson and then to have members
gather to debrief the instructional aspects of the lesson.
A recently retired central office staff member had participated on
numerous Site Support Teams for all of the years of the study. She saw
teachers’ changing collective self-efficacy as key at Ford Elementary School.
She said that a “strong variable was the participation of a contingent [of
teachers] who had just gone through an EL certification program through
Stanford.” She said that they provided a small, tight contingent that held each
other accountable and some of their fervor spread to the rest of the staff. One
teacher leader emerged who had knowledge and the respect of her peers.
Significantly, Ford Elementary School involved a rotating group of
teachers on the Site Support Team from its inception. It is unknown whether the
school would have enjoyed the gains it made without the direct teacher
involvement with the team.
Madison Elementary School
Madison Elementary School is one of the experimental schools that
exhibited only modest gains in terms of practical significance as measured by
130
raw change and Cohen’s d (although percentage change in “prof + adv” was
strong). However, the author has noticed that that particular school showed a
large gain in API during the first year of intervention but slid back during the
second year (Appendix E). At this particular school, dissention had grown
between the teaching staff and the administration. After the first year of
implementation of Site Support Teams, teachers felt empowered and motivated.
The collective self-efficacy had improved to the point of thinking that they could
impact student learning. By the second year of the intervention, teacher
frustration with what they perceived to be lack of leadership had become a
barrier to continued improvement.
An associate superintendent who had strong involvement in Site Support
Teams, including that at Madison Elementary, concluded that, in judging
whether Site Support Teams are effective, one must return to the purpose of the
team. He asserted that Site Support Teams were designed to help with the
culture, to help the school to focus, to get them to analyze student work, and to
focus on their own lesson design. He concluded that the Site Support Team at
Madison achieved that. As evidential data, the associate superintendent
produced documents from the most recent Site Support Team meeting in which
the staff had presented their recent work. This included outcomes from their
mid-year progress monitoring analysis, data from their cycle of inquiry related
to lesson study. Their progress monitoring data had been tracked for 2 years and
131
demonstrated clear improvement in areas identified as high-need and high-focus.
The associate superintendent believed that, in the long run, Site Support Teams
have been, and continue to be, effective in helping this school to improve, and
that a return to strong improvements in achievement scores is likely.
Summary of Findings
The author originally hypothesized that Site Support Teams would be
effective vehicles for spurring improved student achievement because of
foundational principles of accountability, motivation, and staff self -efficacy.
How strong were these principles in the views of teachers and administrators?
In fact, accountability was specifically mentioned by 6 out of 10 Site
Support Team members who were interviewed for this study. In addition, a
teacher who completed a survey wrote specifically about the positive impact of
increased accountability in response to an open-ended invitation to provide
additional comments. Five administrators who were interviewed about Site
Support Teams mentioned that teacher involvement was key to a Site Support
Team’s positive impact on achievement. Indeed, greater involvement often runs
hand-in-hand with greater accountability.
“Necessary but not sufficient.” A district office staff member who
participated on numerous Site Support Teams over a 5-year period, saw
contrasts between the two schools that showed strong improvement and other
132
schools that did not. She said that, at the two successful schools, the leadership
for the change either came from the bottom up (small group of influential
teachers) or from a principal who was setting clear expectations. She said there
must be accountability. To extend this thought, this Site Support Team member
explained that there could be mutually planned accountability or supportive
accountability, but that there had to be some assurance of implementation and
persistence. “Leadership is key. If leadership does not come from the site level,
it must come from the district level. Someone needs to say, ‘This matters. I am
looking for this during walk-throughs.’”
According to Pintrich and Schunk (2002), motivation is “the process
whereby goal-directed activity is instigated and sustained” (p. 5). The
instigation of goal-directed activity may come from establishing a focus.
According to Pintrich and Schunk in their review of cognitive consistency
theories (2002), when one’s behaviors and one’s cognitions are not aligned, one
becomes motivated to change behaviors. Teachers and principals believe that
their daily work helps students and brings about learning. When they come to
see that learning is not occurring, or that their behaviors or practices are having
unintended harmful outcomes, they become motivated to make changes. They
are able to return their focus to the alignment of behaviors and goals. Site
Support Teams can provide cognitive dissonance by calling to teachers’
attention facts, patterns, or observations that may have gone overlooked by staff
133
who have become accustomed to the culture and practices of the school.
Similarly, Site Support Team members bring new thoughts and new questions to
the site team for consideration. Clearly, the involvement of teachers, mentioned
as important by 4 of the 10 Site Support Team members interviewed, is a way to
increase motivation of teachers. One teacher, in response to a survey, described
being motivated to see “all of the great teaching” at her school as well as to
focus on some areas that she felt the whole staff needed to work on.
Seven out of ten Site Support Team members who were interviewed said
that Site Support Teams help the school to set and maintain a focus. Two
teachers, in response to open-ended questions on a survey, wrote about Site
Support Team’s impact on motivation of the staff to focus on specific details of
practice. One Site Support Team member who served on numerous teams saw
Site Support as helpful in building focus and motivation of the site
administration. Another Site Support Team member who was interviewed
reported watching seasoned teachers with years of experience become renewed
as they were motivated to refocus on learning standards and student
engagement. Accountability, itself, is motivating, according to an interviewee.
When one knows that his or her practices or progress will be observed or
monitored, he or she is motivated to perform well. As an unintended positive
outcome, one teacher explained that the Site Support Team helped her to keep
from feeling isolated.
134
There are also indications that Site Support Teams worked by helping
teachers and administrators to increase their individual senses of self -efficacy as
well as their collective self-efficacy. For some, the challenges of working at a
high-needs school had brought about some discouragement. The attention and
focus of the Site Support Team can help to change that. According to one Site
Support Team member, “The team helps teachers to build self-efficacy, to think,
‘We are going to do this!’” However, the interviewee added that the support
needs to be coupled with instructional leadership and strong supervisory
practices by the site administrative team.
One teacher, in a survey response, said that it was not district personnel
that made her and her colleagues better teachers. Rather, she said it was her
colleagues’ own attention to detail that had made the difference. This sense of
excitement, of motivation, of ownership was echoed by others. One Site
Support Team interviewee cited, as evidence of increased teacher self-efficacy,
the spark that appeared again in teachers who had been teaching for 20 to 30
years.
During interviews in which Site Support Team members talked about
specific aspects of Site Support Team work that seemed to make a difference, a
number of characteristics that could be related to self -efficacy were revealed.
The involvement of teachers and the identification of effective teaching
practices through classroom visitations could, conceivably, be related to
135
teachers’ senses of self-efficacy. One principal cited the Site Support Team’s
provision of next steps that were “do-able” as motivational to teachers and
related to their self-efficacy. Finally the relationships established between and
among district and site-level colleagues contributed to a shared sense of efficacy
in the face of a need to effect improved student learning.
All individuals who were interviewed for this study felt that Site Support
Teams were effective. However, returned teacher surveys showed mixed results
which split on elementary versus secondary lines. The nine elementary
respondents reported overall positive reactions and attributions to the Site
Support Team at their school. When asked to rate the degree to which the Site
Support Team had an influence on their teaching or professional practice, eight
out of nine teachers responded with scores of “3” or “4” on a 4-point scale with
a mean score for all nine responses of 3.11. When asked to what extent they had
seen changes at their school that might be attributed to the work of the team,
again eight out of nine respondents gave positive responses of “3” or “4” with a
mean of the nine responses of 3.11.
Four high school surveys were returned, a return rate of approximately
36%. When asked to rate the degree to which the Site Support Team had an
influence on their teaching or professional practice, two respondents rated a
score of “3,” a moderately positive response, and two gave a response of “1,” a
strong negative response. When asked to what extent they had seen changes at
136
their school that might be attributed to the work of the team, only one
respondent gave a strong negative score of “1;” two gave scores of “3,” or
“some change,” and one circled “4,” “significant change.” The mean was 2.75.
What is the sum of these data? Quantitatively, we see that two schools
showed significant improvement, a few held steady in the face of rather
significant demographic and leadership changes, and a few had modest gains.
Most schools showed improvements in at least the “proficient + advanced”
dependent variable. Qualitatively, all 10 Site Support Team members who were
interviewed, and most, although not all survey respondents, were positive about
Site Support Teams and believed in their contribution to improvements in
student learning. How might the seeming contradictions in these data be
resolved? One explanation is that the Site Support Team does, indeed, spur the
learning and behavioral changes of staff. However, the results of this learning
require time before significant changes in student outcomes are realized.
Elmore and City (2007) of the Harvard Graduate School of Education, write
about the oscillating pattern of school improvement. After years of studying
school improvement, they explain that the student performance curve is
typically flat during periods of learning by the staff. It is as a consequence of
improvements in the knowledge and skills of staff that effects are seen in student
performance. Elmore and City (2007) explain, “It takes time for these new
137
practices to mature and become part of the working repertoire of teachers and
administrators” (p. 170).
Recommendations
What are the next steps that members of Site Support Teams should
consider in their work in schools? Given the power of teacher efficacy at both a
personal level and at a collective level, Site Support Teams are advised to
consider the works of Bandura (1993) and Tschannen-Moran, Woolfolk Hoy,
and Hoy (1998). Bandura (1993) summarizes:
Schools in which the staff collectively judge themselves as powerless to
get students to achieve academic success convey a group sense of
academic futility that can pervade the entire life of the school. School
staff members who collectively judge themselves capable of promoting
academic success imbue their schools with a positive atmosphere for
development. (p. 141)
The charge of the Site Support Team members, then, is to provide school staffs
with substantial feedback and evidence of sufficient weight as to cause teachers
to modify their preexisting beliefs to new convictions that they, themselves,
have the capacity to affect rigorous student learning.
An associate superintendent in the district has a strong belief in teacher
involvement and a classroom focus right from the onset of the Site Support
Team. He believes that the Site Support Team must have a connection to the
school. Furthermore, he feels that classroom visits are essential. “You can’t
coach a school if you don’t know the school.” In terms of the coaching
138
function, the district office members of the team come in with fresh eyes and
help the principal to focus. According to the associate superintendent, the team
must empower the principal to adopt the focus and to take it on in their
leadership of the school.
This study has produced a quantitative analysis of the progress of seven
schools, and these data have been contextualized and enriched by the viewpoints
and analyses of study participants. In an attempt to synthesize these various
ideas, commonalities have been studied among patterns of data and qualitative
measures. As a result, the author concludes that Site Support Teams are most
effective when certain conditions exist.
x The principal has assistance in developing clarity about the purpose and
use of Site Support Teams. A Site Support Team leader should provide
assistance with establishing a focus.
x Teachers are involved in the discussions and in walk-throughs of
classrooms.
x Site Support Teams allow ample time for teachers to engage with peers
about practice so that the positive aspects of motivation are felt rather
than only the negative aspects of accountability.
x Team members use case histories of students to help “personalize” the
numbers.
139
x The Site Support Team is considered a “necessary but not sufficient”
practice in helping schools to improve. There is alignment of clear site
expectations, support for the expectations, and mutual accountability.
x A Site Support Team needs to recognize when a school is in a transition
of leadership or student demographics and design ways to help the
school to compensate quickly for these contextual changes.
Site Support Teams provide a blend of accountability, of improved
individual and collective efficacy, and of motivation. These three underlying
factors are interrelated, yet each contribute to the process of improving
achievement. Elmore and City (2007) cite three processes related to school
improvement. The first process is that of improvements in the knowledge and
skills of certificated staff in the realm of instruction. The second process is the
movement of staff into collective groups that share the responsibility for student
learning. The third process is the alignment of school resources, practices, and
policies to support learning and instruction. While these three processes are not
accomplished by a school in a short period of time, a Site Support Team can
assist a school in attending to and in working through each type of process.
When a school is embarking on an improvement effort through the use of a
Site Support Team, certain initial steps are critical to set up an environment for
success.
140
1. District members of the Site Support Team need to spend time at the
school getting to know the school and building trust. Discussion
opportunities with teachers are key.
2. Ensure teacher involvement in the process. Communicate the purpose to
the entire staff as well as information about what to expect and how
teachers can be/will be involved.
3. Working collaboratively, the Site Support Team helps the school to
establish a focus.
x Observe classrooms with the “fresh eyes” of an outsider.
x Ask non-judgmental, data-driven questions using data gathered
through observations and various types of student data.
4. As a Site Support Team, develop mutual accountability with all team
members committing to following through on promised actions in a
timely manner.
5. Help the school leader to align the expectations, practices, and
communication at the school so as to support the focus.
6. Provide, model, and expect supportive and mutual accountability.
7. Celebrate successes.
Site Support Teams can be effective structures in helping the staff of a school
to bring about improved student learning. Focused and supportive Site Support
Teams have great potential for making a difference in our schools.
141
REFERENCES
Alexander, P.A., & Murphy, K.P. (1997). The research base for APA’s learner-
centered psychological principles. In M. Lambert & B. L. McCombs
(Eds.), How students learn: Reforming schools through learning-
centered education (pp. 25-60). Washington, D.C.: American
Psychological Association.
Atkins, K., & Rossi, M. (September, 2007). Change from within. Educational
Leadership, 65(1), 1-5. Retrieved September 27, 2007, from
http://www.ascd.org/portal/site/ascd/template.MAXIMIZE/menuitem.45
9dee008f99653.
Bandura, A. (1989). Regulation of cognitive process through perceived self -
efficacy. Developmental Psychology, 25(5), 729-735. Retrieved
September 3, 2007, from
http://content.apa.org/journals/dev/25/5/729.pdf
Bandura, A. (1993). Perceived self-efficacy in cognitive development and
functioning. Educational Psychologist, 28(2), 117-148.
Bandura, A. (1997). Self-efficacy: The exercise of control. In P. Pintrich & D.
Schunk (Eds.), Motivation in education: Theory, research, and
applications, 2
nd
edition (pp. 185-186). Upper Saddle River, NJ: Pearson
Education.
Burke, J.C. (2004). Achieving accountability in higher education: Balancing
public, academic, and market demands. In J. C. Burke (Ed.), The many
faces of acco untability (pp. 1-24). San Francisco, CA: Jossey-Bass.
California Department of Education. (2000). California Public Schools
Accountability Act of 1999. Retrieved February 10, 2008, from
http://www.cde.ca.gov/ta/ac/pa.
California Department of Education. 2003. API Reports. Retrieved on January
23, 2007 and February 4, 2007, from
http://star.cde.ca.gov/STAR2002/report.asp.
California Department of Education. (2006). Data Quest Demographic Data.
Retrieved February 27, 2008, from http://dq.cde.ca.gov/dataquest/API
base2006/2005base and http://dq.cde.ca.gov/dataquest/APIbase2007.
142
California Department of Education. (2007a), 2005-06 Accountability Project
Report. Retrieved January 8, 2007 from
http://api.cde.ca.gov/APIBase2006/2006Growth_DstApi.
California Department of Education. (2007b). Curtis Middle School data.
Retrieved September 29, 2007, from http://www.ed-data.k12.ca.us.
California Department of Education. (2007c). Standardized Testing and
Reporting (STAR) Program. Understanding 2007 STAR program tests:
Information for school district and school staff. Retrieved April 23,
2007, from http://www.cde.ca.gov/ta/tg/sr/documents/undrstnd07tsts.pdf.
California Department of Education. (2007d, March). 2006-07 Accountability
Progress Reporting System: Technical questions and answers, 2006-07
Academic Performance Index and Adequate Y early Progress. Retrieved
April 24, 2007, from
http://www.cde.ca.gov/ta/ac/ap/documents/techqa06.pdf.
California Department of Education. (2007e, October 31). Academic
Performance Index (API) Report. Retrieved January 27, 2008, from
http://dq.cde.ca.gov/dataquest.
Chwalisz, K.D., Altmaier, E.M., & Russell, D.W. (1992). Causal attributions,
self-efficacy cognitions, and coping with stress. Journal of Social and
Clinical Psychology, 11, 377-400.
Clark, R.E., & Estes, F. (2002). Turning research into results: A guide to
selecting the right performance solutions. Atlanta, GA: CEP Press.
Collins, J.L. (1982, March). Self -efficacy and ability in achievement behavior.
Paper presented at the annual meeting of the American Educational
Research Association, new York.
Eccles, J. (1983). Expectancies, values, and academic behaviors. In J. T.
Spence (Ed.), Achievement and achievement motives (pp. 75-146). San
Francisco: Freeman.
Eccles, J., Wigfield, A., Flanagan, C., Miller, C., Reuman, D., & Yee, D.
(1989). Self-concepts, domain values, and self-esteem: Relations and
changes at early adolescence. Journal of Personality, 57, 283-310.
Education Data Partnership (2005). District reports. Retrieved January 8, 2007,
from http://www.ed-data.k12.ca.us/navigation.
143
Elk Grove Unified School District. (2007a). Anna Kirchgater School
Accountability Report Card. Retrieved January 8, 2007, from
http:www.egusd.net/schools/scores/short/Kirchgater.pdf.
Elk Grove Unified School District. (2007b). Demographics. Retrieved January
8, 2007 from www.egusd.net/discover _egusd/demographics.cfm.
Elmore, R. (2000). Building a new structure for school leadership.
Washington, D.C: Albert Shanker Institute.
Elmore, R. (2002). Bridging the gap between standards and achievement: The
imperative for professional development in education. Washington,
D.C: Albert Shanker Institute.
Elmore, R., & City, E.A. (2007) The road to school improvement. In n. Walser
& C. Chauncey (Eds.), Spotlight on leadership and school change (pp.
167-175). Cambridge, MA: Harvard Education Press.
Goddard, R., Hoy, W.L., & Woolfolk Hoy, A. (2000). Collective teacher
efficacy: Its meaning, measure, and impact on student achievement.
American Educational Research Journal, 37(2), 479-507. Retrieved
november 13, 2007, from http://www.jstor.org/search.
Herzberg, F. (2003, January). One more time: How do you motivate
employees? HBR Classic. Retrieved August 7, 2007, from
http://harvardbusinessonline.hbsp.harvard.edu/b01/en.
Hoy, W.K., & Woolfolk, A.E. (March, 1993). Teachers’ sense of efficacy and
the organizational health of schools. The Elementary School Journal
93(4), 355-372.
Jamentz, K. (2002). Isolation is the enemy of improvement: Instructional
leadership to support standards-based practice. San Francisco:
WestEd.
Lee, V., Dedick, R., & Smith, J. (1991). The effect of the social organization of
school on teachers’ efficacy and satisfaction. Sociology of Education,
64, 190-208.
Leithwood, K. (2004). Educational leadership, a review of the research.
Prepared for the Laboratory for Student Success at Temple University
Center for Research in Human Development and Education. Retrieved
November 23, 2005, from http://www.temple.edu/lss/pdf/Leithwood/pdf.
144
Linn, R.L. (2006). Educational Accountability Systems: CSE Technical Report
687. Los Angeles: UCLA Graduate School of Education and
Information Studies, Center for the Study of Evaluation, National Center
for Research on Evaluation, Standards, and Student Testing. Retrieved
September 29, 2007, from http://www.cse.ucla.edu.
Mac Iver, M.A., & Farley, E. (2003). Bringing the district back in: The role of
central office in improving instruction and student achievement.
Baltimore, MD: Center for Research on the Education of Students
Placed at Risk, Johns Hopkins University. Retrieved October 20, 2006,
from http://www.csos.jhu.edu/crespar/tech/Reports/Report65.pdf .
Marzano, R.J., Pickering, D.J., & Pollock, J.E. (2001). Classroom instruction
that works: Research-based strategies for increasing student
achievement. Alexandria, VA: Association for Supervision and
Curriculum Development.
Marzano, R.J., Waters, T., & Mcnulty, B.A. (2005). School leadership that
works: From research to results. Alexandria, VA: Association for
Supervision and Curriculum Development.
McLaughlin, M.W., & Talbert, J.E. (1993). Contexts that matter for teaching
and learning. (ERIC Document Reproduction Service No. ED357023.)
Stanford, CA: Stanford University, School of Education, Center for
Research on the Context of Teaching. Retrieved September 30, 2007,
from http://www.ERIC.ed.gov:80/ERICWebPortal.
Mendoza, J. (January 23, 2002). Letter, from the Office of the California State
Superintendent of Public Schools. Retrieved February 20, 2007, from
http://www.cde.ca.gov/ta/ac/ti/piltrjan02.asp.
Midgley, C., Feldlaufer, H., & Eccles, J.S. (1989). Change in teacher efficacy
and student self- and task-related beliefs in mathematics during the
transition to junior high school. Journal of Educational Psychology, 81,
247-258.
Mintrop, H., & Trujillo, T. (2007). The practical relevance of accountability
systems for school improvement: A descriptive analysis of California
schools: CSE Report 713. Los Angeles: UCLA Graduate School of
Education and Information Studies, National Center for Research on
Evaluation, Standards, and Student Testing. Retrieved September 29,
2007, from http://www.cse.ucla.edu.
145
O’Day, J.A. (2002). Complexity, accountability, and school improvement.
Harvard Educational Review, 72(3), 293-329.
O’Day, J., Bitter, C., Kirst, M., Camoy, M., Woody, E., Buttles, M., Fuller, B.,
and Ruenzel, D. (2004). Assessing California’s accountability system:
Successes, challenges, and opportunities for improvement. PACE Policy
Brief, 4(2). Retrieved September 16, 2007, from http://www-
gse.berkeley.edu/research/PACE/reports/PB.04.2.pdf.
Paige, R. (2002, July). Key policy letter signed by the Education Secretary.
Retrieved February 20, 2007, from
http://www.ed.gov/policy/elsec/guid/secletter/020724.html.
Pajares, F. (1996). Self-efficacy beliefs in academic settings. Review of
Educational Research, 66(4), 543-578. Retrieved November 3, 2007,
from http://links.jstor.org.
Patton, M.Q. (1987). How to use qualitative methods in evaluation. Los
Angeles: Center for the Study of Evaluation, University of California.
Pintrich, P.R. (2003). A motivational science perspective on the role of student
motivation in learning and teaching contexts. Journal of Educational
Psychology, 95(4), 667-686. Retrieved May 29, 2007, from
http://content.apa.org/journals/edu.
Pintrich, P., & Schunk, D. (1996). Motivation in education: Theory, research,
and applications. Englewood Cliffs, NJ: Prentice Hall.
Pintrich, P.R., & Schunk, D.H. (2002). Motivation in education: Theory,
research, and applications (2
nd
edition). Upper Saddle River, NJ:
Pearson Education.
Reeves, D.B. (2004a). High performance in high poverty schools: 90/90/90
and beyond. Retrieved February 8, 2007, from
http://www.sabine.k12.la.us/online/leadershipacademy.
Reeves, D.B. (2004b). Accountability in action: A blueprint for learning
organizations. Denver, CO: Advanced Learning Press & Center for
Performance Assessment.
Reeves, D.B. (2006). The learning leader: How to focus school improvement
for better results. Alexandria, VA: Association for Supervision and
Curriculum Development.
146
Schmoker, M. (2001). The results fieldbook. Alexandria, VA: Association for
Supervision and Curriculum Development.
Simpson, J. O. (2003, January). Beating the odds. American School Boards
Journal (190), 43-47.
Spillane, J. (2002). Local theories of teacher change: The pedagogy of district
policies and programs. Teachers College Record 104(3), 377-420.
Retrieved September 20, 2007, from http://www.tcrecord.org, ID number:
10849.
Stein, M.K., Hubbard, L., & Mehan, H. (2004). Reform ideas that travel far
afield: The two cultures of reform in new York City’s District #2 and
San Diego. Journal of Educational Change, 5, 161-197.
Togneri, W., & Anderson, S.E. (2003). Beyond islands of excellence: What
districts can do to improve instruction and achievement in all schools--A
leadership brief. Washington, D.C: The Learning First Alliance and the
Association for Supervision and Curriculum Development. Retrieved
September 9, 2007, from
http://www.learningfirst.org/publications/districts.
Tschannen-Moran, M., Woolfolk Hoy, A., & Hoy, W.K. (1998). Teacher
efficacy: Its meaning and measure. Review of Educational Research,
68(2), 202-248. Retrieved november 3, 2007, from http://www.jstor.org.
United States Department of Education. (2002). No Child Left Behind Act of
2001 Retrieved February 3, 2007, from
http://www.nclb.gov/next/overview/index.html.
Wigfield, A. (1994). Expectancy-value theory of achievement and motivation:
A developmental perspective. Educational Psychology Review, 6, 49-78.
Wigfield, A., & Eccles, J. (1992). The development of achievement task values:
A theoretical analysis. Developmental Review. 12, 265-310.
Woolfolk A.E., & Hoy, W.K. (1990). Prospective teachers’ sense of efficacy
and belief about control. Journal of Educational Psychology, 82, 81-91.
147
APPENDIX A
SITE SUPPORT TEAM
TEACHER QUESTIONNAIRE
Please respond to each question by circling 1 – 4, with 1 being a “low” and 4
being “high.”
Question
1 To what degree are you
aware of the existence of
the Site Support Team at
your school?
Low
1 2 3
High
4
2 To what degree has the
Site Support Team had an
influence on your teaching
or professional practice?
1 2 3 4
3 How often have you
received information about
the work of the Site
Support Team?
Never
1
Rarely
2
Occasionally
3
Often
4
4 How often have you
received feedback from the
Site Support Team about
your teaching or
professional practice?
Never
1
Rarely
2
Occasionally
3
Often
4
5 To what extent have you
seen changes at your
school that you think may
be attributed to the work of
the Site Support Team?
No
change
1
Little
change
2
Some change
3
Significant
change
4
Please provide any
thoughts or comments that
you have about the Site
Support Team at your
school.
Thank you for sharing your time and perspective!
148
APPENDIX B
SIMILAR SCHOOLS’ DATA IN YEAR
PRIOR TO ONSET OF INTERVENTION
Table 31
Similar Schools’ Data in Year Prior to Onset of Intervention
School District County API
# Students.
Tested
% Af-
Amer
%
Asian
%
Hisp.
%
White
%
EL
%
FRLP
a
2002
Van Buren High El Presidente Sacramento 579 1610 31 26 20 10 38 59
John G. Downey E. Side Santa Clara 579 1331 5 29 53 6 33 49
2003
Fillmore HS El Presidente Sacramento 644 1654 22 34 18 16 35 57
Ronald Reagan Moreno Vl Riverside 639 1869 21 4 38 34 19 36
Johnson MS El Presidente Sacramento 597 1372 37 18 27 9 32 66
Budd MS Fontana San
Bernardino.
599 1190 13 0 73 11 29 68
Roosevelt MS El Presidente Sacramento 649 1334 23 29 24 15 31 54
John McDougall Oakland Alameda 647 623 39 26 21 9 23 58
2004
Lincoln HS El Presidente Sacramento 713 1560 26 19 16 24 16 31
Latham City Latham City L.A Cnty 714 1568 23 10 38 24 13 25
149
Appendix B: Continued
Table 31 (continued)
School District County API
# Students.
Tested
% Af-
Amer
%
Asian
%
Hisp.
%
White
%
EL
%
FRLP
a
2005
Ford ES El Presidente Sacramento 690 566 18 23 31 23 35 98
Haight ES Stockton San Joaquin 697 558 9 8 67 5 41 82
Madison ES El Presidente Sacramento 658 704 25 12 52 7 47 100
Pete Wilson Sacramento City Sacramento 655 316 24 25 39 5 53 100
a
FRLP = Percentages of students who qualify for the Free and Reduced-Price Lunch Program
150
APPENDIX C
STUDENT CST PERFORMANCE DATA, PER SCHOOL
Table 32
Number of Students Per Performance Band, Van Buren High School
English/Language Arts
2002
9
th
grade
n = 573
10
th
grade
n = 564
11
th
grade
n = 473
Total
n = 1610
Advanced
11 28 23 62
Proficient
86 85 66 237
Basic
172 169 167 508
Below basic
155 158 104 417
Far Below
Basic 149 124 113 386
2007 9
th
grade
n = 464
10
th
grade
n = 425
11
th
grade
n = 342
Total
n = 1231
Advanced
74 38 48 160
Proficient
107 72 72 251
Basic
139 141 108 388
Below basic
93 102 55 250
Far Below
Basic 51 72 59 182
151
Appendix C: Continued
Table 33
Number of Students Per Performance Band, John G. Downey, Control for Van
Buren High School
English/Language Arts
2002
9
th
grade
n = 481
10
th
grade
n = 465
11
th
grade
n = 385
Total
n = 1331
Advanced
24 28 19 71
Proficient
96 74 50 220
Basic
168 149 131 448
Below basic
101 121 93 315
Far Below
Basic 92 93 92 277
2007 9
th
grade
n= 583
10
th
grade
n = 483
11
th
grade
n = 458
Total
n = 1524
Advanced
123 53 64 240
Proficient
157 97 78 332
Basic
163 145 133 441
Below basic
93 116 82 291
Far Below
Basic 47 72 101 220
152
Appendix C: Continued
Table 34
Number of Students Per Performance Band, Fillmore High School
English/Language Arts
2003
9
th
grade
n = 594
10
th
grade
n = 578
11
th
grade
n = 482
Total
n = 1654
Advanced
41 28 29 98
Proficient
160 127 125 412
Basic
226 233 169 628
Below basic 119 121 96 336
Far Below
Basic 48 69 63 180
2007
9
th
grade
n= 456
10
th
grade
n = 507
11
th
grade
n = 414
Total
n = 1387
Advanced
84 51 33 178
Proficient
124 91 83 298
Basic
142 167 120 429
Below basic
78 127 83 288
Far Below
Basic 28 71 95 194
n = number of students
153
Appendix C: Continued
Table 35
Number of Students Per Performance Band, Ronald Reagan HS, Control for
Fillmore HS
English/Language Arts
2003
9
th
grade
n = 633
10
th
grade
n = 584
11
th
grade
n = 652
Total
n = 1869
Advanced
82 52 65 199
Proficient
158 152 157 467
Basic
203 199 240 642
Below basic
114 99 105 318
Far Below
Basic 76 82 85 243
2007
9
th
grade
n = 735
10
th
grade
n = 691
11
th
grade
n = 547
Total
n = 1973
Advanced
111 69 50 230
Proficient
184 164 110 458
Basic
256 211 156 623
Below basic
140 144 110 394
Far Below
Basic 44 103 121 268
n = number of students
154
Appendix C: Continued
Table 36
Number of Students Per Performance Band, Johnson Middle School
English/Language Arts
2003
7
th
grade
n = 683
8
th
grade
n = 689
Total
n = 1372
Advanced
20 14 34
Proficient
130 90 220
Basic
267 254 521
Below basic
164 207 371
Far Below
Basic 102 124 226
2007
7
th
grade
n = 545
8
th
grade
n = 552
Total
n = 1097
Advanced
44 32 76
Proficient
125 88 213
Basic
152 180 332
Below basic
142 132 274
Far Below
Basic 82 120 202
155
Appendix C: Continued
Table 37
Number of Students Per Performance Band, Budd, Control for Johnson MS
English/Language Arts
2003
7
th
grade
n = 609
8
th
grade
n = 581
Total
n = 1190
Advanced 6 6 12
Proficient 90 64 154
Basic 234 215 449
Below basic 145 168 313
Far Below
Basic
134 128 262
2007
7
th
grade
n = 488
8
th
grade
n = 432
Total
n = 920
Advanced 14 12 26
Proficient 96 69 165
Basic 175 170 345
Below basic 140 112 252
Far Below
Basic
63 69 132
156
Appendix C: Continued
Table 38
Number of Students Per Performance Band, Roosevelt Middle School
English/Language Arts
2003
7
th
grade
n = 658
8
th
grade
n = 676
Total
n = 1334
Advanced
27 34 61
Proficient
172 135 307
Basic
221 250 471
Below basic
165 156 321
Far Below
Basic 73 101 174
2007
7
th
grade
n = 535
8
th
grade
n = 576
Total
n = 1111
Advanced
53 46 99
Proficient
150 104 254
Basic
156 225 381
Below basic
123 132 255
Far Below
Basic 53 69 122
157
Appendix C: Continued
Table 39
Number of Students Per Performance Band, John McDougall MS, Control for
Roosevelt
English/Language Arts
2003
7
th
grade
n = 357
8
th
grade
n = 266
Total
n = 623
Advanced
25 13 38
Proficient
72 66 138
Basic
127 103 230
Below basic
86 39 125
Far Below
Basic 47 45 92
2007
7
th
grade
n = 274
8
th
grade
n = 267
Total
n = 541
Advanced
30 32 62
Proficient
57 64 121
Basic
97 80 177
Below basic
60 54 114
Far Below
Basic 30 37 67
158
Appendix C: Continued
Table 40
Number of Students Per Performance Band, Lincoln High School
English/Language Arts
2004
9
th
grade
n = 554
10
th
grade
n = 548
11
th
grade
n = 458
Total
n = 1560
Advanced
77 82 46 205
Proficient
161 148 110 419
Basic
161 165 137 463
Below basic
89 82 69 240
Far Below
Basic 66 71 96 233
2007
9
th
grade
total n=
609
10
th
grade
total n =
594
11
th
grade
total n =
504
Total
n 1707
Advanced
134 89 91 314
Proficient
175 125 111 411
Basic
156 154 121 431
Below basic
97 125 75 297
Far Below
Basic 47 101 106 254
159
Appendix C: Continued
Table 41
Number of Students Per Performance Band, Latham City, Control for Lincoln
English/Language Arts
2004
9
th
grade
n = 632
10
th
grade
n = 518
11
th
grade
n = 418
Total
n = 1568
Advanced
100 92 58 250
Proficient
226 160 113 499
Basic
176 160 142 478
Below basic
81 76 63 220
Far Below
Basic 49 30 42 121
2007
9
th
grade
n = 566
10
th
grade
n = 569
11
th
grade
n = 446
Total
n = 1581
Advanced
159 125 107 391
Proficient
153 131 94 378
Basic
140 182 134 456
Below basic
74 80 58 212
Far Below
Basic 40 51 53 144
160
Appendix C: Continued
Table 42
Number of Students Per Performance Band, Ford Elementary School
English/Language Arts
2005
Grade 2
n= 102
Grade 3
n = 111
Grade 4
n = 118
Grade 5
n = 104
Grade 6
n = 131
Total
n = 566
Advanced
3 0 12 5 5 25
Proficient
24 11 27 20 24 106
Basic
31 41 37 38 52 199
Below
basic 28 38 31 22 30 149
Far
Below
Basic
16 21 11 19 20 87
2007
Grade 2
n = 91
Grade 3
n = 78
Grade 4
n = 3
Grade 5
n = 92
Grade 6
n = 95
Total
n = 429
Advanced
15 9 10 4 4 42
Proficient
22 24 14 19 14 93
Basic
30 26 31 41 40 168
Below
basic 19 16 13 21 23 92
Far
Below
Basic
5 3 5 7 14 34
161
Appendix C: Continued
Table 43
Number of Students Per Performance Band, Haight ES, Control for Ford ES
English/Language Arts
2005
Grade 2
n= 98
Grade 3
n = 93
Grade 4
n = 105
Grade 5
n = 141
Grade 6
n = 121
Total
n = 558
Advanced
4 1 9 10 17 41
Proficient
16 9 20 35 18 98
Basic
33 27 35 56 39 190
Below
basic 22 32 29 23 34 140
Far Below
Basic 23 24 12 17 13 89
2007
Grade 2
n= 67
Grade 3
n = 54
Grade 4
n = 89
Grade 5
n = 85
Grade 6
n = 95
Total
n = 390
Advanced
2 7 15 12 10 46
Proficient
19 13 24 22 20 98
Basic
21 16 24 30 28 119
Below
basic 12 11 12 14 29 78
Far Below
Basic 13 7 14 7 8 49
162
Appendix C: Continued
Table 44
Number of Students Per Performance Band, Madison Elementary
English/Language Arts
2005
Grade 2
n= 160
Grade 3
n = 139
Grade 4
n = 142
Grade 5
n = 144
Grade 6
n = 119
Total
n = 704
Advanced
5 1 8 7 6 27
Proficient
33 22 26 24 18 123
Basic
46 51 48 51 55 251
Below
basic 43 43 37 29 27 179
Far Below
Basic 33 22 23 33 13 124
2007
Grade 2
n= 160
Grade 3
n = 122
Grade 4
n = 143
Grade 5
n = 129
Grade 6
n = 144
Total
n = 698
Advanced
9 2 13 3 7 34
Proficient
37 ]29 41 26 32 165
Basic
42 41 53 48 49 233
Below
basic 42 26 18 28 37 151
Far Below
Basic 30 24 18 24 19 115
163
Appendix C: Continued
Table 45
Number of Students Per Performance Band, Pete Wilson ES, Control for
Madison
English/Language Arts
2005
Grade 2
n= 60
Grade 3
n = 60
Grade 4
n = 59
Grade 5
n = 68
Grade 6
n = 69
Total
n = 316
Advanced
10 0 3 1 1 15
Proficient
21 6 15 12 9 63
Basic
16 15 19 31 22 103
Below
basic 10 21 13 11 23 78
Far Below
Basic 3 18 9 13 14 57
2007
Grade 2
n= 58
Grade 3
n = 37
Grade 4
n = 67
Grade 5
n = 64
Grade 6
n = 54
Total
n = 280
Advanced
4 1 6 1 2 14
Proficient
10 10 11 9 13 53
Basic
11 9 31 20 15 86
Below
basic 18 14 7 19 13 71
Far Below
Basic 15 3 12 15 11 56
164
APPENDIX D
PRE/POST-DATA OF CONTROL SCHOOLS
Table 46
John G. Downey High School, Control School for Van Buren High School
Dependent
Variable
Pre-
n
Post-
n
Pre-
M
Post–
M
Change
in M
SD
Pre -
Percent
Change
Cohen’s
d
All Perf
Band
1331 1524 1.6191 2.0531 .434 1.1418 27% .38
Basic +
above
1331 1524 .5552 .6647 .1095 .4971 20% .22
Prof +
Adv.
1331 1524 .2186 .3753 .1567 .4135 72%
.38
165
Appendix D: Continued
Table 47
Ronald Reagan High School, Control School for Fillmore High School
Dependent
Variable
Pre-
n
Post-
n
Pre-
M
Post–
M
Change
in M
SD
Pre -
Percent
Change
Cohen’s
d
All Perf
Band
1869 1973 2.0326 1.9939 - .0387 1.1686 - 2% -.03
Basic +
above
1869 1973 .6998 .6645 - .0353 .45845 - 5% -.08
Prof +
Adv.
1869 1973 .3563 .3487 - .0076 .47905 - 2% - .02
Table 48
Budd, Control School for Johnson Middle School
Dependent
Variable
Pre-
n
Post-
n
Pre-
M
Post–
M
Change
in M
SD
Pre -
Percent
Change
Cohen’s
d
All Perf
Band
1190 920 1.446 1.675 .229 1.0038 16% .23
Basic +
above
1190 920 .5168 .5826 .0658 .4999 13% .13
Prof +
Adv.
1190 920 .1395 .2076 .0681 .3466 49% .20
Table 49
John McDougall Middle School, Control School for Roosevelt Middle School
Dependent
Variable
Pre-
n
Post-
n
Pre-
M
Post–
M
Change
in M
SD
Pre -
Percent
Change
Cohen’s
d
All Perf
Band
623 541 1.8475 1.9945 .14694 1.11155 8% .13
Basic +
above
623 541 .6517 .6654 .01375 .47682 2% .03
Prof +
Adv.
623 541 .2825 .3383 .05576 .45058 20% .12
166
Appendix D: Continued
Table 50
Latham City Senior High School, Control School for Lincoln High
Dependent
Variable
Pre-
n
Post-
N
Pre-
M
Post–
M
Change
in M
SD
Pre -
Percent
Change
Cohen’s
d
All Perf
Band
1568 1581 2.3425 2.4175 .0750 1.13512 03% .07
Basic +
above
1568 1581 .7825 .7748 -.0077 .41266 -.01% -.02
Prof +
Adv.
1568 1581 .4777 .4864 .0087 .49966 02% .02
Table 51
Haight Elementary School, Control School for Ford Elementary School
Dependent
Variable
Pre-
n
Post-
n
Pre-
M
Post–
M
Change
in M
SD
Pre -
Percent
Change
Cohen’s
D
All Perf
Band
558 390 1.7527 2.0359 .28321 1.14 16% .25
Basic +
above
558 390 .5896 .6744 .08475 .49235 14% .17
Prof +
Adv.
558 390 .2491 .3692 .12013 .43288 48% .28
Table 52
Data for Pete Wilson Elementary School, Control School for Madison
Elementary
Dependent
Variable
Pre-
n
Post-
n
Pre-
M
Post –
M
Change
in M
SD
Pre -
Percent
Change
Cohen’s
d
All Perf
Band
316 280 1.6867 1.6357 -.051 1.12403 - 03% -.05
Basic +
above
316 280 .5728 .5464 -.0264 .49546 - 05% -.05
Prof +
Adv.
316 280 .2468 .2393 -.0075 .43185 - 03% -.02
167
APPENDIX E
API CHANGES FOR EXPERIMENTAL SCHOOLS:
BACKGROUND FOR INTERVIEWS
Table 53
API Changes for Experimental Schools: Background for Interviews
School 2002 2003 2004 2005 2006 2007
Van Buren 579 616 629 657 670 693
Fillmore HS 644 656 675 672 689
Johnson
597
601 638 648 637
Roosevelt 649 658 699 685 695
Lincoln 693 721 706
71
Ford E.S. 690 724 745
Madison 658 700 693
168
APPENDIX F
LINCOLN HIGH SCHOOL PERCENTAGES OF
SUB-GROUPS, 2004-2007
Table 54
Lincoln High School Percentages of Sub-Groups, 2004-2007
Year CST Was Given
2004 2005 2006 2007
School-wide
API Growth
Scores
693 721
a
706 711
a
African-
American
26 26 25 25
Asian 19 20 19 20
Hispanic 17 17 19 19
Socio-Economic
(FRLP)
31 25 37 39
English Learners
(EL)
16 17 15 14
Mobility
b
90 87 89 89
a
Year of API growth.
b
Mobility, as defined by CDE, is the percent of students present at the school on
CBEDS day in the fall and still present at time of CST testing.
169
APPENDIX G
COMMON CHARACTERISTICS THAT MAKE SITE SUPPORT TEAMS EFFECTIVE
AS MENTIONED IN INTERVIEWS
Table 55
Common Characteristics that Make Site Support Teams Effective as Mentioned in Interviews
Characteristic
of Effective
Site Support
Team
Associate
Superintendent
A
Associate
Superintendent
B
Director
A
Director
B
Principal
A
Principal
B
Principal
C
Dist.
Off
Site
Sup.
Team,
5 yrs
Coach
3 Yr.
Site
Sup.
Coach
new
to
Site
Sup.
Teacher
Involvement x x x x x
Classroom
Visitations x x x x x x
Dist. members
bring broader
perspective,
“fresh set of
eyes”
x x x x
Helps the
school to set
and maintain a
focus
x X x x x x x
Provide staff
with “doable”
next steps
x
170
Appendix D: Continued
Table 55 (continued)
Characteristic
of Effective
Site Support
Team
Associate
Superintendent
A
Associate
Superintendent
B
Director
A
Director
B
Principal
A
Principal
B
Principal
C
Dist.
Off
Site
Sup.
Team,
5 yrs
Coach
3 Yr.
Site
Sup.
Coach
new
to
Site
Sup.
Establish clear
expectations x
Remove
barriers and/or
provide
resources to the
school
x x x
School site
follow-up
and/or
accountability
are key
x x x x x x
Build relation-
ships between
site team and
district office
x x x x
Note. Table reflects characteristics mentioned in interviews. The fact that an item was not mentioned in one interview d
oes not necessarily mean that the person does not believe that the factor is a characteristic of effective Site Support
Teams.
171
APPENDIX H
VAN BUREN HIGH SCHOOL SUPPORT TEAM MEETING MINUTES
October 9, 2002
Meeting Convened: 7:30 A.M.
Members Present
Dr. Oliver DeGroot, Marcus Herrera, Jill Wilson, Sue Simpson, Jo
Elbert, Kim Hiayakamoto, Carly Cook, Billie Bacon-Bacerra, Shawn Pillard,
Rosena Morzenov, Ron Still, Jackie Lull, Rudolph Svensen and George
Pandetti.
Members Absent: Serena Star and Rosa Camarera.
Agenda Items
Introductions: Team members were introduced.
Reprioritize Items: The agenda was reviewed to assess any need to
reprioritize—none required.
Defining Roles: The Team next began to discuss the roles of coaches
and team members. The point was made that staff had been open and receptive
to the idea of support, from coaches as discussions continued in one-on-one
meetings held with department chairs and in focus groups. It was further
suggested that we promote our efforts as part of a new pilot program not yet
available to other secondary schools.
Appendix H: Continued
172
The coaches present stated they were in favor of visiting classrooms
without the accompaniment of administrators so as not to give the appearance of
their observations being evaluative or supervisory in nature. Coaches were also
very interested in receiving input, from other team members. Part of their
function will be to look at professional development needs of staff and to
develop a calendar. The point was made that we must ensure the teachers are
part of the decision-making process. Some work on assessing needs has been
accomplished in one of the WASC sub-committees. Language coaches will be
providing teachers with surveys. This will provide the data to tailor in-service
training.
The Professional Learning day hosted for teachers, while setting a strong
foundation for the direction we are moving, did not specifically provide the
latest in research, strategies, and best practices. These will be identified.
The point was made that student work should be the focus of much of
our work. This will aid in teachers being more comfortable with the process.
Yet, the California Standards for the Teaching Profession must remain a
priority—they are not personal or emotional. Content is another priority.
Suggestions were made to review the WASC Final Report and Action
Plan. Furthermore, it was suggested that the Action Plan incorporate some of
the goals, objectives, and actions of the Support Team.
Appendix H: Continued
173
The discussion proceeded to what we should be looking for in the
classroom, and as a team how to support the school in our designated
instructional focus (reading and vocabulary), the development of instructional
calendars, identifying effective strategies and disconnects. Foremost, we should
be seeing reading, reading and more reading, as reading is key to improving
student achievement. We must also consider what support is needed to share
data and to ensure conversations take place about the data. Rosena Morzenov,
from Research and Evaluation is here to support us and we have the new Web-
based SIS in the works.
It was suggested that we should emulate whatever Advancement Via
Individual Determination (AVID) has done successfully. We want to see an
increase in academic achievement…. “we are excited about the journey and the
learning!”
Next the 2
nd
and 4
th
Late Start Wednesday meetings focus of interim
assess was discussed. Mr. Still volunteered to provide a calendar of each
Wednesday’s focus.
It was determined that Dr. DeGroot is the District Point of Contact on all
matters pertaining to the Support Team, while Ms. Wilson will serve as the Site
Point of Contact.
Coaching questions will be coordinated with Jo.
This group will meet on a bi-monthly basis.
Appendix H: Continued
174
Sharing Impressions of Initial Classroom Observations: It was noted that
instruction throughout much of the math department was traditional in nature,
not lending itself to student engagement. Moreover, in some cases the rigor was
not at the level expected. It was however noted that many of the teachers in this
department were new and in a probationary status. This constitutes both a
challenge and an opportunity.
It was also noted that there do not appear to be sufficient use of
cooperative group, and pairs learning taking place in some classrooms.
Data will help to identify student weaknesses in basic computational
skills and other areas, which, in turn, will help drive the teacher’s instructional
focus.
The vocabulary strategies previously discussed and agreed upon, are not
being utilized by all teachers. Departmentally, content vocabulary words have
been identified.
Evidence supporting word for the day is not consistently available on
classroom boards or in the work being performed throughout the class period.
The question came up regarding the number of minutes per class. The school is
on a traditional schedule, with 55 minute classes (except for Late Start
Wednesdays, Rally and Minimum day schedules).
Appendix H: Continued
175
The question came up, “when should coaches share strategies with
teachers?” The consensus was that it should occur at smaller meetings, i.e.,
department or focus groups as opposed to large faculty meetings.
Identify next Steps: The point was made that the Team needs to limit the
number of things that we attempt to do as a team this year. We must be careful
not to pile different layers of work on teachers as we visit classrooms in our
different teams for observations.
We must identify the strategies and best practices we want teachers to
use. Classroom Instruction That Works by Marzano, Pickering, & Pollock
(2001) will serve as one source.
The point was made that a consistent statement has been made by the last
three Western Association of Schools and Colleges (WASC) Visiting Teams,
that is. “These Kids are a lot smarter than most think!” The point being that we
must reverse the belief that our kids can not do better on the part of teachers and
students alike. One suggestion made as a response to kids who do not believe
they can is to, “pretend you can . . .what would you do then!”
The point was also made that we must hold the line on students and
parents seeking release, from classes with rigorous instruction.
What we most need to do is focus on systemic change, change across the
school, and change that will endure. Systemic change we can agree to. We can
choose only a couple of things, so that the change can be clearly measured. We
Appendix H: Continued
176
must marshal our resources. We must use best practices that will assist the most
students. We must purchase materials as able.
Per our on site coach there are three things that are absolutely needed.
They are:
Observations—with data sharing.
Professional learning/staff development.
Team Teaching /coaching demonstrations.
Teaching can be an isolating experience, but it does not have to be!
Allow others to see masterful teaching. We need to get teachers comfortable
and accepting of watching peers.
There needs to be coherence between and among workshops.
Ensure teachers are aware of California Standards for the Teaching
Profession (CSTP). Standards four and five are key.
Planning and assessment of instruction is the intended objective being
taught, and assessed. Evaluate feedback, if any? Is re-teaching required? Are
students being engaged in the monitoring of learning and assessment? Students
must buy into what we are doing.
next Meeting Dates: Oct 31, 2002 @ 7:30 A.M.
Nov 13, 2002 @ 7:30 A.M.
Nov 21, 2002 @ 7:30 A.M.
Dec 11, 2002 @ 7:30 A.M.
Appendix H: Continued
177
Dec 19, 2002 @ 7:30 A.M.
Action Items:
Develop Template for Calendar of Support Action Plan—Wilson
Develop Template for Calendar of Professional Development—Wilson
Provide WASC Visit Summary and Action Plan— Svensen
Provide Research-Based Strategies—Coaches
Provide Calendar of Late Start Wednesday’s Instructional Focus--Still
Continue Reading/Vocabulary Strategies—All
Set-up Support Team e-mail/address—Wilson
Provide Meeting Minutes— Panadetti
Meeting Adjourned: 9:05 A.M.
Abstract (if available)
Abstract
The purpose of this study was to evaluate school-level interventions called Site Support Teams in terms of their correlation to student achievement. Site Support Teams were in use in a school district in northern California. Comprised of district-level and school-level leaders, the purpose of the teams was to work together to improve student achievement. Using a quasi-experimental design, two elementary schools, two middle schools, and three high schools served as experimental schools whose improvements in achievements were measured using the English/Language Arts portion of the California Standards Test for grades 2- 11. Improvement rates at each school were compared to improvements at control schools. The duration of the intervention at experimental schools ranged from 2 to 5 years. Qualitative data taken from surveys of teachers at two schools and interviews of 10 Site Support Team members were used to help interpret the quantitative portion of this study. Site Support Teams were found to be a successful tool in assisting schools to improve student achievement when several features were present: sustained focus, teacher involvement, classroom visitations, and mutual accountability.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
The effects of culturally responsive standards based instruction on African American student achievement
PDF
Effects of teacher autonomy support with structure on marginalized urban student math achievement
PDF
The effectiveness of the literacy for success intervention at Wilson Middle School
PDF
The effect of a language arts intervention on the academic achievement of English language learners
PDF
An evaluation of the impact of a standards-based intervention on the academic achievement of algebra students
PDF
The effectiveness of the cycle of inquiry on middle school English-learners in English-language arts
PDF
A formative evaluation of the student support services TRIO program for low income and first generation college bound students self-efficacy at Butte-Glenn Community College District
PDF
The teacher evaluation: key components that positively impact teacher effectiveness and student achievement
PDF
Sustainable intervention for learning gaps in middle school mathematics: a gap analysis
PDF
An evaluation of the impact of a standards-based intervention on the academic achievement of English language learners
PDF
The continuous failure of Continuous Improvement: the challenge of implementing Continuous Improvement in low income schools
PDF
Examining the effectiveness of teacher training on the improvement of California standardized test scores at Eva B. Elementary School
PDF
Alternatives for achievement: a mathematics intervention for English learners
PDF
Embedded academic support for high school student success: an innovation study
PDF
The effect of reading self-efficacy, expectancy-value, and metacognitive self-regulation on the achievement and persistence of community college students enrolled in basic skills reading courses
PDF
Raising student achievement at Eberman Elementary School with effective teaching strategies
PDF
Student mental health and wellness in K-12 high-performing school districts in Northern California: best practices for educational leaders
PDF
Teacher perception on positive behavior interventions and supports’ (PBIS) cultivation for positive teacher-student relationships in high schools: an evaluation study
PDF
The comparison of hybrid intervention and traditional intervention in increasing student achievement in middle school mathematics
PDF
Narrowing the achievement gap: Factors that support English learner and Hispanic student academic achievement in an urban intermediate school
Asset Metadata
Creator
Zeman, Anne Rita
(author)
Core Title
The effect of site support teams on student achievement in seven northern California schools
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Publication Date
04/28/2008
Defense Date
03/20/2008
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
accountability,focus on achievement,OAI-PMH Harvest,school interventions,school support,teacher involvement,teacher self-efficacy
Place Name
California
(states)
Language
English
Advisor
Hocevar, Dennis (
committee chair
), Goodyear, Rodney K. (
committee member
), Lavadenz, Magaly (
committee member
)
Creator Email
zeman@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-m1208
Unique identifier
UC1446893
Identifier
etd-Zeman-20080428 (filename),usctheses-m40 (legacy collection record id),usctheses-c127-63039 (legacy record id),usctheses-m1208 (legacy record id)
Legacy Identifier
etd-Zeman-20080428.pdf
Dmrecord
63039
Document Type
Dissertation
Rights
Zeman, Anne Rita
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Repository Name
Libraries, University of Southern California
Repository Location
Los Angeles, California
Repository Email
cisadmin@lib.usc.edu
Tags
accountability
focus on achievement
school interventions
school support
teacher involvement
teacher self-efficacy