Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
How effective schools use data to improve student achievement
(USC Thesis Other)
How effective schools use data to improve student achievement
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
HOW EFFECTIVE SCHOOLS USE DATA TO
IMPROVE STUDENT ACHIEVEMENT
By
William Todd Duncan
A Dissertation Presented to the
FACULTY OF THE ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
December 2002
Copyright 2002 William Todd Duncan
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
UMI Number: 3093755
Copyright 2002 by
Duncan, William Todd
All rights reserved.
®
UMI
UMI Microform 3093755
Copyright 2003 by ProQuest Information and Learning Company.
All rights reserved. This microform edition is protected against
unauthorized copying under Title 17, United States Code.
ProQuest Information and Learning Company
300 North Zeeb Road
P.O. Box 1346
Ann Arbor, Ml 48106-1346
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
UNIVERSITY OF SOUTHERN CALIFORNIA
School of Education
Los Angeles, California 90089-0031
This dissertation, written by
William Todd Duncan
under the direction of h Dissertation Committee, and
approved by all members o f the Committee, has been
presented to and accepted by the Faculty of the School
o f Education in partial fulfillment of the requirements for
the degree of
D o c t o r o f E d u c a t io n
Dissertation Committee
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
ii
ACKNOWLEDGEMENTS
I would like to thank my wonderful wife Lela and my children
William, Stephen, and Geneva for their support as I undertook this
daunting task. Lela was always there to support, encourage, and motivate
me to ensure that the work necessary to complete this study was
accomplished. My children were extremely understanding while dad was
away gathering data or busy writing. They were willing to share their time
so that I could accomplish my task.
I am also very thankful to the chair of my committee, Dr. David
Marsh, for providing the guidance and structure necessary for making this
project successful. The deadlines, the assistance in developing
instruments, and the guidance in developing the frameworks were of
immeasurable value to me and to my teammates. The opportunity to
complete this dissertation as a member of a parallel study team has made
the process so much more manageable and has allowed me to complete the
project successfully.
Finally, I would like to thank the USC Rossier School of Education
and Dr. Stuart Gothold in particular, for the opportunity to be a part of the
first cohort of the Urban Superintendency program. This educational
experience will serve my cohort team and me well for years to come, and I
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
iii
am confident that we will have a huge and powerful impact on the world
of urban education in future years.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
iv
TABLE OF CONTENTS
ACKNOWLEDGMENTS ii
LIST OF TABLES vi
LIST OF FIGURES vii
ABSTRACT viii
CHAPTER Page
1 OVERVIEW OF THE STUDY 1
Introduction 1
Statement of the Problem 8
Purpose of the Study 10
The Importance of the Study 11
Organization of the Study 20
2 LITERATURE REVIEW 21
Introduction
The Status of Student Performance in the
21
U.S. and California 22
Conclusion 57
3 RESEARCH METHODOLOGY 58
Introduction 58
Methodology 58
Justification for Qualitative Research Design 59
Sample and Population 61
Instrumentation 64
Data Collection 67
Data Analysis 70
Summary 72
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
V
4 ANALYSIS AND INTERPRETATION OF DATA
AND FINDINGS 74
Introduction 74
Discussion 120
Summary 129
5 SUMMARY, DISCUSSION OF FINDINGS,
CONCLUSIONS, AND RECOMMENDATIONS 130
Research Problem 130
Purpose of the Study 131
Methodology 131
Summary of Data 134
Recommendations 136
Suggestions for Additional Research 139
REFERENCES 140
APPENDICES 148
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
VI
LIST OF TABLES
Table Page
1 API Performances 63
2a The Percentage of Students Performing At or Above
the 50th Percentile in Reading by Grade Level 76
2b The Percentage of Students Performing At or Above
the 50th Percentile in Math by Grade Level on the SAT-9
between 1998 and 2001 76
3a Percentage of Students Scoring in the Below Average
Range in Reading, Mathematics, Language Arts, and Spelling
Between 2000 and 2001 78
3b Percentage of Students Scoring in the Average Range in
Reading, Mathematics, Language Arts, and Spelling 78
3c Percentage of Students Scoring in the Above Average Range
in Reading, Mathematics, Language Arts, and Spelling on the
SAT- between 2000 and 2001 78
4 Teacher Questionnaire 97
5 Strongest Stage of Concern for Each Teacher 106
6 Average Scores Across All Stages, Highest Score 0-45, and
Number of Stages Per Teacher with a Score of 3.5 or More 108
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
vii
LIST OF FIGURES
Figure Page
1 Characteristics of the Sample 63
2 Displays the Relationship Between the Three Research
Questions and Data Collection 67
3 District Support for Standards-based Instruction and
Assessment 114
4 District and School Accountability to Standards-based
Curriculum 114
5 Research Rating Scale of the Success Indicators for
District Design for Data Use 120
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
viii
ABSTRACT
This dissertation was one of 13 parallel dissertations that looked at the extent
of data usage in districts and schools across the State of California. The dissertations
studied elementary, middle, and high schools in northern, central, and southern
California. This study investigates the use of data in a district and elementary school
setting in meeting the State of California’s expectations of improved student
achievement. The study queries whether or not the district has developed a systemic
plan for data use, and if the plan aims to prepare students and inform instructional
practices in light of current and emerging practices of the state accountability system.
Furthermore, the study seeks to discover how well the district plan has been
implemented at the school and classroom levels, and rates how well the plan has
been constructed.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
1
CHAPTER 1
OVERVIEW OF THE STUDY
Introduction
It has become increasingly clear over the last two decades that schools in
this nation have not been providing high quality instruction to our children. That
being said, a dichotomy exists in our education system today. On the one hand,
schools today educate more students, with more challenges, to a higher level of
learning than at any time in the past 100 years. At the same time, our schools
produce record numbers of students who are ready for neither jobs nor additional
schooling. These students cannot read, write, or compute well (Reeves, 2001).
The record numbers of students who leave our schools without the requisite skills
to be successful in the workplace has had a deleterious effect on the industries that
have to employ these individuals. Economic journalist Robert J. Samuelson has
pointed out that if students leave with poor skills, there are consequences. One is
waste. Giving people a third or fourth chance (whether in college or on the job) is
expensive. Some people learn skills later that they could have learned earlier.
Some skills are never learned (Larson, Guidera, & Smith, 1998).
The society at large began to realize the trend toward large numbers of ill-
prepared students entering the work force in the early 1980s. This alarming trend
lead to the National Commission on Excellence in Education report called a Nation
at Risk. The report told the nation that the educational foundations of our society
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
2
are being eroded by a rising tide of mediocrity that threatens our very future as a
Nation. The report also warned that America was losing its competitive edge in the
world economy because of a decline in the quality of education (Congressional
Digest, 1997).
According to an analysis of student performance on the National
Assessment of Educational Progress (NAEP) conducted by Paul Barton (2001),
there were several noteworthy findings. They included the following: States are
generally making more progress in mathematics achievement than in reading, good
readers are getting better at the same time weak readers are losing ground, and
during the 1990s fourth grade students made more improvement in mathematics
achievement than in reading in most states.
The Third International Math and Science (TIMMS) report doesn’t give
much better statistics. The TIMMS report, which compared performance in
mathematics and science of a half-million students world wide at three age ranges
corresponding roughly to grades 4, 8, and 12, indicated that while U.S. students at
the fourth grade level were near the first in the world in science, and were above
the international average in mathematics, by the eighth grade, U.S. performance
had fallen to slightly above the international average in science and to below the
international average in mathematics (Congressional Digest, 1997).
The statistics from the NAEP and the TIMMS studies demonstrate that all
of our students are not as prepared as they should be as they matriculate through
their school careers. When our students complete their schooling, they must be
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
3
prepared to take on the ever-changing demands of the workforce. Laurie
Sachtleben of Cheveron, Inc., says that “from the perspective of having enough
skilled employees tomorrow, math and science education is the key to our future”
(Larson, Guidera, Rogstad & Smith, 1998). Individuals need the academic
knowledge and skills that equip them to succeed in today’s ever-changing
economy. It should come as no surprise then, that almost 90% of new jobs require
more than a high school level of literacy and math skills (Larson, Guidera, Rogstad
& Smith, 1998).
Since the release of the Nation at Risk Report in 1983, schools have been
compelled to begin reforms aimed at improving student achievement. However,
systemic reforms have been difficult to achieve. In 1997 the U.S. Department of
Education released a paper entitled “National Standards of Academic Excellence:
President Clinton’s call to Action for American Education in the 21st Century.” In
the paper the authors point out that student achievement is not improving fast
enough. Across our Nation in our cities, suburbs, and rural communities alike, far
too many students are still not meeting the standards that will prepare them for the
challenges of today and tomorrow (Congressional Digest, 1997). Reforms have
been difficult to achieve because change is a difficult and complex process.
Knowing about the change process is crucial, but there are as many myths as there
are truths associated with change, and it is time to deepen the way we think about
change (Fullan & Miles, 1992).
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
4
Systemic reform must impact the entirety of the organization in order for it
to have any substantive effect. Serious education reform will never be achieved
until there is a significant increase in the number of people, leaders and other
participants alike, who have come to internalize and habitually act on basic
knowledge of how successful change takes place (Fullan & Miles, 1992).
For the reasons cited above, school reform is a complex issue. Fullan
(1992) says that the reasons that reform fails is because:
Schools and districts are overloaded with problems and,
ironically, with solutions that don’t work. Thus things get
worse instead of better. Even our rare success stories appear
as isolated pockets of excellence and are as likely to atrophy
as to prosper over time. We get glimpses of the power of
change, but we have little confidence that we know how to
harness forces for continuous improvement. The problem
is not really a lack of innovation, but the enormous overload
of fragmented, uncoordinated, and ephemeral attempts at
change.
In the last few years, researchers have found that the use of data to improve
instructional practice and student achievement can be a powerful innovation
(Schmoker, 1999). The role of data in the California state expectations for schools
is the development of accountability systems for both schools and districts. The
accountability system rests on the data from student testing. These data form the
basis for decisions made by the state regarding the effectiveness of our schools and
districts to educate our children. These data are also shared with districts, schools,
and the general public for their own individual analysis.
The accountability system developed in the State of California is the
Academic Performance Index (API). The API is used to rank the state’s schools on
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
5
a continuum ranging from underperforming to high achieving. The center piece of
the API is the State Testing and Reporting (STAR) system, which at present uses
the Stanford Achievement Test-Ninth Edition (SAT-9) norm referenced test as its
only measure for determining how well students are performing academically and
schools are doing instructionally (EdSource, 2000). The expectation is that
eventually all schools will be considered as performing at acceptable levels.
Acceptable meaning that schools are performing between 800 and 1000 on the API
on a scale ranging from 200-1000.
John Merrow (2001) points out the danger in our politicians and policy
makers using a single source measure for high stakes results. He writes: “Their
goal may be the improved ‘educational health’ of students, but their method, high-
stakes testing, is little more than a sophisticated, technological form of ‘Gotcha!’ ”
The overarching intent of the accountability system is to improve student
achievement, not catch children up in a no win situation. In order to avoid a
continuous loop of “gotcha” schools, districts, and states will have to depend on
multiple measures to inform them of student progress and instructional
effectiveness. California is expected to eventually include additional factors in the
API such as student and teacher attendance, graduation rates, and number of high
school students passing the new high school exit exam (EdSource, 2000).
In the final analysis, California expects to have a standards-based education
system. State standards were formally adopted in 1998 (EdSource, 2001) and the
standards were then disseminated to all schools statewide. The key to successfully
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
6
transforming California into a standards-based system is to ensure that the
standards are linked to a credible assessment vehicle and that the results of the
assessment, along with other measures, are used to determine how well students are
achieving and how effective the instruction is at a given school. Regardless to how
performance is measured, schools must show academic improvement for the school
as a whole and for significant subgroups within the school (Warren, 1999).
In order for data to have a significant impact on student performance, they
must be used at all levels of the organization. District administrators, school site
administrators, and teachers must have an understanding of how to use data
effectively. Many school districts underutilize one of the most powerful and
common symbol systems available to them— -numbers—to monitor, evaluate, and
revise programs and policies (Noyce, Perda & Traver, 2000). In a study evaluating
California’s standards-based accountability system, WestEd (Guth, Holtzman,
Schneider & Carlos, 1999) found that classroom teachers did not use data to a
great extent. The WestEd study went on to point out that “.. .interestingly, only
28.8 % of districts said that schools use assessment data ‘to guide curriculum and
instruction on an ongoing basis’ to a ‘great extent.’” (Guth, Holtzman, Schneider &
Carlos, 1999, p. 92). Schools and districts will have to learn how to use data
effectively at all levels in order to inform instruction and bring about improved
student achievement.
There has been considerable thought in recent years about how data can be
used in schools, especially in the area of reform, restructure, and renewal. Some of
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
7
this thought has come from within education (educational researchers) and some
has come from the business community and social policymakers. Some of the
thought coming from business shows us how to think differently when approaching
problems or developing innovations. Systems thinking by such authors as Senge
(1991) informs us about how to take information about what is happening within
the organization and use it to help improve operations and improve results.
John Brown (1991) of the Xerox Corporation discusses research that
reinvents the corporation. The research team (known as the Palo Alto Research
Center or PARC) explores how to think differently than the norm. They developed
something they call pioneering research which takes the best from applied research
and basic research in order to come up with ideas that redefine what is meant by
technology, by innovation, and research itself (p. 103). Brown suggests that the
company of the future must understand how people really work and how
technology can help them work more effectively. It must know how to create an
environment for continual innovation on the part of all employees. It must rethink
traditional business assumptions and tap needs that customers don’t even know
they have yet. It must use research to reinvent the corporation (p. 104).
In the same way, schools must learn to take the information gleaned from
multiple data sources in order to figure out how to improve the organization.
Sometimes it might mean “thinking outside of the box” in order to come up with
the solutions for certain problems that have blocked student achievement.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
8
Some educational scholars have developed complete reform systems to help
accomplish positive change in schools. These systems are built on the developer’s
view of child development or learning theory (Traub, 1999). Some well known
systems (reform programs) include: Accelerated Schools, Coalition of Essential
Schools, Expeditionary Learning Outward Bound, and Success For All. These
programs were developed because the researchers looked at data coming out of
schools and set about developing ways to improve outcomes. Educators must
continually look at research happening within the educational context and outside
of it in order to find innovative ways for improving their organizations. Fullan
(1993) says that schools, even empowered ones, will never become hotbeds of
continuous reform and improvement, if left to their own devices.
Statement of the Problem
Student achievement over the last couple of decades has been low as
compared with students in other countries. In an effort to improve achievement,
states have developed accountability systems, which rely on testing, and the data
that results from such testing. However, there is evidence that teachers and
administrators are not using data effectively to positively impact student
achievement. Teachers and administrators need to know how to read, disaggregate,
and understand data in order to translate the information into powerful instructional
change.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
9
Although the current research literature indicates that teachers rarely use
data to improve student achievement, there are schools that are beginning to
effectively use data to impact instruction and achievement in a positive manner
(Schmoker, 1999). It is important that research be conducted that will help map out
how schools can systematically use data to drive instruction and improve student
achievement. More information needs to be presented that will help school districts
make systemic changes that will raise expectation levels in the use of data, and
move schools toward the emerging state practices related to academic
accountability.
Because most teachers do not currently use data effectively, more needs to
be known as to how teachers can learn more about data use and become confident
in their ability to analyze data and turn their findings into powerfully informative
documents that can be used as road maps for improved instruction. Additionally,
more needs to be known about how district level administration uses data and
whether or not they use them as a means for improving instruction and achievement
across the organization. If districts are using data, then what are the data sources
and is there a strong assessment mechanism that helps embed standards-based
instruction within the organization?
Two essential pieces in the state accountability system are the API and the
High School Exit Exam (HSEE). We need to know how district administrators,
school administrators, and teachers are using or could use data and standards to
improve API scores and prepare students for the HSEE. The preparation should
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
10
begin at the elementary school and continue through high school. How do districts
begin developing a plan for using data to monitor student progress toward passing
the HSEE and schools’ progress toward ever increasing API scores?
Purpose of the Study
The primary purpose of this study was to investigate how good schools and
districts use data to improve student achievement. This purpose is driven by the
researcher’s need to learn how successful schools and districts design uses for
collected data and how those designs are implemented. This study was also
intended to find out how schools and districts use data to assess learning and
inform instruction across the organization.
The study was conducted to learn whether or not districts have begun
developing a systemic approach to data use that will allow administrators and
teachers to ensure that students are prepared for state expectations for learning at
the elementary and secondary levels. The study would look for a link between the
district designed plan for data use and state emerging practices for academic
accountability.
Research Questions
The study addressed the following research questions:
1. What is the district design for using data regarding student performance, and
how is that design linked to the current and the emerging state context for
assessing student performance?
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
11
2. To what extent has the district design actually been implemented at the district,
school and individual level?
3. To what extent is the district design a good one?
The Importance of the Study
This study is important because it will add to the literature base by
investigating how schools and districts develop plans for using data to improve
student achievement and how such plans are tied to the accountability systems
developed by state policymakers. The literature on accountability, systemic
reform, and educational finance either implicitly or explicitly discuss the need for
educators to understand how to use data to improve instruction and student
achievement, and that such data use become an integral part of the organizational
system. This study seeks to investigate how organizations that use data have
integrated data use systemically with the express purpose of improving the
academic achievement of the students within the system, and informing teachers of
the effectiveness of their instructional practices.
The findings of this study will be of value to all constituents in the
education community. First, administrators at the district level will glean
information that will help them use data to improve instruction, student
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
12
achievement, and support efforts system-wide. An analysis of how good designs
for data use are developed will help school districts create strong designs of their
own. Administrators at the school site level will be able to see how data use, in a
structured context, will help them monitor the progress of student achievement
throughout the course of the school year, thereby, having consistently greater
success on accountability measures year to year.
Secondly, this study will help principals find ways to help their staffs learn
to use data for the improvement of student outcomes school-wide. Of particular
importance to school leaders will be the findings on how successful schools
implemented district designs for data use. Likewise, teachers will be helped to see
the importance of incorporating data analysis into their instructional planning, and
that such incorporation can become a valuable tool for improving their craft and not
a weapon for administrators and policymakers to denigrate teacher performance.
Finally, the findings in this study will help all stakeholders—policymakers,
administrators, teachers, and parents—see the value of embedding a systemic plan
for using data within their organization. Successful schools have teachers who
have learned or are learning to use data to inform their instructional practices.
Administrators in successful schools not only monitor school-wide achievement
through data analysis, but the information that academic data provides helps
administration direct teachers toward improvement and lead course corrections
when achievement is not what it should be during the course of the school year.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
13
When this study is completed, its importance would be to enlighten the
educational community of the necessity for having a systemic plan for utilizing
data to improve student achievement and how such a plan could be implemented
into their particular organization. Research has indicated that schools have a need
to improve student performance in this country. Research has also indicated that
improving teachers’ ability to use and analyze data would result in powerful
learning outcomes. A study that investigates how schools plan for and use
instructional data for the improvement of achievement outcomes would have a
powerful effect on organizational efforts to meet state and local accountability
mandates.
Limitations
This study had inherent limitations. The investigation was limited to one
school and conducted during a one-month period. The scope of the investigation
would make looking at more than one school by a single researcher an extremely
difficult task. Time constraints were an inherent part of the limitations. The
researcher was limited in the amount of time granted for subject interviews and
data collection within the context of the school setting. Many other variables (e.g.,
student attendance, classroom instruction, and parental attitudes) affected the
outcome of student achievement. Limitations of the case study format were
inherent by the study.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
14
Delimitations
All subjects who participated in this study were in the same school and in
the same district. The district and elementary school were selected because of the
success of students on success indicators and because there was evidence that the
school and district have successfully used data to inform their decision-making.
Therefore, the size of the sample was small and may have been limited in terms of
generalizabiltiy.
Generalizability was increased, however, by selecting a case that was
typical of the phenomenon and designing the case study using clear methods that
would apply to similar schools.
Assumptions
The study was conducted with a set of assumptions. The district
administrators and the principal were believed to have a good understanding of the
collection of data and how to use the gathered information. The subjects of the
study were assumed to respond honestly to questionnaires and to face-to-face
interviews. Finally, it was assumed that data collection techniques such as surveys,
interviews, and questionnaires would provide valid and adequate data for the
purposes of this study.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
15
Definitions
The following terms are presented and defined to clarify their meanings and
to operationalize their usage and interpretation in a consistent manner.
Academic Performance Index (API)
One of three components of California’s new Public Schools Accountability
Act (PS A A), the API measures the performance of schools, especially the academic
achievement of pupils, and improvement over time. A school’s API score is used
as the basis for ranking California’s public schools. In 1999 and 2000, the only
performance indicator included in the API is the school’s Stanford-9 test scores, a
point of controversy and contention among educators. Other performance
indicators are being reviewed for addition to the API. Results of the augmented
items to the Stanford-9 and the High School Exit Exam will be added once the tests
are found to be valid and reliable. The law specifies that at least 60% of the API
must consist of test results. The remaining 40% may include measures such as
student and staff attendance rates and graduation rates.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
16
Accountability
The notion that people (e.g., students or teachers) or an organization (e.g., a
school, school district, or state department of education) should be held
responsible for improving student achievement and should be rewarded or
sanctioned for their success or lack of success in doing so.
Achievement test
A test to measure a student’s knowledge and skills.
Assessment
A system for testing and evaluating students, groups of students, schools, or
districts.
Content standards
For this study, content standards describe what students should know and be
able to do in core academic subjects at each grade level.
Criterion-referenced test
A test that measures specific performance or content standards, often along
a continuum from total lack of skill to excellence. These tests can also have cut
scores that determine whether a test-taker has passed or failed the test or has basic,
proficient, or advanced skills. Criterion-tests, unlike norm-referenced assessments,
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
17
are not primarily created to compare students to each other. The goal is typically to
have everyone attain a passing mark.
Curriculum
The courses of study offered by a school or district. As the term is applied
in the context of the state of California’s educational improvement efforts, it refers
to the set of standards developed by the state that are intended to guide curriculum
and instruction.
High School Exit Exam (HSEE1
Beginning with the Class of 2004, California public school students must
pass the state’s new exit exam before receiving their high school diploma. The exit
exam is not a college entrance or honors exam. Instead, its purpose is to test
whether students have mastered the academic skills necessary to succeed in the
adult world. It is a pass-fail exam divided into two sections: Language arts
(reading and writing) and mathematics. Freshmen, sophomores, juniors, and
seniors can take the test. Once students pass a section of the test, they do not have
to take it again.
National Assessment of Educational Progress fNAEPt
This is a national test that is given to specific grade levels in specific
subjects every other year in which a small sample of students representative of the
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
18
state are tested. NAEP test scores can be compared to national averages. Not all
states participate in NAEP.
Nationally normed assessment
A test that has been administered to a national control group that reflects the
demographic profile of the target population (e.g., 4th graders) throughout the
country. The scores of all subsequent test-takers are then compared against the
scores of this control (or “norming”) group.
Public Schools Accountability Act
Signed into law in California in April 1999, this act outlines a
comprehensive process for ranking schools based on specific criteria and
improvement over time. When schools fall short of the expectations, the state may
intervene—first with assistance and later with sanctions. Successful schools will
be rewarded. The PSAA has three main components: the API, the Immediate
InterventionAJnderperforming Schools Program (II/USP), and the Governor’s
Performance Award program (GPA).
School district
There are three types of school districts in the state of California:
elementary, high school, and unified. An elementary district is generally
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
19
kindergarten through eighth grade (K-8); high school is generally grades 9 through
12; unified is kindergarten through 12th grades (K-12).
Standardized test
A test that is in the same format for all takers. It often relies heavily or
exclusively on multiple-choice questions. The testing conditions—including
instructions, time limits, and scoring rubrics—are the same for all students, though
sometimes accommodations on time limits and instructions are made for disabled
students. Reporting of scores to parents, students, or schools is the same. The
procedures used for creating the test and analyzing the test results are standardized.
This test, also called the SAT-9 Stanford-9, is officially known as the
“Stanford Achievement Test, Ninth Edition Form T” and is published by Harcourt
Brace Educational Measurement. It is a standardized, nationally normed, multiple-
choice test that measures basic skills in math, reading English, and other areas. The
Stanford-9 was adopted by California in 1997 for a five-year period as its statewide
student performance test in grades 2-11 and is currently the sole indicator used in
California’s Academic Performance Index (API).
Third International Mathematics and Science Study fflMSS)
TIMSS is the third attempt by educators to compare achievement in
mathematics and science across nations. Although criticized by some as lacking
reliability, it is the largest, most recent, and best-controlled study of its sort.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
20
TIMSS administered tests to students in 26 countries at grade 4 and 41 countries at
grade 8. Depending on the specific test, it was administered to up to 21 countries at
the Final Year of Secondary School. (It is called the Final Year because, in many
instances, it does not correspond to 12th grade in the United States.)
Organization of the Study
Chapter 1 is an introduction to the study. Chapter 2 reviews the literature
organized by the following topics: definitions of data and data usage, key
components of data use in education, conceptual frameworks for analyzing data use
in schools, and the district framework for data use for improved instruction and
student achievement. Chapter 3 describes the procedures used for sample
selections, the development and use of instruments, data collection, and data
analysis. Chapter 4 presents the findings and a discussion of the findings. Chapter
5 summarizes the findings, conclusions, and implications of the study, and present
recommendations for further research.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
21
CHAPTER 2
LITERATURE REVIEW
Introduction
The discussion of the use of data in schools is a developing
phenomenon in education. The topic goes beyond the consultation of
achievement data, but rather incorporates ideas for restructuring and
renewal, and informs teachers, administrators, and policymakers on the best
methods for improving instruction. This chapter will present a review of
the literature on topics germane to the purpose of this study. The first
section will discuss the status of student performance (especially in math
and literacy) in the US and California—including: comparisons with other
states and/or countries.
The next section will discuss the nature of comprehensive school
reform in the US, focusing especially on standards-based reform. The
discussion of school reform will be followed by a discussion on the role that
data/information about student performance and related matters is intended
to play in comprehensive school reform. That topic is followed by the role
of data in the California state expectations for schools, and finally research
about the current use of data in schools—both at the individual teacher and
school wide levels will be discussed.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
22
The Status of Student Performance in the U.S. and California
In 1983 the United States was jarred by a scathing report on the
status of the educational system. The document entitled “A Nation at
Risk,” commissioned by the Secretary of Education to investigate the
quality of education in this country, informed the general public that
students in the US were trailing their counterparts in other countries and
that our schools were houses of mediocrity (A Nation at Risk, 1983). The
contents of this report led to the educational reform movement that has been
with us now for at least 20 years.
One of the earliest attempts to quantify the progress of students in
the US was the National Assessment of Educational Progress (NAEP)
mandated by congress to monitor the knowledge, skills, and performance of
the Nation’s children and youth (Congressional Digest, 1997). The
Congressional Digest (1997) continued to point out several disturbing
findings based on the NAEP study. First, test indicated that by 1997,40%
of American fourth graders could not read as well as they should to hold a
solid job in tomorrow’s economy. Secondly, the average reading
proficiency of 12th grade students declined significantly from 1992 to 1994,
and that this decline was observed across a broad range of subgroups. For
example:
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
23
♦ The decline in average proficiency among 12th graders between 1992
and 1994 was concentrated among lower-performing students—those
scoring at the 10th, 25th, and 50th percentiles.
♦ The percentage of 12th grade students who reached the proficient level
in reading declined from 1992 to 1994. There also was a decrease in
the percentage of twelfth graders at or above the Basic level.
Further analysis of NAEP results show that generally students are making
more progress in mathematics achievement than in reading, and good
readers are getting better at the same time weak readers are losing ground.
Furthermore, states have not generally reduced the achievement gap
between top and bottom quartiles or between white and minority students in
reading or in mathematics (Barton, 2001).
The performance of California’s 4th grade students on the NAEP
between 1992 and 1996 yielded similar results as found in the national
aggregate. In mathematics, average scores remained the same, scores for
students in the bottom quartile improved, scores for students in the top
quartile remained unchanged, percent of students scoring proficient
remained unchanged, the closing of the gap between top and bottom
quartiles improved, and the gap between white and minority students
remained unchanged. In reading, average scores were unchanged, scores
for students in the bottom quartile were unchanged, scores for students in
the top quartile were unchanged, the percent of students scoring proficient
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
24
were unchanged, there was no change in the gap between the top and
bottom quartiles, and there was no change in the gap between white and
minority students (Barton, 2001).
When comparing California students against the national norm on
the Stanford Achievement Test, Ninth Edition (SAT-9) it is evident that
th
California students do less well than the norm. Typically, the 50
percentile is deemed to be the line of demarcation indicating average
performance in the normed group. For example, SAT-9 data for fourth
grade students in reading from 1998 to 2000 indicated that fewer than 45%
of the students were scoring at or above the 50th percentile. In
mathematics, 51% of the states 4th graders scored at or above the 50th
percentile in 2000 while 44% did so in 1999 and 39% did so in 1998.
Results for all other grades were similar to the fourth grade performance
between 1998 and 2000, with the exception of the year 2000 scores in
mathematics. Better than 50% of the students in grades 2-6 and grade 9
scored at or above the 50 percentile in mathematics (California
Department of Education, 2001).
Unfortunately, when we compare the achievement of US students to
their counterparts in other countries in the areas of mathematics and science,
the picture does not look too much brighter. According to a guide to
business leaders for the support of mathematics by Larson, Guidera, Smith
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
& Nelson (1998), U.S. students’ performance tend to decline as they
matriculate through the grade levels. They report the following:
25
Results from the recent Third International Math and
Science Study (TIMSS), which compared performance in
mathematics and science of a half-million students
worldwide at three age ranges corresponding roughly to
grades 4, 8, and 12, including 33,000 Americans,
indicated that while U.S. students at the fourth-grade level
were near the first in the world in science, and were
above the international average in mathematics, by the
eighth grade, U.S. performance had fallen to slightly
above the international average in science and to below
the international average in mathematics.
In 1999 the Third International Mathematics and Science Study-
Repeat (TIMSS-R), the successor to the 1995 TIMSS, was administered in
38 countries, whereas, the original TIMSS evaluated three grade levels, this
study focused solely on eighth grade students. The TIMSS-R allowed
researchers not only the opportunity to see how eighth grade students
performed against their counterparts around the world in 1999, but it also
provided data as to how the same cohort of students performed four years
later.
The results of TIMSS-R found that in 1999, U.S eighth-graders
exceed the international average of 38 nations in mathematics and science.
In a cohort comparison of U.S. students and their counterparts who
participated in TIMSS 1995 and TIMSS-R 1999, the study found that there
was no change in eighth-grade mathematics or science achievement in the
United States. Among 22 other nations, there was no change in
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
26
mathematics achievement for 18 nations, and no change in science
achievement in 17 nations (U.S. Department of Education, 2000).
The results from TIMSS and TIMSS-R show that U.S. students do
not start out behind in elementary school. U.S. students in elementary
school are at the top of the world in science and mathematics. The problem
of lack of competitiveness tends to happen once students reach middle
school and high school. As the TIMMS data is analyzed, the findings seem
to indicate that there seems to be a lack of rigor in the U.S. curriculum in
secondary schools. U.S. students in high school no longer have to take
math and science by their senior year. Students in the highest performing
countries, such as Japan and Germany, are learning algebra, geometry,
physics, and chemistry in their middle school years. By contrast, middle
school students in the U.S. are still doing elementary arithmetic and
introductory science (National Center for Education, 2001).
The lack of academic competitiveness that U.S. students have
demonstrated creates major implications for the future of the economic well
being of the nation. There are long-term consequences for low student
performance. Advances in the workplace are increasing at astounding rates.
What successes that have been made in academic achievement have been
relatively slow and are far and away outpaced by changes and skill
requirements in the workplace. The NAEP results paint a troubling picture
of American student’s readiness for the demands of living, working, and
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
27
learning in a changing world (Progress of Education Reform, 1996).
Individuals need the academic knowledge and skills that equip them to
succeed in today’s ever-changing economy. It should come as no surprise
then that almost 90% of new jobs require more than a high school level of
literacy and math skills. Jobs that require strong academic skills are vacant
due to lack of qualified workers and remedial training costs are rising
(Larson, Guidera, Rogstad, Smith & Nelson, 1998).
Students who have poor academic skills will be ill-prepared to
survive in the work place of the present and the future. America’s ability to
compete in the world economy when our students enter the workforce will
be seriously compromised if they are consistently out performed by their
counterparts around the globe. The future of the nation’s economic
prosperity hinges on the ability of U.S. students to be able to read, write,
compute, and analyze well.
The Nature of School Reform in the U.S.
No major social institution has been more subject to pressure for
change than the public school system (Sarason, 1971). The 1983 “Nation
at Risk” (U.S. Department of Education) document was the catalyst for the
current move toward educational reform. This document caused many to
begin to seriously question the adequacy of the nation’s school system to
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
28
teach the nation’s children. It became increasingly clear to many that the
system needed to be reformed.
Some of the seminal work in the current educational reform
movement was the work of the late Ron Edmunds and his effective school's
research. The heart of this research was the investigation of what were the
major elements of high performing schools, especially in low income, high
minority populated areas. His research garnered five effective schools
correlates. Effective schools a) have strong administrative leadership; b)
have a climate of expectation in which no children are permitted to fall
below minimum but efficacious levels of achievement; c) the school’s
atmosphere is orderly without being rigid, quiet without being oppressive,
and generally conducive to the instructional business at hand; d) get that
way partly by making it clear that pupil acquisition of basic school skills
takes precedence over all other school activities; e) when necessary, school
energy and resources can be diverted from other business in furtherance of
the fundamental objectives; and f) have some means by which pupil
progress can be frequently monitored (Edmunds, 1979).
This research caused the educational community to take a hard look
at effective instruction in the classroom and the effects of strong leadership
at the site level. Metaphors for leadership being applied to restructured
schools began to change from principal as manager to principal as
facilitator, from teacher as worker to teacher as leader (Murphy, 1991). The
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
29
investigation of effective instruction and strong leadership has given rise to
several reform models designed to improve academic outcomes at
underachieving schools.
Examples of these models are Slavin’s Success For All, Sizer’s
Coalition of Effective Schools, and Chris Whittle’s Edison Project. All of
these school reform models are based on specific philosophical beliefs
about teaching and learning. Indeed schools by their very nature are
expressions of belief, and philosophers have been devising the ideal schools
since the time of Plato (Traub, 1999). Traub goes on to point out that the
idea of a model, a replicable design that comes equipped with assembly and
operating instruction, is quite new and.. .of all of the “movements” criss
crossing the educational landscape it is this one (school-wide reform
models) which arguably holds out the most promise for producing
categorical change (Traub, 1999, p. 6).
As the reform movement continued its momentum, the emphasis
began to shift from structures at the school site—although reform models
for individual schools sites continue to be central pieces in state and
national efforts to improve the performance of low performing schools.
This approach is supported by the literature on school restructure and
renewal which says that school improvement is an integrated rather than a
piecemeal activity and that improvement occurs on a school-by-school basis
(Murphy, 1991)—to standards based accountability systems designed to
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
30
make systemic educational reforms, beginning at the state department of
education, through county and district offices, to individual school sites.
Nearly all states are involved in standards-based reform, and their
commitment to this strategy remains strong (Education Commitment of the
States, 1996). State initiated change can positively impact student
achievement if properly implemented. Marsh and Bowman (1989) found
that top-down strategies of reform (i.e., top referring to reforms initiated by
state policymakers and top-down meaning the progression of impact of the
policy as it is carried through district and school leaders before reaching the
classroom teacher) to be very effective when all students were targeted and
existing academic programs were strengthened; when the local
implementation process is stimulated by external pressure, especially in the
form of testing; the content of the reform extends across the school and
includes alignment of curriculum, textbooks, teaching strategies and testing;
the roles of the state, district and school are complementary; the local
decision-making process complements rather than competes with the
existing structures; the reform fits within the school and district, and finally,
key players are able to institutionalize their efforts within the regular
program of the school. Additionally Marsh (1988) found that state-initiated
“top-down, content-focused” reform in secondary schools is successful
when the content of the reform fits with the priorities of the district, and
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
31
when districts and schools are able to transform the reform into their local
agenda and context.
A couple of major findings by Odden and Marsh (1988) in their
study of a broad state-level initiative for secondary schools (the PACE
study) were that a) education reform legislated at the state level can be an
effective means of improving schools when it is woven into a cohesive
strategy at the local level, and b) attention to the substance of curriculum
and instruction and to the process of school change correlates with higher
test scores and improved learning conditions for all students.
However, states tend to differ in the manner in which they develop
their standards based programs and the progress targets developed for their
respective systems. For example, states such as Florida and Texas have
absolute progress systems, which means that students in this system must
meet a specific performance standard (e.g., meeting a specific standard such
as making a certain score on a given test). Other states such as Kentucky,
Maryland and California have implemented a system of relative progress.
This means that students must show improvement over time (EdSource,
2000).
The structures of these programs differ as well. Texas has a
curriculum standards-based system with a criterion-referenced test to
measure student outcomes relative to the standards. By contrast, California
developed curriculum standards, but has relied on a norm-referenced test to
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
32
measure student achievement. The Standardized Testing and Reporting
(STAR) program is the process that the state of California uses to assess
student achievement. STAR requires all districts to administer the same
nationally normed, basic skills, standardized test. The standardized test
chosen by the state of California is the Stanford Achievement Test Ninth
Edition (SAT-9). The SAT-9 portion of the STAR is now the linchpin of
the state’s accountability law (Guth, Holtzman, Schneider & Carlos, 1999).
Emerging practice in California, with respect to assessing student
achievement, is changing to include a standards-based exam (known as the
California Standards Test) and a High School Exit Exam (which will be
required for high school graduation).
The fact that states have developed different approaches to reform
and accountability should not at all be surprising. Gerald Hayward, Co-
Director of Policy Analysis for California Education, has pointed out that
the subject of school accountability is extraordinarily complex, and that
educators and policymakers are going to have to realize that implementing
effective accountability systems will take a lot of money and a lot of time,
on the order of perhaps 10 years and hundreds of millions of dollars
(EdSource, 1998). The main idea here is that policymakers must have the
vision and the intestinal fortitude to sustain long-term approaches to
systemic reform.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
33
Though most states have begun to take the high road to systemic
reform, there are voices of caution along the avenue of this reform
movement. These messages warn of the dangers of over reliance on high
stakes testing, and point out the difficulties in maintaining reform efforts.
John Merrow (2001) points out that:
politicians and education policy makers demonstrate an
increasing willingness to make life-changing decisions about
our children based on their performance on a standardized
test. Their goal may be the improved educational health of
the student, but their method is little more than a
sophisticated, technological form of "Gotcha.”
Many reform efforts have tended to fail for various reasons. Fullan
and Miles (1992) say that some of the reasons for failure is that many times
reform efforts have faulty maps of change, problems are complex and
solutions are not easy, impatient and superficial solutions are employed,
leaders many times tend to misunderstand resistance, and there is
sometimes a misuse of the knowledge of the change process.
Reforms also tend to fail because teachers do not have the necessary
training to re-tool their thinking and pedagogical practices. Educational
reform is meaningless unless change takes place at the grassroots level, the
classroom. It is effective teaching that positively impacts student
achievement. Cohen and Ball (1990) point out that teaching is not a
uniformed practice where instruction can be changed at will, much like one
changes a set of garments, but rather is an art form, a state of being, of
knowing, and of seeing. They go on to say that “if teachers are to
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
34
significantly alter their pedagogy, they must come to terms not only with
the practices that they have constructed over decades, but also with their
students’ practices of learning, and the expectations of teachers entailed in
them” (Cohen & Ball, 1990, p. 335).
Teachers’ understanding of students’ practices of learning and
improved pedagogy are enhanced when teachers develop their craft through
collegial exchange and collaboration. Fullan, Bennett and Rolheiser-
Bennett (1990) tell us that the basic features of school improvement are
these: shared purpose, norms of collegiality, norms of continuous
improvement, and structures representing the organizational conditions
necessary for significant improvement.
Darling-Hammond (1990) tells us that teachers teach from what they
know. If reform is going to be successful then we must ensure that the
knowledge base for teachers is ever increasing. This is especially true if the
reform effort is initiated from the top, and the innovation is something that
is new. Policymakers cannot just issue an edict and expect change to
magically happen. Darling-Hammond (1997) cautions that:
although policymakers may think that educators implement
their directives, what typically happens is much more a
process of redefinition or sometimes subversion. Teachers’
responses to policies depend on the degree to which the
policies permit flexibility or impose constraints on their
ability to meet what they perceive to be the needs of their
students (p. 70).
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
35
Innovation happens because of effective implementation at the local
level. District administrators, site administrators and teachers may be
enthusiastic about a particular idea or program, but if there is no connection
between the innovation and practice, or teaching staff do not believe that the
organization is not supportive of the innovation, then the implementation
will not happen.
McLaughlin (1990) found although teachers at a site may be eager
to embrace a change effort, they may elect not to do so, or to participate on
only a pro forma basis, because their institutional setting is not supportive.
Consequently, the enthusiasm engendered in teachers may come too little
because of insufficient will or support in the broader organizational
environment, which is hard to orchestrate by means of federal (or even
state) policy.
Policy must be well communicated if it is to be understood.
Meaningful discussion and professional development is imperative to clear
communication of change (Darling-Hammond, 1990). Directives of
themselves are not enough. Policymakers must be cognizant of the fact that
change in student achievement is directly tied to the quality of instruction
those students received. Therefore educational reform will only be effective
if teachers receive comprehensive professional development and the
parameters for change are clearly articulated. There must also be a
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
36
commitment by state policymakers to financially support the reforms that
they initiate.
States and districts have continued to march toward systemic
reform, especially standards-based accountability. Since the early 1970s
and throughout the 1980s, numerous reform initiatives have sought to
increase the accountability and effectiveness of public education in
America... efforts to improve student achievement outcomes have included
increasing graduation requirements and implementing assessment programs
that define both standards of performance of students and standards of
accountability for the educational system (Winfield, 1993).
In California, the Public School Accountability Act of 1999 (PSAA)
requires the Superintendent of Public Instruction, with the approval of the
State Board of Education, to develop an Academic Performance Index
(API), to measure the performance of schools. The API ranks students
according to how they score on a continuum of 200 to 1000. The state has
established a score of 800 as the baseline for all schools to reach to be
considered high performing. All schools are expected to make progress
toward the score of 800 at 5% growth each year until they eventually meet
the mark (EdSource, 1999).
Guth, Holtzman, Schneider and Carlos (1999) conducted an
empirical study on California’s Accountability system. They found that
districts and schools report that accountability has had positive effects,
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
37
especially on curriculum, instruction, and assessment practices. They also
noted that because most districts’ standards-based accountability systems
have been in place for less than two years, it is too early to know for sure
what effects, if any, these systems are having on student achievement.
It must be acknowledged that there is a financial component to
educational reform. If we are to improve the academic performance of all
students, we must first acknowledge that the playing field is not level.
Every student does not come to the educational playing field with the same
set of skills and preparation. Is it, therefore, fair to expect all students to
meet the same target without adequate resources? There are some external
forces that impact student learning that individuals in the educational setting
have absolutely no control over. For instance, poverty and its effects can
have a deleterious effect on a student’s capacity to achieve academically.
The school system must grapple with the questions of adequacy and equity
in order to ensure that all students have what they need to successfully
navigate the academic landscape and achieve at high levels.
Besides the question of vertical equity is the question of horizontal
equity. Not all schools have the same allocations because of the differences
in area wealth. These discrepancies have lead to law suits and finance
reforms in various states. Once finance reforms are mandated, does the
increase in funding lead to spending changes in school districts that improve
student performance?
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
38
Odden (2000) says that one of the most problematic issues raised in
comprehensive school change has to do with cost. Most discussions of
comprehensive school reform seem to assume that such reforms can be
implemented with little if any new funds. This type of thinking ignores the
fact that spending is not equal across the state and national landscape.
School financing has been quite an issue over the past two decades simply
because of the problems of equity inherent in the funding system in many
states. In times past, many policymakers voiced concerns over how districts
would spend the funds once additional school aide was provided. One of
the main predictions was that districts would put the bulk of the funding
toward teachers’ salaries, which takes up to 68% or more of most districts’
budget allocations, as opposed to other expenditures that would have a
greater impact on student learning (Goertz & Natriello, 1999). Some
estimates put personnel related expenditures at 75-85% of a district’s
budget (Picus, 2000).
Elliot (1998) points out that there is considerable controversy among
educational researchers over the relationship between school finance and
students’ achievement. One side maintains that there is no significant
relationship between increasing expenditures and improving students’
achievement because schools do not effectively use these funds to improve
the learning environment. The other side maintains that there is a
significant relationship between money and achievement because money
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
39
buys smaller classrooms and schools and more qualified teachers. School
districts tend to be on the side of the latter as their spending patterns bear
out. Goertz and Natriello (1999) point out that studies show that high cost
and low cost districts tended to spend additional funding allocations on the
same kinds of things, mainly lowered class sizes, additional support
services, and more instructional materials and equipment. These types of
expenditures directly impact students at the classroom level. Picus (2000,
p. 2) points out the following:
Schools often tend to implement new educational
programs but rarely try to eliminate programs that don’t
appear to be working or are not as cost-effective as other
programs. Without careful analysis of existing
programs, a school district can find itself with a vast
number of unrelated programs, some of which might
even work at cross-purposes. That’s why it’s important
to evaluate and analyze all school district programs
regularly and to give careful thought to their costs and to
their effectiveness in improving student achievement.
According to Duncombe and Yinger (1999) in order for school
report cards and performance-based state aid systems to be fair, they must
distinguish between poor performance based on external factors and poor
performance based on school inefficiency. In order to help make such a
distinction, Dunscombe and Yinger (1999) propose a cost index. They say
that “the key to removing external factors is to calculate an educational cost
index; as we use the term, such an index measures the impact of input and
environmental costs, not just input prices. This cost index plays a key role
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
40
in public policy; it is not fair to expect a high-cost district to achieve the
same performance as other districts unless it is given enough resources to
compensate for the high cost” (pp. 265-266).
There are schools where the cost of doing business is higher than in
other areas. For example, many inner city schools cannot attract teachers
unless they pay more money (combat pay) because of the dangerous
circumstances indigenous to the inner city environment. Many students
from low socioeconomic conditions need after school tutorials and other
opportunities to learn because families in poverty may not be able to
provide the additional support (help doing homework, purchasing of books
for reading outside of the school setting, and proper pre-natal and post-natal
nutritions) needed for academic success. For the school to fill in these gaps,
additional funding is necessary. Dunscombe and Yinger (1999) remind us
that if performance declines as student poverty increases, then a district
with a high poverty rate cannot achieve the same performance as a district
with a low poverty rate without running programs (which of course, cost
money) to offset the impact of poverty.
Elliot (1998), in speaking of opportunities to learn (OTL) in the
context of math and science instruction, points out the following:
expenditures correlate significantly with most measures
of OTL. More money is spent in schools where the
teachers tend to be more educated and more
experienced, classes tend to be smaller, teachers tend to
put a greater emphasis on higher-order thinking in math,
to make greater use of computers in math, and to have
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
41
greater access to science equipment in good condition.
In schools where less money is spent, classes tend to be
larger and teachers are more likely to emphasize the
relevance of math and science and memorizing facts in
math and to make greater use of calculators.
Elliot (1998) points out elsewhere in the document that the emphasis on the
relevance of math is much less effective than the emphasis on higher order
thinking skills, and that the emphasis on the relevance of math is more
likely to be found in the instruction of minority and poor students.
It is interesting to note that even though comprehensive educational
reform will need to be accompanied by adequate funding to ensure that the
dollars necessary for supporting change are available, finance reform and
education reform are typically addressed separately. For example, Goertz
and Natriello (1999) point out that in the cases of Kentucky, Texas, and
New Jersey the relationship between finance reform and education reform
remained separate at the state level, despite these states’ attempts at
providing guidance for spending new allocations and shaping of a vision for
school reform.
In the final analysis, the reform movement has moved all
stakeholders in the educational process to focus on improving student
achievement outcomes. The reforms are targeting how instruction is
presented (school wide reforms) and what is being taught (standards-based
reform). Adequacy of funding is also key to successful educational reform.
The types of outcomes sought by the educational reform movement cannot
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
42
be achieved without adequately funding schools according to their
situational needs (controlling for needs of vertical equity) and providing all
schools with the funding that will allow them to purchase goods, services
and facilities that will ensure basic opportunities to learn (horizontal equity
issues).
Such concentrated efforts should have a positive impact on student
achievement outcomes. Higher student achievement, improved instruction,
and greater rigor in academic content are the true nature of the educational
reform movement.
The Role that Data about Student Performance and Related Matter is
Intended to Play in Comprehensive School Reform
In comprehensive school reform, data is intended to be used to guide
education leaders and classroom teachers toward improved instructional
practice. In order for improved instructional practice to take place, data
must be used effectively by all stakeholders. However, effective use of data
by educational leaders and classroom teachers has not always been
accomplished.
In many schools and districts, data analysis has never been viewed
as a high priority. Many state education departments put little emphasis on
schools gathering data, and thus provide little incentive for districts and
schools to devote time, money, and staff resources to using data in new
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
43
ways (Bernhardt, 2000). When data are used at the district and school site
levels, research has found that district administrators are more likely to use
and analyze student performance data than schools and teachers (Guth,
Holtzman, Schneider & Carlos, 1999).
Nevertheless, policy makers have become much more in tuned to the
need for schools to use data to improve instruction and student
achievement. “The notion that schools should be held directly responsible
for improving student achievement—and should be rewarded or sanctioned
for their ability or lack of ability to do so - is taking both California and the
nation by storm” (EdSource, 1998). In looking at how math teachers in
California interpret state educational policy, Cohen and Ball (1990) noted
that if instructional change is going to take place teachers and students are
the ones who will have to make them. They also note that “changing one's
teaching is not like changing one's socks. Teachers construct their practices
gradually, out of their experience as students, their professional education,
and their previous encounters with policy designed to change their
practice.”
State and local policies have been, and continue to be, designed to
affect change in the instructional practice of classroom teachers and
improve student academic achievement. However, much change has
typically been highly influenced by the textbooks used in the classrooms.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
44
Because teacher practice is so widely divergent, teachers can be using the
same text and cover very different topics (Cohen & Ball, 1990).
Teachers, in many instances, teach what they are comfortable with
and what they understand. What the textbook recommends in many
respects may be inconsequential. Thus, when top down policy is issued it is
important for policymakers and administrators to realize that extensive and
in-depth professional development must accompany the policy if
instructional change is to be realized. It is important to invest in sound
professional development that will help current employees learn how to
make new programs successful. This requires a commitment to helping
employees understand the need for change and providing them with support
to deal with change when it comes (Picus, 2000). As Cohen and Ball
(1990) point out:
any teacher, in any system of schooling, interprets and
enacts new instructional policies in light of his or her
own experience, beliefs, and knowledge. Hence to
argue that government policy is the only operating force
is to portray teachers as utterly passive agents without
agency. That is unsupported by our investigations (p.
335).
Therefore, staff development is crucial for teachers if they are to be able to
utilize data effectively and utilize the state adopted texts, which must adhere
to the state standards, to their ultimate advantage.
The use of data is key to holding schools accountable for the
achievement of students. Data is information that tells exactly what is
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
45
happening in a given situation. For example, performance data from
achievement tests provide information which tells how well students
perform academically as measured by a given test instrument. If
disaggregated and analyzed properly, achievement data can provide a
wealth of information about how teachers can improve instruction. Data
help us to monitor and assess performance. Just as goals are an essential
element of success, so data are an essential piece of working toward goals
(Schmoker, 1999).
However, schools must become more knowledgeable about the
effective use of data. Noyce, Perda and Traver (2000) point out that “many
school districts underutilize one of the most powerful and common symbol
systems available to them—numbers—to monitor, evaluate, and revise
programs and policies.” Programs and policies should directly affect what
happens in classrooms. In fact, what is key for California’s six million
public school students (and students nationally) is what happens in their
local schools and classrooms (EdSource, 2001).
Data, especially assessment data, are intended to be used as tools to
change what happens inside classrooms. For example, assessment involves
the systematic and purposeful collection of data to inform actions. The
primary purpose of assessment data is to help students by providing
information about how instruction can be improved (Ransom et al., 1999).
Schmoker (1999) provides evidence that there are teachers who are using
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
46
data to track and improve achievement in writing and higher-order math
skills. He goes on to say that “criteria-based assessments and rubrics have
changed the nature of assessment by providing numerical data that take us
beyond the ability to test mere recall. We can now assess understanding,
application, and other thinking skills in new ways” (p. 37).
Although many schools and districts are not using data to the benefit
of their organizations, there is exciting evidence that there are schools and
districts that are beginning to effectively use data to improve instruction and
student achievement. In an empirical study performed by the Dana Institute
at the University of Texas at Austin, researchers Skira et al., (2000)
investigated four school districts in Texas to find out what was responsible
for the dramatic turn around in the achievement of their students. These
four districts, which serve predominantly minority populations, at one time
were extremely low performing as measured by the accountability system
for the state of Texas known as the Texas Accountability and Assessment
System (TAAS). These four districts are now high performing districts, as
measured by TAAS, and exemplars of how entire districts can be
transformed into highly successful organizations.
One of the main components in the change in theses districts was
their use of data.
Principals discussed with teachers data for the campus
and for each teacher, and the teachers discussed it with
each other. Central office personnel helped campus
leaders and teachers use data to focus, plan, and monitor
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
47
the implementation of their plans. In all these cases, the
data were used to drive improvement efforts (Skira et
al., 2000, p. 28).
Data use is an important component in comprehensive reform
because of their ability to make the need for improvement and subsequent
strategies plain. Data make the invisible visible, revealing strengths and
weaknesses that are easily concealed. Data promote certainty and precision,
which increases teachers’ confidence in their abilities (Schmoker, 1999).
Furthermore, common goals that are regularly evaluated against common
measures—data—sustain collective focus and reveal the best opportunities
for practitioners to learn from each other and hence to get better results
(Schmoker, 1999).
Data, as related to this study, is the use of information that will have
a measured impact on student achievement and facilitate improved
instruction. Teachers and instructional leaders must begin using data in
ever increasing numbers if there is to be the type of wide scale
improvements sought by the comprehensive reform movement. Teachers
are not anti-improvement. Teachers and their administrators must be taught
how to make sense of the information that is placed before them. Only then
will improvement take place.
Good and Brophy (1997) declare that teachers seek opportunities to
evaluate and improve their teaching if acceptable and useful methods are
available. Data is now proliferating in large amounts through state
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
48
assessments, district assessments, and classroom assessments. The way to
make this proliferation of data acceptable and useful is for leaders to present
them in ways that make sense to teachers so that the road map to
instructional improvement is clear.
The Role of Data in the California State Expectations for Schools
Just as in the framework of comprehensive reform, the role of data
in the state of California’s expectations for schools is to provide schools and
districts with information that will guide them to improved instruction and
improved student achievement.
The state of California is using performance data from the various
tests that make up the Public Schools Accountability Act (PSAA) of 1999.
The assessments that are used in the PSAA are the SAT-9, The California
Assessment Test (formerly known as the STAR augmented test), the writing
assessment, and the High School Exit Exam. When achievement results are
compiled from the SAT-9, points are given based on multiple variables
(e.g., socio-economic status of the students, language proficiency of the
students, and number of certificated teachers at the school). The points
determine a school’s rank on the Academic Performance Index (API). At
present, SAT-9 achievement data carries the greatest weight in the
determination of a school’s API ranking (Consortium for Policy Research in
Education, 2000).
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
49
However, the future of the PSAA envisions the use of multiple
measures to determine school performance. Executive director of the State
Board of Education, John Mockler, has predicted that “standards-based
tests, rather than the Stanford-9, would become the core of the state
accountability system in the future. The API will rapidly become a
standards-driven index” (EdSource, 2001).
When results from the various tests are distributed, schools are
expected to take the data and begin making improvements in their
performance rating. The results are distributed in various report types.
Individual student, school, and district reports are sent to districts, as results
are completed by the testing company (Consortium for Policy Research in
Education, 2000). The assessment data is provided so that schools and
districts can gauge how well they have performed against students in the
normed sample. The ranking data is provided so that schools can see how
they perform relative to other schools in the state of California.
All of this ranking and indexing is intended to provide
accountability to public schools in the state of California. National
researcher Margaret Goertz provided some insight into the intent of public
school accountability when discussing the question, accountable for what?
She said: What are state and policymakers trying to measure?
Do they care about student’s knowledge of basic skills? Do
they want to know if students can write? Are they
interested in attendance and high school dropout rates? Is it
important to know how many of the state’s teachers have
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
50
credentials? Will they need to be ale to compare their state
to the nation? These decisions shape not only the
assessment system but also affect what teachers teach and
students learn (EdSource, 2000).
What teachers teach and students learn is at the heart of
accountability and comprehensive reform. Darling-Hammond (1997)
points out that
research has shown that teachers who plan with regard
to students’ abilities and needs and who are flexible
while teaching are effective, especially at stimulating
higher-order thinking, than teachers who engage in
extensive preplanning that is tightly focused on
behavioral objectives and coverage of facts.
The use of assessment data is crucial for teachers in their diagnosis of
student performance and informing them of the need to change their
instructional practice.
The state of California provides assessment data to provide schools
with the tools for improving achievement. However, data provided by the
state is not sufficient. It is only a part of the needed information. Schools
need to analyze their own locally generated data, whether district driven
assessments or classroom generated data. Information gleaned from these
sources help schools make improvements in a steady and sustained way.
Black and Wiliam (1998) assert that educational standards are raised
through classroom assessments. Furthermore, they cite research review
concentrating primarily on classroom assessment which concludes that:
“improved formative assessment helps low achievers more than other
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
students and so reduces the range of achievement while raising achievement
overall” (p. 3). Schmoker (1999) indicates that the frequent collecting of
data provides consistent incremental growth. Activities such as quarterly
assessment in math or other subject areas, writing data collected
periodically, and running-records or sight-word-mastery data collected
monthly or quarterly, will have a positive impact on instructional
improvement (p. 53).
In sum, the role of data in the California state expectations for
schools is as a guide or catalyst for change in schools. The state provides
the results of the various mandated assessments to the schools. The state
also provides the schools with the performance index, which is calculated
using the results of the assessments and other non-school factors. The
California Department of Education (2000) has indicated the purposes of
the API and the data generated as a result of this program in their
framework for the Academic Performance Index. They write “ as important
as it is to focus on the many central features of school that might be
considered as indicators, the primary emphasis of the API is student
achievement.” Thus the implication is that the data are intended to spur
schools to action toward improved student achievement.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
52
Current Uses of Information In Schools
If the intended use of data is improved student achievement, then it
stands to reason that data analysis should be an indispensable activity in
strategic planning and lesson development. But as indicated earlier, studies
show that data are rarely used at the classroom level to inform instructional
practice. When surveyed, teachers rarely mentioned using data at all (Guth,
Holtzman, Schneider & Carlos, 1999). In fact, while teachers report they
feel pressure to improve test scores, they believe such scores are not
particularly useful in helping to drive instruction in a positive way (Khanna,
Trousdale, Penuel & Kell, 1999). Reeves (2001) reports that some schools
acknowledge that:
we have not been very effective at using the
information we have gotten over the years to help us
understand what we could be doing better, but we also
feel that most of the information which we receive is
not in small, diagnostic, useful pieces. Beyond this,
we also notice that we not able to help teachers
become better consumers of this information.
There have been some reasons proffered as to why schools do not
use data well. Some of the reasons are lack of cultural emphasis, lack of
training, and fear. With regards to lack of cultural emphasis, data analysis
is not deemed particularly important. Regarding lack of training, too few
people at the school level are adequately trained to gather and analyze data,
or to establish and maintain databases. With regards to fear, many
educators are afraid that data analysis will turn up something they do not
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
53
want to see, such as evidence of their incompetence, or they have seen other
educators beaten up with data (Bernhardt, 2000). Schmoker (1999)
concurs: “Why do we avoid data? The reason is fear—-of data’s capacity to
reveal strength and weakness, failure and success.” Slotnik and Gratz
(1999) offer as explanation that: “many districts are data rich but
information poor—they have compiled enormous quantities of data, but
often lack the capacity to learn from it.”
Regarding the role of school leaders’ use of data, Khanna,
Trousdale, Penuel, and Kell (1999) point out that: “comparative studies
have found that in general, principals’ knowledge of testing and assessment
is somewhat higher than teachers’ knowledge. At the same time, principals
have historically not appeared to give particularly strong emphasis to the
analysis of data.”
However, where teachers have used data to inform their instruction
there has been marked improvement in instructional practice. The most
effective teachers are constantly renewing their professional knowledge,
their classroom practice, and their classroom activities (Reeves, 1998). This
type of renewal comes from understanding how to use data to inform ones
practice and make necessary adjustments in instruction. In order for
teachers to be able to learn how to use data effectively and to be able to
make necessary adjustments in a timely manner, data must be received
frequently and come in waves. In a discussion of standards-based education
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
54
and its implication on student achievement, Robert Marzano offered the
following insight:
Typically, schools rely on state tests that are taken once
a year. Schools do not receive the results for months
afterward. Effective feedback has to be timely; schools
need to examine multiple data waves throughout the
year, at least one data wave every grading period
(Scherer, 2001).
For effective use of data to be replicated on a broader scale,
intensive training must take place. Understanding of how to use and
interpret data will be meaningless if it is not accompanied by an
understanding of what to do differently in the classroom to address
shortcomings indicated by the data. Professional development on the use of
data is not enough; professional development on improving instruction is
also critical (Guth et al., 1999). According to Shanker (1990):
The school-oriented dimension of staff development is
really at the heart of the matter. The continuous
examination of practice is integral to the improvement
of practice. Collegial interaction among teachers that
allows them to discuss, observe, analyze, and study
problems together is necessary if teachers are to be
able to generate the kind of practitioner-based
knowledge needed for improvement of practice.
Thus, use of data and instructional improvement go hand in hand, and
teachers must be trained to bring both elements together.
In order for teachers and administrators to be able to use data to their
ultimate benefit, they must learn to master the art of disaggregating and
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
55
analyzing data. Mastery comes only from strong initial and sustained
training. According to Fullan (1993):
Mastery involves strong initial teacher education, and
continuous staff development throughout the career, but
is more than this when we place it in the perspective of
comprehensive change agentry. It is a learning habit
that permeates everything we do. It is not enough to be
exposed to new ideas. We have to know where new
ideas fit, and we have to become skilled in them, not just
like them (p. 16).
Once teachers develop mastery of manipulating data and tying that
data to instructional improvement, schools will begin to see the type of
change that states and districts are looking for. For example, in a study
where middle schools emphasized data analysis as a part of their
improvement effort, evidence of improved student achievement became
quite apparent. Overall student performance in the middle schools that
were studied showed continuous improvement in both reading and math for
the 1997-98 school years. The percentage of students testing in the bottom
quartile decreased. These statistics are particularly encouraging as students
in the bottom quartile are the most academically at risk students and were
now being addressed due to the disaggregation of data by quartiles
(Khanna, Trousdale, Penuel & Kell, 1999).
In another study, several schools were involved in a project where
the objective was for teachers to analyze data, reflect on findings and devise
effective strategies and solutions. According to the findings, 350 primary
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
56
and 200 intermediate students of 28 teachers involved in the project
indicated a higher rate of achievement for students in classrooms where
teachers employed the Student Achievement Model (SAM) strategies. The
median Comprehensive Test of Basic Skills (CTBS) scores for students in
grades 3 through 5 was a 76 (75.5% at or above the 50th percentile) in SAM
classrooms compared to a 66 (63.9% at or above the 50 percentile) across
the district (Frye, Frugerer, Harvey, McKay & Robinson, 1999).
In sum, most schools do not use data to inform their instructional
practices or their improvement strategies. It is clear that most states have
implemented accountability measures and that these measures include
assessment data to monitor the improvement efforts of schools. States want
their schools to use the data/information provided by them to guide
improvement efforts. If schools, especially classroom teachers, are not
using data to inform instruction, then the change that policymakers seek
will be very slow coming indeed.
There must be concerted efforts by instructional leaders at the
district levels and school site levels to provide meaningful staff
development that gives teachers the skills necessary to analyze data and
develop strategies for improved instructional practice. As teachers embrace
the use of data and employ innovative strategies, student achievement will
invariably improve. However, it must be noted that districts and schools
cannot mandate the use of data. You cannot mandate what matters, because
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
57
what really matters for complex goals of change are skills, creative
thinking, and committee action (Fullan, 1993). Teachers must have the skill
of data use, as well as the development of instructional improvement
strategies, developed through training that is sustained and consistent.
When strong staff development that has data analysis for the improvement
of instructional practice becomes an integral part of the school culture,
schools will begin to have more teachers participating in the process of
using data to inform their instruction.
Conclusion
The literature base points out the need for educators to systemically
reform educational processes for the improvement of student achievement
outcomes. Research indicates that teachers, as a whole, do not use data to
inform their instruction, and that ongoing staff development is important for
administrators and teachers if they are to successfully implement the
innovation of data use into their instructional practice. This study sought to
investigate how successful schools and school systems plan and use data to
improve instructional strategies for teachers and academic achievement for
students.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
CHAPTER 3
RESEARCH METHODOLOGY
Introduction
58
This chapter describes the research methodology used for the study,
including sample selection, instrumentation, data collection and conceptual
framework for data analysis. The primary purpose of this study was to find out
how effective schools use data to inform instruction and improve student
achievement.
This study investigated the following research questions:
1. What is the district design for using data regarding student performance, and
how is that design linked to the current and the emerging state context for assessing
student performance?
2. What does data use to improve student performance look like at the district,
school, and classroom teacher level?
3. To what extent is the district design a good one?
Methodology
The research methods for this study were qualitative in nature. In-depth
interviews provided the basis for the analysis of beliefs, perceptions and
inclinations of the participants of this study, which included teachers, the principal,
and a district level administrator of an elementary school in a unified school district
in the Inland Empire of Southern California. The taped interviews probed the
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
59
subjects’ understanding of the district design for data use, the extent to which they
used the data provided in their particular context to improve student achievement,
and to find their beliefs about the adequacy of the design for data use. The surveys
determined the teachers’ and principal’s level of implementation of the district
design for implementation data use. All subjects participating in the interviews and
surveys were assured anonymity. All of the information presented about the
interviewees and the school is factual. Archival data collected and reviewed are
public documents.
Justification for Qualitative Research Design
This particular research project is a case study. The case study is one of
several approaches to qualitative research. A case study is the in-depth
investigation of an individual, group, or institution (Gay, 1992). According to Gay
(1992) the purpose of a case study is to determine the why, not just the what, of the
behavior or status of the subject of the study. Gall, Borg and Gall (1996) point out
that one of the goals of case studies is to develop an understanding of a
phenomenon from the points of view of its participants. In research there are two
kinds of view points, the emic and the etic perspectives. The “emic” perspective is
the point of view of the participant (sometimes called the “insider’s” perspective).
The "emic" perspective is gained through direct observation and informal
conversations or interviews. The "etic" perspective is the point of view of the
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
60
researcher, or outsider’s point of view. The "etic" perspective helps to make
conceptual and theoretical sense of the case, and to report the findings in such a
manner that the contribution to the literature is clear (Gall, Borg & Gall, 1996).
Gay (1992) points out that qualitative research is also known as
ethnographic research. According to Gay (1992), ethnographic research involves
the collection of data on many variables in a naturalistic setting. The “naturalistic
setting” refers to variables being studied where they naturally occur. Ericson,
Florio and Buschman (1980) explained that qualitative research methods are best at
yielding answers to five questions:
1. What is happening in this setting?
2. What do these happenings mean to the participants in the setting?
3. What do these participants need to be able to do in order to know how to
function in this setting?
4. How does what is happening in this setting relate to what is happening in
the wider social context of this setting?
5. What are the differences between what is happening in this setting and that
found in other settings?
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
Sample and Population
Selecting the Sample
6 1
The selection method for narrowing the subjects was purposive. The reason
for selecting a purposive strategy was to develop a deeper understanding of the
phenomenon under study—the use of data in the improvement of student
achievement (Gall, Borg & Gall, 1996). Gall, Borg and Gall (1996) also point out
that the goal in purposeful sampling is “to select cases that are likely to be
‘information rich’ with respect to the purposes of the study.” The only schools
considered for use in this study were schools, that had mixed demographics (i.e.,
there must have been a student population with a mix of minority representation as
well as majority representation). The socioeconomic status (SES) of the school had
to be mid-ranged. The school could not be of exclusively high-SES or exclusively
low-SES. The study sought schools with the above demographic makeup and
substantive gains on the SAT-9, district assessments, and other student
performance data. The reason for choosing the above characteristics for this
purposive study was so that the sample would be reflective of populations in most
schools in California, and through this study the researcher might achieve an in-
depth understanding of the selected sample.
The researcher had no personal knowledge of the school or participants
chosen for this study. Thus, concerns about subjectivity due to previous knowledge
of the sample affecting the objectivity of the data collection would be minimized.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
62
In choosing the subject, researcher looked for a district that would have
schools with the characteristics needed for the study. Upon locating the district, the
researcher looked up the API data on the schools in the district. Once schools of
promise were located, the researcher spoke with the Assistant Superintendent for
Instruction to explain the purpose of the study, verify the choice for the study, and
gain permission to conduct research in the district. The district is a relatively small
school district in the Inland Empire region of Southern California. There are six
elementary schools, one middle school and one high school. The district has a total
student population of 16,598. The school that was studied has a student population
of about 700 students. The school has a demographic makeup of 74% Latino, 20%
Anglo, 4% African American, and 3% other. The teaching staff consists of 27
regular education teachers and two special education teachers, a program manager
(in other districts this person is known as compensatory education resource
teacher), and a literacy specialist. Instructional support is provided by five special
education aides, one computer lab aide, 4 four instructional aides, and four
bilingual aides.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
63
Figure 1. Characteristics of the Sample (see attached copy)
Subjects Characteristics
Assistant Superintendent White male, Administrative Credential,
Ed.D, 4 years in current position
Principal White Female, Administrative Credential,
1 year as principal and 1 year as principal
at current school
Reading Specialist With Female, 15 years as a teacher, 6 years
as specialist at this school
Program Manager White Female, 15 years as a teacher, 10
years as a program manager at this school.
33 Teachers (includes 2
special education teachers)
31 Females, 2 males, 25 Whites, 6 Latinos,
1 African American, 1 Filipino, 6 with
BAs, 13 w/MA s+ 30 hrs., 1 w/Doctorate.
Three first years, 3 second years, 3 three-
five years, 24 six years+
The school had an API score for the 2001 school year of 561, which
represents an increase of 53 points over the 2000 score of 508. The base year
(1999) API score was 520. Overall API growth from base year to 2001 was 41
points. Thus, there is an indication that improvement had been made in each of the
academic areas tested by the state between the baseline year and the present year.
Table 1 illustrates the API performances over the baseline year.
Table 1. API Performances
API Year API Score
1999 520
2000 508
2001 563
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
64
Instrumentation
Frameworks for Instrument Design
The instruments were designed collaboratively by a group of 15 doctoral
students investigating the same topic. The group consisted of individuals who
were, at the time of the instrumentation design and framework development,
classroom teachers, assistant principals, principals, assistant superintendent,
educational consultant, and full time graduate students. The group met during the
summer of 2001 with their leader, Dr. David Marsh, to plan the study and discuss
conceptual frameworks to be used in the study.
The research group analyzed current and emerging practices of the state
accountability system in developing conceptual framework A (see Appendix 1),
which addresses research question #1, “What is the district design for using data
regarding student performance, and how is that design linked to the current and
emerging state context for assessing student performance?” Conceptual framework
A considers the overview of the elements of the district design of data use to
improve student performance, district decisions and rulings that support use of
district design, and the intended results of design plans to improve student
performance at the district, school, and classroom levels.
Conceptual framework B (Appendix 2) was developed to help answer
research question #2, “To what extent has the district design actually been
implemented at the district, school, and individual teacher level?” The framework
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
65
considers three elements: 1. The degree of design implementation in both the
current and emerging contexts, 2. Accountability for data use at the district, school,
and individual levels, and 3. The improvement of student achievement through
implementation of data use.
Research question #3, to what extent is the district design a good one,
would be answered through the analysis of district administrator, principal, and
teacher interviews, as well as through other post data analysis. The post-data
analysis includes mapping of the data flow, artifact analysis, and researcher rater
form analysis.
Interviews were conducted using interview questions developed by the
researchers. There were formal interviews and situated interviews. The formal
interview questions (Appendix 3) were linked to the conceptual frameworks to help
answer the main research questions. There was a set of questions developed to
interview the assistant superintendent for curriculum. These questions were
developed to help the researcher get a sense of the district's perspective of and
design for data use. The next set of questions was developed to interview the
principal. These questions sought to discover how the school implemented the
district design for data use, and to understand the schema for data use developed for
the school. The final set of questions were interview questions for teachers. These
questions were developed to find out the teachers perspective on data use in the
school, how effectively the teachers were trained to use data, if the teachers viewed
the district as partners in the use of data for the improvement of student
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
66
achievement, and if the use of data made, a difference in improving student
achievement.
The purpose of the situated interviews (Appendix 4) was to generate stories
and examples about the way data are used (and not used) in the school. These
stories helped to form the vignettes in this study and help illustrate the information
gathered through some of the instruments. Another aspect of the instrumentation
included two teacher questionnaires (Appendix 5). One questionnaire was
designed to determine the teachers’ stages of concern as they began using the
innovation of data analysis and use in their instructional practice. The other
questionnaire was designed to determine the degree of design implementation of
current data practices, the degree of design implementation of emerging state data
practices, the accountability for data use at the district, school, and individual level,
and the improvement of student achievement through implementation of data use.
A questionnaire rating form (Appendix 6) was used to help guide the researcher
through the analysis and rating of teacher responses to the teacher questionnaire.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
67
Figure 2. Displays the Relationship Between the Three Research Questions and
Data Collection
C o l l e c t i o n I n s t r u m e n t s R Q 1 : D e s i g n R Q 2 : I m p l e m e n t a t i o n R Q 3 : A d e q u a c y o f D e s i g n
C a s e S t u d y G u i d e
• I n t e r v i e w s : D i s t r i c t
A d m i n i s t r a t o r , S i t e
A d m i n i s t r a t o r , 6 T e a c h e r s
( m a d e u p o f g r a d e l e v e l /
d e p a r t m e n t l e a d e r s a n d a v e r a g e
t e a c h e r s )
• M a p p i n g o f D a t a F l o w a t
D i s t r i c t a n d S c h o o l S i t e
® A r t i f a c t A n a l y s i s / C o l l e c t i o n
s Q u a n t i t a t i v e D a t a
X X X
S i t u a t e d I n t e r v i e w s
® F o r m i n g v i g n e t t e s w i t h 6
t e a c h e r s
X
T e a c h e r Q u e s t i o n n a i r e X
S t a g e s o f C o n c e r n Q u e s t i o n n a i r e X
R e s e a r c h e r R a t i n g F o r m ( P o s t D a t a
C o l l e c t i o n )
X
I n n o v a t i o n C o n f i g u r a t i o n ( P o s t
D a t a C o l l e c t i o n )
X
Data Collection
The intellectual framework regarding the collection of data was based on
existing and emerging state practices. The study sought to discover how the district
design considered the existing practices of using the SAT-9 and API to determine
student achievement and school performance in its use of data. The study also
sought to discover whether the district considered emerging practices such as the
California State Test (CST) in calculating the school API, the impact of the
school’s/district’s use of data in determining student performance on the High
School Exit Exam (HSEE), and the need for districts to develop interim
assessments to determine the schools’ progress toward meeting state requirements.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
68
Therefore, the interview questions developed by the researcher, based on the
conceptual framework developed by the research team, were geared to finding out
whether the district was using data indeed was considering these existing and
emerging state practices.
The second question asked about implementation of the district plan at the
school and classroom levels. The teacher questionnaire and stages of concern
questionnaire were developed with the Hall and Hord (1987) concern based
adoption model and research literature findings in mind. The Hall and Hord model
looks at the stages of concern that individuals go thorough as they implement an
innovation of change within their organization. The feelings that individuals
express have an impact on the level of implementation of a given innovation. By
including this element in the framework of the instrumentation development, the
researcher as a basis for understanding the extent to which the innovation may or
may not be implemented in the organization.
The leader of the research team, Dr. David Marsh, engaged the team in
extensive training in the use of the instmments developed by the team. The team
met for six hours on a Saturday in October. The training consisted of 1) a review of
the team data collection efforts (overview of the data collection-timeframe,
instmments, site selection, criteria review, obtaining permission); 2) conceptual
frameworks and instmments (overview of list of frameworks and instmments, walk
through conceptual frame works, and walk through the instmments); 3) instrument
strategies (lecturette: interviewing, exercise: good and bad field notes, exercise:
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
69
practice interviewing/debrief); 4) from notes to Chapter 4 (steps and strategies,
managing the process and getting it done, and indicators of a good Chapter 4; and
5) next steps. Once the data had been collected, the team met again for another six-
hour session in January to debrief about the interviews and discuss, in a preliminary
fashion, the findings. The training helped members of the team to interpret the
findings of the questionnaires and the responses to the interview questions.
The researcher made contact with the Assistant Superintendent for
curriculum and obtained his permission to conduct the study in the district, and
with his help, determined the school for study. The researcher interviewed the
Assistant Superintendent, the administrator for research, the school principal, and
six teachers. The researcher did his best to put each individual at ease and give
his/her best response to the questions. All participants, especially teachers, were
assured of their anonymity. Teachers were allowed to express themselves as much
as they wished while striking a balance in time management. The researcher
visited several classrooms to obtain a “snap-shot” view of the instructional
environments. There was a need to collect observational evidence as to whether or
not an environment of data use, high expectations for all students, and standards
based instruction truly existed, as these elements were highly featured in the
interview questions and teacher questionnaire.
The administration of the various instruments went well for the most part.
The researcher was successful in interviewing the individuals critical to the study
(i.e., the district level administrators, the school principal, two teacher leaders, and
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
70
four classroom teachers). The questionnaires (stages of concern and teacher
questionnaire) were distributed to all faculty members. Unfortunately, there was
not 100% participation by the teachers in this area. Some teachers indicated the
questionnaires were too intrusive. Many chose not to participate with no
explanation at all. Fortunately, there was sufficient participation to get a good
reading as to how the teachers on staff felt about the issues of data use
implementation and teacher expectation.
Notes were taken by the researcher using a note pad, lap top computer, and
a portable tape recorder. At the beginning of the interviewing process, the
researcher used a regular note pad. Due to the extensiveness of the interviews, the
researcher decided that it would be better to use a lap top computer and a tape
recorder. By using a tape recorder, the researcher could ensure the fidelity of the
interview. Comments and quotes could be reported accurately by listening back to
the tape. The use of the computer allowed the researcher to record the responses
and store them electronically, thereby minimizing the possibility of lost note pages
and incurring gaps in information. After visiting classrooms, the researcher was
able to write impressions on a note pad and then transfer these notes to the lap top
with the balance of the interview notes.
Data Analysis
The analysis was done in answer to the three research questions of the
study. The researcher used the archival data, interview notes, answers to the survey
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
71
questionnaires, and impressions from classroom visitations to come to the findings
expressed in Chapter 4.
The first step was to look at and analyze all of the archival data in order to
begin answering question #1. The archival data consisted of SAT-9 data, API data,
the School Accountability Report Card (SARC), and district developed and
purchased interim assessment reports. The researcher looked at two to three years
of SAT-9 and API data to express the fact that there was growth for the school in
this study and to determine how much of the growth was linked to data use at the
district and school level. Other archival data were analyzed to substantiate the
indication of student achievement and the use of data by the school. Besides using
archival data, the researcher used transcribed audio data and analyzed the
transcriptions and field notes to determine whether or not there was a plan for using
data introduced by the district and recognized by the school administration and
teachers.
The next step was to analyze the findings from the stages of concern and
teacher questionnaire surveys and conclude what these findings meant. The
analysis and findings were used to answer research question #2. The triangulation
of archival data, interview questions, and survey questionnaires helped validate the
findings for questions #1 and #2.
The final step was to recapitulate the findings of questions #1 and #2 in
order to answer question #3. Question #1 asked, what is the district plan for using
data, question two wanted to know the extent to which the plan was implemented.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
72
The findings of these two questions were vital to answering question #3, to what
extent was the plan a good one. The researcher had to review the findings and
express a valuation of the district plan based on the findings of the first two
questions. Once all findings were determined, the researcher was then able to
summarize the overall findings in a discussion section.
A couple of problems arose during the analysis of the data for this study.
Some of the interim assessment data seemed incomplete and the researcher had to
make calls to the site principal and the administration for research to receive
answers to the questions and to have a more complete copy of the documents
forwarded by the district. Another difficulty stemmed from the fact that there was
a misunderstanding on the researcher’s part of the use one of the reports provided
by the school. It was believed that the report in question was used across grade
levels, but multiple copies of one grade level were given to the researcher. After
further discussion with the principal, it was found that the test was given only to
second grade students and that multiple copies were given to the researcher by
mistake. The researcher was then able to accurately analyze the report in concert
with the balance of the achievement data that had been collected.
Summary
This chapter discussed the research methodology used for the study. This
included a description of the research design, the sample, the instruments, the
procedures, and methodology. Data collection instruments included conceptual
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
73
frameworks for the research questions, formal and situated interview questions, two
teacher questionnaires, and researcher rater forms. Procedures included an
interview of a district administrator, the school principal, a presentation to the
teaching staff of the subject school, interviews of selected teachers, dissemination
and collection of surveys, and audio and note taking during interviews. Chapter 4
will present the findings and results of the research, which address the purposes of
this study.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
74
CHAPTER 4
ANALYSIS AND INTERPRETATION OF DATA AND FINDINGS
Introduction
The purpose of this chapter is to present the data and findings of the study.
The data consists of standardized test reports and periodic assessments such as the
district benchmark exams. The data also consists of school archival records such as
the annual school report and the school wide plan. Surveys and self-reports
gathered from 18 classroom teachers also comprise the data. Additional data is in
the form of written transcriptions of individual interviews with the district assistant
superintendent and the director of assessments, the school principal, two lead
teachers, and four classroom teachers.
The presentation of findings gives a detailed description of (a) the district
design for using data regarding student performance, and how the design is linked
to the current and emerging state context for assessing student performance, (b) the
extent to which the district design actually has been implemented at the district,
school, and individual teacher level, and (c) the extent to which the district plan is a
good one.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
75
Research Question #1: The District Design for Using Data and Its Link to
Emerging State Context for Assessment of Student Performance
Framework for Research Question #1
The first research question was “What is the district design for using data
regarding student performance, and how is the design linked to the current and the
emerging state context for assessing student performance? The primary sources
used to answer this question were Stanford Achievement Test Ninth Edition (SAT-
9) data compiled by the district on each of its schools, data from district benchmark
exams created by the district for interim assessments, and interviews with district
administrators.
Findings for Research Question #1
District
The district and school (from this point known as Grand Elementary) are
focusing their efforts on improving student performance on the SAT-9 and their
API ratings as defined by the state. Evidence to support this is found in the
district’s collection of SAT-9 data and their analysis of the results. The district,
through their assessment department, sends out reports to all schools, which tells
them how each school has performed on the SAT-9 exam. For example each
school receives a report that shows the percentage of students scoring over the 50th
percentile in reading by grade level. The report shows the growth progression year
by year beginning at the baseline year, 1998 to the present year 2001. The same
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
report shows the information in a bar graph format. Tables 2a and 2b illustrate the
above report, showing the percentage of students at Grand Elementary School
scoring at the 50th percentile or higher on the SAT-9 in reading and math.
Table 2a. The Percentage of Students Performing At or Above the 50th Percentile
in Reading by Grade Level.
School Year 1998 1999 2000 2001
Grade 2 33 32 21 30
Grade 3 14 15 26 20
Grade 4 16 27 16 25
Grade 5 21 ' 16 20 24
Grade 6 28 33 25 33
Table 2b. The Percentage of Students Performing At or Above the 50th Percentile
in Math by Grade Level on the SAT-9 between 1998 and 2001.
School Year 1998 1999 2000 2001
Grade 2 44 34 16 36
Grade 3 21 26 23 26
Grade 4 17 24 21 32
Grade 5 30 25 30 37
Grade 6 39 64 47 61
Between 1998 and 2000, one can see an inconsistent pattern in student
achievement, which can possibly be attributed to a lack of focus on achievement
patterns and instructional practice. Interviews with the principal and teachers
indicate that the school began their practice of data analysis during the 1999-2000
school year. In the year that the school began using data, each grade level posted
increases in the percentage of students scoring over the 50th percentile in math,
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
77
whereas in the previous two years some grade levels made gains and some suffered
losses. The same case can be stated for reading, except for third grade during 2001
where there was a 6-point loss. With the exception of second grade, all other
grades tested on the SAT-9 have posted higher scores in 2001 than they did in the
baseline year of 1998.
It is interesting to note that the percentage scores tended to be higher in
math than in reading. This could be attributed to a couple of things. One, that the
area of emphasis during the year was math (although there is nothing in the data to
suggest that this was the case) and the area of focus brought about a higher
achievement outcome; or two, that the area of math is easier to show achievement
growth because reading is a much more complex subject for students to leam and
for teachers to teach (McNeil, 1996).
Student achievement growth can also be demonstrated by looking at the
content cluster report from the SAT-9. This report shows the achievement in
columns giving the percentage of students who are below average, average, and
above average. Although Grand Elementary School does not have a high
percentage of its students in the above average range at any grade level, the data
does support the fact that greater numbers of students are moving toward above
average performance. Tables 3a, 3b, and 3c show the percentage of students
scoring in the below average range, average range, and above average range
between the 1999-2000 and 2000-2001 school years.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
78
Table 3a. Percentage of Students Scoring in the Below Average Range in Reading,
Mathematics, Language Arts, and Spelling between 2000 and 2001 ________
School year 00 01 00 01 00 01 00 01 00 01
Grade Level Grade
2
Grade
2
Grade
3
Grade
3
Grade
4
Grade
4
Grade
5
Grade
5
Grade
6
Grade
6
Average
Improvement
Reading
Vocabulary
51 34 43 52 51 42 45 41 38 26 7
Reading
Comprehension
47 35 50 43 47 31 51 41 41 29 9
Mathematics
Problem Solving 60 35 43 38 42 31 42 28 26 18 12.6
Mathematics
Procedures
41 26 35 30 34 33 26 26 13 15 6.4
Language 57 32 46 41 38 26 47 28 31 22 14
Spelling 51 31 35 33 54 39 49 32 45 38 11.4
Table 3b. Percentage of Students Scoring in the Average Range in Reading,
Mathematics, Language Arts, and Spelling_____________________________
School year 00 01 00 01 00 01 00 01 00 01
Grade Level Grade Grade Grade Grade Grade Grade Grade Grade Grade Grade Average
2 2 3 3 4 4 5 5 6 6 Improvement
Reading
Vocabulary
42 56 50 43 41 52 47 51 57 62 3.8
Reading
Comprehension
47 57 45 50 46 64 40 57 54 5.5 10.2
Mathematics
Problem Solving
34 54 48 51 52 56 49 59 59 60 7.6
Mathematics
Procedures
47 63 58 63 58 61 63 58 66 54 2.2
Language 31 51 47 51 54 65 40 51 64 67 9.8
Spelling 40 55 58 60 39 54 40 55 50 51 9.6
Table 3c. Percentage of Students Scoring in the Above Average Range in Reading,
Mathematics, Language Arts, and Spelling on the SAT-9 between 2000 and 2001
School year 00 01 00 01 00 01 00 01 00 01
Grade Level Grade Grade Grade Grade Grade Grade Grade Grade Grade Grade Total %
2 2 3 3 4 4 5 5 6 6 Improvement
Reading
Vocabulary
7 10 8 5 .8 7 8 7 5 12 5
Reading
Comprehension
7 8 5 7 7 5 9 2 5 15 4
Mathematics
Problem Solving
6 11 9 11 6 13 9 13 15 23 5
Mathematics
Procedures
13 11 7 7 8 6 12 17 21 31 2.2
Language 13 17 7 8 8 10 14 21 5 11 4
Spelling 9 14 7 7 8 8 10 13 5 11 2.8
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
79
The data indicates that the school has been successful in moving more
students out of the below average range and into the average and above average
ranges. In some areas (such as reading comprehension on Table 3b and math
procedures on Table 3c) there have been large improvement gains, which indicates
that the year that the school began systematically using data, they were able to reap
positive benefits.
API data is available over the Internet for anyone who desires information
on any given school in the state. In addition, the district provides each school with
a report entitled API Findings by District School. This report provides each school
with an estimate of their API score prior to the official report from the state. Each
school receives a copy of this report and sees their estimated API standing relative
to other schools in the district. Each school has their API scores printed in their
School Accountability Report Card (SARC). The SARC provides brief
information to the public on all aspects of the school operations. Examples of
headings found in the SARC are the District Mission Statement, School Staff,
School Description, School Facilities, Safety, Curriculum, Instructional Program
and Leadership, Academic Performance Index Growth Report, and Student
Achievement.
The district and schools are using the California Content Standards to
improve student achievement. According to the Assistant Superintendent for
Instructional Support Services, the district uses the state standards exclusively. He
goes on to say “results over the last two to three years has been great, specifically
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
80
the last two years.” The Administrator for Assessment adds, “Drastic
improvements have occurred in the last two and a half to three years. We’re using
data to make decisions. There’s been excellent growth, especially at the elementary
school level at almost every one of our schools.” All elementary schools in the
district, with the exception of one, met their API growth target on the 2000-2001
API report. Indeed, several of the schools posted strong gains. For example, three
schools had growth of 50+ points, one school had growth of 64 points, and one
other school had growth of 86 points. Although the API has been based
exclusively on the SAT-9, the district credits its API growth to their focus on state
standards.
To help the district move students toward mastery of the state standards and
give teachers a tool for informing their instruction, the district has developed
benchmark tests for the California Standards. The benchmark test is a criterion-
referenced instrument that tests student progress toward mastery of the California
state standards. These tests are given four times per year and the results are shared
with each school. Teachers receive performance data for their individual
classrooms and the school receives reports showing outcomes for each classroom at
each grade level, aggregated grade level results, and how each school performed
relative to its district counterparts. Because the emerging practice of the state will
be to place greater weight on the California Standards Test (CST) in calculating
API scores, the district’s practice of benchmarking student progress through the use
of its criterion-referenced test will provide teachers and site administrators with
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
81
timely information in their quest for improving student achievement scores and API
outcomes.
The schools are also using the Stanford Diagnostic Reading Test, 4th Edition
to provide them with achievement data on their second grade students. The test
provides diagnostic data in the areas of phonetic analysis, vocabulary, and
comprehension. The report gives the number tested, the mean raw score, national
percentile rank for this particular test, skills analysis summary, and progress
indicators (number and percent at or above the 50th percentile and the number and
percent below the 50th percentile in each of the tested categories).
The district also analyzes student performance on the California writing
assessment. The district provides a school-by-school report entitled 4th Grade
Writing Results by Elementary School. The report charts out the score range from 2
through 8 and gives the number of students that scored in a given range and the
percentage of the grade level that the number represents. For example, at school A
19 4th grade students received a score of 2, which is the lowest score possible on the
exam. Those 19 students represent 12.5% of the 4th grade students at the school.
Ten students received a score of 3, which represents 6.6% of the 4th grade students
at the school, etc.
To help students prepare for the writing exam and to help them meet state
standards, the district has developed writing prompts tied to the state standards.
The district is working to perfect the prompts so that they can have a reliable
instrument for improving student achievement on the state exam. According to the
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
82
administrator for assessment, “we’re in different stages of pilot testing our writing
prompts.”
The district has linked the use of technology to their preparation of students
for current and emerging exams. The district has decided to use a number of
software programs to help students improve reading and math skills. In interviews
with the Assistant Superintendent for Instructional Services and the Administrator
for Assessment Services, the researcher was told of specific programs that have
been used to help students improve reading comprehension skills, decoding skills,
math computation and application skills. Programs represented across the district
for use at the elementary school level include Statistical Package for Social
Sciences (SPSS), Computer Curriculum Company (CCC), Accelerated Reader and
Accelerated Math, Waterford Reading, River Deep Math, and Edge Mart sight
word program. These programs are tutorials that give students practice at a
particular skill and then provide the teacher with diagnostic feedback so that the
teacher knows how much progress the student has made toward skill mastery and
where the student is in the program lesson progression.
The district is using a program called Nova Net at the secondary level.
According to the assistant superintendent, “Nova Net is a 7-12 curriculum that lines
up directly with the state standards. It allows kids to get credits or advance in
credits. This program is particularly useful for kids that just leam a little bit
differently and need a form of differentiated instruction.” The district is also
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
83
linking other computer technology to student achievement improvement strategies.
The administrator for assessment says that:
We have our information system, which we’re updating, to be
able to take High School Exit Exam Data and link it to
classrooms and use the scale scores for the High School Exit
Exam to identify areas of improvement for kids in the math
and English/language arts area. We’re developing some
access programs and technology software to better take those
data and put them in formats in timely ways for people to use
the data.
According to the district, they are constantly evaluating the usefulness of
the intervention software programs that they have in place in their schools. The
assistant superintendent says “we don’t buy anything for the district without it
being evaluated. If we test a program that is not showing us the kind of results that
we want, we will no longer use it or buy anymore of it.” In sum, the district has
utilized computer technology as an intervention strategy for students who need
additional resources to help them meet mastery of curriculum standards.
Regarding the board rulings that support the use of a district design, there
are none. The school board of the district is not directly involved in the structure
or design of a district plan for data use. Data use and implementation are a function
of administration. The district office, under the umbrella of the department of
Instructional Services, collects various data types, analyzes the data, and sends
findings out to the schools. There might be an analysis of exams that have been
given to all schools, such as SAT-9, or there might be studies initiated by the
district. For example, the district administrator for assessment may determine a
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
84
need to conduct a study in order to get more information on the progress of students
than he is able to get from other data sources. He says,
As evaluator for the district, I often deal with data collection
that’s beyond the SAT-9. I may have to use a modified Terra
Nova test or a test that’s geared to assessing a specific program.
We do what we can to make sure that we communicate the
results to both experimental and control groups to give them the
opportunity to make necessary modifications to their programs.
We are constantly evaluating everything using data to make
necessary decisions.
The district has not set a series of goals for students to reach with respect to
SAT-9 achievement. The expectation they do have is that schools improve their
API numbers in the manner prescribed by the state at a minimum. According to the
district strategic plan, all students are expected to achieve or exceed grade level
standards as defined by the district adopted grade level curriculum. The assistant
superintendent says “API is one of our goals. Of course we would all like for our
students to be above the 50th percentile.” There is nothing found in the data
archives that give an expectation of a measurable rate of progress for student
growth in the district as a whole or for each school individually. Thus, the district
is collecting, analyzing, and distributing data to the schools and making future
decisions based on the results of their assessments.
Though the district has not established a set of achievement goals for the
schools to reach, they do hold principals accountable for the achievement numbers
at their respective schools. When asked the question “What evidence is there that
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
85
points to the fact that the district expects schools to use data to improve student
achievement?” the assistant superintendent answered,
Our principals’ evaluations are based on it. We don’t put a
number on improvement because we understand that students
perform differently from year to year, especially at our
intermediate school.. .when they come in for evaluation we ask
them some pretty poignant things and its all data driven. I think
a principal a couple of years ago would have gotten a different
evaluation.
The district strategic plan says that instruction and achievement are based
on state standards. “Priorities of the district are to align the district standards,
instruction, materials and assessment with State standards, assess student progress
in achieving standards on an ongoing basis using multiple measures...”
The district has committed financial resources to ongoing data use. When
asked, “how has the district financially supported the district’s design for the use of
data?” the assistant superintendent replied, “we put our money where our mouth is
by hiring a program evaluator, I don’t think most districts have a program
evaluator. We’ve made that a priority.” He goes on to say that they have
committed to using data and purchasing programs that they feel make a difference
with the students in the schools. The administrator for assessment adds that the
district is committed to creating a position called an evaluation technology
specialist and hiring someone to fill the position. The duties of this individual
would be to merge data from the different software systems to provide more
succinct information for district administrators and school building personnel.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
86
Staff development in the area of data usage is provided by district level
personnel or building administrators, and teachers are encouraged to go to training
sessions provided by such speakers as Doug Reeves or Mike Schmoker, which are
paid for by categorical funds or other funding from school building budgets. There
is no direct evidence that indicates that the district is allocating funds for
professional development in the area of data analysis. Therefore, it would seem
that based on the allocation of funding, the emphasis in the district is on data
collection and dissemination to school level administrators and teachers. As will be
seen later in interviews and questionnaires, teachers do not feel that they are
receiving enough training to feel comfortable with data analysis. Many do not feel
that the district is providing enough help in this area.
.S chool
The school is actively using data to analyze student outcomes and improve
instruction and student achievement. According to the principal, “the commitment
to using data is high.” The school has established a time once a week where
teachers can collaborate and discuss instruction, student progress, and assessment.
On the weekly collaboration days, one day each month is set aside solely for
analysis of data. Teachers are given testing data (state testing, district testing, and
classroom testing) and a set of guiding questions to assist in the analysis. The
teachers work in grade level teams during the analysis and are guided through the
process by the principal, assistant principal, and program specialist.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
87
The school has articulated its procedures for data analysis in its strategic
plan. The plan indicates that teachers utilize a variety of assessments to determine
students’ success. The plan reads “Analysis of the data from these assessments
provides teachers, students, and parents information needed to drive instruction.
The data also informs individual and group instruction needed to remediate and/or
address areas of weakness.”
To get a greater sense of how the school is using data, the researcher asked
the principal the following question: “What is the school doing to use data (directly
or indirectly) to promote student learning?” The principal answered in the
following manner: “Directly, the school site, grade level, and classroom data has
been analyzed. Most specifically it has helped grade levels look at instructional
practices and helped them to formulate and create instructional practices that are
lacking. Indirectly, the school shares [information-data] to see how the school is
moving toward both the big picture and the day to day.”
The school has no plans in changing or improving the way data are
gathered. In terms of analysis, the process goes beyond teacher analysis of the
data. According to the principal, the program specialist and the principal meet and
converse after every Tuesday. They meet to ensure that they are getting what they
need out of the data. In terms of use, the school uses the data to develop an action
plan based on data analysis. This is an ongoing process that the school feels it
needs to reflect on, use, and then follow through on.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
88
There have been efforts to improve the administrators' and teachers’ ability
to use data to increase student academic achievement. The researcher asked the
principal to explain how these improvement efforts have taken place. The principal
replied,
The administrators are sent to conferences to improve
understanding of how to use and analyze data. The
leadership team goes to training as well. The leadership
team has been going to the California School Leadership
Academy (CSLA). Each of the team members are grade
level leaders, and their training allows them to help grade
level members better understand data analysis. After each
Tuesday meeting there is an evaluation form filled out by
the teachers that the principal and program manager review
and analyze to determine whether staff has understood the
task or project regarding data analysis.
The school has raised student and teacher expectations for increased
achievement through a series of conversations on the topic. The teachers were
challenged to raise the expectations for achievement that they had for the children
at the school. They were challenged to find new and creative ways to reach their
students and ensure that they received the practice that they need. For example, the
principal said to the teaching staff “if children are not doing homework because
they do not have a place to do homework or because they have no one to assist
them, then why are you assigning homework? Find new and creative ways to give
the children the practice that they need, think outside the box.”
The school continued their push toward improved expectations by accepting
no excuses for student failure. Teachers are expected to hold every student to high
expectations for student performance. The administration believes that the school
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
89
must be prepared to accept and handle every student that comes through the school
doors. The principal tells teachers “we cannot use the students as excuses for lack
of success.” The school also holds parents to the same expectations. They provide
the parents with a bi-monthly newsletter as a vehicle for raising parental awareness
of state and district standards, to provide information on how they can help their
children at home, and to inform them about the availability of parenting classes and
community references.
Based on all of the vehicles that have been put into place to promote data
analysis in the school, it is quite obvious that the administration does whole
heartedly support school-wide implementation of standards based curriculum. In
order to underscore the point, the researcher asked the principal to comment on the
question: Would your staff agree that you have led them in school-wide
implementation of standards-based curriculum to improve student achievement?
What evidence can you site to support your position? The principal’s response was
as follows: “We address standards in every Tuesday meeting. Teachers break
down where standards are in the curriculum, where standards are in the instruction,
and we allow time for teacher collaboration and implementation for all of the
above.”
As further evidence of the school administration’s support of standards-
based curriculum, the school’s strategic plan mentions the use of standards quite
prominently. At the beginning of the strategic plan under student outcome the
document says “all students will achieve or exceed grade level standards as defined
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
90
by the district and adopted grade level curriculum.” The first four priorities
mentioned in the strategic plan are of standards and instruction. The first four
priorities are as follows: 1. Define district curriculum standards. 2. Communicate
the standards to students, parents, staff, and community. 3. Align the district
standards, instruction, materials, and assessment with State Standards. 4. Provide
quality instruction for students using multiple strategies effectively.
The strategic plan continues its discussion of standards in the section
entitled Curriculum Support Area. The document reads:
California’s SEE has adopted high academic standards in
Reading/Language Arts, Mathematics, and Science. The
standards serve as the framework for directing district goals,
objectives, and expected learning outcomes into an
articulated curricular program designed to maximize learning
for all students. The standards enable the articulation of
curriculum and learning expectations from grade level to
grade level. The school staff are implementing a standards
based curriculum and through II/USP funding will develop
assessments that are aligned to the standards along with
reporting strategies that reflect student success in relation to
the standards and associated exit criteria. The teachers
develop a variety of their own assessments that they utilize
in determining each students’ mastery of the standards.
The school has worked to evaluate the effectiveness of the instructional
program. As an example of how the school has gone about the task of evaluating
the instructional program, the principal discussed their process in evaluating the
spelling program.
An instructional focus on spelling was done. It began with
SAT-9 results. The curriculum was analyzed to see if it
could support the improvement. The teachers decided after
analysis of the existing program to create their own (meaning
their own school-wide spelling program). Instructional
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
91
programs are also analyzed through SAT-9 data, benchmark
results, API subgroup goals and other STAR information.
This process is done for each of the subject areas. The
school analyzes math, reading, and language arts results as
well as ELL (English Language Learner) progress.
As schools seek to change instructional practices or implement new
strategies, they often run into roadblocks or challenges along the way. The
researcher asked the principal what roadblocks the school faced as they began
implementing a culture of data use. There were several issues that the staff had to
work through in order for them to get to the place where they could become data
driven. The principal replied that the challenges involved
getting beyond the no excuses attitude. Trying to embed
the high standards without alienating the staff. Helping
teachers deal with the feeling of being overwhelmed.
Supporting the teachers over time. At this school, it was
not really a problem because there had been no real focus
in the past. It was a matter of bringing focus to the school.
There is much empirical evidence to support the use of data to inform
instructional practices and improve student achievement. Standards based
education goes hand-in-hand with data use because the standard is the basis upon
which the instruction is founded. The data (information) from standards-based
instruction are typically tests from lessons taught, actual work produced by the
students, criterion referenced tests that show how much of the curriculum the
students have mastered, and achievement tests which are indicators of how much
knowledge students have in a given academic subject area. These standard-based
data are used to improve instructional practices. Hampton (1997) says that a
successful standards-based classroom will have clearly defined standards that lead
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
92
to curriculum mastery. She goes on to say that the criteria for mastery is based on
the students’ achievement on various indicators such as student work, course
outcomes, core assignments, and student assessments. Instructional tools, lessons,
instructional methodology, and professional development must all be tied to
standards.
The researcher asked the principal whether the school and district used
empirical evidence to support their efforts in using data to improve student
achievement. The principal answered:
No. The only thing that the district has used is the book
Results by Schmoker. They have alluded to the work of
Doug Reeves. However, the school does share articles from
refereed joumals. For example, I’m sharing an article from
Leadership Magazine on how high poverty schools improve
achievement.
Thus, the school is implementing a standards based and data driven system
absent the research base to support it. Such omission could make it difficult for all
members of the organization to understand the process for standards based
instruction or what direction the district expects to go with regard to the use of data,
Several teachers interviewed indicated that they truly did not fully understand how
to analyze and use the data. This is so because the meaning of the numbers
provided in the data is difficult to understand. Additionally, they did not really
understand the district’s role in data use.
Since there was not an extensive use of empirical evidence to establish the
use of data and standards based instruction in the school, it was helpful to find out
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
93
what mechanisms the school used to support the teachers in data analysis. The
principal responded to the inquiry by saying “we model, provide worksheets, and
provide action plans. There are mini-training’s at monthly meetings to help
teachers use data. I think the worksheets are the most helpful because they provide
teachers with a guideline for doing things.” Indeed, data artifacts indicate that
teachers are guided through the analytical process in a concrete fashion. The
worksheets tell teachers what they should be looking for in the data and what
questions to ask as they encounter the data. For example, an analysis of a
parent/student/staff survey tells teachers to compare last year’s survey to this
year’s, and then asks, “what are the findings and what does the data tell us?”
In order to develop an understanding as to how the school uses data, it is
helpful to understand the flow of data to the school and what type of data is
analyzed. According to the principal, the school receives data from the district
office and from school generated materials. Much of the data that the school
receives comes from the district office. The district sends results from STAR
testing in various forms. For example, one report shows the school’s performance
on the SAT-9 by quintile, while another report provides a comparison of the
percentage of students at or above the 50th percentile on the SAT-9 between 1999
and 2000 and the amount of change. The types of data provided by the district and
school for analysis are: SAT-9 results, district benchmarks, ELD levels,
accelerating literacy profiles, reading reports (a Stanford Reading Diagnostic),
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
94
SAT-9 writing report, CCC diagnostic and reports from tutorial programs for
reading improvement.
In sum, the school has demonstrated that it does have a commitment to
standards-based instruction and data analysis. The school demonstrates that
commitment by establishing an environment where the organization is expected to
collect and analyze data, and use the findings to help improve the instructional
quality of the teaching staff. The district and school provide the teaching staff with
a variety of data sources and guide them through the analytical process. However,
the culture of data use was not built on a foundation of empirical evidence.
Therefore, the teaching staff has no basis for understanding the concepts and
constructs of standards based instruction, or the methodology for data analysis of
standards-based outcomes.
Nevertheless, the school ensures that the process moves forward by
providing time for teachers to engage in the analytical process, providing guidance
!
and assistance through the process, and providing time for teachers to discuss
instructional practice and plan instructional strategy. In the absence of an initial
platform of empirical evidence, the school is providing source material from the
research literature to help build the necessary constructs for conceptualizing the
standards based model of instruction. Concept development is going to be crucial
for the teachers if they are to scaffold their learning and become more adept at
standards based instruction and data usage.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
95
Classroom
The classroom teachers’ design for using data mirrors that of the school
design for using data. The classroom teachers meet in grade level teams, with the
guidance of the school administration, to collaboratively discuss data that are
distributed to them by the district office, the results of surveys administered by the
school, or collections of diagnostics from their intervention software programs.
Because teachers have voiced the fact that they have not been fully trained
in data analysis, it is difficult to find evidentiary data to support classroom level
designs for their use. For example, when asked the question “how has the school
trained teachers to analyze data to inform their instruction” one teacher responded
“we have not been trained to use data” while another alluded to being sent to
“training with Doug Reeves and CSLA training.” While these seminar training
sessions are attempts at providing exposure to data analysis, they are not exactly
highly focused training with the staff on analysis of data provided to them about the
children that they teach.
The fact that the classroom teacher participates in the school design for data
use and that there seems to be limited staff development for teacher use of data
leads one to conclude that there is little evidence of an explicit or implicit plan for a
classroom level design for using data to improve student achievement. The
classroom teachers’ use of data is guided by the administration’s collaborative
design whereby the teachers in grade level teams are lead in discussions about what
the data outcomes mean and what the instructional implications are for the results
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
96
found in the analysis. For example, the quote from the section above “what are the
findings and what does the data tell us” serves to illustrate how the discussions are
linked to outcomes and instructional implications.
Research Question #2: The Extent to Which the District Design has Actually
Been Implemented
Framework for Research Question #2. The second question for this study was, “to
what extent has the district design actually been implemented at the district, school,
and individual teacher level?” To answer this question, the researcher
used transcriptions from interviews with the principal and teachers. The researcher
also used a teacher questionnaire which helped the researcher to develop a
“picture” of how the implementation of data use took place in the school.
Findings for Question #2. As stated earlier, there is no evidence that there is a
systemic plan in the district for the use of data. However, the school has begun to
create their own system for using data as they have created time for discussing and
analyzing data on a regular basis. Although the school has begun the development
of a system for collecting and analyzing data, the important question is “have they
successfully implemented the findings from their analysis into their instructional
practices?” Table 4 displays the results of the Teacher Questionnaire, which rates
the teacher’s belief of the degree of implementation for the use of data in classroom
practice.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
Table 4. Teacher Questionnaire
Teacher Questionnaire
Results
Degree of design implementation of current data practices
D on’t Know Disagree Strongly Disagree Somewhat Agree Somewhat Agree Strongly
24 27 10 106 86
Degree of design implementation of emerging state data practices
D on’t Know Disagree Strongly Disagree Somewhat Agree Somewhat Agree Strongly
14 37 5 29 29
Accountability for data use at district and individual level
Don’t Know Disagree Strongly Disagree Somewhat Agree Somewhat Agree Strongly
23 6 9 22 30
Improving student achievement through implementation of data use
D on’t Know Disagree Strongly Disagree Somewhat Agree Somewhat Agree Strongly
15 8 6 33 15
The Teacher Questionnaire was administered on a Likert scale from 0-4 where 0
represented a response of don’t know and 4 represented a response of agree strongly.
The responses for each respondent were tallied and added together to arrive at a final
score. For example, in the first category (design implementation) there was a score of
106. That means that there were a total of 106 responses to agree somewhat to the
questions in that category. By contrast, in the third category (accountability) there
were a total of 6 responses to disagree strongly in that category. The number of
questions in each category varied. That accounts for the differences in overall totals
from one category to the next. For example, in the first category there were 16
questions while in the third category there were 6 questions.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
98
Most teachers believe that there is a high degree of implementation of
current data practices. This response is supported by the fact that faculty look at
data on a weekly basis. The district provides the school with data related to current
state data practices. The questions related to this category ask about the teacher’s
use of data to monitor student practice, improve student outcomes, and compare
students in class and across grade levels. In other words, is the teacher using the
data provided to guide instructional practice. The results of the survey indicate that
teachers perceive that they do. However, there is a noticeable response to the
strongly disagree and don’t know areas. This would seem to indicate that although
teachers believe that there is a high degree of implementation of data usage, they
are not quite comfortable with their ability to manipulate the data for their own
purposes and needs, as validated by the results of the Stages of Concern survey.
Teachers’ Perceptions and Commitment to Data Use
The teachers’ perceptions of the implementation of emerging state data
practices are mixed, with a slight edge toward the belief that the school is
implementing emerging practices. The category of implementation of emerging
practices was built on questions regarding frequent professional development,
frequent discussion of new data practices with colleagues, and assistance from
school administrators in implementing new data practices.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
99
It is interesting to note that the highest response in the emerging practices
category was in the strongly disagree area. This response is once again consistent
with the results from the stages of concern survey and teacher interviews that
indicate that the teachers have not received a high degree of professional
development training in the use of data and the emergence of new data practices.
Nevertheless, when looking at the combined total of the agree somewhat and agree
strongly responses, we can determine that the teachers feel that the school as a
whole is moving toward the implementation of the emerging state practices. The
state of California for example is emphasizing the importance of the High School
Exit Exam (HSEE) to students at all grade levels. Administrators and teachers at
the elementary school level are finding that it is in the best interest of their students
for them to begin measuring their success based on the likelihood of their students
ability to pass the HSEE (which is a component of the emerging state practices).
A higher number of teachers agree than disagree that there is a definite
sense of accountability regarding the use of data at the district, school, and
individual levels. This sense of accountability is most likely rooted in the fact that
district administration and school level administration talk about using data. Once
again, the systems set in place by the school for analyzing and discussing data
contribute to the level of accountability seen by the teachers. The expectation that
the teachers respond to the data in writing enhances the level of individual
accountability for teachers.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
100
Teachers believe that they are improving student achievement through
implementation of data use. These questions were rooted in the teachers’
observations of improved student achievement through the use of data. Teacher
responses were highly positive in their experiences in seeing student achievement
rise because of their use of data. By using interim assessments such as CRTs and
teacher made tests, one can notice improvement in student achievement on an
ongoing basis and, thereby, be aware of whether or not students are acquiring a
certain level of mastery. This is true even by keeping a cursory view on student
performance with an absence of in-depth analysis—although an in-depth analysis
will provide a basis for a more precise change in instructional practice.
One important part of implementation of data use is the interventions that
come with findings regarding instructional practice. Once teachers identify student
needs through assessments, they must determine interventions that will help
students meet academic standards. When asked what interventions have the school
implemented to help struggling students achieve the established standards, most of
the teachers pointed to the after-school tutoring program. However, one teacher
discussed a multiplicity of activities that the school has been engaged in over the
last few years. She says:
What we’ve done over the last couple of years is to react
in a hodgepodge kind of way. We try something for six
months or a year and the next year we try something else.
I don’t see a consistency of use in any one thing. It is
stressful on every one, the parents, the teachers, and the
students. Something that has been tried in the past is our
off track tutoring. Facilitated by the learning facilitator
and supported by the aides. It was tried for a little while
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
101
and then stopped. To be quite honest I didn’t see a lot of
growth. I was convinced that it really wasn’t working.
Last year there were two programs that were
implemented. One was that a credentialed teacher, rather
than an aide would work with students who were off
track. So for 21/2 months of the off track time the
students would receive an extended year. Working with
these children in their own classrooms, I saw a difference.
It was like cracking a code with them. In addition to the
off track tutoring, many teachers participated in after
school tutoring with many of the same children. I have
to tell you that I saw growth. This year with many of the
changes across the street (district office) determining that
teacher’s needed to spend more time with the children
who were held back, the intervention program was
dismantled. They no longer receive the help off track.
Many of the teachers who spent time teaching after-
school are burnt out. The paperwork is daunting,
horrendous. Some teachers are still tutoring after-school,
but off the clock. I just put in paperwork for extra time
and it took me three after-school sessions to complete it.
It makes people not want to do the tutoring.
After-school tutoring may be a very effective intervention, providing students with
an extended learning opportunity for meeting curriculum standards. However,
there must be a strong commitment to the intervention by administration and
teachers. All road-blocks to sustaining the innovation must be eliminated if the
innovation is to be successful. Thus, it would be good if the district looked into
minimizing the amount of documenting paperwork in order to make it easier for
teachers to want to participate.
A series of interviews of six teachers, two teacher leaders and four
classroom teachers, were conducted to find out the level of commitment and the
depth of understanding that classroom teachers had to the use of data to improve
instruction and student achievement. The indicators are that the teachers in this
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
102
school are committed to the use of standards, are developing an ever-increasing
expectation for student achievement, and appreciate the use of data to inform their
instructional practices. However, the teachers interviewed were not uniform in
their comfort level for understanding and using data. The more experienced
teachers seemed to have a better understanding of the use of data than did the
newest teachers. They also did not feel strongly that the district was a partner with
them in the use of data. Though they knew that the district provided them with
data, they did not view the district as helpful in giving them the tools and training
that they needed to make sense of data.
The fact that teachers are having difficulty with the use of data may be tied
to the fact that they do not feel that they have had sufficient training in how to use
data. The teachers are unable to articulate, in any concrete manner, that the school
or district has provided them with any training on the use of data. The district says
that it provides in-service training where needed, but teachers are unable to
corroborate this. For example, when asked, how has the school trained teachers to
analyze data, one lead teacher responded:
The leadership team met and viewed a video on how to
analyze data. It talked about quintiles and things like that. I
still walked away with some confusion on how to read class
profiles. I did not receive enough information to understand
the complexity of the reports that we get.
By contrast, another teacher responded, as quoted earlier, “The school has not
trained us to use data.” There does not appear to be any evidence of a systematic
plan by the district to provide systemic training in data analysis. The method by
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
103
which Grand school collects and analyzes the data seems to have been developed
through their participation in the II/USP process. Nevertheless, teachers are
systematically looking at data and analyzing them. They acknowledge that the
school and district provide them with specific data to look at, and that they meet
together in grade level teams to determine how well students are performing on
tests. The findings provide them with implications for their pedagogical
development.
A major part of developing a truly standards-based instructional program is
the development of rubrics. Rubrics help teachers analyze the effectiveness of the
instruction by setting a standardized guide for determining mastery of the
curriculum, or lack thereof, on an ongoing basis. There is evidence that the
teachers at Grand Elementary are using rubrics to analyze classroom data. Each
teacher interviewed affirmed the fact that they use mbrics. They discussed how the
rubrics were developed, and how they ensure that standards are taught. For
example, one lead teacher said: “We sat down and looked at the standards and built
a rubric based on the standards.” Another lead teacher gave a more thorough
accounting:
Rubrics have been used off and on throughout our history.
Initially, when we began 10 years ago they were just handed
to us and we were asked to use them. Then some of us
became more sophisticated in the use of rubrics and we began
creating them on our own and creating them with our students.
Then there came a time when the district placed less emphasis
on using rubrics and the use of rubrics just fell away. It seems
to be in vogue again in our district and in our school and so
we’re beginning to adopt and rewrite them. At this point
rubrics are being used on a monthly basis with one assignment
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
104
and they’re coming from the teachers rather than from the
students.
The rubrics, being based on the standards, help teachers ensure that their instruction
is based on standards. This is one way that they ensure that the standards are being
taught. Other ways that they ensure that the standards are being taught are the
writing of the standards in their lesson plans and looking at the list of the standards
that is provided to all teachers and matching activities and assignments to the
standards. One teacher pointed out that, “Trimester benchmark testing helps ensure
that standards are taught.”
The building of rubrics and the making of time for teachers to plan
instruction and analyze data are indicators that the school has made the use of data
a priority. Teachers have indicated that there is no question that the use of data has
become a priority in the school. The teachers spoke of how the use of data became
a priority in their school in the following manner:
Teacher 1: We were not given a choice in that. An outside evaluator (a function of
the II/USP process) helped the school set up a leadership team. Through this team
the mechanism was set up for using data to inform instruction. One of the things
that we did was to adopt a reading curriculum. It had an interesting effect on
teachers. It gave the teachers the ability to assess how the students were
performing.
Teacher 2: At one point in the past it was used to hit us over the head with, but
lately we’ve been looking at data to see how we can improve our teaching.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
105
Teacher 3: Making it a regular focus makes it a priority.
Teacher 4: Institute the weekly meetings, planning and implementation, providing
conference opportunities, putting it out there and saying this is where we need to be
are all ways that data use became a priority in this school. The teachers have been
empowered to make a difference.
Regarding a systemic use of data, from the district office to the classroom,
the reviews are mixed. It is clear that the district provides the school with data to
use, but not everyone feels that the data that they receive is clear or useful. Some
view the district as a partner with the school in using data, but only to the extent
that the district provides the school with data to use. Others do not see the district
as a partner at all because they say that they do not fully understand the data or they
feel that the data is redundant or manipulated. For example a couple of teachers
have put it this way “The district office has been wonderful about getting data to
the school. They have hired a professional statistician to help., .yes, I see the
district as a partner with the school. Their goal is to improve students, but also by
helping teachers do their job.” Yet, a couple of other teachers had this to say:
No, I do not see the district as a partner with the school. I
see that they want us to use data, but sometimes it is
manipulated.. .1 know that the administrator in special
services has provided us assistance from time to time. I
don’t see the district as a whole partnering with us. They’ve
hired a lot of people across the street (district office) to
analyze data, but I don’t see any benefit from it. I don’t see
it coming down to me.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
106
Concerns Regarding Implementation
Any time there is an implementation of a new innovation there is a natural
questioning that takes place in the minds of the individuals within the organization.
To develop an understanding of the concerns that individuals have during the
implementation of an innovation, Hall and Hord (1987) developed the stages of
concern concept. This conceptual framework was the basis upon which the
research team developed the stages of concern questionnaire. The questionnaire
asked teachers to respond to questions that touched on each of the seven stages of
concern (stage 0 awareness, stage 1 informational, stage 2 personal, stage 3
management, stage 4 consequence, stage 5 collaboration, and stage 6 refocusing).
At Grand Elementary School, the pattern of teacher concerns were analyzed. Of the
33 teachers on staff, 16 teachers filled out the Stages of Concern Questionnaire.
Table 5 shows the strongest stage of concern for each teacher.
Table 5: Strongest Stage of Concern for Each Teacher
T e a c h e r O n e T w o T h r e e F o u r F i v e S i x S e v e n E i g h t N i n e
S t r o n g e s t
S t a g e o f
C o n c e r n
S t a g e s
1 & 2
5 . 0 0 e a
S t a g e
2
4 . 2 0
S t a g e s
1 & 4
3 . 0 0 e a
S t a g e
4
6 . 4 0
S t a g e
1
6 . 1 7
S t a g e
1
6 . 8 3
S t a g e
5
6 . 0 0
S t a g e
4
4 . 8 0
S t a g e
1
5 . 1 7
T e a c h e r T e n E l e v e n T w e l v e T h i r t e e n F o u r t e e n F i f t e e n S i x t e e n
S t r o n g e s t
S t a g e o f
C o n c e r n
S t a g e 2
5 . 2 0
S t a g e 2
3 . 8 0
S t a g e s
3 & 4
2 . 6 0 e a
S t a g e 2
5 . 4 0
S t a g e 2
6 . 4 0
S t a g e 1
6 . 0 0
S t a g e 4
6 . 2 0
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
107
The stages of concern were developed on a Likert scale from 0-7. A
response of 0 indicated that the question was irrelevant to the individual. A
response of 1 indicated that the question of concern was not true for the individual.
A response of 2-4 indicated that the question of concern was somewhat true for the
individual. A response of 5-7 indicated that the question of concern was very true
for the individual.
Table 5 shows that 11 of the 16 teachers who responded to the questionnaire
had their strongest concerns at the informational stage (1) and the personal stage
(2). Their responses tended to be high (between 5 and 6 points). According to Hall
and Hord (1987), individuals at the informational stage have a general awareness of
the innovation. They are not concerned about themselves in relation to the
innovation. These individuals need more information about the innovation and all
that is entailed in the implementation of the innovation. Individuals that have a
high degree of concern at the personal stage are uncertain about the demands of the
innovation, his/her inadequacy with the innovation, and his/her role with the
innovation.
These findings are consistent with what the teachers have said regarding a
lack of training for using data. This table shows that many on the staff really are
not sure about what is involved in analyzing data and how to make the best use of
the data presented to them. It further indicates that the teachers desire more
information about how the district intends for the teachers to use data and more
training on data analysis and standards-based instruction. Table 6 shows the
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
108
average score across all stages of concerns, the highest score for each teacher in
stages 0-4, and the number of stages per teacher (stages 0-4) that had at least
averages of 3.5.
Table 6. Average Scores Across All Stages, Highest Score 0-4, and Number of
Stages Per Teacher with a Score of 3.5 or More.
A v e r a g e S c o r e A c r o s s a l l s t a g e s H i g h e s t S c o r e 0 - 4 N u m b e r o f S t a g e s p e r t e a c h e r
W i t h a s c o r e o f a t l e a s t 3 . 5 ( 0 - 4 )
3 . 9 7 5 . 0 0 3
2 . 9 1 4 . 2 0 2
2 . 8 2 3 . 0 0 0
3 . 2 0 6 . 4 0 1
5 . 1 9 6 . 1 7 5
4 . 4 3 6 . 8 3 5
4 . 1 2 5 . 0 0 3
3 . 3 3 4 . 8 0 3
3 . 8 5 5 . 4 0 4
3 . 7 9 5 . 2 0 2
2 . 8 2 3 . 8 0 1
2 . 3 1 2 . 6 0 0
3 . 4 1 5 . 4 0 3
5 . 0 6 6 . 4 0 5
4 . 2 0 6 . 0 0 4
4 . 9 1 6 . 2 0 5
A high number of the teachers surveyed indicate that they have a high level
of concern regarding the change toward the use of data to improve student
achievement. For example, the majority of the teachers (10/16) scored 3 points or
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
109
higher across all stages of concern. The scale scores indicate that the teachers are
still looking to receive more information about the innovation, trying to understand
their place in the innovation, are not sure of how to manage the innovation as part
of their instructional strategy, and do not understand how the consequence or
impact the innovation will have on their instruction is pertinent to them.
This analysis is validated by the results in the other two columns. A very
high number of teachers surveyed (11/16) had as their highest score 5 points or
more between stages 0 and 4. The same number had at least 3 or more stages at 3.5
or higher between stages 0 and 4. These data further indicate that the teachers are
at the first stages of concern and need a greater amount of staff development
training regarding the use of data as a methodology for improving student
achievement. To understand the profundity of the results, it is prudent to explain
what the concerns are for stages 0 through 4. Hall and Hord (1987) explain the
stages as follows:
Stage 0 Awareness: Little concern about or involvement with the innovation
Stage 1 Informational: A general awareness of the innovation and interest in
learning more detail about it is indicated. The person seems to be unworried about
himself/herself in relation to the innovation. She/he is interested in substantive
aspects of the innovation in a selfless manner such as general characteristics,
effects, and requirements for use.
Stage 2 Personal: Individual is uncertain about the demands of the innovation,
his/her inadequacy to meet those demands, and his/her role with the innovation.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
110
This includes analysis of his/her role in relation to the reward structure of the
organization, decision making, and consideration of potential conflicts with
existing structures or personal commitment. Financial or status implications of the
program for self and colleagues may also be reflected.
Stage 3 Management: Attention is focused on the processes and tasks of using the
innovation and the best use of information and resources. Issues related to
efficiency, organizing, managing, scheduling, and time demands are utmost.
Stage 4 Consequence: Attention focuses on impact of the innovation on student in
his/her immediate sphere of influence. The focus is on relevance of the innovation
for students, evaluation of student outcomes, including performance and
competencies, and changes needed to increase student outcomes (p. 60).
Therefore, if an individual rates high for stage l,for example, he or she is
not really ready to use the innovation at present. This individual still wants more
information about the innovation. Thus, with regards to using data, the individual
needs to understand what the research says about using data (to a greater extent
than what he/she has received to date), understand why using data is important and
how it fits into their world, and how it can be used to its fullest capacity.
In the absence of knowledge regarding the innovation there can be a bit of
ambivalence once the innovation has been introduced. For example, when asked
how the teacher has felt about the changes that have taken place regarding data use,
one lead teacher responded: “some of it is beneficial, some of it is redundant. It is
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
I l l
based too heavily on one set of information. We’re turning education into a race
instead of a journey.”
Although the above table does not show it, the majority of the teachers
surveyed had the highest number of concerns (greater than 3.5) in stages 1 and 2.
The fact that the concerns are highly concentrated at these two stages indicates that
the teachers want to know more about using data and are not yet comfortable with
their ability to make effective use of data in their teaching. These findings are not
inconsistent with what the research says. Teachers ought to be given sufficient
time to understand the innovation during the process of implementation. Again
Hall and Hord (1987) tell us that it is natural for teachers to be at the beginning
stages of concern when an innovation is newly introduced. Last school year was
the first introduction for the use of data for this teaching staff. Most are going to
need more information and more practice in order for them to gain a higher level of
confidence with the innovation. That being said, it is interesting to note that the
newest, least experienced teachers, are enthused about the use of data and are
happy about what they are learning in the journey of data use. A couple of new
teachers who were responding to the question of how they felt about the changes
that have taken place regarding data use answered in the following fashion:
Teacher 4-“I like it. I am relatively new. I believe the process has helped.”
Teacher 5- “I am really thankful for the time people have given to help me
understand things. The data helps me focus on what the students need.”
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
1 1 2
In sum, the district provides data to the schools in multiple fashions. The
district has an expectation that the schools will make use of the data that they
provide. The district uses the testing outcomes to evaluate the effectiveness of the
school’s performance and that of the principal. The school has developed a system,
with the help of an external evaluator, for collecting and analyzing data. Classroom
teachers have participated on a regular basis in the analysis of data, the
development of rubrics to aid in the analysis, and the implementation of standards-
based instruction as a means of more productive instructional methodology. The
available data indicates that the school has been successfully beginning to improve
student achievement. Their attention to data and development of standards-based
instruction has been a key to their success. Yet there is a real need for staff
development in the use of data as teachers are not fully confident in their
knowledge of the data their various forms, and their ability to use data effectively
as a tool for improving student achievement.
Finally, there is no codified plan created by the district for the use of data.
There are a lot of examples of how data are disseminated, the types of technologies
used to aid in the development of student data, the commitment to the collection of
data, and some minimal effort toward staff development in the use of data. Yet,
there is no evidence that there is a systemic plan for the collection of data, training
on how to analyze data and identify data in its various forms, the development of
rubrics for the implementation of standards-based instruction, or the provision of
funding to ensure the ongoing training of staff, collection of data, and analytical
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
113
processes. Based on all of the evidence, each school is responsible for making sure
that all of the above takes place. Each school implements the district expectation
for using data. They must develop and cultivate this understanding on their own, or
go outside of the organization and find assistance on how to use data and develop
standards-based instruction.
Research Question #3: The Extent to Which the District Design is a Good One
Framework for Research Question #3. The third question for this study was “To
what extent was the district plan a good one?” This question had to do with
evidence of the effectiveness of the district and school use of data in promoting
improvement of student achievement. The question sought to determine the level
of district support for standards-based instruction and assessment, district and
school accountability to standards-based curriculum, and degree to which high
student performance is aligned to standards and communicated to teachers,
students, and parents. To answer the question the researcher used a researcher’s
observation and rating form, which was developed by the research team, and their
leader Dr. David Marsh, designed to elicit the researchers assessment of how
effective the district plan for the use of data to improve student achievement
actually was.
Findings for question #3. As indicated earlier, there is no evidence of a codified
plan to use data in a systemic manner. So it is difficult to say that a plan that
essentially does not exist is a good one or a bad one. Nevertheless, we rate whether
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
114
or not the actions that the district and school have taken in the use of data has had
the desired effects of improved student achievement. Based on the Researcher’s
Observation and Rating Form the following figures provide the findings on the
adequacy of the design at the district and school site levels.
Figure 3.
District Support for Standards-based Instruction and Assessment
Intended District Impact: To provide schools with information regarding
their progress toward meeting state API targets. To lead schools toward
standards-based instruction. To encourage schools to use multiple data
sources to inform practice.
Observed Im pact on School Site: Schools receive a multiplicity of data.
The school site systematically analyzes data. Teachers have a better idea of
student achievement levels on an interim basis due to the weekly analysis.
There is insufficient training for teachers to better analyze the data and
effect change in instructional practice. No budgeting by the district to
support data use at the school site level.
Figure 4.
District and School Accountability to Standards-based Curriculum
Student Performance Data Forwarded to School: The district sends
state assessment data out to schools at the beginning of each school year.
The results are disaggregated in several ways and forwarded to schools.
The district uses interim criterion referenced tests and sends the results data
to the school on a quarterly basis. The district has provided schools with
diagnostic testing programs as another vehicle for interim assessments.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
115
How School is Utilizing Student Data to Impact Student Performance:
The school disseminates the data to the teachers. The teachers meet in
grade level teams on a weekly basis and analyze the data in a guided
fashion with school administrators and lead teachers. The teachers take the
findings and determine implications for change in instmction. Ideally, the
implications for change will translate into improved instructional strategies
and stronger pedagogy for classroom teachers.
District Support for Standards-Based Instruction and Assessment
Intended District Impact: To provide schools with information regarding their
progress toward meeting state API targets. To lead schools toward standards-based
instruction. To encourage schools to use multiple data sources to inform practice.
The district has mandated that all schools follow state standards. This
directive indicates, implicitly if not explicitly, that the schools are expected to
move to a standards-based instructional model. When asked about how the district
provides training for teachers to use data to improve student achievement one
teacher and the principal responded by saying that the district sent them to
conferences featuring Doug Reeves. Doug Reeves is a researcher whose area of
interest is standards-based instruction. Thus, this example is an indicator that the
district expects standards-based instruction.
This move toward standards-based instruction should have the effect of
improving achievement test scores, thus improving the API rankings of the schools.
The schools are sent test results in multiple formats. The multiple formats of data
is evidence that the district intends for the schools to look at the results from
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
116
various perspectives. By doing so the schools can get a clearer picture of the
meaning of the data.
Observed Impact on School Site. Schools receive a multiplicity of data.
The school site systematically analyzes data. Teachers have a better idea of student
achievement levels on an interim basis due to the weekly analysis. There is
insufficient training for teachers to better analyze the data and effect change in
instructional practice. No budgeting by the district to support data use at the school
site level.
Since the school has begun a systematic approach to using data, the
intended outcome is that a collaborative investigation of student outcomes will lead
to improved instruction and improved student achievement. An indicator suggests
that this position has merit is that the student achievement and API scores have
improved during the first year that the school has implemented this system of data
analysis. There are also indications that the system has had an impact on the
teaching staff. One teacher commented that “the expectation and goal is that
students meet the state standards. Teachers are working really hard to see that the
children meet the expected standards.”
District and School Accountability to Standards-Based Curriculum
Student Performance Data Forwarded to School. The district sends state
assessment data out to schools at the beginning of each school year. The results are
disaggregated in several ways and forwarded to schools. The district uses interim
criterion referenced tests and sends the results data to the school on a quarterly
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
117
basis.. The district has provided schools with diagnostic testing programs as
another vehicle for interim assessments.
The criterion referenced tests that the district has created are based on the
state standards. This quarterly assessment data is intended to help teachers monitor
student progress throughout the course of the year. Teachers seem to understand
the importance of utilizing this data type. For instance, two teachers responded in
similar fashions during the interview when reflecting on how teachers are
developing standards-based rubrics: “Teachers are developing writing rubrics.
Trimester bench-mark testing helps ensure standards are taught... We sat down and
looked at the standards and built a rubric based on the standards.”
How School is Utilizing Student Data to Impact Student Performance
The school disseminates the data to the teachers. The teachers meet in
grade level teams on a weekly basis and analyze the data in a guided fashion with
school administrators and lead teachers. The teachers take the findings and
determine implications for change in instruction. Ideally, the implications for
change will translate into improved student achievement and stronger pedagogy for
classroom teachers.
There is evidence that the teachers are taking steps toward making changes
in their instruction based on data analysis. For example, one teacher said “I’ve
made questions for my daily math work. I plan to change my questioning during
reading.” Another teacher said “notations are made in the lesson plan book
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
118
regarding the standards to be taught.” These are changes in practice to help
teachers improve what they are doing in the classroom to impact student
achievement.
Degree to Which High Student Performance is Aligned to Standards and
Communicated to Teachers. Students, and Parents
Presumably, the district curriculum is based exclusively on the state
standards. Thus, all decisions that stem from data analysis of state and district
assessments would be tied to standards-based instruction. Five of the six teachers
interviewed for this study say that they keep track of student achievement and
growth in the following manner:
Teacher 1:1 use authentic assessment.
Teacher 3 :1 look at student work samples, I use teacher observation, student
participation, and grade books.
Teacher 4: Through a data base that helps teachers develop a portfolio. SAT-9
tests and districts tests.
Teacher 5: I test them using end of chapter tests. Mostly through observations, etc.
Teacher 6: We do it on paper.
The teachers discuss how they use the information:
Teacher 1 :1 look at their writing and write it down in their portfolios.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
119
Teacher 4: We use it by keeping data as a focus. Keep on moving through staff
development.
Teacher 5: If I find that students are having trouble, I tutor them after-school. I re-
teach when necessary.
Using the researcher rating instrument, the researcher rated how effectively
the district design improves student performance as demonstrated in standardized
assessment results. The results were that the design was somewhat effective
because although there was evidence of improvement, the innovations are relatively
recent and time will tell whether there will be greater increases in student
performance. The degree to which the district provided student data are used by
the school was rated as being effective. School administrators and teachers are
analyzing data using a collaborative model. The efforts are guided, the
administrators and lead teachers meet to reflect on the process, and teachers use the
findings to make changes in their instructional practices. How effectively high
student performance is being developed through out the school was rated as being
unclear. This rating was given because there is no direct evidence, whether through
student work samples or student achievement data, that there are large numbers of
students working at high performance levels. Typically, there are always students
who are high achievers regardless to the overall achievement levels of the school.
Nevertheless, the researcher could not clearly rate this section without archival data
to indicate to what extent the activities that the school is engaged in has impacted
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
120
high performance levels by the students. Figure five shows the results of the
researcher rating of each of the indicators.
Figure 5. Researcher Rating Scale of the Success Indicators for District Design for
Data Use
Researcher’s rating on how effectively district design improves student
performance as demonstrated in standardized assessment results.
Researcher’s Rating: 1 2 3 4 5
x
Researcher’s rating on the degree that the district provided student data is
used by the school.
Researcher’s Rating: 1 2 3 4 5
x
Researcher’s rating on how effectively high student perform ance is
developed throughout the school learning community.
Researcher’s Rating: 1 2 3 4 5
x
Legend: 1= not effective 2= somewhat effective 3= unclear 4= effective 5=very effective
Discussion
The literature reveals that the use of data at the school and classroom level
is not common practice (Guth et al., 1999). Teachers have not been eager to use
data because in many instances past practice had been for administrators and
policymakers to use data as a club to “beat them over the head with.” One teacher
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
121
in this study echoed this very sentiment. Thus, for many teachers the idea of using
data has been a source of fear. Such practice rarely leads to constructive change.
However, when data are used in their proper contexts they can be powerful
instruments for improving instructional strategies and student achievement.
Schmoker (1999) underscores this point when he says “many enlightened educators
recognize that we need data to improve teaching practice.” The school in this study
began systematically using data during the 2000-2001 school year. The data
suggests that their attention to data analysis began to pay dividends in improvement
in student achievement.
There are other reasons why teachers have not used data as a tool to
improve achievement. One such reason is that published data can cause one to be
vulnerable with regards to instructional practice. Testing data show the results of
instructional performance. Once again Schmoker (1999) points out this very issue:
Why do we avoid data? The reason is fear—of data’s
capacity to reveal strength and weakness, failure and
success. Education seems to maintain a tacit bargain
among constituents at every level not to gather or use
information that will reveal where we need to do better,
where we need to make changes. Data almost always point
to action—they are the enemy of comfortable routines. By
ignoring data, we promote inaction and inefficiency.
Teachers must learn to work in a collaborative manner using data as an instrument
to gauge their instructional technique as opposed to viewing data as a mirror for
pointing out imperfections or failed practices. Schmoker (1999) continues by
saying “collective pride and enthusiasm are the result of an organized effort to
monitor and adjust progress based on data.”
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
122
In order for teachers to begin using data in a constructive manner there must
be training so that teachers understand how to use the instruments effectively.
Darling-Hammond (1992) points out that if an innovation is going to take hold
there must be effective staff development. Otherwise teachers will revert to what
feels comfortable. Because districts have not put a premium on data analysis,
there has not been a strong commitment to staff development in this area. Indeed,
the teachers in this study reveal that they have not received training from their
district on how to use data. Nevertheless, interviews from this study reveal that
teachers are not adverse to the use of data. They would, however, like to receive a
greater amount of information and training on the subject in order to become more
effective. This willingness to receive information and participate in the process is a
positive opportunity for the district to begin systemically imbedding the innovation
into the fabric of the organization.
Discussion of Research Questions
The first question of the study asks: what is the district design for using data
to improve student performance, and how is the design linked to current and
emerging state context for assessing student performance? The answer to this
question in this study is that there was no evidence to indicate the existence of a
strategic plan for using data. The district did collect data to monitor how schools
are progressing toward improved achievement on the API and how students are
progressing on the SAT-9. The district distributed SAT-9 and district benchmark
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
123
data to the schools in various formats. The district provided schools with
technology that would provide diagnostic data to the schools on an interim basis.
However, all of these actions do not make up a systematic use of data. There is no
direction for guiding the whole of the organization toward the effective use of the
data provided. There is no central data-base that provides the whole organization
with a central source of data with the capability of having it formatted in a way that
may be meaningful to the particular user. There are no provisions for systemic
staff development so that everyone in the organization will receive training in data
analysis that will be germane to their particular job functions.
The study found that Grand Elementary had begun to develop a systematic
procedure for using data to improve student achievement. School administration
collects data and analyzes them in order to make decisions regarding instructional
direction for the school. The school administration leads and guides the faculty in
data analysis and provides opportunities for collaborative strategic planning efforts.
However, the development of their procedures came as a result of outside
consultants and not as a result of a district level plan. As stated earlier, Grand’s use
of data seems to have had a positive effect on student achievement that year. It
would seem that if they can make substantive gains by having exposure to data
results and giving thought to the implications of those results, then once the staff
truly begins to understand the process of data analysis through increased
information and training, their achievement results would be even stronger than
those of the current indicators.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
124
Research question #2 asked: To what extent is the district design actually
being implemented at the district, school and individual teacher level? The study
found that there was no structural platform for a systemic implementation of data
collection and analysis. It would seem that all schools are left to figure out how to
analyze the data on their own. Although the district expects that principals will
utilize data to influence their decisions regarding improved academic achievement,
there are no specific achievement goals set forward by the district. Without goals
there can hardly be any strong expectation for high achievement.
The study found that the district uses the state standards exclusively as the
district curriculum. The district has begun developing a benchmark testing system
that will be administered on a trimester basis. This is an important element in the
development of a standards-based instructional system. According to Reeves
(1998) “standards without standards-based assessments are merely a very
expensive and time-consuming pep talk—one in a string of educational initiatives
and innovations that shed more heat than light.” However, the district testing
system is still in its infancy and is still being written and revised. Also, all
indications are that the data that the schools receive provide only raw and
percentage scores. The reports lack key components such as item analysis pages
and matrix pages that allow the reviewer to know which standards were tested on
any given item. This type of vital information can have an extreme impact on
instruction and achievement by allowing teachers to see which standards are being
mastered by the students and which ones still need continuous review.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
125
The study also revealed that teachers in this school were at the beginning
stages of the Hall and Hord stages of concern model. As was pointed out in the
analysis portion of this chapter, the teachers are not entirely comfortable with their
knowledge of data use. They are at the information and personal stages, which
means they would like to receive more information about the innovation and are
not yet to the place where implementation is a major point of emphasis. They still
need to understand how the innovation is going to impact them personally. The
implication here for school and district administration is that there needs to be a
steady flow of information about the importance of data to the improvement of
instructional practice and student achievement. This information can come in the
form of empirical research data (e.g., articles from refereed journals and studies) as
well as informal talks regarding data and data analysis. Although the principal is
providing some research articles at present, there is still a need for a continued
distribution of such to the staff. Such resources should come from the district
office as well as the principal’s office.
Formal and situated interviews validated the results of the teacher
questionnaire and stages of concern survey. The interviews demonstrated that
although teachers did not mind having time for data analysis, and some were even
thankful for such an opportunity, they did not always understand what the data
meant or how to use the results to inform their instruction. These answers were
most typical of the veteran teachers, although the answers from new teachers were
similar. The reasons for their lack of understanding were mentioned above in that
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
126
the data does not provide sufficient information for making in-depth analyses and
there is insufficient training to build the knowledge for such analyses.
Finally, there seems to be a dichotomy that exists between the district’s
perception and the teacher’s perception of data use within the district. The district
believes that there is a plan that exists, whether implicit or explicit, for how data is
used. The teachers do not believe that such a plan exists. The reason for such a
difference in perception seems to be related to the need for more information, as
mentioned earlier. The creation of a formalized plan would ensure that everyone
within the organization was of the same understanding as to what the expectations
for data use are, how they are disseminated and collected, how support for data
analysis is provided, and how results from said analysis are achieved.
Research question #3 asked: To what extent was the district plan a good
one? The answer to this question seems to be that there is no way to rate the
district plan for using data to improve student achievement as a good one because
there is no systemic plan in place. The researcher rated the district support for
standards-based instruction and assessment as somewhat effective. This low rating
was given because although there were some elements of standards-based
instruction present in the district system (e.g., the development of a standards-based
assessment, distribution of state and local testing results, and the development of a
systematic approach to data analysis at the school that was studied), there was no
evidence to support the existence of a system-wide action plan for the collection of
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
127
data, the analysis of results, the implementation of strategic planning, and the
setting of achievable goals built around a standards-based instructional approach.
In rating the degree to which data that is provided to the schools by the
district is being used, the research rated this area as effective. This high rating was
given because the school has developed a systematic procedure for analyzing data.
The school has provided once a week planning time for collaborative meetings on
instructional improvement. One of the four weekly meetings is set aside for data
analysis. As mentioned above, the school leadership guides the teachers through
the process of analyzing data to help them see implications for their instructional
processes from the data that is provided. Archival data supports this finding.
The researcher’s rating on how effectively high student performance is
developed throughout the school learning community was a rating of unclear. The
unclear rating was given because although the teachers and principal indicate that
high student achievement is talked about throughout the organization, there is no
concrete evidence to substantiate that high achievement expectations are
implemented within the instructional atmosphere. In walking through classrooms
to do an environmental scan there were no posters on the walls in any classroom
visited that directly stated that there was an expectation for high achievement.
Many classrooms had standards charts posted to the walls, but there did not seem to
be too much more evidence than that. It would seem that in an atmosphere where
students were expected to achieve at high levels there would be posters on the walls
indicating what the expected levels of achievement would be and how far the
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
128
students have come toward meeting the expected levels. One would also expect to
see rubrics for expected achievement posted on the walls or some kind of indicator
that students are following such a rubric. Interview with teachers tend to indicate
that rubrics have been in use in the school, but there was no direct evidence of such
in the classrooms that were visited. Thus, the indication that there is a verbal
communication of high expectation for all students and the lack of tangible
evidence of the same make for an unclear picture of cultural expectation for high
student achievement.
In the final analysis, this school and district seems to be typical of most in
the state or country. The literature is clear about the fact that most teachers do not
use data to improve student achievement. The research literature is also clear on
the point that a standards-based instructional program where data use is prominent
can have a positive effect on the improvement of student achievement. The school
and district, however, do seem to be on the right track with regards to data use.
The district has spent its resources in building the beginnings of an infrastructure
for systemic data use. They have hired an administrator whose job is to collect
achievement data, analyze results and submit the data and findings to the schools,
and conduct studies to find out the effectiveness of instructional and diagnostic
programs in use in each of the schools. The study school is using the data that is
submitted to them in order to analyze the effectiveness of the instruction. It would
seem that through the process currently in place the school will experience
continued success. However, if the school is to experience powerful results, there
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
129
must be a clear plan set forth by the district for improving the use of data at this
school site and throughout the district.
Summary
The data provided answers to the three research questions of the study. The
questionnaires that generated the data were derived from conceptual frameworks
presented in the review of literature. The findings yielded descriptions of how the
district disseminates data to the schools, and the manner in which the school in this
case study analyzes data and uses the results. Chapter 5 summarizes the study and
the findings, leading to important suggestions and recommendations.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
130
CHAPTER 5
SUMMARY, DISCUSSION OF FINDINGS, CONCLUSIONS, AND
RECOMMENDATIONS
Research Problem
The central problem addressed by this research study is one that has arisen
out of the accountability and school reform movements in education: how to
systemically improve the achievement levels in our schools. Many types of
reforms have been proposed and many types of programs have been developed to
help address this issue. However, the research literature suggests that when schools
leam to effectively utilize the data that they receive on a regular basis, they can
have a hugely positive impact on student achievement. If district administrators,
school administrators, and teachers use data in a systematic fashion the
improvements can be seen throughout the organization.
This study found that most districts have not developed a systemic approach
to the use of data. Most teachers do not use data on a regular basis nor have they
been trained and encouraged to do so. Within the context of the current and
emerging practices of accountability, it becomes all the more important for schools
to become effective purveyors of achievement data. They must become adept at
using the data that they receive from the state and the data generated locally.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
131
Purpose of the Study
The primary purpose of this study was to investigate how good schools and
districts use data to improve student achievement. The focus of the investigation
was to discover whether or not the district had developed a plan for the use of data
that would cause a systemic change in the way teachers approach instruction and
thus an improvement in student achievement.
Three research questions defined the problem and guided the procedures for
the study: 1) What is the district design for using data regarding student
performance, and how is that design linked to the current and the emerging state
context for assessing student performance? 2) To what extent has the district
design actually been implemented at the district, school and individual level? 3) To
what extent is the district design a good one?
Methodology
The methods for this study were qualitative in nature. The approach used
was based on the principles of instalment design and qualitative data collection
from Gall, Borg and Gall (1996), Gay (1992), and Hall and Hord (1987). The use
of three types of instruments—teacher surveys, self-reports, and interview guides—
produced triangulation of data, strengthening the validity of the data findings, thus
eliminating biases. District and school archival data such as SAT-9 test results,
district benchmark tests, the district strategic plan, and the school site action plan
supported the primary instruments.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
132
The school for the case study was chosen using the following criteria: a)
The school had to be one of mixed demographics (i.e., there must have been a
student population with a mix of minority representation as well as majority
representation) so as to more closely resemble most schools throughout the state of
California, b) the socioeconomic status of the school had to be mid-ranged (not
exclusively high and not exclusively low), and c) the school had to show
substantive gains on the SAT-9 and other student performance data. The sample for
the case study consisted of the principal and 33 classroom teachers in a year round
elementary school. An interview with the assistant superintendent and the
administrator for research development was included in the study. To protect the
participant’s anonymity, a fictitious name was given to the school (Grand
Elementary School).
Data Collection and Analysis
This section explains how the data was organized, analyzed, and
interpreted. First the researcher transcribed all of the taped interviews and aligned
them with the typed notes taken during the interviews. The teacher survey was
inputted into a database while the self-report was hand scored. Next, a report was
generated from the data base and final scores gathered from the self-report. Once
the data had been sorted and organized, the findings were analyzed.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
133
To determine growth, the researcher collected two years of SAT-9 and API
data. SAT-9 data were disaggregated by grade level and subject matter and
compared to determine growth between the two years. To further verify
achievement gains, the researcher analyzed results from district benchmarks (CRT).
Some questions arose during the data analysis regarding the type of data
that the researcher had been given. Follow-up phone calls were made to the
principal and the administrator for research development in order to get questions
clarified and better data samples forwarded.
Each research question was devised using frameworks developed by the
research cohort and supported by the research literature. The frameworks provided
a research base for each question and served as a way of organizing the findings of
the study.
Framework for the first research question. The first research question
asked, “What is the district design for using data regarding student performance,
and how is that design linked to the current and emerging state context for
assessing student performance?” To answer this question, the researcher analyzed
the SAT-9 and benchmark data to determine patterns in student achievement. The
analysis led to an interpretation of the inferences from the data results, which were
reported in each section of the analysis. The researcher also included vignettes and
responses derived from interviews of the study participants to help draw
conclusions about the district design for data use.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
134
Framework for the second research question. The second research question
asked, to what extent has the district design actually been implemented at the
district school and teacher level? To answer this question the researcher used the
triangulated results from the teacher survey, teacher questionnaire, and teacher
interviews. The results were reviewed and interpreted in this section of the study.
Framework for the third research question. The third research question
asked, “To what extent is the district design a good one?” To answer this question,
the responses of the teacher interviews were interpreted and the results from the
teacher questionnaire and survey were analyzed. The analysis was recorded in this
section as a means of articulating the findings.
Summary of Data
First Research Question: What is the District’s Design for using Data?
The study found that the district design for using data is rated low. There is
no planned design for data use, which would include how data is distributed to
schools, articulated expectation for data use, staff development training to ensure
competency, and funding to ensure that the above was adequately implemented and
maintained. Although the district provides data in multiple formats, the data tends
to be primarily from one source (SAT-9). Teachers do not receive the data in
formats that they can readily interpret.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
135
Second Research Question: To What Extent is the Design Being Implemented at
the District. School, and Teacher Level?
The second research question concerned the implementation of the district
design throughout the district. This effort was also rated low since there is no
structured design to implement. However, the school has developed a system for
using data on a regular basis. This systematic approach receives a medium rating
as the school is in the beginning processes of using data. The school has built in a
format of weekly meetings to foster collaborative efforts in planning instructional
strategies and analysis of achievement data.
The majority of teachers that responded to the survey scored 3.5 or higher
in stages 0-4 on the scale. These results indicate that the teachers would like to
receive a greater amount of information in the use of data, and that they are at the
early stages of concern (information and personal). The stages of concern were
derived from the Hall and Hord (1987) concerns based adoption model framework.
These findings validate the interpretations of the interviews that show that teachers
have not received sufficient information or training about how to adequately use
data and how the process can help them improve their instruction.
Third Research Question: To What Extent is the District Design a Good One?
The adequacy of the design from the systemic perspective is low. The
district needs to develop a process that would be uniform across all schools.
The district office employs multiple data types to complete particular
studies and analyze specific phenomena (such as how do English language learners
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
136
achieve as compared to their English only counterparts, or how much growth has
been made by students who have been retained during the current year). However,
the data that the district office uses is not necessarily useful to all schools all of the
time. According to the administrator for research development, the CRT tests
mentioned in this study are relatively new (only in existence since the 2000-2001
school year) and is still being revised, so this is an incomplete data source. Most of
the data to be analyzed is the SAT-9 data, and that one source is not timely as it is a
once a year test. Thus, it does not tend to be extremely useful during the course of
the year when teachers need to make mid course corrections to their instructional
practices.
Recommendations
Teachers in this school are participating in data use practices and seem to be
glad that there is a system in place that gives them a way to look at how students
are achieving. For many of the new staff members, there is an excitement to the
process of looking at data. For the veteran staff, there seems to be a wariness about
the whole process. The staff as a whole, however, is in the beginning stages of the
process of collecting and analyzing student data. They need much more
information and training in order for them to become proficient at manipulating and
analyzing data. They will need more comprehensible data from their district—such
as a complete Criterion Reference Test with results that include such features as an
item analysis page and standards matrix— so that they can have interim assessments
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
137
that will provide more timely information for informing instructional practice and
noting student growth through the year.
In order for organizations to experience systemic reforms that utilize data to
in a manner that will positively impact student achievement, educators should
consider the following recommendations:
1. Districts should put together a comprehensive plan for the systemic use of data.
This means that the plan should delineate how schools will use data as a part of
their accountability instrumentation, how teacher will use data to measure
growth on an interim basis—throughout the year—and on an annual basis. The
plan will also articulate to teachers how the data that they collect should be
used to help them improve their instructional strategies. Finally, the plan would
guide the district, school, and teachers toward standards based instruction so
that the data that schools utilize would have a common point of reference and
have relevance. It is through standards-based instruction, and the utilization of
data to monitor the success of the instructional activity, that schools will find a
sustained level of academic achievement in their students (Reeves, 1998;
Schmoker, 1999).
2. When developing a plan for the systemic use of data, it will be imperative to
ensure that staff development is a central component in the plan. The research
literature is clear that any innovation that is to be implemented in an
organization will need to have strong staff development in order for the
innovation to be successful. The staff development needs to be provided to
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
138
administrators and teachers alike so that all members of the organization will
become expert users of data and can make wise decisions regarding curriculum
development and improved instructional practice.
3. District administrators and policymakers need to take into account the funding
for the development of a strong plan for using data. There needs to be funding
for the above mentioned staff development. There must also be funding for
data collection and dissemination. Sharing the results of achievement data must
be system wide, which includes parents and community members, and released
in formats that are easily understood (typically accompanied by some type of
explanation of the numbers, graphs, or charts) with sufficient funding to ensure
that all stake-holders receive the information in a timely manner.
4. There should be a development of benchmark exams based on content
standards that can be used by district officials, school administrators, and
teachers to determine student progress throughout the course of the school year.
The data that accompanies the exam results should include an item analysis
page(s) and a standards-based matrix, so that the data is easily analyzed and
teachers can determine the implications for improving instruction between
benchmarks.
5. Besides benchmark exams, the district plan should help teachers develop or
identify meaningful interim assessments so that they can monitor the success of
their instructional strategies between benchmark exams and quarterly grading
reports. Teachers should be instructed in how to use student classroom work as
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
139
data to assess the effectiveness of their instruction and as a guide for modifying
their instruction.
Suggestions for Additional Research
This study examined how schools use data to improve student achievement.
However, once students begin achieving at higher levels, it seems to be an even
more difficult task for schools to maintain the momentum (as evidenced by the
many schools who made huge growth on the California API one year and
demonstrated negative affects the following year). Research is needed to
investigate how some schools manage to sustain achievement momentum after
having made significant improvements. The results of such a study might give
administrators and policymakers a road map for continued improvement in
achievement outcomes.
This study revealed the fact that California and other states have developed
current and emerging practices for data use in accountability reforms. A good
study for the future might be the effectiveness of district interim assessments or
intervention strategies when tied directly to state accountability efforts.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
140
REFERENCES
Adcock, E. & Phillips, G. (2000, April). Accountability evaluation of magnet
school programs: A value added model approach. A paper presented at the
annual meeting of the American Educational Research Association.
Abernathy, P.E. & Serfass, R.W. (1992). One districts’ quality improvement story.
Educational Leadership 50(3): 14-17.
A Nation at Risk. (1983). U.S. Department of Education. Washington D.C.
Barton, P. (2001, March). Raising achievement and reducing gaps: Reporting
progress toward goals for achievement. National Education Goals Panel.
Bemhart, V.L. (2000, Winter). Intersections. National Staff Development Council,
33-36.
Black, P. & Wiliam, b* (1998). Inside the black box: Raising standards through
classroom achievement. Phi Delta Kappan, 80(1): 139-149.
Blank, R. & Wilson, L. (2001). Understanding NAEP and TIMSS results: Three
types of analyses useful to educators. ERS Spectrum. Winter, 19(1): 23-33.
Brown, J.S. (1991). Research that reinvents the corporation, Harvard Business
Review, 69(1): 102-11.
Busch, C. & Odden, A. (1997, Winter). Introduction to the special issue:
Improvement educational policy and results with school-level data—A
synthesis of multiple perspectives. Journal o f Education Finance, 22: 225-
245.
Caldwell, B. & Spinks, J. (1992). Leading the self-managing school. Lewes,
Falmer Press.
California Curriculum News Report. (1999, Dec.). Raising student achievement:
Implementing systems of accountability. Vol. 25(2): 1-8.
California Department of Education. (2001). The 1999 base year Academic
Performance Index.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
141
Cohen, D.K. & Ball, D. (1990). Relations between policy and practice: A
commentary, Educational Evaluation and Policy Analysis, 12(3): 331-338.
Congressional Digest. (1997, Nov.). National education tests, 257-267.
Consortium for Policy Research in Education. (2000). Assessment and
Accountability in the Fifty States.
Conzemius, A. (2000, Winter). Framework: System builds change efforts beyond
hopes, hunches, guesses. National Staff Development Council, 38-41.
Corbett, H.K. & Wilson, B. (1990). Testing Reform and Rebellion, Norwood, NY,
Albex.
Darling-Hammond, L. (1990). Instructional policy into practice: The power of the
bottom over the top. Educational Evaluation and Policy Analysis 12(3):
339-347.
Darling-Hammond, L. (1997). The right to learn: A blueprint for creating schools
that work. San Francisco: Jossey Bass
Darling-Hammond, L. & Ascher, C. (1992) Creating accountability in big city
schools, New York, NY, Eric Clearinghouse on Urban Education.
Duncombe, W. D. & Rothstein, R. (1999). Performance standards and educational
cost indexes: You can’t have one without the other. In Ladd, H.F., Chalk,
R. and Hansen, J.S. (Eds.) Equity and Adequacy in Education Finance.
National Academy Press: Washington.
Easton, J. (1991). Decision making and school improvement LSCs in the first two
years Reform, Chicago, IL, Chicago Panel on Public Policy and Finance.
Edmonds, R. (1979, Oct.). Effective schools for the urban poor. Educational
Leadership, 37: 15-24.
EdSource. (1998, June). Shifting the focus to learning: California’s accountability
debates.
EdSource. (2000, May). National accountability movement offers lessons for
California.
EdSource. (2001, May). California’s new academic standards take hold.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
142
EdSource. (2001, June). California’s student testing system: Hard choices and pew
directions.
Education Commission of the States. (1996). The progress of education reform.
Elliot, M. (1998). School finance and opportunities to learn: Does money well
spent enhance student’s achievement? Sociology o f Education, 71: 223-
245.
Farber, B. (1991). Crisis in Education, San Francisco, CA, Jossey-Bass.
Frye, Frugerer, Harvey, McKay & Robinson. (1999). SAM: A student
achievement model designed to empower teaches and increase student
achievement through action research report from EDRS. Eric No. ED
430015
Fullan, M. (1992). Successful School Improvement, Buckingham, UK, Open
University Press.
Fullan, M. (1993). Change Forces: Probing the depths o f educational reform.
New York: The Falmer Press.
Fullan, M. (1993). Coordination school and district development in restructuring,
in Murphy, J and Hallinger, P. (Eds) Restructuring Schools: Learning from
Ongoing Efforts, Newbury Park, CA Corwin Press.
Fullan, M. & Rolheiser-Bennett, C. (1990). Linking classroom and school
improvement, Educational Leadership, 47(8): 13-19.
Fullan, M. with Stiegelbauer, S. (1991). The New Meaning of Educational
Change. New York: Teacher’s College Press.
Fullan, M.G. & Miles, M.B. (1992). Getting reform right: What works and what
doesn’t. Phi Delta Kappan, 73: 744-752.
Gall, M., Borg, W., & Gall, J. (1996). Educational Research: An Introduction.
Longman Publishers: White Plains.
Gay, L.R. (1992). Educational Research: Competencies for Analysis and
Application (4th ed.). McMillian Publishing: New York.
Glickman, C. (1993). Renewing America’s Schools, San Francisco, CA, Jossey-
Bass.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
143
Goertz, M. & Natriello, G. (1999). Court mandated school finance reform: What
do the new dollars buy? In Ladd, H.F., Chalk, R. and Hansen, J.S. (eds.)
Equity and Adequacy in Education Finance: Issues and Perspectives.
National Academy Press: Washington.
Goodlad, J. (1992b). On taking school reform seriously, Phi Delta Kappan, 74(3):
232-38.
Good, T. & Brophy, J. (1997). Looking in classrooms. Addison Wesley
Longman: New York.
Gross, N. Giacquinta, J. & Bernstein, M. (1971). Implementing Organizational
Innovations: A Sociological Analysis of Planned Education Change, New
York, Basic Books.
Guth, G., Holtzman, D., Schneider, S., Carlos, L. et al. (1999). Evaluation of
California’s standards-based accountability system. Research Report by
WestEd for California Department of Education.
Guthrie, J.W. (1993). Do America’s Schools Need a Dow Jones Index? Phi Delta
Kappan 74(1): 523.
Hall, G.E. & Hord, S.M. (1987). Change In Schools: Facilitating The Process, State
University of New York Press: Albany.
Hampton, S. (1999). Standards-Based Classrooms in High Schools: An
Illustration, in the New American High School. Marsh, D. (Ed.) ASCD:
Alexandria.
Hart, A.W. & Murphy, M.J. (1990). New teachers react to redesigned teacher
work. American Journal o f Education, 98: 224-50.
Heyman, G.D., & Dweck, C.S. (1992). Achievement goals and intrinsic
motivation. Motivation and Emotion, 16: 231-247.
Hill, P. & Bonan, J. (1991). Decentralization and Accountability in Public
Education, Santa Monica, CA: Rand.
Hodgkinson, H. (1991). Reform versus reality, Phi Delta Kappan, 72(1): 9-16.
Huberman, M. (1992). Teacher development and instructional mastery, in
Hargeaves, A & Fullan, M. (Eds) Understanding Teacher Development,
New York : Teachers College Press, 122-42.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
144
Janis, I. (1972). Victims of groupthink, Boston, MA, Houghton Mifflin.
Kanter, R.M., Stein, B. & Jick, T. (1992). The challenge o f organizational change,
New York, The Free Press.
Khanna, R., Trousdale, D., Penuel, W., & Kell, J. (1999, April). Supporting data
use among administrators: Results from a data planning model. Paper
Presented at the Annual Meeting of Edcational Resarch Association.
Larson, K, Guidera, A.R., & Smith, N. (1998). Formula for success: A business
leader’s guide to supporting math and science achievement. Office of
Educational Improvement: Washington, D.C.
Leithwood, K. (1992). The move toward transformational leadership, Educational
Leadership, 49(5): 8-12.
Leonard, J.F. (1991). Applying Deming’s principles to our schools. Reprinted in
■ an Introduction to Total Quality for Schools. Arlington, VA: AASA.
Levin, H. (1988). Acceleration elementary education for disadvantaged students, in
Chief State School Officers (Eds) School Success for students at Risk,
Orlando, FL, Harcourt, Brace, Jovanovich, 209-25.
Lieberman, A. & McLaughlin, M. (1992). Networks for educational change:
Powerful and problematic, Phi Beta Kappan 75(9): 673-7.
Lipsitz, J., & Mizell, M.H. (1997, May). A manifesto for middle grades reform.
Phi Delta Kappan.
Marsh, D. (1988, April). Key Factors Associated with the Effective
Implementation and Impact of California Educational Reform, A paper
presented at the symposium on examining education reform implementation
in California: the PACE-ACE study at the Annual Meeting of the American
Educational Research Association in New Orleans.
Marsh, D. & Bowman, G. (1989). State-initiated top-down versus bottom-up
reform, Educational Policy, 3(3): 195-216.
Marzano, R.J., Pickering, D.J, & Pollock, J.E. (2001). Classroom Instruction That
Works. ASCD: Alexandria.
McLaughlin, M. (1990). The rand change agent study revisited: Macro
perspectives and micro realities, Educational Researcher, 19(9): 11-16.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
145
McNeil, J. (1996). Curriculum: A comprehensive introduction (5th Ed.). Harper
Collins: New York.
Merrow, J. (2001, May). Understanding Standards. Phi Delta Kappan.
Murphy, J. (1991) Restructuring Schools, New York, NY, Teachers College Press.
National Center for Education Statistics, U.S. Department of Education. (2000).
Highlights from the Third International Mathematics and Science Study-
Repeat (TIMSS-R). Washington, D.C.: US. Government Printing Office.
New Brunswick Department of Education. (1992). Excellence in Education: The
Challenge, St. John, New Brunswick, Department of Education.
Noyce, P., Perda, D., & Traver, R. (2000). Creating data-driven schools.
Educational Leadership, 57(5): 52-56.
Odden, A. (2000). The cost of sustaining educational change through
comprehensive school reform. Phi Delta Kappan, 81( 6): 433-444.
Odden, A. & Marsh, D. (1988). How comprehensive reform legislation can
improve secondary schools. Phi Delta Kappan, 69(8): 593-598.
Offord, D., Boyle, M. & Racine, Y. (1991, March/April). Children at risk: Schools
reaching out, Education Today, 17-18.
Picus, L. (2000, May). Setting budget priorities. American School Board Journal,
1-7.
Prestine, N. (1993). Feeling the ripples, riding the waves: Making an essential
school, in Murphy, J. and Hallinger, P. (Eds) Restructuring schools:
Learning from Ongoing Efforts, Newbury Park, CA Corwin Press.
Ransom, K., Minnick-Santa, C., Williams, C., Farstrup. A., et al. (1999).
Highstakes assessments in reading: A position statement of the international
reading association. Journal o f Adolescent and Adult Literacy, 43( 3): 305-
312.
Reeves, D. (1998). Making Standards Work. Center for Performance and
Assessment: Denver.
Reeves, D. (2001). 101 Questions and Answers About Standards, Assessment, and
Accountability. Advanced Learning Press: Denver.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
146
Rhodes, L.A. (1990a). Why quality is within our grasp. ..if we reach. The School
Administrator, 31-34.
Rhodes, L.A. (1990b). Beyond our beliefs: Quantum leaps toward quality schools.
The School Administrator, 23-26.
Sarason, S. (1990). The Predictable Failure of Educational Reform, San Francisco,
CA, Jossey-Bass.
Scherer, M. (2001, September). How and why standards can improve student
achievement: A conversation with Robert J. Marzano. Educational
Leadership, 14-18.
Schmoker, M. (1999). Results: The Key to Continuous School Improvement (2n d
ed). ASCD: Alexandria.
Schmoker, M. & Wilson, R.B. (1993). Transforming schools through quality
education. Phi Delta Kappan 74(5): 389-395.
Senge, P. (1990). The Fifth Discipline: The Art and Practice of the Learning
Organization. Doubleday: New York.
Shanker, A. (1990). Staff development and the restructured school, in Joyce, B.
(Ed.) Changing School Culture Through Staff Development, Alexandria,
VA Association for Supervision and Curriculum Development, 91-103.
Sheppard, L. (1991). Will national tests improve student learning? Phi Delta
Kappan, 72(3): 232-8.
Skirla, L., Scheurich, J., Johnson, J., et al. (2000, Sept.). Equity-Driven
Achievement-Focused School Districts: A Report on Systemic School
Success in Four Texas School Districts Serving Diverse Populations.
University of Texas at Austin.
Slotnik, W.J. & Gratz, D.D. (1999). Guiding improvement: California
accountability project. Thrust fo r Educational Leadership 28(3):
Thomas, M., & Bainbridge, W. (2001, May). All children can learn: Facts and
fallacies. Phi Delta Kappan, 660-662.
Traub, J. (1999). Better By Design?: A Consumer Guide to Schoolwide Reform.
Thomas B. Fordham Foundation.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
147
U.S. Department of Education. (2000). Highlights from the Third International
Mathematics and Science Study-Repeat (TIMSS-R). Washington, D.C.:
U.S. Government Printing Office.
U.S. Department of Education and Consortium for Policy Research in Education.
(1997). What the Third International Mathematics and Science Study
(TIMSS) Means for Systemic School Improvement.
Willis, S. (1993). Creating ‘Total Quality’ Schools. ASCD Update: 1: 4-5.
Winfield, L. (1990). School competency testing reforms and student achievement:
Exploring a National perspective. Educational Evaluation and Policy
Analysis, 12(2): 157-173.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
APPENDICES
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
APPENDIX 1
Use ofData in School Study
Stages of Concern, Question #1
(Teachers)
Name (optional) ________________ _ _ __________________
In o r d e r t o identify t h e s e d a t a , p l e a s e g i v e u s t h e last f o u r d i g i t s o f y o u r S o c i a l S e c u r i t y
n u m b e r : ___________________________________
This is a q u e s t i o n n a i r e a b o u t t h e district’s design to u s e student data to improve s t u d e n t
performance. T h e p u r p o s e o f t h i s q u e s t i o n n a i r e i s t o d e t e r m i n e w h a t t e a c h e r s who a r e u s i n g o r t h i n k i n g
about u s i n g the district’s d e s i g n t o u s e d a t a t o i m p r o v e s t u d e n t l e a r n i n g a r e c o n c e r n e d a b o u t a t v a r i o u s
t i m e s d u r i n g the i n n o v a t i o n adoption p r o c e s s . A g o o d p a r t o f t h e i t e m s o n t h i s q u e s t i o n n a i r e m a y a p p e a r
t o b e of little r e l e v a n c e o r i r r e l e v a n t t o y o u a t t h i s t i m e . F o r t h e c o m p l e t e l y i r r e l e v a n t i t e m s , p l e a s e c i r c l e
“ O ” o n the s c a l e . O t h e r i t e m s will r e p r e s e n t t h o s e c o n c e r n s y o u d o h a v e , i n varying d e g r e e s of intensity,
a n d s h o u l d b e m a r k e d h i g h e r o n t h e s c a l e .
P l e a s e r e s p o n d t o t h e i t e m s i n t e r m s o f y o u r p r e s e n t c o n c e r n s , o r h o w y o u f e e l about your
i n v o l v e m e n t o r p o t e n t i a l i n v o l v e m e n t w i t h t h e district’s d e s i g n t o u s e d a t a t o i m p r o v e s t u d e n t l e a r n i n g .
W e d o n o t hold t o a n y o n e d e f i n i t i o n o f t h i s i n n o v a t i o n , s o p l e a s e t h i n k o f i t i n t e r m s o f y o u r p e r c e p t i o n o f
w h a t i t i n v o l v e s . R e m e m b e r t o r e s p o n d t o e a c h i t e m i n t e r m s o f v o u r p r e s e n t c o n c e r n s a b o u t v o u r
i n v o l v e m e n t o r p o t e n t i a l i n v o l v e m e n t w i t h t h e district’s d e s i g n t o u s e d a t a t o i m p r o v e s t u d e n t l e a r n i n g .
T h a n k y o u for t a k i n g t i m e t o c o m p l e t e t h i s q u e s t i o n n a i r e .
Please circle the number that best reflects y a w response to each statement based on the following rating
scale:
0 1 2 3 4 5 6 . 7
Irrelevant N ot true for me Somewhat true for me Very true for me now
1.1 am concerned about student attitudes toward the district’s design t o u s e s t u d e n t data to
i m p r o v e s t u d e n t p e r f o r m a n c e .
0 1 2 3 4 5 6 7
2.1 now know o f some other approaches that might work better than the district’s design to
u s e student data to improve s t u d e n t performance.
0 1 2 3 4 5 6 7
3.1 don’t even know what the district’s design to use student data to improve student
performance is.
0 1 2 3 4 5 6 7
1
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
150
4.1 am concerned about not having enough time to organize myself each day because of
the district’s design to use student data to improve student performance.
0 1 2 3 4 5 6 7
5.1 would like to help other faculty in their use of the district’s design to use student data to
improve student performance.
0 1 2 3 4 5 6 7
6.1 have a very limited knowledge about the district’s design to use student data to improve
student performance.
0 1 2 3 4 5 6 7
7.1 would like to know how the implementation of the district’s design to use student data to
improve student performance would affect my classroom, my position at my school and my
future professional status.
0 1 2 3 4 5 6 7
8.1 am concerned about conflict between my interests and responsibilities with respect to
implementation o f the district’s design to use student data to improve student performance.
012345 67
9 .1 am concerned about revising my use o f the district’s design to use student data to improve
student performance.
0 1 2 3 4 5 6 7
10.1 would like to develop working relationships with both our faculty and outside faculty
while implementing the district’s design to use student data to improve student performance.
0 1 2 3 4 5 6 7
11.1 am concerned about how the district’s design to use student data to improve student
performance affects students.
0 1 2 3 4 5 6 7
12.1 am not concerned about the district’s design to use student data to improve student
performance.
0 1 2 3 4 5 6 7
13.1 would like to know who will make the decisions in the district’s new design to use
student data to improve student performance.
0 1 2 3 4 5 6 7
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
151
14.1 w ould like to discuss the possibility of using the district’s design to use student data to
improve student performance.
0 12 3 4 5 6 7
15.1 would like to know what resources are available to assist us in implementing the
district’s design to u s e student data to improve student performance.
0 1 2 3 4 5 6 7 '
16.1 am concerned about my inability to manage all that is required by the district’s design
to use student data to improve student performance.
0 1 2 3 4 5 6 7
17.1 would like to know how my teaching or administration is supposed to change with
the implementation o f the district’s design to u s e student data to improve student performance.
0 1 2 3 4 5 6 7
18.1 would like to familiarize other departments or people with the progress of this new
approach to use district’s design to use s t u d e n t data to improve student performance.
.w ...
0 1 2 3 4 5 6 7
19.1 am concerned about evaluating my impact on students in relation to the district’s
d e s i g n t o u s e s t u d e n t d a t a t o i m p r o v e s t u d e n t p e r f o r m a n c e . '
. 0 1 2 3 4 5 6 7
20.1 would Eke to revise the district’s design to u s e student d a t a to improve student performance.
0 1 2 3 4 5 6 7
21.1 am completely occupied with other things besides the d i s t r i c t ’ s design to u s e student data
to i m p r o v e student performance.
0 1 2 3 4 5 6 7
22.1 would Eke to modify our use o f the district’s design to use student data to improve student
performance.
0 1 2 3 4 5 6 7
3
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
152
23 .Although I don’t know about the district’s design to use student data to improve student
performance, I am concerned about aspects of the district’s design.
0 1 2 3 4 5 6 7
24.1 would like to excite my students about their part in the district’s u s e of student data to
improve student performance.
0 1 2 3 4 5 6 7
25.1 am concerned about time spent working with nonacademic problems related to the
district’s design to use student data to Improve student performance
0 1 2 3 4 5 6 7
26.1 would like to know what the use o f the district’s design to use student data to improve
student performance. •
0 1 2 3 4 5 6 7
27.1 would like to coordinate my effort with others to maximized the effects o f the
district’s design to u s e student data to improve s t u d e n t performance.
0 1 2 3 4 5 6 7
28.1 would like to have more information on time and energy commitments required by
the district’s design to use student data to improve student performance.
0 1 2 3 4 5 6 7
29.1 would like to know what other faculty are doing in the area o f implementing the
district’s design to use student data to improve student performance.
0 1 2 3 4 5 6 7
30.At this time, I am not interested in learning about the district’s design to use student data to
improve student performance.
0 1 2 3 4 5 6 7
31.1 would like to determine how to supplement, enhance, or replace the district’s design to
use student data to improve student performance.
0 1 2 3 4 5 6 7
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
153
32.1 would like to use feedback from students to change the district’s design to use student
data to improve student performance.
0 1 2 3 4 5 6 7
33.1 would like to know how my role will change when I am using the district’s design to
use student data to improve student performance.
0 1 2 3 4 5 6 7
34.Coordination o f tasks and people in relation to the district’s design to use student data to
improve student performance is taking too much o f my time.
0 1 2 3 4 5 6 7
35.1 would like to know how the district’s design to use student data to improve student
performance is better than what we have now.
0 1 2 3 4 5 6 7
36.1 am concerned about how the district’s design to use student data to improve s
performance affects students,
0 1 2 3 4 5 6 7
5
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
APPENDIX 2
154
Data Study
Teacher Questionnaire
Thank you for agreeing to participate in this study.- We ensure complete
confidentiality of your valued contribution. As part of this study, your
responses will provide educators and policy-makers in California with much
needed information. Accordingly, please take time to answer each question
carefully and completely by circling the number that best corresponds to
your view.
We would appreciate it if you would provide the following demographic
data for purposes of the study only. Again, complete confidentiality will be
maintained.
Credential(s)
(Indicate if it is an emergency
credential)
Years of Experience
Years in current position
Grade level(s) currently teaching
Courses Currently Teaching
(By department only)
Gender
Ethnicity
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
155
Don’t
Know
0
Disagree
Strongly
1
Disagree
Somewhat
2
Agree
Somewhat
3
Agree
Strongly
4
Degree of design implementation of current data practices
1 .1 am aware of the
design for using data.
0 1 2 3 4
2 .1 use data in my classes
on a weekly basis.
0 1 2 3 4
3 .1 collect data on a
weekly basis.
0 1 2 3
4
4 .1 use data to monitor
student progress.
0 1 2 3 4
5 .1 use data to guide my
instruction.
0 1 2 3 4
6 .1 use data to improve
student outcomes.
0 1 2 3 4
7 .1 collect data on test
scores.
0 1 2 3 4
8 .1 collect data on class
participation.
0 1 2 3 4
9 .1 collect data on student
attitudes.
0 1 2 3 4
10. I collect data. 0 1 2 3 4
11. My department head
collects data.
0 1 2 3 4
12.1 use data to compare
the past and present
performance o f an
individual student.
0 1 2 3 4
13.1 use data to compare
students within my class.
0 1 2 3 4
14.1 use data to compare
students across the school
in the same grade.
0 1 2 3 4
15. Reports are sent to
parents on a regular basis
(about once a month).
0 1 2 3 4
16. The school completes
reports of data
implementation for district
databases.
0 1 2 3 4
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
156
Don’t
Know
Disagree
Strongly
Disagree
Somewhat
Agree
Somewhat
Agree
Strongly
Accountability for data use at district, school, and individual level
24.1 think that the state
holds the district
accountable for data
utilization.
0 1 ' 2 3 4
25. I think that the district
holds the school
accountable for data
utilization.
0 1 2 3 4
26. My school holds me
accountable for data
utilization.
0 1 2 3 4
27.1 hold my students
accountable for improved
performance through the
use o f data.
0 1 2 3 4
28. My salary is
dependent upon utilization
of data practices.
0 1 2 3 4
29. My promotion within
the school is dependent
upon utilization o f data
practices.
.. 0
1 2 3 4
Improving student achievement throng i implementation of date use
30.1 have seen student
achievement visibly
improve when data is used
as a benchmark for
students to reach.
0 1 2 3 4
31.1 have seen student
achievement visibly
improve when I use data
to inform my teaching.
0 1 2 3 4
32. Intervention is more
easily employed through
the utilization of data.
0 1 2 - 3 4
33. Student motivation
increases when data is
present in my classroom.
0 1 2 3 4
34. Student motivation
increases through the
dissemination of data to
parents.
0 1 2 3 4
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
157
Don’t
Know
Disagree
Strongly
Disagree
Somewhat
Agree
Somewhat
Agree
Strongly
Degree of design implementation of emerging state data practices
17. The school offers
frequent professional
development to raise
awareness o f new data
practices.
0 1 2 3 4
18.1 have attended
professional development
training in the past six
months related to new data
practices.
0 1 2 3 4
19.1 frequently discuss
new data practices with
teachers who are about as
experienced as I.
0 1 2 3 4
20.1 frequently discuss
new data practices with
teachers who are more or
less experienced than I
(mentor/ mentee format). '
0 1 2 3 4
21.1 frequently discuss
data practices with
teachers in different
disciplines from mine
0 1 2 3 4
22. School administrators
have assisted me in
implementing new data
practices.
0 1 2 3 4
23. School administrators
have monitored my
utilization o f new data
practices.
0 1 2 3 4
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
APPENDIX 3
Interview Questions (District Administrator)
1. How are the district and schools focusing on improving Stanford 9 scores and API
ratings?
2. Are the district and school using the California Content Standards, district
designed standards or a combination of both? What have been the results o f your
district’s strategy concerning the use o f standards?
3. Is the district encouraging schools to incorporate authentic assessments to
standards based instruction? If so, what steps have the district taken to help
schools link authentic assessment with standards based instruction?
4. What role does state-of-the-art technology play in the district’s move toward
improving student achievement as measured by STAR results and as the schools
prepare students for the High School Exit Exam (HSEE)?
5. How does the district prepare schools for the emerging state assessments
(California Standards Test— CST, HSEE, Attendance Issues, Drop-out rates)?
6. How does the district monitor interim assessments at the school sites? What type
o f assessments are used by the district, and how do these assessments correlate to
the SAT 9 and CST?
7. Once data have been collected from the schools, what methods are used by the
district to analyze these data? How are the analysis shared with the schools?
8. What board ruling^ (regulations and procedures) directly support the district
design for using data?
9. How has empirical evidence been used to support the district’s design for the use
o f data?
10. How has the district financially supported the district’s design for the use o f data
to improve student achievement?
11. What direct evidence is there that points to the feet that the district expects
schools to use data to improve student achievement?
12. What data is supplied by the district to schools to help them improve instruction?
13. Does the district use multiple measures to increase performance and guide
instruction? What measures are used?
14. Has the district developed high performance goals, and how were these goals
established?
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
159
Interview Questions (Principal)
1. What is the school’s commitment to using data to improve student achievement?
How long has the school been using data a tool for improving student
achievement?
2. What is the school doing to use data (directly or indirectly) to promote student
learning?
3. Are there any school-wide plans/efforts to improve how data are gathered,
analyzed, and used? Please share these plans.
4. How does the school try to improve the administrator’s and teacher’s ability to
use data to increase student performance and guide instruction?
5. How have you tried to raise student performance expectations with teachers and
parents?
6. Would your staff agree that you have led them in school-wide implementation o f
a standards-based curriculum to improve student achievement? What evidence
can you site to support your position?
7. In what ways does the school regularly inform the students, parents, and
community o f student performance?
8. How has the district prepared administrators to use data to inform instruction and
improve student achievement?
9. Was the effectiveness o f instructional programs evaluated? If so, what was the
process used in making the evaluation?
10. What roadblocks or challenges did you face in the development o f a culture of
data use?
11. Does the school (administrators and teachers) see the district as a partner in
developing a culture of data use for improving student achievement?
12. Did the school and district use empirical evidence to support their efforts in using
data to improve student achievement?
13. What mechanisms does the school use to support the teachers in data analysis?
14. How does the school receive data for analysis? What type o f data are typically
analyzed?
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
160
Interview Questions (Teachers)
1. How do teachers analyze data to inform instruction and improve student
achievement?
2. How has the school trained teachers to analyze data to inform their instructional
practices?
3. Are administrators and teachers using standards and rubrics to improve student
performance? How were the rubrics developed and how are the rubrics used?
What do you do to ensure that the standards are taught?
4. After having analyzed the data and developed a strategy for instructional change,
how do you assess the effectiveness o f the change? How do you assess the
effectiveness o f current instructional programs?
5. In your opinion, are teachers in this school placing a high priority on improving
student achievement? What evidence can you site to support your conclusion?
6. What are the expectations and goals for achievement for students at this school?
Do teachers truly believe that students in this school can achieve at high levels?
Why or why not?
7. How do teachers in this school inform students, parents, and the community of
student learning performance? How do teachers help parents assist their children
in improving their classroom performance?
8. What interventions have the school implemented to help struggling students
achieve the established standards?
■ 9. Are you familiar with any funding that is used to promote the use o f data in this
school? Is the funding distributed equally across grade levels and curricular
areas? Why or why not?
10. What steps did the school administration take to ensure that the use Of data
became a priority at this school? How has the school gone about the task of
lifting the expectations o f teachers for the students and themselves?
11. Do you see the district as a partner with the school in using data for the
improvement o f student achievement? What has the district done to promote data
use in this school?
12. How have you felt about the changes that have taken place regarding the use o f
• data to inform instruction? Explain why you feel the way you do.
13. What challenges have you faced in the process of implementing data use in your
school? How were these challenges overcome?
14. How do you keep track o f student achievement growth? How do you use this
information?
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
How districts and schools utilize data to improve the delivery of instruction and student performance: A case study
PDF
An analysis of the use of data to increase student achievement in public schools
PDF
An analysis of the impact a district design on the use of data has on student performance
PDF
Design, implementation and adequacy of using student performance data and the design's link to state context for assessing student performance: A case study
PDF
Connecting districts and schools to improve teaching and learning: A case study of district efforts in the Los Coyotes High School District
PDF
An analysis in the use of student performance data in schools
PDF
How classroom teachers react to and implement California's Beginning Teacher Support and Assessment educational reform policy
PDF
A case study: An analysis of the adequacy of one school district's model of data use to raise student achievement
PDF
Design, implementation and adequacy of data utilization in schools: A case study
PDF
Data: Policies, strategies, and utilizations in public schools
PDF
A study of the relationship between student achievement and mathematics program congruence in select secondary schools of the Archdiocese of Los Angeles
PDF
Implementation challenges of a school district's technology plan at the middle school level
PDF
Implementation and levels of use of technology plans in a Southern California K--12 district and its middle school
PDF
An analysis of student -level resources at a California comprehensive high school
PDF
An analysis of the elementary principal's role in implementing school accountability within California's High Priority School: A case study
PDF
Elementary administrators and teachers' perceptions of the teacher evaluation process in California's public schools
PDF
Districtwide instructional improvement: A case study of an elementary school in the Beach Promenade Unified School District
PDF
An analysis of the elementary principal's role in implementing school accountability within California's high -priority school: A case study
PDF
A study of the principal's role in increasing achievement in literacy in a low -performing elementary school
PDF
A longitudinal comparative study of the effects of charter schools on minority and low-SES students in California
Asset Metadata
Creator
Duncan, William Todd (author)
Core Title
How effective schools use data to improve student achievement
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
education, administration,education, curriculum and instruction,education, elementary,OAI-PMH Harvest
Language
English
Contributor
Digitized by ProQuest
(provenance)
Advisor
Marsh, David D. (
committee chair
), [illegible] (
committee member
), Picus, Lawrence O. (
committee member
)
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c16-247820
Unique identifier
UC11339302
Identifier
3093755.pdf (filename),usctheses-c16-247820 (legacy record id)
Legacy Identifier
3093755.pdf
Dmrecord
247820
Document Type
Dissertation
Rights
Duncan, William Todd
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the au...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus, Los Angeles, California 90089, USA
Tags
education, administration
education, curriculum and instruction
education, elementary