Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
District level practices in data driven decision making
(USC Thesis Other)
District level practices in data driven decision making
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
DISTRICT LEVEL PRACTICES IN
DATA DRIVEN DECISION MAKING
by
Hasmik J. Danielian
______________________________________________________
A Dissertation Presented to the
FACULTY OF THE ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
May 2009
Copyright 2009 Hasmik J. Danielian
ii
DEDICATION
I dedicate this dissertation to the love of my life, Arsen, my lovely children,
Alfred and Nyree, as well as my wonderful son-in-law, Anto, for standing by me and
supporting me throughout the doctoral program. Their love, patience, and
encouragements gave me the courage to follow my dream of pursuing a doctorate
from USC at the age of 52.
iii
ACKNOWLEDGMENTS
This dissertation would not have been possible without the exceptionally
valuable guidance and support I have received from my exemplary chair, Dr.
Amanda Datnow, the quintessential educator whose knowledge in Data-Driven
Decision Making (DDDM) research and school reform is second-to-none. This
dissertation also has benefited enormously from the critical review of committee
members Dr. Dominic Brewer and Dr. Edward Lee Vargas by helping broaden and
deepen my understanding of NCLB and its impact on DDDM. I owe a world of
gratitude to Dr. Vargas, my former superintendent at Hacienda La Puente USD, for
encouraging, supporting, and recommending me to the Ed. D. Program and for
guiding and mentoring me throughout the doctoral program. A word of gratitude
goes to Dr. Michael Escalante, my former superintendent at Glendale, for
encouraging me and in persisting that I get my doctorate from USC. Special thanks
to Dr. Maffi, Dr. Kevin Colaner, Dr. Stu Gothhold and Dr. Stowe for all their
leadership and support to become admitted into the program. Special thanks to our
class adviser, Nadine Singh, for all her guidance and support.
I would like to also acknowledge all 16 district leaders in the four districts for
giving me the opportunity to conduct my qualitative research on the “best practices”.
The lessons I learned on DDDM and leadership are immeasurable and have set the
tone for my career as a future superintendent.
My deepest gratitude also goes to my wonderful editor, Shantanu, for his
wisdom and “magic touch” and for being my angel. I would like to thank all my
iv
friends, colleagues, and classmates, particularly Miriam, my soul sister, for all her
love and support, for our study sessions and design and research dialogues that have
been important to this dissertation’s development and completion. Thanks to my
colleague and friend, Sue Kaiser, for our conversations and encouragements to stick
to the “fight” and to complete the journey and to Arpine, who was by my side since
day one.
Last, but not least, I am deeply grateful to my administrative assistant, Nancy
Ruiz, for putting up with me for the past three years. I could not have completed my
dissertation without her patience, encouragement, and moral support.
v
TABLE OF CONTENTS
Dedication …………………………………………………………………. ii
Acknowledgments ………………………………………………………… iii
List of Tables ……………………………………………………………… vi
Abstract …………………………………………………………………… vii
Chapter One: Introduction ………………………………………………… 1
Chapter Two: Literature Review …………………………………………. 12
Chapter Three: Methodology ……………………………….…………….. 46
Chapter Four: Findings …………………………………………………… 54
Chapter Five: Analysis, Summary, Conclusion and Recommendations …. 152
Bibliography ……………………………………………………………… 181
Appendix: Interview Protocol ……………………………………………. 195
vi
LIST OF TABLES
Table 1: Selection of School Districts …………………………………….. 50
Table 2: Types of Data Used in Each District …………………………….. 89
Table 3: Summary of Findings on Culture of Data Use ………………….. 107
Table 4: Summary of Structures/ Practices in Each District ……………… 133
Table 5: Summary of Challenges to DDDM Faced by Each District …….. 145
vii
ABSTRACT
Forty years after the Elementary and Secondary Education Act, numerous
school reforms have attempted to tackle the same problems initially addressed by the
ESEA. Most recently, The No Child Left Behind (NCLB) Act of 2001, the latest in
the series of educational reform initiatives, distinguishes itself by its unparalleled
focus on accountability.
As mandated by NCLB, policy makers had hoped that publishing
disaggregated data on standardized test scores would result in increased awareness
for educators of existing achievement gaps—in short, create a “culture of inquiry,”
where the data creates an atmosphere that promotes awareness. However, what this
chain of assumptions failed to take into account was the challenges associated with
attempting to implement increased student achievement for all students through data-
driven decision making (DDDM). The extant research on DDDM suggests that
conditions at the district level (as well as at the school and state levels) clearly
impact the nature and extent of DDDM that is put in place at a given district.
This qualitative research study adds to the field literature by developing an
increased understanding of DDDM by district leaders in 4 districts. The study
focused specifically on superintendents, assistant superintendents, and assessment
and /or technology office leaders known for their effectiveness in the use of data-
driven decision making to improve and inform instruction that results in increased
student achievement.
viii
Each of the primary sections in this study, examine in careful detail the six,
key themes that were generated by the qualitative data; these six themes are: the No
Child Left Behind Act and the impact of this law on data use, the Processes of
DDDM, the Types of data utilized, the Culture of data use, the District
Structures/Practices with regards to data, and the Challenges to DDDM in each
district.
Consequently, this study will be useful to district leaders who intend to utilize
data-driven decision making to ensure school/district improvement. For districts that
do not currently use data-driven decision making, it is hoped that the findings will be
both inspirational as well as material in assisting districts to form a theory of action.
1
CHAPTER ONE
Introduction
With increased accountability, American schools and the people who work in them
are being asked to do something new to engage in systematic, continuous
improvement in the quality of the educational experiences of students and to subject
themselves to the discipline of measuring their success by the metric of students’
academic performance. Most people who currently work in public schools weren’t
hired to do this work, nor have they been adequately prepared to do it either by their
professional education or their prior experience in schools.
Richard F. Elmore (2002b, p.8)
Background of the Problem
Since the initiation of a national standards-based reform effort by President
George H. Bush in Charlottesville, VA in 1989, the basic principles of standards-
based reform have served as the guiding framework for public education policy at
the federal, state, and local levels. In the spring of 1991 the Bush administration
announced America 2000 with two important but controversial provisions dealing
with standards. One provision called for the development of “voluntary national
standards” in core subjects. Another provision called for states to embrace the
National Education Goals and to adopt voluntary national standards or to develop
their own. While Bush’s defeat in the fall election of 1992 effectively ended
America 2000, the Clinton administration maintained the national focus on standards
through their education initiative, Goals 2000. (Owens and Sunderman, 2006)
Spurred by Goals 2000 and the Improving America’s Schools Act of 1994,
the states moved forward in developing their own standards, creating corresponding
2
assessment systems and aligning their frameworks for curriculum and instruction.
The aim was to establish equitable and unified state systems for identifying problems
to meet the needs of children of color and directing resources and solutions to those
districts that were in greatest need in closing the achievement gap. The alignment
process proved to be far more complex than some initially anticipated and was
somewhat slowed by weakened accountability systems.
The passage of the No Child Left Behind Act (NCLB) was a third step in the
standards-based reform process, tying federal sanctions and awards to schools’
adequate yearly progress. State standards and tests continued to serve as the
foundation for instruction and assessment, but under NCLB accountability became a
federal matter. The law further extended the use of evidence in education by
mandating the use of scientifically based research in NCLB programs, effectively
mandating the use of data in making basic decisions about k-12 education
(Kohlmoos, 2005). The Elementary and Secondary Education Act of 1965 (ESEA),
also known as the Improving America’s Schools Act, was the first major effort by
the federal government to address the achievement gap that existed for students of
color by providing resources to meet the needs of educationally disadvantaged
students. Forty years later, there have been numerous school reforms to tackle the
same problem, such as new instructional models, curriculum, assessment and new
leadership structures (Tyack & Cuban, 1995). The No Child Left Behind Act of
2001, the latest in the series of educational reform initiatives, distinguishes itself by
its unparalleled focus on accountability.
3
Recent educational reforms and NCLB were developed in response to the
widespread perception that students in the United States were not being prepared
adequately in comparison to their peers in other countries (Schmidt, McKnight, &
Raizen, 1997). Therefore, the goal of NCLB, at both the state and federal levels, has
been to draw attention to the discrepancies that exist in the achievement gaps among
specific groups of students based on language, socio-economic status, and/or their
race/ethnic status. The ostensible purpose of NCLB was to hold schools and districts
accountable for providing equity and access for all students through standards-based
instruction and a corresponding assessment program. The two key provisions that set
NCLB apart from earlier education laws were first, the changes from inputs to a
focus on outcomes; and second, disaggregation of data by grade level, by subject
matter, by school, and by subgroup, thus rendering transparent what is happening
inside the classrooms (Weiss, 2007).
State-mandated, performance-based accountability systems largely depend on
theories of motivation that value rewards and recognitions. The proponents of such
an approach argue that schools will aspire to demonstrate achievement in order to
avoid negative press, keep local autonomy, and to avoid sanctions imposed by the
state and federal authorities (Wienbaum, 2005). The goals of the system—to increase
student achievement for all subgroups—are essentially embedded in pre-defined
content standards that educators must use in order to guide curriculum and
instruction. Correspondingly, standardized tests are used to measure the student’s
attainment of the standards. Improvement on the standardized tests for all students,
4
results in rewards to reinforce effective behavior, whereas poor performance results
in sanctions and implementation of strategies that will modify ineffective practice.
Currently, all fifty states have adopted policies that follow a performance-based
system (Goertz, Duffy, & Carlson-LeFloch, 2000). Through NCLB, the federal
government hopes to close the achievement gap by focusing on K-12 education to
ensure that all students, by the year 2014, will become proficient in reading and
mathematics.
NCLB, and most state accountability systems, share three common features:
(1) Goals: Establishes common expectations for student performance; (2)
Assessments: A system that objectively measures the achievement of goals, and (3)
Incentives: Strategies, both financial and administrative, to motivate students and
teachers to increase effectiveness by maximizing efforts (Stecher, Hamilton, &
Gonzalez, 2003). While federal and state policies have tended to emphasize the
value of standardized achievement test data, they have not necessarily encouraged
the use of multiple sources and types of data nor do they provide capacity building in
data use (Ikemoto & Marsh, 2006). In the next section, I will discuss the need for
data driven decision making (DDDM) in light of increased accountability.
Accountability and Data-Driven Decision Making
Albert Einstein observed that, “[p]roblems cannot be solved at the same level
of awareness that created them.” Policy makers had hoped that publishing
disaggregated data on standardized test scores would result in increased awareness
5
for educators of extant achievement gaps—in short, create a “culture of inquiry”,
where the data creates both a desire for, as well as an atmosphere that promotes
awareness. In turn, it was further assumed, that such knowledge would spark
teachers and administrators to adjust their practices to better serve the students.
However, what this chain of assumptions failed to take into account was the
challenges associated with attempting to implement increased student achievement
for all students such as: access to data in a timely manner, analysis and use of data,
and opportunities to design and implement effective and targeted strategies to close
the achievement gap. In their study of impact of NCLB in California, researchers
O’Day, Bitter, Kirst, Camoy, Woody, Buttles, Fuller, & Ruenzel (2004), suggest that
the effective use of data to inform pedagogical practice is at best sporadic, since
many educators lack skills, expertise or the capacity to analyze, interpret, and use
data-driven decision making to bring about changes that will result in improvement
of student achievement. Moreover, a study by RAND researchers (Ikemoto, Pane,
& Hamilton, 2006), and numerous other studies (Choppin, 2002; Dembosky, Pane,
Barney, & Christina, 2005; Feldman & Tung, 2001; Mason, 2002) have found that in
the absence of appropriate data and data systems at the district level, educators often
do not have the capacity to formulate questions, analyze results, select benchmarks,
and develop solutions. Thus, the district role is critical in enabling school level
educators to use data.
While NCLB seeks to increase the role that districts play in monitoring
performance and providing technical assistance to the sites, there is limited research
6
that focuses on the role of the district leaders who effectively use data to assist
schools in improving performance (Datnow, Park, & Wohlstetter, 2007). However,
the important role of the districts in improving instruction has been stressed in the
recent literature on school improvement. Districts do so by providing vision, support,
focus, and policy coordination and by building commitment at the school level
(Corcoran, Fuhrman, & Belcher, 2001; RAND, 2006).
The research on data-driven decision making suggests that conditions at the
district level (as well as at the school and state levels) might have an effect on the
nature of the data-driven decision making. For example, if districts do not provide
technical support to the sites and do not report the results of district assessments in a
timely manner, that may affect the ability of the sites to use data to improve
instruction (Ikemoto & Marsh, 2006). Achieving success for all students requires not
only a systemic approach across the district, but also alignment and consistency
throughout the system (Agullard & Goughnour, 2006). Such consistency begins with
the superintendent and key central office personnel who act on a coherent theory of
action, i.e. a system of beliefs that guides behavior, about how schools should
improve. Research suggests that in districts where there is a more cohesive approach
to district-wide improvement, district office personnel have a clear understanding of
the role that each and every administrator in the district must play and the kind of
structures that are needed to be put in place, in order to drive school and district
improvement (Agullard & Goughnour, 2006).
7
The report, Similar Students, Different Results: Why do Some Schools Do
better? (Ed Source, 2005), addresses practices associated with high performance of
schools in California, including data-driven decision making by the district and the
principal in order to improve instruction and student performance. According to the
principals surveyed, the district role should be to expect that the schools within it
will strive to improve student achievement, provide support for site-level planning,
and evaluate the principals based on the improvement of student achievement results.
In addition, the report suggested that district leadership, accountability and support
appear to influence student achievement. District leadership sets clear and high
expectations that schools meet the Academic Performance Index (API) and Adequate
Yearly Progress (AYP) growth targets for all subgroups, they provide all the needed
resources, including human and financial resources, and provide student achievement
data to evaluate teacher and principal performance.
Although there is a small but growing body of research on how districts
support data use, several issues deserve more investigation, including (1) data quality
(Thorn, Meyer, & Gamoran (2007); (2) building capacity for educators to engage in
“quality conversations around data (Datnow et.al, p.69; Ikemoto et. al, 2006); (3)
managing and prioritizing data (Datnow et. al, 2007); (4) sustaining a culture of
improvement through the use of DDDM (Datnow et. al., 2007); (5) study of school
districts’ use or lack of different types of data such as process data, perception data,
and input data (Ikemoto et al., 2006); (6) factors hindering the effective use of data
(Ikemoto et. al, 2006). As result, district leaders can play a key role to help build
8
capacity for the schools to use data-driven decision making (Datnow, Park, &
Wohlstetter, 2007). My study hopes to fill some of the gaps identified in the
research by focusing on the role of the district leaders to use DDDM to drive school
district improvement. The next section will address the research questions.
Research Questions
This qualitative research study intends to add to the field literature and
develop an increased understanding of data-driven decision making by district
leaders in 4 school districts. The study focused on the actions of school district
leaders, specifically superintendents, assistant superintendents, assessment office and
technology leaders known for their effectiveness in the use of data-driven decision
making to improve and inform instruction that results in increased student
achievement. These districts were chosen on the basis of their commitment to equity
and their leadership in the use of DDDM to narrow the achievement gap over the
past several years. They were selected based on their work in the following areas: a)
individual student data; b) Technology infrastructure; c) Professional development
for data use and statistics; d) Use of benchmark assessments; e) Commitment to
DDDM; f) Percentage of socioeconomic disadvantaged students (40% and higher;)
Free and Reduced lunch (44% and higher), and percentage of Hispanic students (
from 21% up to 82%) as well as percentage of English Language Learners (12% to
46%). Although these school systems vary in size and history, they all have histories
of consistently improving student achievement since the advent of NCLB. All four
9
districts are composed primarily of schools in urban locations that serve large
numbers of low income students and students of color.
Specifically, this study addressed the following overarching question:
How do district leaders use Data-Driven Decision Making (DDDM) to drive
district/school improvement?
Aligned with the question stated above, this research study will also address
the following sub-questions:
a) What forms of data do districts rely upon most?
b) How do district leaders cultivate a culture of data use?
c) What structures/policies do district leaders put in place to assist with data
use?
In sum, I will be addressing the gaps in research on DDDM that include
district leaders’ use of the types and quality of data and the structures they put in
place in order to build capacity for educators to cultivate and support a “culture of
inquiry” that results in continuous improvement of instructional practices and
student performance.
Significance of the Study
This qualitative case study research on best practices aimed to develop an
understanding of data-driven decision making by district leaders to ensure school
improvement. The study focused on leaders in 4 districts and the structures that they
have put in place to build capacity for the schools to engage in continuous cycle of
10
improvement (Datnow et al., 2007). While there is some research in this area, the
extant field-literature is still limited, consequently, this study aimed to add to the
body of literature on the ways that district leaders effectively use data-driven
decision making. The study provides practical applications to using data-driven
decision making based on experiences of leaders in four school districts. The study
should be useful to district leaders who will utilize data-driven decision making to
ensure school/district improvement. For districts that do not currently use data-driven
decision making, it is hoped that the findings will be both inspirational as well as
material in assisting districts to form a theory of action.
Researcher’s Subjectivity
While engaged in this research, although I did not study my own district, my
own biases may still come into play. During the study, I tried to be cognizant of my
own subjectivity that stems from my experiences as an Associate Superintendent at a
K-12 school district. I work in a district that uses data to improve practices.
However, while state data are presented to the sites and interim assessment results
are provided to the schools, the administrators and teachers have not been trained in
data- driven decision making and do not seem to have the capacity to formulate
questions, analyze results, select benchmarks, and develop solutions (Choppin, 2002;
Dembosky, Pane, Barney, & Christina, 2005; Feldman & Tung, 2001; Mason, 2002).
My intent in choosing this topic was to gain a broader understanding of the best
practices on the role of the district leaders in supporting the sites to promote and
11
cultivate a data-driven decision making culture where it becomes an integral part of
the organizational routine to ensure equity in closing the achievement gap
(McLaughlin and Talbert, 2002; Bainbridge & Lasley, 2002). By reviewing the
literature and gathering empirical data on this topic, I not only hope to broaden my
knowledge of best practices in DDDM by district leaders, but also hope that the
lessons learned will be useful to researchers, educators and policy makers who are
interested in improving student performance and closing the achievement gap.
12
CHAPTER TWO
Literature Review
Introduction
The purpose of this literature review is to serve as a foundation for the study
by examining research on district level practices in Data Driven Decision Making
(DDDM) and to identify barriers and facilitators associated with data use. More
specifically, this review will include the following sections:
1. A review of process and context of DDDM through the lens of various
conceptual frameworks.
2. A review of data use and evidence-based practices by district leaders
3. Facilitators and challenges to DDDM
I will first present background information by discussing the affects of
accountability and the specific impacts of No Child Left Behind on DDDM.
Accountability and Impact of NCLB on Data-Driven Decision Making
The literature on strategic efforts to drive school and district improvement is
focused on the role of data to develop, guide, and sustain organizational change that
leads to improvements in student learning (Massell, 1998). Under NCLB, a new
catchphrase, “We are data-driven” has developed and is used by all levels of the
school system, from the central office to the classrooms (Marsh, Pane, & Hamilton,
2006, p.1). According to these researchers, Data-Driven Decision Making (DDDM)
13
refers to the systematic collection of multiple forms of data from a variety of
resources with the goal of improving performance at all levels of the organization.
Research findings suggest that inquiry-based DDDM is much more powerful when
multiple sources of data are used (Choppin, 2002; Feldman & Tung, 2001; Ingram et
al., 2004; Lachat, 2001).
Historically, decisions have been based on intuition and the implicit
knowledge of the district leaders in the context of politics, instead of data (Slavin,
2002). NCLB has created an unparalleled focus on accountability through the use of
data. The state and federal government are focused on evidence-based results
(Fullan, 2000) vis-à-vis data. Assessment and testing have switched from being a
decision-making device about students to being a vigorous force for holding schools
accountable for results (Firestone et al., 1998).
What does analyzing data mean? Various researchers have indicated that it
means different things to different people (Kerr, Marsh, Ikemoto, Darilek, & Barney,
2006; Marsh et al., 2005). Some use the disaggregated state data to identify areas of
weakness to develop targeted interventions, and some choose to gather and analyze
data from multiple sources in order to get to the root causes of the gaps and work
with outside consultants to receive support with interpreting data (Ikemoto & Marsh,
2006).
Effective use of data by district leaders and school personnel has been
identified as a key factor in school improvement processes (Chrispeels, 1992; Earl &
Katz, 2002; Protheroe, 2001; Wayman & Stringfield, 2003). Johnson (1996, 2002)
14
examined many different uses of data to build school and district capacity to
equitably educate the students and to close the achievement gap. Marzano (2003)
suggests that when schools and districts try to use data, they often use the wrong type
of data, using indirect measures of learning for which they have no model to
interpret.
Research suggests that there are two types of motivation in meeting the
accountability mandates (Deci and Ryan, 1985, 1987): (1) extrinsic and (2) intrinsic
motivation. For example, these researchers suggest that large scale accountability
systems like state testing and reporting provide external motivation for change for
school leaders. Therefore, when the pressure is gone, then the associated behavior
disappears. In addition, they contend that although accountability creates a sense of
urgency, it should serve more than just identifying “culprits” (Earl, 2000, p. 8). The
challenge, for the leaders, then becomes to use data, not as a mandatory activity to
measure and monitor progress, but to improve practice (Earl, 2000). Earl and Katz
(2002) propose the use of data to inform decisions regarding current performance
and to devise plans for improvement.
According to numerous researchers (Fullan & Stiegelbauer, 1991; Schmoker,
1996), although initial interest was on data for accountability purposes, the debate
around measurement driven instruction in the 1980’s was an early attempt to use
assessment data to improve instructional decisions (Popham, Cruse, Rankin,
Sandifer, & Williams, 1985; Shepard, 1991). Recently, the education community has
once again become interested in DDDM, because a number of schools and districts
15
have the capacity to process and disseminate data in an efficient and timely manner
(Ackley, 2001; Thorn, 2002). NCLB has accelerated this trend by compelling the
leaders to use data to improve student performance (Hamilton, Stecher, & Klein,
2002).
The No Child Left Behind Act of 2001 holds states, districts and schools
accountable for student achievement. NCLB requires regular assessments to mark
progress and highlight weaknesses in core academic subjects. These assessment
results must be reported in the aggregate as well as disaggregated forms for
individual subgroups of students, including low-income or disability status, race or
ethnicity (U.S. Department of Education; Weiss, 2007). For example, the results of
the California Standards Tests are reported to the public and schools are held
accountable for making the Adequate Yearly Progress (AYP) and meeting their
Academic Progress Index (API). One of the consequences of NCLB is that district
administrators are being asked to think very differently about decision making and to
use data for everything from instructional practices to resource allocation
(Mandinach, Honey, & Light, 2006). In reporting findings from a larger CPRE study
of 22 districts in eight states over two years, Massell (2000) emphasized that district
are progressively giving more attention to (1) interpreting and using data,(2)
building teacher knowledge and skills, (3) aligning curriculum and instructions (4)
targeting interventions on low-performing students and /or schools (p.1). Massell
(2001) also identified challenges that the districts face when building capacity in all
these areas. The challenges included training teachers on using data to improve their
16
instructional practices. This study, however, did not address the effectiveness of the
central office strategies in improving student achievement. Meanwhile, a recent
study by Kerr et al. (2003) suggests that DDDM can have a positive affect on student
achievement. In addition, literature on effective schools includes many citations on
extensive use of DDDM as a common feature of high performing school districts or
“beating the odds” (e.g., Council of Great City Schools 2002; Snipes, Doolittle, &
Helihy, 2002). Additionally, many studies link DDDM to changes in school culture
and teacher practices that result in improved student performance (Kerr et al., 2006).
Clearly, more information is needed in this area of research.
Policy Context for DDDM
“Situation” or “context” has been used as a generic term for everything from
classroom teachers’ experiences to student background. In addition, the word has
been used differently by different researchers (Spillane & Miele, 2007). Some use it
to imply how stakeholders make sense of new information in their environment.
Others go deeper into arguing that “context” is more than a backdrop for making
sense and it is the element of human practice combined with organizational practices
and norms (Greeno, 1998). Starbuck & Milliken (1988) argue that sense making
involves filtering stimuli through the lens of humans’ existing knowledge
frameworks.
Prior to NCLB, researchers always criticized educational leaders for making
decisions based on instinct, intuition, or trends (Slavin, 2002). However, NCLB
17
introduced the new accountability systems that have mandated the use of data to
identify strengths and weaknesses and to assist in evaluation of program
effectiveness (Mason, 2002). Mason further states that since quantifiable measures
are needed to evaluate the effectiveness of practices, therefore the need to use data-
driven decision making has become crucial.
A study by RAND (Marsh, Pane, & Hamilton, 2006) suggests that high
stakes may help stimulate DDDM. Schools and districts with long standing state
accountability systems are more involved in data-driven decision making than those
with a more recent history for accountability. In addition, according to these
researchers, NCLB compels leaders to use data in their daily practice. Whether or not
leaders are convinced of the potential benefits of data, the growing need to use data
for decision making purposes is increasing (Marsh, Pane, & Hamilton, 2006). In a
data-driven decision making world, leaders are in the position of “learning to live
with data and like it” (Earl & Katz, 2002, p. 2). In this context, data-driven
leadership may be used to build a bridge to build capacity for staff (Knapp,
Swinerton, Copland & Monpas-Huber, 2006). In other words, NCLB has set the tone
for collaborative DDDM and serves as an extrinsic motivation to drive school/district
improvement (Deci & Ryan, 1985, 1987).
However, the RAND research on the DDDM process suggests that since
different decisions are made at different levels of the system, such as classroom
teachers, administrators, district leaders, and state and federal levels, therefore the
context in which decisions are made at each of these levels might have an effect on
18
the nature of the data-driven decision making. For example, if districts do not
provide technical support to the sites and report the results of district assessments,
that may affect the ability of the sites to use data to improve instruction (Ikemoto &
Marsh, 2006).
In sum, after two decades of national movement towards the standards-based
reform and greater accountability, NCLB will inevitably have a significant impact on
how district offices collect data and interpret results in their pursuit of improved
student achievement (Linn, Baker, & Betebenner, 2002). After all, as suggested by
Earl and Katz (2002), in order for data to have lasting effect on school improvement,
the motivation for its use must be intrinsic and the practice of using DDDM must be
embedded in the organizational culture. In the next section, I will review the
literature on DDDM. Inherent in NCLB is that educators know how to analyze,
interpret and use data to make informed decisions in all aspects of education from
student learning to professional development. However, it fails to provide capacity
building to enhance data use.
Types of Data and Uses
Earl and Katz (2002) define data as summaries of information about
something of interest that have been collected through the use of qualitative or
quantitative methods. They believe that when we measure abstract concepts such as
core values and beliefs, the results help us make some assumptions that give us some
insight into the issues at hand. Therefore, data are then nothing more than symbols
19
that are communicated in words or numbers (Earl & Katz, 2002). In order to make
sense of data and turn the information into meaningful action, data need to be
organized, placed in context, analyzed, interpreted, and synthesized (Senge et
al.,1999). Spillane & Miele (2007) suggest that no single piece of information, such
as poor tests results, entails a single conclusion, such as poor instruction or low
quality of teaching staff. They suggest that this is mostly due to the fact that
information usually is interpreted in the context of an individual’s or organization’s
values and beliefs (Cronback et al., 1980; Knapp, Swinnerton, Copland, & Monpas-
Huber, 2006) and that becomes the filter for shaping the information and influencing
the evidence. In other words, data need to be organized and be put in a context that
makes most sense to the users within the organization.
Using data wisely entails more than simply gathering data and converting
them into numbers. Data need to be analyzed and interpreted into information that
will result into actionable knowledge (Earl & Katz, 2002). The raw data may be used
as tools for engaging in various inquiries, including indicators that may offer
“warnings and hints” about the organization’s performance (Celio & Harvey, 2005)
such as student achievement gaps, student attendance, student retention/promotion,
teacher retention, and funding equity.
According to Heritage and Yeagley (2003), there are five types of data that
are likely to be available to the districts, schools and teachers to drive school
improvement:
20
• Large scale achievement tests: Large scale achievement tests are
standardized tests that measure student achievement of the standards on
the state tests. Lapp, Fisher, Flood, and Cabello (2001) suggest that main
reasons for these kinds of assessments are to diagnose student progress,
inform instruction, and evaluate programs and provide accountability
information.
• Benchmark assessments: May be developed by the districts or sites and
are used multiple times to measure student progress toward mastery of the
standards.
• Formative assessments: Used by teachers to inform adjustments in
instruction. Marzano (2007) suggests that by looking at teachers’ pre-post
tests and common interim assessments, we know how students are
learning. Teacher teams can use assessment data to guide their
instructional interventions (Ainsworth & Viegut, 2006; Langer, Colton, &
Goff, 2003; Fisher & Frey, 2007).
• Grading: Researchers have found that while grading is the most common
strategy to measure student learning, they are very subjective and are
unlikely to be valid indicators of student achievement (Marzano, 2000,
2007; Heritage & Yeagley, 2005; Reeves, 2004).
• Going beyond grading: The use of multiple measures such as attendance,
behavior and citizenship, perception or survey data, as well as
demographic data can be used to add to the understanding of the
21
educators of factors affecting student achievement (Heritage & Yeagley,
2005).
Like the recent study conducted by Heritage and Yeagley ( 2005), research
conducted at the Education development Center’s Center for Children and
Technology (EDC/CCT) found that for the most part school administrators use data
from standardized tests to understand general trends such as the strengths and
weaknesses in student performance , and allocate resources for providing
intervention programs. In contrast, teachers use multiple measures such as tests,
grades, homework, projects etc. in order to inform their thinking regarding student
learning (Brunner, Fasca, Heinze, Honey, Light, Mandinach, & Wexler, 2005;
Honey, Brunner, Light, Kim, McDermott, Heinze, Breiter, & Mandinach, 2002;
Light, Wexler, & Heinze, 2004).
Consistent with the above findings, Marsh, Pane, & Hamilton (2006) suggest
that certain types of data are more likely to be used for making decisions than other
types of data. For example, superintendents and principals use test data to identify
gaps and allocate resources for the intervention programs. However, in the same
study, more than 80 percent of superintendents in California, Georgia, and
Pennsylvania found results from local assessments to be more useful for decision
making than state test results. In another example, part of developing School
Improvement Plans (SIP), one school district invested heavily in computer-based
programs and trained its staff to analyze data. In addition, districts used data to
monitor schools, teachers, and students in order to measure progress. No matter
22
what kinds of data are used, educators analyze and interpret data to make decisions
and to develop action plans (Marsh, Pane, & Hamilton, 2006).
Mandinach, Honey, and Light (2006) also concede that multiple forms of
data must be used in order to make data driven decision making meaningful. The
data that might be used are as follows:
• Input data, such as demographics and expenditures
• Process data, such as data on financial operations or the quality of
instruction
• Outcome data, such as drop-out rates or student test scores,
• Satisfaction data, such as surveys, opinions from the stakeholders
However, raw data by themselves do not mean anything unless they are organized,
analyzed, and interpreted within the context in which they are presented in order to
produce the information needed to turn into actionable knowledge. This means that
the leaders use this knowledge to prioritize their actions and analyze the options.
Decisions that are made using this framework fall into two categories: Decision that
use data to identify goals or needs, and decisions that require using data to take
action (RAND, 2006). Once decisions to act have been made, new data are collected
to measure the effectiveness of the actions that lead into continuous cycle of data
collection, evaluation and synthesis through feedback loops.
Similarly, Light et al. (2004) note that raw data by themselves do not mean
anything and they depend upon the knowledge of the individuals looking at data.
They further state that individuals looking at data must have the context for data that
23
they are looking at so that they can use that information to understand, organize,
analyze and interpret data, before they can turn the information into actionable
knowledge. This framework identifies six cognitive skills for using data-driven
decision making. They are as follows:
• Collecting and organizing at the data level
• Analyzing and summarizing at the information level and
• Synthesizing and prioritizing at the knowledge level
The researchers, however, caution us that regardless of the depth and breadth
of data, the collected information need to be summarized (Mandinach et al., 2006)
and narrowed down before it can be turned into actionable knowledge. In addition,
the same researchers contend that different decisions are made depending upon the
individuals and the positions they hold. For example, based on data, a
superintendent’s decision to allocate more funds to a school that is in dire need of
interventions may be very different from a principal’s decision that focuses on one
curriculum based on teacher and student responses. Moreover, a teacher may decide
that addressing gaps in student achievement may be more important than getting
involved in another district or school initiative.
On the other hand, the conceptual framework used by Datnow et al., (2007)
reflects the need to examine the relationships between the state, federal, and local
contexts as well as between different levels within a district such as schools,
classrooms, and district office. Their goal in using this framework was to examine
24
how effective the collaborative efforts between schools and the districts were in
building staff competence in the use of data.
In sum, data can guide school and district improvement in numerous ways,
i.e., by providing performance goals and quantifiable measures. Regardless of the
type of data used, data analysis should be a collaborative process (Schmoker, 1996;
Hannaway, 1989; Kennedy, 1982a, 1982b; Spillane et al., 2002) focused on clear,
attainable, and measurable outcomes. However, data use does not take place in a
vacuum. Rather, contextual factors may impact the capacity and motivation of data
use for decision making (Spillane & Miele, 2005). In the next section, I will review
the literature on the process of DDDM
The Process of Data-Driven Decision Making
DDDM in education is not a new practice and it has been in existence for
around forty years. Different terms have been used to refer to this process such as
measurement driven instruction in the 1980s (Popham, 1987; Popham, Cruse,
Rankin, Sandifer, & Williams, 1985), and strategic planning in the 1980s and 1990s
(Schmoker, 2004). In fact, education has adopted the concept of DDDM from
industry such as Total Quality Management (TQM), Organizational Learning, and
Continuous Improvement which focuses on the use of multiple forms of data from
multiple resources (Marsh, Pane, & Hamilton, 2006) to drive continuous
improvement (RAND studies, 2006).
25
Several researchers (for example, Copland, 2003; Halverson, Grigg,
Prischett, & Thomas, 2005) have used the term “Inquiry focused” data-driven
decision making as a means of continuous organizational improvement and learning
(Feldman & Tung, 2001). This process starts with a question and then data are
consulted and other evidence is presented to answer the question. This process is
different from using data to make decisions such as introduction of new services or
intervention programs (Murnane, Sharkey, & Bouldett, 2005) than building an
understanding to improve the quality of services.
Regardless of the process of data analysis, educators have referred to this
process as data driven decision making. In the next section, I will use the literature
review on evidence-based practices by district leaders.
Factors Shaping the Role of Evidence Use by Districts
Evidence “is information selected from the available stock and introduced at
a specific point in the argument in order to persuade a particular audience of the truth
of falsity of a statement…” (Majone,1989, p.11). To state this differently, the
implications that a set of facts holds for a problem are never self-evident, unless we
confirm or deny the evidence. In other words, when constructing evidence, the
problem is identified first and then data is collected.
Three factors seem to influence the use of evidence by district leaders:
Political pressures, organizational capacity, and public policy. Coburn, Honig, &
Stein (in press) look at the overall role that evidence plays in how district office
26
administrators use evidence to make decisions. The researchers identify four roles:
Instrumental role, Conceptual role, Symbolic role, and No role (Bickel & Cooley,
1985; Weiss & Bucuvalas, 1980; Weiss, Murphy-Graham, & Birkeland, 2005). Carol
Weiss and her colleagues (Weiss et al., 2005) argue that there is a fifth kind of role
for evidence that has come about as a result of NCLB and that is the sanctioning role.
Organizational capacity plays a key role in the use of evidence by district
leaders (Coburn, Honig, & Stein, in press). However, according to these scholars,
there is limited research that suggests how districts foster the development of
capacity. These scholars have stated that in their case studies of school districts,
those districts that seem to use data throughout the system, including the district
office, use professional development on data use (for example, Snipes, Doolittle, &
Herlihy, 2002). However, the types of professional development and any subsequent
and corresponding evaluations are not specified in their studies.
Moreover, NCLB requires a shift from collecting data for compliance
purposes (symbolic manner) to collecting data to inform decision making
(instrumental role) (David, 1981). Use of data and evidence also requires research
literacy and skills at data analysis and interpretation (Corcoran et al., 2001; Mc Iver
& Farley, 2003). Unfortunately, many district leaders lack these capacities,
hampering their abilities to use evidence in decision making (Burch & Thiem, 2004;
Corcoran et al., 2001; David, 1981; Honig, 2003, 2004; Mac Iver & Farley 2003;
Reichardt, 2000). In addition, district leaders are more likely to use evidence in
instrumental or conceptual ways when the culture is conducive to data use (Corcoran
27
et al., 2001; Honig, 2002; Massell, 2001; Roberts & Smith, 1982). On the other
hand, districts where data use is not embedded in their norms and culture, evidence
use played little role in decision making (Corcoran et al., 2001)
Last, but not least, public policy is playing a major role in the use of data and
research-based evidence in the districts (Coburn et al., in press). However, these
researchers suggest that the use of “scientifically-based” research does not
necessarily lead to the use of research or data in instrumental or conceptual ways.
Examples include, but are not limited to, the studies done by David (1981) on Title I
program evaluations. David found that what these evaluations did was to increase
the numbers of evaluation done by district office personnel, but they did not increase
the degree to which these evaluations actually resulted in program improvement.
Coburn and her colleagues (in preparation) found that increased policy pressures to
use research in decision making led to increased symbolic use rather than increased
instrumental use of social science research.
The district office can play a central role in helping build capacity for schools
in the use of data for educational decision making (Datnow et al., 2007). Districts
that have high performing schools make decisions based on data, not intuition
(Supovitz & Taylor 2003; Tongeri, 2003). Some of the elements highlighted in a
study done by Armstrong and Anthes (2001), associates effective data use to strong
leadership and a district-wide culture that supports continuous improvement.
Kerr et al. (2003) did a thorough study in three urban districts and found
factors to affect data use were as follows:
28
• Accessibility and timeliness of data
• Perception of data validity
• Training and support for teachers in regard to data analysis and
interpretation,
• The alignment of data strategies with other instructional initiatives
• Distributed leadership
This study advocates that the effective use of data may depend upon several
factors such as strong leadership, proactive (up-front) approach to planning for data
collection and use, and staff capacity for data-driven decision making. Similarly,
Datnow et al., (2007) found that DDDM was used in school systems to build
capacity in schools and districts (Colby, Smith, & Shelton, 2005).
In Earl and Katz’s study (2003), outcomes associated with district actions to
promote DDDM were stronger in two out of the three districts that had invested
more time and energy into DDDM. In addition, in this study, staffs at schools
reported that they used multiple forms of data from multiple sources to execute
instructional planning and decision making. Teachers also repeatedly used their
planning time to review assessment results and other data to design targeted
interventions. In addition, the accessibility of data in user-friendly formats further
facilitated data use for decision making by the sites. Moreover, principals at the
districts who spent a minimum of five hours a week on data analysis with the staff,
made more extensive use of data to identify gaps in student achievement and to
29
develop action plans. Last, but not least, these districts provided technical assistance
to support schools in interpreting assessment results.
In this study, all three districts partnered with outside providers that aided the
districts with building capacity of school and district staff to drive instructional
improvement. One of the most revealing findings in this study was the lack of
capacity of districts to support and strengthen school level efforts in DDDM. This is
in addition to the sites’ limited skills.
In summary, under the NCLB, the use of DDDM is becoming more prevalent
in districts .Therefore, districts can take actions to promote and encourage the use of
data in making decision to drive school/district improvement. In addition, public
policy is only instrumental in conceptual use if it helps build the capacity of district
leaders to use data and research in this manner (Kerr et al., under review; Massell,
2001). If districts develop a culture where data-use becomes instrumental in driving
improvement, then the use of data and evidence do not become uneven, superficial or
symbolic (Coburn, Toure, & Yamashita, in preparation; David, 1981; Massell,
2001). Although numerous studies have documented that data are being used at all
levels of a school and district system and they serve multitude of purposes from goal
setting to identifying low-performing students, aligning instruction to standards, and
improving staff development (e.g., Bernhardt 2003; Choppin 2002; Feldman and
Tung 2001; Mason 2002; Supovitz and Klein 2003), the implementation is not
always very successful. Other factors need to be present in order to enable and
facilitate a culture of inquiry such as effective leadership, proactive approach to data
30
collection and use, and building potential and capacity for DDDM. The next section
will review the literature on the role of district leaders and will address district-level
practice for supporting DDDM.
Role of District Leaders in DDDM
In this section, I will describe the role of district leaders in data driven
decision making. Twenty years ago, Murphy and Hallinger (1988) identified
distinctive characteristics of high performing districts in California. Their study
found that strong instructional leadership from the superintendent, a strong focus on
curriculum and instruction, consistency and alignment of instructional activities, as
well as focused monitoring of curriculum and instruction are instructionally effective
strategies to drive school district improvement. Other more recent studies of districts
identified as high performing conducted in Texas (Ragaland, Asera, & Johnson,
1999; Skrla, Scheurich, & Johnson, 2000) and North Carolina (Public Schools of
North Carolina, 2000) echoed similar themes that Murphy and Hallinger found, with
special emphasis on district leaders to improve student performance.
Similarly, the Consortium for Policy Research in Education (CPRE)
examined the roles that the central office leaders play in shaping and supporting
reforms in three large urban districts that are located in three different states and their
enrollment ranges from 50,000 to 200,000. Literature on school improvement has
stressed the key role that the district office can play in improving instruction by
providing focus, coherence, and consistency and support to the sites (Corcoran,
31
Fuhrman, & Belcher, 2001). Agullard and Goughnour (2006) echo the same findings
by noting that central office personnel such as the Superintendent and other staff
members working on instruction espouse a theory of action, based on their beliefs, to
improve schools and student performance. Embedded in this theory of action is the
change process both at the sites and at the district level. These researchers suggested
that in order for the central office to effectively support the sites to improve student
achievement results, there needs to be consistent, coherent, and systemic approach
among the leaders.
Several comparative district studies have also been identified or investigated
by McIver and Farley (2003) to study the process that central office leaders use
specifically to encourage DDDM (Armstrong & Anthes, 2001; Massell, 2000;
Massell, 2001; Snipes, Dolittle, & Herlihy, 2002). Massell (2000, 2001) found a
significant discrepancy in how districts are supporting data use to improve student
performance, and suggested that the variation was due to the administrators’ beliefs
about the utility of DDDM than to the high stakes accountability context. Also, the
need for continued support to help school leaders to analyze and interpret data has
been emphasized by various researchers (Massell, 2001; Armstrong and Anthes,
2001; and Snipes et al., 2002). Research also found that unfortunately, too many
districts and their leaders do not align their goals and strategies with the school sites
and as a result, many competing agendas and initiatives go on at the same time,
exhausting resources (Agullard & Goughnoor, 2006).
32
Unlike the above researchers who focused on within-district activities,
Goertz’s (2001) analysis focused on how differences in state accountability systems
influence the district role. Although districts in states that had high-stakes
accountability systems put in place a structure to help the schools focus on student
achievement, it did not guarantee any changes in student achievement. The need for
districts to build human capacity among principals and teachers again surfaced in
this study as a critical factor.
Consistent with this findings, several research studies (e.g., MacIver &
Balfanz, 2000; BASRC, 2000) have pointed out that some school districts are not
capable of meeting the school’s needs for providing a DDDM due to the fact they
lack the human resources to do so. Since the information flow does not generally
come in a manageable format, district administrators tend to make things simpler
(Hannaway, 1989; Honig, 2003; Spillane, 2000) by focusing on the pieces that they
feel comfortable with and that are aligned to their previous knowledge and beliefs
(Hannaway, 1989; Honig, 2003) before they make any decisions. In addition, Honig
(2003, 2004), in her study of school-community partnerships in Oakland, found that
district administrators were able to use evidence for decision making when they
converted the complicated challenges into distinct, and familiar actions (Honig,
2003, 2004, April). Similarly, Weick (1995) notes that it is “productive to look at
the filters people invoke, why they invoke them and what those filters include and
exclude” (p. 57). How districts filter information depends upon their focus, prior
knowledge, projects, and available time. Stated simply, the belief system and prior
33
experiences and knowledge of the district administrators frame the DDDM process.
Capacity building efforts are often needed to fully enable leaders to engage in
successful data use practices (Datnow et al., 2007).
In conclusion, effective use of data by district leaders has become a central
tenet in school/ district improvement processes (Chrispeels, 1992; Earl & Katz,
2002; Protheroe, 2001; Wayman & Stringfiled, 2003), not only to raise test scores
(Kennedy, 2003), but also to change school cultures and teacher attitudes (Fieldman
& Tung, 2001), especially toward low-performing, at –risk students as a major force
in building school and district capacity to educate students equitably and reduce the
achievement gaps. Marzano (2003) suggests “leadership could be the single most
aspect of a school reform” (p. 172). Numerous studies have also stressed the
importance of the role of the leaders in DDDM (Supovitz & Klein, 2003; Young,
2006; Lachat & Smith, 2005; Datnow et al., 2007; Kerr et al., 2006; Wayman, 2005).
Wayman (2005) has also suggested the importance of establishing norms and
conditions for data use. Datnow et al. (2007) suggest that leaders must make the use
of data “non-negotiable” and daily use of data must become part of the culture of the
leaders of organizations (Earl & Katz, 2002).
Creating a Culture of Inquiry and Continuous Improvement
As data become key in improving organizations, school and district leaders
must become active participants in creating data-rich environments to make
decisions (Earl, 1998). Earl and Katz (2003) suggest building capacity for schools
34
through the development of a culture of inquiry and development of data literacy, as
well as inquiry as a habit of mind. According to Senge (1990), a learning
organization is one that is “continually expanding its capacity to create its future”.
This is not a straightforward process; rather it is a repetitive process of “thinking in
circles” (O’Connor and McDermott, 1997) that results in decisions, and actions and
feedback that drive the process. Therefore, leaders must develop an “inquir[ing]
habit of mind” and treat data as one of many sources of information for decision
making (Earl & Katz, 2002).
According to research, increasing numbers of organizations are beginning to
use DDDM as they plan and implement programs (Knapp et al., 2006). In literature,
these instances appear in research on “ reforming districts” (McLaughlin & Talbert,
2002); schools and districts engaged in a “ cycle of inquiry” (Copland, 2003);
schools responding to accountability systems (Spillane et al., 2002; Lemons,
Luschei, & Siskin, 2003); schools in the middle of “self-renewal” (Striefer, 2002;
Portin, Beck, Knapp, & Murphy, 2003). In all these cases, leaders build capacity to
promote a culture that encourages inquiry into critical issues facing the organization.
Such practices and collaboration create trust and reduces the anxiety of asking and
answering questions about performance and practice (Copland, 2003), and it
ultimately supports collective learning (Scribner, Cockrell, Cockrell, & Valentine,
1999).
Data-use strategies that involve school staff in collaborative problem solving
encourage open dialogue that is needed to address the equity issues (Love, 2000). In
35
addition, inquiry-based schools and districts promote a culture of high standards and
the use of appropriate assessments for improving student learning (Rallis &
MacMullen, 2000).
Findings from a study by Knapp et al. (2006), suggests that since multiple
actors are involved in the inquiry, the data-driven decision making process becomes
distributed. Other researchers (Elmore, 2000; Spillane, 2006) suggest that in
distributed leadership, leaders find ways to motivate and sustain inquiry into
problems through inviting other players to participate in framing questions,
organizing data, and interpreting results for making decisions. Datnow et al., (2007)
as well as Feldman and Tung (2003) found that the inquiry-based DDDM not only
resulted in improved student achievement, but it also led to a more professional
culture where teachers became more reflective and engaged in professional dialogue.
Wayman et al. (2005) also support the idea of inquiry-based dialogue in order to
develop a common language to define expectations and goals. This process will
ensure teacher buy-in in data use (Choppin 2002; Johnson 1997; NEA Foundation
2003). The emerging literature suggests that the process of inquiry-based DDDM, as
well as better decision making, should be considered for capacity building for school
improvement (Copland, 2003). In the next section, I will review the literature on the
facilitators and challenges to DDDM use.
36
Facilitators and Challenges to DDDM
A review of the literature revealed that districts that believed in collaboration
and openness and used the professional learning communities to create a culture of
collaboration (Chen, Heritage, & Lee, 2005; Holcom, 2001; Keeney, 1998; Lachat &
Smith; Symonds, 2003), greatly contributed to the culture of data-driven decision
making, whereas districts that believed that instruction is a private endeavor
constrained the inquiry process. Other studies have found that school leaders who are
able to effectively use data-driven decision making are committed to data use in their
schools and districts (Choppin, 2002, Copland, 2003; Feldman & Tung, 2001;
Herman & Gribbons, 2001; Lachat & Smith, 2005; Mason, 2002). In addition,
organizational cultures that viewed accountability as a useful vehicle to improve the
practices rather than a threat to teacher performance, encouraged complex data-
driven decision.
The Education Commission of the States researchers (Armstrong & Anthes,
2001) conducted a study of six districts in five states that had used data to improve
student achievement. They found that a district culture that was focused on “service”
to the principals and teachers to use student data, coupled with structural mechanism
for training and assessments greatly contributed to continuous improvement.
Similarly, Cawelti and Protheroe’s (2001) study of six high poverty school districts
(Sacramento, Houston, two smaller districts from Texas, one from Idaho, and one
from West Virginia) all with systemic improvements to increased student
achievement found that focusing teachers on the use of research-based instructional
37
practices and aligning the curriculum with assessment, as well as decentralizing
budgeting and management at the building level greatly facilitated data-driven
decision making.
Multiple studies conducted in California (Murphy and Hallinger, 1988),
Texas (Ragaland, Asera, & Johnson, 1999; Skrla, Scheurich, & Johnson, 2000) and
North Carolina (Public Schools of North Carolina, 2000) have found that in
relatively high improving or high performing districts the following characteristics
were evident:
• A climate of urgency for improving student achievement
• A shared sense of responsibility and “service” by central office staff to
the sites for improving student achievement
• A major focus on aligning resources to improve curriculum, instruction,
and assessment
• Special emphasis on staff development linked on research-based practices
for teachers
• Professional development for principals in analyzing and interpreting data
to improve instruction.
In the next sections, I will address the facilitators to DDDM.
Facilitators to DDDM. According to American Association of School
Administrators (2002), schools that want to use data for school improvement often
do not know where to begin and what type of data to use. Whatever the level of
support or the model adopted by the districts to support the sites, educators need to
38
do several things in preparation for data inquiry (Heritage & Yeagley, 2005). First,
they have to take an inventory to find out the data sources and locations in order to
determine which data will be useful for analysis and will be imported into the data
pool (Wayman et al., 2004). Another task is to make sure that data are “clean” and
accurate. Lastly, they need to identify the resources for any data cleaning and for
importing data into the system.
A new and improved model for professional development has been
developed by Center for Research on Evaluation, Standards and Student Testing
(CRESST) at UCLA which helps to improve educators’ skills to use data more
effectively. Most recent models of staff development focus on cycle of inquiry and
formulating research questions, as well as use of protocols and action labs utilizing
real life scenarios and challenges rather than hypothetical issues (Chen et al., 2005;
Copland 2003; Love 2004; Murnane et al., 2005). In addition, the RAND study
(2006) also suggests that although the most common form of support for data-driven
decision making is training on how to use test data, the content and quality of these
trainings vary. Edison schools in this study, constantly focused on training of
principals and teachers on how to interpret and translate data in order to turn into
actionable knowledge. Most of Edison’s professional development focused on
formulation questions and interpreting and using data from multiple measures to
answer the questions.
Other researchers have cited the need for more data-system technology, as
well as the need to present data in formats that are meaningful to school leaders and
39
teachers (Rudner & Boston, 2003; Schwartz, 2002; Streifer, 2002; Thorn, 2001).
However, although technology may be available, school districts often lack the funds
or do not allocate the resources necessary to establish coherent and high-level data
system capability (Olson, 2002). Last, but not least, partnerships with external
organizations such as universities and consultants may be needed to provide
technical assistance and the needed resources (Feldman & Tung, 2001; Lachat, 2001;
Coburn, Honig, & Stein, in press).
Wayman and Stringfield (2006) described leaders who used a “non-
threatening triangulation of data” approach, relying on multiple measures to ensure
that teachers felt supported and empowered by data instead of being threatened.
Leaders who have been successful at making data use a non-threatening event and
have helped their staffs to view this as attempts to drive school improvement, have
been able to remove these barriers (Choppin 2002; Lachat 2001). Wayman and
Stringfield (2006, p. 569) stress the importance of “Non-threatening triangulation” of
information in schools where principals were able to help teachers use data instead of
being used by data.
Challenges to Using Evidence and DDDM. Numerous studies suggest that
lack of data use at schools is due to lack of training, cultural resistance, and fear of
retaliation. Not many administrators and teachers have had formal training or
experience analyzing and interpreting data or using assessment results for program
and instructional improvement (Bernhardt, 2000b; Cizek, 2000). Most schools do not
provide teachers with the ongoing, sustained training that they need to ask the right
40
questions in analyzing and interpreting data (Protheroe, 2001). Limited time to
gather, analyze, and interpret data is a recurring theme in multiple studies regarding
the DDDM process (Feldman & Tung, 2001; Ingram et al., 2004). In contrast, when
administrators make a priority by using time during staff developments, then the time
moves the process of DDDM forward. The concern is that this open dialogue may
be perceived as threatening or frightening due to uncontrollable conditions. For
example, when the researchers talk about the educators’ fears of using data, they
mean that there is mistrust with data and that “data are being used against them and
their colleagues”, especially in light of high stakes accountability (Heritage &
Yeagley, 2005).
Even in districts where extensive data are maintained through linking
multiple types of data on student performance, as well as demographic data and
student educational experiences, the technology to integrate and manipulate different
types of data is often lacking (Wayman, Stringfield, & Yakimowski, 2004). In their
analysis of available technology, Wayman et al. (2004) recommend that schools
move towards the data-warehousing capability to support the analysis and use of
data.
Earl and Katz (2002), in studying data use by leaders in England and Canada,
reported that the leaders were concerned about not understanding data. While they
were positive about how data helped them to focus and help shape their thinking in
making decisions, translating data into useable knowledge and communicating it to
the stakeholders remained a challenge for them. Moreover, they admitted that they
41
had no training in collecting data or summarizing data, especially where there were
bars and graphs, they failed to provide them with the insight needed to help them
improve their practices. Earl (1995) suggested that we are “statistically illiterate” and
therefore have no idea what the numbers mean (p. 27). Schmoker (2003), Lachat,
(2002), and National Education Association for the Improvement of Education
[NFIE] ( 2003), Rudner & Boston (2003), Schwartz (2002), Streifer (2002), Thorn
(2001), echo this finding by stating that while there is too much data, they are not
always the right kind of data nor are they in a format that assist with understanding .
Unfortunately, district leaders, as well as school staffs, often lack the skills
for making effective use of data (Baker, 2003; Choppin, 2002; Cizek, 2000;
Cromney, 2000). A study done by the National Center for Research on Evaluation,
Standards, and Student Testing (CRESST) at UCLA on the implementation of a data
analysis tool, the Quality School Portfolio (QSP), reported that most site and district
administrators with advanced degrees were not experienced in data analysis (Chen,
Heritage, Danish, Choi, & Lee, 2003). These researchers suggest that engaging in a
strategic process that involves skills for defining questions, interpreting data, and
using that information to turn it into action, and evaluating the action, can provide
the framework for data-driven decision making. However, while data are valuable in
informing decisions, common sense and thoughtful analysis of data is needed in
order to make decisions. Data can inform human decisions, but should not and
cannot replace the decision making process (Jamentz , 2001; Secada , 2001). To
42
make intelligent decisions regarding data, it is important for district leaders to know
the difference between good data and bad data (Heritage & Yeagley, 2005).
Not only are district leaders often struggling with data literacy, numerous
studies have also found that educators often do not have the capacity to formulate
questions, analyze results, select benchmarks, and develop solutions (Choppin, 2002;
Dembosky, Pane, Barney, & Christina, 2005; Feldman & Tung, 2001; Mason, 2002).
The district office can play a major role in helping schools develop the needed skills,
as well as the capacity to use data effectively. Mason (2002) emphasizes that
although technology can help build capacity, it needs to be coupled with teacher
willingness and capacity to use data. Teachers need to learn how to obtain and
manage data and ask good questions, to accurately analyze data and apply data
results appropriately (Lachat and Smith, 2005).
Additional challenges to DDDM have been lack of staff buy-in and
organizational barriers (Feldman & Tung, 2001; Herman & Gribbons 2001; Ingram
et al., 2004). In addition, since teachers hold different views on data use, they
approached the DDDM in different ways. For example, teachers did not place a high
value on state assessments and instead, developed their own rubrics for measuring
student achievement in order to get a more accurate picture of student learning. Also,
teachers who felt powerless that their instruction had any impact on the student
learning were more likely to avoid data use (Ingram et al., 2004).
Political pressures and context also cause challenges to DDDM. Cochran,
Fuhrman, & Belcher (2001), in their study of three large urban districts in three
43
different states where enrollment ranged from 50,000 to 200,000, found that districts
have been challenged for not sticking with one reform long enough in order to see
results. Changes in leadership, funding, and state and federal mandates are some of
the barriers that have obstructed their progress. Some critics have even argued that
the political and bureaucratic nature of the decisions make them inherently incapable
of promoting and sustaining meaningful reforms.
Similarly, Agullard and Goughnour (2006) in their longitudinal study of three
districts found that the superintendent’s role is fluid and influenced by social and
political pressures. For example, in light of budget cuts, the role of the
superintendent may change from instruction to going outward to deal with legislative
decisions. In sum, in as much as districts are making progress in using DDDM, the
political and social pressures from different stakeholders may get in the way. In
summary, district practices and the roles that district leaders play are crucial for
building a school’s capacity for data-driven decision making (Massell, 2001.)
Summary of the Literature Review
The review of the literature suggests that despite emerging data-informed
leadership at the district level, there is a lack of sufficient evidence to indicate the
capacity of such activities to drive school and district improvement. (Honig &
Coburn, 2005; Kerr et al., 2006). The only evidence that is available is in small
numbers of “high performing” schools or districts, where there are well-established
routines for DDDM. The above study strongly suggests that districts that promote a
44
DDDM environment do so by relentlessly pursuing attempts to create a culture of
inquiry. Others, studying the same phenomena, have come to the same conclusions
regarding the central role that leadership brings into play (for example, Kerr et al.,
2006; Supovitz & Klein, 2003; Mason, 2002). These researchers contend that the
absence of this strong advocacy for data use will hamper a culture of inquiry.
In sum, the literature review on the role of the district office leaders in
driving school/district improvement, despite associated challenges, suggests that
district office leaders support the sites by building capacity for DDDM by:
• Providing leadership and support for curriculum and instructional
practices
• Training and supporting principals and teachers to become “data-literate”
• Providing administrative support to promote good teaching
• Developing a culture of inquiry where improving student performance
becomes the shared responsibility of every staff member at the district
office and the schools
• Focusing on instructional practice and allocation of resources to research-
based and coherent professional development
• Providing laser focus on data analysis and alignment of curriculum,
assessment and practices
• Establishing focused, coherent reform efforts aligned to the site initiatives
Using the literature review as the back drop for my future research, I hoped to
add to the body of knowledge in the best practices of DDDM use by district leaders
45
by focusing on actions of district leaders to build capacity at the school level to
collect, analyze, and interpret data in a way that facilitates instructional improvement
and improved student achievement. Most prior studies are focused on district
policies and practices more generally, and less on the specific activities of district
leaders in supporting DDDM. This study sought to fill this gap.
46
CHAPTER THREE
Methodology
The purpose of this study was to research how district leaders effectively use
data driven decision making in order to drive school/district improvement. This
chapter describes the methodology used in this qualitative study by describing the
design, instrumentation, sample, data collection and data analysis. The study focused
on the actions of school district leaders, specifically superintendents, assistant
superintendents as well as other district leaders that are known for their commitment
to equity and their effectiveness in the use of data-driven decision making to improve
and inform instruction that result in improved student achievement.
Specifically, this study addressed the following overarching question: How
do district leaders effectively use Data-Driven Decision Making (DDDM) to drive
district/school improvement?
In conducting this research, the following sub-questions were addressed:
a) What forms of data do districts rely upon most?
b) How do district leaders cultivate a culture of data use?
c) What structures/policies do district leaders put in place to assist with data
use?
A qualitative, descriptive case study of four districts was used to portray data-
driven decision making to drive instructional improvement. The districts were
selected based upon their commitment to equity and reputation in being leaders in
data-driven decision making which led to improved student performance over time.
47
The University of Southern California and the Institutional Review Board
informed the researcher that this particular research, did not warrant approval from
IRB, since it did not involve research on “human subjects”. As a result, the
researcher was able to conduct visits to the district offices between July and
September of 2008. The interview protocol included 35 questions. A total of sixteen
district administrators were contacted, and 13 face-to-face interviews were
conducted; one interview was conducted on the phone, and two interviews were
conducted via e-mail. The interviewees included four superintendents, five assistant
superintendents (two elementary and three secondary), three directors of assessment
and technology, one director of assessment, technology and staff development, one
coordinator of assessment and technology, one director of secondary schools, and
one high school principal. Dexter (1970) has summarized when to use interviewing:
“Interviewing is the preferred tactic of data collection when… it will get better data
or more data or data at less cost than other tactics!” (p. 11). The most common form
of interview is the person-to-person encounter in which one person elicits
information from another (Merriam, 1998). The main purpose of the interview was
to find out what is “in and on someone else’s mind” (Patton, 1990, p. 278). All
interviews were tape recorded and transcribed and data were coded. Lastly, district
documents relevant to the study, such as minutes of meetings, superintendents’
presentations to the board of education and their annual reports, professional
development plans and calendars, districts’ vision, mission, policies and procedures,
district priorities, organizational charts, strategic plans, data reports, surveys, needs
48
assessments, district reform documents, and protocols to guide data dialogues were
examined. The results of the interviews, along with document review, and review of
notes were triangulated in order to analyze and report the results.
Case Study Method
“A qualitative case study is an intensive, holistic description and analysis of a
single instance, phenomenon, or social unit” (Merriam, 1988, p. 21). Smith’s (1978)
notion of the case as a “bounded system” comes closest to Merriam’s (1988)
definition. As Yin (1994) observes, case study is a design particularly suited to
situation in which it is impossible to separate the phenomenon’s variables from their
context. The researcher selected this qualitative case study design because of the
nature of the research problem and the questions that were asked to capture the
detailed actions of district leaders who use data-driven decision making to improve
student performance. The case study approach allowed for an in-depth examination
of the Data Driven Decision Making process and the results yielded provided a
thick and rich description of the key role that district leaders play and the structures
that they put in place to support data use at schools. The researcher’s goal was to
offer insights and illuminate meanings that expand the educators’ understanding of
DDDM.
49
Sample and Population
This qualitative study focused on four school districts that are K-12 and one
that was a high school district. As noted above, the criteria for selection of the
districts was the reputation of the district leaders in regularly and effectively
collecting and analyzing data and engaging in a culture of inquiry to promote
DDDM. Using purposeful sampling (Patton, 1990), the districts were selected based
on increasing test scores and meeting state and federal targets, i.e. Academic
Performance Index (API) for the past three years (2006, 2007, and 2008) and also
because the selected districts attributed part of their success to using DDDM to
improve student achievement. This is referred to as unique sample (Merriam, 1998)
which means they are rare or atypical attributes or occurrences.
The districts were chosen from multiple resources, such as the list of Schools
Moving Up and from other research studies such as those conducted by Springboard
Schools. In addition, the rubric shown in Table 1 was used to select the school
districts. The list was then narrowed down based on phone calls or e-mails to district
leaders and external consultants who had worked with those districts. The researcher
made contacts with administrators in a few districts including the Legend Unified
School District (K-12), White Union High School District (9-12), Worland Unified
School District (K-12), Grovenor Unified School District (K-12). These districts
have varied student populations, some of them quite urban and diverse; all have had
successes in raising student achievement and in using data to inform decisions.
50
Table 1: Selection of School Districts
School Districts
Legend
(K-12)
Worland
(K-12)
White
(9-12)
Grovenor
(K-12)
API: 2008 818 764 728 778
API: 2007 807 757 703 767
API: 2006 803 757 694 756
Percentage of ELL 24.6% 34.4% 12.1% 45.7%
Percentage of Special Ed .087% 1% .094% .92%
Percentage of Free and Reduced Lunch 41.3% 48.8% 47.6% 65.1%
Socio Economic disadvantaged 44% 62% 50.8% 65.5%
Individual Level Data Yes Yes Yes Yes
Technology infrastructure Yes Yes Yes Yes
Training/Professional Development Yes Yes Yes Yes
Use of Performance
Indicators/Benchmarks
Yes Yes Yes Yes
Commitment to DDDM Yes Yes Yes Yes
Value added of DDDM as part of
principal evaluation
Yes Yes Yes Yes
51
Data Collection Procedures
Data collection began with semi-structured interviews of the district
administrators conducted either as a pair, individually, and in one case, on the phone
and in another, over e-mail based on the preference of those district leaders.. These
interviews focused on the process of DDDM and its history and current practices.
Each interview lasted anywhere from 45 minutes to sometimes two hours and was
tape recorded. The interviews focused on the actions of the district leaders
including, but not limited to, Superintendents, Assistant Superintendents, and those
in the research and evaluation office and technology departments to promote DDDM
and the practices that the district is engaged in to improve student performance.
Appendix A includes a list of questions in the interview protocol.
Document analysis was used as the second type of data. The documents used
were superintendents’ presentations to the board of education and their annual
reports, professional development plans and calendars, districts’ vision, mission,
policies and procedures, district priorities, organizational charts, strategic plans, data
reports, surveys, needs assessments, district reform documents, and protocols to
guide data dialogues were examined , what the district offices used as evidence such
as interim assessment data, CST scores, CELDT data, CAHSEE, AP results, Vital
Signs , Writing benchmarks etc. Copies of these documents were obtained, coded,
and catalogued . The results of the interviews, along with document review, and
review of notes were triangulated in order to analyze and report the results.
52
Data Analysis Procedures
Essentially, all qualitative data analysis is content analysis in that it is the
content of interviews, field notes, and documents that is analyzed (Merriam, 1998).
The process involves the simultaneous coding of raw data and the construction of
categories that capture relevant characteristics of the document’s content (Altheide,
1987). All interviews were transcribed verbatim. Data were analyzed by coding
transcribed interviews, notes, and documents looking for common themes and cross
referencing findings. The data were first coded according to the general research
questions and conceptual frameworks guiding this study. Then, more detailed codes
were developed to help themes to emerge. The researcher did not use any software in
sorting the themes. It was done manually. Charts or tables were used to assist the
data analysis process. Since data were used by a variety of ways such as interview,
documents and notes, the researcher triangulated the findings from those sources.
Ethical Considerations
Data collection and interviewing in the qualitative research entails its own
challenges. Stake (1994) states, “qualitative researchers are guests in the private
spaces of the world. Their manners should be good and their codes of ethics strict”
(p.244). Throughout this research, ethical considerations were adhered to strictly.
Names of participants were kept confidential. Prior to all interviews, consent forms
were obtained. In addition, all data gathered from the interviews, tapes, phone
conversations, or e-mails as well as transcripts and other documents are kept
53
confidential unless consent was given by the district leaders to disclose such
information. Thus, the report of findings does not have any names and
confidentiality of the participants was preserved. At all times during the data
gathering and analysis, the researcher complied with the University of Southern
California’s Institutional Review Board procedures, as well as procedures from the
four districts under study. Specifically, the researcher complied with all the
guidelines required for ethical conduct in research.
Limitations of the Study
The case studies were conducted at the selected school districts over a period
of two and a half months. Some of the limitations may be due to the small sample
size of the research and the results may not be generalizable. Certain factors unique
to the districts may impact the transferability of this study to other districts. The
study gathered qualitative data from four Southern California school districts that
have been identified as districts that effectively use data to improve student
performance. The samples and units of analysis were purposefully selected in the
hopes of adding more to the field research concerning DDDM.
54
CHAPTER FOUR
Findings
In order to best describe how district leaders use data-driven decision making
to drive school/district improvement, this chapter is divided into six sections. Each
section outlines six key themes as reflected by the qualitative data gathered in this
study. No Child Left Behind Act, NCLB, and its impact on data use, Process of
DDDM, Types of data, Culture of data use, District Structures/Practices, and
Challenges to DDDM.
NCLB and its impact on data use serves as the backdrop to describe whether
NCLB and state accountability systems have influenced student performance goals
for each district as follows:
The Process of DDDM describes how districts use data to form a theory of
action. Types of data addresses research question #1 and describes a diverse array of
data that districts leaders use.
The Culture of Data Use addresses research question #2 and shows the
impact of governance team’s/boards’ expectations in creating a culture that can result
in a cycle of continuous improvement and inquiry.
The subsequent section, District Structures/Practices addresses research
question #3 that describes the practices and structures that district leaders have put in
place to facilitate data use, including but not limited to the use of as technology and
other resources.
Finally, the last section addresses the challenges associated with DDDM.
55
Introduction: NCLB and Its Impact on Data Use
At the core of the federal No Child Left Behind Act of 2001(NCLB) are the
relatively straightforward, but sometimes controversial accountability provisions
which are aimed at ensuring that: all schools and districts meet state standards by
2014. When the law was enacted, researchers and state education officials projected
that a high percentage of schools would fail to meet the law’s tough accountability
provisions, creating a crisis in public education and overwhelming the capacity of
state education agencies to help low performing schools to close the achievement
gap. Others believed that the law would be changed before the full effect of the
requirements were felt, reflecting the view that the requirements as written were not
sustainable and that the federal government would relax its enforcement and
Congress would amend the law in response to state and local preferences.
District leaders that were interviewed for this research project, shared
different views of NCLB and its impact on closing the achievement gap through the
increased use of data driven decision making, as the following quotes reveal:
“NCLB has increased the depth of looking longitudinally at groups of
students and asking the right questions,” stated the superintendent in Worland
district. On the other hand, the Superintendent of the White district expressed the
following idea, “NCLB has created a laser-like focus on data.” Correspondingly, a
district leader from the Legend district made the following comment, “NCLB is not a
perfect system. However, our superintendent has stated publicly in front of the world
that 60% of our Latino students are not being successful on the Mathematics and
56
English Content Standards Test .When you are admitting 60% failure publicly, I
think that’s pretty awesome that you are willing to go out on the limb to make a
statement like that. This is the impact of NCLB.”
This research study revealed that all district leaders interviewed indicated that
NCLB had been influential in improving student achievement for all students by
creating a sense of urgency to use data and provide targeted interventions
particularly for specific subgroups (as indicated by the data). The assistant
superintendent of educational services at the Legend district stated, “NCLB I would
say has improved education through California, probably, especially with the special
education students, probably far more than any other subgroup. It’s raised the bar for
those kids and so I think that’s a huge positive, is it unrealistic, yes.” The
superintendent in the White district commented that despite all the progress which
had been initiated by, and made under NCLB, some schools and districts might still
go into “program improvement” status and that in turn causes undue stress.
However, the majority of district leaders indicated that they did not continue to have
a sense that NCLB had imposed additional burdens, since they indicated that
different structures including assessments and measurable targets had already been
put in place, along with high expectations for student achievement, prior to the state
accountability system and NCLB. One administrator from the Grovenor district
stated that in their district “DDDM was in place ten years ago before NCLB.”
Correspondingly, the superintendent from the Worland district mentioned that the
standards-based movement in the 90’s had had a significant impact on what they are
57
currently doing as far as using data to improve instruction. The superintendent from
Grovenor indicated that while NCLB was seen as a positive framework through
which to measure success, meeting her district’s performance goals and targets were
considered far more important than meeting the state or federal targets. This
superintendent boldly stated, “By just looking at numbers, we lose perspective that
we work with the students and kids aren’t just numbers.”
In summary, all district leaders clearly felt that the high-stakes accountability
system created by NCLB had created a sense of urgency for data use for all districts
by disaggregating data for sub groups and providing a laser-like focus on data to
develop targeted interventions.
In the next section, I will discuss the findings on the four districts’ theories
and material plans of action for Data-Driven Decision Making (DDDM).
The Theory and Plans of Action for Using Data
This section first addresses each of the four districts’ theories of action for
collecting, analyzing and using data and then I compare and contrast the specific
strategies utilized by the districts. Despite variations in districts’ theory and plans of
action for data use, all four districts were similar in their emphasis on building
capacity for data driven decision making (DDDM) by directing significant resources
to the sites, as well as restructuring principals’ and leadership teams’ meetings to
analyze data and use data to shape and improve instructional delivery. Although
there were no set district policies regarding data use in any of the four districts, I
58
found strong indications that the actions/practices of district leaders were grounded
in discussions of student achievement data in order to improve instructional practices
and student performance. As I will explain, the Legend district has formed its theory
of action based on state standardized test data through a salient and urgent message
from the superintendent and subsequently by the sites, whereas the White district has
used data from multiple measures to develop its plan of action based on “Rigor,
Relevance and Relationships”, as well as through a pyramid of interventions. On the
other hand, the Grovenor district has spent several years through a lengthy
collaborative process to establish two student performance goals that remain the
same through 2014, where every student must become proficient in Mathematics and
English Language Arts standards. On the other hand, the Worland district has
created a strategic plan that has formed the bases of its theory of action for DDDM to
improve and increase student performance. The first sub-section below addresses
the Grovenor District’s theory of action for DDDM in more detail.
Grovenor District
At the Grovenor district, while there is no set model or policy for data
collection and analysis, district leaders communicate high expectations for use of
data to improve instruction and the importance of data use through data-
warehousing, which gives easy and timely access to data to the sites and provides
different reports on student achievement. Getting results is a top priority for the
Board of Education. Data are reviewed continuously throughout the year to provide
59
direction and gain insight regarding the program strengths and weaknesses. The
superintendent stated, “DDDM has become a catchphrase. I am trying to stay away
from all those words, because everybody uses them. It’s whether or not you really
manifest that behavior, not just talk about it. Our theory of action is just do it.”
Data are an integral part of everything that the schools and the district leaders
do. As a result, schools are expected to examine all student achievement data and
make curriculum, strategy, and student placement decisions based on data. The
district sets clear goals and expectations; establishes benchmarks, monitors the
results by reviewing data and modifying practices/instruction. The superintendent
commented, “I go out and sub once in a while. It reminds you, hey, this isn’t that
easy. Teaching is really hard and we can come up with these grandiose ideas and
systems, but it doesn’t go very deep because if it isn’t, if you can’t do it yourself in a
classroom as a teacher, why are you asking them to do it.” As a result of stakeholder
groups engaging in conversations around what the district needs to hold itself
accountable for, two district performance goals were developed that are as follows:
1) Each student who is at Far Below Basic (FBB), Below Basic (BB), Basic, or
Proficient will move up a band on the California Content Standards Test (CST), and
2) All English Language Learner (ELL) students in addition to meeting performance
goal #1, will show progress by moving to the next level on CELDT, a test that is
administered to ELL students to measure progress in meeting the standards for
language proficiency. The District leadership team is made up of cabinet and K-12
division heads, which spend approximately twenty hours a month on DDDM. In
60
addition, DDDM is used at the principal, Advancement Via Individual
Determination (AVID) and counselor meetings in order to improve practices.
The process of data collection begins in late July when the CST data are
released and it continues throughout the year following each quarterly benchmark
assessment. Schools have weekly collaboration time and they use data to engage in
continuous inquiry to improve student performance. For example, they use the a-g
(University of California requirements) completion rates in terms of providing
greater access and support for students or the district looks at commonalities and
patterns in student performance results such as students who remain at Basic level on
California Standards Test as well as on the same level on the California English
Language Development Test (CELDT); consequently, district leaders look for
correlation between the two. Administrators have access to data from other sites as
well. Schools meet in feeder patterns to analyze the data from elementary to high
school. Data are used to monitor progress, evaluate the interventions, and improve
instruction. District leaders meet with the principals to review data, discuss
implications, and establish goals. A Single plan for Student Achievement is used to
develop action plans by using the “trailing data” (Datnow et al., 2007). The district
provides data protocols, rubrics, and monitoring checklists for supporting schools to
analyze the benchmarks and engage in “quality conversations” (Datnow et al., 2007).
The superintendent stated that, “Everyone is on the same page. Getting results is a
priority for the Board of Education. While we put structures in place for data use, we
61
don’t micromanage. Real change comes from teachers re-teaching or differentiating.
Data are to be used at teacher level to be effective.”
High school Advanced Placement courses are audited in order to ensure
alignment to the College Board standards. Student achievement data from these
courses are used to analyze student placement practices. This means disaggregating
student demographic data to determine equity and universal access to AP courses. As
a result of these kinds of data analysis, AP placement matrices are developed and
multiple measures are used instead of just teacher judgment and belief regarding
student achievement based on race: For example, after a thorough analysis of
placement practices, the district leaders learned that Asian students who are at the
Basic level are placed in the honors program vs. Latino students who are Proficient,
but are placed in lower level courses. The district looks at outcome data to predict
which students will be successful or not in order to provide support for teachers. “We
do an autopsy. We work backwards to see what needs to be placed now for the at-
risk 8
th
graders who are in FBB or BB on the CST,” stated a district level
administrator. The superintendent commented, “We have removed artificial barriers
by placing students in courses that they can become successful.”
The district makes data accessible to teachers and principals at all sites.
District and site-based teams analyze the data. Data are used to ensure that students
are placed in the appropriate classes. The district trains teachers and administrators in
the use of data and helps provide a clear focus, common language and common goals
so that everyone is on the same page. Development of student performance goals is a
62
testimony to such a commitment. In addition, the district expects that schools will
provide multiple opportunities for students to retake benchmarks to show the mastery
of standards.
In summary, the Grovenor district’s theory of action is engagement of all
staff in a cycle of continuous inquiry to improve student performance. As the
superintendent stated, “Everyone understands what we are doing and why we are
here.” This is accomplished through providing professional development in data
analysis, restructured principals’ meetings, collaboration and articulation with feeder
patterns, creation of common language and common goals (i.e. district’s
performance goals), course audits, analysis of placement practices in different
courses, establishment of placement matrices, as well as data warehousing and
regularly held weekly meetings of district leaders focused on DDDM.
The Grovenor and White districts are very similar in their theory of action to
improve instructional practices; however, the process that each district goes through
to consequently execute the plans of action differ. The main difference lies in the fact
that although the White District does not have officially established student
performance goals like the Grovenor district, it uses data from multiple sources,
draws its own conclusions and then develops plans of action for the district to
improve student achievement results.
In the next section, I will discuss the theory of action of the White District in
more detail.
63
White District
As discussed above, the White district’s theory of action is “Rigor, Relevance
and Relationships” at all levels, as well as the implementation of the Pyramid of
Intervention, a tiered intervention process to support student achievement. One top
district administrator stated, “John Wooden [the famed UCLA basketball coach]
never talked about winning. It wasn’t about we are going to win the game. It was
about we are going to improve; we are going to do the things we need to do; we are
going to focus on getting better and performing our role and the results will take care
of themselves.” Unlike the Grovenor district, the White district plants the seed for
any initiative and works through it slowly. There are measurable targets and data are
presented in user-friendly formats and are accessible across all sites. There is
transparency with data use. Data teams look at multiple sources of data, develop their
own conclusions and then develop their plans of action. Like the Grovenor district,
the White district gives full autonomy to the sites to implement programs as long as
they can support the effectiveness through DDDM. Analysis of data is best done,
according to the district leaders, in the context of each school community. Research-
based best practices dinner meetings are held four times a year for course leads,
which is another venue for engaging in the cycle of continuous inquiry to improve
student performance.
The district leaders avoid providing too much data in order to avoid analysis -
paralysis for the staff. However, data are presented in user-friendly formats to
facilitate use. “Rigor, Relevance, and Relationships”, coupled with a pyramid of
64
intervention form the basis of the district’s theory of action best represented by the
phrase: “Whatever it Takes.” The White district administrator of the Assessment and
Technology Department stated, “We kind of know where we want to go, but gently
massage it in the background, kind of throw the spark into the tenders and down the
flame and then kind of do the fire break so that it goes along the right way.”
Improving student achievement is Board of Education’s priority. The district
education services administrators, along with the superintendent, cabinet and
principals meet weekly to review and analyze data, establish measurable targets and
share common assessments and resources. Due to the unique nature of this district
(the fact that it is a high school district), and due to the increased focus on improving
Algebra I results, articulation with the five feeder middle schools occurs where the
White district pays for the substitutes to release the feeder school teachers to meet
with the high schools to align practices.
The interviews with district administrators, as well as document review of all
data reports provided to the schools, indicated that data are presented in user-friendly
format and that the district avoids an excessive data flow, which may be as bad as
not providing enough data. This means that instead of overwhelming teachers with
too much data without giving some guidance and training on how to make sense of
that data, the district leaders make every attempt to encourage and facilitate data use
for teachers that will be most helpful in supporting the teachers to improve their
practices. However, in this district, the leaders stated that an initial fund of data
always leads to examining additional, related data and subsequent questions. The
65
district also keeps track of the targets and student performance by using DDDM to
improve practices. School teams look at multiple sources and types of data, develop
their own conclusions and establish needed plan(s) of action. They analyze data in
context of their schools and by departments. The district and sites use data to
develop interventions, programs, or modify bell schedules. It has developed
measurable targets for A-G (University of California requirements), API (Academic
Performance Index), AYP (Adequate Yearly Progress), 4 year college, On-target
with graduation (graduation credit requirements), CAHSEE (California High School
Exit Exam) and more. Each site is implementing common assessments and a
pyramid of intervention; they always share resources. Additionally, there is
transparency with use and sharing of data, and the district is building capacity
through course leads. In summary, like the Grovenor district, the White district uses
the cycle of continuous inquiry to engage in quality conversations that lead to
improved student performance. This is done through dinner meetings to share
research-based best practices on DDDM, through transparency with data, weekly
meetings with principals and cabinet, professional development on data analysis, and
building capacity through course leads, as well as yearly articulation with five
middle schools from neighboring districts that feed into the White District. Unlike
the Grovenor district, although the White District has not conducted course audits or
used placement matrices to ensure universal access and equity for all students in the
college courses, it has nonetheless analyzed data in order to increase the numbers of
students taking rigorous courses.
66
Unlike Grovenor and White districts, the Legend district’s theory of action is
based on the salient and urgent message from the superintendent; a message which is
subsequently further developed by each site to best fit their specific need. All three
districts, however, involve site leadership teams to analyze data from standardized
tests and assessments at the beginning of the year. The next section will detail
Legend district’s theory and plans of action for DDDM to improve student
performance.
Legend District
The Board of Education has high expectations for increasing student
achievement and directs the superintendent to develop strategies to do so. The
superintendent determines how to make that happen. As a result, the Legend
district’s theory of action is grounded in an urgent message from the superintendent
that draws attention to narrowing the achievement gap for Hispanic students. This
message is posted all over the district, on the website, in all district and community
documents, as well at all sites. The message is as follows:
The Legend school district is high performing, with a current Academic
Performance index of 807 (A score between 200 and 1000). 22 of our 30
schools are designated as California Distinguished Schools and 9 have been
awarded the National Blue Ribbon.
As proud we as we are of these facts, we are equally concerned that 57.2%
of our Hispanic students are NOT proficient in English Language Arts, and
that 52.7% are NOT proficient in Math. As a district, we will target
resources to ensure that all students increase proficiency by 2% this year. In
an effort to close the achievement gap, we expect targeted resources to result
in no less than 5% proficiency among our Hispanic students this year.
67
The process that the district used to create this focus was as follows: It
partnered with an outside provider to provide the “Focus on Results” training for the
entire district leadership team, along with leadership teams from each sites. This
training created a common language and framework for digging deeper into
instructional practices and evaluating the effectiveness of practices. The teams
reviewed trend data from standardized tests to identify strengths and weaknesses in
their programs in order to develop a plan of action to improve student performance
results. Over a period of four years, all schools’ leadership teams have been trained
to build capacity for implementation of the “Focus on Results”. Moreover, each site
has developed its own urgent message based on its local data.
The superintendent commented, “I don’t believe there are any mysteries
about our expectations for data use.” As a result, his expectations are clearly
communicated via the SEAM instructional reflection questions, which spell out the
following critical queries:
S- Is Instruction Standards-Based?
E- Are students actively Engaged in the lesson?
A- Are students Assessed regularly to inform and drive instruction
M- Are students reaching Mastery of the standards?
In this way, sites teams have defined what each of those standards in
S.E.A.M should look like in the classrooms. They have developed a common
language and vocabulary for class observation and data analysis.
68
As far as professional development, the district is currently using the “trainer
of trainer” model to build capacity for DDDM at a larger scale and will no longer
need to work with the outside providers, after doing so for almost four years. The
district leaders serve as coaches and each is assigned to a cohort of principals and
schools to support and monitor implementation. Additionally, the district uses Data
Director [a data management system] to analyze data in order to plan interventions,
such as, the Apex drop-out on-line recovery program and the Dual Immersion
program. For example, prior to developing the Dual Immersion program, district
leaders made up of data teams visited districts that had a history of success with
implementing this program. They investigated the challenges and learned about
facilitators associated with the implementation of the new program. In addition, they
developed evaluation tools to measure effectiveness.
The data management personnel work collaboratively with the sites to
provide support through different reports that are user-friendly. Sometimes, they
make faculty presentations or work individually with each department to provide
data analysis. The same staff is also working with the principals to provide more
meaningful data by eliminating the voluminous data binders that contain too much
data and are not utilized by the sites.
The superintendent gives the sites autonomy to implement any new programs
supported by data. He also brings the right personnel, such as the director of
multicultural department, to serve in cabinet so that he can access both information
and resources from the categorical programs to better meet the site needs.
69
In summary, the theory of action of the Legend district is as follows 1)
creation of an urgent message by the superintendent to close the achievement gap,
and 2) establishment of a common language and framework for reflecting upon
instructional practices to improve student achievement results through a standards-
based instructional program where students are cognitively engaged in their own
learning process and are regularly assessed for mastery of standards (S.E.A.M) and
3) monitoring of the results. This is their cycle of continuous inquiry for
improvement of instructional practices. The process of data collection occurs through
regularly scheduled site visits in triads, debriefing, and analysis of the observations
based on established protocols by the leadership teams. All three district
administrators interviewed stated that the district is committed to maintaining an API
of over 800 and is therefore investing and aligning all of its resources at all three
levels: District, schools, and classrooms. They further stated that closing the
achievement gap was a “shared responsibility” and the district office was in the
“service” business to support the administrators and equip them with the tools they
needed to be able to assist the teachers to improve their instructional practices.
Although there were no set policies regarding data use in any of the three
districts, there was a strong expectation by the district leaders for sites to use DDDM
to improve instructional practices and student performance. All three districts
engaged in practices such as professional development for leadership teams and
directing resources, including technology and human resources, to build capacity.
Unlike the White district, the Legend district has formed its theory of action based on
70
state standardized test data that has resulted in execution of an urgent message from
the superintendent and subsequently by the sites, whereas the White district has used
data from multiple measures to develop its plan of action based on Rigor, Relevance
and Relationships, as well as through a pyramid of interventions. On the other hand,
the Grovenor district has spent several years and has gone through a lengthy
collaborative process to establish its student performance goals that remain the same
through 2014, where every student must become proficient in Mathematics and
English Language Arts standards. Next, we will see that the Worland district has just
developed a strategic plan that has formed the basis of its theory of action to use
DDDM to improve student achievement results.
Worland District
The Worland district had been a much decentralized district for a long time
where DDDM has been used by the sites, but such usage was not centrally organized
or coordinated. However, as stated by a top administrator, the district has developed
a great reputation for DDDM for about three years now; the Worland district was
selected by the Fall Foundation as one of the only two districts in the nation that used
data to improve student performance. The Fall Foundation provided the resources
for the district to improve student literacy. Continuing its partnership with the Fall
Foundation, the district has taken a comprehensive look at the organization at a lot of
levels and has developed a strategic plan in July of 2008 that forms the foundation
for its theory of action. This is a decentralized district and the district leaders have
71
invested time and resources to build a solid foundation for centralizing the district to
use data in a meaningful way. The district has put in place structures such as data
management system and personnel to support the sites with data use. Student
achievement data are provided in an user-friendly format and in a timely manner.
Teachers and administrators have direct access to data. Sites have considerable
autonomy to implement programs. Professional development is provided throughout
the year for teachers and principals, principals’ meetings have been restructured to
focus on data analysis, and collaboration time is provided either through early release
or late start at the sites to ensure quality conversations around data. The partnership
with the Fall Foundation has further facilitated data use and engagement in a cycle of
inquiry by providing financial resources for articulation across all levels, i.e. K-12
and the district.
While there has been no formal theory of action for DDDM up until July of
2008, schools are expected to use data to drive instructional improvements. DDDM
is a top priority for the Board of Education since there is a great commitment by the
Board to improve student performance and achievement results. One Assistant
Superintendent stated that the district’s theory of action is as follows, “Data use
should be the bread and butter of all decision making and you know what? We don’t
have to convince our staff. They are convinced!”
The cabinet and principals review data and teachers use data at both the grade
level and department level meetings. Data are made accessible to each of the school
sites through the use of Edusoft, a data management system. The district leaders
72
believe that teachers need meaningful data. The district has been a decentralized
district for a long time. The superintendent commented, “ The system pushes the
superintendent along sometimes, which is OK, because it’s that tension of central
and site working together in a mutually supportive way and it’s hard work.”
However, data warehousing at the district office has created a centralized data
system to better support the sites. Data are used to gain consensus around the ensuing
practices. Research-based best practices are used to scale up practices in other
schools within the district. Examples include, but are not limited to, the
implementation of the Thinking Maps. Thinking Maps are visual teaching tools that
foster and encourage life-long learning and the software utilizes technology to
enhance students’ thinking skills. Apparently, Thinking Maps have helped focus the
pilot schools to significantly improve student performance. The district has also
taken a comprehensive look at the organization, at multiple levels, to develop a
strategic plan. It has looked at data that the organization values and has reviewed the
strengths and weaknesses of these data. The district leaders stated that use of data is
at the heart of all their decisions. Their theory of action is using data to implement
programs, monitoring them, and evaluating those data to improve practices.
During the interview, the superintendent wanted to make sure that everyone
knew that the strategic plan was a shared vision and not her vision. As she stated:
“The strategic plan is based on input from all stakeholders based on massive
gathering of data.” According to her, this process will change the decentralization of
data to centralization of data. While the district is trying to change practice by letting
73
the sites keep autonomy, it wants to establish the power of a shared set of beliefs
about teaching and learning and district practices. Communication and articulation
with K-12 is intended to create that shared responsibility. The superintendent further
stated, “The strategic plan has 8 components and the second strategy addresses what
good teaching is all about. Through this planning process, the district has tried to
create a system that is coherent across the board and helps the system focus on most
important things, not on everything. We are trying to develop a common language
among all levels. The strategic plan is connecting the systems together and it is very
powerful.” Moreover, the strategic plan has created a new mission statement and
belief system. A “word café” of conversations around the strategic plan will occur in
order to get feedback from all stakeholders (i.e., as a feedback loop). In other words,
according to the superintendent, the strategic plan will serve as a protocol for
framing conversations around DDDM. This will become the cycle of continuous
inquiry and improvement. “I think that again it takes a certain level of courage to
invite a whole organization into a journey and into a conversation. It’s sometimes a
little frightening, but at the same time, I know it’s the right work,” stated the
superintendent.
In summary, the Worland district has created a strategic plan that combines
the top-down and bottom-up approaches to building a platform for DDDM. Like the
other three districts, it uses data to implement programs, monitor progress and
evaluate their effectiveness by engaging the staff in a cycle of continuous inquiry
that occurs at the sites through collaboration time, professional development for both
74
teachers and administrators, restructured principal meetings, and most recently
articulation with the K-5 and 6-12 levels, sharing during the best practices dinners,
and early release time. Data management system and personnel facilitate data use.
Once a research-based program improves student outcomes at one site, the district
scales it up and implements it at other sites.
Summary of key points
Although there are no formal policies for data use at any of these four
districts and each has gone through a different process to establish its theory of
action, DDDM is at the core of improving student performance at all four districts. In
all instances, the boards of education hold high expectations for improving student
achievement results and therefore, the superintendents utilize different strategies in
order to build structures and establish practices to make that happen. Also, each
district has put in place structures such as data management systems and personnel to
support the sites with data use; has provided staff development; and has restructured
principals/leadership team meetings. The teams have developed common language
and framework in order to engage in a cycle of continuous improvement.
Each site has time for collaboration and course leads assist with data use.
Sites also use research-based practices in order to implement new programs and use
data in order to monitor and measure progress.
The districts have all worked with external partners to build capacity for data
use and engage their staff in quality conversations that are geared toward learning
75
goals and targeted benchmarks. For example, working with outside providers, the
Grovenor district has established two student performance goals, whereas the Legend
district has published standardized test data that has resulted in urgent messages at
the district and at the sites to close the achievement gap. On the other hand, the
White district has worked with external providers to develop professional learning
communities focusing on “Rigor, Relevance and Relationships” that have resulted in
implementation of a pyramid of intervention to support student achievement. The
Worland district, however, has taken a comprehensive look at the organization at
multiple levels and has developed a strategic plan that forms the foundation for its
theory of action. This plan has rigorous standards, based on APA (American
Psychological Association) guidelines. DDDM at Worland is at the heart of all
decisions and the implementation of this strategic plan is expected to create
consensus on what the organization values and needs to focus on.
Types of Data
This section addresses research question number one: What types of data are
used by the districts? “Through data, we are looking for genius in each child”, stated
a top district level administrator at Worland district. As this quote implies, district
administrators use varied sources of data to create an accurate picture of student
achievement. In general, all four districts use demographic data and student outcome
data that include standardized test results such as the California Standards Test,
California English Language Development Test, and California High School Exit
76
Exam. They also use data from multiple sources that range from grades to on-target
with graduation; a-g completion rates; college attendance; Advanced Placement
(AP); district benchmarks; site common assessments; and recommendations from
the Western Association of Schools and Colleges (WASC) accreditation reports.
Also, trend data, as well as attendance and discipline data are used to get a more
comprehensive picture.
Although the Grovenor, White, and Worland districts use perception data
such as student, parent and teacher surveys to measure effectiveness of practices, the
Legend district generally, prefers hard data over soft data, i.e. perception data, and
therefore does not administer surveys. It does, however, analyze data from teacher
evaluations done by the administrators in order to improve the quality of
observations that result in improvement of instructional practice. Data are also used
by students to know whether they are on track with graduation or are mastering the
standards. One administrator at White district stated, “[That] which is inspected, is
respected.” All districts use data at different levels in order to improve practices and
student performance. In fact, all four districts use “value-added” based on increased
student achievement data on standardized tests to evaluate principals. However,
teacher evaluations are not impacted by the lack of student achievement on
standardized tests at any of the four districts, although some administrators believed
that they should be considered.
I will first review each district’s use of different types of data and then look
for similarities and differences among all four districts.
77
Grovenor District
The district uses data from multiple sources that include data from both state
standardized tests, as well as district and site-developed benchmarks. The types of
data are used are as follows: California English Language Development Test
(CELDT), California High School Exit Exam (CAHSEE), and California Standards
Test (CST); Scholastic Aptitude Test (SAT); demographic data; Pre Scholastic
Aptitude Test (PSAT); ACT (formerly American College Testing); A-G (University
of California) admission criteria; English Language Learners’ redesignation rate;
district writing assessments and benchmarks; client surveys; site common
assessments and curriculum-embedded assessments; Comprehensive Literacy
Assessment (CLA); Dynamic Basic Early Literacy Skills (DIBELS); value-added at
the end of the 6
th
and 8
th
grades; affective data such as attendance; truancy;
discipline; placement practices in Advanced Placement courses; correlation studies;
course audits; summer school data; research-based best practices; and data from the
Western Association of Schools and Colleges (WASC) accreditation reports and
recommendations. The superintendent, however, made the following comment
regarding the actual type of data that has the most impact on student learning: “Real
time data come from class observations.”
The district has created focus and coherence by aligning different types of
data to assist the sites in meeting the district performance goals. For example,
besides data from classroom observations, the district uses data from standardized
tests such as the CST, CAHSEE, SAT and Advanced Placement University of
78
California completion rates of A-G requirements, CELDT and reclassification data
for the English Language Learners. The PSAT that is administered to all 10
th
graders and data are used to identify potential candidates for Advanced Placement.
The ACT is administered to all 8
th
graders to identify potential honors classes and
AP. In addition, the district uses results of quarterly benchmark assessments in
grades 2-12 in English Language Arts, Math, Science, Social science, and World
languages. Writing samples are administered in K-6 and middle schools. Teachers
use curriculum embedded assessments. The district also administers Comprehensive
Literacy Assessment for early grades (CLA) and Dynamic Basic Early Literacy
Skills (DIBELS). The district uses common assessments/ benchmarks. Perception
data, such as surveys from the parents and the community are also used to improve
practices. One of the district administrators stated, “In addition to using data from
multiple resources, we also use the value-added data. This is new and is very helpful
for us to understand what the feeder schools have added to student learning.” This
means that the district is analyzing student data in order to determine the
effectiveness of the programs at the feeder schools or in previous grades and the
extent that each program has contributed to students’ progress toward meeting the
district’s performance goals. The district has conducted an audit of all courses in the
K-12 system. The district is also looking at outcome data, demographic data, and
affective data such as attendance, truancy, trend data and discipline data in order to
develop strategies to address those.
79
In the next section, I will discuss the types of data used by the Legend district
that are very similar to Grovenor’s as far as student achievement data, and yet, they
are different since the Legend district additionally analyzes teacher evaluation and
summative reports to ensure alignment to the school wide goals focused on
increasing student achievement.
Legend District
Like the Grovenor district, the Legend district uses data from multiple
sources in order to modify instruction and improve student performance. The district
uses data from standardized tests such as California Standards Tests that include
Adequate Yearly Progress (AYP); Academic Performance Index (API); demographic
data; California English Language Development Test (CELDT) and Annual
Measurable Objectives (AMO’s); Advanced Placement Tests; grades, California
High School Exit Exam, graduation rates, dropout data; trend data for the past five
years; cohort data; district’s benchmark assessments for all grades, including the
kindergarteners; affective data that include attendance and suspensions data; data
from California Distinguished Schools, Blue Ribbon Schools and High Achieving
Title I schools; financial data; enrollment of students by special needs; core
spending; community profile data; district and state revenue comparison data;
district and state return on spending index (ROSI); California K-12 school districts
and core spending on student performance; data on district’s strategic goals; end of
course exams in mathematics for middle schools; California Alternative Assessment
80
for Special Education (CAPA); data on homeless children; benchmark assessments
in each core subject; research–based best practices; data from Western Association
of Schools and Colleges recommendations; instructional walkthroughs; and teacher
evaluations.
The “vital signs” book is published annually that includes numbers of
suspensions, involuntary transfers, numbers of homeless children, numbers of
students attending the district’s Magnet school, the numbers of crisis intervention
referrals and crime reports. Data are used to develop targeted interventions as well
as provide enrichment opportunities.
Something that is unique about the Legend district is the fact that the district
uses teacher observation evaluation data and summative reports from each
administrator, analyzes the content to ensure alignment to the school wide goals, and
then makes recommendation for improvement of the quality of reports. One district
administrator commented on the following:
Well, just like the district uses multiple measures to develop plans to close
the achievement gap, the human resources department and other district
leaders annually use data from multiple documents, such as the Professional
Growth Plans, classroom observation forms, teacher reflections, and
summary evaluations in order to determine the effectiveness of the evaluation
process in improving teacher performance, that in turn, results in increased
student achievement. This process of analyzing teacher performance
evaluation data is a major culture shift; certainly something that did not exist
four years ago.
As to the types of data used by the district office, an administrator stated:
We also keep close tabs and collect data during the year on the kinds of
issues that site administrators come across. For example, we know which site
administrators are having a hard time developing a quality improvement plan
81
or documenting a weak teacher. We also quickly learn based on the questions
that they ask which administrators still need training on the FRISK model or
other areas. Collecting data from these sources helps us with planning for
next year’s professional growth opportunities not only for the administrators,
but also for improving teacher quality and instructional practices.
Throughout this process, the committee provides examples of exemplary
evaluation write ups done by the principals or co-administrators, as well as examples
of those evaluations that need improvement to better align to school wide goals. A
rubric is used to score the evaluations. All three administrators stated that ever since
the district initiated the process of reviewing and analyzing all teacher performance
evaluations, there have been significant changes and improvements to the quality of
observations in the classrooms and therefore, improvements to the instructional
practices. Consequently, the Legend district has indeed, taken the use of data one
step further and has used data to improve the teacher evaluation process.
In the next section, I will review the types of data used at the White district.
While the district, like the Legend and Grovenor districts, uses data from multiple
sources, it places a special focus on the “F”, as expressed in Daggett’s research
(2006) to develop innovative intervention strategies that give students multiple
opportunities to demonstrate their mastery of standards. In addition, this district uses
data from homework completion that has been turned in on time in order to assess
student mastery of standards. Last, but not least, the degree of implementation of
WASC recommendations is included in principal evaluations, along with meeting
other student achievement targets on the state assessments.
82
White District
Like the other two districts, the White district also uses data from multiple
sources. The types of data used are as follows: California Standards Tests; Academic
Performance Index (API); Adequate Yearly Progress; California English Language
Development Test (CELDT); California High School Exit Exam (CAHSEE);
CAHSEE reports with individual student history, along with summary data by class
and teacher; Similar Schools Rank; Advanced Placement (AP) test results; that
include passing rates, numbers of tests taken and passed, credits earned.; grades,
particularly “F”s; Regional Occupational Programs (ROP) enrollment; participation
in extra and co curricular activities; Inter and Intra district transfers; 9
th
grade
Individual Education Plans; Algebra I data; feeder school data; 10
th
month
attendance trends; CBEDS; Professional Development; Home Language Survey;
percentage of highly qualified teacher under NCLB; district benchmarks and site
common assessments; trend data for the past 4-5 years; on-target with graduation, A-
G (University of California requirements); 4-year college rate; 4-year graduation
rate; English Learner data; Resdesignation data (RFEP); data related to homework
completion; class size figures; performance levels by course; Scholastic Aptitude
Test (SAT); ACT (formerly American College Testing); California State
University/University of California Early Assessment Placement Tests (EAP); senior
project completion rates; perception data (student, parent and staff); post graduate
surveys; Western Association of Schools and Colleges (WASC) recommendations;
83
school plans; affective data such as attendance, suspensions, discipline data; and
demographic data.
“Demographics do not determine destiny,” stated the superintendent,
meaning that we cannot judge the students based on their color, race, or national
origin. The district also reviews composite data, and checks for similar schools rank.
Since the district is a union high school district, it also uses feeder data to assess the
incoming students. It uses data from the 9
th
grade personal learning plans, data from
participation in co-curricular activities, participation in academics, inter and intra
district transfers, Regional Occupational Program (ROP) course enrollments, 10
th
month attendance trends, CBEDS, home language surveys, low income measures,
professional development and percentage of highly qualified teachers by credential.
The superintendent’s annual report to the Board of Education includes all of the
above data. The district also provides protocols for data analysis. The superintendent
in her annual message to the Board has stated, “As the importance of accountability
has grown at the state and local level, the need for data analysis has grown as well.”
In summary, besides using data from multiple sources to improve the
academic program, the district also uses data from student participation in co and
extra curricular activities, as well as enrollment trends in career and technical
education programs such as Regional Occupational Programs. The district also uses
data related to homework that has been turned in on time. Principals are also
evaluated on the attainment of multiple measures that are established by the
superintendent and the Board of Education.
84
In the next section, I will discuss the types of data used at Worland district.
Like the Legend and Grovenor districts, in addition to using data from multiple
sources, the district leaders also collect and use data from instructional walkthroughs.
They also use data from Blue Ribbon Schools and California Distinguished schools
in order to improve the quality of programs. Like the White district, it uses data
from completion rates on career paths, but due to existence of special programs such
as the International Baccalaureate program (IB), data are also used from the numbers
of students completing the program. Although the White district uses data from 9
th
grade students Individual Learning Plans to monitor student progress, there are two
unique features about the types of data that the Worland district leaders use: 1)Under
the strategic plan, every student, and not just the 9
th
graders, will have Individual
Learning Plans (ILPs) that will include data on affective issues, interests, career
plans, and artistic strengths and 2) is the use of APA (American Psychological
Association) standards and principles for student learning.
Worland District
The Worland district uses the following data: Academic Performance Index
(API), Adequate Yearly Progress (AYP); California Standards Tests (CST); Cohort
information on CSTs; California High School Exit Exam (CAHSEE); California
English Language Development Test (CELDT); Advanced Placement (AP); A-G
(University of California requirements) completion rates and enrollment;
International Baccalaureate (IB); Scholastic Aptitude Test (SAT); Pre Scholastic
85
Aptitude Test (PSAT); ACT (formerly American College Testing); Western
Association of Schools and Colleges (WASC) recommendations; demographic data
in each program; classroom observations; course level data by department/ by
teacher; 9
th
grade transition data; capacity building; drop-out; Individual Learning
Plans (ILP-new); on-target with graduation; summer school data and comparison to
end of the year Diagnostic Reading Assessment (DRA) for grades K-2; pre-post tests
for Elementary Schools; Standards-based report cards and checklists for Elementary;
Advancement Via Individual Determination (AVID); Response to Intervention
(RTI); completion rates on the career paths; drop-out data; attendance; behavior;
participation in the arts; redesignation rates to maintain proficiency for three years;
district BM (a performance level on the English Language Development profile
record from Edusoft); on-target with graduation reports; the strategic plan; American
Psychological Association principles (APA) on student learning; benchmark
assessments; student work; writing samples; surveys; Similar Schools Rank; Schools
to Watch program data; California Distinguished Schools; Blue Ribbon Schools;
conversations with students; trend data; and data from intervention programs.
Similar to the comment that was made by the superintendent in the Grovenor
district, a district administrator at Worland mentioned, “Genuine data that makes the
most sense and is meaningful comes from classroom instruction and that needs to be
aligned to state assessments.” The district uses measurable targets and reviews
individual student data; they use the value-added data and look for performance
patterns. Under a new initiative “Family friendly school initiative” parent surveys are
86
administered, Gifted and Talented Education (GATE) and Title I surveys are used.
Student surveys at the sites are administered to build relationships. Asset surveys are
administered and exit surveys are used. Benchmark assessments in writing and
reading, and math at elementary schools are used. Under the strategic plan,
individual learning plans (ILPs) and post graduation plans will be developed.
Affective data such as interest, career plans, and artistic strengths as well as
demographic data, attendance data, grades, course level data for each teacher by
department and by individual teacher and section are used. Quantifiable and
qualitative data are used through interviews, surveys, observations. Enrollment data,
9
th
grade transition, AVID, IB, Response to Intervention (RTI), and data from the
intervention programs are used. All the administrators interviewed indicated that
there are too much data and they need to differentiate between good data and bad
data. Data from the Distinguished and Blue Ribbon schools and Schools to Watch
program are used. Data from administrator walkthroughs both at elementary and
secondary schools are used to determine implementation of the core programs that
include observing student work and engaging in conversations with students. The
district leaders also review the grade reports at the secondary by teacher and by
section in order to identify strengths and weaknesses. Moreover, principals are
evaluated based on increasing achievement performance and meeting the state and
federal targets. All district departments use data that pertain to their departments in
order to support student achievement.
87
At the elementary school, data are posted by individual student name in the
principal’s office/ at risk students. Pre and post test data are used to measure student
achievement growth especially at the end of the summer schools on the District
Reading Assessment (DRA). A checklist for kindergarten and 1
st
grade students lists
the series of skills that every kindergarten child should know. Individual assessment
tools for K-1 are collected three times a year that is in fact, their standards-based
report card. In addition, data from subgroups including students with disabilities are
used to better meet student needs.
In summary, according to the administrator in the Assessment and
Technology department, the district actually only uses two types of data: “The
district uses non-negotiables or external data such as state and federal requirements
that include the API, AYP, CAHSEE, etc. The second type of data is what I call
‘capacity building data’, i.e. qualitative as well as quantitative data in order to build
capacity within the system.”
Summary of Key Points
The analysis of the responses to this research question and review of the
district documents indicate that the district leaders are modeling what good teaching
is all about: High expectations, use of data from multiple resources, use of exemplars
such as well-written or poorly written evaluations, use of rubrics (Marzano, 2006)
and data protocols, in order to communicate expectations and establish clear,
measurable targets to improve performance. In addition, through regular and
88
frequent monitoring of the types of data that are used to assess student learning, the
districts have established a shared sense of responsibility and “service” to support the
administrators for improving student achievement results.
In general, all four districts use data from standardized test scores that are
required by the state and federal accountability systems. In addition, each district
uses internal measures to develop interventions, monitor student progress and
evaluate the effectiveness of its programs. Student achievement results from the
standardized tests, as well as benchmark assessments results for the most part are
used to monitor student learning. However, affective data such as attendance data
and student discipline data are also considered to be important factors that contribute
to improving teaching and learning. Listed below in Table 2 is a summary of the
types of data that each district uses to improve and increase student achievement
results.
89
Table 2: Types of Data Used in Each District
Grovenor Legend White Worland
State
Instructional
Student
Performance Data
California Standards
Test (CST)
California English
Language
Development Test
(CELDT)
California High
School Exit Exam
(CAHSEE)
California Standards
Test(CST)
California English
Language
Development Test
(CELDT)
California High
School Exit Exam
(CAHSEE)
California Standards
Test (CST)
California English
Language
Development Test
(CELDT)
California High
School Exit Exam
(CAHSEE)
California Standards
Test (CST)
California English
Language
Development Test
(CELDT)
California High
School Exit Exam
(CAHSEE)
District
1) Demographic data
2) Quarterly
benchmarks in core
subjects aligned to
content standards for
grades 2-12
3) Correlation studies
between tests,
common assessments,
district interims and
teacher grades
4) Surveys
5) Writing
Assessments (grades
K-6 and middle
schools)
6) Comprehensive
Literacy Assessments
(CLA) for early
grades
7) DIBELS
8) Course audit for K-
12
9) Value-added data
at the end of 6-8
th
grades
10) Discipline data
11) Attendance, and
truancy data
12) Trend data
13) PSAT, SAT, ACT
14) Advanced
placement data and
enrollment trends
15) A-G completion
data
16) Summer school
data
17) WASC
1) Demographic data
2) Attendance charts
3) API history
4) Title I High
Achieving Schools
data
5) National Blue
Ribbon Schools and
California
Distinguished
Schools data
6) Financial and
demographics data
analysis charts
7) Enrollment of
students with special
needs
8) Core spending
9) Community profile
10) The district and
state revenue
comparison data
11) District and state
return on spending
index (ROSI)
12) California K-12
school districts and
core spending and
student performance
13) Data on District’s
strategic goals
14) Vital signs
(discipline data,
attendance data,
truancy, involuntary
transfers, and drop-
out data)
15) Annual
measurable objectives
(AMO’s)
1) Demographic data
2) Grades,
particularly “DFI”
3) Quarterly
benchmark
assessments
4) Customer
satisfaction surveys
5) Research-based
best practices within
and outside the
district
6) A-G completion
rates
7) 4 year college rate
8) AP enrollment and
passing rate, numbers
of tests taken
9) Redesignation
rates for ELL
10) WASC
11) School plans
12) Homework
completion rates
13) On target with
graduation
14) Class size
15) Highly qualified
teachers
16) CBEDS
17) California State
University (CSU)
early assessment
placement test results
and first time
freshmen proficiency
in Mathematics and
English
18) Composite data
19) Similar Schools
rank
1) Demographic data
2) Course level data
by department/ by
teacher
3) Surveys
4) Enrollment data
5) Redesignation data
for ELL students
6) 9
th
grade transition
data
7) Capacity building
data
8) Benchmark
assessments for
Elementary schools in
math, reading, and
writing three times a
year
9) AVID data
10) International
Baccalaureate data
11) Response to
Intervention data
(RTI)
12) Data from
intervention programs
such as math and
language arts
13) WASC
(accreditation) data
14) PSAT, SAT,
ACT
15) Advanced
placement data and
trends
16) A-G completion
data
17) Attendance and
truancy data
18) Drop-out data
19) Discipline data
90
Table 2, Continued
16) End of course
exams in mathematics
for MS
17) High School
graduation rates
18) Advanced
Placement data
19) A-G completion
rates
20) CAPA (California
Alternative
Assessment for
Special Education)
21) Magnet high
school attendance
data
22) Data on homeless
children
23) Data from urgent
messages from the
superintendent and
each site
24) WASC
25) Benchmark
assessment in each
core subject area
26) Research based
best practices data
20) 4 year graduation
rates
21) Data from
intervention programs
22) Senior project
completion data
23) Feeder schools
data
24) Algebra I data
25) 9
th
grade
transition plan data
26) Participation in
co and extra
curricular activities
26) 10
th
month
attendance trends
27) Home language
surveys
28) Low income
measures
29) Professional
development
19) Career pathways
completion data
20) Cohort data on
CST
21) APA standards
data
22) Individual
learning plans data
23) On-target with
graduation
24) Blue Ribbon and
California
Distinguished schools
data
25) Similar Schools
tank
26) Summer school
data and comparison
to the end of the year
Diagnostic Reading
Assessment (DRA)
for grades K-2
27) Diagnostic
Reading Assessment
(K-2)
28) Pre/post tests for
elementary schools
29) Checklists on
mastery of standards
for elementary
30) Standards-based
report card
School
1) Curriculum
embedded
assessments
1) Chapter tests /
common assessments
in core subjects
1) Common
assessments in core
subjects
1) Chapter/unit tests
91
Culture of Data Use
“At the beginning, you might see things because they are visible signs of things
when you are first learning how to use DDDM. However, as you get more and more
proficient, you don’t need that little “cheat sheet” in front of you all the time, it just
becomes part of what you do.”
Superintendent of Grovenor District
This section addresses research question #2: How do district leaders cultivate
a culture of data use? Use of data is an expectation of the district leaders at all four
districts. All have worked with external partners in order to build a culture of data
use. The Grovenor district has used goal setting as the guiding force for creating a
culture of data-driven decision making and engagement in continuous improvement,
whereas the Legend district has published a message by the Superintendent to create
a sense of urgency to narrow the achievement gap for the Hispanic students and has
also created a common language (S.E.A.M.) to improve instructional practices. On
the other hand, the Worland district has used a strategic planning process to nurture a
culture of data use. Last, but not least, the White district has given the sites complete
autonomy to use data to develop programs.
In summary, all four districts have created a culture of data us by creating
focus, coherence, and consistency at all three levels, i.e. district, school and
classroom in order to improve student performance.
Grovenor District
The Grovenor district has embraced a culture of data-driven decision making
in all aspects of its operation. According to the superintendent, all departments,
92
including the classified, use data to make decisions. Although there is no formal
policy for data use, it is an expectation and part of the culture of the district. “How
do you know that there is air in the room?” stated the superintendent. “I guess
because you can breathe.” The district uses a culture of “no excuses” for improving
student achievement. In fact, with the assistance of an external partner form
WestEd, the district leadership team went through a multiyear process to establish
district performance goals that are posted everywhere in the district. The
performance goals sparked the process of continuous improvement and data-driven
decision making. There are two performance goals that are as follows: 1) All
students will progress though the bands on the California Standards Test (CST)
scores annually. This means that if a student is at Far Below Basic, the student will
move up to Below Basic or from Below Basic, the students will progress to Basic,
etc. In this way, within five years, all students are expected to become proficient.
Also those who are Proficient or Advanced will maintain those levels. 2) All English
Language Learners will progress one level on California English Language Learner
Test (CELDT). This means, if they were at early intermediate level, they will move
to intermediate, etc.
Use of data, according to the superintendent, is like using electricity. “How
do you know that you use electricity? You know, because you use it.” The
Studentsuccessproject.org is a video that best demonstrates the culture of data use at
the Grovenor district. Teachers and administrators interviewed in this video state
that “what teachers and principals need are data that come from tests given
93
throughout the current school year, that are given in a timely fashion and in an
environment that values collaboration and problem solving, and allow them to adjust
instructional strategies. It is only in this way that significant gains in student
achievement will be realized.” Teachers and the superintendent interviews in this
video also attest to the culture of data use in the district. District performance goals
are posted in all classrooms, teachers use the CST data and benchmark results during
parent conferences; there are data charts in each classroom to motivate students;
counselors use data to place students in proper classrooms; teachers use the Data
Director to access data, they use collaboration time to review student data and
improve instruction by engaging in a cycle of continuous inquiry. In addition,
according to two top administrators at the district level, students know their data and
can tell you exactly what their goal is to become proficient on the CSTs. Students
also know benchmark protocol analysis; they review their own results; and are
recognized for their performance on the CSTs. “You become data dizzy, i.e. data
rich, information poor”, stated the superintendent. She further stated, “Students are
more than a number-data become the voice of children telling us what they need.”
Moreover, all discussions at the cabinet and the instructional divisions, as
well as the principals’ meetings are focused on data. According to the
superintendent, “If you ask the following three questions, you should be able to get
answers from every one. 1) What are the students learning? 2) How do you know
that they do? 3) What do you do when they don’t?” Teachers use collaboration time
to review data and modify instruction. At the district office, all conversations begin
94
with data, data from all sources. There is a culture of continuous improvement and a
clear focus. “When you get into data use, it becomes something you wouldn’t see
because it is so much part of you,” commented the superintendent.
The alignment of district administrators’ duties to supervise instruction and
evaluate principals is another testament to the fact that there is a culture of data use
in the district. Principal evaluations are done with input from the directors, along
with the assistant superintendents. For example, since the directors work intimately
with the principals to support them in their endeavors to improve student
performance, therefore, they can be very instrumental in providing feedback to the
level of Assistant Superintendents who have supervisory responsibility for principal
evaluations. The superintendent stated, “We manifest true behavior of using data to
improve learning. You know because you use it automatically. That’s how we use
data.”
In summary, the Grovenor district leadership team assisted by an external
partner from WestEd, has used goal setting as the driving force for establishing and
maintaining a culture of DDDM and continuous improvement. In the next section, I
will discuss how the Legend District superintendent has worked with an outside
provider to establish a culture of data use by creating a common language, as well as
publishing a message that has created a sense of urgency for closing the achievement
gap through staff involvement in a cycle of continuous improvement and DDDM. In
addition, it has created a focus and coherence by aligning its practices at all three
levels: district, site, and classrooms.
95
Legend District
At the Legend district, improving student performance is a shared and
collective responsibility of the superintendent and the Board of Education. The
Superintendent’s Annual Report, the State of the District, addresses his vision and
reports the progress made. There is a culture of continuous improvement. Data are
used from best practices to develop programs such as the Dual Immersion program
or Data Management System such as Data Director. The superintendent has
established a culture of data use by publishing a critical message based on the CST
scores that has created an urgency to close the achievement gap. In addition, he has
created a common language and focus through S.E.A.M. (Standards, Engagement,
Assessment, and Mastery) to improve instructional practices. According to the
superintendent, “even the food service knows that improving student achievement is
the district’s focus”. The superintendent’s message and the S.E.A.M. are posted all
over the district and on the website. Every newsletter communicates this message
and expectations for improving student achievement. Data use has become an
integral part of the district culture. For example, principals and site leadership teams
are engaged in data analysis through monthly meetings; faculty meetings are spent
reviewing and analyzing data and schools have time for collaboration to review data
from common assessments or district benchmarks in order to engage in quality
conversations and improve practices based on data. However, the superintendent
stated that the culture of data use for elementary schools is different from the
secondary schools, i.e. students have one teacher at the elementary school vs.
96
multiple teachers at the secondary schools. As a result, there is more “buy in” for
data use at the elementary schools than at the secondary level. However, the district
provides resources to support teachers with data use.
Students know the instructional focus of their schools, since the district has
invested in “Focus on Results,” an instructional improvement framework introduced
by external partners. In light of the urgent message from the superintendent to
narrow the achievement gap for the Hispanic students, each site has been given what
the superintendent referred to “structured autonomy” to choose a site-specific focus
area based on data from multiple sources to increase student achievement. For
example, one high school has decided to increase the numbers of the students,
particularly the Hispanic students, who complete the A-G University of California
requirements. Another high school has created a focus on writing, and other schools
have created a focus on improving literacy, closing the gap in mathematics, or
increasing graduation rates. On the other hand, one elementary school has asked to
reduce the class size in the fourth grade where there was a dramatic decrease in
student achievement in the Mathematics and Language Arts assessment results.
According to the Assistant Superintendent of Educational Services, “When the
district leaders make presentations on data to the Board of Education, students that
participate in the presentation are able to tell the Board what their school’s best
practice is. In their mind, it is not a data decision, but they can go as far as telling
you how many are being successful at it and how many of them aren’t being
successful at it. We had one of the schools that students took a picture after the 1
st
97
benchmark test on the number of students scoring proficient and then after the 2
nd
benchmark took another picture and then took another picture to show in a visual
form the amount of kids and how it was increasing throughout the year.”
In summary, the Legend district has been assisted by an external partner to
create a culture of data use by communicating an urgent message from the
superintendent that reports the results of the California Standardized Test and the gap
that exists between the Latino students and other students. In turn, sites have used
data to develop their own focus areas to improve student achievement. In addition,
the district has created a common framework such as S.E.A.M. to improve
instructional practices. In the next section, I will discuss that although the use of
data is mandatory at the White district, unlike the Legend district, the White district
has created a culture of data use by planting the seed for data use and by allowing the
schools to think creatively and according to an administrator, “outside the box”, in
order to improve student achievement. This can be accomplished through modified
bell schedules, providing interventions during the day, etc.
White District
The White district has had a history and culture of decentralization over a
very long period of time. However, in the era of standards-based instruction,
sanctions, increased accountability, and declining financial resources, the district
leaders have focused all of their resources on a common priority, i.e. improved
student achievement. This has caused a big shift to a collaborative culture for
98
sharing best practices in DDDM, as opposed to working in isolation. However,
valuing and embracing the uniqueness of each site and each staff member, the
district has given the sites a lot of autonomy to think, as one administrator stated,
“outside the box”, to develop programs that will improve student achievement. As a
result, the district has taken the mid-point between centralization and
decentralization.
As part of the culture of data use, schools and the district leaders review
research-based best practices to change the grading practices. For example,
currently, the district is working on the Case of the Zero (Daggett, 2007) to improve
its grading practices. According to the Director of Assessment, Technology and
Staff Development, “The district is also working with the sites to develop “Strategies
of hope”, which include swapping students in Algebra I who are not successful in
class at the quarter with another teacher”. In addition, the culture of data use is
evident by each student receiving individual data on California High School Exit
Exam, strand data, past and present scores and upcoming dates for CAHSEE.
Transparency with data for all schools within the district has created a culture of
sharing data that is built on trust. There is a strong community support for data use
and the superintendent goes through student achievement data with all certificated
and classified staff on her first day opening remarks in order to reinforce data use.
The critical message of the superintendent is communicated to all on day one.
All four administrators interviewed at the White district were unanimous in
stating that culture of data use is also evident by looking at artifacts at each site and
99
at the district; observing what’s on teachers’ desks, in the classrooms, and on data
walls; reviewing results of common assessments; reading the e-mails sent to the
Assessment Office from teachers asking for more data; as well as witnessing the
excitement of teachers for accessing more data. “If you tour the classes, you will see
that the same conversations are being held regarding data,” stated an administrator.
He further commented that student conversations center on common assessments
such as, “Mrs. O.’s class beat us on the common assessments by 10 points. We can’t
stay behind.” In addition, the school cultivates a culture of data use by giving
students lunch privileges based on improved scores and grades in classes and on the
tests. This serves as a great incentive to encourage the students to constantly review
their own data in order to reach their targets. The superintendent shared a personal
story to demonstrate the culture of data use among the students. She stated that
when the CST results arrived last July, her daughter who attends one of the high
schools in the district, called and asked, “Mom, did my school make its targets?”
Additionally, all schools engage in a cycle of continuous improvement in
order to ensure quality teaching and learning experiences for all the students by using
common instructional materials, administering district-wide quarterly common
assessments and essential standards; use of site-based common assessments; and by
implementing a mandatory pyramid of intervention with targets goals. The pyramid
of intervention is a systematic organization of student support that results in targeted
and required interventions for each struggling student. Teachers have time for
collaboration through creative scheduling and/or through release days. Student
100
assemblies address the importance of passing the CAHSEE, and discuss effective
strategies. Also, all conversations at the sites and at the district center on CASHEE
from December to March.
Based on the interviews and review of documents and reports, it became very
clear that there is a culture of focus, intensity, and coherence around data. “There
are no road blocks for the principals,” stated an administrator. In other words, the
organization has provided focus and coherence by aligning its practices at all three
levels, i.e. at the district, school, and classroom levels, while giving autonomy to the
sites to design innovative programs to improve student achievement. For example,
while all 9
th
grade students must have an orientation to high school, each school can
develop a different strategy and structure for implementation. Some schools spend
three days on the orientation, and some may only do it on one day. However, all
students must be exposed to the same information and content and experience the
same activity. The district has a very supportive culture of data use; site leadership
is valued and trusted with the decisions to improve student achievement results.
“The system looks at itself and not at demographics in order to improve its
practices”, commented one of the administrators. Sites have the autonomy to
organize their schools, change bell schedules or think “outside the box” when doing
their master schedule. Offering daily incentives for using data to improve student
achievement results is part of the culture of the district and the sites. According to a
top district administrator, “students know what exactly they need in order to get into
universities in the UC or CSU system, since we are always talking about data and the
101
importance of data in student achievement”. In sum, the White district has created a
culture of data use by creating a common focus- increased student achievement. As
a result, it has worked in a more synchronized way by using all of its resources and
energies on supporting common focus, while it has given the sites autonomy to
design programs to meet different student needs. Examples include, but are not
limited to some schools implementing a block schedule, while others continuing with
the more traditional bell schedules; some offer longer lunches to reward student
achievement; while others create friendly competitions by class and swap students at
the quarter to give them a second chance at passing their class. However, all schools
hold assemblies that are built around improving performance on the standardized
tests. The district has made transparency with data a top priority by providing
opportunities for sharing out of best practices in a supportive culture that is built on
trust. In the next section, I will discuss the Worland district’s culture of data use that
has taken a different turn under the new strategic plan, approved by the Board in July
of 2008. The district leaders have worked with their long-time partner, the Fall
Foundation, to develop a strategic plan in anticipation of making a much
decentralized district a more centralized one where DDDM is used for continuous
improvement.
Worland District
At the Worland district, the belief about the culture of data use stemmed from
the Board of Education and it is therefore, embedded into the way things are done at
102
this district. Periodic presentations on data are made to the board. The development
of the strategic plan in July of 2008 explicitly calls for DDDM. This is a much
decentralized district and the hope of the superintendent is that the strategic plan will
shift a culture of decentralization to that of centralization. The culture of the system
at Worland has created urgency at all levels to use data in order to improve student
performance. Use of data is an expectation of the superintendent. Collaboration is
part of the culture and value system of the district. The superintendent stated, “The
strategic plan has the power of connecting people to what you know about yourself
in the form of data and thus making decisions about where you need to go by
analyzing all the things you know about the system.”
Three years ago, the Fall Foundation visited classrooms and interviewed
teachers, principals, the cabinet, and students and concluded that the district was a
DDDM district. They therefore selected the district and provided resources to
improve literacy in the district. In addition, the WASC visitors and Categorical
Program Monitoring visitors all refer to a culture of data use in this district. District
newsletter and the information on the website regarding data also attest to a culture
of data use. The interviewees indicated that there is a lot of joy around learning and
using data at the schools is evidenced by student work that is displayed and data talks
that take place with high school students. There is a culture of family friendliness
and schools are required to share data with families. In addition, “quality
conversations” (Datnow et. al., 2007) around data take place at each staff and
administrator meeting that keeps pushing the staff to try new strategies to improve
103
student performance. Moreover, data use is part of the culture to inform instruction
and assess student progress. The Director of Assessment and Technology stated, “In
terms of culture of data use, we use data of who we are and what we want to be as a
district in order to meet the needs of community that is not measured by the
accountability piece.”
There are regularly scheduled dinners multiple times a year that are attended
by the classified staff, teachers, administrators and community focused on data.
Professional development is ingrained in the culture. Edusoft training is given to all
teachers. The paraprofessionals working with the English Learners receive training
on data and the important role these employees play in supporting the EL students.
Decisions made by the principals with their site leadership team are based on data
use. Intervention programs are developed based on data use and analysis. The sites
communicate the quality of services and also pinpoint concerns and weaknesses.
There are roundtables with the principals regarding quality of progress at each site.
The district celebrates achievement and there is a culture of sharing challenges and
victories.
In summary, the district cultivates and encourages a culture of data use by
providing opportunities for staff to engage in quality conversations, training, best
practices dinners, and most recently, through the creation of a strategic plan.
104
Summary of Key points
In sum, although none of the districts have a set policy for data use, all four
districts have a very strong culture of data use and the expectations for improving
student achievement results are shaped by the Board of Educations’ expectations for
improving student outcomes. All four districts have established a culture of DDDM
to increase student achievement. However, each has used a different process and
strategy to accomplish that: Gorvenor, a very centralized district, has established two
common performance goals, while the Legend district, a centralized district, has
published results from the CSTs and has created an urgent message from the
superintendent to close the achievement gap. In addition, it has created a common
language and framework, i.e. S.E.A.M., for improvement of instructional practices
and has asked the sites to create their own focus based on their data. On the other
hand, the White district that had a long history of decentralization since 2001, is at
the mid point of centralization and decentralization and has created a common focus
on increasing student achievement by establishing four district priorities: 1) Through
the use of common instructional materials, 2) District-wide quarterly assessments, 3)
Site-based common assessments, and 4) Implementing the Pyramid of intervention.
In other words, it has focused on “Rigor”, Relevance” and “Relationships”. On the
other hand, the Worland district, a much decentralized district, has just begun the
process of centralization through development of its strategic plan. All four districts
have been aided by outside partners in establishing a culture of data use. The
Grovener district has worked with West Ed consultants, while Legend has worked
105
with external evaluators, creating the “Focus on Results” framework; and White has
worked with administrators from a high performing school district in Chicago to
create Professional Learning Communities. Last, but not least, the Worland district
has partnered with the Fall Foundation to create its strategic plan. All four districts
have also created time for collaboration through early release or late start days, as
well as minimum days to review and analyze student achievement data.
Conversations among school staffs, among district leaders and between the two,
provide opportunities for reflection and examination of the process and the results of
the actions for DDDM. District leaders create a culture of data use by engaging
themselves and the staffs in continuous inquiry by creating and sustaining open and
trusting conversations about improvement of practices. As a result, the
superintendents and other district leaders model high expectations and risk-taking
behavior through open communication to foster a climate of trust.
In all four cases, the interviews indicated that the culture of data use was
pervasive among the administrators and teachers to a large degree, although each
district, with no exception, still experienced challenges associated with DDDM by
some teachers who did not “buy-into” the process of data use. However, all districts
had structures in place in order to facilitate data use through professional
development, course leads, technology support personnel, or by avoiding giving too
much data to the sties. Overall, the administrators at the White district seemed to
have made the most efforts in raising student awareness and use of data by providing
106
immediate incentives and rewards such as longer lunches, friendly competitions or
giving students other privileges.
Table 3 below summarizes the culture of data use by each district.
107
Table 3: Summary of Findings on Culture of Data Use
Grovenor Two performance goals displayed everywhere
Data charts in every classroom
Collaboration time
Students know their data
Student protocols
Data used to modify instruction
Every conversation at the district office begins with data
Teacher protocols for data analysis
Review of district and site benchmark assessments
Placement matrices
Rubrics
Professional development
Outside providers have helped establish the culture of DDDM
Legend Urgent message from the superintendent to close the achievement gap
The superintendent’s message is posted on the website and in every
newsletter
Student awareness of data ( benchmarks and results)
Student presentations to the board
Common language and focus ( SEAM)
Site-specific urgent messages
Collaboration time
Review of site and district common assessments
Students know the instructional focus
Leadership team meetings focused on data
Professional development
Best practices
Outside providers have helped establish the culture of DDDM
White Four district priorities
Professional learning communities
Collaboration time to review student assessment results to improve
instructional practices
Artifacts at each site, classroom, and the district on DDDM
Culture gives autonomy to think “outside the box” to improve student
achievement
Culture of focus, intensity, and cohesiveness around DDDM
Transparency with data
Individual student data on CAHSEE
Superintendent’s critical message based on data on the first day
“Strategies of hope” for improving student performance
108
Table 3, Continued
Strong community support for DDDM
Best practices dinners
Student conversations around data
Class competitions
Student lunch privileges/incentives for improving performance
Teacher excitement with data use as evidenced by e-mails sent to
Assessment office
Weekly principal meetings with the superintendent
Data conferences
Data assemblies
Data protocols
Outside providers have helped establish the culture of DDDM
Worland Board’s belief about the culture of data use
Strategic plan
Periodic presentations to the board on DDDM
Collaboration time
Culture of DDDM has created a sense of urgency to improve student
performance
Professional development engrained in the culture
Review of results of site and district assessments
Data talks with students at the High Schools
District newsletters and info on web site based on data
Opportunities for staff to engage in quality conversations
Achievement celebrated
Culture of family friendliness with sharing data
Challenges and successes shared with all
Round Table with Principals regarding progress made on DDDM
Sites communicate quality of services and pinpoint weaknesses
All decisions made based on use of data
Intervention programs developed based on DDDM
WASC visitors and state review teams attest to the culture of DDDM
Joy and excitement around data use by teachers and administrators /Data
lead to more data
Data are used to meet community needs
Outside providers have helped establish the culture of DDDM
109
District Structures and Practices
This section addresses research question #3: What practices/structures do
leaders put in place in terms of data use to drive school/district improvement? The
research indicated that the structures that a district puts in place can be programs
(such as a writing or mathematics program), positions (such as course leads or
teachers on special assignment), or strategies (Thinking Maps etc.) (Agullard &
Goughnoor, 2006). All four districts took a different approach in establishing and
identifying structures to increase student achievement. For example, the Legend
district chose to create a central message based on district-wide analysis of CST
trend data to focus all schools on the urgency of closing the achievement gap for the
Hispanic students, while it also required the schools to engage in site-based planning
that identified site-specific needs and strategies. The White district on the other
hand, a formerly decentralized district, created four district priorities focused on one
goal- improved student achievement, whereas, the Grovenor district went through a
multi-year process to establish two performance goals. Finally, the Worland district,
a decentralized district, used the strategic planning process to centralize the DDDM
process to improve student achievement. In all instances, the district leadership
teams have worked with outside providers and have involved the site leadership
teams in the process. The following section is divided into four themes:
Instructional practices, Professional Development, Outside providers, and Data
Management Systems. Each of these sections will describe how the establishments
110
of different structures by district leaders have contributed to DDDM to improve
student performance.
Grovenor
Instructional Practices
The district has set its vision and mission through collaborative efforts with
teachers, unions and administrators. The first step the district has taken in its journey
to DDDM is to ensure the alignment of instruction and assessment to the state
standards, followed by establishing measurable targets and benchmarks and
development of data protocols at all levels to engage in data analysis.
The district reviews student outcome data in order to predict which students
will be successful in certain programs such as the Advanced Placement (AP) and
consequently provides support for teachers and principals to better meet the needs of
the students. More specifically, it uses data from multiple sources instead of using
teacher judgment and belief regarding the subgroups when placing students in high
level courses such as AP or honors. On the other hand, in an effort to identify AP
potential that results in increased enrollment in more rigorous courses, the district
provides the funds and tests all 10
th
grade students in PSAT exams. It also offers
SAT prep courses to all juniors.
As a result of a program placement audit done at high schools, the district
identified gaps and therefore developed placement matrices and targeted
interventions. One administrator explained the process as follows: “We do an
111
autopsy. We do backward mapping in order to put interventions in place for students
who are in Far Below Basic in the 8
th
grade”. Along the same lines, the district also
looks at data for the 7
th
graders who are not on target to graduate from high school.
Based on the data analysis, the district provides additional classes for intervention,
makes sure that these students attend summer school, after school tutoring and
focuses on reading and writing instruction in the 9
th
and 10
th
grade students for the
students who were identified as “at-risk” in the 7
th
and 8
th
grades.
Principal meetings are dedicated to the review of data and implications. Data
teams regularly meet at the leadership/ strategy academies to discuss next steps.
Content area meetings use the professional learning communities’ model to review
common assessments and engage in a cycle of continuous inquiry to improve
practices. Moreover, data teams made up of principals, counselors, AVID teachers,
special education teachers and representatives from each grade level meet quarterly
to review and analyze data. Additionally, the district’s instructional division that
includes K-12 directors including the categorical and special education program
directors, technology and assessment office personnel, as well as assistant
superintendents, meet every week, for a total of twenty hours a month, to support and
facilitate the work of K-6 and 7-12 by reviewing data from different programs. The
district gives differentiated support based on site needs. Every school has a Teacher
on Special Assignment (TOSA) to facilitate conversations, keep learning logs, and
assist with collaboration. There are teachers on special assignment to build capacity
for data use.
112
There is constant monitoring of the results by the district and the sites. The
staff continuously works on improving instructional delivery models and strategies.
Cycle of continuous improvement is up and alive at the Grovenor district. Principal
evaluations are based on value-added. In other words, did student achievement on
the standardized test scores improve as a result of the principals’ leadership and
based on the practices/programs that the school implemented? As a result of all
these structures, the district has continued using DDDM to establish district
performance goals, create common language, and look at trends, strengths and
weaknesses in order to further increase student performance.
Professional Development
Grovenor is a very centralized district. Although the relationship between the
district and the sites is not hierarchical, the district does not give too much individual
leeway to the sites in terms of data use. Sites have autonomy over staff development.
However, it must focus on areas of needs based on student data and they must
establish site goals based on the district performance goals. The district has
instituted weekly collaboration time at the sites as a result of negotiations with the
teachers’ union. Professional development is provided for all principals and school
leadership teams to learn how to use data warehouse and to build capacity by
training staff and providing support for accessing data and analyzing data to improve
instructional practices. In addition, there are over 20 teachers on special assignment
113
that provide support for classroom teachers by modeling lesson delivery, accessing
and analyzing data.
Although data analysis trainings are offered throughout the year for teachers,
participation can either be voluntary or mandatory. According to the superintendent,
“there is a super week before school starts for all teachers to review and analyze
data, and there is also a super super week (two weeks before school starts) to do so.
Despite the fact that these data trainings are voluntary, the response from our
teachers for participation has been overwhelmingly positive.” The district personnel
make site visits, observe instruction and give feedback to the administration. The
district also gives the schools a menu of best practices. After implementing best
practices, the district sustains them and scales them up and provides support.
District leaders provide data and protocols for analyzing the benchmarks.
Leadership institutes, data analysis, and use of multiple measures have been very
helpful practices for the district to establish structures for data use. More
importantly, according to the superintendent, “a clear focus” has greatly facilitated
data use, since it has helped everyone understand what the priorities of the district
are.
Outside Partners
The process of establishing district performance goals in conjunction with an
external partner from West Ed has been instrumental to building systems for data use
for teachers and principals. With the assistance of an external partner from the West
114
Ed, the district leadership team spent three years to develop two district performance
goals. 1) All students will move up a band on the CST and those at the Proficient
level will maintain Proficiency or Advanced and 2) All ELL students will move up
on proficiency level on CELDT in addition to making progress on CST. Similarly,
the district has worked with multiple external partners such as the Principals’
Exchange, the National Center for Educational Accountability to conduct data
queries for predictive analysis, Action Learning, San Diego State University, and
Learning 24/7 to build capacity for data use and improve instructional practices. The
partners have also helped develop common assessments, have audited the K-12
courses, and have provided training for the teachers in data use.
Data Management System
The Assessment and Technology office personnel assist the sites by
providing the technology support and training for data analysis. These “assessment
dudes”, according to the superintendent, upload the data and support the sites with
considerable analysis of test data. They also make presentations at the faculty
meetings or meet individually with departments or teachers at each site, since some
teachers are still not comfortable with data use. Data Director is used by Achieve
Data Systems as data-warehouse, reports are sent to the sites and to the instructional
division. The district website includes tutorials on the use of Data Director to further
assist the teachers with data use and they have proven to be very useful.
115
Technology is also used to provide support for student placement in courses
such as the Advanced Placement. In addition, availability of site-based scanning
systems to import interim benchmarks into the data warehouse has been influential in
getting test results in a timely manner. The sharing of reports and use of data for
non-evaluative purpose and for diagnostic purposes has greatly facilitated data use at
the sites.
Summary
In summary, data are used to place students from multiple sources and
placement matrices have been used for student’s placements instead of anecdotal
data. In addition, the district has aligned benchmarks to instruction, pacing guides,
placement tests and summative tests. It uses data warehousing and provides training
for teachers and administrators to facilitate data use. The district has also used
external consultants to develop quality assessments and benchmarks to ensure
reliability and validity; has created a common language, common goal, and a
common focus. It has also aligned the duties of district high level administrators
with directors at each level in order to more effectively supervise instruction and the
principals (focus, alignment, and coherence). The district has also established targets
and just like the Legend district, has hired the right people for the job. “We use data
as a tool to conduct an autopsy and do not use it for punitive or evaluative purposes,”
stated the superintendent. In the next section, I will discuss the structures that are
currently in place by the Legend district that include the use of data management
116
system to provide user-friendly reports, professional development for data use,
teacher specialists at each site to support with data use and restructured principal
meetings.
Legend District
Instructional Practices
Like the Grovenor district, the Legend district gives its staff access to
information and resources by bringing the right personnel on board. That has
resulted in creation of an urgent message from the superintendent and the
development of the S.E.A.M. in each classroom that helps teachers reflect upon their
practices. Under the “Focus on Results” every district level administrator has been
trained and is responsible for coaching principals and leadership teams at each site.
Leadership teams at each site conduct walkthroughs and calibrate observations based
on the matrix that they have collaboratively developed. In addition, leadership team
meetings as well as principal meetings have been restructured to include data review
and analysis. As a result of data analysis, drop-out recovery programs such as APEX
and ADVANCED PATH have been developed. “The district has created a district
wide focus and built structures and created systems for monitoring results”, stated
the Assistant Superintendent of the Educational Services. “Each site has developed
SMART (i.e., Specific, Measurable, Attainable, Results-oriented, and Timely) goals,
and principals are required to visit each other at different sites and calibrate
classroom observations”, he continued.
117
There is shared responsibility for support by the district leadership. The
Cabinet looks at data, there is a district wide data analysis, but there is more site-
specific data analysis that takes place. What also works at the Legend district is the
overall awareness of data, using the cycle of continuous improvement in order to
improve practices.
The assistant superintendent’s office, upon receiving the CELDT results or
AP, places the data in a user-friendly format before sending those to the sites. In
addition, when the AP results arrive, his office analyzes and interprets the results and
makes a presentation to the board. However, sites do their own analysis as well.
The presentation to the board simply gives an overall picture as far as the number of
tests taken at each site, and the percent passing. However, sites are required to do a
more in-depth analysis of the results in order to improve their practices. For
example, based on the “focus on results” framework at one of the high schools, the
school is working on the higher order thinking skills in order to increase the numbers
of students who are on the A-G program. The school is focusing on increasing the
numbers of students that meet the A-G requirements, and they have hard evidence on
how to do this. They have redesigned some of the courses in order to prepare more
students for the challenges of A-G. The school has therefore added some additional
courses and the district has supported the school by providing the resources to write
the curriculum and sending more teachers to the College Board for training.
118
Professional Development
The superintendent stated that going deeper into data in order to get deeper
knowledge is warranted in order to succeed and he therefore, recommended staying
the course and not jumping from one initiative to the next. As a result, he used the
following analogy to clarify his statement, “It isn’t really that complicated. I mean I
learned the best teaching strategies when I was in service. OK, because here you are
in a situation where you had to learn, let’s say a basic process like how to take apart
your rifle, how to clean it and put it back together. And you know they showed you,
they gave you hands-on examples. They told you and then they went through it in
the mud, made you take it apart, clean it and put it back together. I mean it’s like a
good football coach does, they do the exact same thing. Put the kids on the football
field, they tell them, they show them, they give them a chance to practice and look at
the results. I think it’s when you don’t accept anything other than total mastery,
that’s when kids are going to be successful.”
Professional development consists of trainer models to build capacity,
training of leadership teams by each site, videotaping class observations by
principals in order to create conversations around quality instructional strategies.
The Assistant Superintendent of Educational Services commented, “We have gone
past the data phobia by breaking down the barrier and breaking down the walls that if
constricted, education suffers.”
Teacher specialists provide support for the sites to build capacity for data use.
Leadership teams have defined what student engagement looks like and therefore
119
they go on instructional walkthroughs and look for quality engagements that the
leadership teams have defined. Sites are given autonomy for staff development and
initiation of new programs based on quantifiable measures. Principal evaluations are
based on value-added. Best practices are implemented and the fidelity of
implementation is being monitored closely by the site leadership teams.
The Legend district administrators consider transparency about data use as
evidenced in the superintendent’s urgent message a facilitator to DDDM. “The
coaching triads are very successful”, one district leader stated, “since these teams
communicate with each other and it’s broken down the barrier of being afraid to ask
questions and so I think we’ve had a rich dialogue on what data are you using, why
are you doing this. I don’t think principals are afraid to ask certain questions now
that maybe five years ago they were afraid to ask for fear of looking stupid.”
Data Management System
At the Legend district, Data Director has made a huge difference in the way
data have been used to improve instruction. The Superintendent stated,
“When I first got here, the district used the old Edusoft and it was a good tool
in its day, but time has passed Edusoft significantly so it really couldn’t provide the
data that we needed. So, the first thing I had to do in order to move the program
along was to determine who can provide us with a system to improve data from what
Edusoft had. As a result of doing various searches and in fact it just turned out, the
way we found it initially was my old district after I had left had found Data Director
120
and then, really, Data Director was developed in Garden Grove. So, I spent time with
the superintendent and really came to other conclusion that it was the right direction.
We went through a data committee to move it in that direction.”
The district uses on-line data warehousing, has a data committee, has on-line
tutorials for data use for teachers and it also offers on-site trainings to help with data
use. Depending upon what kind of support is needed at the sites, the district will
determine the type of training to be provided. For example, trainings have been
provided for staff for on-line CAHSEE programs such as the APEX and virtual HS
programs to help recover drop-outs. The Legend district considers having put data in
the hands of the principals and teachers through Data Director as a major
accomplishment.
The district leaders feel that training is extremely important since everyone
should be able to use the tools that the district makes available to them. The district
data committee is now recommending a change of its current student information
School Max to Zangle to better monitor progress. School Max is an antiquated data
management system that no longer meets the needs of the district as far as providing
student data from multiple sources. Zangle, on the other hand, is an application that
allows users to access various aspects of student information, bringing the school's
data to the users’ fingertips with ease whether over the Internet or district's intranet.
Teachers and faculty members can view online student profiles; enter classroom
news; submit attendance data; input gradebook data and student class marks; share
121
news and gradebook data with parents and students; track Special Education students
and health information.
Outside Partners
Unlike the Grovenor district with multiple outside partners, the Legend
district has only partnered with one outside partner who has provided the “Focus on
Results” framework and has trained leadership teams in different cohorts to build
capacity for data use. After approximately four years of training and building
capacity for DDDM, the outside consultants will no longer be partnering with the
Legend district.
Summary
In sum, the district administrators credit the professional development “Focus
on Results” provided by the external consultants at all three levels of the district,
sites, and classrooms as very instrumental in developing capacity for data use.
Restructured principal meetings, as well as meetings in triads to discuss classroom
observations, transparency with data and technology support have created structures
and systems that have facilitated data use. Staying the course, according to the
superintendent, will allow the district to go deeper into understanding the root causes
of the achievement gap to improve student achievement.
The next section addresses programs, personnel, and strategies at the White
district that include, but are not limited to 148 course leads to assist with data use as
122
well as district wide and site specific common assessments and a pyramid of
intervention for struggling students.
White District
Instructional Practices
The White district has developed four district priorities that are as follows: 1)
Use of common instructional materials, 2) District wide quarterly common
assessments and essential standards; 3) Site-based interim assessments, and 4)
Implementing the Pyramid of Intervention. In White District, the leadership teams
have measurable targets for student achievement and like the other two districts; the
Board of Education holds high expectations for student achievement. The
superintendent’s annual report to the Board of Education is based on the attainment
of the measurable targets that the district has established.
The superintendent stated that the practice of data analysis is deeply rooted in
the district culture. Like the Grovenor district, this district’s leaders have also
worked closely with the teachers’ union to establish good working relationships.
The district has also established a good rapport with the five feeder middle schools
from other districts by releasing teachers to articulate their programs with the union
high school district. The superintendent stated that, “We have used data to build
financial solvency.” There are weekly principal meetings with the cabinet and the
superintendent to review instructional practices and use data to make modifications
to practices and programs. Data-driven principal and administrator evaluations are
123
conducted. The principals are given autonomy and resources to improve student
results and site leadership is valued and trusted by the district. There is time for
collaboration at each site either through weekly meetings, or full-day release time.
Data from research-based best practices such as increasing Algebra I proficiency
rates are used to train staff, and data are used for early intervention and identification
of programs such as counseling programs for AB 1802 supplemental counseling are
used to assist students with passing the CAHSEE and being on track with graduation.
The district leaders stated that they used data to identify the at-risk students and
therefore were one of the first districts that hired additional counselors through 1802
supplemental funds. In the next section, since the two themes are intimately
connected, I will discuss them in one place.
Outside partners and professional development
The district has partnered with external partners such as Springboard to
provide principal coaching and build capacity for DDDM. The district made a great
investment by sending multiple teams to visit the Adelai Stevenson High school in
Chicago three years in a row to visit Rick DuFour’s former high school and district,
where structures were created to promote a collaborative culture through the
formation of the Professional Learning Communities. This high school in Chicago
had changed its focus from teaching to learning; teachers worked together to analyze
and improve their classroom practices by engaging in an ongoing cycle of questions
that promoted deep team learning; and they held themselves accountable for
124
increasing student achievement results. In fact, the district partnered with the former
assistant principal of the Adelai Stevenson High School to train and build capacity
for the staff at the White district. In addition, outside consultants have been hired to
evaluate programs and conduct focus groups in order to improve the quality of
intervention programs. The results of the focus groups have been sent back to the
sites to change interventions and practices (feedback loop).
The district has built capacity by distributing leaders at all levels. In fact, the
district has approximately 148 teacher leads at all sites combined in order to help
build capacity for data use, improvement of instructional practices, and technology
support. In fact, this district is unique in its use of funds to maintain a WASC
coordinator position for 6 years in order to ensure the implementation of WASC
action plans. All sites are engaged in the cycle of continuous improvement.
Like the other two districts, while there are no policies for data use at White,
there is an unwritten expectation at the district that all sites will use data to improve
instructional practices. Cabinet looks at all kinds of data from multiple sources. Data
academies are established and protocols are used to engage in dialogue, there are
collaborative assessment conferences, data are handed out at the principals and
cabinet meeting that they refer to as “council”. The assistant superintendent
commented, “Principals are asking for more data, since data lead to more data and
questions”. As a matter of fact, there are four dinners a year for course leads,
directors, categorical office personnel, curriculum study teams, professional learning
communities, and interventionists i.e. teachers on special assignments, to review
125
data, share best practices, and engage in identifying gaps and building on strengths.
While professional development is not mandatory, collaboration time is provided at
the sites for data analysis and engagement in quality conversations.
Data Management System
The White district officials feel that data warehousing, on-line data training
and tutorials, as well as doing away with data volume have greatly facilitated data
use at the sites. The district leaders were unanimous that data use must be done in the
context of each school and community. The Director of Assessment and
Instructional Technology commented, “Relationships and building trust allow you to
have the hard conversations without having anyone feel defensive. In addition,
working closely with the information technology to ensure successful roll-out of new
programs greatly facilitates data use”. One of the major facilitators to DDDM in this
district is the fact that the district gives the sites the autonomy to do “whatever it
takes” in order to improve student results. Another strategy that the district uses to
encourage data use is that it does not use site data from the common assessments in
order not to intimidate teachers or discourage them from using the information to
improve their own practices. The district leaders have put safety nets to facilitate
and encourage data use such as by letting the user of data to have fun and eliminating
the fear of data use by creating healthy competitions for DDDM. At the classroom
level, students are provided multiple opportunities to retake the exams in order to
126
redeem themselves and some schools have established mandatory tutorial times
during the day by restructuring their bell schedules.
The White district supports the sites by providing different reports generated
through instructional technology and assessment department. For example, the
district uses the EADMS data system that trains teachers to align questions to
standards and prints reports. There are Tech Teacher specialists at each site; the
district is in the process of upgrading its technology of Zangle student information
system, since there are limitations with the current student information program
(SASI). There are infinite numbers of data disaggregated by Technology and
Assessment office. The sites can access the data very easily.
Summary
In conclusion, like the Legend and Grovenor districts, the White district has
built capacity for data use through collaboration with the principals, teachers and
leadership teams and through formation of the professional learning communities. It
has also reduced anxiety about data use and has broken down barriers and the walls
to data use by using the information from student achievement data in a non-
evaluative way for the teachers and by producing user-friendly reports. In addition,
it has enhanced the accessibility and timeliness of data through the use of technology
and by providing resource teachers and leads to assist with data analysis and use. The
major difference between the White district and the other two districts is in the
number of course leads, a total of 148 teachers, to assist with data use. The next
127
section, addresses the Worland district that in partnership with the Fall Foundation,
has developed a strategic plan in order to centralize activities and focus the resources
on DDDM.
Worland District
Instructional Practices
At the Worland district, the superintendent makes special presentations to the
board on DDDM. Cycle of continuous improvement occurs vis-à-vis review of data
and setting goals and most recently, through the strategic planning process that was
just completed in the summer of 2008. American Psychological Association (APA)
guidelines will be used for DDDM. The superintendent commented, “The strategic
plan will build capacity for non-accountability measures such as community
engagement and involvement, professional development, professional learning
communities, etc.”
Although the district does not have a set policy for data use, it expects data to
be used at all of its sites. In other words, data use is non-negotiable. Sites have
complete autonomy to determine staff development. Time for collaboration has been
created either through late start or early release days at the sites. According to a top
district administrator, “Time for collaboration has been helpful in using DDDM to
modify instruction and improve student performance.” There are ongoing
conversations with the administrators and teachers to use data for decision making.
The district has aligned the district assessments to state accountability system and
128
has used data to develop targeted interventions at the sites and has provided the
resources to do so. The administrators interviewed concurred that what works in
DDDM is use of relevant data, benchmark tests, on-going assessment throughout the
year, accessibility of data/direct access to data by the administrators and teachers;
time for teachers to collaborate, training for teachers and principals in data analysis;
and lots of autonomy given to the schools due to decentralization of the district.
According to district leaders, data protocols have been instrumental in facilitating
DDDM at the sites.
Professional Development
At Worland district providing as much access to data as quickly as possible in
a meaningful way is a way of facilitating data use. Training teachers on how to use
the information to modify instruction, as well as training departments and
paraprofessionals working with ELL students and on CELDT tests have been
helpful. In other words, capacity building has worked well with DDDM at Worland
district. Principals are the number one individuals to deliver data analysis. They
receive professional development by the district on data analysis, since principal
meetings have been restructured to focus on data analysis. Implementation of the
strategic plan and sustainability of the practices will further facilitate DDDM. The
instructional division supports the sites to move to the next level based on data. Due
to the decentralized nature of the district, most initiatives begin at the sites not at the
129
district. The best practices are then scaled up such as the Thinking Maps and then
professional development is provided to support implementation.
There are three pre-service days for the secondary that are set aside for data
analysis. In addition, there are early releases or late start dates when teachers
collaborate or are given training in the use of Edusoft training to practice pulling
data.” The Administrator in charge of elementary schools stated that twice a year,
teachers are released at the beginning of the year to look at data and plan for the
year. However, for the first time last school year, K-12 met together to review data
and engage in developing plans. This process continued through the strategic plan
last summer.
The district has Data Days and it provides differentiated professional
development on data use. In addition, even classified personnel who deal with data,
receive training. This is a much decentralized district and the principals have
complete autonomy. Principals’ meetings have been restructured and are focused on
data analysis. The Instructional Division creates a calendar on the focus areas and
sites might add to the list. Capacity building is done through professional
development on best practices and trainer of trainer model. One example of such a
model is trainer of trainer model for ELL support. There are lead teachers at
Elementary and Secondary schools to assist teachers with interventions, technology,
and lesson modeling and data analysis.
The district has started a new practice of holding K-12 meetings once a
month. Training for data analysis will be done by a network of all principals and the
130
external partner, the Fall Foundation. The goal of these trainings will be increased
articulation between K-12 on DDDM and building relationships.
The Assessment /Technology Department Director and personnel meet with
faculty to review data and answer questions. Moreover, the entire district has four
articulation days a year for the entire secondary schools and teachers. Secondary
schools are dismissed at 12:30 p.m. on those four days in order to focus on data
analysis to improve instruction. This year, for the first time, there was a K-12
articulation and cross level data analysis. Additionally, every staff has been trained
in the use of data.
Outside Partners
At Worland district, the partnership with the Fall Foundation has greatly
facilitated DDDM by providing increased resources to create a focus on literacy as
well as advance the strategic planning process. As a result of this partnership,
writing scores on CSTs went up to 72% proficiency and 69% in 7
th
grade.
The strategic plan is going to serve as “the plan” for district improvement and
focus. The superintendent stated, “This is one of the most exciting and unique
collaboration efforts embarked upon us. We are one of the only two districts in the
nation that were selected by the Fall, and that working together, this culture-
changing process engages all of us in a variety of learning and idea exchange forums
that will further our work with literacy and build upon our goal of increasing student
achievement.” However, the district has also partnered with outside consultants and
131
organizations such as the Cambridge group, UCLA writing project, Costen
Foundation, and has worked with professors from prestigious universities in order to
build capacity and improve practices. Clerical personnel that deal with data have
been trained to assist the teachers with pulling data. Mandatory staff development
during the regular workday, hiring early retirees to assist the teachers in the
classroom while teachers are conducting benchmark assessments have also been
helpful. According to a top-level district administrator, “Once teachers see results
from their frequent assessments they get excited about using the data to further
improve their practices.” Simple strategies have also made a big difference. At
Kindergarten, teaching students how to bubble in the tests has greatly aided the
teachers to use the results.
Data Management System
The Director of Technology and Assessment commented, “Centralization of
data through CALPASS that most districts have joined will help facilitate data use.
The existence of good and clean data vs. raw data will have a great impact on data
use.” However, he commented, that “decentralization is both a challenge and a
facilitator to data use. Conversations around data use, i.e. timeliness of data, data
warehousing, good data vs. bad data, help change the culture and help us reinvent
ourselves in terms of how we are really going to take a look at data in a more
powerful manner to impact instruction.”
132
Edusoft is used for data warehousing. The district is considering changing to
a more teacher friendly platform. The district has the one-on-one lap-top- initiative,
where every single middle school student has a lap-top in one school. The Strategic
plan has lots of technology components in it. Aries student information system is
used; assessment office provides new data to the principals and sites as needed.
Summary
In summary, the most important structure that Worland has currently put in
place is its strategic plan that according to the superintendent “will help build
capacity for both accountability measures as well as non-accountability measures
such as community engagement and involvement, professional development and
professional learning communities.” The district has formed partnerships with CAL-
PASS (California Partnership For Achieving Student Success) in order to track
student progress post high school, has used data management systems to provide
easy access to data, has built capacity through professional development on best
practices and trainer of trainer model. It has also partnered with outside partners in
order to improve literacy and develop a strategic plan as a means of centralizing
DDDM.
Table 4 below summarizes the similarities and differences at each district
with respect to structures and practices.
133
Table 4: Summary of Structures/ Practices in Each District
District Instructional Practices Professional
Development
Outside Providers Data Management
System
Grovenor - Alignment of instruction
and assessment to the state
standards
- Establishment of
measurable targets and
benchmarks
- Development of data
protocols at all levels to
engage in data analysis
- Data from multiple
sources instead of using
teacher judgment and
belief
- Additional resources for
testing students in SAT and
PSAT
- Placement audit done at
high schools
- Identify gaps to develop
placement matrices and
targeted interventions
- Principal meetings are
dedicated to the review of
data and implications
- Data teams regularly
meet at the leadership/
strategy academies to
discuss next steps
- Content area meetings
use the professional
learning communities’
model to review common
assessments
- Data teams made up of
teachers, administrators
and support staff from each
grade level meets quarterly
to review and analyze data
- District’s instructional
division meeting (K-12)
and cabinet for a total of
twenty hours a month
- Teachers on Special
Assignment (TOSA) at
each site to facilitate
DDDM conversations and
collaboration
- Principal evaluations are
based on value-added
- Centralized district
- Relationship between
the district and the sites is
not hierarchical
- Sites have autonomy
over staff development
- Professional
development is provided
for all principals and
school leadership teams to
learn how to use data
warehouse and to build
capacity
- Training staff and
providing support for
accessing data and
analyzing data
- Data analysis trainings
are offered throughout the
year for teachers,
participation can either be
voluntary or mandatory
- A menu of best practices
- After implementing best
practices, the district
sustains them and scales
them up
- Leadership institutes
- Training on uploading
the data and support the
sites with considerable
analysis of test data
- On line tutorials
- Site and individual
training on use of
technology to access data
- Extensively
- West Ed
- Robert Linquanti (ELL)
- Multiple external
partners such as the
Principals’ Exchange, the
National Center for
Educational
Accountability to conduct
data queries for
predictive analysis,
Action Learning, San
Diego State University,
and Learning 24/7 to
build capacity for data
use and improve
instructional practices
- Common assessments,
K-12 course audits, and
training for the teachers
in data use also provided
by outside partners
- Data Director is
used as data
warehouse
- Student
information system
is antiquated and
needs upgrading
- Technology is also
used to provide
support for student
placement in courses
such as the
Advanced Placement
- Site-based
scanning systems to
import interim
benchmarks into
data warehouse
134
Table 4, Continued
- Common language
- Looking at trends,
strengths and weaknesses
- District grants
“structured” autonomy to
the sites
- Weekly collaboration
time at the sites
- 20 teachers on special
assignment that provide
support for classroom
teachers by modeling
lesson delivery
- District personnel make
site visits, observe
instruction and give
feedback to the
administration
- District leaders provide
data and protocols for
analyzing the benchmarks
- Data analysis, and use of
multiple measures
- Data reports shared for
non-evaluative purpose
Legend - S.E.A.M
- Transparency with data
- Teacher specialists
provide support for the
sites to build capacity for
data use
- Leadership teams have
defined what student
engagement is
- Sites are given autonomy
for staff development and
initiation of new programs
based on quantifiable
measures
- Principal evaluations are
based on value-added
- Teacher evaluations are
analyzed to ensure
alignment to school wide
goals
- SMART goals at each
site
- Fidelity of
implementation of best
practices is monitored
closely by the site
leadership teams
- Trainer of trainer model
to build capacity
- Principal and leadership
team coaching by district
leadership teams
- Training of leadership
teams by each site
- Videotaping class
observations by principals
- Best practices are
implemented
- Coaching triads
- On-line tutorials for data
use for teachers
- On-site trainings for data
use
- Limited
- Focus on Results
consultants
- Data Director
- Switching from
School Max to Zangle
for student information
system
- On-line data
warehousing
135
Table 4, Continued
- Data committee
- Instructional
walkthroughs to calibrate
observations based on the
matrix
- Restructured principal
meetings
- Right personnel on
cabinet
- Training of the entire
district management on
coaching
- Teacher specialists at
each site
- Cabinet reviews data
- District wide and site
specific data analysis
White - Four district priorities:
1) Common instructional
materials; 2) District wide
quarterly common
assessments and essential
standards;
3) Site-based interim
assessments; and
4) Implementing the
Pyramid of Intervention.
- Measurable targets for
student achievement
- 148 course leads to
provide technical and
instructional support
- WASC coordinator
position for 6 years
- Articulation meetings
with five feeder Middle
Schools
- Data analysis deeply
rooted in the district
culture
- Weekly principal
meetings with the cabinet
and the superintendent to
review instructional
practices
- Data-driven principal and
administrator evaluations
- Principals given
autonomy and resources to
improve student results
- Data from research-
based best practices/
Algebra I used for
training
- Principal coaching
through Springboard
Professional Learning
Communities
- Visits to A. Stevenson
HS in Chicago
- Four dinners a year for
all leads and staff to
review data, share best
practices, and engage in
identifying gaps
- Professional
development is not
mandatory, but
collaboration time is
provided at the sites for
data analysis and
engagement in quality
conversations
- On-line data training and
tutorials
- Extensively
- Spring board
- Outside consultants to
evaluate programs and
conduct focus groups in
order to improve the
quality of intervention
programs
- EADMS
- Switching from SASI
student information
system to Zangle
136
Table 4, Continued
- Site leadership is valued
and trusted by the district
- Time for collaboration at
each site weekly meetings
or full-day release time
- Data are used for early
intervention and
identification of programs
- Additional counselors
through AB 1802
- Cabinet reviews data
from multiple sources
- Data academies and
protocols used to engage in
dialogue
- Collaborative assessment
conferences
- Principal/cabinet
meetings on data analysis
- Reduced data volume
- Working closely with the
Instructional technology to
roll-out programs
- Site data from the site
common assessments are
not used to evaluate
teachers
- Sites have autonomy to
do “whatever it takes” to
improve student
performance
- Healthy competitions for
DDDM
- Students are provided
multiple opportunities to
retake the exams
- Mandatory tutorial
times/restructured bell
schedules/Pyramid of
intervention
Worland - Strategic plan
- K-12 articulation
meetings a month (new)
- Data use is non-
negotiable
- Time for collaboration
- Ongoing conversations
with the administrators and
teachers regarding DDDM
- Best practices
- Mandatory staff
development on DDDM
- Clerical and certificated
staff training
- Trainer of trainer model
- Restructured principal
meetings
- Three pre-service days
for the secondary
- Extensively
- Fall Foundation
- Cambridge group
- UCLA Writing project
- Costen Foundation
- College professors from
prestigious universities
- Edusoft/switching
to a better system
- Switching to a new
Student information
system
- CALPASS
137
Table 4, Continued
- Four articulation days a
year
- Early release days
- The district has aligned
the district assessments to
state accountability system
- Data used to develop
targeted interventions and
resources
- Use of relevant data,
benchmark tests, on-going
assessment throughout the
year
- Accessibility of
data/direct access to data
by the administrators and
teachers
- Autonomy given to the
schools due to
decentralization of the
district
- Data protocols
- Early releases or late
start dates training in the
use of Edusoft training to
practice pulling data
- Decentralized district,
best practices at the sites
are scaled up
138
Challenges to DDDM
All four districts have stated challenges to DDDM, in spite of the fact that
they have put into place numerous structures and practices in order to facilitate data-
driven decision making (DDDM). This section will provide answers on how the
district leaders have addressed these challenges in order to further encourage DDDM
in their respective districts. At the end of each section, there will be a summary of
the challenges. Challenges to DDDM will conclude with summary of key points.
Worland
One of the challenges that the Worland district administrators stated was
related to lack of coherence, collaboration, and communication among K-5 and 6-12
levels, since elementary and secondary schools are siloed and not very collaborative.
In order to address this issue, the district leaders have just begun the joint meetings
between the K-5 and 6-12 to bridge this gap.
Another issue deals with data illiteracy by some teachers. Since teachers do
not know how to use and analyze data, they are reluctant to use and access the info in
order to modify their practices. As a result, the instructional division, through
teacher and principal trainings and early release days, is providing training for the
teachers on several occasions. In addition, the personnel from the Technology and
Assessment department provide one-on-one training on as needed basis. Moreover,
reports are presented in more user-friendly formats.
139
At the elementary level, on the other hand, a top level elementary district
leader stated that at the beginning of DDDM, teachers were complaining about lack
of time to conduct diagnostic assessments such as DRA. The district, therefore hired
early retirees to assist the teachers with the assessments. Once the teachers saw the
benefits of the assessments to improve student learning, they embraced the idea.
Some teachers at the secondary level do not find any utility in administering
district’s benchmark assessments in order to improve instruction. They feel that the
tests are for the district use and not for teachers, since by the time the test results
arrive, it is too late to go back and re-teach the skills and the concepts that students
have missed. In order to address this concern, the district leaders are currently
evaluating the validity of those benchmark assessments and are engaging in dialogue
to determine a more efficient way to administer the assessments and return the tests
to the sites. One of the options that the district leaders are considering is scheduling
and offering interventions during the day and changing its computer platform to meet
the needs of the sites more efficiently. Another challenge has to do with
distinguishing between good data and bad data, since there is too much data and it’s
hard for the teachers to identify meaningful data that will improve student
performance. To address this challenge, the Assessment and Technology department
continues providing support and training, site teams are trained in data use and
course leads assist with data use and analysis. Collaboration time is spent on
reviewing data and analyzing data. A major challenge for the district leaders for
DDDM is lack of ability to manipulate data, since their current data warehousing is
140
insufficient and needs to be upgraded. The district is therefore reviewing best
practices before it changes to a new platform.
In summary, the challenges that the Worland district is facing are as follows:
lack of communication and articulation between k-5 and 6-12; data illiteracy among
some teachers; lack of time to administer district benchmark assessments; utility of
data; timeliness of data to modify instruction; good and meaningful data vs. bad and
useless data; lack of ability to manipulate data through the current technology
platform. In the next section, I will discuss how the Legend district has decided to
deal with the challenges associated with DDDM. Like the Worland district, staff is
faced with too much data, some teachers, especially at the secondary, don’t see the
value and utility of data and therefore lack ownership.
Legend
The challenges listed by the district leaders at Legend are as follows. First,
not everyone is on board with DDDM. As a result, the district continues to
systematically train leadership teams from all cohorts that are made up of
elementary, middle and high school teachers in each quadrant. In addition, it uses
the trainer of trainer model to build capacity for DDDM. According to the
superintendent, due to the nature and size of the elementary schools, it is much easier
for the elementary teachers to buy-in and use data than it is for secondary school
teachers. In addition, like teachers at the Worland district, some teachers at Legend
district reject the validity of data, since they feel data are not important. The
141
superintendent stated, “some teachers are more focused on teaching, but don’t not
care about student learning, therefore they do not find value in data use.” The
district keeps moving forward with its trainings and will continue providing support
to individual sites and teachers for data use. Another challenge identified by the
district leaders is being engaged in too many initiatives and lack of sustainability for
data use. To address this challenge, the superintendent has assured his staff that the
district will stay the course and will not be introducing additional initiatives until it
has closed the achievement gap. Another challenge at the Legend district is the
existence of too much data that creates analysis/paralysis. Principals are
overwhelmed by the amount of data in the data binders that contain data from
multiple sources. As a result, the district has started posting most of the data on line
and is in process of either doing away with the binders or minimizing them to only
those data that the principals use most often to improve student performance. One
district administrator commented, “We lose perspective that we work with students
and not numbers. We need to look at data from multiple sources, not just test scores.
However, too much data are not helpful.”
In summary, not all teachers at Legend district are on board with DDDM,
especially at the secondary schools; teachers don’t see much validity and utility in
data use; too many initiatives hinder sustainability for data use and too much data
becomes overwhelming. In the next section, I will discuss how the leaders at
Grovenor have developed two performance goals in order to provide transparency
with data use and alleviate teacher fears with DDDM.
142
Grovenor
The Grovenor district leaders named several challenges to DDDM as well.
Initially, there was teacher discomfort with sharing student data, but transparency
with data and establishment of district performance goals have trivialized this
challenge. “Data become the ‘voices of children’ telling us what they need,”
commented the superintendent. A second challenge, like in the Worland district, is
that teachers feel a lack of time to analyze the data and re-teach the skills and
concepts that students have not yet mastered. In order to address this issue, district
leaders have worked with the teachers’ unions to develop time for collaboration and
have provided the resources for the sites to offer targeted interventions. The
superintendent commented, “Teachers feel threatened by data since they are afraid
that data are used to punish or evaluate them. We therefore use data as a flashlight
and not as a club.” In addition, since some teachers lack the technological skills to
access and use data, district leaders have provided human resources such as teachers
on special assignment at each site to assist with accessing technology for data use
and analysis, as well as experts from the assessment office, also known as the
“assessment dudes” to train and provide support for the teachers. Another challenge
is moving too fast with DDDM and assuming that people are comfortable with data
use is another challenge cited by district administrators. In order to facilitate data
use, in addition to providing human resources mentioned above, teachers are
provided additional training during their collaboration time. The superintendent
stated, “DDDM is like losing weight. It takes time; you have to weigh yourself
143
every day; watch what you eat, try not to overeat; stay away from what you like to
eat as opposed what you need to eat. You know, sometimes you get disappointed and
give up, but then you go back on track. It is hard work and it takes time.” Lastly,
like the leaders at Worland, district leaders stated that student data system is
currently antiquated and has limited their ability to do multiple queries in order to
track student progress. The system does not allow the district to look at longitudinal
data in order to measure growth of students over time, i.e. aggregate vs.
disaggregated data. To address this challenge, the district is considering
implementing a new student data system and has recently joined the CALPASS to
get longitudinal data. “I wish we had more time and money to look at individual
student data at the sites”, commented one administrator. She further stated, “The
challenge for the schools is to find the actionable data, good data vs. bad data.
Looking at data and not taking action is not going to work. We need to find the
balance of using data to make a difference. That’s going to be the key.”
In sum, the Grovenor district’s challenges to DDDM, like the other two
districts, have to do with teacher discomfort with sharing student data; lack of time to
analyze the data and re-teach the skills and concepts that students have not yet
mastered; using and accessing technology for data use and analysis; feeling
threatened by data use for evaluative purposes or punishment; moving too fast with
DDDM and assuming that people are comfortable with data use; inability to look at
longitudinal data; lack of time to look at individual student data; and finding the
actionable data.
144
In the last section, I will address how the White district leaders have provided
financial and human resources to the sites by assigning course leads to all their sites
to build capacity for data use; have given complete autonomy to the sites for data
use; and avoid using too much data that can be overwhelming for teachers and
principals. Additionally, like all other three districts, lack of teacher buy-in
continues being a challenge at White.
White
Like the Grovenor and Worland districts, the White district administrators
stated that one of the major challenges to DDDM was analysis-paralysis caused by
the existence of too much data. As a result, the district has made all reports in a
user-friendly format and continues providing professional development for teachers
and administrators in DDDM. Moreover, since administrators and teachers feel that
top down culture does not work for data use, the district has therefore given complete
autonomy to the sites to use data in order to develop innovative programs, such as
modified and new bell schedules and new programs at the sites. Since lack of
teacher buy-in to get everyone on board with DDDM, continues being a challenge,
the district leaders have provided 148 course leads and interventionists to assist with
data use and analysis; have provided time for collaboration at each site; and have
also provided technology support and training for data use. The superintendent
commented, “Continued rise of accountability, along with lack of quality subs and
the declining state economy have posed challenges for DDDM.”
145
Summary of Key Points
Table 5 summarizes the similarities and difference in challenges that all four
districts are facing.
Table 5: Summary of Challenges to DDDM Faced by Each District
Districts Worland Grovenor White Legend
Data Illiteracy/technology use Yes Yes Yes Yes
Lack of time to re-teach or ad minister
benchmarks
Yes Yes Yes Yes
No utility for data use/teacher distrust Yes Yes Yes Yes
Lack of Teacher Buy-in Yes Yes Yes Yes
Too much data/ good data vs. bad data Yes Yes Yes Yes
Technology platform limitations to
manipulate data
Yes Yes No Yes
Too many initiatives/moving too fast Yes Yes Yes Yes
Lack of communication between K-5
and 6-12
Yes No No No
Limited ability to use Technology Yes Yes Yes Yes
Declining resources/lack of quality subs No No Yes No
146
Summary of Chapter 4
Each of the primary sections in this study examined in careful detail the six,
key themes that were generated by the qualitative data gathered during the course of
this study. As indicated previously in the chapter, these six themes are: No Child
Left Behind Act and the impact of this law on data use, the Processes of DDDM, the
Types of data, the Culture of data use, the District Structures/Practices, and the
Challenges to DDDM in each district. NCLB and its impact on data use serves as the
backdrop in examining the impact of this law on state accountability systems and
how such an impact has correspondingly influenced student performance.
NCLB and its Impact
During the course of this study, all district leaders interviewed indicated that
NCLB had been influential in improving student achievement by creating a sense of
urgency, which correspondingly led to the use of data to provide targeted
interventions. But the superintendent in the White district made a critical comment
indicating that NCLB by itself was not a panacea for everything that ails the district,
since despite the progress made under (or at least initiated by) NCLB, some schools
and districts might still go into “program improvement” status. However, the
majority of district leaders indicated that they did not continue to perceive the
enactment of NCLB to have imposed additional burdens, since they indicated that a
variety of assessment structures had already been put in place in the districts prior to
NCLB.
147
Theory of Action
As observed in the detailed descriptions earlier, despite variances in the
districts’ theory and plans of action for data use, all four districts were similar in
making an attempt to build greater capacity towards enabling data driven decision
making. Each of the districts directed significant resources to the sites, and
restructured their organizations to better analyze and enable the use of data to shape
and improve instructional delivery. Although there were no formalized district
policies regarding data use in any of the four districts, it was clear to me from having
conducted the study that the theoretical and material praxis of the districts with
regard to data use was aimed at improving instructional practices and student
performance.
In all instances, the boards of education had high expectations for improving
student achievement, correspondingly, all of the superintendents utilized different
strategies in order to build structures and establish practices to make that happen.
Also, each district put in place the appropriate structures and personnel to facilitate
data use, provide the necessary staff development and had restructured the
principals/leadership team meetings. Consequently, these teams developed a
common language and framework in order to engage in a cycle of continuous
improvement. It is also noteworthy, that all the districts have teamed with external
partners to build capacity for data use and engage their staff in quality conversations
that are geared toward learning goals and targeted benchmarks.
148
Types of Data
Given the findings of this study that were generated from the interviews and
document reviews at each district, and as elaborated in the previous sections, the
sheer abundance of the types of data used in each of the individual districts was truly
noteworthy. Again, given the variety of data used, it would be unwise to over
generalize the types of data being used, however, it is fair to say that all four districts
use demographic data and student outcome data that include standardized test results
such as California Standards Test, California English Language Development Test,
and California High School Exit Exam. They also use data from multiple sources
that range from grades to on-target with graduation; a-g completion rates; college
attendance; Advanced Placement (AP); district benchmarks; site common
assessments, and recommendations form the Western Association of Schools and
Colleges (WASC) accreditation reports. Also, trend data, as well as attendance and
discipline data are used to get a more comprehensive picture. However, the
Grovenor, White, and Worland districts also use perception data such as student,
parent and teacher surveys to measure effectiveness of practices.
Culture of Data Use
The “culture” of a district, at any level, is possibly the most difficult to define
and quantify, given the abstract nature of the term itself. However, for the purposes
of this study, we did indeed have to define and describe, as clearly as possible, the
culture of data use at the districts, and how district leaders cultivated a culture of data
149
use. At the outset, it would be fair to say that the use of data is an expectation of the
district leaders at all four districts, and all have worked with external partners in
order to build a culture of data use. The Grovenor district has set specific goals to
create a DDDM culture, whereas the Superintendent at the Legend district has
created a sense of urgency in the district which further encourages DDDM to narrow
the achievement gap (for Latino students specifically) and also created a common
language (S.E.A.M.) to improve instructional practices. In the Worland district a
strategic planning process is used to nurture a culture of data use, while the White
district has given the individual school sites complete autonomy to use data to
develop programs.
District Structures and Practices
The research indicated that the structures that a district puts in place can be
programs (such as a writing or mathematics program), positions (such as course
leads or teachers on special assignment), or strategies (Thinking Maps, etc.). All four
districts took a different approach in establishing and identifying internal structures
in order to increase student achievement. For example, the Legend district chose to
create a centralized message based on district-wide analysis of CST trend data to
focus all schools on the urgency of closing the achievement gap, especially for the
Latino students, while it also required the schools to engage in site-based planning
that identified site-specific needs and strategies. The White district created four
district priorities that focused on the overall goal of improving student achievement,
150
whereas, the Grovenor district went through a multi-year process to establish two
performance goals. Worland which is a decentralized district utilized the strategic
planning process to further centralize the DDDM process in order to improve student
achievement.
Challenges to DDDM
In summary, as described in detail in the previous sections of this chapter,
while a culture and practice of DDDM was not new to any of the districts, nor the
results of NCLB mandates only, all of the districts faced challenges in enhancing
their facility for DDDM, as well as creating specific strategies to correspondingly
increase student achievement as a result of engaging in DDDM practices.
Specifically, the challenges that the districts are facing are as follows: at the
Worland district there is a lack of communication and articulation between k-5 and
6-12; data illiteracy among some teachers; lack of time to administer district
benchmark assessments; appropriate utilization of data; timeliness of data to modify
instruction; good and meaningful data vs. bad and useless data; lack of ability to
manipulate data through the current technology platform. Like the Worland district,
staff at the Legend district is faced with too much data, and some resistance from
teachers, especially at the secondary level, who do not see the value and utility of
data and therefore lack ownership over it; consequently, not all teachers at the
Legend district are on board with DDDM. Perhaps anticipating a similar problem,
leaders at Grovenor have developed two performance goals in order to provide as
151
much transparency as possible with data use and to alleviate teacher fears with using
DDDM. The Grovenor district’s challenges to DDDM, like the other two districts,
have to do with teacher discomfort with sharing student data and a host of other
related issues such as: lack of time to analyze the data and consequently attempt to
teach again the skills and concepts that students have not yet mastered; using and
accessing technology for data use and analysis; moving too fast with DDDM and
assuming that people are comfortable with data use; inability to look at longitudinal
data; lack of time to look at individual student data; and distinguishing between
valuable and useless data. In accounting for these problems, we can see that similar
challenges obtain in the White district’s use of DDDM. In this district, some of the
persistent problems include the analysis and paralysis that can be caused by too
much data; lack of teacher buy-in for DDDM and declining resources to provide time
and quality substitutes to release teachers from the classroom so that can learn and
re-learn as necessary the data driven structures that are in place to enable student
achievement.
152
CHAPTER FIVE
Analysis, Summary, Conclusion and Recommendations
Overview of the Problem
The No Child Left Behind Act of 2001 has created a heightened focus on
increasing student achievement, consequently, based on current research, the use of
data-driven decision making (DDDM) has become more important than ever before.
This research study on best practices examined how district leaders at four school
districts used DDDM to drive school/district improvement. This study also
examined the challenges associated with DDDM and strategies used to facilitate data
use. Clearly, under NCLB, a new catchphrase, “We are Data Driven” has been
developed and is used at all levels of the school system, from central office to the
classrooms (Marsh, Pane, & Hamilton, RAND, 2006, p.1), in order to drive
continuous improvement. Therefore, many questions arise as DDDM becomes the
driving force for school/district improvement. What forms of data do districts rely
upon most? How do district leaders cultivate a culture of data use? What
structures/policies do district leaders put in place to assist with data use? The
findings in this study provide valuable information on best practices related to
DDDM.
This study involved an analysis of district leaders’ actions related to the
utilization of DDDM to create a culture of inquiry that results in a cycle of
continuous improvement to improve student achievement (Datnow et al., 2007).
153
Most prior studies have been focused on district policies and practices more
generally, and less on the specific activities of district leaders in supporting DDDM;
this study sought to fill that gap.
The method of data collection for this study was primarily in the form of
qualitative research: I conducted 16 interviews total with district administrators.
Results of these tape recorded and transcribed interviews were triangulated by
document review/analysis and through the review of the interview notes. Some of
the major conclusions resulting from this study were that all district leaders
improved districts by holding high expectations and making all adults accountable
for student learning beginning with the superintendent, assistant superintendent,
instructional division, principals, and site leadership teams by providing a culture of
continuous improvement; they worked with outside providers to create a common
language and focus and ensured practice and program alignment and coherence at all
three levels: District, site, and classrooms. In addition, they used data from multiple
sources to develop a theory of action; monitored results; and provided resources.
Technology warehousing facilitated and provided timely data for teachers and
administrators so that they could identify the information they need to make
decisions about improving programs, practices, and strategies (Madinich, Honey, &
Light, 2006). District leaders also sustained engagement in school/district
improvement over time, “stayed the course” and did not take on new initiatives until
they saw improved student results. In addition, professional development in DDDM
was provided for principals, teachers and site leadership teams to improve practices.
154
Among the common challenges that the district leaders faced was lack of
buy-in by some teachers (Feldman & Tung, 2001; Herman & Gribbons, 2001;
Ingram et al., 2004); having too much data or as district leaders referred to it,
“analysis paralysis”; not knowing the difference between good data and bad data,
(Heritage and Yeagley, 2005), and data not being presented in a format that best
assist understanding of that data (Schmoker, 2003; Lachat, 2002; and National
Education Association for the Improvement of Education [NFIE] (2003); Rudner &
Boston (2003); Schwartz (2002); Streifer (2002); Thorn (2001).
In sum, the role that the district leaders from all four districts played to
establish practices and structures on DDDM had a profound impact on cultivating
and supporting a culture of continuous improvement to increase student
achievement. In the next section, I will make connections of this study’s findings to
the literature review, summarize new findings, introduce policy implications, and
outline recommendations for further study. In general, it is fair to say that many of
the findings in this study concur with what is found in the field literature review;
however, there are also additional findings that nuance and add to the existing
research.
Connections to Prior Research
The findings from this study can be connected to the extant literature on
DDDM in many ways. Chapter 2 examined DDDM in the following way:
155
1) A review of process and context of DDDM through the lens of various
conceptual frameworks.
2) A review of data use and evidence-based practices by district leaders and
3) Facilitators and challenges to DDDM
I will first briefly review the literature on the process and context of DDDM
and then, will discuss and analyze the findings for each research question by
incorporating the categories listed above.
Introduction: Process and context of DDDM
The process and context of DDDM, in that specific subsection of the
literature review, identified that NCLB had created a policy context whereby schools
and districts are to be held accountable for results (Firestone et al., 1998). Therefore,
the state and federal government focused on evidence-based results (Fullan, 2000)
through the use of data instead of basing it on the implicit knowledge of, or even the
intuition of, the district leaders (Slavin, 2002); in short very little was left to chance.
The research presented in this section of the literature review laid the foundation for
the justification of DDDM to improve student performance.
Multiple researchers, such as Kerr et al., (2003) have suggested that DDDM
has a positive affect on student achievement and they link DDDM to changes in
school culture. These findings are consistent with those from this study, where a
pervasive theme that emerged was the attempt by district leaders to create a culture
156
of continuous improvement and inquiry by establishing practices/structures that build
capacity for DDDM.
A RAND study (Marsh, Pane, & Hamilton, 2006) also suggested that districts
with long-standing state accountability systems are more involved in DDDM. Since
my study was done in California and the state’s accountability system dates back to
1990’s, all four districts in this study concurred that they had been involved in
DDDM way before the enactment of NCLB. However, they all commented that
NCLB had created a “laser-like focus” on specific subgroups in the student
population; a focus that had not previously existed; at least certainly not to this
degree. In the next section, I will connect the literature to the findings for research
question #1. Generally, the findings in the next section are consistent with the
literature review; however, there are data from multiple sources that are not cited in
the literature review.
Research Question #1: What types of data do district leaders rely upon most?
The literature review laid the foundation for data use by district leaders and
school personnel as a key factor in the school improvement process (Chrispeels,
1992; Earl & Katz, 2002; Protheroe, 2001; Wayman & Springfield, 2003). The
literature review analyzed the purpose and intentionality of using data and converting
it into information that will result into actionable knowledge (Earl & Katz, 2002).
Celio and Harvey (2005) have suggested that data may offer “warnings and hints”
about the organizations’ performance such as student achievement gaps, student
157
attendance, student retention/promotion, and funding equity. Heritage and Yeagley
(2003) address five types of data that are most likely available to the districts to drive
improvement. They are as follows: 1) Achievement tests from state standardized
tests, 2) Benchmark assessments, 3) Formative assessments or common assessments,
4) Grading, and 5) Going beyond grading or use of multiple measures (Heritage &
Yeagley, 2005; Mandinach, Honey, & Light, 2006). According to research, no
matter what kinds of data are used, educators analyze and interpret data to develop
action plans (Marsh, Pane, & Hamilton, 2006).
All of the above findings in the literature review fit intimately with the
actions of district leaders when using DDDM. A pervasive theme in this study was
the use of data from multiple sources, instead of only one type of data, to improve
student performance. However, districts in this study, used additional data not listed
in the literature review such as recommendations from the WASC accreditation,
results of college entrance exams such as SAT, PSAT, and ACT, school recognition
programs such as California Distinguished Schools, Blue Ribbon Schools, Schools to
Watch etc, as well as data from completion of courses in Career Technical education,
Regional Occupational Programs (ROP), graduation rates, on-target with graduation,
college-going rates, placement in college courses, completion of senior projects, etc.
in order to develop action plans to drive improvement. In the next section, I will
address the findings from research question #2, where the literature review laid the
foundation for the need to build capacity for schools through the development of a
culture of inquiry and data literacy (Earl & Katz, 2003).
158
Research Question #2: How do district leaders cultivate a culture of data use?
The important role of the districts in improving instruction has been stressed
in the recent literature on school improvement. Districts do so by providing vision,
support, focus, and policy coordination and by building commitment at the school
level (Corcoran, Fuhrman, & Belcher, 2001; RAND, 2006). The research findings
from the literature review analyzed that when districts develop a culture of data use
in order to improve practices, then the use of data and evidence do not become
uneven, superficial, or symbolic (Coburn,Toure, & Yamashita, in preparation; David,
1981; Massell, 2001). In addition, Armstrong & Anthes (2001) associated effective
data use to strong leadership and a district-wide culture that supports continuous
improvement.
Consistent with that research, Massell (2000, 2001) suggests that variations
in DDDM in different districts were due to the leaders’ beliefs about the utility of
DDDM than to the high stakes accountability context. Moreover, Marzano (2003)
suggests that “leadership could be the single most important aspect of a school
reform” (p. 172). Numerous studies have also stressed the importance of the role of
the leaders in DDDM (for example, Supovitz & Klein, 2003; Young, 2006; Lachat &
Smith, 2005; Datnow et al., 2007; Kerr et al., 2006; Wayman, 2005). In this study, I
found that the leadership at each district, starting with the Board of Education, held
high expectations for improving student performance. As a result, all four
superintendents and other district leaders established a culture of data use, where the
leaders made the use of data was “non-negotiable” (Datnow et al., 2007) and daily
159
use of data became part of the culture of the leaders in each of the organizations
(Earl & Katz, 2002).
Findings from a study by Knapp et al., (2006) suggest that since multiple
actors are involved in the inquiry, the DDDM process becomes distributed. Other
researchers (Elmore, 2000; Spillane, 2006) suggest that in distributed leadership,
leaders find ways to motivate and sustain inquiry into problems through inviting
other players to participate in framing questions, organizing data, and interpreting
results for making decisions. Although the findings in this study are consistent with
the research that was stated in chapter two, there are a lot more details regarding the
structures/practices that district leaders put in place to cultivate and promote a
culture of data-driven decision making. For example, district leaders have created
multiple opportunities for the teachers, administrators, course leads, site leadership
teams, as well as other school personnel that include classified personnel, to
participate in “Best practices” events in order to learn how others use DDDM to
improve practices. In addition, transparency with data permeates the culture at each
of the four districts; data lead to more data; and “quality conversations” (Datnow et
al., 2007) take place in a culture of collaboration in order to ensure continuous
improvement.
Other researchers have cited the need for more data-system technology, as
well as the need to present data in formats that are meaningful to school leaders and
teachers (Rudner & Boston, 2003; Schwartz, 2002; Streifer, 2002; Thorn, 2001).
The findings in this study are consistent with the literature review, since all district
160
leaders stated that based on input from the teachers and principals, they were using
technology platforms that would have the capability to produce more user-friendly
formats for data use. One district went as far as reducing its voluminous data binder
to a few pages only based on input from site leadership teams. All four districts used
technology warehousing in order to give easy and timely access to the sites by
making data available on-line. Although technology was available at all districts,
some did not have high-level data system capability (Olson, 2002) and therefore
planned to move to a new platform.
Feldman and Tung (2003), Chen, Heritage, & Lee (2005); Halcom (2001);
Keeney (1998); Lachat & Smith (2005), Symonds (2003) found that inquiry-based
DDDM not only resulted in improved student achievement, but it also led to a more
professional culture where teachers became more reflective and engaged in
professional dialogue. Each of the districts in this study has provided planning time
or time for collaboration either through weekly meetings, early release or late start
dates, as well as full day retreats in some cases and has provided time for sharing of
best practices. Additionally, according to the literature review, use of data and
evidence also requires research literacy and skills at data analysis and interpretation
(Corcoran et al., 2001; Mc Iver & Farley, 2003). Unfortunately, many district
leaders lack these capacities, hampering their abilities to use evidence in decision
making (Burch & Thiem, 2004; Corcoran et al., 2001; David, 1981; Honig, 2003,
2004; Mac Iver & Farley 2003; Reichardt, 2000). The prior research findings
address that organizational capacity plays a key role in the use of evidence by district
161
leaders (Coburn, Honig, & Stein, in press). However, according to these scholars,
there is limited research that suggests how districts foster the development of
capacity. In general, according to these researchers in their case studies of school
districts, those that seem to use data throughout the system, use professional
development on data use (for example, Snipes, Doolittle, & Herlihy, 2002).
Additionally, Massell (2001) identified challenges that the districts face when
building capacity for DDDM. Training teachers on using data to improve their
instructional practices was one of the major challenges.
The findings in the literature review are not as detailed and comprehensive as
the current study. In this study I found that the structures that district leaders put in
place help promote a culture of DDDM. For example, district leaders used multiple
strategies which include providing course leads or teachers on special assignments to
help with data use; providing on-line or personal tutorials on the use of technology to
access data; restructuring leadership/ principal meetings; training leadership teams;
holding best-practices events; creating triads and conducting instructional walks;
videotaping classroom observations and critiquing; creating a common language to
improve instruction; developing district benchmarks and assessments to measure
student progress; providing collaboration time or release days to analyze data;
changing grading practices such as providing multiple opportunities for students to
demonstrate mastery of standards, as opposed to giving failing grades etc.
In addition, the RAND studies (2006) suggest that although most common
form of support for data-driven decision making is training on how to use test data,
162
the content and quality of these trainings vary. Edison schools in this study
constantly focused training of principals and teachers on how to interpret and
translate data in order to turn that into actionable knowledge. The findings in my
study are intimately aligned to these findings. The districts leaders have used
different strategies for training principals in DDDM through restructured principals’
meetings; through creating a common language and protocols for observing
instruction and defining “rigor”, through best practices dinners; through weekly
meetings with the district leaders; through monitoring matrices just to name a few.
They have also trained site leadership teams; have used the “trainer of trainer” model
in order to build capacity for data use. Teachers have been trained and supported by
working with teachers on special assignment, the district support personnel, such as
technology and assessment, to manage data and ask the right questions, to accurately
analyze data and apply data results appropriately (Lachat & Smith, 2005).
Technology trainings and tutorials are provided either in person or on-line and
support personnel assist teachers with accessing data.
Love (2000), suggests that data-use strategies that involve school staff in
collaborative problem solving encourage open dialogue that is needed to address the
equity issues. At all four districts, transparency with data, restructured principal
meetings, professional development, data-warehousing and sharing of data across all
schools have further promoted a culture of data use. The Education Commission of
the States researchers (Armstrong & Anthes, 2001) who conducted a study of the six
districts in five states that had used DDDM to improve student achievement, found
163
that a district culture that was focused on “service” to the principals and teachers to
use student data, coupled with structural mechanisms for training and assessment
greatly contributed to continuous improvement. The findings in this study are
aligned to the findings in the literature review. District leaders defined their role as
that of “support” and “service” to the sites to engage in a “cycle of inquiry”
(Copland, 2003) to improve student achievement.
Wayman et al., (2005) support the idea of inquiry-based dialogue in order to
develop a common language to define expectations and goals. Similarly, the
findings in my study were intimately aligned to that of the literature review. For
example, the Legend district has created a common language through S.E.A.M. to
improve instructional practices; the Grovenor district has created two performance
goals for all students; the White district has created four district priorities where all
schools must abide by; and the Worland district has created a common ground
through its strategic plan.
Additionally, being consistent with this study’s findings, other studies have
found that organizational cultures that viewed accountability as a useful vehicle to
improve the practices rather than a threat to teacher performance encouraged
complex data-driven decision. In addition, Wayman and Stringfield (2006)
described leaders who used a “non-threatening triangulations of data” approach,
relying on multiple measures to ensure that teachers felt supported and empowered
by data use instead of being threatened, cultivated a culture of DDDM. As stated
earlier, the superintendent at the Grovenor district commented, “data are used as
164
flashlight and not as a club.” Additionally, none of the districts in this study used
data in punitive ways to evaluate teacher performance, although the district leaders
used the value –added student achievement in principal evaluations.
Although the review of the literature suggests that despite emerging data-
informed leadership at the district level, except in a small numbers of “high
performing” schools or districts, there is a lack of sufficient evidence to indicate the
capacity of such activities to drive school district improvement (Honig & Coburn,
2005; Kerr et al., 2006). According to the district leaders, this study found that use
of DDDM and a culture of continuous improvement at all levels of the organizations,
i.e. district, schools and the classroom had a great impact on improving student
performance. On the other hand, one administrator at the White district stated that it
was a daunting task to evaluate the extent of the impact of each of these practices or
programs on student achievement. For example, did student achievement increase
because the district implemented new textbooks or implemented a new intervention
program? Or, did it improve because of professional development in DDDM,
regularly scheduled time for teacher collaboration, or sharing of best practices?
Since the literature review does not address these issues, this may be a topic for
further study.
165
Research Question #3: What structures/policies do district leaders put in place to
assist with data use?
District practices/ structures and the roles that district leaders play are crucial
for building a school’s capacity for DDDM. A district’s structures can be positions,
programs, strategies or approaches (Agullard and Goughnour, 2006). Examples
include, but are not limited to adding course leads to a school, introducing a new
writing program such as the Thinking Maps, or implementing the Explicit Direct
Instruction (EDI), a framework for improving teaching and learning. Regardless of
any improvement structures, the literature review found that district structures should
be aligned and support each other at all three levels: District, schools, and classrooms
(Agullard & Goughnour, 2006).
In general, the findings in my study are consistent with the literature review:
All four districts have created systems or structures in order to build capacity for
DDDM by aligning their practices at all three levels: District, schools, and
classroom. In order to provide focus, coherence, and consistency (Agullard &
Goughnour, 2006), district leaders stayed the course by refraining to take on new
initiatives, removed all roadblocks, and used data warehousing in order to support
continuous improvement. Last, but not least, although the literature review had
limited citations about the role of the outside providers in establishing a culture of
DDDM or in developing structures/practices, my study found that indeed, the
outside providers at all four districts had played a key role in establishing a theory of
action for DDDM.
166
Literature on school improvement has stressed the key role that the district
office can play in improving instruction by providing focus, coherence, and
consistency and support to the sites (Corcoran, Fuhrman, & Belcher, 2001).
Agullard and Goughnour (2006) echo the same findings. These researchers
suggested that in order for the central office to effectively support the sites to
improve student achievement results, there needs to be consistent, coherent, and
systemic approach among the leaders. Murphy and Hallinger’s study (1988) study
found that strong instructional leadership from the superintendent, a strong focus on
curriculum and instruction, consistency and alignment of instructional activities, as
well as focused monitoring of curriculum and instruction are instructionally effective
strategies to drive school district improvement. Other more recent studies of districts
identified as high performing conducted in Texas (Ragaland, Asera, & Johnson,
1999; Skrla, Scheurich, & Johnson, 2000) and North Carolina (Public Schools of
North Carolina, 2000) echoed similar themes that Murphy and Hallinger (1988)
found, with special emphasis on district leaders to improve student performance.
Similarly, the Consortium for Policy Research in Education (CPRE) examined the
roles that the central office leaders play in shaping and supporting reforms in three
large urban districts and found similar results.
My study’s findings were aligned to the findings in the literature review. All
four superintendents provided vision, focus, and leadership, removed all distractions
and roadblocks, created a common language, provided time for collaboration,
established a strong focus on improving curriculum and instruction, developed
167
consistency and alignment of instructional activities, as well as focused their efforts
on monitoring of curriculum and instruction to drive school district improvement.
Kerr et al. (2003) did a thorough study in three urban districts and found that
the most influential factors to affect data use were: accessibility and timeliness of
data; perception of data validity; training and support for teachers in regard to data
analysis and interpretation; the alignment of data strategies with other instructional
initiatives; and distributed leadership. Consistent with the literature review, this
study also found that district leaders expressed a concern that the results of the state
standardized tests did not arrive in a timely manner to assist the teachers with
proactive planning, since teachers are supposed to develop action plans based on the
“trailing data” (Datnow et al., 2007), i.e. data from the past tests, and then use the
“leading data” (Datnow et al., 2007), i.e., data from the interims assessments of
formative assessments, in order to modify instruction . As a result, district leaders
mentioned that there was lack of teacher buy-in, in terms of data use, since teachers
did not perceive the data from the standardized assessments as valid. As mentioned
earlier, this study also found that districts had created distributed leadership by
providing multiple teachers on special assignments to support with data use, as well
as by training the district leadership teams, such as the Legend district, to serve as
coaches to the site principals and leadership teams in order to engage in cycle of
continuous improvement.
In Earl and Katz’s study (2003), outcomes associated with district actions to
promote DDDM were stronger in two out of the three districts that had invested
168
more time and energy into DDDM. In addition, in this study, staffs at schools
reported that they used multiple forms of data from multiple sources to execute
instructional planning and decision making. Teachers also repeatedly used their
planning time to review assessment results and other data to design targeted
interventions. In addition, the accessibility of data in user-friendly formats further
facilitated data use for decision making by the sites. These districts also provided
technical assistance to support schools in interpreting assessment results. Last, but
not least, they partnered with outside providers in order to build capacity for DDDM.
This study’s findings closely correlated with the prior research. All districts gave the
sites planning time to review and analyze data in order to develop targeted
interventions, engage in a cycle of inquiry to improve practice. In addition, district
leaders started producing reports in a format that was easy to understand, avoiding
analysis-paralysis. District leaders also provided technical support and training for
teachers in order to access multiple sources of data. Last, but not least, all four
districts partnered with outside providers that greatly contributed to creating a culture
of continuous improvement.
On a final note, the literature review did not include any information
regarding the impact of DDDM in centralized vs. decentralized districts, although
this study references that issue in its findings, particularly in two districts: Worland
and White. Worland used the strategic planning process in order to create more
“centralized” practices through technology warehousing for DDDM, whereas, the
White district, considered “ decentralization”, i.e. removing all roadblocks and
169
mandates from the district office, as a means of promoting DDDM. On the other
hand, both the Grovenor and Legend districts claimed to be very centralized districts,
while giving “structured” autonomy to the sites to proceed with caution while
implementing innovative practices/ programs to increase student achievement. This
gap may be filled by future studies.
Summary of Findings in Relation to Existing Literature
The literature review suggests that districts that promote a DDDM
environment do so by relentlessly pursuing attempts to create a culture of inquiry and
establishing practices/structures to support improvement of practices. Others,
studying the same phenomena, have come to the same conclusions regarding the
central role that leadership brings into play (for example, Kerr et al., 2006; Supovitz
& Klein, 2003; Mason, 2002). These researchers contend that the absence of this
strong advocacy for data use will hamper a culture of inquiry. In sum, the literature
review in chapter 2 about the role of the district office leaders in driving
school/district improvement, suggests that district office leaders support the sites by
building capacity for DDDM by:
• Holding high expectations and making all adults accountable for
improved student performance
• Providing leadership and support for curriculum and instructional
practices
• Training and supporting principals and teachers to become “data-literate”
170
• Providing administrative support to promote good teaching
• Developing a culture of inquiry where improving student performance
becomes the shared responsibility of every staff member at the district
office and the schools
• Focusing on instructional practice and allocation of resources to research-
based and coherent professional development
• Providing laser focus on data analysis and alignment of curriculum,
assessment and practices
• Establishing focused, coherent reform efforts aligned to the site initiatives
The findings in this study confirmed the findings in the prior research.
However, the new findings which are summarized below are anticipated to add to the
existing body of literature:
• “Laser-like focus” on specific subgroups in the student population had
not previously existed to this degree to provide equity prior to NCLB
• Additional data used by district leaders are: Recommendations of the
Western Association of Schools and Colleges (WASC), results of college
entrance exams, school recognition programs, course completion data,
on-target with graduation rates, college-going rates, placement in college
courses, completion of senior projects etc.
• More detailed information on structures and practices for cultivating a
culture of data use
171
• Not all districts have high-level data system capability (Olson, 2002)
• Transparency with data use promoted a culture of best practices
• Data warehousing is conducive to a more centralized culture for DDDM
• Evaluation of the impact of each of the structures/practices/programs on
student achievement remains unknown
• Key roles of outside providers in establishing a culture of DDDM
In the next section, I will discuss implications for future research. While this
study attempted to fill some of the gaps in the research on DDDM, there is still a lot
of research that needs to occur in the area of DDDM to improve student
achievement. On a final note, the following comment that was made by two district
leaders at the Worland district, best summarized the impact of this study on the “next
steps” for DDDM at that district. “The types of questions you asked during the
interview, made us reflect upon our role as district leaders; the impact of our existing
practices; and the next steps. As a result of your research project and our strategic
plan, we have partnered with a renowned data “guru” to take us deeper into our quest
with DDDM to improve student performance.”
Implications for Future Research
Under NCLB, the effectiveness of district level practices to use DDDM to
drive continuous improvement will ultimately be measured by students’ mastery of
contents standards in mathematics and English Language Arts by the year 2014.
Literature on school improvement has stressed the key role that the district office can
172
play in improving instruction by providing focus, coherence, and consistency and
support to the sites (Corcoran, Fuhrman, & Belcher, 2001). Also, Feldman and Tung
(2003), Chen, Heritage, & Lee (2005); Halcom (2001); Keeney (1998); Lachat &
Smith (2005), Symonds (2003) found that inquiry-based DDDM not only resulted in
improved student achievement, but it also led to a more professional culture where
teachers became more reflective and engaged in professional dialogue. While this
study offered valuable insight into the perceptions of district leaders in implementing
DDDM to drive school/district improvement, future research is needed to deepen
understanding in the following areas:
• Evaluating the quality and effectiveness of professional development
offered to the teachers and principals in data analysis process
• Analyzing the perceptions of teachers who utilize technology
warehousing to modify instruction
• Analyzing student perceptions of improvement of teacher practices in
light of DDDM
• Analyzing student achievement results in centralized vs decentralized
districts
• Analyzing the effectiveness of different leadership styles in
promoting/cultivating a culture of DDDM
• Financial impact on the establishment of a culture of DDDM in light of
declining economic resources
173
The next section, will address implications for policy and practice.
Recommendations are geared toward the state, universities, and district leaders.
Implications for Policy and Practice
Based on the findings in this research as well as the findings in the literature
review, Departments of Education, Commissions on teacher credentialing, colleges
and universities, as well as districts and schools should consider the following
actions:
1) Colleges and Universities: It is recommended that institutions of higher
education develop courses in training teachers on the process of DDDM,
use of technology to access data from multiple sources, and use of
different protocols to engage in inquiry to improve practices.
2) California Commission on Teacher Credentialing: It is recommended
that the commission make it a requirement to have all teachers
demonstrate their mastery of DDDM process by adding a test and
developing rigorous criteria for passing.
3) District Leaders:
A) It is recommended that district leaders include scenarios regarding
DDDM in every interview process in order to assess teachers’ and
prospective administrators’ ability to engage in a cycle of continuous
improvement.
174
B) It is recommended that districts require every teacher, counselor,
administrator, and support staff to become certified in DDDM. This
may be fulfilled by taking classes at the colleges or universities,
developing a portfolio, or mandatory three-tiered district training for
certification. The certificates are subject to renewal. The training
should include the following components: the process of DDDM,
context of DDDM, types of data, distinguishing between good data
and bad data, accessing data through technology warehousing, turning
data into actionable knowledge, monitoring progress, and evaluation.
C) It is recommended that the entire district leadership team be trained in
improving instruction and engaging in a cycle of inquiry by coaching
and working intimately with the principals and site leadership teams.
This was exclusively done at the Legend district.
D) It is recommended that district leaders follow the model of the Legend
district by reviewing administrator observations on a regular basis in
order to ensure alignment of instruction to school-wide goals to close
the achievement gap.
E) It is recommended that districts develop transparency with data
through the use of technology as a platform to share best practices,
lesson plans, common formative assessments, and student data so that
all schools within the districts have access to the same information.
175
F) It is recommended that the districts further develop and build out the
school district's capacity to collect, synthesize and analyze data and
information to determine or gauge the specific "added value " of
decisions in other areas beyond instruction on instruction, such as
finance, human resources, professional development, technology,
support services, food services, transportation, school safety, etc. so
as to advance DDDM best practices and improvements systemically.
Use these findings to determine where to invest new resources,
redirect existing resources, or reduce resources when in a declining
resource environment.
G) It is recommended that the school district leadership teams
systemically align all existing and emerging data collection systems
across administrative units, divisions and departments, and develop an
integrated holistic data driven decision making system focused on
student success that allows for the analysis of the impact of decisions
in one administrative unit on each of the other units and identify
coordination, collaboration and communication needs that will
advance student success and create increased effectiveness and
efficiencies system wide.
H) It is recommended that the district technology leaders develop
integrated technology infrastructures and integrated software systems
that facilitate the cross-fertilization and analysis of data and
176
information across administrative units and district wide so as to
allow for the user friendly analysis of decision making throughout the
system and its relationship to -and impact on -classroom instruction.
I) It is recommended that the providers of products and services to
school systems ensure that products and services developed for and
offered to school systems have the built in capacity to facilitate a
systemic data collection and analysis approach to DDDM and, for
example, when providing district software and data management
systems ensure they allow for the integration of all district functions
beyond instruction and include built in formulas that permit school
districts to assess "Cost Benefit Ratios" or "Returns on Investment"
by calculating the corollary effect of decisions made in finance,
professional development, human resources, etc. on the amount of
new value added in the classroom as measured by student progress
and achievement, as well as other measures of success.
4) Site Administrators/Teachers/Support Staff:
A) It is recommended that principals and site administrators continually
improve not only their instructional leadership skills, including data
analysis, access, and use of data, instructional coaching, observation
of instruction, but also empower teacher leaders and counselors, to do
the same in order to build capacity for distributed leadership in
DDDM.
177
These can be accomplished at faculty meetings, department meetings,
release days, common planning time, best practices dinners or
lunches, instructional walkthroughs, and meetings with colleagues
within and across districts.
B) It is also recommended that site administrators designate staff
members who will provide data from multiple resources to facilitate
DDDM for teachers, so that teachers do not spend time mining the
data.
C) It is recommended that each site form its Data Teams or committees,
with representation from classified staff and each academic
department in order to be proactive in addressing any challenges that
may arise as a result of DDDM. Moreover, the committee members
will serve as “data dudes”, according to the superintendent at the
Grovenor, in order to assist the teachers.
D) It is recommended that sites and the district develop incentives to
recognize teachers who consistently improve student performance.
5) Students
A) It is recommended that the process of DDDM and Professional
Learning Communities be incorporated into all aspects of students’
academic experience, such as orientations, counseling, as well
elective or academic courses, and attendance at faculty meetings so
that students understand and appreciate the process that educators
178
working with them go through in order to improve their instructional
practices that result in increased student achievement.
B) It is recommended that more individual student incentives be
developed in order to further motivate students to improve
performance.
6) Parent Education: It is important for parents to receive training and be
provided a handbook in learning how to understand how to use different
types of data, learn the purpose of different types of data and how to
analyze and interpret data. Training may include use of real life
scenarios, where a parent uses data from multiple sources in order to get
to the root causes of the child’s problems. The handbook can be designed
in a simple, user friendly format, with pictures and multiple definitions
and examples cited per the types of data that parents need to be most
aware. In culturally diverse districts, the handbooks may be translated in
the home language of the students.
7) Professional Networking through Technology: It is recommended that
collaborative efforts be established by developing a professional
organization that can help unite districts across the state. The use of
multimedia such as the: Internet, Blogs, Pod Casts, Social Networking
Websites, Webinars, and Video Conferences, can facilitate the sharing of
best practices.
179
In summary, in order to create a culture of DDDM to increase student
achievement, all stakeholders at different levels must engage in practices that are
focused on improving teaching and learning.
Conclusion
The goal of NCLB, at both the state and federal levels, has been to draw
attention to the discrepancies that exist in achievement gaps among groups of
students based on language, socio economic, race or ethnic status. While NCLB
seeks to increase the role that districts play in monitoring performance and providing
technical assistance to the sites, there is limited research that focuses on the role of
the district leaders who use DDDM to assist schools in improving performance.
Although this study provides timely and pertinent information on district leaders’
role and practices that they put in place to develop a culture of continuous
improvement, as well as challenges that they face in cultivating a culture of data use,
much more research needs to be done in the DDDM to improve student performance.
Clearly, the benefits of the DDDM far outweigh the challenges as it creates a culture
of professionalism, a cycle of continuous inquiry where “quality conversations”
(Datnow et al., 2007) can take place around the effectiveness of instructional and
organizational practices to improve student performance. Ultimately, this study will
be useful to the district leaders who will utilize DDDM to ensure school/district
improvement. For districts that do not currently use DDDM, it is hoped that the
180
findings will be both inspirational as well as material in assisting district leaders to
espouse a theory of action.
In conclusion, Daggett’s (2008, p.7) words of wisdom best capture the
essence of my study: “Every day I am reminded about the many inspiring schools
across the nation that are making great strides in their efforts to increase student
achievement. Just as often, educators ask me what they can do to improve their
schools. My answer is that solutions don’t need to be invented; they need to be
shared so we can learn from one another.”
181
BIBLIOGRAPHY
Ackley, D. (2001). Data analysis demystified. Leadership, 31(2), 28-29, 37-38.
Ackoff, R.L. (1989).From data to wisdom, Journal of Applied Systems Analysis, 16,
3-9.
Agullard, K., Goughnour, D.(2006).Central Office Inquiry: Assessing Organization,
Roles, and Actions to Support School Improvement. West Ed.
Ainsworth, L., & Viegut, D.(2006). Common Formative Assessments: How to
Connect Standards-Based Instruction and Assessment. Corwin Press,
Thousand Oaks.
Altheide, D.L. “Ethnographic Content Analysis.” Qualitative Sociology, 1987, 10(1),
65-77.
Alwin, L. (2002). The will and the way of data use. School Administrator, 59(II), II
American Association of School Administrators, 2002
Armstrong, J., & Anthes, K. (2001). How data can help. American School Board
Journal, 188 (11), 38-41.
Bainbridge, W., & Lasley, T. (2002).Demographics, diversity, and K-12
accountability: The challenge of closing the achievement gap. Educational
and Urban Society, 34, 422-437.
Baker, E.L. (2003). From usable to useful assessment knowledge: A design problem
(CSE Technical Report 612). Los Angeles: University of California, National
Center for Research on Evaluation, Standards, and Student Testing.
Bay Area School Reform Collaborative (BASRC, 2001). Turning Point: Annual
report. San Francisco, CA: Author.
Bernhardt, V.L.(2003). No schools left behind. Educational Leadership, 60 (5), 26-
30.
Bickel, W. E., & Cooley, W. W., & (1986). Decision-oriented educational research.
Evaluation in education and human services. Boston, Kluwer-Nijhoff Pub.
Bransford, J., Brown, A., & Cockling, R. (1999). How people learn: Brain, mind,
experience, and school. Washington, DC: National Academy Press.
182
Breiter, A. (2003). Information-knowledge-sense-making: A theoretical analysis
from Management/business literature. Unpublished manuscript, Bremen,
Germany.
Brunner, C., Fasca, C., Heinze, J., Honey, M., Light, D., Mandinach, E., & Wexler,
D (2005). Linking data and learning: The Grow Network Study. Journal of
Education for Students Placed at Risk. 10(3), 241-267.
Burch, P, & Thiem, CH (2004). Private Organizations, School Districts, and the
Enterprise of High Stakes Accountability. Unpublished manuscript.
California Department of Education. Data Quest. Retrieved from
www.cde.ca.gov on October 28, 2007.
California Student Success Project. Retrieved from http://studentsuccessproject.org/
on July 25, 2008.
Cawelti, G., & Protheroe, N. (2001). High student achievement: How six school
districts Changed into high-performance systems. Arlington, VA:
Educational Research Service.
Celio, M.B.& Harvey, J. (2005). Buried treasure: Developing a management guide
from mountains of school data. http://www.crpe.org/pubs.shtml#leadership
Chen, E., Heritage, M., Danish, J., Choi, S., & Lee, J. (2003, August). Evaluating the
web-based quality school portfolio: Year two research report (CSE
deliverable). Los Angeles: University of California, National Center for
Research and Evaluation, Standards and Testing.
Chen, E., Heritage, M., La Torre, D.,& Lee, J. ( 2005 January). Evaluating the web-
based quality school portfolio: Final Report (CSE deliverable). Los Angeles:
University of California, National Center for Research and Evaluation.
Choppin, J.( 2002).Data Use in Practice: Examples from the School Level”. Paper
presented at the Annual Conference of the American Educational Research
Association, New Orleans, La., April 2002.
Chrispeels, J.H. (1992). Purposeful Restructuring: Creating a Culture for Learning
and Achievement in Elementary Schools. Peter Mortimore Institute of
Education University of London.
Cizek, G. J. (2000). Pockets of resistance in the assessment revolution. Educational
measurement: Issues and Practices, 19(2), 16-23.
183
Coburn, C.E., Honig, M., & Stein, M.K. ( in press).What’s the evidence on districts’
use of evidence? In J. Bransford, L.Gomez, D.Lam, & N. Vye (Eds.).
Research and Practice: Towards reconciliation. Cambridge: Harvard
Educational Press.
Coburn, C., Toure, J., & Yamashita, M. In “press” at Teachers’ College Record.
Evidence Interpretation, and Persuasion: Instructional decision making at the
district central offices.
Colby, S., Smith, K., & Shelton, J.(2005). Expanding the supply of high-quality
public schools. San Francisco, CA: The Bridge span Group.
Consortium for Policy Research in Education (CPRE). The District Role in Building
Capacity: Four Strategies. CPRE Policy Briefs. September 2000.
Copland, M.A. (2003). Leadership of inquiry: Building and sustaining capacity for
school improvement. Educational Evaluation and Policy Analysis, 25(4),
375-395.
Corcoran, T.B, Fuhrman, S.H, Belcher, C.L, (2001).The District Role in
Instructional Improvement. Reprinted from Phi Delta Kappan, Volume 83,
Issue 1, September 2001, Pages 78-84.
Council of Great City Schools (2002). Beating the odds II. Washington, DC: Council
of the Great City Schools.
Commey, A.( 2000).Using Student Assessment Data: What Can We Learn From
Schools? Oak Brook, IL: North Central Regional Educational Laboratory.
Cronback, L.J., Ambron, S.R., Dornbusch, S.M., Hess, R.D., Hornik, R.C., &
Phillips, D.C., et al. (1980). Toward reform of program evaluation. San
Francisco: Jossey-Bass.
Daggett, W. (2004). Knowledge and Data Management. The Case Against Zero.
Phil Delta Kappa International, Inc.
Daggett, W. (2008). International Center for Leadership Education.
Datnow, A., Park, V., & Wohlstetter, P. (2007). Achieving with Data: How high
Performing School Systems use Data to Improve Instruction for Elementary
Students.” Los Angeles: Center on Educational Governance, Rossier School
of Education, University of Southern California. Available at
http://www.newschools.org/viewpoints/AchievingwithData.pdf
184
Deci, E. & Ryan, R. (1985). Intrinsic motivation and self-determination in human
behavior. New York: Plenum.
Deci, E. & Ryan, R. (1987). The support of autonomy and control of behavior.
Journal of Personality and Social Psychology, 53, 1024-1037.
Dembosky, J.W., Pane, J.F., Barney, H., & Christina, R. (2005). Data driven
decision making in Southwestern Pennsylvania school districts. Santa
Monica, CA: RAND
Doyle, D. P. (2003). Data-driven decision making: Is it the mantra of the month or
does it Have staying power? T.H.E. Journal, 30(10), 19-21.
Drucker, P.F. (1989). The new realities: In government and politics/in economics
and business in society and world view. New York, NY: Harper & Row.
Earl, L. (2000). Accountability and assessment: improvement or surveillance?
Education Canada. Canadian Education Association.
Earl, L., Katz, S. (2002). Leading Schools in a Data-Rich World. Ontario Institute
for Studies in Education, University of Toronto
EdSource. 2006. Similar Students, Different Results: Why Do Some School Do
Better? Mountain View, CA: Edsource. Available at:
http://www.edsource.org/pdf/simstusumm06.pdf.
Elmore, R.F. (2002b). The limits of “change”, Harvard Education Letter Jan./Feb.
2002, 18(1), 7-8.
Englert, R.M., Kean, M.H., & M.H., & Scribner, J.D. (1977). Politics of program
evaluation in Large city school districts. Education and Urban Society, 9 (4),
429-450.
Feldman, J., & Tung, R. (2001). Whole school reform: how schools use the data-
based inquiry and decision making process. Paper presented at the annual
meeting of the American Educational Research Association, Seattle, WA.
Fetler, M.(1989).School drop-out rates, academic performance, size, and poverty:
Correlates of educational reform. Educational Evaluation and Policy
analysis, 11(2), 109-116.
Fullan, M. (2000).The return of large-scale reform. The Journal of Educational
Change. OISE, University of Toronto: Klower Press.
185
Firestone, W., Mayrowetz, D., & Fairman, J. (1998). Performance-based assessment
and Instructional change: The effects of testing in Maine and Maryland.
Educational Evaluation and Policy Analysis, 20 (2), 95-113.
Fisher, D., & Frey, N. (2007). A tale of two middle schools: The role of structure and
instruction. Journal of Adolescent and Adult Literacy, 51, 204-211.
Greeno, J.G., & the Middle School Mathematics through Application Project Group
(MMAP). (1998). The situativity of knowing learning, & research. American
Psychologist, 53(1), 5-26.
Goertz, M.E. (2001). Standards-based accountability: Horse trade or horse whip? In
Susan Fuhrman (Ed.), From the capitol to the classroom: Standards-based
reform in the States. One hundredth yearbook of the National Society for the
Study of Education Part II (pp. 39-59). Chicago, IL: The University of
Chicago Press.
Goertz, M.E., Duffy, M., & Carlson-LeFloch, K. (2000).Assessment and
accountability systems:50 state profiles. Philadelphia: Consortium for Policy
Research in Education.
Gross, B., & Goertz, M. E. (Eds.) (2005, March). Holding high hopes: How high
schools respond to state accountability policies. (CPRE Research Report
Series No. 56) (Chapter 5). Retrieved November 14, 2005 from
http://www.cpre.org/Publications/rr56.pdf
Halverson, R., Grigg, J., Prichett, R., & Thomas, C. (2005). The new instructional
leadership: Creating data-driven instructional systems in schools. Paper
presented at the annual meeting of the National Council of Professors of
Educational Administration, Washington, D.C.
Hamilton, L.S., Stecher, B.M., & Klein, S.P. (2002). Making sense of test-based
accountability in education. Santa Monica, CA: Rand.
Hannaway, J. (1989). Manager’s Managing. The Workings of an Administrative
System. New York. Oxford University Press.
Heritage, M., &Yeagley, R. (2003). Moving to Better Practice. Data Use and School
Improvement: Challenges and Prospects. Los Angeles: University of
California, National Center for Research and Evaluation, Standards, and
Student Testing (CRESST).
186
Heritage, M., & Yeagley, R. (2005.). Data Use and School Improvement:
Challenges and Prospects. Yearbook of the National Society for the Study of
Education 104 (2), 320–339
Herman, J., & Gribbons, B. (2001). Lessons Learned in Using Data to Support
School Inquiry and Continuous Improvement: Final Report to the Stuart
Foundation, Los Angeles, CA: National Center for Research on Evaluation,
Standards, and Student Testing.
Holcomb, E.L., (2001). Asking the Right Questions: Techniques for Collaboration
and School Change (2
nd
ed.), Thousand Oaks, CA: Corwin.
Honey, M., Brunner, C., Light, D., Kim, C., McDermott, M., Heinze, C., Breiter, A.,
& Mandinach, E. (2002). Linking data and learning: The Grow Network
Study. New York: EDC/center for Children and Technology.
Honig, M., I.( 2004). “The New Middle Management: Intermediary Organizations in
Education Policy Implementation.” Educational Evaluation and Policy
Analysis 26(1): 65-87.
Honig, M., & Coburn, C.E. (2005). When districts use evidence for instructional
improvement: What do we know and where do we go from here? Urban
Voices in Education (6), 22-26.
Howley, C., & Bickel, R. (2000). When it comes to schooling… Small works: School
size, poverty, and student achievement. Randolph, VT: Rural School and
Community Trust.
Ikemoto, G.S, Marsh, J.A.(2006).Cutting Through the “Data-Driven” Mantra:
Different Conceptions of Data-Driven Decision Making. RAND.
Ingram, D., Louis, K.S., & Shroeder, R.G. (2004). Accountability policies and
teacher decision making: Barriers to the use of data to improve practice.
Teacher College Record, 106(6), 1258-1287.
Jamentz, K. (2001). Accountability Dialogues: School Communities Creating
Demand from Within. WestEd, San Francisco, CA.
Johnson, S. M. (1996). Leading to change: The challenge of the new
superintendency. San Francisco: Jossey-Bass.
Johnson, J. H. (1997). “Data-Driven School Improvement.” ERIC Digest, no. 109.
187
Johnson, J. H. (1999). Educators as researchers. Schools in the Middle, 9(I), 38-41.
Johnson, R.S. (2002).Using data to close the achievement gap: How to measure
equity in our Schools. Thousands Oaks, CA : Corwin Press
Keeney, L. (1998). Using data for school improvement: Report on the second
practitioner’s Conference for Annenberg challenge sites. Providence, RI:
Annenberg Institute for School Reform at Brown University.
Kennedy, M.M. (1982a). Working knowledge. In M.M. Kennedy ( Ed.), Working
Knowledge and other essays ( pp. 1-28). Cambridge, MA: The Huron
Institute.
Kennedy, M.M. (1982b). Evidence and decision. In M.M. Kennedy (Ed.), Working
Knowledge And other essays ( pp. 59-103). Cambridge, MA: The Huron
Institute.
Kerr, K.A., Marsh, J.A., Ikemoto, G.S., Darilek, H., & Barney, H. (2006). District
wide strategies to promote data use for instructional improvement. American
Journal of Education, 112, 496-520.
Knapp, M. S., Swinnerton, J.A., Copland, M.A., & Monpas-Huber, J. (2006). Data-
informed Leadership in Education. Center for the Study of Teaching and
Policy. University of Washington.
Kohlmoos, J. (2005). At the Tipping Point: Knowledge-driven Reform. Published in
the December 5, 2005 issue of the “School Improvement Industry Weekly.”
Lachat, M.A., & Smith, S. (2005). Practices that support data use in urban high
schools. Journal of Education for Students Placed at Risk, 10(3), 333-349.
Lachat, M.A. (2001). Data-Driven high school reform: The Breaking Ranks model.
Providence, RI: LAB at Brown University.
Lafee, S.(2002).Data-driven districts. School Administrator, 59(II), 6-7, 9-10, 12, 14-
15.
Langer, J., Colton, A.B., & Goff, L.S. (2003).Collaborative analysis of student work:
Improving Teaching and Learning. Alexandria, VA: Association for
Supervision and Curriculum Development
Lapp, D., Fisher, D., Flood, J., & Cabello, A. ( 2001). An Integrated Approach.
Literacy Assessment of Second Language Learners - Allyn & Bacon.
188
Lemons, R., Luschei, T., & Siskin, L.S. (2003). Leadership and the demands for
standards-based Accountability. In M. Carnoy, R. Elmore, and L.S. Siskin
(Eds). The new accountability: High Schools and high stakes-testing (pp. 99-
128). New York: Routledge Falmer.
Light, D., Wexler, D., & Heinze, J. (2004, April). How Practitioners interpret and
link data to Instruction: Research findings on New York City Schools’
implementation of the Grow Network. Paper presented at the annual meeting
of the American Educational Research Association, San Diego, CA.
Linn, R.L., Baker, E.L., & Betebenner, D.W. ( 2002). Accountability systems:
Implications of requirements of the No Child Left Behind Act of 2001.
Educational Researcher, 31(6) 3-16.
Love, N. (2004). “Taking Data to New Depths.” Journal of Staff Development 25(4):
22-26.
MacIver, D.J., & Balfanz, R. (2000). The school district’s role in helping high –
poverty Schools become high performing, In B. Gaddy (Ed.), Including at-
risk students in standards-based reform ( pp.35-69). Aurora, CO: Mid-
continent Research for Education and Learning (McREL).
Madinach, E. B., Honey, M., & Light, D. (2006).A Theoretical Framework for Data-
Driven Decision Making. EDC Center for Children and Technology. Paper
presented at the Annual meeting of AERA, San Francisco, April 9, 2006.
Marsh, J., Kerr, K., Ikemoto, G., Darilek, H., Sutttorp, M.J., Zimmer, R., et al.
(2005). The role of districts in fostering instructional improvement: Lessons
from three urban districts partnered with the Institute for Learning. MG-361-
WFHF. Santa Monica, CA: RAND Corporation. Retrieved December 18,
2006, from http://www.rand.org/pubs/monographs/MG361/
Marsh, J.A., Pane, J.F., Hamilton, L.S.( 2006). Making Sense of Data-Driven
Decision Making in Education. RAND Research.
Marzano, R.J. (2003). What Works in Schools: Translating research into action.
Alexandria, VA: Association for Supervision and Curriculum Development.
Marzano, R. J. (2007). The Art and Science of Teaching. A Comprehensive
Framework for Teaching. Association for Supervision and Curriculum
Development (ASCD). Alexandria, VA
189
Marzano, R.J. (2000). Transforming classroom grading. Alexandria, VA:
Association for Supervision and Curriculum Development (ASCD).
Mason, S. (2002). Turning data into knowledge: Lessons from six Milwaukee public
schools. Madison, WI: Wisconsin Center for Education Research. Paper
presented at the annual conference of the American Education Research
Association, New Orleans, LA.
Massell, D. (1998). The Theory and Practice of Using Data to Build Capacity: State
and Local Strategies and Their Effects.” In From the Capitol to the
Classroom: Standards-Based Reform in the States, ed.Susan H. Fuhrman.
Chicago: University of Chicago Press.
Massell, D. (2000). The district role in building capacity: Four strategies (CPRE
policy briefs). Philadelphia, PA: Consortium for Policy Research in
Education.
Massell, D. (2001). “The Theory and Practice of Using Data to Build Capacity: State
And Local Strategies and Their Effects.” In From the Capitol to the
Classroom: Standards-Based Reform in the States, ed. Susan H. Fuhrman.
Chicago: University of Chicago Press.
McIntire, T. (2002). The administrator’s guide to data-driven decision making.
Technology and Learning, 22(II), 18-33.
McLaughlin, M.W., & Talbert, J.E. ( 2002). Communities of practice and the work of
high school teaching. Chicago: University of Chicago Press.
McIver, A.M.I, & Farley, E. (2003). Bringing the District back in: The Role of the
Central Office in Improving Instruction and Student Achievement. Published
by the Center for Research on the Education of Students Placed At Risk
(CPESPAR).
Merriam S.B. Case Study Research in Education: A Qualitative Approach. San
Francisco: Jossey- Bass, 1988.
Merriam, S.B. Qualitative Research and Case Study Applications in Education. San
Francisco, Jossey-Bass, 1998.
Murphy, J., & Hallinger, P. (1988). The characteristics of instructionally effective
school districts. Journal of Educational Research, 81(3), 175-181.
190
Murnane, R.J., Sharkey, N.S., & Boudett, K.P. (2005). Using student-assessment
results to improve instruction: Lessons from a workshop. Journal of
Education for Students Placed At Risk, 10(3), 269- 280.
National Center for Research on Evaluation, Standards, and Student Testing
(CRESST). UCLA.
National Education Association for the Improvement of Education (NFIE), 2003
Natriello, G., McDill, E.L., & Pallas, A.M. (1990).Schooling disadvantaged
children: Racing against catastrophe. New York: Teachers College Press.
NEA Foundation for the Improvement of Education. 2003. Using Data about
Classroom Practice And Students Placed at Risk 10(3): 269-80.
O’Connor & McDermott (1997). Engineering Systems Thinking: A Multifunctional
Definition Engineering Systems Thinking. Springer Netherlands. Volume 14,
Number 3, June 2001.
O’Day, J., Bitter, C., Kirst, M., Camoy, M., Woody, E., Buttles, M., Fuller, B.,
Ruenzel, D. (2004). Assessing California’s Accountability System:
Successes, Challenges, and Opportunities for Improvement. PACE Policy
Brief, 4(2). Retrieved November 14, 2005 from http://www-
gse.berkeley.edu/research/PACE/policy brief.04-2.pdf
Olson, L. (1997). The data dialogue for moving school equity. California Tomorrow
Perspectives, 5.
Patton, M.Q. (1990). Qualitative Evaluation Methods (2
nd
ed). Thousand Oaks,
Calif.: Sage.
Popham, W.J. (1987).The merits of measurement-driven instruction. Phi Delta
Kappan, 68, 679-682.
Popham, W.J., Cruse, K.I., Rankin, S.C., Sandifer, P.D., & Williams, P.L.( 1985).
Measurement-driven instruction: It’s on the road. Phil Delta Kappan, 66,
628-634.
Portin, B., Beck, L., Knapp, M., & Murphy, J. (In Press). The School and self-
reflective Renewal: Taking stock and moving on. In B. Portin, L. Beck, M.
Knapp, and J. Murphy (Eds), Self-reflective renewal in schools: Lessons from
a national initiative: Westport, CT: Greenwood.
191
Protheroe, N. (2001). Improving teaching and learning with data-based decisions:
Asking the right questions and acting on the answers. Educational Research
Service Spectrum, 19(3). Retrieved November 30, 2004, from
http://www.ers.org/spectrum/su
Ragaland, M.A., Asera, R., & Johnson, J.F. (1999). Urgency, responsibility, efficacy:
Preliminary findings of a study of high-performing Texas school districts.
Austin, TX: University of Texas at Austin, The Charles A. Dana Center.
Reeves, D. B. (2004). Accountability in action: A blueprint for learning
organizations. Englewood ,CO: Advanced Learning Press.
Roberts, W.J., & Smith, J.K. (1982). American Studies Seminar Research Papers.
Published by American Antiquarian Society.
Rudner, L.M, & Boston, C (2003). Data warehousing: Beyond disaggregation.
Educational Leadership, 60(5), 62-65.
Scribner, J.P., Cockrell, K.S., Cockrell, D.H., & Valentine, J.W. (1999). Creating
professional Communities in school through organizational learning: An
evaluation of a school improvement process. Educational Administration
Quarterly 35(1), 130-160.
Secada, W. (2001, Spring). From the director. Using data for educational decision
making. Newsletter of the Comprehensive Center-Region VI, 6, 1-2.
Senge, P. (1990). The Fifth Discipline: The Art and Practice of the Learning
Organization, New York: Doubleday, 1990.
Senge. P. (1999). The dance of change. New York: Doubleday.
Shepard, L. (1991). Will national tests improve student learning? (CSE Technical
Report 342). Los Angeles: Center for the Study of Evaluation, UCLA.
Schmidt, W.H., C. McKnight, and S. Raizen, A Splintered Vision: An Investigation
of US Science and Mathematics Education, Boston: Kluwer Academic
Publishers, 1997.
Schmoker, M. (2004). Tipping point: From feckless reform to substantive
instructional improvement. Phi Delta Kappan, 55(6), 424-432.Retrieved
March 24, 2006, from http://pdkindtl.org/kappan/k0404sch.htm
192
Schmoker, M. (1996). Results: The key to continuous school improvement.
Alexandria, VA: Association for Supervision and Curriculum Development.
Schmoker, M. (2003). “First Things First: Demystifying Data Analysis.”
Educational Leadership 60(5):22-24.
Skrla, L., Scheurich, J.J., & Johnson, J.F. Jr. (2000, September). Equity-driven
achievement-Focused school districts. Austin, TX: University of Texas. The
Charles A. Dana Center.
Slavin, R. (2002).Evidence-based education policies: Transforming educational
practice and Research. Educational Researcher, 31(7), 15-21.
Smith, L.M. (1978). “An Evolving Logic of Participant Observation, Educational
Ethnography and Other Case Studies.”. In L. Shulman ( ed.), Review of
Research in Education. Itasca, Ill.: Peacock, 1978.
Snipes, J., Doolittle, F., & Helihy, C. (2002).Foundations for Success: Case Studies
of How Urban Schools Improve Student Achievement. New York: MDRC.
Spillane, J.P. (2000). Policy Implementation and Cognition: Reframing and
Refocusing Implementation Research, Review of Educational Research, Vol.
72, No. 3, 387-431 (2002).
Spillane, J.P., Reiser, B.J., & Reiner, T. (2002). Policy implementation and
cognition: Reframing and refocusing implementation research. Review of
Educational Research, 72, 387-431.
Spillane, J.P. & Miele, D.B.( 2007). Evidence in Practice: A Framing of the Terrain.
Northwestern University’s School of Education and Social Policy, Kellogg
School of Management and Institute for Policy Research.
Starbuck, W.H, & Milliken, F.J. (1988). Executives’ perceptual filters: What they
notice and how they make sense. In D.Hambrick (Ed.). The executive effect:
Concepts and methods for studying top management ( pp. 35-66). Greenwich,
CT: JAI
Stecher, B, Hamilton, L. & Gonzalez, G. ( 2003).Working Smarter to Leave No
Child Left Behind. Retrieved November 14, 2005 from
http://www.rand.org/pubs/whitepapers/WP138/WP138.pdf
193
Striefer, P.A., & Schumann, J.A.( 2005). Using data mining to identify actionable
infomrataion: Breaking new ground in data-driven decision making. Journal
of Education for Students Placed at Risk, 10(3), 281-293.
Stringfield, S. (1997). Underlying the chaos of factors explaining exemplary U.S.
elementary Schools: The case for High Reliability Organizations. In T.
Townsend ( Ed.), Restructuring and quality: Problems and possibilities for
tomorrow’s schools (pp. 143-160). London: Routledge.
Supovitz, J., & Taylor, B.S. (2003). The Impact of Standards-based Reform in Duval
County, Florida, 1999-2002. Philadelphia, PA: Consortium for Policy
Research in Education.
Supovitz, J., & and Klein, V. (2003). Mapping a course for improved student
learning: How Innovative schools use student performance data to guide
improvement. Philadelphia: Consortium for Policy Research in Education.
Symonds, K.W. (2003). After the Test: How schools are using data to close the
achievement gap. Springboard Schools. Available at
http://www.springboardschools.org/research/other_research.html.
Thorn, C. (2002). Data use in the classroom: The Challenges of implementing data-
based Decision-making at the school level. Madison, WI: University of
Wisconsin Center for Education Research
Tongeri, W. & Anderson, S. (2003). Beyond Islands of excellence: What districts
can do to improve instruction and achievement in all schools. Washington,
D.C.: Learning First Alliance.
Tyack, D., and L. Cuban, Tinkering Toward Utopia, Cambridge, Mass.: Harvard
University Press, 1995
Wayman, J.C. Stringfield, S. (2006). Technology-supported involvement of entire
faculties in examination of student data for instructional improvement.
American Journal of Education 112(4), 549-571.
Wayman, J.C. Stringfield, S, & Yakimowski, M.(2004). Software enabling school
improvement Through analysis of student data (Report 67). Baltimore, MD:
Johns Hopkins University, Center for Research on the Education of Students
Placed at Risk (CPESPAR).
Weick, K.E. (1995). Sensemaking in organizations. Thousands Oaks, CA: SAGE
Publications.
194
Weiss, J. (2007). Conditions for Student Success: The Cycle of Continuous
Instructional Improvement. Working Paper 4. March 14, 2007.
Weiss, C.H., Murphy-Graham, E., & Birkeland, S. ( 2005). An Alternate Route to
Policy Influence. American Journal of Evaluation, Vol. 26, No. 1, 12-30.
Wheatley, M.J. (1999). Leadership and the new science: Discovering order in a
chaotic world (2
nd
ed). San Francisco: Barrett-Koehler.
Wienbaum, E.H. (2005). Stuck in the Middle With You: District Response to State
Accountability. (CPRE Research Report Series No. 56) (Chapter 5).
Retrieved November 14, 2005 from
http://www.cpre.org/Publications/rr56.pdf
Yin, R.K.(1994).Case Study Research: Design and Methods (2
nd
ed.). Thousand
Oaks, CA. Sage.
195
APPENDIX
Interview Protocol
(Adapted from Datnow et al., 2006)
Data Driven Decision Making
District Administrator Interview Protocol
Participant’s Name: _______________________ Date: __________________
Position: _________________________________________________________
[Introduction: I will begin with a few minutes of explaining the study, introducing
myself, and the purpose of the study. I will explain that while the interview will be
taped, their responses are strictly confidential. I will let them know if there is
something they would like to say off tape, they can inform me and the recorder will
be shut off for their comment. Also, I will let them know the approximate length of
the interview and ask if they have any specific questions before beginning.]
I. Background- Laying the Foundation
Before I ask you specific questions about your role and district practices, I like to
start by asking you some general questions about your district and its surrounding
neighborhood in order to gain a broader understanding of the context in which you
work within.
1. To get a sense of your district, community, schools, and students, could you
tell me a bit about the history of your district, focusing on the last five years
(e.g., particular reform initiatives, strong partnerships with external groups,
major structural changes, etc.)
2. Please tell me your title and your role in the district/system, particularly with
respect to data use.
3. How long have you been in this role? What is your prior experience and
training?
4. Describe the district’s leadership group (e.g. cabinet)? Could you tell me
about this group (e.g., members, function, organizational structure)?
196
5. Is data-driven decision making a priority of the school board/board of
directors?
II. Data-Driven Decision-Making Processes
A. Goals
1. What is your theory of action for collecting and using data? Is there an
overall model/plan that your district/system relies on to collect, analyze, and
use data? Could you describe the process? (Probing for organizational model
for how DDDM process should flow).
2. What are the student performance goals for this district/system and the
schools within it?
How were these goals established?
How do you know when the goals have been met? Probe:
benchmarks, indicators.
3. To what extent have NCLB and state accountability systems influenced your
performance goals for students? Has anything else influenced your goals for
students?
4. What are the expectations of the district/system about how schools should use
data?
B. Sources and Types of Data
I would like to ask you some questions about how groups and individuals work with
data, focusing on data that is used to improve student learning. This can include a
diverse array of data such state assessments, district-developed benchmark
assessments, portfolios, and teacher-developed tests.
1. Please describe the types of student achievement data the district/system
collects:
a. Are the data from standardized tests?
b. Are they from tests the district/system has developed?
c. Which of the tests are mandated by federal/state policy?
d. Are there tests which the district/system elects to administer on your
own?
e. Are there school-level assessments which you collect?
197
2. Are there other types of data related to school performance that you collect
(e.g., stakeholder surveys- parents, students, community members, teachers
& research from other institutions)?
3. Does your district/system work with an external partner in the area of data-
driven decision-making? What has the external partner contributed (i.e., tests,
surveys, software, professional development)?
4. Who is in charge of data collection and management at the district/system
level – number of full-time staff? Does the district analyze the data? Who is
in charge of analyzing the data and disseminating it to schools? What tools
are used (e.g. assessment, data analysis, data warehousing, role of
technology)?
C. Uses of Data (School and District/system Levels)
1. Please describe the types of data/reports the district provides to schools. How
often?
2. Can you provide an example of when the district/system used information
about student performance to make changes in the district/system
organization, instructional delivery, or performance goals? (Probing for
evidence of continuous feedback loop).
3. How are student performance data used to make decisions about:
a. Instructional programs?
b. Professional development?
c. District organization and staffing?
d. District budget?
e. Principal evaluation /value added?
Can you provide a recent example?
III. Structural Support Mechanisms
1. Has the district/system put in place any policies or practices to encourage the
use of data in schools?
2. Has your district/system sponsored professional development for schools that
focus on using data to make decisions? In what specific areas? Is the
professional development offered voluntary or mandatory?
198
3. At the district/system level, is there a person or team of people who support
schools in using performance data to improve decision-making?
4. Does the district/system support or provide for school-level staff who assist
teachers in using data?
5. To what extent do principals have decision-making authority over
professional development? Over curriculum and instruction? Over staffing?
Over budget? Can you think of an instance in which a principal used their
decision-making authority to make a change based on student performance
data?
6. What kinds of training do principals and site administrators receive for
analyzing data to drive school improvement?
IV. Culture of Data Use
1. Do you think this district/system has a culture of data-driven decision
making? How would visitors know that this was a data-driven district?
2. To what extent is the idea of continuous improvement discussed at the
district/system level? Can you share with us an example of an area identified
through data as in need of improvement?
3. What does the district/system do to encourage the culture of data use in
schools? How would you know such a culture if you saw it? What does it
imply for principals? Teachers? Students? What have been the challenges?
4. Are schools recognized publicly or rewarded in other ways for using
performance data to inform decisions?
5. How would you characterize the relationship between the district/system and
individual schools – collaborative, hierarchical?
6. How would you characterize the relationships among schools – collaborative,
cooperative, and competitive? Can you give some examples of how schools
share/collaborate/compete with one another?
199
V. Results, Outstanding Needs, and Sustainability
1. What problems have you seen schools run into in trying to use data for
decision making? How has the district/system tried to help the school?
2. Reflecting upon your data-driven decision making practices at the
district/system level…
What works?
What doesn’t? Why?
What do you wish you knew that you cannot find out today?
What do you wish you were able to do (if you had the information,
tools, practices, decision-making authority, etc.)?
3. What do you see as the next steps for the district to support schools’ use of
data?
4. Interviewer: Choose among the following final questions:
a. What do you consider to be your district/system’s top 3 major
accomplishments in the area of data use?
b. How have federal or state policies facilitated or impeded your work in
this area?
c. In education, we have a tendency to move from one reform to
another. How will you work to sustain the reforms in the area of data-
driven decision-making?
[Concluding Remarks/Questions: Is there anything else we should know? Thank
them for their cooperation and time. Inform them that I will share my report with
them once it is done and that I might need to contact them for follow-ups]
Document Request:
• Overview of performance system?
• Organization chart?
• Partnerships with external organizations?
• Strategic plan?
• Staff development plan?
• Superintendent’s report to the board?
Abstract (if available)
Abstract
Forty years after the Elementary and Secondary Education Act, numerous school reforms have attempted to tackle the same problems initially addressed by the ESEA. Most recently, The No Child Left Behind (NCLB) Act of 2001, the latest in the series of educational reform initiatives, distinguishes itself by its unparalleled focus on accountability.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
How districts prepare site administrators for data-driven decision making
PDF
Data use in middle schools: a multiple case study of three middle schools’ experiences with data-driven decision making
PDF
Does data-driven decision making matter for African American students?
PDF
How urban school superintendents effectively use data-driven decision making to improve student achievement
PDF
School leaders' use of data-driven decision-making for school improvement: a study of promising practices in two California charter schools
PDF
Teaching in the context of NCLB: a qualitative study of the impact on teachers' work, morale, and professional support
PDF
Data-driven decision-making practices that secondary principals use to improve student achievement
PDF
Superintendents and Latino student achievement: promising practices that superintendents use to influence the instruction and increase the achievement of Latino students in urban school districts
PDF
Building and sustaining effective school leadership through principal mentoring in an urban school context
PDF
An examiniation of staff perceptions of a data driven decision making process used in a high performing title one urban elementary school
PDF
The implementation of data driven decision making to improve low-performing schools: an evaluation study of superintendents in the western United States
PDF
School culture, leadership, professional learning, and teacher practice and beliefs: A case study of schoolwide structures and systems at a high-performing high-poverty school
PDF
A gap analysis inquiry project on district-level reform implementation for Rowland Unified School District
PDF
Comprehensive school reform implementation: A gap analysis inquiry project for Rowland Unified School District
PDF
Towards trustworthy and data-driven social interventions
PDF
Structures for change: involving faculty in equity-based decision-making in independent schools
PDF
Sequential decision-making for sensing, communication and strategic interactions
PDF
Medical student loan indebtedness and decision-making in the selection of specialty
PDF
Resource allocation and instructional improvement strategies in rural single-school elementary districts
PDF
Goal orientation of Latino English language learners: the relationship between students’ engagement, achievement and teachers’ instructional practices in mathematics
Asset Metadata
Creator
Danielian, Hasmik J.
(author)
Core Title
District level practices in data driven decision making
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Degree Conferral Date
2009-05
Publication Date
03/05/2009
Defense Date
02/03/2009
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
aligment,best practices,case study,centralized vs decentralized,challenges to data driven decision making,coherence,consistency,culture of data use,cycle of continuous improvement,data warehousing,district leaders' role,district structures/practices,facilitators,FOCUS,high expectations,leadership,multiple measures,NCLB,OAI-PMH Harvest,outside providers,process and context of data driven decision making,process of data driven decision making,professiional development,staying the course,theory of action,transparency with data,types of data
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Datnow, Amanda (
committee chair
), Brewer, Dominic J. (
committee member
), Vargas, Edward Lee (
committee member
)
Creator Email
hdanieli@usc.edu,nyree.danielian@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-m2001
Unique identifier
UC1453745
Identifier
etd-Danielian-2669 (filename),usctheses-m40 (legacy collection record id),usctheses-c127-208280 (legacy record id),usctheses-m2001 (legacy record id)
Legacy Identifier
etd-Danielian-2669.pdf
Dmrecord
208280
Document Type
Dissertation
Rights
Danielian, Hasmik J.
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Repository Name
Libraries, University of Southern California
Repository Location
Los Angeles, California
Repository Email
cisadmin@lib.usc.edu
Tags
aligment
best practices
case study
centralized vs decentralized
challenges to data driven decision making
coherence
consistency
culture of data use
cycle of continuous improvement
data warehousing
district leaders' role
district structures/practices
facilitators
high expectations
multiple measures
NCLB
outside providers
process and context of data driven decision making
process of data driven decision making
professiional development
staying the course
theory of action
transparency with data
types of data