Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Data-driven decision-making practices that secondary principals use to improve student achievement
(USC Thesis Other)
Data-driven decision-making practices that secondary principals use to improve student achievement
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
Running head: DATA-DRIVEN DECISION-MAKING PRACTICES 1
DATA-DRIVEN DECISION-MAKING PRACTICES THAT SECONDARY PRINCIPALS
USE TO IMPROVE STUDENT ACHIEVEMENT
by
Marco A. Sanchez
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
May 2015
Copyright 2015 Marco A. Sanchez
DATA-DRIVEN DECISION-MAKING PRACTICES 2
Dedication
This dissertation is dedicated to my beautiful wife, Maria Elena Sanchez, who has been
an unconditional supporter of each and every professional and personal goal of mine. Her love,
eternal optimism, and genuine kind spirit have guided me through this and every challenge from
day 1. I also dedicate this dissertation to my first-born son Max, whose unexpected arrival during
the writing of this dissertation gave additional meaning that was not present when I chose to
research data-drive decision-making practices to improve student achievement. I love them both.
DATA-DRIVEN DECISION-MAKING PRACTICES 3
Acknowledgments
This dissertation is evidence of the power of mentorship from numerous educators from
whom I have had the pleasure to learn and benefit through my academic and professional career.
Although it is impossible to name them all, naming a select group is warranted.
My high school mathematics teacher, Mr. Omer Hassan, initially nurtured my academic
resiliency. He took time to teach me that in the face of adversity, education is the greatest
equalizer. He influenced my choice of becoming a math educator and never gave up on me.
Dr. Patricia Hale and her husband Chuck, through their friendship and mentorship,
guided me not only through undergraduate and graduate math degrees but also through many of
life’s struggles.
Dr. Stephen Davis, pioneer and director of the innovative Great Leaders for Great
Schools Academy at Cal Poly Pomona, shaped and nurtured a leadership fervor within me that
drives how I lead. His passion for building leaders is inspiring, and I am very fortunate to have
benefitted from his work.
Mr. Roger Fasting gave me my first administrative job when he hired me as his assistant
principal. His “just-make-it-happen” leadership philosophy was a pleasure to experience and
taught me that a no-excuses mentality is necessary in this job.
Mrs. Suzanne Steinseifer-Ripley taught me that despite how chaotic being a high school
administrator can be, one can still take time to laugh realize that “life is good.” She is a dynamic,
humble leader who is not afraid to be innovative and refuses to accept anything less than 100%
from people.
The central office leadership of Pomona Unified—Superintendent Richard Martinez,
Deputy Superintendent Stephanie Baker, Assistant Superintendent Darren Knowles, Director
DATA-DRIVEN DECISION-MAKING PRACTICES 4
Fernando Meza, Director Monica Principe, and Director Cesar Casarrubias—all took time to
develop my leadership. They exemplify model leadership and work selflessly for the students of
Pomona.
Thank you to Dr. Michael Escalante for his review of this dissertation and his dynamic
lectures that gave my classmates and me leadership knowledge from his vast experience.
Words cannot describe the honor of having Dr. Pedro E. Garcia, the embodiment of the
Trojan Spirit, serve as a committee member. His story and leadership legacy are an inspiration to
me, and I am grateful for his captivating storytelling that made going to class after a long day at
work something to look forward to during my 1st year in the program. Fight on!
A special thank you to Dr. Rudy Max Castruita for his guidance and support as my dis-
sertation chair. It has been a privilege to learn as a student from his quintessential leadership
experience. Dr. Castruita’s “Castruita-isms” and his core belief on the critical role that leaders
play in the fight to improve the educational experience of all students will drive me for years to
come.
Finally, I want to thank my parents, Humberto and Eustolia Sanchez, who despite never
having stepped in a secondary school, have been my best teachers. Their simple message of
perseverance and hard work led my siblings and me to self-actualize. I am eternally grateful for
their unconditional love and courage to embark on a 2-day hike to reach this amazing country.
DATA-DRIVEN DECISION-MAKING PRACTICES 5
Table of Contents
Dedication 2
Acknowledgments 3
List of Tables 7
List of Figures 8
Abstract 9
Chapter One: Overview of the Study 11
Background of the Problem 12
Types of Data 14
Statement of the Problem 15
Purpose of the Study 17
Importance of the Study 18
Assumptions 19
Limitations 19
Delimitations 20
Definition of Terms 21
Chapter Two: Literature Review 23
Overview of DDDM 23
Definitions 23
Misconceptions About DDDM 25
Components of DDDM 26
Leadership 26
Professional Development 30
Uses of Data 36
Challenges to DDDM 42
Timely Access to Data 42
Data Validity 42
Chapter Summary 43
Chapter Three: Research Methods 45
Purpose Restated 45
Methodology 46
Sampling Strategy 47
Data Collection 48
Consent and Data Security 49
Data Analysis 49
Assumptions 49
Validity and Reliability 50
Internal Validity 50
External Validity 50
Reliability 51
Reporting of the Findings 51
Chapter Four: Findings 52
Coding of Data 53
Research Questions 54
Themes Related to Research Question 1 56
DATA-DRIVEN DECISION-MAKING PRACTICES 6
Theme Related to Research Question 2: Commitment to Product-Driven PD Time 67
Themes Related to Research Question 3 78
Theme Related to Research Question 4: Staying the Course 95
Chapter Summary 103
Chapter Five: Summary, Implications, Recommendations, and Conclusion 106
Statement of the Problem Restated 107
Guiding Research Questions 108
Summary of Results and Findings by Research Question 109
Research Question 1 109
Research Question 2 110
Research Question 3 110
Research Question 4 111
Ancillary Findings 112
Summary of Findings in Relationship to Current Literature 113
Implications for Practice 114
Recommendations for Future Research 116
Conclusion 116
References 119
Appendix: Interview Questions 129
DATA-DRIVEN DECISION-MAKING PRACTICES 7
List of Tables
Table 1: California Standards Test Proficiency Rates for 2013 in English Language
Arts (ELA) and Math 19
Table 2: Information Regarding Schools of Principals Participating in Study 53
Table 3: Relationship Between Research Questions and Associated Emerging Themes 54
Table 4: Summary of Building Leadership Capacity, by School 62
Table 5: Summary of Setting Clear Expectations, by School 68
Table 6: Summary of Commitment to Product-Driven Professional Development (PD),
by School 78
Table 7: Summary of Rebranding Classroom Walkthroughs, by School 89
Table 8: Summary of Strategic Conversations, by School 95
Table 9: Summary of Staying the Course, by School 104
Table 10: Proficiency Rates on 2013 California Standards Test for Secondary Students
(Grades 9–11) 107
DATA-DRIVEN DECISION-MAKING PRACTICES 8
List of Figures
Figure 1: No Child Left Behind proficiency targets in English language arts 16
Figure 2: No Child Left Behind proficiency targets in mathematics 16
Figure 3: Framework for describing the data-driven decision-making process 24
Figure 4: Data-wise improvement process 37
Figure 5: Framework for data transformation into knowledge 39
DATA-DRIVEN DECISION-MAKING PRACTICES 9
Abstract
The No Child Left Behind (NCLB) era has brought about many changes in education,
most notably cultivating a culture of accountability through a yearly metric measure. Meeting
these metric target growths set by NCLB had implications in practice, including meticulous
analysis of standardized assessment results with the goal of improving them. A mechanism that
leaders use to achieve improvements in standardized testing results is building a culture where
stakeholder use of assessment data to drive instructional decision making, commonly known as
data-driven decision making (DDDM). The purpose of this qualitative study was to learn from
the DDDM processes in place at school sites that were able to use such practices to improve
student achievement.
The study focused on the actions that secondary principals undertook to transition their
school sites from a data-rich and action-poor culture to establishing procedures that converted
assessment data into actionable data that drove the instructional decision-making processes. The
study sample population was 6 secondary principals whose school sites met the following
criteria: (a) implementing a DDDM process; (b) 3 consecutive years of meeting Academic
Performance Index (API) growth targets set by NCLB, with an initial API under 800 during the
1st year included in the study; (c) a student testing population that included minimally 25%
English learners and 50% socioeconomically disadvantaged students; and (d) consistent leader-
ship during the 3 years included in the study.
The findings of this qualitative study add to the understanding of DDDM and supple-
ments previous literature on the topic by providing a contemporary view on DDDM from 6
secondary principals. The study focused on secondary principals and the DDDM strategies that
they used to improve student achievement. The findings were grouped in six different themes,
DATA-DRIVEN DECISION-MAKING PRACTICES 10
all of which supported one of four research questions: (a) building leadership capacity, (b) setting
clear expectations, (c) commitment to product-driven professional development, (d) rebranding
walkthrough, (e) strategic conversations, and (f) staying the course. This study should be specifi-
cally of use to leaders who are seeking sustainable DDDM strategies to improve student achieve-
ment at secondary sites.
DATA-DRIVEN DECISION-MAKING PRACTICES 11
CHAPTER ONE: OVERVIEW OF THE STUDY
The role of educational leaders has changed substantially over the past 30 years. Educa-
tional leaders have transitioned from operating in an environment where the value of public
schooling is assumed to one where it has to be demonstrated (Anderson, Leithwood, & Strauss,
2010). The impetus for this change began with the publication of A Nation at Risk (National
Commission on Excellence in Education, 1983), which brought to the forefront assessment data
whose intentions was to persuade the American public that American students were falling
behind in academic skill and ability compared to their international counterparts. The report
gave traction to the public’s demand to improve education and ultimately became the genesis of
the standards reform movement that has led to the current high-stakes accountability educational
environment. In addition to A Nation at Risk making educational reform a public demand, the
report naturally led to the inclusion of educational reform as a political ticket item. Conse-
quently, in 1991 President George H. W. Bush introduced America 2000, which required report
cards on the progress of schools and supported national standards (U.S. Department of Education
[USDOE], 1991). President William J. Clinton followed with the Improving America’s Schools
Act of 1994, which required states to develop their own standards. Most noteworthy was Presi-
dent George W. Bush’s No Child Left Behind (NCLB) Act of 2002, which not only required
standards and assessments but also tied them to metric barometers that would lead to sanctions
for schools and districts failing to meet growth targets California Department of Education
[CDE], 2013a). Recently the American Recovery and Reinvestment Act (ARRA) of 2009, as
one of the four pillars, requires that districts use data to drive decision making (Mandinach,
2012). Each of the latter policies and reports sought to achieve one fundamental goal: improving
student achievement. However, a 30-year analysis of National Assessment of Educational
DATA-DRIVEN DECISION-MAKING PRACTICES 12
Progress (NAEP) scores, the only cross-state metric available during this time frame, indicated
that student achievement has not undergone a significant change or growth (Mintrop & Sunder-
man, 2009). Schlechty (2001) proposed an explanation for this lackluster growth suggesting that
“the reason schools have not improved is that they have changed so much and so often with so
little effects that leaders seem baffled about what to do next” (p. 2). Educational leaders un-
doubtedly currently find themselves in an accountable rich environment brought on by local,
state, and federal policies whose laser focus provide a wide range of yearly metrics that take
center stage for all conscious educational leaders wishing to avoid sanctions.
Background of the Problem
The Public Schools Accountability Act (PSAA) of 1999 (CDE, n.d.), a comprehensive
reform policy that held all stakeholders accountable for improving student achievement, was
passed in California. The intent of the legislature was to have a “comprehensive and effective
school accountability system [whose] primarily focus [is] increasing academic achievement”
(52050.5[d]). The passage of PSAA laid a foundation for California to satisfy the provisions of
NCLB and ultimately led to the birth of the Standardized Testing and Reporting (STAR) pro-
gram, which assessed learning on statewide adopted academic content standards. For an average
sized high school of 1900, this amounts to upwards of 4,000 standardized assessments each year
(CDE, 2013b). The accountability reform that started 30 years ago has led to an unparalleled
focus on standardized testing and has brought with it a slew of benefits and problems. The
assessment-driven culture nurtured under NCLB has cultivated a widely accepted notion of
focusing on “bubble kids,” or students who are close to become proficient as defined by state
standardized test (Cawelti, 2006; Dee & Jacob, 2011; Hamilton et al., 2009; Lauen & Gaddis,
2012). Cawelti (2006) referred to this focus on borderline students as schools using “funny
DATA-DRIVEN DECISION-MAKING PRACTICES 13
numbers game” (p. 65) to falsely imply there are true gains in student achievement and suggested
that this strategy of focusing on borderline students might explain the growth seen in some states.
Aside from a free-response essay, all other components of the STAR program contain standard-
ized testing in a four-answer, multiple-choice exam format (CDE, 2013b). Critics of the STAR
program have challenged multiple-choice assessment vehicles as valid forms of assessing student
learning, considering the diverse population. For example, as of 2013, there were 60 different
languages spoken by California students, albeit 85% of them were English learners (ELs) whose
primary language was Spanish (CDE, 2013a). Language is only one of many different subgroups
for which educational leaders were accountable under NCLB; others included socioeconomically
disadvantaged, race, and students with disabilities (SWD). On the other hand, prior to NCLB
and ARRA, there was little to no legislation requiring schools to address achievement gaps
among disadvantages populations.
During the NCLB era, school site leaders needed to cope with the pressure of being con-
sidered a Program Improvement (PI) school if their schools failed to meet annual growth targets
for 2 consecutive years, thereby leading school site leaders to face two contradictory forces when
it came to improving student achievement. On the one hand, an educational leader needed to
invest time and energy to carefully design systems and protocols that improved learning experi-
ences that led to higher student achievement, as measured on standardized tests, while at the
same time coping with external pressures that mandated immediate results. This dichotomy
frustrated educational leaders because it is well established that sustainable long-lasting change
takes much longer than 2 years for leaders to achieve (K. M. Brown & Anfara, 2003; Earl &
Fullan, 2003; Fullan, 1985; Kotter 1995; Kotter & Schlesinger, 2008).
DATA-DRIVEN DECISION-MAKING PRACTICES 14
Types of Data
The STAR program require that secondary students take the California Standards Test
(CST) for students in Grades 9–11 in up to five subject areas (i.e., English language arts [ELA],
mathematics, science, social science, NCLB life science), the California Modified Assessment
(CMA) for students in Grades 9–11 in up to five subject areas (i.e., ELA, mathematics, science,
social science, NCLB life science), the California Alternate Performance Assessment (CAPA)
for students in Grades 9–11 in two subject areas (i.e., ELA, mathematics), or the California High
School Exit Exam (CAHSEE) for students in Grade 10 in two subject areas (i.e., ELA and
mathematics). The results from the latter exams are used to calculate an API that becomes the
component used to measure Adequate Yearly Progress (AYP) to meet the provisions of NCLB
(CDE, 2013b). The API metric has perhaps single-handedly brought about a reculturation to the
public’s perception of the quality of schools, thus allowing stakeholders within an educational
community to have a snapshot of the quality of a school. Specifically, it allows for public edu-
cation consumers to have a tangible manner to compare the quality of schools within a commu-
nity.
The API is calculated strictly using the assessment results from the CST, CMA, CAPA,
and CAHSEE. Further, it is calculated for a series of different groups of subpopulations within a
school. Each of these subgroups is considered significant if they compose 10% of the student
population or 100 students (CDE, 2013b).
In order for secondary schools to meet AYP, they must annually meet four requirements:
(a) participation rate, (b) percentage proficient, (c) API, and (d) graduation rate (CDE, 2013c).
The participation rate during the 2013–14 academic year was 95% not only for schools but also
for local education agencies (LEAs; typically districts), and significant subgroups. The
DATA-DRIVEN DECISION-MAKING PRACTICES 15
percentage proficient Annual Measurable Objectives (AMOs) were set on an upward trajectory
linear path starting with the passing of NCLB in 2001 to achieving 100% in 2013–14 (CDE,
2013c; see Figures 1 and 2).
The third factor of API growth was established contingent upon a school’s current API.
The basic formula was 5% of the difference between 800 and a current school’s API. For
instance, if a school’s API was 720, then 5% of 800 - 720 or 80 is 4 points, thus making 4 points
the minimum growth target for that given year. The graduation rate requirement was 90%. In
addition to the latter requirements for a school, the school had to meet all the mentioned criteria
for all significant subgroups. Keeping track of such a large number of metrics alone can present
a challenge; however, given the requirement of also working on all other aspects of the princi-
palship, the task of disaggregating the data and further understanding it is very difficult for most
nondata-trained experts (CDE, 2013c).
Despite 30 years of change brought on by the standards reform movement, a recent survey
of 1,039 districts, including 60 of the largest urban school districts across the country, indicated
that only half of the teachers had access to electronic data for their students, with an even smaller
portion (37%) having access to achievement data for their current students (USDOE, 2008).
Further, the survey indicates that an even smaller percentage of those who have access to the data
are routinely using it in the instructional process.
Statement of the Problem
According to R. S. Brown, Wohlstetter, and Liu (2008), an informed decision requires
relevant and useful information. At the genesis of improving student achievement is a leader’s
decision making in a wide range of factors including budgets, discipline, curriculum, instruction,
and assessment—among others. Although there are many factors affecting student achievement,
DATA-DRIVEN DECISION-MAKING PRACTICES 16
Figure 1. No Child Left Behind proficiency targets in
English language arts. Taken from 2013 Adequate
Yearly Progress Report: Information Guide, by Cali-
fornia Department of Education, 2013c, p. 21, re-
trieved from http://www.cde.ca.gov/ta/ac/ay/
Figure 2. No Child Left Behind proficiency targets in
mathematics. Taken from 2013 Adequate Yearly
Progress Report: Information Guide, by California
Department of Education, 2013c, p. 21, retrieved
from http://www.cde.ca.gov/ta/ac/ay/
DATA-DRIVEN DECISION-MAKING PRACTICES 17
unarguably one of the most positively correlated is instruction (Marzano, Marzano, & Pickering,
2003). Therefore, one can suggest that improving student achievement begins with decision
making associated with instructional practice. Black and Wiliam (2004), in a comprehensive
review of more than 250 studies, found that assessment could provide valuable information to
guide instruction paramount to improving student achievement. This information becomes
crucial evidence to refute critics who are proponents of reducing the amount of assessments that
students take. A better suited argument is the lack of skill in interpreting data for such assess-
ments—a lack that has become a barrier to the frequent and systematic use of data despite the
general belief in their importance (Gischlar, Hojnoski, & Missal, 2009). The problem therefore
becomes how educators transform schools from a data-rich and action-poor culture to one where
stakeholders use an inquiry-based model that transforms raw data into actionable information
that drives the instructional learning gaps of its students, thus leading to continuous improve-
ment in student achievement.
Purpose of the Study
Despite the many challenges previously mentioned, there are school sites and leaders who
are able to use DDDM to improve student achievement. The purpose of this study was to add to
the growing body of literature on DDDM, with a focus on identifying effective DDDM practices
in which secondary principals engage to improve student achievement.
The following research questions were explored in this study:
1. What systems are considered and subsequently put in place for a data-driven school
culture?
2. How do school site principals and their stakeholders transform data from raw form to
actionable data used to drive decision making in instructional practice?
DATA-DRIVEN DECISION-MAKING PRACTICES 18
& 3. What are barriers that a secondary principal faces when implementing DDDM?
4. What are the mechanisms that secondary principals use to evaluate the effectiveness
of protocols and systems put in place by DDDM processes?
Importance of the Study
As of the most recent available data, NCLB mandated a 89.2% and 89.5% in ELA and
math, respectively, for the 2012–13 school year; however, statewide proficiency rates on stan-
dardized tests in California fell radically short at 56.4% and 51.3% in ELA and math, respec-
tively (CDE, 2013a), thus making it evident that current bubble-kid techniques were not working.
Taking a closer look at proficiency rates for secondary students indicated that the subpopulation
that is the focus of this study yielded more alarming results. During the 2012–13 school year, the
proficiency rates for students in Grades 9–11 in ELA and math were 54% and 25%, respectively
(CDE, 2013a). For five of the largest urban districts in southern California, the results were even
more distressing (see Table 1). The combined number of students in Grades 9–12 in the five
districts listed in Table 1 was 905,000, or nearly 46% of all students statewide (CDE, 2013c). (It
should be noted that this was a study of a setting using actual theories and research references’
districts are identified with pseudonyms, but all study setting-related data are real.)
With a shortage of human and fiscal capital, the structures that schools have in place to
meet the academic needs of their students to improve the previously mentioned student achieve-
ment metrics are as important as ever. The aim of this study was to support these efforts by
shedding light on concrete DDDM protocols that allow for the transformation of raw data into
actionable steps for improving instructional experiences for students.
DATA-DRIVEN DECISION-MAKING PRACTICES 19
Table 1
California Standards Test Proficiency Rates for 2013 in English Language Arts (ELA) and Math
District
a
ELA Proficiency Rate (%) Math Proficiency Rate (%)
1 42.9 15.6
2 48.3 23.8
3 40.3 20.7
4 37.1 13.6
5 57.9 36.5
Note. Taken from DataQuest, by California Department of Education, 2013a, retrieved from
http://data1.cde.ca.gov/dataquest/
a
Pseudonyms used.
Assumptions
The following assumptions were made for this study:
1. That the principal acted as instructional leader,
2. That principals would be able to identify and communicate practices used to imple-
ment DDDM, and
3. That the information gathered would sufficiently address the research questions.
Limitations
This study had the following limitations:
1. The validity of the data was based on the choice of instrumentation.
2. The validity was dependent on the willingness of participants to provide responses
that accurately described DDDM structures.
DATA-DRIVEN DECISION-MAKING PRACTICES 20
3. There were challenges in identifying and narrowing the practices used during the
implementation of DDDM policies.
Delimitations
The delimitations of the study were the following:
1. Data collection was limited to principals in Grades 5–8 southern California high
schools, thus limiting the sample size.
2. Interviews were limited to principals who had been a part of the implementation of
DDDM from its inception at the districts involved in the study.
3. Districts must have achieved the following:
a. Schools must have been currently implementing a DDDM process. For the
purpose of this study, a DDDM process was defined as stakeholders within a school site, includ-
ing administrators and teachers, having explicit professional development (PD) time to analyze
assessment data with the sole purpose of reflecting on and informing instructional practice and
relevant decision making.
b. Schools must have met API growth targets set by NCLB during a 3-year time
frame. These growth targets were defined to be 5% of the difference between the school’s API
and the statewide performance target of 800, with a 5-point minimum (CDE, 2013b).
c. The school site must have had a consistent leader during the time frame when API
growth was achieved. By consistent leadership, it was meant that there was no interruption in
service, replacement, or promotion of the leader of the school being studied.
DATA-DRIVEN DECISION-MAKING PRACTICES 21
Definition of Terms
For the purpose of this study, the following terms are defined:
Academic Performance Index: The API is a single number, ranging from a low of 200 to a
high of 1000, which reflects the performance level of a school, a LEA, or a student group based
on the results of statewide assessments.
Adequate Yearly Progress: AYP is a series of annual academic performance goals estab-
lished for each school, LEA, and the state as a whole.
A-G: A-G is an acronym used to abbreviate the minimum courses a student must com-
plete with a grade of C- or better in order to meet college entrance requirements for the California
State University and the University of California system.
California Alternate Performance Assessment: CAPA is an alternate assessment designed
for students with disabilities who cannot take part in general assessments.
California High School Exit Exam: The CAHSEE is an annual exam taken by all Califor-
nia high school students in 10th grade to determine their eligibility for receiving a high school
diploma after all other requirements are met.
California Modified Assessment: The CMA is a standardized assessment designed for
students who have an individualized education program (IEP).
California Standards Test: The CST is a standardized assessment taken by California
public school students in Grades 2–11. Depending on their grade level, students may take the
CST in ELA, mathematics, science, or social science.
Common Core State Standards (CCSS): These are educational standards adopted by the
California State Board of Education that describe what students should know and be able to do in
each subject in each grade.
DATA-DRIVEN DECISION-MAKING PRACTICES 22
Data Driven Decision Making: DDDM is a process by which data are used to drive
decision making.
English learner: EL is a designation given to students who indicate having a native
language other than English.
National Assessment of Educational Progress: The NAEP is a Congressionally mandated
assessment that is intended to assess what American students know in core subjects.
Proficiency: This is a term used to describe two desirable results (proficient and ad-
vanced) of the possible five results (far below basic, below basic, basic, proficient, and advanced)
on the CST.
Specific, Measurable, Achievable, Realistic, and Timely (SMART) Goals: SMART goals
are a common tool used in action plans to assist in achieving objectives.
Standardized Testing and Reporting: STAR is an assessment program adopted under the
PSAA of 1994.
Students With Disabilities: SWD is a designation given to students who have an IEP or
504 plan.
DATA-DRIVEN DECISION-MAKING PRACTICES 23
CHAPTER TWO: LITERATURE REVIEW
The purpose of this literature review is to lay the groundwork for the purpose of the study,
which was to investigate how to increase student achievement by building practices at secondary
school sites that transform raw data into actionable data that teachers can use to drive instruction.
Investigating the promising practices that exemplar secondary principals have found to be effec-
tive in embarking on this ambitious goal is the lens of this literature review.
A recent report by the USDOE indicates that most teachers still lack the resources and
data analysis proficiency to navigate the abundant assessment data available to them (USDOE,
2008). This dilemma leaves school site principals burdened with developing a comprehensive
plan that immediately begins addressing gaps in student learning and long-term goals that nurture
data expertise for human capital, thereby leading to sustainable growths in student achievement.
Advantageous to this venture is to investigate strategies that previous studies have validated for
leaders to consider.
Overview of DDDM
Definitions
The definitions of DDDM in the literature vary insignificantly; however, contrary to
common belief, they all strongly suggest that it is not just as the name would indicate—that is,
the process of a singular decision driven by data—but also entails a much more comprehensive
practice. Specifically it involves a community of stakeholders who adopt a culture of inquiry to
continuously improve student achievement by collectively adjusting their instructional practice
guided by emerging patterns arising from the analysis of relevant data (Dunn, Airola, Lo, &
Garrison, 2013; Ikemoto & Marsh, 2007; Mandinach, 2012; Marsh, Pane, & Hamilton, 2006;
Rankin & Ricchiuti, 2007). Too often, educators focus on the learning deficits of students;
DATA-DRIVEN DECISION-MAKING PRACTICES 24
Figure 3. Framework for describing the data-driven decision-
making process. Taken from “Cutting Through the ;Data-Driven’
Mantra: Different Conceptions of Data Driven Decision Making,”
by G. S. Ikemoto and J. A. Marsh, Yearbook of the National
Society for the Study of Education, 106(1), p. 109.
however, in the DDDM process, both learning strengths and weaknesses become levers in the
continuous improvement model (Dunn et al., 2013). When data are systemically collected and
subsequently analyzed, deficits in learning emerging from data become problems that are reme-
died by the analysis of successful learning. Only by marrying the two can educators begin to
capitalize on the benefits of DDDM (Dunn et al, 2013; Ikemoto & Marsh, 2007; Marsh et al.,
2006; see Figure 3).
The framework for DDDM used in this study is the one suggested by Ikemoto and Marsh
(2007). The comprehensive study used 10 districts in four states, including 130 district leaders,
100 school site principals, 115 teacher focus groups, and survey data from over 2,900 teachers, to
gain insight on the different ways that educators use data to make decisions about teaching and
DATA-DRIVEN DECISION-MAKING PRACTICES 25
learning. The authors’ framework is rooted in the belief that data in raw form become useful
only when they are transformed into knowledge—a process achieved by providing educators with
the skills necessary to decipher germane data from the volume of data available to them. Once
deciphered, data are transformed into information relevant to practitioners. This information, in
turn, provides them with knowledge that sponsors an informed decision applicable to their
teaching practices (Ikemoto & Marsh, 2007).
Misconceptions About DDDM
Prior to moving forward, it is necessary to clarify any misconceptions about what DDDM
is or is not, according to the literature. Mandinach (2012) was clear when she indicated that
“DDDM is not just about the numbers or the data” (p. 73). Too often, data are synonymous with
numbers, creating instant anxiety for educators who have received little to no training on data
analysis (Dunn et al., 2013; Feldman & Tung, 2001; Gischlar et al., 2009; Ikemoto & Marsh,
2007; Mandinach, 2012; USDOE, 2008); however, as stated earlier, DDDM is a comprehensive
and complex process requiring numerical analysis only as a subprocess and not as the primary
objective. More important is the need for leaders to focus on establishing processes and practices
around data analysis heuristically practiced by their organization’s stakeholders, as the literature
on DDDM has indicated is strongly correlated with long-term, sustainable growth in student
achievement. What should be unequivocally clear to readers is that “DDDM is not a monolithic,
straightforward activity” (Ikemoto & Marsh, 2007, p. 125). Any leader wishing to use data to
drive decision making as a means to improve student achievement should be willing to engage in
a long-term commitment to harvest the benefits of sustainable growths in student achievement
(Armstrong & Anthes, 2001; Bernhardt, 2009; Ikemoto & Marsh, 2007; Mason, 2002).
DATA-DRIVEN DECISION-MAKING PRACTICES 26
Components of DDDM
Leadership
Marzano, Waters, and McNulty (2005), in their meta-analysis of school leadership
practices of principals, found that “principals can have a profound effect on the achievement of
students in their schools” (p. 38). This notion extends to the literature on DDDM. Literature on
schools whose leadership has successfully used DDDM will yield one inarguable conclusion—
THAT leadership matters (Anderson et al., 2010; Armstrong & Anthes, 2001; Datnow, Park, &
Wohlstetter, 2007; Herman & Gribbons, 2001; Ikemoto & Marsh, 2007; Kerr, Marsh, Ikemoto,
Darilek, & Barney, 2006; Lachat & Smith, 2005; Marsh, McCombs, & Martorell, 2010; Marzano
et al., 2003; Park & Datnow, 2009; Psencik & Baldwin, 2012; Rankin & Ricchiuti, 2007). Prior
to embarking on a DDDM initiative, the literature indicates that leaders must understand one
basic principle: that effectiveness of DDDM is contingent upon a leader’s commitment (Arm-
strong & Anthes, 2001; Herman & Gribbons, 2001; Kerr et al., 2006). A commitment is best
described by the mantra of one of the superintendents involved in a six-school district case study
administered by Armstrong and Anthes (2001): “In God we trust, all others bring data” (p. 40).
A leader’s commitment to systemic use of data to guide decision and practice, therefore, precedes
the impetus for DDDM processes within a school.
Succeeding commitment for DDDM practices, according to the literature, is an array of
other aspects of leadership that become emerging themes for consideration. The importance of
establishing rapport and an environment of trust, establishing a clear vision and a well-
communicated plan, setting the stage for change by establishing context, ensuring that all stake-
holders have the necessary resources, and being mindful of the environment are all elements that
the literature on DDDM has indicated to be essential for leaders to consider.
DATA-DRIVEN DECISION-MAKING PRACTICES 27
Relationships. The literature on DDDM identifies trust as an enabler of data-driven
cultures and a necessary starting point in the DDDM venture (Cosner, 2011; Duffy, Hannan,
O’Day, & Brown, 2012; Ikemoto & Marsh, 2007; Levin & Datnow, 2012; Thornton & Perreault,
2002). Authors have cited the ingredient of rapport building as fundamental in the recipe for
creating a culture of trust that promotes honest dialogue among stakeholders essential in the
DDDM process. Without the element of trust, it is impossible to create a true “ethos of learning
and continuous improvement” (Park & Datnow, 2009, p. 483). Marsh et al. (2010), in their case
study involving over 1,350 professionals including 113 principals, found that progress on DDDM
was hindered when leadership ignored rapport building. There is also evidence of leaders
breaking this trust, as Herman and Gribbons (2001) indicated when they suggested that “statistics
don’t lie but liars use statistics” (p. 25), referring to the phenomenon of school sites using data
manipulation to misrepresent instead of supporting change. The challenge for leaders, as Earl
and Fullan (2003) suggested, is to promote data as informative to the improvement in instruc-
tional practice for teachers versus it serving as a surveillance tool. It is only by harnessing the
power of trust that leaders convert data from “stress inducing” to an instructional tool (Dunn et
al., 2013, p. 88).
Vision. Although establishing a clear vision is no surprise in any successful school, the
literature identifies vision alongside planning as a chronological next step in successful imple-
mentation of DDDM protocols (Anderson et al., 2010; Armstrong & Anthes, 2001; Bernhardt,
2009; Boudett, City, & Murnane, 2010; Childress, Elmore, & Grossman, 2006; Datnow et al.,
2007; Hamilton et al., 2009; Ikemoto & Marsh, 2007; Isaacs, 2003; Kerr et al., 2006; Lachat &
Smith, 2005; Levin & Datnow, 2012; Mason, 2002; Park & Datnow, 2009). What is specifically
highlighted in the literature is that visions should be built around a clear set of expected
DATA-DRIVEN DECISION-MAKING PRACTICES 28
outcomes (Anderson et al., 2010; Datnow et al., 2007; Isaacs, 2003; Kerr et al., 2006; Lachat &
Smith, 2005; Levin & Datnow, 2012) revolving around systemic data use (Armstrong & Anthes,
2001; Hamilton et al., 2009; Isaacs, 2003; Ikemoto & Marsh, 2007; Kerr et al., 2006; Park &
Datnow, 2009). The marriage of these two, a strong vision with clear, expected outcomes and a
lens centered on continuous data use, is fundamental to vision setting in DDDM processes,
according to previous successful implementations of DDDM. Emerging simultaneously within
vision is the importance of “systemic planning” (M. Golden, 2005, p. 3), which logically follows
in the execution of visions (Armstrong & Anthes, 2001; Boudett et al., 2010; Childress et al.,
2006; Cosner, 2011; Isaacs, 2003; Mason, 2002; Williams et al., 2007). Both Bernhardt (2009)
and Kerr et al. (2006) indicated the importance of committing to a vision once the latter elements
are components in the construction a vision. When leaders failed to follow the practice of
establishing a “clear and unified vision” Cromey (2000, p. 9), it was found that “resulting data
[are] not only confusing, but conflicting” (p. 9), stifling any future gains in achieving objectives.
Setting context. Once a strong, data-centered vision is established and an execution plan
is designed to meet clearly defined objectives, the literature indicates that leaders must establish
context for their stakeholders in order to give traction to motivation essential for the execution of
said vision (Anderson, et al., 2010; Cosner, 2011; Earl & Fullan, 2003; Earl & Katz, 2002; Levin
& Datnow, 2012; Park, Daly, & Guerra, 2012; Thornton & Perreault, 2002; Young, 2006). Park
et al. (2012) provided an eloquent description of this notion when they suggested that
a compliance orientation toward data use would not lead to authentic or sustained engage-
ment with data. Conceptualizing DDDM as a continuous improvement strategy that
directly informs teaching and learning processes appeared to be a strategic sense making
DATA-DRIVEN DECISION-MAKING PRACTICES 29
endeavor that leaders undertook during implementation to persuade teachers that data use
was relevant. (pp. 666-667)
It is this sense making that becomes critical for leaders to invest time in prior to moving forward
with executing any vision. By stressing the purpose of data, leaders were able to establish the
rationale that allowed stakeholders to give meaning to the data being used, ultimately being a
critical component in subsequent steps within the DDDM process (Anderson et al., 2010; Levin
& Datnow, 2012; Park et al., 2012). One of the elements that Earl and Katz (2002) identified as
an indicator of a good contextual setting is the notion of simultaneously creating a “sense of
urgency” for stakeholders (p. 2).
Resources. Ikemoto and Marsh (2007) found that “several users of complex DDDM
processes strongly emphasized the importance of tools” (p. 123). It is no surprise that tools or
adequate resources arise as an emerging theme within the literature for DDDM, just as any task
not having the correct resources can hinder a school site to successfully achieve any objective.
Resources, through the lens of DDDM, however, can encompass an array of different elements
ranging from human capital to physical capital, symbolic capital, and political capital, among
others. The emphasis in the literature for DDDM was specifically discussed in terms of physical
capital in the form of a data system that organizes and presents data in a user-friendly manner
(Anderson et al., 2010; Coburn, Toure, & Yamshita, 2009; Williams et al., 2007; Hamilton et al.,
2009; Herman & Gribbons, 2001; Ikemoto & Marsh, 2007; Isaacs, 2003). Herman and Gribbons
(2001) and Isaacs (2003) specifically highlighted the importance of the presentation of data as a
means to transform large volumes of complex numerical data to usable, easy-to-understand facts
that facilitate its use in practice. Some studies identified districts that have experienced chal-
lenges presented by the absence of appropriate resources (Datnow et al., 2007; Duffy et al.,
DATA-DRIVEN DECISION-MAKING PRACTICES 30
2012). Datnow et al. (2007) specifically found in their study that districts were data rich and
“grappled with organizing data in an accessible format and presenting it in a comprehensible
manner” (p. 30). Once a data system was enacted that allowed for schools to resolve the underly-
ing issue, it prompted them to move ahead with DDDM processes. Even when total proficiency
of DDDM processes was not achieved, leaders within school systems found that enacting
resources that promoted deciphering data to more meaningful forms found that the investment of
time and resources in such ventures was worthwhile and fruitful (Duffy et al., 2012).
Professional Development
Elmore (2002) eloquently wrote about a possible explanation for the crisis that public
education currently faces when he identified a gap in knowledge for school personnel:
With increased accountability, American schools and the people who work in them are
being asked to do something new—to engage in systematic, continuous improvement in
the quality of educational experiences of students and to subject themselves to the disci-
pline of measuring their success by the metric of students’ academic performance. Most
people who currently work in public schools were not hired to do this work, nor have they
been adequately prepared to do it either by their professional education or their prior
experience in schools. (p. 3)
As logic dictates, when there is a gap in knowledge, an organic step to resolve the prob-
lem is to fill the gaps. Within the literature for DDDM, PD emerges as a pivotal key theme
whose elements are categorized into the critical role of data analysis proficiency, paradigm shifts
of building a culture of inquiry, the power of collaboration, the presence of continuous improve-
ment cycles, and the importance of purposeful meeting time whose focus is data analysis.
DATA-DRIVEN DECISION-MAKING PRACTICES 31
Data analysis proficiency. The most ubiquitous theme found in the literature for
DDDM is the pivotal factor that building data analysis proficiency contributes to the success of
any DDDM initiative (Anderson et al., 2010; Boudett et al., 2010; Cosner, 2011; Datnow et al.,
2007; Down & Tong, 2007; Duffy et al., 2012; Dunn et al., 2013; Earl & Fullan, 2003; Earl &
Katz, 2002; A. Golden et al., 2008; Henning, 2006; Herman & Gribbons, 2001; Ikemoto &
Marsh, 2007; Kerr et al., 2006; Lachat & Smith, 2005; Luo, 2008; Mandinach, 2012; Marsh et
al., 2006; Mason, 2002; Park & Datnow, 2009; Rankin & Ricchiuti, 2007; Thornton & Perreault,
2002; Wohlstetter, Datnow, & Park, 2008; Young, 2006). The research literature is clear that
without the capacity to intelligently engage in dialogue regarding data and its implications,
teachers are unable to benefit from the DDDM process. The elements of data analysis profi-
ciency can be further expanded into two subcategories, its presence or absence.
Ikemoto and Marsh (2007) pointed out that “numerous studies have found that school and
personnel often lack adequate capacity to formulate questions, select indicators, interpret results,
and develop solutions” (p. 121). Other literature further supports this notion, as results of case
studies have indicated that very few administrators or teachers receive training on data analysis
(Dunn et al., 2013; Herman & Gribbons, 2001; Luo, 2008; Mandinach, 2012). Earl and Fullan
(2003), in their case study, found that teachers “expressed insecurities about their skills in
gathering, interpreting, and making sense of information about their school” (p. 388). The
dangers of not having adequate data analysis skills, Earl and Fullan (2003) suggested, could lead
to more problems than solutions due to a misuse of data. “Earl (1995, as cited in Earl & Fullan,
2003) suggested that “we live in a culture that has come to value and depend on statistical infor-
mation to inform our decisions. At the same time, educators are likely to misunderstand and
DATA-DRIVEN DECISION-MAKING PRACTICES 32
misuse those statistics because we are ‘statistically illiterate’ and consequently have no idea what
the numbers mean” (p. 389).
The literature identifies numerous school sites that have found success with DDDM due
in large part to their commitment to building capacity among its stakeholders. Datnow et al.
(2007) specifically found that schools sites “provided support for staff in how to use data and
modeling data use and data discussions” (p. 7) as one of the essential steps that leaders took
during the DDDM process. During the planning stages of the DDDM processes, leaders also
were mindful to “consider the extent to which grade-level teams possess the requisite analytic
skills and deep pedagogical content knowledge that support data-based work” (Cosner, 2011, p.
585). The challenge in addressing this issue is determining the right type of data and delivery
format that facilitates proper examination of data (Marsh et al., 2006; Rankin & Ricchiuti, 2007).
An unexpected finding regarding leaders who possess high levels of data analysis profi-
ciency is that they themselves do not use their knowledge of data analysis as much as they use
their skill to broker capacity to stakeholders within their organization (Anderson et al., 2010;
Kerr et al., 2006; Park & Datnow, 2009). Kerr et al. (2006) noted that “several studies have
found that the most successful principals were able to act as catalysts for data inquiry but then
worked to create more distributed leadership around data use” (p. 498).
Culture of inquiry. There is a well-known Chinese proverb that says, “Give a man a
fish, and you have fed him once. Teach him how to fish and you have fed him for a lifetime”
(Tzu, n.d., para. 1). If there were a teach-a-man-to-fish equivalent for education practitioners
wishing to satisfy the hunger of student achievement through DDDM, it would be that of build-
ing a culture of inquiry within an organization. If a leader asks a teacher a question about their
practice, he or she might be offended. However, if a leader instructs teachers how to ask the right
DATA-DRIVEN DECISION-MAKING PRACTICES 33
questions about their practice, then they can figure out how to improve for a lifetime. By defini-
tion, it becomes impossible or at the very least, difficult, to fail when one continues to think
about ways to improve, thus making inquiry the glue that holds all the pieces of DDDM in the
literature (Anderson, et al., 2010; Armstrong & Anthes, 2001; Datnow et al., 2007; Down &
Tong, 2007; Duffy et al., 2012; Earl & Katz, 2002; Feldman & Tung, 2001; M. Golden, 2005;
Halverson, Grigg, Prichett, & Thomas, 2005; Henning, 2006; Ikemoto & Marsh, 2007; Kerr et
al., 2006; Lachat & Smith, 2005; Mandinach, 2012; Mason, 2002; Rankin & Ricchiuti, 2007).
Rallis and MacMullen (2000) wrote, “[Inquiry-minded] schools recognize that improving
teaching and learning is an intentional ongoing process” (p. 1). Also important is how inquiry
allows for leaders to “accurately assess the root causes for achievement patters and make adjust-
ments when warranted” (Duffy et al., 2012, p. 12). This can become one of the tools leaders can
leverage to justify change and cope with the frustrations discussed by Schlechty (2001) in
Chapter One. Under the NCLB of 2002, education leaders have often focused on the expertise of
teachers; however, expertise and capacity are necessary but not a sufficient condition for success
(Wohlstetter et al., 2008). A culture of inquiry, according to the literature on DDDM, should
instead be the focus.
Changing organizations, however, is a lengthy, drawn-out process that requires time
(Bolman & Deal, 2008; K. M. Brown & Anfara, 2003; Earl & Fullan, 2003; Fullan, 1985; Kotter,
1995; Kotter & Schlesinger, 2008; Marzano et al., 2005; Schmoker, 1999). However, when
embarking on such ventures, as is commonly done by leadership, leaders should consider what
cultural direction they should choose to reap the benefits of DDDM. Halverson et al. (2005)
summed this concept up well:
DATA-DRIVEN DECISION-MAKING PRACTICES 34
The press for data-driven decision making, then, is not a call for schools to begin to use
data, but a challenge for leaders to reshape the central practices and cultures of their
schools to react intentionally to the new kinds of data provided by external accountability
systems. (p. 4)
Henning (2006) suggested that when considering an inquiry process, leaders should
consider four approaches: comparing the norm, analyzing trends, correlating data, and disaggre-
gating data. Other literature indicated that leaders who are able to harness the power of inquiry
did so by initially making a commitment to data use by building a strong vision and a good
knowledge base of how data could be used to drive decision making (Kerr et al., 2006). Also
essential to the inquiry process is the importance of a technology tool that provides data in a
manner that supports data inquiry procedures (Mandinach, 2012). Rankin and Ricchiuti (2007)
suggested that leaders use data to help stakeholders develop strong focus questions around
student performance. Mason (2002) suggested that “planned and targeted data inquiry can help
to keep data analysis on track, as well as to ensure that information is fed back into the planning
process and that key decision makers get the answers they need” (p. 7).
Ikemoto and Marsh (2007) found that the inquiry process was specifically constrained
when an organization was entrenched in the belief that instruction was private, whereas DDDM
processes were enabled when leadership promoted openness and collaboration among stake-
holders, thus leading to the next emerging theme in the literature for DDDM.
Collaboration. It is well established that successful schools are places where teachers
meet regularly to discuss ways to improve instructional practice, as indicated by assessment data
that they have disaggregated (Fullan, 1985. 2000, 2001, 2002). In the literature on DDDM, this
concept also emerged as a theme; however, there was an explicit identification of purposeful
DATA-DRIVEN DECISION-MAKING PRACTICES 35
collaboration with a focus of well-defined and measurable goals (Anderson et al., 2010; Arm-
strong & Anthes, 2001; Bernhardt, 2009; Boudett et al., 2010; Coburn et al., 2009; Cosner, 2011;
Earl & Katz, 2002; Ikemoto & Marsh, 2007; Lachat & Smith, 2005; Levin & Datnow, 2012; Park
& Datnow, 2009; Park et al., 2012; Psencik & Baldwin, 2012; Schmoker & Wilson, 1995).
The literature on DDDM stressed the need for leadership to establish normative processes
that promoted deprivatization of individual data and create an openness necessary to have mean-
ingful discussions about data (Anderson et al., 2010; Ikemoto & Marsh, 2007; Park et al., 2012;
Park & Datnow, 2009). Park et al. (2012) found in their case study that leaders who were able to
establish a “sense of collective responsibility” (p. 658) allowed school sites to make substantial
progress toward modifying instructional practices during their biweekly meetings. The authors
also indicated that during this process, they observed leaders and support staff in their data use as
well as modeled the use themselves. On the other hand, Herman and Gribbons (2001) found
“undeniable tension between top down versus bottom up inquiry” (p. 30). This idea was sup-
ported by Park and Datnow (2009), who found that school sites that they observed showed
evidence of leaders taking purposeful action to broker their “authority in a manner that empow-
ered different staff members to utilize their expertise” (p. 491)
Continuous improvement. One of the principals involved in a case study found in the
literature on DDDM that “a huge part of our vision is to just never get complacent” (Park et al.,
2012, p. 656). The notion of never reaching an objective and instead seeing improvement as a
cyclical process, where the objective keeps changing contingent upon the needs of a changing
student population, emerged as a common theme in DDDM processes (Anderson et al., 2010;
Armstrong & Anthes, 2001; Boudett et al., 2010; Childress et al., 2006; Elmore, 2002; Hamilton
et al., 2009; Park & Datnow, 2009; Park et al., 2012; Rankin & Ricchiuti, 2007; Schmoker,
DATA-DRIVEN DECISION-MAKING PRACTICES 36
2002). Park and Datnow (2009) summarized it well when they found that “leaders at all levels
co-constructed the vision and implementation of productive data-driven decision-making by
creating an ethos of learning and continuous improvement rather than one of blame” (p. 491).
Time. There is no doubt that administrators must be mindful of a high-priority task such
as ensuring student safety and managerial accountability, such as addressing stakeholder needs,
school cleanliness, and stocked resource cabinets. However, Black and Wiliam (1998) suggested
that desired outcomes are achieved only after desired goals are identified, show evidence of
present deficiencies, and there is an understanding of how to close the gap between the two. In
the literature for DDDM, time emerges as a theme directly correlated to PD—specifically as PD
time designated to analyze data (Anderson et al., 2010; Armstrong & Anthes, 2001; Earl & Katz,
2002; Feldman & Tung, 2001; Ikemoto & Marsh, 2007; Kerr et al., 2006; Lachat & Smith, 2005;
Park & Datnow, 2009; Rankin & Ricchiuti, 2007; Thornton & Perreault, 2002; Young, 2006).
There was little variation in the discussion of time in the literature, with the exception of flexibil-
ity (Armstrong & Anthes, 2001) and the time of day that teachers meet to discuss data (Feldman
& Tung, 2001).
Uses of Data
Boudett et al. (2010) presented a framework for school sites to use in their endeavor to
use data to improve student learning through a cyclical practice. They discussed at length how
each of the three phases, broken up into eight steps, could contribute to sustainable, long-term
increases in student achievement.
The detailed process shown in the Figure 4, designed by Boudett et al. (2010), is also an
outstanding example of the last cohort of elements emerging in the literature for DDDM: uses of
DATA-DRIVEN DECISION-MAKING PRACTICES 37
Figure 4. Data-wise improvement process. Taken from A
Step-by-Step Guide to Using Assessment Results to Improve
Teaching and Learning, by K. P. Boudett, E. A. City, and
R. J. Murnane, 2010, San Francisco, CA: Jossey-Bass, p. 5.
data. This cohort is broken down into further parts, converting data into actionable data, a
progress monitoring tool, an instructional tool, and an assessment tool.
Actionable change. As noted earlier, during the 2012–13 academic year, NCLB man-
dated that schools have a total of 89.2% and 89.5% of their student population proficient in ELA
and math, respectively; however, statewide proficiency rates in California during that time frame
were 56.4% and 51.3% in ELA and math, respectively (CDE, 2013a). The latter data constitute a
fact that in the absence of action, data are more depressing than useful, thereby leading to the
DATA-DRIVEN DECISION-MAKING PRACTICES 38
most powerful emerging theme in the literature on DDDM—the process of transforming data
into actionable change (Datnow et al., 2007; Earl & Fullan, 2003; Ikemoto & Marsh, 2007; Light,
Wexler, & Heinze, 2005; Marsh et al., 2006; Mason, 2002; Rankin & Ricchiuti, 2007; Schmoker,
2002). Both Marsh et al. (2006) and Mason (2002) acknowledged that they found this step to be
an ambitious venture for leaders to undertake, as the process of transforming data into knowledge
and ultimately action requires cultivation and can present further challenges if personnel lack the
creativity to design appropriate action. Schmoker (2002) asserted that “the most arresting reason
for their exceptional improvement” (p. 12). referring to his research on school sites using
DDDM, was the process of “finding, creating, and continuously refining better ways to teach to
those skills using a baseline and measuring the number of students who actually learn the specific
targeted skills” (p. 12). Ikemoto and Marsh (2007) noted that “all forms of data use required
capacity to translate data into information and actionable knowledge” (p. 122)—eluding to the
fact that the process of transforming data into action is a capacity that leaders need to purpose-
fully build within their organizational culture. This factor was further supported by Rankin and
Ricchiuti (2007) when they highlighted the critical role played in the DDDM process of teaching
data users how to develop questions around student performance based on data plays in the
process.
Light et al. (2005), in their 2-year study on the intersection of decision making and the
process of transforming data into knowledge, presented a framework for educators to consider
when thinking about DDDM. They began by defining data in one of three states in raw state, in
data as information, and in data as knowledge. Initially they acknowledged that the usefulness of
data in raw state is directly proportional to the skills of the person looking at the data. They fol-
lowed with drawing a connection between data and information only after meaning has been
DATA-DRIVEN DECISION-MAKING PRACTICES 39
Figure 5. Framework for data transformation into knowledge.
Taken from Keeping Teachers in the Center: A Framework of
Data-Driven Decision Making, 2005, p. 3, retrieved from
http://cct.edc.org/sites/cct.edc.org/files/publications/
LightWexlerHeinze2005.pdf
connected to context. They then finished with defining knowledge as a collection of informa-
tional data facts that have been deemed useful for a particular task and eventually used to guide
action. Knowledge, they suggested, “is created through a sequential process” (p. 3). The sequen-
tial process seen in Figure 5, they suggested, starts with a collection of data followed by the
organization of it. Once data have been collected and organized, educators summarize it for
analysis. The analysis then leads to the synthesis of information that is used as a decision-
making tool.
Progress monitoring. There is a common adage in education circles that suggests that
“what gets measured gets done” (Schmoker, 2002, p. 35). If this adage were true, under NCLB
all schools would be getting it done, although data would indicate otherwise. During the NCLB
era, federal and state policies mandated school districts to meet student achievement growth
DATA-DRIVEN DECISION-MAKING PRACTICES 40
targets dictated by metrics. The growth targets presented by NCLB in essence reformed the
measurement adage into what data have become, a barometer of progress—which is exactly how
the literature on DDDM identifies data: as a progress-monitoring tool (Boudett et al., 2010;
Datnow et al., 2007; Elmore, 2002; Williams et al., 2007; Rankin & Ricchiuti, 2007; Isaacs,
2003; Kerr et al., 2006; Park et al., 2012; Schmoker & Wilson, 1995). Elmore (2002) found that
data feeds the PD of staff by using disaggregation of data to determine not only the progress of
student learning goals but also the learning priorities for teachers. Williams et al. (2007) added
that schools that have success with DDDM specifically use measurable objectives and monitor
their progress. Other literature not only suggests the continuous evaluation of team performance
but also stresses the use the data arising from this progress monitoring to refine topics for profes-
sional development (Kerr et al., 2006).
Instructional tool. Marzano et al. (2003) claimed that “rather than prowling through
classrooms with checklists of ‘correct’ practices, administrators should be looking at interim
results with their teachers, identifying the most effective practices” (p. 167). The literature on
DDDM is clear—that improvements in student achievement occur only when data are synony-
mous with guiding instructional improvements (Bernhardt, 2009; Boudett et al., 2010; Cosner,
2011; Cromey, 2000; Dunn et al., 2013; M. Golden, 2005; Hamilton et al., 2009; Henning, 2006;
Kerr et al., 2006; Levin & Datnow, 2012; Marsh et al., 2006; Psencik & Baldwin, 2012; Wil-
liams et al., 2007).
Dunn et al. (2013) referred to DDDM as a “learner-centered teaching tool” (p. 88). This
notion of data being a teaching tool directly related to instructional practice was clearly evident in
the literature for DDDM, specifically with respect to how data can change or alter instruction
(Bernhardt, 2009; Levin & Datnow, 2012; Marsh et al., 2006). Other literature points to promote
DATA-DRIVEN DECISION-MAKING PRACTICES 41
the notion of data serving as a means to discriminate effective from ineffective instructional
strategies (Boudett et al., 2010; Cromey, 2000; Hamilton et al., 2009; Psencik & Baldwin, 2012).
Williams et al. (2007) found a strong correlation between schools with a high API and the
extensive use of data used to improve instruction. Cosner (2011) and Kerr et al. (2006) found
that data use was limited for school sites whose cultures did not adopt a practice of altering
instructional methods.
Belief system change agent. An organization changes only when members within the
organization change their behaviors (Fullan, 2001). As Heath and Heath (2010) suggested,
changing human behavior is a difficult process dense with human emotion and deficient in
rational thought. An associated theme arising in the literature for DDDM was how data related
to highly emotional belief systems within a school (Armstrong & Anthes, 2001; Dunn et al.,
2013; Marsh et al., 2006; Park et al., 2012; Wayman, Cho, Jimerson, & Spikes, 2012). Both
Marsh et al. (2006) and Park et al. (2012) found that teachers had difficulty marrying long-
standing belief systems with DDDM practices. Marsh et al. (2006) cited an example of teachers
resisting the urge of “staying on pace with curriculum” (p. 9). which did not allow for them to
interrupt their practice to reflect on data, much less reteach material. Down and Tong (2007) and
Kerr et al. (2006) found that the success of DDDM was highly contingent upon the staff buy-in at
a school. Armstrong and Anthes (2001) suggested a way to cope with this challenge: “One of the
most intriguing ways schools use data, we found, is to change teachers’ attitudes toward the
potential success of previously low-performing students” (p. 30).
DATA-DRIVEN DECISION-MAKING PRACTICES 42
Challenges to DDDM
Within the literature on DDDM, there were few authors who directly addressed chal-
lenges; however, the few who did identified timely access to data and data validity as the two
primary challenges.
Timely Access to Data
School sites that have successfully used DDDM to improve achievement, the literature
indicates, found effective methods in delivering data to stakeholders in a timely and accessible
format (Anderson et al., 2010; Datnow et al., 2007; Duffy et al., 2012; Lachat & Smith, 2005).
Ikemoto and Marsh (2007) found that across two studies, the “access to and timeliness of receiv-
ing data greatly influenced individual use” (p. 120). Kerr et al. (2006) added that a major chal-
lenge for their case study participants was finding ways for school sites to receive data both in a
timely manner and in a user-friendly format. Without these two elements, they found it became
impossible for daily instruction to be influenced by data. Marsh et al. (2006) pointed out that
timely delays associated in receiving state assessment data also hindered school sites’ ability to
use data for planning purposes. Other literature pointed out that the absence of data in a timely
fashion created more of a challenge for leaders than it did for teachers, in part due to planning
purposes (Luo, 2008).
Data Validity
Marsh et al. (2006), in their case study of 41 schools and 9 school districts, found that
“many educators questioned the validity of data, such as whether test scores accurately reflect
students’ knowledge, whether students take tests seriously, whether test are aligned with curricu-
lum” (p. 8). Ikemoto and Marsh (2007) found that even at sites where DDDM was being used
with fidelity, many of the staff members often questioned the accuracy and validity of data.
DATA-DRIVEN DECISION-MAKING PRACTICES 43
Other literature also supported the notion of questioning data validity, including the validity of
state assessments (Cromey, 2000; Kerr et al., 2006) and the accuracy of data (Lachat & Smith,
2005; Luo, 2008). Earl and Fullan (2003) pointed out that due to the statistical illiteracy of
school personnel, school sites consequently feel that they have no idea what the data means.
Both Anderson et al. (2010) and Marsh et al. (2006) addressed data validity through a slightly
different lens, referring to the skepticism of stakeholders as either real or perceived.
Chapter Summary
Collins (2001) highlighted the importance of an organization having personnel assigned
to positions that maximize the company’s ability to reach its objectives. He wrote that “great
vision without great people is irrelevant” (p. 42). However, the lens through which the author
wrote was that of the private sector, where eliminating or transferring personnel within an
organization often comes with ease and the absence of restrictive bargaining contracts such as
those present in public education. The contracts present an immense burden for public school
leaders as they work towards building great people through a comprehensive, purposeful PD plan
with the staff they currently have.
This review has identified a series of different themes that emerge in the literature on
successful data-driven practices. These were categorized in three major groups: leadership, PD,
and explicit uses of data. The literature on DDDM indicates that leaders must first establish a
rapport with their stakeholders, as trust becomes an essential component to deprivatize practice.
Once this rapport is established, the literature suggests that leaders should communicate a strong
vision alongside a robust plan containing clearly identifying measurable goals; however, prior to
releasing this plan, a leader must be conscious of setting the stage for DDDM through contextual
DATA-DRIVEN DECISION-MAKING PRACTICES 44
factors. Finally, the literature makes it clear that a leader must make sure to provide stake-
holders with adequate resources to guarantee the success of school site DDDM objectives.
Within the PD theme, the literature on DDDM points to leaders being mindful of their
own data proficiency as well as their teachers, thus suggesting that leaders make building capac-
ity to address gaps in data an initial consideration. Building this capacity can take place only by a
leader’s commitment to provide its stakeholders with the time necessary to build such capacity.
Two of the strongest elements that the literature identifies as PD objectives for DDDM are
building a culture of inquiry and continuous improvement.
Data are used for a myriad of different reasons within the DDDM process. The first
reason is to produce actionable change from raw data that are transferred to actionable knowl-
edge. Data are also used to monitor progress and provide the information necessary to inform
teachers of possible instructional deficiencies. Finally, data are used to address beliefs within an
organization.
The challenges in DDDM implementation that emerged from the literature are stake-
holders having timely access to data and stakeholders’ skepticism regarding the validity of data.
DATA-DRIVEN DECISION-MAKING PRACTICES 45
CHAPTER THREE: RESEARCH METHODS
There is a well-established, long-standing practice in the U.S. educational system that
cyclically demands change through policy. Schlechty (2001) claimed that “there is in fact so
much change occurring in schools that teachers and school administrators rightly feel over-
whelmed by it. However, this change is seldom accompanied by clear improvements in perfor-
mance” (p. 39). Most recently, NCLB has undoubtedly brought about a focus on data metrics,
and the ARRA will further expand on this trend. Fullan (2001) suggested that “change is a
leader’s friend, but it has split personality: its nonlinear messiness gets us in trouble” (p. 107).
Change is inevitable; however, what a leader does to cope with change or, more significantly,
how a leader can modified a strategy over time to improve student achievement during changing
times is a question worth investigating. One such strategy is addressing the lack of skill in
interpreting the common element in all policy, data from assessments, which have been a barrier
to frequent and systematic use of data (Gischlar et al., 2009).
Purpose Restated
Chapter Three discusses the research methodology that allowed the researcher to address
the gap in DDDM practices. This task was accomplished by extracting knowledge from leaders
at schools that have undergone a transformation from a data-rich and action-poor culture to one
where stakeholders were using an inquiry-based model that transforms raw data into actionable
information that drives the instructional learning gaps of its students, thus leading to continuous
improvements in student achievement. The research methodology encompasses the research
questions, data collection, data analysis, assumptions, and data collection.
Any study begins with clearly identified goals (Maxwell, 2013; Merriam, 2009); there-
fore, it is necessary to refer to the guiding research questions posed in Chapter One:
DATA-DRIVEN DECISION-MAKING PRACTICES 46
1. What systems are considered and subsequently put in place for a data-driven school
culture?
2. How do school site principals and their stakeholders transform data from raw form to
actionable data used to drive decision making in instructional practice?
3. What are barriers that a secondary principal faces when implementing DDDM?
4. What are the mechanisms that secondary principals use to evaluate the effectiveness
of protocols and systems put in place by DDDM.
Methodology
Merriam (2009) suggested that methodology should include “at the minimum, how the
sample was selected, how data were collected and analyzed, and what measures were taken to
ensure validity and reliability” (p. 246). To capture the information and knowledge necessary to
answer the research questions, methodology with a focus on qualitative data was used. Selecting
sites that had met the study’s criteria for participation was the first step in ensuring that the data
necessary to answer the research questions were collected. Collecting qualitative data from
leaders through an interview process becomes fundamental to “enter into the other person’s
perspective” (Merriam, 2009, p. 88) so as to understand through the interviewee’s experience
how he or she interpreted and assigned specific value to the actions that had already occurred.
Quantitative data gathered in the form of assessment data through the CDE website further
supported the data collected through interviews and ensured the necessary triangulation to
support the findings of this study (Creswell, 2009; Maxwell, 2013; Merriam, 2009). By collect-
ing data from multiple sources that is then carefully analyzed, Merriam suggested that the gap in
knowledge could be suitably addressed. The research design suggestions found in Creswell
DATA-DRIVEN DECISION-MAKING PRACTICES 47
(2009), Maxwell (2013), and Merriam (2009) were the resources used to ensure validity and
reliability.
Sampling Strategy
Purposeful sampling was the sampling strategy used for this research study, as it allowed
for the researcher to “discover, understand, and gain insight” (Merriam, 2009, p. 77) about case
study participants and their DDDM experiences that can only be achieved if the researcher
“selects a sample from which the most can be learned” (p. 77). For the purpose of this study, a
DDDM process was defined as stakeholders within a school site, including administrators and
teachers, having explicit PD time to analyze assessment data for the sole purpose of reflecting on
and informing instructional practice and relevant decision making. The participants must be
schools that met the following selection criteria:
1. Schools must be in the process of implementing a DDDM protocol, as described
above.
2. Schools must have met API growth targets set by NCLB during a 3-year time frame
with a schoolwide API under 800 for all years being studied. These growth targets were defined
to be 5% of the difference between the school’s API and the statewide performance target of 800,
with a 5-point minimum (CDE, 2013b).
3. While have achieved the latter growth targets for 3 consecutive years, the student
population included in the API calculation must minimally be 25% EL and 50% socioeconomi-
cally disadvantaged.
4. The school site must have had a consistent leader during the time frame where API
growth was achieved. By consistent leadership, it was meant that there was no interruption in
service, replacement, or promotion of the leader at the school being studied.
DATA-DRIVEN DECISION-MAKING PRACTICES 48
Data Collection
During the data collection process, it was anticipated that it would be difficult to “sepa-
rate detail from trivia” (Merriam, 2009, p. 118). To remedy this anticipated problem, during the
entire data collection process there was a mindfulness to be attentive only to data that were
directly relevant to the purpose of their collection: to provide data to answer the research ques-
tions. The formats for data collection were interviews (see Appendix), document analysis, and
quantitative assessment and demographic data collected from DataQuest (CDE, 2013a).
Interviews. Of the six type of questions suggested by Merriam (2009), experience and
behavior, opinion and values, feeling, knowledge, sensory, and background-demographic ques-
tions, all were considered; however, it was anticipated that the most relevant type of questions to
elicit the richest data required to answer research questions would be experience-behavior and
knowledge (Merriam, 2009). During this study, interview data were collected from six princi-
pals.
Follow-up interviews. Merriam (2009) suggested that interviews are improved after
researchers reflect on subsequent interviews, especially when considering where the interviewer
could have followed up but did not. In the present study, no follow-up interviews were neces-
sary.
Assessment Data. In addition to using assessment data to help establish selection criteria,
there was a collection of 3 years of standardized assessment data to help establish the credibility
of the findings from all the qualitative and quantitative data collected. The data that were used
encompassed 3 consecutive years of the most recent assessment data available for each partici-
pating site.
DATA-DRIVEN DECISION-MAKING PRACTICES 49
Consent and Data Security
All participants of this case study were notified verbally and in writing that their partici-
pation in this research project was voluntary. It was made very clear that their identities would
be kept confidential and that at any given time during data collection, they had the right to termi-
nate their participation. It was made clear to all participants that they would be identified only by
pseudonyms and that their respective site data would be made available at their request. All data
including interview guides, interview notes, reflective notes, observation notes, documents, and
all other relevant data used in this research study will be kept in file folders in a locked file
cabinet. Digital files will also be secured on a password-protected hard drive.
Data Analysis
Data analysis was perhaps the most important component of this study, because this was
where information about DDDM processes unveiled was transformed into working knowledge
that could be used by practitioners wishing to use DDDM as a means to improve student
achievement. After all data are collected, they were subsequently grouped, coded, interpreted,
and then assigned meaning (Merriam, 2009).
After reading and rereading the transcriptions and observation notes, the researcher began
highlighting portions of the notes that directly related to the research questions —a process that
Merriam (2009) called “open coding” (p. 178).
Assumptions
Several assumptions were made during the course of this study. The first assumption was
that improvements in assessment data used during each case study participant’s DDDM proce-
dures were synonymous to improvements in student achievement. For instance, if a school uses
CSTs in their DDDM protocols, then improvements in year-over-year results on CST were
DATA-DRIVEN DECISION-MAKING PRACTICES 50
assumed to demonstrate improvements in student achievement. The same was
assumed about local and teacher assessments. Secondly, it was assumed that the
assessment barometer for each corresponding school’s DDDM process was a
valid form of assessment. The third assumption was that each individual leader
interviewed possessed an in-depth understanding about his or her organization.
The last assumption was that the findings of this case study would be applicable
and transferable to any school site wishing to obtain improvements in student
achievements.
Validity and Reliability
Internal Validity
Maxwell (2013) suggested that validity is not something that a researcher can prove but
rather becomes a goal. This investigator’s goal was to use triangulation to ensure validity (Mer-
riam, 2009). In this research study, internal validity was adhered to through mindful interpreta-
tion of the respondents’ perspectives about DDDM strategies that they determined contributed to
improved student achievement patterns at their school sites. Internal validity was established by
including language in interview and survey questions that specifically addressed participants’
perceptions of the effectiveness of DDDM strategies.
External Validity
External validity was established through the guidance and direction of the faculty
members advising this research, and participants of the school sites being studied, all of whom
possess a working knowledge of the inner workings of DDDM processes. Each individual
member provided meaningful input that contributed to the feasibility of how DDDM practices
added to student achievement gains.
DATA-DRIVEN DECISION-MAKING PRACTICES 51
Reliability
Reliability in this research study was ensured by keeping an audit trail that describes in
detail how data were collected, how emerging themes were determined, and how decisions were
made throughout the research study (Merriam, 2009).
Reporting the Findings
The findings of this study are presented in Chapter Four in narrative form, beginning
with a presentation of the relevant data that correspond to each of the four guiding research
questions, followed by identifying both parallel elements from the literature review and new
discoveries made as a result of this study. Chapter Five presents a framework for school sites
wishing to pursue DDDM as a means to improve student achievement.
DATA-DRIVEN DECISION-MAKING PRACTICES 52
CHAPTER FOUR: FINDINGS
This study was designed to provide secondary practitioners with insight into the DDDM
practices that they use to improve student achievement. Specifically, it was intended to assist in
providing a blueprint of practice for principals who wish to improve student achievement at
schools serving a student population that have at least 50% of its testing students on free or
reduced-price lunch, 25% of its testing students being categorized as ELs, and having an API
below 800 and who desire to meet API growth target set by the NCLB Act of 2002. This study
included the qualitative data of six secondary school sites whose principals provided perspectives
on time-proven strategies that they attributed to improvements in student achievement. Of the
six principals, all had 5+ years of experience as principal; Five were male and one was female.
Their schools had 3-year student enrollment averages ranging from 1,435 to 2,835 and all were
located geographically in southern California. The data collected for this study were gathered
from four principals in May of 2014 and from two principals during in October of 2014. All
interviews were conducted in the principals’ office, and the entire interview was recorded (see
Table 2).
The purpose of this chapter is to report the data that were collected to answer each of the
four research questions. As Patton (2002), suggested, an interviewer needs to “enter into the
other person’s perspective” (p. 88) in order to understand through his or her experience how he or
she interprets and assign specific value to the actions that have already occurred. An interview
protocol was designed to guide a semistructured interview process. Although there were two to
three interview questions used to support each of the four research questions, additional ques-
tions were asked on an as-needed basis to help expand on or probe into
DATA-DRIVEN DECISION-MAKING PRACTICES 53
Table 2
Information Regarding Schools of Principals Participating in Study
School site
a
FARL
b
3-year API growth EL population
b
Enrollment
b
Alpha High School 84.5% 41 46.9% 1,730
Beta High School 54.6% 59 27.1% 1,759
Gamma High School 85.4% 38 37.0% 1,437
Delta High School 75.8% 38 27.2% 2,834
Epsilon High School 62.9% 35 40.9% 2,438
Zeta High School 84.3% 42 52.9% 2,833
Note. FARL = free and reduced-price lunch program; EL = English learner. Taken from Data-
Quest, by California Department of Education, 2013a, retrieved from http://data1.cde.ca .gov/
dataquest/
a
Names are identified with pseudonyms, but case study-related data are real.
b
Percentages represent a 3-year average.
the interviewee’s answer. A protocol for the interview in addition to the interview questions is
provided in the Appendix.
Coding of Data
Upon completion of all recorded interviews, the electronic files were submitted for tran-
scription in verbatim form to fully capture what each interviewee intended to communicate.
Each transcription was read in its entirety, during which highlighting was used to identify ele-
ments that directly related to one of four research questions—a process that Merriam (2009)
called “open coding” (p. 178). After open coding was done, all highlighted text was grouped by
color and subsequently analyzed for emerging themes. To help track emerging themes both
Microsoft Excel
©
and Microsoft Word
©
were used to facilitate analysis. Once data were grouped,
DATA-DRIVEN DECISION-MAKING PRACTICES 54
analysis began and emerging themes that supported research questions were identified. This
chapter reports on themes in narrative form using direct quotations from participants to highlight
findings for each research question.
Research Questions
To facilitate comprehension of the findings, this chapter is divided into six sections, each
of which represents a theme emerging from the qualitative study: (a) building leadership capac-
ity, (b) setting clear expectations, (c) commitment to product-driven PD time, (d) rebranding of
walkthrough process, (e) strategic conversations, and (f) staying the course. The correlation
between the six themes and the guiding research questions are shown in Table 3.
Table 3
Relationship Between Research Questions and Associated Emerging Themes
Question # Research question Emerging theme(s)
1 What systems are considered and subsequently put Building leadership capacity
in place for a data-driven school culture? Setting clear expectations
2 How do school site principals and their stakeholders Commitment to PD time
transform data from raw form to actionable data used
to drive decision making in instructional practice?
3 What are the barriers that a secondary principal faces Rebranding walkthrough
when implementing data-driven decision making? Strategic conversations
4. What are the mechanisms that secondary principals Staying the course
use to evaluate the effectiveness of protocols and
systems put in place by data-drive decision-making
processes?
Note. PD = professional development.
DATA-DRIVEN DECISION-MAKING PRACTICES 55
The section on building leadership capacity describes how leaders began building support
DDDM protocols by building teacher leaders at their site. In all cases, the teacher leaders
became the driving force of the initiative. In some cases this took the form of revamping or
creating a personnel position to redefine the role of a previous leadership position, such as
department chairperson.
The section on setting clear expectations touches on how leaders went about defining the
parameters of DDDM procedures at their sites. This definition surfaced in a variety of methods,
including ingraining the process as part of the core practices of the school and embedding
language in the vision. In all cases, however, the strategy of repetition and continuous communi-
cation played a role in its success.
The section on commitment to product-driven PD time describes how the principals all
made a conscious effort to dedicate time within their PD calendars to provide teachers with the
requisite skills to effectively use DDDM to drive instruction. In nearly all cases, instead of using
supervision as a manner to keep teachers accountable, the principals used rapport and trust to
keep teachers accountable for PD time in addition to asking for products that assisted in monitor-
ing process.
The section on rebranding walkthroughs describes how principals made a conscious effort
to rebrand their walkthrough procedures to directly support DDDM processes versus being
perceived as compliance or a checklist type of process.
The section on strategic conversations describes how principals used face-to-face meet-
ings with individuals who they felt were slow in adopting DDDM practices. Communication
was typically very forward, respectful, and supportive.
DATA-DRIVEN DECISION-MAKING PRACTICES 56
The section on staying the course deals with how principals understood that both consis-
tency and time were essential factors in how sustainable and successful their DDDM processes
were in improving student achievement, as measured by assessments.
Themes Related to Research Question 1
Research Question 1 asked, “What systems are considered and subsequently put in place
for a data-driven school culture?”
Building leadership capacity. In the literature review of this study, it was noted that
prior to embarking on a DDDM initiative, a leader must understand that its effectiveness is con-
tingent upon his or her commitment to the process (Armstrong & Anthes, 2001; Herman &
Gribbons, 2001; Kerr et al., 2006), specifically the need to avoid a fire-ready-aim tactic but
instead lead people to discover the importance of the process. Herman and Gribbons (2001)
supported this grassroots approach to leading, saying that there is a “strong and undeniable
tension between top-down accountability requirements and authentic, bottom-up school inquiry
process directed at improvement student learning” (p. 30). The principals who participated in
this study all supported the latter by explicitly citing the need to build leadership capacity that
assisted in making the possibility of a DDDM conversation to begin. The teacher leaders all
became lobbyists who played a critical role in selling the initiative to their colleagues.
Alpha High School (AHS). The principal of AHS supported the latter finding when he
acknowledged that upon his arrival at AHS, he was mindful of how top-down approaches to
leadership were ineffective and immediately moved to change that perception:
Initially when I started, it was very . . . sort of top-down, you know, traditional hierarchy
of a large system, a large organization, and everything just kind of got done and, and
dropped down on everybody. Whether it would be, you know, the data, the summative
DATA-DRIVEN DECISION-MAKING PRACTICES 57
data, or the interim-type of assessments, and then having it become more from the grass
roots and be built up through . . . the teacher ranks, and letting them have more of the
design of that. And then with the design of it, I think that sparks a natural curiosity for
them to see what the results are, and then they take a lot more interest . . . in, you know,
trying to influence what the results are versus, you know, just something being placed
there, and “Do this” and then, . . . “These are the results,” . . . It also impacts motivation,
too, I think.
He further indicated that a critical area was ensuring that the theory of distributed leadership was
in place prior to embarking on the DDDM initiative.
Beta High School (BHS). The principal of BHS began his DDDM journey through a
different avenue; he sat down with teacher leadership, specifically his association’s president:
I had monthly meetings with [her] and reviewed the new initiatives with her [and] al-
lowed her to ask any follow-up questions, clarifying questions, so that she had a clear
grasp of why we were actually doing it—the process by which we were going to imple-
ment it and the timeline over the course of time of what that process would actually look
like. And so by doing so, any time she received pushback without my presence, she was
able to clearly articulate what the reason and rationale for doing it, and if she got hung up
on topics, then she was able to come back to me, and so it was that running dialogue,
open communication, and rapport that was established—what the association that allowed
me to effectively implement [the DDDM] process.
After sitting with her, he then moved to completely revamping job descriptions to help formalize
the leadership expectations for the position. He shared:
DATA-DRIVEN DECISION-MAKING PRACTICES 58
We created a job description because the job description didn’t, hadn’t previously existed,
and based on that job description, it basically took the role of the department chair from
being a pencils and paper clip, supplies counter, to a person who ultimately was moving
into an instructional leadership.
Gamma High School (GHS). The principal of GHS left nothing to interpretation when
she made her core belief on this topic clear: “It really is based in and it’s district wide on the
premise that building capacity of teachers is the most important thing that we can do.” The latter
belief, she continued to expand, is applicable to all teachers—not just teaching core content areas
(i.e., ELA, math, science, social science). Teachers in visual and performing arts, physical edu-
cation, and world languages all had the same support. In addition to the core belief of teachers
being the most valued resource in improving student achievement, there are other protocols in the
DDDM process that help empower teachers, specifically teacher leaders, to organically promote
the valued teacher culture that allowed GHS to capitalize on DDDM processes to drive student
learning. Feeding this notion was a faith-based approach to PD that fully trusted teachers to do
what it took to drive the DDDM process without direct administrative oversight. The principal
indicated this situation by saying:
Our feeling is you’re professionals, you know, you bring us your minutes, your agendas,
you know, products from those meetings, but we have faith and confidence that you are
going to follow your agendas and make improvement. So that’s a [Gamma Unified]
feeling that we trust, and that’s part of why we have a good relationship with [our teacher
association] is we genuinely believe that teachers are the most important part of every-
thing we do and that they are professionals . . . and we’re gonna treat them as profession-
als.
DATA-DRIVEN DECISION-MAKING PRACTICES 59
The practice articulated by the principal was one that Ikemoto and Marsh (2007) supported when
they indicated that trust was “a particularly important enabler of complex DDDM” practices (p.
120).
Delta High School (DHS). The principal of DHS acknowledged that the success of
school initiatives, including the DDDM process, was contingent on consistently building teacher
leaders:
It required us . . . to invest more time and developing leadership skills in those positions,
and we continue to do that . . . and invest time in . . . the course leads to further define
what their role is and how they’re going to facilitate those meetings. But at the heart of
the meetings is the data, you know, and it’s kind of continuous training where I think our
courses need to be continually minded of what are the priorities.
Although he felt that this was an important process, he acknowledged that staying the course with
building teacher leaders is a process that takes time and requires attention:
Getting the right people to apply and the right people to serve in those roles, I think, was a
challenge early on, and it’s a challenge as people step down. Okay, how do you find the
next person that’s going to move that next group forward and be good at looking at data
and making some—and drawing some conclusion to the data, and then having them
stimulate the right discussion to stimulate instructional change.
Once selected, however, as with GHS, there was a full faith- and trust-based system to help
deliver without top-down administrator directives to teachers but rather by leveraging teacher
leadership to be middle management. He noted that
we rely on our . . . teacher leaders as well to support the change that we’ve discussed, and
. . . we try to monitor it through classroom observation, and then make recommendations
DATA-DRIVEN DECISION-MAKING PRACTICES 60
back when we’re not seeing it to course leads and department chairs. So this is what
we’re seeing great or this is what we’re not seeing and would like to see more of, and
trying to communicate that through our course leads and our department chairs.
The process was not completely autonomous, however, as administration socially negotiated with
teacher leaders to use DDDM processes to improve an aspect of student learning. He shares that
“we charged each department in coming up with a strategy or, on the way we addressed it and as
a school, it was nice to come to consensus.”
Epsilon High School (EHS) and Zeta High School (ZHS). The principals of EHS and
ZHS took a slightly different approach to lobby support for their DDDM processes: They exclu-
sively worked with leadership teams composed of department chairs and/or content team leads.
Nevertheless, the commitment was present in the form of continuously meeting with their lead-
ership teams to drive the work. The principal of EHS commented:
Our site-based PD has to do with our instructional leadership team, which is a group of
about eight teachers that meet quarterly, actually at least quarterly on a release day, and
during our 4th-quarter release day, together we come up with our PD plan for the follow-
ing year.
The principal of ZHS made it clear that he enjoyed meeting with his team:
I liked meeting . . . with teachers and with the leadership. We had course leads. In other
words, there’s a department chair, okay, and then we broke the department into subjects,
and each subject had a lead. So I would meet individually with the lead, and they would
have like four to five teachers in each subject, or course. And so like, let’s take science.
So the biology lead would have four or five teachers in biology.
DATA-DRIVEN DECISION-MAKING PRACTICES 61
The ZHS principal communicated the latter message and other core beliefs with conviction. It is
this passion that builds the trust necessary to move the DDDM process and in the long term
promotes the “ethos of learning and continuous improvement” that Park and Datnow (2009, p.
483) identified.
The principal of ZHS had the latter in mind when describing his long-term goals but was
aware of the time and energy that was necessary to invest early on in the DDDM process to
support its success
Summary of key points. What became abundantly clear in the message from all six
school leaders who participated in the study was that they firmly believed that systemic uses of
data to guide decision making to guide instructional practices were preceded by the need to
cultivate teacher leaders who become proponents of the DDDM initiative and ultimately the
impetus for change. The principal of DHS articulated this well:
I think that was a major step, and probably the most innovative piece was creating the
level position to help run that and ultimately generating quadruple the number of depart-
ment chairs on the campus . . . you know, multiply that number of people on campus by
four and you got buy-in already with . . . a third of your campus because serving in these
leadership roles and that helps things, right? You get close to that tipping point with
more people on board, driving those discussions.
It is this tipping point that becomes essential to pursue through aggressive and purposeful lobby-
ing with teacher groups prior to even discussing DDDM processes with stakeholders. In this
leadership building state, all principals lobbied for their initiative by promoting data as informa-
tive to the improvement of instructional practice versus serving as a surveillance tool (Earl &
Fullan, 2003). What was absent in the evidence was the phenomenon that Herman and Gribbons
DATA-DRIVEN DECISION-MAKING PRACTICES 62
(2001) described as leaders using data manipulation to misrepresent change initiatives. (See
Table 4 for summary of building leadership capacity by school.)
Table 4
Summary of Building Leadership Capacity, by School
School site Procedures utilized to build leadership capacity
Alpha High School Restructured perception of leadership from a top-down model to using a
leadership team comprised of department chairs and administration.
Beta High School Used teacher association president as a springboard, then capitalized on
vetted initiatives working along with department chairs.
Gamma High School Adopted and promoted district-adopted culture of putting teachers first to
drive initiatives through department heads and content team leaders.
Delta High School Invested time to develop leadership skills through the redefinition of
department chairs by revising job expectations and leadership coaching.
Epsilon High School Used leadership team comprised of administration and department chairs
to drive professional development and collaborative work.
Zeta High School Used leadership team comprised of administration and department chairs
to drive professional development and collaborative work.
Setting clear expectations. Once a mindful approach was determined for building
leadership capacity to assist in making DDDM palatable for teachers, the findings of this study
indicated that leaders felt a need to crystalize their expectations to stakeholders. Having a clear
set of expected outcomes coincides with what the literature supports (Anderson et al., 2010;
Datnow et al., 2007; Isaacs, 2003; Kerr et al., 2006; Lachat & Smith, 2005; Levin & Datnow,
2012).
DATA-DRIVEN DECISION-MAKING PRACTICES 63
AHS. The principal of AHS acknowledged that he understood that the number one
priority of any school should be instructional practices; however, he realized that he had an uphill
battle with respect to setting clear expectations with DDDM processes to improve instructional
practices, as there were other elements in need of alignment prior to setting instructional
direction: “What I noticed within the first couple of weeks when I started working here was that
there was a gross misalignment of curriculum instruction and assessment.” He made it clear to
me that his message on instruction had been consistently simple: “It’s all just the basic concept of
backwards mapping and, you know, seeing what types of things students need to know, and then
planning backwards, and then measuring incrementally their progress towards kind of the large
summative assessments.” He later expanded on his foundational core instructional beliefs in
reference to establishing expectations from teachers: “You have the four questions: What do you
want students to know? How do you know that they’ve learned it? What do you do with those
who haven’t? And what do you do with those who have?” At the same time, he was clear that
he was not going to stifle teacher creativity and that he respected teacher professionalism: “[I am]
a believer in that you want—that you define what the outcome—what you’re hoping the outcome
can be and, you know, maybe suggest some variety of approaches or ways to get that done.”
BHS. The principal at BHS took a more formalized approach to setting expectations
when he shared: “Getting people to understand how we were going to use this SMART goals
process as a tool to use the data that drive and adjust our instruction”—thereby acknowledging
that this understanding was essential to the success of the DDDM process. He later indicated that
this preparation was extensive, including training his own administrative personnel to have
concise communication and then moving on to teacher leaders prior to going school wide with
DATA-DRIVEN DECISION-MAKING PRACTICES 64
the process. His expectations were rooted in one basic premise, “to get teachers to stop working
in isolation and start working in a collaborative manner.”
GHS. The principal of GHS went about articulating her views on expectations by starting
with describing the expectations that her superintendent has of her. She shared that “the annual
report that I have to do to our superintendent and the school board and all of the measurable
targets that fit within there” are essentially what drive her perspective on DDDM, noting that her
perspective was, “What does get measured is what gets done.” However, prior to measuring
anything, she acknowledged that informing people of what she expected to see was crucial. She
noted that
the more you can be up front, and these are the things I look for in the classroom. You
tell them that up front, and then if you don’t see it, well, they knew that that’s what you
were looking for.
After communication had been made, however, she described herself as a charger when it came
to expectations. She made it clear during her interview that “it’s nonnegotiable that you’re gonna
use good instructional practices, that you’re gonna hold kids to high expectations, that you’re
going to do bell-to-bell instruction.” During her interview she clarified that her expectations
were driven by her core beliefs of putting students first while simultaneously respecting the pro-
fessional autonomy of teachers: “We want student-centered learning, student engagement to be
high, but you can use a variety of different practices to do so.” Unique to this interview, how-
ever, was how the culture of expectations extended to student groups as well, albeit through the
expectations of adults. She shared that she believed that it is “all about building relational
capacity with kids but also helping them to become self-managing in their behavior and create
high-functioning, self-managing classrooms.”
DATA-DRIVEN DECISION-MAKING PRACTICES 65
DHS. One of the most streamlined descriptions of expectations came from the principal
of DHS. He shared that an array of different decisions affecting DDDM practices that were
ingrained at DHS at the time of the study began by bringing in all stakeholders from the adminis-
trative and teaching ranks up to the school board level, which he credited with having assisted in
making expectations clear. He stated that the entire process “crystallized what our focus is and
what we value and has created the structures that I think will help us continue to address those
areas we need to focus on.” Despite this fact, he acknowledged that it was “probably the hardest
thing to do is get the entire organization to believe and value the same things.” He indicated that
the school achieves this clarity at the teacher level by having well-established systems for teach-
ers that clearly communicate expectations. He shared: “For us, we have a thing called teacher
power where the expectations are clear on how teachers function within the structure that I’ve
discussed with the collaboration groups.” Despite having well-established clarity to teacher
expectations, he acknowledged that at times it was difficult to find an area of focus, given so
many data barometers. His answer to this ambiguity was leadership. He shared that he was
mindful to
keeping the staff focused on the areas that we feel are . . . prioritized and the . . . data can
be so confusing at times, or it can provide you with so much information that you have 25
things that are priorities, and I think you don’t get anything done that way but so to help
narrow the data down to things that are—that we can affect and make an impact on and in
keeping that as a focus for our staff.
The latter mindfulness to leading teachers to important areas of focus, he acknowledged, could be
a difficult task due to the “gray data” that is so “personal to teachers.”
DATA-DRIVEN DECISION-MAKING PRACTICES 66
EHS. The principal of EHS had a very concise and basic message when it came to con-
tinuously communicating his expectations. During his interview, he succinctly noted that his
expectations when starting his DDDM were rooted in three core beliefs: “I’ve got three ideas:
effective teachers, district instruction, and teams.” Regarding what effective teaching looked like
he would eventually go on a journey to define, and je made sure to bring on his teacher leaders
when he did so. In terms of district instruction, he was referring to district instructional initia-
tives that teachers were professionally developed throughout teacher pull-out days. Finally, he
valued collaborative teams. He later expressed that for the latter three ventures, he took the
approach of being consistent. He shared that “in terms of . . . what I wanted to do with the teach-
ers here was to, in a repetitive kind of way, really attend to a limited number of achievement,
goals, and did not swamp them with data.” It was these fundamental three elements that were the
groundwork of his expectations as principal that he repeated over and over, making it clear to
teachers that he expected them to come through for students.
ZHS. The principal of ZHS had a radically unique way to communicate his expectations
as principal. He began with acknowledging that because the tenure of principals is very short,
there is very little time, in his opinion, for principals to bring about change. He referred to taking
risks immediately: “I think that you need to make bold moves, bold strokes, and then refine it as
you go through your career. So you come in as a principal, you need to make changes and make
major changes.” Despite this perspective, he was aware that he could take such an approach
because his site was the lowest performing school in the district. He leveraged this fact to rally
his teachers behind supporting him as expressed by this comment:
DATA-DRIVEN DECISION-MAKING PRACTICES 67
So the courses that were at the very bottom, I had a little bit of an edge because I’d say,
“Okay, we’re—there’s no way we should be last in the district in anything.” You know
what I mean? And so that started a conversation there, with picking that up.
Although his approach was untraditional, his underlying message to his teachers was “to be the
best” or most improved. How that materialized was similar to EHS, which was focusing on a
small number of items that included improving A-G rates and improving standardized assess-
ments results on the CAHSEE and CST.
Summary of key points. All accredited public schools in California have a vision and
mission. It is part of the process of accreditation of the Western Association of Schools and
Colleges; however, in the case of the principals who participated in this study, it was clear that
their instructional goals were directly tied to the core beliefs of not only the principal and teach-
ers but also ingrained in their school vision. The messaging was concise and continuous, and
principals made sure to ingrain DDDM as part of their everyday practices. Despite the slight
variations with the six principals, they all spoke of instructional priority with conviction and were
able to clearly articulate where they were, where they are, where they were going, and how they
were going to get there. (See Table 5 for summary of setting clear expectations by school.)
Theme Related to Research Question 2: Commitment to Product-Driven PD Time
Research Question 2 asks, “How do school site principals and their stakeholders trans-
form data from raw form to actionable data used to drive decision making in instructional prac-
tice?”
As previously mentioned in the literature review of this study, the most ubiquitous theme
that emerged from the literature for DDDM was the pivotal factor that building data analysis
proficiency plays in a successful implementation of DDDM initiatives. Although the latter theme
DATA-DRIVEN DECISION-MAKING PRACTICES 68
Table 5
Summary of Setting Clear Expectations, by School
School site Procedures utilized to set clear expectations
Alpha High School Had to invest time in realigning a focus on curriculum, instruction, and
assessment. Collaboration was a central focus DDDM process.
Beta High School Used the SMART goal process to set clear instructional expectations.
Gamma High School Adopted a mantra of “what gets measured gets done.” Used the latter to
drive professional development. District assisted in setting expectations.
Delta High School Used a core belief system redefinition to gain interest on behalf of teach-
ers. Used the latter to drive professional development. District assisted
in setting expectations
Epsilon High School Relayed a clear expectation of doing what was best for students and
using DDDM to accomplish this task.
Zeta High School Used low achievement to drive excitement to improve achievement
through DDDM practices.
Note. DDDM = data-driven decision making. SMART = specific, measurable, achievable,
realistic, and timely.
arose, albeit to a very small extent, in the data collection of this study, it did not emerge with the
same tenor as the PD theme in the data collected from the six participating principals. Leaders
who participated in this study made explicit reference to their commitment to pursue additional
time for teachers to consistently participate in PD with the sole purpose of collaborating with one
another on best practices guided by data. The latter emerging theme of being committed to PD
coincides with what Elmore (2002) eloquently expressed on the topic of contemporary gaps in
student achievement: “American schools and the people who work in them are being asked to do
DATA-DRIVEN DECISION-MAKING PRACTICES 69
something new—to engage in systematic, continuous improvement in the quality of the educa-
tional experiences of students” (p. 3). Among the foci of the participating principals were the
systemic approach that Elmore referred to and the fact that they all, in one way or another, strove
to build an inquiry-based, continuously improving DDDM process that they specified as playing
a key role in improving student achievement.
AHS. The principal of AHS validated the research when he acknowledged the impor-
tance of using it to determine the effectiveness of instruction:
Whatever time is allocated for professional development . . . with your teaching staff, you
definitely want to make sure that you’re on target with that. And the only way to see if
you’re on target is to see, you know, . . . what the needs are and . . . what’s working and
what needs—what areas could be improved in different curriculums.
He added, however, that having PD for the sake of having it is not advantageous to any principal.
He referred to PD as “planned and intentional activities” where, despite allowing some autonomy
for teachers to deliver curriculum, there is an expectation that they use the best practices acquired
in PD meetings. The objective, he indicated, becomes conditioning teachers to revisit “the cycle
of inquiry, and the teachers working together, and it’s just the natural progression throughout the
year once you have some of those other elements in place.” To ensure that the cycle of inquiry
was revisited, he collected artifacts that become the product that teachers create when they meet;
then he said that “as long as you’re collecting the evidence and the documents and seeing the
work, you can see the work move forward.” When discussing this product-based PD to which he
was committed, he revisited the importance of having teacher leaders take the reign and lead the
work. He achieved this task by making sure that the paid teacher leader positions had data
analysis proficiency:
DATA-DRIVEN DECISION-MAKING PRACTICES 70
Ensuring that the people that are in the leadership positions are able to articulate and
understand what—you know, what they’re working with, and how to interpret it, . . . and
then working with the leaders to, you know, influence their areas in the best way possible.
So . . . it’s the monitoring of the leadership of the school, is probably—probably the
number one factor that can influence what’s happening.
BHS. The principal of BHS took a unique route to building his PD plan. He began
reviewing a book with his administrative team and then moved to his leadership team. The
message of the book helped mold his core beliefs about PD: “The basic premise of results now
was that you get teachers to stop working in isolation and start working in a collaborative man-
ner.” He then noted that he used the message of the book to “set up structures similar to profes-
sional learning communities where you allocate time.” Although the concept of collaboration
among teachers in a professional learning community may not be foreign to school sites, the
manner in which he carefully crafted a plan to roll out structures that, at their root, would pro-
mote meaningful collaboration among teachers in a sustainable, continuous fashion was the
overarching message that he communicated:
So it was essentially a cycle of inquiry that we established for the course of the year, and I
think we saw a significant improvement—one, because teachers were working collabor-
atively together to discuss the data results, and two, they worked together collaboratively
to determine what the best instructional strategies to utilize to cover particular topics but
still giving teachers some level of control within their own classroom.
He discussed his mindful approach to confront the barriers that he knew he would encounter:
The focus was of this data process was based on collaboration. It wasn’t based on timing.
We spent a significant amount of time breaking down the barriers of autonomy and
DATA-DRIVEN DECISION-MAKING PRACTICES 71
teachers being willing to share their student results and, based on the sharing of student
results, foster the development of sharing instructional strategies.
All in all, a total of 18 months was invested in simply training administrative and leader-
ship teams at BHS—a factor that highlighted the commitment of the principal to invest time and
energy on his DDDM initiative. Once administrative and leadership teams were trained, the
principal moved to training teachers to use the SMART goal process to guide their instruction
while giving him an unintended monitoring tool:
As a department, . . . those SMART goals were . . . submitted back to the administrator
staff, but you might have had for our overall goals for the district for the year, but teachers
then could use those SMART goals to address what they actually needed to adjust in their
classroom.
GHS. The principal of GHS described the need to get down to basics with her staff. She
acknowledged that teachers, despite providing students with instruction, were not using the
instructional practices that she felt would improve student achievement. She immediately began
with a focus on students: “[A] big need I saw was student engagement was not really high.” She
also referred to historical trend data to guide her perspective:
Five years ago had a lot of teachers that were doing a lot of lecture based, or the students
were very compliant but there wasn’t a lot of affiliation and choice and novelty and
variety happening instructionally. And so again, that helps then to determine what profes-
sional learning that we will do across the system.
Because she had made her expectations about student engagement known to teachers, they knew
that it was a “non-negotiable” that they would engage students in rigorous instruction. The prin-
cipal described how her opinions on the lack of engagement were not subjective but instead were
DATA-DRIVEN DECISION-MAKING PRACTICES 72
guided by straightforward data. She informed her staff that she and her administrative team
would collect “checklist” data from their classrooms, including the number of times a student
was engaged or asked questions. This information would then provide them with data as a
method of justifying PD content. Despite her firm core beliefs, the principal of GHS was clear
when she acknowledged her inability to truly gauge that the PD she valued so much was having
the impact she desired:
What was the one thing that really made the difference? I don’t know. We . . . don’t
operate in a vacuum so . . . I don’t know if it was the PD or it was the improved achiev—
you know, discipline, or what it might have been. My gut tells me it’s what happens
instructionally. If you have a good teacher in front of kids who is, you know, very skilled,
. . . and have a good relationship with kids that it will make a difference for kids. I
believe in that research, so that’s why I focus so much on making sure that they have
good professional learning.
It was this “gut” feeling that drove her belief that if she professionally developed her teachers,
there would be improvements in student achievement. This principal made sure to highlight that
PD did not necessarily have to take place at GHS. In fact, she monitored which teachers had
been to district and/or third-party organization trainings such as Advancement Via Individual
Determination (AVID) or Advanced Placement (AP) Collegeboard trainings:
So if you look at my boards, this board tells you all the staff development to date that our
teachers have been trained in. So at—in AVID strategy, 60 of the 75 teachers on this
campus have been gone to summer institute or . . . training. The only teachers who have
not are typically your severely handicapped teachers, your ROP [Regional Occupation
Program] teachers and maybe your PE [physical education] teachers. Everybody else has
DATA-DRIVEN DECISION-MAKING PRACTICES 73
had an opportunity to go. Highly qualified . . . is an annual measure, but I want to make
sure that our teachers and they are 100% highly qualified in what they’re teaching. All of
our AP teachers have been to training; but if I add new AP teacher next year or AP class
and I have to make . . . sure that they’re going this summer.
DHS. The principal of DHS made it clear that the journey to building the systems that
would promote collaboration through PD came about only after strategizing over many obstacles.
He alluded to change within organizations as being difficult even when it came to changing
something as simple as a bell schedule; nevertheless, there was time and energy invested in
making sure that those systems were developed:
I think . . . that was an obstacle early on as well. I mean, just, with logistical planning for
how do you create time in the school day for people to analyze data, you know, and
realizing if you ask them all the old traditional models—we’ll ask them all to come after
school and just stay for department meeting after school. . . . that didn’t always work, you
know, and so the rest of them off school have time. Where do you build the time in to
meet, to discuss data and make data a priority school wide? So we changed our bell
schedule . . . to build in collaborative time into every week, and we have two even sched-
ules built into developed schedule every week.
He navigated the challenges by harvesting from the teacher leaders whom he had mindfully
sowed. It was these positions that played a crucial role in DHS teachers adopting his vision of
collaborative teams working together in purposeful PD:
It required us . . . to invest more time and developing leadership skills in those positions,
and we continue to do that . . . and invest time . . . in the course leads to further define
DATA-DRIVEN DECISION-MAKING PRACTICES 74
what their role is and how they’re going to facilitate those meetings, but at the heart of the
meetings is the data.
It is important to note that the principal of BHS was very mindful of the autonomy that he
provided teachers during PD time. Also important to note is the high level of trust that he placed
in those course leads to make sure that they moved forward with the work required to take place:
They meet without us and their course elect teams; they discuss the data . . . in their own
course elect teams. They do so in their meetings to discuss what best practices were
shared, what items were changed, and what was the discussion centered around; but
beyond the minutes, we don’t sit in those meetings and dictate how it goes.
Like the principal of GHS, the principal of DHS also pointed to offsite PD as playing a
role in the success of DDDM. Unique to all responses on PD, however, was the highlight of
prioritizing funds to dovetail with values and objectives:
And I think that’s another piece, how you spend your money is an indicator of what you
value so, you know, having that summer professional development and paying everybody
to attend for as many days as they need to in the summer.
In addition to allocating funds for teachers to attend third-party PD, the principal of DHS made
sure to highlight how he provided additional pay for PD hours outside of contractual pay:
[During] specific weeks in the summer for each subject area, and they meet intensely that
week, but in addition to that, each teacher is approved to work collaboratively for as many
hours as they want as long as they are working on things that are priorities for our district.
EHS. The principal of EHS began his discussion of PD by acknowledging the support of
his district: “There was a tremendous amount of really valuable staff development that was
provided by the district to teachers during the summer and after school on instructional practice.”
DATA-DRIVEN DECISION-MAKING PRACTICES 75
Despite the latter fact, like other principals he realized that during the initial implementation of
DDDM practices, the PD was contingent around his navigating the personalities of his staff in
addition to teachers who did not value collaborating with one another as much as he did:
They’re not—they’re not obstinate, difficult, argumentative people. They just would
prefer to work alone. . . . Then it shows how important it is to visit classes, shows how
important it is to attend departmental meetings, and how important it is to group teachers
and have them talk about things, because that’s how we draw the autonomous teachers
back into the group.
A unique perspective on collaboration was shared by the principal when he highlighted some of
the logistical factors he faced when collaborative meetings were held:
Even though I think the trend is first to work together more and more and more, sharing
ideas and have common practice and more cohesiveness around pacing and instruction
and assessments and reassessments, there’s always some sort of an argument going on.
Always. And professional debate is, you have to have it. It’s actually desirable.
Despite this challenge, the principal made sure that teachers understood his position and did not
deviate from his expectations:
My steps are that there’s an expectation that teachers meet, and there’s an expectation that
they talk about effective instruction together. There’s an expectation that they talk about
what Common Core is together, because we don’t want to get away from that, and that
people are encouraged to take risks—that the innovators are encouraged to innovate, and
then I will share what they learn with the faculty.
An interesting technique that the principal identified to help navigate the latter problems
with the initial rollout of collaborative PD was seeing himself as the eyes and ears of his school.
DATA-DRIVEN DECISION-MAKING PRACTICES 76
He provided an example of dealing with an individual who valued more autonomy versus
collaboration when she indicated to him in a conversation that she could not possibly use all the
data for English language learners as an instructional guiding tool because there were so many
data points. Instead of giving her an example of how he thought she could resolve the problem,
she sent out model examples that other teachers used to provide her with examples. He refer-
enced this technique as a way to promote collaboration in an “organic” fashion albeit indirectly
influenced by his actions.
Like every other principal who participated in this study, the principal of EHS used
teacher leaders to drive PD:
Our site-based PD has to do with our instructional leadership team, which is a group of
about eight teachers that meet quarterly, actually at least quarterly on a release day, and
during our 4th-quarter release day—together we come up with our PD plan for next year.
The purpose of the site instructional leadership team, however, had a unique function at EHS: to
specifically focus on utilizing the cycle of inquiry to refine lesson plans as a group instead of
strictly sharing best practices as other sites do:
A group of eight teachers, they work in pairs, design a lesson together, teach the lesson,
refine the lesson and re-teach the lesson all in one day. Um, and this year’s theme is going
to be around interacting with complex text.
ZHS. The principal of ZHS had a more hands-on approach to setting objectives for PD.
He utilized face-to-face meetings with each content team to identify areas of growth: “I would
meet with the whole team and do a pullout with them, and they would start designing their
lessons based on where we found that we were struggling in.” The amount of time that teachers
DATA-DRIVEN DECISION-MAKING PRACTICES 77
had to collaborate was much more sparse with ZHS because they met only monthly instead of
weekly as other school sites:
So the last Friday of every month was just to have data teams, and they would go over the
data from . . . their classrooms for that month, . . . because they were—we were getting
pretty common assessments and seeing which class was having a lot of success and
everything.
The principal of ZHS did not focus as much on discussing the PD as a factor with respect
to the success of DDDM. There was evidence of PD being so dysfunctional upon his arrival at
ZHS that he was able to leverage the school as being significantly underperforming as a factor to
drive PD.
Summary of key points. Although there was variation in the personnel used to drive
PD, the time of day and length of time that teachers met, the topics discussed during PD, and PD
being strictly provided by site or district, one theme was evident, all principals understood that
PD played a crucial role in the success of the DDDM initiative. More importantly, they were
aware that objectives for PD were clearly communicated to teachers, that time had to be not only
committed but also consistently provided to teachers with the sole purpose of revisiting DDDM
procedures. Also evident, albeit more strongly at some sites than others, was the expectation of a
product during PD. In other words, there was zero evidence of sit-and-get types of PDs. When
teachers attend PD, they are required to create lessons, bring artifacts to meetings that would
drive the work, fill out SMART goal templates that would be used by administration to drive
discussions, or bring a best practice to a meeting that would be shared with staff. (See Table 6
for summary of commitment to product-driven PD.)
DATA-DRIVEN DECISION-MAKING PRACTICES 78
Table 6
Summary of Commitment to Product-Driven Professional Development (PD), by School
School site Procedures regarding product-driven PD
Alpha High School Primarily used department chairs to drive conversations with teaching
staff. Understanding of instructional needs guided by four basic ques-
tions set objectives. PD took place during day and after school.
Beta High School Used both department chairs and content leads to drive PD. Objectives
were well defined through book study. PD tool place during school day,
during pullouts, and during summer.
Gamma High School Course lead model in addition to department chairperson collaborate
with administration to drive PD. Primarily the district set objectives,
with some site administrative autonomy for implementation. PD took
place during school day, during pullout days, and during summer.
Delta High School Course lead model in addition to department chairperson collaborate
with administration to drive PD. Primarily the district set objectives,
with some site administrative autonomy for implementation. PD took
place during school day, during pullout days, and during summer.
Epsilon High School Used instructional leadership team as a means to generate model lessons
that were continuously improved over time. Objectives set by both prin-
cipal and district office leadership. PD took place during pullout days,
after school at the district office, and during full-day PD days.
Zeta High School Used both department chairs and content leads to drive PD. Principal
primarily sets objectives. PD takes place during school day, during
pullouts, and during summer.
Themes Related to Research Question 3
Research Question 3 asks, “What are barriers that a secondary principal faces when
implementing DDDM?”
Rebranding classroom walkthrough. Marzano et al. (2003) suggested that school site
administrators need to look beyond compliance in the walkthrough process when they noted that
DATA-DRIVEN DECISION-MAKING PRACTICES 79
“rather than prowling through classrooms with checklists of ‘correct’ practices, administrators
should be looking at interim results with their teachers, identifying the most effective practices”
(p. 167). The latter statement, although descriptive of a theme in the DDDM literature, suggests
that walkthroughs should advance work upon which collaborative groups have agreed as opposed
to functioning strictly as monitoring teaching practices that could be subjectively disagreed upon.
One of the slight deviations from the data collected was the purposeful intent of principals to
rebrand their walkthrough processes.
AHS. The principal of AHS began rebranding the walkthrough process by acknowledg-
ing that he had a substantial number of teachers who were working hard but with little or no
yield. He capitalized on the latter fact to shift the teacher mindset from one of checklist walk-
throughs that were compliance based at their core to a walkthrough process that focused more on
instructional practices that stemmed from organic discussions of practice in collaborative groups,
hence making it more palpable for teachers:
There was so much frustration from the teachers because . . . you know, it’s like—it’s just
efficiency, you know? You know, everybody only has so much time and energy to
expend while they’re at their workplace, and they want to use it the right way, and then
they want to see results, so there was a lot of hard work and a lot of people putting in
great efforts and seeing very little or no results.
At the same time, however, the principal recognized that it was also a method to get “a sense of
who is preparing which tea—which teachers are preparing the right way to create the . . . appro-
priate lessons.” Despite most teachers being on board with changing the walkthrough process
after the reasoning was provided, the principal acknowledged that approval of the initiative grew
at a slow pace due to teachers being naturally reluctant to change:
DATA-DRIVEN DECISION-MAKING PRACTICES 80
There’s a lot of, you know, people that are creatures of habit, and . . . they’ve done things
a certain way for a long period of time, and . . . change is hard, you know, and regardless
of who it is, change takes thought, and some . . . you know—there’s a term I remember,
cognitive economy. It’s easiest not to think. It’s easiest not to change, so just the fact that
you’re changing is a barrier. But then once, you know—so then you have to talk about
the rationale: What’s the reason for the change? If it’s just something—you know, it’s
because it’s a whim or something, then people—nobody’s ever going to really get into
that. But if there’s a rationale that goes with the change, then people are more accepting
and willing to try.
The principal also discussed how a cycle of inquiry was used to keep the system account-
able as to why specific steps were taken, how it was being checked, how it was being refined, and
acted upon:
I mean, that’s . . . the cycle of inquiry, and, you know, we’ve developed a few different
you know—we call them evidence, and, you know, . . . what’s the plan? You know, what
are you doing, and how are you checking it, and then what’s your next plan, or how . . .
you’re act[ing] on that?
Essential to the walkthrough process for the principal of AHS was the core belief that
only by using the walkthrough process to constantly monitor practices that were directly corre-
lated to dialogue from collaborative groups could they be sustainably used in the DDDM process
to move forward:
So those are . . . things that we know and we believe, and we’ve really tried to tighten
those up and improve those, and it’s just the same thing as the constant monitoring—the
DATA-DRIVEN DECISION-MAKING PRACTICES 81
constant checking of practice and adjusting the practice and trying to become more effi-
cient, and as proactive as possible with . . . what you see as coming up next.
BHS. The rebranding of the walkthrough process took on a slight different form at BHS.
The principal there had to take mindful steps to ensure that teachers did not think that the data
from common assessments were going to be used in the evaluative process:
The major challenge was encouraging teachers to share the data of their student results,
become comfortable knowing that site administration was not going to use their results in
an evaluative manner which would then attach to their yearly evaluation to potentially
evaluate them out of a particular position if they—if their students actually weren’t
demonstrating mastery.
The principal also directed his attention to branding the walkthrough process as a means
to justify monitoring to the central office while simultaneously shifting the mindset of teachers
from compliance to functionality:
The tool that we used as an administrative team was going to be shared with district office
administration as to justification of how we were monitoring or utilizing data—moni-
toring the adjustment of a construction based on . . . that use of the data. So that would be
one, and then the second aspect was because it was a cultural paradigm shift in the sense
that teachers in the past had not used data truly to adjust their instruction, as well as they
hadn’t really worked collaboratively.
Like other schools involved in this study, this “paradigm shift” occurred by building a culture of
trust with staff members and allowing teachers to collaborate without an administrator present
during administrative meetings:
DATA-DRIVEN DECISION-MAKING PRACTICES 82
It was them seeing that the administrator wasn’t in the room sometimes to allow them to
have open conversations about what transpired and then . . . as a department, . . . those
SMART goals were submitted—submitted back to the administrator staff, but you might
have had for our overall goals for the district for the year but teachers then could use
those SMART goals to address what they actually needed to adjust in their classroom.
GHS. GHS leadership used a different process to rebrand the walkthrough process; they
simplified it by focusing on a few instructional items, as the principal noted: “What are the key
things we need to focus on? What two or three things max is all we can focus on, and knowing
the instruction piece was huge, that was a huge amount of our time and support.” This process
did not come as easily as it may seem; rather, it came through much social negotiating with
teachers who were skeptical of the few instructional initiatives being presented. The principal
carefully crafted how she would generate validation of the process by collecting qualitative data
and then presenting the nameless data to staff. The focus of her work revolved around instruc-
tion, specifically student engagement through higher level questioning that was supported by
research:
With respect to teacher buy-in, I think being open and transparent about sharing what
you’re seeing, especially when you’re walking classrooms, there’s a process that we used
called—that I used called the data walks process. So you’re in every classroom for about
5 minutes a time, 4-6 minutes, and you’re looking for seven different components of
instruction—so you’re looking at the standards that the teacher’s teaching. You’re
looking at the level of thinking that’s happening. You know whether you use Blooms [a
taxonomy that is a six-tier system used to describe different learning domains] or DOK
[Webb’s Depth of Knowledge, a four-tier system used to describe different learning
DATA-DRIVEN DECISION-MAKING PRACTICES 83
domains] or whatever. You’re . . . looking at the instructional strategies, and you’re
basing that on Marzano’s work, or you’re looking at student engagement. And you’re
looking at Schlechty and what he says about what student engagement should look like.
You’re looking at the types of assessments that teachers are giving. Is it multiple choice,
are they asking open-ended questions, or [are] they asking higher level questions, or are
they just asking a question to the whole class and nobody responds?
Once a basic message was established, she passionately communicated the message consistently
that allowed teachers to refine the process over time:
So . . . it’s nonnegotiable that you’re gonna use good instructional practices, that you’re
gonna hold kids to high expectations, that you’re going to do bell-to-bell instruction . . .
that you’re gonna . . . Students—we want student-centered learning, student engagement
to be high, but you can use a variety of different practices to do so.
DHS. The principal of DHS dovetailed his walkthrough process rebranding by utilizing
the collaboration for which he previously lobbied with staff members. As noted previously, there
was a mindful approach to ensure that staff members collaborated in a timely way; hence, the
momentum of this process was leveraged:
Some people were stuck in the way they did things and had a hard time initially transi-
tioning to a more collaborative practice where again, I talked about in the earlier question:
How do we monitor it? We’re in classrooms every week.
Being in classrooms, however, was done specifically for the purpose of supporting what the
teacher leaders communicated back to administrators through the minutes of those collaborative
meetings:
DATA-DRIVEN DECISION-MAKING PRACTICES 84
You know, I think somebody told me the saying that you’re . . . the campus will respect
what you inspect, you know, and so what are we looking at, you know? Are we sending
notes back on those minutes when they’re submitted to us and saying, “Great. . . . I like
that. You guys talked about this. Maybe next time, you might want to consider this in
your discussions.” Because if they’re just sending in minutes and nobody’s responding to
them, at some point those minutes become garbage, you know, and useless and—but
looking at them and saying, “You know what? I appreciate this. I appreciate that. I look
for those instructional practices in our next visit to your classrooms.”
At this point it is worth noting that the specific purpose of the walkthrough process was
the observation of best practices discussed at the collaborative meetings. In the rebranding
process of the observation process, however, the principal of DHS had to contend with teachers
who were previously doing great work in isolation. However, the principal acknowledged that
the objective was to create school-wide success and not islands of excellence:
For years prior to this, . . . if you were a great teacher individually, you’re fine. There was
no requirement to share what you were doing. There was . . . no requirement that you
would meet frequently to analyze data with your team. We realize that that requires a
certain amount of skill—a certain amount of social collaborative skills—that just because
people were great teachers in the past didn’t necessarily mean they were going to be the
greatest collaborators.
While simultaneously supporting teacher leaders and the fruits of their collaborative
labors, the principal was mindful to avoid a finish-line mentality where compliance was the
focus. Instead he ingrained continuous improvement into process to make it acceptable for
administration to always be in the classroom:
DATA-DRIVEN DECISION-MAKING PRACTICES 85
We want to be completely transparent with data. It’s a way to know this is just the infor-
mation for us to have, to help us improve every year. That’s kind of our philosophy is we
just want to continually improve—continuous improvement is something we talk about
all the time. We know it doesn’t happen in leaps and bounds and sometimes it’s, you
know, a step backwards at times, but we want to see a trend—how we continue to im-
prove what we’re doing and what drives that is measuring how we did, you know. So in
talking about data simply just information for us to make decisions about, how we’re
going to do again next time. So it’s not as I think intimidating as it once was.
Further supporting his stance on continuous improvement through the walkthrough process, the
principal of DHS embedded practices that were annual in nature to ensure that staff were
conditioned to holistically use data as a conversation piece in the DDDM collaborative process
that drove the walkthrough process:
So, you know, . . . we’re pretty responsive to data. I think that way and, you know, that
continual monitoring where our staff knows they’re going to get this data every single
year, you know, and they’re going to get—and we work really hard to get them the same
data in the same format so that it’s not overwhelming, so they [are] used to seeing the
same charge every single year and . . . with the same data on there so they can see
longitudinally how things are improving or how—if they’re getting worse. And really
have them become very familiar with the data that we value and we measure, so that they
can see the improvement.
EHS. The principal of EHS took a completely different approach to rebranding his walk-
through process. He noted that he used the walkthrough process to drive collaboration by trying
“to be the eyes and ears to get people to collaborate.” He achieved the latter by drawing attention
DATA-DRIVEN DECISION-MAKING PRACTICES 86
away from him as the principal and evaluator and shining light on the great work his teachers did
on a day-to-day basis:
And one of the things that I said to our faculty, again, many times over and over, is that
when I went from the classroom to be an administrator, I felt like I had a gift that I never
had as a teacher, where I got to see different teachers in their classrooms every single day.
And I said, “None of you get to do that.” And I said, “I get to see ideas that people could
share, and I could see cases where people can learn from one another’s strength.
One of the basic examples he provided to drive achievement of English language learners was the
use of a seating chart:
And, you know, out of a school of 90 people, I have four to eight people whose . . .
seating charts would look like that. So I’d shoot pictures of them—we’re getting back to
the interpret part—and then I would share the seating chart with the faculty and again say,
“Gosh, I feel like I’m the luckiest person in the world, that I’m the only person here who
gets to see all these things, and none of you get to see it, and I get to see it.”
At the root of DDDM effectiveness for EHS’s principal was the person-to-person contact
he achieved in the walkthrough process. He acknowledged that no modern 21st-century silver
bullet could provide him with the qualitative data that the walkthrough process provided for him
that, in turn, drove his decision making in terms of PD with teachers and conversation with staff
members:
There’s no . . . office-based, electronic, 21st-century way . . . of doing that, . . . and ulti-
mately, I mean, we haven’t even talked about visiting classes. We did a little when I told
you I took pictures of the teachers’ seating charts, but if you don’t know your faculty, . . .
none of this other stuff even matters at all. It doesn’t.
DATA-DRIVEN DECISION-MAKING PRACTICES 87
ZHS. The principal of ZHS rebranded his walkthrough process through the lens of sup-
porting leadership. When observing teachers not engaged in practices that supported growth in
student learning, he would ask for ways in which he could support them. He offered an example
where working with a teacher where falling behind was evident, he provided support to get the
teacher caught up, thereby leading to the professional growth he desired:
Well, it starts with a, you know—if I . . . walk into your classroom and I’m just going
like, “Oh my gosh, we’ve got problems here.” You know what I mean? Or “You have
35% of your students are Ds or Fs. You’re going to meet with me, but it’s not going to be
like a meeting where you’re in trouble. It’s going to be like, ‘What can I do to help you?
What . . . do you need?’” One teacher goes like, “Look, I apologize to you. I can’t keep
up. I just need a day.” And I go like, “I’ll pay for your sub. I’ll pay for your sub—you
can work in my office, because I have this big conference room in there, or you can go
somewhere where your comfortable, you know, the library. Take a day.” And the re-
sponse from that teacher after that day was incredible growth.
The walkthrough process took on significantly more meaning to the principal of ZHS as
this was a perfect opportunity not only to monitor the DDDM practices teachers agreed upon
through collaborative meetings but also to function as a way to leverage his contact with teachers
and students to build rapport essential to driving growth in instruction:
So I’m taking 5 minutes to do that. I can’t tell you how far that goes for a principal and
for the educator that’s in that classroom going like, “Oh my gosh, those are one of my
kids.” I’ll walk into the classroom and be like, “Oh man, I love the students in your
class.” And so when we got to the start of the school year, I’d done such a good job of
getting to know as many kids as I could all summer, all last year, and into the first quarter
DATA-DRIVEN DECISION-MAKING PRACTICES 88
that I can ask the students on this campus, “Look, this is what we’re going to do,” and
they’re going to follow me.
Summary of key points. The principal of DHS noted that “people respect what you
inspect”—a slight variation of the adage “expect what you inspect.” No matter what the princi-
pals’ positions were, there was a clear understanding that the walkthrough process served as the
research would indicate: a progress monitoring tool. What the research on DDDM did indicate
was that the principals not only took a thoughtful approach to articulating to staff what was
clearly expected in the walkthrough process but also made sure to transform the perception of the
walkthrough process to the tool they envisioned to be a critical part of their DDDM initiatives.
(See Table 7 for summary of rebranding classroom walkthroughs.)
Strategic conversations. The ability of a leader being able to harness the power of
rapport and relationships to leverage the use of DDDM to drive leadership expectations, PD, and
collaborative meetings constituted elements that emerged in the literature review. However, one
of the elements that was not reflected explicitly in the literature review was that of pragmatic
conversations in which leaders must partake during operational, day-to-day implementation of
DDDM to ensure that all stakeholders implement the protocol to fidelity. Simply said, what
AHS. The principal of AHS took on a “truth-telling” approach to ensure that his staff
members were implementing the DDDM procedures adopted at AHS. He did so by telling staff
what was working and what was not but simultaneously “keep[ing] positive, professional rela-
tionships. He acknowledged that this process was possible because he had put in place a commu-
nication system that had promoted stakeholder input and included their voice in the initiative.
Nevertheless, there was clear evidence of being strategic in how to engage staff members in
dialogue, as indicated when the principal shared his opinions on his own protocols: “What I’ve
DATA-DRIVEN DECISION-MAKING PRACTICES 89
Table 7
Summary of Rebranding Classroom Walkthroughs, by School
School site Procedures regarding classroom walkthroughs
Alpha High School Used teacher frustration on previously ineffective, compliance-based
walkthrough processes to fuel rebranding.
Beta High School Took a meticulous approach to introducing walkthrough process as a
means to support SMART goal process, taking purposeful steps to ensure
common assessment data were not going to be used in evaluation
process.
Gamma High School Rebranded walkthrough process by utilizing a back-to-the-basics ap-
proach. Focused on a few items that were central focus of walkthrough
processes.
Delta High School Transformed walkthrough process by capitalizing on previous work that
focused on empowering teacher leaders to drive collaborative work.
Served as a means to support latter teacher work.
Epsilon High School Principal used the consistent message of seeing himself as a steward for
teachers who used walkthrough process and administrative experience to
highlight good practices in classrooms to rebrand classroom
walkthrough.
Zeta High School Principal had branded the walkthrough process using a strict perception
of the process as a means to identify teachers that he needed to support.
found is, and I’ve heard it—it’s been reinforced—is people appreciate probably more just talking
to them in the hallway, or coming to their classroom when nobody’s there.”
BHS. The principal of BHS took on a different approach to engage staff members in
dialogue to ensure that DDDM procedures were followed; he embedded it as part of the process
by reviewing the SMART goals for each individual department. Those conversations, however,
DATA-DRIVEN DECISION-MAKING PRACTICES 90
would take place with teacher leaders, who would then bring those concerns back to the teachers
—in essence using a middle person to communicate with teachers:
[SMART] goals would then be submitted to site administrators, and then the site adminis-
trators would sit with the department chairs to, one, just ask program questions as to how
they actually arrived at the goals that were set for the coming year. If the goals did not
necessarily have justification, then the goals were—the site administrators asked the
department chairs to revisit them with their department members to revise those particular
goals.
The latter procedure, however, did not come without some initial guiding from administration:
We identified what we thought were some areas that were strengths and then some areas
of growth, and then we set up some preliminary goals that we thought the departments
could possibly set, and . . . we used those observations to assist us in driving our initial
conversations with the teachers.
GHS. The principal of GHS was very explicit about the actions that she took and how
she communicated the conversations that she would hold with teachers if they were unable to
implement the DDDM procedures:
“I’m going to be in your classroom, and if I don’t like what I see instructionally, I’m
going to let you know, ‘Hey, this is what I saw—this is what I wanna see. Here’s the
support to make you successful.’ And if you fight me on that, then I’m not gonna change
[because] this is what I know is good for kids and so . . . your achievement data is show-
ing that what you’re doing is not working, so why are you going to continue to do what’s
not working when I’m giving you an option and support?”
DATA-DRIVEN DECISION-MAKING PRACTICES 91
Despite making reference to what she saw instructionally, the DDDM procedures at GHS incor-
porated shared instructional best practices as a staple of the DDDM process. The principal made
it clear that she was passionate about instruction and the way in which she communicated that to
staff members:
So it’s . . . nonnegotiable that you’re gonna use good instructional practices, that you’re
gonna hold kids to high expectations, that you’re going to do bell-to-bell instruction . . .
that you’re gonna [have] student-centered learning, student engagement to be high, but
you can use a variety of different practices to do so.
DHS. Similar to BHS, the principal of DHS used a middle person to communicate the
findings of undesirable walkthrough data to ensure that teachers were implementing DDDM
practices. In addition to using a middle person, there was also an open conversation during full
staff meetings. The principal noted that what happened when this practice did not yield desirable
outcomes:
We rely on . . . our teacher leaders as well to support the change that we’ve discussed, and
we . . . try to monitor it through classroom observation and then make recommendations
back when we’re not seeing it to course leads and department chairs—so this is what
we’re seeing great or this is what we’re not seeing and would like to see more of, and
trying to communicate that through our course leads and our department chairs, and in
addition to what we’re saying, and through either formal meetings like a general staff
meeting or department meetings where we stop in and talk about things that are—we need
to address or even our informal conversations we have with individual teachers.
The principal closed the latter statement by sharing that the conversation might even extend into
the evaluation process “through our—an observation process and evaluation process on
DATA-DRIVEN DECISION-MAKING PRACTICES 92
recommendations we’re making . . . that we think would be in line with what the data’s telling
us.” The principal of DHS also noted that the conversation piece was particularly difficult
because there was an initial fear of change in addition to having veteran staff members at his site
who were releasing 20 years of autonomous and at times successful teaching practices:
There’s a lot of fear. I think fear was a huge barrier. The traditional veteran teachers that
were in place that were working in isolation for 20 years, and we’re used to kind of the
autonomy of their own classrooms—the autonomy of making decisions on a regular basis
or just the feel like, “Hey, . . . I already have the way I’m going to do this—the way I’m
going to teach this year mapped out. I’ve been doing it this way for 20 years. It’s fine.
Why do I need change?”
Despite this issue, the principal made sure to take on those challenges and engage those teachers
in dialogue to ensure that there was school-wide implementation of DDDM practices. The
principal acknowledged that only by having school-wide implementation could long-term
changes in instructional practices occur that would lead to sustainable growths in student
achievement. This seemed to be a systemic process that was ingrained in the tenure process at
DHS:
We train them to work collaboration. We set that expectation that they’re going to meet
with their peers frequently. We evaluate that way, and that is a major priority for the
granting of tenure for teachers [that adds an] additional layer [because ] it’s not only
about great instruction in the classroom.
EHS. The principal of EHS, like the one at GHS, took on a very forward method of
engaging teachers in conversation whenever needed, as he pointed out: “I would say, for me,
where I am now at the start of my 5th year here—and . . . we’re—for lack of a better word,
DATA-DRIVEN DECISION-MAKING PRACTICES 93
forcing people to work together—it’s the interpersonal stuff.” The “interpersonal stuff” the
principal mentioned were the intermittent cases where teachers were unable to work together
during the collaborative part of the DDDM process. He engaged them in conversation but in a
very “noble” manner:
I always dignify it with this, because it almost is like this. It . . . comes from a good place.
It’s “You care about kids, you care about what you do, but let’s work on finding some
common ground with . . . your colleagues to where we don’t feel like we’re being hurt by
one another or hurting one another and not personalizing things as much.” So those con-
versations happen a lot.
At times the conversations in which he engaged addressed a crisis in the belief system—for
example, “she said she didn’t have time and the kids didn’t have the vocabulary skills to do it.
So I brought her in and we talked about it.” He also provided examples of conversations that
were detailed and specific:
So once I identify the kids and intervene . . . services that will help kids, and that’s every
kid—that’s a kid whose English CST is 450, and that’s a student whose English CST is
260 or lower—let’s go ahead and meet the needs of all of those kids.
ZHS. The principal of ZHS, like the principal of EHS, took on a more detailed approach
to conversing with teachers, specifically as conversations related to the disaggregated student
achievement data that he and his team generated. However, his strategic conversations were
selected based on individuals who might be implementing DDDM processes albeit unsuccess-
fully, as measured by gaps in student achievement data:
Then, of course the . . . achievement rate inside the classroom. So all summer long that’s
all we did was break it down with them, give them the data—this is where we’re at, we
DATA-DRIVEN DECISION-MAKING PRACTICES 94
grew here or we dropped here, let’s take a look at this, what we’re doing wrong, do we
need to move people around a little bit. Okay, then bring them in, and so we redefine our
curriculum to make sure we grow in those areas.
His approach, however, took on a positive supportive tone as indicated when he mentioned that
teachers would have face-to-face time with him to discuss the gaps in practice: “[Teachers are]
going to meet with me, but it’s not going to be like a meeting where you’re in trouble. It’s going
to be like, ‘What can I do to help you?’” He provided other examples of how he met with
teachers who failed to meet the appropriate or desired growth targets:
And then . . . we start—that’s the first thing I look at. Just like what area, where are we
struggling across our curriculum where students aren’t getting the material? And then I
would have individual meetings with teachers who seem to have the most struggles in
those areas—large numbers of students not achieving.
Summary of key points. Although there were multiple barometers used for selecting
individuals to engage in conversation, a common theme that emerged from the data in this study
was the use conversations strategically by the principal with the intention to promote school-wide
commitment to DDDM protocols. Some principals opted to use conversations to drive positive
cultures, to identify gaps in practice, to clarify expectations, and/or to identify interventions to
drive support. Nevertheless, they were all consistent in following through with the conversation
piece that was conditioned into their DDDM processes. (See Table 8 for summary of strategic
conversations.)
DATA-DRIVEN DECISION-MAKING PRACTICES 95
Table 8
Summary of Strategic Conversations, by School
School site Procedures regarding classroom walkthroughs
Alpha High School Used personal conversation that took place in neutral teacher spaces to
communicate shortcomings in a very positive and professional manner.
Beta High School Used a middle person as an initial method of dialogue to inform teacher
groups of the need to take corrective action.
Gamma High School Principal was very forward and clear in communication with teachers so
that if the walkthrough process identified the need to speak to teachers,
she would meet to help them determine any obstacles. Also used the
process to identify future professional development topics.
Delta High School For low-level and non-urgent communication, principal used teacher
leaders to act as liaisons between administration and teachers. For higher
level or more urgent communication, person-to-person contact was used
and included tenure discussions when applicable.
Epsilon High School Built a culture of consistently engaging teachers in face-to-face commu-
nication whenever necessary to clarify expectations and built a plan for
improvement.
Zeta High School Conversations take place on a consistent basis and are revolve around the
need to change instructional practices that did not yield desirable growths
in student achievement.
Theme Related to Research Question 4: Staying the Course
Research Question 4 asked, “What are the mechanisms that secondary principals use to
evaluate the effectiveness of protocols and systems put in place by DDDM?”
Although the last emerging theme from the data collected from all six principals partici-
pating in the study did arise in the literature review, it occurred in a slightly different context.
The literature review was rich in discussions about continuous improvements as a means to use a
DATA-DRIVEN DECISION-MAKING PRACTICES 96
defined DDDM protocol to increasing student achievement. The distinction between the latter
and the final emerging theme was that principals did not articulate a consciousness to improve
their DDDM procedures; however, there was a consistent message of staying the course with the
DDDM procedures once a co-constructed vision was created with their staff. The evolution of
DDDM practices, the principals all indicated, took time.
AHS. The principal of AHS focused on consistently pursuing “incremental growths” as a
means to stay the course with his DDDM procedures:
The motivation of the teachers to continually, you know, try to do some, you know— . . .
how do you get the next incremental growth, in a way, and keep them on that, in that
mind frame, you know, so that they don’t become stagnant or, you know, comfortable
with, “Okay, this worked great last year, and we’ve got it figured out—we’re done.” So
it’s just, you know, continually trying to motivate and keep people innovating and con-
stantly just trying to get a little bit more, a little bit better.
The principal maintained a slow and steady approach to improving student achievement
and never deterred from using the DDDM processes that he developed as a means to drive what
he did, despite some personnel issues:
Once, you know, one of them is whether a new person comes into the system, whether it’s
an administrator or a teacher, adjusting . . . the chemistry, essentially, is definitely an
obstacle, because none . . . —all schools, you know, they’re . . . constantly changing. So
there’s always going to be new people that come in, and new dynamics, and new person-
alities that have either positive or negative influences on the existing groups.
BHS. The principal of BHS projected himself as dealing with the many “things” that
arise in the day-to-day operations of schools and how easily these issues can deter the work that
DATA-DRIVEN DECISION-MAKING PRACTICES 97
has to take place, yet he found a way to make the time and find the protocols that would allow
him to tend to DDDM practices:
So many things come up in the course [of a] week, and so it was staying the course and
not letting the distractions derail an early-release Wednesday that was set to focus . . . on
data, and basically finding out other ways to address the issues . . . that had arisen, you
know, over the course of the day’s proceeding.
It is important to note that the principal of BCHS was very mindful of time and its role in build-
ing the structures that would ensure the success of DDDM practices. He discussed preliminary
steps evolving, over years, which points to the aim-ready-fire versus the fire-ready-aim approach:
It was at that point that—and this was over the course of a year, year and a half—adminis-
trators about 6 months—the department chairs were another, approximately about a year,
so it was about a year-and-a-half process before we really . . . rode it out. We did . . .
some pilot type of practices, but we didn’t have full implementation of it, probably about
. . . 2 years in. But after department chairs actually went through the process, then they
actually were able to then roll it out to their department. And as a part of the rollout, each
administrator was assigned to a department to ensure that it was implemented with
fidelity. And if there were questions that need to be answered that our department chairs
could not answer, we were there to support the process; and based on that structure, . . . it
then turned out to be quite effective.
GHS. Like the principal of BHS, initiatives that were used to prepare the site for DDDM
practices took years to build:
And then there was another professional learning that we brought to … I had been very
well received at [previous school site], example for example and across [Gamma Unified]
DATA-DRIVEN DECISION-MAKING PRACTICES 98
actually, and that was a training called Capturing Kids Hearts. And so that’s all about
building relational capacity with kids but also helping them to become self-managing in
their behavior and create high-functioning, self-managing classrooms. And so we created
a plan where every teacher would participate . . . in that training over the course of about
2½ years.
Although selecting a few items on which to focus in terms of PD, the overall objective
was to focus on items that would support the DDDM procedures set in place as a means to drive
improvements in student achievement:
And so we started looking at, okay, if we could only bring in a few types of professional
learning experiences so to make sure that everybody has a baseline of good practice, what
would they be? And so the curriculum, the assistant principal of curriculum, and I
worked on what that would look like. So each of them, making sure that we focused on
their strengths, were really able to—and then be consistent about that—were able to really
affect the change, I think, pretty quickly.
DHS. The principal of DHS relayed his timelines for developing and implementing
DDDM practices in terms of years as well:
I talked about the summer professional development. That’s been something that’s been
in place for 4 years now where this specific week in the summer for each subject area, and
they meet intensely that week. But in addition to that, each teacher is approved to work
collaboratively for as many hours as they want as long as they are working on things that
are priorities for our district.
At times it was necessary to refine processes that were previously agreed upon. The fol-
lowing comment provides a detailed example of such refinement when the principal discussed
DATA-DRIVEN DECISION-MAKING PRACTICES 99
how when collaborative meetings were set to discuss data, they were perhaps not yielding the
desired results. However, this problem did not deter the principal from staying the course:
There was just on a calendar—okay, we got the schedule now, now we have time to meet,
and sitting around with the chemistry teachers and looking at each other, and you got your
data that your administrator put in your mailbox but not really knowing, so now what?
You know, we looked at it, said, “Oh, it looks like you did pretty well on the test, and you
did well on the first half of the test on these standards, and you did really bad”—you
know, but no outcome, no change was coming out of it, and it was really only those
groups that kind of self-motivated. Or maybe a leader emerged in that group, without
prompting to kind of drive—okay, to know, “What, so now what are we going to do to
reteach this so now, what are we going to do next year to . . . do this better than getting
those leaders in place?”
He further discussed how the human element was taken into account in the DDDM process yet
did not deter him from staying the course:
I remember years ago when we first put together our collaborative teams and developed
the . . . course lead position and, you know, early on, I think a lot of schools went through
identifying central standards and then development of common assessment. I think
through those early processes, we began to understand more about the collaborative
dynamic that required us to invest more time and teaching collaboration and teaching
group leadership because they do function—those little data analysis teams do function
somewhat independently.
Staying the course was an area where the principal of DHS acknowledged that the district
office had some degree impacted his decision making with respect to DDDM practices:
DATA-DRIVEN DECISION-MAKING PRACTICES 100
Our district has developed a—what we call an instructional guide for the entire district
with the set of values on what is important, how do we operate within these collaborative
teams, what is the expectation of . . . data analysis and the common assessment. That’s
been developed—we’ve now in the third edition of that— . . . that’s what we would call
our plan, has been edited and we’ve had input from, not just from administrators but from
faculty groups as well.
The principal articulated that he identified himself as a steward of keeping DHS focused
on DDDM initiatives:
So, I think, that’s my job is to stay focused on areas that are important to continue to
value, the collaboration and data-centered focus, . . . and the words I say and the consis-
tency that I have in providing them data, presenting them data.
Among the data provided by the principals participating in this study, few were as clearly
articulated as the following message from the principal of DHS. The message epitomizes the
theme of staying the course and its impact with dealing with challenges:
It’s the secret of our success—our academic success has been the history of these struc-
tures, and these structures that focus on data and sharing the practices based on the
common assessments and, you know, the more that we say that, the more it becomes
ingrained, and I think over time we won over the veteran teachers.
At the end of the day, the principal acknowledged that time was essential in seeing the DDDM
processes evolve: “It takes time where through the data and through also, you know, additional
supplemental professional development. We’re hoping to see instructional change that empha-
sizes the skills that we’ve demonstrated we need to improve.”
DATA-DRIVEN DECISION-MAKING PRACTICES 101
EHS. The principal of EHS echoed the same messages as did other principals involved
in this study. The language of years became common currency used to describe the amount of
time it took to develop the skill sets that were fundamental in the DDDM practice at EHS:
Now, . . . it’s not as—it’s not stated as clearly as that. It doesn’t happen in one day—it
happens over the course of a few years. But by having those kinds of conversations
where teachers participated in the English CST data and have the English CST data
applied to their content, then they started to kind of discover together what they needed to
do in the classroom . . . to meet the needs of that kind of a heterogeneous group of kids.
And the same is true in history.
The principal of EHS acknowledged that prior to venturing into DDDM processes, it was
clear to him that the journey on which he was embarking was a project that was ambitious and
would take time to evolve:
We also . . . refer to teacher-level data, in terms of grading and in terms of assessments
given, with the idea being that within course teams, there was an expectation that there
would be some kind of cohesiveness around grade distributions, around grading policies,
and around what kind of assessments and reassessments were offered during the grading
term. And that was actually a big project.
The principal also provided examples about how early wins gave him the leadership efficacy to0
continue pursuing refinement of the DDDM practices:
Over that period . . . of building effective strategies, seeing those effective classes taught
me that it was important to keep pursuing. And then as I would see teachers develop a
skill set and develop classrooms where they were reaching more kids, I would know that
DATA-DRIVEN DECISION-MAKING PRACTICES 102
it was working. It just takes a few years of the process I described to roll itself out, but I
need teachers.
Ingrained in his philosophy was the notion of repetition, as indicated in the following comment:
So . . . I’m a believer in repetition, and so, I didn’t just regurgitate the data to them, but
they were given the data, and then they were asked to talk about it and how it relates to
their classes and what they teach.
He was aware of the human element and its need to nourish skill set development and to
clarify expectations from EHS leadership:
I mean the pushback is, they’ll do it if they’re asked, but they’re not going to work
through it frequently enough to remember how to do it. So again, that’s coming back to
that couple-of-year theme, too, where the 1st year of requesting artifacts at meetings
would kind of force people to go into Data Director [a data management system].
ZHS. The principal of ZHS provided a slightly different perspective on staying the
course. Despite his opinions on average principal tenure and its perceived impact on leadership,
the only difference between the principal of ZHS and other principals was that he stayed the
course on a select number of DDDM practices without first rolling out preliminary processes
over time—instead rolling them out relatively more quickly than other principals:
I’ll say it like this as a principal—leadership’s a funny kind of thing. You know, if you
looked at the average expectancy of a principal on a campus, because a lot of people say,
“Well, it’s going to take me 4 years to get here”—well you really don’t have 4 years.
Although he said, “You don’t have the time to, or the luxury of time that you think when you take
a [principal job] on,” during his time as principal of ZHS, he consistently worked on the same
DATA-DRIVEN DECISION-MAKING PRACTICES 103
initiatives that included collaborative teachers working on analyzing assessment data to drive
instructional practices.
Summary of key points. All principals who participated in this study messaged one
clear point: that DDDM practices take time to frame, develop, introduce, and refine. The metric
of years appeared often to describe the evolution of DDDM practices, which they were all com-
fortable in communicating. Each principal made it clear that outside of minor changes to his or
her site, DDDM practices, at their core, each understood that concentrating on a basic notion of
providing teachers the time to collaborate to discuss results on assessments. This collaboration
occurred alongside the expectation of change, as monitored through walkthrough processes
married with conversations as needed, and this was all it took to improve student achievement.
(See Table 9 for summary of staying the course.)
Chapter Summary
This study began with a pursuit of DDDM that secondary principals used to improve
student achievement. What ultimately resulted was a uniform consensus of a focus on six key
themes whose chronology was supported by the guiding research questions. The principals all
indicated that DDDM practices at their site began with building leadership capacity. They all
adopted a belief and practice that placed teacher leaders at the fulcrum that turned previously
stagnant and/or nonexistent collaboration into productive and diplomatic processes that shed
light on effective teacher practices.
Once leadership capacity was built, there was a consensus on a clear set of expectations
by all stakeholders. This process unfolded in diverse manners; however, all principals made
certain that all members of their school site understood their role in the DDDM process.
DATA-DRIVEN DECISION-MAKING PRACTICES 104
Table 9
Summary of Staying the Course, by School
School site Procedures regarding classroom walkthroughs
Alpha High School Restructured school focus by developing a DDDM plan that aligned
curriculum, instruction, and assessment. Plan was implemented over
time while keeping in mind relationships and rapport building.
Beta High School DDDM plan focused on utilizing SMART goals as a primary focus on
instruction and purposefully developed teacher leaders to support success
of plan implementation.
Gamma High School DDDM plan framework was provided by central office, and site auton-
omy allowed for flexibility on focus. Plan focused on training teachers
to refine and acquire skills that ensured implementation of DDDM prac-
tices to fidelity.
Delta High School DDDM plan framework was provided by central office, and site auton-
omy allowed for flexibility on focus. Plan focused on teacher collabora-
tion centered on identifying best instructional practices that would cor-
rect gaps in student learning
Epsilon High School DDDM plan was focused on the identification of best instructional prac-
tices identified by principal during walkthrough procedures and subse-
quently shared during teacher collaboration.
Zeta High School DDDM plan was centered on a continuous growth model. Principal
identified gap in achievement data that drove annual professional devel-
opment plan but focused on utilizing data as a belief system change
agent.
Note. DDDM = data-driven decision making. SMART = smart, measurable, achievable, realis-
tic, and timely.
With clear expectations and a group of teacher leaders at the ready, each principal who
participated in this study made a commitment to product-driven PD time. Products created
during PD varied from the creation of a SMART goal sheet to an analysis of nonsummative
DATA-DRIVEN DECISION-MAKING PRACTICES 105
assessment data or the identification of effective teacher practices. All principals understood that
providing teachers with consistent PD time was essential to initially cultivate DDDM processes
but ultimately allow teachers to enter a cycle of continuous improvement.
Once PD time was provided and teachers were given ample time to discuss best practices,
all principals indicated that the walkthrough process was essential in providing teachers with the
checks and balances on the DDDM process that allowed them to see how they could further
promote the success of previously agreed-upon instructional practices.
Upon completion of collecting qualitative data through the walkthrough process, all principals
shared the need to engage in selective conversations with members opting to unsubscribe from
previously agreed-upon adoptions as a means to keep themselves and their teachers accountable
for the collective good.
All principals later mentioned that emerging themes arising from the DDDM practices at
their high schools were to some degree rendered mute without staying the course and refining
them over time, as indicated from the principals’ perspectives of principals. Investing years into
determining the effectiveness of practices was a common theme that arose from the gathering of
data on DDDM.
DATA-DRIVEN DECISION-MAKING PRACTICES 106
CHAPTER FIVE: SUMMARY, IMPLICATIONS, RECOMMENDATIONS, AND
CONCLUSION
The NCLB Act of 2002 legislation mandated that 100% of students be proficient on
standardized exams by 2014. In 2013, the last year that the CST was administered, California
secondary schools (Grades 9–12) had proficiency rates of 54.1% in ELA and 25.0% in mathemat-
ics. For secondary students classified as low-socioeconomic and ELLs, the results were far
lower. Students classified as low socioeconomic had a proficiency rate of 41.8% in ELA and
19.5% in math. More troubling were the results for ELs, who had a proficiency rate of 7.1% for
ELA and 8.2% in math. By that metric, it appears that secondary schools in California fell short,
very short, of meeting NCLB targets for all students but even more so for low-socioeconomic and
EL students (see Table 10).
Elmore (2002) succinctly described a possible explanation for the crisis captured in Table
10 by describing the gaps in knowledge for school personnel. The author wrote that people who
work in schools are being asked to “to engage in systematic, continuous improvement in the
quality of the educational experiences of students” (p. 3). The problem, he explained, is that
“most people who currently work in public schools weren’t hired to do this work, nor have they
been adequately prepared to do it either by their professional education or their prior experience
in schools” (p. 3). Adding the fact that educational institutions are burdened with other nonin-
structional social factors makes closing the achievement gaps identified above much harder to
solve.
One of the many byproducts of the sunsetting NCLB era is that it mandated disaggre-
gation of school data by diverse groups and brought a heightened sense of urgency to meet the
growth targets for each of those groups. In order to meet these growth targets, some principals
DATA-DRIVEN DECISION-MAKING PRACTICES 107
Table 10
Proficiency Rates on 2013 California Standards Test for Secondary Students (Grades 9–11)
Category ELA proficiency rates (%) Math proficiency rates (%)
No Child Left Behind mandate 100.0 100.0
All students 54.1 25.0
Low socioeconomic 41.8 19.5
English learners 7.1 8.2
Note. ELA = English language arts. Taken from DataQuest, by California Department of
Education, 2013a, retrieved from http://data1.cde.ca.gov/dataquest/
chose to focus on the bubble kids, students who were close to moving to a higher or lower
proficiency level (Cawelti, 2006; Dee & Jacob, 2011; Hamilton et al., 2009; Lauen & Gaddis,
2012). Other principals have chosen to develop systemic DDDM approaches to improving
achievement by focusing on student learning, which was the focus of this dissertation study.
Statement of the Problem Restated
As California public schools move toward a new high-stakes accountability era under the
newly adopted California CCSS, the decisions that principals make on curriculum, instruction,
and assessment practices are critical to ensure growth in student achievement for traditionally
poorer performing groups such as low-socioeconomic and ELs. Although there is ample research
on DDDM practices that school leaders use in a data-rich educational landscape to transform raw
data into actionable information that drives instructional pedagogy, there has been scarce re-
search on successful DDDM practices by secondary principals of school sites serving a target
student population, specifically one having at least 50% of students categorized as low socioeco-
nomic and 25% ELs. Therefore, the details of practice that school principals use to transform
DATA-DRIVEN DECISION-MAKING PRACTICES 108
voluminous data into actionable information at these target sites consequently constitute an
investigation that is necessary and desired.
Guiding Research Questions
As a means to assist in developing a better understanding of the DDDM practices that
secondary principals use to improve student achievement, the data transformation framework
presented by Ikemoto and Marsh (2007) and the data improvement process framework presented
by Boudett et al. (2010) were used to develop the four research questions for this dissertation
study. Ikemoto and Marsh (2007) proposed a six-stage framework comprised of (a) data, (b)
information, (c) knowledge, (d) decision making, (5) implementation, and (6) measurement. At
the end of the cycle, feedback is provided that can lead one to any of the initial three stages: data,
information, or knowledge. Boudett et al. developed an eight stage framework: (a) organize for
collaborative work, (b) build assessment literacy, (c) create data overview, (d) dig into student
data, (e) examine instruction, (f) develop action plan, (g) plan to assess progress, and (h) act and
assess. The latter frameworks were the guiding lenses in the development of the following four
research questions:
1. What systems are considered and subsequently put in place for a data-driven school
culture?
2. How do school site principals and their stakeholders transform data from raw form to
actionable data used to drive decision making in instructional practice?
3. What are the barriers that a secondary principal faces when implementing DDDM?
4. What are the mechanisms that secondary principals use to evaluate the effectiveness
of protocols and systems put in place by DDDM processes?
DATA-DRIVEN DECISION-MAKING PRACTICES 109
Summary of Results and Findings by Research Question
Research Question 1
Research Question 1 asked, “What systems are considered and subsequently put in place
for a data-driven school culture?” The findings emerging from this question were in part clear
for specific elements and more ambiguous for others. For the elements that principals consid-
ered, it was clear that all six principals were mindful of the leadership capacity of their teacher
leaders prior to embarking on their site’s DDDM initiative. Some sites chose to utilize the
traditional department chair position as the vehicle for teacher leaders, but most chose to imple-
ment a content teacher leader model in addition to leveraging existing department chairpersons.
Each embarked on the journey of building leadership capacity in an individual manner and over a
wide period of time; however, they all explicitly indicated that leadership capacity was some-
thing that they considered prior to beginning the journey.
A second item that was clearly considered by all six principals prior to embarking on
DDDM procedures was the need to clearly communicate expectations to stakeholders. Not only
did all six principals take time to communicate but they also developed the vision with stake-
holder input and understanding the reasoning behind procedures put in place. The central
premise of all principals was the expectation that teachers would gauge the effectiveness of
meeting student needs in addition to intermittent procedures to continuously improve over time.
In layman terms, how do teachers determine what to do, what are teachers doing, how do teachers
know that what they are doing is working, and what do teachers do when they know what was
done has not worked? The ambiguity arose in the systems that principals subsequently put in
place for a data-driven culture. One opted to use the SMART goal process to identify measurable
objectives for each team of teachers to work toward improvement. Three chose to give teachers a
DATA-DRIVEN DECISION-MAKING PRACTICES 110
general framework of collaboration to identify effective teacher practices through the use of
common teacher assessments but, however, provided professional autonomy for self-
identification and selection of such practices. Finally, two opted to use teacher collaborative time
to identify items of focus from common assessments in addition to other metrics including
reading levels, CAHSEE preparation, and A-G rates, among others.
Research Question 2
Research Question 2 asked, “How do school site principals and their stakeholders trans-
form data from raw form to actionable data used to drive decision making in instructional prac-
tice?” The data collected from the interviews with all six principals unearthed a theme that
varied slightly from themes that emerged in the literature review of this dissertation study. Spe-
cifically, the literature on DDDM processes had ample references regarding the critical role that
PD plays in the success of a DDDM process. Less evident in the literature, however, was the
expectation of using or creating a product during PD. In other words, there was zero evidence
from all participating principals that they expected anything less than utilizing PD time to use or
create items. Less consistent were the types of items used or created during professional devel-
opment sessions. Some school sites expected teachers to bring the results of common formative
assessments provided by administration or downloaded from an assessment data management
system. Other school sites required teachers to create lesson plans or activities to correct student
learning gaps, as identified by assignments, assessments, or qualitative data.
Research Question 3
Research Question 3 asked, “What are the barriers that a secondary principal faces when
implementing DDDM?” In terms of the barriers that principals faced during implementation of
DDDM initiatives, the six principals provided a strong indication that they revolved around two
DATA-DRIVEN DECISION-MAKING PRACTICES 111
themes: establishing a meaningful walkthrough procedure and engaging in strategic conversation
with stakeholders who failed to implement site DDDM practices to fidelity. All principals
referred to the process of rebranding the walkthrough process as a very personal process because
there was a perception that administration was observing through a critical lens instead of
assisting in improving the instructional practice of teachers. All principals except one admitted
that the process took time to unfold. There was evidence of the walkthrough process serving as a
check and balance of the DDDM. The walkthrough process was eventually embraced as a means
to ensure that teachers were following through with best practices and corrective actions on
instructional techniques. A second challenge in the DDDM process that emerged from the data
collected was that all principals acknowledged the need to engage in a conversation with clarifi-
cation of expectations being the central focus with reticent teachers who hesitated to fully
implement DDDM practices or implement them to fidelity.
Research Question 4
Research Question 4 asked, “What are the mechanisms that secondary principals use to
evaluate the effectiveness of protocols and systems put in place by DDDM processes?” The
weakest of findings emerging from the data collected from all six principals was that of data
collected to support the findings for Research Question 4. No principal described a specific
mechanism for evaluating the effectiveness of protocols. One principal went as far as articulat-
ing the uncertainty of effectiveness, given that so many strategies are used to improve student
achievement. This principal was honest in communicating that he really felt that there is no true
manner to determine the effectiveness of each strategy individually, as such strategies do not
occur in a vacuum. What did become evident from all six principals was that they felt that
staying the course was a critical factor in developing DDDM procedures; all felt at their core that
DATA-DRIVEN DECISION-MAKING PRACTICES 112
focusing on those DDDM practices would lead to improved student learning and subsequently
student achievement.
Ancillary Findings
The four research questions that guided this dissertation study provided insight into the
DDDM that secondary principals used to improve student achievement. Although six strong
themes emerged from the structured interview process with six secondary principals, there were
ancillary findings identified through probing questions that added additional insight that was
essential to capture an even deeper understanding of the perspectives of the six principals who
participated in this dissertation study.
One identified ancillary finding from a cohort of the six principals involved in this
dissertation study that was supported by a subset of the literature on DDDM was the impact that
district leadership plays in the success of DDDM initiatives (Anderson et al., 2010; Armstrong &
Anthes, 2001; Datnow et al., 2007; Duffy et al., 2012; Levin & Datnow, 2012; Marsh et al.,
2006; Psencik & Baldwin, 2012). Two of the six principals provided perspectives that showed
strong evidence of district office leadership assisting in the development of processes and expec-
tations that supported the implementation of DDDM protocols at their sites. One of the six
principals provided a perspective that showed moderate evidence of district office leadership
assisting with PD on instructional practices that supported the site’s DDDM initiative. Finally,
one principal provided a perspective that showed moderate evidence of the district assisting with
building data analysis proficiency that supports the site’s DDDM initiative. The remaining two
principals who participated shared perspectives that showed strong evidence of their district
allowing full site autonomy in their sites’ DDDM initiative.
DATA-DRIVEN DECISION-MAKING PRACTICES 113
In their book Reframing Organizations, authors Bolman and Deal (2008) described four
leadership frames through which leaders operate: structural, human resource, political, and
symbolic. It became very apparent that all six principals who participated in this study showed a
strong bias toward the human resource frame. All understood well that they needed their teach-
ers to get the work done and therefore invested a significant amount of time in developing and
cultivating relationships with them. Even when describing the conversations that took place,
which undoubtedly contained corrective action undertones, the process occurred with finesse.
Their leadership styles showed strong evidence of a servant approach to leading, where they saw
themselves as advocates for teachers, supporting them as a priority and using teacher empow-
erment as a means to help move their organizations forward.
Summary of Findings in Relationship to Current Literature
The research on DDDM practices provided insight into how a community of stakeholders
can come together to adopt a culture of inquiry to continuously improve student achievement by
collectively adjusting their instructional practices, guided by the emerging patterns arising from
the analysis of relevant data. Central to the DDDM process is a synthesis of systematic data col-
lection and analysis along with correcting deficit learning through instructional practices identi-
fied by collaborative discussions. The findings of this dissertation study directly supported the
literature; however, they were not inclusive of all elements found in the literature. Emerging
themes in the literature that were unexpectedly completely absent or barely present from the
emerging themes in this study were building data analysis proficiency for teachers, timely access
to data, and data validity.
Five of the six principals who participated in this study made little to no mention of the
importance of developing data analysis proficiency for their teachers. There was no evidence of
DATA-DRIVEN DECISION-MAKING PRACTICES 114
data having to be deciphered from complex forms to more pragmatic practitioner-friendly forms
as the literature on DDDM indicated. Principals and/or district support personnel possessing the
skill and performing the task for teachers prior to teachers coming together to discuss data can
explain the latter disconnect between themes present in the literature but absent from findings in
this study.
Completely absent in the findings of this study was the notion of timely access to data. In
the literature there was evidence of stakeholders pointing to timely access to data as an obstacle
in the DDDM process. Ikemoto and Marsh (2007), for instance, cited two studies where timely
access to data greatly influenced individual use” (p. 120) by teachers. Timely access to data was
also cited as an obstacle to DDDM by other authors in the literature (e.g., Kerr et al., 2006; Luo,
2008; Marsh et al., 2006).
The most surprising missing theme from the findings of this study but present in the liter-
ature was the skepticism that stakeholders have toward the validity of data presented during the
DDDM process (Anderson et al., 2010; Luo, 2008). None of the principals made mention of
teachers questioning the validity of assessment data (Cromey, 2000; Kerr et al., 2006) in measur-
ing student learning or the accuracy of assessments capturing the effectiveness of their instruc-
tional approach (Lachat & Smith, 2005; Luo, 2008)
Implications for Practice
The results of this dissertation study provided the insight of time-proven, successful
DDDM practices used by six principals who served a population of students who had historically
been low achieving. Their hard work and the hard work of their teaching staff led to improve-
ments in student achievement that merit attention and can have implications in practice.
DATA-DRIVEN DECISION-MAKING PRACTICES 115
The current educational trend is changing federal and state accountability, modernizing
standards, and budget volatility. The latter factors have had tremendous impact on a principal’s
ability not only to manage but also to lead their school sites. What became evident from the data
collected from this dissertation study was that despite these factors, none of which emerged as
impactful to DDDM initiatives, all principals who participated were able to devise a plan lacking
in complexity and to dedicate a substantial amount of time to see that the plan was carried out. In
short, less is more. The current educational landscape through the changing CCCS demand that
principals focus on four C’s: communication, collaboration, critical thinking, and creativity. This
dissertation study pointed to one additional C: consistency.
Even with a simple and consistently implemented plan, principals are unable to carry out
their plan if they are in their position for an insufficient amount of time. One of the selection
criteria for participating principals in this dissertation study was to have been in their positions
for at least 3 years; however, 100% of them had been in their positions at least 5 years. One of
the principals made it very clear that his leadership lens was conscious of the fact that principal
tenure is at an all-time low—an opinion supported by research (Waters, Marzano, & McNulty,
2003). Another shared that although he was in the position for longer than the average tenure,
there was a high turnover rate for his assistant principals, specifically 19 of them over 5 years.
This study pointed to the importance of lengthening average principal tenure to allow principals
the essential time it takes to develop, implement, and refine DDDM practices as a means to
improve student achievement for historically low-achieving student populations.
DATA-DRIVEN DECISION-MAKING PRACTICES 116
Recommendations for Future Research
Through the analysis of the data collected to identify the DDDM practices that secondary
principals used to improve student achievement, questions emerged that warrant further examina-
tion:
1. To what degree do teacher associations impact a principal’s ability to move DDDM
initiatives forward?
2. What impact does a principal’s average tenure within a district have on the principal’s
action plan for instructional initiatives such as DDDM processes?
3. How do budgetary restrictions impact a principal’s ability to support the success of
DDDM protocols?
4. What role does access to additional personnel such as teacher course leads, resource
teachers, and counselors play in helping to support the success of DDDM processes?
5. To what degree does autonomy provided by district leadership impact a principal’s
ability to identify and implement initiatives, including DDDM?
Conclusion
This study began by taking a historic look at data. It began with a brief review of A
Nation at Risk (National Commission for Excellence in Education, 1983), which intentionally hit
a chord by tying educational assessment data to the economic strength of the country. Since the
publication of A Nation at Risk, the data presented in the report has led educational and political
architects to engineer several rounds of extraordinarily high accountability standards that at their
core have student achievement in mind. Most recently, with the adoption of the CCCS, an ini-
tiative birthed by the National Governors Association and Business Roundtable, it seems that the
DATA-DRIVEN DECISION-MAKING PRACTICES 117
same objective is at hand: to preserve the economic strength of the country by providing Ameri-
can industries with graduates who have 21st-century skills.
Armstrong and Anthes (2001) cited a superintendent as saying, “In God we trust; all
others bring data!” (p. 40). The latter quotation captures, at its core, what drove the DDDM
initiatives for all principals who participated in this study. When one take a bird’s eye view of
the current educational environment, one has to wonder what else drives decision making?
Given the digital age, where data are not only immediately available but also can easily be
interpreted in mass, education leaders must leverage this access to drive how they operate their
schools.
This study was rooted in the curious belief that an investigation into successful DDDM
practices would uncover innovation concentrated on collaborative and PD structures, numerical
disaggregation techniques, data presentation formats, or data analysis methods. From the find-
ings of this study, it is apparent that only a few of those elements emerged. As Mandinach
(2012) pointed out, “DDDM is not just about the numbers or the data” (p. 73). It is instead about
the human resource frame of leadership that Bolman and Deal (2008) discussed. As bells ring
and hallways clear, principals who participated in this study felt at their core that they were in the
business of people and that they operated under the mantra of never assuming that their teachers
needed them; they understood that they needed their teachers. Without teachers working together
in true dialogue, discussing instructional practices directly correlated to achievement data and
doing so in a cycle of continuous improvement using authentic inquiry, improving student
achievement for low-achieving students seems unlikely. Some feel that the digital revolution
will one day lead to the elimination of teachers. Until that day comes, after all budgets are cuts,
DATA-DRIVEN DECISION-MAKING PRACTICES 118
legislation is changed, and assessment are taken, public schools will have students, teachers, and
a principal leading them and hopefully utilizing DDDM practices.
DATA-DRIVEN DECISION-MAKING PRACTICES 119
References
Anderson, S., Leithwood, K., & Strauss, T. (2010). Leading data use in schools: Organizational
conditions and practices at the school and district levels. Leadership and Policy in Schools,
9, 292–327. doi:10.1080/15700761003731492
Armstrong, J., & Anthes, K. (2001). How data can help: Putting information to work to raise
student achievement. American School Board Journal, n.v., 38–41.
Bernhardt, V. L. (2009). Data use: Data-driven decision making takes a big picture view of the
needs of teachers and students. Journal of Staff Development, 30(1), 24–27.
Black, P., & Wiliam D. (1998). Inside the Black Box: Raising the standards through classroom
assessment. Phi Delta Kappa, 10, 139–148.
Black, P., & Wiliam, D. (2004, June). The formative purpose: Assessment must first promote
learning. Yearbook of the National Society for the Study of Education, 103(2), 20–50.
Bolman, L. G., & Deal, T. E. (2008). Reframing organizations: Artistry, choice, and leadership.
San Francisco, CA: Jossey-Bass.
Boudett, K. P., City, E. A., & Murnane, R. J. (2010). Data wise: A step-by-step guide to using
assessment results to improve teaching and learning. Cambridge, MA: Harvard Education
Press.
Brown, K. M., & Anfara, Jr., V. A. (2003). Paving the way for change: Visionary leadership in
action at the middle level. National Association of Secondary Principals Bulletin, 87(635),
16–34. doi:10.1177/019263650308763503
Brown, R. S., Wohlstetter, P., & Liu, S. (2008). Developing an indicator system for schools of
choice: A balanced approach. Journal of School Choice, 2, 391–414. doi:10.1080/
15582150802618659
DATA-DRIVEN DECISION-MAKING PRACTICES 120
California Department of Education. (n.d.). California Education Code § 52050 Public Schools
Accountability Act of 1999. Retrieved from http://www.leginfo.ca.gov/cgi-bin/
displaycode?section=edc&group=52001-53000&file=52050-52050.5
California Department of Education. (2013a). DataQuest. Retrieved from http://data1.cde.ca
.gov/dataquest/
California Department of Education. (2013b). 2013 Academic Performance Index report: Infor-
mation guide. Retrieved from http://www.cde.ca.gov/ta/ac/ap/documents/infoguide13.pdf
California Department of Education. (2013c). 2013 Adequate Yearly Progress report: Informa-
tion guide. Retrieved from http://www.cde.ca.gov/ta/ac/ay/
California Education Code § 52050 Public Schools Accountability Act of 1999. Retrieved from
http://www.leginfo.ca.gov/cgi-bin/displaycode?section=edc&group=52001-53000&file=
52050-52050.5
Cawelti, G. (2006). The effects of No Child Left Behind. Educational Leadership, 11(1), 64–68.
Childress, S., Elmore, R., & Grossman, A. (2006). How to manage urban school districts. Har-
vard Business Review. Retrieved from https://hbr.org/2006/11/how-to-manage-urban-
school-districts/ar/1
Coburn, C. E., Toure, J., & Yamashita, M. (2009). Evidence, interpretation, and persuasion:
Instructional decision making at the central office. Teachers College Record, 111, 1115–
1161.
Collins, J. (2001). Good to great: Why some companies make the leap . . . and others don’t. New
York, NY: HarperCollins Books.
DATA-DRIVEN DECISION-MAKING PRACTICES 121
Cosner, S. (2011). Teacher learning, instructional considerations, and principal communication:
Lessons from a longitudinal study of collaborative data use by teachers. Educational Man-
agement Administration & Leadership, 39, 568–589.
Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods ap-
proaches (3rd ed.). Thousand Oaks, CA: Sage.
Cromey, A. (2000). Using student assessment data: What can we learn from schools? (Policy
Issues #6). Oak Brook, IL: North Central Regional Educational Lab.
Datnow, A., Park, V., & Wohlstetter, P. (2007). Achieving with data: How high-performing
school systems use data to improve instruction for elementary students. Retrieved from
http://www.newschools.org/files/AchievingWithData.pdf
Dee, T. S., & Jacob, B. (2011). The impact of No Child Left Behind on student achievement.
Journal of Policy Analysis and Management, 30, 418–446.
Down, A. C., & Tong, V. P. (2007). Accountability, assessment, and the scholarship of “best
practice.” In J. C. Smart (Ed.), Handbook of higher education (Vol. 22, pp. 57–119). New
York, NY: Springer.
Duffy, H., Hannan, S., O’Day, J., & Brown, J. (2012). Building district capacity for data-
informed leadership (Special Series on the Fresno–Long Beach Learning Partnership).
Retrieved from http://www.cacollaborative.org/sites/default/files/CA_Collaborative_
Fresno_LB_Brief4.pdf
Dunn, K. E., Airola, D. T., Lo, W., & Garrison, M. (2013). What teachers think about what they
can do with data: Development and validation of the data driven decision making efficacy
and anxiety inventory. Contemporary Educational Psychology, 38(1), 87–98.
DATA-DRIVEN DECISION-MAKING PRACTICES 122
Earl, L., & Fullan, M. (2003). Using data in leadership for learning. Cambridge Journal of
Education, 33(3), 383–394. doi:10.1080/0305764032000122023
Earl, L., & Katz, S. (2002). Leading schools in a data-rich world. In K. Leithwood, P. Hallinger
9(Eds.), Second international handbook of educational leadership and administration (pp.
1003–1022). Dordrecht, The Netherlands: Kluwer.
Elmore, R. F. (2002). Bridging the gap between standards and achievement: The imperative for
professional development in education. Washington, DC: Albert Shanker Institute.
Feldman, J., & Tung, R. (2001). How schools use the data-based inquiry and decision making
process. Washington, DC: American Educational Research Association.
Fullan, M. (1985). Change processes and strategies at the local level. The Elementary School
Journal, 85, 390–421.
Fullan, M. (2000). The three stories of education reform. Phi Delta Kappan, 81(8), 581–584.
Fullan, M. (2001). Leading in a culture of change. San Francisco, CA: Jossey-Bass.
Fullan, M. (2002). The change leader. Educational Leadership, 59(8), 16–21. Retrieved from
http://www.ascd.org/publications/educational-leadership/may02/vol59/num08/The-Change-
Leader.aspx
Gischlar, K. L., Hojnoski, R. L., & Missal, K. N. (2009). Improving child outcomes with data-
based decision making: Interpreting and using data. Young Exceptional Children, 13(1),
2–18. doi:10.1177/1096250609346249
Golden, A., Madsen, K., Pfeiffer-Hoyt, R., Butler-Wall, B., Collins, V., Johnson, C., . . . Field-
ing, L. (2008). Data dashboards for school directors: Using data for accountability and
student achievement. Bellevue, WA: Center for Educational Effectiveness
Golden, M. (2005). Making strides with educational data. T.H.E. Journal, 32(12), 38–40.
DATA-DRIVEN DECISION-MAKING PRACTICES 123
Halverson, R., Grigg, J., Prichett, R., & Thomas, C. (2005, July). The new instructional leader-
ship: Creating data-driven instructional systems in schools. Paper presented at the annual
meeting of the National Council Professors of Educational Administration, Washington, DC.
Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J., & Wayman, J. (2009).
Using student achievement data to support instructional decision-making (NCEE
20094067). Retrieved from http://ies.ed.gov/ncee/wwc/pdf/practiceguides/
dddm_pg_092909.pdf
Heath, C., & Heath, D. (2010). Switch: How to change things when change is hard. New York,
NY: Broadway Books.
Henning, J. E. (2006). Teacher leaders at work: Analyzing standardized achievement data. Edu-
cation, 126, 729–737.
Herman, J., & Gribbons, B. (2001). Lessons learned in using data to support school inquiry and
continuous improvement: Final report to the Stuart Foundation (CSE Technical Report
535). Retrieved from http://www.cse.ucla.edu/products/Reports/TR535.pdf
Ikemoto, G. S., & Marsh, J. A. (2007). Cutting through the “data-driven” mantra: Different con-
ceptions of data driven decision making. Yearbook of the National Society for the Study of
Education, 106(1), 105–131.
Improving America’s Schools Act of 1994. (1994). Retrieved from http://www2.ed.gov/ legisla-
tion/ESEA/toc.html
Isaacs, M. L. (2003). Data-driven decision making: The engine of accountability. Professional
School Counseling, 6, 288–295.
DATA-DRIVEN DECISION-MAKING PRACTICES 124
Kerr, K. A., Marsh, J. A., Ikemoto, G. S., Darilek, H., & Barney, H. (2006). Strategies to promote
data use for instructional improvement: Action, outcomes, and lessons from three urban
districts. American Journal of Education, 112, 496–520.
Kotter, J. P. (1995). Leading change: Why transformation efforts fail. Harvard Business Review,
n.v., 59–67.
Kotter, J. P., & Schlesinger, L. A. (2008). Choosing strategies for change. Harvard Business
Review, n.v., 1–11.
Lachat, M. A., & Smith, S. (2005). Practices that support data use in urban high schools. Journal
of Education for Students Placed at Risk, 10, 333–349.
Lauen, D. L., & Gaddis, S. M. (2012). Shining a light or fumbling in the dark? The effects of
NCLB’s subgroup-specific accountability on student achievement. Educational Evaluation
and Policy Analysis, 34, 185–208.
Levin, J. A., & Datnow, A. (2012). The principal role in data driven decision making: Using case
study data to develop multi-mediator models of educational reform. School Effectiveness
and School Improvement, 23, 179–201.
Light, D., Wexler, D. H., & Heinze, J. (2005). Keeping teachers in the center: A framework of
data-driven decision-making. Retrieved from http://cct.edc.org/sites/cct.edc.org/files/
publications/LightWexlerHeinze2005.pdf
Luo, M. (2008). Structural equation modeling for high school principals’ data-driven decision
making: An analysis of information use environments. Educational Administration Quar-
terly, 44, 603–634. doi:10.1177/0013161X08321506
DATA-DRIVEN DECISION-MAKING PRACTICES 125
Mandinach, E. B. (2012). A perfect time for data use: Using data-driven decision making to
inform practice. Educational Psychologist, 47(2), 71–85. doi:10.1080/00461520.2012
.667064
Marsh, J. A., McCombs, J. S., & Martorell, F. (2010). How instructional coaches support data-
driven decision making: Policy implementation and effects in Florida middle schools.
Educational Policy, 24, 872–907. doi:10.1177/0895904809341467
Marsh, J. A., Pane, J. F., & Hamilton, L. S. (2006). Making sense of data-driven decision making
in education. Retrieved from http://www.rand.org/pubs/occasional_papers/OP170.html
Marzano, R. J., Marzano, J. S., & Pickering, D. J. (2003). Classroom management that works.
Denver, CO: Mid-continent Research for Education and Learning/
Marzano, R. J., Waters, T., & McNulty, B. A. (2005). School leadership that works: From
research to results. Denver, CO: Mid-continent Research for Education and Learning.
Mason, S. (2002, April). Turning data into knowledge: Lessons from six Milwaukee public
schools. Paper presented to the annual conference of the American Education Research
Association, New Orleans, LA.
Maxwell, J. A. (2013). Qualitative research design: An interactive approach (3rd ed.). Thou-
sand Oaks, CA: Sage.
Merriam, S. B. (2009). Qualitative research: A guide to design and implementation. San Fran-
cisco CA: Jossey-Bass.
Mintrop, H., & Sunderman, G. L. (2009). Predictable failure of federal sanctions-driven account-
ability for school improvement and why we may retain it anyway. Educational Researcher,
38, 353–364. doi:10.3102/0013189X09339055
DATA-DRIVEN DECISION-MAKING PRACTICES 126
National Commission on Excellence in Education. (1983). A nation at risk: The imperative for
educational reform. Washington, DC: U.S. Department of Education.
No Child Left Behind (NCLB) Act of 2001, Pub. L. No. 107-110, § 115, Stat. 1425 (2002).
Park, V., Daly, A. J., & Guerra, A. W. (2012). Strategic framing: How leaders craft the meaning
of data use for equity and learning. Educational Policy, 27, 645–675. doi:10.1177/
0895904811429295
Park, V., & Datnow, A. (2009). Co-constructing distributed leadership: District and school
connections in data-driven decision-making. School Leadership & Management, 29, 477–
494. doi:10.1080/13632430903162541
Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). Thousand Oaks,
CA: Sage.
Psencik, K., & Baldwin, R. (2012). Link data to learning goals: Common district assessments
connect teaching effectiveness to student performance. Journal of Staff Development, 33(4),
30-33.
Rallis, S. & MacMullen, M. (2000). Inquiry-minded schools: Opening doors for accountability.
Phi Delta Kappan, 81, 766–773.
Rankin, L. D., & Ricchiuti, L. M. (2007, September). Data driven decision making. Connected
Newsletter, 14(1), 4–6.
Schlechty, P. C. (2001). Shaking up the schoolhouse: How to support and sustain educational
innovation. San Francisco, CA: Jossey-Bass
Schmoker, M. (1999). Results: The key to continuous improvement. Alexandria VA: ASCD.
Schmoker, M. (2002). Up and away. Journal of Staff Development, 23(2), 10-13.
DATA-DRIVEN DECISION-MAKING PRACTICES 127
Schmoker, M., & Wilson, R. B. (1995). Results: The key to renewal. Educational Leadership,
52(7), 62–64.
Thornton, B., & Perreault, G. (2002). Becoming a data-based leader: An introduction. Journal of
national Association of Secondary Principals, 86(630), 86–96.
Tzu, L. (n.d.). Who said give a man a fish you feed him for a day teach a man to fish you feed him
for life? Retrieved from http://www.answers.com/Q/Who_said_give_a_man_a_fish_you_
feed_him_for_a_day_teach_a_man_to_fish_you_feed_him_for_life
U.S. Department of Education (1991). America 2000: An education strategy—sourcebook
(Report No. ED OS91-13). Retrieved from http://files.eric.ed.gov/fulltext/ED327985.pdf
U.S. Department of Education (2008). Teachers’ use of student data systems to improve instruc-
tion: 2005 to 2007. Retrieved from http://www2.ed.gov/rschstat/eval/tech/teachers-data-
use-2005-2007/teachers-data-use-2005-2007-intro.html
Waters, T., Marzano, R. J., & McNulty, B. (2003). Balanced leadership: What 30 years of
research tells us about the effects of leadership on student achievement (Working paper).
Retrieved from http://files.eric.ed.gov/fulltext/ED481972.pdf
Wayman, J. C., Cho, V., Jimerson, J. B., & Spikes, D. D. (2012). District-wide effects on data
use in the classroom. Education Policy Analysis Archives, 20(25), 1–27. Retrieved from
http://files.eric.ed.gov/fulltext/EJ982702.pdf
Williams, T., Perry, M., Oregón, I., Brazil, N., Hakuta, K., Haertel, E., . . . Levin, J. (2007).
Similar English learner students, different results: Why do some schools do better? A
follow-up analysis, based on a large-scale survey of California elementary schools serving
low-income and EL students. Mountain View, CA: EdSource.
DATA-DRIVEN DECISION-MAKING PRACTICES 128
Wohlstetter, P., Datnow, A., & Park, V. (2008). Creating a system for data-driven decision
making: Applying the principal-agent framework. School Effectiveness and School Improve-
ment, 19, 239–259.
Young, V. M. (2006). Teachers’ use of data: Loose coupling, agenda setting, and team norms.
American Journal of Education, 112, 521–548.
DATA-DRIVEN DECISION-MAKING PRACTICES 129
Appendix:
Interview Questions
DATA-DRIVEN DECISION-MAKING PRACTICES 130
Abstract (if available)
Abstract
The No Child Left Behind (NCLB) era has brought about many changes in education, most notably cultivating a culture of accountability through a yearly metric measure. Meeting these metric target growths set by NCLB had implications in practice, including meticulous analysis of standardized assessment results with the goal of improving them. A mechanism that leaders use to achieve improvements in standardized testing results is building a culture where stakeholder use of assessment data to drive instructional decision making, commonly known as data-driven decision making (DDDM). The purpose of this qualitative study was to learn from the DDDM processes in place at school sites that were able to use such practices to improve student achievement. The study focused on the actions that secondary principals undertook to transition their school sites from a data-rich and action-poor culture to establishing procedures that converted assessment data into actionable data that drove the instructional decision-making processes. The study sample population was 6 secondary principals whose school sites met the following criteria: (a) implementing a DDDM process
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
How urban school superintendents effectively use data-driven decision making to improve student achievement
PDF
School leaders' use of data-driven decision-making for school improvement: a study of promising practices in two California charter schools
PDF
An examiniation of staff perceptions of a data driven decision making process used in a high performing title one urban elementary school
PDF
Digital teacher evaluations: principal expectations of usefulness and their impact on teaching
PDF
A case study of promising leadership practices employed by principals of Knowledge Is Power Program Los Angeles (KIPP LA) charter school to improve student achievement
PDF
Effective leadership practices used by middle school principals in the implementation of instructional change
PDF
The secondary school principal's role as instructional leader in teacher professional development
PDF
Superintendents increase student achievement by selecting effective principals
PDF
Secondary school counselor-principal relationships: impact on counselor accountability
PDF
21st century superintendents: the dynamics related to the decision-making process for the selection of high school principals
PDF
Comparative study of the networked principal vs. the isolated principal
PDF
Efective leadership practices used by elementary school principals in the implementation of instructional change
PDF
The path to math: leadership matters: effective practices of principals that improve student achievement in secondary mathematics
PDF
Let's hear it from the principals: a study of four Title One elementary school principals' leadership practices on student achievement
PDF
Examining the applications of data-driven decision making on classroom instruction
PDF
Initiatives implemented by urban high school principals that increase and sustain achievement in algebra for African American students
PDF
Does data-driven decision making matter for African American students?
PDF
Resource allocation to improve student achievement
PDF
How districts prepare site administrators for data-driven decision making
PDF
The sustainability of superintendent-led reforms to improve student achievement
Asset Metadata
Creator
Sanchez, Marco A.
(author)
Core Title
Data-driven decision-making practices that secondary principals use to improve student achievement
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Publication Date
04/15/2015
Defense Date
03/09/2015
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
data-driven,decision-making,OAI-PMH Harvest,secondary principals,student achievement
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Castruita, Rudy Max (
committee chair
), Escalante, Michael F. (
committee member
), García, Pedro Enrique (
committee member
)
Creator Email
ma_sanchez81@yahoo.com,marcosan@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c3-549573
Unique identifier
UC11297541
Identifier
etd-SanchezMar-3306.pdf (filename),usctheses-c3-549573 (legacy record id)
Legacy Identifier
etd-SanchezMar-3306-0.pdf
Dmrecord
549573
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Sanchez, Marco A.
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
data-driven
decision-making
secondary principals
student achievement