Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Do we really understand what we are talking about? A study examining the data literacy capacities and needs of school leaders
(USC Thesis Other)
Do we really understand what we are talking about? A study examining the data literacy capacities and needs of school leaders
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
DO WE REALLY UNDERSTAND WHAT WE ARE TALKING ABOUT?
A STUDY EXAMINING THE DATA LITERACY CAPACITIES AND
NEEDS OF SCHOOL LEADERS
by
Patricia Wu
A Dissertation Presented to the
FACULTY OF THE ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
May 2009
Copyright 2009 Patricia Wu
ii
TABLE OF CONTENTS
LIST OF TABLES............................................................................................................. iv
LIST OF FIGURES ............................................................................................................ v
ABSTRACT....................................................................................................................... vi
CHAPTER ONE ................................................................................................................. 1
Overview of the Study .................................................................................................... 1
Data Literacy in Educational Leadership.................................................................... 3
Research Questions..................................................................................................... 5
Significance of Study.................................................................................................. 6
Assumptions and Limitations of Study....................................................................... 6
CHAPTER TWO ................................................................................................................ 8
Review of Literature ....................................................................................................... 8
What is Data Literacy?................................................................................................ 9
School Leader Data Literacy .................................................................................... 12
Increasing Data Literacy........................................................................................... 16
Data Literacy Supports ......................................................................................... 16
Data Literacy Resources ....................................................................................... 20
Data Literacy and Data Driven Decision Making..................................................... 24
Summary................................................................................................................... 28
CHAPTER THREE .......................................................................................................... 30
Methodology................................................................................................................. 30
Approach................................................................................................................... 31
Sample and Population ............................................................................................. 31
Data Collection ......................................................................................................... 35
Study Limitations...................................................................................................... 37
Ethical Considerations .............................................................................................. 38
Summary................................................................................................................... 39
CHAPTER FOUR............................................................................................................. 40
Analysis of Data............................................................................................................ 40
Self-Perception of Data Literacy .............................................................................. 41
Purpose.................................................................................................................. 41
Quality................................................................................................................... 43
Statistics ................................................................................................................ 44
Interpretation......................................................................................................... 46
Communication..................................................................................................... 49
Summary............................................................................................................... 52
Supports and Structures of Data Literacy ................................................................. 53
Data Literacy Supports ......................................................................................... 53
iii
Data Literacy Structures ....................................................................................... 56
Summary............................................................................................................... 59
Data Literacy’s Influence on Data Driven Decision Making ................................... 60
Data Use at School Sites....................................................................................... 60
High Implementation Schools............................................................................... 65
Delegating Data Driven Decision Making............................................................ 68
Summary............................................................................................................... 70
Increasing Data Literacy........................................................................................... 70
Time ...................................................................................................................... 71
Collaboration......................................................................................................... 72
Technology ........................................................................................................... 73
Communication..................................................................................................... 74
Summary............................................................................................................... 77
Conclusion ................................................................................................................ 77
CHAPTER FIVE .............................................................................................................. 80
Summary and Implications ........................................................................................... 80
Connections to Prior Research.................................................................................. 84
Implications for Policy and Practice......................................................................... 90
Recommendations for Future Research.................................................................... 92
Conclusion ................................................................................................................ 94
REFERENCES ................................................................................................................. 96
APPENDIX A................................................................................................................. 101
APPENDIX B ................................................................................................................. 107
iv
LIST OF TABLES
Table 1: Overview of participants..................................................................................... 33
Table 2: Educational experiences of participants ............................................................. 34
v
LIST OF FIGURES
Figure 1 Data Driven Decision Making Conceptual Framework ..................................... 25
Figure 2 Data Driven Decision Making Conceptual Framework ..................................... 27
vi
ABSTRACT
The implementation of No Child Left Behind has created a push to use data
driven decision making at all school sites. Often this move forward is based upon an
assumption that a foundation of data literacy has been built. This study questions this
assumption and aims to examine the data literacy capacities and needs of school leaders
during this era of accountability.
First, school leaders’ perception of their own data literacy was explored. This
was followed by an examination of what supports and resources are in place to increase
data literacy, as well as, what supports and resources needed to be added. Finally, the
effect of data literacy on data driven decision making was analyzed.
This in depth study of school leader data literacy was accomplished through
qualitative research methods. Twelve administrators were interviewed to explore their
comfort with data and to determine how to best support their data literacy. Documents
supporting their work with data was gathered and analyzed. Together, interview
transcripts and documents were coded and analyzed for themes and patters.
The participants in this study demonstrated their data literacy knowledge and
skills. All participants used data at their school sites. However, the level of data driven
decision making varied between participants. Participants also discussed their challenges
in engaging different stakeholders in using data. This was the area of data literacy that
participants expressed the most need for further development. Participants wanted to
develop data literacy through collaboration with other colleagues and suggested the need
to dedicate time to focus on data literacy.
vii
This study has evidenced the growth in data literacy but also shows the need to
further increase data literacy in all educators. Data driven decision making is not
conducted in isolation. To further implement data driven decision making successfully,
all educators must learn how to use data appropriately. Also, technology resources must
be available to support data driven decision making. This study adds to the growing body
of research on data driven decision making by studying how to increase data literacy and
build capacity for data driven decision making.
1
CHAPTER ONE
Overview of the Study
There was a time in education when decisions were based on the best
judgments of the people in authority. It was assumed that school and
district leaders, as professionals in the field, had both the responsibility
and the right to make decisions about students, schools and even about
education more broadly. They did so using a combination of intimate and
privileged knowledge of the context, political savvy, professional training
and logical analysis. Data played almost no part in decisions.
(Earl & Katz, 2002, p.3)
Today, educational decisions at all levels are expected to be made on the basis of
data. In January 2002, President George W. Bush signed into the law the No Child Left
Behind Act of 2001 (NCLB). NCLB greatly changed the landscape of educational
decision making by increasing accountability and pushing for improved student
achievement (Datnow, Park, & Wohlstetter, 2007). The increase in accountability was
accompanied with a focus on data. “NCLB requires that educators know how to analyze,
interpret, and use data so that they can make informed decision in all areas of education,
ranging from professional development to student learning” (Datnow et. al., 2007, p. 10).
The time when “data played almost no part in decisions” (Earl & Katz, 2002, p. 3) has
clearly ended.
NCLB began the age of accountability and it seemed that data was to play a role
in every good decision being made. Yet data alone do not provide answers or solutions
(Earl & Katz, 2002). ”As Secada (2001) notes, data should be used to inform educators’
decisions, not to replace them” (Mandinach, Honey, & Light, 2006, p. 6). “Relying on
data alone, without the wisdom of experience and the caution of thoughtfulness, can lead
2
to disaster” (Heritage & Yeagley, 2005, p. 333). It seems that it is still “a time when
decisions [are] based on the best judgments of people in authority” (Earl & Katz, 2002, p.
3). The difference is that people in authority, educational leaders, are now being asked to
use data as a foundation for their best judgments and decisions.
Educational leaders are asked to synthesize, interpret, and analyze data when
making decisions. They are asked to engage in data driven decision making (DDDM).
“DDDM in education typically refers to teachers, principals, and administrators
systematically collecting and analyzing data to guide a range of decisions to help improve
the success of students and schools” (Ikemoto & Marsh, 2007, p. 108). This is a daunting
task. A small, but growing body of research suggests that DDDM has the potential to
increase student performance (Datnow, Park, & Wohlstetter, 2007; Alwin, 2002; Doyle,
2003; Johnson, 1999, 2000; Lafee, 2002; McIntire, 2002). However, individual schools
often lack the capacity to implement what research suggests: data driven decision making
(Datnow, Park, & Wohlstetter, 2007; Diamond & Spillane, 2004; Ingram, Louis, &
Schroeder, 2004; Mason, 2000; Petrides & Nodine, 2005; Wohlstetter, Van Kirk,
Robertson, & Mohrman, 1997).
Datnow, Park, and Wohlstetter (2007) state in order to sustain a culture of
continuous improvement through the use of data driven decision making a continual
investment in data management resources, including both human and social capital are
needed. The push to achieve data driven decision making at schools has often be
characterized with investment in data management software, outside experts, and time to
3
analyze data. Often this push forward to guide instruction and make decisions based on
data makes the false assumption that teachers and administrators are data literate.
Ikemoto and Marsh (2007) reference Confrey and Makar (2005) and Streifer
(2002) to state that educators lack statistical knowledge for interpreting and analyzing
data. This could be a result of the lack of formal training for administrators and teachers
on assessment information and data (Hertiage & Yeagly, 2005). The lack of education on
data literacy occurs during preservice and inservice training (Mandinach, Honey, & Light,
2006). Recently workshops on using data have become available; however, teachers
report that the sessions are not helpful (Marsh, Pane, and Hamiliton, 2006). Therefore it
should be no surprise that when given data, educators “often have no idea what the data
mean or how to use it” (Earl & Katz, 2002, p. 17). Perhaps one of the reasons that lead
educators to feel that they are “data rich, but information poor” (Mandinach, Honey, &
Light, 2006, p. 12) is a knowledge gap in data literacy.
Data Literacy in Educational Leadership
Earl and Katz (2002) define a data literate leader as being able to (1) Think about
purposes, (2) Recognize sound and unsound data, (3) Possess knowledge about statistical
and measurement concepts, (4) Make interpretation paramount, and (5) Pay attention to
reporting and to audiences. These five characteristics can be summarized into three steps
a data literate leader would take in using data driven decision making. First, a data
literate leader must appropriately identify which data are needed to make decisions and
determine the quality of that data. Next, a sound foundation of statistical and
4
measurement concepts must be applied to correctly interpret the data. Lastly, a data
literate leader must communicate information and decisions to audiences clearly and
accessibly.
Earl and Katz (2002) provide an argument that successful educational leaders
must be able to create, understand, analyze, and interpret data to make decisions and plan
improvement initiatives. However, educational leaders are not required to receive
training in statistics, research design, data management, interpretation, or reporting. The
result is a major concern that that most educators are not trained in creating,
understanding, analyzing, and interpreting data; and as a result, educators lack data
literacy (Earl & Katz, 2002; Mandinach, Honey, & Light, 2006; Popham, 1999).
This concern becomes most prominent when Mandinach, Honey, and Light (2006)
note that educational leaders are instrumental to facilitating or impeding the use of data at
their schools. Research has consistently found that educational leaders who are
successful in utilizing data driven decision making are data literate and committed to
continuous data use (Marsh, Pane, & Hamilton, 2006; Detert, Kopel, Mauriel, Jenni,
2000; Mason, 2002; Lachat and Smith, 2005; Mieles and Foley, 2005). “If data are to
become part of the fabric of school improvement, leaders in schools and districts must
become active players in the data-rich environment that surrounds them” (Earl & Katz,
2002, p. 12).
Ikemoto and Marsh (2007), as well as Marsh, Pane, and Hamilton (2006),
recommend providing professional development aimed at building educators capacity to
analyze data, find solutions, and conduct research. Hertige and Yeagley (2005) reference
5
Stiggins’ (2002) recommendation to not only provide professional development but
require data literacy as a licensing requirement for educators. Whether or not it is a
licensing requirement, NCLB has created an environment in which data literacy is a
necessary tool for school leaders. There is an urgency to encourage educational leaders
to become data literate leaders.
Research Questions
The goal of this qualitative study is to respond to the urgent need to increase data
literacy in educational leaders. This qualitative study will add to the growing body of
literature on data driven decision making by providing a better understanding of the data
literacy capacities and needs of school leaders. This qualitative study will focus on
school site administrators and seek to answer the following research question:
How do administrators increase their data literacy to build capacity for data driven
decision making?
In this qualitative study, the following sub-questions will also be addressed:
How data literate do administrators perceive themselves to be?
What supports and resources are in place to increase administrator’s data literacy?
How do administrators use their data literacy to influence DDDM at their school
sites?
These sub-questions will create a picture of the current level of data literacy in school
leaders, supports and resources in place to increase data literacy, as well as, explore the
potential influence data literacy will have on data driven decision making.
6
Significance of Study
As previously stated, there is a lack of data literacy in educators. There is also
little to no formal education in data literacy during preservice or inservice training. This
gap in data literacy needs to be closed to allow educators to successfully engage in data
driven decision making. “Helping all schools and students achieve, regardless of ethnic
and socioeconomic background, requires that we identify and develop processes and
practices that support teachers’ deep and sustained examination of data in ways that are
aligned to local instructional goals” (Mandinach, Honey, & Light, 2006, p. 5). This study
aims to identify what supports and resources are needed to increase educator’s
understanding of data. In turn, the increased knowledge base will build capacity to help
all schools and students achieve.
Assumptions and Limitations of Study
This qualitative study is based on the assumption that data literacy is a necessary
prerequisite to successful data driven decision making. While research has been shown
to support the need to increase data literacy, the direct connection between an increase in
data literacy and successful data driven decision making has yet to be proven. Therefore,
it remains an assumption in this qualitative study.
It should be noted that successful data driven decision making must be conducted
on a broad foundation of resources (Datnow et. al., 2007). This foundation includes
leadership, time, data management software, quality data, and other resources. This
7
qualitative study will focus on data literacy not because it is the most important piece, but
because it is often a piece of the foundation that is ignored. I have found that in the
desire to allow data to drive instruction, educators often focus their resources on training
educators to use data before ever diagnosing the educator’s data literacy. For long term
successful data driven decision making, educators must pause in the race to use data and
first seek to understand the data.
8
CHAPTER TWO
Review of Literature
NCLB has brought about the age of accountability in schools. The hallmark of
accountability has become data driven decision making: the need to gather, analyze, and
use data to guide decisions aimed at improving schools (Ikemoto & Marsh, 2007).
However, schools continue to struggle as their leaders chart a new course in data driven
decision making. Often this is due a lack in capacity to implement data driven decision
making (Datnow, Park, and Wohlstetter, 2007). While it is not the only challenge, Earl
and Katz (2002) emphasize the lack of data literacy in school leaders as a contributing
factor to the delay of successful implementation of data driven decision making. Several
researchers note the importance of increasing data literacy in school leaders to building
capacity for data driven decision making (Mandinach, Honey, & Light, 2006; Ikemoto &
Marsh, 2007; Marsh, Pane, & Hamilton, 2006). This qualitative study aims to respond to
this need and examine school leaders’ capacities and needs in data literacy.
This chapter will serve to provide a review of the current literature on data
literacy and its role in developing capacity to conduct data driven decision making. First,
a definition of data literacy will be provided to determine the knowledge and skills
needed for data driven decision making. Next, literature will be reviewed to create a
picture of the current data literacy capabilities among administrators. Supports that are in
place to strengthen and maintain administrators’ data literacy, as well as, any resources
that may reinforce an administrators’ data literacy are also reviewed. Lastly, the role that
9
data literacy has been shown to have in data driven decision making will be presented.
The pieces of literature brought together here will formulate the foundation upon which
to build this qualitative study of data literacy.
What is Data Literacy?
Within the name data literacy, it is clearly suggested that this skill requires the
ability to read and understand data. However with the flood of complicated data that has
arrived at school sites, the ability to read and understand data has become more
complicated as well. Drawing on the work of Earl and Katz (2002), a comprehensive
definition of data literacy will be presented. First, a prerequisite to data literacy will be
described. This will be followed by a detailed discussion of the five characteristics of
data literacy. A brief presentation of the merits of data literacy will be shared in this
section, and will be elaborated on at the conclusion of the literature review.
Earl and Katz (2002) suggest that an inquiry habit of mind is the prerequisite to
data literacy. Both an inquiry habit of mind and data literacy increase an educators
capacities to use data driven decision making Educators must see the process of data
driven decision making as non linear process that requires individuals to think in circles
and maintain an inquiry habit of mind. Inquiry seeks an advanced understanding while a
habit of mind suggests this process is used deliberately and not by chance. An inquiry
habit of mind is defined as “a way of thinking that is a dynamic iterative system with
feedback loops that organizes ideas towards clearer directions and decisions and draws on
or seeks out information as the participants move close and closer to understanding some
10
phenomenon” (p.14). Earl and Katz emphasize that an inquiry habit of mind is difficult
to develop and requires perseverance in a continuous search for clarity and understanding.
An inquiry habit of mind lays a foundation to support the knowledge and skills of
data literacy. Earl and Katz (2002) describe five knowledge and skills a data literate
leader should have (1) Think about purposes, (2) Recognize sound and unsound data, (3)
Possess knowledge about statistical and measurement concepts, (4) Make interpretation
paramount, and (5) Pay attention to reporting and to audiences. Each of these five
characteristics will be discussed in detail, as they are the characteristics that together are
termed data literacy and will be studied and researched in this dissertation.
Earl and Katz (2002) describe the first step into data driven decision making is to
define a purpose for looking at the data. To simply look at data without a goal in mind is
like a car spinning its wheels and not moving forward. This needs to be done in order for
educators to gather and collect the best sources of data for the purpose. Setting a purpose
is often done in the form of posing a question or problem. Lachat, Williams, and Smith
(2006) describe the use of essential questions as one of three most important practices in
increasing data literacy. The ability to focus data examination on a purposeful question is
a key factor in increasing data literacy (Love, 2004; Holcomb, 2001). Lachat and
Smith’s (2004) case study even found the use of essential questions provided a lens to
look at data purposefully, as well as, increased confidence for educators who felt
challenged with data driven decision making.
Once a purpose has been determined, the next step is to recognize sound and
unsound data (Earl & Katz, 2002). Earl and Katz suggest that “not all data are created
11
equal and consumers must be able to discern the difference between legitimate data and
data that are suspect” (p. 19). To assume that all data are valid and accurate can easily
lead to wrong conclusions. Furthermore, it must be remembered that data often represent
an estimate at best, and all data are subject to human error. For successful data driven
decision making, the presence of high quality data is imperative (Marsh, Pane, and
Hamilton, 2006; Heritage & Yeagley, 2005).
With a purpose established and sound data to examine, statistical and
measurement concepts must be applied to seek a greater understanding of the data (Earl
& Katz, 2002). Despite what is often a dislike for statistics, it cannot be denied that
statistical knowledge is needed to properly analyze data. Mandinach, Honey, and Light
(2006) cite the experience of others (Confrey & Makar, 2002, 2004, 2005; Hammerman
& Rubin, 2002, 2003) to conclude that there is a need for educators to understand
concepts of variation, distribution, mean, sampling, and aggregation.
Earl and Katz (2002) suggest that next educators need to make interpretations of
their data. Earl and Katz define interpretation as “formulating possibilities, developing
convincing arguments, locating logical flaws and establishing a feasible and defensible
notion of what the data represent” (p.19). Datnow, Park, and Wholstetter (2007) suggest
the use of structured protocols to facilitate discussions to make interpretations of the data.
These protocols asked individuals to first identify trends and patterns and next to make
interpretations and draw conclusions.
Lastly, the fifth characteristic of data literacy is the ability to report and
communicate conclusions to different audiences (Earl & Katz, 2002). This fifth
12
characteristics is of greater importance for administrators that teachers. Administrators
must have the ability to report and communicate data successfully to a wide variety of
audiences. Earl and Fullan (2003) describe the implications of making data public or
selectively releasing data. There is merit in providing data for all constituents. It
provides a transparency and prevents misreporting; however, selectively releasing data
after analysis can provide clearer communication. Regardless of how much data is
presented, the need to communicate data clearly is imperative.
Once educators have developed an inquiry habit of mind and strengthened their
data literacy in all five characteristics, Earl and Katz (2002) reminds educators that data
driven decision making is not an individual effort. For true data driven decision making
to succeed, educators have the challenge of developing a culture of inquiry. A culture of
inquiry is a community where data is valued and seen as an integral part of decision
making. While this study focuses on the development of the data literacy in
administrators, it is done within the overarching goal of increasing capacity for data
driven decision making.
School Leader Data Literacy
In order to develop administrator data literacy as described above, it is necessary
to first seek a greater understanding of administrator’s data literacy presently. First,
research from Earl and Fullan (2003) will provide information regarding administrators’
self-perceived data literacy. Next, this will be supported by research on the general data
literacy of educators. Due to a lack of researching focusing exclusively on the data
13
literacy of school leaders, literature on data literacy of administrators, as well as, teachers
has been included.
Earl and Fullan (2003) bring together several research studies over the past
decade to draw conclusions for leaders in the era of accountability and data driven
decision making. The data being brought together includes many interviews and survey
from three different contexts: (1) The National Literacy and Numeracy Strategies in
England (2) The Manitoba School Improvement Program (MSIP) in Manitoba, Canada (3)
Secondary School Reform in Ontario, Canada. These three contexts differ in location,
nature of reform, availability and use of data.
Common at all three sites were school leaders who worried about not being able
to gain understanding from data and not being able to clearly communicate their
understanding to constituents. Their anxieties about using data were candidly expressed,
even when school leaders had a positive orientation toward data driven decision making.
Insecurities about their own skills to gather and interpret data were commonly shared.
When we talk to principals and even district leaders, they tell us that the
data are sometimes impenetrable to them. Tables and graphs that are
supposed to be self-evident fail to provide them with the insights that they
feel ought to accrue. Some of them even admit that they do not really
know how to use numbers from large scale assessments and indictor
systems in the school planning. This is not an unusual phenomenon (Earl
and Fullan, 2003, p. 388).
Of course there were exceptions to this insecurity and anxiety. In the MSIP in
Manitoba, educators gathered and interpreted data they selected as important. This led to
more comfort and confidence in working with the data. In England, school leaders seem
more sophisticated in their use of data. This sophistication came with the use of
14
curriculum targets rather than numerical targets. The numerical targets were a series of
targets sets by the government; the curriculum targets described the actual learning of
particular children and were based on frameworks for literacy and math. This would
suggest that data literacy increased when data was tailored to use by each school site.
Despite these notable exceptions, Earl and Fullan’s research provide evidence that school
leader data literacy should be increased.
Ikemoto and Marsh (2007) describe the need to increase data literacy for all
school personnel. School personnel lack adequate skills in formulating questions,
choosing indicators, interpreting data, and developing conclusions (Ikemoto & Marsh,
2007; Chopping, 2002; Dembosky, Pane, Barney, & Mr. Steeletina, 2005; Feldman &
Tung, 2001; Mason, 2002; Fusarelli, 2008; Gordon, 2007). Ikemoto and Marsh (2007)
studied one district and analyzed survey results. They found that only 23% of teachers
perceived themselves as “feeling moderately or very prepared to interpret and use reports
of student test results” (p. 121). If data driven decision making is to occur successfully,
more than 23% of teachers need to feel confident in data literacy. These various research
reports evidence an urgent need for school leaders to increase their data literacy.
Increasing school leader data literacy will greatly increase capacity for data driven
decision making. The message that administration communicates to their staff can make
a big difference. “A principal who is data-driven or technically savvy can exert
substantial influence on the faculty, communicating the importance and thereby
stimulating use” of data and technology (Mandinach, Rivas, Light, Heinze, Light, 2006, p.
6). Numerous studies have shown that having a data literate administrator will positively
15
impact a school’s engagement in data driven decision making (Marsh, Pane, & Hamilton,
2006; Detert, Kopel, Mauriel, Jenni, 2000; Mason, 2002; Lachat and Smith, 2005; Mieles
and Foley, 2005, Ikemoto & Marsh, 2007; ; Feldman & Tung, 2001; Mason, 2002;
Lachat & Smith, 2005; Chopping, 2002; Fusarelli, 2008).
It is important to note that many studies on school leader data literacy are
conducted using the participants’ perception of their own data literacy and the
interpretation of the researcher. The studies are often based on the participant’s self
efficacy. Therefore, it becomes necessary to connect a school leader’s self efficacy with
data literacy. Bandura (1977) discusses the effect a leader’s self-efficacy has on their
own behavior and the behavior of others. More recently, Leithwood and Jantzi’s (2008)
study of teachers and principals supports Bandura’s research and finds a weak but
significant effect of self-efficacy on behavior.
Research studies on the effect of a school leader’s self efficacy on student
achievement have shown a significant effect (Hallinger & Heck, 1996; Leithwood &
Jantzi, 2005, 2008). Often the effect of the leader’s self efficacy is an indirect effect on
student achievement through the school leader’s effect on the school and classroom
conditions. From these studies it can be suggested that a school leader’s self efficacy will
impact the leader’s data literacy and capacity for data driven decision making. The
impact of data literacy on data decision driven making will be explored in the final
section of this chapter.
16
Increasing Data Literacy
The literature presented thus far has defined what data literacy entails and
evidenced the need to focus on increasing data literacy in school leaders. This study aims
increase understanding of school leaders’ capacities and needs in data literacy. In order
to do so, literature on existing supports and resources to increase school leaders’ data
literacy will be examined here. Existing supports and resources must first be discussed to
explore what further needs are unmet.
That literature has been divided into two sections, the support of data literacy and
the resources of data literacy. The support of data literacy includes any programs that
will directly increase a school leaders’ data literacy. This will include professional
development, pre-service and in-service classes. The resources of data literacy refer to
technology or personnel that will reinforce the data literacy a school leader has, but not
necessarily increase it. This will include the use of data teams, data warehouses, and
other technology. Together these supports and resources work together to increase data
literacy and build support for data driven decision making.
Data Literacy Supports
With the flood of data arriving at schools, there is a need to learn how to analyze
and interpret the data available. Despite this urgent need, there is little training or
professional development available to help school leaders increase their data literacy.
Trainings and professional developments on data literacy are just beginning to emerge.
Research on the effectiveness of such programs is just becoming available (Marsh, Pane,
17
& Hamilton, 2006). The literature presented in this section will evidence what little
training and professional development is available. It was also highlight research that
provides a better understanding of what professional development and training currently
exists. Research on programs that foster data use have been included. It is assumed that
general professional development or training on data use would influence the
participant’s data literacy.
Professional development on understanding and using data appears to lead to
increased data literacy and therefore capacity for data driven decision making. Burgess
(2006) described one Ohio school district that created data classes for all teachers and
administrators to build capacity for data driven decision making. After administering a
Data Needs Assessment, Lakota School District developed three data classes: Data 101,
102, and 103. The first class (Data 101) contained basic statistics and data use; followed
by further training in disaggregated data (Data 102) and district data processes and
practices (Data 103). The trainings were given over two to three years. Lakota uses a
train-the-trainer model, where administrators and school leaders are trained with the
expectation that they will then train others at the school site. The results of this training
series have been an increased interest in data as educators see the value that data has.
Educators have also gained a much better understanding for what the data means and how
to use it. Lakota school district has also benefited from a common vocabulary and
improved communication due to these trainings. Even with all the positive affects of this
training, Lakota knows that there needs to be a continuous focus on improving and hopes
to advance their data use further.
18
Another research study that showed the positive effects of professional
development was conducted by Datnow, Park, and Wholstetter (2007) at four different
school systems. The goal was to gain a deeper understanding of data driven decision
making. Specifically, their research sites target schools that had a high level of
engagement is data driven decision making. They found that at that at all four research
sites professional development was provided to principals and was made available to
anyone who asked. It was also the expectation that any information received at training
would be brought back and shared with the entire staff. Administration set the tone by
acting as instructional leaders and modeling effective data use. This research at high data
driven decision making sites would suggest the effectiveness of professional development
in building capacity to data driven decision making.
Despite the availability of training in these four school systems, teachers in both
studies described still requested more training on how to read data more carefully and
what to do with data (Burgess, 2006; Datnow et. al, 2007). Principals also noted the need
to develop their skills and capacity to have quality conversations about data. Even being
well ahead of other schools and districts, these educators still felt inadequately trained for
data driven decision making.
Similar sentiments of inadequate training were found by Marsh, Pane, and
Hamilton (2006) when they conducted qualitative studies using surveys, observations,
documentary review, interviews and focus groups from four projects. Even though
teachers and administrators reported having access to training and workshops on data
literacy and data use, they generally did not find these sessions helpful. While training
19
on using data for instructional planning was rated as more useful; training on
understanding and interpreting data was rated as not useful.
Other literature found that teachers and administrators received no training. Earl
and Fullan’s (2003) research in Canada and England was highlighted earlier. They found
school leaders to be lacking in data literacy. These school leaders had received no
“training or experience in research, data collection, data management or data
interpretation” (p. 388). Several pieces of literature have also found that administrators
and teachers have not received formal training in data literacy (Mandinach, Honey, and
Light, 2006; Popham, 1999, Stiggins, 1999, 2006; Crooks, 1989, Shepard, Hammerness,
Darling-Hammond, Rust, Snowden, Gordon, Gutierrez, & Pacheco, 2005). The lack of
training provided, coupled with the low effectiveness of available training, would provide
evidence to hypothesize a low level of data literacy in administrators.
Mandinach, Honey, and Light (2006) suggest two types of training are needed,
one on data literacy and one on the use of technology tools. Heritage and Yeagley (2005)
suggest that it needs to be a primary consideration for pre-service and in-service training.
Ikemoto and Marsh (2007) suggest that policymakers need to provide professional
development to educators if they want to encourage the pursuit of data driven decision
making. They suggest that the professional development would need to cover data
analysis skills, pose questions, collect data, and determine appropriate action. Earl and
Katz (2003) add training in statistics, research design, data management, and reporting to
the list of skills. Earl and Katz also suggest that not only does instruction need to be
provided but opportunities to practice, discuss, and reflect are needed. In sum, there
20
appears to be a general lack of data literacy professional development opportunities for
school leaders and teachers.
Data Literacy Resources
Literature will be presented which will show the positive effect data literacy
support resources can have on school leaders’ data literacy. These resources include
different technology or personnel additions to a school site. While the research will show
the value of putting these resources in place, it is important to note that these
renforcemetns alone do not increase data literacy of school leaders. They can only
reinforce the data literacy that has already developed. Simply investing in purchasing
personnel or software will not make an impact if these resources are not integrated and
implemented appropriately. When implemented correctly, these resources can provide
the support that is needed to build confidence and implement data driven decision making.
Lachat, Williams, and Smith’s (2006) research on the three important practices of
developing data literacy will be used to guide this presentation of literature. This
qualitative study drew on five years of working with five urban high schools and a case
study for the Northeast and Islands Educational Laboratory. First, a discussion of
technology tools will be presented followed by a discussion of personnel resources.
Lachat, Williams, and Smith (2006) describe the importance of technology in
developing data literacy. Specifically, Lachat, Williams, and Smith suggest the
importance of technology in providing capacity for purposeful data disaggregation.
Purposeful data disaggregation refers to the capability of disaggregating data by multiple
21
indicators, such as race, gender, interventions, grades, etc. Once the data is disaggregated
it allows users to make connections between these indicators and develop conclusions.
The importance of data disaggregation in reinforcing data literacy and building capacity
for data driven decision making has been evidenced by many researchers (Lachat,
Wiliams, and Smith, 2006; Holcomb, 1999; Love, 2004).
Datnow, Park, and Wholstetter (2007) found that having disaggregated data by
teacher and classroom helped to shift the culture to value data. Teachers also expressed a
desire to see longitudinal data that could track student progress over time. Johnson (2002)
also notes the importance in disaggregating data to diagnose and address the achievement
gap that is prevalent in many urban schools. The disaggregation of data allows trends
and patterns to emerge and questions to formulate regarding student achievement. The
benefit of this technology in reinforcing data literacy is clear.
Mandinach, Rivas, Light, Heinze, and Light (2006) researched different
technology tools in order to gain a deeper understanding of which technology tools were
the most valuable. They studied three different technology tools at six school sites across
two years. The three tools were (1) test reporting system (2) data warehouses and (3)
handheld web-based reporting. The GROW reports were useful but the 5 month delay in
results caused problems. The TUSDStats were accessible and encouraged teacher use.
The TUSDStats also provided good tools for disaggregating data. The handhelds seem to
have the best response and teachers need minimally training to access the information.
Mandinach et al.’s (2006) research provide a comprehensive picture as to the
characteristics effective technological tools have. Overall, the results of two year study
22
highlighted the importance several features in technology tools. The tools needed to
disaggregate data well and provide data in a format that was easily to read and understand.
The data processing also needed the capability of linking several different data systems in
a timely manner. The staff also needed to be able to access the data easily with minimal
training. For data to be used effectively, the technology tools needed to present data in a
comprehensible and accessible manner (Datnow et. al, 2007).
In addition to technology tools, Lachat, Williams, and Smith (2006) next suggest
the importance of a data team or a data coach in fostering data literacy. A data team is a
group of individuals who collaborate over data. Membership for the data team should
have a broad representation of the school and include school leaders, teachers, counselors,
instructional coaches. Data teams are mandated in Connecticut to ensure data driven
decision making (Fusarelli, 2008). A superintendent found that the data teams
encouraged creativity and innovation. The team would work together to develop
essential questions, analyze and interpret data, and communicate and disseminate data.
Team work in the form of a data team or professional learning community both aid in
increasing data literacy (Gordon, 2007; Ikemoto & Marsh, 2007;Chen, Heritage, & Lee,
2005; Holcomb, 2001; Keeney, 1998; Lachat and Smith, 2005; Symonds, 2003). The
collaboration would provide support for the team members to increase their data literacy
together.
Lachat, Williams, and Smith also suggest using a data coach. The data coach is a
mentor who models various uses of data. The data coach was very useful in faciliting
data driven decision making and developing data literacy. Datnow, Park, and Wohlstetter
23
(2007) found dedicated individuals or teams responsible for data analysis and use at most
of the schools they researched. They also noted that the data coach, or data manager, was
an experienced educator, not a statistician. This allowed the data coach to develop the
necessary relationships with teachers to mentor.
Outside organizations can also provide mentoring to teachers. Ikemoto and
Marsh (2007) suggest that external organization, such as universities, consultants,
information technology companies, and state departments of education can assist schools
in facilitating data driven decision making. They found external organizations
particularly helpful in facilitating more complex forms of data driven decision making.
Datnow, Park, and Wolhstetter (2007) also found partnerships between high data driven
decision making schools and external organizations. These partnerships mainly focused
on providing the school with data systems.
Other tools to reinforce data literacy are set protocols, rubrics, worksheets, and
procedures (Ikemoto & Marsch, 2007; Datnow et. al., 2007). These tools can help
educators who are struggling to master their data literacy. It provides a structure for
groups to engage in data driven decision making. Johnson (2002) breaks down the path a
school must take to “transition from being data providers to being data users” (p. 35).
Throughout her book, assessments, guides, and protocols are provided to assist a school
to make this transition. Johnson emphasizes the need to proceed in a linear process in
order to maintain the goal of improving student achievement.
In sum, research has shown the importance of providing resources to enhance
educators’ data literacy. However, this research is just emerging and number of schools
24
using these resources is still limited. The research that is presented here shows great
potential in the ability for technology tools and expert personnel to impact data literacy
and data driven decision making.
Data Literacy and Data Driven Decision Making
Earlier in this literature review, Earl and Katz (2002) research was presented to
lay a foundation for understanding data literacy. Earl and Katz described the inquiry
habit of mind and the five characteristics of data literacy. Literature has been presented
to evidence the need to further increase school leader data literacy despite current
supports and resources in place. In this chapter, the need to increase data literacy has
been discussed in the larger context of building capacity for data driven decision making.
Data literacy is one of the many pieces needed for successful data driven decision making.
This chapter will conclude by illustrating the role data literacy has in data driven
decision making. This provides further evidence to the urgency that exists to develop and
maintain data literacy in school leaders. It also informs the potential influence data
literacy can have on data driven decision making. Two data driven decision making
frameworks will be used to demonstrate this potential influence. Frameworks will be
presented to show the influence of data literacy on the federal context to the school level
context.
Datnow, Park, and Wohlstetter (2007) develop a conceptual framework of data
driven decision making to reflect the “importance of examining the relationships between
different levels of the broader public education context” (p. 16). The framework shows
25
the system and school context situated in the context of federal and state accountability
policies. This framework also emphasizes student performance as the center and shared
goal for each level. Figure 1 below illustrates the conceptual framework of data driven
decision making adapted from Datnow, Park, and Wohlstetter.
Figure 1 Data Driven Decision Making Conceptual Framework
Data literacy is needed to support every level of this framework. In the federal
context, data literacy is needed to support understanding of federal and state
accountability policies. In the system context, data literacy is integral to a successful data
system. Earl and Katz’s (2002) five characteristics of data literacy each contribute to the
26
data system. (1) Having purpose and (2) recognizing sound data are integral in
developing an assessment and evaluation system. Skills to (3) understand and analyze, as
well as, (4) interpret the data are foundations for translating data into action. Lastly, (5)
communicating data effectively is imperative in data sharing. These same characteristics
are also reflected at the school context under data systems. Datnow et al.’s (2007)
conceptual framework provide evidence to support the importance and impact that data
literacy can have on data driven decision making in all different contexts.
Mandinach, Honey, and Light (2006) created a conceptual framework to show
data driven decision making with a focus on the educator and the classroom. This
framework does not dismiss the broader educational context, but provides a framework of
how data can be transformed to actions in the classroom. Data driven decision making is
depicted as a “six-step process, moving from data to information to knowledge” (p. 9).
Data is then used to make decisions that will be implemented and its impact evaluated.
The impact then feeds back to provide new data, information, and knowledge. This
entire system is aided by technology tools, data warehouses, and reports. Figure 2 below
shows a framework for data driven decision making adapted from Mandrinach, Honey,
and Light.
27
Figure 2 Data Driven Decision Making Conceptual Framework
Earl and Katz’s (2002) five characteristics are integral in Mandinach, Honey, and
Light’s (2006) data driven decision making conceptual framework as well. (1) Having
purpose and (2) recognizing sound data are reflected here as organizing and collecting
data. (3) Understanding and analyzing data and (4) making interpretations set the
foundation for summarizing, analyzing, synthesizing, and prioritizing data. (5)
Communicating data is an important skill in the impact and feedback of decisions made
on data. Both of the conceptual frameworks shown here reflect the importance of data
literacy and the potential increased data literacy has on successful data driven decision
making. These frameworks will help inform my study.
28
Summary
The current push toward accountability is focused on implementing data driven
decision making. Conceptual frameworks of data driven decision making have evidenced
the vital role that data literacy has in successful data driven decision making (Datnow et
al., 2007; Mandinach et al., 2006). Specifically, the data literacy of school leaders will
greatly impact the success of data driven decision making at school sites. This study
hopes to contribute to the growing body of literature on data driven decision making by
examining the data literacy capacities and needs of school leaders.
Data literacy has been defined by Earl and Katz (2002) and is built upon a
foundation inquiry of mind. Earl and Katz identify five characteristics of data literate
leaders. They must be able to define purpose in looking at data and recognize quality
data to begin. Next, data literate leaders must use statistics and measurement skills to
analyze and interpret the data. Lastly, data literate leaders must have the skills to
communicate data effectively to all stakeholders. With a concrete understanding of data
literacy, the question remains: how do we increase data literacy?
The literature existing in this area has evidenced the potential professional
development, technology tools, and expert personnel can have on increase and building
data literacy. These supports and resources are greatly needed as educators greatly lack
in data literacy (Earl & Fullan, 2003). Currently, the literature presented here has
evidenced the lack of effective professional development for all educators. The use of
technology tools and expert personnel show promise, but the use of these resources is still
29
limited. This study hopes to examine in greater depth what currently exists to build data
literacy and what remains to be done to foster data literate school leaders.
30
CHAPTER THREE
Methodology
This chapter will describe the research design for this study. The sample and
population, as well as, the data collection and analysis procedures will be discussed. This
will be followed by a discussion of the study limitations and ethical considerations.
As previously stated, there is a lack of data literacy among educators and this
study hopes to respond to the urgent need to encourage educational leaders to become
data literate leaders. This qualitative study aims to identify what supports and structures
are needed to increase educator’s understanding of data and provide a better
understanding of the data literacy capacities and needs of school leaders. The following
research question and sub-questions are addressed. The over-arching research question is:
(1) How do administrators increase their data literacy to build capacity for data
driven decision making?
This over-arching research question is supported by the following sub-questions:
(2) How data literate do administrators perceive themselves to be?
(3) What supports and structures are in place to increase administrator’s data
literacy?
(4) How do administrators use their data literacy to influence DDDM at their
school sites?
31
Approach
A qualitative, descriptive-analytic study method was used to conduct this research.
Merriam (1998) identifies qualitative studies as the primary design when researchers are
interested in insight, discovery, and interpretation, rather than hypothesis testing. Patton
(2002) adds that qualitative methods “facilitate study of issues in depth and in details” (p.
14). This study used qualitative data collection in the form of interviews and document
analysis.
The study aimed to provide an in depth understanding of administrators’ data
literacy capacities and needs. Specifically, this study was designed to uncover the
process by which school leaders increase their data literacy. Patton (2002) notes that
“qualitative inquiry is highly appropriate for studying process” (p.159). The qualitative
method will allow the study to go beyond simply identifying increases in data literacy
and explore in depth and in detail the process by which gains were made. The goal of
this qualitative study is to gain practical insight into increasing school leader data literacy.
Depth and detail will be needed to allow for understanding of this process to emerge.
Sample and Population
This study focused on individual K-12 school leaders. Twelve administrators
were interviewed. The administrators were purposefully selected according to several
criteria. First, all of the administrators were site leaders of traditional K-12 public
schools in southern California. Both principals and assistant principals were included, as
they both represent school leaders. It is the assumption of this study that data literacy is
32
necessary at all schools regardless of school level or setting. Therefore, administrators
from all school levels and communities were included. Second, because the aim was to
understand the range of data literacy among administrators, I sought to include
administrators who at least initially described themselves in varied ways with respect to
their use and understanding of data. As Patton (2002) suggests, my goal was to include a
variety of perspectives from people who are dissimilar in terms of how they used data,
but similar in that they were all public school leaders.
I used professional contacts to find administrators who were willing to participate.
The administrators were contacted through email and phone. They were first informed of
the nature of this study and then asked to volunteer to participate in one 45 minute
interview and possible follow up interviews on the telephone. The participants were also
asked to prepare any documents that they may want to share in relation to data use at
their school site, and specifically were asked to prepare samples of a student achievement
data report used at their school site.
All participants were administrators who used data regularly in their positions.
Nine of the participants were principals and the other three participants were assistant
principals. The participants represented eight different districts across three counties.
They also represented different school levels. Four participants were from elementary
schools, three from middle schools, and five from high schools. Only one of the twelve
participants represented a Program Improvement school. A Program Improvement
school is a school that has not succeeded in AYP goals set by the state.
Table 1lists the participants and the districts and types of schools they represented.
33
Table 1: Overview of participants
DISTRICT NAME SCHOOL POSITION
Ms. Lewis MS Principal District A
Mr. Steele HS Assistant Principal
Ms. Norris ES Principal District B
Mr. Lucas HS Principal
Ms. Hernandez ES Assistant Principal District C
Mr. Genetti HS Principal
Ms. Mao ES Principal District D
Ms. Dorsey MS Assistant Principal
District E Mr. Finer ES Principal
District F Mr. Saravia MS Principal
District G Mr. Taft HS Principal
District H Ms. McNiel HS Principal
The participants also varied in their experience in education. Some participants
had between 6-10 years of experience in education while some had more than 20 years of
experience. A smaller range was seen for experience in administration. Participants
ranged from their first year to more than 10 years in administration. The participants
were very similar in their experiences as a principal. All participants had less than 5
years experience as a principal; some had not gained principalship experience. Another
similarity is in the level of education. Eleven of 12 participants had a doctorate in
34
education or were in the process of completing their doctorate in education. This caused
the data to be skewed towards participants with a higher level of education. Overall, the
participants had varied experiences in education, but were all new (less than 5 years) to
the current position and in pursuit of higher education.
Table 2: Educational experiences of participants
DISTRICT NAME YRS IN
EDUC
YEARS IN
ADMIN
YEARS AS
PRINCIPAL
LEVEL
OF EDUC
Ms. Lewis 11-15 6-10 1-5 Ed.D. District A
Mr. Steele 11-15 6-10 N/A Ed.D.
Ms. Norris 20-25 11-15 1-5 Ed.D. District B
Mr. Lucas 16-20 1-5 1-5 Ed.D.
Ms. Hernandez 6-10 1-5 N/A Ed.D. District C
Mr. Genetti 20-25 6-10 1-5 Ed.D.
Ms. Mao 25+ 1-5 1-5 Ed.D. District D
Ms. Dorsey 20-25 11-15 N/A M.A.
District E Mr. Finer 11-15 1-5 1-5 Ed.D.
District F Mr. Saravia 6-10 1-5 1-5 Ed.D.
District G Mr. Taft 6-10 6-10 1-5 Ed.D.
District H Ms. McNiel 16-20 1-5 1-5 Ed.D.
35
Data Collection
All participants were asked to participate in an audio recorded interview
following a semi-structured protocol (included as Appendix A). Patton (2002) describes
how interview responses allow for the researchers to “understand the world as seen by the
respondents” (p. 21). The goal of the interviews was illicit response that would allow
insight into the world of the participant. I wanted to leave with a better understanding of
how data driven decision making occurred at the participant’s site and what would help to
increase the participant’s data literacy.
The twelve interviews took place between a span of three months. All but two of
the interviews took place at the end of the school day after the students had gone home.
The first interview was digitally recorded and transcribed for each participant. The
interviews lasted approximately one hour and fifteen minutes. About forty five minutes
of the interview was then recorded. The time that was not recorded was used in the
beginning to introduce myself, provide information about the study, answer questions,
and sign the consent form. Usually at the end of the interview participants would
continue to share their thoughts on the interview questions for another fifteen to thirty
minutes. Participants would often elaborate on experiences mentioned earlier. It seemed
they were more comfortably when they noticed the tape recorder turned off.
When arrangements were being made to conduct the interview, participants were
be asked to be prepared to share a sample student achievement data report from their
school site as well as any other documents that were used to facilitate data driven
decision making at the school site. Only five of the participants had prepared documents
36
before the start of the interview. The participants introduced each document as it
pertained to their responses. If there were documents left unmentioned, I inquired about
them at the end of the interview. In all interviews, if a protocol or data document was
mentioned, I would request a copy. All participants would show me the relevant
documents, usually at the end of the interview. However, some of the participants did not
let me take the document with me. This was done because the participant was not
comfortable sharing a document that had either teacher or student names.
Once all of the interviews had been collected, a follow up interview was requested
from each participant. The request was made through email and most of the participants
answered the questions via email. The follow up interview mainly consisted of
biographical questions such as years as an educator. Three of the participants were called
to get points of clarification from the first interview if, for example, the participants had
described a data system or protocol that was unclear in reading the transcript. This
follow up interview lasted about ten minutes and was documented with hand written
notes.
Once all the interviews were conducted, the interviews were fully transcribed,
resulting in approximately 150 pages of transcribed interview data. The transcription, my
interview notes, and any documents collected were analyzed together to find emerging
patterns or trends. Each source of data was then initially coded based upon my research
questions. Then I developed more detailed coding systems to further analyze the data.
The sub codes were developed by identifying trends and patterns in the participant’s
37
responses as well as using the questions from the interview protocol as a foundation. A
list of codes is provided in Appendix B.
Data analysis software was used to assist in this procedure. All interview
transcripts were loaded into HyperReserach. HyperResearch was used to highlight and
tag the transcripts as well as sort the data according to the codes. The documents were
kept separately and referred to as needed in the original transcript. Using the list of codes,
an outline for chapter four was developed. HyperReserach reports that sorted the data
according to codes were used to provide evidence for the findings of this dissertation.
The findings for this dissertation have been triangulated for validity. Findings
that were presented in chapter four have been validated by multiple participants. Data
provided by the participants that deviated from the norm were also presented if the
unique experience helped to provide understanding for the research question. The
experiences of each participant were also triangulated for validity. Participants described
their experiences as well as shared relevant documents to provide evidence to validate
their experiences. This triangulation of data helped to assure accurate interpretation of
trends and patterns in the data.
Study Limitations
As stated previously, the unit of analysis is the individual. Individual school
leaders vary by their experiences, knowledge, personalities, and values. These factors
undoubtedly affect the transferability of this study. This provided limited applicability
and transferability of this study to other individuals. However, the individual participants
38
selected for this study have been selected in a manner that is purposeful. There were
selected to represent a variety of school leaders, with both more and less statistical
knowledge. Hopefully, the insights gained from this study can still provide some general
knowledge about school leader data literacy capacity and needs that is relevant to others
more broadly.
Ethical Considerations
Ethical considerations will include such issues as informed consent,
confidentiality, and protection of participants’ anonymity. First, approval for the study
was obtained through IRB. After receiving IRB approval, participants were secured.
Informed consent was obtained from participants prior to any interviews. This ensured
that participants understand the nature of the study and voluntarily choose to participate.
Also, participants were informed of their right to withdraw participation at anytime. Any
questions the participants had were answered clearly. The participants mainly asked
questions inquiring about confidentiality and anonymity. I described how the documents
and media would be kept secure in my home and how pseudonyms would be used at all
times. A few of the participants asked about the dissertation process and I responded by
briefly describing my experiences in doing research.
Confidentiality and anonymity will be protected as much as possible. Since there
was only one participant per school, confidentiality and anonymity will less likely be
compromised. That is, is unlikely for participants to have known the other participants of
39
the study previously. Names of participants will be guarded from other participants
during the course of the study.
All data collected such as taped interviews, interview transcriptions, interview
notes, or documents will be kept confidential. Digital evidence is kept on my personal
laptop and all paper evidence is being stored together and protected in a locked file
cabinet. This dissertation also protects confidentiality and anonymity by masking all
participant names with pseudonyms. At all times of research, I will adhere to the policies
and procedures set forth by the University of Southern California.
Summary
This chapter has described how this qualitative study was conducted. The
subsequent chapters will present the results of the analysis of the data.
40
CHAPTER FOUR
Analysis of Data
This dissertation aims to gain a deeper understanding of how to increase school
leader’s data literacy and build capacity for data driven decision making. This qualitative
study used interview data and document analysis to answer the research questions posed
earlier. In this chapter, the data collected for this study will be presented and analyzed.
The four research questions will be answered through an analysis of the data. The over
arching question is
(1) How do administrators increase their data literacy to build capacity for data
driven decision making?
This question is supported by the following sub-questions.
(2) How data literate do administrators perceive themselves to be?
(3) What supports and structures are in place to increase administrator’s data
literacy?
(4) How do administrators use their data literacy to influence DDDM at their
school sites?
The answers to these four research question be presented cohesively to provide a
comprehensive look at school leader data literacy. First, a discussion of participants’ self
perception of data literacy will provide a foundation. Next, findings will be presented on
the supports and structures in place to support data literacy. Then, the influence of the
participants’ data literacy on successful data driven decision making will be discussed.
41
Last, the overarching research question will be answered describing what participants
shared about increasing their data literacy.
Self-Perception of Data Literacy
To begin a discussion of data literacy it is helpful to draw upon Earl and Katz’s
definition of data literacy. Earl and Katz (2002) define a data literate leader as being able
to (1) Think about purposes, (2) Recognize sound and unsound data, (3) Possess
knowledge about statistical and measurement concepts, (4) Make interpretation
paramount, and (5) Pay attention to reporting and to audiences. Following this definition,
the participants’ self perception of data literacy will be presented in this order.
Purpose
Earl and Katz (2002) believe that a successful approach to data begins by setting a
purpose for examination of data. The participants all perceived themselves to be
purposeful in their work and many felt they set the goals to drive their school forward.
Eight of the twelve participants described the goals they set and held the perspective that
it was their role to set a focus for the school. Mr. Genetti summed up his view of a
principal’s role when he said, “Whereas I set the goal, they are in full support of it.” This
sentiment of providing direction for the staff was shared by all the participants. Ms.
Lewis shares a similar view as she states that, “I set the goals, I share them with the
teachers, and then asked their opinions if they liked it, if they thought it would work.”
Mr. Finer reflects the same sentiment by saying, “I think it’s my role to make sure
42
everyone understands how important it is to have goals to be working toward something
as a school.” The other five participants echoed the same opinions.
One participant not only shared the need to set goals but also the need to set those
goals in the parameter of using data. Mr. Lucas defines his role more specifically as
establishing “a framework of expectations that focus around data, that focus on
collaboration, that constantly reinforce the idea of our vision which is to focus on student
learning.” When asked to break this down to more concrete goals Mr. Lucas articulates
the need to “reduce our failure rate… better align our curriculum… improve test scores…
and better communicate with all our stakeholders.” While eight participants all spoke of
setting their own goals and focus, Mr. Lucas best articulated his focus as well as
measurable goals.
While most of the participants were able to share more specific goals they set in
looking at data, four participants were unable to do so. Of these four, two participants
were Assistant Principals who described goals set but did not take any ownership for
creating them. Ms. Hernandez and Mr. Steele both described the goals as having been
there before they started in their positions. The other two participants were principals
who spoke of improving student learning and instruction, but did not describe personally
creating these goals for their faculty. One participant, Ms. Norris, referred to using the
state goals to drive her, while Ms. McNiel, described goals that were created by the
district before she started as principal. Overall, the participants understood the need for a
clear purpose and were able to articulate a concrete direction when looking at data. The
43
difference came in that eight participants created the goals for themselves while the other
four had those goals created for them by others.
Quality
Earl and Katz (2002) described the need to recognize quality data and use data
when appropriate. In discussing this component of data literacy with participants, it
became clear that all participants worked with state assessments that were not privy to
seeing before the test and were not able to contributing to creating the assessment. Thus,
Mr. Genetti says, “There is a certain amount of faith you have to put in the state in
development of those assessments and there’s a certain amount of faith that you have to
place in the hands of the teacher that administers it.” Five participants shared the
sentiment regarding district level or state level tests.
Despite these concerns, all participants spoke about there was a need to find
quality data. Five participants felt they determined the quality of their data by using
multiple measures for each student or each purpose. “I think we have to look at multiple
measures, we just can’t look at that end of trimester test” (Ms. Hernandez). Mr. Genetti
used multiple measures such as test scores from previous years, end of course exams, and
district exams. Ms. Norris looks at her data and “couples [the data] with other pieces of
data; I would call it things like informal walk-throughs, classroom observations, bell to
bell observations, even individual conversations with students about their grades.” Ms.
Norris continued to describe how she also uses class grades, different assessments, and
student reflection.
44
Two participants indicated their need for quality by sharing experiences when
they received data that was not of high quality. The understanding for the importance of
quality data was well demonstrated when Mr. Finer described how
We’ve already had multiple errors on the [math] benchmark assessments. It’s
created a number of problems, so I don’t make as many decisions based upon the
math assessments because I want to understand that they’re valid before I start
holding people to making instructional changes based upon those tests.
Mr. Taft had a similar problem with Latin roots assessments. They found students all
scoring above 90% and realized that data was not assessing what the students knew but
was assessing how much teachers taught the topic. Mr. Taft realized, “We must have
taught this WAY too well.” Mr. Taft and his teachers decided to try making a new
assessment to gather higher quality data.
Overall, all participants clearly articulated a need to find quality data that could be
trusted. However, given the mandated state and district testing, the participants felt
restricted in their ability to gather higher quality data. When participants found the data
to be of lower quality, the only course of action was to dismiss the data. Having such
little control over mandated standardized assessment resulted in participants either having
complete faith in the data and moving forward, or having no faith at all and ignoring the
data.
Statistics
Once a purpose has been established and quality data has been found, Earl and
Katz (2002) describe the need to apply statistics and measurement concepts to
understanding the data. In this component of data literacy, the participants ranged in their
45
perceptions of their abilities. One participant, Mr. Steele, stood out as being very
comfortable with statistics and data. Mr. Steele described taking the reigns with data and
how he uses excel to manipulate the data into meaningful presentations for his staff.
While Mr. Steele’s school has computer programs to do this he prefers to tackle the data
himself on excel. “Excel is just my friend” Mr. Steele states. Mr. Steele continued by
saying, “I have a reasonably good statistical knowledge, to kind of look at things.” On
the other end of the spectrum, Ms. Norris admitted that, “I would love to be faster at
being able to see patterns. It takes me a while. I have to study my data.” The other ten
participants fell in the range between Mr. Steele and Ms. Norris.
The range in statistical understanding greatly varied and there was no trend in the
participant’s responses. A trend that did appear was the ability to find assistance in
statistical understanding if it was needed. As Ms. Norris states, “I’m not very good at it –
it’s easier for me to pick up the phone.” Ms. Norris as well as four other participants
regularly used resources at the district office to assist with statistical analysis of data. All
participants described working with others to look at data. This made it difficult to gauge
participant’s individual comfort level with statistics. Participants either seemed
comfortable with statistics or at least comfortable with getting help.
Most principals delegated at least one part of the statistical analysis to an assistant
principal or someone at the district level who provided this assistance. The assistant
principals, such as Mr. Steele, validated this trend by describing situations when they
were given data and asked to present it to the principal and staff. Mr. Steele sums up the
situation by saying “I have become kind of the guy who gathers all the data and presents
46
it to the teachers. I spend most of August as soon as the CST scores come in and like
24/7 crunching numbers”. When asked about support for data analysis Ms. Norris tells
me “my TOSA (teacher on special assignment) is really resourceful. If I need data or if I
need information I go to my--she's not really a TOSA (teacher on special assignment),
she's my reading coach.” As long as the principal was able to understand a summary or
report from someone else, having less than excellent statistical understanding was
surprisingly not a problem.
Interpretation
After quality data has been statistically analyzed, it becomes necessary to make
interpretation that will often lead to actions. Earl and Katz (2002) describe the ability to
make interpretations paramount the next data literacy component. Participants all felt
there were able to see patterns and trends, and make interpretations. Despite differences
in their comfort levels, no participant expressed frustration with interpreting their data.
Many participants gave examples of how they have interpreted their data. The
examples illustrate the thought process they went through in interpreting the data. Ms.
Lewis shares that
My role is also to remind them that while we are a high performing school of 912,
where 80 or 82% of our kids are proficient on CST’s, that 20% of our kids are not
and we have an achievement gap, and that we need to focus on those students –
that 20% and help bring them up.
Ms. Lewis continues by describing how teachers identified students they taught from this
20% and discussed interventions and supports that should be provided for these students.
47
Ms. Lewis shows depth in her interpretation of data by focusing not just on the high
achievement but seeing the disparity and the achievement gap.
Another example comes from Ms. Mao who shares
I also know that with the fourth grade writing test fourth grade teachers in this
school…consider it their test. And when children don’t do well it’s like the
failure of the fourth grade. I would say then excuse me, children have been
writing since pre-school, ok? You cannot screw this up all on your own. So, I
tend to, well, not sugar-coat the data, but make them realize it’s a snapshot and
not necessarily the whole picture.
Ms. Mao continued to describe how her fourth grade teachers continue the conversation
by discussing what can be done to improve the student’s writing abilities. Ms. Mao and
Ms. Lewis’s experiences provide examples of how participants had interacted with their
data to bring instructional change with their staff.
The participants shared many examples of how data guided instructional decisions
and only two participants described experiences when data failed to yield instructional
decisions. These two instances occurred because data was interpreted and the conclusion
was not to use the data for instructional change. As described earlier in the section on
quality, Mr. Finer and Mr. Taft describe experiences where they lacked confidence in
their data chose to discard the data. It seemed that unless there was a solid reason, data
that was interpreted lead to instructional change.
Participants primarily described experiences relating to student achievement data
and primarily CST data when asked to share examples. While this was the focal point for
this study, there is clearly other data that exists, relating to budget, attendance, and other
measures. In terms of student achievement, participants focused on the CST with limited
discussion of classroom level assessments and data. It remains unclear if all types of data
48
are being analyzed and interpreted. Although principals gave examples of where data did
lead to instructional change, they also acknowledged that much data had not been
analyzed and interpreted and thus the potential of using data to guide instruction had not
been reached.
Another trend is that most participants looked at data with others. None of the
participants looked at data in isolation. Some participants only shared data with the
administrative team, others used a data team. Three participants described the regular use
of a data team at their school sites. Often the data team was also the leadership team or
instructional team. Mr. Finer’s data team “looks at grade level data, so in addition to the
grade level sitting down and looking at their own data-their classroom data--we also sit
down as a team and look at the grade level and school wide.” The data team works
together to make interpretations
There were many others who described looking at data as a part of a leadership
meeting and administrative team. As with the data team, these participants described
looking at data with others; however, the difference was in that data was only a part of
the agenda and was the not the focus of the meeting. Mr. Lucas describes his leadership
team meeting by saying
We meet with our leadership team every Wednesday. We discuss issues that are
of a school wide nature. We discuss procedural and logistical, but then we also
focus on learning. We always have our leadership team agendas… broken down
into communication and our academics. We kind of focus on those things.
Mr. Lucas describes the use of data at these meetings, but unlike a true data team, they do
not meet to focus on data.
49
There were only four participants who did not describe the use of a data team or
other small group. Of the four participants, two of them indicated a desire for a data team
but shared about not having the resources to create a data team. Ms. Hernandez states,
“No! No coaches, no coordinators, it’s just the Principal and I. We run the whole school
-- everything… The funds are very limited.” Ms. Hernandez described how she and the
principal present the data to the faculty for discussion. The other two participants did not
provide any experiences about looking at data in a smaller group.
Interpretation is another component that is shared with others, such as statistical
understanding. As described above, participants shared their data with others to make
interpretations. There were different groups used to share data. Some participants used
data teams to focus on data; other participants brought the data straight to the teachers for
discussion. However the method, interpretation of data was shared with others to
incorporate different perspectives.
Communication
Moving toward the end of the data process, one of the last and perhaps most vital
steps is to communicate the data with all stakeholders. Paying attention to reporting and
to audiences is Earl and Katz (2002) last data literacy component. This is also the
component of data literacy where the participants spent the most time and energy. The
participants mention both struggles and accomplishments in communicating data to
stakeholders numerous times in interviews. The stakeholders discussed included teachers,
staff, parents, students, and community.
50
There were many examples offered about communicating data. Mr. Finer met
one-on-one with every 4
th
and 5
th
grade student to share data from previous years and to
set a goal for how the student should score next time. In the course of the goal setting,
they discussed the bands and the test scores. “We sign a contract together for each 4
th
and 5
th
grade student and that seems to have had some success.”
Ms. Lewis had regular meetings by department where they discuss their data. On
these staff development days, Ms. Lewis uses all the time to talk about data.
We did some activities early on this year with all their class rosters. They all
received their current students and how they did on English and Math CST last
year and they had to highlight every kid who wasn't proficient. So they knew
who their kids were in their classes.
Ms. Lewis chose this activity because she wanted to remind her teachers that despite
being a high achieving school, 20% of their students are not achieving and they need to
focus on this 20% to close the achievement gap.
While it is not possible for Ms. Lewis to attend each meeting in its entirety, she
had her department chairs keep notebooks where they included the agenda and notes.
The notebook is then turned into her and Ms. Lewis returns it with her comments. While
this may not be a substitute for her presence, she communicates to each department by
keeping this notebook with the faculty’s comments and hers in return.
Mr. Taft hosted events at night for parents, students, and the community to come
on campus and hear him share the school’s data.
We do tend to look at things like the fact that a lot of our ELD (English Language
Development) kids don’t pass the exam the first, second, or even third time they
take it. We do want to give those parents that information and show them that
trend, and say, ‘Ok, now's the time to get the tutoring because we’re going to need
to pass this before you get out of here’ or just give them a realistic expectation if
51
they're a Level 1 student they’re registering in as a Senior. It’s unlikely that
they’ll pass CAHSEE (California high school exit exam) by the end of their
experience.
This conversation with parents continues as Mr. Taft shares about ECT (district level
assessment) data, SAT scores, performance data, CST and CAHSEE, A through G
requirements (for entrance into the University of California), and graduation rates.
Mr. Taft reports a good turn out sharing that “We give away a computer and we
also baby sit their kids while they’re there and feed them just for coming in and listening
to our data.” For parents who miss this event, information is posted online where parents
can log in at anytime and parents received newsletters in different languages sharing data.
The website and the newsletters provide the same information on data given by Mr. Taft
at the parent night. Mr. Taft wants to offer his parents and community multiple chances
to become aware of the school’s data. Mr. Finer, Ms. Lewis, and Mr. Taft provide
examples to different ways the participants have found to share data with stakeholders.
These three examples were chosen because they were more unique.
Despite all this success, all participants seem to continue to struggle to get all
stakeholders on board in looking at data. Communicating data is always a challenge
because there is always a need to have new and creative ways to engage different
audiences. Participants described finding it particularly challenging to engage teachers in
looking at data. When asked about challenges he faces, Mr. Saravia states, “it's
presenting the data in a clear manner that [the teachers] understand and how it relates to
them and also how it relates to their students and getting them to actually use data to
52
drive their instruction in the classroom.” Finding a successful way to communicate data
to motivate teachers to change instruction remains a challenge for many participants.
Summary
The following are key findings of the participant’s self-perception of data literacy.
These key findings summarize the answer to research question two.
• Purpose: Participants often saw it as their role as administrators to set a purpose
and a direction for the school. All but one participant was able to articulate a
purpose and goals for their school.
• Quality: Participants were constrained by the mandated state and district testing.
However, participants did recognize the need for quality data and the importance
of using quality data for decisions.
• Statistical Understanding: Participants varied greatly in their statistical
understanding of data. However, participants often out-sourced this component
and had others analyze the data for them.
• Interpretation: All participants recognized the need to understand how to make
interpretations of data at their school sites. Also, participants shared data with
others to get varied perspectives in interpreting data.
• Communication: All participants communicated data to the stakeholders.
Communicating data is an on-going struggle because new ideas are always
needed to keep engagement and meet the needs of various audiences.
53
Supports and Structures of Data Literacy
This section has been divided into two sections, the supports of data literacy and
the structures of data literacy. The support of data literacy includes any programs that
will directly increase a school leaders’ data literacy such as district professional
development and outside classes. The structures of data literacy refer to technology or
personnel that will reinforce the data literacy a school leader has, but not necessarily
increase it such as technology tools and personnel. Together these structures and
supports work together to increase data literacy and build support for data driven decision
making.
Data Literacy Supports
Participants were asked about two types of data literacy supports that were
available to them. The first was any professional development on data literacy offered by
the school district. The other is any classes or trainings participants pursued
independently. When asked about data literacy training and professional development,
participants were encouraged to share about any professional development or training
that provided information on how to use data at their schools. Participants were asked
about both technical training on how to access data, as well as, training on how to use
data with staff or communicate data to constituents.
When asked about district sponsored professional development, only four
participants clearly described having district sponsored professional development on data
literacy. However, when pressed, three of the four described training given by Data
54
Director or Edusoft. Mr. Finer shares that “a majority of what we get is the technical
support from the district office… we’ve gotten what we asked for and we usually ask for
technical support.” Mr. Lucas was the one participant who received more than technical
training in data literacy; he also described professional development workshops. Mr.
Lucas states that his district has “the focus on data, the focus on the reality of the
conversation, and the idea that we need to focus on the ‘main thing’ and that ‘main thing’
being student learning.”
Three of the participants described remembering some time spent on data literacy
such as “one discussion about data and that was the begging of the year when all the CST
results came back. That’s about it.” (Ms. McNiel) Similarly, Ms. Hernandez recalls “one
little workshop on using Data Director. It wasn’t enough for a person to be proficient.”
While these three participants did not deny professional development on data literacy was
offered, it was clear the professional development was doing little to increase their data
literacy.
The remaining five participants clearly answered “no” when asked if they were
offered professional development on data literacy. Mr. Saravia candidly says
As far as analyzing the data itself and what it means, I don’t think that support
from the district office is strong whatsoever. I think basically looking at the data
myself; I was able to teach myself what it meant and what I was looking at. As
far as support from the district office in terms of doing that, we didn’t receive
much support at all.
For these five participants little was done to increase their data literacy.
However, that did not mean that their data literacy stagnated. As mentioned
earlier, 11 out of 12 participants had either earned an Ed.D. or were in the process of
55
earning their degree. Ms. Norris, who received no district professional development on
data literacy, described her increase in data literacy by saying
I learned everything kind of simultaneously. Everything was making sense.
Everything I was hearing in my classes; everything that I was being shown and
taught by my teachers. It was just layers and layers of things that would happen
all simultaneously. Things I was hearing in class I was able to put into practice
right away, so it was kind of cool.
Many participants had similarly positive experiences to describe their educational
experiences.
Ms. Mao says “Well, I’m working on my Doctorate as well, so I’ve gone a lot to
my professors and gotten help that way. For example, two years ago I was taking a
statistics class as part of my program” when asked about her data literacy exposure. Mr.
Lucas shares about the importance of collaboration in his graduate program saying, “I
think that being in a doctoral program myself and being in a cohort with 25 other
administrators really helps a lot.” Mr. Saravia echoes the same sentiments saying
My own knowledge has been received from working at USC with Dr. Datnow,
looking at her studies, analyzing best practices using data. Also looking at
research for my own dissertation and reading how other schools use data, that's
been a big help to me. Also just my networking with other site principals and
other people in education has been a value to me. So I would say that's mainly
where my information has come from.
Despite the positive experiences described, participants were not asked to share details of
the data literacy training provided in their graduate programs. While it was clear that the
participants learned how to use data in their programs, it remains unclear how much time
was spent on data literacy and what specific topics were covered.
Three participants described attending trainings or professional developments on
their own. Mr. Finer described a weekend summit as “a real eye-opener with regard to
56
data.” Other participants described obtaining data literacy in their own ways. Mr. Steele
shares that “I was a math teacher, and my wife’s an accountant.” Mr. Saravia finds he
increased his data literacy by “networking with other site principals and other people in
education.” It seems that however the method, the participants were increasing their data
literacy.
Every participant was dedicated to learning what they needed to be successful in
their jobs. It seemed that the lack of district professional development did not prevent
administrators from gaining data literacy. Ms. Hernandez shares that “Everything I have
learned, I’ve learned on my own.” Mr. Taft went farther to say that
what really separates people who are leading in schools from people who are
working in schools is your willingness to go out and get it yourself. I think that I
have always been a person that, if there is something out there to know, I would
like to go and get it.
Data Literacy Structures
Participants were asked about two general categories of structures in data literacy.
The first was about technology that reinforced their data literacy. The second was about
the personnel that were available to also reinforce their data literacy, such as a data team
or data coach. When asked about available technology, 10 of 12 participants reported
using a computer program to analyze data. The computer program was provided by the
district. In reflecting on Data Director, Mr. Finer stated, “I think it’s adequate to more
than adequate. I haven't wanted to look at data in any way that I couldn't use Data
Director for.” Many participants described using Data Director or Edusoft and described
them as “powerful” and “useful” tools. Of the ten participants who used a computer
57
program for data only Mr. Steele felt that the computer program was unable to give him
data in a format that he wanted. Mr. Steele described Data Director as “It was very pretty
and lots of bells and whistles… I haven’t found it to be super super helpful.” Mr. Steele
continued to describe how he puts his data in to excel. Overall, the participants still felt
the computer programs were very useful in using data at the school sites.
Two participants from the same district reported not having a computer system for
data analysis. Ms. Mao stated,
I’m the computer program! You are looking at the computer program! I wish we
had one, but no, I just do it myself…I know it’s easier and faster for me to do this
by hand, so I do it by hand, just to say I did it, and then I put it into some sort of
format to send out to parents and teachers.
Ms. Dorsey, from the same district added, “We don’t have a data collecting, analyzing
and sorting program-- that’s a real hindrance both for our teachers as well as for us.” It
was clear that Ms. Mao and Ms. Dorsey did not use computer programs to analyze data;
however, it is still unclear how they did analyze data. While Ms. Mao describes doing
everything by hand, it is unclear exactly how she analyzes that amount of data by hand.
For schools without a computer program to support data use, less opportunities were
available for data analysis and thus less practice for developing data literacy occurred.
When it comes to having personnel to assist in data driven decision making, every
participant reported working with others. As mentioned earlier, it was clear that data
driven decision making was not done in isolation and was clearly a team effort. Nine
participants reported having district personnel they can contact for support in using data.
Participants shared about how they had access to individuals at the district level who
specialized in data use and instruction. These participants all have confidence in having
58
strong support from the district. Mr. Genetti shares that “they are virtually at my beck
and call. If I had the opportunity right now to call them, I'd certainly get them on the
phone and I could have them in my office very, very quickly.” It does seem however that
often the district responds only to the request from the school site, saying “all you have to
do is ask for it” (Ms. Hernandez). Mr. Lucas cautions that “what one gets from the
Office of Assessment for their site might not be the same thing that I would get for my
site if I don't ask for it.”
In addition to district support personnel, often times, support personnel is found at
the school sites. Participants reported working with other administrators, coaches,
coordinators, counselors, RSP (Resource Specialist Program) teachers, and other out of
classroom teachers in data driven decision making. At many schools it seemed that
participants were resourceful and used all available resources. It was unclear if the
support personnel at these schools were required to participate in data driven decision
making or if the individuals choose to participate. Mr. Steele described how his math
teacher contributed, saying he was “just an amazing statistician.” Again, while the
support personnel varied, all participants reported working with others on data driven
decision making.
While not directly asked about it during the interview, three participants described
the use protocols in guiding the schools data driven decision making. Mr. Finer, whose
school was further along in data driven decision making, has data analysis sheets that
teachers fill out before meetings and then protocols that are followed during the meeting
to discuss data. One page asked teachers to identify students in each performance band.
59
This page was completed when the teacher arrived at the meeting. During the course of
the meeting, the teachers discussed reasons for the students’ placements and identified
instructional strategies that could be used to improve student achievement. Mr. Steele
and Mr. Saravia reported using forms or protocols, but did not describe them in detail.
They reported have inconsistent success in implementing them at the school site.
Summary
The following are key findings on the supports and structures of data literacy.
These key findings summarize the answer to research question three.
• Only four participants reported having district sponsored professional
development on data literacy. The professional development that was offered was
mostly technical support.
• Five participants reported receiving no professional development on data literacy
from the district.
• Eleven participants gained data literacy through an Ed.D. program.
• Participants sought to increase the data literacy independently through a variety of
different resources taking responsibility for their own professional growth.
• Ten participants used a computer program to analyze data and all but one found
the computer program to be extremely helpful.
• The two participants from the only district without a computer program reported
on the need for the district to purchase one.
60
• All participants used personnel resources as the district office and school site to
support data driven decision making.
• Some participants found the use of protocols strengthened data use at their school
sites.
Data Literacy’s Influence on Data Driven Decision Making
Next, a discussion of how participants have taken their data literacy, data literacy
supports, and data literacy structures to create a data driven decision making environment
at their school sites will be presented. The focus is on the effect the participants’ data
literacy on data use at their school site. First an overview of how data is used at school
sites will be discussed. Next, a more detailed examination of two participants whose
school sites had a high level of engagement of data driven decision making will be shared.
Lastly, evidence will presented to show the need for administrators to focus on delegating
data driven decision making and creating systems to build staff capacity.
Data Use at School Sites
While all the participants reported using data in their decision making, the data
use varied from participant to participant. Some participants described their schools as
very data driven, other felt they were just beginning to use data and had a long way to
grow in terms of data driven decision making. When asked how data driven their school
is, Mr. Finer responded by saying, “It is to the point where they are pretty much ready to
continue this process on their [teachers] own.” On the other hand, Ms. Hernandez
61
responded by saying “I’d say around the low end.” Ms. Lewis’s response was “I think
I'm getting better, I think I'm weak though.” Ms. McNiel described her schools saying,
“We've gotten better every year…But we’re still missing that final piece, you know, to
get us up to where we need to be.” Despite the variance in data use, all participants felt
that data was an integral part of their job and recognized the need to better understand
how to use it.
At all twelve schools, participants reported using data with many different
stakeholders. They shared data with teachers, parents, students, and other staff. Mr.
Saravia said, “our counselors meet for the AB 1802 (state requirement) meetings and go
over that with our far below basic and below basic parents, then they go over that
information.” Mr. Genetti describes how his school communicates data by saying,
Our website also shares information and then I share information at any time I
meet with parents, school site council, booster meetings, the various community
meetings that I have to attend and parent meetings we hold on campus. It's also
shared through counseling department when they meet with parents and students
so it's pretty well put out there.
Participants shared a variety of methods for communicating data to various stakeholders
in the community.
Data was shared in a large group setting at all schools. Usually this would occur
at the beginning of the year during a faculty meeting. After that, each school varied with
how they proceeded with their data. One of the participants only shared examples of
looking at data in large staff meetings. Eleven of the participants described using data in
smaller groups of teachers throughout the year. A more detailed discussion of their data
use follows.
62
At one of the schools, the participant only discussed sharing the data in large
group meetings and did not describe any set system or protocol for sharing and analyzing
data throughout the year. Ms. Mao described sharing data with her teachers by saying,
When we meet as a staff the day before school opens that is when I will talk about
the Star assessment from the year before; I’ll bring up a watch list. … So what I
say on the first day back is this is a snapshot of what was done last year, so even
though we have these kids on a watch list doesn’t mean that they necessarily need
to be there this year. … so I can pull data, and share grade levels, [and] staff
meetings. I’ll just shoot an email and let them look at it on their own.
It is unclear if there are other procedures for data use at this school site; however, this
was the only information given by the participant.
This was the same school that did not have access to a computer program to
analyze data. While their implementation of data driven decision making was more
limited, it was difficult to determine if the participant’s data literacy played a role in the
implementation. More than anything else, it seemed it was the lack of technology
resources that prevented them from having more opportunities to use data and moving
forward in implementing data driven decision making. Ms. Mao seemed to apply her
data literacy as much as possible given the limited technology support.
The participants of the other eleven schools moved forward from their beginning
meeting with set meetings in smaller groups throughout the year. These groups would
then work to use the data to inform and guide instruction. These groups were called
“professional learning communities,” “grade level groups,” “departments,” or “content
clusters.” While the name of the group varied, the organization was very similar. In
elementary schools, teachers met in grade level groups and in secondary schools, teachers
met by content groups. The meetings were held anywhere from once a month to four
63
times a month. The teachers would come together to look at student achievement data
and discuss instruction, implementing data driven decision making. The results of the
meetings were usually shared with an administrator who oversaw the group.
When asked how data was shared at her school, Ms. McNiel responded by saying,
A number of different ways. A lot of it is done through Department Chairs.
Again, we have 4 to 5 “Course of Life” days where we have substitutes in the
classrooms and the entire Department has the day to work together. A lot of that
is done in those venues as well as department meetings.
Mr. Saravia shared, “As far as our meeting structure is concerned, we have early out
Wednesdays and … our department and grade level meetings are important because
basically the department says, "Okay, this is what we are going to do as a department" or
their action plan. Then the grade level meetings where they can actually work together to
make sure they do what is needed from the department in achieving their goals. So those
are very important. They all are focused on looking at the data and making sure that they
are achieving their goal.”
Ms. Hernandez described her school’s implementation by saying, “This year the
District is implementing professional learning communities which is really a big deal for
the teachers because now they have to meet for 1 hour every week.” These professional
learning communities are focused on looking at data. Mr. Lucas’s school also had
professional learning communities focused on data. “My expectation is that our teachers
meet every week and dialog about data and dialog about student learning, and we go
around and make sure that's happening.” As these examples show, data is used at these
schools. Each school site finds a unique way to incorporate data driven decision making
into the schools structure.
64
It is difficult to tell how much the data literacy of the participants influenced on
the implementation of data driven decision making. There were two reasons for this.
First, as mentioned earlier, none of the participants worked toward data driven decision
making in isolation. Participants worked in groups to develop the systems and protocols
for data driven decision making. For example, when asked how the small group meetings
are planned, Mr. Lucas responds by saying,
The leadership team takes that information and they discuss it at their Friday
dialogue with their content area team. Then it just keeps going back and forth.
They'll bring stuff from the Friday meeting to the next Wednesday meeting. Then
they'll take stuff from the Wednesday meeting and discuss it at the Friday meeting.
Thus, it is difficult to separate Mr. Lucas’s data literacy from the data literacy of the
leadership team.
The second reason it is difficult to determine the influence of data literacy on data
driven decision making, is that data driven decision making is now mandated at many
schools. Mr. Finer shared that his district mandates that
all data is to be posted in the classrooms. That's a given. Every district, I mean
every school in the district and every classroom will have their benchmark data
posted. Each teacher's lounge and office are also supposed to have the data posted
for parents and for teachers.
In addition to district mandates such as Mr. Finer’s, all participants are under the pressure
of NCLB (No Child Left Behind) to increase their API (Academic Performance Index)
and AYP (Adequate Yearly Progress). It is difficult to say if the data driven decision
making at schools sites is just a response to these district and state mandates to use data
or if school leaders would pursue data driven decision making without these mandates.
65
In summary, it is clear that data is being used at school sites. The level of data
driven decision making varies between schools. What remains unclear is the effect of the
participants’ data literacy on data driven decision making implementation. Data driven
decision making at the school site can be a result of others’ data literacy or a result from
district and state mandates.
High Implementation Schools
Of the ten schools that had small group meetings for data driven decision making
throughout the year, two schools seemed to have brought data driven decision making to
the next level. At these schools it was clear that the data literacy of the participant was
key to moving the school to a more successful implementation of data driven decision
making. Both Mr. Finer and Mr. Steele have a data team at their school sites that work
together to achieve data driven decision making. Both school sites have also
implemented data driven decision making for at least two years and have had the
opportunity to reflect and learn from their own experiences.
While many schools used data, only Mr. Finer and Mr. Steele’s schools had set
personnel, schedules, and protocols to examine data. When asked, both participants also
described themselves as being very data driven and data literate. The detailed examples
as well as the confidence and clarity with which they spoke about data driven decision
making at their school sites demonstrated the high level of commitment and
implementation both participants had to data driven decision making.
66
What sets Mr. Finer’s school apart is the protocols and forms that are in place to
guide the meetings and maximize efficiency. Even though the meetings are not
mandatory for teachers 90% of teachers come ready to work. “They definitely come to
the meeting prepared. Nobody shows up at the meeting without their classroom data
analysis sheet.” Mr. Finer described several stories of how teachers have taken the work
from these meetings to their classrooms and impacted instruction.
For example, Mr. Finer describes a first grade meeting where the teachers are
looking at data from a recent reading comprehension assessment. The teachers notice
that the students’ scores have dipped and discuss what can be done to boost the scores.
Teachers shared ideas on different vocabulary strategies that they have, and one
of the teachers had gone up and demonstrated a word sort that she had been using
as an activity in her classroom. As a grade level, we committed to saying that this
is something everybody is going to be doing because it had proven success based
on the data. When I come through classrooms during that time, that's something
that obviously...by being in the meeting and being there when they made that
commitment, it's pretty much a guarantee that it's going to be done in the
classroom. No one's going to sit there and say ‘Okay, I'm going to say 'yes, yes,
yes’ at the meeting and I'm going to get in the classroom and do something
differently.’
It was clear from the examples that Mr. Finer gave, his school was moving with data to
impact instruction.
Mr. Steele has also brought his staff to the next level of data driven decision
making by giving them different ways to look at data. Mr. Steele describes himself as the
kind of the guy who gathers all the data and presents it to the teachers. I spend
most of August as soon as the CST scores come in and like 24/7 crunching
numbers so that, when the teachers come back in August, part of the time that
they spend in staff development and things like that is looking at how they did
compared with the rest of the department. It’s anonymous. I can show you how I
kind of do that, but they’re looking at the trends by the clusters and have these
67
cool little graphs to look at and they identify as a group where the weakness in the
department is as far as the clusters.
Mr. Steele shows me a sample packet of how this data is displayed. It is an
example from the science department. Along the right side is a series of numbers. He
explains to me that the teachers are all aware of their own code. The names are not used
to keep the anonymity of the other teachers. Moving across, there is a list of clusters that
represent groups of standards on the assessment. The teacher can see how each class
performed on each cluster of the assessment as well as how all of the teacher’s students
combined performed. These performance measures can then be compared to other
teachers or the department as a whole. Mr. Steele keeps these reports for all subjects and
assessments. Using multiple reports, teachers can compare student performance across
different years or assessments. It is clear from the report that is has been tailored to meet
the needs and demands of his staff. Mr. Steele also mentions that he can tailor the report
to any specific needs the department may have.
Both Mr. Finer and Mr. Steele express an avid interest in data driven decision
making. That interest propels them to learn more and explore data literacy. Mr. Finer
states, “I usually tend to gravitate toward the data driven workshops just because it is an
area I have a lot of interest in.” Mr. Steele’s interest in data is clear when he describes
his own excel programs and how he created them because Data Director was unable to
offer him what he needed. While Mr. Steele and Mr. Finer also use data teams, their
dedication to the process gives me confidence that they are a driving force in their data
team and their data literacy is what pushes data driven decision making forward at their
schools.
68
Delegating Data Driven Decision Making
In describing the influence of data literacy on data driven decision making, it is
important to point out that administrators use their data literacy on a larger scale to
oversee data driven decision making processes at the school. Administrators need to
prefect their data literacy skills to report to large audiences and to delegate data driven
decision making processes through protocols. The challenges faced by the administrators
are in creating systems and protocols that will build teacher capacity for data driven
decision making and in generating motivation for data driven decision making. This
became evident as participants described how rare it was to be able to discuss data with
their teachers one-on-one. There were two reasons for this situation.
First, administrators were unable to meet with teachers one-on-one due to time
constraints. Often meetings are occurring simultaneously and most administrators are
rotating through meetings. Mr. Taft states, “I would say if I make it on average twice a
year to a department meeting that would be about right. It's not even close to what I want
to be able to do; just time-wise, it's not going to work.” Mr. Genetti echoed Mr. Tafts
concern with time by saying,
I have 87 staff members and I'm, for example, doing the routine annual
evaluations or semi-annual evaluations. I think I have 20 staff members this year,
and it's a challenge for me to get in those classrooms for honest evaluation and sit
down and meet with them as many times as I do with those 20. To try and do it
with 87--it would be virtually impossible.
69
Ms. Lewis described her situation by saying, “I want them to look at data but I don't have
a lot of individual conversations and even with my Department Chairs it's more ‘okay,
here are the results. Go and discuss them with your department.’”
Second, administrators moved cautiously in working with teachers one-on-one
with their data. Administrators did not want to create tension or loose teacher motivation
to work with data. Ms. Lewis sums up the situation by describing the reality of what
happens at her school, “I want them to look at data but I don't have a lot of individual
conversations and even with my Department Chairs it's more ‘okay, here are the results.
Go and discuss them with your department.’” Mr. Steele comments on how he does not
speak to teachers about data one-on-one because it would appear too “big brotherish.”
Two participants discuss contractual issues in speaking with teachers about their data.
Ms. Hernandez shares with me about grievances occurring in her district in regards to
data use. Mr. Saravia states,
As far as one-on-one, what I used to do is I would have meetings with my
teachers and sit down with them--the teachers that I was evaluating--and we
would go over that. Now some felt that--not the teachers themselves but in
discussion with other administrators--that might become a sticking point with the
union. The teachers felt threatened in that their data was being presented to them
during the goals and objectives meeting. So I backed off from that because I didn't
want to go down that path.
In addition to time constraints, union issues are another obstacle in meeting with teachers
one-on-one to discuss data. Therefore, participants have found that data must be
communicated to staff in large group meetings and with the help of other support
personnel.
70
In sum, time constraints and contractual limitations leave little opportunity for
administrators to practice or perfect skills in discussing data one-on-one with teachers.
Rather, administrators tend to focus their attention on communicating about data to large
audiences and delegating and overseeing protocols that promote discussion around data.
Summary
The following are key findings on the data literacy’s influence on data driven
decision making. These key findings summarize the answer to research question four.
• Student achievement data is used at all schools in this study to make instructional
decisions.
• Technology resources can greatly limit the schools ability to implement data
driven decision making.
• School leaders comfortable with data literacy can influence data driven decision
making and bring data driven decision making to the next level.
• Administrators rarely discuss data with teachers one-on-one.
• Administrators need to personally have data literacy in order for them to
successfully create systems for data driven decision making for their staff.
Increasing Data Literacy
The overarching question of this dissertation will be answered in this section.
Information will be presented about how the participants felt their data literacy could be
increased. Participants felt that a variety of elements would positively impact their data
71
literacy levels. These elements included time, collaboration, technology, and
communication. Each of these elements will be discussed in this section.
Time
When asked about increasing their data literacy, Ms. Norris responded by saying
“You know sitting in this chair, time is the Number 1 issue.” Many other participants
shared this sentiment of needing more time. Ms. Lewis says “I need more time!... there's
just not enough time between making sure that the internal/external goals are set, and
making sure the writing focus is on, getting in to all the classrooms on a regular basis.”
Ms. Dorsey adds that “the thing which is always an issue is time.” Mr. Genetti agrees
saying, “Time is a huge challenge.” With all the demands being placed on administrators,
it is clear that more time can always be used. Other participants also mention the struggle
to find time to work with their staff and develop their staff’s data literacy.
Earlier Ms. McNiel was specifically referring to more time spent on data literacy.
It’s not so much that I need more hours in a day. I would say in order to move
forward it needs to be given more priority internally as well as externally… it’s all
of this (points to a pile a papers on her desk) that gets in the way; it’s the urgency
in a phone call or an e-mail from the District. ‘There's a report I need you to fill
out and, sorry, I need it by tomorrow morning. Get it done right now!’ I feel like
we're this island where we keep trying to move our teachers forward
professionally…We want to look at data and how to glean the most from it. But
what interferes is doing what I call “the crap”… You know, getting pulled in all
different directions…if this were more controlled all around us, we would have
more time because we wouldn’t have to deal with so many of what I think are
extraneous issues.
72
Ms. McNiel brings an important point, more priority needs to be given to data driven
decision making. If more priority is given to data driven decision making, then more
time would be set aside to focus on data.
Collaboration
Participants not only wanted more time focused on data literacy, but they also
knew how they wanted to spend their time as well. Eight participants cited a need for
more time to collaborate with other professionals in the field. Mr. Lucas shared that “I
think the next thing is creating time for collaboration.” Ms. Hernandez specifically felt
administrators needed “more time to sit down and talk to one another. I don’t think that
sort of coaching and mentorship of colleagues – mentoring each other and coaching each
other and leaning on each other – I don’t think is there.” Many of the participants echoed
Ms. Hernandez’s perspectives and cited a need to discuss ideas with other educators in
different schools and districts.
Participants had different ideas of how that collaboration would be organized. Mr.
Finer and Ms. Dorsey were looking for a more “informal collaboration with principals”
where colleagues could be “sounding boards” to give feed back on ideas. Ms. McNiel
would guide this conversation by having administrators bring school site data to discuss
at a district level meeting each month – “like a tutoring session.” Mr. Taft and Mr. Steele
saw collaboration as visits to other schools.
I think you've got to show them something that's working well. Find a district
that we can look at a case study for or get us there... I would like to actually go
and see something done really, really well…I think that building belief that [it]
can be done… because right now we have a lot of things from a conceptual
73
standpoint… but I haven’t seen it work yet. So those things would really help me
to fire people up… if I could say ‘Yes, I’ve seen this and its great!’
Regardless of how they saw the collaboration working, these participants felt that a
chance to work with other school leaders would increase data literacy.
Technology
In addition to more time and more collaboration, participants also wanted
increased technology support. The type of technology support needed depended on
existing levels of technology resources available to the participants. There were only two
participants who did not have access to a computer program to analyze data. The two
participants expressed the need for a computer system to analyze data. Ms. Mao stated,
I would get a data system or I would have the district put one in, and I would get
some really serious training. I would love to be able to do that! I mean, not only
would it save me a lot of time, but I think that it would be so exciting and
interesting!
Ms. Dorsey said simply, “we all agree that we need is some sort of computer tool.” The
district was unable to afford such a computer system. It was clear from these two
participants that not having this technology tool greatly limited their ability to proceed
with data driven decision making.
Of the remaining ten participants who had access to a computer program, only
three participants felt the need to get further technology training to increase data literacy.
The remaining seven did not feel a need for any increase in technology or technology
training. While the three participants were satisfied with the computer program and were
74
able to use it successfully, they felt there was more potential in the computer program and
they did not have the skills to access that potential.
These three participants all described very specific training they wish to receive.
It was evident that they had used the program enough to understand what skills they
needed to develop. For example, Ms. Norris wished to
get data broken down more by standards... I would like to see more focus on... say
these focus standards…That way they could see if the students have met their
goals as far as they should be at this time of year so that they are ready when the
big test comes.
Mr. Genetti shared that
I would like to be better informed on the use of Data Director and the
development of questions…The ability to seek an answer and pose a question to
the Data Director that will result in the answer that I am seeking.
He described how he has wanted to get certain pieces of information from data director
but he did not know how to enter the exact question into data director. It was clear that
these requests for technology were beyond a basic level use of the technology.
Communication
When asked about what area of data literacy the participants felt they needed to
improve upon, eight participants agreed that motivating their teachers was the priority.
When asked what she needed more information on, Ms. Norris responded by saying,
“Motivation-the key to motivation!” Ms. Hernandez also finds motivating her teachers a
challenge. Ms. Hernandez describes the challenge for data use at her school is
the teachers getting tired and exhausted because they’re not used to that and
saying ‘I’m just tired.’ They’re looking tired… we look tired because it’s like
“Oh my goodness, there’s just so much to do!
75
Many participants shared challenges with teacher motivation in data use.
Mr. Lucas described his challenge with keeping his staff focused creating a
culture of using data.
I think that probably the biggest challenge, the most ongoing challenge, is
continuing to be a cheerleader about the vision…you are constantly trying to
manage your resistors, as an administrator maintaining that constant vision is just
an ongoing daily deal for me.
Ms. Mao describes the reason why motivation is a challenge for so many of the
participants.
I probably think that elementary teaching has always been thought to have been
an art not a science…I think that it is a paradigm shift, to have people look at and
make decisions based on data. That is fairly new, particularly in the history of this
state, and I think, to many teachers, particularly those of my age who went to
teach in the seventies, that was a big no-no, and so it is hard to make that shift.
Several participants described the paradigm shift into data driven decision making as a
challenge at their sites.
Participants therefore want to learn about how they can better support their
teachers in using data and in making this paradigm shift. As discussed earlier,
administrators rarely meet with teachers to discuss data one-on-one, rather administrators
need to delegate and create systems to support data driven decision making. This
provides yet another challenge in communicating about data and motivating teachers.
Therefore, participants showed a need to learn more about motivating their teachers and
need to do so through strong communication and effective systems that delegate data
driven decision making and build staff capacity.
76
This would fall into Earl and Katz (2002) fifth component of data literacy –
paying attention to reporting and to audiences. Many participants describe the need to
address this challenge. Earl and Katz’s fifth data literacy component is indicated as an
area that needs improvement for participants who are both beginning in data driven
decision making, as well as participants whose schools are more developed in using data
driven decision making.
For example, Mr. Steele states,
I think what I need to get better at is pushing teachers to use it more. I think I’m
comfortable with it; I think I know where to get things. I have a reasonably good
statistical knowledge, to kind of look at things, but getting teachers to use it and
kind of guiding them through that process that we talked about earlier which is
kind of really our next frontier.
Mr. Finer shared this sentiment and described how some of his teachers only participate
out of peer pressure and he wanted the teachers to begin to “take this on” and take
ownership of data driven decision making. He shared that he works on “increasing the
awareness with the teachers of how important it is to use the data and just the incredible
things we have at our fingertips.” Mr. Saravia shared the need to get “the teachers to
believe in the data. Getting the teachers to believe that the CST is a valid assessment to
measure student achievement.” These are just three of the examples to evidence the need
to develop data literacy skills for reporting and communicating.
A few participants were more detailed in the desire to focus on motivating
teachers. One participant was able to describe specifically what they needed to push their
teachers to take more ownership of data driven decision making. Mr. Saravia stated,
there isn't a clear message that this is needed to move us forward. If that came
from our district office and it was mandated by our district office, I think district-
77
wide that would help with principals pushing our teachers to create those
benchmark assessments.
Another participant, Mr. Genetti, described that he wanted the teachers to “have a better
understanding and an understanding deep enough that they can share that understanding
with whoever they come in contact with.” Mr. Genetti reiterates what Earl and Katz’s
definition of data literacy, the ability to report data to a variety of audiences.
Summary
The following are key findings on how to increase data literacy in school leaders.
These key findings summarize the answer to the over arching research question.
• More time is needed to develop data literacy skills.
• An increase in collaboration with other school leaders will increase data literacy.
• Technology resources are vital in creating a foundation to increase data literacy
and support implementation of data driven decision making.
• Participants wanted to increase data literacy in the fifth component – paying
attention to reporting and to audience. This would help to motivate teachers to
pursue data driven decision making.
Conclusion
The participants shared their experiences in using data driven decision making at
their schools site and reflected upon their data literacy. In reporting and analyzing their
experiences I found evidence to support that data driven decision is being implemented at
all schools. As expected each school is at a different place in implementation, but all
78
schools are moving forward toward using data to guide decisions that will increase
student learning and improve teaching.
As each school is in a different place, each of the participants is also in a different
place in their journey toward becoming data literate school leaders. Most of the
participants placed value on data driven decision making and seemed eager to learn how
to effectively implement data driven decision making. However, the lack of resources
provided to two of the participants in one district was a challenging obstacle. As
mentioned earlier, these two participants were unable to move forward in data driven
decision making due to a lack of technology tools.
Participants also moved forward with an eye on collaboration. Participants were
already embracing collaboration at the school site by discussing data in data teams,
leadership teams, as well as, teacher-led grade level groups and departments.
Encouraged by their successes, participants were eager to increase collaboration to
include other school leaders at different schools and different districts. Many participants
felt this was the venue to help increase their data literacy.
Participants also discussed what components of data literacy they needed the most
support in. Participants candidly reported that the reality in their school sites permitted
little to no opportunity to discuss data one-on-one with teachers. Rather administrators
had to become proficient at delegating discussion to others as well as reporting data to
large audiences. Following this understanding, it was clear why many participants shared
a need to learn more about how to communicate about data to foster motivation with
teachers.
79
Due to the high education level of the participants, as well as, the effects of No
Child Left Behind, it seems that the participants had come along way in developing their
data literacy. Yet, every participant clearly understood that there was still more to learn
to increase data literacy and successfully implement data driven decision making. The
relation of the study findings to existing research and the implications for policy and
practice will be discussed in the next chapter.
80
CHAPTER FIVE
Summary and Implications
The era of accountability in education has arrived and the time in which educators
made decisions based upon instincts alone has ended. Educators today must acquire new
knowledge and skills to successfully lead schools in data driven decision making.
Educational “leaders are aware that they need to become experts at moving from data to
information to knowledge but they are not at all confident that they know enough to be
able to do it well”(Earl and Fullan, 2003, p. 389). To use data successfully educators
must obtain data literacy. Earl and Katz (2002) define a data literate leader as being able
to (1) Think about purposes, (2) Recognize sound and unsound data, (3) Possess
knowledge about statistical and measurement concepts, (4) Make interpretation
paramount, and (5) Pay attention to reporting and to audiences.
Educators often lack training and practice in these data literacy skills. In the push
forward to use data to guide instruction, it has often been forgotten that first, school
leaders must clearly understand, analyze, and interpret the data being used. Without data
literacy skills to set a foundation, data driven decision making cannot be successful. The
push forward to use data necessitates an urgent need for school leaders to increase their
data literacy.
This dissertation aimed to gain a deeper understanding of how to increase school
leaders’ data literacy and build their capacity for data driven decision making. The goal
is to add to the growing body of research surrounding data driven decision making.
81
Specifically, this study aimed to provide promising practices for increasing data literacy
in school leaders who implement data driven decision making.
This qualitative study used interview data and document analysis to answer the
following research questions:
(1) How do administrators increase their data literacy to build capacity for data
driven decision making?
This question is supported by the following sub-questions.
(2) How data literate do administrators perceive themselves to be?
(3) What supports and resources are in place to increase administrator’s data
literacy?
(4) How do administrators use their data literacy to influence DDDM at their
school sites?
Twelve administrators from traditional public schools participated in this
qualitative study. Their experiences and the documents they shared provided the data
that was analyzed for trends and patterns.
Data were first analyzed using Earl and Katz’s (2003) five components of data
literacy. In terms of the first component, purpose, the participants demonstrated their
commitment to setting goals and providing a direction for their school. When it came to
identifying quality data, participants again demonstrated the amount of thought they had
given to the quality of the data they used. Often participants did not use data they felt
was not of high quality. When it came to interpretation, participants again showed their
commitment to this component of data literacy. Participants dedicated time to
82
understanding their interpretations and choose to work with others to make
interpretations.
The findings varied more regarding the participant’s level of statistical analysis.
Some participants had high levels of statistical understanding while others had a minimal
statistical background. The consistency was found in the participant’s dependence on
having others analyze their data for them. Communication was a component where
participants shared their struggles. The need to communicate different types of data to
different stakeholders kept communication about data a constant challenge. There is
always a need to seek new ideas to encourage engagement of the audience and help
audiences to understand the large amount of data available in education.
The data literacy skills that participants had were acquired from a variety of
sources. Often participants spoke of finding these resources through their own efforts.
Learning about data literacy independently was often the result because district sponsored
professional development on data literacy was limited. Participants engaged in
professional organizations and sought graduate degree programs. It was through these
programs that participants advanced their data literacy.
Other data literacy support came in the form of personnel at the district and school
site. Well developed protocols also helped some schools to advance their data literacy.
Most important to setting a foundation for building data literacy was the availability of a
computer program. Participants reported on the value of their computer programs.
While those without technical resources shared about the need to have those resources in
order to move forward with data driven decision making.
83
Participants’ data literacy skills and their resources created capacity for
implementation of data driven decision making at their school sites. While all
participants described the use of student achievement data to guide instructional decisions,
each participant tailored their process to fit the needs of their individual school. The
findings evidenced the importance of data literacy skills for administrators to bring their
school further in the process of data driven decision making.
Administrators needed to create systems for data driven decision making that
allow their staff more success in using data. Findings showed that administrators rarely
discussed data one-on-one with teachers. This occurred in response to time limitations as
well as union concerns. This left administrators focused on seeing the big picture and
facilitating systems for data driven decision making.
To help school leaders build capacity for data driven decision making, school
leaders need to increase their data literacy skills. Specifically, participants shared their
need to develop their communication skills. Having to engage teachers, parents, students,
and other stakeholders in data and motive them to use data, requires data literacy
communication skills. This was the component identified as both the greatest challenge
and the area in which more training was needed.
To succeed in gaining data literacy skills, participants voiced the need to have
more time spent on data literacy development. Participants discussed the need to have
time that was specifically dedicated to data to increase the commitment to learning about
data. Also, participants expressed the need to learn through collaboration. Participants
84
wanted to exchange ideas with other colleges as well as observe successful models to
gain insight.
In the following section, I will discuss the connections between these findings and
the extant literature. Next, implications for policy and practice will be discussed.
Recommendations for further research will be shared after.
Connections to Prior Research
The goal of this dissertation is to add to the growing body of literature of data
driven decision by focusing on data literacy and its role in building capacity for data
driven decision making. It is necessary to see how these findings both validate and
depart from existing literacy on data literacy and data driven decision making.
First it is important to remember that the need to for data literacy arise from its
connection with data driven decision making. Literature from Datnow, Park, and
Wholstetter (2007) and Mandinach, Honey, and Light (2006) presents data driven
decision making frameworks that demonstrate the connection between data literacy and
data driven decision making. Furthermore, Datnow, Park, and Kennedy (2008) evidence
the need for data driven decision making to help educators understand student learning
needs and gauge instruction accordingly. When educators increase their data literacy
they are building capacity to implement data driven decision making successfully leading
to increased student learning.
Earl and Fullan’s (2003) research brings together several research studies to
provide evidence that school leaders have anxiety with using data. School leaders
85
showed insecurities regarding their own skills and knowledge in understanding data. Earl
and Fullan’s research also provided insight into England’s schools where school leaders
were more comfortable with using data. Even these school leaders who felt more
comfortable with data still reported a need to increase their data literacy. This need to
increase data literacy in educators is supported by many other studies (Ikemoto & Marsh,
2007; Chopping, 2002; Dembosky, Pane, Barney, & Mr. Steeletina, 2005; Feldman &
Tung, 2001; Mason, 2002; Fusarelli, 2008; Gordon, 2007)
The participants ranged in their comfort level with data. Some participants
echoed Earl and Fullan’s (2003) research speaking candidly about their insecurities
regarding data. However, as more school leaders pursue professional opportunities and
practice with data, school leaders describe more comfort with data. Some participants in
even reported a high level of comfort with data. As in previous literature, all school
leaders want to increase their data literacy, and even those with high comfort levels with
data describe the willingness to learn more.
The two schools in this study that showed regular use of data driven decision
making and engagement in data had the two participants who were most comfortable
with data use. It can be suggested that the data literacy of these individuals is what
influences their schools to have commitment to data driven decision making. Just as the
literature described the potential influence a data literate leader can have on the school
(Marsh, Pane, & Hamilton, 2006; Detert, Kopel, Mauriel, Jenni, 2000; Mason, 2002;
Lachat and Smith, 2005; Mieles and Foley, 2005, Ikemoto & Marsh, 2007; ; Feldman &
86
Tung, 2001; Mason, 2002; Lachat & Smith, 2005; Chopping, 2002; Fusarelli, 2008), two
participants demonstrated that influence.
Despite the need for the increased data literacy, research has shown the limited
professional development opportunities (Mandinach, Honey, and Light, 2006; Popham,
1999, Stiggins, 1999, 2006; Crooks, 1989, Shepard, Hammerness, Darling-Hammond,
Rust, Snowden, Gordon, Gutierrez, & Pacheco, 2005). The professional development
opportunities that are available often leave educators still seeking more training. Marsh,
Pane, and Hamiliton (2006) researched existing professional development on data
literacy and found that the professional development available was not helpful. Datnow
et al. (2007) and Burgess (2006) researched data literacy professional development that
educators found helpful, yet both studies found that educators still felt inadequately
trained and needed more training in data literacy.
The findings of this study again validate the existing literature. Only one third of
participants reported any professional development offerings. When asked, these
participants described mostly technically training. Even after the professional
development, participants still desired further training on data literacy. It seems that the
situation has not deviated from the previous research. There is little professional
development for data literacy and school leaders continue to need more training in data
literacy.
This study did not add to the literature on successful district data literacy
professional development. Most of the respondents reported not being provided with
access to a professional development program. However, participants reported gaining
87
knowledge and skills through graduate school classes and professional organizations.
Much like the studies described the participants reported increases in knowledge and
skills for data literacy when exposed to effective trainings. It seems that while the
participants did not report helpful district professional development, there is still evidence
that proper training builds data driven decision making capacity.
Research also shows the positive effect of technology and qualified personnel in
supporting data literacy. Lachat et al. (2006) studied five schools over five years and
describe the importance of technology in supporting data use. Technology that can
disaggregate data is needed at school sites. Dissaggregated data allows educators to
uncover emerging trends and patterns (Lachat et al., 2006; Holcomb, 1999; Love, 2004;
Datnow et al., 2007; Johnson, 2002). Mandinach et al.’s (2006) underescores the
importance of technology tools with their two year study at six different schools.
The findings of this dissertation continue to find the importance of technology in
data use. Nine of twelve participants relied heavily on technology tools to use data,
describing the need for dissaggregation of data and computer generated reports. The two
participants who did not have access to technology tools described the hindrance it
caused and reiterated the desire to have technology tools. It seems that the need for
technology tools is becoming basic to creating a foundation for data driven decision
making.
Research also suggests the importance of working in teams to build data literacy
(Gordon, 2007; Ikemoto & Marsh, 2007;Chen, Heritage, & Lee, 2005; Holcomb, 2001;
Keeney, 1998; Lachat and Smith, 2005; Symonds, 2003). Lachat, Williams, and Smith
88
(2006) specifically refer to the need to have a data team or a data coach to facilitate data
driven decision making. Datnow et al. (2007) also found that data coaches were utilized
at high data driven decision making schools. Data coaches and data teams can facilitate
relationships to support data driven decision making. Another tool that is supportive in
increasing data literacy is the use of protocols and rubrics (Johnson, 2002).
Participants in this study described the use of data teams to move the school
further in the implementation of data driven decision making. The two participants that
had the highest level of data use used data teams to gain perspective in interpreting data.
They also used protocols to guide their discussions and focus conversation. One
participant even functioned as the data coach for his school. The schools implementing
data driven decision making most effectively depended on high data literacy
administrators who implemented data teams and protocols.
Researchers have different suggestions for how to increase school leader data
literacy. Mandinach et al. (2006) suggest the need for training in both technology tools
and data literacy separately. Ikemoto and Marsh (2007) suggest that professional
development would need to cover data analysis skills such as posing questions, collecting
data, and determining appropriate action. Earl and Katz (2003) also suggest these skills
but add the need to give opportunity for practice and reflection to gain data literacy skills
and knowledge.
While the findings of this dissertation agree with Earl and Katz’s (2003)
suggestion for practice and reflection, the findings begin to deviate from the other
research. The participants describe data literacy training being too focused on technical
89
skills. The findings here show that school leaders need training on communicating about
data. The challenge of reporting different types of data to different audiences is a
dynamic problem that school leaders continually face. Also, participants need to increase
their knowledge and skills in creating systems and protocols that will support data driven
decision making.
The reasons for these deviations may be due to the time that has elapsed since the
implementation of NCLB. Over this time period, administrators have had the opportunity
to increase their data literacy skills and now they do need as much training in technical
skills and certain components of data literacy. Another reason is the evidence that school
leaders are not isolated in their use of data. They often rely on others to help analyze,
interpret, and make decisions regarding data and again, require less training. However,
when it comes to communication of data, principals are often asked to speak about the
data for their school. Thus, principals find they must rely on themselves to be effective
communicators about data.
School leaders also have found this to be their biggest challenge, communication
with large groups, faculty or parent meetings. Due to time and union constraints, there is
little opportunity to meet with teachers regarding data individually. Often school leaders
are faced with the challenge of communicating a message to a large group and then
creating systems that will facilitate data driven decision making. As school move toward
data driven decision making, school leaders have found their role to be evolving and
therefore in need of different data literacy skills.
90
Implications for Policy and Practice
The findings of this study yield numerous implications for policy and practice at
multiple levels of the educational system.
1. While school leaders exhibit data literacy, more needs to be done to increase
school leaders’ data literacy.
There needs to be an increase in data literacy training available to educators. School
leaders still need to learn more about data literacy and data driven decision making.
Districts need to offer training on data literacy throughout the year. Universities and
professional organizations need to offer more opportunities in data literacy. Not only
do learning opportunities need to be created, opportunities for practice and reflection
also need to be created for school leaders.
2. Capacity needs to be built to support successful data driven decision making
in all educators.
The findings from this study provide evidence to suggest that data driven decision
making will not be done successfully in isolation. That being the case, data literacy
needs to be encouraged in all members of the staff. It is not sufficient to have a data
literate school leader. School leaders need to provide opportunities to train their staff
in data literacy skills. This can be done through district sponsored training or
professional organizations.
91
3. When increasing data literacy, collaboration needs to be fostered and
encouraged.
The use of data in schools is a dynamic process that is unique to the needs of the
school. Leaders need to a forum to share challenges and ideas. Time should be
provided at district level meetings for administrators to discuss their experiences with
data. Professional organizations and universities should create forums to discuss data.
The collaboration and conversation will allow school leaders to develop data driven
decision making systems targeted to meet their school’s needs. The relationships
developed will also create more support for data driven decision making.
4. Data literacy professional development needs to focus on developing
communication skills.
School leaders need to practice and develop their skills for communicating data to all
stakeholders. Professional development needs to help school leaders to move from
data to discussion with their stakeholders. School leaders need to be given ideas and
models for communicating effectively about data. Also, opportunities to practice and
reflect need to be provided.
5. Technology structures must be in place for school leaders to increase their
data literacy.
Technology is essential to enable schools to use data effectively. Educators need to
be trained in using the technology to its fullest potential. Schools without technology
92
to support data driven decision making need to be provided with technology resources.
The technology need to be capable of disaggregating data and providing user friendly
reports.
Recommendations for Future Research
This study was conducted on a small scale with limitations in the sample
population. One limitation of this study was the small sample size. I would recommend
conducting this study on a larger scale to increase the validity of the study. The increase
in participants will allow a better understanding of data literacy capacities and needs.
Not only would I recommend having more participants, but also a greater representation
in the number of schools and districts. Also, there needs to be more than one participant
from each school site. This can help to validate the school leader’s level of data literacy
and data use at that school.
I would also recommend finding a larger range in the participants and their
schools. This study focused mainly on schools that were not designated as in “program
improvement” under NCLB guidelines. Also, the participants were heavily skewed
toward having Ed.D. and were mostly new to the principal’s chair. This study needs to
be conducted with participants who range in the level of education and their tenure as
principal. Participants also represent schools that are program improvement. These
variances can help to create a more complete picture of administrators’ data literacy
needs.
93
These future studies should continue to be conducted using qualitative methods.
In trying to gain a deeper understanding of how to increase knowledge in individuals,
qualitative data provides a view into the individual’s perspective and offers an
opportunity to explore the best ways of providing support and building capacity.
There is also a need for further research in the impact of a principal’s leadership
and data literacy skills on student achievement. While it is assumed that data driven
decision making will lead to improved instruction and thus increased student
achievement, evidence needs to be found to support this assumption. It is necessary to
see how students and teacher are impacted by a school leader’s data literacy. It is
important to learn more about the actual impact a school leader is making in instruction
and student achievement over time.
Lastly, I would recommend more investigation in to the role of the district, state,
and union regulations on data use and how that may affect the work at school sites. It
came up more than once that participants felt limited in exercising the data literacy skills
due to a rule or mandate. It is necessary to learn more about how much data driven
decision making would occur without state and district regulation for data use and testing.
A better understanding of how to best work with requirements from the state and district
could benefit other school leaders. Also, the effect of the union regulations that prevent
administrators from speaking with teachers about data is some districts is another factor.
More research needs to be done to understand the effects these regulations may or may
not have on data driven decision making.
94
Conclusion
In this era of accountability, data must be used to validate decisions, monitor
programs, and evaluate results. Using data successfully has become a necessary and
valuable skill to school leaders and educators. It should not be assumed that school
leaders already possess these skills. Rather, data literacy knowledge and skills need to be
developed and practiced. There is an urgent need to build capacity and implement data
driven decision making successful in today’s dynamic educational arena.
This study has shown evidence of the development of data literacy knowledge and
skills in school leaders. Despite this growth, it is still apparent that more training in data
literacy is needed to help school leaders build capacity to implement data driven decision
making successfully. This training needs to provide opportunities for school leaders to
learn how to create systems for data driven decision making. Also, participants want to
learn more about how to motivate and engage teachers in analyzing and using data.
While there seems to be a movement toward data driven decision making in schools,
there remains a gap in reaching the potential of using data to guide instructional decisions.
To fill this gap, school leaders need to make commitment to data driven decision
making. More opportunities to learn about data use and to practice data driven decision
making need to be provided to school leaders. Technology and personnel resources need
to be available to support the efforts of school leaders. A focus on the skills needed for
effective communication of data with all stakeholders is greatly needed. School leaders
must engage other educators in the work of data driven decision making and foster their
knowledge and skills.
95
Data driven decision making is not conducted in isolation, and school leaders
must encourage other educators to build capacity. It is also important to note that
increases in data literacy skills will develop over time. As school leaders develop their
data literacy, they must also be mindful of how other educators have also developed their
own skills. Educators must work together to support one another’s data literacy to
develop needed skills.
This study has focused on and increased the understanding of data literacy
capacities and needs for school leaders. However, data literacy is only one of the many
pieces needed to successfully implement data driven decision making. As data alone
does not provide answers, data literacy alone does not provide data driven decision
making. Data literacy is one piece of a complete set of skills that must be developed for
data driven decision making. Only when data driven decision making is implemented
successfully, can the final goal of improved instruction and increased student
achievement be reached.
96
REFERENCES
Alwin, L. (2002). The will and the way of data use. School Administrator, 59(11), 11.
Burgess, Leigh. (2006). Data 101: Going Back to School. Principal Leadership. 7(2), 22-
25
Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change.
Psychological Review, 84, 191-215.
Chen, E., Heritage, M., & Lee, J. (2005). Identifying and monitoring students’learning
needs with technology. Journal of Education for Students Put at Risk, 10(3), 309
– 332
Choppin, J. (2002). Data use in practice: Examples from the school level. Paper
presented at the annual meeting of the American Educational Research
Association, New Orleans, LA.
Crooks, T.J. (1988). The impact of classroom evaluations on students. Review of
Educational Research, 58(4), 438-481.
Datnow, A., Park, V., & Wohlstetter, P. (2007). Achieving with data: How high
performing districts use data to improve instruction for elementary school
students. Los Angeles, CA: Center on Educational Governance, USC Rossier
School of Education.
http://www.usc.edu/dept/education/cagov/reform_publications.html#data
Datnow, A., Park, V., & Kennedy, B. (2008) Acting on Data How urban high schools use
data to improve instruction. Los Angeles, CA: Center on Educational Governance,
USC Rossier School of Education.
Dembosky, J.W., Pane, J.F., Barney, H., & Mr. Steeletina, R. (2005). Data driven
decision making in southwestern Pennsylvania school districts. Santa Monica,
CA:RAND
Detert, J.R., Kopel, M.E.B., Mauriel, J.J., & Jenni, R.W. (2000). Quality management in
U.S. High Schools: Evidence from the field. Journal of School Leadership, 10,
158-187.
Diamond, J.B. & Spillane, J.P. (2004). High-stakes accountability in urban elementary
schools: Challenging or reproducing inequality? Teachers College Record, 106(6),
1145-1176.
97
Doyle, D.P. (2003). Data-driven decision making: Is it the mantra of the month or does it
have staying power? T.H.E. Journal, 30 (10), 19-21.
Earl, L., & Fullan, M. (2003). Using data in leadership for learning. Cambridge Journal
of Education, 33(3), 383-394. Retrieved November 4, 2007 from
http://education.vermont.gov/new/pdfdoc/pgm_assessment/summit/research_reso
urces/using_data_leadership_learning.pdf
Earl, L., & Katz, S. (2002). Leading schools in a data rich world. In K. Leithwood, Pl.
Hallinger, G. Furman, P. Gronn, J. MacBeath, B. Mulforld & K. Riley (Eds.), The
second international handbook of educational leadership and administration.
Dordrecht, Netherlands: Kluwer.
Feldman, J. & Tung, R. (2001). Whole school reform: How schools use the data-based
inquiry and decision making process. Paper presented at the annual meeting of
the American Educational Research Association, Seattle, WA.
Fusarelli, L.D. (2008). Flying (Partially) Blind: School Leaders’ Use of Research In
Decision Making. Phi Delta Kappan. 89(5), 365-369.
Gordon, J. (2007) Developing Data Literacy. Outlook. 3(4), 9-10.
Hallinger, P., & Heck, R. (1996). Reassessing the principal’s role in school effectiveness:
A review of empirical research, 1980-1995. Educational Administration
Quarterly, 32, 5-44
Holcomb, E. L. (2001). Asking the right questions: Techniques for collaboration and
school change (2
nd
ed.). Thousand Oaks, CA: Corwin.
Heritage, M., & Yeagley. R. (2005). Data use and school improvement: Challenges and
prospects. In J.L. Herman & E. Haertel (Eds.), Uses and misuses of data for
educational accountability and improvement (National Society for the Study of
Education Yearbook, Vol. 104, Issue 2, pp. 320-339 ). Chicago: National Society
for the Study of Education. Retrieved April 26, 2006 from
http://www.blackwell-synergy.com/doi/pdf/10.1111/j.1744-7984.2005.00035.x
Ingram, D., Louis, K.S., Schroeder, R.G. (2004). Accountability policies and teacher
decision making: Barriers to the use of data to improve practice. Teachers College
Record, 106(6), 1258-1287.
98
Ikemoto, G. S., & Marsh, J. A. (2007). Cutting through the “data driven” mantra:
Different conceptions of data-driven decision making. In P.A. Moss (Ed.),
Evidence and decision making (National Society for the Study of Education
Yearbook, Vol. 106, Issue 1, pp.105-131). Chicago: National Society for the
Study of Education.
Johnson, J.H. (1999). Educators as researchers. Schools in the Middle, 9(1), 38-41.
Johnson, J.H. (2000). Data-driven school improvement. Journal of School Improvement,
I(I). Retried from
http://www.ncacasi.org/jsi/2000v1i1/data_driven.
Johnson, R. (2002). Using data to close the achievement gap: How to measure equity in
our schools. Thousand Oaks, CA: Corwin.
Keeney, L. (1998). Using data for school improvement: Report on the second
practitioners conference for Annenberg Challenge sites, Houston, May, 1998.
Providence, RI: Annenberg Institute for School Reform
Lachat, M.A., Williams, M., & Smith, S. (2006). Making Sense of ALL Your Data.
Principal Leadership 7 (2), 16-24
Lachat, M.A., & Smith, S. (2005). Practices that support data use in urban high schools.
Journal of Education for Students placed at Risk, 10(3), 333-349.
Lafee, S. (2002). Data-driven districts. School Administrator, 59(11), 6-15
Leithwood, K. & Jantzi, D. (2008). Linking Leadership to Student Learning: The
Contributions of Leader Efficacy. Educational Administration Quarterly, 44, 496-
530
Leithwood, K. & Jantzi, D. (2005). A review of transformational school literature
research 1996-2005. Paper presented at the annual meeting of the American
Educational Research Association, Montreal, QC
Love, N. (2004). Taking data to new depths. Journal of Staff Development. 25(4), 22-26.
Madinach, E. B., Honey, M., & Light, D. (2006). A theoretical framework for data-
driven decision making. Paper presented at the annual meeting of the American
Educational Research Association, San Francisco, CA. Retrieved April 6, 2007
from
http://cct.edc.org/admin/publications/speeches/DataFrame_AERA06.pdf
99
Mandinach, E., Rivas, L., Light, D., Heinze, C., & Light, M. (2006). The Impact of Data-
Driven Decision Making tools on Educational Practice: A Sytems Analysis of Six
School Districts. Paper presented at the annual meeting of the American
Educational Research Association, San Francisco, CA. Retrieved March 8, 2008
from
http://cct.edc.org/admin/publications/speeches/Data_AERA06.pdf
Mason, S. (2002). Turning data into knowledge: Lessons from six Milwaukee public
schools. A paper presented at the annual conference of AERA, New Orleans,
April 2002.
Marsh, J. A., Pane, J. F., & Hamilton, L. S. (2006). Making sense of data-driven decision
making in education. RAND Corporation. Retrieved December 14, 2006 from
http://www.rand.org/pubs/occasional_papers/2006/RAND_OP170.pdf
McIntire, T. (2002). The administrator’s guide to data-driven decision making.
Technology and Learning, 22(11), 18-33.
Merriam, S. B. (1998). Qualitative research and case study applications in education.
San Francisco, CA: Jossey-Bass Publishers.
Mieles, T., & Foley, E. (2005). From data to decisions: Lessons from school districts
using data warehousing. Providence, R.I.: Annenberg Institute for School Reform
at Brown University.
Patton, M.Q. (2002) Qualitative Research and Evaluation Methods, Thousand Oaks, CA:
Sage Publications, Inc.
Petrides, L., & Nodine, T. (2005). Anatomy of school system improvement: Performance-
driven practices in urban school districts. San Francisco, CA: Institute for the
Study of Knowledge Management in Education and NewSchools Venture Fund.
Popham, W. J., (1999). Why standararized tests don’t measure educational quality.
Educational Leadership, 56(6), 8-15.
Shepard, L., Hammerness, K., Darling-Hammond, L., Rust, F., Snowden, J.B., Gordon,
E., Gutierrez, C., & Pacheco, J. (2005) Chapter 8 in Darling-Hammond, L. &
Bransford, J. (Eds) Preparing teachers for a changing world: What teachers
should know and be able to do. San Francisco, CA: Jossey-Bass (note particularly
p. 282-284)
Stiggins, R. (2006). Balanced assessment systems: Redefining excellence in assessment.
Princeton, NJ: Educational Testing Service.
100
Stiggins, R. (1999). Evaluating classroom assessment training in teacher education.
Educational Measurement: Issues and Practice, 18(1), 23-27.
Symonds, K. W. (2003). After the test: How Schools are using data to colose the
achievement gap. San Francisco: Bay Area School Reform Collaborative.
Wohlstetter, P., Van Kirk, A. N., Robertson, P.J., & Mohrman, S.A. (1997). Organizing
for successful school-based management. Alexandria, VA: Association for
Supervision and Curriculum Development.
101
APPENDIX A
Data Literacy Study:
Administrator Interview Protocol
Participant’s Name: __________________________ Date: __________________
Position: _______________________________________________________________
[Introduction: Begin with a few minutes of explaining the study, who you are, and the
purpose of the study. Explain that while the interview will be taped, their responses are
strictly confidential. Let them know if there is something they would like to say off tape,
they can inform you and the recorder will be shut off for their comment. Also, let them
know the approximate length of the interview and ask if they have any specific questions
before beginning.]
I. Background- Laying the Foundation
Before I ask you specific questions about data use, I will start by asking you some
general questions about your school, its surrounding neighborhood, and the district in
order to gain a broader understanding of the context in which you work.
1. Could you tell me a bit about the history of your school and district, focusing on
the last five years (e.g., particular reform initiatives, strong partnerships with
external groups, major structural changes, etc.)?
2. Could you tell me a little about the students and community that you serve?
3. How long have you been at this school? What is your prior experience and
training?
102
II. Demonstrating Data Literacy and Data Use (Research Question 2 and 4)
1. Does your school/system have performance goals? Do teachers establish their
own goals for their classes? How were these goals established? How do you
know when the goals have been met? Probe: benchmarks, indicators. How
frequently do you re-assess the goals?
2. How do you use data in your role as a leader? Can you share an example with me?
When looking at data how do you determine your purpose? Is this your choice or
is it district mandated?
3. Can you show me the sample data report that would be regularly given to teachers?
(If not, request sample report for follow up meeting and/or discuss why reports
are not available to participants at the school site.)
4. If I were a teacher at your school, how would you share the data with me?
Encourage a role play.
5. What kinds of data do administrators have access to and collect? (e.g.,
state/district assessments; school, classroom, standardized, teacher-created, etc.)
6. What types of data do you find most useful when making decisions about
instruction and curriculum?
103
7. Can you provide an example of when your school used information about student
performance to make decisions about instructional programs? Professional
development? School organization and staffing? School budget? Can you
provide a recent example
8. How do you recognize quality data? Can you share an example with me?
9. How do you communicate your data to staff, students, parents, and community?
Can you share an example with me?
10. What challenges have you run into trying to use data for decision making? How
have you dealt with these challenges?
11. Overall, do you think your school has a culture of data-driven decision making?
How would visitors know that this was a data-driven school? What would they
see and hear?
104
III. Supports and Resources for Data Literacy (Research Question 3)
1. How do administrators in your district use data? Can you provide a specific
example?
2. What are the expectations of the district/system about how schools should use
data? To what extent is the culture of data use prevalent in your district/system?
3. What types of support are most important in helping administrators look at and
use data for instructional improvement?
4. At the district/system level, is there a person or team of people who support your
school in using performance data to improve decision-making? What are their
role/responsibilities?
5. Has your district/system sponsored professional development for principals and/or
for teachers that focus on using data to make decisions? In what specific areas? Is
the professional development offered voluntary or mandatory?
6. What has been most useful about this professional development? What has been
the least useful? How can the professional development be tailored to meet your
needs?
105
7. Where else have you acquired data literacy?
8. What tools and technology do you use to analyze data? How effective are these
tools at making the data more accessible? To what extent can these tools
disaggregate the data? Are there tools you would like to have access to but do not?
IV. Sustainability (Research Question 1)
1. Reflecting upon the data literacy support in your district
What works?
What doesn’t? Why?
What do you wish you knew that you cannot find out today?
What do you wish you were able to do (if you had the information, tools,
practices, decision-making authority, etc.)?
2. What further types of training/support do you think will be needed to help
administrators use data effectively?
[Concluding Remarks/Questions: Is there anything else we should know? Thank them
for their cooperation and time.
106
Research Questions
Cross-references of interview questions with the four research questions:
(1) How do administrators increase their data literacy to build capacity for data driven
decision making?
(2) How data literate do administrators perceive themselves to be?
(3) What supports and resources are in place to increase administrator’s data literacy?
(4) How do administrators use their data literacy to influence DDDM at their school sites?
Document Request:
• Sample Data
• School schedule (meetings, etc)
• Meeting minutes
• Professional development agendas and calendar
• Samples of work to demonstrate data-driven decision making?
• Samples of work to demonstrate data literacy
• Sample reports from the district and school
• Sample action plans based on data
• Data drive decision making protocols
• Anything else you think would be helpful for us to look at to get a sense of how
you use data?
107
APPENDIX B
List of Codes
The Master Code List for this Study is:
1 Increasing Data Literacy
1.1 Time
1.2 Collaboration
1.3 Technology
1.4 Other
1.5 Motivation
2 Self Perception of Data Literacy
2.1 Purpose
2.2 Quality
2.3 Statistics
2.4 Interpretation
2.5 Communication
3 Current Data Literacy Supports and Strucutres
3.1 Support Professional Development
3.2 Support Other
3.3 Structure Technology
3.4 Data Personnel District
3.5 Data Personnel School
3.6 Data Protocol and Rubrics
4 Data Use at School Sites
Abstract (if available)
Abstract
The implementation of No Child Left Behind has created a push to use data driven decision making at all school sites. Often this move forward is based upon an assumption that a foundation of data literacy has been built. This study questions this assumption and aims to examine the data literacy capacities and needs of school leaders during this era of accountability.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
How districts prepare site administrators for data-driven decision making
PDF
Teacher education programs and data driven decision making: are we preparing our preservice teachers to be data and assessment literate?
PDF
Examining the applications of data-driven decision making on classroom instruction
PDF
Data use in middle schools: a multiple case study of three middle schools’ experiences with data-driven decision making
PDF
The pursuit of equity: a comparative case study of nine schools and their use of data
PDF
Holding on and holding out: why some teachers resist the move toward data-driven decision making
PDF
Multiple perceptions of teachers who use data
PDF
School leaders' use of data-driven decision-making for school improvement: a study of promising practices in two California charter schools
PDF
Charter schools, data use, and the 21st century: how charter schools use data to inform instruction that prepares students for the 21st century
PDF
The successful leadership strategies of new principals in turnaround middle school settings: the first 90 days
PDF
Designing school systems to encourage data use and instructional improvement: a comparison of educational organizations
PDF
Leadership pipeline and succession: promising practices for building leadership capacity in a K-12 school district
PDF
In the implementation of standards-based reform: what is the leadership role of the principal in building school capacity and accountability to sustain student academic growth?
PDF
Data-driven decision-making practices that secondary principals use to improve student achievement
PDF
Narrowing the achievement gap: a case study of one outperforming urban school making a difference
PDF
Getting to the Core: an examination into the resources, strategies and skills superintendents employed as they implemented the Common Core state standards and the politics in play
Asset Metadata
Creator
Wu, Patricia
(author)
Core Title
Do we really understand what we are talking about? A study examining the data literacy capacities and needs of school leaders
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Publication Date
04/27/2009
Defense Date
03/23/2009
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
data driven decision making,data literacy,data use,OAI-PMH Harvest
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Datnow, Amanda (
committee chair
), Hentschke, Guilbert C. (
committee member
), Stowe, Kathy Huisong (
committee member
)
Creator Email
patriciawu19@yahoo.com,patriciw@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-m2127
Unique identifier
UC1199732
Identifier
etd-Wu-2817 (filename),usctheses-m40 (legacy collection record id),usctheses-c127-225123 (legacy record id),usctheses-m2127 (legacy record id)
Legacy Identifier
etd-Wu-2817.pdf
Dmrecord
225123
Document Type
Dissertation
Rights
Wu, Patricia
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Repository Name
Libraries, University of Southern California
Repository Location
Los Angeles, California
Repository Email
cisadmin@lib.usc.edu
Tags
data driven decision making
data literacy
data use