Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
A teacher's use of data to support classroom instruction in an urban elementary school
(USC Thesis Other)
A teacher's use of data to support classroom instruction in an urban elementary school
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
A TEACHER’S USE OF DATA TO SUPPORT CLASSROOM INSTRUCTION IN AN
URBAN ELEMENTARY SCHOOL
by
Genesis Guadalupe Aguirre
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
December 2022
Copyright 2022 Genesis Guadalupe Aguirre
ii
Dedication
To my grandmother, Elvia, mother, Guadalupe, my stepfather, Thomas, and my siblings Hugo
and Alexia. Thank you for encouraging me every step of the way, even when I doubted myself.
To my aunts, Sylvia and Patty and cousins, Emily, Vianny, Marissa, and Bianca, you all gave me
the strength to keep pushing through this journey and reminded me to persevere.
To my partner, Milo, who stood beside me throughout my dissertation writing. I cannot thank
you enough for the love, support, and patience you provided during early mornings of writing,
holidays with a computer at hand, and meetings with my dissertation chair. To my best friend,
Alyson, thank you for keeping me on my toes and believing I could complete this journey when I
thought I could not. And to my dearest, loving dog, Honey who has been with me throughout my
graduate studies. Thank you for keeping me calm when I felt the most anxious through
challenging moments as I worked tirelessly in this dissertation endeavor.
To my Esperanza community of teachers who provided continuous words of encouragement and
made me feel strong when I felt the weakest.
And, to all the members of my TEMS cohort. Together, we were able to share ideas and cheer
for one another as we all finished this journey at different times. Your positive affirmations and
community provided strength to endure the challenges of completing a dissertation.
iii
Acknowledgements
This dissertation was possible with the uplifting support and encouragement of my committee.
Dr. Julie M. Slayton, my dissertation chair. Mil gracias for the guidance and brilliance
throughout this dissertation journey. Thank you for always providing the support and expertise
both in class and through my dissertation. I appreciate you helping me see beyond what I thought
I knew. Your experience and depth of knowledge helped me understand and complete a study
that focused on something I truly believe in.
To Dr. Samkian, thank you for your support and expertise in helping me understand the methods
I would take to conduct my study. I am grateful for all the learning that I experienced as a
student in your methods course and feel very humble and honored to build my capacity as a
novice researcher.
Dr. Ephraim, thank you for your expertise and feedback in helping me understand how to
develop this area of inquiry throughout my field in education. Your time and support were very
helpful in completing this dissertation.
To the teacher who allowed me to follow her throughout a 3-month period. Thank you for
allowing me to step into your classroom and school environment to help me understand how
teachers use data to inform classroom instruction. This dissertation would not be possible
without YOU!
iv
Table of Contents
Dedication………………………………………………………………………………………...iv
Acknowledgements...……………………………...………………………………………………v
List of Figures…………………………………………………………………………………….vi
Abstract……..…………………………………………………………………………………….ix
Chapter 1: Overview of Study…………………………………………………………………….1
Background of the Problem. .……………………………………………………………..1
Statement of the Problem……………………..…… ……………………………………..3
Purpose of the Study……………..……………… ……………………………………….4
Significance of the Study………………………….…..…………………………………..5
Chapter 2: Literature Review…………………………………………………………………..….7
Leadership………………………………………………………………………………….8
Learning Communities……………………………………………………………………30
Teacher Beliefs On Data Use……………………………………………………………..69
Conceptual Framework…………………………………………………………………...79
Chapter 3: Methods………………………………………………………………………………84
Research Design…………………………………………………………………………..84
Sample and Population……………………………………………………………………85
Site and Participation Selection…………………………………………………………..87
Participant Selection……………………………………………………………………...87
Data Collection and Instruments/Protocols………………………………………………87
Documents and Artifacts………………………………………………………………….90
Data Analysis……………………………………………………………………………..91
v
Limitations and Delimitations…………………………………………………………….95
Credibility and Trustworthiness…………………………………………………………..96
Ethics……………………………………………………………………………………...97
Chapter 4: Findings………………………………………………………………………………99
Finding 1………………………………………………………………………………...102
Finding 2………………………………………………………………………………...107
Finding 3………………………………………………………………………………...117
Conclusion of Findings………………………………………………………………….126
Revised Conceptual Framework………………………………………………………...126
Conclusion.……………………………………………………………………………...130
Chapter 5: Executive Summary, Implications, and Future Research…………………………..131
Summary of Findings……………………………………………………………………132
Implications and Recommendations…………………………………………………….133
References………………………………………………………………………………………137
Appendices……………………………………………………………………………………...140
Appendix A: Screener………..…………………………………………………………...140
Appendix B: Observation Tool…..……………………………………………………….142
Appendix C: Teacher Interview…………………………………………………………..144
vi
List of Figures
Figure 1: Conceptual Framework………………………………………………………………..80
Figure 2: Revised Conceptual Framework……………………………………………………..128
vii
Abstract
Schools generate and collect different forms of data to identify and understand student
performance. Data is a tool for teachers to make sense of information and make instructional
shifts to support the learners in their classrooms. The context within which data is used matters.
To understand how a teacher was able to use data within her context, this study explored and
addressed the following questions: How does one elementary school teacher use data in
collaboration and professional development meetings? What role does leadership play in the way
that an elementary school teacher uses data in collaboration and professional development
meetings and in her classroom instruction? How are those data use activities reflected in her
instruction in her classroom? This qualitative single-case study examined the interactions
between a teacher, a Math TOSA and her grade level team during professional development and
collaboration meetings. The study was conducted at a K-5 public elementary school in Los
Angeles County with a population of 35% English Learners (ELs) and 70% socially
disadvantaged students. The data for this case study included observations, teacher interview,
and reflective memos. The findings from this study revealed that the teacher was able to use data
in general ways; however, it was evident that the Math TOSA and Teacher Leader constrained
opportunities to examine and interrogate the teacher’s practice. To support teachers with tools to
use data effectively to inform their instruction, leadership needs to provide structures to analyze
and examine teacher practice.
1
Chapter 1: Overview of the Study
Schools generate and collect different forms of data to identify students’ performance in
various academic areas (Datnow et al., 2013; Farrell, 2012; Wayman, 2005). For teachers to
make meaning of data, leaders need to provide a data driven culture in schools and support
teachers with opportunities to engage in conversation to analyze and make sense of the data
provided (Marsh, 2012; Young, 2006). Leaders can provide facilitation in small learning groups
to support active conversations among teachers to analyze and synthesize the information that
will be used in decision making (Kerr et al., 2006; Young, 2006). Teachers need to value the
importance of data driven decision making (DDDM) to engage in professional development to
use data in their practice (Datnow & Hubbard, 2015). Professional development on DDDM can
support and enhance teachers’ understanding on how to interpret and analyze data and use the
information as actionable knowledge in the classroom (Datnow & Hubbard, 2016; Donnell &
Gutter, 2015; Dunn et al., 2013). The purpose of this study was to understand how elementary
teachers used data in their school to support their instruction in their classroom. In this chapter I
explain the background of the problem to place the study in context. I then provide a statement of
the problem to focus on my study, followed by the purpose of the study, and why it is significant.
Background of the Problem
In 2001, the reauthorization of the Elementary and Secondary Education Act (ESEA) of
1965 went into effect. Known as No Child Left Behind (NCLB), it shifted authority away from
states and districts and towards the federal government. The law mandated that schools report
different forms of data to the federal government for accountability purposes use data for
instructional improvement purposes (Farrell, 2015). Prior to NCLB, there was no requirement
that schools disaggregate data to demonstrate whether they were effective in meeting the needs
2
of their students with disabilities, their language learners or their students from historically
marginalized groups. Part of the driver for NCLB was a desire to shine a light on students who
were not afforded high quality learning opportunities and ensure that they would no longer be
hidden in aggregated data (Borkowski & Sneed, 2006). This federal legislation identified failing
schools and created adequately yearly progress (AYP) towards state goals for all students to
reach proficiency in reading and mathematics (Dee & Jacob, 2011). In response to this
legislation, state educational agencies, school districts, and other educational entities began to
collect and store various forms of data (Farrell, 2015; Wayman 2005). Inherent in this effort was
the assumption that the availability of data would inform and initiate change in teaching practice
(Wayman, 2005).
In 2015, the Every Student Succeeds Act (ESSA) became the new educational law in the
United States, replacing NCLB (Mandinach & Gummer, 2016; Sharp, 2016). The ESSA shifted
education authority from the federal government back to state and local education agencies,
where it had been prior to NCLB (Sharp, 2016). The ESSA requires states to set high college and
career ready standards, empowers states to use evidenced-based interventions that foster school
improvement, and encourages states to preserve annual assessments as an informing mechanism
(Sharp, 2016). Data continues to be a form of accountability for schools to show their students’
progress and foster educational growth (Mandinach & Gummer, 2016). The ESSA law expects
all levels of the education system (e.g., states, districts, schools, teachers) to use data (Mandinach
& Gummer, 2016).
The use of data has increased in educational settings and the idea of DDDM has emerged
as an approach to support the accountability measure for ESSA. Madinach (2015) describes
DDDM to be a systematic collection, analysis, and interpretation of data that informs the practice
3
and policy in educational settings. There are various stakeholders involved in this process
including teachers, administrators, superintendents, and other educational officials (Madinach,
2015). DDDM has been important in education in the past two decades as a way to increase the
emphasis on rigor (Madinach, 2015). This approach asks stakeholders to look for patterns of
performance to unveil the strengths and needs of students (Wayman, 2005). Once the patterns are
unveiled, the expectation is that teachers will do something with the information so that they can
improve their students’ opportunities to engage in meaningful learning opportunities or at least
improve the instructional strategies that they are using with their students.
Accountability systems both state and federally have produced conflicting outcomes,
which is why there been an urgent need for reform (EdSource, 2014). In California, the
accountability system has evolved to develop a more effective system to assess students in
meaningful ways and help drive academic improvement (EdSource, 2014). Many significant
reforms have driven this change, beginning with the Common Core State Standards (CCSS)
which were adopted by the State Board in August of 2012, the Next Generation Science
Standards (NGSS) adopted in September 2013, and Assembly Bill 484 and Senate Bill 1458
(EdSource, 2014). These reforms have created an emphasis in how schools and districts are held
accountable for student performance and how students’ academic abilities are being evaluated
(EdSource, 2014). Additionally, the use of the California Assessment of Student Perform and
Progress (CAASPP) have been implemented throughout schools to measure student achievement
using standardized assessments in grades 3 through 8, and 11(EdSource, 2014).
Leadership plays an important role in DDDM because school leaders can support or
impede the learning of teachers’ ability to use data. A principal’s role is crucial in DDDM as
they set the tone at the school site as they can develop structures in place to allow teachers to
4
engage in discussion about their instruction and whether those discussions carry over into the
classroom (Datnow et al., 2013; Farrell, 2012; Huguet et al., 2014; Young, 2006). In addition,
leadership can create norms and a positive school culture that can support teachers in engaging
and possibly use data in their practice that can lead to a change in their instruction (Datnow et al.,
2013; Young, 2006).
Statement of the Problem
In the United States, schools and districts have organized teachers to work together in
DDDM to improve educational practice (Datnow et al., 2013; Wayman, 2005). Data consists of
various documents such as state assessments, observations, classroom assessments, and student
work (Young, 2006). Leadership is an important component in DDDM because school leaders
can create or hinder the conditions for teacher learning by creating norms and promoting a
positive data culture environment (Datnow et al., 2013; Farrell, 2012; Huguet et al., 2014).
Principals can provide teachers with professional development to support their understanding of
data use and develop discussion protocols that can support teachers’ conversations on data use
(Datnow et al., 2013; Mandinach & Gummer, 2016; Wayman, 2005). Teachers who currently
use data in their schools have encountered challenges, such as not having the necessary skills to
evaluate and interpret data and the belief that data is not an important piece for instructional
improvement (Dunn et al., 2013a). Data literacy is an important element for teachers as it
demonstrates how information derived from data is then transformed into actionable knowledge
in the classroom (Datnow & Hubbard, 2016; Mandinach & Gummer, 2016). In addition,
providing time and creating structures for teachers to engage in meaningful conversations
regarding data is another challenge that teachers face in schools (Marsh, 2012).
5
Teachers play a significant role in student learning as they provide daily instruction in the
classroom. The way in which teachers collaborate around data needs to be well explored.
Moreover, there is not sufficient research that explains how teachers’ discussions in their grade
levels or professional learning communities (PLCs) carry over into the decisions teachers are
making in their instruction and the way those decisions are revealed throughout their interactions
with their students. In an effort to understand these issues, this study explores the connections
between leadership, learning communities, and teacher beliefs about data use.
Purpose of the Study
The purpose of this study was to examine the role leadership played in the way that an
elementary school teacher made sense of and used data to inform her instruction. The following
research questions informed my dissertation study:
1. How does one elementary school teacher use data in collaboration and professional
development meetings?
2. What role does the leadership play in the way that one elementary school teacher uses
data in collaboration and professional development meetings and in her classroom
instruction?
3. How are those data use activities reflected in her classroom instruction?
Significance of the Study
This study was important because it provided an opportunity to exam and understand
how leadership shaped the environment within which teachers were being asked to use data to
inform their instruction in addition to the way that teachers made sense of data in their learning
communities and used it in their approach to instruction. Many schools serving low-income
communities are held accountable for student achievement through data; however, some schools
6
are noticing limited growth (Young, 2006). The single-case study I conducted of a teacher
provided a rich-portrait about how she engaged in collaboration and professional development to
discuss data and how this information then transferred to how she used the data to inform her
instruction. Moreover, this study informed my own instruction as a practicing elementary teacher
and supported my understanding of how to use data to inform my instruction in the classroom. In
addition, this study also supported my current work as I implement professional development
with practicing teachers in elementary school settings.
Organization of the Study
This study was organized in the following way. In chapter two, the literature focused on
three major concepts that frame this study: Leadership, Learning Communities, and Teacher
Beliefs On Data Use. This continued with my conceptual framework guiding my study. In
chapter three, I described the qualitative methods used and data collection procedures. In chapter
four, I discuss the findings from my study. Chapter five concluded with an executive summary,
implications, and recommendations for future research.
7
Chapter 2: Literature Review
To gain a deeper understanding into the role of leadership in relations to how teachers use
data in their school setting and the way teachers use that data in relation to their practice, this
study focused on the way leadership and a group of teachers interacted in relation to the use of
data to inform one teacher’s practice in her classroom. Specifically, I answered the following
questions:
1. How does one elementary school teacher use data in collaboration and professional
development meetings?
2. What role does the leadership play in the way that one elementary school teacher uses
data in collaboration and professional development meetings and in her classroom
instruction?
3. How are those data use activities reflected in her classroom instruction?
I drew on three bodies of literature to answer this question: leadership, learning
communities, and teachers’ belief on data use. I looked to leadership because it helped me
understand how different leadership roles in the school environment provided support for
teachers in understanding data. I drew on learning communities because this literature offers an
approach to teacher learning that is intended to provide opportunities for teachers to engage in
discussion with colleagues to analyze and make sense of data. Finally, I discussed teacher beliefs
on data use, as teachers’ beliefs, attitudes, and perspectives about data supports or hinders their
approach in using data to inform their instruction. Two areas that I did not explicitly address in
the literature of teacher beliefs are culturally relevant pedagogy and critical reflection. It is
important to note that culturally relevant pedagogy and critical reflections are important as they
help teachers understand their students’ cultural identities and provide teachers opportunities to
8
make sense of their instructional practice. I first present the literature on leadership then turn to
learning communities and finally I present the literature on teachers’ belief on data use. The
chapter concludes with my original conceptual framework, which guided my approach to my
research design, data collection strategy, and analysis.
Leadership
The literature on leadership was an important component to understand how leaders in
schools created conditions for teachers to understand data (Datnow et al., 2013; Huguet et al.,
2014). Leaders play an important role in schools as they can promote the use of data by creating
time and professional learning opportunities for teachers to collaborate in the schools (Huguet et
al., 2014). Datnow et al. (2013) explained how leadership can create specialized roles within
schools such as grade level chairs, instructional coaches to support in building systematic
structures that support teachers in understanding data.
Datnow et al.’s (2013) qualitative case study was conducted in two phases. The first
phase of research focused on examining school elementary school systems that were identified as
being school leaders in supporting schools in data use; whereas, the second phase of research
focused on data use at a high school level in order to complement the previous studies in
elementary schools. The goal of this case study research was to expand knowledge about the
processes and outcomes of data driven decision making (DDDM). Purposive sampling was used
to identify and select the research sites. Researchers found locations that were leaders in the use
of data with a diverse population of students in terms of race/ethnicity and social economic
status. To find the sites, internet searches were conducted in addition to searching schools and
districts that were award winners for improved achievement.
9
For the first phase of the research, there were 25 school districts or charter management
organizations fitting the criteria. After reviewing the district and school sites by speaking and
interviewing with school leaders and experts in the field, the research team narrowed down their
selection. In the second phase of the study, the focus was to find high schools that had
experience with implementation of data use. A total of 50 sites were identified and then
narrowed down by conducting phone interviews with principals to ascertain how data was being
used at their school and if there was support at the school and district.
The analysis of the study drew on data that were collected in six public schools located in
three districts. The schools were located in three states that had strong accountability systems,
state curricular standards linked to state assessments, and consequences for schools not showing
improvement over a multiple-year period (Datnow et al., 2013). Three schools were high school,
level and three were mid-to-large size elementary schools. All of the sites had a set of common
structures in place to support teachers with data use. All three districts used staff at the district
and school site to support teachers in managing and using data. In addition, all districts had
system-wide curricula aligned to state standards. Although all districts had curriculum
guidelines, schools had some flexibility to make instructional changes based on student needs,
particularly if the case could be made using data (Datnow et al., 2013). System-wide interim
assessments were administered in each district on a quarterly of more frequent basis and were
aligned to state curriculum standards. Leaders strongly encouraged teachers to analyze and use
various assessments such as benchmark assessments, chapter tests, homework, and quizzes for
formative purposes in planning their instruction (Datnow et al., 2013).
The study was conducted in 2006-2008 and used case study methodology. The research
team consisted of a university professor and several PhD students who conducted visits to each
10
of the schools and their corresponding central offices. The interviews in the central office were
primarily conducted in the first visit and the school interviews and observations were conducted
in the second visit. In each site, the team interviewed two or three administrators from the central
office, two to three administrators at each school site, and teachers across grade levels and
academic disciplines (Datnow et al., 2013). Interviews lasted between 45-60 minutes with a
mixture of grade level teachers, school, and district leaders. The research team asked site
principals to identify a variety of teachers in terms of their engagement and interest in data use.
A wide range of teachers were interviewed based in terms of their experience, age, gender, racial
background, grade level/department, and engagement with data (Datnow et al., 2013).
A total of nine interviews were conducted with district level staff, which included the
superintendent, assistant superintendent, and director of research assessment and 10 interviews
with school site leaders that included principal and assistant principal that were conducted across
the three districts and six schools (Datnow et al., 2013). A total of 76 teachers across the school
were interviewed, some individually and other in small focus groups. The interviews with
teachers focused on several key areas such as how teachers used data and for what purposes; the
sources and type of data teacher had access to and used, expectations for how teachers should
use data and for what purposes, structural support mechanisms such as collaboration time,
training, and personnel to assist with data use; evidence of a culture of data use in the school and
the teachers’ own beliefs about the use if data; needs and areas of improvement in the use of data
the school site and how teachers used data to inform classroom instruction (Datnow et al., 2013).
Informal observations were also conducted in schools, classrooms, and relevant meetings. The
research team observed data discussion among teachers whenever possible to try and capture
data-informed decision making in action; however, this information was not captured at all sites.
11
Documents such as data discussion protocols and school improvement plans were gathered and
analyzed to support this study. Interviews were recorded and transcribed verbatim, field notes of
classroom and meeting observations were typed. Transcripts were coded for evidence of data
use, structural scaffolds for data use, and evidence impact of data use. Coded data informed the
development of case reports at school sites which helped identify cross-case themes (Datnow et
al., 2013). Categories were refined by focusing on what was considered to be affordances or
constraints in collaborative use of data.
Findings were organized as affordances and constraints that existed when schools and
districts organized teacher collaboration time for the purpose of DDDM (Datnow et al., 2013).
Datnow et al. (2013) first describe how the district and schools structured time for teacher
collaboration and at how various leadership activities and elements in the organizational context
constituted affordances and constraints to teachers’ collaborative work with data. Common
patterns that were found across sites included leaders focused on thoughtful use of data, the focus
of DDDM in terms of a collective responsibility, norms for teachers’ collaboration,
implementation of data discussion protocols, and teacher groupings and subject matter subculture
(Datnow et al., 2013).
After conducting the case study, one of the findings suggested that providing structured
collaboration with an agenda for instructional change was essential for DDDM to take among
teachers in all three districts (Datnow et al., 2013). Structural collaboration meant that leaders
created “sacred” time where teachers examined data with the intent of instructional
improvement. An affordance that was found in this study was that leadership focused on
thoughtful use of data. Leadership at all levels supported teachers in making sense of data by
defining the purpose of data use, stressing the importance of using evidence, and emphasizing
12
improvement efforts (Datnow et al., 2013). District leaders encouraged principals and teachers to
use multiple sources of data before making decisions. The framing of DDDM as a collective
responsibility was another affordance found in this study. Leaders attempted for teachers to share
their data in collaborative groups by attempting to frame the responsibility for student results in a
collective rather than individualistic way (Datnow et al., 2013). The establishment of norms for
data discussion served as an affordance to teacher collaboration around data use. Principals
created high expectations for professional accountability among staff by covering norms for
conversation for meetings, materials teachers were expected to bring for meetings, materials that
were not to be brought to meetings such as papers to grade, and how to compile data binders
(Datnow et al., 2013). Data discussion protocols also served as an affordance as they were
developed to ensure that discussions about classroom-level data occurred and that actions were
taken based on these conversations. Teacher groupings and subject matter subcultures were both
seen as affordances and constraints because of different personalities, their own dynamics,
interactional styles, and the levels of trust. For example, teachers of English and social studies
observed different issues arising in their data discussion related to the content they were
teaching. Some teachers focused more on classroom practice; whereas, others focused more on
the assessment data and numbers. The focus on collaborative use of data might have been an
affordance in a department that saw itself as crucial to student achievement and state assessment
(Datnow et al., 2013).
Farrell (2012) conducted a qualitative comparative study to gain in-depth perspective on
data use in four school systems in the 2011-2012 school year. The four school systems that were
intentionally selected were identified as information-rich cases in order to understand the role of
organizational context on data use. All four school systems were located in one state and served
13
similar demographics which consisted of more than 75% of Latino students, and more than 60%
were eligible for free or reduced lunches (Farrell, 2012). The sample selection occurred in three
phases. In phase one, two school districts were selected as part of a larger, multisite study. The
districts and schools were purposefully chosen based on strong reputational factors and evidence
of district commitment to data use. In the second phase, two charter management organizations
(CMOs) were selected in order to bring organizational elements into relief. CMOs were
organizationally different from school districts such as accountability pressures, structures and
distributions of decision-making rights, size and growth trajectory, available financial resources,
and degree of regulation (Farrell, 2012). In the final phase, two secondary schools were selected
to examine in depth how data use unfolded across levels of each school systems. Preliminary
interviews were conducted with each school’s leader to confirm their eligibility and interest.
A variety of sources were used for data collection including in-depth semi structured
interviews and focus groups, observational field notes, and document review. Each school was
visited five times throughout the 2010-2011 school year. Interviews were conducted with the
district/home office administrators and leaders on one to two site visits, which included
superintendent, assistant superintendents, and other staff overseeing data initiatives. School level
data was collected during separate visits which included teacher and principal interviews.
Teachers were selected if they were regularly involved in data use initiatives at their schools.
Semi-structured protocol guided all interviews and focus groups, which ranged from 45-90
minutes and were recorded and transcribed. A total of 77 people were interviewed, either one-on-
one or within a focus group setting. Relevant documents were gathered during district/CMO-
level meetings and professional development trainings. Farrell (2012) used reflective memoranda
14
to further the thinking about possible relationships, and these ideas formed the basis of a case
study record for each school system.
Data analysis occurred concurrently with data collection in a continuous and interactive
manner (Farrell, 2012). A qualitative methods analysis software package, Atlas.TI, was used to
organize and code all transcripts, field notes, and case study records. An initial a prior coding list
was developed to encompass the three domains of the conceptual framework. Codes were
revised, expanded, and collapsed into more refined categories. Farrell (2012) approached data
analysis in three phases. First, cross-case patterns were explored in data use, a stage model was
created to understand the investment in different resources across cases at the organizational
level guided by other empirical models. Similarities and dissimilarities were explored in the
kinds of contextual factors reported by individual cases and across system types. Accountability
pressures and other organizational factors emerged from two broad categories. These categories
were examined in relation to the patterns of data use and resource mobilization.
Patterns of data use were summarized and accountability pressures shaping the kinds of
data educators used in their practice were explored. Main patterns in resource mobilization were
outlined describing the organizational factors that enabled and constrained school systems as
they mobilized the resources to support data use. Educators in all four schools used multiple
sources of data for instructional improvement. Three notable patterns were identified, the first
being high stakes state assessment data were used extensively in all schools. Schools used other
forms of assessment data depending on other initiatives in their system. Classroom data
collection was used to document conversation with students, which provided teachers with the
opportunity to provide immediate feedback concerning students understanding of a concept;
whereas, other schools used professional learning communities (PLCs) to develop common
15
grade assessments and analyze the results of the data. Educators from CMOs used college ready
indicators such as PSATs, SATs and ACTs because they were able to use this data along with the
knowledge of students’ reading levels to implement SAT vocabulary into lessons.
Federal and state accountability policies had strong influences on data use across all four
schools, particularly in relation to high stakes standardized state assessment results (Farrell,
2012). State assessment results were used for classroom assignment and student scheduling.
Students who scored far below basic or below basic were placed in double periods of resource
classes; whereas, students who scored proficient or advanced were placed in Honors classes.
Other patterns in data use relate back to internal literacy initiatives within a school system that
were prioritized, supported, and monitored (Farrell, 2012). One school had school and system
leaders focused on the common grade assessments that teachers co-developed in their PLCs
which was a district-wide policy that was supported and monitored at the district and school
levels (Farrell, 2012). Similarly, a CMO school had performance management systems in place
that incorporated internal benchmark assessment results with consequences in teachers’
evaluations and compensation systems (Farrell, 2012).
State assessment results were crucial for CMO schools as educators expressed that their
schools faced a legitimate threat of closure if the state assessments were poor. Charter educators
prioritized state assessments during the times of charter monitoring or reauthorization. State
assessments were also critical to the network’s potential for expansion under the current
authorizer. For example, one leader believed the current charter landscape was crowded, and
their authorizer, which was the local school district, might be less likely to grant charters to new
school in the future (Farrell, 2012). The pressure of market and community accountability meant
that CMOs paid greater attention the college ready indicators than their peers in the school
16
districts. Data was used by CMO educators to evaluate if they were meeting their schools’
missions for college preparation and externally as well (Farrell, 2012). The data was publicized
by CMOs to attract parents and families, in addition to supporting the reputation of the school.
High state assessment scores established legitimacy in regard to district schools and helped
distinguish the CMO related schools as high quality when compared to the rest of the field of
charter schools (Farrell, 2012).
All schools made investments in resourced to varying degrees. For example, one school
expected teachers to meet regularly in professional learning communities to plan and design
instruction, compare assessment data, and share best practice; whereas, another school focused
on providing instructional coaches to every school. However, CMOs made investments in
resources that supported data use rather than the school districts. CMOs made six out of eight
sizeable investments which were classified as major initiative. Evidence of human capital was
seen as CMOs hired and trained new employees and developed educators’ capacity to use data
early on their careers; whereas, the district allocated human capital resources to support teachers
once they were working in the classroom. CMOs also had significant commitment to tools,
technology and developed sophisticated data management systems. Four out of the six
organizational policies were ranked major investments because CMOs regularly reserved time
for in-school data analysis and cross network collaboration to collectively analyze data and share
instructional strategies (Farrell, 2012).
The systems’ ability to mobilize resources for data use were due to the structure of the
two types of organizations. The two school districts were hierarchical with compartmentalized
departments. CMOs on the other hand did not have the same departments as the district so had
fewer levels on their organizational charts (Farrell, 2012). A greater level of autonomy and
17
decision-making rights were distributed to the school sites in the CMOs, which included
budgeting, staffing, and curriculum development. The governance structure largely shaped the
flexibility with system resources. In one school, departments were distinct because they were
overseen by separate supervisors with individual budgets. However, in another school, the
department purchased video cameras to give teachers the opportunity to record their instruction
and share it as a form of data of instructional practice with their colleagues. In contrast, another
school used professional learning community initiative to support with collaboration around
results of common assessments ked by the superintendent and involved all departments in the
district, this led to greater level of commitment (Farrell, 2012).
Size largely impacted how the school systems allocated their resources to support data.
The size and complexity for the districts compelled them to use top-down strategies for data use
initiative. CMOs smaller size enabled their flexibility and responsiveness when mobilizing or
reallocating resources (Farrell, 2012). Another factor related to size was where the organizations
stood in terms of their developmental trajectory. The two districts were relatively fixed sizes;
whereas, CMOs has spent the past 10 years growing their network and schools.
Statewide budget cuts were identified as a critical constraint for investing in data use
initiatives for all four systems. This caused educators to consider where they would find “the
biggest bang for the buck” (Farrell, 2012). The superintendent at a district school saw the
investment in training for PLCs to be a front-end cost because the PLC model would be self-
sustained at schools by providing professional development (PD) which meant little cost to the
system. However, for both CMOs there were limited financial resources by the self-reported per-
pupil funding differential between charters and district schools. These circumstances led to
heavy involvement by philanthropic partners to support data use initiatives. Working closely
18
with the foundations had restrictions and limitations. For example, one CMO leader reported that
engaging in a new project with foundations lead to engaged in the foundation’s areas of interest.
Projects driven by grants also created internal pressures based on the life of the grant (Farrell,
2012).
The presence or absence of a collective bargain agreement (CBA) was an important
factor pointed out by educators that helped shape their ability to mobilize their human capital
resources to support data use. The two CMOs did not have a CBA with a local teachers’
association; however, the two districts did. The absences of a CBA allowed charter schools to
have more flexibility to allocate their human capital resources. Changes to the organizational
routines in response to data were done quickly by the CMOs because of the absence of a CBA.
In one CMO school, the ELA department analyzed winter benchmark results and decided as their
instruction response to change their prep periods to have more time to spend in each other’s
classes, supporting student one to one (Farrell, 2012). In contrast, the president of the district
teachers’ association reported that the union and district had spent over 2 years negotiating the
language around teacher meetings, whether the time could be used for collaboration around
common assessments results (Farrell, 2012). Educators expressed that the CBA was an important
mechanism for protecting the rights of teachers. For example, one union representative at a
district expressed that in some schools, completing data analysis protocols and other paperwork
had put a strain on teachers’ time. One district’s union president noted that the demand to use
data to inform their instruction had increased; however, professional trainings from the district to
effectively do this work had not kept pace (Farrell, 2012). CMO teachers also expressed similar
feelings of being overwhelmed, but without CBA in place balancing these pressures could be
challenging (Farrell, 2012).
19
Huguet et al.’s (2014) mixed method study drew from a larger study investigating data
through comparative case studies of six low performing middle schools during the academic year
(2011-2012). The focus of this study was on the role of coaches and PLCs in improving teachers’
capacity to use data to improve language arts instruction (Huguet et al., 2014). Huguet et al.
(2014) focused on four schools that utilized data or instructional coaches. These schools
consisted of low-income students of color and/or English language learners and had failed to
meet state accountability targets for more than 5 years.
Huguet et al. (2014) collected their data by visiting each school three times during the
academic year. During the school visits, principals, coaches, and case study teachers were
interviewed. Three teachers were selected from each site who primarily taught English language
arts and worked with a coach throughout the year. A monthly online survey was administered to
teachers and coaches along with observations of meetings and shadowing of coaches. Interview
protocols and surveys elicited information about interactions between coaches and teachers such
as the frequency and focus of meetings, strategies employed, and perceptions about influence
(Huguet et al., 2014).
Huguet et al. (2014) used the qualitative method analysis software NVivo to organize and
code all open-ended survey responses, transcripts, field notes, documents, and case study
records. An initial a prior coding list was developed to encompass the domains of the conceptual
framework. Codes were revised, expanded, and refined into categories such as coach practices
(e.g., modeling, assessment of teacher needs), artifacts, and organizational and environmental
factors (Huguet et al., 2014).
In the analysis, two coaches (one data coach, one instructional coach) had a greater
impact on teacher practices; whereas, two other coaches (one data coach, one instructional
20
coach) were less effective (Huguet et al., 2014). These four coaches were chosen to be the focus
to explore factors that contributed to differences between each group. Several data sources were
analyzed to confirm and disconfirm initial categories of coaches, triangulating across sources
such as, teacher survey responses, coded interview transcripts, and observations of teacher and
coach interaction (Huguet et al., 2014). Patterns were identified in the perceptions of coaches’
influence on data-use practices, literacy instruction, artifacts, in addition to organizational and
environmental conditions. A case study memoranda was created to record notes about strengths
and areas of growth for the coaches which served as a basis for the identification of two “strong”
and two “developing” coaches (Huguet et al., 2014).
Finding of this study were organized around practices, artifacts, and contextual
conditions that differentiate strong and developing coaches in the study. Coach actions and
practices that appear to contribute to strong coaching for improving teacher data-use capacity
were the utilization of a variety of practices in the role of a coach, employed artifacts as teaching
tools, and co-constructively approached norms (Huguet et al., 2014). Coaches in the study who
influenced teacher practice had content expertise and well-developed interpersonal skills.
Mediated coach-teacher interaction by principals influenced political climate in each school
(Huguet et al., 2014).
A core set of coaching practices appeared to contribute to capacity building for data use
which included assessing teachers’ needs, modeling, providing feedback and sharing expertise,
dialogue and questioning, and brokering. Assessing teachers needs provided coaches with the
opportunity to create specific goals with teachers for data use. Modeling data consisted of
coaches explaining and demonstrating ways to interpret, respond to, and act upon data. Providing
feedback and sharing expertise involved suggesting next steps; whereas, dialogue and
21
questioning about data and instruction was a key practice that provided teachers with the
opportunity to make changes in delivery of their practice. Coaches engaged in brokering, which
was the act of overcoming the divide between data and application through their ability to
connect teachers to expertise and resources that supported the data process (Huguet et al., 2014).
Huguet et al. (2014) found that strong coaches provided scaffolding that allowed teachers
to access tools on their own. Two coaches in the study taught teachers how to utilize a data
software to sort and analyze their own results, rather than providing teachers with pre-sorted
printouts. Coaches may have fostered understanding and capabilities that teachers unassisted,
could apply to data analysis and practice in the future (Huguet et al., 2014). Developing coaches
in the study also utilized the school’s data management system differently as a tool in their
coaching practice. Developing coaches disaggregated the data using the data system and printed
out graphs and charts to share with their teachers. This approach helped some teachers; whereas,
other teachers thought that it was not assisting them in their own learning. Group work norms
were all prevalent in coach-teacher interaction; however, strong coaches were able to establish
these norms in a different way than developing coaches (Huguet et al., 2014). Work norms were
developed in a collaborative way with strong coaches which brought a lot of buy in from
teachers, as opposed to developing coaches who imposed their own ideals to their groups.
Huguet et al.’s (2014) analysis of the four cases indicated that intrapersonal and
organizational factors played an important role in mediating coach interactions with teachers
around data use. Content data expertise was seen in all four case study coaches; however, the
level of coach experience in language arts clearly related to teacher perceptions of coach
credibility (Huguet et al., 2014). In the study, the two strong coaches had strong backgrounds in
language arts; whereas, the developing coaches came from math and science backgrounds. This
22
was significant because the data-use cycle begins with a focus on data and later drew on content
knowledge and skills in order to translate information into actionable knowledge and
instructional response. Coaches who did not have strong backgrounds in the content areas had
less influence on language arts teachers’ data use practice at the end of the cycle. Interpersonal
knowledge and skills were mediating factor in coaches’ effort to build teachers’ data use capacity
(Huguet et al., 2014). The interpersonal skills demonstrated by strong coachers were being
highly accessible/approachable, providing feedback and strategies, one-to-one coaching, physical
interactions between teachers and coaches.
Two contextual key factors that influenced coach-teacher interactions were political
forces and administrators’ support for coaches (Huguet et al., 2014). The support of principals at
the school sites was able to mediate the school politics to support coach and teacher
relationships. Huguet et al. (2014) found that school administrators either facilitated or
constrained the interactions between coaches and teachers. School administrators were able to
support and structure the sensitivity of coaches’ jobs in the school’s political environment and
power dynamic (Huguet et al., 2014). This allowed coaches to develop their support and
interactions with teachers; however, when administration support was not present coaches
encountered power struggles with teachers and other staff. Huguet et al. (2014) found that
political environment could impact coaching practice.
Breiter and Light (2006) implemented a case study on an assessment information system
in New York City. The project Linking Data and Learning was funded by the Carnegie
Corporation. The New York City Department of Education (NYCDOE) contracted with the
Grow Network in 2001 to provide a data-driven decision-making tool to support grade levels (3-
8
th
). There was a total of 30,000 teachers, 5,000 district and school instructional leaders, and
23
1,200 schools serving approximately 400,000 students (Breiter & Light, 2006). There were
numerous perceived obstacles to improving student achievement in New York City due to high
levels of concentrated poverty, homelessness, isolation, and addiction. There were also high rates
of teacher turn over which created a constant cycle of novice teachers in need of training and
mentoring. There were also cultural disconnect, as schools are composed of diverse learners and
the challenge to effectively communicate philosophies and addressing learning needs of many
different students and families.
In 2001, the NYCDOE introduced a system-wide data-support tool for its schools with
the support of the Grow Network Company. Students were grouped by their current class, grade,
school and their New York State standards which ranged from far below standards (1) to far
above standards (4) (Breiter & Light, 2006). In addition, the Grow Report provided educators
with student scores on sub-sections of the test. The Grow Report also provided data for
administrators where an overview of the school and present class and teacher level data was
presented. Parents would also receive reports that explained the test their child took, student
performance, and how they can support their child at home. Educators were given a paper report
with disaggregated data about their students and on-line access to additional data such as the
rank and grouping of each student based on sub-test and skill items (Breiter & Light, 2006). The
Grow Network website supported teachers’ analysis and instructional decision making with two
additional features. One was the information on the state standards, definition of the tested skills
and concepts in addition to the reported data linked to instructional materials and resources for
teachers and administrators that suggested activities and teaching strategies to promote standard-
based learning in the classroom (Breiter & Light, 2006). The system relied on existing data from
24
other resources, the analytical component offers aggregated data and suggests different
classroom organization (Breiter & Light, 2006).
Breiter and Light (2006) used a mixture of qualitative and quantitative methodology as
one component focused on understanding how educators (from classroom teachers to district
administrators) understood the data presented in the Grow Reports and how they used that
information to generate knowledge that could inform their decision making. Structured
interviews were conducted with 47 educational leaders that included central office stakeholders,
superintendents, deputy superintendents, math coordinators, English Language Arts (ELA)
coordinators, staff developers, district liaisons, technology coordinators, directors of research and
curriculum. Interviews were also conducted with representatives from non-governmental
organizations working closely with the New York City schools on issues of educational reform
and professional development (Breiter & Light, 2006). Ethnographic research was conducted in
15 schools across four school districts in New York City that represented various neighborhoods,
student populations, and overall performance levels. There were 45 semi-structured and open-
ended with principals, assistant principals, staff developers, and teachers. Observations were
conducted in 10 grade-wide meetings and/or professional development workshops. Structured
interviews were conducted with 31 teachers to further explore the way in which teachers think
about using assessment information. Two separate surveys were designed for teachers and for
administrators to explore how educators conceptualize the use of the Grow Report for
instructional planning and the types of supported needed to fully leverage the use of data to
improve instruction (Breiter & Light, 2006).
Breiter and Light (2006) found that administrator use of the Grow Reports could be
grouped into four categories which were identifying areas of need and targeting resources,
25
planning, supporting conversations, and shaping professional development. However,
administrators played an important role as they generated knowledge from the data organized by
the Growth Reports and implemented their decisions into the school or district in different ways.
Administrators stated that the Growth Reports helped them identify class, grade, and school-wide
strengths and weaknesses that supported in making decisions about planning, designing
professional development activities, and determining student performance and demographics
(Breiter & Light, 2006). Administrators explained that the Growth Report data also supported in
planning for setting school and district priorities and for instructional programs. The report and
test data also supported administrators in framing important conversations across the school
system about student learning, professional development needs and school wide challenges
(Breiter & Light, 2006). The Grow Reports were used by administrators to focus on PD activities
and to make decisions about other professional activities that supported teachers in developing
differentiated instructional activities or learning about school/district wide standards and goals
with the close alignment to the Grow Reports (Breiter & Light, 2006).
Breiter and Light (2006) found that the Grow Reports were easy to read and were
informative for teachers. Teachers reported that the Grow Report tool was a substantial
improvement from their previous data tool. Interviews uncovered a core of practitioner
knowledge, which consisted of using the data to inform their instructional decisions. Some
teachers raised concerns about the ways the data might have transformed as it was turned into the
Grow Report (Breiter & Light, 2006). Other teacher concerns were about reliability and validity,
the level at which decisions were being made and other information available. Interviews with
teachers reported that by using the Grow Reports in myriad ways to meet their own varied
instructional pedagogical needs, as well as their diverse students’ academic needs (Breiter &
26
Light, 2006). The areas of instructional practice were grouped into five categories which
consisted of the following: targeting instruction, meeting the needs of diverse learners,
supporting conversations, shaping teachers’ PD, and encouraging self-directed learning.
Teachers reported that the Growth Reports help them decide what standards to address and
which skills to teach in daily lesson plans, mini-lessons, and even year-long pacing calendars
(Breiter & Light, 2006). Most teachers agreed that the data represented on the Grow Reports
reveal individual student’s performance levels which was a tool to help them differentiate
instruction. In addition, the Grow Reports helped teachers bridge conversations with colleagues,
parents, administrators, and students about their learning. Interviews and surveys indicated that
teachers had the opportunity to analyze the Grow Reports and their classes’ strength and
weaknesses to also reflect upon their own teaching practice (Breiter & Light, 2006). Another
finding reported in this case study was the encouragement of self-directed learning. Teachers
shared how they shared the reports with their students to encourage students to take
responsibility in terms of their own academic progress (Breiter & Light, 2006).
The Grow Reports were a useful data tool as many teachers reported that the system was
clear and useful. Teachers and administrators who were using this data incorporated additional
data from other sources to inform their decisions, such as daily interactions with students. Seven
factors were highlighted from this case study as to how data systems can support educators is
making informed decisions. Building from the real needs of classroom and building educators
was one factor identified. The Grow Reports worked closely together with the target audience,
which was teachers (Breiter & Light, 2006). Recognizing teachers’ wealth of tacit knowledge as
a starting point was another factor as was important to have teachers talk about the holistic
understanding of the students’ needs and abilities. Selecting appropriate data to include in the
27
information system was another factor identified as many information systems are often flooded
with data. The Grow Report however provided enough data to make decisions (Breiter & Light,
2006). The fourth factor identified was how effective testing required close alignment between
state standards, teaching, and testing. The Grow Report could be used as a tool to support a
discussion about alignment between state standards, teaching, and testing (Breiter & Light,
2006). Educators needed PD on instructional decision-making that considered the role of data
(Breiter & Light, 2006). Teachers needed PD in working with systems and understanding how
the system worked. In addition to professional development, educators needed to expand their
repertoire of instructional strategies to be able to respond to the needs of individual children or
groups of students (Breiter & Light, 2006). The last factor identified was further research on
effective instructional decision-making as the context was very important when making an
effective instructional decision (Breiter & Light, 2006).
Wayman (2005) used prior research on data use to help inform suggestions for three
important areas of support for teachers using data systems, which were PD, leadership for a
supportive data climate, and opportunities for collaboration. Accountability policies implicitly
assume that teachers understood how to turn data into information, however, it was not an easy
task. Herman and Gibson (2001) stated that most educators were not prepared to view teaching
and students’ learning through the information lens (as cited in Wayman, 2005, p. 301).
Schmoker (2004) asserted that professional development for data use in schools was often
implemented without expectations and on a large scale; however, recommendations by a host of
experts suggested that professional development in a smaller more personal scale supported
educators learning (as cited in Wayman, 2005, p. 302). Zhao and Frank’s (2003) study found that
teacher to teacher interaction held a strong positive impact on teacher use of technology whereas
28
training provided by the district did not which suggested that successful technology
implementation was not strongly impacted by large-scale-professional development (as cited in
Wayman, 2005, p.302). Wellman and Lipton (2002) found that resources and guides for specific
activities and topics to include in professional development for data use did exist. One way to
support the delivery of PD was through an in-house expert implemented in various forms.
Symonds (2003) advocated for classroom coaches to support data use in addition to larger level
PD (Wayman, 2005, p. 302). These tools should be augmented with regular ongoing PD to
promote teacher involvement in data use.
The second area presented by Wayman (2005) was the role of leadership in teachers’ use
of data. He argued that in order for data initiatives to be successful, the support of strong
leadership needed to be present. In his review, he determined that strong leaders needed to
establish conditions that supported and encouraged teachers to grow in their use of the system.
Massell (2001) found that string leadership and a supportive culture were characteristics of the
schools in his studies that were more involved in the data use (as cited in Wayman, 2005, p.
303). A change in school culture was needed for inquiry of data use and the involvement of
teachers was necessary (as cited in Wayman, 2005, p. 303). Copland (2003) noted that the
structure for efficient inquiry must be allowed to evolve, within the school context, with diverse
individuals assuming varied leadership roles (as cited in Wayman, 2005, p. 303). Ingram et al.
(2004) highlighted the importance of leadership to engage entire faculties in conversations to
establish common goals and definitions concerning meaningful data. School leaders needed to
consider teachers’ professional judgment as a component of the information process (Wayman,
2005). Leadership support consisted of allowing time for educators to engage in daily inquiry
into their classroom practice. Teachers participated more fully in a data initiative when they were
29
provided time. This time allowed teachers to explore newly implemented technology, if it was
supportive of their teaching tasks (Wayman, 2005).
The third category addressed by Wayman (20005) was the role of collaboration. Wayman
(2005) stated that teachers would best progress as reflective practitioners through various forms
of collaboration with another educator. There were benefits beyond one-to-one relationships
when collaborating because shared responsibilities between administrators and teachers are
present in contexts. Nichols and Singer (2000) claimed an increased interdepartmental
collaboration due to data initiatives (as cited in Wayman, 2005, p. 305). Massell (2001) stated
that there was more communication between interschool and interdistrict because of data use,
this formed common grounds (as cited in Wayman, 2005, p. 305). Collaboration would occur
when using data; however, it was important to establish structures for collaboration such as
distributed leadership, schoolwide data workgroups, and data committees that support individual
data exploration (Wayman, 2005). Engaging in substantive discussions of teaching and learning,
establishing a collective understanding of goals and engagement in professional staff inquiry
could support in avoiding pitfalls.
In conclusion, leadership supports data use in schools as they can create learning
conditions that can support this practice (Wayman, 2005). Organizational factors are supported
with the leadership of a principals as they can provide norms, discussion protocols, professional
development, trust, and a positive relationship with teachers. There is a connection between
leadership and data use as they are the people who can create time and develop support that will
enhance teachers use of data during meetings (e.g., grade level, professional development).
Leadership is a critical component in teacher data use because they set expectations and create a
culture in schools that support the initiative of DDDM (Datnow et al., 2013). What is missing
30
from this literature is how leadership interacts with teachers in collaboration to data use. It is
evident that conditions need to be in place to create expectations and discussion on data use;
however, there is lack of understanding of the collaboration between the teachers and leadership.
Learning Communities
To understand learning communities, I needed to understand how learning communities
had been explored and studied. Research has shown that collaboration amongst teachers can
support understanding of data (Horn & Little, 2010; Young, 2006). In addition, as suggested
above, the support of specialized roles (e.g., instructional coach, grade level chair) has helped
facilitate teachers understanding of data as they help disaggregate the information for teachers to
understand what the information means (Horn et al., 2015; Ikemoto & Marsh, 2007; Kerr et al.,
2006). Learning communities are an important component to this study as they serve as a way to
engage teachers in meaningful conversations with colleagues to understand data and ways to
make changes in their practice (Ikemoto & Marsh, 2007).
Horn et al.’s (2015) study focused on answering the question: How do middle school
mathematics teachers’ use conversations shape professional learning opportunities? District B
was a partner through this study because of the high-quality mathematics curriculum and
intensive teacher professional development. District B served approximately 80,000 diverse
students, 50 % were identified as Latino, 25% African American, and 15% European American
(Horn et al., 2015). In addition, 25% of all student were classified as English Language Learners.
District B’s annual state assessment for the study were the following: 50% of African American
students met mathematics standards, compared to 70% of Latino students and 80% of the
European (Horn et al., 2015). In addition, 50% of ELL students met standards in mathematics.
Interim assessments instruments and a curriculum framework were developed by District B to
31
improve mathematic achievement (Horn et al., 2015). The testing instrument which was called
the ABC was linked to the curriculum framework and provided accountability for the curricular
framework by evaluating students on standards that corresponded with instruction (Horn et al.,
2015). The interim assessments were written by mathematic specialists in Districts B’s central
office. The ABC lacked both field testing and psychometric validation but was highly
consequential. The district provided collaborative time for teachers to use the ABC results to
inform their instruction (Horn et al., 2015).
Horn et al.’s (2015) data collection aimed to support a close analysis of teachers’ data use
conversations while also capturing the context of the teacher workgroups. Four meetings from
each workgroup across 2011-2012 academic year were sampled. Data from the MIST project
were used to serve the second goal. Broad data collection was used for the MIST project, which
required team members to record meetings, document classroom instruction, and interview
participants over time from 10 visits made to two sites. Data such as annual structured interviews
with teachers, school leaders, and district personnel were used for the comparative analysis
(Horn et al., 2015). Horn et al. (2015) incorporated one of the projects quantitative measures
which was the visions of high-quality mathematics instruction instrument (VHQMI). The
VHQMI instrument provided a way to classify participant’s views about the nature of good
mathematics teaching such as the role of the teacher, instructional discourse, and mathematical
tasks (Horn et al., 2015). Internal sampling technique was used to select participants for the
study, which consisted of asking key informants in the district to nominate teacher teams who
collaborated well and used the ABC data to inform their instruction (Horn et al., 2015). After
interviewing participants and visiting sites, a total of four teacher workgroups were selected from
District B.
32
The primary use of analysis for the study was data use conversations, where two
conversations were compared similarities and differences to inform an analysis of teachers’
learning opportunities (Horn et al., 2015). The conversations occurred among two 7
th
grade
mathematics teacher workgroups at Creekside and Park Falls Middle Schools, who used data
from the same ABC test. Different data was used to prove how specific practices shaped
teachers’ learning opportunities (Horn et al., 2015). Creekside’s student demographic consisted
of a third of students identified as African American, European American, and Latino. Students
who qualified for free or reduced-price lunch consisted of about 60%. The demographics for the
mathematics teachers consisted of about 65% identifying as European American and the
remaining identifying as African American, Latino or Pacific Islander (Horn et al., 2015). The
average years of experience for the 7
th
grade mathematics team was 5.6 years which was more
than the department average of 4.4 years. The focal meeting was made up of six 7
th
grade
mathematics teachers, a special education assistant, a mathematical instructional coach, and the
principal who led the meeting (Horn et al., 2015). Park Falls’ student demographic consisted of
50% Latino, 30% African American, and 10% European. In addition, 75% of students qualified
for free or reduced-price lunch. The teacher demographic at Park Falls consisted of eight out of
the nine teachers self-identifying as European American and were less experienced than
Creekside teachers, averaging 2.4 years of experience. The average experience for the seventh-
grade teachers at Creekside was 2 years. A total of three 7
th
grade teachers attended the focal
meeting.
Horn et al. (2015) offered a partially mixed-methods study with an emphasis on the case-
oriented qualitative research logic. In the study, qualitative logic dominated the inquiry by
incorporating some quantitative measures that captured aspects of the phenomenon which
33
allowed direct comparison (Horn et al., 2015). Two focal meetings took place in January 2012
and were coded using constructs from the conceptual framework in order to understand how the
different conversational features shaped the work-groups’ talk and in turn teachers’ learning
opportunities (Horn et al., 2015). Segments were re-transcribed to conduct fine grained discourse
analysis to look at talk in interaction which supported the interpretations of meaning making
(Horn et al., 2015). To understand the workgroup cultures at Creekside and Park Falls, additional
MIST data was used for the video analysis which focused primarily on the 2011-2012 school
year. The VHQM instrument was also used to understand participants views on good
mathematics teaching, which was an important component of individual epistemic stance (Horn
et al., 2015). Eight meetings were sampled and coded to describe the activities and topics that
occupied the work-groups’ time. Horn et al. (2015) were able to calculate how the work-groups
allocated time within each meeting and across the sample. This data was used to compare the
distribution of activities in the sampled meetings to participants’ reports of how they used work-
group time (Horn et al., 2015). In addition, conversations were closely analyzed to understand
the relation to typical work-group activities. Each meeting was divided into episodes based on
shifts in either topic or activity. Problem frames were identified to understand interpretation of
the topics at hand and epistemic claims were marked to understand data conversations (Horn et
al., 2015). In order to understand how different data use practices mobilized teachers’ future
work and provided conceptual tools that would support this work, the two groups’ conversational
features and teacher sensemaking were juxtaposed (Horn et al., 2015).
The findings showed that the norms and values of Creekside’s’ 7
th
grade workgroup were
shaped based on a strong administrative charge to improve on AYP proficiency targets (Horn et
al., 2015). In order to monitor reports of both the ABC and the state test, Creekside employed a
34
full-time data manager to produce and monitor reports. Teachers reported in interviews that
formal observations were accompanied by written feedback provided accountability. Formal
observations helped identify mathematical teachers who needed support which became a central
focus for the instructional coach; however, as part of the MIST project, the instructional coach
was not able to work with all mathematics teachers due to their own resistance, perceived need,
or limited time (Horn et al., 2015). The sampled meetings that were facilitated by both the
principal and instructional coach lasted about 35 minutes and took place during the school day.
During these workgroup discussions, teachers shared that they focused on planning and
reflecting on lessons, sharing resources, and looking at ABC data to talk about improving scores
(Horn et al., 2015). The findings showed that the time allocated in the four samples meetings
reflected less planning and more administration tasks (Horn et al., 2015). In the sampled
meetings, the primary activities were to look at the achievement data which accounted for 27%
of the time, 26% of the time was devoted to an entire meeting to the school’s Improvement Plan,
17% on lesson planning, and 8% on planning for tutoring sessions. The VHQMI allowed to
measure and classify the certain ideas about mathematics, teaching, and learning. Horn et al.
(2015) found that the instructional coach had a vision of mathematics teaching that was more
ambitious than the majority of Creekside’s teachers, which provided a greater engagement with
instructional ideas aligned with documents such as the principles and standards. Horn et al.
(2015) also found that the principal had a less ambitious vision than the instructional coach of
good mathematics instruction which might have played a part in the instructional emphasis of the
meetings. Conversational sensemaking in data use activities were scaffolded by the data manager
at the school site who color coded and desegregated the data for each individual teacher in order
to identify their students’ proficiency level (Horn et al., 2015). The leader’s agenda and time
35
allocation contributed to the sensemaking of data, as teachers engaged in an introduction of new
standardized testing procedures facilitated by the coach, data use, and looking at the next unit of
instruction (Horn et al., 2015). Horn et al. (2015) described two types of frames that were used
throughout the professional development on data use, which were compass and monitoring
frames. The compass frames-oriented conversation towards questions about what to do in
response to the information that data reports; whereas, the monitoring frame uses data to
establish whether or not previous decisions are working (Horn et al., 2015). During the time
teachers engaged in data use, the leader invoked a monitoring frame by asking, “how are we
doing?” (Horn et al., 2015). Teachers engaged in compass monitoring frames as they were
annotating the data during the PD (Horn et al., 2015).
Teacher conversations at Park Fall were centered on student thinking as they analyzed
frequently missed items and student errors (Horn et al., 2015). These conversations allowed for
learning opportunities about topics that needed revisiting and reasons for that purpose. Through
replays and rehearsals of instructional interactions and a distribution of student responses, it was
evident that teachers exemplify data use (Horn et al., 2015). The teacher conversations at
Creekside uncovered about students’ mathematical thinking and limited grasps of certain topics;
however, they did not uncover and spend sufficient time extrapolating next steps for instruction
(Horn et al., 2015).
Horn and Little’s (2010) qualitative case study focused on two teacher groups, one in
mathematics and the other in English. The study took place at East High School, which had an
enrollment of 1,800 ethnically diverse students, one third of the student came from low-income
families. The 12
th
grade cohort was 25% smaller than the 9
th
grade cohort. The research team
36
focused particularly to teachers’ relationships and interactions within the subject domains of
English and mathematics (Horn & Little, 2010).
Each department differed in teaching experience, 14 of the English teachers’ experience
ranged from 1 to 9 years, with an average of 3.5 years (Horn & Little, 2010). The English
department chair was currently on his 4
th
year of teaching at the school. The math department
consisted of 10 members, and ranged from 1 to 19 years of experience, which averaged 8.7
years. The department’s co-chairs had a combination of 15 years of experience. For this study,
the focus was on the Academic Literacy Group which was formed to tackle challenges students
were facing with reading comprehension and the Algebra Group which was created to open
access to college-preparatory mathematics for all student (Horn & Little, 2010). The Academic
Literacy Group was comprised of four full-time members of the English department, the
department chair and a member of the research team, who was a former English teacher (Horn &
Little, 2010). The Algebra Group was comprised of six full time members of the math
department, including the department co-chair, a special education teacher, a student intern and
one member of the research team, who was a former math teacher (Horn & Little, 2010).
Teachers in each group worked collaboratively in grade levels and specific courses. Both
groups would also meet voluntarily after school for 90 minutes once a week. During these
meetings, teachers were working together and concentrated their activity and interactions on
teaching and learning (Horn & Little, 2010). Data was collected through interviews and
observations and were also videotaped and audiotaped. During the fall semester, approximately
26 hours amongst the Academic Literacy group were recorded. In addition, 12 hours of their
interactions with other members of the English department. There were approximately 26 hours
weekly meetings of the Algebra group, in addition to eight full days of algebra week activities.
37
The interviews from teachers in each group confirm the importance of the weekly meetings and
how each group aims for supporting professional learning and improvement in classrooms (Horn
& Little, 2010).
Horn and Little (2010) were interested in understanding how talk supplies opportunities
for professional learning; therefore, they focused specifically on conversational moments that
entail accounts of classroom experience and signal problems of professional practice.
Conversational moments were located by means of a systematic mapping of episodes of teacher
to teacher talk within line-numbered transcripts, marking episode boundaries by shifts in topic
and/or participation structure (Horn & Little, 2010). Episodes provided the research team to
identify when problems arose and if they were pursued or not. Problems of practice were
identified with the use of linguistic and paralinguistic cues, which included expressions of
emotional distress, change in intonation and emphasis, and direct appeals for feedback or
assistance (Horn & Little, 2010). Two extended episodes were selected, one from each group as
they typify each group’s discourse patterns. In the Academic Literacy group problems arose
during curriculum planning and lesson walkthroughs. In the Algebra group, problems arose
during check-in’s when teachers reported on their development in the classroom.
Horn and Little (2010) found that in each group studied, teachers who expressed
problems were greeted with normalizing responses. This term moves supply reassurance and
establishes solidarity. When engaging in normalized responses teachers either turned the
conversations towards the teaching or away from teaching (Horn & Little, 2010). When teachers
turned the conversation towards teaching, they were able to start a detailed discussion of specific
classroom instances (Horn & Little, 2010). Conversations turned away from more in-depth
38
conversations when there was a limitation from teachers’ expressions of sympathy and
reassurances (Horn & Little, 2010).
Horn and Little (2010) found that both groups expressed ambitious aims for their students
and embraced responsibilities for improvements in curriculum and teaching. Each group differed
in justifying their discussions and their interpretation of classroom-based problem of practice.
The Algebra group had a common set of conceptual tools and principles, which rooted from
shared professional development experiences. Teachers were able to deepen their expertise based
on outside professional development and networking within colleagues (Horn & Little, 2010).
In the Academic Literacy group there was less evidence of shared ideas, principles, and
terminology (Horn & Little, 2010). The Academic Literacy group teachers were able to convey
interest in improving curriculum but expressed various views regarding the concept, materials
and specific premises that associated with the program. Teachers in the Academic Literacy
Group had fewer collective connections with external sources of ideas and did not have a shared
language to identify and work on problems of practice (Horn & Little, 2010).
Both groups differed substantially in teaching from a coherent curriculum (Horn & Little,
2010). The Academic Literacy group teachers were not able to take the time to consider how
well students in different classes had responded to lessons already taught, although this was
explicitly suggested from one teacher in the group (Horn & Little, 2010). The curriculum that
was presented by the Academic Literacy group represented an urgent task as opposed to an
existing and robust resources that could be exploited for learning (Horn & Little, 2010). The
Algebra group teachers were able to use an extensive set of curricular resources that all had
agreed on based on criteria and principals (Horn & Little, 2010). These resources had been
selected, revised, or designed over several years. However, the Academic Literacy group was in
39
the early stage of working out instructional goals and curriculum development (Horn & Little,
2010). Teachers had a common goal of providing students with metacognitive reading strategies
and confidence to read academic text.
In the area of leadership within the group, each group differed based on expectations that
were observed. In the Academic Literacy group, there was lack of ambitious and consistent
practices (Horn & Little, 2010). There was a division of labor among grade levels that focused
on improving curriculum. Teachers who took on the lead in drafting lessons, tended towards a
telling and showing (Horn & Little, 2010). In the Algebra group, the current co-chair for the
math department took a visible role in posing questions, eliciting specific accounts of classroom
practice, focusing on both student and teacher learning (Horn & Little, 2010). These leadership
patterns were significantly important in directing the group’s attention to problem of practice
(Horn & Little, 2010). Overall, Horn and Little (2010) found that the Algebra group had more
resources internally and externally to couple problems of practice in order to support teachers’
sensemaking.
Young’s (2006) study focused on four case studies that followed one grade-level team of
teachers in four school across two districts. The study explored what constitutes data in teachers’
eyes and how organizational conditions shape teachers’ use of specific data. Sites were
purposefully selected to ensure an opportunity to study teachers’ data use to ask questions about
using data to practitioners. One high and one low performing school was selected from each
district because of an explicit accountability-based rationale underlying data-driven decision-
making rhetoric (Young, 2006). Upper-primary grades were chosen as case study teams in each
school. Data collection began from June 2003 through February 2005, which consisted of 90
repeated interviews with district administrators, school principal. Teachers, and reform and
40
literacy coaches along with 73 observations of grade-level meetings, staff meetings, and school
and district wide professional development sessions (Young, 2006). Grounded theory approach
was used in this study to analyze and interpret themes from interviews. Semi-structured, role
specific interview protocols were used to guide all interviews and were all audiotaped and
transcribed. First, the transcripts and observation notes were coded for descriptive codes then for
analytic codes as themes emerged. The qualitative data base N6 software was used to enter both
descriptive and analytic codes. The database was used to support and disconfirm evidence for
each descriptive and analytic theme. Brief case study summaries were created reviewing the
districts’ literacy and assessment policies, and grade-level team data practices (Young, 2006).
The first case focused on Oak Park Unified School District which was a midsize, urban,
K-12 district in the San Francisco Bay Area (Young, 2006). The student enrollment population
had no single racial/ethnic group that accounted for more than one-third of the population.
Fulton, an alternative K-8
program has historically drawn its 400 students from across the city;
however, changes in student demographics are approaching because more neighborhood children
attend the school. The number of English learners had doubled to 25% of the school enrollment
and the number of students eligible for free or reduced-price has risen from 25% to 35% (Young,
2006). For a quarter century, Hilltop Elementary had been one of poorest schools in Oak Park.
The school’s largest single racial/ethnic group of students is represented by African Americans
(44%), with (19%) Asian, (13%) Latino, and (11%) Filipino. One third of the population at
Hilltop Elementary consisted of English Learners and 87% of students were eligible for free or
reduced-price lunch. Oak Park District adopted Houghton-Mifflin as their English Language Arts
program in 2003-2004 school year. The districted directed kindergarten through second-grade
teachers to implement the program with fidelity which meant that teachers must complete
41
virtually all of the activities in sequence and according to a districtwide pacing guide (Young,
2006). The district assessment coordinator reported that there was no accountability whether
anyone ever used them or not, and after 12 years many seniors could not read. The district
created a K-2 assessment to support the new literacy program which featured frequent and varied
testing. Fulton was an alternative school which followed the philosophy of developing the whole
child through cross-curricular themes, building the program for school traditions of teacher
leadership and collaboration. Fulton participated in the initial 1996 cohort of a regional inquiry-
based reform network and continued through 2003-2004. This engaged teachers in inquiry
language grade-level meetings and staff discussions which created confidence in critiquing and
using multiple forms of data (Young, 2006). Six teachers made up the second/third grade team
and they have worked together for 4 years. Each of the teachers had long-term teaching
experience and each team member took on a leadership role by serving in committees and
supporting fellow teachers. This grade level team felt ownership and has strong roots across the
district. Individually, teachers reviewed their students’ results after each assessment; however,
their most extensive uses of data occurred during full-day grade-level team meetings every six
weeks. The district assessment results constituted only one source of data for the second/third
grade Fulton teaches. To stay consistent with their philosophy of teaching the whole child,
teachers used collective observations of student work, oral responses, and independence as data.
Fulton teachers shared students for reading and math which motivated them to use each other as
resources in making sense of various data. The second/third grade team primarily used
assessment data and their observations of student work and behavior to group students for
literacy instruction (Young, 2006). Situated within ongoing discourse, second/third grade
teachers made decisions collaboratively.
42
Hilltop Elementary was a historically low-performing Title I school. In 2001, a new
principal began at the school with a clear agenda of creating an adult learning community.
Instruction expertise and professional development skills helped guide the teachers in
implementing new strategies. This cultural shift motivated Hilltop teachers to work with data.
Three teachers belonged to the second-grade case study team which consists of one novice and
two veteran teaches with more than 25 years of experience (Young, 2006). Explicit professional
learning opportunities for teachers were present in this team, as the team frequently met three full
days during the year and 90-minute of collaboration time, twice a month. Teachers reviewed
various types of data such as student writing samples, journal responses, and district-required
formative assessments during structured team meetings. The activities gave teachers experiences
that build collaborative norms and knowledge of how to integrate data analysis into instructional
concerns (Young, 2006). The range in experience among the second-grade Hilltop team revealed
differences between expert and novice literacy teachers. Novice teachers struggled with making
sense of abundant district assessment data.
Bosque Union was a small, suburban, K-8 district in the Bay area which serves a mixed
community of professional and working-class families. One of the largest elementary schools in
Bosque was Cascade Elementary with over 700 students in 2003-2004 school year. The change
of student demographics presented teachers with new social and instructional needs. English
learner’s population had doubled to 32 % from 1997-1998 to 2003-2004. In addition, the
percentage of poor students had increased from 17% to 31% in 2001-2002. Olmo Elementary
was less than two miles from Cascade and served students who were predominantly Latino 72%
in 2003-2004. Eighty percent of the school’s students were English learners and 99% of students
were eligible for free or reduced-price lunch (Young, 2006). The literacy program at Bosque
43
Elementary was supported by both balanced literacy and differentiated instruction. This literacy
approach relied on deep teacher knowledge about how students learned to read and write and on
classroom management. The vision of literacy instruction required teachers to know how to
assess student reading and diagnose the causes of reading failure, create lessons that met the
diagnosed student needs, and integrate specific reading and writing skills (Young, 2006).
The population of Cascade consisted of a large proportion of English language learners
(ELLs) and children in poverty; however, teachers did not seek alternative instructional
strategies to support students’ learning (Young, 2006). The third-grade team at Cascade
consisted of six teachers, of which five had more than 10 years of experience and three had been
at Cascade for over 10 years. During the 2003-2004 school year, the team regrouped students
across six classes according to their ELD needs during the ELD block (Young, 2006). Teachers
at Cascade used their individual curriculum which consisted of and varied among Readers’ and
Writers’ Workshop, literature circles, guided reading, and instruction in writing conventions and
phonics. Some of the evidence of data practiced among the Cascade third-grade team consisted
of student work products, day to day interactions such as teacher conferences.
Olmo served a large population of ELLs and the schools primary focus was literacy and
English language development (ELD) (Young, 2006). The school improvement plan
incorporated extensive training in literacy and English language development strategies. The
third-grade team at Olmo consisted of four teachers that varied in both experience and expertise.
Teachers combined Readers’ Workshop with specific comprehension strategies for their literacy
instruction. The third-grade team worked with portfolios of data collection presented by a
professional development partner. The school also implemented walk throughs during the ELD
instructional block and monitored Reading and Oral Language Assessment (ROLA) scores for
44
targeted students at each ELD level for each class. Olmo had various amounts of data available
such as standard district assessments, district-led walk-throughs for literacy and state testing
reports; however, teachers report that school and district led data collection held little
instructional value to them (Young, 2006). Teachers at Olmo met weekly and voluntarily to plan
but rarely discussed how they used data such as ROLA, state testing information, and student
work for instruction. These case study teams portrayed the range of value and use in data forms.
There were different factors that influenced the use of teacher data use. Young (2006)
identified agenda setting, building collaboration norms, and dedicating roles for data-related
functions as supports to increase teachers use of data for instructional purposes. Agenda setting
sets the stage for teachers to engage with various and specific kinds of data. Leaderships function
in school encompassed articulating the rationale for and expectations of how teachers used
particular forms of data, modeling data use, planning and scaffolding teachers’ learning about
using data and structuring time to allow teachers to do so collaboratively (Young, 2006). The
findings of this study identified that both Oak Park and Bosque administrators expect local
assessments to constitute the primary data principals and teacher use; however, each school used
different strategies when analyzing the data. The Bosque strategy promoted principals’ use of
local assessment data to monitor their school performance; whereas, Oak Park promoted grade-
level teams’ use of local assessment data to identify instructional problems and to understand
whether specific instructional strategies improved student learning. Oak Park’s district and
school administrators reinforced agenda setting activities to convey the logic of using data for
improvement through a school planning process. First, the school plan template required school
teams to evaluate their state achievement and formative assessment data to identify instructional
areas for improvement (Young, 2006). In addition, the district administrators provide time for
45
school teams to assess collaboratively and publicly the effectiveness of their improvement
strategies.
The Fulton and Hilltop principals also lead practice to promote the value of being data
driven (Young, 2006). The Fulton’s principal’s approach aligned with teachers’ expectations,
modeling, data use in schoolwide decisions. Fulton had a history with the regional inquiry-based
network which has supported teachers with the need for data to support school and grade level
teams. Hilltop’s inquiry practices were less mature and depended on the principal’s leadership
strength. The principal at Hilltop articulated the need for analyzing behavior and performance
data to understand the potential instructional problems. This led to charting the progress on
behavioral changes of teachers or students. The principal held learning sessions instead of staff
meetings and modeled data use by providing teachers feedback from informal classroom
observations, instructional strategies with the support of research behind them, and presented
data at staff meetings to frame problems (Young, 2006). Moreover, the principal provided clear
learning objectives for teaches with respect to using student work as data and structured grade-
level meetings where teachers could collaborate on instructional related tasks that involve
sharing data.
Bosque exemplified loose coupling at multiple levels of the district and school
organization (Young, 2006). For example, at the district level the superintendent met three times
a year with the principal to review each round of formative assessments. The expectations of the
superintended were to inform the principals’ plan for instructional change. However, the
principal’s monitoring at Cascade and Olmo was decoupled from teachers’ data use for
instruction. Although the superintendent conferenced with principals, they did not communicate
how teachers would use the data collaboratively (Young, 2006). The Cascade and Olmo teams
46
used data individually and both had no consistent pattern of collaboration where they shared
assessment data. The Bosque district leaders attempted to model the importance of using data in
their own work by integrating assessments and other data to the district’s strategic planning
process. On the contrary, the principal at Olmo placed low priority on actively leading the staff
in using data. Young (2006) stated that the principal demonstrated outward compliance with the
logic of using data as manifested in the specific district initiative, while protecting technical core
of the school. This led teachers at Cascade to not engage in team-based inquiry assessments and
student work. Similarly, the principal at Olmo had made the initiative to focus on ELD by
engaging in walk throughs during the ELD block period and had targeted ELD students’
performance by evaluating the implementation of strategies (Young, 2006). Loose coupling was
identified in the Bosque district case studies due to contradictory agenda setting between school
and the district administrators; whereas, agenda setting schools demonstrated connections
between the professed organizational goals of using data, strategies through teacher activities.
Young (2006) found that leadership specifically focused on data use could move teacher
teams to greater collaboration around data analysis than they demonstrate in other instructional
related activities. In the case study, Cascade teachers had the highest discord and the weakest
level of collaboration. The grade level team differed most in their views of students and whether
the increasing number of English learners and disadvantage students at the school required new
instructional practices (Young, 2006). Teachers also had varying levels of discomfort in letting
their peers teach their students during the ELD block and disagree with some curricular choices
their peers made. However, the teachers at Hilltop exhibited relatively high cohesion. Teachers
also faced common struggles in terms of helping mainly poor and minority students (Young,
47
2006). Yet, on their own, teachers did not collaborate to a high degree due to one outspoken
veteran teacher.
At Olmo, the third-grade level team had a high level of cohesion similar to other case
study teams. The teachers worked together and faced adversity to make improvements as their
school history had been identified as underperforming. The Olmo team considered planning and
coordination as their primary group activity, here teachers worked four hours weekly to share
specific lesson and give each other suggestions and feedback. Fulton teachers also had a strong
sense of joint work as the second/third grade teachers switched students for literacy and teachers
who served non-struggling readers agreed to take more students than those teaching the most
struggling readers (Young, 2006). At Fulton, teachers felt interdependent and saw their
colleagues as integral to their own personal success in the classroom (Young, 2006).
Young (2006) identified Hilltop as a high leadership and high collaborative school. The
principal’s vision centered on teachers’ learning about instruction as it was seen in classroom
practices and in artifacts. It was supported by a community that held members accountable for
their learning. Teachers at Hilltop identified data as classroom observations, student work
samples, and assessment results. The principal’s agenda however drove changes seen at Hilltop
such as collaborative data practices. Young (2006) identified the third grade Olmo team as a low
collaboration and low leadership school. The principal did not follow up with teachers on various
data collection efforts; however, the teachers had also shared that the various demands felt like
accountability pressures and did not benefit their classroom practice (Young, 2006). Although
teachers engaged in collaborative norms which supported the learning among colleagues, there
was a disconnect between data collection activities by school leadership and the lack of teacher
interest.
48
Fulton and Cascade were also both identified as high leadership and collaborative
schools. Both principals and teachers’ expectations were to discuss collegially significant issues
and to make joint decisions in the process that incorporated collecting appropriate data.
However, the position of the third-grade teachers at Cascade reflects loose coupling between
teachers use for monitoring the school and the lack of agenda for teachers to use data in
instruction (Young, 2006). The principal did not set expectations for teachers to work
collaboratively with the data in their instructional context and did not enact any strategies to
develop collegial learning norms with grade level teams. At Cascade, the school culture did not
encourage teachers to work with data and did not promote specific use of common practice.
These four teams demonstrate how norms of interaction, agenda settings, helped shape whether
and how team members use data for instruction (Young, 2006).
Young (2006) identified key functions that facilitated new data practice which consisted
of dealing with data reporting, interpreting data and teaching teachers about data, furnishing
instructional resources like issues arising from data analyses, facilitating meetings in order for
teachers to answer so what and following up with teachers on responses to data analyses. Some
of the case studies illustrated these functions across different roles and not all of the functions
appeared explicitly in every school. At Fulton, the literacy lead teacher supported with inputting
data and reporting problems during the first-year adoption of Houghton-Mifflin. In addition, the
literacy teacher facilitated a portion of a meeting leading the teacher in analyzing their winter
assessments and identifying conventions as a common focus (Young, 2006). Hilltop covered
more of the data related factors with specialized roles for example, the literacy coach, local
collaborative coach, Title 1 teacher, and principal. Each person was assigned multiple roles and
explicit roles which reflected the principal’s goals of creating a coherent system to facilitate
49
teachers’ learning about their practice and having consistent instructional strategies through the
school. However, at Cascade the data-related functions were least explicit and incorporated into
fewer roles. Only the peer coach offered resources such as writing strategies during districtwide
professional development (Young, 2006). None of the data related responsibilities fall to non-
classroom teacher roles, instead the teachers privately analyzed and interpret student work data
for their individual practice. Olmo teachers also held private responsibility for interpreting
ROLA and student work data. Moreover, teachers provided one another with resources such as
materials and extra professional development to individuals and to the whole faculty. There was
plentiful amount of professional development as well as walk throughs to hold teachers
accountable for certain aspects of the instructional environment, yet there was no accountability
for analyzing data and acting on the analysis (Young, 2006). Young (2006) found that data
related functions indicated the depth of capacity and was represented by multiple roles in
fostering teacher’ learning about data and institutionalizing norms that supported collaborative
data use for instruction.
Schifter et al. (2014) presented a process used in a National Science Foundation (NSF)
funded project to help middle-grade science teachers use elaborate and diverse data from virtual
environment game modules designed for assessment of science inquiry. The research questions
guiding this study were as followed: How can we help teachers understand and make sense of
their student performance data from this project? How do teachers use these data to identify
student scientific misunderstandings? Once scientific misunderstandings are identified, how do
teachers plan to change their instructional strategies, if at all?
The study used immersive virtual environments to design new types of inquiry-based
assessment modules to elicit middle school students’ understanding of science content and
50
inquiry skills (Schifter et al., 2014). The game used nonplayer characters to introduce the
mystery/problem of the module and provide evidence that may be useful. Students gathered and
recorded data, explored the environment, and observed visual signals and tacit clues about the
problem they were investigating. There was a total of two introductory modules which
acclimated students to the virtual world, related science tools, and problem-solving processes.
There were four assessment modules that asked students to solve problems, contextualized in a
narrative that assessed students understanding of various science concepts such as adaptation,
weather fronts, gas laws, and force vectors. When students were done with solving the posed
problem, they responded to a non-player character questions about the problem, supporting their
inferences with the data they collected and ranking the data in importance for solving the
problem (Schifter et al., 2014). To score students’ performance, all actions and interactions by
students within the world were recorded. All student performance data was collected by the NSF
funded project data base and accessed by the project dashboard. The dashboard was designed to
serve both administrative, researcher, teacher, and student needs simultaneously. The dashboard
was an automated source for data such as collecting, organizing, and summarizing data in
addition to analyzing, synthesizing the data into information to be used in decision making.
Immediately after a student completes a module, teachers were able to view students’ multiple-
choice answers, the characters, and objects with which the students interacted in the virtual
world, the measurements that students took virtual scitools, and the information they graphed
(Schifter et al., 2014).
During the 2009-2012, there were two different types of PD held, one which focused on
training to use the project modules and another that focused on instructional sessions based
around science and data. There was a total of nine sessions which included a summer institute
51
each year that lasted 1 to 3 days. The sessions were split between understanding the science
content and inquiry skills presented in the modules and understanding the data output from the
modules (Schifter et al., 2014). In the first PD, experts in the content areas of one of the modules
created inquiry-based instruction on the scientific construct within the modules. This led to
opportunities for teachers to work in small groups to develop new on-topic lesson plans for their
own students and shared with peers at the end of the PD. These opportunities supported teachers
understanding of the sources of data in relations to the modules. After the first year, teachers
shared their interest in receiving both analyses and raw data for their students, which gave rise to
the project dashboard. This initiated the concept behind visual representation, which evolved
over the next 2 years (Schifter et al., 2014). The project dashboard was introduced the third year,
and an emphasis was made on supporting teachers use of the dashboard for administrative
purposes such as enrolling students in the dashboard and indicating receipt of permission forms.
Some teachers presented a learning curve when engaging in this task. One of the purposes for the
dashboard was to collect, organize, and summarize the data from the virtual models. Once
teachers were comfortable with these steps, they worked in collaborative groups to analyze,
synthesize, and make decisions using the data from the dashboard. Teachers learned how to
access whole-class reports showing how each student answered the objective-type and the
summative questions (Schifter et al., 2014). The whole set of data provided teachers with
additional objective information about each student and how students solved problems step-by-
step.
During the last summer institute, an attempt was made to help teachers interpret and use
the data from the project dashboard using a DDDM PD workshop. Schifter et al. (2014) stated
that the goal was to share an analysis of the student performance data, engage teachers in active
52
conversations around that data, and develop a collaborative teacher working group using the data
from the dashboard to create lesson plans incorporating student information in a manner
responsive to the needs of students. During this session, the goal was to move teachers to analyze
and synthesize the data into information to be used in decision making.
Teachers were introduced to the principles of DDDM and were reminded of the types of
data that were available through the dashboard. Teachers were able to identify strengths and
weaknesses in student’s understanding of a concept through careful analysis of item level data.
After analyzing the data, teachers were able to generate solutions for improving their own
instruction. Teachers discussed their views on why students performed in certain ways, within
the modules, and assessed specific student misconceptions about specific concepts (Schifter et
al., 2014). Teachers worked in small groups of three and discussed lesson plan options to address
their instruction and specific misunderstandings.
The culminating activity for the PD was to create lesson plans were teachers focused on
how they were addressing the apparent student misunderstandings identified by the project data
analyses. Three lessons were included to indicate the purposes and timing to address the noted
misunderstandings. In lesson one, two teachers presented a lesson they have used to deepen
inquiry skills and approach problem solving effectively. Two techniques teachers used to reach
this goal were developing a 10-question verbal summary survey with questions culled from those
used in a module and a question and answer and discussion chaining methodology. Teachers
asked the whole class each question form the summary survey aloud after the modules were
completed and asked supplemental why questions about the correct answer probing for more
supporting evidence. Another example was when teachers displayed one student’s work in front
of the classroom and discussed how and why this example exemplified correct and thorough
53
work. In the second lesson, teachers created a lesson they had used which aided students in
making the leap from basic comprehension to generating hypotheses through building skills in
note taking. Teachers used a stepwise visual note-taking method where one line was drawn on
the board on which students write their basic ideas about a concept and then a line is drawn
above on which students wrote ideas that connect to or stemmed from those basic concepts
(Shifter et al., 2014). Finally, a third line was then drawn above, and students note conclusions or
hypotheses reached based on the previous steps. Teachers were able to note that this technique
was effective particularly for students who had challenges with combining and analyzing ideas.
A third lesson was developed after findings from the performance data that students lacked
understanding of causality. Teachers presented a lesson that focused on improving students’
ability to use evidence to note valid cause and effect relationships. The lesson addressed the
concept that data can prove an assertion to be incorrect. Teachers used a weather lore website to
explore myths said about weather. Teachers had students collect an actual data set on weather
and compare these data to the saying (Schifter et al., 2014). Students were then able to discuss
the purposes of those saying and myths. Through this lesson, students were able to review data
and causal relationships helped students think more clearly about the application of knowledge
(Schifter et al., 2014).
Kerr et al. (2006) conducted a comparative case study design and mixed methods to
examine district efforts to promote instructional improvement. The study examined the following
questions: What strategies did districts employ to promote instructional improvement through
data-based decision making? How did these strategies work? What constrained or enabled
district efforts to promote data use for instructional decision making? The study used purposive
sample of three school districts: Monroe, Roosevelt, and Jefferson (Kerr et al., 2006). Each
54
district was located in an urban area and had a significant percentage of low-income and
minority students. Roosevelt and Jefferson had approximately 30,000 students; whereas, Monroe
had nearly 80,000 students. These districts were selected because they had made districtwide
instructional improvement and varied in size, union environment and state context. Each school
had also worked with the Institute for Learning (IFL), which was an organization central to the
broader study (Kerr et al., 2006).
Kerr et al. (2006) collected both qualitative and quantitative data from multiple sources
over a 2-year period. Researchers visited each district multiple times during the 2002-2003
school years and interviews central office leaders and staff as well as community leaders.
District-level interviews included the superintendent, associate superintendents, and
administrators in the areas of curriculum, instruction, and PD (Kerr et al., 2006). A total of 85
interviews were conducted with district and community leaders. Schools in each district were
visited during Spring 2003 and Winter-Spring 2004 and observed during district meetings and
other related activities. Researchers interviewed the principal and/or instructional specialist were
interviewed. A total of 72 schools were visited, accounting for 118 teacher focus groups and 73
principal, 30 assistant principal, and 50 instructional specialist interviews (Kerr et al., 2006). In
spring 2004, all principals in the three districts were surveyed along with all teachers in
Roosevelt and Jefferson, and a sample of teachers in Monroe. Response rates across districts
ranged from 68 to 78% for principals, and 31 to 48% for teachers. Each of the study districts
made significant investments in promoting data decision making at the district and school levels,
while each emphasized different actions to varying degrees and levels of success (Kerr et al.,
2006).
55
The study’s districts invested varying degrees in strategies promoting the use of data to
guide instruction and instructional decisions (Kerr et al., 2006). Strategies included the
development of interim assessments and technology/systems for housing, analyzing and
reporting data, the provision of professional development and technical assistance on how to
interpret and use student test results, the revamping of school improvement planning processes,
the encouragement of structured review of student work, and the use of an IFL-developed
classroom observation protocol, the Learning Walk, to assess the quality of classroom instruction
(Kerr et al., 2006). The data revealed that promoting data-based decision making was much more
of a focus in Jefferson and Monroe than in Roosevelt. For example, Jefferson invested in
implementing a new data-driven school improvement planning process. In addition, Monroe
focused on implementing a system of interim assessment tests linked to a data processing and
analysis system. Roosevelt leaders did not invest in some strategies to promote data use and
acknowledge that future efforts would include a more prominent focus in this area (Kerr et al.,
2006). Each district provided professional development and technical assistance to teachers and
principals to support their knowledge about and use of data to guide instructional decisions. A
large percentage of teachers and principals survey respondents in Jefferson and Monroe reported
an emphasis on the interpretation and use of student test results in the training and support they
received from their school and/or district. Technical assistance was provided to school-level staff
to assist with data analysis and interpretation. District staff members assisted low performing
schools with data analysis, this occurred primarily for the school improvement planning (Kerr et
al., 2006). Through the analysis of student achievement data, the principals received assistance
with selecting and implementing appropriate instructional strategies to meet student needs. In
addition, the study districts also implemented strategies to promote data use that included
56
evidence other than student score data. For example, systematic observations of classroom
instructional practice and systematic review of student work were used as data. Roosevelt
teachers used student work samples to develop grade-level expectations. Jefferson used a
different strategy that promoted review of student work as a means of measuring and assessing
student performance and desired changes in instructional practice. Moreover, the district trained
school-level instructional specialists through a process of reviewing student work and tying
results to instructional practice. All three study districts also implemented Learning Walks to
varying degrees, which were developed by IFL for observing classroom instruction across a
school and benchmarking and assessing teachers’ instructional practice (Kerr et al., 2006).
Jefferson and Monroe both examined two initiatives during the period of the study which
were school improvement plan and interim assessments linked to data systems. Although school
improvement planning (SIP) occurred in all three school districts, it was more central and
supported at Jefferson. Jefferson administrators encouraged school faculties to examine
assessment results by grade level to identify areas of needed improvement in math and English
language arts and to identify a realistic, narrow set of strategies to address those needs (Kerr et
al., 2006). A detailed SIP template guide was provided by district administrators; however,
limited training was provided on how to use it. School coaches were expected to assist with data
analysis and implementation of the plans. School level staff in Jefferson identified SIP as a
district wide reform priority and focus of professional development. A total of 45 % of teachers
survey respondents in Jefferson reported that they had read their school’s SIP and had a thorough
understanding of it. The staff at Jefferson consistently described school improvement planning as
useful yet labor intensive (Kerr et al., 2006). Principals reported that the SIP process was more
labor intensive than it needed to be. Interviews showed that principals and teachers described the
57
process as one that help them identify school and classroom needs. The process also allowed for
the collective identification of school goals and drew on in house expertise of school staff. All
three districts regularly administered formative assessments; however, Monroe administered a
comprehensive set of standards -aligned assessments in all grades and core subjects linked to a
sophisticated data management system. A data management system was purchased by Monroe to
provide quick access to results, to facilitate detailed analysis of data, to allow for the
development of additional assessments customized to a particular class, group, or student (Kerr
et al., 2006). Principals and district staff interviewed found interim assessment data valid and
useful and reported using the systems regularly. The information was used to identify students,
teachers, and schools needing additional support, and decided how to design this support.
Teachers were mixed in their response as 59% found interim assessment data moderately or very
useful for guiding instruction in the classroom. Kerr et al. (2006) found that teachers interviews
described looking at item analyses to break down student needs by objective, to identify topics
that required reteaching and new ways of teaching, and to identify and talk with colleagues who
succeeded in teaching a particular objective. In addition, teachers noted that classroom
assessments provided more thorough and provided more timely information or that district
assessments simply duplicated what teachers already knew based on classroom assessments and
review of student work (Kerr et al., 2006).
In all three districts, efforts were generally recognized and valued by staff. Monroe and
Jefferson promoted the use of data for instructional matters because they invested more time and
attention into data use strategies (Kerr et al., 2006). Monroe and Jefferson were more consistent
and successful in making data available for instructional decision making. Teachers and
principals were more likely to perceive the data as useful for making these decisions. In the
58
study, teachers and principals reported having multiple sources and forms of data available to be
used for instructional planning and decisions. In addition, nearly all principals and the majority
of teacher survey respondents reported having access to sate assessment results for their school
as whole and for students disaggregated by student groups and by subtopics or skills (Kerr et al.,
2006). Roosevelt’s data were available to teachers and principals; however, the data was in less
sophisticated and usable format, which limited the types of analyses that could be used by the
school staff to guide their work. Teachers in Jefferson and Monroe were most likely to find data
when available, useful for guiding instruction in their classrooms. Jefferson and Monroe staff
reported more extensive and frequent use of data to identify areas of weakness and to guide
instructional decisions and spent at least five hours a week reviewing student achievement data
or reviewing student work with teachers. Principals shared in interviews about reviewing test
scores to identify student, classroom, and school deficiencies and regularly using this
information to change curriculum sequencing and target resources to students and teachers (Kerr
et al., 2006). Principals played a major role in supporting with data analysis as 79% of teachers
in Jefferson and 72% of teachers in Monroe reported that their principals regularly helped them
with data analysis and with adapting their teaching practices according to analysis of state or
district assessments in comparison to 56% in Roosevelt. Teachers in Jefferson and Monroe
repeatedly reported spending time in school/grade level meetings or professional development
sessions reviewing student assessment results and other data to group students, develop their
targeted interventions, and identify student weaknesses and areas that required reteaching or
reinforcement (Kerr et al., 2006). District administrators in Jefferson and Monroe were more
likely to cite examples of data-driven decision around instruction.
59
There were several factors that influenced districts’ efforts to use data for instructional
improvement purposes, including the history of state accountability incentives, access and
timelines of data, perceived validity of data and flexibility to alter instruction and staff capacity
and support. Roosevelt, Monroe, and Jefferson experienced added pressures from long-lasting
state accountability systems which aimed at developing individual school and student measures
of achievement (Kerr et al., 2006). The two districts operated for years in an environment with
strong incentives to carefully analyze students learning and test scores at individual students and
classroom levels, this may have contributed to a stronger motivation and capacity to analyze
data. Access to and timeliness of receiving data greatly influenced individual use for all three
districts. Most schools had the ability to see a variety of student data, disaggregated it, run item
analysis, and display results in multiple formats; however, individuals in all three districts
commonly complained that data was not timely (Kerr et al., 2006). School staff in each site often
questioned the accuracy and validity of measures, which affected individual buy in for various
data sources. Principals and teachers also questioned the validity and reliability of the interim
assessments as some test had changed in quality from the first administration to the second.
Although assessments data revealed problem areas that required re-teaching, teachers in Monroe
and Roosevelt opted to follow the curriculum instead of the data because of the perceived
pressure to stay on pace. Data-use skills and expertise in all three districts were observed, yet
capacity gaps were most visible in Roosevelt as teachers reported feeling less prepared to use
data. Based on interviews with district leaders in Roosevelt, the priority for professional
development on data use was not high due to unavailable data systems. In contrast, Monroe and
Jefferson made data analysis a high priority as they used school staff to complete initial analysis
and summarize results in easy-to-understand tables and graphs. In addition, school-based
60
coaches often took the first step of analyzing test results and presenting them in useable forms to
school faculties (Kerr et al., 2006).
This study investigated the role of one intermediary organization, the IFL, in influencing
district work around data use. The IFL made two key contributions to district efforts in the area
of data-based decision making. The first contribution was gaining important concepts form the
IFL around notions of accountability and the importance of benchmarking progress. An example
of this contribution was seen at Jefferson when these concepts applied to the design of the SIP
efforts as leaders attempted to evaluate the implementation of SIPs in the lower performing
schools (Kerr et al., 2006). The second contribution was the IFL implemented one prevalent data
strategy in all three districts which was the use of Learning Walks to assess the quality of
instruction in the classroom and schools (Kerr et al., 2006). The IFL provided protocols, tools,
and professional development for staff on how to conduct these walks, record observations,
analyze the evidence gathered, and make judgements about the quality of instruction as it related
to best teaching practices (Kerr et al., 2006).
Ikemoto and Marsh (2007) used data from two RAND studies from the Institute for
Learning (IFL). The data gathered from the two studies consisted of 10 districts in four states,
interviews with more than 130 school district leaders (central office administrators and board
members), 100 school principals, and 80 other school leaders (assistant principals and coaches).
Interview data was collected from 115 teacher focus groups and survey data from 2,917 teachers
and 146 principals in three districts that partnered with IFL (Ikemoto & Marsh, 2007). The
analysis of this data was done in two stages, the first consisted of scanning all project documents,
interview notes, and transcripts to identify the broad types of data and analysis that educators
reported using the dimensions on which these types of data and analyses varied, and the factors
61
that appeared to enable or hinder educators using data (Ikemoto & Marsh, 2007). The second
stage of data analysis consisted of refining and emerging typology of DDDM, which lead to
identifying a sample of 36 DDDM examples from seven districts across the two studies. The
framework presented in the study was not used for the original data collection, the various types
of data and analyses in all interviews were not probed to generate the details necessary to
categorize all of the example of DDDM in the full set data. In the second stage of analysis there
was a limited number of examples that were included, therefore, Ikemoto and Marsh (2007)
caution against generalizing the findings.
Ikemoto and Marsh (2007) defined DDDM in education as teachers, principals, and
administrators systematically collecting and analyzing data to guide a range of decisions to help
improve the success of students and schools. Mandinach et al. (2006) suggested that multiple
forms of data are first turned into information via analysis and then combined with stakeholders
understanding and expertise to create actionable knowledge (as cited in Ikemoto & Marsh, 2007,
p. 108). First, data needed to be collected and organized into raw data. There were multiple types
of data that educators used such as input data which was school expenditures or the
demographics of the student population, process data, such as data on financial operations or the
quality of instruction, or outcome data, such as dropout rates or student test scores, and
satisfaction data, such as opinions from teachers, students, parents, and the community (Ikemoto
& Marsh, 2007). During the second step of the process, the raw data was combined with an
understanding of the situation through analysis of summarization of information. Next, data users
converted information into actionable knowledge by using their judgement to prioritize
information and weigh the relative merit to possible solutions. Different types of decisions were
supported by actionable knowledge such as setting and assessing progress towards goals,
62
addressing individual or group needs, evaluating effectiveness of practices, assessing whether
client needs are being met, reallocating resources, or improving processes to improve outcomes.
Ikemoto and Marsh (2007) explained that the framework also recognized that DDDM could be
understood within a larger context. Data collection, analyses of performance, and decision vary
across various levels of the educational systems such as the classroom, school, and district. The
various levels of the educational systems could also influence the nature of the DDDM process,
which could affect educators’ ability to turn data into valid information and actionable
knowledge. DDDM in practice was not necessarily as linear as educators could skip a step in the
process, draw one data source or engage in the process alone (Ikemoto & Marsh, 2007).
DDDM could vary along two continuums, one type of data used and the nature of data
analysis and decision making. All forms of DDDM could be appropriate and useful, it all
depended on purpose and resources that were available. Educators utilized a range simple and
complex data. Simple forms of data were less complicated and comprehensive; whereas,
complex data was often composed of two or more interwoven parts and was more
multidimensional (Ikemoto & Marsh, 2007). Educators interpret data and decide how to take
actions in various ways. These types of actions also varied from simple to complex along the
following dimensions, basis of interpretation, reliance on knowledge, types of analysis, extent of
participation, and frequency. DDDM process fell within one of four quadrants depending on the
level of complexity along the two continuums, quadrant I-DDDM basic, quadrant II-analysis-
focus, quadrant III-data-focused, and quadrant IV0inquiry-focused (Ikemoto & Marsh, 2007).
DDDM entailed using simple data and simple analysis procedures; whereas, inquiry focused
DDDM involved complex data and analysis. Feldman and Tung (2001) describe inquiry focused
63
DDDM as purposefully utilizing the process as a means of continuous improvement and
organizational learning (as cited in Ikemoto & Marsh, 2007).
Ikemoto and Marsh (2007) relied on a sample of 36 examples that had adequate
information to place them in one of the quadrants of the framework. These particular examples
were categorized because they were relevant and were able to triangulate the reports across
various respondents. To some extent, educators employed all four types of DDDM, but their
efforts tended to reflect simpler models (Ikemoto & Marsh, 2007). Fifteen of the 36 examples of
DDDM analyzed resembled basic models of DDDM, where the majority of the examples
involved using state test results. One elementary school principal reported looking at state test
scores and noticed students were performing poorly in mathematics. The principal then decided
to focus teacher professional development time on strategies for teaching math. This example
showed DDDM relying on one type of data, from one point in time, and from a source that was
readily available. Probing revealed that the principal acted alone in interpreting data and
determining a solution (Ikemoto & Marsh, 2007). Another example of the basic model was of a
central leader encouraging secondary schools in the district to enroll students in two class periods
for subjects in which school-wide state results revealed the weaknesses or student subgroups
experienced difficulty (Ikemoto & Marsh, 2007). Further discussion revealed that the schools
were using both achievement data and demographic data to design master schedules; however,
the response to the data was based upon one district leader’s intuition.
There were nine instances out of the 36 samples that used analysis-focused model of
DDDM. In this model, state test data were used along with groups such as school leadership
teams and grade level teams. An iterative examination of data such as interim test scores were
also used. These examples were less likely to take advantage of expert knowledge, empirical
64
evidence, and sophisticated analysis techniques to interpret and explain the data. One central
office leader reported regularly visiting schools to discuss individual student results on the
district interim assessments. The visits consisted of meeting with the principal, assistant
principal, and school coaches to identify through the data areas of weakness as well as potential
services and supports to address individual student needs. After these meetings, curriculum staff
from the central office held follow-up meetings with teacher and principals to help develop a
pacing calendar of curricular objectives and conducted further analyses of the interim test results
and state release test results to identify objectives that needed additional attention. Another
example of the analysis model is of a principal in an elementary school who guides her teachers
in a process of disaggregating state test results to find patterns that might explain low literacy
scores. Through a series of meetings teachers offered hypotheses and the principal ran further
analyses of test scores to examine the merit of the hypotheses (Ikemoto & Marsh, 2007). Using
this process, the school discovered that students transferring into the school in kindergarten
through second grade were outperforming other students. Knowing that the other schools in the
district were using a different curriculum, teachers began to explore how each curriculum
addressed the skill areas that were posing problems for their students. Through this process, staff
discovered that the school’s curriculum was not thorough in covering skills such as letter
recognition, which were emphasized by the state standards and assessments. The staff decided to
draw from the expertise of the school’s literacy coach and develop supplementary materials for
these skill areas and made the decision to adopt the curriculum that was being used by other
schools in the district for the following year (Ikemoto & Marsh, 2007). This example
demonstrated teachers and administrators began with existing data, engaged in further data
collection and analysis before ultimately deciding how to respond to the data. Educators
65
combined evidence with expertise through a collective process to develop actionable knowledge
and identify a solution.
Seven of the 36 examples used data focused models for DDDM. Educators drew on
complex forms of data, often as a group, and often did not draw on expert knowledge or
empirical data. The first example was of a principal who turned to data for direction when the
district awarded the elementary school extra financial resources mid-year. His staff drew on
multiple types of data which included input, outcome, process, and satisfaction data. The
principal and his leadership team examined school-wide test scores and discipline data to
determine where these financial resources would make the most difference (Ikemoto & Marsh,
2007). In addition, the school leaders held 41 parent meetings to ask parents how the school can
improve in supporting their children. The leadership team then analyzed the results and
determined that reading was the school’s number one problem and hired two additional reading
specialists. Based on this information, the principal used complex data; however, the analysis
process was much simpler. A second example of data focused model for DDDM was of a district
that was faced with a budget deficit that relied heavily on survey data to determine budget cuts
that would minimize the direct impact on students (Ikemoto & Marsh, 2007). In order to gauge
the needs and priorities for the district-wide investments the central office staff administered
surveys to principals, teachers, parents, and community members using an online service. Prior
to administering the surveys, a preliminary list was created by a budget committee for potential
areas for trimming based on their preferences as well. Afterwards, surveys asked respondents to
select from a final list of services, programs, and staff positions in order to achieve to potential
reduction goals (Ikemoto & Marsh, 2007). District leaders shared that without the process of data
collection, the school staff might have not been willing to accept the budget decision. In this
66
example, staff relied on data from multiple stakeholders to make budget decisions, however, the
analysis and action did not utilize empirical or expert knowledge to interpret or explain data.
Ikemoto and Marsh (2007) found five instances of inquiry-focused models of DDDM in
the sample of 36. These examples represent a significant investment in time and resources to
probe a particular problem of practice. Most often, they were formal meetings such as principal,
school faculty meetings or professional development time. The first example focused on
improving the capacity to support ELLs. Leaders in one district noticed the low scores of ELLs
were jeopardizing the district’s ability to meet Adequate Yearly Progress (AYP) under No Child
Left Behind (NCLB) guidelines. The district received support from IFL to examine the
underlying causes of poor ELL performance. Learning walks, a protocol developed by IFL was
used to walk through the classrooms to collect evidence on current teaching practices. In
addition, school and district administrators began a series of observations in ELLs and non-ELL
classrooms across the district. Through the questioning of students, examination of student work,
and observation of instruction and classroom materials. These observations systematically
collected information among other things, the nature and quality of student dialogue and the
clarity of instructional expectations (Ikemoto & Marsh, 2007). Based on the qualitative data and
IFL and district expertise regarding the best practices for ELLs instruction, district leaders
concluded that ELLs teachers were not instructing ELL students with the same level of rigor
observed in the non-ELL classrooms. The IFL and district language developed experts explored
research knowledge regarding rigorous instruction for ELL and crafted professional development
opportunities for teacher and administrators (Ikemoto & Marsh, 2007). Prominent researchers
were invited to attend district wide staff meetings, disseminated books and articles, and created
study groups with master ELL teachers. These expert teachers were expected to promote
67
rigorous instruction across the districting and engaged in another process of data collection and
inquiry in monthly meetings. Participating teachers had to watch videos of instruction, observe
each other demonstrating lessons, and discuss ways to inspire ELL students to excel (Ikemoto &
Marsh, 2007). Ikemoto and Marsh (2007) expressed that in this example, educators drew on
multiple types of data, engaged in collective effort to examine evidence, and consider expertise
as part of an ongoing process of improvement.
Ikemoto and Marsh (2007) found a common set of factors that explained why educators
engaged in DDDM and why some did with great levels of complexity with others. One factor
was the access to and timeliness of receiving data greatly influenced individual use. Based on the
IFL study, findings show that educators were much more likely to use data in a district that
enabled access through an online data system. Accessibility of multiple forms of data was
particularly important to support educators in pursing complex DDDM processes. Educators who
were more likely to examine, analyze, and triangulate multiple forms of evidence tended to be in
states or districts that collected and published data beyond typical achievement, attendance, and
demographic summaries (Ikemoto & Marsh, 2007). Perceived validity of data was another
common factor in DDDM. Schools and staff often questioned the accuracy and validity
measures, which affected individual buy in; however, the validity factors were less of a concern
to educators engaging in complex DDDM because they used multiple data sources and were able
to engage in their own data collection to address missing data or data perceived to be invalid
(Ikemoto & Marsh, 2007). Choppin (2002) asserts that school personnel often lack adequate
capacity to formulate questions, select indicators, interpret results and develop solutions (as cited
in Ikemoto & Marsh, 2007, p. 121).
68
In addition, the lack of time to analyze, synthesize, and interpret data also limited DDDM
in multiple study sites (Ikemoto & Marsh, 2007). However, when administrators made DDDM a
priority during professional development sessions and/or faculty, department, and grade-level
meetings, this supported the process. Yet, district and schools that engaged in complex DDDM
process had to allocated valuable time to create new structures to enable individuals to
collectively interpret data and decide actionable knowledge. Partnerships with external
organizations such as universities, consultants, and state departments of education were able to
support schools and districts by providing valuable technical assistance and needed resources
(Ikemoto & Marsh, 2007). External organizations were particularly helpful in facilitating more
complex forms of DDDM by assisting educators in the process of transforming raw data into
information and actionable knowledge. Tools were also important for several users of complex
DDDM, which came from external organizations to support in the inquiry process. One example
present in the study was through IFL and the implementation of a systematic observations of
instruction which consisted of protocols for recording information, rubrics for comparing these
data to notions of best practices, and worksheets and procedures to guide reflections and action
steps (Ikemoto & Marsh, 2007). Culture and leadership were also important within schools and
districts as it influenced patterns across data. Administrators with strong visions of DDDM who
promote norms of openness and collaboration greatly enabled data use in some places, yet a
trusting, data-driven culture was particularly important for complex DDDM in the district.
Ikemoto and Marsh (2007) stated that respondents shared how complex DDDM process is at it
involves digging beneath the surface to develop deeper understandings of the underlying causes
of problems and involving asking questions. In data-driven cultures colleagues were willing to
constructively challenge each other to provide evidence for claims made during an inquiry
69
process (Ikemoto & Marsh, 2007). Accountability systems such as the NCLB Act have put
pressure on schools to make progress in developing school and student achievement. Yet,
Federal and state policies have not emphasized the value of using multiple sources and types of
data when examining student achievement data (Ikemoto & Marsh, 2007). Short time frames for
school improvement planning had also prevented educators from being thorough and collective
in DDDM.
In conclusion, teachers who engage in learning communities are able to collaborate and
develop skills necessary to understand data use (Ikemoto & Marsh, 2007; Kerr et al., 2006).
Teachers who work in learning communities can engage in meaningful discussions with
colleagues regarding data use and instructional practice in the classroom. These opportunities
allow for teachers to build confidence within a team to ask questions and gain better depth about
the information presented and strategies that can support their learning in this approach (Young,
2006; Marsh, 2012). What is missing from this literature is evidence of the way that teachers’
discussions in their grade levels or PLCs carry over into the decisions they are making in their
instruction and the way those decisions are revealed through their interactions with their
students.
Teacher Beliefs on Data Use
To understand the role of teacher belief on data use, I needed to understand how teacher
beliefs impacted teachers’ use of data in their school. Teacher beliefs can either support or hinder
teachers’ use of data in the classroom (Dunn et al., 2013b). Datnow and Hubbard (2016)
explained that trust plays an important role when teachers engage in DDDM to create an
environment where teachers can explore the components of this approach. In addition, providing
70
teachers with opportunities to understand data and turn the information into actionable
knowledge can support DDDM (Dunn et al., 2013b; Schmidt & Datnow, 2005).
Datnow and Hubbard’s (2016) literature review examined teachers’ capacity for data use
and their beliefs about data use. Datnow and Hubbard (2016) found that literature and teachers’
capacity of data were not connected. A framework was derived from extensive literature to
understand teachers’ capacity and beliefs of data use. The guiding questions for the investigation
were: What efforts have been undertaken to build teachers’ capacity for data use? and What is
the range of teacher beliefs about data use? Datnow and Hubbard (2016) categorized their
findings into six different themes which consisted of structured collaboration, coaching,
university and consultant partnerships, and other forms of training and leadership.
Datnow and Hubbard (2016) used a literature review as a form of methodology. Two
criteria for inclusion in this review were to build K-12 teachers’ capacity or teachers’ beliefs
about data use and sources published after 2001, due to NCLB. During this time accountability
policies were taking place as it a was a key initiative in the US. The authors used two criteria
when searching for studies which were databases such as Google Scholar and ERIC and using
specific terms (e.g., teacher, data use, teacher professional development, teacher training). The
review of the research drew upon the topics of teacher capacity for and beliefs about data use in
qualitative and quantitative research studies, mixed method studies, literature reviews, and
conceptual/theoretical publications. A limitation in this literature review was that it did not
include every relevant publication on the topic (Datnow & Hubbard, 2016).
Datnow and Hubbard (2016) suggested that structured collaboration provided teachers
with the opportunity to engage in dialogue around data, teacher pedagogy, and student
achievement. Some ways to support this form of collaboration is by creating simple common
71
data analysis protocols for teachers to use productively and through the use of reflection (Datnow
& Hubbard, 2016). Quality of conversations was also an important concept to implement with
teachers as it supported teachers when interpreting data in workgroups (Datnow & Hubbard,
2016). Professional learning communities, data coaches, and content area coaches were popular
interventions; however, coaching showed long-term changes in developing teacher’s capacity to
use data (Datnow & Hubbard, 2016).
The second finding to emerge was that coaches served as facilitators to support in
building teachers’ capacity for data use (Datnow & Hubbard, 2016). For coaches to be successful
in this position, coaches needed to demonstrate strong interpersonal skills, content, and
pedagogical knowledge. Datnow and Hubbard (2016) explained that coaches who engaged in
observation and assessment while teachers engaged in data use process, supported coaches in
identifying different ways to assist teachers with data analysis. Coaches needed to model,
provide feedback, and share expertise to bridge teachers with necessary resources to support
teachers in building their understanding of data (Datnow & Hubbard, 2016). In addition, the
authors noted that teacher leaders served as facilitators in supporting teachers to develop skills
around data use. Partnerships between principals and lead teachers were important because both
need to have a deep understanding of content knowledge and facilitation skills to support
teachers as they examine data. Partnerships between coaches and principals are important to have
when building teachers’ capacity for using data. Some characteristics to form partnerships
included close collaboration, trust, and openness to facilitate efforts of using data among
teachers. Datnow and Hubbard (2016) expressed that although coaches can be beneficial, they
have hindered the learning development of teachers of using data effectively.
72
Dunn et al.’s (2013b) study assessed teacher efficacy for construct that either supported
or inhibited DDDM (Dunn et al., 2013b). The study also explored teachers’ efficacy for their
abilities to analyze and interpret data as well as their efficacy for connecting those findings to
classroom practice. There were 1728 (K-12) teachers from 193 schools who participated in a
state-wide professional development program which aimed to increase teacher DDDM in a state
in the Pacific Northwest. The participants were teachers from the convenience sample of
participating schools who varied in their experience implementing DDDM as indicated on their
responses to the online 3D-MEA inventory as well as their trained experience in DDDM (Dunn
et al., 2013b). Forty-five percent of participants indicated that they had implemented DDDM for
less than one year, 24% had implemented DDDM for one to two years, 24% had implemented
DDDM for 3 to 5 years, and 7% indicated they had implemented DDDM for 5 or more years
(Dunn et al., 2013b). Teachers participated in state-wide DDDM; however, participation varied
amongst schools. The professional development consisted of a 2-day seminar training in the prior
year, as well as a one-year job embedded follow-up training. Half of the participants did not
participate in the either the 2-day seminar training or the job embedded training during the time
they completed the 3D-MEA inventory (Dunn et al., 2013b). The demographics of the
participants were predominantly female, and their age and work experience ranged in a wide
continuum (Dunn et al. 2013b). To fully protect the identify of teachers who participated in this
research study, the union did not allow any other form of participant demographic data collection
(Dunn et al., 2013b). The state’s student population included 33.7% minorities and the state’s
teacher population included 8.4% minorities (Dunn et al., 2013b). The participants represented a
wide variety of schools ranging from extremely small rural schools to large suburban and large
urban schools (Dunn et al., 2013b). Dunn et al. (2013b) classified 44 % of the school in the study
73
as elementary schools, 18% were middle or junior high school, and 18% were classified as high
schools. The remaining 20% were classified as schools serving kindergarten through 12
th
grade
(Dunn et al., 2013b).
The 3D-MEA inventory was the tool used for this study and was created in collaboration
with two researchers who developed the DDDM professional development for teachers and an
educational psychologist who was an outside evaluator for the project (Dunn et al., 2013b).
There were 35 items that were developed to assess the four components of DDDM: efficacy for
data identification and access, efficacy for data technology use, anxiety for DDDM, and efficacy
for data analysis, data interpretation, and the application of data to instruction. A five-point
Likert scale was developed that ranged from one to five, one being strongly disagree and five
being strongly agree. From the original 35 items that were developed, 13 were removed with the
consensus of the three authors for the following reasons: perceived conceptual overlap or
redundancy, perceived questionable terminology, and/or perceived ambiguity. The final
proposed inventory included 22 items and was used for data collection and statistical analysis.
Teachers that participated in the study were invited to respond to 3D-MEA inventory online after
the first year of DDDM professional development; whether or not the teachers participated in the
trainings (Dunn et al., 2013b). A total of 1855 inventories were completed; however, out of these
127 were deleted because they were missing information. Dunn et al. (2013b) divided the 1728
inventories into split-half samples making each a total of 864. Sample 1 was used to perform an
EFA to determine the factor structures; whereas, sample 2 was used to perform a CFA to confirm
the factorial validity of the identified scales.
Initially the 3D-MEA was designed to assess four hypothesized components of DDDM.
The originally hypothesized component of efficacy for data analysis, interpretation, and
74
application to instruction was not supported by the data and two distinct factors emerged one for
data analysis and interpretation and one for application of data findings and instruction (Dunn et
al., 2013b). This separation of these two components indicate that perceived their abilities to
analyze and interpret data separate from their abilities to connect their data-based interpretations
to instructional decision making. Interpretation was defined as teachers’ beliefs in their ability to
successfully analyze and interpret student data; whereas, application was defined as teachers’
beliefs in their abilities to successfully connect or apply their interpretation of data findings to
classroom instruction to improve student learning.
Dunn et al. (2013b) hypothesized that emphasis needed to be added to understand the
basic skills required to evaluate data and interpret data-based findings because teachers clearly
believed that the skills to understand data were significantly different from the skills they need to
apply data to classroom decision-making. The current findings indicated that DDDM anxiety
was inversely related to the other aspects of DDDM efficacy measured by the 3D-MEA-efficacy
for data technology use, efficacy for data identification and access, efficacy for data analysis and
interpretation, and efficacy for application of data instruction. The development of sound
measures for these DDDM-specific constructs provided not only preliminary evidence to support
DDDM efficacy and anxiety, but also provided the required tool to further explore these
constructs and their impact on teacher practices (Dunn et al., 2013b). The 3D-MEA inventory
may be useful for school leaders seeking to develop professional development in exploring
teachers’ beliefs about their ability to successfully engage in DDDM to support positive changes
for student learning (Dunn et al., 2013b).
Schmidt and Datnow’s (2005) study examined teachers’ emotions in the process of
making sense of educational reforms. The study drew upon qualitative data that was gathered in
75
a longitudinal, 4-year, case study of comprehensive school reform (CSR) in five schools located
in two US states, California, and Florida. The schools were located in rural and urban locations
that served a low income and diverse student population. Data was reported on three elementary
schools in California and two elementary schools in Florida. The CSR models implemented by
the schools were among the most popular in the US and include Accelerated Schools, Comer
School Development Program (SDP), Direct Instruction (DI), Edison Project, and Success for
All (SFA) (Schmidt & Datnow, 2005). Funding was provided to three out of the five schools by
the federal Comprehensive Reform (CSRD) program. These schools were also selected because
they reflected a range of reform implementation levels. There were approximately 75 teachers
interviewed, nine were with teachers at the SFA school, eight at the Direction Instruction school,
31 at the Comer SDP school, 15 at the Edison School, and 13 at the Accelerated School. The
interviews represented a range of opinions and levels of involvement in reform efforts. Most
teachers were interviewed once; however, in smaller schools some teachers were interviewed
more than once on consecutive visits. Most interviews lasted approximately 45 minutes and were
table and transcribed verbatim. The interviews were guided by semi-structure protocols, as a
variety of questions were asked about how teachers defined the reform model in their school and
the emotions the reform elicited. The data analysis began by coding data that responded to the
particular questions and the data was grouped according to particular themes. First, teachers’
interviews were coded by school then merged the coded data into a file organized by code, rather
than by the school. Notes were made during the coding process that included information
regarding the description of codes and issues that emerged in the coding (Schmidt & Datnow,
2005).
76
Schmidt and Datnow (2005) defined three of the reform designs (SFA, Edison Project,
DI) as highly structured and specified and provided curriculum, lesson plans, school
organizational models, implementation plans, and professional development. SFA and DI
focused primarily in reading; whereas, Edison focused on all subject areas. The most
comprehensive and contentious reform model was Edison as it required schools to become
charter schools when they adopt the model as is the only reform model in the study that was
operated by a for-profit company (Schmidt & Datnow, 2005). Accelerated Schools and Comer
SDP were less structured and specified, as these models asked schools to commit to a guiding set
of principles and to engage in an inquiry-guided locally driven process of self-renewal. The
structure of some reform models was deliberately left open for teachers to negotiate meaning
within them; whereas, others offered more structure.
Many of the teachers in the SFA and DI school had clear definitions of the reforms they
were engaged in as they were able to describe the reform in their schools in similar ways
(Schmidt & Datnow, 2005). DI school teachers described their reform in similar ways; however,
some teachers noted particular emphasis on drilling and repetition in the program. Schmidt and
Datnow (2005) found more uniformity of meaning in the schools implementing more specified
reform models, yet there is no certainty that teachers were always comfortable and enthusiastic
about the reforms. Edison, a more complex reform raised more emotions and conflicts for
teachers as one teacher described the program to be extremely stressful (Schmidt & Datnow,
2005). Less structured reform models lead to more variety in definition as well as a range of
emotions. Comer SDP school teachers were able to construct a consistent definition of the
reform among themselves. Yet, most teachers in the Comer SDP school could only speak
vaguely about the reform. Schmidt and Datnow (2005) found more diversity than uniformity in
77
definition at the Accelerated School. Teachers at the Accelerated School admitted that they did
not receive any training at all, which lead to misunderstandings as to what the reforms were
about. Some teachers had limited knowledge about the reforms seeing them only as one
component rather than as an overall school improvement plan (Schmidt & Datnow, 2005).
Limited involvement with the reform lead to teachers experiencing anxiety, frustration, and
uncertainty of how the reform first into the school. Teachers experienced frustration with the
state of affairs at the school as the models had never been implemented. The most unusual
finding in the study was the fact that some teachers the reform had no meaning at all. The
teachers’ inability to describe the reforms at their schools could have been revealing the daunting
task of meaning-making itself and the difficulties teachers may have when attempting to put into
words that they think about something they are uncertain about (Schmidt & Datnow, 2005). The
lack of training and informed dialogue lead to teachers’ vague misunderstanding of the reforms
they were implementing. However, as some reforms allowed teachers to negotiate their own
meaning, this led to frustration on the part of some teachers.
Teachers were influenced by their emotions, their own past experiences, ideologies, and
teaching styles when making meaning of the reforms in their classroom. The structured reform
models (SFA, Edison, and DI) elicited a range of meaning and emotions from teachers such as
feeling content, joy, and satisfaction when the reforms supported building their skills and/or
aligned with their own teaching beliefs (Schmidt & Datnow, 2005). Some teachers expressed
that Edison’s structure was comforting because the reform model was very structured and had
very set patterns to follow. In contrast, other teachers felt that Edison was stressful at the
beginning of implementation because new instructional routines were implemented in all subject
areas. Teachers expressed that the Edison reform model implementation was a steep learning
78
curve. All of the structured reforms brought comfort and trust levels rose when teachers were
able to see the benefits of the reform for their students. In addition, some teachers expressed
boredom implementing the Edison and DI reform model because they implemented the same
routine in their classroom instruction, and it became quite redundant. Schmidt and Datnow
(2005) found cases where teachers negotiated new meanings of reforms. For example, teachers
were able to find ways to resolve aspects of the reform that teachers found problematic and
adapted meaning in the context of their own teaching practice to support their students. Teachers
expressed feelings of guilt and frustration when making modifications and adaptions to the
reform models to support the students in the classroom. Reform models such as Comer SDP and
Accelerated Schools prompted a different set of emotions for teachers at the level of classroom
practice sense-making. Schmidt and Datnow (2005) found that some teachers felt that they had
been embracing the reform all along by defining the reform in ways that were consistent with
their current practice. On the contrary, teachers that were not able to define the reform at all,
were not able to see connections to their classroom practice. This was evident, as teachers were
asked in interviews if the Accelerated Schools reform was implemented in their classroom, most
teachers responded negatively. Positive and negative emotions were evident within the classroom
context, in particular feelings of ambiguity. Moreover, when reforms conflict with teachers’ own
moral purposes or become too politically high stakes, harm can be done as emotions are not
encouraging in actual change in the context of the classroom (Schmidt & Datnow, 2005).
In conclusion, teachers’ beliefs play a significant role in DDDM as they can support or
hinder their motivation to implement this approach. To engage in DDDM, teachers must value
the use of data in their practice (Ikemoto & Marsh, 2007). Teachers who do not believe in
DDDM need additional support from leadership. Creating opportunities for teachers to engage in
79
activities and feel successful in data use through professional development will increase
teachers’ motivation and buy in to engage in DDDM (Schmidt & Datnow, 2005). What is
missing from this literature is culturally relevant pedagogy and critical reflection. In addition,
looking at how teachers’ share their beliefs during discussions with colleagues about ways to
meet the needs of historically marginalized students and how the assets they bring can be
leveraged when teachers plan instruction based on the data analyzed.
Conceptual Framework
In this section, I present my original conceptual framework, which provided the overall
framework for this study. According to Maxwell (2013), the conceptual framework is a model
and a tentative theory of the phenomena that the researcher plans to study. This framework
contains elements that I believed operated in relation with each other to explain the phenomenon
I explored; how does an elementary teachers use data to inform her instruction. My conceptual
framework served as a guide that informed how I collected, analyzed, and interpreted data.
My conceptual framework was built on three concepts: leadership, learning communities,
and teacher beliefs about data driven decision making. I argued that the leader in a school created
conditions and expectations to support the learning during professional development. I believed
that the interaction between the leader and the professional learning conditions supported the
content of the professional development. The interaction between the teacher and the learning
conditions of the professional development then helped develop teachers’ understanding of
content to implement in their classroom. The teacher and the leader had a continuous relationship
to support teachers practice in the classroom. To foster the interactions between the leader and
the teacher outside of professional development, the leader would observe the teacher in action
and provide feedback. As a result, this model of leadership, professional development, and
80
teacher supported student learning (see Figure 1). I discuss each of these components in more
detail below.
Figure 1
Conceptual Framework: The relationship between the conditions for data discussions and
teacher practice.
The first concept of my conceptual framework was leadership. I asserted that the leader
in schools created expectations and conditions that either supported or impeded teacher learning
during professional development. For teacher learning to take place, the principal created and
maintained a positive school environment and power dynamics for teacher learning (Huguet et
al., 2014). One way the leader promoted a positive learning environment was by creating time to
hold learning sessions as opposed to staff meetings where teachers build trust and unity within
teachers and administrators (Datnow et al., 2013; Young, 2016). Learning sessions would then
allow leaders to define what consisted of meaningful data and allow for teachers to voice their
professional judgement about data (Wayman, 2005). The leader presented norms that supported
the development or maintenance of a positive environment for teacher learning and sets
Professional
Development
Leadership
Teacher
Students
81
expectations and establish common goals to support the culture around data (Datnow et al., 2013;
Wayman, 2005). These expectations consisted of agenda setting and modeling best practices that
served to help process and build teachers’ understanding of the professional development
content. The leader used an agenda to remind teachers of his/her expectations during professional
development. To create conditions for learning, the principal also used facilitators (e.g.,
instructional coach) to engage in structured collaboration, which consisted of modeling how to
analyze data and provide questions to guide and increase engagement in conversation (Datnow et
al., 2013). The leader demonstrated content area and interpersonal expertise to support
conversations with teachers to identify areas of need (Breiter & Light, 2006). The teacher group
was expected to use discussion protocols to provide and ensure that they had the opportunity to
process information and understand content presented during professional development (Datnow
et al., 2013).
The second concept of my conceptual framework was professional learning communities.
I asserted that the interaction between the teacher and the learning conditions of the professional
development helped develop teachers understanding of content to implement in their classroom.
Establishing PLCs supported teacher learning through collaboration among colleagues, focus on
student learning, and reflective dialogue of teacher practice (Horn & Little, 2010). To build
professional learning communities, I argued that the school leader provided time for teachers to
participate in grade level meetings to collaborate, analyze, synthesize data, ask questions, and
build trust among colleagues (Kerr et al., 2006; Schifter et al., 2014; Young, 2006). Time would
be a contributing factor to the development of teachers understanding of content as it allowed for
individuals to interpret data and make sense of the information they are analyzing (Ikemoto &
Marsh, 2007). The specialized roles (e.g., grade level chairs, literacy coach, Title I coach)
82
created by the leader facilitated teachers’ learning in small groups to support active
conversations among teachers to analyze and synthesize the information that was used in
decision making (Kerr et al., 2006; Young, 2006). Teachers would explore multiple sources of
data (e.g., writing samples, journal responses, district informative assessments) to identify areas
of need and calibrated with one another their interpretation of the data (Kerr et al., 2006; Marsh,
2012; Young, 2006). Teachers would use a form of documentation (e.g., school plan template) to
evaluate data and identified instructional areas for improvement (Young, 2006). This would
serve as a guide to support teachers to engage in conversation about what they were seeing in the
data and raised questions about areas of concern. By creating and presenting data to teachers that
was usable and safe to understand, teachers were able to interpret and make sense of the data
(Kerr et al., 2006; Marsh, 2012). Usable and safe data would be important to build mutual trust
among teachers who were participating in data inquiry (Marsh, 2012).
The third concept of my conceptual framework was teacher beliefs on DDDM. I asserted
that teachers had to believe in the value of using data if they were going to engage in DDDM.
Teachers’ beliefs might come with different levels of data use experience. If teachers did not
believe that data was an important piece for instructional improvement, then the leader would
have to use professional development to support the use of data in their practice. For teachers
who already believed in DDDM, then they would actively participate in professional
development presented by the leader on the use of data to inform instruction. However, on the
other hand, teachers who did not believe in DDDM would need additional support from the
principal to embrace DDDM as an approach to instructional improvement. Teachers in both
cases would work closely with the leader and have a continuous relationship to support their
practice in the classroom. The leader was expected to create norms of openness and collaboration
83
to support the teacher in developing skills to analyze and interpret data (Datnow & Hubbard,
2016; Ikemoto & Marsh, 2007). This support would then help shape the teachers’ attitude,
confidence, and their belief of their ability to use data to help understand and develop the skills
necessary to interpret data and make it into actionable knowledge in the classroom (Datnow &
Hubbard, 2016; Donnell & Gettinger, 2015). Professional development on DDDM would
support and enhances teachers’ understanding on how to interpret and analyze data and use the
information as actionable knowledge in the classroom (Datnow & Hubbard, 2016; Donnell &
Gutter, 2015; Dunn et al., 2013).
84
Chapter 3: Methods
This chapter describes the qualitative approach of this study, which included the
instrumentation, data collection, and data analysis procedures I used to conduct this study. The
purpose of this study was to examine how a teacher made sense of data in her school site to
support her instructional practice in the classroom. This qualitative single-case study was
informed by the following research questions:
1. How does one elementary school teacher use data in collaboration and professional
development meetings?
2. What role does leadership play in the way that one elementary school teacher uses
data in collaboration and professional development meetings and in her classroom
instruction?
3. How are those data use activities reflected in her classroom instruction?
Research Design
Merriam and Tisdell (2016) define qualitative research to be an inquiry approach in
investigating something in a systematic manner. Qualitative researchers’ interest focuses on how
people construct their worlds and interpret their experiences (Merriam & Tisdell, 2016). For this
study, I was interested in how a teacher constructed meaning from various forms of data to
support her instruction in the classroom. I used a qualitative single-case study approach that
positioned me as the primary instrument for data collection and analysis (Merriam & Tisdell,
2016). A case study is an in-depth description and analysis of a bounded system (Merriam &
Tisdell, 2016). The unit of analysis for this study was an elementary school teacher and her
working environment. I examined one elementary first-grade teacher who had been using data in
her practice to inform her classroom instruction. I was able to observe and examine how this
85
first-grade teacher made sense of data that was used at her school site and used the information
gathered during collaboration and professional development meetings to inform how this
information drove her instruction in the classroom.
Sample and Population
In this qualitative single-case study, I used purposeful sampling to select a teacher who
would help me understand the phenomenon of interest. Merriam and Tisdell (2016) describe
purposeful sampling to include people who know the most about the topic. For this dissertation
study, I purposefully selected a teacher who gave me insight as to how an urban elementary
school teacher made sense of data during collaboration and professional development meetings
to support her instruction in her classroom.
Setting
I began the process of seeking a district that was active in participating in research. I
selected Wave District Schools because I knew that they participated in research with other
organizations of which I was a part of. This led me to their district website where I was able to
email the director of special projects to obtain a meeting to discuss my study. I used convenience
sampling (Merriam & Tisdell, 2016), as I knew the director of special projects from my previous
work experience. This connection allowed me to communicate with her and gain insight to Wave
District Schools. After meeting with the director of special projects, I met with her for a second
time to interview her and gain understanding of the district’s instructional focus and determine if
the elementary school sites were actively using data to inform classroom instruction. She was
able to provide me with a list of three schools that would be potential candidates for my study.
Once permission to conduct my study was granted by both the institutional review board (IRB)
and Wave District, I emailed the principals at the three school sites to set up a phone meeting
86
where I used a screener to determine the best school that would fit my study’s selected criteria,
which consistent with Ikemoto and Marsh (2007), included the school site being a public
elementary implementing school wide data use in an urban community. Milner et al. (2015)
define an urban intensive community to be large cities with a population in excess of one million
people where limited resources are often expected. In an urban intensive community, outside
school factors such as housing, housing policy, poverty, and transportation are directly connected
to what happens inside of school. Wave District met this definition. Research indicates that
schools that serve low-income children (e.g., free reduced lunch) are using data driven decision
making to create meaningful learning opportunities for students to learn (Young, 2006). The
screener allowed for principals to engaged in a short interview process where they answered
questions regarding data use. For example, I asked, “How long have you been implementing data
use at your school site?” “What conditions, if any, did you establish with respect to the way you
wanted teachers to use data?” After reviewing the screener responses for all three schools, I
determined Sunshine Elementary to be the best fit where I would conduct my qualitative single-
case study because the school site was actively using data and had the support of leadership to
lead the work of data use to inform teachers’ classroom instruction. In addition, the
demographics of the Sunshine Elementary consisted of 9.7% African American, 10.3% Asian,
0.7% Filipino, 68.4% Hispanic or Latino, 0.5% Pacific Islander, 8.8% White, and 1.6% Two or
More Races (Wave School District website, n.d.). I followed up with the special projects director
once I had selected the school site and emailed the principal at Sunshine Elementary to begin my
selection process for a teacher participant.
87
Participants
I began the process of seeking a teacher for this study by speaking with the principal and
asking about a grade level that was using data and would be interested in participating in a
qualitative single-case study. The principal guided me to a group of first grade teachers who had
a reputation for working with data as a grade level team. Once selecting the team of teachers, I
used my conceptual framework concepts of leadership, learning communities, and teachers’
belief on data use to drive decision making to guide me in screening and selecting the teacher I
studied. Out of the four first-grade teachers, one of the teachers was very eager to participate in
the study and volunteered immediately. I selected her because of her willingness to participate
and her reputation for using data in her teaching practice (Dunn et al., 2013) and her interest in
participating.
Data Collection and Instruments/Protocols
I conducted 11 observations during collaboration and professional development meetings
over the course of 3 months. Collaboration meetings were observed for approximately 50
minutes to 1 hour and professional development meetings were observed for 2 hours for a total
of 14 hours. In addition, I conducted three observations in the teacher’s classroom to see how the
collaboration and professional development meetings about strategies and data supported and or
influenced the teacher’s instruction in the classroom. Each classroom observation lasted 1 hour.
I approached the classroom observations with flexibility because I wanted to see how the teacher
believed she was carrying over the discussions from collaboration and professional development
meetings into her lesson planning and classroom instruction. I conducted a 2-hour interview with
the teacher to understand how she made sense of data in the meetings through collaboration from
colleagues and how teacher conversations reflected her understanding to guide her instruction in
88
the classroom. I explained to the teacher who participated in my study along with teachers
present at any meetings the purpose of the inquiry and methods that I used, obtained informed
consent, and assured confidentiality (Merriam & Tisdell, 2016). Although my focus for this
study was on one teacher, I had to get consent from all teachers present to participate in their
collaboration meetings.
Interviews
In qualitative studies, the most common form of data collection is interviews (Merriam &
Tisdell, 2016). Merriam and Tisdell (2016) also add that the main purpose of an interview is to
obtain a specific kind of information. In qualitative research, conducting interviews allow for the
researcher and the participant to engage in conversation that provided insights to the research
question. In this qualitative single-case study, I conducted one formal interview, an initial
interview with the teacher to gain some background information about her teaching experience
and understand if she believed in and valued data. This led me to ask her questions about what
type of data she used to inform her classroom instruction and how often she engaged in data use.
I began the interview with the teacher by establishing rapport and explaining the purpose of the
interview. Audiotaping the interview allowed me to capture everything said by the teacher. I took
notes during the interview to record nonverbal mannerisms and any probing questions (e.g., tell
me more, can you provide an example?).
In addition, I conducted two informal interviews after collaboration meetings and one
informal interview after a classroom observation to ask clarifying questions and gain insight into
topics that were discussed that I did not fully understand. After collaboration meetings, I asked
the focus teacher about her planning of instruction for a particular iELD lesson and her structure
for a dot talk she was planning on implementing for math. Moreover, I was able to ask her how
89
she grouped students for different content areas (e.g., Math, ELA) after a classroom observation
which helped me understand her knowledge of her students.
The format of the interviews was semi-structured, which allowed for a mix of more and
less structured interview questions and flexibility (Merriam & Tisdell, 2016). I developed
questions for my semi-structured interview with the concepts of my conceptual framework which
included, leadership, learning communities, and teacher beliefs on data use, as a guide to
structure my questions. I asked questions that allowed me to understand how the teacher selected
data in her school environment, what structures were in place, if any, to understand the data
being used, and how the teacher used this data to inform her instruction in the classroom (See
Appendix C). In addition, I asked questions that provided insight to how collaboration in her
school, if any, supported her understanding of data use at her school site. The questions I asked
looked at who made decisions about what data teachers used, what protocols were used related to
data discussion, what role did the principal play in the way that data was made available, how
time was made available for data discussions, and were there protocols, coaches, or supports for
data discussion present/available during meetings. The format for my informal interviews were
informal as I crafted the questions based on what I saw. For example, I wanted to know more
about her grouping of students, so I asked her to share how she made the decision to group
students and if grouping stayed the safe throughout all content areas.
Observations
In qualitative research, interviews and observations are used to collect data. Merriam and
Tisdell (2016) explain that observations can be distinguished to interviews because observations
take place in the setting where the phenomenon of interest naturally occurs, and that
observational data represents a firsthand encounter with the phenomenon of interest. For this
90
study, I conducted 11 in person observations during collaboration and professional development
meetings. I also conduct a total of three in person observations in the teacher’s classrooms to
observe how the teacher incorporated changes in her instruction based on discussions observed
in the collaboration and professional development meetings. Each observation was
approximately 1 hour long, and I observed a series of lessons in specific content areas (e.g.,
Math, iELD) as they were linked to the data conversations discussed in collaboration and
professional development meetings. I was specifically looking at how the teacher made sense of
the data discussed in meetings with colleagues and how this then led to an instructional shift in
the classroom to support student learning. I observed classroom observations to capture evidence
of the way the teacher was bringing data into the classroom. I was flexible with observations
throughout my time in the field to be open to whichever approach the teacher took after data
discussions and to observe during designated times.
Documents and Artifacts
Merriam and Tisdell (2016) state that documents are the third major source of data in
qualitative research. Documents support researchers to uncover meaning, develop understanding,
and discover insights that are relevant to the research problem (Merriam & Tisdell, 2016).
Merriam and Tisdell (2016) explain that artifacts are used to represent some form of
communication that is meaningful to participants and the setting. I took pictures of artifacts
during collaboration meetings and in the teacher’s classroom that represented how she
understood data and used the information to inform her instruction in the classroom. I collected
documents such as district and school site professional development calendars, classroom
artifacts and parts of lesson planning. These documents were used to triangulate data sources
(e.g., professional development, collaboration meetings, classroom observations) to understand
91
the phenomenon (Patton, 2002). District and school site professional development calendars
helped me create the time to be present and observe these meetings. Classroom artifacts and
lesson plans helped me understand whether there was a relation between the teacher’s practice
and the data discussed in collaboration meeting. These documents and classroom artifacts helped
me understand how the teacher was making use of these tools to support her classroom
instruction.
Data Analysis
For this qualitative study, data consisted of transcriptions from the teacher interviews,
observation field notes from collaboration meetings, professional development, classroom
observations, reflective memos, along with documents and artifacts. The data analysis was an
interactive process as I continuously revisited the interview and observation data to develop my
codebook. I transcribed the recorded interview and began to the process of open coding by
engaging with every line of the transcript and underlining any key phrases which helped me
identify initial categories (Harding, 2013). I followed this process for my observation data by
starting with collaboration meetings followed by professional development and ending with
classroom observations. As I explored observations and transcripts, I highlighted areas in the
data where the teacher discussed her belief and value of data use in addition to how she
interacted with data at her school site. In addition, structures of time, leadership responsibilities,
and data conversations were areas I highlighted in the data to understand conditions and
structures in place at a school site.
As part of the first cycle of data analysis, I engaged with both analytic tools an open
coding. Corbin and Strauss (2008) describe data analysis to be a process where the researcher
interacts with raw data and uses techniques to derive concepts. Reflective notes and memos were
92
gathered during my data collection of interviews and observations. Merriam and Tisdell (2016)
describe reflective memos to be a form of analysis. I used my reflective memos to capture ideas
that developed throughout my data analysis (Merriam & Tisdell, 2016). I read my reflective
memos to give myself insight into what I had been thinking about at the time of data collection.
Asking myself questions and summarizing my ideas in various reflective memos helped me
make connections and made me aware of what I wanted to clarify with the teacher in follow up
observations. In addition, the use of analytic tools helps analysts avoid standard ways of thinking
about a phenomenon and stimulates the inductive process (Corbin & Strauss, 2008). The types of
analytic tools I used during my data analysis consisted of the use of questioning, looking for
words that indicate time, thinking about various meanings of a words, making comparisons,
drawing upon personal experience, and looking at the language and emotions expressed by the
participants (Corbin & Strauss, 2008).
I asked questions of the data I engaged with to help me think about the perspective of my
participants and help me gain a better understanding of my research questions. I asked myself
questions like, “What does this teacher consider data to be?” I asked, “How does the structure of
time during collaboration and professional development meetings impact teachers’ opportunities
to talk and use student data?” Similarly, I asked, “How does the use of a math TOSA support or
hinder conversations about data use?” Asking myself questions throughout every stage of
analysis was a tool I used to help me understand the data from the beginning to end of my
analysis (Corbin & Strauss, 2008).
I looked for words that indicated time as I wanted to understand how time was structured
and used throughout collaboration and professional development meetings. In my data, time
marks in my observation transcripts were indicators that allowed me to capture the time that was
93
spent in conversations and planning content lessons based on areas of focus (e.g., Math, iELD).
These time marks also provided an insight as to how time was spent in meetings which helped
me understand the structure of time in different learning settings. I noticed and took in account
the language and emotions expressed by the participants. These observations helped me
understand how participants were feeling about decisions made and ideas expressed during
collaboration and professional development.
I used the interview data first to engage in cycles of coding that included summarizing
segments of data followed by identifying patterns (Harding, 2013; Miles et al., 2014). I
developed an initial a priori coding list to the three domains of my conceptual framework which
consisted of leadership, professional learning settings, and teachers’ belief and value of data. I
engaged in the first cycle of coding by first looking at a priori codes and empirical codes.
Harding (2013) described a priori codes to reflect categories that were already of interest before
the research began, whereas, empirical codes were derived while reading through the data as
points of importance and commonality. Examples of empirical codes consisted of “looking away
from practice”, “teacher moves”, and “knowledge for practice.” I organized and coded all
transcripts and field notes. I began by writing codes alongside the transcript (Harding, 2013).
I completed coding the data from interview transcriptions and then used the same
approach by once again engaging in open coding, using similar codes when I read my
observation and documents/artifacts. I used codes from my conceptual framework as I coded
collaboration meetings, professional development sessions, and classroom observations. For
example, the a priori codes that I used were “time” as it related to how time was structured for
the various professional learning settings. I used “systems” to understand the structures that were
in place to support teachers in data use and analysis. I used “looking away from practice” to
94
understand the approach that was taken from leadership to support teacher practice during
collaboration and professional settings.
As I moved to analytic codes, I aggregated the a priori and empirical codes to identify
patterns which led me to identify my findings. I engaged in meaning making from codes that
related to my research questions (Merriam & Tisdell, 2016). I used a priori and empirical codes
to label participants’ precise words captured in the data between teachers and the Math TOSA.
This helped me identify and understand the interactions between teachers and leadership (e.g.,
Teacher Leader, Math TOSA). For example, the questions asked by the Math TOSA were, “what
I am asking you is what would be an acceptable equation?” “Who would like to share their
strategy?” provided insight as to the limitations and constraints that were present during
professional learning settings. This type of coding allowed for me to deepen my understanding
and answer my research questions about how leadership plays a role in teacher’s use of data to
inform her instruction.
Analytic memos were also part of my data analysis to help me gain insight as to how I
made sense of the data (Miles et al., 2014). The use of analytic memos used throughout my
codebook provided opportunities to ask myself questions and allowed me to understand the
concepts that developed in my case study from different observational data. In these analytic
memos, I made connections across different data sources (e.g., interviews, observations, artifacts,
documents) to derive my findings. I was able to triangulate data (Patton, 2002) from multiple
data sources because I crossed checked information from interviews along with the artifacts and
documents I captured in the field. With the observations from collaboration meetings,
professional development, and classroom instruction, I was able to reference back to make
connections that were shared in both formal and informal teacher interviews. As researchers, we
95
may share similar cultures and life experiences as the participants we study (Corbin & Strauss,
2008). As an elementary teacher, I was able to make connections with the content, curricula used
at the school site, and professional development. I had to separate my own ideas from what I
observed. For this reason, I had to keep my biases on check and use analytic memos to help me
consider any biases I had about the data.
Limitations and Delimitations
There were several limitations in my study, which are consistent when conducting a
qualitative approach. One limitation to my dissertation study was truthfulness. The data collected
through interviews relied on the truthfulness of the focus teacher, which could not be controlled.
In addition, I could not control the way that the school planned or enacted data use. Another
limitation was teacher availability from the participant. I was limited with time when conducting
interviews and observations. In addition, the principal’s absence throughout my study was a
limitation because I was not able to conduct an interview or observe her. Moreover, a limitation
that impacted my study was that I was a novice researcher. As a novice researcher, I did not have
a lot of experience to construct the best questions or know the best time to probe an interviewee,
this comes with experience.
A delimitation was my availability because this limited and constrained how much data I
collected and limited capturing the essential data through observations and collecting all the right
documents/artifacts. In addition, as a new observer I may have missed documenting information
that could have contributed to my ability to answer my research questions by not asking probes
during my informal interviews that would have helped me gain more insight to my focus teacher
and environment. Another delimitation that impacted my study was assuming that the principal
was going to be present at all professional development meetings when the principal was only
96
visible for two observations during professional development. Lastly, not asking to be part of the
leadership team meetings was a delimitation as I was not able to gain insight to their planning
and approach in the way they would support teachers understanding and use of data during
collaboration and professional development meetings.
Credibility and Trustworthiness
To ensure credibility and trustworthiness in this qualitative single-case study, I identified
any biases and monitored them through reflective memos. (Merriam & Tisdell, 2016). As an
elementary school teacher, I had strong feelings about the importance of the use of data, and the
need to inform instruction to provide students with high quality instruction that foster meaningful
learning. I wrote observer comments to capture my feelings and thoughts during observations
and interviews and bracket any biases during my time in the field. This accounted for conscious
and unconscious biases that I brought to the work of this study. Merriam and Tisdell (2016)
describe reflexibility as a way for researchers to explain their biases, dispositions, and
assumptions regarding the research. I kept a research journal that I used to reflect critically on
myself as the researcher. I asked and wrote questions that ensured I was unearthing my biases.
For example, “Why am I sure what teachers were doing was [good] or [bad]?” I reflected and
asked myself to go back and look for evidence to disconfirm things that I thought I knew from
the data I collected. I also used the research journal to describe in detail how I collected data,
how categories were derived, and decisions were made throughout my study. After my
interviews, I reflected on the quality of information received and wrote down any ideas or
interpretations that emerged after the interviews or observations (Patton, 2002). I wrote analytic
memos where I accounted for the way I was making sense of the data and any biases that I had
with respect to what I was thinking based on what I saw and heard in the field. In addition, I
97
engaged in multiple conversations during analysis and writing with my dissertation chair as a
form of peer review.
A qualitative researcher can use various strategies to increase the credibility, one being
triangulation (Merriam & Tisdell, 2016). Triangulation can be done when using multiple sources
of data to compare and cross check data that has been collected through observations or
interviews (Merriam & Tisdell, 2016). I collected more than one type of data which consisted of
observations, interviews, and documents/artifacts. These multiple sources allowed for cross
checking the data to ensure validity and reliability (Merriam & Tisdell, 2016). Another strategy
that supports credibility of my study was the adequate engagement in data collection (Merriam &
Tisdell, 2016). I spent 3 months collecting rich and descriptive data in the field until I reached
saturation and had come to a point where there was no new learning in the field in relation to the
questions I was trying to answer. This increased the credibility and trustworthiness of my study
(Merriam & Tisdell, 2016).
Ethics
In this dissertation study, I collected data from people which is why it was important to
conduct the study in an ethical manner. As I interacted with teachers in the field, I as the
researcher had the responsibility to behave ethically by treating interviewees with respect from
the first contact to the last (Rubin & Rubin, 2012). I remembered to ask for permission for
recording and using certain answers from interviewees as quotes in my study (Rubin & Rubin,
2012). Before beginning my study, I applied to IRB, which required me to sign a form indicating
that I have obtained informed consent and assessed the potential risks to participants in the study
(Creswell & Creswell, 2018). I did not pressure participants into signing consent forms and did
not disclose the purpose of the study before obtaining consent from participants. I was very clear
98
with the participants that my focus was on the way that one person in the group took up the
conversation even though I was collecting data from the group. Most importantly, I assured the
teacher confidentiality in the study. In addition, I obtained permission from the site I conducted
my study to ensure that the school selected was one without a vested interest (Creswell &
Creswell, 2018). During my data collection, I made sure to respect the site by being the least
disruptive when conducting observations and interviews at the site (Creswell & Creswell, 2018).
As I collect data, I made sure to avoid exploitation of participants and collecting harmful
information (Creswell & Creswell, 2018). During the time I analyzed the data, I avoided
disclosing only positive results and respected the privacy of participants (Creswell & Creswell,
2018). In conclusion, I made sure to communicate information in clear and appropriate language
and shared data with others so that readers could determine the credibility of the study as I
reported and shared data of the proposed study (Creswell, 2014). I made sure to disclose with the
participants that I was a teacher who was there to learn from them. I was clear about my
positionality in relation to what the participants were doing during this study, as I was not there
to evaluate their work but learn from their interactions with colleagues about data use.
99
Chapter 4: Findings
The purpose of the study was to examine how an elementary school teacher used data to
inform her instruction in the classroom. The findings are representative of observation and
interview data and documents and artifacts. One formal teacher interview, two informal
interviews, seven grade level collaboration meetings, four professional development meetings
and three classroom observations were conducted in the span of 3 months. The data from the
observations were triangulated with the participant interviews and the documents artifacts
collected. To provide the context for the findings, the following section presents a brief
explanation of what I expected to see at the site before field work took place. This is followed by
a description of the research site and the participants. Lastly, this chapter presents the findings
supporting each research question.
This study involved a first-grade team at Sunshine Elementary with the expectation that
teachers used data to inform their instruction in the classroom with the support provided by the
school site. A district director identified Sunshine Elementary as a school site that used data to
inform their instructional practice. Once interviewing the principal, she identified the first-grade
team as a group of teachers who would be willing to participate in the study. Therefore, I
expected to observe teachers using data in their day-to-day interactions (e.g., collaboration time,
professional development) to examine their teaching practice and make changes to their
instruction.
Background of Sunshine Elementary
Located in Southern California, Sunshine Elementary was one of six elementary schools
for Wave School District. Sunshine Elementary served a total of 566 students in grades K-5. The
population at this school site consisted of 68 percent of students who were socioeconomically
100
disadvantaged, and 31 percent of the students were English learners (California Dashboard,
2017). Sunshine Elementary promoted students in becoming bilingual, biliterate, and bi-cultural
through their dual language emersion program (Wave School District website, n.d.). The first-
grade level consisted of 99 first graders enrolled at Sunshine Elementary and 4 first-grade level
teachers (Wave School District website, n.d.). In the 2018-2019 academic school year, 33
percent of kindergarteners were English Learners (Wave School District website, n.d.). Due to
school closures in March of 2020, demographic data were not generated for the 2019-2020
academic school year. Based on the data from 2018-2019 and the absence of data from 2019-
2020, I estimated that 33 percent of the incoming first graders were English Learners, as that was
the percentage of the kindergarteners the year prior (Wave School District website, n.d.).
Sunshine Elementary school’s demographics consisted of 8% African American, 9% Asian,
0.7% Filipino, 67% Hispanic/Latino, 0.03% Pacific Islander, and 8% White (Wave School
District website, n.d.). Sunshine Elementary’s instructional focus for the 2019-2020 academic
school year was on literacy and mathematics which is represented in the observation data
collected for this study. Sunshine Elementary was also active with parent community by serving
monthly family nights, monthly parent spotlights, parent meetings, and weekly-home school
connection information (Wave School District website, n.d.).
Background of Participants
Teacher
Alyson, a first-grade Caucasian teacher at Sunshine Elementary was the focus teacher for
this study. She had been a teacher for 14 years in general education and had experience teaching
multiple grade levels. Alyson had served as a teacher leader (e.g., grade level chair) the year
prior to this study. Teaching was a second career for Alyson.
101
Grade Level
The grade level consisted of four female teachers: one Caucasian, one South East Asian,
and two Latina. All teachers taught general education, two were part of the dual immersion
program. The grade level was led by Lisa, the teacher leader for first grade. At Sunshine
Elementary, the teacher leader served as the grade level chair. The dynamic of this grade level
consisted of respect and team players, as they would support one another as they met consistently
throughout the academic school year.
Out of the Classroom Support
Through an initial screener interview with the principal at Sunshine Elementary, she
shared that Wave School District provided multiple supports outside of the classroom. At the
time of the study, the district had provided an English Learner Instructional Resource Teacher
(ELIRT) who focused on the content of English Language Development (ELD), a Teacher on
Special Assignment (TOSA) who focused on mathematics and a new Language Arts Specialist
(LAS) who was in the process of starting at Sunshine Elementary. Outside of the classroom
support facilitated professional development for teachers and met with them during their
collaboration time.
What follows is the research question and the findings that emerged from the data. The
first research question asked, “What role does the leadership play in the way that one elementary
school teacher uses data in collaboration and professional development meetings and in her
classroom instruction?” The aim of this question was to determine if data use was a common
practice in different environments that the teacher was part of at her school site. The findings
revealed that the teacher believed in and valued data to use in her practice to the best of her
ability. The second research question asked, “How does one elementary school teacher use data
102
in collaboration and professional development meetings?” The aim of this question was to
understand how leadership played a role in supporting a teacher use data to support his/her
instruction. The findings revealed that leadership lacked preparedness and constrained
conversations to investigate teachers’ instructional practice, consistent with Horn and Little
(2009).
The last research question asked, “How are those data use activities reflected in her
classroom instruction?” The aim of this question was to determine if learning from professional
development reflected in the teacher’s classroom instruction. The data revealed that professional
development at Sunshine Elementary was structured in a knowledge for practice approach
(Cochran-Smith & Lytle, 1999) as teachers were receivers of the knowledge; however, the
learning from professional development was carried over to the classroom.
Research Question: How Does One Elementary School Teacher Use Data in
Collaboration/Professional Development Meetings?
The finding that emerged from this research question was that Alyson, the focus teacher,
believed in and valued data as an important tool to support students’ learning needs. She
described experiences where she used data on her own and described ways she made sense of the
information to support her instructional practice.
As the literature (cf., Datnow & Hubbard, 2016; Ikemoto & Marsh, 2007; Wayman,
2005) on teacher belief and value of data suggests, teachers must value data to use data to inform
their practice. It is evident that Alyson believed and valued data as an important tool to support
students’ learning needs. Through her interview, she shared multiple examples on the range of
ways she had the opportunity to engage with data on her own and with the support of leaders at
her school site. The way Alyson described her experiences was consistent with Datnow’s (2005)
103
assertion that providing teachers with opportunities to understand data supported their ability to
turn the information into actionable knowledge. The value Alyson placed on the importance of
data is also consistent with Ikemoto and Marsh’s (2007) assertion that, for teachers to use data to
inform their practice, they must value its use. Alyson saw data as something essential to her work
and did not believe she could operate as a teacher without it. She said,
I don’t know how people would do without it [data]. I don’t know how a teacher would
get through a school year without it [data].
In her statement, Alyson indicated that she could not imagine functioning without data to inform
her work with her students. She suggested that it would not be possible for her, or that she could
not relate to “people who would do without it [data].” She could not see getting “through a
school year without it [data],” demonstrating the value she placed in data as a tool to support her
teaching and her students’ learning.
Alyson’s belief in and value of data as an essential tool to support student learning could
also be seen in the way she described her approach to proposing an after-school intervention
program to the principal, as well as other decisions she made. She offered,
Oh… okay, I can give a recent…. She, we use data in several different ways. Here is a
good example for the use of data that something that just started a couple of months ago,
umm… she asked me to look at children to see if I thought there was a role for an after
school or before school intervention and who and why. So what I, I decided to do, my
preference was to do a reading intervention because in
first grade reading is so important.
So, I pulled data from several um… assessments and observations that I had used. I
looked for kids who had very similar skill sets, or very similar needs, like their next steps,
and so then I took to my boss umm… my data. Look, this group of five children has this
104
in common for sight words, they have this in common for phonological awareness, they
have this in common for phonics, they have this in common for fluency, I think I could
build a really cool curriculum for those five kids after-school, 20 sessions. She said,
“Bingo you are on, and I will pay you.” So, it’s the data is to help when, ya. And I find
myself using it in that way a lot umm...
Here, Alyson demonstrated her belief in and value of data as a part of her practice. As she was
given a task of developing an intervention, she defaulted to data as she stated that she “looks at
students” to help her determine who would benefit from the intervention. Implied in this
statement was that “look[ing] at students” meant looking at student data. Alyson continued by
stating that she “pulled from several assessments and observations,” indicating that she used
qualitative (observations) and quantitative (assessments) data to help her unpack what the data
meant. Moreover, she aggregated the data by student skills (e.g., sight words) and similar needs
(e.g., next steps). Furthermore, she indicated that she would use the data to make curricular
decisions if she were going to provide the intervention. Altogether, her approach to determining
the value of an after-school intervention program revealed her belief in the importance and value
of data in making educational decisions (intervention, instruction, curriculum) to support the
students’ learning needs.
Consistent with Ikemoto and Marsh (2007), because Alyson saw the value in the use of
data, she indicated that used it to inform her practice. The value she placed on data to inform her
practice was revealed as she spoke to the support she received from “teacher leaders” to use data
as part of her practice. Alyson’s belief that data mattered is reinforced in her description of how
she looked at district assessments. She described an experience with data as she stated,
105
Then the other would be, we have the district data, because the district mandates some
assessments, then what will happen there will be like a teacher leader, like an out of
classroom teacher whose specialty is either ELD or language arts or math and we will
look at the data, sort of apples to apples, based on umm… district required assessments.
This statement demonstrates that Alyson valued the leadership provided to look at data as she
referenced the presence of “like a teacher leader” who had time and the responsibilities to plan
out a way to approach data with teachers. Additionally, she saw this person as being the out of
the classroom expert (“teacher whose specialty is ELD, language arts, or math”) who brought
forward data to look at and analyze together. In addition, the language used by Alyson to
describe the out of the teacher leader as having a “specialty” reinforces the idea that Alyson
valued expertise that could support the way teachers engaged in analysis of district assessments.
Moreover, this evidence indicates that Alyson valued the presence of the teacher leader as she
believed this person led teachers in their investigation to understand students’ learning needs
when she stated, “we will look at data, sort of apples to apples.” Here, implied in Alyson’s
words, “look[ing] at the data,” was a focus to compare skills addressed in district assessments to
understand students’ learning needs.
Consistent with Datnow’s (2005) assertion that teachers need to have an understanding of
their data to turn that information into actionable knowledge, Alyson’s belief and value of data
led to addressing what she believed were student learning needs and “deficiencies.” She
described the value of data as she shared,
So then we are then looking at … what do we see in terms of … common … common
proficiencies, … common deficiencies, … we you know plan kind of those next steps
106
what will address what ideas for lessons that would address a particular deficiency. We
plan lessons around that and that is not intervention, that’s in the class.
This evidence reveals Alyson’s belief of data as a tool to help her understand her students’
learning needs. Here, she described how “looking at common proficiencies and common
deficiencies” provided her with insight into what students knew and did not know. In addition,
the data afforded her an understanding of how she could address these “deficiencies” in her
classroom. She expressed how she “plan[ned] the next steps” as an outcome of the data she
“look[ed]” at. Moreover, Alyson shared that the planned lessons were “not intervention, that’s in
the class.” This evidence revealed that Alyson was using data to differentiate and knew how to
turn data into actionable knowledge based on the student data used in her classroom.
As the literature (cf., Kerr et al., 2006, Young, 2006) on leadership suggests, specialized
leadership roles support the facilitation of learning and active conversations amongst teachers. In
this first finding, although Alyson believed in and believed she used data in actionable ways, it
was evident that leadership (e.g., grade level chair, TOSA) who were present during
collaboration meetings did not provide conditions for learning and data use (Datnow et al., 2013,
Wayman, 2005). Through observations, Alyson demonstrated multiple attempts to examine her
practice; however, the discourse routines that took place during collaboration time were
constrained by the choices made by leadership. Inconsistent with Horn and Little (2009), the
discourse routines observed during collaboration meetings turned away from opportunities to
examine teacher practice.
107
Research Question: What Role Does Leadership Play in the Way that One
Elementary School Teacher Uses Data in Collaboration/Professional Development
Meetings and in Her Classroom Instruction?
The finding that emerged from this research question was that collaboration meetings did
not support the teacher’s use of data to inform her instruction. The first-grade teacher leader
abdicated her leadership responsibilities to other teachers in the grade level. The absence of the
principal led others in leadership roles (e.g., TOSA, ELIRT, Teacher Leader) to constrain
opportunities for teachers to examine and interrogate their practice.
Over the course of 3 months of observations, I observed seven “collaboration meetings.”
In those seven meeting, teachers, including Alyson, were asked to engage with data (e.g., student
work) two times. Consistent with Ikemoto and Marsh (2007), time is a contributing factor for
teachers to make sense of data to inform their practice. As explained by Alyson, the teachers
determined how they were going to use the time and the purpose of these meetings was not to
use data to inform the teachers’ approach to instruction. These meetings were more focused on
sharing ideas and strategies. Alyson offered,
… during our self-directed collaboration time we periodically talk about we did a little
social studies thing we were talking about sharing lesson ideas, tips and tricks.... and
then… once in a while we will do some collaborative work around science.
In her statement, Alyson indicated that collaboration time was “self-directed,” implying that
teachers made the decisions about the content and the way the time would be organized. In
contrast, the literature suggests (cf., Marsh, 2012) that structured leadership (e.g., the principal,
coach) would need to provide appropriate supports and accountability for time to be allocated to
interpreting and analyzing data. As indicated by Alyson, this was not the case when it came to
108
deciding how collaboration time would be used. Additionally, she shared content that was
covered during collaboration time, which was “a little social studies” and “work around science.”
This indicates that there was a content focus as teachers gathered during collaboration time.
Alyson continued by describing what was “talked” about during this time, which consisted of
“sharing lesson ideas, tips and tricks.” This evidence reveals that the time during collaboration
was used as an open forum rather than a space to use data as teachers voiced lesson ideas and
shared “tips and tricks” from their experience. These conversations limited their capacity to use
collaboration time as an opportunity to look at their instructional practice and discuss
opportunities for improvement (Horn et al., 2015).
Meetings without Data
In five of the seven collaboration meetings, the teachers did not use data to inform their
approach to planning and leadership did not promote a turn towards (Horn & Little, 2016)
practice to plan. As Kerr et al. (2006) and Young (2006) suggest, specialized leadership roles
(e.g., grade level chair, literacy coach, Title I coach) would need to support the facilitation of
teachers’ learning and active conversations to analyze and synthesize information that could be
used in decision making (Kerr et al., 2006; Young, 2006). That did not occur in the five meetings
without data. For example, one collaboration meeting focused on “Planning Unit 4/Math” and
was led by the grade level chair. The intention was to talk about the implementation of Unit 4 as
a grade level. Instead of beginning with planning to initiate a conversation about instructional
practice, the grade level chair, Lisa (L) began with a logistical item by asking the grade level
team to consider intervention times that would not interfere with their ELA block and then
identified topics for the next collaboration meeting. The conversation then led to teachers,
Jennifer (J) and Alyson (A) identifying components that were part of Unit 4 and professional
109
learning that would take place in the upcoming month. The collaboration meeting concluded
with Alyson sharing a joint change unknown problem type that would be implemented in each of
their classroom. In a 50-minute collaboration meeting, 17 minutes was spent on logistics for
upcoming meetings, 30 minutes of the meeting was spent in conversation about components in
Unit 4 along with dates of future professional development/assessments, and 4 minutes were
used by Alyson as she shared a math problem from the Children’s Mathematics Book. The
following is an excerpt from the meeting:
10:55
L: Trying to figure out intervention times. When is out ELA time? Language arts
time.
OC: Alyson shared her language arts times (8:35-11:45).
A: Why are we doing this if the office has the schedule of our times?
11:02
J: I am trying to find topics we wanted to discuss but I cannot find it. (Jennifer is
looking at a Google Doc that had topics for discussion.)
A: I was looking for it as well and I also couldn’t find it before we went on break on
Friday. (Alyson is also looking at Google Drive and documents).
J: Do we want to spend a couple minutes to create topics?
A: I would like to make a list, so we use our time wisely.
Inconsistent with Kerr et al. (2006) and Young (2006), Lisa, the grade level chair and therefore a
leader in the setting, did not provide the members of the grade level with the support they needed
to engage in learning and active conversations related to instructional decision making. She
began by focusing the time on logistical inquiries when she said, “Trying to figure out
110
intervention times. When is our ELA time?” By starting the meeting this way, Lisa set the tone
by demonstrating the lack of preparedness for how the time would be used on logistics instead of
practice. In her response, “Why are we doing this if the office has the schedule of our time?”
Alyson suggested that logistics was not the best use of their time, as the information was already
available in the office. Implied in Alyson’s statement was that the time might be used to focus on
instructional practice instead.
Lisa’s lack of preparedness was further demonstrated in this subsequent exchange
between Jennifer and Alyson’s effort to orient themselves when they say this, “I am trying to
find topics we wanted to discuss but I cannot find it. Do we want to spend a couple of minutes to
create topics?” Alyson responded, “I would like to make a list, so we use our time wisely.”
Jennifer was communicating “topics to discuss” and Alyson focusing on how she would like to
spend the “time wisely” was a demonstration of Lisa’s abdication of her leadership
responsibilities as both teachers were seeking to structure time towards practice during their
meeting.
Furthermore, Lisa’s abdication of her leadership responsibilities and lack of preparedness
were also seen towards the end of the meeting as she suggested that the grade level team look at
math content as it would take less time to discuss. This exchange spoke to these points:
11:39
A: We can use the other units. I have the Oak Tree, the cycle of the Oak Tree. The
closest we have, to compare and contrast two types of text.
L: Do we want to think about it a little more? We can maybe look at math?
OC: (Joint change unknown) Type of problem
J: I wanted to do the writing, but it is going to take more time.
111
L: We have 5 minutes, so we can maybe look at math since that could be faster.
11:42
OC: Alyson goes to her cabinet and grabs some CGI math documents and reads the
problem.
A: Jameson has _books and some in his _. How many books does Jameson have?
There are 3 boys playing soccer. Some girls are playing soccer. How many girls
came to play soccer? Number sets: (3, 7) (6, 14) I use names of students in our
classroom.
Lisa’s statement, “Do we want to think about it a little more? We can maybe look at math?”
demonstrates lack of leadership as she was guiding her team to shift conversations to a different
content area without resources and protocols that could lead to investigate practice. Instead, Lisa
gave up her leadership responsibilities to other teachers in the meeting as seen when Alyson
shared, “Jameson has [blank] books and some in his [blank]. How many books does Jameson
have?” Alyson stepped in to share with the grade level an example of a type of problem that
could be used for math content. In addition, Lisa prioritized completion over investigation of
practice as she said, “We have 5 minutes, we can maybe look at math since it is faster.” She
failed to attend to any form of data to make instructional decisions to support her colleagues.
Moreover, she did not create norms to support teachers in developing skills to analyze and
interpret data during collaboration meetings (Datnow & Hubbard, 2016; Ikemoto & Marsh,
2007).
Meetings with Data
Data were present in two of the seven collaboration meetings I observed. The instances
where data were brought to discussion were determined by Alyson’s inquiry of her practice and
112
an initiative by the district’s focus on mathematics. While Datnow et al. (2013) and Wayman
(2005) suggest that agenda setting and discussion protocols create conditions for learning, these
conditions were lacking in the interaction observed between Teresa, the TOSA at Sunshine
Elementary and the first-grade level teachers. In addition, the way that Alyson engaged with the
data was constrained by the choices made by Teresa in relation to the discourse routines that took
place during collaboration time. For example, in one of the collaboration meetings, Alyson
presented a challenge she was facing in the classroom. Teresa’s dialogue with Alyson turned
away from (Horn & Little, 2009) investigating her practice and teacher moves to help her make
the necessary changes to best support her instruction. Instead, Teresa directed Alyson and
teachers back to strategies seen in student artifacts and dismissed her investigation of practice.
The following is an exchange from the meeting:
10:59
T: It looks like you have work to do.
A: Here is the deal, I am trying to show students that they are (missed the
following information) I am working with my guys process equation. I am really
struggling in terms of being, of being their scribe and note the process of their
equation. Did you guys all do the problem? What you did she come up with…an
equation that matches their process?
11:03
A: We have been working with three addends. Now, I want to be Megan Franke. I
want to learn how to scribe the process of the students’ thinking. How can I help
my students really annotate what they are doing as opposed to writing an adult
algorithm?
113
OC: Jennifer and Lisa looked at student artifacts that are on the circle table while
Alyson is pointing to student artifacts.
T: This is the same as Cindy. This is the same as [missed information].
OC: Same as refers to the type of problem students were working on
11:06
A: I know there is (missed the rest of this) I think what is going on is, I am struggling
with my teacher moves. How do I relate the thinking process to the equation?
OC: Alyson looks at Jennifer and Lisa.
A: And you can come in. [Alyson tells this to Jennifer and Lisa].
OC: Jennifer and Lisa looked at Teresa and the student artifacts.
In the above interaction, inconsistent with Datnow et al. (2013) and Wayman (2005), Teresa did
not create the conditions for learning during the collaboration time. She began by opening the
meeting as she said, “It looks like we have work to do.” This statement centered the meeting to
be an open forum, instead of having a focus for the meeting with the use of an agenda. Alyson’s
statement, “I am really struggling in terms of being, of being their scribe and note the process of
their equation” implied that she was seeking the support from the group to examine an aspect of
her practice with which she was struggling. Teresa an expert and leader, did not foster a
conversation amongst teachers as she did not enact discussion protocols. Instead, she stated,
“This is the same as Cindy, this is the same as…” leading the team back to the artifacts Alyson
brought with her to the meeting. For the second time, Alyson attempted to voice her challenge
and seek support from her team as she said, “I am struggling with my teacher moves. How do I
relate the thinking process to the equation?” Alyson, made eye contact with her colleagues and
asked for their thoughts when she said, “and you can come in.” However, Lisa and Jennifer
114
looked at their leader, Teresa, for guidance rather than unpacking and examining Alyson’s posed
challenge to the team. Teresa’s silence behavior failed to promote the necessary conditions for
learning.
Moreover, the discourse routines that took place during this collaboration time were
constrained by the choices made by Teresa. This entire exchange spoke to turning away from
practice (Horn & Little, 2009) as Teresa guided teachers to examine the student artifact instead
of examining Alyson’s practice. Teresa asked teachers to hypothetically discuss ways students
would create equations to show their understanding of a problem. In addition, there was evidence
of teachers standing by the leader and looking for her guidance throughout collaboration time.
The following is a continuation of the above meeting:
A: So, for me the big discussion for us is what are the teacher moves to help them
understand in creating an adult algorithm. They are going to take the big number
and the little number and add them, whatever, I subtract the small number from
the bigger number, but wait until he gets to integers.
T: Ummhmm.
A: So yeah, what are my teacher moves that I can do so I can help my kids use
mathematical notation to accurately explain the process because once they start
doing that [OC: did not get the rest of this statement]
11:12
T: So let me ask you this question, say that they line up the 7 and then they line the 9
and they compare exactly like the way the problem is asking them to do, and then
they circle the last two. What would the equation look like?
115
A: I don’t know. That is why I am asking you. Well what nuggets can I make to
cause a child to think about process because right this point they are taking away,
so I am thinking if they are taking away, I should be seeing some crossing out.
T: So say they did not cross out, but they did exactly like this, put they are putting
J: Like this one, they are basically comparing.
T: Exactly, so what I am asking you is what would be an acceptable equation?
A: I don’t know, I don’t know, I don’t know. And my question is what teacher moves
I need to do to get them to make notations that makes sense to them. I have to go
back to the SMPs. Here, I don’t care what they write, I mean it is, it actually
makes sense, let me look…
In her statement, “So, for me the big discussion for us is what are the teacher moves to help them
understand in creating an adult algorithm” Alyson again pushed forward to examine her practice
with her colleagues. Teresa was not responsive to Alyson as she responded with, “Ummhmm.”
Teresa precluded the opportunity to engage in discussion to explore and examine her practice as
Teresa guided all to examine the student artifact when she said, “So let me ask you this question,
say that they line up the 7 and then they line the 9 and they compare exactly like the way the
problem is asking them to do, and then they circle the last two. What would the equation look
like?” Alyson’s statement,
I don’t know. That is why I am asking you. Well what nuggets can I make to cause a
child to think about process because right this point, they are taking away, so I am
thinking if they are taking away, I should be seeing some crossing out
demonstrates her frustration and advocacy for support in her instruction, yet she was also abiding
to the leader’s question regarding the student artifact. In this interaction, there was a push and
116
pull away from practice as Teresa continued to ask questions about students solving math
equations, “So say they did not cross out, but they did exactly like this, put they are putting.”
Jennifer, abided to the leader by responding, “Like this one, they are basically comparing.”
Teresa acknowledged her response by saying, “Exactly, so what I am asking you is what would
be an acceptable equation?” In the end, Alyson made an additional attempt to look at her practice
as she said, “I don’t know, I don’t know, I don’t know. And my question is what teacher moves I
need to do to get them to make notations that makes sense to them.” She was left having to figure
out her challenge on her own when she said to her colleagues, “I have to go back to the SMPs.
Here, I don’t care what they write, I mean it is, it actually makes sense, let me look.” Teresa’s
attempts to respond to Alyson’s constrained the discourse routines observed during collaboration
time. Teresa lacked providing the support for teachers to analyze and synthesize information to
make instructional decisions in their practice (Kerr et al., 2006; Young, 2006).
As the literature (cf., Datnow & Hubbard, 2016; Donnell & Gutter, 2015; Dunn et al.,
2013) on professional development suggest, providing teachers opportunities to make sense of
information can lead to actionable knowledge in the classroom. It is evident that the professional
development opportunities at Sunshine Elementary took a knowledge for practice approach
(Cochran-Smith & Lytle, 1999), where outside leadership was the key holder of knowledge and
teachers were the receivers of knowledge. The interactions from professional development were
constrained by leadership as professional development did not attend to student data and
strategies were learned in a context neutral environment. Through observations, Alyson
demonstrated how she carried over the knowledge received from professional development into
the classroom. In addition, she demonstrated using data in general ways through choices she
made in her implementation of strategies learned.
117
Research Question: How are those Data Use Activities Reflected in Her Classroom
Instruction?
The finding that emerged in response to this research question was that strategies learned
during professional development opportunities carried over to the teacher’s classroom
instruction. Professional development provided at the school site was a knowledge for practice
approach, where leadership were the knowledge holders and the teachers were the receivers of
knowledge (Cochran-Smith & Lytle, 199). However, the learning from professional development
carried over to Alyson’s classroom instruction as she leveraged data use in general ways based
on student knowledge from her classroom.
Teachers at Sunshine Elementary were provided with six professional development (PD)
per semester. Professional development was held for teachers as a school site and district level
and were led by leaders in the district (e.g., TOSA, ELIRT). The PD focused on the following
topics: integrated English Language Development (iELD), data analysis in mathematics, and
mathematics instruction. During PD, teachers were modeled different strategies to use in the
classroom to support English learners’ language development and math instruction. These
professional development sessions were structured as knowledge for practice (Cochran-Smith &
Lytle, 1999) as teachers were taught strategies as opposed to examining and interrogating their
practice by incorporating student data when learning about these strategies. At times, the PD
content carried over to collaboration time with the presence of the TOSA as the PD was held on
Thursdays and collaboration meetings were held on Fridays.
In all four of the professional development sessions that I observed, a district leader
facilitates each session. The professional development sessions were used as an opportunity to
teach and model strategies for teachers to implement in their classrooms as opposed to providing
118
time for teachers to examine their teacher moves and student data to make instructional shifts to
support their instruction in the classroom. For example, one professional development session
focused on a warm-up activity, identified as a dot string. In this one-and-a-half-hour session,
where a group of teachers from various elementary school sites gathered, the math TOSA from a
different school site, Toni began with an inclusion activity and continued with a TED Talk video
that focused on performance vs learning zone. This led the TOSA to introduce the dot string and
enact the strategy as she played the role of the teacher, and the teachers played the role of
students. The following is an excerpt from the meeting:
T: The standard—a kids should be able to add and subtract within 10 and add to 20.
We want you to have a lens when you did this warm-up.
OC: Listed in the PowerPoint slide that was visible to the participants. Lens ideas—
intentional choice numbers (thinking of your why), purposeful for partnership,
warm-up is a routine (15 minutes), questioning-on going because you want to
listen and ask a question and nudge, recording students’ responses, how do you
get all students to participate
OC: Teachers take a picture of “Lens ideas” slide.
T: If you want to, choose one when you do a warm-up.
2:17
T: What do you think? Number talks with using dot images. I am going to put the
images for 2-3 seconds. How many dots do you see? How did you see them?
OC: Projector is not displaying. Facilitator goes back to fix the projector.
T: How many do you see Cindy?
M: 11
119
T: Who would like to share how they saw 11?
OC: TOSA calls on Red #4
Red #4: I moved those over to make a 10.
T: You moved 2 over to make a 10. Does anyone see it a different way?
N: I saw that the second had 8 and the other 3 and that makes 11.
OC: Teachers continue to share out their answer about the dot image with the TOSA.
2:25
T: This is why I take the numbers first. As long as they are taking something. This
would be my question. It’s a quick image but students are now creating something
fancy.
OC: Teacher 10 and N share out loud to the group observations from their classroom.
T10: I see some kids counting 2, 4, 6, but I know they keep an image in their brain and
then can count by 2s.
N: I think I struggle with recording what students are saying.
T: How many did you see?
OC: All teachers, 13
T: I would usually have students talk to one another and share. Who would like to
share their strategy?
…
T14: I moved some over. 10 + 3 = 13
OC: TOSA projects a PowerPoint slide with the dot image:
• • • •
• • •
• •
• •
120
• •
2:27
T1: I know that 10 + 3= 13, 9 is 1 less than 10, or would I write it 9 + 1 = 10
T: A different way?
T22: Well, if you do the first one then 11 + 1 + 1= 13
2:29
T17: 10 + 4 is 14, but one is missing so 1 less is 13.
T: I was trying to show a property there. What you are actually doing is splitting the
3 and 1 and then you are taking away.
A: 9 + 3 + 1= 13
T: This time I am going to ask you to talk to your partner. As you are explaining this,
make sure you think of a word bank, vocab students could possibly use.
Teachers are sitting together based on school site and grade level.
…
T: So, this might be words that students use as they do quick images, you are using
the language and you want student to use… What frames can they use? I notice…
What frames can they use with a partner?
Teachers share out examples shared throughout the enactment of the dot image string.
T: Your ELIRTS created sentence frames so that you can use them across your grade
level. Okay, let’s move on. There is actually a planning template in your folder.
In this interaction, Toni’s statement, “The standard—a kids should be able to add and subtract
within 10 and add to 20. We want you to have a lens when you did this warm-up” reflected a
knowledge for practice (Cochran-Smith & Lytle, 1999) approach as Toni centered herself as an
121
expert by telling teachers to include a “lens idea” when implementing a warm-up. TOni modeled
and described the strategy of the dot string as she said, “What do you think? Number talks with
using dot images. I am going to put the images for 2-3 seconds. How many dots do you see?
How did you see them?” Here, Toni demonstrated ownership of knowledge as she walked
teachers through the implementation of the dot image string strategy. Toni role played the dot
image string when she asked teachers, “Who would like to share how they saw 11? A teacher,
playing the role of the student, responded, “I moved those over to make a 10.” This exchange
demonstrates how Toni was the owner of knowledge, and the teachers were the receivers of
knowledge as the TOSA took on the role of the teacher. Toni did not provide opportunities for
teachers to share their knowledge about the strategy. Teachers were not regarded as expertise and
knowledge generators (Cochran-Smith & Lytle, 1999). Toni’s statement, “This is why I take the
numbers first. As long as they are taking something. This would be my question. It’s a quick
image but students are now creating something fancy” demonstrates knowledge for practice
(Cochran-Smith & Lytle, 1999) as Toni explained her teacher moves and did not provide
opportunity for teachers to share their knowledge and contribute to their learning. When a
teacher attempted to contribute to the learning space and offered, “I think I struggle with
recording what students are saying” here, was a missed opportunity from Toni as she ignored the
teacher as having knowledge and qualitative data to contribute to her professional development.
Toni continued being the owner of knowledge when she shared, “I would usually have students
talk to one another and share.” The role play continued as teachers shared their responses about
the dot image; however, Toni reinforced her role as an expert when she referred back to her
teaching moves when she said, “I was trying to show a property there. What you are actually
doing is splitting the 3 and 1 and then you are taking away.” Consistent with Cochran-Smith and
122
Lytle (1999), Toni was demonstrating how knowledge resided outside of the classroom. Towards
the end of the meeting, when Toni stated, “This time I am going to ask you to talk to your
partner. As you are explaining this, make sure you think of a word bank, vocab students could
possibly use,” this is the first opportunity when teachers are referred to as having knowledge and
contributing to the PD. Toni reinforced teachers being the receivers of knowledge when she
stated, “Your ELIRTs created sentence frames so that you can use them across your grade level.
Okay, let’s move on. There is actually a planning template in your folder.” These professional
development sessions were structured as knowledge for practice (Cochran-Smith & Lytle, 1999)
as Toni took the stance of knowledge holder and teachers were the receivers of the knowledge.
Over the course of 3 months, I conducted three classroom observations. In two of the
three classroom observations, Alyson implemented her learning from professional development
in her teaching practice. Consistent with literature (cf., Datnow & Hubbard, 2016; Donnell &
Gutter, 2015; Dunn et al., 2013), professional development can serve as an opportunity to make
sense of information and use it as actionable knowledge in the classroom. In one of the
classroom observations, Alyson implemented a dot image string, a strategy she learned about in
her professional development learning session as referred to above. Alyson carried over her
learning from professional development and leveraged data use in general ways to inform her
instruction based on student knowledge from her classroom (Datnow & Hubbard, 2016; Donnell
& Gutter, 2015; Dunn et al., 2013). The following is an observation from her classroom
instruction:
12:32
OC: Alyson’s first grade students walk into the room. Students walk to their desk and
sit down. Most students look at Alyson after being seated.
123
12:35
A: First row with your theatre partners. Second row with your theatre partners. Third
row with your theatre partners. Can you move a little back, so your head does not
get bumped? How about third row? All right, hey we talked about this. I think that
is a perfect shirt for you. I am going to ask you to examine 3 images. Think about
what you are seeing, how many are you seeing, where are you seeing them?”
OC: Alyson is in front of the room standing in front of the whiteboard. She calls on
rows to have students sit on the carpet. As Alyson is posing questions, she has a
thumb to her chest and has a look on her face as if she is thinking.
12:37
A: I am going to show you some images. This is not a test. Can I show them to you
more than once?
OC: Students nod their head up and down
A: Are we going to have think time?
OC: Students nod their heads up and down. Alyson gets the chart paper on the white
board and turns it over. She shows the dot images to students for about 5 seconds
and then turns it over so that the dots are not visible and leaves the chart paper on
the white board.
A: Did you need to point and count?
S: No
A: Do you need to think (did not capture this word]
A: [Alyson has her thumb to her chest and is stating the following] “I remember…. I
saw… hmmm… and how are they arranged? Ok. Let me think about that.
124
12:39
A: How many were there in all? We got some ideas. I am going to ask you to turn to
your partner. One person is going to listen, the other is going to think and talk.
One person is going to listen, the other is going to think and talk
OC: Students turn their body and face their partner. I move closer to two students to
listen to their partner conversation
S14: There are 7 dots. 4 on the top and 3 on the bottom.
OC: I did not capture the other student’s idea. Student 13 was looking at Student 14
when he was sharing his idea.
A: One question I want to ask you is [did not capture the word] way. Say how many
you saw.
S: [Echo] 7
A: Did anyone see anything different?
OC: Alyson turns that chart paper over to display the dot images again
A: So, I am going to need speakers, but I am going to pull some sticks.
OC: Alyson pulls a stick from a jar.
A: Israel.
In the above classroom observation, consistent with Datnow and Hubbard (2016), Donnell and
Gutter (2015), and Dunn et al. (2013), Alyson carried over her learning from her professional
development sessions into actionable knowledge in her teaching practice. The number sense
routine, known as a dot image string, that she learned about in the learning session referred to
above was observed when she had the three dot images visible on the whiteboard, next to her
chart paper. In her statement, “First row with your theatre partners. Second row with your theatre
125
partners. Third row with your theatre partners” she had partnered students together based on their
math level skills. Alyson differentiated and exercised authority in her practice as she grouped her
students in homogenous ways using data based on her students’ math knowledge in her
classroom. She used the strategy of prompting and visualizing when she said to her students, “I
am going to ask you to examine three images. Think about what you are seeing, how many are
you seeing, where are you seeing them?” This is a representation of knowledge for practice
(Cochran-Smith & Lytle, 1999) as she incorporated language that was provided and modeled to
her in professional development. She continued by asking students, “Are we going to have think
time?” and students nodded their head up and down to answer yes. This implied that Alyson was
explicitly sharing with her students the act of stopping and processing the images seen prior to
sharing as a whole group to lower students’ affective filter in the content area of mathematics, a
reflection of data use as her classroom consists of English learners. Alyson gave her students the
opportunity to practice and share their thinking when she said, “I am going to ask you to turn to
your partner. One person is going to listen, the other is going to think and talk.” Here, Alyson
implemented “turn[ing] and talk[ing]” an ELD strategy she acquired in a previous professional
development. She continued and asked the class, “Say how many you saw” and “did anyone see
anything different?” as an opportunity to create trust and a safe learning space for students to
share their thinking. This demonstrated knowledge for practice (Cochran-Smith & Lytle, 1999)
as this was language used by the TOSA during the professional development referred to above.
When Alyson said, “So, I am going to need speakers, but I am going to pull some sticks” this
strategy, the use of “sticks” was a method to provide opportunity for all her students to have an
opportunity to participate in whole group share outs. In spite of the constraints faced in her
126
professional development, Alyson leveraged the use of data in general ways as she incorporated
additional teacher moves when she implemented the dot image string in her classroom.
Conclusion of Findings
Alyson believed in and valued data as part of her instructional practice. She was able to
carry over her learning from professional development into her classroom instruction (Dunn et
al., 2013b; Schmidt & Datnow, 2005). Although the learning from professional learning settings
were a knowledge for practice (Cochran-Smith & Lytle, 1996), she was able to make
instructional decisions to support the learners in her classroom because of her belief and use of
data. The constraints from both the first-grade teacher leader and the Math TOSA did not allow
for opportunities to engage in data analysis throughout collaboration and professional
development (Farell, 2012; Huguet et al., 2014). Moreover, throughout professional learning
settings, the lack of interpersonal skills and content knowledge from (Huguet et al., 2014)
leadership (e.g., teacher leader, Math TOSA) did not provide Alyson with the necessary support
to engage in data driven decision making to support her instruction.
Revised Conceptual Framework
In this section, I present my conceptual framework, which was revised in response to my
data collection and analysis. Maxwell (2013) explains that a conceptual framework is a model of
a tentative theory of the phenomena, and thus, subject to change and revision as a result of new
information. My initial conceptual framework emerged from literature centered in leadership,
learning communities, and teachers’ belief on data use. Originally, I presumed that leadership
would be demonstrated by the principal. In my revised conceptual framework, I argue that
leadership at a school site consists not only of the principal, and instead, includes specialized
leadership roles (e.g., Math TOSA, Teacher Leader). Specialized leadership roles (Kerr et al.
127
2016; Young, 2006; Datnow & Hubbard, 2016) can provide guidance and structures throughout
collaboration meetings and professional development to support teachers in examining and
interrogating their practice with the use of data (Horn & Little, 2016). Consistent with the
construction of my initial conceptual framework, I continue to believe that professional learning
settings provide opportunities for teachers to engage in data use conversations and instructional
practice. In addition, teachers must value and use data as part of their teaching practice to use
data in their day-to-day instruction.
In my study, I discovered that teachers who believe in data will use data to the best of
their ability. The presence of sophisticated data use is contingent upon the conditions that exist at
a school site and the supports provided from leadership. The principal’s absence is palpable,
failing to set a clear expectation and create a culture for data use. Only if structures and
conditions are fully present then it is possible for teachers to carry over data use in their
instruction. When some structure and/or conditions are present, then data use is limited and
constrained. For this reason, “sacred” time for examining data and the intent of instructional
improvement with the thoughtfulness of use of data will support teachers in using data to make
instructional changes in the classroom (Datnow et al., 2013).
128
Figure 2
Revised Conceptual Framework: The relationship between the conditions for data discussions
and teacher practice.
In Figure 2, I present my revised conceptual framework model with an outside circle that
demonstrates the school site as the context of teachers’ use of data. The inner circle demonstrates
professional learning settings (e.g., professional development, collaboration meetings), which is
where teacher discourse takes place to make sense of data and examine teachers’ instructional
practice. The model consists of three circles that represent leadership at the school site (e.g.,
Math TOSA, Teacher Leader), peers, and the classroom teacher. I argue that for teachers to use
data to inform decision making, teachers must believe in data. Teachers must not only have time
designated to engage in data conversations but also have the necessary tools to engage in
discourse regarding data and instructional practice. I discuss how the components within the
circles interact with one another to support teacher data use at school sites.
129
Leadership
Specialized roles (e.g., TOSA, Teacher Leader) at school sites contribute to leadership.
Leadership creates structures and conditions for learning to support the use of data in different
contexts (Datnow et al., 2013). The leaders in school settings can support the conditions for
learning through learning sessions and conversation tools to support teachers in analyzing and
interpreting data (Datnow et al., 2013; Young, 2016). Time is a contributing factor to engage in
dialogue amongst teachers’ use of data (Ikemoto & Marsh, 2007); however, the way time is used
can support or hinder teachers’ conversations. Leadership must have the necessary training and
discussion protocols to support teachers in examining and investigating their practice through the
use of discourse routines (Datnow et al., 2013; Horn & Little, 2016).
Professional Learning Settings
Professional learning settings can serve to support teachers’ instructional practice and
data use. By establishing different professional learning settings (e.g., professional development,
collaboration meetings), leaders make it possible for teachers to focus on data to inform their
decision making of instruction, understand student learning, and reflect on their practice (Horn &
Little, 2010). Creating a knowledge of practice approach (Cochran-Smith & Lytle, 1999) in
different learning settings allows for teachers to collaborate, analyze, synthesize data, ask
questions, and build trust among colleagues (Kerr et al., 2006; Schifter et al., 2014; Young,
2006). Specialized leadership roles (e.g., Math TOSA, Teacher Leader) foster teacher learning
by facilitating conversations to interrogate and examine teacher practice (Horn & Little, 2016).
Most importantly, specialized leadership roles (e.g., Math TOSA, Teacher Leader) must
demonstrate content area and interpersonal expertise to support conversations with teachers to
identify areas of need and support (Breiter & Light, 2006).
130
Teachers’ Beliefs on Data Use
Teachers must value data as a tool to use data in their instructional practice. Teachers’
beliefs may come with different levels of data use experience. If teachers do not believe that data
is an important piece for instructional improvement, then leadership will have to use professional
development to support the use of data in their practice. Teachers in both cases work closely and
collaboratively with leadership to develop teachers’ data use skills (Datnow & Hubbard, 2016;
Ikemoto & Marsh, 2007). In addition, the collaboration with leadership supports or hinders
teachers’ opportunities to interrogate and examine their practice (Horn & Little, 2016).
Leadership support helps shapes teachers’ confidence and attitudes about data use from
professional learning settings which can transfer into actionable knowledge in the classroom
(Datnow & Hubbard, 2016; Donnell & Gettinger, 2015).
Conclusion
The revised conceptual framework reflects my constructed theory of the phenomenon
studied. Each concept was examined and refined. I argue that it is contingent upon the conditions
that exist at a school site for teachers to use data to inform their instruction in the classroom.
Only if structures and conditions are fully present, then it is possible for teachers to carry over
data use and professional learning into actionable knowledge in their instruction (Datnow &
Hubbard, 2016; Donnell & Gettinger, 2015). When some structure and/or conditions are present,
then data use is limited and constrained by stakeholders at the school site.
131
Chapter 5: Executive Summary, Implications, and Future Research
The study examined how a teacher in an elementary setting used data to inform her
instruction in the classroom. A qualitative single case study was employed to answer the
following questions:
• How does an elementary school teacher use data in collaboration/professional
development meetings?
• What role does leadership play in the way that one elementary school teacher uses
data in collaboration/professional development meetings and in her classroom
instruction?
• How are those data use activities reflected in her classroom instruction?
To answer this question, I purposively sampled an elementary school in an urban setting that was
identified by school district officials as an elementary school site that implemented data use to
inform their instructional practice. During this time, I conducted one 60-minute interview with
the first-grade teacher of focus. In addition, I conducted observations of seven collaboration
meetings, four professional development sessions, and three classroom observations of the focus
teacher. I had been granted permission to record some collaboration meetings; therefore, I
transcribed these meetings. In addition, other observation data before the permission of recording
were scripted and transcribed. Documents related to professional development, collaboration
meetings, and classroom instruction were analyzed and triangulated with observations and
interviews. Pseudonyms for the school site, the school district, the teachers, and students referred
to in the observations were created to ensure confidentiality. This final chapter presents an
overview of the findings and conclusions drawn from this research. It is organized as followed:
summary of findings, implications for practice, and recommendations for future research.
132
Summary of Findings
In response to the first research question, “How does one elementary school teacher use
data in collaboration/professional development meetings?” Alyson, the teacher of focus at
Sunshine Elementary, demonstrated using data in general ways to support her instruction in the
classroom. Alyson demonstrated her ability to use data independently when creating intervention
and grouping her students for her classroom instruction (e.g., math groups). She believed and
valued data use; however, constraints from leadership (e.g., Math TOSA, Teacher Leader) did
not provide her with the necessary supports to use data in various learning environments (e.g.,
collaboration time, professional development).
In response to the second research question, “What role does leadership play in the way
that one elementary school teacher uses data in collaboration/professional development meetings
and in her classroom instruction?” The Math TOSA and Teacher Leader constrained
opportunities for Alyson to examine and interrogate her practice (Horn & Little, 2016). Learning
during professional development consisted of a knowledge for practice approach (Cochran-Smith
& Lytle, 1999) which positioned Alyson as the receiver of knowledge. In her effort to voice her
challenges faced in the classroom in both collaboration and professional development meetings,
Alyson did not receive the necessary supports from leadership at her school site (e.g., Math
TOSA, Teacher Leader).
In response to the third research question, “How are those data use activities reflected in
her classroom instruction?” Alyson carried over her learning from professional development
sessions into actionable knowledge into her classroom practice (Datnow & Hubbard, 2016;
Donnell & Gutter, 2015; Dunn et al., 2013). In addition, she exercised authority and
differentiated her instruction from her knowledge and data of her students. Alyson leveraged the
133
use of data in general ways and incorporated her knowledge of her students into her teaching
practice.
Implications and Recommendations
This dissertation explored the way that an elementary school teacher used data to inform
her practice. In this chapter, I will discuss potential implications of the findings for practice and
policy and make recommendations for research, all of which may be used for future researchers,
policy makers, and practitioners.
Practice
In this study, although leadership provided time for collaboration and professional
development, the way that teachers were supported during this time and the way that time was
used did not promote their ability to learn and use data to inform their instruction in the
classroom. Therefore, one implication that emerged from this study is that leadership (e.g.,
Principal, Math TOSA, Teacher Leaders) needs to provide more than just time for teachers to
collaborate and share ideas with one another. Conversation protocols around student work and
teachers’ instructional moves can provide opportunities to examine and interrogate teachers’
practice (Horn & Little, 2016; Huguet et al., 2014). A recommendation resulting from this
implication is that to support this kind of discourse (e.g., conversation protocols), the principal
and outside classroom support (e.g., Math TOSA, ELIRT, Teacher Leaders) need to adopt these
tools district wide. In addition, they need to engage in professional development where they learn
and practice conversation protocols to enhance their practice before implementing these tools
with teachers at their school site (Marsh, 2012).
Professional development was implemented with teachers using a knowledge for practice
approach where the Math TOSA was the knowledge holder and teachers were the receivers of
134
knowledge (Cochran-Smith & Lytle, 1999). This was not the not an effective approach for
teachers to use this knowledge to inform their classroom instruction. For this reason, another
implication that emerged from this study is that leadership (e.g., Principal, ELIRT, Math TOSA,
Teacher Leader) needs to provide a knowledge in practice and knowledge of practice approach
during professional development to build autonomy amongst staff (Cochran-Smith & Lytle,
1999). Providing a knowledge in practice approach validates teachers as knowers of practical
knowledge and positions them as practitioners where they can develop their artistry through
exploration of problems of practice (Cochran-Smith & Lytle, 1999). A knowledge for practice
approach emphasizes how teachers use the knowledge base to solve problems, reflect of their
practice, and create rich learning opportunities in the classroom (Cochran-Smith & Lytle, 1999).
A recommendation as a result of this implication is that to support a knowledge in practice
approach, the principal and outside classroom support (e.g., Math TOSA, ELIRT, Teacher
Leaders), should take the stance of not being the knowledge holders rather all stakeholders
should have the ability to contribute to their learning and explore problems of practice
throughout various meeting times (e.g., collaboration time, professional development). In
addition, the principal and outside classroom support (e.g., Math TOSA, ELIRT, Teacher
Leaders) need to lead discourse amongst teachers to investigate and interrogate their practice
(Horn & Little, 2016) to support instructional shifts in teachers’ practice.
Policy
There are various barriers that can impact data use, which include lack of time, an
absence of focus on data-related discussions, and lack of adequate skills and knowledge to
formulate questions around data (Marsh, 2012). For this reason, policies need to be written so
that central office administrators, principals, outside classroom support, and teachers are
135
expected and afforded the support to build data literacy to effectively use data as part of their
practice. Central office administrators play an important role in providing evidence-based
practices in district systems that impact teachers at the school level as they use data to inform
their instruction (Coburn & Talbert, 2006). Additionally, policies need to be written so that there
is dedicated and protected time for data use throughout the school year.
To implement data use effectively at a school site, leadership (e.g., principals, TOSAs,
Teacher Leaders) and teachers need opportunities throughout the school year to partake in
professional development and training to build their data capacity. These learning opportunities
can support leadership to develop the skills needed to protect and structure time effectively
during data meetings to analyze and engage in data discourse (Marsh, 2012). This can provide
leadership the adequate skills necessary to formulate questions amongst one another to make
sense of data (Marsh, 2012). Moreover, this can increase opportunities for teachers to have the
efficacy to use data independently or through collaborative meetings amongst their colleagues.
Research
This qualitative single case study focused on one teacher and how she used data to inform
her instruction. During this study, the principal was absent during most of the professional
development meetings and outside classroom support (e.g., TOSA, ELIRT, Teacher Leader) took
on the role of leadership. From a practice perspective, something that did come out of the data
that would need further investigation was the relationship between the principal and the TOSA.
Another area of inquiry might be to take a closer look at how a principal and outside classroom
support (e.g., TOSA, ELIRT, LAS) engage in professional learning and training to understand
how to use data effectively in a school site. Moreover, an area that would need further
investigation is the way outside classroom support engage in any coaching or training to support
136
teachers to examine and interrogate their practice (Horn & Little, 2016). The following questions
are suggestions for further research that are based on leadership and data use:
• How does a principal use data to support teacher practice in an elementary school setting?
• How does leadership (e.g., principal, TOSA, Teacher Leader) build their data literacy
capacity to support teachers’ instructional practice?
• How does outside classroom support (e.g., TOSA, Teacher Leader) engage in coaching
with teachers to support data use?
137
References
Borkowski, J. W., & Sneed, M. (2006). Will NCLB improve or harm public education? Harvard
Education Review, 76(4), 503-525.
Breiter, A., & Light, D. (2006). Data for school improvement: Factors for designing effective
information systems to support decision-making in schools. Educational Technology &
Society, 9(3), 206-217.
Cooper, P. M. (2003). Effective white teachers of black children teaching within a community.
Journal of Teacher Education, 54(5), 413-427.
Coburn, C.E. & Talbert, J.E. (2006). Conceptions of evidence use in school districts: Mapping
the Terrain. American Journal of Education, 112, 469-495.
Datnow, A. & Hubbard, L. (2016). Teacher capacity for and beliefs about data-driven decision
making: A literature review of international research. Journal Educational Change, 17, 7-
28.
Datnow, A., Park, V., & Kennedy-Lewis, B. (2013). Affordances and constraints in the
collaboration for the purpose of data. Journal of Educational Administration, 51(3), 341-
362.
Dee, T. S., & Jacob, B. (2011). The impact of no child left behind on student achievement.
Journal of Policy Analysis and Management, 30(3), 418-446.
Dunn, K. E., Airola, D. T., Lo, W. J., & Garrison, M. (2013a). Becoming data driven: The
influence of teachers’ sense of efficacy on concerns related to data-driven decision
making. The Journal of Experimental Education, 81(2), 222-241.
Dunn, K. E., Airola, D. T., Lo, W. J., & Garrison, M. (2013b). What teachers think about what
they can do with data decision-making efficacy and anxiety inventory. Contemporary
Educational Psychology, 38, 87-98.
EdSource. (2014). Reforming testing and accountability: Essential principles for student success
in California. https://edsource.org/publications
Farrell, C. (2012). Designing school systems to encourage data uses and instructional
improvement: A comparison of school districts and charter management organizations.
Educational Administration Quarterly, 51(3), 438-471.
Gadsden, V. L., & Dixon-Roman, E. J. (2017). Urban schooling and urban families: The role of
context and place. Urban Education, 52(4), 431-459.
Harding, J. (2013). Qualitative data analysis from start to finish. Santa Monica, CA: SAGE
Publications.
138
Horn, I. S., & Little, J. W. (2010). Attending to problems of practice: Routines and resources for
professional learning in teachers’ workplace interactions. American Educational
Research Journal, 47(1), 181-217.
Horn, I. S., Kane, B. D., & Wilson, J. (2015). Making sense of student performance data: Data
use logics and mathematics teachers’ learning opportunities. American Educational
Research Journal, 52(2), 208-242.
Huguet, A., Marsh, J. A., & Farrell, C. C. (2014). Building teachers’ data-use capacity: Insights
from strong and developing coaches. Education Policy Analysis Archives, 22(52), 1-31.
Ikemoto, G. S., & Marsh, J. (2007). Cutting through the data-driven mantra: Different
conceptions of data driven decision making. Santa Monica, CA: RAND Corporation.
Kerr, K. A., Marsh, J. A., Ikemoto, G. S., Darilek, H., & Barney, H. (2006). Strategies to
promote data use for instructional improvement: Actions, outcomes, and lessons from
three urban districts. American Journal of Education, 112(4), 496-520.
Marsh, J. (2012). Interventions promoting educators’ use of data: Research insights and gaps.
Teacher College Record, 114, 1-48.
Mandinach, E. B. (2012). A perfect time for data use: Using data-driven decision making to
inform practice. Educational Psychologist, 47(2), 71-85.
Mandinach, E. B., & Gummer, E.S. (2016). Every teacher should succeed with data literacy.
Sage Journals, 97(8), 43-46.
Milner IV, H. R., Murray, I. E., Farinde, A. A., & Delale-O’Conner, L. (2015). Outside of school
matters: What we need to know in urban environments. Equity and Excellence in
Education, 48(4), 529-548.
Schifter, C. C., Natarajan, U., Ketelhut, D. J., & Kirchgessner, A. (2014). Data-driven decision
making: Facilitating teacher use of student data to inform classroom instruction.
Contemporary Issues in Technology and Teacher Education, 14(4), 419-432.
Schmidt, M., & Datnow, A. (2005). Teachers’ sense-making about comprehensive school
reform: the influence of emotions. Teaching and Teacher Education, 21, 949-965.
Sharp, L. A. (2016). ESEA Reauthorization: An overview of the every student success act. Texas
Journal of Literacy Education, 4(1), 9-13.
Wayman, J. C. (2005). Involving teachers in data-driven decision making: Using computer data
systems to support teacher inquiry and reflection. Journal of Education for Students
Placed at Risk, 10(3), 295-308.
139
Young, V. M. (2006). Teachers’ use of data: Loose coupling, agenda setting, and team norms.
American Journal of Education, 112, 521-548.
140
Appendix A: School Screener
1. How long have you been implementing data use at your school site?
2. What conditions, if any, did you establish with respect to the way you wanted
teachers to use data?
a. Describe how, if at all, you actively established conditions and
expectations at your school site.
3. Tell me about the way you use data, if at all, with your administration and
teachers at your school site. For example, in faculty meetings where all of the
teachers are meeting together versus in grade level meetings where they might
only be meeting with other members of their grade level.
4. Who makes decisions about what data is used during meetings (where all of the
teachers are present, not in grade levels)?
5. How often do you hold meetings with all of your teachers to engage in data
conversations?
141
6. Describe who if anyone, (e.g., instructional coach, grade level chair, coordinator)
facilitate teachers’ conversations during meetings?
7. What discussion tools and protocols, if any, do you use to engage in data use?
8. How often, if at all, do you meet with lead teachers (grade level chairs) to discuss
how they will engage in data conversations during their grade level meetings?
9. How often, if at all, do you engage in walk-throughs to observe teachers’
practice?
142
Appendix B: Observation Tool
Teacher Name: _____________________________
School: ___________________________________
Location: __________________________________
Observer: __________________________________
Date: ______________________________________
Type of Activity Observed:
Grade Level Meeting
Faculty Meeting
Professional Development
Classroom Observation
Activity Description:
Start Time: ________________________________ End Time: _________________________
Activity and Materials Used: ______________________________________________________
Front
Back
143
Time _________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
Time _________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
Time _________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
Time _________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
144
Appendix C: Teacher Interview Protocol
I. Introduction
I would like to thank you for agreeing and taking the time to participate in my study. I do want to
recap my study and answer any questions you might have before we begin. The study I am
conducting is focusing on a teacher’s use of data. This entails looking at how leadership plays a
role in support teachers use of data in the school. I will look at how teachers interact and engage
in conversations during grade level meeting and professional development to understand how a
teacher makes sense of the data and see if this information is transferred into the teachers
practice in the classroom. I will be interviewing you along with _____________ (Principal). I
will interview ____________ (Principal) to gain further information as to how they takes the role
of the leader to create structures to support teachers in understanding how to use data.
I do want to assure you that as we begin our interview, I am taking the role of a researcher which
means that I am here to listen actively to your responses or observations. I am not evaluating
your responses or observations. I am strictly using this information for my data analysis. I will
not share this information with others, such as teachers or the principal.
The data for this study will be compiled into a report and I do plan on using some of what you
say as direct quotes. You can choose your pseudonym, this will keep things confidential. I am
able to provide you with my final paper if you are interested. Do you have any questions before
we begin the interview?
Most importantly, I would like to have your permission to begin your interview. I will be using
my iPhone app as a recorder so that I can accurately capture what you share with me. Do I have
your permission to record our conversation? Please not that I will delete this recording once I
capture the necessary data for my analysis.
Interviewer:________________________
Interviewee:________________________
Location: __________________________
Start Time: _________________ End Time: __________
Focus of the Interview:
• Understand the role of the principal during professional development
• Understand how the professional learning communities interact and discuss data
• Understand how the teacher engages in data talks during meetings with colleagues
• Become familiar with the teacher’s belief and use of data use in his/her classroom
II. Setting the Stage (Developing Rapport and Priming the Mind, Demographics items of
interest):
145
I would like to start by asking you some background questions about you.
1. First, please state your name.
2. Tell me about your background in education?
3. How long have you been teaching?
4. What grade level do you currently teach?
5. Is there any other information you would like to share before moving on with the
interview?
III. Heart of the Interview (Interview Questions are directly tied to Research Questions):
IV. Now I would like to ask you questions about your experience with the use of data at
your school and the role the principal plays.
1. Tell me about the way your principal uses data, if at all, with you or your peers at
your school.
a. Describe a recent experience you had with your principal where they used
data with you and your peers.
b. What kind of data was used during this experience?
2. What role, if any, does the principal play in the way that data is available for you
and your peers?
3. Tell me about the way you and your peers use data at your school (for example, at
faculty meetings or other times outside of your PLC.
4. What discussion tools and protocols, if any, are used by your principal to engage
in data use?
a. Describe an experience where the protocols were used.
5. What conditions and expectations, if any, has your principal created to support
learning during professional development?
a. Provide an example that you think demonstrates what those conditions and
expectations look and sound like in a recent meeting.
6. In what ways, if at all, does the principal support your and your peers’ ability to
carry the information discussed in meetings into your classroom (s)?
146
a. Describe an experience where you discussed this information with your
principal.
7. How often, it at all, does your principal engage in walk-throughs to observe your
practice?
a. Describe the last time they walked through.
b. What was the experience like?
8. How often, if at all, do you meet with your principal to discuss formal/informal
observations in your classroom, if any?
a. Describe your recent experience.
b. What was it like?
9. Is there any other information you would like to share before we move on with
the interview?
V. Now I would like to ask you questions about your experience during grade level,
professional development, and/or professional learning communities.
1. How often do you engage in meetings with your grade level?
a. Do you have an example of a calendar that is used for meetings with your
grade level?
2. What are some of the content areas of focus, if any, that your team discusses?
a. What made _________ the content area of focus?
b. How was this decided?
3. What discussion tools or protocols, if any, are used during your meetings to
support teachers understanding of data?
a. Describe a recent discussion. Tell me about what happened.
4. Do you have an example of a recent agenda that you had for your meeting?
a. May I see it?
b. Can you walk me through the agenda? How was this information present
during your meeting?
5. Who if anyone, (e.g., instructional coach, grade level chair, coordinator)
facilitates your data meetings?
6. Who if anyone, makes decisions about what data you use during your meetings?
7. What professional developments are offered to support teachers’ understanding of
data, if any?
147
8. Is there any other information you would like to share before we move on with
the interview?
VI. Now I would like to ask you questions about your experience and beliefs about data
use and instruction in your classroom.
1. How do you incorporate student data in your lesson planning, if at all?
a. Describe a recent experience using data in your planning.
2. Do you have a current lesson that you implemented?
a. May I see it?
b. Can you walk me through the lesson?
c. What kind of data did you use when creating the lesson?
d. Tell me more about the lesson.
3. What are your thoughts about using data to inform your instruction in the
classroom?
4. Who if anyone, (e.g., instructional coach, grade level chair, coordinator) supports
your planning/instruction in the classroom based on data discussed in meetings?
5. Tell me about any additional professional development that you have attended to
support your understanding of data in the classroom, if any.
a. Tell me about a recent experience.
Some potential probes:
• You said “ _______________” could you elaborate more on that idea?
• That is great. Have you thought about ___________?
• While you mention “ ______________”, could you tell me more about
______________?
• Can you walk me through _______ in further detail?
• What were your thoughts when ___________. Please tell me more.
• Can you tell me more about __________?
VII. Closing Question (Anything else to add)
Is there anything you would like to add to our conversation today that might not have
been covered?
VIII. Closing (Thank you and Follow-up Options)
Thank you so much for sharing your thoughts with me today. I know that it is a busy
time in the year and I really appreciate your time and willingness to share. Everything
you have shared is really helpful for my study. If at any point I find myself with
follow-up questions, may I contact you? If so is e-mail or a phone call ok? Once
again, I would like to thank you for your participation.
IX. Post Interview Summary and Reflection
Abstract (if available)
Abstract
Schools generate and collect different forms of data to identify and understand student performance. Data is a tool for teachers to make sense of information and make instructional shifts to support the learners in their classrooms. The context within which data is used matters. To understand how a teacher was able to use data within her context, this study explored and addressed the following questions: How does one elementary school teacher use data in collaboration and professional development meetings? What role does leadership play in the way that an elementary school teacher uses data in collaboration and professional development meetings and in her classroom instruction? How are those data use activities reflected in her instruction in her classroom? This qualitative single-case study examined the interactions between a teacher, a Math TOSA and her grade level team during professional development and collaboration meetings. The study was conducted at a K-5 public elementary school in Los Angeles County with a population of 35% English Learners (ELs) and 70% socially disadvantaged students. The data for this case study included observations, teacher interview, and reflective memos. The findings from this study revealed that the teacher was able to use data in general ways; however, it was evident that the Math TOSA and Teacher Leader constrained opportunities to examine and interrogate the teacher’s practice. To support teachers with tools to use data effectively to inform their instruction, leadership needs to provide structures to analyze and examine teacher practice.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Effective implementation of technology in elementary schools: an evaluation study
PDF
Examining the applications of data-driven decision making on classroom instruction
PDF
Critically reflective dialogue: an action research study on increasing the critical consciousness of ethnic studies teachers
PDF
Professional development at an international school
PDF
Examining elementary school teachers’ beliefs and pedagogy with students from low socioeconomic status and historically marginalized communities
PDF
The role of leadership in using data to inform instruction: a case study
PDF
Investigating the enactment of professionald development practices in a high performing secondary school setting: a case study examining the purpose, process, and structure of professional development
PDF
School leaders' use of data-driven decision-making for school improvement: a study of promising practices in two California charter schools
PDF
Instructional coaching: disrupting traditional math practices through inquiry and action research
PDF
Disrupting cis-heteronormativity: creating safe and affirming conditions for racially marginalized LGBTQ+ students through a critical reflection coaching group
PDF
Effective coaching of teachers to support learning for English language learners
PDF
A case study of one poʻokumu kaiapuni
PDF
Anti-racist adaptive leadership: an action research study on supporting principals in predominantly white schools to develop color consciousness and support future anti-racist practices
PDF
Using cognitive task analysis to capture how expert principals conduct informal classroom walk-throughs and provide feedback to teachers
PDF
Changing the story: an action research study on utilizing culturally relevant pedagogical practice to enact a movement toward liberatory curriculum and instruction
PDF
Culturally responsive teaching in science: an action research study aimed at supporting a resident teacher in developing inquiry-based culturally responsive science lessons
PDF
Do you see you in me!? No, I do not. I other you.
PDF
Enacting ideology: an examination of the connections between teacher ideology, classroom climate, and teacher interpretation of student behavior
PDF
Authentic professional learning: a case study
PDF
Coaching to transform: an action research study on utilizing critical reflection to enact change towards incorporating culturally responsive teaching practices
Asset Metadata
Creator
Aguirre, Genesis Guadalupe
(author)
Core Title
A teacher's use of data to support classroom instruction in an urban elementary school
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Degree Conferral Date
2022-12
Publication Date
09/17/2022
Defense Date
03/24/2022
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
classroom,data,data use,Education,elementary,instruction,leadership,OAI-PMH Harvest
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Slayton, Julie (
committee chair
), Ephraim, Ronni (
committee member
), Samkian, Artineh (
committee member
)
Creator Email
genesis_aguirre@yahoo.com,ggaguirr@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-oUC111996060
Unique identifier
UC111996060
Legacy Identifier
etd-AguirreGen-11210
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Aguirre, Genesis Guadalupe
Type
texts
Source
20220917-usctheses-batch-981
(batch),
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the author, as the original true and official version of the work, but does not grant the reader permission to use the work if the desired use is covered by copyright. It is the author, as rights holder, who must provide use permission if such use is covered by copyright. The original signature page accompanying the original submission of the work to the USC Libraries is retained by the USC Libraries and a copy of it may be obtained by authorized requesters contacting the repository e-mail address given.
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Repository Email
cisadmin@lib.usc.edu
Tags
classroom
data
data use
elementary