Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
The role of leadership in using data to inform instruction: a case study
(USC Thesis Other)
The role of leadership in using data to inform instruction: a case study
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
Running Head: USING DATA TO INFORM INSTRUCTION 1
THE ROLE OF LEADERSHIP IN USING DATA TO INFORM
INSTRUCTION: A CASE STUDY
by
Debra L. Coaloa
A Dissertation Proposal Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
Final Defense Date: June 7, 2016
Degree Conferral Date: December 14, 2016
Copyright 2016 Debra L. Coaloa
USING DATA TO INFORM INSTRUCTION 2
“The final conclusion is that we know very little, and yet it is astounding that we know so much,
and still more astounding that so little knowledge can give us so much power.”
– Bertrand Russell
USING DATA TO INFORM INSTRUCTION 3
Acknowledgements
Writing this dissertation was unlike anything I have ever done before. It required a
tremendous amount of stamina, dogged perseverance, belief in self, and sheer grit. It would not
have been possible without the support of many people in my life.
To Julie Slayton, my dissertation chair: I will never forget the countless hours we spent together
as you guided me through every step of this arduous process. In your role as both professor and
dissertation chair, you challenged our assumptions, pressed us to wrestle with the literature, and
always held us to high standards. As I plowed through the data, you helped me articulate hazy
ideas, confirmed my hunches, pushed my thinking, and somehow generated excitement about the
discoveries even when my own interest was waning. You helped me to realize how this
seemingly endless process could be valuable to me and others. I am deeply grateful for you.
To Julie Marsh: You inspired me as both a professor and a researcher. I never knew anyone
who could make the topic of accountability so fascinating. Thank you for offering valuable
feedback and for your support and positive thoughts in this demanding process.
To Katharine Strunk: I admire your work, and I appreciate your willingness to serve on my
committee. Thank you for your insights, thoughtfulness, and feedback on my dissertation.
To Jenn Wolfe: Thank you for taking this journey with me, for injecting much-needed comic
relief, for giving me technical support, for keeping me on track, for helping me break through the
blocks, for nurturing me with food and drink, and for continuously giving me much-needed
encouragement and advice. I will never forget this time or our project on reflective practice.
To Judy Bakenhus: After years in the field, I was thrilled to learn that we would be taking this
journey together late in our careers. Your presence in the program served as a constant reminder
of the close connection between theory and practice. I am happy we collaborated on a project.
To my father: Your influence has shaped me in lasting ways. I am so proud to be your daughter.
USING DATA TO INFORM INSTRUCTION 4
To my beloved late mother who lost her battle with breast cancer in the middle of this: Thank
you for your constant presence in my life even now, for always believing in me, and for
encouraging me to become an educator and to continuously pursue lofty goals.
To my husband: You are the love of my life. Thank you for your constant support,
encouragement, generosity, and love. Thank you for always pushing me to reach the highest
heights. You stood by me through the toughest moments, massaged my feet, cooked for me,
allowed me to bounce ideas off of you, and endured endless hours alone waiting for me to finish.
I bless the day I found you and cherish each and every moment that we spend together.
To my five adult children: I did this for you as much as for anyone with the intent of reminding
you that learning is a lifelong endeavor. Remember to be positive, kind, hardworking, and
dedicated. Rejoice in your successes, but develop resilience so that you can confront the
setbacks you will inevitably face. Join a scholarly community and engage in vibrant
conversations with likeminded people who strive to improve their minds and enrich the lives of
others. I love you all and want to see you thrive in whatever paths you choose to take.
To my friends: Brenda, for providing me with a safe harbor while I completed the study; Marilu,
for being a great listener, a dear friend, and godmother to my children; Ana, for being a caring,
supportive friend; and Brian, for providing me with a wonderful release from all the pressure
through the enchanting world of horseback riding.
To my family members: I love you all. I am proud to be the first woman doctor in the family.
To members of my cohort, the TEMs Concentration, and the dissertation group: You enriched
me with your ideas, lifted my spirits at some of the lowest moments, gave me endless amounts of
support, advice, and reassurance throughout the entire doctoral program, and continue to inspire
me with your projects, plans, and career choices. Fight on, my dear fellow Trojans!
USING DATA TO INFORM INSTRUCTION 5
Table of Contents
Acknowledgements ..........................................................................................................................3
List of Tables ...................................................................................................................................7
List of Figures ..................................................................................................................................8
Abstract ............................................................................................................................................9
Chapter One: Introduction .............................................................................................................10
Background of the Problem ...............................................................................................11
Why Teacher Evaluation Needed to Be Revamped ...........................................................12
Statement of the Problem ...................................................................................................14
Purpose of the Study ..........................................................................................................14
Research Question .............................................................................................................15
Importance of the Study .....................................................................................................15
Chapter Two: Literature Review ...................................................................................................17
Leadership ..........................................................................................................................17
Distributed Leadership ...........................................................................................18
Student-Centered Leadership .................................................................................20
Instructional Leadership.........................................................................................24
Transformative Leadership ....................................................................................28
Adaptive Leadership ..............................................................................................31
Professional Development .................................................................................................33
Models of Professional Development ....................................................................34
Reflective Practice .....................................................................................34
Knowledge for, in, and of Practice.............................................................38
Professional Learning Communities ..........................................................42
Essential Elements of PD that Support Practice Change .......................................47
Theoretical Work .......................................................................................47
Empirical Work ..........................................................................................53
Accountability ....................................................................................................................58
Theoretical Concepts of Reciprocal and Internal Accountability ..........................58
Empirical Work on Internal Accountability ..........................................................62
Lateral Accountability ...........................................................................................65
Empirical Work on Lateral Accountability............................................................67
External Mandates for Accountability ...................................................................68
Data-Driven Decision Making ...............................................................................72
Data Use Conversations .........................................................................................77
Teacher Evaluation ................................................................................................82
Contextual Factors that May Impact Principals’ Practice, Including Data Use ....87
Empirical Work on Teacher Evaluation ................................................................88
Empirical Work on Principals’ Use of Teacher Evaluation Data ..........................90
Conceptual Framework ......................................................................................................92
School Leadership ..................................................................................................94
Professional Development .....................................................................................95
Accountability ........................................................................................................97
Conclusion .............................................................................................................98
USING DATA TO INFORM INSTRUCTION 6
Chapter Three: Methodology .........................................................................................................99
Research Design.................................................................................................................99
Sample and Population ......................................................................................................99
Site Selection .......................................................................................................100
Participant Selection ............................................................................................101
Instrumentation and Data Collection ...............................................................................102
Process of Constructing the Interview Protocol ...................................................104
The Observation Protocol ....................................................................................104
Data Analysis ...................................................................................................................105
Limitations .......................................................................................................................107
Delimitations ....................................................................................................................107
Credibility and Trustworthiness ...........................................................................108
Chapter Four: Findings ................................................................................................................110
Context of Study and Findings ........................................................................................110
Background of Star High School .........................................................................111
Principal ...............................................................................................................111
Faculty and Staff ..................................................................................................113
Research Question ...............................................................................................114
Theme #1 .............................................................................................................115
Subtheme 1a .............................................................................................115
Subtheme 1b.............................................................................................148
Subtheme 1c .............................................................................................176
Theme #2 .............................................................................................................185
Theme #3 .............................................................................................................194
Theme #4 .............................................................................................................201
Conclusion .......................................................................................................................208
Chapter Five: Discussion, Implications, and Future Research ....................................................209
Summary of Findings .......................................................................................................210
Implications and Recommendations ................................................................................212
Practice .................................................................................................................212
Policy ...................................................................................................................213
Research ...............................................................................................................214
References ....................................................................................................................................215
USING DATA TO INFORM INSTRUCTION 7
List of Tables
Table 1: Key Processes and Core Components .............................................................................21
USING DATA TO INFORM INSTRUCTION 8
List of Figures
Figure 1: Framework to Determine the Impact of PD ...................................................................49
Figure 2: Conceptual Framework of Data-Driven Decision Making in Education .......................73
Figure 3: Conceptual Model ..........................................................................................................93
USING DATA TO INFORM INSTRUCTION 9
Abstract
Data use is proliferating in schools as a tool to inform instructional improvement. Teacher
evaluation is increasingly viewed as an important data source and mechanism in this effort. This
qualitative case study sought to examine how data generated from teacher evaluation and other
teacher learning experiences worked in conjunction to improve practice. More specifically, this
study examined the role of leadership in using data for the purpose of increasing teacher
knowledge and skills. Spanning a four-month period, the study focused on eight English
teachers, a principal, and two assistant principals in one high school involved in implementing a
new teacher evaluation process and immersed in data use for the purpose of improving practice.
Findings revealed that the principal was not well equipped to build the capacity of her staff to
use data to examine their pedagogy in a way that would foster instructional innovation. Her
efforts resulted in little more than minor tweaks to practice. Likewise, she did not have a clear
approach to improving instruction. Her emphasis was on initiating multiple disconnected
learning experiences that were not consistently aligned nor did they include an explanation of
why and how these experiences would enhance instruction or an expectation for following
through to ensure that new learning would take hold. Professional development was mostly
delivered in a top-down fashion that resulted in the exclusion of teacher voice. Finally, the
principal responded to external accountability demands by buffering her teachers from the
cumbersome, unpleasant aspects of them, while simultaneously using them as leverage to pursue
instructional improvement. Ultimately, despite good intentions, the principal was not well
positioned to promote the use of data as a tool for teacher learning.
USING DATA TO INFORM INSTRUCTION 10
CHAPTER ONE: INTRODUCTION
Data use in schools is rapidly expanding as many forms of performance data besides
testing proliferate at school sites (Horn, Kane, & Wilson, 2015; Mandinich, 2012; Marsh, 2012).
Anxiety over testing created by accountability measures such as No Child Left Behind (NCLB)
has increased the occurrence of data use in schools (Mandinich, 2015). In response to these
accountability measures and other policy shifts, data-driven decision making is becoming a
critical part of educational practice across all levels of the educational system, including state,
district, and school (Mandinich, 2012). As a result, educators face ongoing pressure to use data
in their practice (Horn et al., 2015; Mandinich, 2012; Marsh, 2012). Data use is viewed as a sine
qua non for school improvement (Mandinich, 2012). As former Secretary of Education,
Margaret Spellings once stated, “What gets measured, gets done” (Mandinich, 2012, p. 72).
When I began this study, I set out to examine how a school was using data generated
specifically by teacher evaluation to inform instruction. I was initially motivated to engage in
research on this topic by my own personal and professional role in implementing a new teacher
evaluation system in a large school district with the aim of improving teacher effectiveness. In
my effort to contextualize teacher evaluation in a school setting, I expected to find a synergy
between the use of data in the enactment of teacher evaluation and data use in other professional
learning experiences such as professional development. Instead, as the study progressed, though
there was some overlap, I found more of a disconnection between these two types of data use.
This silo effect seemed to be about the principal’s leadership around data-driven practices more
than anything else. Thus, my study shifted from a focus exclusively on teacher evaluation to the
role of leadership in using data, including teacher evaluation data, to improve instruction.
USING DATA TO INFORM INSTRUCTION 11
In this chapter, I set the context for this study by presenting background information on
issues related to the use of data for improving teacher practice, in particular, teacher evaluation,
which was the initial impetus for this study. After that, I present the statement of the problem,
the purpose, and importance of the study.
Background of the Problem
Recently, policy makers have pushed the use of data beyond compliance and
accountability measures. They emphasize its use to inform continuous improvement, setting the
stage for educators to examine multiple types of data (Mandinich, 2012). However, the presence
of data alone does not amount to much without knowledge about how to collect, organize, use,
and interpret data. (Horn et al., 2015; Mandinich, 2012; Marsh, 2012). Furthermore, educators
are not well equipped with skills and knowledge to use data effectively to inform practice.
Currently, there is a dearth of training to support the use of a broad range of data in ways that
contribute to instructional improvement (Mandinich, 2012).
Coinciding with the pressure to use data to drive improvement is an effort to improve
teacher quality. A mounting body of evidence confirms that teacher quality has an effect on
student outcomes, particularly with low-performing students (Aaronson, Barrow, & Sander,
2007; Darling-Hammond, 1999; Donaldson, 2011; Rockoff, 2004; Stronge, Ward, & Grant,
2011). Of all the variables that schools control, teachers play a pivotal role in the success or
failure of their students (Aaronson et al., 2007; Darling-Hammond, 1999; Rockoff, 2004;
Stronge et al., 2011). Indeed, knowledgeable, well-prepared teachers can have a greater
influence on student achievement than other factors, such as poverty, language history, and
minority status (Darling-Hammond, 1999). This growing realization has caused many to take a
USING DATA TO INFORM INSTRUCTION 12
critical look at policies and procedures designed to improve teacher effectiveness. Teacher
evaluation is increasingly viewed as an important mechanism for improving teacher quality.
Why Teacher Evaluation Needed to Be Revamped
In recent years, states and districts across the nation have made significant changes, some
dramatic and controversial, to their teacher evaluation systems. These changes have been
motivated in large part by incentives in federal programs Race to the Top (RTTT), NCLB, and
Teacher Incentive Fund (Darling-Hammond, 2013; Hallgren, James-Purdumy, & Perez-
Stevenson, 2014; Hallinger, Heck, & Murphy, 2014) and by a shifting political climate that
endorses holding teachers accountable for student performance (Baker, Oluwole, & Green,
2013). In an effort to respond rapidly to the requirements of these competitive federal programs,
many districts have developed and implemented major changes to their teacher evaluation
procedures often without taking a critical look at the policy logic behind these changes or at the
existing research (Darling-Hammond, 2013; Hallinger et al., 2014). Consistent with the
increased focus on teacher evaluation is a growing body of literature affirming that teacher
effectiveness is tied directly to student learning. Teacher evaluation is viewed as the most
promising means for improving it (Darling-Hammond, 2013; Hallinger et al., 2014).
Historically, teacher evaluation systems have done little to help teachers improve their
practice or to distinguish effective teachers from those who need additional support (Darling-
Hammond, 2013). Based predominantly on casual, infrequent observations from site
administrators who bestow mostly satisfactory ratings, teacher evaluation has long been viewed
as an “essentially meaningless ritual” (Danielson & McGreal, 2000, p. 7). Citing a report on
teacher evaluation from a group of expert teachers, Accomplished California Teachers, Darling-
Hammond (2013) identified specific drawbacks of traditional teacher evaluation systems: (a) No
USING DATA TO INFORM INSTRUCTION 13
focus on improving practice; (b) Inadequate time and staff for effective evaluations; (c) Little or
no consideration of student outcomes; (d) One-size-fits-all procedures that do not consider
individual teacher needs; and (e) Disconnection of evaluations from professional development
(p. 4). Danielson and McGreal (2000) identified additional limitations of traditional teacher
evaluation systems, including limited administrative expertise, hierarchical one-way
communication from administrator to teacher, few shared values and assumptions about what
constitutes good teaching, and no differentiation between novice and experienced practitioners.
Apart from these problems, numerous studies have shown that existing teacher
evaluation, tenure, and dismissal policies have serious deficiencies and have actually impeded
efforts to improve teacher quality and student achievement (McGuinn, 2012). In most districts,
teachers are granted tenure after three years in the classroom with no meaningful evaluation of
their teaching effectiveness and with little risk of dismissal for the remainder of their careers
(McGuinn, 2011). Despite the growing consensus that a major overhaul of the evaluation system
urgently needs to take place, surprisingly, no state has made a serious, sustained effort to reform
its evaluation system until the advent of RTTT (McGuinn, 2011).
Along with major changes to teacher evaluation, federal and state policies have spurred
an increased use of data in schools. Educators urgently need support in figuring out how to use
the full range of data in the service of school improvement (Horn et al., 2015; Mandinich, 2012;
Marsh, 2012). New teacher evaluation systems that include the use of evidence-based practices
can be viewed as an important source of data, in conjunction with other forms of data, to
improve instruction.
USING DATA TO INFORM INSTRUCTION 14
Statement of the Problem
Data use is increasing across all levels of educational practice, and the pressure to use
data to inform instruction is growing as well (Mandinich, 2012; Marsh, 2012). Educators lack
sufficient training and support in knowing how to use different kinds of data effectively in ways
that contribute to instructional improvement (Mandinich, 2012; Marsh, 2012). Teacher
evaluation is viewed as one important mechanism to use data to improve teacher quality.
Currently, teacher evaluation is viewed in isolation. There is little empirical evidence to support
the assumption that teacher evaluation can be used successfully at a school site in ways that lead
to instructional improvement (Hallinger et al., 2013). Likewise, there is little evidence to show
the linkage between data generated from teacher evaluation and other forms of data that are
being used to inform teacher practice. It is not clear what happens when teacher evaluation data
are used at a school site in conjunction with other forms of data and other professional learning
experiences to foster teacher learning and instructional improvement.
Purpose of the Study
The purpose of this case study was to understand the way in which teacher evaluation
was used to contribute to an overall improvement in instruction in conjunction with other forms
of data and other teacher learning opportunities. The study attempted to capture a picture of
what teacher evaluation looked like in a secondary school with a historically disadvantaged
school population. This study also sought to examine how data generated from teacher
evaluation procedures might lead to practice change. In particular, I examined the role of
leadership in facilitating collaborative inquiry and discourse around the use of data with the
purpose of increasing teacher knowledge and skills. As this study unfolded, I discovered that
teacher evaluation data could not be viewed in isolation from other forms of data and other
USING DATA TO INFORM INSTRUCTION 15
teacher learning experiences. It was not possible to look at it separate from other data used at the
school. As such, the research question evolved into one that was more representative of what I
studied. Interview data were collected from teachers and administrators to explore their
perceptions of the teacher evaluation process, how it connected to professional development, and
whether or not they felt these professional learning experiences were impacting their practice.
Observation tools were used to capture and identify the quality of conversations that took place
during teacher evaluation procedures and professional learning experiences. Lastly, observation
tools, interview protocols, and document collection were analyzed to discover how data
generated from teacher evaluation and other professional learning experiences were used to
increase teacher learning.
Research Question
This qualitative case study was guided by the following research question:
What is the role of a high school principal in using data, inclusive of teacher evaluation
data, as a tool for teacher learning?
Importance of the Study
In the rush to meet the demands of accountability measures such as NCLB and RTTT,
schools are experiencing increasing pressure to use data to drive improvement (Mandinich, 2012;
Marsh, 2012). Teacher evaluation is considered a promising data source and an important
mechanism for improving instruction. This study provides insight into how one school used data
in the enactment of teacher evaluation and professional development for the purposes of
improving instruction. It might benefit those interested in learning how to maximize the use of
data as a tool for teacher learning, including data that are generated from teacher evaluation and
data from other professional learning opportunities. It also might benefit those interested in
USING DATA TO INFORM INSTRUCTION 16
learning how these two data sources can work in conjunction to contribute to overall
improvement in practice.
USING DATA TO INFORM INSTRUCTION 17
CHAPTER TWO: LITERATURE REVIEW
The research question for this study asked: What is the role of a high school principal in
using data, inclusive of teacher evaluation data, as a tool for teacher learning? To answer this
question and to inform my thinking, this chapter drew upon three relevant bodies of literature:
models of leadership, professional development, and accountability. Together, these bodies of
literature provided insight into different aspects of this question. To begin this chapter, I
examine the following models of leadership: distributed, student-centered, instructional,
transformative, and adaptive. Next, I review the literature on professional development, paying
particular attention to models and features of professional development that support practice
change. After that, I examine types of accountability, including internal, reciprocal, holistic, and
external accountability. I then specifically address three aspects of external accountability, data
driven decision making, data use conversations, and teacher evaluation. Finally, I conclude this
review of the literature with the presentation of my conceptual framework that served as the
basis for my study’s methodology.
Leadership
Spillane, Halverson, and Diamond (2004) and other researchers (Hallinger, 2011;
Leithwood et al., 2004) have pointed out that leadership is critical to instructional innovation in
schools. A leader plays a central role in creating the conditions that support school improvement
and practice change (Goldring, Porter, Murphy, Elliot, & Cravens, 2009) in ways that increase
student learning (Leithwood et al., 2004). To underscore this impact, Leithwood (2004)
emphasized, “There are virtually no documented instances of troubled schools being turned
around without intervention by a powerful leader” (p. 5). I begin this literature review by
examining the area of school leadership. This review provides insight into how leadership
USING DATA TO INFORM INSTRUCTION 18
influences the way teacher evaluation is taken up in a school setting to support teacher learning,
eventually leading to teacher practice change.
Spillane et al. (2001, 2004), along with other voices (Hallinger, 2011; Hord, 1997),
characterized the literature on leadership as short sighted and limited, focusing predominantly on
the school principal, his/her traits, attributes, personality, and behaviors. A more complex
approach to studying leadership is needed, one that considers the organizational, cultural, and
political factors that impact the decisions and actions of school leaders (Spillane et al., 2004;
Spillane & Healey, 2010). For this reason, I have chosen to examine models of leadership rather
than traits or features, because this approach offers a more inclusive, holistic view of leadership.
In this section, I present various models of leadership that offer insight into how school leaders
influence teacher practice, including distributed, student-centered, instructional,
transformational, and adaptive leadership. To this end, I address both theoretical and empirical
literature.
Distributed Leadership
Spillane et al. (2004) presented an integrative, conceptual model of leadership, one that
explores the interaction of a leader’s thinking and behavior within the context of a vibrant,
complex, ever-changing school environment. In the distributed model of leadership, multiple
leaders work together in a community to improve instruction. Spillane et al. (2004) defined this
type of school leadership as “the identification, acquisition, allocation, co-ordination and use of
the social, material, and cultural resources necessary to establish the conditions for the possibility
of teaching and learning” (p. 11). Leadership, in this sense, centers on the abilities of an array of
school leaders working in unison to energize and mobilize members of the school community
USING DATA TO INFORM INSTRUCTION 19
toward improvement of instruction as well as to harness the resources needed to support that
effort (Spillane et al., 2004).
In the distributed model, the interactions that take place between multiple leaders, both
positional and informal, are characterized as “reciprocal interdependencies” (Spillane et al.,
2004, p. 18) where leaders work separately but interdependently to complete leadership tasks,
co-leading the school’s development, each bringing different skills, knowledge, and perspectives
to accomplish common goals. Spillane et al. offered an example of how a school could possibly
approach the task of teacher evaluation. The principal might take an authoritative role,
conducting formal, infrequent summative teacher observations while the assistant principal
assumes a more supportive, friendly role, informally visiting classrooms, engaging in reflective
conversations with teachers to support instructional improvement. Human activity at a school
engaged in distributed leadership “is not simply a function of individual skill and knowledge but
is spread across people and situations” (Spillane et al., 2004, p. 16). This style of leadership
contrasts the traditional task-oriented approach where the principal simply assigns duties to
subordinates. Spillane et al. described this prevalent task-oriented approach to leadership as
“disjointed, discretionary, and emergent” (p. 12), likening it to “firefighting” (as cited in Weick,
1996), which too often leads to short-term fixes, not long-term planning.
In contrast, they referred to distributed leadership as the person-plus versus person-solo
perspective (Spillane et al., 2004) to provide an image of leadership that extends beyond the
mind of a single, charismatic individual to one that spreads across a social and material terrain
that includes different leaders and is grounded in the interaction of people and the complex
situations in which they work. Spillane et al. argued that a distributed perspective can make the
dense black box of school leadership more transparent and “can help leaders identify the
USING DATA TO INFORM INSTRUCTION 20
dimensions of their practice, articulate the relations among these dimensions, and think about
changing their practice” (p. 29).
Student-Centered Leadership
Goldring et al. (2009) presented the model of student-centered leadership as a set of
processes and conditions that interact to create school environments that are conducive to student
learning and teacher practice change. The conditions or core components identified involve
characteristics of schools that support the learning of students and enhance the ability of teachers
to teach. They include such things as high expectations for all students, rigorous curriculum,
effective instructional practices, a healthy culture of learning and professional behavior,
connections to external communities, and systemic performance accountability. Key processes,
on the other hand, refer to those leadership behaviors, individually or collectively, that motivate
and raise the level of commitment of all members of the school community. They include
planning, implementing, supporting, advocating, communicating, and monitoring (Goldring et
al., 2009).
Effective student-centered leadership, thus, requires core components, the what, created
or enacted through key processes, the how. Table 1, below, provides a useful visual to
understand how these two dimensions intersect:
USING DATA TO INFORM INSTRUCTION 21
Table 1
Key Process and Core Components
Key Processes
Core
Components
Planning Implementing Supporting Advocating Communicating Monitoring
High Expectations
for Student
Learning
Rigorous
Curriculum
(content)
Quality
Instruction
(pedagogy)
Culture of
Learning &
Professional
Behavior
Connections to
External
Communities
Performance
Accountability
Source. Goldring et al. (2009). The evaluation of principals: What and how do states and urban
districts assess leadership? The Elementary School Journal, 110, 19–39. Copyright 2009 by
University of Chicago Press.
Goldring et al. (2009) described these processes as “interconnected,” “recursive,” and
“highly reactive to one another” (p. 9). They offered an example of how school leaders might
approach instructional improvement, first by planning to collect key data, then by
communicating the value of examining the data to determine gaps in student achievement. Once
student needs were identified, leaders moved to implement changes in instruction, advocating for
USING DATA TO INFORM INSTRUCTION 22
diverse instructional approaches that met the needs of all students, and supporting teachers
throughout with resources and professional learning opportunities. Finally, these leaders
monitored the progress with ongoing classroom observations and corrective feedback (Goldring
et al., 2009). I reviewed these key leadership processes in more detail as each one illuminates
how learning-centered leaders can effectively motivate their organizations toward achieving the
core components (i.e., high expectations for student learning, rigorous curriculum, quality
instruction, etc.).
First, Goldring et al. (2009) defined planning as “articulating shared direction and
coherent policies, practices, and procedures for realizing high standards of student performance”
(p. 10). Planning provides leaders with a guide to mobilize resources, people, and tasks.
Goldring et al. referred to it as “an engine that builds common purpose and shared culture” (p.
10). Planning not only involves selecting highly qualified teachers whose values reflect the
shared vision of the school, but also includes thoughtful coordinated scheduling of the learning
activities they will engage in and acquiring the materials, technology, and other tools they will
need to provide quality instruction to students.
Secondly, when student-centered leaders implement they “put into practice the activities
necessary to realize high standards for student performance” (Golding et al., 2009, p. 10). To
ensure that policies and procedures are enacted that will further the core components, the leaders
establish such things as high quality instructional programs, community-building protocols,
programs to enhance communication with parents, structures to support professional learning,
and so forth.
Third, to demonstrate support, student-centered leaders secure “the financial, political,
technological, and human resources necessary to promote academic and social learning”
USING DATA TO INFORM INSTRUCTION 23
(Goldring et al., 2009, p. 11). They work diligently to support teachers in their efforts to
improve their practice. The leaders provide the teachers not only with the resources, technology,
and tools to be successful but also promote their efforts to grow and enhance their content
knowledge and pedagogy. Likewise, they create schedules that encourage the development of
professional learning communities.
Next, student-centered leaders advocate publicly on behalf of all students within the
school and beyond its boundaries. The leaders ensure that diverse students have equal access to
educational opportunities and that the cultural backgrounds of all students are valued and
respected. They reach out and develop “civic capacity” (Goldring et al., 2009, p. 12) with youth
development organizations and institutions in the nearby community and lobby on behalf of
parents and students with the school district bureaucracy.
After that, communicating is critical to developing a culture of learning and professional
behavior at a school (Goldring et al., 2009). Student-centered leaders are open, transparent,
highly visible, and accessible to students, parents, teachers, other staff, and members of the
community. To enhance communication with these constituents, they “develop, utilize, and
maintain systems of exchange” (Goldring et al., 2009, p. 12) in multiple formats, including in-
person, through email, by newsletter, and so forth. They also take frequent advantage of
opportunities to engage in reflective dialogue with stakeholders.
Finally, monitoring involves “systematically collecting and analyzing data to make
judgments that guide decisions and actions for continuous improvement” (Goldring et al., 2009,
p. 13). It can include an array of activities such as scheduling ongoing classroom observations,
assessing the effectiveness of professional development, reviewing student programs to ensure
access to rigorous coursework, and so forth. Monitoring also involves regularly reviewing
USING DATA TO INFORM INSTRUCTION 24
student achievement data and using it to identify gaps, provide extended learning opportunities to
students, or offer insights into ways for teachers to adjust their instruction.
Instructional Leadership
Strong instructional leadership from the principal is a hallmark of effective schools
(Hallinger, 2011). In the conceptual framework of instructional leadership, the school leader is
presented as someone who is “strong and directive” (Hallinger, 2003, p. 329), displaying a
combination of expertise and charisma that easily enables him/her to monitor, coordinate,
oversee, and shape curriculum and instruction in a school. In addition, instructional leaders are
characterized as goal-oriented, maintaining a vigilant and targeted focus on improving the
conditions for learning and creating coherence in values and actions across classrooms on a daily
basis (Hallinger, 2003, p. 137). Likewise, they are viewed as “culture builders,” ceaselessly
promoting high expectations and standards as well as a mindset of continuous learning for all
students and teachers (Hallinger, 2003). The instructional leader often has a narrow focus—
namely, the improvement of student academic outcomes—and is unafraid to enter the often
sacred, mysterious realm of the classroom, conversing courageously, candidly, and easily with
teachers regarding the improvement of instruction.
Hallinger (2003, 2005) presented a model of instructional leadership that includes the
following three dimensions: (a) Defining the school’s mission by communicating clear
measurable goals focused on raising student achievement; (b) Managing the instructional
program by observing and evaluating instruction, coordinating the curriculum, and monitoring
student progress; and (c) Promoting a positive school learning climate by protecting instructional
and professional learning time, maintaining a high visible presence, and providing incentives to
both students and teachers for learning.
USING DATA TO INFORM INSTRUCTION 25
Hallinger (2011) likewise, identified three means through which leadership impacts
learning: (a) Vision and goals, (b) Academic structures and processes; and (c) People. The
research on the effects of school leadership conducted during the 1990s found that vision (the big
picture or the direction in which the school seeks to move) and goals (the specific academic
targets that need to be achieved on the journey to realizing that vision) are the most significant
vehicles through which a leader can impact learning (Hallinger, 2011). The vision and goals are
invaluable in motivating people to participate and contribute their time, talent, and energies
(Hallinger, 2011). In addition, a leader’s influence has the potential to improve academic
structures and processes that enhance the practice of teachers. Citing a study conducted by
Robinson, Lloyd, and Rowe (2008) that set out to examine the impact of leadership on student
outcomes, Hallinger pointed out that a principal’s support for and participation in the
professional learning of staff results in the largest gain in learning outcomes. I describe this
study and present the findings in more details at the end of this section.
After reviewing existing literature on the effectiveness of the role of the principal and the
impact of instructional leadership, Hallinger (2011) summarized several key findings derived
from this empirical work:
Principals are value leaders. Values influence and guide decision-making and problem
solving. Principals are “responsible for protecting what’s important” (Hallinger, 2011, p.
137). Their ability to explicitly articulate beliefs and values is viewed as one of several
“foundational competencies.” (Hallinger, 2011, p. 137)
The principal is important but s/he can only achieve success with the cooperation of
others. Even with the presence of a strong school leader, there is a recognition by
researchers that the principal cannot act alone but achieves success only with the support
USING DATA TO INFORM INSTRUCTION 26
and cooperation of others, with his/her impact mediated continuously by the school’s
culture, work structures, processes, and people. Hallinger called this “mutual influence,”
referring to the impact that a school’s context has on both learning and leading (p. 137).
Leadership should be aimed at building the school’s capacity for improvement.
Hallinger (2011) stated, “Both education and school improvement are about the
development of human capacity” (p. 137). The school must be viewed holistically as a
community of learners consisting of teachers, school leaders, students, and family
members who have a mutual impact on each other.
Leaders should work to understand the school context first, then develop leadership
strategies. Strategies are selected based on the contextual needs of the school.
Leaders should seek to share leadership and empower others. Citing Fullan (2011),
Hallinger (2011) argued that leadership at all levels is the key to school reform. Leaders
who focus on building capacity and developing human capital have the greatest potential
to impact student outcomes.
Adding to the body of literature on instructional leadership, Leithwood et al., (2004)
suggested that the term itself serves to keep the focus of all decision-making at a school site on
teaching and learning. It likewise conveys a strong message that improving classroom practices
of teachers will drive the work. Leithwood pointed out that instructional leadership is second
only to classroom instruction in playing a hugely critical—if often undervalued—role in student
learning. He reaffirmed Hallinger’s (2011) model of instructional leadership as laying out an
important set of instructional leadership dimensions: (a) Defining the school’s mission, (b)
Managing the instructional program, and (c) Promoting a positive learning climate.
USING DATA TO INFORM INSTRUCTION 27
Smith and Andrews (1989), likewise, identified several key behaviors of an instructional
leader that heighten his/her presence at a school site. According to them, an effective
instructional leader becomes adept at (a) Working cooperatively with the staff and the
community to develop clear goals that promote the school’s vision/mission, (b) Enhancing
his/her visibility by informally dropping in on classroom instruction, (c) Modeling behavior
consistent with the school vision, (d) Actively participating in professional development, (e)
Buffering the school from external pressures, and (f) Communicating clear expectations for
student learning. School effectiveness research affirms that a teacher’s perception of the school
principal as an instructional leader is the most powerful determinant of the satisfaction he or she
experience with his or her professional role (Smith & Andrews, 1989).
Marks and Printy (2003), responding to critics who view the traditional conception of
instructional leadership as limited, “paternalistic, archaic, and dependent on docile followers” (p.
373), offered an expanded form that they referred to as shared instructional leadership, one that
is more inclusive and collaborative. In this new model, the leader empowers teachers to share
responsibility for professional and instructional improvement. S/he becomes less an inspector or
evaluator of instruction and more a facilitator of teacher growth. Collaborative inquiry replaces
principal-directed supervisory practices. The relationship is a reciprocal one.
Blase and Blase (1999) conducted a study to determine teachers’ perspectives on
instructional leadership. In this study, over 800 American teachers nationwide responded to an
open-ended questionnaire describing characteristics of principals that had a positive influence on
their classroom instruction. The researchers set out to investigate the following question: What
characteristics (e.g., strategies, behaviors, attitudes, goals) of school principals positively
influence teaching, and what effects do such characteristics have on classroom instruction?
USING DATA TO INFORM INSTRUCTION 28
The results showed that teachers reported two major themes as indicators of effective
instructional leadership: (a) Talking with teachers to promote reflection; and (b) Promoting
professional growth. Both of these had a positive enhancing impact on teachers’ emotional,
cognitive, and behavioral dispositions. In addition, other principal behaviors, such as
demonstrating a firm belief in teacher choice, engaging in nonthreatening and growth-oriented
interactions with teachers, and putting forth sincere efforts to take an interest in teachers were
valued by teachers as evidence of strong instructional leadership. Blase and Blase (1999) found
that effective instructional leadership is an integral part of school culture, involving the use of
collaborative structures such as peer coaching, inquiry, collegial study groups, and reflective
discussion to promote professional dialogue. The researchers concluded that principals who
want to be instructional leaders should focus on promoting the use of reflection to shape a school
culture of individual and shared critical examination for instructional improvement.
Transformative Leadership
The transformative model of leadership focuses on the community’s capacity to innovate
(Hallinger, 2003). It is distributed in nature, highlighting the critical role that school leaders play
in developing a shared vision and shared commitment from school members to implement
change. The transformative leader enacts leadership similar to an instructional leader by
defining the school mission, articulating measurable goals, and creating a positive school climate
conducive to student and adult learning. Likewise, s/he oversees the instructional program by
observing and evaluating teachers, coordinating the curriculum, and monitoring student progress.
In this model, however, it is not assumed that the principal is the sole provider of leadership.
The transformative leader seeks to shape and create change by incorporating and synthesizing
the ideas and aspirations of all members of the school community.
USING DATA TO INFORM INSTRUCTION 29
In pointing out sharp differences between instructional and transformative leadership,
Hallinger (2003) provided insight into both models. He argued that the approach of the
instructional leader is top-down, inflexible, and directive, focusing on the control, coordination,
and management of the status quo in pursuit of predetermined school-wide goals that are tied
almost exclusively to student achievement. The instructional leader seeks to limit uncertainty
and ambiguity by implementing first order changes in schools—those that directly impact
instruction such as supervision of teaching or coordination of the curriculum.
The transformational leader, on the other hand, takes a bottom up approach to change,
building the capacity of multiple leaders, encouraging their participation and commitment to
serve the greater needs of the organization (Hallinger, 2003). Recognizing that school
improvement is context-specific, the transformational leader generates second order changes—
those that tap into the personal aspirations and goals of individuals in the organization in ways
that motivate them to commit and empower them to make school wide changes. The
transformational leader creates “shared influence settings,” which are collaborative, interactive,
dynamic and exploratory (Hallinger, 2003, p. 340). This type of leader is comfortable with
ambiguity and uncertainty, understanding that school improvement is a journey with many twists
and turns—not a destination. Furthermore, the transformational leader is interested in looking
beyond traditional student achievement outcomes to other indicators of school success such as
teacher perceptions, student engagement, and so forth.
Marks and Printy (2003) offered an additional perspective on the concept of
transformative leadership. According to them, the transformative leader works to improve school
performance by collaborating with stakeholders to engage in inquiry, problem finding, and
problem solving. The goal is to use his/her leverage to inspire commitment from “followers,”
USING DATA TO INFORM INSTRUCTION 30
encouraging them to reach their fullest potential individually through “idealized influence,
inspirational motivation, intellectual stimulation, and individualized consideration” (Marks &
Printy, 2003, p. 375), and then to set aside their own self-interests in support of the greater good.
Thus, a principal’s central role is to introduce innovation and shape an inclusive school culture
that supports school improvement. Marks and Printy cautioned that by focusing exclusively on
reform efforts and culture building, the transformative leader can sometimes lose sight of
instruction and what it takes to achieve high quality teaching and learning.
To address this concern, Marks and Printy (2003) presented an integrated model of
leadership, holding that an effective principal works simultaneously to innovate and improve
instruction. In an effort to investigate the potential of this new fused leadership model, Marks
and Printy conducted a study that examined the collaboration of principals and teachers around
instruction at 24 selected K–12 schools, all in the process of restructuring. Teachers at these
schools responded to a survey seeking information about their instructional practices,
professional activities, and perceptions of their school. The researchers also interviewed 25–30
staff members at each school, observed governance and professional meetings, reviewed
restructuring documents, and scrutinized the instruction and assessment practices of 144 core
teachers. Two of the research questions that were pertinent for my study follow: (a) What is the
relationship between transformational and shared instructional leadership in restructuring
elementary, middle, and high schools? and (b) What is the effect of transformational and shared
instructional leadership on school performance as measured by the quality of pedagogy and the
achievement of students?
A study by Marks and Printy (2003) confirmed that transformational leadership was a
necessary but insufficient form of leadership for instructional improvement. And yet, when
USING DATA TO INFORM INSTRUCTION 31
transformational and shared instructional leadership coexisted in an integrated form, the positive
influence on school performance, as measured by the quality of teaching and the academic
performance of students, was substantial.
Another research project produced similar findings. Robinson et al. (2008) conducted a
meta-analysis of 37 multinational studies to examine the impact of leadership on a wide range of
student outcomes. The first part of this meta-analysis involved a comparison between
instructional and transformational leadership. The researchers found that the impact of
instructional leadership on student outcomes is notably greater than that of transformational
leadership. Ultimately, however, they found that an integrated form of leadership (i.e., shared
instructional leadership combined with qualities of transformational leadership) was the best
predictor of the intellectual quality of student work.
Adaptive Leadership
Adaptive leadership has its roots in science, with a particular focus on the evolutionary
process where organisms continuously adapt to changing environments in ways that ensure they
not only survive but also flourish. It is defined as “the practice of mobilizing people to tackle
tough challenges and thrive” (Heifetz, Grashow, & Linsky, 2009a, p. 14). An adaptive leader
starts with what already exists, tapping into those with institutional memory, and then building
on that knowledge within an organization. Adaptive change involves determining what should
stay and what should be abandoned. As such, it is simultaneously “conservative and
progressive” (Heifetz et al., p. 15). Heifetz et al. (2009a) argued that the most effective
leadership includes change as an enduring part of the organization by integrating it in the values,
competencies, and strategic orientations. An adaptive leader understands that organizational
USING DATA TO INFORM INSTRUCTION 32
adaptation occurs through experimentation, separates the essential from the expendable, benefits
from diversity, and takes time.
Heifetz et al. (2009a) described adaptive leadership as an iterative process that involves
three activities: (a) Carefully observing events and patterns in the organization; (b) Interpreting
what is observed; and (c) Designing interventions to meet the challenges that have been
identified. Heifetz, Grashow, and Linsky (2009b) referred to the process of adaptation as “an
improvisational and experimental art” (p. 2) where leaders skillfully promote change while
preserving an organization’s identity. Likewise, these leaders become adept at delicately
balancing the urgency for change with the need to preserve aspects of the status quo, expertly
“orchestrating the inevitable conflict, chaos and confusion of change so that the disturbance is
productive rather than destructive” (Heifetz et al., 2009b, p. 3). They also work at replacing a
top-down, hierarchical orientation to authority with one that distributes leadership responsibility
across the organization, drawing upon the “collective intelligence” (Heifetz et al., 2009b, p. 4) of
diverse groups of people to assist with decision making and efforts at innovation.
From the literature reviewed in this section, it is clear that school leaders play a critical
role in influencing and creating learning environments that support teacher practice change.
Several different sources (Spillane et al., 2004; Goldring et al., 2009; Hallinger, 2003, 2011
Heifetz et al. (2009a) 1; Marks and Printy, 2003, 2009a, 2009b) are consistent with this finding.
Key ideas related to models of leadership helped me think about how to answer my research
question. However, much of this work is theoretical and not grounded in empirical studies.
Therefore, it is not certain that these ideas would be true in every case. To summarize some of
these key concepts: in the distributed perspective, a leader empowers multiple team players to
share responsibilities and work together to improve instruction, taking advantage of each
USING DATA TO INFORM INSTRUCTION 33
person’s skills, knowledge, experience, and perspectives (Spillane et al., 2004). From the
student-centered model of leadership, a leader establishes a set of conditions (i.e., high
expectations for all, a culture of learning and professional behavior, etc.) by skillfully enacting
key processes (i.e., planning, implementing, supporting, etc.), enabling him/her to effectively
support teachers as they strive to improve their practice (Goldring et al., 2009). In instructional
leadership, the principal plays a powerful role in keeping the focus on instruction by articulating
a clear vision, managing the instructional program, and promoting a positive school learning
climate (Hallinger, 2011). The transformative leader seeks to innovate by incorporating the
ideas and aspirations of teachers and all community members (Hallinger, 2003). Finally,
adaptive leaders continuously analyze current systems and structures while developing the
capacity in themselves and others to respond quickly and intelligently to change (Heifetz et al.,
2009a).
The next section explores ways that leaders and teachers can engage in promising
learning opportunities to improve their practice.
Professional Development
In this section, I present literature that helped me understand how professional
development should be designed and implemented to support teacher growth and development.
This literature offered insight into what it takes to improve instruction using evidence, data, and
evaluation. In the first part, I describe four current models of professional learning, including:
(a) Reflective practice, (b) Knowledge for, in, and of practice, (c) Inquiry as stance, and (d)
Professional learning communities. After that, I examined what the literature identifies as
effective elements of professional development.
USING DATA TO INFORM INSTRUCTION 34
Models of Professional Development
“Professional development,” Elmore (2002) has argued, “is at the center of the practice of
improvement” (p. 32). Wilson and Berne (1999) added, “Teacher learning ought not be bound
and delivered but rather activated” (p. 194). In this section, I present models of professional
development designed to provide teachers with the knowledge and skills they need to be
effective in the classroom. These models include the following: reflective practice; knowledge
for, in, and of practice; and professional learning communities.
Reflective practice. Rodgers (2002a) stated, “Thinking, particularly reflective thinking
or inquiry, is essential to both teachers’ and students’ learning” (p. 844). Recently, teachers have
been being asked more and more to use reflective thinking to do such things as critically examine
their practice, make use of feedback from others, and draw on educational research to enhance
their knowledge and adapt their teaching to new findings and ideas (Rodgers, 2002a). Rodgers
(2002a) draws upon Dewey’s concept of reflection, pointing out that reflection involves four
criteria:
1. Reflection is a meaning-making process that moves a learner from one experience
into the next with deeper understanding of its relationships with and connections to
other experiences and ideas.
2. Reflection is a systematic, rigorous, disciplined way of thinking, with its roots in
scientific inquiry.
3. Reflection needs to happen in community, in interaction with others.
4. Reflection requires attitudes that value the personal and intellectual growth of oneself
and of others.
USING DATA TO INFORM INSTRUCTION 35
Rodgers (2002b) described a four-step reflective cycle that clarifies and operationalizes Dewey’s
conceptions about reflection, one that she has used with teachers to sharpen their ability to
observe skillfully and think critically about students and their learning. It includes the following
cyclical phases: presence, description, analysis, and experimentation. As teachers go through
this process, they learn how to intentionally slow down, pay close attention to their surroundings,
shift the weight of their thinking to student learning, and eventually come to a realization of how
to take thoughtful, intelligent action based on the understandings that surface. Rodgers (2002b).
described the four steps of reflection as “seeing learning, differentiating its parts, giving it
meaning, and responding intelligently” (p. 235). Each one is examined for how it illuminates a
teacher’s ability to thoughtfully engage in reflective practice.
Step 1 Presence: Rodgers (2002b) described presence as “a way of encountering the
world of the classroom,” whereby “the action that one takes comes out of one’s sensitivity to the
flow of events” (p. 235). Being present means observing, being aware of, paying attention to,
and responding to the learner in a way that serves the continuity of his or her learning. Rodgers
pointed out that when teachers become present, they exhibit “a wide open acceptance…that is
free of judgment and filled with honor for [students’] capacities as learners” (p. 237), and they
demonstrate a passion not only for the subject matter but also for the awe-inspiring human
endeavor of learning. Presence, in this sense, means being awake, alert to one’s surroundings,
mindful of the present moment, and continuously focused on student learning.
Step 2 Description: Rodgers (2002b) referred to description as “telling the story of an
experience” (p. 237). It involves slowing down in the moment, describing an experience in as
much detail as possible while continuously considering alternative ways to view and respond to
it. It is difficult but imperative to do this without making judgments or interpretations about it.
USING DATA TO INFORM INSTRUCTION 36
As teachers learn how to see and describe what is occurring in the classroom, it is critically
important to view events and situations not only from their own perspective but through
students’ eyes as well. Actively and regularly seeking feedback from students helps teachers to
view learning from students’ perspectives, acknowledging that students, not teachers, are “the
authorities on their learning” (Rodgers, 2002b, p. 242). The following questions elicit this type
of feedback from students: What did you learn? How do you know you learned? What helped
you learn? What hindered your learning? Rodgers asserted that the use of feedback often
represents the turning point in a teacher’s awareness of the centrality of student learning, because
it affords the teacher a perspective that is separate from what the teacher is able to see on his/her
own.
Step 3 Analysis of Experience: Once description is complete, teachers can move into the
next phase: analysis. This is the meaning-making stage of reflection, where a teacher arrives at
different understandings about what has occurred and then devises a plan of action (Rodgers,
2002b). This stage involves generating a number of explanations about an event and then
shaping a theory or hypothesis that one is willing to test. It also involves unearthing and
examining assumptions and deeply held beliefs that teachers have about teaching and learning.
Rodgers (2002a), citing Dewey, argued that reflection, in contrast to the complacent act of
believing, constitutes “active, persistent and careful consideration of any belief or supposed form
of knowledge in the light of the grounds that support it and the further conclusions to which it
tends” (p. 850). Shining a critical light on these beliefs causes the learner to enter into a state of
disequilibrium, perplexity, disconnectedness. The process of reflection pushes the learner
through curiosity, inquiry, and a genuine desire for resolution to “a harmonious state of
settledness” (Rodgers, 2002a, p. 850).
USING DATA TO INFORM INSTRUCTION 37
Step 4 Experimentation: The end result of reflective thinking involves action (Rodgers,
2002a). Rodgers explained that the terms experience and experiment come from the same Latin
root, experiri, which means to test or to try. Once the reflective thinker has fully described and
analyzed an experience, s/he formulates a theory, and then moves to test that theory, what Dewey
referred to as the “reorganization and reconstruction of experience” (Rodgers, 2002a, p. 249).
The entire process of reflection, both cyclical and systematic, provides the learner with an
opportunity to “direct the course of future experiences” (Rodgers, 2006, p. 249) in a way that
leads to intelligent action. A commitment to and curiosity about one’s growth and an attitude of
open-mindedness and wholeheartedness play an important role (Rodgers, 2002a).
Valli (1997) also approached the topic of reflective thought by acknowledging Dewey.
She explained that Dewey contrasts reflective thought with habits of thought that are
“unsystematic, lack evidence, are based on false beliefs or assumptions, or mindlessly conform
to tradition or authority” (p. 68). Valli pointed out that the term reflection means “to bend back”
(p. 68). A reflective person, according to her, is someone who looks back, thinking carefully,
consciously, critically, and systematically about ideas and experiences and who, in Dewey’s
words, “converts action that is merely appetitive, blind, and impulsive into intelligent action”
(Valli, 1997, p. 69). Also drawing upon Dewey, Valli explained that reflective thinkers develop
a disposition of “wholeheartedness, responsibility and open mindedness” (p. 69) as they examine
student behavior and classroom practices.
Valli (1997) identified five different types of reflection: (a) Technical, where teacher
thinking focuses solely on pre-determined teaching techniques and skills; (b) Reflection-in and
on-action, referring to the practical thinking teachers do during and after the act of teaching; (c)
Deliberative action, teacher thinking that responds to multiple voices and sources; (d)
USING DATA TO INFORM INSTRUCTION 38
Personalistic reflection, thinking that focuses on personal growth and relational issues; and (e)
Critical reflection, thinking that is political, aimed at changing unjust and inequitable policies,
practices and structures. Valli asserted that teachers should develop the capacity to engage in
each of these types of reflection through a variety of processes, including action research (a
systematic and in-depth inquiry into some aspect of one’s own teaching practice and context),
journal writing, case studies (detailed descriptions of concrete teaching episodes), supervision
(self-review of teaching practices with the guidance of peers or authority figures), and classroom
activities and discussions.
Knowledge for, in, and of practice. Cochran-Smith and Lytle (1999) pointed out that
teacher learning has become a major preoccupation of the educational establishment and is
“widely acknowledged as the sine qua non of every school change effort” (p. 249). Furthermore,
a new image of teacher learning has emerged, one that moves away from the still prevalent
episodic workshops where teachers passively receive the latest strategies and techniques toward
a more constructivist approach. It is now widely recognized that teacher learning takes place
over time and is more effective when previous experience is connected to new understandings.
Cochran-Smith and Lytle presented three distinct concepts of how teachers learn and offer
insights into how ongoing learning of teachers can be supported and deepened: knowledge-for, in
and of-practice. Each one represents a different, competing, sometimes overlapping, model of
teacher knowledge and professional practice.
In knowledge-for-practice, still the most prevalent model, teachers learn by receiving
information from outside experts and university-based researchers who arm them with an array
of content and pedagogical knowledge, educational theories, research-based strategies, best
practices, and proven instructional techniques generally void of context (Cochran-Smith & Lytle,
USING DATA TO INFORM INSTRUCTION 39
1999). It is based on the premise that teachers will improve if they accrue more and better state-
of-the-art strategies and enhance their ability to implement them effectively. In this conception
of teacher learning, “teachers are knowledge users not generators” (Cochran-Smith & Lytle,
1999, p. 257).
In contrast, knowledge-in-practice focuses on the practical knowledge teachers acquire
over time (Cochran-Smith & Lytle, 1999). From this perspective, teachers are the generators of
knowledge, bearing the responsibility for mediating ideas, finding meaning, constructing
knowledge, and using it to try out alternative approaches to instruction. They are the experts
who learn from their own experience in the classroom and from ongoing, systematic, and
deliberate reflection on that experience. Teacher learning comes as a result of “enhancing
teachers’ understanding of their own actions” (Cochran-Smith & Lytle, 1999, p. 267). Cochran-
Smith and Lytle argued that this type of knowledge is often undervalued, viewed as something
local, particular to a certain context or circumstance, and too often deemed unimportant or
inconsequential.
In the third model of teacher learning, knowledge-of-practice, Cochran-Smith and Lytle
(1999) stated that “both knowledge use and knowledge construction are regarded as problematic”
(p. 266), meaning that both are subject to scrutiny during the learning process. In this model,
teacher learning is cyclical, deprivatized, and ongoing. It begins by examining and critiquing
one’s own experiences, by continuously challenging deeply held personal beliefs and
assumptions about teaching and learning, by identifying and seeking alternatives to problems of
practice, by carefully studying students, classroom experiences, societal structures, and the
impact that teachers, themselves, have on classroom dynamics and student learning. In this
perspective, teachers work closely in local communities as co-constructors of knowledge with
USING DATA TO INFORM INSTRUCTION 40
students, other educators, administrators, parents, and outside support providers in an effort to
create equitable and optimal learning conditions for all students. Cochran-Smith and Lytle (as
cited in McDonald, 1992) described teacher learning in this view as teachers “breaking
professional silence” (p. 280) about their classrooms, the school culture, and social inequities in
order to effect change. They described teacher learning as teachers “becoming part of a
community of researchers and learners who see questioning as part of the task of teaching across
the span” (Cochran-Smith & Lytle, 1999, p. 284).
In sum, none of these conceptions of teacher learning is exhaustive, mutually exclusive,
or fully intended to stand on its own (Cochran-Smith & Lytle, 1999). Each one presents a
different, competing idea about teacher knowledge and professional practice, while
acknowledging that teachers learn over time by building on prior experiences. Together, these
forms of teacher learning provide insight into how teacher education and professional
development can be approached to influence and improve teacher practice (Cochran-Smith &
Lytle, 1999).
Drawing from the ideas presented in knowledge-of-practice, inquiry as stance extends the
view that teacher learning comes as a result of teachers gaining an increased understanding of
their own actions, including “their own assumptions, their own reasoning and decisions, and their
own inventions of new knowledge to fit unique and shifting classroom situations” (Cochran-
Smith & Lytle, 1999, p. 267). Inquiry as stance is described as a type of grounding or
commitment to teacher learning in inquiry communities where teachers gather regularly to
examine their work through a social and political lens. Teachers who take an inquiry stance
“generate local knowledge, envision and theorize their practice, and interpret and interrogate the
theory and research of others” (p. 289). From this perspective, teachers learn to live with
USING DATA TO INFORM INSTRUCTION 41
uncertainty, with presenting problems that may not have easy solutions, with generating
questions, with challenging structures and fundamental practices. Questions that are central to
teacher inquiry include the following:
Who am I as a teacher?
What am I assuming about this child, this group, this community?
What sense are my students making of what is going on in the classroom?
How do the frameworks and research of others inform my own understandings?
What are the underlying assumptions of these materials, texts, tests, curriculum
frameworks, and school reporting documents?
What am I trying to make happen here and why? (Cochran-Smith & Lytle, 1999, p.
292)
Cochran-Smith and Demers (2010) provided an example of a student teacher who is
learning to teach from an inquiry stance. This example illustrates the potential of the inquiry
stance in enhancing a teacher’s ability to challenge assumptions. In this particular case, David
Tashian, a history teacher in an urban Boston high school, challenged the assumptions of other
teachers at his school who reacted skeptically to his attempt to introduce nontraditional teaching
methods in his classroom. These off-putting attitudes led David to ask the following question:
“What happens when I introduce non-traditional teaching methods to students who have only
been exposed to traditional methods?” (Cochran-Smith & Demers, 2010, p. 24).
As this teacher began to introduce new methods such as small group work, role play,
panel discussions, and so forth, he soon realized that these methods increased student
engagement with learning. He also discovered that his students began to ask more complex
questions and challenged each other’s ideas and beliefs. This realization may seem simple, but
USING DATA TO INFORM INSTRUCTION 42
as Cochran-Smith and Demers (2010) have pointed out, it “is neither common sense nor
common practice, especially in urban schools and in areas where there are large numbers of poor
children and children of color” (p. 25). They continued, “When prospective teachers are learning
from an inquiry stance, they are learning to raise questions about well-entrenched practices,
challenge common expectations and conceptualize learning outcomes for all students in rich and
complex ways” (Cochran-Smith & Demers, 2010, p. 26).
Professional learning communities. One approach that seeks to create the conditions
that allow teacher inquiry to take root and flourish is the professional learning community model.
Stoll, Bolam, McMahon, Wallace, and Thomas (2006) pointed out that, while there is no
universal definition of this term, a growing international consensus describes it as “a group of
people sharing and critically interrogating their practice in an ongoing, reflective, collaborative,
inclusive, learning-oriented, growth-promoting way” (p. 223). The expansion of professional
learning communities in schools holds particular promise for ongoing teacher learning and
sustained school improvement (Desimone, 2009; Stoll et al., 2006;). In this section, I provide an
overview of literature on professional learning communities. The following researchers’ work is
reviewed: Hord (1997), Grossman, Wineburg, and Woolworth (2001), and Vescio, Ross, and
Adams (2008). I address both empirical and theoretical literature related to professional learning
communities.
Hord (1997) referred to this model of professional learning as “communities of
continuous inquiry and development” (p. 6) and then defined them more specifically as protected
spaces where “teachers in a school and its administrators continuously seek and share learning,
and act on their learning” (p. 6) always with an unwavering focus on student learning. She
further argued that these types of communities intentionally interrupt the traditional pattern in
USING DATA TO INFORM INSTRUCTION 43
schools where teachers simply teach, administrators lead, and students alone are expected to
learn. She stated, “There is no longer a hierarchy of who knows more than someone else, but
rather the need for everyone to contribute” (Hord, 1997, p. 16). The role of leaders in this type
of setting, she asserted, is “to plant the seeds of community, nurture the fledgling community,
and protect the community once it emerges” (Hord, 1997, p. 18). School administrators lead by
following, serving, and inviting others to take an active role in the challenges of leadership
(Hord, 1997).
After having reviewed all of the extant literature at the time, Hord (1997) identified the
following components of a professional learning community that are necessary for it to have an
impact on instructional practice:
1. A principal who acts as a facilitator and is willing to share power and authority by
including staff input in decision making.
2. A shared vision among all school stakeholders that is based on a steadfast
commitment to student learning.
3. Collective learning among staff and an application of that learning to solutions that
address student needs.
4. Peer review of each teacher’s classroom that provides teachers with feedback and
assistance in a way that supports individual and community reflection and
improvement.
5. Physical conditions and human capacities that support the community such as time to
meet and talk, small school size and physical proximity of staff to one another,
communication structures, and so forth.
USING DATA TO INFORM INSTRUCTION 44
With these conditions in place, Hord (1997) argued, “Teachers are more likely to be consistently
well informed, professionally renewed, and inspired so that they inspire students” (p. 32).
Grossman et al. (2001), likewise, presented a model of teacher community as an
alternative approach to professional learning for teachers. Over a period of 18 months, these
authors studied a group of history and English teachers in a large urban high school who came
together twice a week with the intent of creating an interdisciplinary curriculum. Grossman et al.
referred to their study as “an instructive case that sheds light on the birth pangs of teacher
community” (p. 942), giving them insight into the triumphs and challenges of forming teacher
community, including finding common language, and creating a collective vision for ongoing
professional learning at the school site. Grossman et al. relied on Bellah, Madsen, Sullivan,
Swindler, and Tipton’s (1985) definition of community as “a group of people who are socially
interdependent, who participate together in discussion and decision making, and who share
certain practices that both define the community and are nurtured by it” (as cited in Grossman et
al., 2001, p. 946).
Grossman et al. (2001) presented teacher community as a contrasting vision of
professional development, one that is concerned above all with what teachers themselves “do and
think” (p. 997). It occurs at the work site, is ongoing, and offers the possibility for
transformation not only of individual teacher thinking but also of the structures for social
engagement that exist in schools. Teacher community provides many benefits to teachers. In
particular, it (a) Creates a venue for teacher learning and continuous intellectual development,
(b) Carves out a communal space where teachers are exposed to multiple perspectives and a
collective knowledge that exceeds that of any individual, (c) Provides support for new teachers
who might otherwise exit the profession early, (d) Cultivates leadership among teachers who
USING DATA TO INFORM INSTRUCTION 45
earn the right to represent the collective vision; and (e) Gives students a glimpse at what lifelong
learning might look like.
As stated previously, Grossman et al. (2001) conducted an empirical study that brought
together 22 English and social studies teachers from a large urban high school, including a
special education and ESL teacher, over a 2.5-year time span. The researchers participated both
as project organizers and project leaders. One question they wanted to answer follows: What
distinguishes a community of teachers from a group of teachers sitting in a room? They
approached the research with no set agenda other than the desire to bring teachers together to
provide them with an opportunity for continued learning and discussions around history and
English. In this effort, teachers met twice monthly with the researchers for an entire day to read
and discuss literary and historical texts and to create an interdisciplinary curriculum together.
Evidence for this study came primarily from transcripts of the group discussions but also
included information gathered in field notes, emails, journals, notes from phone conversations,
teacher interviews, written evaluations, and “think alouds.”
Grossman et al. (2001) discovered some intriguing insights about the way that teacher
community formed and developed. The researchers recognized distinct stages that a disparate
group of teachers moved through as they evolved into a mature community. First, the teachers
identified as distinct factions or subgroups within a larger group, unaware of individual
differences or perspectives. As group identity began to take shape, it could initially take on the
characteristics of pseudo community where members exhibited a surface friendliness, always
careful not to intrude in each other’s space. Conflict was suppressed and the most vocal, pushy
members of the group often took control of the group’s direction. At this stage, everyone
behaved politely and silence went unquestioned. As teacher community reached full maturity,
USING DATA TO INFORM INSTRUCTION 46
the researchers found that a more cohesive group identity formed where contributions and
perspectives of all individual group members were honored and validated. Likewise, a sense of
communal responsibility for the norms, behavior, and direction of the group emerged.
Ultimately, Grossman et al. found that teachers became committed to both student learning and
teacher learning and assumed responsibility for the growth and development of their colleagues
as well as themselves.
After observing the development of one teacher community over time, Grossman et al.
(2001) came to the following conclusions: (a) Time and resources are necessary for teacher
community to flourish; (b) Structural changes alone do not teach people to interact differently;
(c) Privacy persists (i.e., it is difficult to change the culture of schooling); (d) Teacher
community works best when teachers self-select their groups; and (e) Community is difficult to
attain and even more challenging to sustain.
Another group of researchers, Vescio et al. (2008), presented an extensive review of
literature in an effort to examine the impact of professional learning communities on teaching
practices and student learning. They reviewed the literature specifically as it related to two basic
questions: In what ways does teacher practice change as a result of participation in a professional
learning community? and What aspects of professional learning communities support these
changes? The researchers were only able to locate a mere 11 research studies, all of which
offered modest evidence supporting the idea that participation in a learning community leads to
positive changes in teaching practice (Vescio et al., 2008). Five of the studies mentioned
specific changes teachers made in their classrooms as a result of working in a teacher learning
community, including the incorporation of more student-centered teaching techniques and
flexible classroom arrangements sensitive to student needs. They, likewise, found modest
USING DATA TO INFORM INSTRUCTION 47
evidence to suggest that professional learning communities improved the teaching culture,
particularly by increasing collaboration, bringing a greater focus on student learning, enhancing
teacher authority, and supporting an atmosphere of continuous learning.
Vescio et al. (2008) argued the PLC model represented a fundamental shift away from
the traditional model of PD. It is grounded in the knowledge of practice approach to teacher
learning in which:
it is assumed that the knowledge teachers need to teach well is generated when teachers
treat their own classrooms and schools as sites for intentional investigation at the same
time that they treat the knowledge and theory produced by others as generative material
for interrogation and interpretation. (p. 89)
Essential Elements of Professional Development That Support Practice Change
The literature has identified essential characteristics of professional development (PD)
that have been associated with increasing teacher knowledge and skills and improving classroom
practice. In this section, I focused on critical elements that must be in place at a school site to
enhance teacher learning. The work of researchers Desimone (2009), Little (1993), Elmore
(2002), and Webster-Wright (2009) was examined. Finally, I reviewed the results of two
empirical studies conducted by Garet, Porter, Desimone, and Yoon (2001) and Brady (2009).
Theoretical work. Desimone (2009) proposed an approach to studying the impact of PD
on teacher learning in which she identified those elements she deemed to be essential for
influencing teacher practice change and student outcomes. She pointed out that instructional
improvement is viewed as critical to efforts to increase student learning and that “education
reform is often viewed as synonymous with teachers’ professional development” (Desimone,
2009, p. 181). Given this importance, Desimone articulated the way that PD should be studied to
USING DATA TO INFORM INSTRUCTION 48
effectively measure its impact on teachers and students. She presented the following challenge
to the research community: How can we best measure professional development and its effects
on teachers and students toward the end of improving PD programs and policies to foster better
instruction and student achievement? To answer this, she suggested that we first must agree on a
common definition of what constitutes PD. Citing Little (1993), Desimone defined PD broadly
as any activity that is designed to prepare staff members for improved performance in current or
future roles in the school districts. She argued that because teachers experience such a vast array
of formal and informal activities that contribute to their professional growth and development, it
is necessary to focus on a set of common critical elements of teachers’ learning experiences to
successfully measure their impact.
Desimone (2009) contended that there is empirical research that identifies a core set of
elements of effective PD that are critical for increasing teacher learning and expertise:
1. Content focus: Deepening teachers’ content and subject area knowledge and
providing them with methods to help students learn that knowledge.
2. Active learning: Providing teachers with opportunities to engage in active learning
where a spirit of continuous improvement and collaboration among colleagues is
facilitated. Giving teachers opportunities to lead, model, or observe practice,
followed by interactive feedback and discussion.
3. Coherence: Aligning school-wide goals and priorities with district initiatives and
policies, and connecting them to an overall strategy for school improvement in a way
that is consistent with teacher knowledge and beliefs.
4. Duration: Allowing for sufficient duration, including both span of time over which
the activity takes place and the number of hours involved in the activity.
USING DATA TO INFORM INSTRUCTION 49
5. Collective participation: Giving groups of teachers who share the same subject
matter, grade level, or who work in the same school the opportunity to participate in
professional learning activities together to build a collaborative, interactive
community.
Desimone (2009) proposed a framework to determine whether or not these elements are in place,
present, and leading to teacher practice change and student learning. It includes a core theory of
action that would work to measure the impact of any professional development activity:
1. Teachers experience professional development.
2. The PD increases teachers’ knowledge and skills and/or changes their attitudes and
beliefs.
3. Teachers use their new knowledge and skills, attitudes, and beliefs to improve the
content of their instruction or their approach to pedagogy, or both.
4. The instructional changes foster increased student learning.
Figure 1. Framework to determine the impact of PD. Reprinted from L. M. Desimone (2009).
Improving impact studies of teachers’ professional development: Toward better
conceptualizations and measures. Educational Researcher, 38, 181–199. Copyright 2009
by Sage Publications.
Desimone (2009) claimed that this conceptual framework would create a shared
knowledge base and would provide valuable evidence on the impact of PD that could be useful
USING DATA TO INFORM INSTRUCTION 50
for practitioners. To date, Desimone has not published any empirical work that enacts this
conceptual framework to know how it would play out if applied.
Little (1993) offered a similar set of characteristics that, if implemented, would foster
learning experiences for teachers that influence practice change. Responding to the pressure to
improving teaching and learning in a reform-driven era, Little argued that the dominant form of
PD—the training model—an approach that assumes that a clearly defined set of skills can be
easily transferred from trainers to teachers through a well-defined process does not adequately
meet the needs of teachers and students. This model involves short-term skill training
workshops where teachers act as passive “consumers of knowledge produced elsewhere” (Little,
1993, p. 142). The “workshop menu” of this form of PD, Little (1993) pointed out, is
“fragmented in content, form and continuity” (p. 142) and does little to advance teacher learning
at a time when teachers face intense scrutiny and pressure to perform.
Little (1993) explored promising alternatives to the training model, including teacher
collaboratives and other networks, subject matter associations, school-university partnerships,
and special institutes, all of which provide a setting where teachers engage “in the pursuit of
genuine questions, problems, and curiosities, over time, in ways that leave a mark on
perspectives, policy, and practice” (p. 133). In these settings, teachers are viewed as
knowledgeable, active participants in a larger professional community.
To create environments that foster teacher learning, Little (1993) proposed six principles
of professional development that provide an alternative to the dominant training model. She
suggested that, with the implementation of these principles, professional development should
accomplish the following: (a) Offer social, emotional and intellectual engagement with ideas,
with materials and with colleagues; (b) Focus on context and teacher experiences; and (c)
USING DATA TO INFORM INSTRUCTION 51
Support informed dissent. Little explains “To permit or even foster principled dissent places a
premium on the evaluation of alternatives and the close scrutiny of underlying assumptions” (p.
138); (d) Be grounded in a big picture or systemic perspective, putting classroom practice in the
larger contexts of school and district practice; (e) Encourage and cultivate the habit of inquiry. It
should provide space for teachers to “interrogate their individual beliefs and the institutional
patterns of practice” (Little, 1993, p. 39); and (f) Ensure bureaucratic restraint and balance the
interests of individuals with the interests of institutions. Little argued that any teacher
professional development activity should be tested against these principles to determine its
effectiveness.
Elmore (2002) also weighed in on essential practices of effective professional
development. He defined professional development as “the label we attach to activities that are
designed in some way to increase the skill and knowledge of educators” (p. 6). He similarly
pointed out that educational researchers have reached a broad consensus on the main elements of
effective professional development. In this consensus view, professional development is focused
on the improvement of student learning through the enhancement of the skill and knowledge of
educators. Professional development activities (a) Are derived from a well-articulated mission
or purpose tied to some statement about student learning, (b) Place an emphasis on teachers’
content knowledge and the pedagogical skills that lead to effective instruction, (c) Are based on a
clear model of adult learning theory, (d) Are designed to develop the capacity of teachers to
interact socially on problems of practice, (e) Involve the active participation of school leaders
and other staff, (f) Require a sustained commitment to consistency and focus over time. (g)
Move learning close to the point of practice, in schools and classrooms whenever possible, and,
USING DATA TO INFORM INSTRUCTION 52
finally, (h) Should be evaluated continuously on the basis of the effect they have on student
achievement (Elmore, 2002).
Elmore (2002) pointed out that the problem is not so much knowing what effective PD
practices need to be implemented; it involves the more challenging issue of knowing how to get
them successfully rooted into the structures of schools. He further argued that professional
development in the service of student learning is not a “single-shot episode” (Elmore, 2002, p.
13). He stated, “Improvement is a discipline, a practice that requires focus, knowledge,
persistence and consistency over time” (p. 13). Elmore emphasized the importance of the school
leader in creating supportive structures for teacher collaboration that will lead to improvement in
instructional practice.
Finally, Webster-Wright (2009) articulated a perspective about authentic professional
learning (PL). She contended that while professional development experiences are becoming
more interactive and learner-centered, they predominantly “remain as episodic updates of
information delivered in a didactic manner, separated from engagement with authentic work
experiences” (Webster-Wright, 2009, p. 703). Much of PD is atomistic, decontextualized,
delivered in discrete packages by outside experts, disconnecting research from practice
(Webster-Wright, 2009). With a mounting body of empirical evidence, a consensus has
developed in the research community that effective PD is “continuing, active, social, and related
to practice” (Webster-Wright, 2009, p. 703). Despite this evidence, the practice of delivering PD
as isolated, decontextualized segments continues unabated partly because little is understood
about how teachers learn and continue to learn throughout their working lives. Webster-Wright
took issue with the term “professional development,” arguing that it treats the teacher as
deficient, in need of development and direction rather than as a professional engaged in self-
USING DATA TO INFORM INSTRUCTION 53
directed learning. She introduced an alternative conceptualization of PD, “authentic PL,” that
shifts the focus from passive development to active learning, from an atomistic perspective to a
holistic one (Webster-Wright, 2009).
Authentic PL honors and elevates the ongoing, lived experiences of teachers, recognizing
that they learn in a variety of ways from casual conversations with peers in hallways to more
formal workshop settings and that the full extent of this learning should be acknowledged,
voiced, and valued (Webster-Wright, 2009). Webster-Wright likewise, argued that there is
increasing insistence on the centrality of the workplace for ongoing professional learning. PL is
at its optimum when it is situated in a community of learners that is engaged in active problem
solving. Critical reflection plays an important role in this process. For instance, it is through
challenging implicit assumptions and questioning deeply held beliefs and taken-for-granted
practices that PL can lead to changes in practice.
Empirical work. Garet et al. (2001) presented findings from the first large-scale
empirical study comparing the effects of different characteristics of PD on teacher learning. In
this study, they examined the relationship between features of effective PD that had been
identified in the literature and teacher-reported changes in knowledge and skills and classroom
practice (Garet et al., 2001). One thousand twenty-seven math and science teachers nationwide
participated in a survey, asking them to respond to questions about their professional
development experiences. The researchers analyzed the impact of three structural features—
characteristics of the structure or design of PD activities, including (a) the form (i.e., workshop,
study group, etc.); (b) the duration of the activity; and (c) the degree to which the activity
emphasizes the collective participation of groups of teachers. They also examined three core
features of PD—dimensions of the substance or core of the PD experience: (a) the degree to
USING DATA TO INFORM INSTRUCTION 54
which the activity had a content focus; (b) the extent to which the activity provided opportunities
for active learning; and (c) the degree to which the activity fostered coherence in teachers’
learning (aligned with teacher goals, state standards, district initiatives, etc.).
In the survey, teachers were asked to describe the type of activity in which they had
participated, specifying whether it was traditional or reform-oriented (Garet et al., 2001). The
researchers characterized the most common, traditional form of PD as “the workshop,” typically
involving a timed session that occurs outside the classroom and led by experts. Alternative,
reform types of PD, on the other hand, included study groups, mentoring, coaching, peer
observation, subject area networks, and school-university partnerships, to name a few. They
often were more closely connected to the classroom, might be easier to sustain over time, and
might be more responsive to teacher needs and goals. Not surprisingly, 79% of teachers in the
study report participating in traditional forms of PD.
The researchers presented the results by discussing the effects of both structural features
(form, duration, and active participation) and core features (content focus, active learning
opportunities, and coherence) on teacher learning (Garet et al., 2001). With regard to structure,
they found that the form or activity type had an impact on duration. Reform activities tended to
span longer periods of time and involved greater numbers of contact hours than traditional
activities. Garet and his colleagues noticed a modest effect of activity type on enhanced teacher
knowledge and skills, indicating that reform activities resulted in more positive outcomes for
teacher learning. They also found that duration had a direct positive influence on the three core
features of PD (content focus, active learning, and coherence). With regard to the core features,
all three measures of the core features of PD activities had a positive influence on the improved
skills and knowledge of teachers’ instructional practice, supporting the idea that activities with a
USING DATA TO INFORM INSTRUCTION 55
focus on content and a coherent connection to other learning experiences of teachers were more
likely to yield enhanced skills and knowledge. Active learning was also related to improvement
in teacher practice but less so than content focus and coherence (Garet et al., 2001).
The researchers found that sustained and intensive PD was more likely to have an impact
on practice change (as reported by teachers) than PD that was shorter in duration (Garet et al.,
2001). They also found that PD that focused on subject matter (content), fostered active
participation (active learning), and was connected to a teacher’s daily professional life
(coherence) was more likely to produce positive changes in classroom practice (Garet et al.,
2001). Furthermore, reform activities tended to yield better outcomes due to the longer duration
of the activities (Garet et al., 2001). Thus, to improve PD experiences, they argued that it was
more important to focus on the duration, collective participation, and core features (content,
active learning, and coherence) rather than on the form of PD. The results also emphasized the
importance of collective participation for improvement in teacher skills and knowledge. Further,
they pointed to the profound importance of subject matter focus (Garet et al., 2001). Finally,
based on the results, Garet et al. concluded that if educational practitioners want to use PD as a
tool to improve instruction, they need to design activities that have the characteristics that
research shows lead to improvement in teaching (Garet et al., 2001).
Another study where teachers engaged in an alternative form of PD produced equally
promising results. Brady (2009) used qualitative methods to examine the effects of one
innovative approach to teacher learning, the Shakespeare Reloaded Project, a collaborative
research and teaching initiative between the English Department at the University of Sydney and
Barker College, an independent K–12 school in Sydney. This project, described as the creation
and interaction of innovative communities, was set up to facilitate alternative approaches to
USING DATA TO INFORM INSTRUCTION 56
teaching Shakespeare in Australian secondary schools and universities. Brady (2009) began her
analysis by weighing in on the current debate on how best to provide effective PD for teachers,
acknowledging the limitations of traditional forms of PD that are based on the top-down
workshop model where experts convey information to teachers disconnected from their practice.
She argued that teachers learn best in supportive communities with colleagues and other
professionals in which knowledge is generated collectively in the context of their own
classrooms. Drawing upon her own experience working with the Shakespeare Reloaded project,
including observations of group interactions and classroom instruction and interviews and
document reviews of one case study of two teachers who participated in the project, Brady
(2009) set out to discover the effects of this innovative approach on teacher learning.
The study of Brady (2009) focused on the following research questions: How are the
participating teachers able to develop their Shakespearean pedagogy and develop as
professionals within the context of an innovative learning community? How is the Shakespeare
Reloaded Project impacting the principles, perceptions, and practices of participating teachers?
The findings include that the Shakespeare Reloaded Project facilitated teacher learning in several
ways: (a) It created a space for multiple-voiced discourse, including teacher, academic, and
student voices; (b) Small group discussions took place on approaches to teaching Shakespeare
involving conversational exchanges that were broadened and enriched by the diversity of the
group’s expertise and experiences; (c) Teachers were able to exchange ideas about Shakespeare
and about classroom instruction, providing them with a forum for developing and improving
their practice; and (d) Teachers benefited not only from immersing themselves in the world of
Shakespeare but also from co-constructing knowledge with colleagues, and from engaging
intellectually with academic discourses and current scholarly debates; and (e) The seminar style
USING DATA TO INFORM INSTRUCTION 57
group setting promoted a collaborative and energetic exchange of ideas that encouraged a shared
sense of leadership. In short, Brady explained, the dialogues encouraged “diversity, ongoing
experiences, and becomingness” (p. 345). She further pointed out that teachers became lifelong
learners in the process, valuing opportunities to engage academically with Shakespeare, making
use of student perspectives, and incorporating the pedagogic lessons of their experiences into
their practice.
A review of the theoretical and empirical literature in this section indicates that some
models of professional development have more potential to support teacher growth and
development than others. The literature clearly identifies the most prevalent form of PD,
knowledge-for-practice, where teachers passively receive information on the latest techniques
and strategies by experts disconnected from the classroom, as having the least potential for
teacher practice change (Cochran-Smith & Lytle, 1999; Elmore, 2002; Garet et al., 2001; Little,
1993; Webster-Wright, 2009). Finally, the literature suggests that teachers learn best when they
are engaged in active, ongoing problem solving with colleagues in supportive communities
where knowledge is generated collectively in the context of their own classrooms (Brady, 2009;
Elmore, 2002; Garet et al., 2001; Grossman et al., 2001; Hord, 1997; Little, 1993; Vescio et al.,
2008; Webster-Wright, 2009). Although this is a thought provoking and important realization,
there is not a sufficient empirical base to know for certain if this would happen in every case.
The literature on professional development is substantial and clearly responsive to my
research question, offering particular insights into how to create optimal conditions in schools for
teacher learning. The literature on knowledge for practice models of PD, still so pervasive, is
helpful in understanding why teachers get stuck. Likewise, the literature on reflective practice,
knowledge of practice, and professional learning communities provides insight into how school
USING DATA TO INFORM INSTRUCTION 58
leaders can structure professional learning in a way that builds capacity leading to practice
change. Next, I examined the area of accountability.
Accountability
The term accountability simultaneously inspires interest, fear, and confusion (Burke,
2004). For many, it holds negative connotations, conjuring up images of distressed people
receiving news of consequences for poor results or bad behavior (Burke, 2004). In a broader
sense, accountability simply refers to a willingness or desire to narrate one’s experience or give
an account of one’s actions as people relate to commonly understood commitments (Goldberg &
Morrison, 2003). Accountability in education has evolved over the years, moving from a focus
on compliance to a greater emphasis on student outcomes (Fuhrman, 2004; Stecher & Kirby,
2004). According to Stecher and Kirby, accountability in education currently “refers to the
practice of holding educational systems responsible for the quality of their products—students’
knowledge, skills, and behaviors” (p. 1). The roots of this new accountability stem from the
influence of business leaders, particularly the notions of setting explicit goals and facing
increased consequences for performance (Fuhrman, 2004).
Elmore (2007) argued that the increased external pressure for accountability is a driving
force behind educational reform. This remains true today (Fullan, 2011). Elmore (2007) defined
accountability as “the variety of formal and informal ways by which people in schools give an
account of their actions to someone in a position of authority” (p. 140). Researchers have
identified three types of accountability that play a role in whether or not teachers move in a
coherent direction toward improving their practice or engaging in behaviors that support school
and district initiatives. These include internal, reciprocal, and lateral accountability (City,
Elmore, Fiarman, & Teitel, 2009; Elmore, 2002, 2003, 2007). This section speaks to these forms
USING DATA TO INFORM INSTRUCTION 59
of accountability in terms of the theoretical and empirical work that exists. I then focus on
external accountability, paying particular attention to data use and teacher evaluation.
Theoretical Concepts of Reciprocal and Internal Accountability
Elmore (2002) tackled the concept of accountability by first challenging it as a means to a
productive end for teaching and learning. He pointed out that the dominant use of the term
currently refers to the “systems that hold students, schools, or districts responsible for academic
performance” (p. 3). Elmore (2007) took issue with the commonly held notion that
accountability in education only exists when it is imposed on schools by external authorities. He
argued that the existing focus on externally driven accountability is shortsighted and is unable,
on its own, to lead to effective improvement. He offers a more nuanced way of deconstructing
the concept of accountability by breaking it into smaller concepts of internal and reciprocal
accountability. Investment in both of these types of accountability, Elmore (2002, 2007) argued,
will lead to continuous instructional improvement and will ensure that schools respond
productively to external mandates.
Elmore (2002) pointed out that any successful accountability system must involve a
reciprocal process of give and take. He contended that the existing structure of schools is unable
to meet the demands of an external accountability system. In the current structure, teachers act
essentially as independent contractors who work mostly in isolation. They enter the profession
with the mistaken notion that they know most of what they need to know to teach. Elmore
pointed out that “it would be difficult to create a more dysfunctional organization for a
performance-based accountability system” (p. 5), further claiming that the structure and culture
of schools are more designed to inhibit learning than to encourage it. He denounced schools as
USING DATA TO INFORM INSTRUCTION 60
“hostile and inhospitable places to learning” (p. 5) because they are unsupportive of adult
learning and, as a consequence, equally unreceptive to student learning.
To create environments that support teacher learning, Elmore (2002) introduced “the
principle of reciprocity of accountability for capacity” (p. 5). With this principle, he reasoned
that for every increase in performance that is expected from a teacher or employee, there is an
equal responsibility to provide that person with the capacity to meet that expectation. Similarly,
for every investment that is made in a teacher or employee’s skill or knowledge, that person has
a reciprocal responsibility to demonstrate an increase in performance. Elmore acknowledged
that schools are woefully ill-equipped to provide this kind of support. What the public and policy
makers need to realize is that for performance-based accountability to achieve its goal of
increased student achievement, a strategy for consistently investing in the skills and knowledge
of educators is necessary.
Elmore (2002) posited that internal accountability holds more promise for enhancing
teacher learning and must necessarily precede external accountability, arguing that “knowledge
and skill are at the core of school improvement” (p. 19). He argued that while external
accountability systems—intended to help schools develop a practice of continuous
improvement—may create the conditions that make it advantageous for schools to improve these
systems, whether by sanction or reward, they will never automatically result in an effective
improvement process for schools. Internal accountability, on the other hand, can lead to
continuous instructional improvement. Schools with strong internal accountability systems,
according to Elmore (2002), have a clear, unwavering focus on instruction, shared expectations
among school leaders, teachers, and students for student and teacher performance, and a well-
articulated process for determining if those expectations have been met. Staff members at these
USING DATA TO INFORM INSTRUCTION 61
schools tend to attribute their success or failure to themselves, recognizing the impact that their
own levels of skills and abilities have on student success. Schools with weak accountability
systems, on the other hand, lack this focus, blaming forces beyond their control for their success
or failure such as the students, their families, the surrounding community or the dysfunctional
system.
In another piece, Elmore (2003) reasserted the idea that schools with strong internal
accountability develop coherence and unity around expectations for teaching and learning in a
way that leads to measurable gains in student performance. Elmore noted that, conversely,
schools with low internal accountability lack agreement around instructional expectations and
are powerless to influence changes in practice that produce student learning gains.
Elmore (2003), in addition, confirmed that external accountability systems such as No
Child Left Behind not only do little to help schools improve, but actually push them to fail by
setting arbitrary, unrealistic performance targets without providing support or guidance for how
to accomplish them. He argued that students and teachers are more likely to improve under a
certain set of conditions: (a) When a new set of eyes help them identify problems they may not
be aware of, (b) When they are presented with an obstacle and then provided with knowledge on
how to overcome it, (c) When they have access to knowledge and learn to use it in novel ways,
and (d) When there is coherence and agreement among teachers, students, and others in the
community around what constitutes good instruction—otherwise known as internal
accountability.
Elmore (2003) further reiterated that one way for schools to respond to performance-
based accountability systems and to create the conditions that promote the kind of improvement
USING DATA TO INFORM INSTRUCTION 62
in teaching and learning described above is through the principle of reciprocity. His thinking
evolved such that he articulated the principle in the following way:
I should be expected to perform at the limits of my capacity, but I should not be expected
to do those things for which I do not have the capacity unless you accept joint
responsibility with me to create that capacity. (p. 11)
Performance-based accountability systems, Elmore asserted, have a fundamental design flaw—
they fail to align capacity building with performance measures. He contended that schools build
capacity by strengthening internal accountability, creating shared expectations, norms, values,
and coherent structures around instruction for teachers and students, and then by consistently and
jointly identifying, confronting, and working through challenging problems of instructional
practice.
Empirical Work on Internal Accountability
Elmore (2007) tested his theory of internal accountability in a large scale five-year study
undertaken to understand more about how schools constructed their own conceptions of
accountability and how people at those schools viewed and responded to accountability measures
in their daily work lives. To engage in this work, Elmore conducted case studies in a diverse
sample of 20 schools located in metropolitan areas in the western and eastern regions of the
United States. They spent approximately two weeks in each school. Elmore characterized the
study as “exploratory and formative in nature” (p. 136), involving classroom observations,
interviews of teachers and other staff members, and focus groups of students and parents.
Seeking to understand how accountability played out at a school site, the researchers
posed the following question to each of the schools they studied: For what are you accountable,
to whom, and how? To structure the field research, they developed a working theory of
USING DATA TO INFORM INSTRUCTION 63
accountability that includes the following four key premises: (a) Schools have conceptions of
accountability already embedded in their daily routines and practice; (b) School site
accountability conceptions are “organic” (Elmore, 2007, p.138), built around human interactions
that occur in the ongoing act of teaching and learning. Relying on Shein, Elmore (2007)
described this group culture as “a pattern of basic, shared assumptions that the group learned as it
[solved] its problems of external adaptation and internal integration” (p. 138); (c) Stakeholders in
schools were active participants in the accountability systems under which they exist and they
can be “active agents” (p.138) in changing them as well; and (d) Formal, external accountability
systems were only one among many that influence a school’s sense of internal accountability.
Other influences such as teacher and administrator beliefs about teaching and learning,
shared perceptions about the students in the school, their daily routines, and pressures from
parents and outside agencies also have an impact on how a school site shapes accountability
conceptions (Elmore, 2007). Ultimately, Elmore concluded that a school’s conception of
accountability is based on a close, reciprocal relationship among three factors: individual
responsibility, shared expectations, and internal and external accountability mechanisms. He
posited that schools are more likely to have cohesive, powerful accountability systems when the
values and norms of these systems are aligned with the individual conceptions of responsibility
and the collective expectations in the school.
After the first phase of the study, the researchers discovered that schools fell into one of
three categories. In the first set of schools, accountability essentially existed at the individual
level (Elmore, 2007). Teachers acted independently as solo practitioners in isolated classrooms.
They had no clear unified expectations for themselves or their students, nor did they face
consequences for their performance (Elmore, 2007). Teachers in this group of schools
USING DATA TO INFORM INSTRUCTION 64
experienced very little collegiality or administrative oversight. They reported having autonomy
over their practice and content, but felt accountable solely to their students in a way that was
one-sided and unreciprocated. Elmore referred to these types of schools as having a “historical
lack of formal accountability” (p. 157). He noted that the second set of schools was
characterized by teachers who had a strong sense of shared expectations and similar philosophies
about teaching and learning. Elmore found that these schools developed a culture of shared
expectations that guided the actions of teachers and other staff members, and yet there were
seemingly no apparent consequences for those who failed to meet these expectations. The third
set of schools achieved a strong, highly interactive, coherent internal accountability system.
Teachers in these schools agreed to a set of shared expectations that complemented their own
belief systems. Likewise, these schools had visible accountability structures and clearly
articulated consequences for those who failed to meet the expectations. According to Elmore,
they existed in the backdrop of external accountability systems, and sometimes the internal and
external mechanisms were mutually reinforcing.
While the research of Elmore (2007) produced numerous findings, I addressed only those
that were pertinent to my study. These findings included the following:
1. School-level factors were extremely important in shaping a school’s conceptions of
accountability.
2. The baseline or default form of accountability observed at several schools in the study
was characterized by individual teacher responsibility where personal decisions took
precedence over any outside expectation or formal accountability mechanism.
3. The study confirmed widely prevalent views that schools develop their own
normative structures apart from external influences and that teaching is basically an
USING DATA TO INFORM INSTRUCTION 65
isolated endeavor where individuals are left alone to decide what and how to teach. A
strong leader capable of recruiting like-minded staff, shaping a culture of shared
expectations, and providing instructional leadership, was sometimes associated with
overcoming this isolation.
Elmore concluded that schools operating in the default mode—where most decisions related to
student learning are made by individual teachers—will not be capable of responding to external
accountability mechanisms. He pointed out that the schools in the study did not exhibit the
capacity necessary to improve instructional practice or overall performance due to a limited
capacity to engage in collective deliberation and action. Schools that are more likely to respond
effectively to external pressures are those that have strong internal accountability systems.
Elmore continued:
A strong normative environment inside the school, based on a belief in the capacity and
efficacy of teachers and principals to influence student learning, coupled with the
knowledge and skill necessary to act on those beliefs, are prior conditions necessary for
the success of strong external accountability systems. (p. 199)
Lateral Accountability
City et al. (2009) introduced the concept of lateral accountability as a key aspect of the
instructional rounds process, a systems-level school improvement strategy. In this form of
accountability, a group of role-alike or cross-role colleagues form a rounds network with the
intent of working together on common problems of practice. The authors characterized this type
of network as one that “moves away from the default mode of hierarchy, compliance and self-
protection and toward a culture that creates a space for individual and organizational learning”
USING DATA TO INFORM INSTRUCTION 66
(p. 63). When this culture of learning is fostered, members hold each other responsible for
meeting agreements through participation in the network (City et al., 2009)
To more fully understand lateral accountability in this context, it is helpful to review the
rounds process, an explicit practice that is designed to focus the work of school improvement on
instruction. It is grounded in the idea that transparency of teaching and collaboration are critical
to instructional improvement. Rounds involve an iterative four-step process: identifying a
problem of practice, observing, debriefing, and then adjusting instruction in an effort to improve
student learning (City et al., 2009). City and his colleagues argued that such a process is
needed, because, while educators may have established individual routines and practices, they
lack “shared practices” and “agreed upon definitions of what good instruction looks like” (p. 3).
The rounds process was established to bridge this knowledge gap (City et al., 2009). It focuses
exclusively on the instructional core—what David Hawkins initially identified as the “I” (the
teacher), the “thou” (the student), and the “it” (the content) (City et al., 2009). Observation of
practice centers on three questions related to this instructional core: (a) What are teachers doing
and saying? (b) What are students doing and saying? and (c) What is the task?
Network facilitators create the conditions under which the rounds process can flourish by
fostering lateral accountability. They start by developing trust (i.e., using protocols to create safe
places for people to take risks, speaking candidly about problems of practice, and building social
relationships with each other). They also collectively develop group norms, referring to them
often and making them part of a living document that guides members’ behavior and
expectations. Norms serve as the foundation for lateral accountability, because they define the
commitment that individual network members expect from each other. Rather than acting as
punitive enforcers of behavior, facilitators in lateral accountability serve in a more supportive
USING DATA TO INFORM INSTRUCTION 67
role. They do not position themselves to solve problems but rather transfer issues and
responsibilities back to the group through a concept known as “transfer of agency” (City et al.,
2009, p. 149). In this way, the group is encouraged to take ownership of its own learning. In
this process, the rounds protocol is designed to help network members develop the highest level
of trust, relational trust, as opposed to a simpler form, transactional trust, defined by City et al.
as simply a commitment to behave in a certain way, “a procedural safety net that people can use
to sort out their relationships with one another” (p. 164). Relational trust, in contrast, grows and
deepens over time as participants work together and learn to depend on each other through the
delicate task of classroom observation. As the authors affirmed, “Collective efficacy requires
collective work and collective norms, not just individual understanding” (2009, p. 165).
Empirical Work on Lateral Accountability
Teitel (2013) conducted five case studies of schools that have implemented the practice
of school-based instructional rounds. Recognizing the benefits of a close and continuous focus
on schools and classrooms overtime, the rounds practice slowly evolved into a single school
focus. The schools selected for this study represented a range of models including rural, urban,
charter, district, and independent. Each case study detailed what the rounds protocol looked like
in the school and describes how it connected to the school’s improvement efforts. Between July
2012 and February 2013, data from observations, interviews, and written correspondence were
collected and analyzed at each school.
Teitel (2013) found that when teachers at a school site focus on a common problem of
practice and make commitments to each other for improvement, lateral accountability was
strengthened. Teitel noticed three benefits of school-based rounds: shared context, immediacy,
and embeddedness. He discovered that school-based rounds created an “intimacy” (p. 145)
USING DATA TO INFORM INSTRUCTION 68
between teachers, administrators, and students who all own the problem of practice and have a
vested interest in addressing it. Likewise, rounds that are conducted in a common context give
an immediacy and urgency to the work. Feedback from observations and subsequent
adjustments to instruction can take place right away. Finally, there is an aspect of embeddedness
to the school-based rounds protocol as it becomes a key part of the continuous cycle of growth
and improvement at each school.
External Mandates for Accountability
Finally, external accountability is a critical, if problematic, element of any accountability
system (Slayton, 2007). According to Slayton external accountability is important, because it
ensures that an entity outside of the classroom, school, local district, and district levels can
provide oversight to each level of the system and ensure that there is a consistent adherence to
and implementation of the system to drive improvement (p. 7). He noted that, and yet, few, if
any, districts across the United States have accountability systems structured in ways that hold
members at every level accountable to a continuous cycle of improvement and may even act in
ways that are counterproductive. In this section, I examine the literature on external
accountability in terms of the theory and empirical base—what it says about the way that
external pressure has been imposed to increase accountability. I am particularly interested in
reviewing the literature on how accountability systems or external mandates influence teacher
practice change.
In 2001, No Child Left Behind (NCLB) established an accountability system for states,
school districts, and schools, requiring them to (a) Adopt challenging content and achievement
standards; (b) Make annual progress on students meeting those standards, including decreasing
gaps between students of certain subgroups; and (c) Test students and collect data to measure
USING DATA TO INFORM INSTRUCTION 69
their annual progress (Lohman, 2010). Race to the Top (RTTT), built on the same test-based
accountability structure as NCLB, is viewed as an attempt to respond to the failure of NCLB in
producing any meaningful school improvement or in making progress on closing student
achievement gaps (McGuinn, 2011). According to McGuinn, RTTT provides a way for states to
opt out of NCLB’s tough, unrealistic expectations. RTTT’s competitive grant process, designed
to entice states with incentives rather than to hit them with sanctions, is meant to spur
educational reforms in four areas, including teacher evaluation. The literature provides
important background information on the underlying assumptions and limitations of these new
accountability mandates.
Fuhrman (2004) pointed out that accountability systems established by NCLB have
shifted away from holding schools accountable for policy compliance and are moving instead
toward a focus on learning outcomes or school-level academic performance that includes
consequences for that performance. According to Fuhrman, this new conception of
accountability relies on a set of underlying assumptions: (a) Performance, or student
achievement, is the key value or goal of schooling; (b) Performance is accurately measured by
the assessment instruments in use; (c) Consequences, or stakes, motivate educators and students;
and (d) Improved instruction and higher levels of performance will result. By way of
introduction to the book Redesigning Accountability Systems for Education, Fuhrman expressed
doubts about whether or not these assumptions will bear out in practice, providing a lead-in to
the works of authors who have addressed this concern.
O’Day (2004), one of these authors, articulated how accountability can best be employed
to promote organizational learning and improvement in schools. O’Day first took aim at the
NCLB focus on the school as the basic unit of accountability, arguing that changes in behavior
USING DATA TO INFORM INSTRUCTION 70
must occur at both the collective and the individual level. Next, she challenged the idea that
external forces can successfully change the internal workings of a school, stating “rules decreed
from on high often have little impact, especially when it comes to the core technology of
teaching and learning” (O’Day, 2008, p. 18). Third, she pointed out that how information is
understood and communicated in schools is critical to school improvement. She stated, “Indeed,
information is the lifeblood of all accountability mechanisms. One accounts to someone for
something, and this accounting gets done by conveying information” (O’Day, 2008, p. 18).
School improvement (adaptation), she argued, depends on the successful dissemination of
information through interaction (how members of the system obtain information) among
members of the system and interpretation (how members of the system make meaning of the
information and are able to act on it). Finally, O’Day argued that school accountability systems
under NCLB will have a greater chance to be successful if they can accomplish the following:
Generate and focus attention on information relevant to teaching and learning.
Motivate educators not only to attend to relevant information but also to expend effort to
augment or change strategies in response to this information.
Develop the knowledge and skills to promote valid interpretation of information in a way
that leads to continuous learning.
Allocate resources where they are most needed. (pp. 20–21)
Jones (2004) also took issue with many aspects of NCLB’s high stakes testing accountability
model, arguing that we need to find a better, more balanced means of evaluating schools. He
contended that schools should answer to a wider range of constituents, including students,
parents, and local community members, not just state and federal authorities. Likewise,
accountability should be expanded to monitor other aspects of schools, seeking evidence that
USING DATA TO INFORM INSTRUCTION 71
ensures the physical and emotional well-being of students, student learning, teacher learning,
equity and access, and continuous learning and improvement. He further argued that schools
should be judged based on multiple measures, both qualitative and quantitative, taking into
account local contexts. He suggested adapting a business model, the “balanced scorecard,” to
provide a more comprehensive view of a school’s performance. It could potentially be used to
monitor four elements of school performance: (a) Student learning with a greater focus on
“applied learning and thinking skills” (p. 586) as determined by multiple measures; (b) Fair and
equitable opportunities to learn, particularly for students and teachers; (c) Responsiveness to
students, parents, and community members; and (d) Organizational capacity. Jones argued that
schools need to strengthen internal accountability by empowering teachers (i.e., providing
teachers with more meaningful opportunities for professional development and shared
leadership). Finally, Jones suggested that government agencies play a more supportive, less
punitive role for schools that are struggling. Establishing reciprocal accountability will help to
move in this direction.
This section presented external accountability mandates that emerged from NCLB. It
addressed questions in the literature about these mandates (i.e., if school is the most appropriate
unit of focus, how NCLB will best be employed, and how it will bear itself out, and so forth).
We now have evidence that this high-stakes testing accountability model did not produce any
meaningful student achievement gains, nor did it close the achievement gap for disadvantaged
students as originally anticipated (McGuinn, 2011). It did, however, lead to an increase in data
use as schools scrambled to reach achievement targets. The next section examines the literature
on data use.
USING DATA TO INFORM INSTRUCTION 72
Data-Driven Decision Making
Accountability systems increasingly rely on an array of data to inform decisions. In this
section, I examined what the literature says about data use, whether they are gathered and
examined for productive purposes or used solely to respond in a superficial way to externally
driven mandates.
Marsh, Pane, and Hamilton (2006) analyzed four Rand studies to explore how data were
used in a variety of educational contexts. They began by defining Data Driven Decision Making
(DDDM) as the systematic collection and analysis of various types of data by teachers and
administrators that they use to guide decisions intended to improve schools. These data,
according to Marsh et al. include the following: (a) input data (material costs, school
expenditures, student demographics, etc.); process data (production rates, data on financial
operations, quality of instruction, etc.); outcome data (dropout rates, student test scores, etc.); and
satisfaction data (opinions from teachers, students, parents, or community members, etc.).
According to Marsh and his colleagues, once collected, the raw data were systematically
organized, analyzed, and synthesized in an effort to yield critical information for decision
making. Decisions fell into two categories: (a) Decisions using data to inform, clarify, and
identify, and (2) Decisions using data to act (Marsh et al., 2006). They offered a useful visual to
demonstrate the DDDM process:
USING DATA TO INFORM INSTRUCTION 73
Figure 2. Conceptual framework of data-driven decision making in education. Reprinted from
Marsh et al. (2006). Making sense of data-driven decision making in education (p. 3). Copyright
2006 by Rand Corporation. Reprinted with permission.
In this conceptual framework, raw data must be collected, organized, analyzed, and
summarized for it to render useful information. Once this takes place, information becomes
actionable knowledge when key staff members weigh its value and consider possible solutions.
Actionable knowledge, in turn, informs a range of decisions that might include setting goals,
addressing student needs, targeting support for student subgroups, evaluating teaching practices,
and so forth.
To begin their analysis of the four Rand studies, Marsh et al. (2006) examined the results
of interviews, surveys, focus groups, observations, and document reviews. They considered four
questions:
USING DATA TO INFORM INSTRUCTION 74
What types of data are administrators using?
How are administrators and teachers using this data?
What kinds of support are available to help with data use?
What factors influence the use of data for decision making?
The researchers’ analysis yielded some of the following findings and recommendations about
data use: (a) Outcome data, particularly state test scores, clearly receive the most systematic
attention, and school staff use these data most frequently to set improvement goals. They do not
appear to use much input, process, or satisfaction data. Practitioners and policy makers should
consider using other types of data at multiple points in time to provide a more balanced approach
to decision making; (b) Having access to data does not necessarily mean they will be used
effectively to inform decisions or that they will lead to improvements. School staff frequently
lack data analysis skills and guidance in identifying problems and determining solutions.
According to Marsh and his colleagues, more attention needs to be paid to developing capacity in
data analysis and decision making skills; (c) Most schools and districts in the studies view data
as useful to improve teaching and learning but express concerns about the negative consequences
of high stakes testing and excessive reliance on test data, both of which results in a narrowing of
the curriculum and an over emphasis on teaching test-taking strategies.
Jennings (2012) also discussed data use but restricted her focus to how teachers use data
in the work place. She pointed out that since the 1970s, American education policy has enacted
test-based accountability policies in an effort to improve student achievement. Central to this
policy logic is the idea that test score data, in conjunction with sanctions, will improve schools
and that data are the fuel that will drive the reform. She indicated that scholars have identified
several positive effects that test score data have had on teacher practice, including helping them
USING DATA TO INFORM INSTRUCTION 75
to diagnose student needs, identify strengths and weaknesses in the curriculum, determine
content not mastered by students, motivating them to work smarter, encouraging them to seek
professional development, and guiding them to effectively allocate resources.
Jennings (2012) argued that even with more than 20 years of accountability policies, we
still know little about “how features of accountability systems interact with organizational and
individual characteristics to influence teachers’ responses to accountability” (p. 2). She
identified ways that teachers can conceivably use test score data. First, the most passive way is
to use it as a lens, making inferences about school performance. Secondly, teachers can use test
score data for diagnosis, identifying strengths and weaknesses of students. Third, it can be used
as a compass, pointing toward instructional or organizational changes. Next, it can be used for
monitoring, such as determining whether or not a curriculum is working, and finally, it can be
used as a legitimizer, providing support for decisions. Jennings stated:
Together, these types of data use capture how teachers know their schools, students, and
themselves (lens); how they determine what’s working, what’s going wrong and why
(diagnosis); what they should do in response (compass); how they establish whether it
worked (monitoring); and how they justify decisions to themselves or to others
(legitimizers). (p. 4)
Jennings (2012) explained that at the core of test score data use is the process that
educators, policy makers, and parents enact to make inferences about student and school-level
performance. Jennings pointed out that this data can be used in productive ways—practices that
improve student learning and validate inferences about student and school-level performance or
distortive ways—instructional and organizational decisions made to produce score gains that do
USING DATA TO INFORM INSTRUCTION 76
not generalize to other measures of learning and, thus, lead to invalid inferences about student
and school-level performance.
Jennings (2012) reviewed the literature on the effects of accountability systems on
teachers’ data use. She proposed five features of accountability systems that affected how test
scores were used and examined how individuals and organizations interacted with these features
to influence teachers’ data use. First, accountability systems apply varying amounts of pressure
to influence data use that varies on a continuum from supportive to punitive. Second, the locus
of pressure varies across accountability systems, focusing in varying ways on districts, schools,
students, and/or teachers. Third, accountability systems differ in distributional goals (e.g.,
prioritizing growth vs. proficiency or emphasizing performance of racial or other subgroups,
etc.) Fourth, features of assessments vary across systems in ways that affect how teachers use
them. Fifth, the scope of accountability systems affects data use (e.g., incorporating multiple
measures, using process or outcome data, etc.)
Some of the findings of Jennings’s review of the literature that may be pertinent for my
study include the following:
Higher stakes increase data use.
The most common improvement strategies used by teachers that examine test results
include aligning curricula and instruction with standards, providing extra support for
lower performing students, planning instruction, adopting benchmark assessments, and
engaging in test preparation activities.
When educators are faced with goals they perceive to be extremely difficult or nearly
impossible to achieve, they have an increased tendency to narrow the focus, engage in
USING DATA TO INFORM INSTRUCTION 77
unethical behavior, or risk taking, inhibit learning, increase competition, and decrease
intrinsic motivation.
There is evidence that schools with higher organizational capacity are likely to use data
more productively.
Most of the variation in data use exists within schools not between schools.
Teachers in high- and low-performing schools are likely to respond differently to
accountability pressure. Teachers in the lowest performing schools may be more likely
to pursue distortive uses of data, targeting high return content or students. Use of test
score data has increased targeting behavior.
School organizational characteristics, such as the availability of data systems,
instructional support staff, and professional development opportunities, may affect how
features of assessments are distilled for teachers.
Accountability systems based on multiple process and outcome measures may produce
more productive uses of data than those simply based on test scores.
Jennings (2012) concluded by pointing out that there is tremendous interest in understanding
how data can be used to transform teacher practice, acknowledging that it is already altering
teachers’ work on a daily basis. More studies are needed to understand this impact. The next
section focused on data use conversations that are proliferating in schools as a result of external
accountability measures.
Data Use Conversations
In an era of accountability, teachers and administrators feel an increasing amount of
pressure to focus on performance data, especially in low-performing schools. Given this
increased attention, it is important to consider how increased data use impacts instruction. Horn
USING DATA TO INFORM INSTRUCTION 78
and Little (2010) and Horn et al. (2015) have addressed pressing questions about teacher data use
conversations, in particular, and examined how they might potentially contribute to teacher
learning.
Horn and Little (2010) considered the impact that teacher talk has on professional
learning and instructional innovation. They pointed to research suggesting that certain
impediments make it difficult for teachers to engage in meaningful discourse with their
colleagues in ways that generate new insights into teaching. Some of these obstacles include the
following: (a) The challenge of making tacit knowledge explicit, (b) The difficulty of
confronting well-established norms of privacy and interference, (c) Insufficient structure and
social supports, (d) Language that reinforces assumptions about learners and learning, and (e)
The multitude of immediate tasks teachers confront on a daily basis.
Despite these challenges, creating opportunities to improve teacher discourse around
teaching and learning is widely recognized as an important step toward improvement in practice.
This growing realization is supported by a body of research spanning 25 years, suggesting that
teachers’ collegial relationships have an impact on school improvement (Horn & Little, 2009).
Horn and Little pointed out that some studies attributed improvement gains to the formation of
professional community where teachers interact with colleagues in reflective dialogue about
student learning. Horn and Little argued that formal workplace groups are more likely to foster
teacher learning if they focus on dilemmas and problems of practice. A further study found that
secondary school reforms at the classroom level were unlikely to succeed “without a means to
foster in-depth interaction, mutual support, and professional learning opportunity among subject-
domain teachers” (Horn & Little, 2009, p. 185).
USING DATA TO INFORM INSTRUCTION 79
With the goal of specifying discourse that is generative for teacher learning, Horn and
Little (2009) examined the impact of conversational routines of teachers in English and algebra
work groups at one urban high school. Their study centered on the following research questions:
What part do problems of teaching and learning play in teachers’ recorded talk? How often do
they arise, with regard to what aspects of teaching, and with what degree of specificity and
transparency? How do they get taken up, or not? As a point of departure, the researchers
focused on conversational routines, the patterned and recurring ways that conversations occur
within a group, and moves, turns of talk that shape the progress of discourse by setting up and
constraining the response of speakers.
Teachers in both work groups engaged in conversational exchanges involving
normalizing responses or moves that defined a problem as common to everyday classroom
experience (Horn & Little, 2010). Normalizing practices served as a starting point for digging
into problems of practice. They helped to establish solidarity, creating a sense of membership
and affiliation as well as legitimizing problems as worthy of attention. Normalizing responses
were used to turn the conversation toward or away from deeper examination of problems or
dilemmas. The conversation was likely to turn away from more in-depth exploration of
problems when teachers limited themselves to expressions of sympathy or reassurance, rendering
the teacher posing the problem helpless.
Normalizing responses could also be used to turn toward problems of practice (Horn &
Little, 2009). According to Horn and Little, the algebra group, in particular, successfully
established a conversational routine that opened up opportunities for teacher learning and created
an interactional space conducive to learning about teaching practice. It involved the following
elements: (a) Normalizing a problem of practice; (b) Further specifying the problem; (c)
USING DATA TO INFORM INSTRUCTION 80
Revising the account of the problem (its nature and possible causes); and (d) Generalizing to
principles of teaching. By using this routine, the algebra group positioned a novice teacher to
identify, examine, and reflect on her own teaching dilemma in a way that expanded the collective
knowledge of the group as well (Horn & Little, 2010).
Horn, Kane, and Wilson (2015) conducted a study focused on teachers’ data use
conversations with the intent of discovering how they potentially limit or expand teacher
learning opportunities. To set the stage for their study, they first presented two opposing camps
regarding the approaches schools are taking to use performance data to improve educational
outcomes: (a) Instructional management logic—where the focus is on reorganizing instructional
work with or without change in instructional practice to ensure improvement in test scores; and
(2) Instructional improvement logic—where the emphasis is on improving educational outcomes
by supporting teachers’ professional growth and the development of their pedagogical skills.
The tension between these two approaches arises as schools struggle to interpret and use
performance data.
Next, Horn et al. (2015) presented a theoretical perspective on teacher conversations.
They pointed to four conversational features that shape meaning in teachers’ work group talk: (a)
Epistemic stances—work group perspectives on what can be known, how to know it, and why it
is of value. Horn et al. referred to these as “bald declarations in single turns of talk” (p. 213)
providing an example to illustrate— “What matters here is motivating kids;” (b) Representations
of practice—renderings of classroom experiences for the group’s consideration. These take on
two forms: replays—detailed stories depicting classroom events, and rehearsals—classroom
interactions played out in more general or anticipatory ways; (c) Activity structures –
conversational routines or patterned ways of talking; and (d) Problem frames—descriptions of
USING DATA TO INFORM INSTRUCTION 81
how classroom issues are defined through interactions. Finally, Horn et al. pointed to Jennings’s
(2012) five frames as useful for analyzing teachers’ approaches to data use (data as a lens, a
diagnostic tool, a compass, a monitoring device, and/or as a legitimizer). All of these
conversational features and frames work together in dynamic ways to support teacher learning
opportunities (Horn et al., 2015).
With this theoretical base as a back drop, Horn et al. (2015) observed two seventh-grade
Math teacher workgroups in nearby urban middle schools. The study sampled four meetings
from each workgroup, with the intent of answering the following research question: How do
middle school mathematics teachers’ data use conversations shape professional learning
opportunities? The researchers found that even though both groups were using the same interim
assessment data to inform their instruction, they used different data use logics in their
conversations, and this shaped the quality of their learning opportunities.
The first school, Creekside, had a principal with a strong administrative charge to
improve student outcomes (Horn et al., 2015). Because of his emphasis on proficiency targets,
the conversations of the teacher group were geared toward the instructional management logic
using mostly compass and monitoring frames. Teacher discussions centered on ways to
personalize and motivate individual students to achieve, focusing primarily on the issue of
improving test scores without rethinking teaching. The Park Falls teachers, in contrast, worked
largely without the support of a strong principal. Their conversations were governed mostly by
an instructional management logic with an emphasis on the diagnostic frame as they worked
together to review and interpret frequently missed items and student errors. In this process, they
focused on student thinking which led them to learning opportunities about what topics needed
revisiting and why. Interestingly, time became a factor in their interactions. They spent so
USING DATA TO INFORM INSTRUCTION 82
much time in group conversations with peers reviewing the frequently missed test items that
they did not have sufficient time to deeply discuss instructional implications. Thus, even though
their conversations gave them insight into how their instructional choices had an impact on
student outcomes, the way the activity was structured limited the capacity of teachers to use
findings in their conversations to improve practice (Horn et al., 2015).
Teacher Evaluation
Recently, substantial efforts have been made to improve teacher quality through
performance evaluation measures (Hallinger et al., 2013). Darling-Hammond (2013) and Little,
Goe, and Bell (2009) pointed out important areas to consider for those involved in efforts to
restructure teacher evaluation systems.
Darling Hammond (2013) argued that we need to carefully consider how these new
systems are conceived and implemented, offering that any fair system would not only develop
the skills of individual teachers but would also create the conditions under which those skills can
be used effectively. She emphasized that, likewise, a critical feature of any evaluation system
would require teachers to collect, examine, reflect on, and use evidence of student learning to
inform or adjust instruction. She noted that it would also include ongoing opportunities for
professional learning to ensure that teachers have the knowledge and skills needed to respond to
student learning. Further, Darling-Hammond cautioned against adopting any evaluation system
that might take an “individualistic, competitive approach to ranking and sorting teachers” (p. 3),
arguing that this would undermine collaboration and the growth of professional learning
communities—key aspects of a teaching and learning environment that supports continuous
improvement for all.
USING DATA TO INFORM INSTRUCTION 83
After examining teacher evaluation systems across the United States and in other parts of
the world, Darling-Hammond (2013) proposed key features necessary for an effective teacher
evaluation system:
1. Teacher evaluation should be based on professional teaching standards.
2. Evaluations should include multi-faceted evidence of teacher practice, student learning,
and professional contributions.
3. Evaluators should be knowledgeable about instruction and well-trained in the evaluation
process.
4. Evaluation should be accompanied by useful feedback, and connected to professional
development opportunities that are relevant to the teacher’s goals and needs.
5. The evaluation system should value and encourage teacher collaboration.
6. Expert teachers should be part of the assistance and review process for new teachers and
for teachers needing extra assistance.
7. Panels of teachers and administrators should oversee the evaluation process. (p. 153)
Little and his colleagues (2009) also cautioned that a more robust, comprehensive
definition of teacher effectiveness needs to be considered before determining the best way to
measure it. They argued that there are many different conceptions of teacher effectiveness, and
some of them complex and controversial. According to them, teacher effectiveness is often
narrowly defined as “the ability to produce gains in student achievement scores” (p. 1). Test
scores, they point out, provide limited information about what and how much students learn. As
such, they cannot be considered the sole indicator of a teacher’s impact on student learning.
Little et al. (2009) presented an important set of criteria to consider for those involved in
efforts to measure teacher effectiveness, A Five Point Definition of Teacher Effectiveness,
USING DATA TO INFORM INSTRUCTION 84
developed by Goe et al. (2008). In this comprehensive definition, effective teachers do all of the
following: (a) Maintain high expectations for all students and focus their attention on student
learning; (b) Contribute to positive outcomes for students, including promoting regular
attendance, positive behavior, and self-efficacy, among others; (c) Use a variety of resources to
plan, structure, monitor, and assess student progress; (d) Contribute to promoting classrooms that
value diversity and civic-mindedness; and (e) Collaborate with peers, school leaders, and outside
educators to ensure student success.
In light of this expanded definition, Little et al. (2009) examined the pros and cons of
several methods currently in use to evaluate teaching. They articulated seven methods of teacher
evaluation, including value-added models, classroom observations, principal evaluations,
analysis of classroom artifacts, portfolios, self-reports of practice, and student evaluations. For
the purposes of this literature review, I focused on the first three and briefly mentioned the other
four.
Value added models—relatively new but growing in popularity—credit teachers for gains
in student achievement indicators (Little et al., 2009). According to Little and her colleagues,
supporters of these models argue that they are beneficial because they do not involve raters
making subjective judgments, are cost efficient and nonintrusive, and directly examine how
teachers contribute to student learning. Little et al. pointed out that detractors, on the other hand,
caution that value-added scores cannot be consistently or reliably tied to teacher actions and that
the value-added model assumes too much about test scores and what they mean about student
learning.
Classroom observations are the most common form of teacher evaluation though they
vary widely in who observes, what gets observed, and for how long and how often (Little et al.,
USING DATA TO INFORM INSTRUCTION 85
2009). According to Little et al., two observation protocols are widely used to conduct
classroom observations: Danielson’s Enhancing Professional Practice: A Framework for
Teaching (1996) and the University of Virginia’s Classroom Assessment Scoring System
(CLASS) (2009). Both protocols are general across subject matter and include formal
procedures for training raters and scoring. Little and his colleagues pointed out that this form of
teacher evaluation is generally viewed to be credible by multiple stakeholders. They argued that
it is the most direct way to measure teacher practice and that studies have linked it to gains in
student achievement. More work needs to be done to link teacher performance scores to student
outcomes and to address concerns about rater reliability (Little et al., 2009).
A third method of teacher evaluation, principal observations, focuses exclusively on
classroom observations conducted by principals or assistant principals (Little et al., 2009). Little
et al. found that this method of evaluation was considered valuable because principals are the
most knowledgeable about the context of their schools, student populations, and teacher groups,
and can potentially use this knowledge to improve instruction. The researchers argued that
classroom observations conducted by principals can be useful for both formative and summative
purposes, giving principals information to make decisions about dismissal and tenure, to identify
teachers in need of remediation, to determine professional development needs, and to provide
direct feedback to teachers to improve their practice, among others. Principals’ interpretations of
teaching behaviors, however, have the potential to introduce bias in the process. Furthermore,
many principals are not well trained in methods of evaluation or in helping teachers grow and
develop in their profession. In one study by Brandt et al. (2007) of several Midwestern districts,
only 8% of principals reported receiving training as a component of their teacher evaluation
USING DATA TO INFORM INSTRUCTION 86
systems (as cited by Little et al., 2009). Principals risk credibility with teachers if they are
viewed as “uninformed or unjust evaluators” (Little et al., 2009, p. 9).
Other methods of teacher evaluation include classroom artifacts, portfolios, self-report of
practice, and student evaluation (Little et al., 2009). Little and his colleagues pointed out that
classroom artifacts involved the analysis of teacher-created evidence (lesson plans, assessments,
scoring rubrics, student work samples, etc.) to determine the quality of instruction and the extent
to which a teacher creates and supports student learning opportunities. This method is thought to
be practical and feasible as the procedures do not place undue burdens on teachers, but accurate
scoring is essential and difficult to achieve (Little et al., 2009). According to Little et al.
portfolios are defined as evidence (teaching practices, school activities and student progress) of
classroom practice created and compiled by teachers. They pointed out that this evidence
includes such items as lesson plans, schedules, assignments, assessments, student work samples,
reflective writings, notes from parents, and so on that demonstrate exemplary work. This
procedure often requires teachers to reflect on the materials and to thoughtfully explain the
rationale for using them. Self-report of practice requires teachers to report on classroom practice
by completing such things as surveys, instructional logs, checklists, interviews, etc. Finally,
student evaluations ask students to rate teachers on various aspects of teaching. Little and his
peers emphasized that student reports should only be used in conjunction with other measures as
students are not qualified to judge teachers in such areas as content knowledge and classroom
management, among others. These last four methods are considered to be promising measures of
teacher effectiveness, but many issues of reliability and validity still need to be addressed (Little
et al., 2009).
USING DATA TO INFORM INSTRUCTION 87
In sum, Little et al. (2009) suggested considering the following set of recommendations
to those in charge of conceptualizing and creating new measures of teacher effectiveness: (a) Use
multiple measures to capture important information about teacher practice that is not found by
solely examining evidence from observations or value-added scores; (b) Incorporate measures
that take into account specific teaching contexts related to subject matter, student background,
and school culture; and (c) Keep in mind that the ultimate goal of teacher evaluation is to
improve instruction and student learning.
Contextual Factors that May Impact Principals’ Practices, Including Data Use
Principals’ work exists within a larger context involving many ongoing and competing
demands from multiple sources, including parents, teachers, district offices, and external
mandates, among others (Goldring, Huff, May, & Camburn, 2008). Goldring and his colleagues
point out that, increasingly, principals have been asked to devote more time to instructional
leadership. Their ability to do this and their success in their leadership role may vary according
to the context in which they work and their skill at adjusting and adapting to the competing
pressures in their environments. A study conducted by Goldring et al. highlighted the role that
context plays in school site leadership.
In this study, Goldring et al. (2008) examined the daily work logs of 46 principals in one
urban, Southeastern district through a cluster analysis in an effort to understand how principals
allocated their time and attention across major realms of responsibility. They determined that
principals could be clustered into three broad categories based on the way that they prioritize
their work: (a) Eclectic principals divided their time across a variety of activities; (b)
Instructional principals focused more time on instruction; and (c) Student-centered principals
spent a large amount of time on student affairs. The researchers found that contextual factors
USING DATA TO INFORM INSTRUCTION 88
alone best explained principals’ time allocation not their individual attributes. These results
emphasize the influence that context plays on school principals’ practice.
Empirical Work on Teacher Evaluation
In 2008, Chicago Public Schools (CPS) launched the Excellence in Teaching Pilot, a
massive citywide effort to overhaul the way in which teachers are evaluated and how they
receive feedback (Sartain et al., 2011). According to the report of Sartain et al., at the time, this
pilot was at the forefront of a national movement to restructure teacher evaluation systems. The
work in Chicago was motivated by dissatisfaction with the traditional evaluation model in place
that failed to give teachers meaningful guidance on how they could improve their instruction and
did not differentiate low- from high-performing teachers (Sartain et al., 2011). The researchers
reported that, in this system, 93% of teachers were identified as excellent or superior and only
0.3% were found to be unsatisfactory. At the same time, 66% of CPS schools were failing to
meet standards, suggesting a major disconnect between classroom results and teaching quality
(Sartain et al., 2011).
The report summarized findings from a two-year study of Chicago’s Excellence in
Teaching Pilot, involving 101 elementary schools (Sartain et al., 2011). Sartain and his
colleagues informed that this pilot was designed to drive instructional improvement by providing
teachers with evidence-based feedback on their strengths and weaknesses. The pilot consisted of
training and support for principals and teachers, principal observation of teaching practice
conducted twice a year using the Charlotte Danielson Framework for Teaching, and conferences
between the principal and the teacher to discuss evaluation results and teaching practice (Sartain
et al., 2011).
USING DATA TO INFORM INSTRUCTION 89
CPS and the Consortium on Chicago School Research worked closely to design the pilot
and the quantitative and qualitative research (Sartain et al., 2011). Data were collected from
multiple observations, administrator and teacher interviews, and teacher focus groups. The
researcher asked three overarching questions: (a) What are the characteristics of principal ratings
of teacher practice? (b) What are the principal and teacher perceptions of the evaluation tool and
conferences? (c) What factors facilitated or impeded implementation of the teacher evaluation
system? The study took place over two years—from 2008–2010.
Some of the key findings included the following:
There was a strong relationship between classroom observation ratings and student
learning. In the classrooms of highly rated teachers, students showed the most growth;
and, conversely, in the classrooms of teachers with low observation ratings, students
showed the least growth.
Principals rated teaching reliably at the low and middle levels. However, principals
were more likely to rate practice as highly effective when outside observers rated
practice as effective.
Conferences, based on shared language from the Danielson Framework and on
evidence from the observation, were reported to be more reflective. Positive attitudes
about conferencing were dependent on principals’ skills and buy-in (Sartain et al.,
2011).
Overall, the new system was found to be an improvement (Sartain et al., 2011). According to the
report, classroom observations proved to be valid and reliable measures of teaching practice.
Principal-teacher conferences were more reflective and objective than with the former system
USING DATA TO INFORM INSTRUCTION 90
and were more focused on instructional practice and improvement. Further, over 50% of
principals were highly engaged with the process (Sartain et al., 2011).
To conclude the report, the researchers (2011) asserted that the shift to evidence-based
teacher evaluation requires teachers “to conceptualize their instructional practice as constantly
evolving, open to scrutiny, and in need of tweaking and improvement” (p. 42). They also
contended that any successful, evidence-based teacher evaluation system requires the following
components: (a) An intentional, long-term commitment to the work; (b) A well-designed
instructional rubric to promote shared language and values about instruction; (c) A well-
conceived, carefully articulated system of accountability that details a clear evaluation process;
(d) A structure that connects the tools, ratings, and procedures with professional development
that supports teachers and administrators and helps them to re-conceptualize teacher evaluation
as a process to promote teacher growth, and improvement in teacher practice (Sartain et al.,
2011, p. 42).
Empirical Work on Principals’ Use of Teacher Evaluation Data
Although much attention has been focused on value-added test score data as an indicator
of teacher effectiveness, teacher observation data, among other types of data, are emerging as the
main source of principals’ data use in making informed decisions about schools, teachers, and
student learning (Goldring et al., 2015). Goldring et al. (2015) conducted a study in which they
explored principals’ use of, and attitudes toward, teacher effectiveness data. The researchers
asked the following questions: How and why do principals use teacher effectiveness measures
for human capital decisions in practice? What are barriers to using these measures? To find
answers to these questions, they surveyed principals and conducted semi-structured interviews of
central office staff and principals in six urban high school districts over a two-year period.
USING DATA TO INFORM INSTRUCTION 91
Results indicated that principals tended to be wary of value-added measures as indicators of
teacher effectiveness. They were much more likely to rely on teacher observation data to make
informed decisions about teaching and leaning on their campuses despite its time-consuming
nature, involving observing classroom instruction, scripting, uploading evidence, scoring, and
proving feedback to teachers (Goldring et al., 2015).
Principals preferred observation data because, unlike value-added measures, it was
collected at multiple data points over the course of the school year, included formal and informal
snapshots of practice, and were readily and easily accessible (Goldring et al., 2015). Excellence
of teaching was generally clearly defined and when it came time to give feedback to teachers,
principals were well equipped with objective evidence on which to base their perceptions. In
sum, observation data is preferred by principals because it provides a big picture view of a
teacher’s practice, can inform professional development plans, and can help principals provide
specific, ongoing feedback to teachers (Goldring et al., 2015).
The literature on accountability helps me respond to my research question. It suggests
that investments in internal accountability (unwavering focus on instruction, shared expectations
for student and teacher performance, and a clear process to ensure that those expectations have
been met) and reciprocal accountability (alignment of capacity building with performance
measures) will lead to continuous instructional improvement and will help schools respond
productively to external mandates (Elmore, 2007). Elmore surmised and used two case studies
as an example to prove this point, but, as yet, there is no large body of research that proves that
these things would happen in every case. The literature has also suggested that lateral
accountability empowers teachers to focus on common problems of practice and supports
improvement efforts through peer commitments, and that this is potentially more powerful than
USING DATA TO INFORM INSTRUCTION 92
other hierarchical forms of accountability (City et al., 2009). External accountability provides
necessary oversight, ensuring that all levels of the school system move toward improvement, and
yet it is doubtful that the current high-stakes testing model will lead to instructional improvement
(Fuhrman, 2004; Slayton, 2007). Productive uses of data have the potential to effectively inform
school accountability systems, particularly teacher evaluation, in ways that will lead to teacher
practice change (Jennings, 2012; Marsh et al., 2006). Finally, empirical evidence from states and
districts that have implemented new teacher evaluation systems such as the Chicago’s Excellence
in Teaching Pilot provides insight into how a system can be developed, implemented, and
supported in a way that leads to instructional improvement (Sartain et al., 2011). Much of this
research is theoretical. Most of the empirical research to date involves small case studies. More
large-scale studies with randomized selection are needed to prove these things to be true.
This concludes a substantive literature review regarding three areas study: leadership,
professional development, and accountability. Now I turn my attention to my conceptual
framework.
Conceptual Framework
Maxwell (2012) defined the conceptual framework as “the system of concepts,
assumptions, expectations, beliefs, and theories that supports and informs” a research study (p.
39). He further described it as “a conception or model of what is out there that you plan to
study…a tentative theory of the phenomena that you are investigating” (p. 39). The literature
review was written “in close association with advisors who know the territory” (Maxwell, 2012,
p. 40). Drawing on the literature and the data, this is the conceptual framework I propose to
explain in my study and to describe what happens between these components. The goal of my
study was to explore how a school was using data, including teacher evaluation data, to improve
USING DATA TO INFORM INSTRUCTION 93
practice. I studied the principal’s approach to using data in the enactment of teacher evaluation
and professional development, exploring how the principal communicates with teachers and
other school leaders and the context within which that communication takes place to discover
how and in what ways this interaction fosters teacher practice change. I constructed a conceptual
framework that drew upon the ideas presented in the literature review: (a) Leadership, (b)
Professional development, and (c) Accountability. In this section, I present a theory that shows
how these ideas and concepts connect and intersect. Toward this effort, I created a concept map
to visually present the design of my study:
Figure 3. Conceptual model.
In this model, federal, state, and district policy initiatives create external accountability
mandates, putting pressure on schools to comply in ways that are not always supportive or
USING DATA TO INFORM INSTRUCTION 94
productive and often result in punitive consequences. A school leader mediates or buffers the
school from these demands by creating internal accountability systems using a leadership style
that supports the use of data for the improvement in practice. I posited that the existence of
lateral accountability where power is shared would contribute to a better use of data. In this
case, there is no lateral accountability and the use of data is limited. The school site leader also
enacts a certain set of strategies to create an enabling and empowering environment where
teachers are encouraged to work together using data productively to inform their practice. The
school site leader works to inspire and motivate teachers to become active participants in their
own learning and creates conditions in professional development that support practice change.
Next, I discussed each of the areas studied in the literature review so that it becomes clear
how they all work together to create a school environment that can successfully meet the
demands of an externally mandated accountability system, specifically teacher evaluation, in a
way that support instructional improvement and teacher practice change.
School Leadership
I drew from across different models of leadership and the data from my study to construct
the following explanation for how a principal exercises leadership for this type of environment to
be created. According to Hallinger (2004), the principal demonstrates instructional leadership,
where the school leader shapes his/her image as someone who is “strong and directive” (p. 329),
displaying a combination of expertise and charisma that easily enables him/her to monitor,
coordinate, oversee, and shape curriculum and instruction in a school. In the presence of
distributed leadership, a school leader would ostensibly strive to energize, empower, and
mobilize key members of the school community, including administrative staff, coaches,
coordinators, and lead teachers, to take on leadership responsibilities; together, they work in
USING DATA TO INFORM INSTRUCTION 95
unison toward improvement of instruction (Spillane et al., 2004). Spillane et al. reported that in
the absence of this type of leadership, the power resides exclusively with the school leader and
the leadership capacity of other administrators and/or teachers is not developed, weakening the
way that data is used to inform practice. Furthermore, a school leader exhibits aspects of
student-centered leadership by employing processes (planning, implementing, supporting,
advocating, communicating, and monitoring) to create the conditions to use data in productive
ways that result in practice change (Goldring et al., 2009). Goldring et al. pointed out that the
principal makes efforts at enacting all of these behaviors by planning professional learning
experiences, promoting the use of data, supporting his/her staff with resources and strategies, but
risks falling short at two critical aspects—implementing and monitoring. The quality of these
behaviors is as important as the presence of these behaviors.
As an instructional leader, the principal maintains a steadfast focus on instruction by
articulating a clear vision, managing the instructional program, and promoting a positive school
learning climate (Hallinger, 2011). However, a school leader who presents him/herself as the
keeper of the all the knowledge and expertise, fails to build the capacity of teachers to critically
examine their pedagogy or to use data in ways that lead to substantive instructional
improvement.
Professional Development
The school leader creates protected professional learning time and space for teachers
where they can problem solve and examine their practice in a safe, supportive community with
their peers (Cochran-Smith & Lytle, 1999; Hord, 1997). Teachers take advantage of professional
development time to deepen content knowledge (Desimone, 2009; Elmore, 2002) and to review
data in the form of student work and administrative or peer feedback from observations. In the
USING DATA TO INFORM INSTRUCTION 96
presence of reflective practice, teachers engage in reflective conversations with their peers about
what is occurring in the classroom, what might need to change, and what instructional
approaches they will experiment with to adjust their instruction in an effort to improve student
learning. Teachers, likewise, engage in inquiry, identifying problems, asking questions, and
examining assumptions in learning communities with reciprocal accountability that includes
norms and clear expectations (Rodgers, 2002). They make an effort to use the four step
reflective cycle: observation, description, analysis, and experimentation (Rodgers, 2002) in these
conversations. In the absence of reflective practice, teacher focus mostly on student behaviors
and superficial activity-based changes to their practice. They do not focus on self-identified
problems of practice or issues related to their instructional approaches or pedagogy.
Likewise, in the presence of optimal professional learning, teachers and school leaders
participate in an evolved professional learning community, characterized by commitment and
trust, where multiple perspectives inform collective knowledge, and communal responsibility for
norms, behavior, and direction of the group emerges (Grossman et al., 2001). Grossman and his
colleagues noted that in this environment, members feel comfortable to examine their practice
and challenge deeply held beliefs, and assumptions about teaching and learning. Teachers,
likewise, feel safe to take risks in identifying and seeking alternatives to problems of practice,
recognizing the impact that they, themselves, have on classroom dynamics and student learning
(Cochran-Smith & Lytle, 1999). According to Cochran-Smith and Lytle, for example, if a
teacher encounters disruptive behavior from a student in the classroom, s/he will question his or
her role in how it might have been caused, exacerbated, and his or her own role in how it might
be corrected or alleviated. Ultimately, members of the community experience a growing
USING DATA TO INFORM INSTRUCTION 97
commitment to student learning and assume responsibility for the growth and development of
their colleagues as well as themselves (Grossman et al., 2001).
In the absence of optimal professional learning, teachers are subjected to externally
imposed professional development that is delivered in a top-down fashion and is based on the
knowledge for practice model, resulting in the exclusion of teacher voice and the inability to
improve practice in substantive ways (Cochran-Smith & Lytle, 1999). Teachers also engage in
protocols that involve the use of data in limited conversations that mostly focus on student
behaviors and result in small tweaks to practice.
Accountability
In the presence of internal accountability, the school community demonstrates a
persistent focus on instruction, shared expectations for student and teacher performance, and a
well-defined process to measure improvement gains (Elmore, 2003). Likewise, reciprocal
accountability is key to this effort. According to Elmore, the school leader establishes a shared
vision with clearly articulated, measurable goals, and a strong support system for teachers and
others to meet those goals. This shared vision includes coaching and professional development
opportunities to work on areas of instruction that teachers identify as needs through reflective
conversations with peers and instructional leaders.
In the absence of a strong internal accountability system, the school leader uses external
accountability demands to his/her advantage, mediating with the larger district and other external
forces while simultaneously buffering the teachers from the harsher aspects of the demands. The
school leader uses the accountability demands to his/her advantage by allowing them to put
pressure on staff so that s/he can more easily pursue his/her agenda.
USING DATA TO INFORM INSTRUCTION 98
Conclusion
This chapter explored various bodies of theoretical and empirical literature that informed
the conceptual framework for this study. The literature and data revealed key concepts in
leadership, professional development, and accountability that need to be considered when
implementing an externally driven accountability system, particularly teacher evaluation, at a
school site in a way that supports teacher practice change. Chapter 3 will present the
methodology that was used for this study.
USING DATA TO INFORM INSTRUCTION 99
CHAPTER THREE: METHODOLOGY
This chapter describes the qualitative approach, instrumentation, and data collection
methodologies that were used to conduct this study. The purpose of this study was to examine
the role of leadership in structuring opportunities for teachers to engage in collaborative inquiry
and discourse around the use of data in ways that increase their knowledge and skills.
Specifically, I wanted to explore how a school principal can create a collaborative environment,
putting policies and practices in place that would influence a teacher to move toward increased
professional growth. This qualitative case study was informed by the following research
question: What is the role of the high school principal in using data, inclusive of teacher
evaluation data, as a tool for teacher learning?
This chapter is divided into five sections. First, I discuss the rationale for selecting a
qualitative research design. Second, I describe the sample and population proposed for this case
study, including the proposed criteria that were used for selection of the research site. Third, I
describe the data collection procedures proposed for this research. Fourth, I describe data
analysis procedures and the limitations and delimitations of the study. Finally, I end with
remarks about the credibility and trustworthiness of the study.
Research Design
I chose to do a qualitative research design, because I wanted to explore, discover, and
possibly surprise myself with whatever I might encounter, to gain insight into how teacher
evaluation data and other data were being used at a school site to enhance teacher learning, so
that I could find answers to my research question. I was also interested in keeping an open
mind and in learning. Merriam (2009) stated, “Qualitative researchers are interested in
understanding how people interpret their experiences, how they construct their worlds, and
USING DATA TO INFORM INSTRUCTION 100
what meaning they attribute to their experiences” (p. 5). I wanted to learn about the role of
leadership in using data, including teacher evaluation data, to improve instruction. I was
particularly curious to find out how teachers and school leaders experienced the use of data in
teacher evaluation and learning experiences at a school site, so that I could contribute to the
knowledge base in this area of research and offer ideas that might improve this process. This
particular case study attempted to capture what the teacher evaluation process enacted in
practice looked like, what the role of leadership was in that process and in other professional
learning settings, and whether or not teachers who were participating in these experiences
perceived them as likely to impact their practice.
Sample and Population
The focus of my study was to gain a sense of what an effective use of data in the
enactment of teacher evaluation and professional development looked like at a school site and
whether the identifiable elements of leadership, professional development, and accountability
played out in a way that was consistent with the literature. I also explored how these elements
worked together to create an environment conducive to professional learning that leads to
practice change.
Site Selection
I purposively sampled using a set of criteria to select a high school for this study.
Merriam (2009) pointed out that purposeful sampling is “based on the assumption that the
investigator wants to discover, understand, and gain insight and therefore must select a sample
from which the most can be learned” (p. 77). Purposeful sampling was considered likely to
yield the most valuable information to help me answer my research question. To find this
school, I sent out information letters to 10 high schools in the Southern California area. I chose
USING DATA TO INFORM INSTRUCTION 101
high schools that were known for trying innovative approaches. In this letter, I stated that I was
looking for a public secondary school with a historically underserved population of students
that has a reputation for successfully using teacher evaluation data productively to inform
teacher practice. I received favorable responses from three schools. I scheduled a visit to each
of these schools and met with the principals. I selected the school that had a willing, receptive
principal and that was geographically closest to me. The high school that I selected was
representative of low-performing schools and was led by a principal in her second year of
leadership who was implementing a new teacher evaluation system and who appeared to be
moving the school forward. Located in Southern California, Star High School opened in 2006,
led by a group of like-minded educators who had a vision of providing this struggling, inner-
city community with an alternative to underperforming high schools in the area. Of the
approximately 600 students in the school, 95% were economically disadvantaged, 100% were
minority, and the school had an API
1
of 750. Star High School was one of several high schools
in the Cascade School District, a district that was implementing a new teacher evaluation
system in all of its schools.
Participant Selection.
After gaining the approval of the principal, I sent out a letter to the teachers at the site
explaining my study. I was introduced to the faculty at one of the professional development
meetings. I shared the scope of the study with them and explained that I would conduct
observations of several whole group professional development meetings and more in-depth
interviews and observations exclusively with the English teachers as that content area reflects
1
The API is the California Accountability Performance Index. At the time of the study, this was on a hiatus
because of the transition in the testing system.
USING DATA TO INFORM INSTRUCTION 102
my background. All of the staff members agreed to the study. I then identified nine English
teachers, the principal, and two assistant principals to participate in the study. I focused on a
specific content-alike group of English teachers who worked collaboratively in multiple types
of professional learning experiences, and included a school site principal who supported the use
of data as a tool to improve instruction in both teacher evaluation and professional
development. The teachers who were selected had allotted, protected time to collaborate with
peers on instruction. Years of experience for teachers were not factored into the study. The
principal was identified as a strong leader who had a clearly articulated process for evaluating
teachers and believed in the use of teacher evaluation and other data to improve instruction, and
who made efforts to create the conditions that support continuous learning and teacher practice
change. She also believed in facilitating teacher growth and development through collaborative
inquiry.
Instrumentation and Data Collection
The purpose of this case study was to gain insight into how a secondary school used
teacher evaluation and other data in ways that contributed to teacher learning and instructional
improvement. As in qualitative studies, my role as the researcher was to serve as the primary
instrument for data collection and analysis (Merriam, 2009).
In this study, I collected data over a four-month period of time from multiple sources
related to the school’s teacher evaluation policies and procedures and to the school’s
professional development plan, specifically through interviews, observations, and documents.
Data in qualitative studies include nearly anything that you see, hear, or collect, or anything that
is communicated to you as you embark on your study (Maxwell, 2012). As Maxwell has
stated, “There is no such thing as inadmissible evidence in trying to understand the issues or
USING DATA TO INFORM INSTRUCTION 103
situations you are studying” (p. 87). To this end, I interviewed various staff members at the
selected school site, including the principal, two assistant principals, and eight out of nine
English teachers, one of whom served as a coach and a department chairperson. I actually
interviewed all nine of the English teachers, but the transcription of one of them was lost and
the teacher refused to be re-interviewed. In addition, I conducted eight observations of
professional development meetings, and one teacher/administrator informal classroom
observation conference. I attempted to observe more of the informal conferences but the
principal cancelled all of them and rescheduled them at times that were not possible for me to
attend.
Interviews were semi-structured to allow for flexibility of questions. Interviews
included questions that sought responses about leadership actions that promote productive
teacher evaluation procedures and that foster and encourage professional learning. Interviews
took place during teacher conference periods or after school in a private office on the campus or
in the teacher’s classroom. In addition, I engaged in informal data-gathering strategies,
including “hanging out, casual conversations, and incidental observations” (Maxwell, 2012, p.
88) to develop a rapport with staff members on the campus and to gather potentially valuable
information at unexpected moments. Maxwell explained that this type of informal data
gathering is particularly valuable, because “it can provide important contextual information, a
different perspective from your interviews, and a check on your interview data” (p. 88).
Because I happened to be working nearby, I was able to visit the site frequently, affording me
the opportunity to conduct many informal observations during the time of the study.
Evidence of professional learning was collected through documents obtained. I
examined documents related to the scheduling of professional development, teacher planning,
USING DATA TO INFORM INSTRUCTION 104
professional learning model protocols and expectations, and notes taken by the principal and
teachers during collaborative group time. I also collected documents related to the teacher
evaluation process, including evidence gathered from classroom observations and teacher-
administrator informal/formal conferences, and documents detailing the school district’s
teacher evaluation policies and procedures. All of the documents I collected were relevant to
my research question and were “grounded in the context under study” (Maxwell, 2012, p. 163).
Process of Constructing the Interview Protocol
In creating an interview protocol, I developed several clear, easy to understand
questions and probes that I felt would yield rich descriptive details about the teacher evaluation
process and professional learning opportunities at the school site, including the use of data and
how it enhanced or hindered teachers’ ability to learn about their practice. In developing this
protocol, I started with a question about the interviewee’s decision to become a teacher. I
thought this might be a good way to ground the participants and encourage them to tap into
their passion for the work they do. I then moved into questions that asked for more specifics
related to the teacher evaluation process and to opportunities for professional growth and
learning. After that, I asked questions about professional development and the principal’s
leadership role at the school.
The Observation Protocol
I conducted eight 90-minute observations of professional development sessions using a
semi-structured observation protocol. As Merriam (2009) explained, observations have an
advantage because they take place in the natural environment where the phenomenon of interest
naturally occurs without the need to arrange or schedule anything. During the observations, I
assumed the role of observer as participant. This means that my research purpose was known to
USING DATA TO INFORM INSTRUCTION 105
the staff members at the school. I worked to establish rapport with them by showing interest in
their activities, contributing to the refreshments, being friendly, and fitting into their routines. as
Merriam (2009) suggested. For the most part, participants welcomed my presence and made
themselves available to me for questions or for clarification. They allowed me to look over their
shoulders while they were taking notes and often gave me copies of their notes. I observed the
principals’ leadership role in professional development, noting her actions and the teachers’
responses. I also observed small group breakout sessions with the English Department teachers.
I described the physical setting, noting the seating arrangement of teachers, how space was
allocated, and the objects and resources in the environment. I paid particular attention to the
participants, identifying their roles and noting patterns and frequency of interactions. I
transcribed the conversations after being granted permission. I also noted subtle factors such as
off-task behavior, nonverbal communication in dress and choice of physical space, paying
special attention to how activities transpire and to what does or does not happen.
Data Analysis
Merriam (2009) described data analysis as the process used to answer your research
question, which begins by “identifying the segments in your data that are responsive to your
research questions” (p. 176). She also cautioned that this is the point in the research “at which
a tolerance for ambiguity is most critical” (Merriam, 2009, p. 175). Further, Corbin and Strauss
(2008) described data analysis as both an art and a science. The researcher must “learn to think
outside the box, be willing to take risks, and be able to spin straw into gold” until the raw data
becomes something that “promotes understanding and increases professional knowledge” (p.
47).
USING DATA TO INFORM INSTRUCTION 106
I began the process of analysis in this spirit. All of the data collected was coded to find
patterns that yielded findings into how data from teacher evaluation and other professional
learning experiences was used to support teacher practice change. I also found evidence that
demonstrated how the principal directed the use of data in professional learning experiences,
revealing how that was experienced by teachers and how teachers interacted with her and each
other to examine and improve their practice.
I began the process of data analysis by reading through the interview and observation
transcripts, identifying segments in my data that were responsive to my research question, and
then assigning both open and analytic codes to that data. Some of the open codes were drawn
from direct quotations in the data. Examples include the following: Leadership: “Putting in the
time to be the face of learning at your school,” PD: “a forum to show off cheeky practice,”
Teacher Evaluation: “nerve wracking,” “intimidating,” “invasive.” Other open codes emerged
with multiple unsolicited references to a particular experience or perception. For example:
Value of one collaborative experience where teachers worked with peers to create vertical
alignment of writing scaffolds. Examples of analytic codes included the following: principal
as manager of instructional program, knowledge for practice, principal as buffer or mediator,
and so forth. I then organized these codes into categories including some of the following:
Instructional leadership, PD: Benefits and Negatives, TE: Benefits and Negatives, and so forth.
After several thorough readings of each piece of data, I created 17 categories and more than 300
codes. I used an Excel chart and an online coding system, Atlas.ti, to aid in the analysis
process. With these supports, I looked for patterns that emerged both within the data and across
the data. After completing the coding for the interviews and observations, I wrote analytic
memos for each one. After that, I reviewed the Excel and online charts to detect the recurrence
USING DATA TO INFORM INSTRUCTION 107
of codes. I noticed that they clumped together in a way that revealed the patterns. Out of this
process emerged the themes that are represented in Chapter Four.
Limitations
There are six limitations identified with this study. First, this study is not generalizable
because it is a qualitative study. Some might consider that to be a limitation, but from a
qualitative standpoint, it is not a limitation. Second, the timeline of the data collection only
spanned four months (i.e., one semester), and may not have allowed for enough data to be
collected to determine whether or not the principal’s leadership role in data use effected a change
in practice. Third, this is a small-scale study and does not give a larger window into the
phenomenon studied. Fourth, I might not have been on the site enough or had the availability in
my schedule to ensure the likelihood of capturing more of the evidence, particularly principal-
teacher informal observation conferences. Possibly, my access to these types of conferences was
limited as well by the sensitive nature of the data I sought to collect. Fifth, data collected
through interviews rely on the truthfulness of the staff members interviewed and cannot be
controlled. Finally, the honesty of reporting students who qualify for free and reduced lunch, or
lack thereof, may have affected this school’s classification as an economically disadvantaged
school.
Delimitations
The following delimitations include factors within the researcher’s control yet may still
affect the results of the study or how the results are interpreted. First, only one school site is
represented in this case study. Second, the study took place over four months. Interview and
observation times were determined by the availability and willingness of the staff. Third, the
USING DATA TO INFORM INSTRUCTION 108
instrumentation used for the observations and interviews conducted in this case study may not
yield appropriate data to answer the specified research question.
Credibility and Trustworthiness
My own research bias acts as a limitation, as the inferences I made from the observation
and interview notes were made from my own lens and might not always align with what
participants might be thinking when they provided their responses. As the primary instrument in
the data collection process and analysis, it was critical to be mindful of personal biases and
prejudices that might affect the results of my research. I came into this study with extensive
experience as a district and school site administrator. I also had personal experience
implementing a new teacher evaluation system across schools in a local school district. I had
certain expectations and feelings about what I might discover. They shaped the way that I
wrestled with the data. Likewise, I felt an immediate affinity with the school principal. I
recognized her struggles and the choices she might agonize about and most likely would need to
weigh. To lessen the possibility of bias, I had to take a step back to both acknowledge and
interrogate my biases.
Apart from these efforts, I engaged in specific practices to ensure the credibility and
trustworthiness of my findings. First, I wrote questions in my interview protocol that were
relevant and responsive to my research question and that addressed the three areas of research in
my conceptual framework. I also wrote analytic memos, asking myself questions about the data.
Second, to ensure internal validity and strengthen reliability, I triangulated the data by examining
evidence from multiple sources, including interviews, observations, and document analysis, and
by cross-checking that evidence. This process of triangulation enabled me to build a coherent
justification for themes. Third, I used rich, thick description to convey the findings, making an
USING DATA TO INFORM INSTRUCTION 109
effort to describe the setting in sufficient detail to help the reader feel part of the experience.
This attention to description provided a framework for comparison, allowing for transferability
and ensuring an objective representation of the data. Finally, all phases of this project were
subject to scrutiny by a dissertation chairperson experienced in qualitative research methods. My
thinking was continuously examined and questioned by her, resulting in a pushing back on my
biases. She also challenged many of my assumptions about the data.
USING DATA TO INFORM INSTRUCTION 110
CHAPTER FOUR: FINDINGS
In this qualitative case study, I set out to discover how a secondary school was using
teacher evaluation data to improve teacher learning. More specifically, I sought to examine the
leadership role of the principal in using data generated from teacher evaluation with the aim of
improving teacher practice. In the process, I discovered that teacher evaluation data were not
used in isolation and could not be examined separate from other data used at the school. The
findings presented in this chapter demonstrate this. This recognition also caused my research
question to change to one that was more representative of my findings. It now reads as follows:
What is the role of a high school principal in using data, inclusive of teacher evaluation data as a
tool for teacher learning? These findings were drawn from interviews, observations, and
document analysis. A total of 11 semi-structured interviews were conducted, eight of which
were of teachers and three of school administrators. The data from these interviews were
triangulated with observations of eight professional development sessions, one principal-teacher
informal observation conference, and related documents. Before presenting these findings, it is
important to acknowledge the possibility that there might be something beyond the scope of the
data that I did not get to see either before, during, or after collection of the data. I could only
speak to what the data offered to me. Finally, a brief description of the research site is provided
along with background information on the interview participants. After that, the findings in
relation to the research question are presented and discussed.
Context of Study and Findings
In order to contextualize the findings, I provide a description of the school, the principal,
and the faculty and staff members that participated in the study. After that, I present the
findings.
USING DATA TO INFORM INSTRUCTION 111
Background of Star High School
Located in the downtown of a large Southern California city, Star High School opened in
2006, led by a group of likeminded educators interested in providing an alternative to the
underperforming high schools in the area. Star was one of many schools in the Cascade School
District, a district that had made a conscious decision to implement a new teacher evaluation
system in all of its schools. Star began with a class of 140 ninth grade students in 2006 and
enrolled close to 600 students in grades 9–12 at the time of this study in 2015. Ninety-five
percent of the students were economically disadvantaged, 100% were minority, and the school
had an Academic Performance Index (API) of 750. It shared a campus with an urban middle
school from another local school district from which it received the majority of its students. This
was an intentional coupling of two schools from different districts to encourage innovation. The
area surrounding the school was experiencing some urban renewal, but the fact that pornography
was being filmed in an adjacent warehouse and a sweat shop was situated on every corner of the
campus served as a reminder that most families in the school were subject to urban blight and
severe economic hardship.
Principal
Sofia was the principal and cofounder of Star High School. Prior to opening this school,
she worked as a classroom teacher for 19 years in a neighboring urban high school, the last four
of which she served as a literacy coach. She presented herself as an English teacher at heart and
an avid reader of great literature. She also presented herself as a lifelong learner, having earned
two master’s degrees, one in Educational Administration and one in English. With regard to
continuing her education, she stated:
USING DATA TO INFORM INSTRUCTION 112
“I know that in the next 5 years, hopefully, I’m going to get into an EdD program and get
my doctorate. Even though it’s sort of the end of my career, I kind of want to end
knowing that I accomplished my educational goals. I definitely want to get a doctorate.”
This statement was evidence of her desire to promote continuous learning in herself and of her
ambition and determination to achieve her goals. It demonstrated how important it was for her to
be perceived as an expert and a leader with far-reaching knowledge. Sofia came to Cascade
Public Schools with a small group of likeminded colleagues who wanted to start their own
school together. Frustrated with their efforts to break up a large, low-performing high school in
the eastern part of the city into smaller, more responsive schools, they sought out a place where
they could realize their dreams of personalization and autonomy. Sofia began her tenure at this
school as an assistant principal and had only served as principal for two years when I began this
study.
Sofia had a vision of creating a successful school in a challenging urban environment
with a historically disadvantaged student population. Success for her was measured
predominantly by test scores. The determination and effort that she put forth to establish this
school was evidence that she was not somebody who got stuck by the structural impediments
around her. She did not bend to the system. In interview data, she expressed a willingness to
take the risk of doing what she thought was the right thing to do in the best interest of the
students. She portrayed herself as a leader who made an effort to get others to share her vision,
rallying her teachers and administrators around expectations for teaching and learning what she
hoped would lead to improved student outcomes in the way. This is consistent with Elmore’s
(2003) description of a leader as someone who fosters internal accountability by maintaining a
steadfast focus on instruction and by promoting shared expectations for student performance.
USING DATA TO INFORM INSTRUCTION 113
Sofia, likewise, presented herself as a successful role model for students in the
community. She often shared her childhood experience of activism with her students, vividly
recalling the times that she walked alongside Cesar Chavez as he fought for justice for the farm
workers. Furthermore, she took pride in her ability to “connect” with students and lead them on
pathways to success. Posters of heroic, inspirational African Americans, Latinos, and others
who had confronted tremendous odds to achieve greatness adorned her office, classroom walls,
and hallways. A large poster in her office read as follows:
A path is only a path, and there is no affront, to oneself or to others, in dropping it if that
is what your heart tells you . . . Look at every path closely and deliberately. Try it as
many times as you think necessary. Then ask yourself alone, one question . . .Does this
path have a heart? If it does, the path is good; if it doesn’t, it is of no use.
– Francisco Castaneda
This quotation summed up the principal’s sense of herself as a coach and mentor, an educator
with an almost messianic calling to provide direction and opportunities to students from
historically disadvantaged backgrounds.
Faculty and Staff
Eight of nine members of the English Department agreed to be interviewed for this study,
along with the principal and two assistant principals. The classroom experience of these English
teachers ranged from five to nine years, except for one who only had three years of experience.
All of these teachers had earned master’s degrees in Education, were fully credentialed to teach,
and many had aspirations to one day become administrators or out of the classroom support
personnel. Three of them started their teaching careers with Teach for America, and two of them
were former students of the principal. These two both professed a kind of familial allegiance to
USING DATA TO INFORM INSTRUCTION 114
her. One of the English teachers taught English to students with special needs. Finally, one of
them served in a dual role as both the literacy coach and English Department Chairperson. All of
these teachers expressed noble reasons for entering the teaching profession such as wanting to
have an impact on students in the community or desiring to join a long history of educators in
their families, among others. Apart from the English teachers and principal, I interviewed two
assistant principals, one in charge of history teachers and one overseeing math and science
teachers.
Research Question
What follows is the research question and the themes that emerged from the data. What is
the role of a high school principal in using data, inclusive of teacher evaluation data, as a tool for
teacher learning?
An examination of the data revealed that, on the surface, the principal appeared to be an
effective instructional leader, and yet she was not well positioned to promote the use of data as a
tool to inform instruction, nor did she have a deep understanding of how data use might lead to
teacher practice change. Likewise, she did not have a coherent and integrated approach to
improving instruction. Furthermore, professional development was delivered mostly in a top-
down fashion and was based on a knowledge for practice model (Cochran-Smith & Lytle, 1999)
that resulted in the exclusion of teacher voice and inhibited substantive improvement in practice.
Finally, the principal used external accountability demands, specifically teacher evaluation and
externally imposed PD, to her advantage.
I start with Theme #1, which is represented through a series of subthemes. I then present
each of these subthemes in order to demonstrate this overarching theme.
USING DATA TO INFORM INSTRUCTION 115
Theme #1
On the surface, the principal appeared to be an effective instructional leader, and yet she
was not well positioned to promote the use of data as a tool to inform instruction, nor did she
have a deep understanding of how data use might lead to teacher practice change.
Subtheme 1a. The principal managed and drove instruction and carried the
intellectual work rather than building her teachers’ capacity to reflect on their practice or to
critically examine their pedagogy in a way that would lead to instructional improvement.
According to Hallinger (2003), an instructional leader displays a combination of
expertise and charisma in an effort to focus school-wide attention on instruction, managing the
instructional program by supervising and evaluating instruction, coordinating the curriculum, and
monitoring student progress. The principal embodied some but not all of these attributes.
Consistent with Hallinger’s description, the principal embraced an image of herself as a
charismatic instructional leader focused on instruction. She went out of her way to present
herself and to be viewed by her staff as an instructional leader. She stated:
“I want to call myself a teacher’s coach and a teacher’s teacher. I’d rather be the kind of
school leader who my teachers want to invite into their room all the time, because they
want to show me that they’re learning, and how they’re better.”
However, the way this played itself out is that she became the holder and keeper of the
knowledge and expertise and her teachers the deficient recipients of that knowledge in need of
fixing. Moreover, she did not demonstrate the ability to help teachers engage in instructional
improvement. These different, often competing, aspects of her leadership were evident in her
one-on-one work with teachers in the evaluation process and in the way she positioned herself in
professional development. The data from interviews and observations revealed that teachers
USING DATA TO INFORM INSTRUCTION 116
held a perception of the principal as an accessible, knowledgeable leader and exceptional
teacher, but it also revealed that the principal had a problematic, ineffective approach to
instructional improvement.
Sofia worked hard to establish her credibility with her teachers on instruction. One of the
English teachers at the school, Steven, described her, above all, as “a teacher first.” She
maintained high visibility and presence on the campus and had a strong connection to students as
well as teachers. When asked where she was most likely to be seen, Eva responded:
“Everywhere. She’ll be in the classroom, outside at lunch, in the office. She’s always
around.”
Steven responded in a similar manner:
“Everywhere. She’s in her office. She’s in the hallways. She’s opening the gates,
closing the gates. She’s doing hallway sweeps. She’s always doing her observations
with whoever.”
In interview and observation data, it was evident that she visited classrooms frequently, often
assuming the role of co-teacher or student, offered resources to teachers and training
opportunities, oversaw professional development, and continuously shared instructional
strategies. Eva stated:
“She’s not like a lot of administrators at schools that taught for like 5 years and
then became an administrator. She taught for a long time, I think over 20 years.
That, at schools, is not very common, so she has a lot of experience from just
being in the classroom herself. She loves being in the classroom. I feel like Sofia
would be in the classroom all day if she could. She likes that more than being in
the office. She has a lot of experience. She’s very present. The kids all know her
USING DATA TO INFORM INSTRUCTION 117
well. They love her. So it’s not weird for her to come into your class. It’s just
natural. She becomes like part of the instruction. She becomes like a student
there. She’ll raise her hand and ask questions during your class. She has a lot of
resources and information and wisdom to offer us when it comes to that.”
This quotation supported the idea that, on the surface, Sofia appeared to be an instructional
leader, managing instruction by offering resources, information, and wisdom, and by spending a
lot of time in the classroom. She also successfully created the perception with her teachers that
she was different and more capable as an administrator because of the amount of time she spent
as a teacher compared to most of her peers in administration.
By and large, the teachers at the school held Sofia in high esteem and, in particular,
valued her feedback. Interview data revealed that they viewed her as an accessible,
knowledgeable, experienced teacher, like themselves, who could give them a lot of tools and
helpful strategies to improve student behaviors in the classroom. Furthermore, Sofia was the
driver of instruction, modeling “everything we’re supposed to do in the classroom,” according to
Steven. Anne described her as someone who was “at the center of it all.” Likewise, Eva pointed
out that working with Sofia put her in the frame of mind of “constantly looking for ways to
improve.” Eva continued, “She has so many insights into things that I just don’t think about.”
While the principal successfully created the perception of herself as knowledgeable about
instruction, her efforts with her teachers did little to support or facilitate practice change. The
conversations that she set up with her teachers in the area of teacher evaluation, in particular,
were superficial and principal-centered. She did not know how to help teacher learners become
reflective practitioners by critically examining their instruction through the four-step process
described by Rogers (2002b), including the four cyclical phases: presence, description, analysis,
USING DATA TO INFORM INSTRUCTION 118
and experimentation. Instead, she positioned herself as the knowledgeable other and the teacher
was expected to act as the recipient of that knowledge (Cochran-Smith & Lytle, 1999).
Furthermore, while she had no difficulty focusing on her own growth and development, she
seemed to lack the commitment to and curiosity about her teachers’ growth that Rodgers (2002a)
argued was so important for a reflective practitioner. Notably, another data point that reinforced
the idea that Sofia did not engage in reflective practice was that, when afforded the opportunity
to receive the findings from this study to potentially inform her practice, she expressed a lack of
interest and curiosity.
One example that demonstrated the principal’s approach to leading instructional
improvement was a principal-teacher informal conference. Anne, the teacher in the conference,
was a nine-year veteran. This particular conference lasted for little more than 20 minutes. The
informal conference provided the principal with an avenue to focus on teachers’ concerns about
their practice and to build their instructional capacity. In reference to Hallinger, (2011), “Both
education and school improvement are about the development of human capacity” (p. 137).
Rather than focusing her interaction with this teacher on instruction, using the data gathered in
the observation to give the teacher an opportunity to examine and explore her thinking about
how her instructional choices shaped student learning opportunities, Sofia directed a shallow,
limited conversation where the teacher was given very little opportunity to reflect on her
teaching or to use the data gathered to improve her instruction. Though there was some evidence
to indicate that Anne’s use of data mirrored the principal’s approach, suggesting that she had
internalized some of the practices that the principal espoused, the bulk of the evidence from this
observation showed that the principal did little more with this potentially rich opportunity than
act out her role as expert and owner of the process.
USING DATA TO INFORM INSTRUCTION 119
Prior to talking with Anne, Sophia observed 15–20 minutes of her classroom and scripted
all aspects of the lesson, including what the teacher said and did, what the students said and did,
the appearance of the classroom, and so on. The principal then used this script to guide the
discussion during the conference. The informal conference was not scored. This particular
conference took place in the principal’s office. The principal began the conference by saying
“We are going to talk really fast just to do this informal and then,” Anne finished her sentence,
“finally schedule the formal?” The principal replied, “Yes, is that ok?” They paused for a
moment. During the pause, Anne looked at some of the multiple tasks that were posted on the
principal’s board next to her desk. This To Do chart listed pending items for March and April,
including such things as budget, first draft of matrix, order for teacher appreciation, planning for
SBAC, Junior Prom, Spirit Week, Sports Banquet, have hiring done, plan the museum trip,
prepare for CAHSEE, and so forth. Anne remarked, “Miss, the amount of stuff you have to do
and like the variety of it is just overwhelming.” The principal agreed, “It’s really
overwhelming.”
The principal then opened the conference by asking Anne about her progress with
teaching the play Romeo and Juliet.
Principal: So, you’re getting into Romeo and Juliet?
Anne: Yes.
Principal: How was it? How has it been? What’s going on?
Anne: Great, they love it.
Principal: They’re kind of loving it.
Anne: Yeah. They love it.
Principal: It’s weird.
USING DATA TO INFORM INSTRUCTION 120
Anne: They like watch it on their own at home.
Principal: Yeah?
Anne: Today we weren’t reading, because we’re ahead of schedule and they were
like, oh, man, really, no. They really like it.
Principal: I think one of the things is that students are engaged.
Anne: Yeah.
Principal: And I think that they’re also engaged in the content. They want to know
that they know Shakespeare …
Anne: Yes. Yeah.
Principal: … which I think is cool. I was sitting with Ruben and the other little girl.
Anne: I’m trying to think. You were in period five.
Principal: I was next to the window and it was Ruben and the skateboard kid. He has
a skateboard. He is really clean cut.
Anne: Gustavo, I’m sorry. Yeah, Gustavo, yeah.
Principal: The other little girl was next to him.
Anne: Yeah. Jasmine, maybe?
Principal: Yeah.
Anne: Yeah.
Principal: Then he was all like, oh, yeah, Jasmine.
Anne: Yes. Mm-hmm (affirmative).
Principal: Then they were looking up the diction and it was so funny because he was
like, I know about this, and I was like, oh, have you read it? He was like,
no. I go, but it’s like from everyday life. I was like, oh, like pop culture,
USING DATA TO INFORM INSTRUCTION 121
like cultural literacy or whatever? He was like, well, yeah, because, for
example, feuding families.
Anne: Right. Yeah.
Principal: Because they were talking about the prologue.
Anne: Mm-hmm (affirmative).
Principal: It was just funny that I said, yeah, Shakespeare is like an archetype within
our culture. So you’re going to know a lot of stuff that you didn’t know
you knew.
Anne: Cool.
Principal: He was pretty happy about that, but that’s when Mr. Shine came and got
me out. Remember?
Anne: Yes. I do.
Principal: He was like, oh, and he was very interested.
Anne: Yeah.
Principal: So, it’s really hard because you have some tough kids who like to be off
track, but also are very interested in learning.
Anne: Yeah. I just have to use strategies to get him …
Principal: Redirected?
Anne: Yeah and an occasional task. You know?
Principal: Mm-hmm (affirmative).
Anne: Do this for me over there and then come back, and it just helps, but
he’s …
Principal: Yeah.
USING DATA TO INFORM INSTRUCTION 122
Anne: … tough.
Principal: Well because he’s attention deficit.
Anne: Mm-hmm (affirmative).
Principal: I think one of the things when you think about it for him is this idea of
planning ahead those things.
Anne: Ok. Oh, ok. Like, building them in?
Principal: Even on a Post-It for him every day.
Anne: Yeah?
Principal: On the days that you have 4
th
, 5
th
, 6
th
.
Anne: So like what kind of stuff …
Principal: Even redirection in the class and out of the class …
Anne: Ok.
Principal: Because I saw him, for example. He was in Amber’s reading class. I was
watching him the other day. It was like really simple mental redirections
for him. Do you know what I mean? For example, he likes to talk out. He
doesn’t know how to raise his hand. It’s almost like, ok, Gustavo, today I
want to call on you, but I’m only going to call on you three times. So you
have these three cards. Pick them. Use it …
Anne: Got it.
Principal: … when you need to, when you really want to say something. I’ll never
ask the question.
Anne: I’ll only acknowledge it if you raise your hand.
Principal: Yeah.
USING DATA TO INFORM INSTRUCTION 123
Anne: Yeah.
Principal: Or just give me the little Post-It.
Anne: Ok.
Principal: Give me the little whatever, because then he focuses. It’s like Jose does
that, too.
Anne: Task completion?
Principal: Yeah.
Anne: Yeah.
Principal: He focuses on I have three so I’m going to do three, and then it helps him
to not just be all over the place. But I think that he’s also super creative so
with him it’s a balance of—what is it called—validation that he’s here. A
lot of it is just proximity and putting your hand on his shoulder and being
like, Ok. I’m here near you. I acknowledge you. I’m not going to call on
you. It’s those three impulse conversations before a class where like, dude,
I’m not going to call on you every time. So you need to pick your three
times and then when you earn it you get a reward. Do you know what I
mean?
From the outset, the setting for the conference and the principal’s opening remarks
undermined this potentially rich interaction with her teacher. The conference took place in the
principal’s office, where the power resided with the principal, rather than the teacher’s classroom
or a neutral location where the teacher might have felt more comfortable opening up about her
practice. The imposing To Do list, a constant image of the “overwhelming” and ongoing nature
of the principal’s work, served to remind the two participants of the urgency of this moment,
USING DATA TO INFORM INSTRUCTION 124
reinforced by the principal’s push to “talk really fast.” This environment was set up by the
principal, intentionally or not, in a way that sent a confusing, conflicting message to the teacher:
the principal showed by her actions that she did not truly value the use of data, because she did
not have time to give the teacher the attention needed to review the data. It also demonstrated a
recurring aspect of the principal’s leadership style, the tension of time in conjunction with depth.
The principal created so much forward motion in her work that it made it nearly impossible for
her to settle on or pay close attention to any one thing. This sense of urgency played itself out in
the larger data set as well.
Apart from the setting, the principal’s initial questions had the effect of limiting the scope
of the conference, directing it toward a focus on student behaviors not on instruction. “So,
you’re getting into Romeo and Juliet?” she asked. This question immediately narrowed the
focus of the discussion and set a frame for the teacher to talk about the students’ experience with
the play not what Anne might have been thinking about in relation to her instructional choices.
The principal made no reference to the observation, to prior goals or expectations, or to the
teacher’s thoughts about her instruction. Again, she diminished the importance of the data.
Instead, the principal’s question, “How’s it going,” triggered a response from the teacher about
the students’ experience with the text. “Great, they love it,” responded Anne. Once the
conversation was set in this direction, the principal never redirected it toward instruction.
Instead, the principal continued in the same vein, focusing the conversation on one off-task
student in the class, Gustavo, and toward the strategies she identified for the teacher as the best
way to respond to this student. She said, “I was next to the window and it was Ruben and the
skateboard kid.” “Gustavo,” the teacher corrected her. Here was evidence again of the how
USING DATA TO INFORM INSTRUCTION 125
principal’s own experience and perspective trumped the data she had ostensibly collected. This
suggested that she did not really know what to do with the data.
Without picking up on the teacher’s initial comment, without probing or investigating it
in any way, the principal immediately delved into the students’ experiences and focused on
something that was of particular interest to her, not the teacher. She seemed to assert that she not
only had more knowledge and insight into Anne’s students than Anne herself, even if she did not
remember the particular student’s name, but also that she was capable of building instant rapport
with all students, inspiring “tough kids” like Gustavo, in particular, to make personal connections
with Shakespeare. She told him, “Shakespeare is like an archetype within our culture, so you’re
going to know a lot of stuff that you didn’t know you knew.” With this interaction, the principal
implied that she had successfully helped Gustavo to be “engaged” with Shakespeare by making
real world connections to this challenging piece of literature. She also used this opportunity to
reinforce her image as super teacher.
Next, she told Anne, “So, it’s really hard because you have some tough kids who like to
be off track, but also are very interested in learning.” According to Sofia, they just needed to be
“redirected” to keep them on task. Implied in this interaction was that Gustavo easily got off
task, and Anne did not know how to adequately redirect him. The principal gave her the
strategies that would work with a student who she, again, knew more about than Anne. She said,
“He was in Amber’s reading class. I was watching him the other day. It was like really simple
mental redirections for him.” The principal emphasized her ability to be uncommonly aware and
somewhat omniscient, able to observe students across classrooms and identify the best
instructional approaches for teachers because of this knowledge. Again, the principal reinforced
her image as an instructional leader with exceptional knowledge and expertise.
USING DATA TO INFORM INSTRUCTION 126
In spite of the principal’s assertions during her interviews that the “informals” provided
teachers with an opportunity to demonstrate progress in their selected areas of growth and also
give them an opportunity to use data to reflect on their practice, the evidence showed otherwise.
First, the principal constrained the amount of time that could be given to the teacher, which, in
and of itself, limited the possibility for reflection. Then, instead of focusing on instruction, the
principal focused on student engagement as it related exclusively to her interaction with one
student. From the outset of this brief conversation, the principal geared the conversation toward
student behaviors and away from instruction, identified the problem for the teacher, and
determined what the solution would be. The teacher was never given an opportunity to talk
about what she felt was important. The principal remained focused exclusively on her
perception of the behavior of one student and did not ask Anne what she thought was going on
with that student. Instead, she told Anne what the problem was and then told her how to
respond.
The principal went on to continue to narrow the focus of the conversation to possible
remedies for the off task behavior of one student:
Anne: What about the calling out?
Principal: Yeah. He has to control that.
Anne: Ok.
Principal: Do you know what I mean?
Anne: Yeah. Give him like concrete ways to …
Principal: Yeah. It’s almost like, if you don’t call out in the next half hour, I’m
giving also a ticket. I’m giving you a whatever.
Anne: Yeah.
USING DATA TO INFORM INSTRUCTION 127
Principal: At the end of class, give me all the ones I gave you, because that equals
this many points.
Anne: Ok.
Principal: It has to be associated with a reward. It has to be immediate today, not be
like end of the week. It has to be today. In this half hour you didn’t call
out once. You get this.
Anne: What about the chit chat with the people around him?
Principal: Yeah. That’s another thing.
Anne: Yeah?
Principal: We got to work on one thing at a time.
Anne: Ok.
Principal: I guess. What’s more important to you?
Anne: Probably actually chit chatting with people around him because that’s
when he’s off, because if he’s calling out, at least he’s engaged with what
we’re doing. Do you know what I mean? Even though it’s disruptive, but
he if is chatting with other people, then he is getting them off task, too,
and he’s not doing what we’re supposed to be doing.
Principal: Right. Let’s start with that one then.
Anne: Ok.
Principal: We’re going to be like focusing on the chatting. Right?
Anne: Mm-hmm (affirmative).
Principal: You can’t really put him with any partner that is not going to focus that
chit chat with or can you? Partners are good …
USING DATA TO INFORM INSTRUCTION 128
Anne: Yeah. No. I’m trying to think. I just moved seats yesterday. I’m trying to
think who I have him with. There are certain girls that he sort of bothers a
lot, so he’s far away from them and I think now he’s with some quiet kids.
He doesn’t chit chat with the people right around him, but he’ll sort of
like, can I go sharpen my pencil? Maybe it’s just like, you cannot get out
of your seat, but I’ll give him one task in a class period that allows him to
get up and move around, but other than that, he can’t get up out of his seat.
Principal: Yeah.
Anne: Ok.
Principal: It’s like a negotiation for a reward.
Anne: Ok.
Principal: Then what happens is over time …
Anne: Yeah?
Principal: … like over years, he’ll turn into Jamal where he doesn’t need the reward
anymore. The reward is intrinsic.
Anne: Right. Ok.
Principal: When he’s younger, it’s like he can’t control.
Anne: Right. I really like him. He’s a great kid.
Principal: I like him, too.
Anne: Yeah.
Principal: He needs to earn it.
Anne: Yeah. Ok.
Principal: I saw him earning with Mrs. Brown. He was earning his points.
USING DATA TO INFORM INSTRUCTION 129
Anne: Ok.
Principal: She has little stickers, little raffle tickets. Then she gives raffles at the end
of the week, and then that’s what they are, but that’s just all of them is like
that.
Anne: Right.
Principal: It’s like way harder.
Anne: Ok.
Principal: It’s a smaller class, but they’re seven kids who have attention deficit. It’s
crazy.
Anne: Yeah. When you get on, he’ll be up for it, I think.
Principal: Yeah. I think he likes the roles. He’s a good reader.
Anne: Yeah. He is a good reader.
Principal: When he is reading, if he has a role every time, he’s going to stay on task.
Anne: Ok.
Principal: If he doesn’t have a reading role, he’s not.
Anne: Ok. Give him stuff to read aloud?
Principal: Yeah.
In this excerpt, the teacher assumed the role of learner, looking to the principal who had clearly
established herself as the expert in the relationship. Anne neither took control of the
conversation nor was she ever given an opportunity to take control of the conversation. She
placed herself in a subservient position as someone who is knowledge seeking and comfortable
to be on the receiving end of the discussion. The principal did not position Anne to be proactive
or reflective.
USING DATA TO INFORM INSTRUCTION 130
At this point of the conversation, Anne looked to the principal for advice on how to
respond to other off-task behaviors of this same student, Gustavo. “What about the calling out?”
she asked. Without hesitation, the principal jumped in once again to offer the solution–give him
immediate rewards for on task behavior. The teacher gently pressed again, “What about the chit
chatting with people around him?” Here the principal gave her a small choice: “What’s more
important to you, the calling out or the chit chatting?” The teacher asserted, “Probably actually
the chit chatting.” For the first time, the teacher was allowed to select and even offer a solution–
give him one opportunity to get up during class if he remained on task the rest of the time. The
principal validated this idea by rephrasing it, “It’s like a negotiation for a reward.”
Sofia then mentioned another student who was successful after repeated efforts with this
type of positive reinforcement and another teacher who established an effective reward system
with even tougher students, once again pointing out her successes with students and teachers, not
acknowledging Anne’s successes or building her capacity to address student behavior issues or
instructional dilemmas. In this exchange, Anne was given a little leeway to suggest small tweaks
in her practice that are still only focused in a very limited way on one student’s off-task behavior.
In the next part of the conference, the principal asked Anne, “What else are you working
on with the kids?” Anne responded:
Anne: Well, I just started with our new seating chart. I have a different student
every day that’s keeping these tally marks about who is sharing.
Principal: Good.
Anne: Yeah and I told them this because, I’m trying to build in more pair shares.
So I told them that this week and next week we’re just gathering data.
We’re just researching and then I’m going to show them, so they can see
USING DATA TO INFORM INSTRUCTION 131
who is talking and how much, and who is not talking. And then after that
I said, it’ll start being a part of your grade. Now, we’re just practicing the
new system.
Principal: Good.
Anne: But it doesn’t count yet. And then once we look at it, then it will start to
count. And so I told the person who is keeping track to stop after five, but
then you can get a real clear picture.
There are kind of three groups. There are kids who get their five checks in
the first 5 minutes, and then there are kids who say nothing, and then there
are kids who maybe have one or two checks. They’ll raise their hand and
get in their answers so they’ve spoken, but then they don’t anymore.
Principal: Which is like the 70 percent kids (The C grade kids who are ok with doing
the minimum).
Anne: Yeah. I don’t know. They’re pretty talkative. A lot of kids talk a lot, but
…
Principal: They’re like… I love them.
Anne: Yeah.
Principal: They grow on you. Don’t they?
Anne: The 9
th
grade? Oh, they’re great.
Principal: I can’t believe I’m saying it, but they’re growing on me.
Anne: They really are. They’re fun and nice and they get excited. They’re nice
kids. I like them.
USING DATA TO INFORM INSTRUCTION 132
Principal: Yeah. I have for you … Look what I did. I’m so proud. I made this table
for every teacher… for your focus areas out of your goals for your IGC
(Interim Guiding Conference), and then what am I rerating for second
semester.
Anne: Got it. Very organized.
Principal: I’m trying. Look at my mess, but it’s organized. You talked about a new
seating chart after 10 weeks. (The principal reads a chart with teacher
goals listed on it.) Mix up the seating that students are used to …
Anne: Yeah.
Principal: … and encourage new and different students to participate more. That’s
what you’re doing; Awesome. (The principal continues reading from the
goal chart.) Then my second goal is to incorporate one pair share plus cold
call for every whole group discussion, roughly once per class period,
strategically plan for a question that is answerable, and then circulate to
preview answers and then call on students who are reluctant to share. That,
I think, is working to this. It’s awesome.
Anne: Yes.
Principal: Right now, we’re looking at 2.3B, which is Student to Student talk. We’re
going to take care of that. 3.2A, which is the whole lesson. Right? What’s
the pacing of the lesson? What’s the structure of the lesson? Do I have
enough pair share plus cold call, and then do I end with a proving behavior
that is incorporating the close reading or the writing? These ones are all in
the discourse so the questioning, the group structures and the discourses of
USING DATA TO INFORM INSTRUCTION 133
the group structures, and then the 3.4A is the feedback. I feel like you’re
doing everything that’s leading you towards moving those to the right.
Anne: Mm-hmm (affirmative).
Principal: The 3.4A is the feedback.
Anne: Feedback on student work?
Principal: Well, just two kids out loud.
Anne: Oh, ok.
Principal: We have, let me see. This thing is so big. Ok. (The principal continues to
read from the goal chart.) Checking for understanding and adjusting
instruction. The teacher checks for understanding using different
techniques throughout the lesson to yield actionable data on the students’
progress and the teacher adjusts whole class instruction based on that data.
I feel like this is still what you’re thinking about.
Anne: May I look at them?
Principal: Sure.
Anne: It’s this one, 3.4?
Principal: A.
Anne: A. Ok.
Principal: It’s not the feedback, but the checking for understanding.
Anne: Oh, checking for understanding. Oh, Ok. Sorry. I put 3.4A?
Principal: Yeah. Well, I think you didn’t put it. I think it was probably the two we
have to do.
Anne: No. I didn’t. I didn’t know I need to.
USING DATA TO INFORM INSTRUCTION 134
Principal: Yeah. You put it.
Anne: Ok.
Principal: Do you want me to look again?
Anne: Sure.
Principal: Sure?
Anne: No. I believe it. Wait. Yeah. No. That makes sense that I would put that.
Principal: Yeah because I think it is in line with what you’re doing.
Anne: Yeah. It is. Yes, because if I just listen to the first three kids that answer it,
I’m not …
Principal: You’re not …
Anne: … checking for understanding. I’m just letting someone answer. Yeah.
No. That makes sense that I have chosen that. Ok.
Principal: And I think when we talk about different techniques, you’re doing a lot of
it in the verbal …
Anne: Yeah?
Principal: … but I think also we need to think about: What about the exit slips? What
about the different ways I can put the Post-Its?
Anne: Or they’re like, “One through five, do you understand that kind of thing?”
Principal: Yeah and it’s also just what they know on their own, not just what they
know. So like when you’re asking a question and you have 10 people who
can answer it and there is 25 in the class, how do you know the other ones
know?
Anne: Mm-hmm (affirmative).
USING DATA TO INFORM INSTRUCTION 135
Principal: It’s like speaking about and you’ve done a lot of really cool things with
the close reading with the quotations, with the doing the charts, with
everybody finding the right either fact or piece or quotation to support the
topic sentence, or the inference. Right?
Anne: Yeah.
Principal: All of those are ways that you’re checking for understanding.
In this part of the conference, the principal asked, “What else are you working on?” After seven
minutes of the 20-minute conference had passed, she finally posed a question to Anne about
something she was doing to work on her instruction. This question did trigger Anne to talk about
her instruction if only in a superficial way, explaining that one of her goals was to increase
student participation. As Anne referenced her own attempt to engage in a self-styled form of
data-driven research by creating a chart to tally how frequently students respond to questions
during classroom discussions, the principal’s attention seemed to diminish. The principal
appeared to be pleased that Anne had initiated a data-driven experiment to hold her students
accountable for their participation in class, an act that may well have been inspired by the
principal’s own expectations regarding data use, and yet her interest seemed to wane. “Good,”
she said to her. Once again, she did not probe Anne’s thinking on this or any other topic.
Notably, as soon as the conversation shifted to Anne’s work, Anne’s voice was much
more prominent. Anne talked about her effort to use data to inform her instruction, albeit to
accomplish minor tweaks to practice regarding student behaviors. She seemed proud of
gathering data, repeating the behavior of the principal in other settings. It seemed expected that
Anne would collect data and use that data as evidence that students were talking more. This was
consistent with how data were used at this school as an important aspect of instructional
USING DATA TO INFORM INSTRUCTION 136
conversations and was representative of the larger data set. Yet, the conversation remained
shallow and the principal contributed very little.
The data Anne gathered informed her about a dilemma she had with her students
regarding their participation in class discussions. Anne explained that there were three types of
students, those who frequently participated, those who did just enough to get the credit (i.e., the
70% kids), and those who did not participate at all. The principal only seemed to be half
listening at this point. She responded with a comment about the 70% kids, implying that she
once again had the right answer. Anne contradicted her by saying that those kids were actually
quite talkative. At that point, the principal changed the subject. She seemed to be disappointed
that she was not the one to solve the problem. She also only seemed to attend to part of Anne’s
concern. She then stated, “They’re like, I love them.” Her attention focused once again on the
students. She did not engage with the teacher’s problem of practice. When the instructional or
behavioral concern did not emanate from her, the principal’s level of investment shifted. As
soon as she became the recipient of information, she became very passive. She did not ask Anne
for clarification or elaboration. Nor did she probe for reflection, reinforcing the idea that if the
conversation was not driven by the principal, her investment in it decreased. The quality of the
interaction diminished when the teacher started from the position of what was important to her as
a teacher. Here the principal had a rich opportunity to engage in a potentially powerful
discussion about how to increase student engagement in Anne’s classroom. Anne was wrestling
with the implications of the data she collected, yet the principal’s response was brief and
superficial.
Instead of exploring Anne’s thoughts, she immediately shifted the attention to a data table
she was proud of making for the teachers. From an instructional leader’s standpoint, this
USING DATA TO INFORM INSTRUCTION 137
interaction demonstrated that Sofia was a principal-centered leader who was reluctant or did not
know how to release responsibility to the learner. If she was not driving the discussion, she did
not engage with the teacher at the same level, and the teacher was not able or willing to press her
to address her concerns. Their relationship was top-down, hierarchical, and lacking in substance.
Next, the conversation on goal-setting, ostensibly one of the most important parts of the
informal conference, was, instead, confusing, muddled, and unproductive. As the principal
reviewed Anne’s goals, she noted that one goal Anne purportedly was working on required her to
incorporate “one pair share plus cold call” for every whole group discussion she had in a class
period. She then reminded Anne of other goals that she presumably agreed to work on. The
principal stated, “Then my second goal is to incorporate…” At this point, it was unclear who
had set this goal, Anne or the principal. Furthermore, some of the goals mentioned seemed to be
compliance-driven, required by the school district, and others seemed to be selected by the
principal.
Anne, apparently, did not have much control over which goals she would work on. She
could not even clearly remember the specific goals she selected, particularly the one entitled,
Checking for Understanding and Adjusting Instruction. Anne was unsure. “Do you want me to
look again?” asked the principal. “No, I believe it. That makes sense that I would put that.”
Anne appeared to be confused and not in control of the process. It seemed that she selected at
least some of these goals at one point in the year, but there had been no concerted effort on the
part of the principal or others to encourage her to revisit her goals or recommit to working on
them.
In sum, this conference lasted for approximately 20 minutes, which seemed quite
extensive given the principal’s press for time. The first eight minutes were focused on the off-
USING DATA TO INFORM INSTRUCTION 138
task behavior of one student and the principal’s strategies for how to address it. The next seven
minutes were focused on the teacher’s attempt to present a surficial problem of practice and to
discuss her self-selected goals, which were hazy for Anne. The last five minutes were taken up
in finding a date for the formal observation. In the end, very little was accomplished in this
conference. It was the principal who owned the intellectual work in this process. The principal
reinforced Anne for the goals she set, validating her for the progress she made, ultimately
owning the process of using data to drive learning and improvement. She did not enable Anne to
own it. She did not equip her or demand anything from her other than a few minor tweaks in her
practice. Likewise, there appeared to be little buy-in from the teacher. The principal
summarized the conference by saying, “I feel like you’re doing everything that’s leading you
towards moving those to the right,” implying that Anne would receive higher scores on the
indicators in her upcoming formal scored observation. It was unclear why Anne would deserve
or merit this increase. She was not set up to reflect on the quality of her instruction or to
demonstrate how she had made improvement in her practice.
Even though the principal clearly made an effort to demonstrate her understanding of
what it meant to be an instructional leader and attempted to display her ability to link data use to
teacher practice, particularly in the informal principal-teacher post observation conference, the
influence that she had on practice appeared to be superficial at best. As soon as the conversation
shifted to the teacher’s practice, the principal took a back seat. Her contribution to the
conversation focused more on the procedural aspects of the teacher evaluation process and less
on instruction. At key moments, she seemed to revert to the rubric script. She was not well
positioned to ask the teacher to engage in a reflective act and did not seem interested or invested
in doing that. In the end, the teacher gained some small insights, some small tweaks of her
USING DATA TO INFORM INSTRUCTION 139
practice, but it was the principal doing all the intellectual work. She scripted the evidence, she
analyzed the evidence, she positioned herself to be the only one to give feedback on the
evidence, as minimal as that was. She did not ask the teacher to think about her own
understanding of the evidence or how it might be used to inform her practice, and she did not
attend to the teacher’s concerns about her practice.
Just as she did in the context of the informal observation, the principal practiced the same
approach in the formal observation, portraying herself as the manager and driver of instruction,
the person with the most knowledge and expertise, and yet, she continued to carry the weight of
the intellectual work and did not build the capacity of her teachers to reflect on their practice or
to critically examine their pedagogy in a way that would lead to instructional improvement. In
her role as manager and driver of instruction, Sofia set up an extensive, thorough, demanding
evaluation process that required each teacher to prepare for two detailed formal observations
annually. Sofia set up a rigorous system of teacher evaluation mandated by the school district,
requiring each of the three administrators to evaluate approximately 10 teachers annually. The
designated evaluator conducted one formal observation and four informal observations for each
teacher each semester, a requirement that exceeded the already-demanding expectations of the
school system in which they worked. She consistently presented herself as an efficient manager
of instruction, as a leader who went “above and beyond” others’ expectations, and the way she
structured the teacher evaluation process was evidence of this. This structure allowed her and
her fellow administrators to be in control of the process and of the knowledge. Further, as a way
of managing instruction, Sofia utilized the 29 indicators represented in the rubric to give teachers
an opportunity to select goals and identify areas of need and focus.
USING DATA TO INFORM INSTRUCTION 140
Ultimately, the data she gathered from the teacher evaluation process set teachers up to
make no more than superficial changes in instruction in a knowledge for practice approach
(Cochran-Smith & Lytle, 1999) that did not enable them to critically examine their practice or
their pedagogy and instead treated them more as the recipients of the principal’s knowledge and
experience. In this section, I examined the principal’s role in the formal evaluation process,
providing evidence to show how, despite good intentions, the principal undermined her ability to
build the capacity of both teachers and administrators to use data effectively to improve practice.
She also tended to focus on student behavior changes and not on helping teachers discover,
examine, and reflect upon problems with their practice. Furthermore, she tended to position
herself as the knower, while her teachers acted as grateful passive recipients of her feedback.
The English teachers’ descriptions of their experiences in the evaluation process aligned
with these assertions, expressing that their knowledge was not particularly valued, respected, or
even probed. When asked about her experience with teacher evaluation, Miriam replied:
“This is one area where teachers, in my opinion, aren’t valued as experts per se or experts
in their class. It’s more like the administrator comes in for a total of 45 minutes and
suddenly knows everything.”
Another teacher, Sue, likewise, painted a picture of the all-knowing administrator offering advice
to the grateful, yet deficient teacher in dire need of resources and strategies. In her interactions
with the principal during the evaluation process, Sue described the principal as “nurturing” and
“encouraging,” someone who gave her constructive feedback, someone who made her feel
“renewed,” and in a frame of mind of wanting “to improve.” However, in these same
interactions, Sue was always the receiver of information. She was never asked to participate in
USING DATA TO INFORM INSTRUCTION 141
its construction or formation. She never contributed to her own learning, nor was she expected
to. When asked to talk about the feedback she received from the principal, she stated:
“Personally I look at her feedback for…just in the class, her observations in the moment
when we’re discussing that in the post-observation meeting. We’re also setting goals at
the same time, so it’s built into our evaluation system that we have to set goals on that
data. Based on what she’s telling me, we outline the different sections of the rubric that I
want to work on for next year. I set those goals and I write them down in our little portal
that we have, our online portal. Then we start to use that to plan our instruction.”
From this description, Sue communicated that the principal was keeper of the knowledge, and
the teacher’s position was to be the recipient of that knowledge rather than a participant who
constructed that knowledge. In reference to these post observation conferences, Sue further
stated:
“I look forward to them in the sense that I’m interested to see what she saw and what she
has to say. It’s like I do feel sometimes somewhat a little bit nervous. Just to see, oh, did I
think I did well and then she’s going to think I didn’t do very well.”
In this description, Sue portrayed the principal as someone who did not encourage her to reflect
on her pedagogy. Sue was very much positioned in a knowledge for practice mindset, one that is
the least likely to have an effect on teacher practice change (Cochran-Smith & Lytle, 1999). Sue
also indicated that she relied heavily on feedback from the principal to validate her instructional
choices, to help set her instructional goals, and to get advice on what kinds of tools or strategies
she should use to improve student behaviors in her classroom. She waited to learn what the
principal “sees” and what she would “say” and how “she” would react. Improvement of practice
resided at the principal’s level not at the teacher level.
USING DATA TO INFORM INSTRUCTION 142
Sergio described a similar experience working with the principal in the teacher evaluation
process. In describing the principal’s approach, he stated:
“She’s honest, and she’s thorough, and she’s knowledgeable, and she’s willing to help,
and she is very insightful, specifically with me. She talks about great English, so she
gives me her knowledge on her experiences. I have a very open relationship with her, and
honest.
He recalled the interaction he had with her in a pre-planning conference for the formal
observation:
“She was able to help. For my formal observation I’m going to be doing a brand new
activity which is called the fishbowl, which is a discussion amongst five and six kids at
the center of the classroom. The rest of the class are active participants just by note
taking. They really can’t engage in conversation. I’ve never done that activity. She
basically explained the ground works of how to set it up, how to begin it with a round
robin question, then allowing the students to initiate their own conversations by allowing
them to start with their own questions as opposed to the questions that I give them. She
also helped me create questions that would lead to deeper and richer conversations. That
was basically the gist of it, the questions themselves that she helped me. The
conversations that I want them to focus on is on this concept of forgiveness. We’re
reaching a point in our holocaust unit where a dying SS officer is asking a Jewish
prisoner for forgiveness on his death bed.”
In this description of the formal evaluation process, Sergio experienced the principal as the one
with all of the expertise and knowledge, first helping Sergio set up the fishbowl, and then
offering her insights and ideas about how he could do it better or more effectively. She told him
USING DATA TO INFORM INSTRUCTION 143
to start it with a round robin and then suggested that he continue with the students’ own
questions, not the questions he formulated. She did not ask him to reflect on his practice, to
think about how the students would respond to the fishbowl, or to come up with his own insights
about what issues he might face with this activity. Like Sue, Sergio was the recipient of the
principal’s knowledge. He was not asked to construct his own understanding of what might
occur in his classroom or to consider how his choices might influence student learning. When
asked if her feedback was helpful or relevant, Sergio responded:
She gives me a heads up of possible drawbacks. The questions themselves. She saw them
and some of them were pretty flat. She immediately addressed it and helped me change
the questions.
Again, Sergio experienced the principal as the one with the knowledge and himself as the
intended receiver. Consistent with Sergio’s representation of the interaction, the principal equally
represented herself as the keeper of the knowledge and the teacher as the recipient of that
knowledge. She stated:
“Let’s say for Sergio, who I’ve been coaching for the whole year. He’s really working on
a couple of things, the student to student discourse, student facilitated conversation,
student questioning. Are they asking each other the right question? Am I asking them the
right questions to get them to think on a deeper level? I’ll be looking for that so that I can
give him information. I went in for his formal last week. He wanted to do a form of
Socratic seminar, but he knew that what he really wanted to do is have the kids observing
kids who are really good at discussion, to see what they saw. He set up a fishbowl
conversation, for the majority of the kids to be on the outer circle and then some kids to
be on the inner circle. I met with him ahead of time. He showed me his questions. I gave
USING DATA TO INFORM INSTRUCTION 144
him some input, like, I think this might be stronger. You might get at what you’re looking
for with a different kind of question. I kind of gave him feedback on that. He redid his
questions. We went in and he had the students do the seminar. He said, Well, I’m going
to create a hot seat for kids. If they want to tag-in and tag-out. I'm going to have an empty
seat, so if kids are feeling really inspired and they want to talk, but they’re not in the
circle, I'm going to give them the opportunity to go in and be able to either ask a question
or make a comment. He did that. Three times other kids from the outer circle got to do
that, which is cool because they wanted to talk. It was all about forgiveness and the
power of forgiveness. I have his script. This week is my post conference with him. I have
to go back in, have to look at his script, have to bucket the areas where he wants to be re-
rated to focus on his areas of growth. Then give him feedback on that.”
At the beginning of Sofia’s response, she stated that Sergio was working on enhancing student
discourse. Are the students and teacher asking the right questions “to get them to think on a
deeper level?” she wondered. She said, “I’ll be looking for that so that I can give him
information.” Once again, she carried the weight of the intellectual work. She was the dispenser
of knowledge. Next, she talked about helping Sergio set up a Fishbowl activity on the power of
forgiveness. To begin, the principal zeroed in on the activity Sergio wanted to do without asking
him how he wanted to set it up or what concerns he might have had about it. Instead she
proceeded to tell him what she knew about how to strengthen his approach to this activity. She
rewrote his questions for the activity in a way that she believed might help the students think
more critically. There was no evidence in her representation of this particular interaction or in
any other that the principal made a concerted effort to help teachers tap into their own
knowledge, reflect on their practice, or consider their own thoughts about how to improve it.
USING DATA TO INFORM INSTRUCTION 145
Furthermore, there was a disconnection between the principal’s representation of this
process and the teachers’ perceptions. The principal depicted a smooth, trouble-free evaluation
process, but the teachers viewed it differently. The principal believed that she was providing her
teachers with valuable data to help them improve and perfect their practice. In contrast, the
teachers described a stressful, pressure-filled, time-consuming process, particularly related to
completing the lesson plan and other online forms. They recalled their interactions with the
principal as pleasant, “nurturing,” and informative. However, in virtually all of these
interactions, they described the principal as dominating the conversation, recognizing only her
own ideas and thoughts about the data that was collected, deciding by herself what needed to be
done to fix the instruction, and determining on her own the next steps for the teacher. The
teachers acknowledged, willingly defaulting to the principal’s ownership of the process. They,
likewise, did not seem to be aware of how to tap into their own deep progressive knowledge of
instruction, nor were they guided in any way to do this.
Consistent with the way that the principal portrayed herself in the evaluation process, she,
likewise, acted out her role as the manager and driver of instruction in the way that she structured
professional development. She became the dominant force in professional learning settings, the
person with the most knowledge and expertise, but also the one who carried the weight of the
intellectual work and did not build the capacity of her teachers to reflect on their practice or to
critically examine their pedagogy in a way that would lead to instructional improvement. Sofia
used the weekly PD time to establish herself as the person responsible for leading instruction in
the school and as the person who drove the instructional agenda. She positioned herself as the
leader of instruction in this forum by taking control of the 90-minute weekly prime time slot and
by putting herself at the center of the PD, setting the tone, dominating the air time, and
USING DATA TO INFORM INSTRUCTION 146
responding to very few questions or ideas that were not a part of the highly structured set of
activities. Likewise, she portrayed herself as integral to the instructional program by
participating actively in the PD, sitting alongside the teachers during the break-out protocol, and
contributing to their conversations. She wanted to be perceived as someone who “walked the
talk” and modeled the kind of commitment and dedication she wanted to see in her teachers with
respect to their behavior in PD.
The principal articulated what it meant for her to be an instructional leader during PD
time. She pointed out the importance of having credibility with her teachers as someone who not
only talked about what they should do but also could actually show them. It was clear that she
believed that she bore the responsibility to be the expert and the keeper of the knowledge. She
also mentioned the importance of continuing to seek ways to deepen her expertise, to model in
her words, “a growth mindset,” to be “the face of learning in the school.” However, she never
mentioned the importance of building others’ capacity, nor did she talk about her responsibility
to help teachers construct their own knowledge about their practice. The principal reinforced
this understanding when she described her role in PD:
“We talk about your ability to interact, coach, respect, and elevate your teachers. I feel
like that is your authority as an instructional leader. It comes from, it’s not about
reputation. It is about walking your talk. I feel like you have to be able, a teacher has to
be able, to believe in your expertise as a teacher in order for you to be an instructional
leader. That comes from, and I really believe that professional development is your
vehicle to be like a college professor, to be able to teach methodology courses…All of
that is separate from the heart of the matter we call it, which is teaching and learning.
How do you prove to your teachers and your kids that that’s what you’re about? You
USING DATA TO INFORM INSTRUCTION 147
have to prove it by literally putting in the time and being the face of learning in your
school.”
In this excerpt, the principal compared her role in professional development to the stand and
deliver model of a college professor. She articulated a knowledge for practice approach to
structuring PD. She also spoke about how important it was for her to be perceived by her
teachers as an expert, a leader with the knowledge to come in and take over the classroom at any
moment in time. Never did she mention the importance of helping her teachers tap into their
own knowledge. She further stated:
“If I were still in the classroom, and I think what makes me feel proud to be an
instructional leader, is put me in that room right now today, and I will do that for you. I’ll
show you. I think that they know that. Therefore, they listen to my feedback. Anyway,
my point is, I think that, as leaders, you have to continue to build your expertise in
instruction. That’s why I’m going to get my doctorate also. Because, otherwise, what
you have to say, your feedback is irrelevant. I want to always be relevant.”
Again, she emphasized the importance of establishing her credibility as a teacher in order to be
revered as an instructional leader. To be viewed as the expert, as the one who could provide the
most valuable feedback, she spoke about the need to continue learning so that she could remain
“relevant.” Furthermore, she spoke about continuously seeking ways to build her own expertise
and knowledge, but did not mention building the knowledge or capacity of others.
To conclude Subtheme 1a, I presented evidence from interviews and observations
affirming the principal’s portrayal of herself as the manager and driver of instruction, and as the
person with the most knowledge, expertise, and capacity to lead instructional improvement. In
her interactions with teachers in both the teacher evaluation process and PD, this ownership of
USING DATA TO INFORM INSTRUCTION 148
the work and of the knowledge never allowed her to make more than superficial incremental
changes to teacher practice. I now move to Subtheme 1b.
Subtheme 1b. The principal believed in and promoted the use of data but did not
incorporate it effectively into professional learning experiences in ways that would improve
instruction. The principal believed in the use of data and was clear about promoting its use with
her staff. She expressed this belief through her words and her actions. In interview and
observation data, the principal reinforced her belief that data were an essential component of her
work by talking about the importance of “being data driven” and engaging in “data driven”
discussions. In fact, data were the focal point of nearly every interaction that the principal had
with her staff. Likewise, data-driven planning and data-driven decision making were considered
core beliefs promoted by the larger district and were key aspects of the district-created lesson
plan template that Sofia used with her staff. She advocated for the use of data not only to support
the message of the larger district but also to serve her own instructional agenda, impressing upon
her teachers the importance of using data to improve practice. In placing data front and center,
she highlighted its importance in the community, modeled its use, and showed that she expected
her teachers to use it as well. In particular, she engaged teachers in the use of data to drive the
teacher evaluation process and to structure conversations in professional development. It was
unclear how much training and support she received from the district to work with data
effectively. Below I present evidence to show how the principal used data in a complex,
ultimately ineffective, effort to improve instruction in both of these areas: teacher evaluation and
professional development.
The principal demonstrated her investment in the use of data as a vehicle for supporting
teachers’ improvement in practice through the way she described her approach to the teacher
USING DATA TO INFORM INSTRUCTION 149
evaluation process. She explained how she uploaded data on a website at multiple points in the
process to make it available to her and her teachers as a means for improving practice. She
stated the following:
“Every teacher is evaluated every year. We do two formals. According to the union
contract, it’s two informals and a formal, which is required, is minimum. We, at our
school, do four informals and a formal. We get to do two that we upload onto the website,
and then two that we just provide to the teachers out of support of opportunities to
observe and coach them.”
Here the principal explained the mechanisms for capturing data in the teacher evaluation process.
These included scripting four informal observations and two formal observations for each
teacher. She instituted a policy whereby her school goes “above and beyond” the minimum that
the district required, doing four informals rather than two, so that they could have additional
opportunities to use data to “observe and coach” teachers, a major thrust of her instructional
agenda. Notably, her focus in this description was on data collection not on improving practice.
The teacher evaluation system set up by the larger district did not position Sofia to be successful.
Likewise, the training she received from the district did not appear to prepare her adequately to
help teachers examine their problems of practice. She tried to go “above and beyond” the
district’s expectations, thinking that more would mean better, but the system became so
burdensome that she could not keep up with it or follow through with her own well-intentioned
expectations.
Despite these challenges, Sofia enthusiastically articulated her philosophy about the
importance of data and how data collection might be used in a powerful way to promote changes
in instruction. In her approach to teacher evaluation, she talked about how data were collected at
USING DATA TO INFORM INSTRUCTION 150
certain points, helping teachers identify their areas of growth and allowing her to support them in
the process. Data were critical to her approach to using the teacher evaluation process to
improve instruction. She stated:
“I have six scripts on my computer still that I have to bucket. Basically the process is this,
you observe them several times during the semester. During that time, it’s based on the
teacher. For second semester, each teacher has set up two areas. They call them their
areas of growth. Here’s where I want to grow. What I do is I write that down, and I
remember for each teacher. Before I go in, I go, okay, what is it they’re working on
again? I gotta remember by looking at past scripts and then reviewing progress made, etc.
You go in and you do an informal observation between 15 and 30 minutes, and you give
them feedback in those areas. At the bottom of the script for us, they have areas of
strength, areas of growth, next steps. What I do is I fill all that in. Then I email the
teacher on the same day so that they know immediately what I saw. Then, when they
come in ... Every observation has to have a post conference. They'll come in within that
week or five or six days. Sometimes it’s right away. Sometimes it’s in a few days. Then
we pull that script back up, and we talk about it.”
Here, the principal’s belief in the power of data to influence practice change was evident. She
described an in-depth cycle of pre-observation planning, observing, scripting, and post
observation feedback, and discussions with each teacher. She also talked about specific data
points where the data collected help teachers identify and target areas of growth. These data
points allowed her to support the teacher in this process. Further, the principal articulated her
enthusiasm for data and her belief in its power, explaining that it was an inextricable part of
USING DATA TO INFORM INSTRUCTION 151
engaging teachers in discussions about instruction. The data empowered her to give teachers
objective feedback about their instruction.
The interview data substantiated this finding. For example, one teacher, Sue, talked
about how the principal collected data during the evaluation process.
“She does it every time she has an observation. Each time she comes in, observing
everything and writing things down. It’s mainly during the post-observation meeting that
all of that is put into play.”
The principal clearly demonstrated to her teachers that she valued the use of data to engage in
efforts to improve instruction. She wanted her teachers to know that data were valuable, and yet
she did not incorporate it effectively into professional learning experiences in ways that would
lead to practice change. Sue explained that the post observation conversations always took place
in the principal’s office, again an indication that the principal did not position teachers in the
optimal environment to open up about and examine and reflect on their practice. When asked to
describe her perception of how the principal used the observation data to conduct a post
observation conference, Sue stated:
“I feel pretty comfortable. She is very nurturing in terms of like as an evaluator. At least
from my experience. She’s very encouraging but then also gives me that constructive
feedback to help me. Yeah, I definitely don’t feel like I’m being, you know, torn down or
anything. If anything I feel kind of renewed after. Like, okay, this is what she sees. This
is what I want to improve…I mean I look forward to it in the sense that I am interested to
see what she saw and what she has to say. It’s like I do feel sometimes somewhat a little
bit nervous. Just to see, oh, I think I did well and then she’s going to think I didn’t do
very well.”
USING DATA TO INFORM INSTRUCTION 152
The principal’s use of data—or lack thereof—in this example did little to build the capacity of
this teacher to examine her own practice or to reflect on her instructional choices. What is most
notable in this example is the lack of data use in principal-teacher interactions. The principal’s
behavior in Sue’s description indicated that she did not point to any evidence of what she
observed. Likewise, she did not ask the teacher to use any data to discuss what happened
during the lesson. The teacher was “comfortable” to hear what the principal had to say, and she
seemed to feel that it was her role to rely exclusively on the principal’s knowledge of what she
needed to do to improve without concrete evidence. She was interested to see what the
principal “saw and what she has to say.” Hearing the principal’s perspective made her feel
“nervous,” just thinking about how the principal would react. This interaction was not about
improving the teacher’s instruction. It was more about living up to the principal’s expectations.
Neither the principal nor Sue used data in a productive way to improve her practice. Part of the
principal’s failure in using data was demonstrated by the teachers’ failure to refer to it at all,
much less take it up in any meaningful way to inform her thinking about teaching.
When asked to describe how the principal’s use of observation data had influenced her
practice, Sue stated the following:
“I think she’s influenced it in terms of like giving me strategies and stuff. Yeah, I think
that she’s helped me a lot just so far in just our meetings we’ve had. It’s almost like, in a
sense, she gives me too much to think about. You know? She has so much to say that by
the end of it I’m like, whoa. What do I need to focus on again? We just talked about
everything. She’s definitely influenced what I try to focus on just because I do want to, I
guess, to show her that I can improve. She does motivate me to want to be better. There’s
that. A specific example when she was in my classroom, just recently, the last time she
USING DATA TO INFORM INSTRUCTION 153
was in it, we were looking at the students’ responses to a writing prompt that I had and
she was giving me some advice on how to make, like, because she saw the activity that I
made, and she thought it was really good. But then she was giving me a specific way to
improve it, like to make it more tangible or something or to better organize it so that way
everyone’s participating rather than just a few. Yeah. For example, she just told me to
start giving, uh, this is just a small example. She’s like, yeah, to encourage more kids to
participate just start giving them extra credit points like at a certain moment in time. For
the next 5 minutes, those who are answering and giving me that feedback, I’ll start
putting in the extra credit points. Something like that. Things to get them motivated or
something. That was just one example that I needed to put into practice. I’m like, oh
yeah, that’s a good idea.”
In this description, Sue could not seem to clearly articulate how her interactions with the
principal laid the groundwork for her to use data to improve her instruction. She started out with
a general statement indicating that the principal gave her “strategies and stuff” in their meetings
that have influenced what she might do to improve her practice. In the first part of her response,
there was no mention of the use of data. In the next part, she honed in on a specific example of
data use. They looked at student responses to a writing prompt. Implied in what was said was
that there were not many responses, that the students were relatively quiet. Therefore, the
principal gave her a specific way to increase their participation based on the data. The
principal’s recommendation was to have the teacher make the assignment more “tangible” or to
“better organize it,” but the principal’s final suggestion was simply to give the students “extra
credit points” to increase their participation. From this evidence, examining the data led to a
small tweak in practice, giving extra credit points to increase student participation. The teacher
USING DATA TO INFORM INSTRUCTION 154
was never asked and was not well positioned to identify, investigate, or use data to reflect on
problems of practice in ways that profoundly influenced her practice. Sue’s experience of this
interaction with the principal was that it was overwhelming, not well directed, not targeted in a
way that made it clear how the examination of student work data could be helpful to her. She
expressed a desire to improve, but she did not experience data use or data-driven conversations
with the principal as useful to making significant changes in her practice.
In the same way that the principal relied on data to drive the teacher evaluation process,
she included it as an integral part of PD but not always in ways that would improve instruction.
In observation data, the principal demonstrated a notable enthusiasm for using data in PD. In
fact, PD time became a regular forum for her to use data in an effort to improve teacher practice.
To do this, the principal created opportunities at each segment of PD to put data front and center
of her agenda, starting at the beginning and introduction to the PD, carrying on through the core
activity, and finally, continuing at the wrap up. By incorporating data in every aspect of the PD,
she showed that she valued and promoted its use to move the school forward, yet, she did not
know how to help teachers use it effectively and in a focused way to identify, expound upon, and
ultimately re-conceptualize problems of practice. In relation to Goldring et al.’s (2009) approach
to instructional improvement (planning to collect key data, communicating the value of the data,
implementing instructional changes based on analysis of the data, advocating new approaches to
address needs, supporting teachers with resources, and, finally, monitoring ongoing progress),
the principal of Star struggled the most in the areas of data analysis, implementation of
instructional changes, and ongoing monitoring. And though she took pride in her ability to be
visible and accessible and to communicate her expectations fluidly, she was not adept at
engaging teachers in reflective conversations, a critical part of communication (Goldring et al.,
USING DATA TO INFORM INSTRUCTION 155
2009). In this section, I present evidence from one particular observation to demonstrate the
principal’s unproductive approach to using data.
During one PD, teachers engaged in a Student Work Analysis (SWA) protocol where
they evaluated student work in departments. The objective of the PD read as follows: “We will
deepen our understanding of effective lessons to generate next steps in meeting our data driven
goals.” The SWA protocol was used to help them achieve this objective. This protocol required
one teacher to present student work samples to a group of content-like peers who were to ask
clarifying questions and offer critical feedback to theoretically “improve instruction.” After an
introductory whole group gathering, teachers moved to different locations to meet with their
departments and begin the SWA protocol. For the English Department, Tania was the presenter,
Elizabeth, the literacy coach, was the facilitator, and teachers (approximately nine) were invited
to be the analysts.
At the beginning of the PD, during the time of ritual shout outs (quick acknowledgements
to staff members for doing something special or extra for themselves or others during the week)
the principal took advantage of this moment to inject her own shout out with the staff. She
presented data in the form of recently released CAHSEE (California High School State Exit
Exam) test results to acknowledge staff efforts at reaching goals that were set earlier in the year.
She stated:
“So I want to do my shout out really quick, and I would like you to turn your attention to
the screen. I want to do some data analysis really, really fast. I want to do a shout out to
all the folks who have touched those kids.”
After presenting the visual data, she acknowledged the efforts of several teachers by name and
then said the following:
USING DATA TO INFORM INSTRUCTION 156
“All these teachers have some way affected them, because I set a goal last summer if you
remember, 12% increase. And we just about hit that in both English and Math. In
English, we came just a little bit under but that’s ok. It’s to be expected with all of our
English learners. But look at that jump in proficiency rates. I just really, really, really,
really want to thank you so much for all of the work that all of you put in.”
In this excerpt, the principal revealed that she valued data more for its use as a culture and
morale builder than as a tool for improving instruction. By providing evidence of student
achievement and progress toward meeting schoolwide goals with her staff, she highlighted the
importance of using data in moving the school forward, but did not articulate what steps had
been taken instructionally that might have produced this increase in scores. Likewise, she did not
ask her staff to examine the CAHSEE data in a way that might have given them insights into
how they could change their practice, particularly in ways that would raise scores in English and
with English Learners. The data she presented were so general and her analysis of it so shallow
that it did little to inform teachers’ thinking about practice.
In sum, the CAHSEE data she focused on were clearly important to her, yet she
diminished its worth by presenting it as just another shout out and by stating that she wanted to
do it “really quick.” As shown previously, this rush to use the data was evident in the larger data
set as well. It seemed to suggest that she was not comfortable with data use, and she did not
know what to do with it. In this example, once again, the principal’s approach to using data was
compromised. She had the intention of presenting data to show an increase in student
proficiency rates and she wanted to thank her staff for all the work they did to accomplish this,
but she spoke about it in vague, superficial terms. She did not articulate what teachers had done
instructionally to raise the proficiency rates, and she did not ask them to think about it in terms
USING DATA TO INFORM INSTRUCTION 157
what they might do differently in terms of instruction to improve proficiency rates. In the end, it
was also unclear how this data might be used to improve teacher learning. The principal’s
vagueness about the data and her rush to present it so quickly turned this opportunity to use data
in a productive way into little more than a cheerleading act.
In the next segment of the PD, the principal missed another opportunity to talk about data
in a useful way as she prepared the teachers to engage in the breakout protocol. Instead of
setting teachers up with a clear understanding of how to use data productively to improve their
practice, she spent the next 30 minutes speaking in vague terms about their role in examining
student work data. She explained:
“In today’s work, we’re going to be working in departments and taking a look at student
work to build the team and also to really think about growing as individual learners, so
that growth mindset and being data driven. Again, the data being the student work today.
The authentic data of their work, what does it tell us and where’s it going to take us?”
The principal prefaced the SWA activity, a protocol she wholeheartedly believed in as something
that could be potentially powerful for insights into instruction, for the teachers with a vague
introduction. She then posed a broad rhetorical question without asking for or expecting a
response. In this instance, the principal clearly communicated her belief in the use of data,
emphasized her intent on making it part of the learning process, strived to impart that message to
her teachers, but did not articulate a clear, specific approach as to how she planned to accomplish
that.
She then launched into an explanation of how teachers should engage in the SWA
protocol, giving teachers an anticipatory set, asking them to respond to three questions first in
writing and then in pair/share: Where are you in terms of your content? What instructional
USING DATA TO INFORM INSTRUCTION 158
strategy are you focusing on? What is on the forefront of your mind for the common core? After
a few minutes of sharing, she stated:
“We’re going to go ahead and stop here. I just wanted to give you guys a frame for where
you are right now in your teaching, also to put you in that space of just reminding
yourself of sharing with your colleagues.”
It was unclear how these questions or this pair/share activity would prepare teachers to think
about using data to improve their practice with colleagues using the SWA protocol, the central
focus of this PD. The questions were broad, not very directed, making it hard for teachers to
know how they were about to operationalize the use of data to inform instruction.
At this point, she prepared to engage the faculty in the SWA protocol by giving them a
detailed description of it. She reviewed the steps of the protocol and the role of each participant,
explaining how the student work would be analyzed. She told the teachers:
“You are looking for–and for me it’s always good to remind myself, because sometimes
we skip through the steps–three things, and they’re all bolded. The strengths, the struggle,
and what are they ready to learn? Every work has strengths. What is it that the student is
showing you that they’re learning, that they’re doing well? It’s important to bring that up.
What is it that you see a kid is struggling with? You really try to be objective and you try
to say, in this work, this is what I see the student struggling with ...The last piece, what is
it that you are interpreting that this student is ready to learn next?”
With this direction from the principal, it was again not evident how teachers would
operationalize or make strides in improving their practice. The focus of the protocol was on
students, what they were learning, what they were doing well, what they were struggling with,
and so forth, not what the teacher was grappling with or doing. How the principal set the
USING DATA TO INFORM INSTRUCTION 159
teachers up, what she asked them to think about, and what she told them to do, again,
demonstrated the disjointed, unfocused nature of her approach. The principal sent the message
that she believed in the concept of using data to improve instruction, but she did not set up her
staff to successfully engage in using data to reflect on and re-conceptualize problems of practice.
In the next segment of the PD, teachers moved into their departments to put into play the
SWA protocol. In the English Department, Tania agreed to take the role of presenter. She
passed around three essays representative of high, medium, and low students that were written at
the end of a unit on Fahrenheit 451:
Tania: Ok. So what we have here is the final essays of the AP language students for
Fahrenheit 451. Unfortunately, I don’t have, I meant to make copies of everything
that led up to it, but I’ll tell you what it is right now. They read Fahrenheit 451
and then at the end of the unit, we had a two-day Socratic seminar. It was on
Monday and on Wednesday. At the end of the Socratic seminar on Wednesday, I
gave them their writing prompt based on one of the Socratic seminar questions.
So it was, they already had multiple perspectives on the writing prompt... (excerpt
shortened). Ok, so there aren’t many scaffolds. Unfortunately, I’m sorry, I didn’t
bring them, but at least you know what they are. If you don’t mind, I actually
don’t want to give you a focus question. I would like for you to take a look at it
and then maybe let me know what you see that maybe I don’t see, and I can go
from there.
Tania started by providing a context to the student work she distributed to her colleagues.
However, she did not tell them what the prompt was, nor did she give them a rubric or a question
of interest for the participants to consider. She said nothing that gave them any real context or
USING DATA TO INFORM INSTRUCTION 160
guidance about what the performance expectations were for the students nor did she give them a
problem of practice for them to consider. Furthermore, Tania was not fully prepared to share.
She did not make copies of the scaffolds leading up to this draft of the essays for her colleagues.
From what she explained, the students finished reading the novel, Fahrenheit 451, engaged in a
Socratic Seminar “that has given them multiple perspectives on the writing prompt,” submitted
drafts of an essay responding to a prompt that came from one of the questions in the seminar, and
participated in a structured peer editing. Tania expressed concerns about their topic sentences,
thesis statements, and grammatical errors, among others. She did not have a focus question for
her colleagues to consider.
In the way that the SWA activity was set up, it appeared to be acceptable for Tania not to
come prepared. The conditions that had been created around data use at the school by the
principal suggested that this was an appropriate way to come to an event where data would be
used and analyzed. Tania neither brought enough copies of the student work nor give her peers
the writing prompt. Nobody said anything to her. Her lack of preparedness did not seem to be a
problem for her peers or the principal. Yet, as a result of her choices, no one was well positioned
to provide feedback to the presenter regarding her students’ work or her instructional choices.
The silence surrounding this aspect of the discussion may in part have been due to what
Grossman et al. (2001) has described as pseudo community, where members exhibit a surface
friendliness with each other, always careful not to mention anything too critical or to intrude in
each other’s space. This politeness hinders candid talk that might lead to important insights.
Silence or awkwardness of this nature might also be an indication of how deeply personal and
sensitive it was for these teachers to talk about their teaching. It seemed safer to speak about
USING DATA TO INFORM INSTRUCTION 161
students. This dynamic suggested a larger challenge that principals and teachers have in
engaging in reflective conversations with each other around instruction.
In the next part of the protocol, the analysts were invited to ask brief clarifying questions
about the assignment. The teachers’ behavior was a reflection of the principal’s unclear
approach of using data to re-conceptualize problems of practice. Just like her, their interactions
demonstrated that they have not been set up to use data in ways that help them think about their
practice:
Elizabeth: So this is meant for us to ask the questions.
Anne: I have a question about the Socratic seminar. Were they taking notes
during it or like marking their book or anything?
Tania: No, so …
Anne: But they … I’m sorry.
Tania: Oh no, no, go ahead.
Anne: But they used the questions that you guys talk about during the seminar?
Tania: Yeah. They are like, in the Socratic Seminar, I have the seminar divided
into three. I have warm up questions, core questions, and then cool down
questions. Again, the seminar took two days because they had such rich
conversations and then it also gave me an extra day to try to figure out
which prompt would be the best. Then, yeah, the end of the second day,
just based on the richness of their conversation, I chose a prompt from one
of the questions. They didn’t take any notes, so it was like, that’s also kind
of like testing the listening skill because if you weren’t listening, then you
USING DATA TO INFORM INSTRUCTION 162
don’t have much except your own ideas in there. That’s pretty much what
I did with it.
Francisco: From the moment you started to use Socratic Seminar up to now, what
was the span of that? You know, was it two weeks, two and a half weeks?
Tania: Yeah, good question. Three weeks for the first 12, it will be four weeks for
the second 12. I timed it. I don’t do this with every process essay, but I
timed it so that we would do that during this time because right after the
Fahrenheit 451 unit, I did a multiple choice boot camp. It was for two
weeks straight kids were focusing on the AP multiple choice test and at
home for homework, they were working on the writing of the essay. That
way, they were getting double work and I wasn’t doing double grading.
Anne: Did you have a page length suggestion or requirement for the essay?
Tania: Yes, I think I either said three to five or four to six. Then for the rough
assignments, for the rough draft, I didn’t give a minimum or maximum
page length.
Elizabeth: I just wondered if you’ve graded these yet.
Tania: No, I haven’t.
Anne: Then I just realized really quick. It looks like they were talking about
some articles?
Tania: Oh yeah, I’m sorry. During the Fahrenheit 451 unit, because I wanted it to
be slightly common core. So each week we had a specific focus.
Fahrenheit 451 is broken into three parts. For one week, I would focus on
part one, and then specifically in part one, I would figure out what was the
USING DATA TO INFORM INSTRUCTION 163
topic, what was the main topic that I wanted to focus on, and I would find
articles that supported that. And then at the end, they would have to create
a theme of part one based on our discussion and based on the article.
Part one was about, if I can remember correctly, it was censorship. I found
articles on censorship. Part two was about burning books, so I looked up
articles on burning history. Part three was kind of about I think dystopia
rebuilding into a civilization. Oh, no, no, I’m sorry. The part three was
about dependency on technology and TV. So I found articles on that. I got
this, this final was due on Friday, but I was out for the ASB conference
and then Mr. Shine collected the finals and then I received these essays on
Monday, so I haven’t graded them yet.
Elizabeth: Any other questions?
Julie: I’m sorry, I wasn’t there at the beginning. Did they represent a high,
medium and low or they are just three kids?
Tania: They are supposed to represent a high, medium, low based on what I’ve
known their ability to be. I’m not too sure based on …
The clarifying questions that the participants asked were mostly structural or directed at student
behavior. They never inquired about, nor did the presenter ever talk about—or question her own
instructional choices, or clarify her use of the scaffolds she provided, or the prompt that she
actually chose, or the way that she expected students to use the information from the Socratic
seminar in their writing, among other omissions. Very few members of this group of English
teachers asked any questions at all. There was a notable lack of curiosity in the room. The
facilitator’s voice was missing. She did nothing to steer the conversation or provide guidance.
USING DATA TO INFORM INSTRUCTION 164
In addition, the teachers’ interactions were superficial. The group appeared merely to be
following the structure dictated by the district-created protocol and, by the very execution of it,
were positioned to solicit advice on fixing no more than minor surficial student level problems.
At this point, participants took several minutes to read through the student work samples,
after which they began the process of analysis. The presenter was supposed to listen to the
conversation without responding. This part of the protocol, potentially the most powerful for the
possible profound insights teacher analysts might provide to the presenter about her instruction,
once again reflected the culture of using data at this school, an approach that could best be
characterized as unclear, disjointed, vague, and loose. Likewise, the principal’s participation in
this part of the discussion contributed to this lack of structure and reinforced the finding that she
did not equip her teachers to use data in a productive way:
Elizabeth: So where are we? We are going to interpret the students’ work. Okay, so
this is when you take notes.
Tania: Yeah, I’m going to take notes. Let me just open it up, open up my Word
document.
Elizabeth: Okay. So we’ll talk about it, while you take notes.
Eva: So some trends or patterns I saw was I was just wondering about the
prompts, because I found that in all three essays, Fahrenheit 451, the main
text seemed not to be the focus. It was just maybe one paragraph or in a
certain essay, there were four words. I was just wondering about that
because it was a lot more article heavy.
USING DATA TO INFORM INSTRUCTION 165
Anne: Yeah, I think, and this is a difficult text, so it’s awesome that they had to
compare and contrast textual analysis with real world articles. But it got
too, it just wasn’t as balanced as I think it could have been.
Eva: I would say which one, Heidi’s, I guess. I think that, and I just read
through the first body paragraph. I think that it’s so clear that she makes
the effort to connect the book with the article, and that’s like a hard, I feel
like that’s a difficult task. And while in terms of its success, there are still
some things that are awkward and could flow better, but I feel like she
does the best job of making that connection and achieving that balance
when you compare it with Sharon and Vanessa.
Anne: I actually thought Vanessa, was she supposed to be the low one?
Tania: Heidi was supposed to be the low.
Anne: Oh, ok. I actually thought she did the best post-reading.
Eva: I know.
Anne: It was a little … Her explanations were a little long.
Tania: This was Heidi?
Anne: Heidi, yeah. She does, it seems like you must have taught them to analyze
the grammar. She does the dashes and the pauses and the ellipses, and she
is the only one who took that apart. And she did it twice and I think it was
successful.
Principal: There was also what you were saying with that first body paragraph. She
is really thinking about what the scene in the book and using that as an
example of this idea of losing track of your thoughts and being simple
USING DATA TO INFORM INSTRUCTION 166
minded where I think the other two more like touched on it and moved on
fast. She took some time to analyze it, at least in that paragraph.
Francisco: What I looked at when it came to structure, I saw that Vanessa’s and
Heidi’s had a little bit more of an Anne Schaeffer structure, which was
somewhat easier to follow. You can definitely see that they are trying to
integrate examples from Fahrenheit …
Principal: 451.
Francisco: 451 in it while Sharon seemed more of a, like she has a lot to say, but I
didn’t see her tied into the text.
Eva: Until way later.
Francisco: Yeah, until way later. For me, just from the very get go, I thought
Vanessa’s was the high one just from the intro paragraph and then Heidi
was the middle and then Sharon-
Julie: Sharon was the low.
Francisco: Yeah, but it was just different. It was different styles of writing. That’s
what I did see when I was reading about it… (excerpt shortened)
Principal: I think that one trend that I noticed in all three is a combination of a real
strength which is grasping at this thesis of technology and social media
affecting and dulling our minds. I feel like they all got that and I think they
must have talked about that a lot at the seminar and they really wanted to
stretch that idea. I really liked that all three took it beyond one sentence...
(excerpt shortened) … I feel like Sharon was really trying that. That was
really just a real strength for her, the desire. She put, for example, The
USING DATA TO INFORM INSTRUCTION 167
desire for more technology slowly begins to lurk into the minds of people
in society.
To start, the facilitator did not seem to know how to lead this part of the discussion. “So where
are we?” she asked. She then answered her own question. “We are going to interpret the
students’ work. Okay, so this is when you take notes,” she turned to Tania. She never made clear
what the teacher analysts should look for or pay attention to in the writing samples, and Tania
had not given them an area of focus or a question of interest. So they were left to venture out on
their own. It was also unclear why the presenter was supposed to take notes or what those notes
should entail. (During the observation, the only person who seemed to be taking notes was the
principal who wrote extensive notes in a vigorous manner.) The facilitator simply said, “So
we’ll talk about it, while you take notes.”
The first teacher to offer her analysis wondered about the prompt because the essays
made little if any reference to Fahrenheit 451, even though it was the culminating assignment
after study of this novel. This teacher presented a potentially interesting instructional dilemma:
how did students write an essay about a novel they had been studying without including textual
evidence from the novel to support their assertions or without making any connection to the
novel at all? How did the teacher set them up to write? How did she communicate her
expectations for the writing? Was there a rubric? If not, why not? What could she have done
differently? None of these questions were asked. Instead, Anne picked up on this comment and
seemed to apologize for the students. It was a “difficult task” to connect the book to the article,
she surmised. The comment was then dropped, and a different discussion took place about
which student was the low one and why. Again, there were no criteria on which to base this
determination, so teachers were left to wonder which essay was high, low, or medium and why.
USING DATA TO INFORM INSTRUCTION 168
At this point, one of the teachers shifted the discussion to mechanical issues in the essays,
particularly grammar. The principal entered the discussion focusing on something completely
different: how the students wrote the body paragraph. Her contribution to the discussion
mirrored and reinforced the loose, disconnected, unfocused conversation of the teachers. Their
interaction demonstrated that they did not know how to use data to help them think about their
practice. The principal and others continued to focus on student writing strengths and
weaknesses, not on instruction.
In the segment that followed, the teacher analysts discussed possible next steps for the
teacher presenter to “move forward.” Again, teachers expressed confusion about the protocol
and about the purpose of the activity. They were not well equipped to come to a consensus
around next steps:
Elizabeth: So should we combine the next steps? Do we have any questions for Tania
now?
Principal: I think the next steps, jumping to the next steps.
Elizabeth: So next steps.
Tania: Is this me or …
Principal: No, us first. You get the last 5 minutes. We’re gonna cut…
Elizabeth: Next steps, this is …
Julie: Well next steps for… (Teacher reads from the handout explaining the
SWA process.) Possible next steps for moving forward. So analysts
discuss steps that the presenter could use with the student based on the
work in front of them. They may suggest strategies, scaffolds or other
USING DATA TO INFORM INSTRUCTION 169
ideas that can align instruction with their picture of how this student
learns.
Francisco: I think the kids would appreciate and you would appreciate later on if you
had an opportunity where you had them read it out loud to a partner. That
way they can catch whatever grammatical errors they have in their essay.
Julie: I think it will be cool to treat this as another rough draft and tell them to
write a final.
Tania: Now that I heard Sharon’s topic sentence that I told her to kick out.
Elizabeth: Because I know you’ve done close reading for this.
Anne: Yeah, it’s very clear.
Elizabeth: So maybe making it more explicit and telling them this is what a body
paragraph should look like.
Principal: Yeah, and I think it’s very explicit asking engaging questions for readers
of what you should be looking for.
Julie: Did you provide a model paragraph at all?
Tania: Yeah.
Principal: I think there is a lot of real strengths in all this writing. The thing is like
these kids are really moving forward, which is great. I think that it’s all
about the idea between the … The difference between developed. It’s so
hard for kids, because you see a developed paragraph, well developed and
then there’s precise and accurate writing. They are two hard things as you
know. A student has to think about how to make it more precise.
USING DATA TO INFORM INSTRUCTION 170
Julie: Like I said, I think there could be just taking samples from their essays
and doing more focused mini lessons on making the language more
precise or using more … To improve the flow of their sentences or
simplify … In the case of Sharon, simplifying the sentence structure or
something like that could work as well, just as you sprinkle it throughout
the re-reading of the essays.
Tania: All right. Can I ask questions now? You know how like in the beginning I
didn’t have a focused question, because I wanted to see what everybody
had to say. So thank you for all of that. There is only … There are just a
few issues because the things that you see are the things that I’ve seen and
that frustrate me. For instance, Sharon can get very worried, and I’ve seen
this in the past where our really intelligent students can get very worried
with what they have to say. It will be best as we … His nickname for me is
worried, like I call him worried. See, the thing is that how do you properly
get the kids to simplify without being completely simple?
Julie: Yeah.
In this part of the discussion, Elizabeth began by asking a procedural question. “So should we
combine the next steps? Do we have any questions for Tania now?” she asked. The principal
injected a sense of urgency in her directive to skip this part of the protocol. She shifted the
momentum away from a step that would have allowed them to investigate more before coming
up with solutions. By doing this, no one was given the opportunity to ask Tania questions that
might have illuminated and enriched the discussion they were about to have and might have
helped them formulate action steps. As a result, they did not engage in deep examination of the
USING DATA TO INFORM INSTRUCTION 171
issues presented, nor did they explore what was happening with the teacher, making it difficult to
take into consideration her concerns for the purposes of analysis. Thus, in this particular PD, the
principal demonstrated her approach to using data. She made it available and conveyed its
importance. She expressed belief in the protocol where data from student work would be
examined to shed light on a teacher’s instruction. Yet her participation with this protocol was
counterproductive. She actually directed the group away from doing the deeper work and from
focusing more intently on the teacher’s instructional choices.
A close examination of this protocol confirmed that the principal’s approach to using data
with her teachers was consistent with the direction provided to her by the district. One part of
the SWA protocol stated, “Analysts discuss steps that the presenter could use with the student,
based on the work in front of them. They may suggest strategies, scaffolds or other ideas that
can align instruction with their picture of how this student learns.” This part of the process came
after the teachers had ostensibly engaged in description and analysis together. It would seem to
be the most critical moment of the reflective process—the time when the teacher would articulate
new learning and action steps. Yet the focus was on “strategies, scaffolds or other ideas that can
align instruction with their picture of how this student learns.” The language from the district in
the protocol mirrored the principal’s language and, like her, it focused on individual
student needs not on concerns about the teachers’ pedagogy. There was nothing about this
language that asked the teachers to reflect or that provided guidance to the principal to create
opportunities for them to reflect. The principal’s behavior in this instance underscored the idea
that principals’ work exists within an environmental context and that contextual factors have a
powerful influence on the failure or success of their work (Goldring et al., 2008).
USING DATA TO INFORM INSTRUCTION 172
The teachers, likewise, were not set up to engage in this activity in any way that would
stimulate them to focus their questions specifically on the teacher’s behavior or on her
instructional choices. The teachers’ attempts to come up with action steps also demonstrated that
they did not know what to do with the data. They offered surficial, activity-driven solutions,
mostly suggestions about reorganizing the way students interacted with each other. One teacher,
for example, suggested doing “focused mini lessons on making the language more precise.” The
teachers somehow knew that Tania had done close reading with her students and that she had
provided them with a model paragraph, but it was not clear how they engaged in close reading or
what the model paragraph looked like or how it was used by students. They did not hone in on
Tania’s instructional approach, considering how it worked and offering ideas about how Tania
might improve it or approach it differently to get a better result. The principal contributed to this
lack of understanding. Her comments, likewise, focused on surficial student level problems not
on instructional issues.
When Tania reached out to her colleagues near the end of the discussion, there appeared
to be a brief opportunity for Tania to reflect upon and analyze her own pedagogy given all that
she had heard before, but she, also, did not know how to examine her own practice in light of the
data. She, likewise, demonstrated an inability to use data in an effective way. Her response
focused on student-level concerns. She thanked her peers for their suggestions and then
expressed frustration about not really receiving any real solutions. What does one do with a
student who is “worried” she wondered? This question could have led to a discussion about
what she might do instructionally to address the needs of this student. Instead, it remained at
simply addressing a student-level concern. The opportunity was missed.
USING DATA TO INFORM INSTRUCTION 173
After each department went through the SWA process, all of the teachers reconvened for
a 10-minute wrap up. The principal, once again, injected a sense of urgency. She stated the
following to the whole group:
“So if I can have your attention up here please, I’m going to go ahead and move us
forward because we have only 5 minutes left. Once again, this is going to be a
combination of the individual reflection question that I shared with you before we broke
up. Just to share that. Going back to that question, thinking about how, and this is almost
like how does this synthesize really quickly if you’re coming in from this department
because we didn’t really get to do this together? Those of you in the other departments
may have already done it. As far as your department, where are you in terms of moving
closer towards your goal?”
Then the second part is individually, what did you learn from today’s work? Even
if what we learned is that we have a lot of work to do. I think we all … I know we
learned a lot in our breakout session, so I’d like to hear from folks individually and also
as a department. Anything that you have to share and then we’ll move on to the exit slip
for the last 2 minutes. We’ve got 3 minutes for this. Yes?”
By rushing through this last step, the principal demonstrated that she devalued this important
moment for group reflection, the critical chance to make sense of the data and the work they had
just completed together with their peers and to use this forum to articulate follow-up actions.
Instead, she seemed to be more concerned with completing the task and moving on. “We learned
a lot in our breakout session,” became an oddly empty statement. This was consistent with the
findings of a Horn et al. (2015) study where the Park Falls’ teachers who spent so much time in
group conversations with peers reviewing the frequently missed test items did not have sufficient
USING DATA TO INFORM INSTRUCTION 174
time to deeply discuss instructional implications. In both school sites, Park Falls and Star, the
activity structure limited the capacity of teachers to use discoveries in their conversations to
improve practice (Horn et al., 2015).
At this point, a representative from each department was given an opportunity to share
out. Tania, the facilitator from the English Department stated:
“I guess some of the key takeaways that we got from today’s SWA is I feel like no matter
what grade level or the ability of the students, the common threads are what we need to
focus on there, which is precision in writing and focus, ability to have clear writing styles
and stuff like that. General, general. A key takeaway is that we’re all struggling
similarly no matter what grade level or the ability of the students, which makes it very
clear for the English Department.”
This statement, which conveyed so little substance, was a reflection of how valueless the
protocol proved to be for the teachers. Sofia abruptly moved on to the next department
spokesperson without even a comment as she did with the comments from all other departments.
Here was evidence of one more presentation of data that seemed to go nowhere. The principal
did not set the teachers up to engage in discourse with each other around data in a way that
would push their practice to the next level.
The principal ended the PD by saying,
“We’re in a good space with that because the kids are doing well. Just keep in mind that
everything always needs work, right? Our work always need work.”
She seemed pleased with the quality of their interaction. The work itself mostly focused on
structural considerations of activities not instruction. There was no discussion about how to help
them make instruction better. It was all focused on what the students were doing or what the
USING DATA TO INFORM INSTRUCTION 175
student need was—not on what instructional choices were made in order to help them be
successful. The following questions that might have led them to examine their pedagogy were
never asked: What could you do to improve? What could you work on as a teacher? How do
you walk away from this recognizing where your work is and where you need to take it, not
where the students’ work is? Finally, there was no action plan for what was decided about the
implications for instruction or what they should do next. This PD demonstrated how data had
been used consistently in other interactions at this school—it was viewed as important,
accessible, always front and center in the activity or presentation, but it was unclear how it would
be used to inform instruction.
In all of the teacher interview data, only one teacher, Francisco, recalled a time when he
participated in a professional learning experience, specifically, a Student Work Analysis, that led
him to do a reflective act. He stated:
“I remember one of my midterms was an essay and I wrote out the prompt and I had so
many students bomb that essay. I just took it as, okay, they didn’t understand it, oh well. I
gave them all the tools that were necessary to write the proper essay. Not until we started
our protocol and we did a Student Work Analysis and I brought that prompt along with
some student samples. I was asked some very basic questions, and one of the questions
was, ‘Have you ever taught your students how to do compare and contrast?’ I didn’t
realize at that moment, but my prompt was compare and contrast. While this whole time
they had been doing narratives or expository. Never compare and contrast. It was clear as
day, and I thought I had a great lesson or a great culminating task, but it was at that
moment that I realized that there’s a lot more that I need to learn and I definitely
USING DATA TO INFORM INSTRUCTION 176
appreciated that critical feedback that I received from both my peers and my evaluator in
regards to that.”
Francisco recalled a time when he assigned an essay prompt to students, and many of them
“bombed the essay.” In his recollection of this experience, Francisco was willing to hear from
his peers and evaluators that the decisions he made were not necessarily the ones his students
needed. He was receptive enough to hear that sometimes he might be the problem in his own
practice, not the students. His thinking was more than structural. First, he saw his students, not
himself, as the problem. In this case, he acknowledged that he was the impediment, because he
recognized that the instructional decision he made set his students up to fail. With the prompt
that he created, he thought that he was doing one thing with it and, as it turned out, he did not
have the ability to step back and see that, in fact, something else was going on with the prompt.
He thought he was teaching students how to write narratives or expository writing, but what he
was asking them to do was a skill he did not teach them: comparison/contrast. So when they
were unable to be successful, he recognized that he had not given them the tools to write a proper
essay. Here he acknowledged that someone showed him that he created a prompt that did not
align with what he set out to accomplish. He understood that he did not give his students the
tools to be successful with it. He recognized that it was his fault, not the students’. Notably,
Francisco gained this insight into his instruction on his own in the process of engaging in the
Student Work Analysis protocol with his peers. The principal did not help him. His experience
demonstrated that teachers at Star High School had the ability to engage in more overt learning,
but they were not given much space or prompting in how to do that.
Subtheme 1c: The principal did not build her teachers’ or administrators’ instructional
or leadership capacity through her approach to evaluation or PD. The principal did not build
USING DATA TO INFORM INSTRUCTION 177
the capacity of her teachers to examine their practice during the teacher evaluation process, nor
did she support the growth of her administrators in this process. As shown previously, she
implemented and embellished an ambitious, rigorous evaluation system where every teacher was
evaluated every year, including multiple informal and formal observations and conferences. The
principal viewed this system as valuable and essential. The teachers viewed it as “stressful,”
excessive, and burdensome. They did too little of the intellectual work and too much of the
procedural (i.e., completing lesson plan templates, submitting online reflections, attending post
observation conferences, and coaching sessions, etc.). Teachers were set up to be passive
recipients of the principal’s knowledge and expertise. Despite high expectations and good
intentions, she undermined the power of the process by not relinquishing more of the
responsibility to the teachers. She gave them strategies and some insights into their instruction,
but she did not create a situation where they could internalize new practices and could focus on
aspects of their teaching experience that were important for them from their perspective. Anne
shared some of her frustration about the evaluation process:
“I mean, just the extent of it too was way more … the word that came to mind was
invasive. It’s been a real source of stress and it’s been a real cause of low morale around
here, because we have really strong administrators. In my first three years here, I felt like
I got really great feedback and I got really attentive feedback, and then they switched this
evaluation system. It’s more demanding of our time as teachers and it doesn’t always
seem to be the most, the best way to use the time.”
In regards to completing the lesson plan template, she stated:
“Yeah, you would never be able to do it on a daily basis. I mean, it’s just an unrealistic
expectation, and so if it’s not … I don’t know why they wouldn’t have us do a version. I
could see doing a little bit more elaborate lesson plan, but why wouldn’t we do something
USING DATA TO INFORM INSTRUCTION 178
similar to what we use on a daily basis? It seems to me that would be a more effective
way to coach us on lesson planning. It’s like, what are you actually going to do? Then do
that and we’ll make that better instead of writing this whole big, elaborate lesson plan
that you do twice a semester.”
In Anne’s experience, the additional procedural demands placed on teachers did not result in a
better, more valuable process as the principal envisioned. The extent of the technical demands
became “invasive” and “a real cause for low morale.” It seemed to undermine the one part of the
process that Anne valued most: the opportunity to receive “attentive feedback.” Furthermore,
she described the lesson plan that must be completed twice a year as “unrealistic” and unnatural,
not something that would help a teacher improve practice on a “daily basis.” Anne expressed a
desire to use a lesson plan built on what she was already doing. In this way, the principal could
support the development and growth of her practice using something she already owned and was
actually doing in the classroom.
Steven articulated a similar sense of frustration for the failure of the evaluation system to
build his capacity as a teacher.
“As far as the observation cycle as a whole, I still think it has its flaws. It’s very
structured in terms of they want to see key terms being used and specific processes. For
example, in the beginning of class they want me to read the objective, and then they want
me to say, Jerry, can you read the objective? After he reads it, Sally, can you read it one
more time? They are looking for all these kinds of key things, and if I don’t do that then
I am expected to get a lower score on that. For me, it takes out the organic-ness, I guess
you could say, of teaching. It loses some of, for me, my ownership of the classroom.
Sometimes I feel like it’s not the most effective way, but again sometimes I will play the
USING DATA TO INFORM INSTRUCTION 179
game, I guess, just to get those grades, those scores. Those are kind of some of the
negative aspects of it.”
For Steven, the evaluation process was little more than a compliance act. He viewed his role in it
as making an effort “to play the game,” almost like reading a script to ensure that he fulfilled
certain pre-determined expectations. He felt compelled to demonstrate parts of a model lesson
that were unnatural for him, doing things that he did not necessarily agree with or understand.
For him, it was “not the most effective way” to demonstrate or improve his practice. While this
teacher did not articulate what that better way could be, the principal clearly did not use the
process effectively to develop this teacher’s capacity to be a reflective practitioner and effective
teacher.
Apart from work she engaged in with teachers, the principal showed weakness in
supporting her administrators with the evaluation process. While the principal perceived herself
as engaging in distributed leadership by having the other administrators participate in the
evaluation process along with her, she, in fact, did not build their capacity to participate in an
effective way. In the distributed model, leadership extends beyond the mind of a single,
charismatic individual to one that spreads across a social terrain that includes multiple leaders
and is grounded in the interaction of people. In this model, leaders work separately but
interdependently to accomplish common goals (Spillane et al., 2004). The principal presented
herself as a leader who distributed responsibility to other members of her staff by assigning
administrators the task of evaluating teachers based on their subject area expertise. However, it
was not really a distributed act, because she did not do it in a way that allowed them to own it.
In fact, the principal was skeptical about their ability to do this work, thus undermining their
USING DATA TO INFORM INSTRUCTION 180
capacity to really engage in distributed leadership. Likewise, she cast doubt on their ability to
fairly and effectively evaluate teachers. Sofia stated:
“I, as an admin, I want to call myself a teacher’s coach and a teacher’s teacher. I think I
spent so many years as a teacher. I’m not intimidated and I’m not insecure as an admin.
I think that a lot of times new administrators who didn’t teach for very long, they’re still,
they’re a little bit insecure about their ability to coach, so the position and hierarchy
matter to them. You know what I mean? In some ways, they tend more to score lower or
not really support teachers. They want teachers to prove their excellence.”
She continued:
“I’d rather be the kind of school leader who my teachers want to invite into their room all
the time, because they want to show me what they’re learning, and how they’re better.”
She elaborated:
“My two assistant principals are less like me than me. I think they’re in that second
realm of I didn’t teach for that long, so I have to be super strict. I don’t feel that pressure.
What are you going to do? I just feel like, you don’t know more than me, teacher.
You’re never going to know more than me.”
By setting herself up as the only one capable of working with teachers and building their
capacity, she isolated herself and undermined her fellow administrators by not supporting them
in their role. She also did not build their capacity, nor did she respond in any way to their
perceived lack of experience with an effort to help or support them in their evaluative role.
When asked about the principal’s approach to the evaluation process and how she involved him
with it, one of the assistant principals described a limited, simplistic interaction between himself
and the principal. According to this administrator, the principal involved her leadership team
USING DATA TO INFORM INSTRUCTION 181
only in a structural way by evenly dividing up the staff to be evaluated at the beginning of the
year. When asked about the principal’s role as a coach in the process, his initial instinct was to
mention the district’s leader as the coaching support for the evaluation process, not the principal.
He stated:
“At the beginning of the year, we look at, we have a chart of administrative
responsibilities, who is in charge of what kind of thing, and we sort of split up by
departments or by teachers. We each have a third of the teaching staff. It’s sort of like we
focus on our third, because there’s so many teachers and there’s not enough time. So
with our boss from the district every couple of weeks, we will discuss our caseloads
together, and look at grade book data. With this cluster director, we do various forms of
data review. Not the same each time; sometimes we’ll go into a classroom and look at a
set teacher and give feedback to each other. Like what are we seeing? But really
throughout the year, I will go to my principal a lot with my caseload and say, here’s what
I’m seeing in these teachers. Do you agree based off your expertise and knowledge, and
would you give the same coaching?”
In this description, the assistant principal pointed out that, as a team, they determined the
teachers whom each one would evaluate. The support or coaching for the teacher evaluation
process was initially not provided by the principal but by the cluster director of the larger school
district who visited the school twice a month to review caseloads and data. This cluster leader
also conducted classroom observations where administrators shared thoughts with each other
about what they were seeing. Almost as an afterthought, he mentioned going to the principal on
his own initiative to hear her feedback on the teachers he was observing, deferring to her
expertise and knowledge about instruction. Even with her administrators, it appeared that she
USING DATA TO INFORM INSTRUCTION 182
was cast as the knower and the keeper of the knowledge. It is unclear if or how she took
advantage of these moments with her administrators to build their capacity in the teacher
evaluation process.
The other assistant principal described a similar relationship with the principal that was
perhaps even more limited and simplistic. She stated:
“Well, when it comes to having to do with evaluation, we break down the number of
teachers we have. We all have the same number of teachers we evaluate, and based on
that as a school, we conduct four informals and one formal. As a district, we’re only
required to do two, but as a school we went ahead and we increased the number of
informals to make sure that we push on the amount of coaching that we have for our
teachers. That’s something that came from her, from her leadership, to making sure that
we’re in the classroom more and we’re providing our teachers more with coaching
instead of just the evaluation process, which happens at the end of a semester. We have
coaching throughout the semester. We perform one informal per month, and then at the
end we perform the formal. That’s all from her leadership. It came from her leadership,
making sure that we’re in the classroom a lot more.”
From this description, it was clear that the principal wanted her administrators in the classroom
more and her way of ensuring this was to require additional observations. By spending more
time in the classroom, according to this assistant principal, they could provide more coaching to
the teachers throughout the semester. This was evidence that, again, the principal expected the
knowledge and the intellectual work of instructional improvement to stay at the administrative
level and that she expected her administrators to already know how to do the work of evaluating
teachers. It did not seem that she provided them with any additional support with the teacher
USING DATA TO INFORM INSTRUCTION 183
evaluation process. At another point in her interview, this assistant principal remembered a time
when a chemistry teacher had difficulty teaching balancing equations. Marisela explained, “We
debriefed the observation, and I found a different way of doing it. He went back and taught the
next period. It was more effective.” She continued, “He felt very good, because now he is
teaching it the way that I taught it to him.” In this example, not surprisingly, the evaluator acted
as expert and knower, dispensing knowledge to an empty vessel. This brief exchange provided
evidence that this administrator clearly did not have a nuanced understanding of the evaluation
process and did not know how to invite reflective thinking in the teacher. The principal did not
build her capacity to engage in this process in a more thoughtful way with teachers.
There appeared to be a disconnect between the principal’s philosophy about teacher
evaluation and how it was enacted by both administrators, suggesting that the principal did not
ensure that her fellow administrators had a shared understanding of how the evaluation process
was supposed to be used to inform instruction at her school. Thus, while the principal
empowered her fellow administrators to participate equally in the evaluation process with her,
she fell short of equipping them with the capacity to use data in meaningful ways to improve
instruction. Unlike a leader who strives to improve the conditions for learning at his/her school
by engaging in reciprocal accountability (Elmore, 2002), giving staff members a responsibility
and then building their capacity to be successful at it, she failed to build her administrators’
capacity despite her awareness of their weaknesses, thus, undermining their ability to be
effective.
In the same way that the principal did not develop the abilities of her staff in the area of
teacher evaluation, she did not foster the skills teachers needed to take on leadership roles in PD.
In the SWA protocol mentioned previously, the teachers were not well positioned to analyze data
USING DATA TO INFORM INSTRUCTION 184
in a way that would help one of their own colleagues with her practice. Likewise, Elizabeth, the
coach, was not prepared to facilitate the group analysis of student work in a way that might lead
to practice change. The principal did not set her up to skillfully lead the teachers in a discussion
on instruction. Elizabeth did not know how to provide a clear purpose for the teachers’
interactions, nor did she know how to lead an open exploration of ideas and perspectives,
culminating with a consensus about an agreed-upon set of actions to carry the work forward.
Except for providing simple prompts to move teachers to the next activity, Elizabeth’s presence
in the discussion was barely noticeable. Teachers were left to blindly muddle through the
process.
In addition to struggling with developing her staff to support teacher learning in the
evaluation process, the principal faced a challenge in recruiting staff to take on leadership
positions such as department chairperson, literacy coach, math coach, and so forth. When asked
about her decision to become the literacy coach or department chair, Elizabeth stated the
following:
“Department chair? Nobody else wanted it. That’s quite honestly how that happened, and
I didn’t want it either, just someone has to step up. I think part of the reason is because
this whole evaluation system, it takes so much out of our teachers, and we're all just tired.
In terms of coaching, I wanted to leave a few years ago because I wanted to explore other
avenues, and that was when I was getting my admin credential.”
Elizabeth seemed to take on both the position of literacy coach and department chair by default
as everyone else was too “tired” or unwilling to commit to these leadership roles. Despite
protestations about additional work, including multiple meetings for little pay, the teachers did
not feel empowered to act as leaders. The principal’s failure to build anyone’s capacity and her
USING DATA TO INFORM INSTRUCTION 185
need to hold on to all of the knowledge left everyone else not being positioned to take on
additional responsibilities. They did not feel that their knowledge was important or that their
participation would make a difference.
Presented in this section in support of Theme #1 was a set of subthemes and examples
from the data illustrating how, despite the principal’s efforts to portray herself as a
knowledgeable manager and driver of instruction in both teacher evaluation and PD, she fell
short of using these skills in an effective way. She was not well positioned to use data or to
facilitate her teachers’ use of data to inform instruction, nor was she able to build the capacity of
her staff in ways that would lead to teacher practice change. Next, I move to Theme #2.
Theme #2
The principal did not have a coherent and integrated approach to improving instruction.
The principal’s emphasis was on initiating multiple disconnected learning experiences that were
not consistently aligned, nor did they include an explanation of why and how these experiences
would improve instruction or an expectation for follow through to ensure that new learning
would take hold. Her role, thus, wittingly or not, was to create disconnect across PD, the teacher
evaluation process, and other types of learning experiences rather than engage teachers in
questions related to their own practice, valuing their knowledge in ways that allowed for a
coherent approach to use data to improve instruction.
Little (1993) decried the “workshop menu” form of PD prevalent in so many school
settings, describing it as “fragmented in content, form and continuity” (p. 142). She pointed out
that teacher learning is best fostered when teachers engage in genuine questions around their
practice in settings where their knowledge is valued. In addition, according to Hallinger (2003),
instructional leaders maintain a vigilant and persistent focus on improving conditions for
USING DATA TO INFORM INSTRUCTION 186
learning and creating coherence in values and actions across classrooms. Sofia had every
intention of creating this kind of coherence to improve learning conditions for teachers in ways
that had an impact on student learning. Yet her forward push on multiple fronts affected her
ability to ensure that new learning was being implemented.
In this section, I present evidence to demonstrate how the principal’s approach to using
data in the context of teacher learning experiences—particularly in the way she constructed the
teacher evaluation process and organized the PD meetings—inevitably became a well-meaning
but disjointed effort, with little to no common understanding of purpose or follow up on
commitments made, resulting in no appreciable impact on instruction. Despite Sofia’s portrayal
of herself as a leader who successfully aligned teacher learning experiences in a three-prong
effort (TE/Coaching/PD), this approach came across as inconsistently aligned and disconnected
pieces from the teachers’ perspective. The principal stated the following:
“When I look at professional development at my school, I always see the prong. I see the
PD prong, which is the actual instruction of teachers during professional development
days, late start days. Then there’s the prong of individual coaching. I make a very
concerted effort to align both of those things.”
I want to be able to coach you individually on things that you set your own
individual goals on, based on what we’re learning in PD. That’s why it’s so important,
these next couple of weeks. Because the majority of what they say they want to learn,
that’s what I have to make the PD about, which should align with their own goals, so they
feel that all the prongs meet. You know what I mean? So they feel like they’re supported
individually and collectively with their department and their team, and then also through
their own observations and feedback.”
USING DATA TO INFORM INSTRUCTION 187
In this excerpt, Sofia presented herself as a leader who made “a very concerted effort to align”
teacher learning experiences at her school, particularly the TE process, the individual coaching,
and the PD. She believed that she structured these experiences based on the teachers’ own self-
determined goals and needs. The evidence from teacher interviews and observations countered
this claim in both the teacher evaluation process and PD.
As mentioned previously, the principal enacted an ambitious, rigorous online teacher
evaluation system where every teacher was evaluated every year, including multiple informal
and formal observations and conferences. In an effort to provide teachers more opportunities for
coaching, she even purposefully doubled the number of informal observations required by the
school district. By considering data from multiple observations, Sofia hoped to gain a
comprehensive picture of a teacher’s practice. This, in theory, would allow her to help teachers
reflect upon their instruction and identify areas of weakness that could then be reinforced
through coaching and PD in order to progressively improve their practice. Despite her
intentions, this system became unwieldy, difficult to manage, and did not clearly connect the
data points in a coherent way. The principal presented a process that included multiple pieces
that never really worked together to inform practice. She stated:
“I have six scripts on my computer still that I have to bucket. Basically the process is this,
you observe them several times during the semester. During that time, it’s based on the
teacher. For second semester, each teacher has set up two areas. They call them their
areas of growth. Here’s where I want to grow.
“What I do is I write that down, and I remember for each teacher. Before I go in, I
think, okay, what is it they’re working on again? I need to remember. I look at past
scripts and then progress made, etc. You go in and you do an informal observation
USING DATA TO INFORM INSTRUCTION 188
between 15 and 30 minutes, and you give them feedback in those areas. At the bottom of
the script for us, they have areas of strength, areas of growth, next steps.
“What I do is I fill all that in. Then I email to the teacher on the same day so that
they know immediately what I saw. Then, when they come in ... Every observation has to
have a post ... They'll come in within that week or 5 or 6 days. Sometimes it’s right away.
Sometimes it’s in a few days. Then we pull that script back up, and we talk about it.
Let's say for Sergio, who I’ve been coaching for the whole year. He’s really working on a
few things, student to student discourse, student facilitated conversation, and student
questioning. Are they asking each other the right question? Am I asking them the right
questions to get them to think on a deeper level, etc., etc. I'll be looking for that so that I
can give him information.
“I’ll be like, Oh, my God, in this particular period I know you have four included
special day kids. I saw that Austin did this, this, and this. Jake did this. That was really
good. I saw that Jose, even when I asked him a question, he gave me a one-word answer.
So he’s still intimidated to share. We've been working on that.”
The principal described a labor-intensive system that was difficult to manage and did not allow
her to engage teachers in genuine questions around their practice in substantive ways. This
system involved scripting observations, bucketing scripts on the computer, remembering specific
teachers’ areas of growth, setting up post conferences, and so forth. The demands placed on her
and her staff caused what administrators and teachers identified as possibly the most important
aspect of the process to be devalued and put aside: the time for debriefing, reflecting, and re-
conceptualizing of practice. Again, this is consistent with findings in the study of Park Falls’
teachers, who spent so much time in group conversations with peers reviewing the frequently
USING DATA TO INFORM INSTRUCTION 189
missed test items that they did not have sufficient time to deeply discuss instructional
implications (Horn et al., 2015). The conference she described in this excerpt focused mostly on
superficial suggestions to change student behaviors not pedagogy. Likewise, in the conference
previously mentioned, Anne could not clearly recall the goals she had presumably identified, and
certainly did not have them at the forefront of her mind in her conversations with the principal.
In this and other moments, it seemed unclear who had, in fact, selected the goals, the teacher or
the principal. This lack of alignment was manifested in PD experiences as well.
Just as the principal’s intentions with teacher evaluation did not match up with what
actually occurred, there was no evidence to show that she had created a coherent PD plan with
her staff despite her statements. Structurally, the principal took well-intentioned steps to provide
teachers with a meaningful PD experience. She carved out time on Wednesday mornings and
Monday and Thursday afternoons to provide a protected space for adult learning to take place.
She also did extensive planning of PD with her leadership team to ensure that teachers received
training on research-based instructional strategies and to incorporate potentially powerful
protocols such as Student Work Analysis (SWA) and Peer Observation Protocol (POP) to
promote the development of professional learning communities (PLC). Likewise, she took steps
to build community and nurture the kind of trust and sense of connectedness among faculty
members. Teachers felt comfortable to observe each other’s classrooms and to offer cool and
warm comments as feedback.
Despite this push to plan and structure meaningful learning experiences and despite the
principal’s well-meaning intentions, the PD had a disconnected, disjointed quality. Apart from
the overriding push for Thinking Maps, teachers were given superficial activities, protocols, and
strategies and did not seem to understand why they were doing them, how they were connected,
USING DATA TO INFORM INSTRUCTION 190
or how they would impact student learning; above all, the WHY was missing. In addition, most of
the teachers and administrators expressed concern about a demoralizing lack of follow up. PD
was delivered too fast with no effort to encourage reflection, no follow-up on commitments, and
no push for mastery of what had been learned.
For example, Anne painted a picture of PD at the school that was starkly different from
the principal’s description, one that involved multiple loosely linked activities that seemed to
have little connection to the teacher evaluation process or to individual coaching. She described
it in the following way:
“So we’ve had a couple full-day Cascade days. We did one that was a full day of thinking
maps. Then maybe three or four times a year, we do all Cascade teachers by subjects, so
like all the English teachers will meet. We meet at nine o’clock. Then we have our
Wednesday morning PD which is all kinds of stuff, a lot of content. They’ll do strategies.
They’ll do classroom management. They’ll do Common Core rollout. They’ll do peer
observation protocol. We do that at least twice a year and then Student Work Analysis.
We do that often.
“Then on Thursdays, we’ll do like departments. I guess that’s not official
professional development, but we’ll do department meetings, grade level meetings, and
that’s where we do target tool kits for the kids that are really struggling. We’ll work on
them. Then once a month that’ll be union. There is a fourth one. Maybe like a faculty
meeting. Wednesday is the stuff that you use in your classroom. That’s how I see it
anyway. Thursday is like housekeeping stuff.”
In this excerpt, Anne listed a litany of activities and initiatives determined both by the larger
school district and by the school’s PD planning committee that seemed to have little connection
USING DATA TO INFORM INSTRUCTION 191
with each other. Even the Thursday schedule that was set up to provide teachers with more
collaboration time was taken up predominantly with operational tasks or work focused on
student not teacher needs. Except for the Thursday grade-level meetings where Anne explained
“we do target tool kits for the kids that are struggling,” she made no attempt to articulate the
purpose for any of the activities.
Paul, one of the assistant principals, also lamented the lack of “continuity” with the PD
schedule. When asked to describe how PD could be improved, Paul stated:
“To me, it’s the continuity piece. Like, things come up and we have to adjust, and we get
mandates from our district and from teacher feedback. But I would like to have it
continue instead of like we’re studying this for a couple weeks and then we cycle to
something else. I’d like there to be less of that interruption, which is hard to control.
I think some schools in our district are terrible at having continuity, so I think we’re one
of the better ones. But I think that’s something we continue to focus on, and have
Wednesday be the input and then Thursday morning when we have the collaboration
time. So we’re working on changing those committees around so we won’t have as much
work to present next year. I really want to keep that Wednesday to give you material or
follow up, and Thursday you’re in departments or grade level teams.”
In this excerpt, Paul describes an ongoing, relentless forward push in the PD schedule that was
“hard to control.” He also mentioned the different, competing pressures that weighed upon them
and commanded their time such as district mandates and teacher feedback. He expressed a
yearning for something meaningful that continues without “interruption” or “cycling to
something else.” He seemed to want to limit presentations or “input” to Wednesdays, protecting
the Thursday time for collaboration.
USING DATA TO INFORM INSTRUCTION 192
Steven’s remarks provided another voice reinforcing the idea that teachers experienced
an ongoing “struggle” with the push toward new things. He expressed a strong desire to slow
down so that he could stick with one thing. He stated the following:
“There’s commitments, but I think again the biggest struggle is the follow up to those
commitments. Like I said, some of these bigger things like thinking maps, they are
committed to that right now, but in all honesty it seems to always change every year.
There is a new technique or something new that is a kind of trending educational topic,
and when I feel like I am beginning just to master one of them we are pushed toward this
new thing. For me, that’s one of the biggest struggles, even with thinking maps now. I
like it. I like using it in my class, but I feel like I don’t need to utilize it as much as they
would like me to, or at least I don’t need to utilize all the different thinking maps. I think
there are 12 of them that they want me to use. For me and my class, I see maybe three or
four of them that I can use on a regular basis that will definitely help the students.
I think we are constantly moving forward. In regards to thinking maps, certainly those are
things that I’m going to use. I continue to use CATCH, Socratic Seminars. Those are
things for me at least that I really enjoy in the classroom.”
Anne explained how this nonstop, rushed, relentless forward push exacted a toll on teachers’
ability to implement the new learning. She spoke about the urgent need for follow up, stating:
“I think they give us a lot. They’ve slowed down, but I think that it might be nice if they
teach you to do something and then a week later they check to see if you’re doing it, and
then a week later they check if you’re still doing it. That would make you actually do it.
Otherwise, with the thinking maps, it’s like we have a whole PD on them. We all used
them for a couple of weeks and then it’s like they go away. They’re trying to hold us
USING DATA TO INFORM INSTRUCTION 193
accountable but you saw this morning when the trainer was like, who brought in student
work? I was like, I didn’t know we're supposed to bring student work.
“I think if it’s something that really matters, they need to check and build in a
follow-up or accountability. It’s that thing that's a little bit annoying as well. We’re adults
and we’re professional. Let us do what we want. But it’s like, if you guys decide
something’s important, then make us do it. And if not, then... You know what I mean?”
Elizabeth voiced this same concern. When asked how PD could be improved, she stated the
following:
“I think the biggest thing with our PD that I get frustrated about is the lack of follow up.
I think we have excellent PD up to the point where it’s like now what are we going to do
with it. We’re brainstorming, brainstorming, and then at the very end, I feel like there’s
not enough follow up, and I think that definitely, going back to data, that’s a part of our
data that we’re missing. It’s like we can do all of this work, but then the next PD is
sometimes something different, and then we forget about what we did before.”
This section provided evidence to demonstrate how the principal’s approach to organizing
teacher learning experiences—particularly the way she constructed the teacher evaluation
process and organized the PD meetings—ultimately became a well-meaning but fragmented,
disjointed effort. According to Elmore (2002), effective professional development involves
developing the capacity of teachers to interact socially on problems of practice and requires
sustained commitment to consistency and focus over time. The continuous forward thrust of
pushing new strategies and initiatives left little-to-no time for understanding the purpose or to
follow up on commitments made to ensure new learning would be implemented. It certainly left
USING DATA TO INFORM INSTRUCTION 194
no time for teachers to interact around problems of practice in a sustained and consistent way, as
Elmore (2002) suggested. Next, I present theme #3.
Theme #3
Professional development was delivered mostly in a top-down fashion and was based on
a knowledge for practice model (Cochran-Smith & Lytle, 1999) that resulted in the exclusion of
teacher voice and inhibited substantive improvement in practice. School environments that
foster teacher learning encourage and cultivate inquiry around problems of practice in settings
where teachers are viewed as knowledgeable, active participants (Little, 1993). The principal at
Star High School did not employ a distributive leadership approach and, as a result, excluded
teacher voice and devalued teacher knowledge. The way that this played out was that PD was
not driven by teachers’ interests or their perceived needs. Instead, while they expressed a desire
or yearning to take charge of their own learning, teachers continued to be subjected to the
principal’s decisions about what content or topics were worth engaging in on a regular basis.
Therefore, their knowledge only seemed to be valued in rare formalized spaces where they were
put in charge of their own learning and were able to decide what they believed was important,
not the principal. However, neither approach translated into richer conversations about practice.
Furthermore, neither approach had the right structure in place to enable teachers to go deeper
into thinking about their practice. The principal’s hierarchical approach reinforced the tension
that existed between the principal’s tendency to be authoritative and the teachers’ desire for
autonomy, neither of which led teachers to think about improvement of practice in more
substantive ways.
In interview data, teachers expressed a desire to be the driver of their own learning, to
work in collaboration with each other around topics that they themselves decided were important
USING DATA TO INFORM INSTRUCTION 195
in a knowledge of practice approach (Cochran-Smith & Lytle, 1999). This included time to
collaborate informally with colleagues, time to study educational research, and time to work
independently. Likewise, they yearned for something more interconnected, something more
teacher-determined. They wanted to be treated like experts; they wanted to be more than just on
the receiving end. They wanted to construct their own knowledge. Steven stated the following:
“For me, the most enjoyable part of PDs is not when we are being talked to but when we
get to break up into our groups and really discuss as teachers what we want to see happen
in our classes. For me, yes. I like when we work with departments like that, grade level
teams. We will do that sometimes, or content areas. I really like just kind of working
within our department to see again what we need to focus on.”
This was an example of the desire that many teachers expressed to control the content of their
own learning. Steven stated twice that he wanted a formalized space to do what he and his peers
decide what is the most important thing to do: not to be told by someone else what they need to
do. The tension between authority and autonomy was evident here. Implied in Steven’s
statements was the idea that teachers used data to inform what they thought was important in
their classrooms. They approached this act with an unclear sense of how to use data in this
context. It was evident that the structures were not sufficiently created to support their use of
data in ways that foster their ability to examine their practice. Their capacity to use data had not
been built. Steven continued:
“I think, symbolically, and this is in all honesty, we are told we have a voice. We are able
to make suggestions. In all honesty, I don’t really see those suggestions being played out
later or if they do, it’s very lightly treaded on or lightly discussed, and after that we don’t
go back to it. We push forward because, again, there is an agenda. Like you said earlier,
USING DATA TO INFORM INSTRUCTION 196
there is a plan that is in place that I don’t know exactly where it’s going yet, but they
have specifically, this is what we do now, next PD, next PD, next PD. I feel like they say
we have the opportunity to speak up, but it’s not really implemented and pushed at the
same time.”
In this excerpt, the teacher characterized the principal’s approach to PD as a top-down
experience where teachers’ voices were lost or devalued. Their suggestions were not taken
seriously, nor were they ever acted upon. They were swallowed up and forgotten in a relentless
forward push to satisfy an agenda that was not clear to this teacher. The principal paid lip-
service to the idea that teacher voice was important when, it actuality, it was not.
In opposition to the principal’s approach, most of the English teachers made reference to
an experience they perceived to be very powerful, an indication of how infrequently they were
given an opportunity to engage in and control their own learning. In this experience, they
worked with peers to vertically align writing scaffolds across grade levels to improve student
writing. This was an example of a professional learning experience that nearly all the English
teachers felt positive about and valued as meaningful for their practice. Elizabeth articulated
what it was like to be involved with her colleagues in this PD experience:
“In our department teams, we did a really cool thing. Instead of a student work analysis,
we decided to do a teacher work analysis. Yeah, as a department I think it’s easier. Last
semester, we definitely had more say in what we wanted to do as a department. For
example, as the English department chair, I knew for sure what we’re missing were our
scaffolds for our students. We thought we were providing too many scaffolds to the point
where our 12th graders couldn’t write an essay on their own without the scaffolds. What
USING DATA TO INFORM INSTRUCTION 197
we did all semester whenever we got the time to meet was we would look at student work
and we would observe each other just on that one topic.
“Again, we were working on scaffolds. All of our teachers agreed to bring in an
essay assignment, but basically all of the lessons and all of the scaffolds that we provided
to students. We weren’t concerned about what the students could produce. We just
wanted to see from 9th through 12th grade what we were giving our students for
scaffolds. In our departments, we basically gave each other feedback and we looked at,
okay, here are the scaffolds for 9th grade. Does it look the same in 12th? Because it
shouldn’t. I teach 9th, 10th, and 12th grade. It was really helpful for me to see the
scaffolds that I was giving and that my colleagues were giving and whether or not it
flowed vertically from grade level to grade level.
“The comments that some of my colleagues gave to me were like, maybe you’re
not providing enough scaffolds for your 12th grade students, because I would just give
them a prompt, and it’s not like flowing from 11th to 12th grade. We came up with really
concrete ways to fix that.
Teachers in the English Department proudly remembered a time when they engaged in a teacher-
directed learning experience where data was collected and reviewed for an instructional purpose,
helping students become increasingly independent as writers. The very act of changing the name
of the protocol from Student Work Analysis to Teacher Work Analysis symbolized their
ownership of the process. In multiple sessions over several months, these teachers brought data
to bear in the form of scaffolds, possibly handouts, to study how those scaffolds played out
across the grade levels, figuring out what they interpret those scaffolds to mean in terms of
student opportunities to have support versus to develop independence. This effort suggested that
USING DATA TO INFORM INSTRUCTION 198
these teachers were capable of using data (the scaffolds) to analyze and make decisions about
their practice.
Likewise, this teacher-owned activity seemed to produce a richer instructional
conversation than anything else they had done together. It involved a self-directed act where
they applied a protocol to gain insight into the way that they were working within their own
classrooms and across classrooms to build independent writers. Furthermore, they committed to
taking certain actions and following through with them. Data seemed to drive the discussion and
outcome. Notably, this collaborative learning experience also demonstrated that teachers had
internalized many of the data-driven practices promoted by the principal. They applied the skill
set that she had introduced to them. Given the freedom to exercise their agency, they chose, of
their own volition, to do the kinds of data-driven activities she asked of them in PD.
Other voices chimed in about the impact that this same experience had on their approach
to teaching writing. Sue stated the following:
“We actually changed as a group after looking at what we were doing for essay writing.
We actually ended up changing our vertical plan in terms of how we teach writing,
because we were realizing that kids were receiving so much support throughout 10
th
and
11
th
grade that they weren’t really learning much about being independent writers. That
was a really productive discussion we had.”
Sue described the experience of working with peers to collaboratively examine data in positive
terms. “We actually changed,” she stated. The words she used, including “changing,”
“realizing,” and “productive,” underscored the dynamic, creative nature of this experience for the
teachers and provided a clue about why it resonated so deeply with them.
USING DATA TO INFORM INSTRUCTION 199
Francisco, likewise, responded by sharing the same example when asked if he and his
colleagues ever held themselves accountable for coming up with or enacting joint agreements
made in PD. He stated the following:
“A couple years ago within the English department we’ve done our vertical plan as far as
what we want our students to know each grade level, and what we want them to know by
the time they leave here, by the time they graduate. We have had several meetings where
we do our own teacher work analysis and we will bring packets of everything down from
the handouts that we give out to the short stories or the texts that we use. We try to
annotate everything as much as we possibly can and just basically, we’re like an open
book and we’ve had our own fellow colleagues basically dissect everything that we do
and evaluate it and basically give us critical feedback as to what needs to be taken care of
in each grade level, what’s a concern, what’s a priority, what’s something that we
definitely need to take care of.”
All of these teachers viewed this experience as one way for them to be productive, a way for
them to have more control over themselves and their decisions. It seemed to be in contrast with
what they regularly experienced in PD. Everything they did in PD happened to them. They had
no control over what happened. Someone presented information to them and decided what they
would do with it. Someone else evaluated them. Someone else observed them and provided
feedback. This cycle was out of their control. The only thing that had the potential to be inside
their control was an experience like this where they could determine what was important. They
could position themselves as teachers in control of their own destiny. They could be viewed as
experts. In a knowledge of practice sense, this was the only time they saw themselves as being
USING DATA TO INFORM INSTRUCTION 200
the owners of the work they need to do and as experts in their own right as opposed to being
viewed as teachers who need to be under the control of someone else.
This experience, likewise, exposed the tension between the principal-driven learning
process and the teachers’ desire to initiate their own learning process. The teachers organized
their own learning around a problem of practice they were concerned about, the impact of
scaffolds across grade levels on student writing. They collaboratively decided when and how
students should have support as they increasingly move toward greater independence as writers.
The teachers never mentioned how the scaffolds were used or the pedagogy that went along with
the use of the scaffolds. Nonetheless, they conducted a data-driven process where they had a
richer conversation than at any other time in their PD experience. This experience signified a
teacher-directed act where teachers applied a data-driven protocol to gain insight into the way
that they were working within their own classrooms and across classrooms to build independent
writers. They used data to inform their progression from one grade to the next over time and this
data use translated into a change in their approach to teaching writing, to fostering and creating
more space for their students to become independent writers. In addition, this experience
prompted them to commit to taking certain actions and following through with them. Thus, they
used a data-driven process to make a structural change.
A closer examination of this experience, however, demonstrated, that without the right
supports, teachers became stuck at the surface, focusing only on activity-based aspects of their
practice in ways that were only loosely connected to data. They never discussed the instructional
scaffolds or rubrics. There was no discourse on instruction, no discussion of what they said to
their students or how they interacted with them. The experience remained at the margins of their
teaching. Nonetheless, it was more thoughtful, more coherent. The content of their conversation
USING DATA TO INFORM INSTRUCTION 201
was an improvement from other PD experiences. In the end, the autonomy did not lead to a
particularly rich data-driven experience that had a profound impact on their pedagogy.
Theme #4
The principal used external accountability demands, specifically teacher evaluation and
externally imposed PD, to support her instructional agenda. Consistent with the literature
(Smith & Andrews, 1989), the principal acted as a buffer, shielding her staff from external
pressures. She also took advantage of the larger district’s directives by positioning herself as the
teachers’ champion and defender, someone who responded to an unpopular system on their
behalf, while wholeheartedly believing in it and supporting it. She internalized the idea that
using data improves instruction, and she used data as evidence to leverage or promote practice
change. She buffered her staff and yet took advantage of the external pressure to exert pressure
on them. She positioned herself as the teachers’ defender who responded to this outside system
but also clearly agreed with it.
The principal cultivated an environment where her staff viewed her as positive and the
system as negative. Because her staff expected her to shield them from the harsher aspects of the
accountability demands, she undermined her own ability to tackle some of the harder things, like
expecting her teachers to critically examine and consider questions about their pedagogy and not
just pushing them to make superficial activity-based or structural changes in their classrooms or
changes in student behavior. In this approach, there was a notable lack of reciprocal
accountability (Elmore, 2003). The principal exerted energy trying to inspire her staff with her
charisma, offering ideas, resources, and strategies, assuming the role of exceptional teacher and
instructional leader, but never actively building their capacity to meet the outside demands.
Likewise, she did not know how to support teachers’ use of data to improve their practice. In
USING DATA TO INFORM INSTRUCTION 202
this section, I present evidence to demonstrate how the principal used external accountability
demands, specifically teacher evaluation and externally imposed PD, to support her instructional
agenda, simultaneously exerting pressure while buffering her teachers from the most unpleasant
aspects of both.
The principal constructed the way that her staff understood the teacher evaluation system
in an effort to further her agenda while making it the scapegoat, using it to send the message that
we are doing the best we can with what we have been told to do. She cushioned it, separated it,
and reduced the emphasis on the negative aspects of it, so that she could make it palatable for
teachers and so that she could use the data derived from it to leverage her work with the teachers.
The principal presented the teacher evaluation process as a mixed bag, something that
was excessive and demanding but that had benefits, and her teachers internalized this view of it.
On the one hand, they saw it as cumbersome and time consuming, creating unnecessary stress,
discomfort, and pressure. Some described it as little more than a compliance act, developed and
implemented without teacher voice, intended only to satisfy some distant high stakes
accountability measures. Others lamented a process that required each teacher to write a
grueling five-page lesson plan, setting them up to perform no more than a “dog and pony show”
on the day of the observation that was not “organic” and definitely not representative of their
practice as a whole.
At the same time, most of the teachers viewed the teacher evaluation process as valuable
and could articulate ways that it benefitted them. Many of them particularly valued the
conversations and feedback they received from their administrator/evaluator. In these
discussions, evaluators often pointed out things teachers might not otherwise notice, offered
helpful strategies, and validated their instructional choices. Likewise, the prospect of having
USING DATA TO INFORM INSTRUCTION 203
their instruction critiqued created a discomfort in some teachers, leading to a healthy
disequilibrium. Furthermore, many valued the process for encouraging the increased use of data
as a tool to examine practice. Anne stated that the teacher evaluation process made her want to
do better or “be on her A game.” She pointed out that it “forces you to look at the things you
should be doing” and “makes you less lazy.” She also acknowledged that the process kept her on
her toes, reminding her to put forth daily effort to demonstrate best practices laid forth in the
rubric so that students might participate more “organically” on the day of the observation. She
further stated that it kept her accountable.
Mindful of her teachers’ reactions to the teacher evaluation process, in the following
excerpt, Sofia described her role as mediator, explaining how she buffered her teachers from
these external demands while simultaneously ensuring that they complied with them. She stated
the following:
“We have to mediate because the district expects a certain level of quality. Our job is to
hold them and make them accountable for that level of performance. We also have to get
them to that level of performance. My job as a mediator is to always see every
observation from a teacher’s point of view first. From a kid’s point of view and from the
teacher’s point of view.
“From a teacher’s point of view, it’s like, Okay, they’re coming in. What am I
doing? I want to be natural. I don’t want to be fake. I want to do what I do. I want
feedback on what I do. I don’t want it to be unauthentic, but at the same time, they’re
going to catch me. They’re going to hold me to a certain level of expectation.
“Some things that are really simple but obvious points of mediation would be, like
for example, having a three-part objective on the board. When we taught, we didn’t do
USING DATA TO INFORM INSTRUCTION 204
that. What? Why does writing that show that I’m such a great teacher? I could have one
or not have one, and I could be a really good teacher. I have veteran teachers who are
coming from that paradigm. I have new teachers who are like, I just want to do
everything right.”
In this excerpt, the principal presented herself as a leader who empathizes with the teachers’
ambivalence about the evaluation process, understanding how they simultaneously want to be
“natural” and authentic in their presentation of a lesson while being expected to demonstrate
certain performance criteria that they might not own or agree with. She gave the example of the
“three-part objective” that each teacher needed to post on his/her board prior to their formal
observation. They might not have believed that doing this was necessary or would improve their
practice, but they did it to be compliant with the process. The principal positioned herself as the
person who needed to communicate empathy with the teachers while keeping them accountable
by holding them to “a certain level of expectation.”
The principal wholeheartedly promoted the teacher evaluation system mandated by the
larger district, believing that it produced the kind of objective data that would help teachers make
changes to their practice such as writing a three-part objective on the board or asking more
challenging questions to students or grouping students in more intentional ways. To ensure that
her teachers complied with the excessive, burdensome aspects of the process, she cushioned it
for them, convincing them that it could be useful for them. Ultimately, what the teachers were
required to do positioned them to be recipients of information that produced no more than minor
tweaks of their practice at the margins. Evidence to support these assertions were presented
earlier in this chapter in the analysis of the informal conference and in teacher interview data.
The principal used external accountability demands, specifically teacher evaluation and
USING DATA TO INFORM INSTRUCTION 205
externally imposed PD, to support her instructional agenda. She agreed with these demands and
she was able to scapegoat them in pursuit of her agenda, while simultaneously undermining her
ability to foster improvement in teacher practice. The principal’s work around teacher evaluation
existed in a larger context. Her well-intentioned efforts to implement an externally imposed
system were potentially limited or enhanced by the training and supports she received. This is
consistent with Goldring et al.’s (2008) findings.
Like the teacher evaluation system, the principal equally used the district’s externally
imposed PD initiative, Thinking Maps, to support her instructional agenda. However, by
allowing this training to dominate PD time over several weeks and contribute to the
fragmentation of her approach to PD, she failed to ensure that it would become a productive part
of her teachers’ work or that it would have an impact on practice. The principal described her
approach to PD as follows:
“The way that we work it is you build capacity and teacher leadership to learn how to do
professional development for each other, so that teachers get a sense that it’s not just
admin doing a workshop. It’s really us learning, and it’s coming from and growing out of
our needs, which they have communicated to us.
“They communicated to us last year they wanted to do Thinking Maps this year,
so that’s why we’re doing it. It’s all kind of answering what they want. Here’s the tricky
part, it also has to align. It has to come from that, from teachers, but it also has to align
with the Cascade’s initiatives for the year for professional development.
“We get two pieces. We get what the teachers want. We get what Cascade wants.
We’re in the middle. So we have to create something that makes sense to the teachers
who we work with, and that feels authentic. It also has to be what the district wants.”
USING DATA TO INFORM INSTRUCTION 206
The principal clearly understood that effective PD, as the literature confirms, occurs when
teachers are engaged in self-directed learning around problems of practice (Webster-Wright,
2009). She seemed to convince herself that her teachers wanted Thinking Maps, that they
requested it, when in actuality, the training was not an option but a mandate from the school
district. She allowed her teachers to go through session after session of training based on a
knowledge for practice model even as teachers became increasingly disengaged and
disconnected from it. Teachers interviewed stated that the training was going “too fast.” When
asked if teachers were getting it, Anne responded:
“Nobody is, and that’s why we don’t use them. What [the trainer] did today was so
levelled beyond what I could do with my kids right now.”
The principal acknowledged this disconnect but argued that the top-down approach of the
training, consistent with the knowledge for practice model, was necessary at the inception. The
principal described the Thinking Maps training as follows:
“I feel like what’s different about it, and that’s an instinctive teacher-based response, is
this is still someone from the outside coming in. It’s an expert coming from the outside,
and we appreciate that, but it’s still not of us. I think next year, when we make it more of
our own teachers and our own work and our own practice, we’re going to get greater buy
in, and we’re going to get greater use.
“Then next year, we’re doing writing off of the maps. It’s going to be talking off
the maps and writing off the maps, which is what teachers want. I have to align it. I went
to a really cool PD in New York from Uncommon Schools. I don't know if you’ve heard
of them. Doug Lemov is from Uncommon Schools. They do all the Teach Like a
Champion strategies. (excerpt shortened)
USING DATA TO INFORM INSTRUCTION 207
“Yeah, so anyway, they do PDs. They did a really great one on Close Reading.
I’m going to take that, and I’m going to write a PD sequence, like a unit, so I know that
combined with Thinking Maps, I think the teachers are going to go, Oh, I see how I could
use this. I’m not going to stop, because what I know is this: one shot or a few shots here
and there with no deep learning and no peer observation and no student work analysis
doesn’t get ingrained into a school’s culture of instruction. This, one semester, there’s no
way that that’s going to be like, now everyone does Thinking Maps.
“It has to be, in my experience, two to three years of consistent implementation
and practice through professional development and expectation, through peer
observation, is when teachers become, I use this. I implement this to fidelity. Now, we’ve
just started. In my mind, we just started. (excerpt shortened) They are just getting their
feet wet. They are just getting exposure. They have to learn about Thinking Maps.”
The principal, once again, represented herself as someone who felt empathy for the teachers’
below-the-surface resistance to the district-mandated training. She articulated their lament. “It’s
an expert coming from the outside,” she stated. “It’s not of us.” But she also believed that
teachers would stay with her if she could just make it more pleasant for them and perhaps tie it to
a more exciting set of strategies from another PD source. She recognized the limited impact that
short-term skills training workshops had on teacher practice, but strongly believed that if she
could just get them to persist and work through this initial stage, soon enough they would own
Thinking Maps or any other district-mandated training and implement it “to fidelity.”
In this section, I provided evidence to demonstrate how the principal consistently
positioned herself as a mediator between outside external pressures and teacher responses to
those dictates. In both the teacher evaluation process and PD, the principal empathized with the
USING DATA TO INFORM INSTRUCTION 208
teachers’ complaints and concerns about outside demands while exerting pressure on them to
comply with these demands. This strategy undermined her ability to foster the kind of
improvement in practice that she desired.
Conclusion
Findings from this study revealed that the principal of Star High School, Sofia, presented
herself as a knowledgeable, effective instructional leader whose greatest asset was her ability to
be viewed by her teachers as one of them. She believed wholeheartedly that her energy,
charisma, forward thrust, and expertise were enough to inspire teachers to improve their practice.
However, in her effort to be viewed as knowledgeable and indispensable, she carried the weight
of the intellectual work and did not build the capacity of her teachers to reflect on their practice
or examine their pedagogy. Neither did she build the leadership skills of her staff to use data in a
way that would foster instructional innovation or facilitate the kind of discourse that might push
teachers to a deeper level of thinking. Her efforts in teacher evaluation and professional
development, in particular, resulted in little more than minor tweaks to practice. Finally, she
took advantage of external accountability demands in both TE and PD to buffer her teachers
from the unpleasant aspects of them, while simultaneously believing in them and using them as
leverage to push her instructional agenda with her teachers.
USING DATA TO INFORM INSTRUCTION 209
CHAPTER FIVE: DISCUSSION, IMPLICATIONS, AND FUTURE RESEARCH
This study examined the role of a school leader who used data, in the face of
accountability pressures, inclusive of teacher evaluation data, for the purpose of improving
teacher practice. In the words of Bloomberg and Volpe (2015), the aim of this qualitative
research was “to tell a richly detailed story that takes into account and respects a context and that
connects participants, events, processes, activities, and experiences to larger issues or
phenomena” (p. 180). Guided by this spirit, a qualitative case study methodology was employed
to respond to the following question:
What is the role of a high school principal in using data, inclusive of teacher evaluation
data, as a tool for teacher learning?
To answer this question, I purposively sampled a high school in an urban setting that
included a principal actively using data, inclusive of teacher evaluation data, for the purpose of
improving practice and other criteria previously mentioned. I spent four months in the field,
collecting data for this study. During this time, I conducted 11 45–60 minute interviews, eight
with teachers in the English Department and three with school leaders, including the principal
and two assistant principals. In addition, observations of one principal-teacher informal
conference and eight professional development sessions were conducted, including whole group
meetings with teachers from all content areas and smaller break-out sessions exclusively with
English teachers. Because I had been granted permission to record these meetings, they were
also transcribed. Related documents, including notes taken by the principal and teachers during
PD meetings, teacher evaluation and PD protocols and policy documents, post-PD teacher
reflections, and classroom observation scripts, among other documents, were collected and
reviewed. Pseudonyms for the school site, the school district, the principal, the assistant
USING DATA TO INFORM INSTRUCTION 210
principals, the teachers, and students referred to in the interviews were created to ensure
anonymity. Observation data were scripted and transcribed, interview data were transcribed,
supporting documents were gathered and reviewed, and all were analyzed using an inductive
approach. This final chapter is a discussion of the major findings and conclusions drawn from
this research. Before summarizing these findings, I once again acknowledge the possibility that
there might be something beyond the scope of the data that I did not get to see before, during, or
after collection of the data. The findings represent only what the data offered to me. This
chapter is organized as follows: summary of findings, implications for practice, and
recommendations for future research.
Summary of Findings
The principal of Star High School, Sofia, worked diligently to present herself as a
knowledgeable, inspiring instructional leader. Further, she successfully crafted an image of
herself with her teachers as an exceptional teacher who intimately understood and empathized
with the frustrations, concerns, and needs of teachers. In fact, she believed that she, almost
singlehandedly, could give teachers the skills and knowledge necessary to improve their practice.
However, in her effort to be viewed as knowledgeable and indispensable, she carried the weight
of the intellectual work and did not build the capacity of her teachers to reflect on their practice
or to examine their pedagogy. Neither did she build the leadership skills of her staff or distribute
responsibility, enabling them to use data in ways that would foster instructional innovation
and/or facilitate the kind of discourse around data that might push teachers to a deeper level of
thinking. Her leadership role in both teacher evaluation and PD resulted in little more than
minor tweaks of instructional practice. She promoted the use of data, viewed it as important,
USING DATA TO INFORM INSTRUCTION 211
wanted to make it accessible, but did not know how to structure data-driven dialogues that would
engage teachers in collaborative inquiry, problem solving, and decision making.
Likewise, the principal did not have a clear approach to improving instruction in PD. Her
emphasis was on creating a sense of urgency, while simultaneously initiating multiple
disconnected learning experiences that were not consistently aligned. She also made an effort to
provide her teachers with a smorgasbord of trendy new strategies that they could choose to
implement or discard as they pleased. She did not take time to define the purpose of these
professional learning experiences or connect them to a larger plan or set up a mechanism to
ensure that new learning would take hold and be faithfully implemented. Furthermore, even
though she paid lip service to the fact that teachers were included in the planning and selection of
these activities, they were continuously set up to be the recipients’ of others’ knowledge. They
were not put in a position to focus on their own self-selected problems of practice or to be
generators or co-constructors of knowledge.
Much of her approach to PD and TE seemed to be dictated by external accountability
demands. In meeting these expectations, the principal viewed her role as a buffer, protecting her
teachers from the unpleasant aspects of these demands while simultaneously using them to
leverage her work with teachers. She believed in the use of data to drive instruction, making it
accessible, presenting it regularly to her staff, and promoting its use as well. Nonetheless, her
focus was always on making superficial changes in student behavior not on helping teachers
think deeply about their pedagogy. She was not well equipped or well positioned to promote the
use of data among her staff as a tool to inform instruction.
USING DATA TO INFORM INSTRUCTION 212
Implications and Recommendations
This dissertation explored the way a high school principal used data to inform practice.
In this part of the chapter, I discuss potential implications of the findings for practice and policy
and make recommendations for research, all of which may be useful for future researchers,
policy makers, and practitioners.
Practice
One implication that emerged from this study is that principals need to do more than just
share their knowledge and expertise about instruction to have an impact on practice. When
principals act as the keepers of the knowledge and owners of the learning process, teachers are
placed in powerless, subservient positions whereby their voices are silenced or devalued. School
leaders need training in how to draw out reflective thinking in teachers, help them tap into their
own knowledge, and rely on their own thinking to solve problems of practice.
Another implication that emerged from this study is that it is not enough for school
leaders to promote the use of data, presenting it, and making it accessible to staff. Instead, there
is a need to focus on what happens to the data, how we shape the discourse around it, and how
we use it to make informed decisions that are actionable. This is consistent with the work of
Marsh (2012), who argued that “data alone does not guarantee use” (p. 3). According to Marsh,
school leaders often lack the capacity of knowing how to interpret data and connect it to
instructional practices in a way that leads to shifts in teacher thinking. Sofia had not had her
capacity built. She was on the receiving end of a school district that did not seem to have a
system in place to build the capacity of its principals. She had been given the responsibility for
enacting policies that she was not equipped to manage.
USING DATA TO INFORM INSTRUCTION 213
Additionally, it is important to recognize that the work Sofia was engaged in does not live
in a vacuum. It reflects the larger challenge that exists in the field. Principals are not well
positioned to have reflective, complicated conversations with teachers about teacher evaluation
data—data that are sensitive and personal—in a way that will impact their practice. The systems
that are currently set up are time consuming and, quite simply, overwhelming. They do not take
into account the multiple demands placed on principals. This implication and the previous ones
led to the following set of recommendations in policy and research.
Policy
Policy needs to be written so that data use is not just expected of a principal without an
understanding of how it will be used. The school district needs to have a policy in place about
data use. In order to be effective with data use, it is not enough just to make the data accessible
or to provide tools. It is critical to build the capacity of staff to do the work. Consistent with
what Marsh (2012) determined to be true in the literature base, there are things that need to be
considered by policy makers. When drafting state or district policy around the use of data,
policy implications suggest that it is insufficient to just tell teachers and staff members to use
data or to simply distribute materials for using data. Therefore, the recommendations are that
infrastructure needs to be developed to equip leaders with the knowledge, skills, and abilities
necessary to use data in a more effective way.
Studies on data use in schools show that “principals and central office administrators
provide much-needed vision, modeling, and direction, enabling data use interventions to
flourish” (Marsh, 2012, p. 31). School leaders need training to provide this kind of support. It is
not enough to employ principals who are ostensibly knowledgeable or enthusiastic about data
use. Sofia, the principal in this study, immersed herself and her staff in data, but she fell short in
USING DATA TO INFORM INSTRUCTION 214
knowing how to engage teachers in discussions around data. She presented data to her faculty
without ever asking the question, what is this data meant to accomplish? She, likewise, did not
know where the interpretation of it would lead. She did not know how to determine an action
plan or follow through to ensure that new learning was implemented. Furthermore, she did not
hold her staff accountable to the work. She retained all the responsibility. Neither did she equip
them to do the work. She was the sole owner of the work. They felt disempowered by her,
which led them to abdicate responsibility.
Research
Policy makers at all levels expect schools to put data front and center in the service of
instructional improvement, and yet the research on educators’ use of data is limited in quantity
and quality (Marsh, 2012). Marsh pointed out that school leaders and teachers lack adequate
skills and knowledge to formulate questions around the data, select indicators, analyze, and
interpret results, and then respond to the new knowledge with solutions. Marsh indicated that
educators often get stuck at these later stages, particularly lacking skills to move from knowledge
to action. The research community would do well to focus on this area of research. Equally
important, research is needed on how the implementation of new teacher evaluation systems
affect the role of principals and how or whether it positions them to be effective as instructional
leaders. In addition, there is a need for examining how principals’ choices regarding data use
support or impede teacher learning. Consistent with the work of Marsh (2012), more research
that examines the training around data use in teacher and administrator education and training
programs could be beneficial. One suggestion is that more quantitative work be done in this area
of study to lead to more generalizable findings.
USING DATA TO INFORM INSTRUCTION 215
References
Aaronson, D., Barrow, L., & Sander, W. (2007). Teachers and student achievement in the
Chicago Public High Schools. Journal of Labor Economics, 25, 95–135. Retrieved from
http://www.jstor.org.libproxy2.usc.edu/stable/10.1086/508733
Baker, B. D., Oluwole, J. O., & Green III, P. C. (2013). The legal consequences of mandating
high stakes decisions based on low quality information: Teacher evaluation in the race–
to-the-top era. Education Policy Analysis Archives, 21(5), 5.
Blase, J., & Blase, J. (1999). Principals’ instructional leadership and teacher development:
Teachers’ perspectives. Educational administration quarterly, 35(3), 349–378.
Bloomberg, L. D., & Volpe, M. (2015). Completing your qualitative dissertation: A road map
from beginning to end. Thousand Oaks, CA: Sage Publications.
Brady, L. (2009). “Shakespeare Reloaded”: Teacher professional development within a
collaborative learning community. Teacher Development, 13, 335–348.
Brandt, C., Mathers, C., Oliva, M., Brown-Sims, M., & Hess, J. (2007). Examining District
Guidance to Schools on Teacher Evaluation Policies in the Midwest Region. Issues &
Answers. REL 2007-No. 030. Regional Educational Laboratory Midwest.
Burke, J. C. (2004). Achieving accountability in higher education: Balancing public, academic,
and market demands. In J. C. Burke (Ed.), The many faces of accountability (pp. 1–24).
San Francisco, CA: Jossey-Bass.
City, E. A., Elmore, R. F., Fiarman, S. E., & Teitel, L. (2009). Instructional rounds in education:
A network approach to improving teaching and learning. Cambridge, MA: Harvard
Education Press.
USING DATA TO INFORM INSTRUCTION 216
Cochran-Smith, M., & Demers, K. (2010). Research and teacher learning: Taking an inquiry
stance. In Teachers as learners (pp. 13–43). Dordrecht, Netherlands: Springer
Netherlands.
Cochran-Smith, M., & Fries, K. (2005). Education: The Report of the AERA Panel on research
and teacher education (pp. 37-68). Researching teacher education in changing times:
Politics and paradigms in M. Cochran-Smith and K. Zeichner, Eds. Studying Teacher.
Mahwah, NJ: Lawrence Erlbaum Publishers.
Cochran-Smith, M., & Lytle, S. L. (1999). Relationships of knowledge and practice: Teacher
learning in communities. Review of Research in Education, 24, 249–305.
Danielson, C., & McGreal, T. L. (2000). Teacher evaluation to enhance professional practice.
Alexandria, VA: Ascd.
Darling-Hammond, L. (1999). Teacher quality and student achievement: A review of state policy
evidence. Seattle, WA: Center for the Study of Teaching and Policy, University of
Washington.
Darling-Hammond, L. (2013). Getting teacher evaluation right: What really matters for
effectiveness and improvement. New York, NY: Teachers College Press. Desimone, L.
M. (2009). Improving impact studies of teachers’ professional development: Toward
better conceptualizations and measures. Educational Researcher, 38, 181–199.
Donaldson, M. L. (2011). Principals' approaches to developing teacher quality: Constraints and
opportunities in hiring, assigning, evaluating, and developing teachers. Washington, DC:
Center for American Progress.
Elmore, R. F. (2002). Bridging the gap between standards and achievement, Washington, DC:
Albert Shanker Institute. Retrieved from http://www.nsdc.org/library/results/
USING DATA TO INFORM INSTRUCTION 217
res11-02elmore.html
Elmore, R. F. (2003). Doing the right thing, knowing the right thing to do: The problem of failing
schools and performance-based accountability. Cambridge, MA: Harvard Graduate
School of Education and Consortium for Policy Research in Education.
Elmore, R. F. (2007). School reform from the inside out: Policy, practice, and performance.
Cambridge, MA: Harvard Education Press.
Fuhrman, S. H. (2004). Introduction. In S. H. Fuhrman & R. F. Elmore (Ed.), Redesigning
accountability systems for education (pp. 3–14). New York, NY: Teachers College Press.
Fullan, M. (2004). Leadership across the system. Insight, 61, 14–17.
Fullan, M. (2011). Choosing the wrong drivers for whole system reform. Melbourne, Australia:
Centre for Strategic Education.
Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Yoon, K. S. (2001). What makes
professional development effective? Results from a national sample of
teachers. American Educational Research Journal, 38, 915–945.
Goldberg, B., & Morrison, D. M. (2003). Co-Nect: Purpose, accountability, and school
leadership. In J. Murphy & A. Datnow (Eds.), Leadership lessons from comprehensive
school reforms (pp. 57–82). Thousand Oaks, CA: Corwin Press.
Goldring, E., Cravens, X. C., Murphy, J., Porter, A. C., Elliott, S. N., & Carson, B. (2009). The
evaluation of principals: What and how do states and urban districts assess leadership?
The Elementary School Journal, 110(1), 19–39.
Goldring, E., Grissom, J. A., Rubin, M., Neumerski, C. M., Cannata, M., Drake, T., &
Schuermann, P. (2015). Make room value added principals’ human capital decisions and
the emergence of teacher observation data. Educational Researcher, 44, 96–104.
USING DATA TO INFORM INSTRUCTION 218
Goldring, E., Huff, J., May, H., & Camburn, E. (2008). School context and individual
characteristics: what influences principal practice? Journal of Educational
Administration, 46(3), 332–352.
Goldring, E., Porter, A., Murphy, J., Elliott, S. N., & Cravens, X. (2009). Assessing learning-
centered leadership: Connections to research, professional standards, and current
practices. Leadership and Policy in Schools, 8, 1–36.
Grossman, P., Wineburg, S., & Woolworth, S. (2001). Toward a theory of teacher community.
The Teachers College Record, 103, 942–1012.
Hallgren, K., James-Burdumy, S., & Perez-Stevenson, I. (2014). State requirements for teacher
evaluation policies promoted by Race to the Top (No. 8121). Cambridge, MA:
Mathematica Policy Research.
Hallinger, P. (2003). Leading educational change: Reflections on the practice of instructional and
transformational leadership. Cambridge Journal of Education, 33, 329–352.
Hallinger, P. (2005). Instructional leadership and the school principal: A passing fancy that
refuses to fade away. Leadership and Policy in Schools, 4, 221–239.
Hallinger, P. (2011). Leadership for learning: Lessons from 40 years of empirical research.
Journal of Educational Administration, 49, 125–142.
Hallinger, P., Heck, R. H., & Murphy, J. (2014). Teacher evaluation and school improvement:
An analysis of the evidence. Educational Assessment, Evaluation and Accountability, 26,
5–28.
Heifetz, R., Grashow, A., & Linsky, M. (2009a). Leadership in a (permanent) crisis. Harvard
Business Review, 87, 62–69.
USING DATA TO INFORM INSTRUCTION 219
Heifetz, R., Grashow, A., & Linsky, M. (2009b). The practice of adaptive leadership. Boston,
MA: Harvard Business School Publishing.
Hord, S. M. (1997). Professional learning communities: Communities of continuous inquiry and
improvement. Austin, TX: Southwest Educational Development Laboratory.
Horn, I. S., Kane, B. D., & Wilson, J. (2015). Making sense of student performance data:
Mathematics teachers’ learning through conversations with assessments. In Annual
meeting of the American Educational Research Association, San Francisco, CA.
Horn, I. S., & Little, J. W. (2010). Attending to problems of practice: Routines and resources for
professional learning in teachers’ workplace interactions. American Educational
Research Journal, 47(1), 181-217.
Jennings, J. L. (2012). The effects of accountability system design on teachers’ use of test score
data. Teachers College Record, 114, 1–23.
Jones, K. (2004). A balanced school accountability model: An alternative to high-stakes
testing. Phi Delta Kappan, 85, 584.
Leithwood, K., Louis, K. S., Anderson, S., & Wahlstrom, K. (2004). Review of research How
leadership influences student learning. University of Minnesota, Center for Applied
Research and Educational Improvement.
Little, J. W. (1993). Teachers’ professional development in a climate of educational reform.
Educational Evaluation and Policy Analysis, 15, 129–151.
Little, O., Goe, L., & Bell, C. (2009). A practical guide to evaluating teacher effectiveness.
National Comprehensive Center for Teacher Quality.
Lohman, J. S. (2010). Comparing No Child Left Behind and Race to the Top. Connecticut
General Assembly, Hartford, CT: Office of Legislative Research.
USING DATA TO INFORM INSTRUCTION 220
Mandinach, E. B. (2012). A perfect time for data use: Using data-driven decision making to
inform practice. Educational Psychologist, 47, 71–85.
Marsh, J. A. (2012). Interventions promoting educators’ use of data: Research insights and
gaps. Teachers College Record, 114(11), 1–48.
Marsh, Julie A., John F. Pane, & Laura S. Hamilton. Making sense of data-driven decision
making in education: Evidence from recent RAND research. Santa Monica, CA: RAND
Corporation, 2006. Retrieved from http://www.rand.org/pubs/occasional_
papers/OP170.html.
Maxwell, J. A. (2012). Qualitative research design: An interactive approach. Thousand Oaks,
CA: Sage.
McDonald, J. P. (1992). Teaching: Making sense of an uncertain craft. Teachers College Press,
1234 Amsterdam Ave., New York, NY 10027.McGuinn, P. (2012). Stimulating reform:
Race to the top, competitive grants and the Obama education agenda. Educational
Policy, 26(1), 136–159.
Merriam, S. B. (2009). Qualitative research: A guide to design and implementation. San
Francisco, CA: Jossey-Bass.
Robinson, V. M., Lloyd, C. A., & Rowe, K. J. (2008). The impact of leadership on student
outcomes: An analysis of the differential effects of leadership types. Educational
Administration Quarterly, 44(5) (December 2008): 635–674.
Rockoff, J. (2004). The impact of individual teachers on student achievement: Evidence from
panel data. The American Economic Review, 94, 247–252. Retrieved from
http://www.jstor.org.libproxy2.usc.edu/stable/3592891
USING DATA TO INFORM INSTRUCTION 221
Rodgers, C. R. (2002a). Defining reflection: Another look at Steven Dewey and reflective
thinking. Teachers College Record, 104, 842–866. doi:10.1111/1467-9620.00181
Rodgers, C. R. (2002b). Seeing student learning: Teacher change and the role of reflection.
Harvard Educational Review, 72, 230–253.
Sartain, L., Stoelinga, S. R., Brown, E. R., Matsko, K. K., Miller, F. K., Durwood, C. E., ... &
Glazer, D. (2011). Rethinking teacher evaluation in Chicago. Chicago, IL: Consortium
on Chicago School Research.
Slayton, J. (2007). Resolution Regarding District Accountability Transformation Metrics:
Performance Measurement Plan. On file with author.
Smith, W. F., & Andrews, R. L. (1989). Instructional leadership: How Principals Make a
Difference. Publications, Association for Supervision and Curriculum Development,
Alexandria, VA.
Spillane, J. P., Halverson, R., & Diamond, J. B. (2001). Investigating school leadership practice:
A distributed perspective. Educational Researcher, 30(3), 23–28.
Spillane, J. P., Halverson, R., & Diamond, J. B. (2004). Towards a theory of leadership practice:
A distributed perspective. Journal of Curriculum Studies, 36, 3–34.
Spillane, J. P., & Healey, K. (2010). Conceptualizing school leadership and management from a
distributed perspective: An exploration of some study operations and measures. The
Elementary School Journal, 111, 253–281.
Stecher, B. M., & Kirby, S. N. (2004). Organizational improvement and accountability: lessons
for education from other sectors, 136. Santa Monica, CA: Rand Corp.
Stoll, L., Bolam, R., McMahon, A., Wallace, M., & Thomas, S. (2006). Professional learning
communities: A review of the literature. Journal of Educational Change, 7, 221–258.
USING DATA TO INFORM INSTRUCTION 222
Strauss, A., & Corbin, J. (1990). Basics of qualitative research (Vol. 15). Newbury Park, CA:
Sage.
Stronge, J. H., Ward, T. J., & Grant, L. W. (2011). What makes good teachers good? A cross-
case analysis of the connection between teacher effectiveness and student
achievement. Journal of Teacher Education, 62, 339–355.
Teitel, L. (2013). School-based instructional rounds: Improving teaching and learning across
classrooms. Cambridge, MA: Harvard Education Press.
Valli, L. (1997). Listening to other voices: A description of teacher reflection in the United
States. Peabody Journal of Education, 72, 67–88.
Vescio, V., Ross, D., & Adams, A. (2008). A review of research on the impact of professional
learning communities on teaching practice and student learning. Teaching and Teacher
Education, 24, 80–91.
Webster-Wright, A. (2009). Reframing professional development through understanding
authentic professional learning. Review of educational research, 79, 702–739.
Weick, K. E. (1996). Drop your tools: An allegory for organizational studies. Administrative
Science Quarterly, 301-313.
Wilson, S. M., & Berne, J. (1999). Teacher learning and the acquisition of professional
knowledge: An examination of research on contemporary professional
development. Review of Research in Education, 24, 173–209.
Abstract (if available)
Abstract
Data use is proliferating in schools as a tool to inform instructional improvement. Teacher evaluation is increasingly viewed as an important data source and mechanism in this effort. This qualitative case study sought to examine how data generated from teacher evaluation and other teacher learning experiences worked in conjunction to improve practice. More specifically, this study examined the role of leadership in using data for the purpose of increasing teacher knowledge and skills. Spanning a four-month period, the study focused on eight English teachers, a principal, and two assistant principals in one high school involved in implementing a new teacher evaluation process and immersed in data use for the purpose of improving practice. Findings revealed that the principal was not well equipped to build the capacity of her staff to use data to examine their pedagogy in a way that would foster instructional innovation. Her efforts resulted in little more than minor tweaks to practice. Likewise, she did not have a clear approach to improving instruction. Her emphasis was on initiating multiple disconnected learning experiences that were not consistently aligned nor did they include an explanation of why and how these experiences would enhance instruction or an expectation for following through to ensure that new learning would take hold. Professional development was mostly delivered in a top-down fashion that resulted in the exclusion of teacher voice. Finally, the principal responded to external accountability demands by buffering her teachers from the cumbersome, unpleasant aspects of them, while simultaneously using them as leverage to pursue instructional improvement. Ultimately, despite good intentions, the principal was not well positioned to promote the use of data as a tool for teacher learning.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Teacher discourse and practice: the role of discourse in grade-level meetings for teacher learning and changes in practice
PDF
A teacher's use of data to support classroom instruction in an urban elementary school
PDF
Designing school systems to encourage data use and instructional improvement: a comparison of educational organizations
PDF
Collaborative instructional practice for student achievement: an evaluation study
PDF
Authentic professional learning: a case study
PDF
Building networks for change: how ed-tech coaches broker information to lead instructional reform
PDF
A case study about the implied and perceived messages sent by one teacher through instruction for academic and behavioral expectations of students
PDF
Navigating a way out of program improvement: a case study analysis
PDF
Cognitively guided instruction: A case study in two elementary schools
PDF
An exploratory study of professional development to improve student reading: a case study in Oahu Hawaii
PDF
A qualitative study on Hawaii's use of Race to the Top funding on extended learning time in a Zone of School Innovation
PDF
Instructional coaching, educational technology, and teacher self-efficacy: a case study of instructional coaching programs in a California public K-12 school district
PDF
School leaders' use of data-driven decision-making for school improvement: a study of promising practices in two California charter schools
PDF
A case study analysis of high school principals' criteria in the teacher selection process
PDF
Investigating the enactment of professionald development practices in a high performing secondary school setting: a case study examining the purpose, process, and structure of professional development
PDF
Teacher self-efficacy and instructional coaching in California public K-12 schools: effective instructional coaching programs across elementary, middle, and high schools and the impact on teacher...
PDF
A school's use of data-driven decision making to affect gifted students' learning
PDF
Digital learning in K12: putting teacher professional identity on the line
PDF
Following the leader: leadership and the enculturation of the involuntarily transferred teacher to their new school site
PDF
Digital teacher evaluations: principal expectations of usefulness and their impact on teaching
Asset Metadata
Creator
Coaloa, Debra L.
(author)
Core Title
The role of leadership in using data to inform instruction: a case study
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education
Publication Date
09/26/2016
Defense Date
06/07/2016
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
accountability,data use,leadership,OAI-PMH Harvest,professional learning,teacher evaluation
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Slayton, Julie (
committee chair
), Marsh, Julie (
committee member
), Strunk, Katharine (
committee member
)
Creator Email
dcoaloa@usc.edu,dlcoaloa@hotmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c40-306212
Unique identifier
UC11281214
Identifier
etd-CoaloaDebr-4812.pdf (filename),usctheses-c40-306212 (legacy record id)
Legacy Identifier
etd-CoaloaDebr-4812.pdf
Dmrecord
306212
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Coaloa, Debra L.
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
accountability
data use
professional learning
teacher evaluation