Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Making equity & student success work: practice change in higher education
(USC Thesis Other)
Making equity & student success work: practice change in higher education
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
MAKING EQUITY & STUDENT SUCCESS WORK:
PRACTICE CHANGE IN HIGHER EDUCATION
by
Keith A. Witham
Dissertation Committee: Estela Mara Bensimon (chair),
Patricia Burch, Alicia C. Dowd, & Patricia Riley
A Dissertation Presented to the
FACULTY OF THE ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree of
DOCTOR OF PHILOSOPHY
(EDUCATION)
December 2014
© 2014 Keith A. Witham
ii
Acknowledgements
This dissertation benefited immensely from the support and guidance of many
individuals. I am especially grateful to each member of my committee for their time and support.
Specifically, I would like to thank Tricia Burch for pushing me to refine the methodology and
helping me to articulate the goals of the study in relation to political and institutional contexts. I
am grateful to Alicia Dowd for allowing me the space to carve out my own interests and
questions within the larger study of data-use in PASSHE, for modeling a remarkable clarity of
thought and purpose in her work, and for teaching me to do qualitative research on-the-fly as we
explored Pennsylvania together. Patti Riley has been tremendously influential—not only to the
questions and methods of this study but to my intellectual development and identity as a scholar;
I am grateful to her for intuitively knowing what it was I hoped to learn and for pointing me in
exciting new directions to go about learning it. And of course, I am ever grateful to Estela
Bensimon, not only for serving as a guide, interlocutor, and mentor to me throughout my
graduate school career but also—and more importantly—for her lifetime of work and thought,
which has generated so many of the principles, ideas, and practices on which my own and many
others’ work depends.
I would also like to thank the staff of the Rossier PhD program office—Laura Romero,
Dianne Morris, and Katie Moulton—as well as Arlease Woods in the Center for Urban
Education for their endless behind-the-scenes work making the experience of being a doctoral
student at USC effortless and enjoyable. I was lucky to be a part of a program and a research
center so committed to doctoral students. I was also lucky to have so many bright and passionate
colleagues. In particular, I would like to thank Holly Kosiewicz, Raquel Rall, Kathryn Struthers,
Tracey Weinstein, Rudy Acosta, Linda Shieh, Megan Chase, Robin Bishop, Tiffany Jones,
Cheryl Ching, Maxine Roberts, and Jason Robinson—not only for being good friends and
providers of moral support but for also being utterly inspiring as practitioners, researchers, and
human beings. I am grateful to my parents for their love and support—in particular to my father
for making sure I always had a working laptop and to my mother who, though she may no longer
be able to understand what I’m doing, always remembers to tell me she’s proud of me. And last,
but certainly not least, I am grateful to Greg Steirer who is—as far as I’m concerned—the
world’s best thinker, writer, editor, teacher, kitten rescuer, and companion.
iii
Table of Contents
Abstract ....................................................................................................................................................... vii
Foreword ....................................................................................................................................................... 1
CHAPTER I: Introduction ...................................................................................................................... 10
Higher Education Change in the Completion Era ................................................................................... 13
The Case Setting ..................................................................................................................................... 18
Aims, Premises, and Contribution of this Study ..................................................................................... 24
Aims. ................................................................................................................................................... 24
Premises. ............................................................................................................................................. 25
Research questions. ............................................................................................................................. 27
CHAPTER II: Literature Review ........................................................................................................... 28
Grounding the Grand Narratives in Practice ........................................................................................... 32
Change as a function of performance-based accountability policy. .................................................... 36
Change as a function of data use and organizational “learning.”........................................................ 40
Broader Lessons about Organizational Change from Higher Education and Beyond ............................ 43
Key Points from the Literature about Dynamics of Change ................................................................... 51
Key Points from the Literature about Approaches to Studying Change ................................................. 51
The Location of this Study in the Trajectory of Change Scholarship ..................................................... 52
CHAPTER III: A Practice Theory of Change ....................................................................................... 53
The Focus on Practices ........................................................................................................................... 54
Habitus and the Structure/Agency Relationship ..................................................................................... 57
Sources of Change within Practice Theory ............................................................................................. 59
Practice adaptation. ............................................................................................................................. 59
Judgment-based practice. .................................................................................................................... 60
Change as practice re-mediation. ........................................................................................................ 60
Motives of Practice: Institutional Logics & Moral Elements of Practice ............................................... 62
Moral elements of practice .................................................................................................................. 66
Summary: A Practice Theory of Change ................................................................................................ 69
Methodological Implications of a Practice Approach ............................................................................. 71
CHAPTER IV: Research Design & Methodology ................................................................................. 75
Ethnographic & Narrative Methods within a Case Study Design ........................................................... 76
iv
Multi-Site Case Study Design ................................................................................................................. 78
Sampling strategy. ................................................................................................................................... 78
Access & Bias Concerns ......................................................................................................................... 83
Data Collection ....................................................................................................................................... 84
Interviews. ........................................................................................................................................... 84
Observations. ...................................................................................................................................... 85
Document analysis. ............................................................................................................................. 86
Data Analysis & Reporting ..................................................................................................................... 88
Thematic coding and analysis. ............................................................................................................ 88
Organizing and interpreting findings. ................................................................................................. 90
Validity ................................................................................................................................................... 91
CHAPTER V: Propositions about the Practice of Change in Higher Education ............................... 95
Analytic Questions .................................................................................................................................. 95
Propositions about the Practice of Change Focused on Equity & Student Success ................................ 97
CHAPTER VI: Case Narratives of Practice Re-Mediation ................................................................ 112
System History & Overview ................................................................................................................. 113
Office of Civil Rights-Mandated Desegregation and the Adams Case ................................................. 114
PASSHE Performance-Based Funding ................................................................................................. 121
Case Study Narratives ............................................................................................................................... 124
Hillside University ................................................................................................................................ 124
Overview. .......................................................................................................................................... 124
The context of change at Hillside University. ................................................................................... 125
Introduction of re-mediating artifacts. .............................................................................................. 127
Enactment of competing institutional logics. .................................................................................... 133
Challenges for on-going practice change. ......................................................................................... 135
Reflections on practice re-mediation at Hillside University. ............................................................ 135
Valley University .................................................................................................................................. 138
Overview. .......................................................................................................................................... 138
The context of change initiatives at Valley University. .................................................................... 139
Navigating and enacting competing institutional logics. .................................................................. 147
Data-use as an “inter-mediating” practice......................................................................................... 152
Resource constraints as a moderating discourse. .............................................................................. 156
The role of moral values and professional identity. .......................................................................... 158
v
Reflections on practice re-mediation at Valley University. .............................................................. 162
Old Main University ............................................................................................................................. 165
Overview. .......................................................................................................................................... 165
The context of change initiatives at Old Main University. ............................................................... 166
Accountability context as a legitimating factor in practice re-mediation. ........................................ 169
Data-use as a mediator of learning and communication practices. ................................................... 176
Structures promoting incremental re-mediation of practice. ............................................................. 180
Reflections on practice re-mediation at Old Main University. ......................................................... 188
CHAPTER VII: Discussions of Findings and Conclusion................................................................... 190
Discussion of Key Findings from the Case Study ................................................................................ 193
Towards a Practice-Based Understanding of a Change Bureaucracy ................................................... 200
Accountability as a driver of practice change. .................................................................................. 203
The role of data and data-use in practice change. ............................................................................. 204
Integrating accountability and data use in change practices. ............................................................ 205
Recognizing the potential of interacting elements of higher education practice. ............................. 207
Connecting to Research and Theory: Contributions and Limits of the Practice Approach .................. 209
Contributions of the practice approach. ............................................................................................ 211
Limitations of the practice approach. ................................................................................................ 212
Limitations of the Study Generally and Directions for Future Research .............................................. 213
Bibliography ............................................................................................................................................ 218
Appendix I: List of Data Collected at Each Site ...................................................................................... 232
Appendix II: Annotated List of Deductive and Inductive/Emergent Codes ............................................. 233
Appendix III: Sample Interview Protocol ................................................................................................. 235
Appendix IV: Informed Consent Document ............................................................................................. 236
vi
List of Tables
Table 1: Sampling matrix for PASSHE universities…………………………………………….81
Table 2: Research Questions and Examples of Corresponding Analytic Questions…………….96
Table 3: Dominant institutional logics conveyed by Valley University
Academic Master Plan………………………………………………………………………….141
List of Figures
Figure 1: Timeline of PASSHE system policy and institutional activity………………………. 23
Figure 2: Diagram of the ‘activity setting’ of change initiative committees…………………….71
vii
Abstract
This dissertation examines ways in which change occurs in higher education institutions.
Though building upon traditions of scholarship on organizational change both in higher
education and other settings, this study makes unique and timely contributions in two key ways.
First, the study focuses on what I argue is a genre of change initiative characteristic of the so-
called “completion agenda” that has driven much of public higher education policy and practice
over the last five to seven years. Such initiatives are anchored in organizational practice but
linked explicitly to state or system accountability policy, they share a common set of objectives
related to completion and equity in student achievement, and they tend to rely on team-based
data use as a structure and mechanism for organizational change processes.
Second, this study adopts a practice-theory approach to studying organizational change in
an attempt to (1) overcome what I argue are some conceptual limitations endemic to much of the
extant research on change in higher education and (2) demonstrate a theoretical approach better
suited to the current contexts of change in particular. Practice theory puts emphasis on the
significance of individuals’ daily work activities for understanding the nature of organizational
activity, social work settings, organizational history and culture, and institutional contexts. The
practice-theory lens enables me in this study to ascertain both the nature of change within the
specific settings of committees tasked with developing recommendations for improvements in
student success and equity, as well as the organizational implications of those committees’ work
over time. Practice theory also helps clarify the ways in which the specific accountability
contexts and data-use mechanisms characteristic of completion-agenda reform efforts may serve
to support (or inhibit) practice change within institutions.
viii
Through a case study of the Pennsylvania State System of Higher Education (PASSHE),
this study investigates how a particular set of initiatives—the Equity Scorecard and
Deliverology—have brought about change within universities in the context of outcomes-based
funding policy focused on completion and equity. This study focuses in-depth on the practices
and changes in practice associated with those particular initiatives as they were implemented by
committees comprised of professionals from across roles (administrators, faculty, and
professional staff) at three purposefully sampled universities within PASSHE.
The findings of this study are presented in the form of eight propositions about the nature
of change related to these initiatives and the policy and institutional contexts in which they were
enacted. Those propositions are then elaborated through case narratives describing both patterns
and variations across the three universities in the nature of practice-change associated with the
initiatives. Both the propositions and the case narratives illustrate the complexity of interacting
factors that mediate practices directed at improving student success and equity. In most cases, the
practices associated with the specific change initiatives represented reconfigurations of existing
types of practices (e.g., retention committees): though the universities had engaged in similar
types of efforts previously, the accountability policy context of the system’s new outcomes-
based funding policy and the specific tools, language, artifacts, and technologies provided by the
initiatives (Equity Scorecard and Deliverology) served to significantly re-mediate those existing
practices. Those re-mediated practices then had organizational implications to the extent that
they created resources for ongoing learning and practice-change across the university,
particularly in the form of new structures for learning (e.g., data-use routines and venues for
collective inquiry).
1
Making Equity & Student Success Work: Practice Change in Higher Education
Foreword
On a rainy day in June, 30 professors and a handful of administrators filter slowly into a
cramped classroom in the building the Psychology department shares with the university police
department. Most of the professors are casually dressed, having just come from a summer meet-
and-greet with students and their families attending new student orientation. A modest lunch of
saran-wrapped sandwiches and chips is provided for them. The windowless room is stuffy
despite the effort of a wobbly pedestal fan buzzing by the door. In the corner, a graduate student
is assembling packets of Excel printouts showing data on retention and graduation rates, by
major, disaggregated by race and ethnicity. As they eat and talk, participants are given a folder
containing that data for their department as well as several worksheets, a memo and contract
from the Provost outlining tasks each chair will have to complete in order to receive a stipend of
$200, and a small, colorful rectangular sticker that reads “Equity Matters.”
The department chairs are gathered to attend one of a series of trainings being conducted
by the Dean of Academic Affairs as part of an initiative she calls the “Student Success
Network.” That Dean, along with a Psychology professor, are co-leaders of a committee
assembled at the university to implement the “Equity Scorecard,” a project that uses tools and
methods of action research to help higher education practitioners develop recommendations for
making their institutions more welcoming and effective for students of color. The state system of
higher education has provided funding for the implementation of the Equity Scorecard at all
fourteen universities in the system in tandem with a new performance-based accountability
policy. The Dean also leads the implementation of another national project paid for by the
system office, “Deliverology,” which uses management consulting processes and tools to support
2
the implementation of reforms focused on improving access and success for students of color and
low-income students, as well as improving overall retention and graduation rates. In addition to
these two large initiatives, the Dean heads and/or sits on a number of internal committees tasked
with guiding university policy on enrollment, retention, academic probation policy, and other
issues. The Student Success Network is her attempt to integrate these efforts.
The Provost stops in briefly to welcome the faculty and introduce the purpose of the
workshop. “As you know, the University’s new strategic plan is titled ‘Building on Excellence.’
The work you’re doing today is part of reaching our goal of excellence.” She goes on to provide
context for the training, mentioning that university has goals related to equity that it has to meet
as part of the state’s performance-based funding policy. She also mentions that the university is
undertaking a comprehensive effort of reforming general education, including investing in
Degree Compass – a course recommendation program that uses predictive analytics to help
students make choices about courses and majors based on their interests and abilities. “Equity,”
the Provost emphasizes, “is integral to all of these efforts.”
After the Provost leaves, the Psychology professor co-leading the Student Success
Network projects a PowerPoint slide showing data like those given to participants. “Before we
start, I’d like to ask that we not get bogged down in questioning whether the data are ‘right’,”
she says, using air quotes. “Sometimes we get derailed by all the things that could be wrong with
the data, but there’s always information in there. The demonstration today is about learning a
process that can be applied to other data later.” She goes on to describe the day’s training as “a
broad lesson on how to approach data.”
Despite her plea, several minutes into the presentation the chair of the theatre
department interrupts from the back of the room. “Wait, why are we looking at six-year
3
graduation rates? Don’t students graduate in four years?” The Chemistry chair jumps on the
opportunity provided by his colleague’s interruption. “And why are we even calculating this? I
know who my students are that are failing, they’re all from inner city schools…” The chair of
Accounting chimes in: “We only have one black student, how does this even apply to us?”
After patiently addressing their questions, the presenter gets back on track and walks the
participants through a process of identifying where there are gaps in retention and graduation
rates for students of color in their departments. “For some of you with only one or two students,
this may be an ‘access’ issue,” she suggests, again using air quotes. After a half hour, there
starts to be a change in the tone in the room. Side conversations grow louder and some of the
chairs start to express a kind of frustrated curiosity. The chair from Political Science speaks up
to ask why they never see these data as part of their enrollment reports. Another faculty member
answers: “You can, it’s in a dropdown menu under faculty, reports, profile, see profiles…” The
director of undecided major counseling interrupts. “No, you have to ask academic computing to
provide you a report template that has race as a variable.” Another person disagrees, “No, can’t
we get this information from our reports on majors?” As the meeting descends into chaotic
conversation and disagreement about the availability and whereabouts of the data, the Dean
calls the room to order ask suggests that they take twenty minutes to show everyone in the room
where to find it. “One of our recommendations,” she adds with a smile, “is that department
chair training should include how to find this information.”
------
The vignette above is based on an observation I conducted at a large comprehensive four-
year university – one of the three universities sampled as sites for this case study. But the
scenario it presents, in a general sense, is common at every higher education institution in every
4
sector across the country. “The life of the academic organization,” a community college
president once told me, “is in the committee meeting." Faculty and administrators spend hours
(many more than they would like) sitting in committee meetings rehearsing the very sorts of
activities described above – looking at information about an issue, questioning its validity,
arguing over interpretations of the problem, questioning the very premise of the questions being
asked, and sharing knowledge and opinions about the way things are done, have always been
done, or should be done. The committee meeting is, in many ways, a simulacrum of the higher
education organization writ large; it is the enactment of the ideals of shared governance,
democratic participation, and intellectual debate, though it may often also be an enactment of
tensions and conflicts borne from those same ideals. It is these and similar practices—because of
what they represent about the organizations in which they are situated and the potential they hold
for changing or constraining change in those organizations over time—that are the focus of this
study.
The activities I observed, as the theoretical discussion below will help make clear, have a
number of traits common to all work activity. First, they are situated – a theoretically loaded
term suggesting that everything we do happens within a specific social, cultural, and historical
location (e.g., a mid-tier public university in the eastern U.S. in 2014). Second, they are social –
obvious but significant, the activities involve physical and verbal interaction between people.
Third, they are mediated by physical and conceptual structures – that is, like most everything we
do, the activities described above involve people’s interactions with physical artifacts (e.g.,
printed data spreadsheets, Powerpoints), tools (databases), and language (“access,” “retention,”
“excellence”) (Engeström, 2000; 2001). Moreover, as the provost pointed out, the activities are
5
directed towards a specific goal (improving equity in retention rates) that derives from larger
organizational histories and goals (equity, excellence, or revenue).
When we begin in this way to deconstruct these seemingly mundane activities with the
help of a particular theoretical lens, we thus begin to learn things about the organization. We may
see tensions arising between the organization’s goals and the beliefs of its members. We hear
expressions that may enact or contradict grand organizational narratives (e.g., “Building on
Excellence”). Layers of organizational practices and policies begin to be revealed like the strata
of sedimentary rock, remnants of past eras; no one is quite sure when or why they were
formed…that’s just the way we’ve always done it.
More importantly for this study, we also see in these mundane everyday activities the
potential for change – that is, potential for our work as practitioners to be re-mediated with new
tools, new artifacts, and new language such that new organizational outcomes are produced. In
the case of the institution described above and others explored in this study (and indeed, many
institutions elsewhere), pressures to be more equitable, more effective, more efficient, and/or
more “excellent” in terms of learning outcomes or research are driving reforms of varying scale
– from departments to universities and even system or statewide (Kezar, 2013). Researchers
from across disciplines, and organizational scholars in particular, have addressed questions of
when, how, why, and to what end colleges and universities change to respond to pressures from
the external environment. I situate this study through a review of this broader literature, which
ranges from large-scale cross-national examinations of university transformation (e.g., Clark,
1998; 2004; Sporn, 1999) to micro-ethnographic studies of particular university functions like
planning and strategic management (e.g, Jazrabkowski, 2004; 2008), and from focus on change
6
around particular issues like technology implementation (e.g., Blin & Munro, 2008) to broad
considerations of university’s responses to wide-ranging environmental shifts.
I review this broad and diverse literature in order to make clear that this study fills a
specific niche that I believe is timely and relevant – that of changes being pursued within
colleges and universities (at both the level of departments and specific offices but also
university-wide) associated with the “completion agenda” in which U.S. higher education
currently finds itself and which I discuss further in Chapter I. These initiatives take many
different forms but are characterized by a common set of goals – to increase retention and
graduation rates (or in the current parlance to “increase student success”) and to simultaneously
reduce disparities in these outcomes between students from demographic groups who have
tended historically to succeed in higher education and those who have not. Additionally, these
initiatives are often, as in this study, driven by some combination of external pressure, such as
accountability and accreditation, as well as some internal impetus; though as Clark (1983a) and
many others have observed, the internal motivations for such work often get neglected in the
discourse of policymaking, and part of my goal with this study is to illustrate how in practice,
the distinction between internal and external drivers of change may be less relevant to the current
nature of higher education change than it was in the past.
In addition to contributing depth to our knowledge about these current initiatives for
change, this study also aims to make a theoretical contribution by demonstrating the efficacy of a
focus on practice as the locus of change in higher education. My focus on the mundane activities
of everyday work associated with specific change initiatives situates this study within a line of
developing research in organizational science, management, and sociology literatures that adopts
a practice-theory focus. As Feldman and Orlikowski (2011) suggest, the use of practice as a
7
focusing concept in organizational research can occur in one or all of three dimensions. First, a
focus on practice can be an empirical approach, simply in recognizing in data collection and
analysis the “centrality of people’s actions to organizational outcomes” regardless of whether the
researcher explicitly adopts practice theory as a guiding framework (Feldman & Orlikowski,
2011, p. 2). Indeed, within the extant literature on change efforts there are many embedded and
taken-for-granted descriptions of the actual work people do (or do not do) as part of change
efforts. My literature review draws particular attention to these often understated elements of
prior studies. Second, organizational research can adopt a formal practice-theory lens, thus
making more explicit “the particular theoretical relationships that explain the dynamics of
everyday activity, how these are generated, and how they operate within different contexts over
time” (Feldman & Orlikowski, 2011, p. 2-3). And finally, attention to practice in organizational
research can also derive from a “practice ontology” that “sees the social world as brought into
being through everyday activity” and from which perspective practices are “the primary building
blocks of social reality” (Feldman & Orlikowski, 2011, p. 3; Schatzki, 2001).
This study adopts all three of these “levels” of focus on practice, and the theoretical
framework presented in Chapter III elaborates both the benefits and challenges associated with
that choice. Chapter III also provides clarity around the potentially confusing semantics of
practice theory, including distinguishing between terms such as “practices,” “practice,”
“activity,” and “activity settings.” As an introduction this approach, however, I suggest here just
that an empirical and theoretical focus on practice enables me to argue, with rich qualitative
evidence to support the argument, that when we pursue change in higher education we must
attend to the way our everyday work is mediated and, thus, how it can be re-mediated to achieve
new goals and/or existing ones more effectively. This is not to say that we should do so blindly,
8
or that all change is necessarily good. Indeed, I also argue that by examining our practices
carefully, recognizing in them the depth of organizational history and structure, we are also
better equipped to understand and critique those aims we pursue – new or old. The study thus
asks a seemingly simple question: Where in the activity associated with equity and student
success initiatives in higher education is there potential for different organizational practices
and routines to emerge?
This dissertation proceeds as follows. The introduction in Chapter I describes the nature
and context of the initiatives examined in the study and frames them as a specific form of
institutional change efforts. This chapter also describes the broad setting of the case examined in
the study – the Pennsylvania State System of Higher Education (PASSHE) – as well as the
specific accountability and institutional contexts of the case, including a description of the
initiatives about which data – in the form of interviews, observations, and document analysis –
were collected for this study. Chapter II is a literature review organized around two objectives:
(1) to situate this study within both a particular strand of organizational research interested in
processes of change and the limited research within higher education examining a specific kind
of institutional-level change, and (2) to synthesize and critique existing research on change in
higher education specifically in order to demonstrate the particular contributions of this study.
Chapter III provides a theoretical discussion emphasizing the generative potential of a
“practice approach” to studying change in higher education and outlining the key concepts,
terms, and theoretical propositions guiding the study’s design and data analysis. Chapter IV
describes the design and methodology of the study, with an emphasis on the processes and tools
of “organizational ethnography” (Nicolini, 2009a; Ybema, Yanow, Wels, & Kamsteeg, 2009)
9
that facilitate analytic movement between individual and group-level activities and
organizational-level phenomena or outcomes.
Following the model of Clark’s (1998) studying of organizational transformation in
European universities, Chapter V then provides an introductory summation of key conceptual
findings of the study, which I have structured in the form of “propositions” following an
approach described by Neumann and Pallas (in press). These propositions precede in-depth
organizational narratives of each site within the case study, presented in Chapter Six, in order to
allow the reader to observe within the narratives how those propositions are (or are not) enacted
and substantiated by evidence presented in context of the “change story” at each university.
Finally, Chapter Seven synthesizes and summarizes key lessons across sites within the case
study, discusses challenges and limitations of the study, and offers a reflective coda on the
discourse of change in higher education and the viability of the practice theory approach for
future research.
10
CHAPTER I: Introduction
In August of 2013, President Obama announced a new extension of his administration’s
college completion policy agenda that will involve ranking colleges based on the value they offer
students—defined as a function of graduation rates, cost, and the debt and earnings of
graduates—and tying federal financial aid to institutions’ performance on those metrics in order
to boost graduation rates and put downward pressure on tuition and fees (Lewin, 2013; Slack,
2013). Although the President did not use the word “accountability” in his remarks announcing
the plan, his intent to “create better incentives for colleges to do more with less, and deliver
better value for students and their families” and the metrics on which the planned rating system
would be based inspired a predictable backlash from college presidents and faculty associations
about being held accountable for costs. The official statement of the American Association of
University Professors (AAUP) suggested that, “the President’s proposal will do little to solve the
problem and will likely result in a decline in the quality of education offered to working class
and middle class students, particularly students of color” (Fichtenbaum, 2013).
Four months after the President’s announcement, U.S. senators convened a hearing to
investigate the effectiveness of accreditation as a mechanism of accountability in higher
education. In discussions foreshadowing debates that will likely play out during reauthorization
of the Higher Education Act if and when that comes about, members of the Senate education
committee questioned whether the current system of accreditation is undermined by conflicts of
interest and a lack of rigorous consequences for poor outcomes. Senator Elizabeth Warren—a
champion of consumer rights—asked whether poor performance on graduation rates and high
rates of students defaulting on their student loans should not constitute “a bright line beyond
11
which we say a school should not be accredited” (Warren, quoted in Chronicle of Higher
Education, 2013, Dec. 13).
These discussions and the President’s proposal are the latest iterations of a national
higher education policy discourse that has converged in recent years around an agenda to
improve the completion rates of public institutions and the efficiency with which institutions
educate and graduate students. While the President’s proposal relies on the limited but not
insignificant leverage the federal government has for influencing public higher education
through the market and distribution of federal financial aid, the targeting by U.S. senators of the
accreditation process as a tool for ensuring institutional effectiveness represents a potentially
significant new foray of political oversight into the affairs of higher education.
The shifting nature of policymakers’ interests in higher education and the concurrent
growth of influence of the education “megafoundations” on state and federal policymaking have
ushered in a range of investments totaling billions of dollars in higher education “innovation”
and reform (Hall, 2011; Russell, 2011). These initiatives reflect the enactment of the college
“completion agenda,” an umbrella term used discursively (pejoratively among some observers)
to describe the push for the nation’s higher education system to award significantly greater
numbers of college degrees and certificates in order to meet goals for US educational attainment
set by President Obama when he announced his administration’s “American Graduation
Initiative” in 2009. Consequently, government and private investments in both policy and
practice measures within the completion agenda have as their aim increasing rates of “student
success,” generally defined in terms of completion of a credential or degree and other key
indicators linked to policymakers’ political and economic interests in (particularly public) higher
education (Perna & Thomas, 2006; Russell, 2011).
12
Underlying the “completion” push in federal and state policy has been an attending
concern about persistent gaps in educational attainment between white and upper class
populations with historically high success rates, and populations from minority racial and ethnic
background and from the middle and lower classes who have been historically underrepresented
both within higher education and, consequently, among college-educated adults in the US.
Though concerns about equity in college access and success were by no means new to the
discourse of the completion agenda, the pervasive and dominant focus on completion has created
a new urgency around both (1) closing gaps in success rates for underrepresented groups as a
strategy for raising the overall US attainment rate, particularly in light of shifting demographics
(Kelly, 2008) and (2) protecting equity in access for those groups in the rush to shift colleges’
focus to outcomes as opposed to enrollment (Bragg & Durham, 2012; Rhoades, 2012).
Policy and philanthropy strategies for advancing the completion agenda have thus given
rise to an array of initiatives focused on bringing about “change” in higher education defined by
a certain set of outcomes loosely and variably defined in terms of student success and equity.
Some of these initiatives (e.g., Complete College America, the National Governors Association’s
‘Complete to Compete’, Excelencia in Education’s ‘Ensuring America’s Future’, and ECS’s
‘Boosting College Completion for a New Economy’) focus on changing state policy to establish
incentives for institutions to produce more graduates, while others (Access to Success, Achieving
the Dream, and AACC’s College Completion Challenge) take aim directly at system or
institution-level policy and practice (see Russell, 2011, for a summary of major completion
initiatives). Moreover, because of the unprecedented level of synergy between major
philanthropies and state and federal policy around the objectives and metrics of the completion
agenda (Hall, 2011), many of these initiatives operate contiguously at multiple levels—for
13
example, states participating in Complete College America’s network have also received grants
(often from the same funders that support CCA) to engage in practice-level reforms like
Achieving the Dream or Access to Success. At the organizational level, then, the completion
agenda has engendered a new context for change; though public colleges and universities have
always been linked to their environments in complex ways, I argue that the change initiatives
emerging from the completion agenda differ from those of previous eras in at least two ways that
I explore next. The unique nature of these efforts and the urgency with which they are being
enacted and evaluated (not to mention the magnitude of the investments made in them) make it
vital to understand if and how change occurs and to offer criteria for measuring, critiquing, and
appreciating—but also perhaps problematizing—organizational change focused on improving
equity and student success as broad social and political priorities.
Higher Education Change in the Completion Era
Calls for change in higher education are by no measure new or unfamiliar. As Zemsky
(2013) observes, “[C]hange may be all around us, but for the nation’s colleges and universities
there really is precious little that is new under the sun. For thirty years, the very thing that each
wave of reformers has declared needed to be changed has remained all but impervious to change.
Even the language of reform has remained constant” (p. 15). It is certainly true that the archives
of our public universities contain boxes of faded dusty and yellowing pages of strategic plans,
accreditation reports, policy memos, and notes from trustee meetings from decades past all
dealing with pressures coming from a “changing environment” to be more efficient, more
entrepreneurial in raising revenue, more attendant to issues of diversity and access, more
rigorous in assessment of learning, and so on (indeed, I spent a lot of time reviewing these
14
documents in the course of my dissertation research and such histories inform my case
narratives).
But it may also be reasonable to suggest that, despite their similarity in language and
focus to past eras of reform, the calls for reform inherent to the current completion agenda have
given rise to a new type of change effort—one that I suggest defies easy categorization according
to existing models of organizational change in higher education (Kezar, 2001). Neave (2012)
locates the current push for “managed change” in higher education within what he suggests is a
new paradigm of change characterized by a quickening in the pace of change and a “prospective”
rather than “retrospective” reference point for the goals pursued by that change.
Haste to implement, the wish instantly to ‘embed’ what are held up as ‘good practices’
are, not surprisingly, further evidence of the very curious situation which higher
education finds itself engaged in. Certainly, this is change of a very extensive order. But
it is also change of a very different nature from the ‘organic change’ that characterized
the elite university. There is some evidence that the best descriptor to apply to it is that of
‘enforced change’ in which the determination of ‘le pays politique’ allied to the
instrumentality of control a posteriori, whether through funding incentives or through
retrospective monitoring, leaves very little latitude to ‘le pays réel’ other than a grudging
compliance. (p. 36)
Neave is suggesting here something subtle about both the changing nature of change and the way
we approach studying it as a function of a shifting relationship between public policy and higher
education—or more specifically, the “political state” and the “real state” of university operation.
The urgency associated with contemporary policy initiatives (e.g., the completion agenda) and
their private counterparts has, his theory would suggest, redefined the parameters within which
15
change can occur and also shifted the narrative for observing and studying change from one of
“distance travelled” to one of “what ought to be” (Neave, 2012, p. 34). Reform in higher
education was once, as Neave (2012) describes it, an “oscillation from steady state to the
instability of reform and back to steady state” (p. 37), whereby reform efforts would recede like a
tide leaving some worthwhile improvements in practice or washing away remnants of failed
prior interventions. In any case, the resulting balance would be based on judgments rooted in the
dominance of institutional autonomy and the legitimacy assumed to be inherent to its traditional
modes of production. In the current paradigm, by contrast, change is rather a “reiterative cycle of
continuous adjustment” reinforced by public policy to ensure that “the speed of institutional
responsiveness is set…by forces external to the university” (p. 37). In the current state, that is,
the locus of the impetus for institutional change has moved into a new, hybrid space between
institutional autonomy and political interest where change is pursued by institutions—still of
their own will—but in processes defined by metrics, objectives, and timeframes for change that
have been defined externally and reached a new, powerful level of consensus in the discourse
about higher education and its purposes.
In the current paradigm, Neave suggests, change is thus less a matter of internal
“adjustment”—that is, the institution “having detected and defined this situation on its own
initiative and then having taken measures to remedy it again on its own initiative.” Change is
now more a matter of “accommodation” where “the university [is] responding on its own
initiative…but to issues already identified by outside interests” (p. 30). Neave is writing from the
perspective of western Europe, but critics of the completion agenda in the US may be
sympathetic to his concerns about the relocation of the power to define the objectives and
metrics of change from institutional autonomy to a normative, prospectively oriented, and
16
“hastened” public policy regime (see, e.g., Chronicle, 2012, March 2, for an array of critiques
regarding the limitations of the completion agenda metrics and objectives; also Rhoades, 2012).
Certainly the re-emergence in states of performance-based funding policies organized around a
typically narrow and increasingly common set of completion-focused metrics (Jones, 2013)
suggests that the urgent, economically and politically defined “first in the world” discourse of the
completion agenda has been enacted through the same sorts of public policy measures as those
about which Neave is concerned.
Yet there is something interesting about Neave’s notion of change as “accommodation,”
particularly with respect to the organizational issues implied in the phenomena of student success
and equity. If we consider the current regime of institutional change efforts identified above
(including those operating in the case studied here, which I describe below) as the merging of a
hastened, prospective public policy agenda and the constant, iterative, and autonomous process
of change within institutions, such efforts appear as a somewhat new strand of organizational
change. They are not, as Neave notes, like the “organic change” of prior eras—which Zemsky
(2013) might suggest does not reflect any real change of substance; nor, however, do they
resemble the implementation of packaged reform models or direct regulatory intervention of the
sort we might find in K-12 education, health care, or other sectors. Such current change efforts
are rather like projects of process improvement, internally motivated and maintained (ostensibly)
but directed to externally defined objectives, metrics, and (though the findings of this study
problematize this) values. They are, in that regard, change processes unique to higher education
organizations in the current political and economic moment.
The goal of this introduction has been to locate and describe the particular kind of
organizational change in higher education that I am interested in better understanding through
17
this study. Now I will be more concrete in applying this conceptual discussion to the actual work
happening in institutions around the country. The institution-level initiatives I referenced
above—Achieving the Dream, Access to Success, and so on—are projects launched in the 2000s
with funding from major national philanthropies (in some cases with matching funds from
institutions or systems). Though designed according to varying theories of change, they share
certain features, for example: (1) a focus on data use and building capacity for “data-driven”
decision-making; (2) an emphasis on helping institutions improve their own policies and
practices rather than implementing “best practices” from the field; (3) a design involving
designated “teams” of administrators, faculty, and staff who engage in various specific activities
and processes; and (4) guiding objectives and metrics focused on increasing student success
overall and for underrepresented students in particular (see, e.g., Jenkins, 2011, Russell, 2011,
Zachry, 2008 for descriptions of some of these initiatives).
Moreover, though most of these initiatives pre-dated the completion agenda (it has no
real official start date but most accounts reference Obama’s 2009 AGI announcement as the
consolidating event), they reflect the growing cohesion in the early 2000s among major
philanthropies (and the policymakers and other influential leaders who informed their strategies)
around a certain kind of reform that culminated in the focus we see now on completion. And,
within the more visible networks and synergies of the completion agenda these initiatives have
been linked directly to policy strategies (for example, performance-based funding). These
initiatives thus constitute a specific genre of change effort—they are temporally confined
primarily to the last decade or so, they are “based” within institutions or in some cases within
systems, and they are—by design—models of “accommodation” (Neave, 2012) that privilege
18
institutional autonomy and locally defined interventions while locating the objectives of those
interventions within the broader economic and political interests of social policy.
To further clarify, the type of change effort I am interested in here are not just about
responding to accountability or about increasing capacity for data use; rather, I argue (with
support in my data) that they represent a reconfiguration of a particular kind of academic and
administrative work that occurs extensively and routinely in colleges and universities—a kind of
work done largely in committee form or in related practices such as leadership, professional
development, planning, budgeting, and so on. Whether and how “accountability” and “data use”
impact these practices is part of the empirical question of this study, though I place particular
attention on those dimensions given the specific contexts of the case setting—to which I now
turn.
The Case Setting
The preceding overview of the current moment in higher education reform allows me to
move towards answering the question germane to any case study—as Luker (2008) puts it, what
is this a case of? In 2011, the Board of Governors of the Pennsylvania State System of Higher
Education (PASSHE) implemented a revised performance-based model for appropriating funds
to the 14 comprehensive, regional universities that constitute the system. The new funding model
aligned the distribution of funding to the primary objectives outlined in the system’s recently
released strategic plan, which include creating effective and equitable learning environments,
graduating more students with high-quality credentials, and creating a highly qualified labor pool
to support the state’s economy (Marshall, 2011). The metrics structuring the new funding model
are organized in three buckets—student success, access, and stewardship. The first two of these
19
categories incorporate indicators related to the access and success (and gaps in access and
success) for low-income students and students of color (Cavanaugh & Garland, 2012).
At the same time PASSHE was revising its performance-based funding policy, the
system also joined the Complete College America Alliance of States, thereby committing to
implement an array of completion-focused policies and report on a set of common “student
success” metrics. And, PASSHE had previously joined a network of state public institutions
sponsored by Washington, D.C.-based advocacy organization Education Trust under the name
“Access to Success,” which asks institutions and systems to commit to reforming policy and
practice in order to eliminate gaps in higher education achievement for low-income students and
students of color. Notably, the formulation of PASSHE’s performance metrics related to “closing
gaps” in access and success derived directly from metrics used by Access to Success. There were
thus clear linkages between the system’s engagement with national initiatives, which were at the
forefront of enacting the policy goals of the completion agenda, and its own revision of policy.
Further illustrating the complexity of these linkages and the importance of the case for
studying change as a form of “accommodation” within the completion agenda, leaders at
PASSHE went a step further and invested millions of dollars in institution-level initiatives to
help its member universities build capacity for the kinds of change prioritized in the
performance-based funding. Specifically, PASSHE contracted with an organization called the
“Education Delivery Institute” (affiliated with the Education Trust) that uses a package of
management consulting principles (called “Deliverology”) to help universities implement policy
or practice interventions they themselves developed as part of their participation in Access to
Success or elsewhere.
20
At the same time, PASSHE also contracted with the Center for Urban Education at the
University of Southern California to provide all 14 universities access to a facilitated process
called the Equity Scorecard. The Equity Scorecard is a two-year action research-based project
that provides a structured process for practitioners to investigate the institutional policies and
practices that negatively impact the success of students of color (Bensimon & Malcom, 2012).
The process is supported by a range of tools (protocols to guide specific kinds of organization
inquiry) and other artifacts, including templates for collecting and interpreting data. The process
also enacts an intentional and theory-based lexicon designed to shift practitioners’ cognitive
framing of inequities that are pervasive in higher education. These words and phrases include,
for example, “equity-minded” and “deficit-minded,” “focal effort,” “hunches” and
“milestones”—all of which take on specific significance within the activities conducted by
Scorecard committees (Bensimon & Malcom, 2012).
The Equity Scorecard has not traditionally been directly linked to state or system policy
initiatives. It is thus because of the Scorecard’s general design features and the specific purposes
for which the PASSHE administrators chose to implement it (in relation to their policy
initiatives) that I associate it with other “completion agenda” initiatives, though it predates most
others I referenced here. Though varying in their history, level of involvement, and underlying
theories of change, both the Deliverology and Equity Scorecard initiatives as implemented in
PASSHE thus share features of the broader genre of change model I described above, including a
focus on data use and a team-based design emphasizing local, contextualized development of
change interventions. Within the PASSHE universities, furthermore, the two projects were often
implemented by committees with some level of overlap in membership and varying levels of
integration.
21
There are two important features about the PASSHE setting as a case for studying change
in higher education. The first is that PASSHE represents in many ways the prototype of, as
Neave (2012) says, “the very curious situation which higher education finds itself engaged in”
(p. 36) with respect to intertwining policy and practice dimensions of the completion agenda.
The system’s concurrent investment in policy strategies and what I have suggested represent
semi-autonomous process-improvement or capacity-building initiatives like the Equity Scorecard
and Deliverology reflects what may be the sine qua non of contemporary change efforts—a
blending of accountability and institutional autonomy, a merging of internal and external values
with respect to the purposes of public higher education, and prospective rather than retrospective
bases for the legitimacy of certain forms of higher education practice. PASSHE Executive Vice
Chancellor Peter Garland described the linking of these initiatives to state policy priorities in a
newspaper interview in 2013 following the Supreme Court decision on Fisher v. Texas:
"The Pennsylvania State System of Higher Education (PASSHE) does not have a policy
similar to the one reviewed by the U.S. Supreme Court. However, the leadership of
PASSHE and its fourteen member public universities strive to assure that our campuses
reflect the diversity of the Commonwealth’s citizens. To meet this goal we participate in
a number of initiatives specifically focused on reducing gaps in access and achievement
for under-represented and low income students such as the national Access to Success
program.
"It is important that these students not only earn admission to our universities but
also graduate at the same rate as all of our other students … The PASSHE performance
funding program—one of only a few in the nation—measures and tracks progress in
22
closing these gaps as well as efforts to ensure diversity in our faculty and other staff.”
(Wenner, 2013, June 24, n.p.)
The use of “national initiatives” to meet goals embedded in a performance-based accountability
system is thus a unique but timely opportunity to study a particular new policy innovation in
higher education.
The second important feature of the PASSHE case is that, in a pragmatic sense, it offers a
new context in which to study organizational change and to attempt to bridge a perplexing
paradox evident across the higher education change literature. Though I leave the exploration of
this paradox to the literature review, the short version has been summarized by Hearn (1996):
Colleges and universities are among the oldest forms of social organization, so it is not
particularly surprising that they often exhibit deep, institutionalized resistance to
fundamental change. It has often been quipped that ‘Trying to change an educational
system is like trying to move a cemetery: there’s not a lot of internal support for it.’ Yet
the centuries-old survival of postsecondary institutions may be due not only to their
ability to resist undesirable change but also to an equally remarkable ability to adapt
successfully to emerging circumstances. Change does indeed take place in college and
universities, and sometimes it is far from trivial. (p. 141)
In other words, higher education organizations appear simultaneously to never change and yet to
be remarkably adaptive. Though my study examines change over a relatively small period of
time (two to three years) during the course of these specific initiatives, in nearly every site these
initiatives are reconfigurations of pre-existing committees tasked with examining issues of
retention or equity (or diversity, more commonly). Figure 1 illustrates the location of the specific
data-use initiatives of interest here in the longer history of a typical PASSHE university. This
23
timeline of activity at the system policy and institutional practice levels demonstrates the
continuity of practices like strategic planning committees as well as the interruption or
emergence of new committees in relation to system policy. Figure 1 also indicates that the pre-
existing committees tasked with examining (for example) retention and transfer outcomes were
in most cases merged into a reconfigured (often with a core group of the same members as well
as a handful of new members) committee tasked by the provost to examine equity and student
success using the tools and processes of the external initiatives. Thus the PASSHE setting
represents an opportunity to examine the extent to which the appropriation of new tools and
processes into existing practices (e.g., retention committees and strategic planning) serves as a
form of adaptation and/or accommodation to the shifting policy environment while also
maintaining certain core forms and practices within the institution (Hearn, 1996; Neave, 2012).
24
Aims, Premises, and Contribution of this Study
Aims. Though relatively small in scope, this study follows in the vein of previous studies
(most closely, Clark, 1998, and Sporn, 1999) that examine the complex processes by which
change occurs in colleges and universities. Clark’s (1998) introduction to his study is compelling
and relevant to my own project:
University change in Europe at the end of the twentieth century, and arguably elsewhere
in the world, centers on the development of self-regulating universities, places that
actively seek out stronger means to become and remain competent societal institutions.
The specific elements identified in this study serve as such means. Relatively concrete,
the operational pathways developed here are more tangible to the touch than the grand
rhetorics that flow readily in and around universities in ambiguous terms of liberal and
professional education, accountability and assessment, bureaucracy and community, state
and market. Broad interest in purpose and implementation are brought down to specific
categories of university organization. We follow the trail of the simple maxim that
universities change their character as they change the specific ways in which they
operate. (p. xvi)
Like Clark (1998), I am interested in change as a function of specific, tangible activities; but I
push further than Clark did in this direction, relying on recent developments in practice and
activity theory to distill the changing nature of higher education practice. To be clear, then, the
goal of this study is not to attempt to determine whether some change initiative or another within
the participating universities has brought about a change in “outcomes” or has been otherwise
effectively implemented or successful according to some normative criteria. Rather, this study
aims to examine deeply a process of change through a distinctly focused theoretical lens that
25
attends to (1) how change occurs in the daily work of practitioners as the result of new tools,
language, and objectives, and (2) the relationship between that changing daily work and the
organizational structures within which it occurs.
Premises. The following premises also guide this investigation. The first premise is that
if policy or practice reform efforts are intended by their implementers to change the ways
institutions “do business” such that they produce some greater equity in, quality of (however
defined), or quantity of the things they produce (i.e., learning and scholarship), the mechanisms
of that reform necessarily imply a process of “learning” and change in daily practice.
Researchers therefore need to conceptualize change not functionally as mechanisms and
indicators but as an impetus of practices at the individual and organizational level that are both
durable over time and capable of changing in response to changing conditions. That is, how is
change enacted in the activities individuals and groups routinely carry out, whether in response
to shifting internal or external conditions, a sense of moral or professional obligation, or in the
course of adjusting that every day activity to the pursuit of a new goal (de Certeau, 1984;
Nicolini, 2013; Thévenot, 2001)? The relationship between practices and learning in higher
education, particularly around data-use and accountability, has not yet been thoroughly
examined, either theoretically or empirically (Ewell, 2012, Stensaker, Henkel, Välimaa, &
Sarrico, 2012).
The second premise is, therefore, that theorizing about accountability in higher education
could benefit from a “turn to practices” (Schatzki, 2001) in order to move beyond constructs that
cannot fully capture the complexity of the enactment of change across organizations (Feldman &
Orlikowski, 2011), as demonstrated by the equivocal findings of existing empirical research.
There is yet little evidence or rich description of how change is actually understood or enacted in
26
practice by those working within higher education, nor of what the conditions are that have
supported the accomplishment of that change.
1
Finally, and importantly, this study is not naïve about the power, conflict, and politics
that shape the ecosystem of public higher education generally speaking or specifically within the
case studied here. Many of these dynamics will emerge as key elements in the case narratives. I
do, however, avoid implying judgment about which positions within these power-hierarchical
structures represent the “true” voice, purpose, or “good” of higher education. The theoretical
framework guiding this study locates power and conflict not as taken-for-granted structural or
macro-social facts but rather as phenomena of practices ordered and enacted in ways that allow
or prohibit individuals to pursue their objectives (Bourdieu, 1980; Nicolini, 2013; Schatzki,
2001), and which therefore have the quality of being mediated (by artifacts, routines, rules, and
so on)—and thus also the potential of being re-mediated and reformed.
In sum, this study begins from the broader question not just of whether change is
occurring in higher education institutions according to the reform regime of the completion
agenda, but how such change may enter into the work life of higher education faculty,
administrators, and staff. As Luker (2008) suggests, qualitative research is concerned not with
testing known categories in the social world but “trying to discern the very shape and elements of
the categories themselves” (p. 38). In this study I aim to discern the shape and elements of a
practice of change.
1
This study is part of a larger study conducted by the Center for Urban Education at the Rossier School of
Education at USC, funded by the Spencer Foundation, which aims to contribute to precisely this gap in the research
by examining the conditions under which data use around issues of inequity in postsecondary outcomes lead to the
creation of actionable knowledge and sustained, increased capacity for learning and acting on issues of equity.
27
Research questions.
Given these aims, premises, and intended contributions, the research questions guiding
this study are as follows: Within the specific confines of the case setting described above:
(1) What are the practices associated with the objectives of improving student success
and equity in universities?
(2) How are those practices re-mediated in the course of data-centered, team-based
change initiatives?
(3) What are the organizational/structural implications and extensions of such changes
in practice?
The evidence elicited by these research questions is not an inventory of practices, but
rather an analytic accounting for what an array of activities reveals about what Sporn (1999)
calls a university’s “adaptive capability” and what I will call, the term having emerged from my
findings, “learning structures.” My goal in studying these practices and their attendant structures
is to discern who is involved in change, what they are doing, how they go about doing it, what is
different about these practices compared to other or prior activities, and how these practices
change over time in relation to the university’s organizational structures, norms, policies,
histories, and objectives.
28
CHAPTER II: Literature Review
There is an apparent paradox between two grand narratives that emerge from the
literature on change in higher education. On one hand, colleges and universities are complex
organizations that are, depending on the perspective, well-suited to resist change or resistant to
needed change—at any rate, purposefully changing colleges and universities is like “moving a
cemetery” (Hearn, 1996). The predominance of faculty autonomy and shared governance,
multifarious and even contradictory goals and expectations vis-à-vis society, and the diffuse and
immeasurable nature of knowledge as an organizational “product” make higher education
institutions—depending on the perspective of the observer—immune to, resistant to, or incapable
of change in either organizational form or practice (Hearn, 1996; Clark, 1983a; Clark 1983b;
Weick, 1976).
In keeping with the notion of university organizational structures as arbiters of unwanted
change, virtually every major inquiry into processes of change in higher education makes
reference to Weick’s (1976) suggestion that colleges and universities are “loosely coupled
systems.” Loose coupling suggests that two elements within an organization, or in an
organization and its environment, may have varying degrees of interdependence such that,
although the two elements may be linked structurally, changes in one element has little or
uncertain impact on the other (Glassman, 1973; Meyer & Rowan, 1977; Thompson, 1967;
Weick, 1976). If and when change occurs, then, its outcomes may be nonetheless unpredictable
and short-lived.
Birnbaum (1988) offered a somewhat more optimistic notion of higher education
institutions as “cybernetic systems” that manage complexity by dividing responsibility for
achieving their many goals among subunits and maintaining complex feedback loops that prompt
29
and regulate change at varying levels within the organization. The particular cybernetic
dynamics of academic organization can suppress or facilitate organizational efficiency and
effectiveness, and therefore facilitate or frustrate change efforts, depending upon the flow of
information and the capacity at any given level to make or delegate necessary decisions based on
valid information. Capacity for change is thus a function not only of the organizational form (and
the coupling or uncoupling of cause and effect it engenders) but also the dynamics of
information collection, dissemination, and communication within and across the subdivisions of
the organization.
And if the complexities of organizational form and control had not complicated the
picture enough, change in higher education must also be infused through multiple and sometimes
competing cultural forces. Bergquist (1992) identified first four and later six (Bergquist &
Pawlak, 2008) distinct cultures within academic organizations that each create “contending
points of reference” that must be engaged for change to be successful. Culture, Bergquist and
Pawlak (2008) suggested, “provides guidelines for problem solving, and more generally, serves
an overarching purpose…the production of something valued by its members” (p. 10). Six (at
least) different “cultures” thus create the potential for at least six different frames of reference
through which higher education practitioners may value, prioritize, resist, protest, champion, or
subvert any given effort at change.
These texts have become canonical; one or all are routinely cited at the beginning of
much of the work examining organizational change, leadership, learning, or innovation in higher
education as a way of contextualizing the change process and foregrounding the entrenched
patterns of practice through which academic work occurs. Christensen and Eyring (2011) posited
30
such a view using the notably deterministic analogy of “DNA” to support their call for disruptive
innovation:
A university cannot be made more efficient by simply cutting its operating
budget, any more than a carnivore can be converted to an herbivore by
constraining its intake of meat…For example, requiring universities to admit
underprepared students is unlikely to produce a proportional number of new
college graduates. It is not in the typical university’s genetic makeup to remediate
such students, and neither regulation nor economic pressure will be enough,
alone, to change that. (p. 21, emphasis added)
In short, change in higher education is so complicated, it is no wonder we hardly ever see
it. If the way universities and individuals within them work is embedded at the molecular
level of their constitution, the only options for change are to evolve over millennia or, as
Christensen would prefer, to create drastic breaks in the whole system so as to engender
new educational forms altogether.
In support of this narrative is an extensive body of empirical work questioning whether
academic organizations have the capacity to generate and sustain real change. Levine (1980), for
example, illustrated how innovations in higher education may fail to be institutionalized because
of disagreements over those innovations’ profitability for the institution (or groups within it) or
their compatibility with existing needs and norms. Similarly, Clark (2004) revisited his
optimistic earlier study (1998) about transformation in European universities towards new
entrepreneurial practices and structures, finding that for nearly every instance of sustained
transformation there was another example where changes had been abandoned or resulted in
misalignment between new practice and entrenched organizational forms, goals, or values. And
31
in a smaller scale study, Blin and Munro (2008) puzzled over why they saw no fundamental
change in modes of instruction among faculty at their university following the introduction of
new technology (Moodle). Echoing Levine’s (1980) findings with respect to profitability and
compatibility, Blin and Munro’s colleagues generally reported that they simply saw no need for
changing their practices and felt comfortable with existing modes of preparing for and delivering
their courses. An array of PowerPoint, Word, and email functionality, their respondents noted,
served them just fine.
And yet there in Blin and Munro’s (2008) finding of “no change” is a small indication of
the paradox of change narratives in higher education. In form, higher education looks much the
same as it did 100 years ago. As former Harvard president and Treasury Secretary Larry
Summers observed,
[T]he world is changing very rapidly. Think social networking, gay marriage, stem cells
or the rise of China. Most companies look nothing like they did 50 years ago. Think
General Motors, AT&T or Goldman Sachs. Yet undergraduate education changes
remarkably little over time… With few exceptions, just as in the middle of the 20th
century, students take four courses a term, each meeting for about three hours a week,
usually with a teacher standing in front of the room. Students are evaluated on the basis
of examination essays handwritten in blue books and relatively short research papers.
Instructors are organized into departments, most of which bear the same names they did
when the grandparents of today’s students were undergraduates. A vast majority of
students still major in one or two disciplines centered on a particular department.
(Summers, New York Times, 2012, January 20, n.p.)
32
And yet, in practice, the delivery of higher education today—via PowerPoints, email, online
registration systems, and so on—would be unrecognizable and terrifying to college students of
the mid-20
th
century. Nor would those students recognize the elaborate student centers,
recreational facilities, or franchise-filled food courts of modern campuses. The second grand
narrative is thus that higher education institutions do in fact change, constantly and often
significantly.
This second narrative in the literature is constituted by arguments (sometimes
ideological) about whether and why higher education institutions change. Studies in this vain
demonstrate how dramatic shifts in organizational structure have occurred within higher
education in response to a shifting political and economic ecology. For example, Rhoades (2012)
cautioned that retrenchment in taxpayer support for higher education and a concurrent push
towards “completion” as an economically defined political goal have resulted in dramatic shifts
in policy and practice across sectors, whereby low-income and minority students are being
squeezed out of traditionally open-access institutions. Slaughter and Leslie (1997) and Slaughter
and Rhoades (2000; 2004) showed how privatization ideology in government and reductions in
state funding to higher education have prompted colleges and universities to restructure in order
to compete in a global knowledge-based economy, bringing about dramatic changes in both
internal organizational structures and the practices of faculty and administrators within them.
Thus, to finally get to the other hand, those who work in higher education have ample
anecdotal and empirical evidence that, in fact, change in colleges and universities is continuous,
cumulative and, as Clark (1983a) and Hearn (1996) have noted, often dramatic.
Grounding the Grand Narratives in Practice
33
The following discussion reviews a small selection of literature from the much broader
universe of conceptual and empirical work on change in higher education (borrowing from other
fields where helpful) in order to consider the ways in which the ontological and methodological
foundations of change research may contribute to this apparent paradox in change narratives,
substantiating the complexity of change in both generative and, perhaps, limiting ways. I also
look to the literature to help identify key patterns and variables that emerge (sometimes
explicitly, sometimes implicitly) from existing research and that can guide the present study.
A review of broader higher education change literature, both empirical and conceptual,
confirms that indeed there is no single categorization scheme or contingency calculus by which
we might be able to predict that “change occurs under these circumstances” or “these are the
factors necessary to successfully effect change.” Despite the difficulty in putting parameters
around what exactly is meant by change or how it occurs, however, we in fact know quite a bit
about many different overlapping and entangled dimensions of change in higher education
organizations, from intentional processes of innovation (e.g., Levine, 1980) and reform to less
intentional processes of evolution (Hearn, 1996; Kezar, 2001; Kezar, 2013; Stensaker, Henkel,
Välimaa, & Sarrico, 2012).
2
The literature addresses change in many contexts and dimensions:
as a function of accountability policy and reform (i.e., “top-down”); in terms of organizational
“learning” and professional development (“bottom-up”); at the scale of small department-level
structural changes; and at the level of institution-wide transformation (Kezar, 2001). Across this
varied research there is evidence of both intentional and unintentional change, change efforts that
2
Kezar (2001) suggests a conceptual distinction between models of change and specific “narrow” strategies for or
processes of change, such as innovation, adaptation, diffusion, implementation, and reform (pp. 13-15). For the
purposes of my study, however, I am concerned with understanding processes of change that I argue look similar, at
the level of practice I focus on, regardless of their specific character. I also find that existing literature does not
adhere cleanly to this schema. I thus use the term “change” generally to mean any of these types of intentional
change and in reviewing prior studies I maintain the words used by authors.
34
appeared to produce different outcomes as well as those with no apparent impact, and change
efforts that were sustained as well those that were thwarted, distorted, or abandoned. The
apparent paradox with respect to whether higher education organizations change is thus not a
function of scale, nor of whether the prompt for change was top-down, bottom-up, external,
internal, political (e.g., accountability policy), or professional (accreditation). The puzzle seems
to span all of the boundaries, sectors, and categories researchers have imagined for organizing
their inquiries.
Nor do higher education scholars have any shortage of conceptual and analytic tools with
which to approach inquiries of change. Kezar (2001; 2013) provides an exhaustive review of
theoretical approaches and conceptual models for understanding organizational change in higher
education. She emphasizes, however, that much of the literature (and practice) regarding change
attends insufficiently to theories of change processes—an omission that limits our ability as a
field or as individuals to understand, predict, enact, or lead change (Kezar, 2013, pp. xiv-xv). Her
assessment of existing literature echoes that expressed by Stensaker and his colleagues, who in
the introduction to their edited volume on the topic (Stensaker, Henkel, Välimaa, & Sarrico,
2012) suggest that: “[T]he studies conducted so far provide mixed evidence with respect to the
impact of all these [change] initiatives at the shop-floor level of higher education…Hence, while
performance of the sector may have improved, one still faces some challenges in explaining how
‘change’ takes place at the institutional level…” (pp. 1-2).
At a basic level, the fact that we have only mixed evidence of or knowledge about change
processes may be simply a reflection of the complexity of the “organized anarchy” of higher
education organizations (Cohen & March, 1974) and the multiple, nested tiers of organization
that comprise universities (Clark, 1983b). Perhaps no body of research looking at one aspect or
35
level of change could thus ever be totally reconciled with another body of research focused at a
different level or unit of analysis.
At a deeper level, however, the puzzle of change research in higher education may also
stem from the ontological assumptions and discursive architecture within which it is conducted.
Stensaker (2003) suggests a problem with assumptions about change in his study of the
implementation of External Quality Monitoring (EQM) in European universities, where the
finding that EQM failed to bring about its intended changes in outcomes was—he argues—the
function of an overly “mechanistic view” of policy implementation and organizational change (p.
153) as opposed to a perspective that would look to change within a continuous process of
interaction between actors and features of the EQM system. In a more general sense, Tierney
(1998) expressed the same concern in the introduction to an edited volume addressing the need
for fundamental, internally guided change in higher education: “Colleges and universities have
been beset with suggestions for how to treat one or another illness rather than focusing on how to
ensure that we no longer get ill. We have been trapped by our assumptions” (p. 3). When
research on change focuses in at the level of symptom, to extend Tierney’s metaphor, it may thus
miss change (or lack thereof) at the level of illness; similarly if we look for mechanisms of
change assuming that such imagined constructs as “culture” or “capacity” will be evident as
mechanisms or objects of change, we may neglect to see that such imagined constructs may
appear to endure despite (or perhaps because of) subtle changes in the activities that collectively
enact those macro-level artifices.
With this overview of tensions in approaches to studying organizational change in higher
education as a backdrop, I move now to reviewing literature on issues of change specifically
relevant to this study, noting where and how these tensions play out within existing research.
36
Indeed, assumptions about the nature of change, and in particular a tendency to view (and thus
measure) change from a functionalist perspective, reoccur through much of the existing literature
regarding specific issues in which organizational change is the primary concern, even if change
is not nominally or explicitly referenced in the research questions driving the studies.
Given the nature of the change initiatives on which this study focuses and their “hybrid”
location at the nexus of PASSHE’s performance-based funding policy and campus-based data-
use initiatives, two areas of literature in particular are relevant both as exemplars of common
approaches to studying change and as sources to inform the content and methods of this study
specifically: (1) institutional change as a function of performance-based accountability policy
and (2) the implementation of data-use initiatives.
Change as a function of performance-based accountability policy. This case study
situates questions about how change occurs in higher education within the context of both a
performance-based funding policy and campus- and system-wide initiatives in which groups of
individuals come together to assess, demonstrate, and find ways to improve upon measures of
institutional performance (for example, with respect to learning, students’ retention and
completion, and equity). The context of performance-based funding in particular provides a rich
case for study because of what prior research has shown about the tendency of this genre of
policy specifically to exacerbate an alleged “culture gap” between academia and policymakers
and to undermine institutional improvement (Dougherty & Reddy, 2013; Ewell, 2011; Shulock,
2003) and because of the ways in which such accountability policy contributes to the changing
“cadence” of change and the locus of goals, measures, and values associated with contemporary
reform (Neave, 2012).
37
Research examining the implementation and impact of performance-based funding
policies echoes the equivocal nature of broader literature on change in higher education. Many
studies suggest that performance-based accountability policies implemented in the 1980s and
1990s were generally ineffective. In part, scholars attribute their ineffectiveness to overreliance
on output indicators (of “political” interest) without attending to the processes of institutional
improvement (within the “professional” domain) that were needed to achieve their intended aims
(Burke & Modaressi, 2000; Dougherty & Hong, 2006; Ewell & Jones, 1994). In a series of
studies across states, Burke and his associates identified as a “common and fatal flaw” among all
types of performance accountability measures that the priorities and responsibilities
communicated to campus leaders through such policies did not filter down into the academic
departments ultimately responsible for making improvements in practice (Burke & Minassians,
2003, p. 20).
This finding was confirmed by other, smaller case studies of performance policy
implementation within specific institutions or systems (e.g., Bogue & Johnson, 2010). Harbour
and Nagy (2005) found, for example, that community college administrators dismissed the
accountability system in their state as irrelevant to their work, although the authors also found
evidence that the accountability system metrics had become somewhat embedded in planning
and other practices. Huisman and Currie (2004) similarly found that the administrators and
faculty interpreted performance-based accountability policies as symbolic “window-dressing;”
the authors in turn interpreted practitioners’ responses as resistance to making any changes in
practice in accordance with the policies’ incentives (p. 548).
At the same time, however, many of these studies simultaneously observed numerous
ways in which the presence of the performance-funding systems had impacted the way
38
individuals went about their work within institutions. For example, Burke (2002) found that the
presence of a performance-based accountability system had led to greater use of data in
institutional planning and increased awareness among administrators of state priorities and of
their own institution’s performance. And in a comprehensive review of dissertations and other
unpublished studies done on specific institutions (particularly in Tennessee and Florida, where
performance-based funding policies have been in place the longest), Dougherty and Reddy
(2011) and Bogue and Johnson (2010) found extensive examples of program restructuring,
changes to curriculum and instruction, and improvements in students services (practices that the
authors could not causally link to the presence of a performance-based accountability system but
that nonetheless contributed to and coincided with the intended aims of that policy). Dougherty
and Reddy (2013) documented even more extensive examples of institutional responses they
attributed to the implementation of performance-based funding policy, including both structural
changes (e.g., altering graduation requirements and adding new programs for retention) and
changes in practice and procedure (e.g., new models of academic counseling and enhanced
career advising).
Consistent with classic policy implementation theories (e.g., Sabatier & Mazmanian,
1980) and similar perspectives that have informed much of this research (Saarinen & Välimaa,
2012), the premise behind the vast majority of the literature on performance-based funding has
thus been that its influence on change within institutions could be understood as a rational, linear
process of organizational response facilitated or inhibited by the interests or resistance of
enacting practitioners (e.g, faculty). The question as framed by this approach is thus whether or
not colleges and universities brought about the desired changes (without attending to how),
measured primarily in terms of changes in outcomes like retention and graduation rates. But like
39
the tension found in the broader literature, the equivocal nature of findings derived from these
rationalist implementation-oriented studies generates more questions than it answers. For
example, why do leaders perceive that performance-based funding had had no impact on the
improvement of their institutions, yet researchers are able to document changes in practice and
structures (e.g., Burke & Modarresi, 2000; Burke, 2002)? What might administrators mean when
they say that an accountability system is “window dressing” (Harbour & Nagy, 2005), and how
does that interpretation effect the use they made of the data and processes required by the policy?
How are policymakers themselves involved in the “learning” generated by the implementation of
the policies that contributed to changes observed over time (e.g., Burke, 2002)?
More recent studies of the impact of performance-based funding have tried to move
beyond the functionalist assumptions of prior approaches and have started instead from a
hypothesis that if accountability policy were to achieve its intended outcomes, such effects
would have to be mediated by and reflected in “intermediate outcomes”—specifically, processes
of organizational learning and practice change within institutions (Dougherty & Reddy, 2013;
Jones, Dougherty, Lahr, Natow, Pheatt, & Reddy, 2014). These studies not only start from a
different premise than prior work on organizational change vis-à-vis accountability policy but
also incorporate new forms of data; for example Dougherty and Reddy (2013) posit that,
“Performance funding programs can also stimulate institutional change by providing resources to
improve the capacity of colleges to engage in productive organizational learning” (p. 41). The
building of such capacity itself can thus itself be a measure of policy impact on organizational
change. Indeed, Boyce (2003) and others (e.g., Kezar, 2011) have suggested that organizational
learning is critical to the sustainability of change efforts broadly defined.
3
3
While there is clearly a vital link between organizational learning and organizational change, I am not here delving
into the immense literature on organizational learning except in the instances where the studies I review themselves
40
Change as a function of data use and organizational “learning.” The concept of
organizational learning also provides a link between theories of change underlying initiatives
promoting (and studies of) data use in higher education. A number of studies have examined data
use as an impetus of change, focusing primarily on the types and frequencies of data used
(specifically among Achieving the Dream community colleges, e.g. Morest & Jenkins, 2007;
Jenkins & Kerrigan, 2008). Again echoing the broader literature, many of these studies produce
confounding results. For example, despite the push among initiatives like the Washington State
“Student Achievement Initiative” and Achieving the Dream (nationally) to build a culture of
evidence, studies have shown that the frequency with which practitioners access and make use of
data is generally low other than among top administrators, institutional researchers, and a select
few others (Jenkins, Wachen, Moore, & Shulock, 2012; Morest & Jenkins, 2007). Moreover, the
data-use practices of these individual users appears to bear little relationship to the overall
institutional structures and practices around using data to facilitate change or improvement
(Jenkins & Kerrigan, 2008), bringing into question the potential for widespread or long-term
impact.
Initiatives to promote the use of data as a driver of organizational change—many of
which are directly or indirectly linked to performance-based accountability systems—all operate
on a theory of change suggesting that the injection of new forms of data into the work of
administrators, faculty, or staff will create a “culture of evidence” that normalizes the practice of
collecting and analyzing empirical data on student progress in order to make informed decisions
about when and how to change (Morest & Jenkins, 2007; Witham & Bensimon, 2012). Within
make connections between change and learning theories. My theoretical framework takes a different approach,
suggesting that learning is embedded within the practices associated with change; that is, from a practice perspective
change is equivalent to learning in the form of re-mediated activity directed at revised objectives (e.g., Engeström,
2000).
41
many such data-use initiatives—and approaches to studying them—there is thus a dominant
focus on data as the source of self-evident information about how to proceed or change course
and an assumption that practitioners exercise objective, rational appropriation of and action
based on the information gleaned from those data (Dowd, 2005). A growing body of research
(particularly in K-12 education), however, has demonstrated that mechanisms of data-driven
learning and change depend upon the capacity of practitioners to access and make sense of data
as well as the social contexts in which they do so (Colyvas, 2012; Dowd, 2005; Witham &
Bensimon, 2012). Dowd (2005) argued that the theory of change behind data-use as a
mechanism of change within accountability systems should hinge upon a “culture of inquiry”
rather than a “culture of evidence,” emphasizing in this semantic adjustment a shift “from the
data to the decision-maker as the locus of change” (p. 2). Scholars working on issues of policy
implementation and reform in K-12 schools have arrived at similar conclusions based on studies
guided by theories of sense-making (Weick, 1995) and communities of practice (Wenger, 1998).
Coburn, Touré, and Yamashita (2009), for example, conducted a longitudinal study of changing
decision-making processes in district offices using sense-making and framing theory. Their
theoretical perspective enabled them to see how data, while imagined in accountability policies
as providing “objective” information about performance (Ewell, 2012), actually served as a raw
text around which a complex process of interpretation, persuasion, and argumentation occurred
in order to produce a socially constructed form of “evidence” in support of decision making.
Coburn and Turner (2012) similarly emphasized the importance of situating the study of data use
analytically within contexts of practice and to take into account how the values and norms of the
broader institutional environment come to shape and become embodied in practice.
42
Evaluating a range of data-driven decision-making interventions, Marsh, Pane, and
Hamilton (2006) made a similar conceptual distinction between raw data, information, and
actionable knowledge, finding that different kinds of information and forms of actionable
knowledge are generated from the same raw data through social processes of interpretation. This
research has shown that the kinds of actionable knowledge generated from accountability data
varies depending on the types of decisions and decision-makers involved in interpreting and
making use of that data (Marsh, Pane, & Hamilton, 2006) and the positions of those decision-
makers within the education system as well as their previous experiences with reforms (Coburn
& Talbert, 2006).
Much of Coburn’s other research (2001; 2004; 2005) has studied the interactive nature of
teachers’ implementation of new instructional models and reform-driven practices. Throughout
these studies Coburn has shown how teachers’ use of the information generated by
accountability or reform programs is significantly shaped through patterns of interaction with
other teachers. Using a similar theoretical approach, but examining the implementation of new
practices in hospital intensive care units, Tucker, Nembhard, and Edmondson (2007) found two
distinct patterns of learning associated with intended changes in practice: learning what, as in
learning the basic technology of best practices, and learning how, which reflected hospital staff
learning together through interaction and experimentation how to operationalize those practices
in their local settings. Social interaction within local work settings thus served to mediate the
influence of the values and priorities embedded in external practices or policies.
Other scholars have similarly emphasized that the social processes of interpreting
accountability data may result in significant divergence between the goals of accountability data
and its perceived value and use among practitioners. Ingram, Louis, and Schroeder (2004), for
43
example, found that accountability data and prescriptions for their use in guiding practice were
frequently perceived with mistrust by teachers, and that teacher communities maintained their
own preferences for the types of data that were useful to guide practice (a common theme in
data-use research generally; see e.g., the review by Marsh, 2012).
Across a range of empirical and conceptual work pertaining to two specific change
issues—specifically, in relation to performance accountability policy and as a function of a push
for greater data-use in educational organizations—there is thus ample evidence to support both
of the grand narratives about change and no-change in higher education. But there are also, in
recent research, promising approaches to solving the puzzle, for example in theorizing
incremental and intermediate elements of the change process (as in Dougherty & Reddy, 2013)
and in attending to the social learning and situated work practices associated with enacting
change efforts (as in Coburn’s work).
Broader Lessons about Organizational Change from Higher Education and Beyond
Thus far I have summarized empirical scholarship on two particular issues of
organizational change in higher education (and K-12)—(1) as a function of performance-based
accountability policy and (2) within data-use initiatives—noting the conceptual approaches and,
in some cases, limitations of those approaches. I have traced throughout this literature where
there are iterations of the two grand narratives vis-à-vis whether and how change occurs. But I
have argued that the apparent paradox may be at least in part a function of conceptual limitations
of much of the change research.
I want now to demonstrate how varying existing conceptual approaches can be generative
of valuable propositions about change in higher education and at the same time, to illustrate how
embedded within many of the existing accounts of college and university change are rich and
44
often unexploited data about the practices through which change is mediated and enacted. I will
then take up the notion of “practice” as an ontological foundation in the theoretical discussion in
the next chapter.
In two of the most expansive existing studies of university change, Clark conducted in-
depth inquiry into the processes by which five European universities (1998) and later nine
additional universities in Africa, South America, and the US (2004) pursued fundamental change
efforts to become more entrepreneurial in response to shifting state, national, and global
economic and social forces. His goal was to identify what he called “pathways of
transformation”—the “means of entrepreneurial action” by which universities “go about
transforming themselves” (Clark, 1998, p. 5). Across the first five case narratives, for example,
Clark identified five propositions about “irreducible minimum” elements of change: (1) a
“strengthened steering core” within the university constituted by members of both administrative
and academic departments; (2) an “expanded development periphery” facilitated by new or
improved organizational structures linking academic departments to external audiences and
sources of potential revenue; (3) a diversified funding base including grant and contract revenue;
(4) a “stimulated academic heartland,” meaning that the academic core of faculty and their
departments are engaged in the managerial work of transformation, instilling academic values in
that work but also adopting a “modified belief system” in the process; and (5) an “integrated
entrepreneurial culture,” which Clark describes as the result of institutionalized new behaviors
and operationalized new beliefs (1998, pp. 5-8).
In suggesting these five common elements of transformation, Clark also emphasized their
interaction as a part of a holistic internal, structural capability for change. He referred to these
multiple strands of transformation as constituting an “interactive instrumentalism,” which he
45
described in terms of the iterative movement of change across domains of practice within the
university:
As we have seen by reviewing development over ten to 15 year periods, the building of
structured capability and cultural climate takes time and is incrementally fashioned.
Action taken at the center requires faculty involvement and approval. Change in new and
old units in the periphery and in the heartland is piecemeal, experimental, and adaptive.
(1998, p. 145)
In his update to the original five case studies, Clark (2004) elaborated these ideas, suggesting
that a sustained process of transformation requires a "bureaucracy of change," which he defined
as an infrastructure built of "interlocked forms and interests that insist on continuous change,
incremental and accumulative" (p. 5). Based his additional case studies, Clark (2004) further
argued that sustained change requires "collective responses that build new sets of structures and
processes – accompanied by allied beliefs – that steadily express a determined institutional will"
(p. 5-6). In many of the cases Clark revisited, notably, a failure of transformation processes to
generate such new sets of structures resulted in fractured or distorted revisions of prior university
organization and practice.
Though Clark made no explicit theoretical provisions at the outset of his case studies,
preferring to work “more from ‘practice’ to ‘theory’ than the other way around” (1998, p. xv),
the propositions he offered as findings from his research reflect many of the assumptions and
emphases of theories and models of organizational change common in the literature. Theories of
change as an evolutionary and systemic process, for example, emphasize how changes in the
interrelated structures of higher education organizations interact as a system to respond to
46
environmental changes (Kezar, 2001; Kezar, 2013; Morgan, 1986), just as Clark suggested with
his concept of “interactive instrumentalism.”
Similarly, neo-institutional theories—though typically concerned more with unplanned
change (Kezar, 2013)—offer the concept of institutional “fields” in which norms of
organizational form and behavior circulate, conferring legitimacy on organizations that mimic or
adapt to those norms. In order to maintain access to resources, organizations pursue legitimacy
by behaving like other organizations such that a form of “collective rationality” emerges across
organizations with similar purposes and resource pools (DiMaggio & Powell, 1983). The pursuit
of legitimacy as a means to secure revenue and other organizational resources (political
influence, prestige, and so on) within these fields thus explains similarities in structure and
behavior among organizations in the field (DiMaggio & Powell, 1983; Meyer & Rowan, 1977).
Clark’s (2004) proposition about the necessity among modern higher education organizations to
build a “bureaucracy of change” in order to ensure continued survival and legitimacy reflects a
similar logic.
Reading carefully Clark’s data and his analysis of them reveals another interesting
feature: his narratives pay a great deal of attention to the ways in which specific individuals went
about enacting the processes of transformation that Clark eventually distills in to propositions
about organizational “pathways.” For example, he notes that “[T]ransformation occurs when a
number of individuals come together in university basic units and across a university over a
number of years to change, by means of organized initiative, how the institution is structured and
oriented” (Clark, 1998, p. 4). And later, “The operational units, departments and research
centers, remain the sites where research, teaching, and service are performed: what they do and
do not do becomes finally central” (p. 145). Though Clark makes no explicit mention of the
47
theoretical importance of such activities as part of an overall theory of change, his attention in
data collection and analyses hints in some ways, I argue, at an empirical and ontological
orientation to collective work activity as central to the change processes he observes (Feldman
& Orlikowski, 2011).
Around the same time as Clark’s work in Europe, Sporn (1999) conducted a similarly
expansive case study at six universities—three in the U.S. and three in Europe—to examine how
in each case the institutions had adapted to challenges and pressures emanating from the external
environment. These specific challenges included, for example: cut-backs in state funding among
most of the public universities in her sample; crises at one large public flagship in the US around
issues of diversity and affirmative action; opportunities for specialization or upward selectivity
and prestige at a large private institution; and crises with respect to organizational dysfunction
and mandated reform from trustees and external stakeholders. In examining the universities’
responses to these various crises and opportunities, Sporn framed the change she wanted to
understand as "adaptation,” which “requires structural modification or alterations in order to
respond to changes in the external environment” (p. 74). Conceptually linking organizational
structure and environment through the idea of adaption, Sporn thus focused in particular on what
organizational structures and processes were involved in each university's responses to their
respective challenges and thereby contributed to what she called an “adaptive capability.”
Though many of Sporn’s findings of structural factors enabling universities' responses are
abstract and perhaps too poorly specified to be of use in categorizing or making sense of change
efforts in other sites (e.g., "clan culture of collegiality" and "externally focused mission"), there
is among her case narratives—as in Clark’s (1998 and 2004) data—a recurring illustration of the
specific practices and bounded work of academics and managers within institutions. For
48
example, when Sporn described the importance of what she abstractly characterized as “the
leadership and management structures” that facilitated the University of Michigan’s response to
crises around diversity, what she actually provided were rich data about the leadership practices
at the institution:
First, the president of the university issued the Mandate for the university committee,
thereby communicating the importance of the problem of diversity. In addition to this
personal commitment, funds have been allocated especially for Mandate projects.
Second, UM’s structure was changed and a new office for minority affairs has been
added reporting to the provost…Third, central and local administration endorsed the
Mandate through different programs and actions. Deans, for example, have been
reporting their achievements to the provost in their annual budget review. (p. 139,
emphasis added)
Despite her use of the passive voice in places, each of the italicized verbs represents actions
taken by someone in the course of their daily work but in interaction with the broader process of
change at the university.
Sporn’s case data also reveal a pattern of interaction between those specific practices and
diverging or competing institutional logics, though she does not explicate the specific theoretical
importance of those interactions (Thornton, Ocasio, & Lounsbury, 2012).
4
For example,
activities like those described above are often illustrated as occurring in tension with or
contradiction to other university priorities: activities taken in pursuit of becoming more
4
Institutional logics are conceived of by institutional theory as a mechanism by which the norms and values of
institutional fields both constrain and enable individual and group action within organizations; they delimit the array
of approaches individuals take to organizational problems or scenarios but can be manipulated or subverted through
individual or group agency (Thornton, Ocasio, & Lounsbury, 2012). Institutional logics have a conceptual
relationship to Bourdieu’s (1980/1990) concepts of the habitus and logic of practice; both theoretical traditions are
explored in depth in Chapter 3.
49
entrepreneurial contradict the tradition of faculty autonomy with respect to maintaining core
academic functions; actions to embed a commitment to inclusion and diversity are confounded
by a lack of "ownership" of diversity as a campus-wide issue; and so on. Reinterpreting Sporn’s
data with the aid of institutional theories thus reveals different and potentially more concrete
propositions about the nature of change and adaption in her cases.
There are many other examples of case studies rich in descriptions of changes enacted in
the daily practices of individuals within higher education (e.g., in Christensen & Eyring, 2011,
and Zemsky, 2013), though such descriptions are typically treated as ingredients in institutional
abstractions about the change process (e.g., “culture,” “DNA”) rather than theoretically
important elements in their own right. In each case, these existing studies of change in higher
education have, from their various theoretical (or atheoretical) perspectives, generated an array
of critical “sensitizing concepts” (Charmaz, 2006) for future research with respect to the
institutional, political, and organizational factors associated with local cases of change. Below I
distill a few of these key findings that have informed data analysis in this study.
Many of these studies also yield findings consistent with the enduring puzzle about how
higher education institution seem to simultaneously change so much and so little (Hearn, 1996). I
have argued that, in part, the puzzle may derive from inadequate attention to the nature of the
practice of change within its local settings as conveyed in many of these case study data.
Feldman (in Feldman & Orlikowski, 2011), for example, describes the value in her work of
reinterpreting her data on organizational change through a theoretical lens that attends to the
nature of practice:
One explanation for change in routines was the existence of “exogenous shocks.” These
are things like budget crises and new technologies that considerably alter the context that
50
routines operate in. The context the organization operated in during this period, however,
was very stable. Moreover, change had been theorized as the opposite of stability. For
example, one common explanation of the process of routine change is provided by
punctuated equilibrium theory…This theory suggests that routines enable people to
ignore small changes in the context until they accumulate into really big problems and the
routines have to be abandoned or overhauled to reflect and respond to the new context.
This punctuated change, however, did not describe what I was seeing. I was not able to
understand what I was seeing using the theories I had available to me at the time.
(Feldman & Orlikowski, 2011, pp. 9-10).
After spending a sabbatical delving into practice theory, Feldman re-theorized the organizational
routines she was studying as practices and in doing so recognized that change and stability,
which she had previously understood as contradictory forces, were in fact “different outcomes of
the same dynamic rather than different dynamics”: “This stew of theories provided a foundation
for a new way of conceptualizing routines and a way of understanding the relationship between
stability and change as a result of the internal (or endogenous) dynamics of the routine”
(Feldman & Orlikowski, 2011, p. 10).
The literature reviewed above illustrates how recent efforts in higher education, K-12
education, and other fields to approach change through similar theoretical lenses have helped to
bridge apparent discrepancies in findings between, for example, the processes of change in local
settings versus apparent stability when considering only structural or institutional levels of
analysis. Such studies have also helped to enhance our understanding of how efforts to change
outcomes of work may benefit from a theory of change and a research design that foreground the
mediated social interaction of practitioners around new tools and artifacts such as data within
51
their work settings. I thus also distill key points from the literature about approaches to studying
change before I move into discussion of a theoretical framework to support and guide such
approaches.
Key Points from the Literature about Dynamics of Change
• Whether from policy initiatives or from the “bottom-up,” processes of change are
mediated by practitioner learning both with respect to new tools for and methods of new
ways of doing (i.e., learning what) and in the application of those new tools and methods
to their own local activities (i.e., learning how).
• Though data are often intended to serve as a core artifact to facilitate changes in work
procedures and outcomes, the meaning of those data is not self-evident or self-acting;
rather meaning is generated through a social learning process and choices about courses
of action based on data are mediated by existing priorities, preferences, and ways of
doing work. Further, the organizational capacity for supporting that type of learning
process impacts the change process. Sustained change is associated with a strong
“adaptive capability” (Sporn, 1999) or change-supportive infrastructure (Clark, 2004).
• Change in universities occurs over time through an incremental process wherein changes
in practice in a given setting (administrative planning or budgeting) move to other
settings (faculty meetings or committees) and back again. In each setting, the trajectory
of change is shaped instrumentally by the practices in the others settings.
Key Points from the Literature about Approaches to Studying Change
• Theoretical perspectives that attend to the interactions of individual action, organizational
structure, and institutional environment may provide more concrete propositions about
the nature of change than those that treat one as dominant.
52
• Attending to the dialectical nature of practice within change processes may also help
bridge between apparent discrepancies or shortfalls of explanation in findings generated
through linear, implementation-oriented theories and models of change.
• Inquiry into change processes would benefit from an approach that facilitates observing
the movement of the change impulse across settings within an organization through
specific communicative and organizational practices (leadership, budgeting, strategic
planning, professional development, academic committees, and so on).
The Location of this Study in the Trajectory of Change Scholarship
In foregrounding these particular empirical and theoretical considerations from existing
literature, I am situating this study in what I see as the emerging terrain of a promising approach
to change scholarship. I find in Clark’s (1998) and Sporn’s (1999) studies an impressively
comprehensive approach to examining change processes and have modeled the design of this
study very much on their, and similar, studies (though of course my study is much more limited
in scope). I have also structured my findings in a similar form of those offered by Clark (1998,
2004) and others—that is, in the form of propositions about dynamics of change (also,
specifically, Neumann & Pallas, in press). In these ways my study is a continuation of a tradition
of prominent case study research on change in higher education. But I have also found in the
literature promising new theoretical tools with which to collect and analyze case study data. My
study is thus situated accordingly within a new strand of research across disciplines that draws
heavily on advances in organizational theory and is as a result, I think, notably different in tone
and focus from traditional change research in higher education.
53
CHAPTER III: A Practice Theory of Change
The contribution of a practice approach is to uncover that behind all the
apparently durable features of our world there is always the work and effort of
someone. It also highlights that the relation between practices and their material
conditions—between ‘structure and process’—is conceived recursively as two-
way traffic. (Nicolini, 2013, p. 3)
The life of the academic organization is in the committee meeting.
—Sandy Shugart, President, Valencia College
Out of the review of existing literature I extracted several key themes with respect to both
dimensions and dynamics of change processes and approaches to studying those processes.
Among those key themes there is a focus on two general ideas. First, change processes in higher
education implicate both structural factors – such as policy, organizational form, history,
institutionalized power relationships (bureaucracy) – and individual and group activities,
routines, preferences, and values (Clark, 1983b; Clark, 1998; Hearn, 1996; Kezar, 2013; Levine,
1980; Sporn, 1999). Fully capturing the complexity of the change process thus requires a
theoretical and methodological approach to seeing not just one dimension or the other or both
dimensions separately but the dialectical nature of individual and group action within social
structures (Feldman & Orlikowski, 2011; Kezar, 2013).
The second prominent theme in the literature is that the change process is incremental
and involves the actions taken by individuals across domains in social work settings where they
collectively interpret and evaluate information, interpret and evaluate new tools or methods for
doing work, and integrate those new resources into their own individual and collective work in
varying ways and to varying extents (Blin & Munro, 2008; Clark, 1998; Coburn, 2004; Coburn
& Turner, 2012; Tucker, Nembhard, & Edmondson, 2007; Weick, 1976). Thus, studying change
also requires a theoretical lens that supports the analytic work of mapping these interactive social
learning practices, and which can help trace analytically the “movement” of resources for change
54
(new tools, data, accountability metrics, etc.) through the daily work of practitioners across
settings within the university (Feldman & Orlikowski, 2011; Nicolini, 2009a).
Taking these ideas as directive, the analytic work of this study thus involves ascertaining
the meaning of individual and groups actions within a specific context and discerning what
lessons they collectively offer about change as a phenomenon constituted by (and generative of)
activity across settings not only within each university but also across the larger PASSHE case
setting. Moving the analysis of change into the domain of practices thus requires synthesizing
analytically two levels of inquiry and data: (1) the structural forces that shape individual and
group activity within the universities (culture, rules, norms, and so on) as conveyed through
documents, strategic plans, historical analysis, and so on; and (2) individual subjective narratives
of individuals’ own involvement in and actions associated with the particular change initiatives
(Nicolini, 2013; Thornton, Ocasio, & Lounsbury, 2012).
As Feldman and Orlikowski (2011) and other organizational researchers have argued
(Nicolini, 2009a; Nicolini, 2013; Schatzki, 2001), the array of theoretical traditions that share a
common ontological premise about “practice” as the primary unit of social reality are
particularly well suited as a foundation for the analytic focus on the interaction between
dialectical relationships of individual/social, agency/structure, and change/stability, particularly
within organizational studies (Nicolini, 2013). I thus primarily draw on practice theory to
provide a theoretical framework for this inquiry, though I also draw on the concept of
institutional logics—which has a relationship to practice theory but derives from neoinstitutional
theory (Friedland & Alford, 1991; Thornton, Ocasio, & Lounsbury, 2012)—because I believe it
is consistent with and complements practice theory in ways important for the study of change.
The Focus on Practices
55
At the most rudimentary level, the diverse set of theoretical traditions grouped under the
umbrella of “practice theory” share the assertion that practices (as opposed to structures, norms,
cultures, or subjective belief or cognition) are the primary unit of social and organizational
phenomena; the practice lens directs researchers to examine the relationship between what
individuals do, why they do it, and the context in which they do it (Bourdieu, 1980/1990;
Nicolini, 2013; Polkinghorne, 2004; Schatzki, 2001). Within this body of theory, “practice” is
defined as simply “arrays of human activity” (Schatzki, 2001, p. 2) or, more specifically,
“activity directed toward accomplishing a goal” (Polkinghorne, 2004, p. 71).
Perhaps the most limiting feature of practice theory is its confusing vocabulary; the
theory gives special meaning to a notion of “Practice,” but also to the practices or “arrays of
activity” that constitute a Practice. Before delving into the main tenets of the theory, I will thus
try to provide some clarification about its terminology. I find that yoga is a helpful clarifying
example: Yoga is a Practice; we say, for example, “my yoga practice” or “I practice yoga” (in
fact, referring to “practicing” yoga could be considered part of the performative nature of the
Practice).
5
The Practice of yoga consists of an array of specific learned practices or activities –
sitting or standing in rows facing forward, completing sequential poses or movements, saying
“Namaste” at the conclusion of the routine, and so on. Moreover, the Practice of yoga is
accomplished through interaction with specific artifacts and physical settings – rubber mats,
towels, mirrored walls. In doing the practices of yoga, I am repeating the basic form of the
activities done by many others before me and yet each time I attempt a pose it is my own
iteration of it and during my enacting of the Practice I can also attempt new variations of the
5
I am not going to maintain the use of a capital P throughout the study to signal when I am referring to such
theorized practices; I think it is unnecessary to do so. I use the capitalized form here only to help make clarifying
points with respect to the theory.
56
forms to better suit my body, or I can also simply sit down (Nicolini, 2013; Schatzki, 2001). I
thus have agency within the general structure governing the Practice of yoga.
What the yoga example lacks that is otherwise relevant to practice theory is the sense that
in each of my iterations of the activities that constitute the Practice, I may also be shaping the
broader structure of the Practice as it continues on to be done by others. In this sense, imagining
the Practice of Law, Medicine, Science, and so on may be better examples – in these Practices
we can better imagine the ways in which individual agency in carrying out the activities (treating
a patient, arguing a case) may also change the nature of the Practice over time as new techniques
are developed or as enactments in one instance (for example, a legal case) become part of the
sediment of resources upon which the practice draws (as in case law and legal precedence).
Another important element of practice theory is that those who engage in any given
Practice do so in the course of accomplishing some general or specific goal or objective that is
shared across practitioners (Polkinghorne, 2004). This point seems obvious but is helpful in
distinguishing the practices and activities that constitute a Practice from others; in other words,
which practices are theoretically relevant is defined at least in part by their ordered performance
as part of a specific larger array of activities in pursuit of an objective. Walking down the street
is not by itself a Practice within the apparatus of practice theory; however, if my project were to
examine the strategic navigation of city streets as part of Modern existence, as in the case of
Baudelaire’s flâneur or in the tactics of the situationists’ dérive (Debord, 1958), then the activity
of walking down the street (and in the course of doing so, interacting with the physical and social
constraints of the constructed urban surroundings) becomes relevant through a practice theory
lens.
57
In application to organizational research, practice theorists thus imagine organizational
phenomena (both concrete, such as production of goods or services, and abstract, such as the
reproduction of hierarchy or the control of symbolic resources like language) as the result of
ordered practices, which are arrays of human activity directed towards some objective.
According to Nicolini (2013), “All practice theories foreground the importance of activity,
performance, and work in the creation and perpetuation of all aspects of social life. Practice
approaches are fundamentally processual and tend to see the world as an ongoing routinized and
recurrent accomplishment” (p. 3). The practice also incorporates its material and communicative
manifestations—the mat in yoga, the hammer in carpentry, the case brief in law—as well as
those historical and social structures that support its continued reproduction—the Hippocratic
oath in medicine, the State Bar certification in law, or the patterns of urban development that
shape the tactics of taxi drivers moving quickly through city streets (de Certeau, 1984; Nicolini,
2013; Schatzki, 2001; Latour, 2005). Within organizations, practice theory thus draws attention
to the social and material elements interacting with human agency and at the same time draws
attention away from imagined categorical boundaries for such activities, such as “institutions” or
“cultures” (Latour, 2014; Leonardi, 2012; Schatzki, 2001).
Habitus and the Structure/Agency Relationship
To better understand how the theory of practice attempts to merge human agency and
social structure, it is helpful to trace its origins and the conventions in social theory to which its
progenitors were responding. One of the primary influences in the development of what is today
termed practice theory was Bourdieu’s theory of the “habitus”—a construct that Bourdieu
intended to bridge between and thus supplant both the cultural determinism of structural
anthropology (c.f., Claude Lévi-Strauss) and the relativism of poststructural and
58
phenomenological philosophy and anthropology (c.f., Franz Boas). The habitus, as de Certeau
(1984) suggested in his critique of Bourdieu’s construct, was an attempted “passage from one
genre to another: from ethnology to sociology” (p. 51); that is, the habitus was meant to reassert
human agency into the universalizing and deterministic theories of “the social” or “the cultural”
that were dominant among the structuralist sociologists (Bourdieu, 1980/1990).
The habitus is thus a theoretical mediating device that tries to explain the durable
reproduction (i.e., over time and generations) of social and cultural structures (for example,
socioeconomic class) through their enactment by individuals within social groups, but without
either denying the agency of the actor or reducing all possible social “truths” to subjective
individual experiences. In Bourdieu’s words, “the habitus is a spontaneity without consciousness
or will” (p. 56). The habitus is thus akin to a “disposition” to act in a certain way in a given
situation, but without meaning to preclude adaptation in circumstances of crisis or complexity
(Bourdieu himself uses the term “disposition” as an analogous concept: 1984, p. 52 & 64;
Polkinghorne, 2004). Though from various disciplines and with some distinguishing principles,
the various strands of practice theory that emerged in the latter half of the twentieth century
shared Bourdieu’s epistemological intent—as Schatzki (2001) described it, “to free activity from
the determining grasp of objectified social structures and systems, to question individual actions
and their status as the building-blocks of social phenomena, and to transcend rigid action-
structure oppositions” (p. 1).
Giddens’s structuration theory further specified the dialectical relationship of human
action and social structure in creating routines that constitute social order (Giddens, 1984) and
has been a similarly dominant influence on theories of practice. Social routines, Giddens
suggested, are the result of the skillful application by human beings of available resources or
59
courses of action to constantly shifting social situations. Giddens emphasized the notion of a
“duality” of structure and agency inherent in practices. Owing in large part to these two
theoretical traditions, then, the goal of theorists advancing an interest in practices as the primary
material of “the social” has been to take seriously the meaning of everyday activity as human
agency, but to theorize that agency as enacted within the continuously reproduced social
structures of a particular physical and historical location (Feldman & Orlikowski, 2011; Giddens,
1984; Nicolini, 2013; Schatzki, 2001).
Sources of Change within Practice Theory
While practice theories thus acknowledge the durability of certain social constructs, such
as socioeconomic class (the primary antagonist in Bourdieu’s work), they locate the source of
such phenomena in the continuously reoccurring actions of individuals over time. In doing so,
the theory (to varying extents depending on the theorist) thus makes space for the adaptation and
mutation of practices and their meanings as the result of changing contexts and/or the willful
(and sometimes skillful) inventiveness of individuals. Indeed, de Certeau (e.g., 1984) suggested
that even the mediating theory of the habitus was over-determinant and underestimated the
extent to which individuals engage in adaptive and even subversive practices in daily life.
Practice adaptation. For de Certeau, everyday practices (walking through a city,
consumption) constitute an interaction between “strategies” and “tactics,” where strategies
6
are
structures of physical and/or social space enacted by “subjects of will and power” (e.g.,
corporations or governments) and tactics are actions employed by “subjugated” individuals
(customers, clients, citizens) to navigate—unconsciously and routinely—and even circumvent
that “strategized” social space (de Certeau, 1984, p. xix). In de Certeau’s theory of practice, the
6
In French, stratégie, meaning the “art” (l’art: the processes and rules involved in carrying out an action) of
coordination of resources in skillful pursuit of a goal, especially by economic, political, or military personnel
(Larousse Dictionary; Larousse.com, n.d.).
60
tactical nature of everyday actions thus represents a subversive impulse; he illustrates this
impulse through the notion of “la perruque” (literally, a wig), a French idiom referring to the use
of time or resources during work for non-work related activities. He suggests human beings
within capitalist and industrial societies engage in tactics like la perruque—exploiting the design
of those institutions for their own aims.
Judgment-based practice. Other practice theorists have extended the notion to reflect a
transformative objective, as in cultural-historical activity theory (discussed below) and, in a more
specific context, Polkinghorne’s (2004) notion of a “judgment-based” practice of care in human
services: “Goals are accomplished in the human realm by practitioners, not by programs or
scripts. Highly motivated practitioners know that goals can be accomplished by various means,
and they will not hesitate to try an unconventional approach if they feel a particular situation
demands it” (p. 90). Polkinghorne, like de Certeau, sees practitioners as drawing on a range of
approaches that may or may not coincide with dominant structural constraints and which, based
on appeal to judgments that stem from professional or ethical positions, may even subvert those
dominant structures with “unconventional” methods in order to produce the best outcomes for
those under the practitioner’s care (Polkinghorne, 2004). The referencing within one’s practice of
judgments based on professional or moral principles outside the “conventions” of the specific
organizational setting in which they occur is particularly relevant for this study and I thus
explore it more in-depth below.
Change as practice re-mediation. Cultural-historical activity theory (CHAT) is a
distinctly transformation-focused strand of practice theory with roots in Marxist philosophy and
Vygotsky’s (1978) work on sociocultural learning. CHAT conceives its primary unit of analysis
as the cultural “activity setting”—a physical and conceptual space of work defined by a set of
61
mutually constitutive relationships between individuals’ activities and the material, social, and
historical elements of those settings that mediate activity (Engeström, 2000; Engeström, 2001).
By drawing attention to the mediating aspects of spaces where individuals work (for example,
artifacts, history, rules, and language), CHAT thus provides a powerful tool for conceptualizing
organizational learning as the “re-mediation” of practices through the introduction of new (or
interruption of existing) processes and artifacts (Engeström, 1987; 2001; Cole & Engeström,
1993). The activity setting of a given practice is the integral unit of analysis for collecting
“practice data” and learning from them about the potential for practice to be re-mediated, and the
change/learning process of interest is thus observed by attending to the ways in which that
practice was (or was not) re-mediated and to what end. These ideas directly inform the change
framework I use in this study (see Figure 2).
Across the various traditions and developments of practice theory there are thus
numerous ways in which change can be conceptualized a practice dynamic. As Nicolini (2013)
describes:
[P]ractice theories carve a specific space for individual agency and agents. While homo
economicus is conceived as a (semi) rational decision maker and homo sociologicus is
depicted as a norm-following, role-performing individual, homo practicus is conceived as
a carrier of practices, a body/mind who ‘carries’ but also ‘carries out’ social practices;”
moreover, “Practices come first, because it is only once we appreciate the set of practices
involved in a scene of action that we can ask what sort of agency and ‘actor-ship’ is made
possible by these specific conditions” (Nicolini, 2013, pp. 4, 7).
Change in made possible via the mediated nature of the practices that bring about organizational
processes and outcomes; in any given “scene of action,” there is the potential for practices to
62
change both a function of new mediating artifacts, tools, language, and so on (as in CHAT)
and/or through the adaption of actors drawing on a range of motivating or interpretive schemes,
as in Polkinghorne’s (2004) judgment-based practice of care.
Motives of Practice: Institutional Logics & Moral Elements of Practice
Bourdieu’s theory of the habitus derived from his larger emphasis on what he called the
“logic of practice”—his assertion of the primacy of practical (rather than rational) knowledge
and embodied social “know-how” (Bourdieu, 1980/1990; Nicolini, 2013). Despite the use of this
concept of a “logic” in Bourdieu’s work and residual references to it within practice theory,
however, most traditions and applications of practice theory tend to leave unspecified the process
by which such cultural or social knowledge comes to be embodied and routinized. Turner (1994),
who sees in practice theory the “vanishing point of twentieth-century philosophy” (p. 1), in fact
criticizes the “overly facile” theory of practice as lacking any plausible explanation of the
reproduction or learning of practices in the first place, much less to allow for the kind of
intelligent tactics and innovations theorists such as de Certeau have attributed to the human
actor. “The idea that there are ‘shared’ practices requires some sort of notion of how they come
to be shared, and this notion in turn dictates how practices can be conceived…Practices without
sharing are habits—individual rather than shared” (Turner, 2001, p. 120).Thus while practice
theory helps imagine how change might occur in practice—through the adaptive strategies and
re-mediated activities of individuals—it is largely silent on why such changes might occur.
Moreover, while practice theory (particularly the structuration theory strands) emphasizes
the role of structure in constraining but also being shaped by human agency, how or in what
ways those specific structures—particularly within specific organizational contexts—actually
come to constrain or motivate human action is also not fully addressed in most accounts. As the
63
literature review above suggests, change in higher education often represents not just changes in
technical procedures or ways of doing things (often referred to as first-order or first-level
change) but fundamental shifts in the objectives and assumptions about the purposes and
processes of academic work (second-order change) (Kezar, 2001; Neave, 2012; Slaughter &
Leslie, 1997). As Clark (2004) noted in his follow-up to his 1998 study of university
transformation, for example, instances where transformation had failed often resulted in
imbalanced and even dysfunctional “hybrid” organizational forms caught between old and new
values and structures. Thus change in higher education may in many cases thus implicate
contradiction and competition between multiple systems of practice that seem to be embedded
within organizations.
Institutional logics. The theory of institutional logics (Friedland & Alford, 1991;
Thornton & Ocasio, 2008; Thornton, Ocasio, & Lounsbury, 2012) is particularly helpful for
filling in this gap in practice theory. The notion of institutional logics derives from but is also
distinct from neoinstitutional theory. In their contribution to Powell and DiMaggio’s (1991)
seminal anthology on neoinstitutionalism, Friedland and Alford argued for greater emphasis
within the mechanics of institutional theory on the symbolic and cultural dimensions of
organizational and individual behavior. Following institutionalism’s original proposition,
institutional “fields” come to define the norms of legitimacy (i.e., what constitutes a “university”
or “church”), resulting in the convergence of organizational forms over time (Meyer & Rowan,
1977; DiMaggio & Powell, 1983; DiMaggio & Powell, 1991). In order to maintain access to
resources, organizations pursue legitimacy by behaving like other organizations such that a form
of “collective rationality” emerges across organizations with similar purposes and resource pools
(DiMaggio & Powell, 1983). Friedland and Alford (1991) argued, however, that premises
64
underlying the theoretical mechanisms of institutionalism, such as those assuming rational
economic actors, mimicry under conditions of uncertainty, and organizational resource
dependency, neglected the extent to which institutional norms constrained but were also shaped
through individual agency: “Institutions are supraorganizational patterns of human activity by
which individuals and organizations produce and reproduce their material subsistence and
organize time and space” (p. 243). They emphasized that “each of the most important
institutional orders of contemporary Western societies has a central logic—a set of material
practices and symbolic constructions—which constitutes its organizing principles and which is
available to organizations and individuals to elaborate” (Friedland & Alford, 1991, p. 248). Such
logics, they argued, “provide individuals with vocabularies of motives and with a sense of self”
but at the same time, “individuals, groups, and organizations try to use institutional orders to
their own advantage” (p. 251). Logics are thus a mediating mechanism (like the habitus) that
create parameters for behavior and rationality, but which may serve instrumentally for
individuals and organizations within a given institution.
Thornton, Ocasio, and Lounsbury (2012) have further developed the theory of
institutional logics to better account for the “microfoundations” of their influence on individual
behavior within organizations. Their framework draws on a wide range of social and cognitive
theories of human action and interaction in order to explain how a set of embedded patterns of
symbols, values, and beliefs could both guide behavior and be reified and sustained through
practices (thus remaining consistent with the basic ontological premises of practice theory). In
their integrated model,
[I]nstitutional logics focus the attention of individual actors through cultural
embeddedness, activating a social actor’s situated identities, goals, and schemas and the
65
shared focus of attention shape social interaction. Social interactions generate
communication and resource flows, and interdependencies, resulting in social practices
and structures, including organizations and institutional work. (Thornton, Ocasio, &
Lounsbury, 2012, p. 85).
This model contains at least two important features with respect to practice theory and the study
of change. First, reminiscent of Bourdieu’s notion of logics of practice, it suggests that
institutional logics inform behavior by triggering frames of reference specific to the culture and
history of the organization. Thus while CHAT suggests that history mediates practice within
cultural activity settings, this model of institutional logics further suggests that a specific
history—carried through symbolic and discursive practices or “myths”—mediates activity in a
specific way, by activating shared understandings about the meaning of that history (for
example, in the way that histories of prestige and legacy are dominant in many universities and
may drive admissions practices) (Lawrence & Suddaby, 2006; Meyer & Rowan, 1977).
Second, Thornton, Ocasio, and Lounsbury’s (2012) model suggests that institutional
logics influence behavior by also activating, in relation to those shared patterns of meaning,
individuals’ identities and goals. Thus while elements of practice theory like de Certeau’s (1984)
notion of la perruque and Giddens’s (1984) structuration theory account for adaptation or
spontaneity, the institutional logics perspective further suggests that different embedded
symbolic and cultural/historical patterns of meaning within organization may interact specifically
with individuals’ own self-identities in distinct ways. According to Thornton, Ocasio, and
Lounsbury (2012), “Relevant category identities include a social actor’s industry, occupation or
profession, employer, department, voluntary-organization affiliation, race, gender, ethnicity,
nationality, and geography,” (p. 85) and such identities may be more “accessible” or relevant for
66
informing action in relation to specific institutional logics. Thus a logic of prestige within a
university, for example, may bear a particular relationship to a practitioner’s identity as an
academic professional, while a competing logic of efficiency or accountability may limit the
relevance of that professional identity as a motivator for action and make accessible instead other
aspects of the practitioner’s identity, for example that of an “employee” or a “taxpayer”—
identity categories that provide meaningful sources of interpretation and action given the
influence of a particular logic. This interactive relationship between institutional logics and
individuals’ many overlapping social and cultural identity positions thus provides the analytic
project of studying practice with a more specific guide for observation—it suggests that in
practices (and in individuals’ narratives about their practices) can be found references (implicit
or explicit) to various institutional logics that may be operating, diffusely, to mediate change
processes by activating or suppressing practitioners’ goals and identities as motivations for new
courses of practice.
Moral elements of practice. In drawing attention to aspects of practitioners’ self-
identities and goals in relation to organizational histories and structures, the theory of
institutional logics thus also opens space within the study of practice to consider, as in
Polkinghorne’s (2004) notion of a “judgment-based practice of care,” the ways in which those
professional, moral, or other psychological or cultural identity-based filters for interpreting
situations and choosing courses of action may mediate organizational work—especially within
volatile change processes where multiple institutional logics may be competing for attention.
Practice theory attends only partially to these dynamics. According to Nicolini (2013):
All practice-based approaches foreground the centrality of interest in all human
matters and therefore put emphasis on the importance of power, conflict, and
67
politics as constitutive elements of the social reality we experience. Practices, in
fact, literally put people (and things) in place, and they give (or deny) people the
power to do things and to think of themselves in certain ways. As a result,
practices and their temporal and spatial ordering (i.e., several practices combined
in a particular way) produce and reproduce differences and inequalities. In so
doing they serve certain interests at the expense of others. (p. 6, emphasis added)
Though practice theory attends to such “constitutive elements” of social reality, however, it does
not consider as fully the moral tensions and dynamics of human behavior in relation to “shared”
and “durable” qualities of practice, as suggested by Thornton, Ocasio, & Lounsbury’s (2012)
model of institutional logics.
Indeed, Thévenot (2001) also critiqued theories of practice for a “lack of realism” in
providing “good accounts of our dynamic confrontation with the world” (p. 56), and argued that
practice theory should take account of the “moral element in practice which shapes the
evaluative process governing any pragmatic engagement” (p. 57, emphasis added). In language
very similar to that used by Thornton, Ocasio, & Lounsbury (2012) to describe the micro-level
influence of institutional logics, Thévenot (2001) argued that, “It is not only the variety of
activities covered by the term ‘practice’ which poses a problem. In addition, one must also take
into account figures of action which, beyond showing habit and the body, point towards
intentions and plans, or towards forms of activity that require reflective argumentation” (p. 57).
And in asking what forces or “figures of action” drive individuals towards particular intentions,
Thévenot (2001) further suggested that:
The moral element is crucial…It drives both the agent in his conduct and
determines the way other agents take hold of or ‘seize’ this conduct. This element
68
might also be called ‘making sense of’ if we are clear that much more is at stake
than meaning, language, and understanding. It originates in a notion of the good
that grounds each regime. (p. 59, emphasis added)
With the notion of “pragmatic regimes,” Thévenot thus suggests a mode of human agency
guided by interaction between the given social structures of a situation and the “conceptions of
the good” inherent to individuals’ belonging to certain social or cultural groups. With respect to
specifically organizational practices and changes in those practices, such pragmatic regimes
could thus be seen—as components of individuals’ self-identities—to come into play in relation
to the varying and competing institutional logics driving change; and indeed such a mediating
role might significantly influence the course of change via the practices it engenders or
constrains.
MacIntyre (2007) further contributed to this line of moral-driven practice in his work
developing a practice-ontological philosophy of virtue. In attempting to establish a viable role for
virtue within the institutions of modernity, MacIntyre relied on a theorizing of practice similar to
those described above (and is thus often included in the coterie of practice theorists, though he
does not cite or make reference to them or their broader theoretical traditions) to suggest that
virtue can be identified as a good internal to practice. In his words:
By a ‘practice’ I am going to mean any coherent and complex form of socially
established cooperative human activity through which goods internal to that form of
activity are realized in the course of trying to achieve those standards of excellence which
are appropriate to, and partially definitive of, that form of activity, with the result that
human powers to achieve excellence, and human conceptions of the ends and goods
involved, are systematically extended. (MacIntyre, 2007, p. 187)
69
That is, MacIntyre sees practice (he gives the examples of painting or chess) as defined by a set
of internal goods, the pursuit of which constitutes virtue; this is important to his broader work
because he is trying to assert the role of virtue despite the modernist rejection of morality defined
by an end goal (telos, in Aristotle’s ethics). In describing virtue in this sense, MacIntyre then
further suggests that a practice is driven by that virtue—a pursuit of excellence as defined by the
practitioners of it—rather than as an instrument to some specific goal.
Summary: A Practice Theory of Change
Having outlined the elements of practice theory and its strengths in terms of specifying
observable elements of individual, organizational, and system-wide activity that may be
coherently analyzed to understand the meaning of and concrete processes that constitute an
abstract social phenomenon like organizational change, the goal of the following summary is to
then apply these ideas to my subject matter—change—within the particular context of my case.
This summary provides an overview (illustrated in Figure 2) of how I conceptualize “a practice
of change” in higher education from the perspective of practice theory and institutional logics.
• The initiatives on which this study focuses are intended to improve equity or student
success; they engage higher education professionals in an array of activities devoted to
accomplishing those specific objectives (e.g., improving retention rates or closing equity
gaps). These activities include both discursive and bodily practices (“sayings and
doings”), and they are enacted in interaction with a wide array of mediating artifacts,
tools, and vocabularies that facilitates them. These initiatives thus constitute an activity
setting: an identifiable process of social activity mediated by tools, rules and policies,
language, history, power, and so on.
70
• The practices associated with change initiatives are structured by specific institutional
policies, histories, political arrangements, and may derive from an array of institutional
and organizational logics. At the same time, however, practitioners may also adopt
micro-strategies or “tactics” as adaptations or subversions of those dominant logics in
pursuit of the goals as they understand them.
• Moreover, practitioners’ micro-strategies vis-à-vis the change objectives may emerge
from impulses that are distinctly moral and that derive from their own conceptions of
professional ethics, personal or religious beliefs, “doing the good,” engaging in a
“practice of care” (Polkinghorne, 2004), or pursuing excellence as a virtue defined by and
internal to the practice itself (MacIntyre, 2007).
• The practice of change is conceived as an ongoing process of learning, wherein all of the
factors mediating practice may, through the course of the change initiative and beyond,
be altered or come to interact in reconfigured ways. In the terms of CHAT, specifically,
the initiatives being studied here can thus be seen to “re-mediate” practices via the
introduction of new tools, resources, language, processes, and so on. The practice of
change is thus a process of re-mediated activity taking into account all of the various
dimensions of practice as situated between institutional/organizational structures and
complex human agency.
7
7
This conceptualization also aligns with the theory of change underlying the Equity Scorecard, which similarly
draws on CHAT to support the notion of “re-mediated practice” with respect to practitioners’ attitudes and beliefs
about students of color and the reproduction of institutionalized racism through unintentional organizational routines
and policies (Bensimon & Malcom, 2012; Witham & Bensimon, 2012).
71
Methodological Implications of a Practice Approach
The outline above offers a number of critical implications for the methodology of this
study. One of the risks of approaching the study of organizational or sociological phenomena in
72
terms of their constituent “practices” is that the empirical project will become one of simply
inventorying activities without a coherent analytic thread tying them together to elucidate the
rich notion of “Practice” that the theoretical framework aspires to. As Nicolini (2013) cautions,
The mere a-theoretical cataloguing of what practitioners do…sheds little light on
the meaning of the work that goes into it, what makes it possible, why it is the
way it is, and how it contributes to, or interferes with, the production of
organizational life. In other words, listing and enumerating practices by taking
them at face-value constitutes a weak approach to practice. Such a descriptive and
a-theoretical way of addressing practice, which builds on the misleading
assumption that practice is self-explanatory, is scarcely capable of providing the
affordances [of the practice approach]. (p. 13)
Instead, Nicolini (2009a; 2013) argues for a “strong program” of studying practice, in which
observed practices are filtered continually through an analytic lens of “what does this mean? And
why does it matter?” (2013, p. 14).
The methodological challenge presented by practice theory is thus that studying practice
requires capturing through observation what is at both individual and organizational levels only
evident as it manifests in either action, speech, or artifacts, but the researcher must also ascertain
the latent “how and why” behind those actions, narratives, and tools (Nicolini, 2009b; Yanow,
2000b). The same dilemma has been central to theories and methods for studying organizational
learning: Argyris and Schön (1974) argued, for example, “We cannot learn what someone’s
theory-in-use is simply by asking him. We must construct his theory-in-use from observations of
his behavior” (p. 7, emphasis added). In her model of interpretive policy analysis, Yanow
(2000b) similarly argued that actions, language, and the creation of artifacts are expressions of
73
how individuals and communities frame—that is, discuss, make sense of, and assign values to—
an issue, and the researcher’s job is thus to “map” the ways that different interpretive
communities use language and artifacts to convey their collective framing. Methodologically,
then, research into practice involves an analytic mapping—from the observable discourse and
“bodily choreography” (Nicolini, 2009a) to their underlying logic(s), intent(s), and/or moral
impulse (MacIntyre, 2007; Thévenot, 2001). Researching practice requires not just
“triangulation” of various sources of evidence from various tiers of analysis but the theoretically
guided, iterative cross-referencing of those various types of data in order to arrive at a rich
narrative of meaning and process rather than just sequences of events.
Another implication, however, is that—even while a focus on practice is potentially over-
encompassing (as in the critiques of Thévenot, 2001, and Turner, 2001)—the researcher must
find a protocol by which to attend not only to the practices that reveal themselves as directly
efficacious or transformative in some way but also those to which practitioners themselves may
not draw attention or of which they may not be fully conscious. Weick (1976) argued that in
studying individuals’ processes of sensemaking, for example, researchers must be cautious of
human beings’ tendency to rationalize their actions in retrospect when in fact those actions
emerged from both conscious and unconscious choices and interpretations of the situation at
hand. That is, though part of the agenda of a practice approach is to ferret out the underlying
“how and why” of practice, human beings are rather poor reporters of our own intentions. As in
all modes of ethnographic research, this challenge thus points to the importance of “thick
description” (Geertz, 1973) coupled with analytic synthesis across cases and iterative,
theoretically guided triangulation between observations, interview accounts, documents, and
74
other forms of evidence. The approaches I have used to address these challenges are the topic of
Chapter IV.
75
CHAPTER IV: Research Design & Methodology
This study investigates the practices associated with change initiatives in a public system
of higher education in order to understand how such practices may either engender or constrain
broader organizational change and the nature of such potential change. Practice is understood
conceptually as the array of activities occurring within or associated with the specific activity
settings of participating individuals within the universities. These activities are further
conceptually understood as emerging from an interaction between the material, symbolic, and
historical patterns and structures of the universities (that is, institutional logics) and the agency of
those participating individuals who in turn may possess numerous self-identities and roles in
relation to the logic of change initiatives. The research design for this study thus facilitates not
only a rich description and interpretation of practices at the local level but also the synthesis and
interpretation of practices as meaningful conveyors of organizational history, continuity, and/or
change (Nicolini, 2009a).
This study uses a case study design framework, within which I have drawn on tools of
organizational ethnography (Ybema, Yanow, Wels, & Kamsteeg, 2009) and dynamic narrative
analysis (Daiute, 2014) to assemble narrative and artifactual accounts of practice into a coherent
case analysis. A case study design provides a rationale and framework for the assembly of data
from multiple sites within a bounded case (bounded in time and space; Stake, 1995), but is also
flexible enough to accommodate specific methods for incorporating both micro-level data
analysis and meso-level (organizational) synthesis and analysis. Nicolini (2009a) describes such
an approach as “zooming in and zooming out,” where zooming in means attending to practice
“as a publicly available accomplishment based on the situated assembling of a number of
discursive and non-discursive practices” (p. 1400) and zooming out to trace connections across
76
multiple settings or occurrences and to examine through what means and processes those
connection are sustained or challenged (Nicolini, 2009a, p. 1408).
Ethnographic & Narrative Methods within a Case Study Design
Case study research is the “study of an issue explored through one or more cases within a
bounded system” (Creswell, 2007, p. 73), where a bounded system represents a setting or object
(or person) defined by its temporal and physical boundaries—“a specific, complex, functioning
thing” (Stake, 1995, p. 2). Case study research designs share with ethnography the incorporation
of multiple sources of data including interviews, observation, and document analysis. Case study,
however, elevates the primary analytic question out of the specific case (e.g., faculty as a culture-
sharing group) to focus on the process or issue (e.g., the accountability field of practice), which
is then examined across settings that serve as contexts for understanding the “issue” rather than
objects of study themselves (Stake, 1995).
The ethnographic research paradigm provides a framework for assembling observation,
interviews, document analysis, and multiple layers of interpretation of meanings (by both the
participants and the researcher) into a “cultural portrait” (Creswell, 2007; Hammersly &
Atkinson, 2007; Yanow, 2000). In an approach particularly responsive to the theoretical agenda
of practice theory, “organizational ethnographers do not describe the complexities of everyday
organizational life in the abstract, but instead through reporting on their first-hand, field-based
observations and experiences” (Ybema, Yanow, Wels, & Kamsteeg, 2009, p. 6). The product of
ethnographic research is a set of rules or patterns that govern the meaning-making processes of
the cultural group (Creswell, 2007) conveyed through narratives and theories rather than
statistics or “outcomes” (Hammersly & Atkinson, 2007). Moreover, ethnographic research
explicates the role of the researcher as an interpreter of interpretations—a role that requires a
77
constant awareness by the researcher of his or her own social and cultural location as it shapes
the interpretations of observations in the field.
Case study research designs typically incorporate as data interviews or focus groups, and
ethnography of course relies on a great extent to the narrative accounts (as well as the
researcher’s participation in or observation of events) with respect to the settings and activities of
interest. But the treatment of narrative as a source of “evidence” in social science is itself
wrought with complexity and deeply imbued with interpretive possibility. Daiute (2014) suggests
an approach to narrative analysis that is cognizant of the “use” value narrative has as a cultural
tool, emphasizing the need for researchers to understand that “narrating is an activity for creating
identity as well as for sharing experience, but more than that, it is an activity for figuring out
what is going on in the environment and how one fits” (p. 15). In her dynamic narrative
approach, the analytic task is thus to interpret narration as a discursive activity; much like the
way practice theory would conceive of narrating as a component of a socially mediated set of
practices, Daiute emphasizes the need to interrogate how narrative accounts convey the many,
potentially overlapping or even contradictory vectors of social experience that interact within the
act of storytelling. By parsing narrative accounts to recognize, for example, the role of
individuals’ discursively enacted values, the construction of micro-narrative “plots,” or
storytelling devices like first-person or third-person characterizations, researchers can create
through narrative analysis a rich picture not only of the topic or event being described but also
valuable information about individuals’ social and cultural locations and systems for interacting
in the given social setting (Daiute, 2014).
Reflecting on the strengths and intents of each methodological paradigm or approach
(case study, ethnography, narrative analysis) separately, the eclectic approach proposed here has
78
limitations by contrast. A purely ethnographic approach would devote all of the time and energy
resources of the study to illustrating in careful depth and richness the meaning-making methods
and practices of a single group, and this in-depth exposure would provide greater narrative
richness than is possible when trying to spread study resources and time across multiple sites or
groups. Narrative analysis would go even more deeply into the structuring and meaning-making
(and meaning-conveying elements) of individuals’ narrative accounts, taking time to study
semantic and rhetorical tactics and patterns, and thus offering a particularly textured account of
change processes. Alternately, a purely case-study approach would focus more specifically on
the “issue” as the “conceptual structure for organizing the study” (Stake, 1995, p. 17) and thus
devote more time and energy to mapping out the macro-level phenomenon of change and its
various dimensions. While acknowledging the limiting effect of combining approaches, the
research design enacted here nonetheless responds to the compelling theoretical imperatives of
the practice perspective and the research questions motivating the study by devoting attention to
micro and macro-level processes and outcomes and individual narratives and strategies and
attending in particular to analytic strategies for making sense of data across these various
dimensions of the case setting.
Multi-Site Case Study Design
Sampling strategy. The sampling strategy for this study involves a sample of (1)
universities within the system and (2) a sample of participants in the relevant change initiatives
nested within those universities.
Site sample. This study identifies a specific case—the Pennsylvania State System of
Higher Education—as the setting in which I have an intrinsic interest for the reasons outlined in
the introduction, including the many ways in which the current policy and practice environment
79
in PASSHE is representative of the broader completion agenda. Though all fourteen campuses of
PASSHE comprise the bounded system of the case for this research, it was neither feasible nor
necessarily beneficial to incorporate all fourteen universities into the case study design. The goal
of case study is to get an in-depth understanding of the complexity and uniqueness of the case
(Stake, 1995), and the amount of time and resources needed to study any one case adequately
was prohibitive for including 14 different sites. Moreover, Stake (1995) argues that sampling in a
case study can serve to “maximize what we can learn” (p. 4) from the case, and site selection
should thus reflect an intentional strategy to capture any variations, extremes, or similarities that
have conceptual or theoretical salience for the case and the issues of interest.
Jones, Torres, and Arminio (2006) further emphasize the importance of sampling to
achieve coverage of the phenomenon, which extends to the “relationship between one’s
methodological approach, research questions, data collection, and participant selection
strategies” (p. 66). To strive for as much coverage as possible within PASSHE and to maximize
learning potential from the case, I have therefore used a purposeful site sampling strategy that
takes into account variation between sites on an array of politically or theoretically significant
characteristics (Creswell, 2007). The variables along which I sampled sites within the system are
listed in Table 1 along with the data for each of the selected universities. Examining data for
each university on each of these variables, I selected three universities across which I believe the
study achieves significant variation, allowing me to capture important differences as well as
identify patterns or similarities across sites that differ along dimensions that the theory and prior
literature suggest are relevant.
Characteristics of the sampling universe. The diversity among PASSHE universities
reflects the diversity of the large state they serve, which incorporates many distinctly different
80
cultural, geological, demographic, and economic regions. At least four of the PASSHE
institutions are located in areas that could be considered somewhat urban or in proximity to
populated urban centers. By contrast, at least four universities are located in distinctly rural or
remote settings far away from urban centers. The remaining six institutions are located in small
towns or cities, but are also relatively far from the state’s three major urban population centers
(Philadelphia, Harrisburg, and Pittsburg). Reflecting these geographic locations, the universities’
campuses also vary culturally and aesthetically. Many of the campuses in rural or remote areas
have spacious campuses with views of farms or orchards and they have a dominating presence
within the local community. Other institutions in more suburban or urban settings have campuses
situated within residential neighborhoods packed with taller, more densely packed dorms and
classroom buildings, and they are more likely to be just one among many large public healthcare,
government, or educational institutions in the community. As I address in the introduction to
Chapter VI, the universities’ geographic location and proximity to urban centers have significant
implications for how they have conceived historically of their missions as well as their efforts to
address student retention and equity.
The PASSHE universities also vary in terms of organizational characteristics. One of the
characteristics most relevant to this case study is the institutions’ leadership history both before
and during their engagement with the change initiatives. As Table 1 illustrates, more than half of
the institutions had some turnover in top leadership (president or provost) during the period of
the initiatives. These changes in leadership varied in nature: in many cases, presidents retired or
provosts were promoted and succession was anticipated and occurred without significant impact
on the day-to-day workings of the university. In other cases, the leadership changes were abrupt
and unexpected, including in one case the forced departure of a controversial and long-standing
81
president whose departure effected significant organizational changes. In several institutions,
current presidents or provosts hold those positions as “interim” appointments until permanent
hires can be made. All of these leadership dynamics have implications for the institutions’
engagement and sustained involvement in change initiatives, including how the initiatives were
received, the political contexts into which they were introduced, and the relationship between the
institutions and the PASSHE system administrative apparatus.
Table 1
Sampling Matrix for PASSHE Universities
Undergrad
Enrollment
(2012)
Size %ile
w/in
PASSHE
Accept
Rate
Endowment
%ile
w/in
PASSHE
Overall
Grad Rate
Leadership
Change
Region
of PA
Urbanicity
University 1
8,413 54% 59% 38% 51% Y West Suburb: Small
University 2
6,635 23% 47% 62% 48% N West Town: Remote
University 3
7,234 31% 44% 23% 42% Y West Town: Fringe
University 4
7,280 38% 81% 31% 57% N East City: Small
University 5
14,398 92% 58% 100% 50% Y East Town: Distant
University 6
10,163 77% 68% 46% 54% Y East Town: Fringe
University 7
8,575 62% 63% 85% 64% Y East Suburb: Large
University 8
14,654 100% 50% 77% 69% Y East Suburb: Large
University 9
10,265 85% 63% 69% 64% N Central City: Small
University 10
5,553 29% 61% 15% 47% ? Central Town: Distant
University 11
3,206 8% 69% 0% 48% Y Central Town: Distant
University 12
7,632 46% 81% 92% 57% Y Central Town: Fringe
Notes. Leadership change includes new president or provost during the time of change
initiatives. One university was excluded from the sample pool because of its involvement in prior
studies and another withdrew from the study.
Participant sample. Within the PASSHE universities, the specific design of the change
initiatives on which this study is primarily focused call for committees of approximately seven to
10 practitioners from across functions and roles to implement the various processes or tools.
Across the sampled universities and initiatives, there was thus a potential pool of 50-60
individuals (depending on overlap in participation between initiatives) from which to draw for
in-depth interviews. Here again, the principles of maximizing learning and achieving coverage
82
were useful guides for developing a sampling strategy. A typical committee in these initiatives
includes representatives of various campus constituents, such as deans of academic or student
affairs, institutional research, faculty, admissions, and staff who work in advising or in
administrative roles in multi-cultural student affairs or academic support centers.
Practice theory emphasizes that sequences of practices produce social ordering, i.e. power
differences, affording some individuals more influence than others over the means and methods
of pursuing a given objective. This idea is similar to the notion in organizational theory that there
are, for example, “rules and standards that are understood by members of a group” that yield
attributions of expertise or authority to some members while marginalizing others (Cialdini and
Trost, 1998, p. 152). In postsecondary institutions, an example of this might be administrative
staff without a PhD or junior faculty without tenure showing deference to those who hold a PhD
or tenure on the basis of their perceived authority and the legitimacy afforded to them by the
practices of tenuring, prioritizing service on particularly influential committees, giving voice to
senior faculty in meetings, and so on. Such levels of perceived or enacted authority may also
influence practitioners’ sense of efficacy in adapting or resisting certain change efforts in
contradiction to or preservation of competing institutional logics (Thornton, Ocasio, &
Lounsbury, 2012).
Additionally, practice theory (in particular, Thévenot, 2001, and MacIntyre, 2007)
emphasizes that practices are adapted and selected from in part based evaluations that derive
from a conception of the good relative to any particular objective. While individuals’ moral
evaluative schemes may not vary systematically according to their role or function within the
university, it is possible that the range of an individual’s perceived courses of action in any given
scenario would bear some mark of (and/or co-variation with) the cumulative experiences of his
83
or her life and career choices and circumstances (e.g., becoming a college advisor or TRIO
program coordinator versus an economics professor, CFO, or institutional researcher). Moreover,
other aspects of individual self-identities, which Thornton, Ocasio, and Lounsbury (2012) have
conceptualized as effectual in relation to institutional logics, do vary systematically across
functions (for example, self-identities as faculty members, counselors, administrators, and
racial/ethnic, gender, or other subject-position identities).
In each site, I started with participants in the Equity Scorecard team because of my access
to them through the Center for Urban Education and the larger research project under which this
study is being conducted. I invited all participants to participate in interviews but took additional
efforts where needed to obtain variation to the extent possible along the roles and identity
variables identified above. In total, I ultimately incorporated sixteen formal interviews as data for
the study (see Appendix I for a list of interview participants from each site identified by role and
Appendix III for a sample interview protocol). Because a lot of valuable “storytelling” happens
informally in conversation, I also wrote four field memos during the process of data collection in
which I captured my impressions and observations from the field sites, including conversations
outside of the formal interviews at each site.
Access & Bias Concerns
The data collection required for the in-depth understanding sought in a case study is
extensive and intrusive, including observations and interviews, as well as document review and
examination of other physical artifacts (Creswell, 2007). It is worth noting that although my
association with the Center for Urban Education, which provides the Equity Scorecard process,
facilitated my access to the sites, the Scorecard initiative as well as Deliverology were strongly
84
associated by many participants (this association emerged clearly in the data) with system-level
politics in an often antagonistic way.
There are two consequences of this affiliation in terms of bias in access to data. First, my
sample may introduce bias to the extent that the provost at each of my sites was supportive of
both the Scorecard and this study (not true at other sites not included in my study). Although
“provost support” was not a primary sampling variable, the implementation of the Scorecard and
Deliverology in response to performance-funding policy was received differently by different
universities; thus my findings about change may have varied considerably at universities where
key administrators may have been less receptive to my study, and I recognize this as a possible
limitation in the coverage of my study sample (Jones, Torres, & Arminio, 2006).
Second, within sites my affiliation with the Center for Urban Education may have biased
the narratives provided by participants if they felt like they were being “evaluated” for their use
of the initiative’s process and tools. I tried to attend to these potential biases within interviews
and in subsequent data analysis, in particular by asking clarifying or challenging questions
during interviews when I felt like responses demonstrated positive response bias (positive bias
either in overstating positive outcomes or censoring negative experiences or attitudes).
Data Collection
Interviews. Data collection for this study included sixteen official interviews lasting
approximately one hour (fifteen of which were audiotaped and transcribed for analysis) and
additional “off-record” impressions from informal conversations documented in field memos.
All participants are kept confidential and referred throughout the study using only general
descriptors based on position (faculty member, dean, student affairs professional, etc.).
8
8
Appendix III provides a sample interview protocol and the Informed Consent Document for the study is included
as Appendix IV.
85
The participant sampling strategy I outlined previously presents a rationale for choosing
among the initiative participants to collect a diverse set of perspectives and viewpoints on the
processes by which the teams pursued their objectives—what ethnographers call “multivocality”
in construal of organizational life (Ybema, Yanow, Wels, & Kamsteeg, 2009). The interviews
conducted in this study aimed to elicit those diverse individual perspectives and interpretations
through semi-structured interviews with open-ended questions designed to elicit respondents’
narrative accounts of how he or she and the group went about accomplishing their objectives—a
focus critical for incorporating meaningful narrative data (Polkinghorne, personal
communication, January 9, 2014). Eliciting participants’ own narratives of their actions (e.g., in
what ways did you do x or y) is more important for the goals of the study than attaining answers
to specific questions (e.g., did you do x or y activity) because these narratives will demonstrate
both actions and perceived rationales for actions that convey practice meanings and will also
convey certain communicative strategies in relation to those actions (Daiute, 2014; Nicolini,
2009a).
Observations. I conducted one official observation of an event related to the change
initiatives and one informal observation of a Scorecard committee meeting. Impressions and key
points from these observations were also documented in field memos and thereby integrated into
the data analysis.
Observing practices as they unfold in a given setting is the primary mode of capturing
“practice.” In Nicolini’s (2009b) study of telemedicine practice, he created a protocol that
facilitated his observation of both what was being said (in one column, the literal transcript) and
done (in an adjacent column, the bodily “choreography” associated with that speech). In this
way, Nicolini was able to capture not only the discursive nature of the practice, analyzed later for
86
its linguistic properties (humor, chastising, instruction, humility, and so on) but also the tacit
knowledge and adaptation dedicated to that discursive act (eye-rolling, nodding, looking at the
patient’s chart to double-check information the patient has relayed). He then compared these
discursive and bodily actions to the formal protocol for the practice (the inherited artifact) and
was thus able to trace not only the impact of that document across settings but the individual
adaptations given to it by the nurses in his observed settings.
Though my opportunities for observation in this study were limited, in the instances
where I was able to conduct formal (only one) and informal observations of activity settings
related to the change initiatives at each university, I used a similar strategy of attending to both
“sayings and doings” (Nicolini, 2009a; 2009b). For example, I noted the meeting facilitator’s use
of “air quotes” in trying to convey the instability of certain terms with respect to data; I noted
where participants verbally resisted the objectives of the meeting but simultaneously completed
the worksheets provided to them; I also noted participants in the workshop were seated and how
their interactions (whispers, side-commentary, glances) changed over the course of the meeting.
These observations supported by analysis of this setting as a learning structure (Chapter VI) and
helped me better understand the nature of practice re-mediation in that type of activity setting.
Document analysis. I collected numerous artifacts from each site, including memos,
posters, meeting agendas, data reports, strategic plans, and other materials generated specifically
in the context of or used in association with the change initiatives. I also reviewed websites and
conducted extensive archival research about the sampled universities and the PASSHE system in
order to inform the case narratives from a broad historical perspective. A list of collected
artifacts is included in Appendix I.
87
Documents and other material artifacts (websites, policy texts, data) constitute a
theoretically rich mediating component of practice both as products generated in the course of
specific initiatives as teams pursue their objectives, and as inherited resources for practice, such
as guiding protocols or scripts (such as existing strategic plans, performance-based funding
goals, and so on; Nicolini, 2009a). The research questions and theoretical framework of this
study place a specific emphasis on communication and process, and I have thus drawn on
qualitative document analysis (Altheide, Coyle, DeVriese, & Schneider, 2008) as a mode of
sampling, organizing, and analyzing documents related to change practices both at the local and
system levels. Qualitative document analysis provides a systematic method for drawing attention
to “thematic emphasis and trends in communication patterns and discourse” (Altheide, Coyle,
DeVriese, & Schneider, 2008, p. 128), which supports the goal within a practice approach of
mapping patterns of interest and association across practices both within and across change-
focused activity settings.
Moreover, reflecting the focus within practice theory on tracing the “movement” of
artifacts across settings (Nicolini, 2009a), I have also drawn on Prior’s (2008) notion of studying
“documents in action,” which she argued “encourages a focus on how documents are used
(function) and how they are exchanged and circulated in various communities” and in a “web of
activity” (p. 117). I paid particular attention in sampling and analyzing documents and artifacts
to those that (either explicitly or implicitly) communicated the objectives of the various change
initiatives, and asked participants during interviews to provide their own accounts of how
artifacts were (or were not) used in the course of their shared activities. While a document
review by itself cannot trace the flow of documents around a community or capture its many
possible interpretations, this focus on “action” nonetheless helps attend to how the language of
88
the document instigates or directs future action or aims to shape possible interpretations (Prior,
2008).
The reciprocal relationship emphasized in practice theory between practices and their
mediating artifacts points to the importance of an ongoing and circular data collection strategy;
for example, document analysis early in the study provided contextual information and possible
themes and codes for observations and interviews, and those observations and interviews have in
turn revealed additional artifacts that are integral to each of the activity settings. The methods in
this research design are thus conceived as facilitating an analytic cross-referencing between
language (observed and elicited in interviews), observed settings and the bodily movements of
interactions within them, and artifacts that serve as manifestations of or inherited resources for
practice (Nicolini, 2013).
Data Analysis & Reporting
Thematic coding and analysis. Stake (1995) suggested that intrinsic case studies
represent a unique kind of qualitative research in which the primary task is to gain the in-depth
understanding the researcher needs to be able to make sense of and represent the complexity of
the case. Analytic approaches like coding and categorical sorting and organizing are helpful,
Stake argued, for helping detect patterns: “The search for meaning often is a search for patterns,
for consistency, for consistency within certain conditions, which we call ‘correspondence’”
(Stake, 1995, p. 78). Stake’s tactical suggestions align with Nicolini’s (2009a) analytic strategy
of “zooming in and out” to achieve both in-depth interpretation of micro-ethnographic accounts
of practice and patterns of association across practices.
Following these strategies, I first generated an initial list of codes drawing from the
extant change literature and from my theoretical framework to serve as deductive “sensitizing
89
concepts” (Charmaz, 2006; Daiute, 2014) during the initial thematic coding of interview and
observation data as well as analysis of material artifacts. In the course of reading through my
transcripts and other case materials and applying these initial deductive analytic codes, additional
codes emerged inductively as comparisons across interviews and sites revealed unpredicted
patterns (Charmaz, 2006; Stake, 1995). A list of deductively and inductively generated codes is
provided in Appendix II, annotated with a brief description of how I defined or thought about
each code as I was analyzing my data as well as any specific references that informed them
(where applicable). I used Atlas.ti qualitative software to organize and code case data and to
facilitate synthesis and emergent coding.
As Stake (1995) suggested, however, coding is simply a way of organizing and
categorizing data; it is not in itself a mode of substantive analysis. Underlying the iterative
process of deductive and inductive coding, my analysis was instead driven by a process “analytic
questioning” described by Neumann and Pallas (in press). Analytic questions, according to
Neumann and Pallas, are separate from the broader research questions that drive a study; they
emerge from the data, “are responsive both to the research questions and to the features of the
data that the research has gathered,” and serve as a “small shovel or scoop” the researcher uses to
“scoop out” elements of the data responsive to the larger research questions (Neuman & Pallas,
in press, n.p.). Many analytic questions may guide data analysis and they are refined iteratively
as analysis progresses, resulting ultimately in a set of “propositions:”
The objective of an analytic question of this type is to generate claims about the
phenomenon at issue supported by the data, and possibly generalizable beyond the
specific cases studied. Such claims are typically phrased as propositions, carefully crafted
90
so as to communicate possibility though framed in light of a researcher's uncertainty,
stated in forthcoming ways. (Neumann & Pallas, in press, n.p.)
Following this analytic strategy, I first analyzed data according to analytic questions as they
emerged both during data collection and coding; many of these I noted in field memos or in
comments within Atlas.ti. Organizing and synthesizing my data around such analytic questions
was more an informal than formal process; I provide in Chapter V some examples of the analytic
questions that facilitated my data analysis, but many more arose during the process.
Organizing and interpreting findings. I thus present the analysis and findings from this
study in two stages. After conducting several iterative rounds of data analysis, analytic
questioning, and re-analysis, I organized the evidence around key themes that I then formulated
into tentative propositions as answers to the analytic questions of this study (Neumann & Pallas,
in press). The use of propositions as a format of qualitative findings is particularly responsive to
the goals of this study and of qualitative research generally, particularly where the objective is to
build theory or enhance understanding about the nature of categories structuring empirical
phenomena (Luker, 2008). Following the model provided by Clark (1998) in his seminal study of
university transformation, I first articulate these propositions about practice change and provide
brief examples from the data to give them texture and depth (Chapter V).
Having outlined and substantiated the tentative propositions about practice change
emerging from the study, I then “zoom in” (Nicolini, 2009a) to provide rich ethnographic and
historical accounts of the practices and settings associated with the change initiatives in each
particular site. Adapting the “dynamic narrative approach” suggested by Daiute (2014) for
presenting individual stories, my narrative accounts are presented “in terms of relational
concepts” and emphasizing tensions between narrative accounts and the interactions of key
91
elements of practice across interviews and settings within each university (pp. 249-250). These
narratives thus provide detailed descriptions, drawn and synthesized directly from the data, of the
practices (incorporating “sayings and doings” but also the mediating artifacts and meanings of
activities) associated with the student success or equity change initiatives at each the three
universities in the sample. Within these narrative accounts, again borrowing from Clark (1998), I
emphasize in bold-italicized text where the propositions in Chapter V and their various
components were involved in the processes of change within each setting. Notably, not all
propositions played out equally or in the same ways across settings, which is why they are
offered as tentative propositions about the nature of change rather than as statements of finding.
This strategy allows the reader to see in context how the synthesized findings from across the
case are relevant (or not) within the differing contexts of the three universities.
Validity
Within both ethnographic and case study research, tensions inherent to the nature of
qualitative research complicate the researcher’s role in gathering and representing the data.
These tensions derive from the logic of the constructivist epistemological positions that underlie
ethnographic and other culturally contextualized research—that is, the “data” of qualitative
research must be acknowledged as context-dependent, socially constructed (that is, as a
particular form of practice), and presented through the filter of both respondents’ and the
researcher’s own subjective interpretations and social positions. And yet, at the same time, the
data must be presented with assurances of “validity” as accurate representations of what was
observed.
Yanow (2000b), for example, argued that in interpretive models of policy analysis, the
researcher “accesses” rather than “collects” data; data “collection” involves accessing the tacit
92
knowledge of interpretive communities in order to understand how they are making sense of
policy (p. 27). While “accessing” versus “collecting” data may seem like a semantic nuance, the
difference poses real logistical concerns for the researcher who must maintain a delicate balance
between acknowledging the social constructions involved in both the creation of that tacit
knowledge and the researcher’s appropriation of it as “data.” The researcher cannot treat tacit
knowledge as “real” or self-evident, but must at the same time find a way to communicate
observed patterns and findings convincingly to an external audience (Yanow, 2000b).
A second tension lies in interpreting the “data” of tacit knowledge in the first place,
acknowledging the well-established tendency of human beings to rationalize our behavior after
the fact based on our own interpretive schema or dominant institutional or organizational logics
(Gioia & Poole, 1984; Nisbett & Wilson, 1977; Thornton, Ocasio, & Lounsbury, 2012; Weick,
1976; Weick, 1995). We can effectively rationalize our actions even when no such reasoning
took place (or took place subconsciously), and thus the researcher’s task of discerning “true”
underlying motivations for action is made especially difficult when relying on self-reports about
those sub-conscious actions. As Yanow (2000a) suggests, this methodological challenge
becomes particularly thorny because the researcher’s work in observing and participating in
others’ interpretations itself also involves subconscious reasoning; thus the entire process of
mapping, interpreting, and reporting practice involves multiple opportunities for bias and
misrepresentation to enter the study.
The methodological challenge is thus to accommodate the layers of context and
interpretation occurring within a qualitative study while assuring that the researcher’s
representations are trustworthy and worthwhile. Smith and Deemer (2000) argued that the
criteria for the quality of qualitative research must be rooted in local and contextualized moral
93
standards applied pragmatically to actual modes and instances of inquiry rather than deriving
from abstract epistemological ethics. That is to say, the “practice” of research is itself a practice
driven by “pragmatic regimes” (Thévenot, 2001) or, potentially, by a “judgment-based practice
of care” (Polkinghorne, 2004) applied to the creation of useful and truthful knowledge. These
practice-rooted judgments can be carefully articulated to derive criteria for validity from the
contexts of the inquiry.
Other methodologists have offered more normative evaluative concepts and methods
aimed at providing validity to qualitative research. Altheide and Johnson (1994) referred to
principles of an “ethnographic ethic” that call for ethnographic researchers to “substantiate their
interpretations and findings with a reflexive account of themselves and the processes of their
research” (p. 489), to report the “multivocality” of the diverse social settings in which they work
(p. 490), and to make explicit both the process of research and the researcher’s own perspective
and position. Other methodologists (e.g., Creswell, 2007; Hammersley & Atkinson, 2007; Stake,
1995) rely on more specific and familiar techniques, including triangulation among multiple
forms of data, member-checking, peer review, and the use of thick description in order to achieve
validity in qualitative research. Yet others (e.g., Lincoln & Guba, 1994; Lincoln, Lynham, &
Guba, 2011) have argued that validity in constructivist-anchored research derives from a
consensus between the researcher, the participants, and the community about “what is ‘real’,
what is useful, and what has meaning” (1994, p. 167), where usefulness and meaningfulness
concern the effectiveness of the research as impetus for social action and change.
Ultimately, each of these perspectives suggests that—while there are some techniques
that are obligatory (triangulation, thick description, etc.)—the criteria of validity in qualitative
research must be negotiated and addressed explicitly within the context of the study. To this end,
94
I have used the following techniques to enhance validity. First, I will engage in a process of
member-checking wherein I provided narrative accounts to selected participants and asked for
feedback, correction, or confirmation.
9
Second, I have attempted to a reasonable extent to
incorporate quotations from interviews or documents in direct support of the findings I propose
so that readers can see participants’ own narratives and assess themselves the validity of my
interpretations. Finally, the fact that this study is part of a larger study conducted by the Center
for Urban Education within the PASSHE system has given me opportunities to discuss and
compare my findings with other researchers who are working with the same and other similar
data.
9
I did not have time to conduct member checking before submission of this dissertation, however I will do so as part
of the larger ongoing study and prior to any peer-reviewed publication of this research.
95
CHAPTER V: Propositions about the Practice of Change in Higher Education
The research questions motivating this study ask about the nature of a particular type of
change effort in higher education at a particular moment in history, using the PASSHE setting
and particular change initiatives within it as a case and using a theoretical focus on practice to
help interpret and formulate the knowledge gained. The goal of the study is thus to arrive at
generalizable findings—not about any specific instances or “co-variates” of change impact but in
terms (as in Clark, 1998, and Sporn, 1999) of propositions that are both valid relative to the
specific case and tentatively generalizable to settings outside the case. As Neumann and Pallas
(in press) suggest,
[T]he findings of any one study serve as a stepping-off point for where, substantively and
conceptually, a new study, in another site, can begin. One of the aims of that study would
be to verify the findings of the first study, revise misrepresentations or state their
inconclusiveness, extend earlier findings, and open up new ground. Generalization in
qualitative research means something other than it does in statistical studies. (n.p.)
The goal of my study, in this same sense, is thus to provide as findings a set of refined and
substantiated, while still tentative, propositions.
Analytic Questions
To do so, I have relied upon Neumann and Pallas’s (in press) approach (or my
interpretation of it) of linking the case study data to my research questions through a process of
iterative analytic questioning. These analytic questions are directly responsive to my research
questions but also derive from my experience collecting and analyzing the data. As in Clark’s
(1998) approach, I then constructed analytic narratives of the change processes at each of the
universities in my case study illustrating throughout how the propositions “play out” and are
96
substantiated by accounts of practice change (or lack thereof) in each case. In this chapter, I
provide the propositions along with examples of analytic questions and supporting evidence that
contributed to their development. In Chapter VI I then provide the analytic narratives.
Neumann and Pallas (in press) suggest that formulating analytic questions is “akin to
coming up with questions that allow one to move, systematically and carefully, through a
probing conversation with a person narrating a complicated experience, one that may be
incomplete and difficult to summarize or express” (n.p.), where the conversation in this case is
one between the researcher and the data. Analytic questions are thus probing, substantive
questions aimed at clarifying or elaborating “hints” or “markers” in the data. Table 2 provides
examples of analytic questions that emerged in my collection (indeed, many are questions I
asked as follow up during interviews) and analysis of the data and which have informed the
propositions described below.
Table 2
Research Questions and Examples of Corresponding Analytic Questions
Research Questions Analytic Questions
(1) What are the
practices associated
with the objectives of
improving student
success and equity in
universities?
• What specific activities take (or have taken) place within the university, historically
and currently, with respect to issues of student success or equity?
• How are these activities “typically” carried out? In what settings? By whom?
• What are the objectives of these activities, as perceived by participants or as
conveyed in documents?
• What institutional logics are implicated in these practices? Or, conversely, what do
the practices convey about competing or contradictory institutional logics?
• What histories, policies, or other structures are evident in the data with respect to
issues of student success or equity?
(2) How are those
practices re-mediated
in the course of data-
centered, team-based
change initiatives?
• What did participants do during meetings? Who participated? Where did the
meetings take place?
• In what ways, if any, were these activities different from those of prior student
success or equity practices (e.g., retention committees)?
• What tools, language, processes or other features of the specific change initiatives
came “into play” during the course of change-related activities? How were they
used?
• What communicative practices and strategies were associated with these initiatives?
How were/are their objectives and processes discussed within or outside the specific
activity settings of the initiatives?
• How did participants relate to these practices? Were they encouraged? Skeptical?
97
What were participants’ own objectives with respect to the change practices?
• How did participants perceive the objectives of these initiatives?
• In what ways did the tools, language, processes or other features lead participants to
do things differently? How so?
(3) What are the
organizational/
structural
implications and
extensions of such
changes in practice?
• In which cases did the practices of the initiatives generate new artifacts or structures
(i.e., new policies)? How are these used or perceived by participants?
• In what ways, if any, did participants describe integration of change-related
practices into other forms and settings of practice within the university not directly
related to the initiative (for example, in strategic planning, in other educational
reform activities or committees, etc.)?
• What sources of resistance, skepticism, or structural constraints or barriers did
participants describe in relation to the practices associated with the change
initiative?
• What strategies did participants describe or convey implicitly for navigating,
circumventing, or confronting structural constraints? How did they work? What was
the outcome, from the participants’ perspective?
Propositions about the Practice of Change Focused on Equity & Student Success
The following propositions reflect findings of this study proffered as tentative
generalizations that respond to my research questions via this process of analytic questioning. I
note in parentheses which research questions each proposition responds to; given the iterative
nature of change, some propositions overlap in responding to, for example, both questions about
the nature of specific practices related to equity and student success (RQ1) and the structural
contexts of those practices (RQ3). Below the discussion of each proposition I provide quotes
from interviews or documents to illustrate the kinds of narrative references I drew upon in
developing the proposition. The quotes are meant to provide depth and texture to the abstract
proposition, and for the sake of validity in allowing the reader to consider whether my
interpretations of the data relate clearly to the proposition I offer. However, the quotes I include
are not the only data informing the propositions; in each case the propositions reflect a broad
synthesizing of data analysis from across interviews, observations, and document analysis.
98
1. Change may be reflected in reconfigurations and enhanced continuities of existing
practices rather than entirely new practices (RQ1/RQ2). In nearly every case, “change”
represented alterations of existing practices rather than entirely new practices. In each case there
were elements of continuity in practices and in many cases the new practices associated with the
change initiatives were literally reconfigurations of existing committees. “Committee work” was
a familiar, continuous form of practice in relation to both issues of equity or diversity and student
success more broadly; these practices incorporated activities that appeared similar over time but
took on different dimensions and meanings for practitioners within the change initiatives (for
example, looking at data, making presentations to senior administrators, conducting research and
reporting back to the committee, and so on).
Quotes illustrating proposition #1
Quote from a dean: [W]e decided that a good number of the people from the Student Retention
Committee—to make it short, SSRC—a number of these people would be part of the Equity Scorecard
because the Retention Committee already had a lot of experience over the two years prior to that… Of
course, our work was not as structured as the Equity Scorecard. Because the Equity Scorecard has a
specific process that we, you know, were supposed to follow. So what we did is we folded our work into
it…
Quote from a faculty member: I was very interested in retention... So we asked for a little money from the
Social Equity office. We started the study collecting data. We asked undergraduates to help us. It was
just the whole going to your dorms, give this to any African-American and Hispanic student. Let them
answer this. We just want to know what they… so, and then we interviewed too. So in that process I
went and interviewed the Multicultural Student Association Director and they said, “Oh, so you’re also
interested in this.” So they invited me to the rally just to encourage the students…and so on.
2. Though change may reflect reconfigurations of ongoing practices, new activity
settings may re-mediate those practices by forging new networks for communicative practices
and/or by changing the conditions of communicative practices across university “silos” and
functions (RQ2/RQ3). The theory of mediated activity settings suggested by CHAT and
modeled by Engeström (1987) includes “people” as one of the mediating elements of practice;
one question in this study is thus how and in what ways people, or the relationships among
99
people within the settings, may serve to mediate (or in this case re-mediate) practice. Clark
(1998) suggested that university transformation occurred through “incremental and
accumulative” processes of interactive change across departments and functions over time. From
a practice perspective, the question would then be what particular elements of a re-mediated
practice would contribute to such interactive change processes throughout the organization.
Though in two of the three universities in this study the committees formed to engage in
the change initiatives retained the structures and portions of committee membership from prior
committees focused on issues of retention or diversity, the specific initiatives (Scorecard and
Deliverology) prescribed participation by individuals from key functional areas across the
university. In the other case, the committees brought together individuals who had never
previously interacted (for example, faculty and the dean of admissions). In both cases, the new
setting structured by participation of professionals with different roles within the university thus
re-mediated practices related to student success and equity; it appeared to do so by in some cases
by creating new networks for communication and practice among members of the committee.
Externally, the activities conducted by the committee also generated new networks of
communication by (1) forging new relationships between professionals from various areas of the
university and (2) defining or re-redefining the conditions for communicative practices between
those individuals. For example, where faculty had previously been unaware of the internal logics
and structures of admissions practices at the university, the Scorecard inquiry activities might
have given professionals from both settings into interaction by creating a legitimated setting for
new communicative practices and a shared goal.
100
Quotes illustrating proposition #2
Quote from a faculty member: [A] lot of us, especially in faculty, didn't really know. We didn't know
what the hierarchy was. We didn't really know who made the decisions and when they were made and we
were getting lots of data about, okay, here's the percentage of white students admitted and here's the
percentage of offers made, here's this and this, and so we had the numbers and we could eyeball them, but
we wanted to know at what point is each decision made. What's the communication process like, are we
just using e-mail, when do those e-mails go out. A big thing we were working on was like incomplete
applications and who handles those, so we were just trying to get a better picture of what goes on in the
admissions office and with the counselors. It was eye opening, because like I said, I had no idea exactly
the process and how those decisions were made.
Quote from a department chair: I think also, in the beginning… like, with our campus, we’ve gotten so
large and we have so many different units and I think our units are used to working but not working
across. Units are used to working for, like, whoever is your VP. I think there was, especially for me
being academic coming into Admissions I think there was like, “Okay, who are these people. They are on
our turf.” But I thought after a couple of times… I think people were a little more open then realizing we
weren’t trying to come in and say everything you are doing is bad, we are just trying to see if there is any
way we can help you in your process… So I thought that was helpful to this project to realize we are
trying to work together. We are not trying to come in and condemn people and say we are going to – no
we just wanted to learn about the process.
3. Professional identity and personal moral values may be a significant mediator of the
change process by informing interpretations of the change objectives and motivating practice
“tactics” to align institutional and practitioner goals (RQ2). Both practice theory (MacIntyre,
2007; Polkinghorne, 2004) and Thornton, Ocasio, and Lounsbury’s (2012) model of the micro-
foundations of institutional logics suggest that practitioner identities inform the range of
approaches they take in their practices, particularly in responding to change or choosing a course
of action (or inaction) in relation to organizational goals. MacIntyre (2007) also emphasizes that
practice is driven by the “internal goods” it produces, and that “virtue” in practice is thus distinct
from technical skills (p. 193). That is, the moral or “virtuous” drive for excellence within a
profession may be a constant and necessary element to practice. The decision by a professor to
select new ways of doing her work, for example, is not simply a matter of rationally choosing
more effective means to a given end but part of a drive for excellence within the very practice of
being a professor itself (consistent with Bensimon, 2007, and Polkinghorne, 2004).
101
In some cases, practitioners in this study found ways that the new objectives, tools, and
processes that resonated with their own pursuits of excellence in their practices (MacIntyre,
2007) and provided new “tactics” for pursuing what they expressed as a professional or moral
obligation. In other cases professional identity provided a source of skepticism and caution. In
both cases, however, the relationship of practitioners’ professional identities and (in some cases)
personal moral or ethical beliefs prompted tactical and adaptive changes in practice to align the
objectives of the initiative with their own objectives.
Quotes illustrating proposition #3
Quote from a faculty member: [W]e have to know what all variables we're working with, and I think that's
probably the scientist part of me, that training that says that you can't just come in, and even from a youth
development point of view, from a [disciplinary] point of view, we always tell our students, don't go into
a community and fix a problem without knowing what the problem is. So it was like, I like what we're
trying to do but until you can prove to me that is a problem, I'm not sold on this. So, while everyone else
was busy trying to set goals, I was like, no, I want some more variables, give me something else to work
with…
Quote from a student affairs professional: I think they had it where [students] could go and schedule with
the facilitators and based on who the facilitators are, they're all lovely, personable and caring women that
would make sure that the student was listened to, but there wasn't a dedicated professional that did what I
did now. And that's why I did not thrive in the office for students with disabilities because the students
wanted to meet with me and they were overwhelmed and the person who ran the office said you're gonna
set precedence, because she was very legal and I was very “people,” and never the twain shall meet, and
she thought that I was trying to be their friend and I said, no, from a rehab counseling perspective you
need to meet them where they are, especially if they have disabilities, especially if they have cerebral
palsy and nobody's paying them two cents on the street as far as attention, you know, we had students
who were deaf, we had students who were blind just coming in, you know, you need to make sure that
those people are incorporated into the student population.
Quote from a faculty member: This diversity thing – I’m a minority so I think it’s very interesting to see
how that happens but I had done a program a few years back…Of course, as you go through this you start
to see the numbers. And we are talking about minority students coming to college. Then we wrote a
grant…We were doing programs, bringing minority students into the university to kind of show them
what happens here hoping that it will translate to coming to university. On that end I was trying to bring
minority students to my university so they can later come back when they finish high school. Then I
discovered that they come and they just leave again. Why? …[I]f I can help in any way for minority
students to come to college and be successful, because I understand the pain of coming to college and not
being successful. To me that’s painful. I think I have the ability to and if I don’t have the ability I can
direct them to resources. “This is why you need to stay.” I think it just comes from that fact. My
experience. I have children in college and it’s just a painful experience to think that your child is going to
drop out and education is the only way you have for them to survive in this economy. So I think that kind
of informs my interest doing it. I can do it every day.
102
Quote from a dean: To me… that was my first experience as ‘I am not the predominant white male.’
…As my wife and I reflected on that, we began to say, and I began to reflect that that’s how I looked at
people who didn’t look like me growing up. And I realized that there’s a problem here. And, you know,
I really wrestled with that in my own mind, in my own spirit. That experience in China really taught me
to say there’s no difference between … someone being Chinese, someone being this, someone being this,
someone being this. So when I got this position, and I got equated, or understood what was going on in
the Equity Scorecard, those things actually came back to me, and I began to realize that we as a
University have to be careful not to say, “Oh, well you’re this and you’re this,” but we need to make sure
we have opportunities for all students to be able to be successful…I began to say, “Why, why are there
gaps?” And some of the stuff we’ve done now with the Support for Success and the four-year grad plan,
we are beginning to analyze things differently to say there shouldn’t be differences.
4. The logics and specific objectives of accountability policy may mediate change-
related practices in unexpected ways, for example by creating discursive resources for practice
or intensifying tensions between institutional logics (RQ2). The literature suggests that the
origins of change initiatives within accountability policy may be counterproductive to or even
antithetical to “professional” change efforts within higher education practice (Shulock, 2003);
this suggests that the accountability context would in some way serve to mediate the change
initiatives in PASSHE. Accountability did appear, in some cases, to re-mediate existing practices
related to student success and equity by introducing specific new objectives (goals associated
with performance-based funding such as “cutting gaps in half”); but, also consistent with the
literature, that particular dimension of accountability policy’s role in the change processes was
limited primarily to deans or other top administrators (Burke & Minassians, 2003).
However, the accountability context appeared more notably and surprisingly to re-
mediate change practices, in some cases, through a legitimizing effect of specific communicative
and discursive practices associated with the initiatives. Specifically, members of the teams were
able to strategically use the initiatives’ association with the accountability system to get “buy-in”
and cooperation in carrying out their inquiry practices in multiple settings across the university
103
because they could cite the initiatives as a “provost’s initiative” or “something the president has
tasked us with doing.” In another case, several modes of accountability impetus (performance-
based funding but also accreditation) seemed to have led to a discursive attempt within strategic
planning and leadership practices to merge and integrate several competing institutional logics,
thereby constraining change practices by creating dichotomous frames of reference for action
and decision making. Operating “through” these discursive practices, accountability thus
appeared to have indirectly mediated practices related to the change efforts interacting with an
array of other histories, logics, and practices within the universities.
Quotes illustrating proposition #4
Quote from a dean: [T]he Equity Scorecard gave us, I don’t wanna say credibility, but gave us an entry, it
gave us more of a standing in the community. One thing was the Retention Committee wanting to look at
things as a subcommittee of the enrollment management group, and it is something else as a group
charged by the president to look at this and making the direct connection to the key performance
indicators. So it gives a little bit more clout to go out there and get the data a little faster than we
normally would.
Quote from a dean: Yeah, everybody’s aware of our performance funding, yes. If you ask me the
question, did we engage in [Equity Scorecard] because of performance funding, my answer would be
“Yes, we did.” But that caused [the Scorecard] to be able to come to us. Did we engage in the process
because of performance funding as an end? No, we didn’t. I think we had a very frank discussion where,
you know, the reasons why we engaged in this process had to do with—we just felt it was the right thing
to do, that once you’re aware of those gaps, to not try to address those gaps is unconscionable. You
should—you cannot be aware of that and not want to do something about it, and I think that that’s what
kept people at the table. It was nice that if you, you know, if you fix a gap, then we’re gonna do well in
performance funding. Okay, that’s great. So it’s kind of like activation, you know, in chemistry.
5. The introduction of new tools, processes, artifacts, and language specifically
designed to address issues of equity and student success may re-mediate practices if they link
new “know-how” with practitioners’ own existing wisdom, goals, or professional judgment. It
is not clear that new technologies for practice in themselves serve to re-mediate practice
(RQ2). The change initiatives introduced new technologies that appeared to re-mediate existing
practices around student success and equity in significant ways. By technology I mean not in the
104
sense of computers or digital media but in the original sense of technê, as in Aristotle’s
delineation of the five virtues of thought in the Nicomachean Ethics. Aristotle’s view of technê
was that of a craft or in some translations “know-how” or “technical thinking,” which is distinct
from but bears an intimate relationship to other virtues of thought including epistêmê as certain,
demonstrated, and in some translations “scientific” knowledge; phronêsis as wisdom or prudence
in action; and praxis as a form of knowledge embedded in practice or something akin to
excellence in practice (Aristotle, 1962, pp. 311-315; Bensimon, 2007; Kristjánsson, 2005;
MacIntyre, 2007; Polkinghorne, 2004). However, new forms of technê apart from these other
forms of knowledge may be incomplete and ineffective. The literature on data-use initiatives
reviewed above, for example, suggests that new forms of data collection and reporting do not
necessarily bring about changes in practice (Dowd, 2005; Morest & Jenkins, 2007).
In many cases in this study, practitioners’ described or revealed (during observations)
ways in which the change initiatives, particularly the Equity Scorecard, provided a structured
process, tools (data and inquiry protocols), and vocabulary which distinctly re-mediated practice
in interaction with other elements of practice, including practitioners’ own professional and
personal judgment and ethics (Bensimon, 2007). In other cases, practitioners described new tools
or artifacts (or I discovered them through document analysis) that did not appear to re-mediate
practices associated with the change initiatives but instead appeared to remain peripheral to
practice; in these cases it appeared that the new technologies were often either incongruent with
existing practices or practitioners could not find ways to link them to their own objectives and/or
phronetic wisdom.
105
Quotes illustrating proposition #5
Quote from a faculty member: I think we were all excited about that and so part of it is the data-rich – and
the process that has been developed by the Center for Urban Education. So the processes and then the
data-rich aspect of it, I think we all appreciated that…Also the kind of question of what is it to be “equity-
minded” versus “deficit-minded.” I think that as an intellectual project… to think of what does it mean,
how do we avoid being deficit-minded, I think that that was a really useful and interesting – and
something that excited us.
Quote from a dean: I mean the data tools that you have are data tools. And so you could disaggregate any
way that you want to disaggregate. And so, you know, you look at something like, you know, men and
women. There’s a strong sense of inequity in terms of how women access, for example, computer
science. Well, what is that gap? How big is it? What are the steps of the process, and is there a way to
take those principles through, for example, in a retention committee, to look at that to help close the gap
on how women access computer science as a discipline. Different question, same set of tools, right?
Quote from a student affairs professional: It was my boss tapped me and at that point too I’d been here
three years maybe, and she knew pretty well the kinds of questions and what I was interested in – which
again was, I struggled a lot getting information about my students. So I knew it was going to be a lot of
work, but that was okay, because again I felt it was the best opportunity for us to have an impact on this.
I had served on the Retention Committee. We had done a lot of studies. We had done a lot of
subcommittee kind of work and we really had almost hit a brick wall with, it’s not really going much
further. So this really seemed to be an opportunity to really have an impact on change… The thing that I
really liked, it was the first time I actually got hands on data about my students; my students; just my
students related to Retention and Completion as well as broken down race, ethnicity and gender. So that
was important to me because I had never had it before.
6. Competing institutional logics may moderate rather than mediate practice change by
creating dichotomous frames of reference for possible outcomes; in other words, they may
indirectly mediate the re-mediation process by circumscribing or broadening practitioners’
preferences or sense of efficacy, and may do so differently for people in different positions of
authority (RQ2). In several cases in this study, an institutional logic of “commitment to equity”
appeared to compete with a logic of “prestige and rising in the rankings”; in other cases a logic
of “commitment to equity” competed with a logic of “fiscal prudence and resource constraint.”
These various logics were expressed in practitioners’ explicit and implicit discursive practices
related to the change objectives—for example, in explicitly rejecting “unfeasible” ideas during
meetings or implicitly prioritizing certain activities over others based on a sense of broader
106
institutional values. In these ways, such discursive practices within the activity settings served to
reify logics in ways that were primarily conservative.
Perhaps not surprisingly, these discursive practices occurred in patterns related to the
division of labor in higher education. Specifically, deans and administrators were not more likely
to make references to one logic as dominant over another (though in some cases this was
implicit), but they were more likely to emphasize the competing nature of various logics—for
example, in framing the viability of recommendations in terms of those that cost money and
those that would not. Non-administrative practitioners (especially faculty) were more likely to
convey (implicitly or explicitly) competing logics as a point of frustration or motivation for new
forms of practice and to explicitly reference one set of institutional values or priorities as
dominant over others. This finding is consistent with Thornton, Ocasio, and Lounsbury’s (2012)
suggestion that professional identities interact with institutional logics in creating frames of
reference for practice.
Quotes illustrating proposition #6
Quote from a faculty member: Our gap in terms of just numbers is not that big, I mean we're taking about
keeping 20 more students or something like that, but you know in terms of this 7000 people we're talking
about retaining 20 more students, it's doable, it's doable and the small changes that we make can impact 1
or 2 here and there and then this change can impact 1 or 2 more there, that's why I'm hopeful. So, getting
that message out and getting some of these changes started, I think are important, but I think the buy in
from the people in power is where I'm just uncertain, I mean, I don't know. They have a lot on their plate.
They have other issues to worry about. How important is this in the grand scheme of things, of keeping
the university running and you know, our US news and world report rankings and all that stuff, which I
think drives them more than other concerns.
Quote from a student affairs professional: That was one of the things at that initial meeting we were
discussing was… What are some of the barriers that we may encounter in trying to lessen the gap? And
the admissions requirement was one of those barriers that we initially found out, it was because
primarily…the SAT score was the strongest indicator of whether or not they were going to be admitted or
not. And that was an institutional barrier because we didn't think that the administration at that time was
going to lower the SAT score for students to get in.
107
7. Re-mediated practices may create or reinforce “learning structures” in which those
practices are embedded. The creation or transformation of structures for continuous learning
practices may represent one way of measuring the impact of change initiatives (RQ3). Practice
theory, especially drawing on Giddens’s (1984) structuration theory, emphasizes that practice is
enacted in the course of reciprocal influence of human agency and structure—that even as
practices derive from structure they also reproduce and/or transform structures. Engeström
(1987; 2001) defines “expansive learning” within organizations as transformations of activity
settings such that “culturally new patterns of activity” are produced. From a practice theory
perspective, such structure/practice transformations may have a relationship to the “adaptive
capability” described by Sporn (1999) in her study of university adaptation, suggesting that the
extent to which re-mediated practices effect concurrent structural changes may enhance or
constrain the capacity for the kinds of iterative, organization-wide changes in practice described
by Clark (1998). I thus define learning structures here as contexts of practice in which the object
of shared activities is to “learn” with respect to a particular goal (for example, committee
meetings where new forms of practice are routinely enacted, e.g., conducting inquiry, collecting
and presenting evidence, interpreting data). I elaborate this idea in-depth in the final case
narrative in Chapter VI.
With respect to practices re-mediated through change initiatives, in this study such
practices in some cases altered the structures within which they were embedded in ways that
appear to enhance the potential for expansive learning outside the parameters of the initiatives
themselves. In other cases, the re-mediated practices of individuals within the team seemed
confined to the activity settings of the initiatives with no evidence of changes to the patterns of
activity within other structures across the organization (committee meetings, strategic planning
108
practices, etc.). Despite evidence of changes in individual practice, that is, the extension of new
or revised practices into corresponding new or revised structural settings that would support
ongoing learning (i.e., new committees, new institutional routines with respect to data reporting
and monitoring, and so on) was not evident.
Quotes illustrating proposition #7
Quote from a dean: “I’m also a member of the Curriculum and Academic Policies Council, and I’m the
chair of the Undergraduate Programs Committee. So I was able to, as chair of the UGPC, change also the
way we were looking at changes in curriculum and changes to programs. So there were things that were
done behind the scenes that were not officially, but as people came to present changes to their programs,
then we started asking… “So where’s your assessment for this, how is this going to impact pre-major
students, how is this going to impact minority students?” So we started asking these questions that hadn’t
been asked before.”
Quote from a dean: “We gave each one of [the department chairs] data for their own department and said,
what we would like you to do is to begin to get in the habit of including ethnicity and race, equity data, in
your general assessment process, but also in your five-year review. As far as I know, none of them do that
now. And so they’re invited to this meeting—and at that time we will have all the data that we think is
available by department. Data on access, data on retention, and on graduation, and we might have a little
bit of the excellence data, data on probation, by ethnicity, and things like that. And what we will do
during that—I think it’s a 4-hour session—we will work through a couple of examples… And this is one
part where Equity Scorecard and [Deliverology] connect because we are using the faculty in Equity
Scorecard as the consultants for this because they have the background, and so the provost has given us
the funding to pay the faculty to be their consultants and to also pay the department chairs an itty bitty
stipend for attending the session and coming up with an equity plan by the end of the summer, by
department.”
8. Leadership practices may have an important but primarily indirect and resource-
based role in change-focused practice re-mediation (RQ1/RQ2). For this study I did not
incorporate a specific theoretical perspective on the role of leadership, nor did I review the
extensive literature on leadership or leadership strategies related specifically to change (e.g.,
Kezar, 2013) because my interest was in the particular nature of change rooted across types of
higher education practice but specifically within the settings of these committees. I also did not
have the opportunity to interview provosts or presidents as part of my data collection. I thus
109
provide this proposition last as the most tentative; however, references to leadership came up in
the data too often to neglect entirely.
Sporn (1999) says little of leadership other than that “committed leadership” was critical
for successful adaption (p. 270), and Clark (1998) suggested that a “strengthened steering core”
including both leaders and faculty was vital to university transformation. Hearn (1996) suggested
that leaders play an important translational role in change by introducing new objectives or
exigencies from the outside world in such a way that change efforts fit within “the existing
institutional culture and history” (p. 148). Practice theory would emphasize that leadership is
also a practice and thus, like other practices, has the potential to create resources (not just
financial but also symbolic, legitimizing, discursive) for other areas of practice within the
organization (Feldman & Orlikowski, 2011; Nicolini, 2013). Leaders, moreover, have the
potential in their practice to create structures impacting the work of others throughout the
organization; that is, not through literal restructuring of the organization (e.g., reallocating funds,
hiring new positions, reorganizing lines of authority, and so on) which is an outcome of a
particular leadership decision, but rather by shifting structures of practice in subtle ways through
the enactment of leadership tactics (e.g., creating new informal patterns of communication and
authority or distributing authority by modeling delegation of decision-making, and so on).
I thus draw on these general ideas of about the potential roles of leadership in change by
looking not at the practice of leadership in itself but in how resources or implications of
leadership practices appeared within the context of the change initiatives. Like the accountability
policy that initially generated the change initiatives in this case, the role of leaders (here I mean
presidents and provosts) and leadership practices appeared primarily to serve in the creation of
resources (discursive, symbolic, and fiscal) for practice (Feldman & Orlikowski, 2011). Any
110
direct impact of leadership practices in re-mediating practice was unclear from this study. In one
case the provost was actively involved in facilitating and structuring the use of change initiatives
on campus by making presentations, providing financial resources to support professional
development activities, and creating linkages between initiatives and other campus planning and
reform activities. In another case, the provost was supportive (as perceived by participants and in
informal conversations) but relatively hands-off, and in a third case participants made virtually
no mention of leaders engaging in the processes other than assigning a dean to lead the team.
Quotes illustrating proposition #8
Quote from a dean: Our approach is a little bit different because of [our university’s] distributed
leadership…This is something that our former president put in place. Meaning, giving people down to
the fourth or fifth level the authority to make changes among the programs as they see fit. So the
university is not open in general to the idea of having one person monitor everything. And even to this
day we have a different president, it’s still the campus culture that in the spirit of distributive leadership
you go and work at a different level. I’ll give you an example. Just today, this morning, one of the
recommendations from the retention completion of transfer students, you know, piece of the Equity
Scorecard, we found out that the English department…somebody reported that the English department
does not allow the students who have taken a remedial course, a remedial writing course at a community
college, to come directly into our freshman comp class, our writing 120. That if a student hasn’t taken the
writing 120 or its equivalent at the community college, they have to start with the remedial here. That’s a
policy. That’s a “policy.” It doesn’t live anywhere. It’s been a practice. And so that was bubbled up to
the final report. And the provost says to me, okay, well, you’re the [Dean], you go talk to the English
department.
Quote from a faculty member: [The Provost] already actually had her report that mirrored what we were
doing. She was very aware of what was going on even before we did that. So ours was in addition to
what was already taking place in her office. I think her awareness of this problem – and I don’t know
how they work up there – but I can almost say that her awareness may have been the strength of bringing
the Equity Scorecard in. So…she was very appreciative that we had taken the initiative and she was
going to read what we had said and that was going to inform her in this whole process.
Together, these eight propositions describe what appeared as patterns across sites with
respect to the nature of the practices related to the change initiatives. In other words, they
represent my analytic assessment of what the activities (“sayings and doings”) associated with
the specific initiatives conveyed about the conditions of practice change within the universities
and about the structures and contexts that both facilitate and constrain practices related to
111
change. To be clear, the propositions do not reflect statements about what was true in every case;
some were more present within any given case than others. Also, the propositions reflect
possibilities about the nature of change and the relationship between practices and structures in
both generative and conservative ways; in other words, the intent of the propositions is not to say
that the processes of changing practices could be improved by enacting x, y, or z mediating
factor. I am arguing in this study the need to replace such positivist ways of thinking about
change with an understanding of change as a function of the interactions between structure and
action that are inherent to practice. The propositions are intended to describe ways in which the
factors mediating change processes are fundamentally relational components of an iterative
process rather than causal elements in a linear process.
The next chapter provides case narratives from each of the three universities in which the
dynamics suggested in these propositions play out in different ways, to varying extents, and in
varying configurations of interaction with each other. Certainly all of the dimensions of
organizational practice and change reflected across the propositions could be addressed at each
university through an extended analysis, but for the sake of concision and “maximizing learning”
about the case I have instead focused on the most prominent features of the practices associated
with change initiatives, and attempted to provide examples, where possible, of variation in the
posited relationships (i.e., to provide examples where the mediating relationship was both
generative of practice change and constraining of practice change).
112
CHAPTER VI: Case Narratives of Practice Re-Mediation
This chapter provides analytic case narratives about the practices associated with the
Equity Scorecard and Deliverology initiatives within the PASSHE universities. Throughout each
narrative I illustrate in bold how the propositions outlined in the previous chapter “played out” in
the process, describing the nature of that practice re-mediation and my interpretation of the
meaningful interactions between practices and contextual factors. I have tried, for the most part,
to resist throughout this analysis the instinct to evaluate qualitatively the “success” or “impact”
of practice change in each case. I have not, for example, tried to provide a matrix of variation in
the extent to which practices were or were not “re-mediated,” the organizational “effects” of that
re-mediation, or the conditions that may have contributed to or inhibited the effectiveness of
practice re-mediation in causing some set of organizational outcomes. Instead, with the
propositions above and the case narratives below I have tried to put emphasis on understanding
the nature of practice change by illustrating the contexts and processes of re-mediation as
conveyed in the data. The benefit of viewing the three case narratives as comparative is thus to
flesh out different dimensions of change given different organizational settings and practices, not
to comparatively evaluate what makes practice change more or less effective with respect to a
given set of outcomes or normative standards. That said, one of the potential contributions of a
practice approach and of studies like this one is to build, over time, an empirical knowledge-base
from which such evaluative criteria could be developed.
Though each PASSHE university has a unique history of development, and indeed the
three sampled for this study vary not only on the characteristics outlined in Chapter IV but also
on many others I have not identified, all three universities are nonetheless part of the public
system of higher education and thus some of the contextual and historical factors relevant to the
113
change processes studied here are common across sites. Before proceeding to the case narratives,
I therefore begin with an overview of the history and current contexts of the PASSHE system.
System History & Overview
The current governance structure known as the Pennsylvania State System of Higher
Education was formed through legislative action (Act 188) in 1982, which brought together, in
1983, 13 existing state colleges along with Indiana University of Pennsylvania (a state-owned
university) into one system under the control of a central governing board. Twelve of the
institutions began as normal (teacher training) colleges within each of the twelve independent
districts established by the Pennsylvania Normal School Act of 1857 and one, Cheyney
University (then State College), was founded by a Quaker philanthropist as the “Institute for
Colored Youth” in 1837 and later became a teachers college. In their early history, the normal
colleges received ad hoc state funding to purchase land or build facilities but were privately
developed and owned, operating with charters from the state to serve as regional teachers
colleges. In 1911, state law mandated that the Commonwealth of Pennsylvania purchase the
colleges and thereafter each received appropriations of public funds.
Over time, the state colleges expanded their curricular offerings and with the passage of
the 1982 legislation, all twelve state colleges along with Indiana University and Cheyney State
College became comprehensive universities (PASSHE, n.d.). Until recently, only one of the
universities (Indiana University of Pennsylvania) could offer the doctorate degree; in 2012 the
state passed the Higher Education Modernization Act, which allowed other PASSHE universities
to develop doctoral programs with approval from the PASSHE Board of Governors.
One of the findings of this study is that the practices associated with the student success
and equity initiatives were in most cases continuations or reconfigurations of existing practices,
114
many of which had been developed and iteratively evolved over time in response to various
external policy measures. Cultural-historical activity theory emphasizes that “history” is a critical
factor in determining the structures of on-going activity, as current structures and artifacts
represent the materialization of prior activities that can be traced back indefinitely (Cole, 1983).
Indeed, in-depth historical analysis of the PASSHE institutions and their social and political
contexts would surely reveal many sedimentary layers of policy and practice interacting to
transform the practices within the institutions over time. Because I am cognizant of the
importance of these historical dimensions but this is not a primarily historical project, I highlight
just two historical factors that emerged explicitly in the data and that are important for
understanding the structures within which the current practices take place—that is, with respect
to history as a mediating factor of student success and equity-focused practice within the
universities. These two parallel historical contexts are (1) Pennsylvania’s history as one of the
states ordered in the 1970s by the federal government to desegregate its public education
institutions, and (2) the PASSHE system’s subsequent efforts to embed access and retention for
students of color within performance-based accountability policy.
Office of Civil Rights-Mandated Desegregation and the Adams Case
Title VI of the 1964 Civil Rights Act prohibited any agency or organization receiving
federal assistance from discriminating based on race, color, or national origin. To enforce the act,
the Department of Health, Education, and Welfare (HEW, now the Department of Education)
created an Office of Civil Rights (OCR) tasked in part with eliminating segregation in public
elementary, secondary, and postsecondary schools. In 1969 and 1970, the OCR notified 19 states
including Pennsylvania that they were in violation of Title VI in maintaining racially segregated
public postsecondary systems. States were required to submit desegregation plans and were to be
115
(and many continue to be) monitored by OCR for making progress in eliminating duplicate,
racially segregated programs (US Dept. of Education, 1991). In 1970, the NAACP Legal
Defense Fund brought suit on behalf of a number of private individuals against the OCR for
failing to adequately enforce Title VI. That suit, Adams v. Richardson, resulted in 1977 in OCR
having to develop more stringent criteria for desegregation planning and assessment (US Dept.
of Education, 1991).
In March of 1969, Pennsylvania’s Department of Education was notified by OCR that the
state was in violation of Title VI and was ordered to submit a plan to eliminate racial segregation
across its public education institutions or risk losing federal funding including its eligibility for
nearly $2.4 million (around $15.6 million in today’s dollars) in annual federal support for higher
education. In a memorandum to the chairmen of the faculty senates at the Pennsylvania state
colleges, the Department’s Vice President for Academic Affairs described the order:
You are aware of the fact that the Office of Health, Education, and Welfare has cited the
Pennsylvania state colleges accusing them of unacceptable practices in discrimination. In
effect, an ultimatum has been issued. Presumably, Pennsylvania must come up with a
plan for desegregation acceptable to the department in Washington. (Smay, 1969, n.p.)
The Pennsylvania Department of Education did submit a plan to HEW in 1969 titled the
“Pennsylvania Department of Education Plan for Equal Opportunity in the State Colleges and
University” (Friedman, 1975). HEW “accepted [the plan] in principle but found it insufficient in
detail on how the imbalance would be alleviated at the individual college level” (Stroh, 1969,
September, 9, n.p.). Subsequently, the Pennsylvania DOE submitted a revised plan outlining
specific responsibilities at the Department and institutional levels with implementation target
dates (Friedman, 1975).
116
As part of its plan to address the OCR order, the commissioner of the Pennsylvania
Department of Education requested that presidents of each of the state colleges send academic
deans and admission directors to a series of statewide conferences in the fall of 1969 to develop
strategies for increasing the enrollment of black students. I quote from this letter at length to
provide a sense of the historical origin of many current structures within PASSHE universities:
Attached is a copy of a tentative program for the closed working session of
Academic Deans or Vice-Presidents scheduled in Harrisburg on September 16 and 17. As
you know, the genesis for this two-day conference is the HEW Desegregation Order and
the Pennsylvania Plan to meet this Order.
Discussions of PDE staff with HEW officials in both Washington and
Pennsylvania made clear that the HEW regards at least two aspects of major importance
in achieving a higher level of desegregation. One of these is the academic climate and
offerings of the State-owned institution; the other is admissions policies and procedures.
The conference for Academic Deans on September 16 and 17 will be complemented by a
conference for Admissions Officers on October 14 and 15.
…The conference can only have a high degree of success if you will have had
significant discussions of the role of your institution in regard to desegregation prior to
September 16. I assume this would include an area of agreement with your President,
perhaps your faculty Senate and/or other colleagues on the practical academic problems
of increased Black enrollment. These would include such topics as the sharing of course
responsibilities (Cheyney and West Chester, for instance), faculty exchange, student
exchange, your college as a ‘magnet’ for certain areas of study, consortiums, library
117
resources, faculty needs (especially Blacks), tutorial or remedial help, teacher education
for the ghetto, etc. (Mohlenhoff, 1969, n.p.)
Following these conferences, the Department and the state colleges developed a range of
“interventions” designed to enroll and support more black students at the state colleges (other
than Cheyney). The DOE commissioner sent periodic memoranda to the state colleges describing
such efforts by the colleges to increase black enrollment, and providing resources for “social
awareness” (e.g., Mohlenhoff, 1971, April). For example, one such memorandum provided
statistics from the 1970s under the heading “1970 Census Figures Show an Increase of Blacks in
Pennsylvania” (Mohlenhoff, 1971, April). Another showed the number and percentage of
bachelor’s degrees awarded to black students at each of the state colleges in the 1970-71
academic year, which ranged (not including Cheyney) from a low of .12 percent (1 out of 795
degrees awarded that year went to a black student), to a high of 3 percent (34 out of 1,099
degrees were awarded to black students).
In each case these memoranda from the Commissioner’s office provided examples of
what today we might call “best practices” being undertaken at various colleges in response to the
state desegregation plan. For example, this example from California State College:
Dr. George Reed, associate professor, Department of Educational Foundations at
California State College, and Regis Serinko, assistant to the California president, are
making a statistical study of California’s success in the enrollment of disadvantaged
students. They are especially concerned with compensatory programs for minority
groups and are seeking data on the rate of attrition and identifiable causes for such
attrition. (Mohlenhoff, 1971, July, p. 2)
Another example described a recruitment initiative at Millersville State College:
118
OPPORTUNITIES DAY AT MILLERSVILLE: Approximately 250 students attended an
Opportunities Day for minority students at Millersville State College in February.
Students came from as far away as Pittsburgh and New Jersey. The all-day
program included presentations about many facets of college life, an introduction to
college officials, a luncheon, campus tours and question and answer sessions.
The students, most of them high school seniors, were told about the college’s
Special Program for Disadvantaged Students.
At the end of the day, 53 students filled out applications for admission to
Millersville. (Mohlenhoff, 1971, April, pp. 6-7)
These memoranda provide a wealth of information about what were then, much like those still
common at many institutions nationally today, a range of programmatic campus-level responses
to the external mandate to increase access for underrepresented students of color. Moreover,
these examples illustrate attempts to examine existing structures (admissions procedures) or
create new structures (compensatory programs) intended to address inequitable access for black
students, and vestiges of these efforts remain visible at the institutions today. Every PASSHE
institution has, for example, an Office of Social Equity; these offices were created as the
administrative home of desegregation plan initiatives, which subsequently became affirmative
action policies, and which are now tasked primarily with compliance with Titles VI, VII, and IX
of the Civil Rights Act and all of the other federal and state laws prohibiting discrimination on
various grounds. Interestingly, virtually no mention was made of these offices or their
responsibilities during the interviews for this study.
In addition to tracing back the historical origins of many of the structures of current
practices at PASSHE universities, there is also evident within the historical material origins of
119
the same conflicting institutional logics and narratives as those that emerge in the data of this
study around student success and equity for students of color. One such dominant narrative was
that black students from urban areas simply did not want to attend or remain at rural
universities—a narrative that emerged in the data for this study as well. In a 1969 news article
titled “Small-Town Attitudes Keep Races Apart at State-Run Colleges,” the author offered an
implicit theory as to why Pennsylvania’s rural state colleges were likely to remain predominantly
white:
Last spring ugly rumors began to drift down Main Street into the bars and
barbershops of Slippery Rock, a small farming town 55 miles north of Pittsburgh.
The U.S. Department of Health, Education, and Welfare only weeks before had
ordered Pennsylvania’s state colleges to correct widespread racial imbalance and
desegregate.
The rumors in Slippery Rock were that some 500 blacks from Harlem would be
attending school there come September.
…State education officials see in the Slippery Rock anecdote the reasons why the
14 state colleges have traditionally been segregated and why they are likely to remain so
to a large degree for years to come.
The point made concerns life in small-town America – not only how it affects the
state colleges and the men who run them, but also how it affects the perspective [sic]
Negro student who most likely is from the city. (Stroh, 1969, September 9, n.p.)
The article later quotes a professor of Sociology from Slippery Rock State College: “’I don’t
believe the whispering campaign was necessarily white racism,’ he said. ‘Rather, it was a
projection of unnecessary fears resulting from ignorance. I found the rumors, which have since
120
subsided, rather amusing in one respect. None of our kids will be from Harlem this year. All are
from a 60-mile radius of the college’” (Stroh, 1969, September 9, n.p.). There is, of course, a
double irony in the professor’s comment; in denying (while also confirming) the racism
underlying the town’s overreaction to the “rumors” he also reaffirms the attitude of absurdity at
the suggestion that any students from outside the immediate community of the rural college
would enroll there.
The universities themselves also attributed their difficulties in attracting additional black
students to their rural location and students’ assumed preferences. In a carefully worded section
of its Desegregation Information and Plan (Shippensburg State College, n.d., ca 1973), for
example, Shippensburg State College administrators provided statistics on the black enrollment
from each of its top ten “sending counties” for new freshmen and accordingly summarized their
outlook on future recruitment:
It is becoming apparent that the recruitment of blacks and other minorities is
highly competitive, particularly in the more metropolitan areas in and around Pittsburgh
and Philadelphia. The low input of blacks from the top sending counties listed in Table I
other than Philadelphia and perhaps Dauphin raises a number of questions about
recruitment possibilities within these counties, including counties within the immediate
geographical area of the college. There is a growing conviction that a higher retention
rate might result if such minority students were more geographically and ecologically
related to the College. Present studies show that the attrition rate is almost 80%, even
though many counseling and study skills activities are included in the program.
(Shippensburg State College, n.d., ca 1973, p. 2; emphasis added)
121
I emphasized in this quote two distinct but related logics reflected in the statement, which are
similarly reflected in the present-day case narratives below. The first is about access and
suggests that black students are not “ecologically related” to the rural institutions; that is, “they
do not want to be here.” The second statement, offered in this report as proof of the first, reflects
a similar logic but with respect to retention: “they do not succeed despite programs we have in
place.” In both cases, emphasis is placed on students, and assurances are offered of the
legitimacy of the institution’s efforts—in recruitment and retention, for example—that are
simply inadequate given the geographic and “ecological” isolation of the institution.
PASSHE Performance-Based Funding
The second and parallel historical context relevant to the current inquiry is PASSHE’s
use of performance-based funding as a tool of accountability, which was first implemented in
2003 and is still in place today (as revised in 2010). Like the OCR mandate to desegregate,
PASSHE’s accountability policy required that institutions meet and set goals with respect to
student retention and completion, as well as “inclusiveness” and success for students of color
(among other outcomes; Cavanaugh & Garland, 2012). The implementation of performance-
based funding was, also, a component of PASSHE’s ongoing response to the desegregation
mandate. To repeat the quote from Chapter I for the sake of convenience, PASSHE Executive
Vice Chancellor noted in 2013, in response to the Fisher decision, that, "The Pennsylvania State
System of Higher Education (PASSHE) does not have a policy similar to the one reviewed by
the U.S. Supreme Court. However, the leadership of PASSHE and its fourteen member public
universities strive to assure that our campuses reflect the diversity of the Commonwealth’s
citizens. To meet this goal we participate in a number of initiatives specifically focused on
122
reducing gaps in access and achievement for under-represented and low-income students such as
the national Access to Success program” (Wenner, 2013, June 24, n.p.).
The current performance-based funding system thus reflects both a continuation within
the policymaking practice of PASSHE administrators of the same structures developed in
response to the OCR mandate of prior decades as well as, in 2010, a departure from prior
practice in revising the policy design such that institutions now “compete” against their own
prior performance (rather than against each other), and through the implementation of capacity-
building initiatives.
There are two points I wish to make through this extensive historical foray. The first is
that each of the universities within PASSHE, as well as the system-level policy office, have long
histories of building and refining internal structures around efforts to become more effective in
enrolling and serving diverse student (i.e., in student success and equity, though those concepts
have been defined differently over the years). For at least four decades, moreover, the
universities have undergone specific attempts at change related to access and success for students
of color and other “disadvantaged” students. Vestiges of these histories—not only in terms of
structures and practices within the institutions but also common institutional logics and myths—
are clearly evident within the data of this study and relevant to the study of ongoing practice
change in the twenty-first century. Leonardi (2012), in his study of car crash simulation
technology, concisely describes the ways in which such structural changes become embedded as
sediments of practice:
People come to see particular changes as inevitable because routine work dynamics
obscure the processes through which those changes were decided upon. A consequence
of this obscurity is that the technologies we develop and use and the organizations in
123
which we develop and use them appear to evolve in fairly straightforward ways such that
when those managers and engineers reach the next choice point, they reflect upon the
apparent inevitability of the changes that have occurred previously and use those
perceptions of inevitability to justify their current decisions. (Leonardi, 2012, p. 5)
Those structures and practices developed in the 1970s—responding to the exigencies of the
time—were a function of the way the universities had “always” practiced just as the current
structures and practices are a function of those built in the 1970s (Clark, 1983a). This historical
context can thus be analytically recognized in narrative accounts of current change processes
even when respondents are not aware of them; indeed I suspect that many, particularly younger,
practitioners within the universities are not aware of the full history of these efforts and how they
continue to shape institutional policy and practice.
The second and corollary point, however, is that in comparison to the initiatives being
studied here, the earlier institutional efforts described in the Commissioner’s memoranda—
though similar in some respects, for example, in their genesis within external policy mandates
and their general objectives—may have been also been distinctly different in practice. In many
ways, these earlier efforts of the 1970s may represent a genre of accountability logic and
institutional response focused on “best practice” models of intervention (Dowd & Tong, 2007) or
“single-loop” learning wherein the assumption underlying the perceived problem (students of
color not enrolling) remained untested (Argyris & Schön, 1978; Witham & Bensimon, 2012).
Solutions and discourse around them were distinctly student-centered or, to use the language of
the Equity Scorecard, “deficit-minded,” meaning they focused on the deficits of students rather
than the impact of institutional practice and policy (Bensimon, 2006). Indeed, it may be those
iterations of practice with respect to student success and equity that now require re-mediation,
124
with new objectives, new tools, and new vocabularies, in order to drive meaningful
organizational interrogation about underlying assumptions embedded in those sedimentary
structures.
Case Study Narratives
The following are narrative accounts of the change processes in the three sampled sites,
which for the sake of confidentiality I given the pseudonyms of Hillside University, Valley
University, and Old Main University. In each case narrative, I begin with an overview of the case
that highlights key elements of the setting in relation to the broader PASSHE system, and then
describe any other general contextual factors that I found, based largely on my field memos,
were important for “setting the stage” of the change story. I then provide a narrative, drawing
heavily on quotes from participants and documents, analyzing descriptions they offer through the
lens of practice theory. I use sub-headings to organize the narratives around the most important
stages or elements of their process as I interpreted it. I conclude each narrative with a summary
and reflection on the nature of practice change in each setting.
Hillside University
Overview. Hillside University of Pennsylvania provides a compelling study of change in
the context of dramatic internal disruption and a powerful set of institutional logics that had
dominated institutional practices for decades. Hillside is one of the several PASSHE universities
that experienced turnover in leadership during the course of the change initiatives. The
university’s former president had overseen a tremendous growth in enrollment, extensive new
capital projects, several multi-million dollar capital campaigns, and some of the university’s
sports teams’ rise to national standing. He also survived two faculty votes of no confidence.
Moreover, even those who viewed his efforts to expand the standing and prestige of the
125
university as visionary leadership reflect that his methods for doing so were autocratic and
heavy-handed.
The change in leadership at Hillside thus disrupted not only the structures of control
within the university but also the symbolic logic driving the way people within it had worked for
two decades or longer. From a practice theory perspective, it is clear how the former president’s
enactment of leadership had influenced the structural and symbolic elements of activity across
the university in complex and powerful ways. Interview respondents described, for example,
how all channels of communication went directly and unilaterally to the president’s office; how
no one was allowed to make decisions (even at the level of admissions, for example) without
consulting the president; and how the president’s “pet projects” and departments (such as
athletics) seemed to gain easier access to resources than others.
The context of change at Hillside University. Hillside University began its engagement
with the Equity Scorecard and Deliverology in 2012. The former president’s legacy, in all its
various forms, played significantly into the ways in which these initiatives functioned to re-
mediate practices around student success and equity. Like other PASSHE universities, Hillside
University had established programs for minority students, academic support centers, and other
initiatives to increase graduation rates. Compared to the other universities visited for this study,
however, the practices associated with the Scorecard and Deliverology at Hillside University
were less reflective of ongoing practices; in fact, in many ways participants struggled to
integrate the initiatives into their work at all because of the lack of an infrastructure for
committee-level work, decision-making, and data use. Respondents at Hillside described ways in
which the Scorecard and Deliverology enabled them to advance efforts that they would not have
126
had the tools or even the awareness to initiate previously. For example, one administrator noted
that:
I knew nothing about retention, success, leading indications, nothing, and I think that
worked to my advantage because I had no preconceived ideas of this is the way it should
be, because in my world there was nothing to say this is how it should be, because I
didn’t know anything… [W]hen I kind of took over [the Scorecard], the tone of the whole
university was changing, but our tone in our meetings changed from ‘what do we need to
do’ to ‘what does the data say we need to do.’ Because it was also one of the first times
we had good data to say this is why we need to do something.
In addition to controlling decision-making across the institution, the prior president also
constrained decision-making across units within the university by tightly controlling the flow of
information. Institutional research, for example, had been carried out by one person who reported
directly to the president. No investments had been made in building data collection capacity, and
according to interview participants, no one really knew how that person collected the data or
whether the data were reliable. As one faculty member described,
One of things that we did spend a lot of time doing was trying to clear numbers, because
we had 3 or 4 different sets of numbers that we were working with and so the office over
at Office for Student Retention had one set of numbers, the admissions office had a
different set of numbers, the registrar's office had a different set of numbers…I think
that's something I know we can improve upon is having one set of numbers that when we
ask or you ask us, well, what is the percentage of different students, everyone is speaking
the same language…but [now] depending on which office you go to there's a different set
127
of numbers out there and you're just like, how do you do this, how do you ask the same
question at 3 different offices and they come up with 4 different sets of numbers?
As a result of the dispersed and ad hoc data collection across campus, there had been virtually no
systematic use of data in decision-making at any level, and even at the beginning of the
initiatives, the institution had very little reliable data with which to begin the process. In this
way, leadership practices from prior years constrained the re-mediating of practice by limiting
the resources available to those working in the institution.
Introduction of re-mediating artifacts. The data tools associated with the change
initiatives thus introduced, if not entirely new, certainly extensively transformed practices for
examining student success and equity. An administrator leading the change initiatives described
how one of the data tools (the Benchmarking Equity & Student Success Tool, or BESST)
provided at the outset of the Scorecard project defined for him one of the major tasks of his new
role as an administrator tasked with improving retention and completion rates and closing gaps
by race and income: “It was actually probably my first glimpse of the BESST thing… [That] was
probably my first connection with how do students matriculate and move through and leave?
And that then began to spring a lot of questions in my mind about what I do.” Such data tools,
including an extensive report prepared by consultants from Deliverology to help Hillside
University build student support programs based on robust evidence of student enrollment
patterns, thus have served as new tools re-mediating the practices of the new administrator and
his staff as they have tried to create new structures and policies around student success and
equity.
But because of the void in data-use practices prior to 2012, the Hillside University case
also provides an illustrative example of the importance of linking new mediating tools to
128
practitioners’ wisdom or judgment. Without prior institutional practices and structures to guide
practitioners’ engagement with data, for example, the experiences of data as a mediating tool for
practice varied widely and were moderated significantly by the relationship practitioners drew
between the objectives and values underlying the data use and their own professional judgment,
identity, and tacit knowledge. One faculty member, for example, expressed skepticism about the
validity of the data guiding the change initiatives—which he attributed to both his scientific
training, which led him to question the validity of the data, and his personal background, which
made him skeptical of claims about disparities in success for certain students:
It was frustrating, and the reason it was frustrating was because you had I think about 8 or
10 of us there sitting at the table and we were going through these different workshops
and I remember we had to set goals, but the problem for me was we cannot set a goal if
we don't know what the problem is … so we were trying to create equity and the
recruitment rate, the admission rate amongst African-American and Hispanics the same
level as the white students that were coming in and no-one was able to really show me
solid numbers to say that this is a problem…I think that's probably the scientist part of
me, that training that says that you can't just come in, and even from a [disciplinary] point
of view, we always tell our students, don't go into a community and fix a problem
without knowing what the problem is. So it was like, I like what we're trying to do but
until you can prove to me that is a problem, I'm not sold on this.
An administrator participating in the Scorecard and Deliverology teams had an entirely different
experience encountering the data because it resonated with his own personal experience working
abroad through a church-sponsored missionary project.
129
That experience in China really taught me to say there’s no difference between …
someone being Chinese, someone being this, someone being this, someone being this. So
when I got this position, and I equated, or understood what was going on in the Equity
Scorecard, those things actually came back to me, and I began to realize that we as a
University have to be careful not to say, “Oh, well you’re this and you’re this,” but we
need to make sure we have opportunities for all students to be able to be successful…I
began to say, “Why, why are there gaps?” And some of the stuff we’ve done now with
the Support for Success and the four-year grad plan, we are beginning to analyze things
differently to say there shouldn’t be differences.
The introduction of data into the practices of professionals within Hillside University thus did
not automatically point to a new set of solutions, structures, or way of doing their work that
would yield better outcomes. Rather the new re-mediating tools and objectives interacted with
(sometimes positively and sometimes negatively) existing professional judgment and wisdom
about students’ experiences at the university.
As the committee members at Hillside began slowly collecting data with the support of
the external initiatives and the system office, they also began developing new networks of
communication that had been impossible under the prior presidential regime. For example, two
administrators working in academic affairs and admissions described how they had previously
rarely communicated about work matters simply because they had no efficacy in decision-
making; all decisions went through the president. As a function of (in part, but also independent
of) participation in the structured processes of the change initiatives, they now “routinely” pick
up the phone and discuss ideas or develop proposals to submit to the provost. Other staff
members expressed a similar new sense of self-efficacy as a result of being part of a cross-
130
functional committee implementing the Equity Scorecard and Deliverology. For example, one
student affairs professional recalled her reaction to be asked to join the Scorecard committee:
‘Are you sure you want me?’ That was my reaction, was like, ‘you want me?’ … other
positions that were in the initial grouping, administrators, admissions officers, Provost, I
mean it just seemed like I was out of my league. I am staff, low staff and I just did not
see how I was going to be able to add anything of value to an echelon that was working at
a higher level than I was in the university. So, that was my, like, ‘oh my goodness, what
am I doing here,’ but then when we started to work together on some of the round table
discussions, then I could see where I could fit, where I could add and where I could see if
there were going to be any changes. I am out here in the trenches with those students,
those first-year, first-generation, not only the minority students… I have a closer
connection and a closer touch to the campus and to the students, whereas maybe some of
the administrators in that initial group may have been a little bit removed from the day-
to-day life of the student. So, then I was able to find my part and be more comfortable in
my position … being able to offer something of value to the [project].
This quote not only reflects this individual’s emotional experience of being asked to work with
people from different levels of power within the institution but also reveals the kinds of
dominant logics that had always structured her experience working in the university—that
decisions were made at the top, and that “low staff” (or indeed any staff) had nothing to
contribute to those decisions. It was not until entering into the activity setting of the Scorecard
committee meetings that her way of interacting with institutional decision-making was re-
mediated by a new activity setting (intentional within the Scorecard design) characterized by
diversity of participation from across functions and level of authority.
131
In addition to just expressing her change of attitude about her self-efficacy vis-à-vis the
Scorecard committee, during our interview that staff member also described practices indicating
her involvement in new organizational routines around ensuring student success; notably, her use
of pronouns in describing her work shifted from first/third person (“me/they”) to second person
(“we”) as she recounted her narrative of her involvement in the Scorecard committee:
[I bring] the realism as to what are some of the struggles that the students share with us
when they come into the scheduling center, and not only do they come for scheduling
purposes, but sometimes there is a need for advising, so hearing some of their concerns,
seeing them in some of their struggles, you can see the difference or the disparaging
differences between those students who are prepared to be here against those students
who are struggling and you can see why there is the gap… So, now when we look and go
back to the admission process we are making sure that…we are bringing to [the Director
of Admissions] attention that ‘hey, we have students who are struggling with the ability
to read a text book and understanding what it is that they've read.’ So, when you are
considering admitting them, we have to also think about what are we gonna do with them
once they're in. How are we going to support them when they're here? Yes, you can
admit them, but how do we support them when they're in? So, we are having these
conversations with admissions now.
Here the individual is expressing a sense of efficacy in her and her colleagues’ new routines of
communicating with administrators in separate departments; notably, this reflects not only a new
mode of practice for her but, in the terms of practice theory, potentially a new structuring of her
practice (extending beyond her to those who do her job in the future) around new channels and
routines of communication.
132
Across interviews, participants expressed many similar examples of practices mediated
by new networks of communication across offices and functions as a result not only of job
restructuring but of their involvement in the change initiative committees. These new
communicative networks also interacted with and enhanced the mechanisms by which
practitioners’ own personal and professional goals and values could mediate their practices.
One student affairs professional, for example, pitched the idea during a Scorecard team meeting
of creating a new position for her as a withdrawal counselor that would meet a need she saw for
retaining students and draw on her prior professional experience. The dean agreed and moved
her from her prior role in disabilities services into a new position as the point person for all
students considering withdrawing from the university. Her professional identity is evident in the
way she described her job:
[P]retty much whenever somebody does make a request to withdraw, they a lot of times
will go to academic records, some of them just leave, so that's not the most responsible
way to go about it, but we are talking about people with differing levels of, you know,
coping and of, you know, decision making and that kind of thing… So, I meet with them
and just kind of sit back and this is where the rehab counseling comes in, just sit back and
listen. Why do you want to withdraw? Tell me what's going on. Pull their academic
history up. Get an idea of where they are financially. Get an idea of what's going on at
home, because, you know, I made this joke once, I said people don't just wander into my
office and go, "my life is awesome, I just need help at the math lab, everything else in my
life is perfect. Yeah, I just need a tutor for earth science." It never happens. There's
something going on with a roommate. There's things going on financially, things going
on at home, some are commuters, nontraditional. You know, they're on academic
133
warning, academic probation, financial aid-wise, which is a whole other thing, and I just
do student case management in terms of teasing out what are your, what's stopping you
from getting from point A to point B …
Notable in addition to her explicit reference to her rehab counseling background are her uses of
words like “coping” and “case management;” this language reflects how she now interprets her
work activities through the lens of her professional training, which she had not been able to
apply to her work previously. Through the restructuring of her position, her practice as a
professional (comprised of all the activities she had formerly done, such as meeting with students
and linking them to resources) now takes place within a distinctly different activity setting such
that her own professional values and goals can now inform her work in new ways.
Enactment of competing institutional logics. Despite the flood of energy and
momentum unleashed following the leadership change at Hillside University and the many
changes it has brought, the dominant logics that circumscribed the work of the university’s
professionals for decades are clearly still present and in tension with the new logics emerging
from collaborative work, distributed leadership practices, and a focus on student success rather
than prestige and selectivity. An administrator described, for example, how his decision to
centralize academic scheduling within his office conflicted with enduring practices reflecting
prior logics that prioritized the university’s athletics:
[T]he other thing that we changed is that person that’s in athletics used to have the
registration ability that my staff has, which is, they can put anybody into any class any
time anywhere and override everything. They had that. And, um, I convinced the
president and the provost that we need to take that away from them because that is
inappropriate. Then because what they would do is we would build a schedule of a
134
schedule that students should have and look, “How did that get changed?” That didn’t
score me big points with the athletic department.
Though the administrator’s use of his authority to centralize scheduling (with the provost’s
approval) perhaps reflects a positive change of practice with respect to student success, it
nonetheless reflects a deep tension that may (if I had interviewed the director of athletics, for
example) serve to re-mediate others’ practices in ways that reduce their sense of efficacy or that
trigger resistance or frustration with respect to the goals of student success and equity. In a
similar example, one of the first actions taken by administrators at the outset of the Equity
Scorecard project was to change from an SAT-driven admissions process to a holistic admissions
process. As one of the student affairs professionals who was a member of the Scorecard team
described,
That was one of the things at that initial meeting we were discussing. What are some of
the barriers that we may encounter in trying to lessen the gap? And the admissions
requirement was one of those barriers that we initially found out, it was because
primarily…the SAT score was the strongest indicator of whether or not they were going
to be admitted or not. And that was an institutional barrier because we didn't think that
the administration at that time was going to lower the SAT score for students to get in.
But now that has all changed because they're no longer here and so now that institutional
barrier is no longer in place and so the SAT score is no longer just the primary indicator
on how to get in.
Here again, putting aside that the subsequent change to a holistic admissions process may be a
positive change in practice, it also conflicts with decades of practices and assumptions
throughout the university structured around the logic of increasing the university’s standing in
135
the rankings and its regional and national reputation. The incremental and iterative process of
change described by Clark (1998) and the general perspective of practice theory as to the
intricately linked networks of practice within organizations would suggest that such changes in
one department may continue to come into conflict with existing practices and the logics driving
them for years to come.
Challenges for on-going practice change. This necessarily incremental nature of
practice change also emphasizes the need for new practices structured around an ongoing
objective of learning; that is, creating activity settings specifically designed to support the re-
mediation of practices across the organization to support new objectives. Though there were
many positive changes within the specific units of those practitioners interviewed for this study
who were associated with the change initiatives, for example, it remains to be seen whether any
of these changes will in turn create learning structures in which new practices can be embedded
and thus potentially sustained and extended across the organization over time (Clark, 1998;
Sporn, 1999). This may or may not alter the potential for new practices to yield new outcomes
with respect to student success and equity, but this is one of the ways that the types of new
activity settings and structures generated around new practices (remembering that the activity-
structure relationship is dialectical and continuous) varied across settings in this study.
Reflections on practice re-mediation at Hillside University. Overall, Hillside
University is a illustrative case of changes in practice and attending transformation of
organizational structure in the wake of dramatic organizational changes associated with a change
in leadership. This unique context of the specific change initiatives studied here helped to draw
out, in particular, the powerful role of institutional logics in creating sediments of practice and
policy. The case also illustrates the potential for new tools and processes to harness the
136
professional and personal goals, values, and wisdom of individuals within the organization by re-
mediating the activity settings in which they work. By introducing new tools and objectives into
the work settings of a handful of individuals within the university, the Equity Scorecard and
Deliverology initiatives helped practitioners develop new practices better aligned to what they
believed were the important goals and values of the institution, thus enacting new logics to
compete with those characteristic of the prior leadership regime. The initiatives also supported
the creation of new networks and communication routines that may continue to re-mediate
practices associated with student success across different departments and settings of the
university.
The case also highlights some of the difficulties of practice change for the same reasons,
however. As the changes in practice at Hillside University, for example in admissions and
scheduling, reverberate throughout the organization, other counter-practices may emerge to
reinforce existing institutional logics that are equally strongly embedded in the ways others’
professional values and self-identity mediate their practices. Similarly, the isolated changes in
practice enacted by participants in the change initiatives may not extend into other settings
within the organization or create sustained new forums or learning structures for the kinds of
practice re-mediation that took place during the initiatives.
Interestingly, Hillside is also the only university out of the three where there was virtually
no mention during interviews of the accountability context of system performance-based
funding. In part, that notable absence may be a function of Hillside’s success under the former
president in creating entrepreneurial forms of revenue such as online degree programs, or it
could be a residual effect of the president having buffered the internal workings of the university
from all awareness of external demands. In any case, in comparison to the other universities in
137
this study where accountability mediated practice change by creating discursive legitimacy (Old
Main University) and creating discursive frames to inform the student success and equity
objectives (Valley University), the relative absence of an evident accountability logic operating
at Hillside could be either productive or limiting to ongoing change efforts.
138
Valley University
Overview. Valley University is one of the several (the majority among PASSHE
institutions) universities located in remote and small-town settings relatively far from the state’s
major urban centers. Driving onto the campus of Valley University of Pennsylvania feels,
literally and figuratively, like arriving at the city upon a hill. The grand original main hall of the
university, which now houses most of the administrative functions of the university, sits atop a
hill covered in lush grass, brick paths, and hundred-year-old trees, and is clustered among an
array of other imposing buildings including a new state-of-the-art contemporary arts building.
Each time I was on campus over the spring and summer, there was also a bustle of new
construction that contrasted starkly with the quiet of the sleepy town around the university.
Several five-story dormitory buildings with expansive glass-walled lobbies lining the circular
drive around campus were being readied to house a new crop of students in the fall of 2014.
Beyond campus lies miles of rolling hills home to Amish and Old Order Mennonite farms and,
occasionally, traffic down the main street is slowed by horse-drawn carriages headed to the
Home Depot at the edge of town.
Valley University’s idyllic rural setting has figured significantly into how the campus
community understands its challenges in enrolling and graduating students of color—in the
1970s and still today. As at Hillside University, Valley University provides a compelling
example of how dominant institutional logics come to be enacted and perpetuated through
practice and discourse; at Valley, new language was particularly important as an element of
practice re-mediation within the change initiatives because it provided committee members with
a strategy for re-framing discourse about students of color. Valley University is also a case, like
Hillside, that illustrates how practices re-mediated through new processes and tools have the
139
potential to harness and renew practitioners’ own commitments to equity and student success
rooted in their personal and professional identities and moral values.
Unlike Hillside, Valley University has had leaders—particularly at the provost level—
who have enacted a commitment to diversity and student success through an array of strategic
planning and other leadership practices (forming special committees and task forces, and so on).
The Valley University case is also distinct in that accountability pressures—both from PASSHE
performance-based funding and accreditation review—have played a significant role in shaping
and linking various competing institutional logics and objectives in ways that have directly
impacted practices at the university over the past six or seven years. In many ways, the change
efforts at Valley are part of a broad and coherent institutional strategy around student success
and equity that was developed in response both to performance-based funding and the
university’s most recent accreditation review. Practices developed or re-mediated within the
change initiatives have been both informed by and integrated into those larger structures in some
ways that practitioners there experienced as both positive and potentially limiting.
The context of change initiatives at Valley University. Valley University had its most
recent accreditation review by the Middle States Commission in 2009. In their evaluation report
(Middle State Commission, 2009), the accreditation team noted that there was “considerable
evidence that the faculty and staff at [Valley] consistently demonstrate a deep sense of
commitment to students and their success” (p. 11) but that “recruitment and retention of a more
diverse faculty and student body should be a higher priority” (p. 4). Moreover, the accreditation
team noted that, while the university had a robust range of programs to support student success,
“The one area of concern was a significant gap of nearly 30% in the achievement of under-
represented students versus other students, as measured by both retention and graduation rates”
140
(p. 11).
10
In the case of Valley University, accountability processes had thus prompted renewed
efforts within the university to address issues of student success and inequities in student success
in particular, and the accreditors’ assessment proved highly prescient of what committee
members would themselves discover as they later launched into the Equity Scorecard and
Deliverology initiatives.
Starting in 2009, the academic affairs division began a process of strategic planning that
resulted in an “Academic Master Plan” that now serves as a reference point for much of the work
done by committees and departments on campus (e.g., curriculum and assessment, program
review, budgeting, and so on). The authors of the Academic Master Plan (AMP) made explicit its
relationship to various dimensions of the university’s accountability systems: “The AMP goals,
objectives, and strategies are aligned with the core goals of PASSHE and the university,
suggestions and recommendations from [Valley’s] 2008-2009 Middle States self-study process
and visiting team report, and [Valley’s] PASSHE performance indicators” ([Valley University],
2013, Academic Master Plan, p. 2) The AMP also enacts discursively many of the dominant
institutional logics that emerged as mediating and moderating factors in this study, and in this
way it serves as a communicative link between the external and internal accountability objectives
and core university practices. Table 3 provides three examples of dominant logics that emerged
in complex ways throughout the data, with illustrative quotes from the Academic Master Plan.
10
Indeed, at the outset of the Equity Scorecard project in 2012, Valley University was not only less likely to
graduate African American students (who had a six-year graduation rate of 46 percent versus 61 percent for white
students), it was also far less likely to accept and enroll black students than students from other groups in the first
place. Out of 1,265 African American applicants for the fall of 2010, only 405 were admitted (32 percent) and of
those, only 109 enrolled; by contrast, 80 percent of white applicants were admitted and 35 percent enrolled (data
were provided to the Center for Urban Education by the institutional researchers at the PASSHE System Office).
141
Table 3
Dominant institutional logics conveyed by Valley University Academic Master Plan
Institutional Logic Illustrative Quotes from the Academic Master Plan
Valley University is (or
desires to be) prestigious
and nationally recognized
for academic excellence.
[Valley] University (VU) leads and serves its region as the foremost
university in the Pennsylvania State System of Higher Education (PASSHE).
The Academic Master Plan (AMP) aims to identify and articulate the goals
and strategies that will allow [VU] to achieve its aspirations. The AMP
envisions [Valley] as a university steeped in the liberal arts tradition,
nationally recognized for its excellence in academic and professional
programs that prepare students for engagement with a diverse, global society.
(p. 2)
The university is facing
financial constraints and
budget uncertainty.
Higher education is at a crossroads in Pennsylvania because of diminishing
state funding. In this time of economic uncertainty, many argue that [VU]
should not engage in academic planning, that the university does not have
the money to dream. However, [VU] has a mission to provide an excellent
education for its students and to prepare them for meaningful and fulfilling
careers. The university must look ahead to the future and plan how it will
maintain and improve its offerings, regardless of budget trends. Vision and
improvement do not necessarily incur additional costs, and higher education
will continue to evolve regardless of the economy. (p. 3)
The university is
committed to diversity.
Goal 2: Cultivate a learning-centered environment to facilitate students’
intellectual growth and success.
Academic Affairs will:
1. Recruit a diverse and highly qualified student body
Goal 7: Cultivate an environment in which the curriculum and culture
support accessibility for and inclusion of members of diverse and under-
represented populations. Academic Affairs will:
1. Ensure equitable representation and treatment in all facets of university
life.
2. Encourage open-mindedness and an appreciation of differences. (pp. 4, 6)
The Academic Master Plan, in serving as a key artifact in which accountability objectives
and concerns and the institution’s own history and system of values were merged, thus conveys a
series of dominant and often competing logics that figure centrally into current areas of practice,
particularly around admissions, retention, and improvements in student success and equity in
both access and completion. Indeed, partly as a function of the Academic Master Plan and the
agenda it embodies, Valley University had already begun addressing its access, retention, and
completion challenges among students of color by the time PASSHE brought in the Equity
142
Scorecard and Deliverology in 2012. In 2009, for example, the associate provost had tasked a
new retention committee, led by two of the division deans, with trying to get a handle on what
might be causing the gaps in retention and completion noted by Middle States. As a dean who
co-chaired that committee described:
[T]he interim provost at the time asked me to co-chair the Retention Committee. And it
was very ad hoc. We picked the membership basically trying to be as broad as possible,
but also people who were interested. And we just wanted people who were interested in
looking at retention issues. And we began to meet. And we really focused in on, you
know, are there any short term kinds of things we could bring to bear to just help increase
retention and improve retention for our students and are there any long-term things that
we could bring to bear…I mean we’re not experts in any sense of the word in terms of
what the issues are around minority versus majority. And so we really spent about a year
just studying. We felt like we needed to educate ourselves. We didn’t want to make
recommendations that would, you know, make the situation worse or would have no
impact or really just not make recommendation based on things that we just didn’t know
anything about. So we did things like establish reading lists. We did a lot of sharing at
the meetings, but we really didn’t come out with any recommendations that had any
impact because we really just kind of felt like we didn’t know what we wanted to do.
When the Scorecard and Deliverology were introduced a year or so later, that existing
retention committee was reconfigured into new committees dedicated to implementing those
projects’ structured processes. These committees also included faculty who had been conducting
research activities on their own related to improving the success of students of color, but such
activities had been dispersed across the university and not part of an organized change effort.
143
One faculty member, for example, described a study she had conducted because of her own
interests in the success of students of color, which then resulted in her being invited to join the
retention committee and, later, the Equity Scorecard team.
I thought, okay if retention is not happening what – I did not feel like I was hearing the
university do anything about it and it was just like this, you know, exodus of students
coming and then just leaving in droves. What’s going on? So I decided to find out what
our students themselves are saying because obviously the university collects data but I
didn’t even know what kind of data they were collecting. I was just really new in this
whole thing. It was just that I was very interested in retention being the fact that that was
happening. So we asked for a little money from the Social Equity office. We started the
study collecting data. We asked undergraduates to help us. It was just the whole going
to your dorms, give this to any African-American and Hispanic student. Let them answer
this. We just want to know what they thought, and then we interviewed [them] too. So
in that process I went and interviewed the Multicultural Student Association Director and
they … invited me to a rally just to encourage the students… At that point, one of the
retention committee members was there… and they said, “Oh, we want you to be a
member of the retention committee.”
This faculty member and others interviewed for this study had been engaged in a range of their
own service activities trying to understand the experiences of students of color or students in
particular segments (developmental education, undeclared majors, and so on), but their activities,
much like the retention committee, were largely ad hoc, loosely structured, and not integrated
into larger university structures.
144
The new change initiatives brought in by PASSHE thus represented in some ways
adaptation, integration, and reconfiguration of existing practices at Valley University rather
than wholly new structures or activities. There had been a flurry of planning and activity in
response to recent accountability measures, and the retention committee in particular had already
started to examine some of its probation policies, for example, and had drawn on a range of
expertise from across the university to begin—according to one administrator—to learn about the
university’s equity challenges. Those existing practices were transformed in significant ways as
they were channeled into the external change initiatives, however; not only were they brought
together into two linked initiatives under the charge of the provost, but the practices were then
also reconfigured into the particular structures and processes of the initiatives. For example,
where the prior retention committee had brought together people interested in issues of retention,
in assembling a new committee for the specific initiatives the provost included members of the
prior retention committee but focused specifically (with guidance from the Scorecard protocol)
on getting representation from key areas within the university. As an administrator described:
We really had representation from faculty, from administration, from various units
around the college, and so most of the people who were selected—or not selected,
actually asked—to be on the [Scorecard] group came out of that committee. I think the
provost worked with the faculty to pick others. I think that, you know, I think she was
pretty wise in the way she did that. It really is about pretty evenly mixed between faculty
and the administration. There are people who sat in key positions…Dean of Arts and
Sciences. We had our Dean of Admissions on the group. The financial aid director was
on the group. Representation from areas like Developmental Education. Representation
from the student affairs, representation from academic affairs – I mean it really was kind
145
of a pretty decently representative group, and I think that really helped move that group
forward in terms of its ability to get things done.
Another member of the team, a professor, expressed a similar sense of how interacting with
people from across functions helped him better understand the processes impacting access and
retention. Though he had previously served on the retention committee and done his own work
around issues pertaining to the retention of students of color, the Scorecard committee provided a
different context for learning:
[I]t was nice just sharing and hearing what other people …… [B]ecause a lot of us,
especially in faculty, didn't really know. We didn't know what the hierarchy was. We
didn't really know who made the decisions and when they were made and we were
getting lots of data about, okay, here's the percentage of white students admitted and
here's the percentage of offers made… and so we had the numbers and we could eyeball
them, but we wanted to know at what point is each decision made. What's the
communication process like, are we just using e-mail, when do those e-mails go out? A
big thing we were working on was like incomplete applications and who handles those,
so we were just trying to get a better picture of what goes on in the admissions office and
with the counselors. It was eye opening, because like I said, I had no idea exactly the
process and how those decisions were made.
In these ways, the diversified participation from across university “silos” and functions served
to help re-mediate practices by bringing into interaction the perspectives and assumptions of
individuals embedded in separate communities of practice within the institution. In contrast to
typical committees, for example those examining curriculum or other “academic” affairs or those
dealing with “administrative” issues like admissions or student affairs, the composition of the
146
change initiative committees reflected a merging in communities of practice organized around
the holistic objectives outlined in the AMP and accountability metrics rather than organized by
function or division. The activities of these new “hybrid” committees were also distinctly
different from the isolated research activities conducted by various faculty members previously,
though those had had similar aims and methods, because of their links to university strategy and
the mediating effect of cross-function collaboration.
The second major way in which the structured change processes differed from those of
the prior retention committee as well as activities being conducted ad hoc across the institution
was that they provided a set of discursive tools that helped orient the committees around the
institutional role in advancing the access and success of students of color. In particular, the focus
within the Equity Scorecard on the concepts of “equity-mindedness” and “deficit-mindedness”
(Bensimon, 2005) became particularly important re-mediating elements as members of the team
at Valley University began to discuss issues of racial and ethnic inequities. Initially, many
members of the committee voiced assumptions based on dominant institutional logics about
student deficits as the primary reason for gaps in outcomes. As one of the faculty members on
the committee described:
That is something that we kept on going back into the [Equity Scorecard] manual and
saying, “Okay listen guys”…we kept on going back to, “No, it is from the angle of the
institution. What is the institution not doing. We know what the students are not doing.
But how is the institution…” So we just kept on going back to that and that really helped
the conversation. I think the more we strengthened that, the more people decided “Okay
I’ll give up. I’ll talk about the institution.” Because there was some reluctance… “It’s
not our problem. We’ve done everything we can.” And people used to come in and say,
147
“This is what my program is doing and I don’t know what else we can do.” And then the
question would go, “But is there something else you can do?” The student needs to be
here. Let’s not worry about what part they have to play but what part can you play. I
noticed as the weeks went by the moment people realized, “I am not going to be talking
about the student, I am going to be talking about the institution and my involvement and
what we can do,” then it started going really smoothly. The conversations just went
smoothly and I felt that that was the key to breaking those tense difficulties…So that was
how the conversation really changed and turned in that direction.
This quote illustrates how the emphasis within the Scorecard process on using communicative
strategies to reframe conversations about the issues of equity re-mediated the processes of
collecting data, interpreting evidence, and crafting institutional recommendations. In particular,
the new language brought to the surface an additional set of competing institutional logics
underlying the university’s largely unsuccessful efforts to enroll and retain students of color
historically – those of diversity and equity. I discuss the role of these logics below.
Navigating and enacting competing institutional logics. The shift in thinking aided by
a shift in language and discursive framing helped committee members as they began to identify
and challenge the dominant institutional logics driving the assumptions and practices of faculty
and staff across the campus with respect to the outcomes of students of color. As the campus had
started to confront the gaps in the rates of success of black students, especially in the wake of the
Middle States evaluation, for example, there was renewed attention on campus to the goals of
increasing diversity. Within the Academic Master Plan, one of the strategies identified as
“critical” (i.e., necessary to implement immediately) was to “Adopt a new set of recruitment,
tuition, and scholarship policies and programs to increase the academic profile and diversity of
148
the incoming student population, including traditional, transfer, non-traditional, under-
represented student populations, veterans, international, and graduate students” ([Valley
University] Academic Master Plan, 2013, p. 7). Notably, this goal reflects a logic of diversity—
that is, increasing representation of diverse populations—but not necessarily of equity in terms of
taking institutional responsibility for the experiences of the students who constitute that diversity
(Bensimon, 2005). In response to these imperatives, however, narratives had clearly emerged
across campus as to the causes of the university’s ongoing failure to recruit students of color and
the disparities those students experienced in retention and completion. As in the historical
responses many PASSHE universities had to desegregation planning in the 1970s, one narrative
in particular was commonly expressed with varying levels of skepticism across interviews
conducted for this study—not as respondents’ own views, necessarily, but as the dominant
explanation for gaps in access and retention: “they (black students) just don’t want to be here.”
As one member of the Equity Scorecard committee discovered when he began doing
inquiry with staff and administrators, that narrative was reinforced by the way faculty and staff
members’ enacted the institutional logic of diversity (i.e., enacted through their daily way of
doing their jobs). Many of the staff worked in programs supporting diversity initiatives, for
example, and many faculty members claimed to treat all students “equally.” But members of the
Scorecard team also talked to students about how they experienced the university and began to
recognize how their experiences of those practices diverged from faculty and administrators’
assumptions about them. As a faculty member from the Scorecard team described:
Well, I did 2 things, I did interviews with students, primarily African-American students
and then I had another faculty member do interviews with white faculty members on
campus, and we asked them similar questions and we were essentially looking for where
149
their interests converged and diverged…and they diverged clearly in a lot of different
areas. We started off very broad just about, you know, what are some positives and
negatives…and that was sort of fine. Then we asked them in broad terms “what's the
racial climate like at [Valley]” and that's where you first started to see faculty members
saying, “well, it's, you know it's fine…everyone seems to get along.” And the African-
American students were saying, “Well, it's okay, but it could be better.”
Through this type of inquiry activity, the faculty member found that perspectives diverged in
particular around the myth of students not “wanting” or being prepared for life and academics at
the rural university.
I mean, I think we recognized we're in a sort of small rural area, you know transportation
issues and that type of thing. And then when we got into some of the retention stuff and
in classroom climate stuff, that's where the faculty members were saying, “No,” you
know “it's fine” or “the students who come here, maybe they're underprepared” and blah,
blah, blah. And from the student's perspective, never once did we hear anything about
students being underprepared or not being able to do the work. It was more about the
climate at [Valley] and how they don't feel as welcome in the classroom and they feel
singled out in the classroom or they don't feel comfortable approaching the teachers and
the instructors. And the teacher said, “No, I treat everyone fairly,” you know, “I don't call
on anyone more than the other.” And I think that's where we first started to notice these
signs of, well, you know, this sort of color-blindness, like, you know, “it's not an issue
and I don't treat anyone differently,” …but then the students themselves were talking
about, you know, “I had to leave my major because of this,” you know, “I withdrew from
the class because of this and the teacher was just cold.” …And the faculty members did
150
not, not a single one perceived any of that, and in fact, instead of talking about race
issues, they wanted to go to preparation and low SES and first-generation stuff.
After starting to recognize these practices of “color-blindness” and the divergence between
students’ actual experiences and faculty’s assumptions about them, the team further interrogated
other institutional logics that seemed to be reinforcing the color-blind assumptions at the
university.
I can't remember the other place we went, but then we talked to some of the support staff,
some of the front office people and started hearing similar things, “not that big an issue”
or “it's just,” you know, “they just have a rough time transitioning to [rural] PA,” but it
wasn't anything particular about their office or the university. The university puts on tons
of programs. The university is welcoming and all this stuff and then another thing we did
is, we kept hearing about the programs and the cultural diversity programs and things like
that at [Valley]. And so I just went back really quick and looked at all the ones that fit
under cultural diversity programs for the previous year and found like 80-85% of those
programs were cultural diversity programs aimed at educating white students about other
cultures, so it was like, you know, Native American history seminar or something like
that. It was like cultural enrichment programs or cultural education programs, but it
wasn't programs focused on specific populations, so we're gonna have this poet come in,
because you know, this poet is sort of well-known in this community or we're gonna have
this musician or we're gonna show this film because it's popular with this group of people
and as part of our campaign to be inclusive, so it was that type of thing and so the
university I think has a different idea of what a diversity focus is. Their focus seems to
151
be on getting the white students educated as opposed to making minority groups feel
more welcome and comfortable.
In other words, the team began to recognize the difference in the way that a logic of diversity
rather than a logic of equity was being enacted within university practices.
These reflections portray a number of elements of the change processes around equity
and student success as they began to unfold at Valley University. First, in ways similar to but
also different from those observed at Hillside University, the competing institutional logics of
diversity, equity, and prestige, as well as others, moderated the process of practice re-
mediation through the change initiatives not only within the committee but also as those
individuals branched out and began addressing inequities with other members of the campus
community. Indeed, these logics and the tensions between them were evident as mediating
factors across the campus community. These logics were deeply informed by history (Valley
University’s administrators made similar claims about black students not wanting to be there in
its reports to the state during desegregation planning and monitoring). They were enacted
discursively through explicit references within the Academic Master Plan, which served as a
reference point for many of the activities on campus, and they were reinforced through some
faculty and staff members’ expressions of a belief that the university was already addressing
“diversity” through an array of cultural awareness and student support programs as well as
“equal” treatment in academic settings.
Second, this particular faculty member’s narrative around faculty perception illustrates
how the new processes and language specifically designed to address learning around issues
of equity were critical to the re-mediation of practices at Valley. Specifically, the language and
process provided by the Equity Scorecard figured significantly into the re-mediation of practices
152
as they provided the committee members with a new way of framing or contradicting those
dominant institutional logics. In nearly every interview at Valley, participants used or made
reference to the idea of institutional responsibility and the importance of refocusing practice
change around addressing the institution’s role in perpetuating unequal outcomes for students of
color; that discursive tactic, gleaned from participation in the committee, provided tools for
exposing contradictory logics or enacting new logics around equity and change.
Data-use as an “inter-mediating” practice. Data had a significant role as an artifact
mediating the activities of the Scorecard committee meetings at Valley, particularly in helping
get all members of the committee oriented to the nature of the disparities in access and
completion. But in addition to directly mediating the groups’ activities (for example, by helping
focus attention on a particular equity concern or framing the objectives of the initiative), data
also served as a sort of intermediating tool between the activities of the committee and the
practices of various departments on campus with which the committee interacted.
Though Valley University (unlike Hillside) had a developed and robust data-collection
capacity and a skilled institutional researcher, the team members found that data necessary for
answering critical questions about the success of students of color were rarely if ever provided to
the campus community and hard to define and interpret. As one administrator described:
We spent a couple meetings really honestly just number crunching. It really was just
sitting down…We had a projector in the room, and we put the numbers up on the screen.
And we just all stepped through to make sure we understood just exactly what we were
seeing and why we were seeing that. I think we had some meetings where we really
came in with an agenda, and then we never would do the agenda because there would be
something else that would come up, and we would just basically talk for 2 hours… I
153
mean, there were times when we weren’t really sure what data that we wanted. Or we
would ask him for interpretation. You know, one of the hard lessons that we learned is
that not all data is the same. You know, it really depends where you got the data from,
and it really depends upon the perspectives around that piece of data.
The administrator’s comments are consistent with research on data-use and change (e.g., Coburn
& Turner, 2012; Dowd, 2005) suggesting that organizational data are not only often confusing,
even at a superficial level, but also are not self-evident as tools of practice and must be
interpreted within situated settings of organizational values in order to become meaningful as a
form of evidence to inform action. Through the interaction between committee members and the
conversations with the institutional researcher working to tease out shared interpretations,
accurate definitions, and plausible meanings of the data, those data thus became effective as re-
mediating artifacts—directly within the settings of the committee meetings.
Data were also a particularly important artifact in the committee members’ learning about
the role of admissions policies and their impact on students of color. After having looked at data
on the acceptance rates of black students, the committee asked for additional data to help them
understand what the driving factors were behind admissions decisions; this led to them looking at
data on the role of the SAT in the admissions process. As one administrator described, the
findings were surprising to him:
I think the admissions process surprised me. Uh, I learned a lot about SAT for example.
I mean, this really forced people to go out and look at SAT as a predictive tool. I learned
a lot in terms of how not to use SAT as a predictive tool. Because I think that, you know,
people—folks want to rely on that as the be-all end-all of academic success. And it’s not.
And the more you get into the data, the more it becomes clearer to you that there are far
154
more predictors of academic success than an SAT. So I think in some sense that was a
surprise. You know, intuitively you know that, but until you look at it, it’s like, “Whoa,
this really does correlate with social economic status.” This really, you know, you could
have a student who has a modest SAT score who comes into school and just excels, you
know, and find out that the student is very motivated, that they were number 3 in their
academic class and so on and so forth. They just weren’t a good test taker. Or they only
took the test once and that was the one that counted in their admission score. I think that
was a surprise.
A faculty member on the committee expressed similar frustration after having discovered the
link between the data on SAT scores and admissions rates for students of color, emphasizing the
role of the SAT-driven admissions process in reifying tension between dominant institutional
logics around revenue, prestige, equity, and student deficits:
I mean one part of it is that you know, students do have lower scores …but the big thing
is we have to meet our enrollment numbers because that's where the money comes from
and so, you know at times, they do make adjustments to like SATs and things like that if
we're sort of missing our enrollment numbers and I think the faculty think automatically
that, okay, they're letting in students who have lower SAT scores, which means those
students are less prepared…not necessarily, right? If SAT is the only metric, which it is
and that was the first part of our [report], like why are we still using this as the only
metric measure of a student's academic readiness when there are so many other
mechanisms out there. … We look at ACT, we look at SAT, we look at GPA, we look
at, you know, do they have these required courses and that's what goes into the file and
155
there's no other, there's nothing else there. There's no attempt to get anything else in a lot
of cases either, so.
After looking at the data on how well SATs predicted success for students of color and working
at length with the admissions office to understand their own incentives, accountability metrics,
goals, and professional judgments about their role in creating more equitable access, the
Scorecard committee made recommendations to the provost for changes in admissions practice.
But those recommendations reflected changes that had in fact been long desired by the
admissions department, with respect to moving towards indicators other than SATs, based on
their own professional judgments. Previously, however, these concerns had been largely ignored
given the dominance of the logic of prestige and “excellence,” which reinforced the objective of
striving for a high average freshman SAT score in order to boost the university’s ranking.
The structured process of the change initiative, which was rooted in data use as a
legitimating practice and tied closely to the objectives of the university’s Academic Master Plan
and accountability-based priorities, thus introduced a new communicative channel between the
admissions department and university leadership. In other words, the process re-routed some of
the interests of the admissions department through the legitimating practices of inquiry and
reporting conducted by the Scorecard committee. Data thus not only directly re-mediated the
committee practices but re-mediated the communication practices and influence networks of
the larger campus by providing a data-based link between the admissions practitioners’
professional judgments and the objectives of leadership with respect to closing equity gaps. A
similar finding emerged at Old Main University (described below), suggesting that the structured
process of data use afforded to the committees in these initiatives may serve to legitimate local
156
practitioner judgment or preferences within departments and thereby re-mediate practices in
those settings indirectly.
Resource constraints as a moderating discourse. Though the re-mediation of practice
through language and data in many ways helped the committee members identify and challenge
some of the dominant logics restricting change, in other ways contradictory institutional logics
moderated that practice re-mediation process by circumscribing or dichotomizing potential new
forms of practice. Particularly when the committee began to discuss the content and
communication of their recommendations, the process of prioritizing and agreeing upon those
recommendations was clearly mediated by an institutional logic concerning resource constraints,
funding cut-backs, and concerns about revenue. Indeed, these financial logics also informed the
framing of the change objectives—by the deans leading the committee and presumably by the
provost or president before them. They are also clearly expressed within the Academic Master
Plan, which noted that “Higher education is at a crossroads in Pennsylvania because of
diminishing state funding.” Drawing on Prior’s (2008) notion of “documents in action,” the
following description by a faculty member of the way the Scorecard project was introduced to
the committee suggests the movement of the language around revenue and planning from the
AMP into the practices of campus committees:
[T]he numbers were discussed… so that we could see the gravity of the situation. So that
we would know where are we coming from. So at the beginning it was what are we
looking at… Then we went – we talked about, how does that lead to money if we were
able to retain so many? I think that was a motivating factor, that we can say, there are
only 20, and that looks, like, okay that’s a few students. But what does 20 translate in
terms of how much they bring and if they stay throughout. But they also talked about
157
how that impacts the rest who are coming. If more students are staying then others would
say, “well I can make it and stay.” So it was money. It was how that translates as it goes
over the years… how it can get more students and that number increases.
The resource logic not only framed the objectives of the initiative in a particular way but also
framed the potential outcomes—notably, not in terms of new practices but in terms of potential
new expenses for programs or services. Two different administrators reiterated the role of
financial feasibility in prioritizing the actionable recommendations generated by the committee.
In the words of one administrator:
I think we looked to, you know, is this something that we’re gonna recommend and then
it’s gonna bankrupt the institution. I mean, can we really make a recommendation to do
X, Y, and Z and believe that there’s enough resources to do something about it. Uh, now
saying that, I would say after we went through the process, even if it would’ve cost some
dollars that would not have been a reason that we would’ve not made a recommendation.
I think that—I think most people were there to make a difference. I mean, it was sort of
the moral imperative that really drove people much more so than the bottom line
appearance. And everybody recognized that if we make changes, then certainly it helps
the bottom line of the institution, but everybody was able to say well if we make changes,
it’s because we should make these changes.
Reflecting the tension between the logics of equity and financial constraint in the work of the
committee, this administrator later suggested that financial concerns did not play into the framing
or enacting of the work, despite the fact that revenue and financial constraints were referenced
throughout interviews.
158
And, you know, we didn’t have a lot of discussion about dollars and cents. We had a lot
of discussion about what’s right and what’s not right. I think that that is very telling.
Administrators, in particular, seemed to be struggling to navigate these competing logics in the
ways they framed and communicated the institution’s practices around equity, and the switching
between those different frames of reference clearly impacted the practices of the group as they
worked towards recommendations. Though the administrator emphasized the role of “moral
imperative” (more on that below) in making choices about what to recommend, comments from
other members of the committee (and in informal conversations with others across campus)
clearly reflected the dominance of a university-wide narrative about fiscal constraint (as surely
exists at every public university) that almost pre-emptively restricts potential changes to “cost”
and “no cost.”
The role of moral values and professional identity. Despite the apparent tensions in
the administrators’ accounts between “moral imperatives” and cost concerns, there was evidence
across interviews that confirmed the Middle States accreditation team’s assessment that indeed
“the faculty and staff at [Valley] consistently demonstrate a deep sense of commitment to
students and their success” (Middle State Commission, 2009, p. 11). As was the case at all three
universities visited in this study, participants’ professional identities and moral values
significantly mediated their practices both within the activity settings of the Scorecard and
Deliverology committees and elsewhere in the activity settings of their regular work. The faculty
member quoted above, who initiated studies of students’ experiences at the university out of her
own professional interest and concern, expressed eloquently the role of such values in her work:
This diversity thing … I had done a program a few years back. I was involved in a
program that worked with high school students… Of course, as you go through this you
159
start to see the numbers. And we are talking about minority students coming to college.
… We were doing programs, bring minority students into the university to kind of show
them what happens here hoping that it will translate to coming to university. On that end
I was trying to bring minority students to my university so they can later come back when
they finish high school. Then I discovered that they come and they just leave again.
Why? So wanted to stop this whole coming and staying, because I’m interested in that. I
think that they should be in school. Being successful - I had to be in school and be
successful and if I can help I can help in any way for minority students to come to college
and be successful…because I understand the pain of coming to college and not being
successful. To me that’s painful. I think I have the ability to help and if I don’t have the
ability I can direct them to resources. “This is why you need to stay.” I think it just
comes from that fact. My experience. I have children in college and it’s just a painful
experience to think that your child is going to drop out and education is the only way you
have for them to survive in this economy. In this – whatever. So I think that kind of
informs my interest doing it. I can do it every day.
This faculty member’s moral values around ensuring that students of color succeed—anchored in
her professional identity as an employee of the university, in her own racial and ethnic identity,
and in the association she draws between her students and her own children—motivated her
tactics for examining institutional policies and practices. Those personal moral values mediated
her engagement with the work both inside and outside the Equity Scorecard committee and
specifically informed her own tactics in enacting the committee practices. Within the Scorecard
committee, specifically, her engagement was mediated by a sense of self-efficacy and self-
awareness rooted in those values, particularly as a woman of color:
160
I am generally comfortable because I have learned to speak out over the years... So my
discomfort comes when I’m not sure that the people I’m working with are genuine and
really mean what they say. But with the team I’ve been comfortable and I’ve spoken my
mind…I’m the passionate, emotional, do it yesterday, what’s so hard about that. Why are
you struggling with this thing? It’s obvious, you know. So I’ve kind of held myself back.
They know that when I raise my hand to say something it is out of passion. The data is
there but it’s common sense too.
Another faculty member who had served on the pre-existing retention committee described
explicitly how the re-mediating process of the Equity Scorecard project aligned with his own
professional values as a professor:
What I understood…[the Scorecard] was what we were trying to do with the retention
committee and now it was going to be a little more data driven, so we were going to,
instead of just thinking about and brainstorming processes, we were actually going to use
a methodology and look at some data that was going to help us make some more
informed decisions, which as an [academic], I was all for. I want to know what's been
done before and the evidence before we start making changes, so I was really interested
in the process overall.
Later, that same faculty member described how the process had helped to provide new
communicative strategies for him within his own practice as a professor and department chair.
I try and bring [equity] up at any chance if I notice something going on with decisions,
with meetings. I teach a class on prejudice and discrimination too, so I think it's more on
my mind than other people, but we have a good department who is aware of these issues.
… With our syllabus, we had a discussion about our syllabus…last semester we did like
161
a brown bag where this article I found that had to do with the culture of independence on
college campuses and how that may not fit well with interdependent individuals and it
talked about things like the syllabus where it said… you're responsible for this and it's on
you and we're focusing on this sort of independent way and we talked about how we can
structure our syllabi maybe so that it's a little a more interdependent or group focused or
it's “we're in this together” type thing... Also, [as faculty] we need to research…and so
we have teams of students and one faculty member said, “well,” you know, “I hand pick
my researchers and the students who work for me and they're the best of the best and I
think it's an honor type thing,” and I said…“that's good, but you know those students are
gonna have opportunities because they have the 4.0s and because they have that…but
what about a student who maybe doesn't excel in the classroom but you never know how
they're gonna be in a lab.” I try and pick students who I think may need that extra, you
know, if they're struggling in the classroom but I see them working really hard, they're
really motivated, well, what if I had them as a research assistant, right? …[A]nd I was
like maybe we should think about—maybe it's not a merit thing and maybe it's not we
just take the top tier students on as our research assistants, but think about how maybe
that research experience could help other students as well…And, oh, you know what? It
made it into our 5 year review… our department recently had to do a 5 year self-study
and I took the section about sort of equity type stuff and one of things I did is, I just
looked at all the students who are enrolled in our research class, who got selected to be in
our research class over the last like 3 years and just by eyeballing the names… I just sort
of went through and picked out the number of, from what I could tell were ethnic
minority students and it was like 2 out of 45 or something like that, and so that's the type
162
of things I sort of put in our report, you know, maybe we're doing a disservice, maybe
based on the proportion of minority students in our department, how come only 2 have
been research assistants? So I'm just trying to find little areas to point out to our
department about ways that we could do a better job in serving our students and you
know, ethnic minority students in particular...
What this faculty member described here is essentially a repetition of many of the processes and
inquiry techniques used within the Equity Scorecard committee process, along with the
mediating artifacts (data, syllabi review protocol, and so on) that had helped to re-mediate the
practices of the campus-wide Scorecard and Deliverology committees as they looked at equity
and student success issues at the university. By linking new “technê” or “know-how” with his
own existing wisdom, this faculty member could then potentially re-mediate other practices
within his department, applying those new tools, processes, artifacts, and language to
departmental practices impacting students of color. Through these adaptive tactics, he has re-
mediated departmental practices with new processes of data use and new discursive tactics in
dialogue with colleagues that have allowed him to re-frame the objectives of the faculty-student
research relationship. These re-mediations were then reified into a structural artifact of the
department—the five-year review study—which may serve as a resource that shapes practice in
the future.
Reflections on practice re-mediation at Valley University. Because of the clear
presence of an array of dominant institutional logics that sit in tension with each other and with
the objectives of increasing student success (in particular at Valley, success for students of
color), one of the most powerful re-mediating aspects of the change initiatives here was the
language and processes that allowed the committee members to begin enacting counter-logics
163
around institutional responsibility. Several members of the committees, both faculty and
administrators, noted examples of how they have changed their own practices as a result of
participating in the initiatives. The case is thus an interesting example of how new discursive
tactics may be effective in creating space within universities for practice change.
The case also reflects the difficulty of embedding the re-mediating process itself within
new structures. At Valley (as at Hillside, though in different ways), the re-mediated practices
enacted by the committee over time have yet to transform (in visible ways, at least) the structures
that shape activity across settings or university-wide. Instead, committee members seem to be
looking to senior leadership to adopt their recommendations and see them through. As one
administrator described:
If you make a recommendation to the president, the president as the chief executive
officer of the institution has the authority to charge his management team with the
implementation of those ideas. And, you know, is it the place of me or anybody else in
the committee to go, for example, into admissions and say, you must do this a particular
way? Probably not, right. We feel like, well, the admissions people are the experts in that
field. We should identify the problems for them. We should circle back. I mean, we
should re-look at the data, you know. I think [the Scorecard committee] feels like it’s off
the hook in terms of, you know, we made these recommendations and now were done.
No. You circle back, you look at the data periodically to see are the recommendations
that you’ve been making having an impact, positive, negative, or not. … And if they are
having an impact, you know, sort of letting the president know, well, you know, here you
go, these folks seemed to have done what we’ve asked them to do. We made these
recommendations. They don’t seem to be having any effect at all. We’ve made these
164
recommendations and, oh my god, this is awful, you know, the gap’s getting bigger.
What other kinds of things can you do? And so, you know, I think that’s the circle that
you bring. But it’s utilizing existing administrative structures.
This deference to top leadership and the reiteration of practices within existing administrative
structures is pragmatic, certainly, and a similar view was expressed in comments from others at
Valley and other universities. But what this perspective neglects, from a practice theory
standpoint, is what non-programmatic changes might be made in structures to create wellsprings
across campus of on-going practice re-mediation—not to tell admissions officers what to do, for
example, but rather to embed the processes of collective learning through work re-mediation
within existing structures so as to transform them—as some members of the committees have
done within their own practices.
Aside from the instances described by faculty members in their own departments, there
was thus little evidence of the creation or reinforcement of “learning structures” in which new
practices could be embedded. Such learning structures may not be critical to change outcomes;
indeed, with the effective leadership evident at the university and the continuation of practices at
the administrative level with respect to accreditation review, performance-based funding, and the
other concerns that brought the change initiatives to campus, much could change as a result of
recommendations made by the committees. But the nature of the change process at Valley
(particularly in comparison to that of Old Main, below) is notable for the extent to which practice
re-mediation remained a function largely of the isolated activity settings of the committees,
despite the apparent integration of those initiatives into the larger strategic planning discourse
and practices guided by the Academic Master Plan that serves as a communicative link between
larger accountability goals and campus-level practices.
165
Old Main University
Overview. Old Main University is distinct from other members of the PASSHE system
in many ways. Though historically it shares its origins as a normal school with the other regional
universities, Old Main’s geographic location as one of the PASSHE institutions that sits in
proximity to one of the state’s urban centers has contributed to patterns of growth and cultural
development different from institutions in other regions. Old Main is among the largest
universities in PASSHE, in terms of enrollment, and is also one of the most selective in
admissions. Old Main also has among the highest graduation rates in the system, for all students
as well as for students of color in particular.
The campus of Old Main University reflects its suburban location and is thus also
notably different from many of the primarily rural or small-town PASSHE university campuses.
Old Main’s several city blocks within a relatively affluent neighborhood are densely packed with
ivy-covered buildings and stately old houses converted into administrative offices, student
activity buildings, and fraternity and sorority houses. Several new nine or ten-story dormitories
have just opened on the center of campus. On the days when I visited, the campus’s main quad
was packed with students lining up to buy lunch from food trucks; the campus was both notably
more crowded, and notably more diverse, than the other universities I visited.
The same energy and intensity felt on Old Main’s campus was evident in its faculty and
department offices, and in the way the professionals I interviewed engaged in their work both
within the change committees and elsewhere. Not only do individuals within the university see
their institution as different from other PASSHE universities (this is expressed in the interviews),
they also enact their work differently from those in other universities in some key ways. In stark
contrast to Hillside University, in particular, the nature of work at Old Main University appeared
166
to be significantly guided by an institutional logic that faculty and administrators there
repeatedly referred to as “distributed leadership.” This logic, enacted through the leadership
practices of a prior president who delegated significant budgetary and decision-making authority
to divisions and department administrators, has noticeable implications for practice within the
university—particularly with respect to the change initiatives studied here. Old Main, in contrast
to the other two universities in this study, thus offers an example of practice change within a
social and cultural context characterized by urgency and collective energy around decision-
making, and the energy and intensity of a relatively diverse and metropolitan state university.
The context of change initiatives at Old Main University. The energy and urgency felt
at Old Main University was evident in the level of attention the campus has given to concerns
about student success and equity historically. Perhaps reflecting the way that a logic of
distributed leadership is enacted in practice, faculty and administrators at Old Main each
recounted numerous efforts—both individual and department- or university-wide—in which they
had worked over many years on issues broadly related to improving retention, equity (not only
concerning race/ethnicity but also for LBGT students and others), general education reform, and
so on. During the timeframe of the change initiatives studied here, faculty and administrators on
the committees were participating in numerous other committees and activities and noted many
ways in which they had extended the work of the change initiative committees into those other
settings.
Thus at Old Main, the practices of the change initiatives were, more than at either of the
other universities, reconfigurations of existing practices targeted to similar objectives. As an
administrator who co-led the Scorecard committee suggested, a pre-existing Retention
167
Committee as well as a Curriculum and Academic Policies Committee and other efforts she had
participated in involved similar activities and objectives, with varying levels of success:
[W]e decided that a good number of the people from the Student Retention Committee—
to make it short, SSRC—a number of these people would be part of the Equity Scorecard
because the Retention Committee already had a lot of experience over the two years prior
to that. We had been looking at a lot of data, and we had developed recommendations.
We actually had a system in place of getting information. The director of institutional
research was a member of the retention committee… One important thing that we did that
is very similar to the Scorecard is we analyzed policies. Something that most Retention
Committees wouldn’t be doing, but we did do that… And so we have been able to change
a couple policies as a result of that… the probation policy was changed, and the practices
around probation were changed as a result of what we did. There were a few data
policies that also changed in how we were reporting things. So it was a very nice
preparation for doing the Equity Scorecard. And it might be the reason why we were
able to jump in and do this.
Perhaps because of the sheer volume of activity around such issues, however, that history
of practice (particularly in committees) appeared to have created a level of skepticism among
many participants. Though the administrator had been able to change a number of policies due to
her influence on various committees and a level of autonomy in making decisions about policies
and practices within her own division (structured by the distributed leadership logic), others
involved in similar initiatives felt that they had spent a lot of time doing committee work that
never produced change. One faculty member described the influence of that experience on the
early tenor of Scorecard committee meetings:
168
I think the interesting thing is people on our team had different levels of experience but
also they had a different longevity at the university and I think the biggest tension was
sometimes it would come back to, well…we’ve been at this point before. We made this
suggestion. It’s not going to go over because we have suggested something like this. I
think – it’s how much of the work that we are going to do is going to have impact at the
cabinet level.
Another faculty member, who co-led the Scorecard committee and was also a member of the
Deliverology committee, described her work trying to recruit other faculty members to
participate:
I was able to take the faculty members who we were asking to be on the committee and
try to convince them that this was worth their time, right [laughter]? And I did that by
promising that if we didn’t get anything done we would dismantle the committee and be
done with it. That we weren’t going to spin our wheels for two years and have nothing to
show for it.
The Scorecard and Deliverology initiatives were thus introduced at Old Main into an ongoing set
of practices, and practitioners’ experiences of those practices significantly mediated their
attitudes and expectations about the work of the initiatives. On one hand, this extensive historical
layer of practice and its attending structures in many ways generatively mediated the work of the
initiatives—for example, as the administrator described, the committees had an existing
relationship with the institutional research office and there were existing networks on which to
draw for new members, as well as a familiarity among those members about the goals and values
that framed the committees’ work. On the other hand, however, the cumulative experience of
devoting time to committee work that never produced change had made participants skeptical.
169
This backdrop is important for understanding some of the key ways in which these
particular change initiatives served to re-mediate the practices of committee work devoted to
student success and equity at Old Main University. Based on the literature, the conditions at Old
Main were in many ways ideal for change; a combination of committed senior leadership and a
pervasive enactment of the logic of distributed leadership could be expected to facilitate broad
engagement among dedicated members of the university community and create a strong
“adaptive capacity” in terms of budgeting and decision-making structures that would promote
change (e.g., Clark, 1998, Hearn, 1996; Levine, 1980; Sporn, 1999; Zemsky, 2013). And yet,
while some members of the community had been able to realize change by enacting their own
authority within a distributed leadership logic, others experienced these institutional conditions
as creating a lot of activity with no real identifiable change as a result.
Accountability context as a legitimating factor in practice re-mediation. The
Scorecard and Deliverology projects were introduced at Old Main University with little clarity
around their purpose or intent. Most participants only understood that they were being
implemented by the PASSHE administrators and were linked to the universities’ performance-
based funding objectives. In fact, unlike other universities where the initiatives helped
reconfigure and advance nascent efforts, the introduction of the initiatives at Old Main
University actually disrupted recommendations generated by the prior retention committee to
develop a new set of interventions to support persistence among African-American men—a
proposal that had already been approved and was about to be funded when the new initiatives
were announced. A faculty member who had led the research and development of that program
and then became a co-leader of the Scorecard committee described her frustration:
170
So my subcommittee looked at success among our African American men, and we wrote
a report….and we developed a proposal for a multi-faceted, pretty comprehensive
program that would be focused on African American men on campus. And it got a lot of
attention, it was also about the time that we were going through Middle States evaluation
so it got some attention in that piece, and they had all this money that they were about to
put towards it to develop the proposal we had made into something that was more action
oriented and usable. And it got stalled because then, just as the money was about to be
handed over, essentially, we got word that we were going to embark on the Equity
Scorecard project instead… I think that the idea was if PASSHE was about to put all this
money into a real structured program that it was too premature to jump in and put money
in one place before we had seen what was going to come out of Equity Scorecard.
Conveying her frustration and referencing her experience in prior committees, she then also
signaled how the Scorecard was ultimately different.
That’s more typical, I think, of committee work on campus. There’s a lot of time and
effort and spinning and then…it rarely feels like something comes of it. And the Equity
Scorecard project was different in that way, and so, there’s a balance, right? It did usurp
this other thing [laughter] but on the other hand, I think a lot of action actually came out
of it, which is unusual, so.
Other members of the Scorecard committee who were equally skeptical initially expressed a
sense that the project offered an opportunity to advance their prior work around student success
and equity. A student affairs professional described her early impression:
So I knew it was going to be a lot of work, but that was okay, because again I felt it was
the best opportunity for us to have an impact on this. I had served on the Retention
171
Committee. We had done a lot of studies. We had done a lot of subcommittee kind of
work and we really had almost hit a brick wall with, it’s not really going much further.
So this really seemed to be an opportunity to really have an impact on change.
Despite the fact that the structured change initiatives at Old Main represented—to a greater
degree than at the other universities in this study—a continuation and reconfiguration of
existing practices, these initiatives thus created a different sense of potential and expectation
among a group of professionals who were skeptical and weary of the impact of those existing
practices.
Two factors seemed to account in large part for this different set of expectations and the
ultimate sense that more was accomplished through the Scorecard than through prior efforts. The
first, I argue, had to do with the nature of data collection and use in the initiatives (I discuss that
at length below). The other factor I gather from the case was that the relationship between the
initiatives and the broader accountability context at Old Main University created a unique set of
conditions for the practices of change efforts there. As a point of comparison, I suggested above
that at Valley University the accountability context had indirectly mediated change practices by
linking various (often divergent) institutional logics, particularly through a strategic planning
process. At Old Main, the accountability context served a different mediating role; here, the fact
that the initiatives were linked to performance-based funding and thus sanctioned by the
president provided a discursive legitimacy that directly re-mediated the existing practices of the
retention committee. The interaction between this accountability context and the distributed
leadership logic at Old Main, I believe, may have facilitated this unique mediating function.
After the administrator had described many of the activities the prior committee had undertaken
172
that were similar in method and objective to the Equity Scorecard, for example, I asked her if the
Scorecard felt like a redundant effort:
No, it was not redundant…the Equity Scorecard gave us, I don’t want to say credibility,
but gave us an entry, it gave us more of a standing in the community. One thing was the
Retention Committee wanting to look at things as a subcommittee of the enrollment
management group, and something else is a group charged by the president to look at this
and making the direct connection to the key performance indicators. So it gives a little
bit more clout to go out there and get the data a little faster than we normally would.
A student affairs professional also expressed how she has used that legitimating tactic to get data
related to her practice as an advisor to undecided students; whereas previously the university
would not allow her to access reports containing data disaggregated by race and ethnicity, the
context of the Scorecard project had provided her a new discursive resource and authority within
her practice:
I went back to everybody who wouldn’t let me change my reports… started the process
all over again with can I have the changes… and I was able to say “Equity Scorecard”
and “our performance funding requirements from PASSHE”… “my Dean expects me to
be tracking this. Can I please have this?”
I also observed this legitimating discourse enacted as a strategy for engaging additional members
of the campus community in new data-use practices. Over the summer, the Scorecard committee
(some of whom have now transitioned into the Deliverology committee
11
) held a workshop for
department chairs in which they were trained, using the tools, data, processes, and language of
11
The line between these two committees is blurred because the same individual is in charge of both and has merged
their activities into a single effort called the “Student Success Network.” The workshop I observed and several
others held during 2014 are sponsored by the Student Success Network, which is officially the committee appointed
to implement Deliverology, though these activities were led by the Scorecard committee members; for consistency I
continue to refer to both initiatives.
173
the Equity Scorecard, to examine equity within their departments (I discuss this event
extensively below with respect to “learning structures”). At this workshop, both the division
administrator and the provost referenced the “performance metrics” related to closing gaps for
underrepresented students; indeed this was the only instance in the study where I witnessed a
provost enacting leadership practice directly within the activity settings related to the change
initiatives. Though many of the activities undertaken by the Scorecard committee and its
individual members were very similar to activities pursued by them previously, the
accountability context thus appeared to directly re-mediate those practices in the form of a
discursive resource for legitimacy.
The “entry” provided by the legitimacy associated with the Scorecard, along with the
existing relationships between the committee chairs and several critical offices around campus,
also contributed to buy-in and early impact that further distinguished the Scorecard practices
from those of other committees. One of the initial issues that the committee sought to address
was disparities in access for Black students; as at other universities in the study, the committee
members thus began by investigating how the university conducts admissions—both to the
university and to majors within the university. But as one of the team members described,
admissions practice at Old Main University is politically volatile:
We were actually very bitter that we had to start with the access. We felt that that was
sort of the easy pickings and that, that…we had just done all this work on retention and
why didn’t we…we should start with retention. And we also thought it was also going to
be really loaded because we were honing in on one office, right? And we were really
weary about how we were going to be received in that one office. And I actually think it
was a brilliant way to start, because we had to work really hard to develop a working
174
relationship with admissions so that they didn’t feel that we were causing trouble, that we
were going to be supportive, and we did that. And…but what it also allowed us to do,
because we had such lovely buy-in with the admissions office, that was really totally
collaborative, and we were able to get big recommendations through pretty easily,
including hiring new staff, doing a pilot study on getting the SATs to be optional…so it
helped our committee feel like okay this is worth it, which was, you know, a really
important piece to it…it really kept us moving, and it helped the university begin to see
us as—I think administrators were interested, were more on board, and then when we
started going to departments we had more support from administrators, we had a little bit
more clout. There began to be some articles written about us in the provost’s newsletter.
And I don’t know many people read that, but we became… it wasn’t just another
committee on campus, it was a group that was actually doing something… [I]t really
helped us, I think, develop some credibility, both within our committee and in the
university, and then that sort of propelled us and moved us forward in a way that the
other committee just didn’t have… that clout.
What this faculty member describes is a set of activities—primarily communicative—for which
the conditions within the accountability-associated context of the Scorecard were different from
contexts of prior committees. Through the enactment of those structurally different practices
over time, iteratively, the practices gained more and more “traction” with respect to a broader set
of practices (like admissions) within the university. In other words, the initial momentum
afforded to the committee because of the accountability context in which the initiative emerged
thus allowed the committee to enact practices of inquiry and recommendation more effectively
175
and get what committee members perceived as an “early win” that gave the committee work
even greater legitimacy.
Within this faculty member’s account there is also an indication of ways in which the
legitimacy of the Scorecard initiative helped to re-mediate larger institutional channels of
communication and influence. Recall that because of being situated within the larger strategic
planning and accountability contexts at Valley University, the Scorecard committee there was
able to successfully advance recommendations regarding SAT-optional admissions that had been
unheeded given the prevailing logics of “excellence” and prestige. At Old Main University, a
similar dynamic unfolded wherein the director of admissions recognized the practices of the
Scorecard committee as a resource for advancing her professional judgment and goals around the
use of SATs as well as staffing related to fulfilling the goals of her office. One of the Scorecard
committee members described her impressions of the how the Scorecard committee activities
interacted with but also expanded existing goals within the admissions office and how a
convergence of those goals led to change:
Some of the members of the Equity Scorecard team are on enrollment management
where admissions sits, so we had a sense of what they were struggling with
beforehand…I think all of that gave us an in but the biggest thing was that it was really
clear that the director of admissions who is a brilliant woman looked at our project and
thought, I need more staff and this is how I’m gonna get it. And so we were able to work,
I think they saw that they were going to get something that they wanted. What really
surprises me is how supportive the director of admissions was in thinking about SAT-
optional admissions, and she now has that ball with the pilot study data and is pushing it
forward in a way I never would have expected. So I think that, seeing us as a way to get
176
something that she needed was our wedge in, but I think we also broadened everyone’s
ideas about what could happen and that’s been a good thing.
The accountability context of the Scorecard and Deliverology—particularly the relationship to
performance-based funding and the president and provost’s explicit discursive practices with
respect to those initiatives—thus both directly and indirectly re-mediated the practices of the
committee(s) vis-à-vis the objectives of improving student success and equity; it also re-
mediated larger structural relationships within the university by facilitating convergence of
interests and enacting new channels of communication for local professional judgments (e.g., in
admissions) through a legitimated channel of practices (i.e., the Scorecard inquiry activities and
recommendations).
Data-use as a mediator of learning and communication practices. The other primary
way that the Scorecard and Deliverology initiatives appeared to re-mediate the practices of
retention and student success committees at Old Main University was to change the relationship
between various university practices and the collection, communication, and use of data on
student experiences and outcomes. Like Valley University, Old Main had a well-established,
robust institutional research office and a highly regarded institutional researcher (committee
members described her as “passionate” and a “guru”). But also like Valley University (and most
public universities), data-use practices were highly localized to institutional research and even
intentionally limited in other respects. The Scorecard committee member quoted above who used
the cover of the Scorecard to request additional data, for example, described how the process of
getting data to support her practice as a student affairs professional had previously been
challenged by an informal policy of restricting data (likely, though I did not confirm this, as a
177
structural reification of some set of practices within the university’s history around desegregation
or affirmative action). As she described:
The first year I was here I started asking about [disaggregated data]. Can I have
information about race, ethnicity and gender? And I was told literally, “Oh, we don’t
give that out because we’re concerned that you’ll act on it in a negative way.” And I
thought, “We’re educators. What do you mean you’re assuming we’re going to act on
something in a negative way?” So not having the data, when at the same time my boss
was saying, “Well how are you doing with students of color?” And it’s like, “Well I
don’t know.” I can see who is sitting in the chair but I don’t label because I feel it’s
important that the student identifies who they are. So I don’t make any assumptions, so I
don’t know because I don’t have any data.
Though data were thus limited in availability for areas of practice outside specific committee
work, data had been available to the retention committee in their activities prior to the
introduction of the Scorecard and indeed the institutional researcher had been a member of that
committee. Within those settings, data had—as in the Scorecard and Deliverology settings—
served as a mediating artifact for the committee’s learning about retention patterns.
This was only one part of how data served differently to re-mediate the activities of the
Scorecard, however. In addition to directly mediating the committee’s own internal activities, the
Scorecard committee’s practices themselves—working with others across campus to assess and
understand policies and practices impacting student success and equity—created a new
relationship to the use of data as an institutional practice. Whereas before individuals like the
student affairs professional quoted above encountered entrenched informal policy limiting the
use of data (for example, student data coded for race, ethnicity, and gender), the Scorecard
178
committee practices explicated the role of data not only as instrumental to committee work but
also in changing the nature of work university-wide in order to meet the university’s
accountability objectives. As a faculty member and Scorecard committee member described:
One of the most important things – I hope one of the most important things that I think
we’ve been able to advocate for is getting done is that we want the various entities to
make – the various units to make as a part of their regular work, their regular reporting
structure to include issues of race and ethnicity. That they are actively monitoring the
gap between majority and under-represented minority populations in their programs at
the program level. So that’s just getting the data, but the data has a kind of power, right?
If people are monitoring it then it has to be at the forefront of their mind. I think we have
also gotten a commitment from the Deans and the Provost that they are going to be
making decisions based on that information. So collecting the information and using it as
part of the reporting structure I think is a super huge thing that feels like a victory even
though on some level it’s just getting the data.
What this faculty member describes is a distinctly different conceptualization of the objectives of
the change initiatives from those evident at other sites. As at the other universities in study, data
served a direct re-mediating role in both committee and individual practices—the student affairs
professional who had been trying for years to get data coded by race, ethnicity, and gender, for
example, knew that her practice would be enhanced by having that information. But to a greater
extent than was evident at other universities, the committee members at Old Main understood
data-use as a critical change catalyst for university practices more broadly and understood as part
of their committee’s objective the enacting of structures to promote data-use practices university-
wide.
179
One explanation of this difference (between Old Main and Valley, for example) is that
perhaps, given the extent of prior retention and equity efforts at Old Main and the extent to
which those efforts had been stalled or ineffective, the individuals involved in the Scorecard and
Deliverology initiatives there had a history of experiences informing their relationship to data
use. The individuals such as the administrator and the faculty member who talked about the
importance of embedding data-use in campus-wide practices had experienced change efforts that
otherwise had the conditions to be successful—a culture of distributed leadership, a clear set of
objectives, data available for understanding the problems, and so on. But in those prior instances,
the work within the committees had not led to sustained change in other domains of practice
within the university because the particular mediating elements that had led to committee
members’ increased knowledge and enhanced their own professional efficacy had not been
available elsewhere.
When the faculty member says that, “collecting the information and using it as part of the
reporting structure…feels like a victory even though on some level it’s just getting the data,”
what he is suggesting is thus that they have been able to change the structures of practice more
broadly than they were able to under the conditions of prior committee work. In interaction with
the legitimating context of accountability and given their unique history of experience with
practice-change efforts, the committee at Old Main thus re-interpreted their objectives (recall
that objectives are part of the mediating triangle in CHAT) as not just re-mediating their own
practice to produce recommendations as a committee but creating conditions for ongoing
practice re-mediation university-wide. The idea of creating conditions for ongoing practice
change as an objective of change initiatives is, I believe, a compelling example anchored in the
practice perspective of this study of similar notions throughout the change literature with respect
180
to sustained change “capability” (Sporn, 1999) or institutionalization of innovations (Levine,
1980). I thus explore in depth here how such conditions might be created at Old Main and, in the
terms of practice theory, how such conditions would change the nature of practices related to
equity and student success broadly across the university.
Structures promoting incremental re-mediation of practice. I have argued that the
Scorecard committee (overlapping with the Deliverology committee) at Old Main uniquely
conceptualized one of the objectives of their work as creating structures to promote greater use
of data by faculty and staff university-wide with respect to equity and student success. Indeed,
the workshops like the one I observed in which department chairs received training for using
Scorecard tools and processes to investigate patterns of access and retention of students of color
within their majors represent efforts by the committee members to establish routines around
practice re-mediation similar to what occurred within the committee.
At the same time, the administrator and other members of the committee have worked
with their colleagues in their departments or other settings (unrelated to the specific committees)
to enact practices that reflect the objectives of the committees (improving equity, for example)
and that have the potential to reinforce the training from the workshops. For example, a faculty
member participating in the Scorecard committee convinced the dean of her division (the
College of Arts and Sciences) to embed goals and technologies adapted from and developed
within the Scorecard process to the annual competitive grant process for distributing research
funds to faculty. The practices related to improving student success and equity could thus be
learned by faculty in workshops and reinforced by incentives within their annual funding
competition.
181
Through these examples, the Old Main University case thus provides the strongest
indication in this study of how re-mediated practices within the committee settings may serve to
create or reinforce “learning structures” in which those practices are embedded. It is here
where I find at least one component of what might be the practice version of what (Clark, 2004)
calls a “bureaucracy of change” and (Sporn, 1999) refers to as the “adaptive capability.” In the
terms of the practice framework, a “learning structure” would create the conditions for ongoing
incremental practice re-mediation across the university with respect (in this case) to the
objective of student success and equity. In other words, I am suggesting that what the committee
members at Old Main were doing was enacting a new structure for a practice of learning; the key
is thus to understand the qualities of such a “structure” and how it may engender such practices
in a way that is enduring over time.
Elements of learning structures. Recall from the theoretical discussion in Chapter III
that practice theory views structure (particularly based on Giddens and Bourdieu) as a
mechanism by which human agency is regulated, but it is likewise reproduced by human action;
it is half of a “dualism” constituting human social activity (Giddens, 1984). Structure here thus
means something different from other meanings in organizational research—for example to refer
to an organizational chart, hierarchical “structure” of authority, physical layout of buildings or
departments, policies, and so on. Rather, the practice theory concept of structure in relation to the
organization is more like the conditions that those organizational architectures create for human
agency, or the organizational social and cultural conditions within which human agency occurs
and which it reproduces. Thus it is, for example, the power dynamics established by the
hierarchy or the range of conceivable actions individuals enact given their place within an
organizational division of labor that constitute structure vis-à-vis practice. The phrase “learning
182
structure” is thus actually shorthand for “a structure for learning as a practice,” reflecting that the
structure sets the conditions for practices to be enacted (and is also reproduced and/or altered by
them).
I reiterate these theoretical notions of structure in order to then draw attention to
particular elements of the Old Main University committee members’ efforts to create a learning
structure in relation to student success and equity. First, however, I want to make clear that what
can be learned about such a structure is not just about the settings or forms through which it was
enacted (though those forms themselves may also be instructive) but rather how those forms
create or support the conditions for certain types of activities to occur. I thus explore two
occurrences at Old Main that I have referenced already, the “Equity Matters” training workshops
and new data reporting routines; but I explore here in particular how those forms engendered
new structures as the conditions for a practice of learning about student success and equity.
The series of “Equity Matters” workshops conducted by the Scorecard committee were
designed to introduce deans, department chairs, and eventually other groups of faculty and stuff
to a process of using data as an entry-point for examining and interrogating practices and policies
that may negatively impact students of color. Within these workshops, participants are provided
with a set of data showing enrollment and retention of students in their divisions or departments,
disaggregated by race and ethnicity, and are then guided through a process (an adapted and
compressed version of processes embedded within the Scorecard) of interpreting the data,
identifying “gaps” between majority and non-majority students, and identifying potential
institutional practices and policies that may be contributing to those gaps. Through this process,
participants are also given a new vocabulary to re-mediate their interpretive practices around the
data and the student experiences those data represent—“equity gap,” “equity-minded” and
183
“deficit-minded,” “hunches” and “focal efforts” are all words belonging to the Scorecard lexicon
that I heard repeated and defined within the workshop I observed.
Following the training, participants are expected to complete a set of inquiry activities to
determine where within their departments inequities in access or retention occur for students of
color and to then make a report back to the committee (and the provost) regarding how they plan
to address the inequities and to continue monitoring them. Participants were given a small
stipend, which was funded by the provost’s office as a symbolic token of the importance of
completing the work. In order to receive the stipend, the provost required that participants sign a
“contract” outlining their responsibilities. After completing the training, they also received a
sticker that reads “Equity Matters,” similar to the “Safe Space” stickers associated with training
about sensitivity to LGBTQ issues; later in the fall the dean and the provost plan a larger
campaign to inform students that the “Equity Matters” stickers indicate that the faculty or staff
members displaying those stickers care about the success of students of color and are attuned the
potential impact of race and racism on students’ experiences at the university.
As they complete their reports over the summer and fall, the Equity Scorecard committee
members will serve as on-demand “consultants” helping them examine data and conduct inquiry
within their own departments. The committee members have also embedded the specific
activities prescribed by the “Equity Matters” training and subsequent inquiry and reporting
within a larger communications campaign to create the expectation among students (or at least
the perception of that expectation among faculty and staff) that they will be cognizant of race and
ethnicity as a factor shaping students’ experiences. These training events thus, in various ways,
create new conditions for practice re-mediation among groups of practitioners across the
university, essentially replicating the re-mediated nature of practice within the Scorecard
184
committee through a structure of training, inquiry, and ongoing data-use incentivized by a
reporting function a sort of public awareness campaign. In other words, the specific architecture
of the “Equity Matters” training—the actual workshop, the contract, the reporting function, and
so on—have the potential to create conditions (incentives, expectations, awareness) wherein
department chairs (or other participants) integrate learning about issues of equity into their
existing practice.
A second major organizational form that served (or may serve) as a component of a
learning structure at Old Main was the creation of new routines of data collection, reporting, and
use (these were embedded within the Equity Matters trainings but also occurred separately). As
the faculty member described above, the collection and monitoring of new data was a “victory”
because, as he said, the “data have power.” I have discussed the re-mediating role of data
throughout these case narratives, but here I want to focus specifically on how routines of data use
are also a component of learning structures in order to deconstruct the ways in which data may
have or come to have “power” as tools of practice. In other words, I want to further elaborate the
distinction between data as mediating artifacts and data use as a kind of technology embedded in
a learning practice. Based on learning from the Old Main University setting, I suggest that the
distinction hinges in part on the different ways in which the artifact of data within individual or
group settings and the collection and use of data interact with other mediating factors of practice
such as professional identity, moral values, prior practices, other structural factors of work
activity like incentives and job objectives, and broader institutional logics with respect to equity,
student success, and so on; that is, that it hinges upon the relationship one has with data in
practice. I take a brief theoretical detour here to help clarify and deepen this particular finding
with respect to the Old Main University case.
185
I have thus far in this study avoided drawing extensively on organizational learning
theories (aside from reviewing their role in existing literature) because I have anchored this study
firmly in an epistemological preference for viewing practices (rather than internal theorized
cognitive processes or macro-level abstractions) as the constitutive material of observable social
phenomena. But here I find that the distinction in Argyris and Schön’s classic work on single-
and double-loop learning in organizations is metaphorically helpful for articulating the
distinction between the nature of data as a re-mediating artifact and data-use as a form of technê
supporting a learning practice. Argyris and Schön (1974, 1978) developed a model of
organizational learning constituted by changes in individuals’ “theories of action” or “theories in
use” with respect to the ways in which going about work will produce the desired outcomes.
Learning, in their model, is the process of changing “theories-in-use” such that individuals either
diagnose and respond to the environment (which they called “single-loop” learning, based on W.
Ross Ashby’s work on adaptive behavior) or examine their assumptions about the environment
and intervene in the underlying system in order to create stability (“double-loop” learning). A
thermometer, for example, measures temperature—it collects “data” so to speak, and is thus
“learning” about the temperature in the room. A thermostat, by contrast, first collects data about
temperature and then acts to change the conditions in the room, such that changing the
temperature becomes the object of the thermostat’s learning—a “second loop” of learning occurs
vis-à-vis the change in temperature.
The difference between single and double-loop learning is fairly obvious with respect to
data use. Institutional researchers, for example, can (and do) collect huge amounts of data on
enrollment, retention, and completion, grades, credit accumulation, and so on. From those data
the university could be said to be “learning” about students’ outcomes. For a second loop of
186
learning to occur, the data then have to be used in such a way that changing the underlying
conditions producing what is evident in the data—students’ outcomes or disparities among
groups, for example—becomes the object of learning in a continuous cycle (Witham &
Bensimon, 2012).
The question that the Old Main University case prompted in the course of my analysis
was how to translate these abstract metaphors of organizational learning into practice terms, and
the distinction I am trying to draw here is thus a phenomenological one: what is the relationship
to data in a learning practice? My hypothesis, based on observations and narrative accounts at
Old Main University, is that the practical relationship between practitioner(s) and data changes
when the objective of learning from data is to create changes in the underlying systems
producing those outcomes, and when the data thus become the object conveying that agency. At
one level, an individual or group may use data to answer a question or to complete a job as
he/she/they understand it. The student affairs professional who argued with academic computing
about getting data disaggregated by race and gender needed data in order to answer questions
about her job as she understood it—essentially she suggested that “I can do my job better if I
have these data.” At a second level, committee members at Old Main want data collection,
dissemination, and reporting to become part of the routine practice of the academic departments
because they believe (based at least in part on the re-mediation of their own practice within the
Scorecard committee) that they have an agency relationship to what is evident in the data—that
students’ outcomes can be changed as a function of re-construed and restructured practices. The
data are the instrument of that relationship at this second level; the collection of data is the
technê through which practitioners might bring about changes in practice (note, I said changes in
practice, not changes in outcomes). The routine collection and reporting of data, as a component
187
of a learning structure, thus has the potential to change the nature of practitioner agency with
respect to the students’ educational experiences conveyed in the data.
Leonardi (2012) takes a similar approach to explaining how new technology produces
organizational change. Drawing on Taylor’s (2001) theory of communicative acts as
“imbrication” of agency and instrumentality, wherein agency is action for a purpose and
instrumentality is the “means by which something is accomplished” (Taylor, 2001, p. 279),
Leonardi (2012) suggests that organizational processes are similarly phenomena of
“sociomaterial imbrication,” such that human and material agencies “come to form an integrated
organizational structure” (p. 7). Viewing organizational processes in this way, in particular the
relationship between computer technology and human agency within organizations, Leonardi
argues that, “As material agencies enable people to do new things in new ways, people change
their goals and consequently bring changes to the technologies with which they work” (pp. 7-8).
What I am suggesting in extending Leonardi’s approach (or my interpretation of it) is that data-
use is a form of technology to which practitioners’ agency takes on a particular relationship
when the data has also enabled them to reimagine the objectives of their work. In a learning
structure, data become part of the texture of “sociomaterially” imbricated practice—not just a
tool for accomplishing a given task but a continually re-structuring element of ongoing practice
change. In this way, the faculty member’s statement that “data have power” could be interpreted
literally as “data have agency;” they reshape practitioners’ relationship to their own activities and
are in turn embedded in practice as the objectification of those practitioners’ goals.
This conceptualization of data use fits nicely with the semantics of accountability-driven
practice change. The goals outlined in the PASSHE performance-based funding, for example, are
to “close the gaps” in access or completion rates between majority and non-majority students.
188
Those “gaps” are properties of data, not of people. I cannot go out onto campus and see “gaps”
just as I cannot see institutionalized racism (though I can see enactments of it if I am looking).
What I can see are data, and I understand my relationship to the disparities in students’
experiences (and my role in making students’ experiences better) via the technology of
continually monitoring data and adjusting my practice to change those data.
Certainly the ultimate goal of accountability policy such as PASSHE’s performance-
based funding is to ensure that race, ethnicity, gender, income, or other factors are no longer
predictive of disparate rates of success (on average); in other words, that all students succeed at
equal rates and that if they do not, that is not because of a systematic set of practices that
disadvantage some individuals based on their belonging to a specific social group. But because
no individual practitioner can alone ensure that, for example, black and white students graduate
at the same rates, the objective of organizational change around equity or student success in
higher education has to be creating organizational conditions by which the sum total of students’
experiences changes. We might say, then, that the goal of accountability policy that creates
conditions for data-use and provides specific objectives with respect to data could thus be to
create such learning structures for continually re-mediated practices—that, adopting Leonardi’s
language, accountability could change the texture of practice-structure imbrication using the
technology of data use. Regardless of how effectively such structures become embedded or
sustained at Old Main University, the efforts to create these forms of new learning structure are, I
believe, highly instructive with respect to practice change.
Reflections on practice re-mediation at Old Main University. The practices directed
towards improving equity and student success at Old Main, and the ways in which they were re-
mediated within the Scorecard and Deliverology committees, confirm many of the same patterns
189
reflected in my broader propositions across settings. As at Valley University but to a greater
extent, the practices of the change initiatives at Old Main were reconfigurations of existing
practices; but the re-mediating effects of these initiatives were significant. Notably, as in the
other universities, there was a strong “interaction effect” (to use regression modeling language)
of practitioners’ professional identities and dominant institutional logics—particularly, here, one
of distributed leadership. It was also clear that leadership practices had created resources for
practice re-mediation, as in other settings, but the form those resources took was distinct in that
leaders communicatively translated the accountability context into a form of discursive
legitimacy resource for the committee members (as well as providing actual funding for
trainings). The Old Main University setting was distinct from the others, therefore, in the ways in
which accountability served as an indirect re-mediating factor.
A final compelling lesson from Old Main University has to do with the ways in which
committee members there re-conceptualized the objectives of the committees and how those re-
conceptualized objectives were enacted. Rather than seeing their objectives as only developing
solutions (though they did some of that as well, in recommendations for new staff or new
programs), the committee also began replicating new structures of practice embedding the data-
use processes and other tools of their own committee work. Drawing on CHAT and other
elements of practice theory with respect to how change in practice settings may incrementally
produce change in organizational practice more broadly, the attempts to create new learning
structures at Old Main University offer compelling examples of possible ways to reimagine the
goals and evaluations of accountability policy or other investments aimed at improvements in
equity and student success.
190
CHAPTER VII: Discussions of Findings and Conclusion
In the introduction to this study I suggested that the college completion agenda has
engendered a new form of change initiative within higher education institutions or, at the very
least, has resituated activities common to prior change efforts within a new kind of
organizational and political milieu. Most notably, a certain type of change initiative characteristic
of the completion agenda is both accountability (that is, state and federal policy) driven and
organizationally focused; its objectives are framed in terms of political and economic goals
related to postsecondary attainment but operationalized through methods and strategies familiar
within the professional practices of academic organizations.
On the accountability side, such change efforts are defined by metrics and goals
symptomatic of what Neave (2012) suggests is a new paradigm of change—one in which
postsecondary institutions are instrumentalized within an urgent, future-oriented political regime.
President Obama’s “2020 goal” for the US to be first in the world in postsecondary attainment –
the rallying cry of the completion agenda – serves as a clear signal of this contemporary change
paradigm. State policy measures enacted nationwide have converged around metrics and
mechanisms, disseminated by influential foundations and advocacy organizations, that enact this
future-oriented political and social urgency in the form of outcomes-based funding (Hall, 2011;
Jones, 2013) and related communicative policy tools like strategic plans and college outcomes
“dashboards.”
On the organizational side, the perpetual hunt for “best practices” to implement “at scale”
(Kezar, 2011)—though certainly still present—has given way, at least in some innovative state
policy contexts, to reliance upon national initiatives concerned more with building organizational
capacity to improve practices than implementing a given set of “best” practices. The Equity
191
Scorecard, Deliverology, Achieving the Dream, Completion by Design, and similar (primarily
foundation-funded) initiatives—though differing significantly in their scale, theories of change,
methods, and objectives—share a focus on improving organizational capacity for productive
learning and change rather than identifying and replicating interventions. These initiatives, as the
executives of the Pennsylvania State System of Higher Education have made clear with respect
to their investments in Equity Scorecard and Deliverology, are being linked intentionally to
accountability policy in order to support institutions in achieving policy goals. As the literature
review in Chapter II suggests, such policy innovation is an attempt to correct for prior iterations
of performance-based funding and other accountability measures (e.g., in PASSHE, mandated
desegregation) that had mixed, limited, or no apparent impact on outcomes.
I have argued that these specific contexts make the change processes associated with the
completion agenda qualitatively different from most prior change efforts in higher education. I
have also argued that they and the contexts in which they are being implemented are furthermore
significantly different enough that understanding them may require new conceptual approaches;
conventional constructs that were useful for parsing academic, political, and market forces
(Clark, 1983b) and mechanisms of change may, in the current context, prove not only inadequate
for capturing the complexity of current change practices but misrepresentative of them. If we are
to evaluate, promote, or critique the current nature of change—as researchers, policymakers, or
practitioners—we need a clearer understanding than existing research provides as to the complex
array of relationships through which the political and social objectives of the day mediate
organizational practices in the immediate day-to-day of those working in higher education.
I have also argued that PASSHE is an exemplary case of the convergence of all of these
dimensions of political, social, and organizational change. Though I describe the linking of the
192
Equity Scorecard and Deliverology with performance-based funding policy as a type of policy
“innovation,” I use that word without an intended valence. I am not endorsing that strategy as a
self-evidently positive thing for those working in institutions, nor am I suggesting that academic
institutions necessarily should be building capacity to serve as instruments for the objectives of
state or federal policy. That said, I do think—and I believe there is general consensus in the
field—that achieving greater equity in the way public institutions serve students of color and
those from financially disadvantaged families should be an objective of ongoing efforts
nationwide as it has been in Pennsylvania. That is to say, while I happen to believe that the
objectives of the change initiatives I studied here are worthwhile as goals for institutional
change, the aim of this study has been to investigate the nature of current change efforts using
these initiatives as examples, not to argue that policy-driven change—even when anchored in
practice—is always a normatively good thing.
This chapter proceeds as follows. I first conclude by outlining some of the key lessons
learned from this case study. The eight propositions outlined in Chapter V were offered as
tentative generalizations about specific dimensions of the nature of change based on my analysis
of the data, and the case narratives in Chapter VI then put those propositions into action (or I
should say, into practice) by embedding them within descriptions of what has taken place at each
university over the past few years, constructed from participants’ accounts and enriched by the
historical and cultural texture of the sites. In this concluding chapter I “zoom out” even farther,
to use Nicolini’s (2009a) metaphor, to more succinctly summarize what the findings of this study
suggest about the nature of practice re-remediation associated with accountability-driven change
initiatives. I do so first by providing summarized discussions in the form of answers to the
research questions motivating the study.
193
Re-visiting Clark’s (2004) idea of a change “bureaucracy” and similar ideas elsewhere in
the literature, I then integrate the findings from this study within a brief discussion of what a
practice theory version of that idea would be in the context of the types of change initiatives
studied here. I discuss in particular the two defining features of the change initiatives according
to which I structured the literature review in Chapter II—the accountability context in which they
emerged and their focus on data-use as a core change mechanism. Within this discussion I
suggest some implications of my findings for future research as well as for policymakers and
practitioners who find themselves, in their own practices, trying to understand or enact (or resist)
change.
Finally, I then reflect further on how this study may contribute to theoretical approaches
to change in higher education. I have argued throughout (but particularly in Chapter III) that
practice theory is a potentially generative approach to capturing and integrating the many
interacting dimensions and dynamics of organizational change—both generally and, especially,
with respect to the current “change paradigm” of the completion-agenda era. I believe such an
approach has been generative in this study, and I thus conclude the chapter by offering some
reflections on the utility of practice theory, what I believe it offers and prescribes for future
research on change, and how its language and analytic focus contribute to the metadiscourse on
change in higher education. I also discuss the limitations of the practice approach as well as the
limitations of this study more generally.
Discussion of Key Findings from the Case Study
The following are the research questions for this study and a summary of findings
responding to each.
194
(1) What are the practices associated with the objectives of improving student success
and equity in universities?
In this study, the activities associated with efforts to improve student success and equity
took in almost all cases the same general form as activities common to committee meetings that
take place every day on nearly every college and university campus: 8-10 individuals coming
together for an hour or two hours once a week or every two weeks to discuss, review reports or
other sources of information, and develop sets of shared actions intended to meet the groups’
objectives; individuals or groups of individuals then working outside the committee meetings,
interacting with various offices or individuals across campus to gather additional information and
then reporting back to the group; and finally drafting recommendations and making presentations
of the committee findings to various campus constituencies.
There are two dimensions of practice reflected in these activities. The first is that this
array of activities of the committee is, I argue, itself a kind of higher education practice in that it
takes a similar form across institutions without ever having been explicitly “learned” and that it
involves shared, mediated, structured activity in pursuit of a goal (Nicolini, 2013; Schatzki,
2001). The second dimension is that the activity setting of the committee, as demonstrated in this
study, integrates the various other practices of its members—the faculty member engages in the
committee as part of a service commitment; deans participate as part of their own practices in
developing the organization and pursuing organizational goals; staff members participate to
contribute their preferences and perspectives based on their professional practices supporting
students, in admissions or financial aid, and so on. Reflecting on the practices associated with the
committees tasked with pursuing the objectives of student success or equity, and thus viewing
the committee as a setting for a variety of practices, provides a conceptual foundation for
195
answering the second question of how specific change initiatives may change the nature of those
practices.
(2) How are those practices re-mediated in the course of data-centered, team-based
change initiatives?
The findings of this study suggest that these practices—in both dimensions I just described—
have the potential to be re-mediated by the change initiatives in at least the following five ways:
• Accountability. Through new objectives introduced into the activity settings by the
accountability impetus in which they emerged, and through new resources for the
committee practices generated (typically) by leadership practices associated with
accountability.
• Networks. Through new configurations of participation, that is, new people from across
positions and functions within the university (and thus across practices) coming together
and interacting to create new networks for communication within and outside the
committee.
• Data. Through the introduction of data, as new mediating artifacts that embody and
define the change objectives (i.e., “cutting the gaps in half”) and serve as a focal point for
shared interpretation and discourse.
• Language. Through the introduction of a new vocabulary that enables practitioners to
identify dominant institutional logics and discursively enact counter-logics or other
“tactics” aligned with their professional goals.
• Technology. Through new technologies, such as particular forms of data use, as new or
reconfigured ways-of-doing to reach the objectives (i.e., the action research process of
196
the Equity Scorecard and its associated tools, or the management systems approach of
Deliverology).
These five elements emerged as the most salient aspects of practice re-mediation with the
change initiatives, but variations in the ways in which they served to re-mediate practices across
settings revealed additional components of practice that interacted with the five elements above
in complex and interesting ways—in other words, the re-mediating impact of these five elements
occurred only in interaction with one or more of the following:
• Practitioners’ tacit knowledge, wisdom, and professional judgment interacted with the
re-mediating elements in ways that triggered adaptive, subversive, or innovative new
practices; for example, new technologies were more efficacious when linked to existing
practitioner wisdom or goals (e.g., “using the BESST data tool was the first time I really
understood what I was trying to accomplish as a dean”).
12
Practice re-mediation occurred
as reconfiguration of the relationship between technê and phronesis within participants’
practices.
• Practitioners’ professional and personal identities, values, and virtues interacted with
the re-mediating elements in ways that enhanced the ability of those personal or
professional values to inform their work; for example, new objectives and language re-
mediated practice by changing practitioners’ personal/professional relationships to their
own practice(s) (e.g., “I was surprised how much the concept of ‘equity-mindedness’
made me think more deeply about why I do my job”).
• Institutional logics, which are enduring structures of practice embedded in the
institutions’ social and cultural histories (Thornton, Ocasio, & Lounsbury, 2012),
12
The quote here and in the following bulleted paragraphs are my paraphrasing of statements made during
interviews.
197
interacted with the re-mediating elements in ways that both produced and constrained
new possibilities for practice. For example, the re-mediation of practice around the
objective of increasing equity for students of color confronted existing practices enacted
through a logic of “diversity” as an institutional goal, and new language helped re-
mediate practices at the nexus of those competing logics (e.g., “we asked people in
various departments if they were aware of inequities by race and they said ‘yes, but we
already have lots of programs in place,’ so we tried to get them to focus on institutional
responsibility and being equity-minded instead of focusing on student deficits”).
Given these key re-mediating elements and the ways in which their interactions may alter
the nature of practices associated with the specific committees and their participants, the final
question is then how such initiatives may come to provide new resources for organizational
change—not just as re-mediated practices (learning) within a committee or for its individual
members, but as sources of ongoing, incremental, and iterative change across the university.
(3) What are the organizational/structural implications and extensions of such changes
in practice?
The findings of this study reveal a number of ways by which the re-mediated practices of the
committees and their individual members may potentially extend across organizational settings:
• Through re-mediation of participants’ practices within their own settings as faculty
members, administrators, or other staff (e.g., “now I show my colleagues data on
inequities in student research experiences in our department and challenge them to re-
examine how they choose students for those opportunities”).
• Through new structures to support learning practices across the organization, which
may be created through:
198
o Formal replication of practice re-mediation settings for other groups across
campus (e.g., “Equity Matters” training workshops).
o New communicative networks and practices in relation to the organizational
structure (“silos”) (e.g., “Now admissions calls me up when they have a student
who wants to major in Communications but who’s on the border of being
admitted; I never used to be involved in those decisions”).
o Reinforcing organizational routines such as data use that change practitioners’
goals and agency in relation to student experiences (e.g., all department chairs
now monitor data on enrollment and retention by race/ethnicity, or division deans
incorporate new equity or student success objectives into competitive grant
RFPs).
The implication of these findings is that first, the impact of re-mediated practice in one setting is
not automatically going to change organizational practices more broadly. Ongoing learning as an
organizational practice requires incremental, assertive tactics by practitioners within their own
settings to facilitate re-mediation of others’ work (they become agents of learning in other
settings) and/or new structures across the organization that create conditions for practice re-
mediation to occur in other settings. In many cases in this study, the committees have made
recommendations and are waiting for administrators to “adopt” them, which, they acknowledge
may or may not happen depending on how those recommendations fit into larger institutional
priorities (illustrating again the interaction of dominant institutional logics with new modes of
practice). Moreover, in some cases practitioners believed that enactment by leaders was the only
way for things to change, whereas others saw the potential of incremental change through their
own activities as faculty or staff members—further illustrating the important interaction effects
199
of institutional logics and the way leadership practices (for example, the autocratic practices of
the former president at Hillside University or the distributed leadership practices of the former
president at Old Main University) serve to create enduring structures that constrain or generate
potential for new modes of practice elsewhere within the university.
Nicolini (2009a) emphasized how practices in one setting may serve to create resources
of practice in other settings; from this standpoint, the organizational implications of the change
initiatives could be understood in terms of the ways in which the activities of the committees
created resources for learning practices in other settings within the university. I suggested in the
Old Main University case narrative, for example, that what committee members there were doing
was creating structures for learning practices similar to what they had experienced within the
committee. In other cases, the committee members created recommendations, which perhaps
served as a resource for decision-making or as an artifact in discursive practices elsewhere
(budget committee meetings, leadership councils, board meetings, etc.), but these did not
fundamentally change the nature of those practices. One way of evaluating how change
initiatives impact their broader organizations is thus to examine the ways in which they created
resources for practice change elsewhere, such as the data-use routines I discussed at length in the
Old Main case narrative, versus creating artifacts for existing practices.
Feldman (e.g., 2000; 2004; Feldman & Pentland, 2003; Feldman & Rafaeli, 2002) has
examined extensively how organizational routines make use of resources and the nature of those
resources as they become objects-in-use rather than static objects apart from their use; future
research on change in higher education could benefit from a similar exploration of how resources
generated in practice (for example, in committees) generate resources-in-use as components of
practice as opposed to unused artifacts (reports, recommendations), and what the conditions are
200
that result in those variations of resource production. In the same vein, future research could
research could explore in what ways, and to what effect, leadership practices in particular (as in
this study, though I did not delve as deeply here as I could have) generate resources for learning
practices across the university.
Towards a Practice-Based Understanding of a Change Bureaucracy
One of the most compelling findings from Clark’s (1998; 2004) and Sporn’s (1999)
comprehensive studies of large-scale organizational change in higher education—compelling in
each case and in that both scholars referenced a version of the same idea—was that there is, vital
to effective and sustained change processes, something like an “infrastructure” of change. Clark
(2004) described such an infrastructure as a “bureaucracy” constituted by “interlocked forms and
interests that insist on continuous change, incremental and accumulative” (p. 5). Sporn (1999), in
a more abstract rendering, characterized the idea as “adaptive capability”—what I interpret from
her case analyses as reflecting a general set of architectural and cultural factors that interact to
support the development of new organizational processes oriented to meet new demands or
respond to crises.
The notion of an organizational capacity for change provides an interesting frame through
which to reflect on the research questions that motivated this study and to synthesize the findings
that have emerged in answering them. What might a capacity or an infrastructure for change
mean in the terms and from the perspective of practice theory? How might such an infrastructure
(or lack thereof) be reflected within the practices of a specific set of change-focused settings like
the committees I studied here?
In the aggregate, the findings of this study suggest that there are at least two mechanisms
through which the re-mediation of a set of practices within the given setting of the committee
201
may potentially intersect with the array of other practices that constitute the “organizing” of the
university in order to produce the kinds of incremental change generated through “interlocked
forms and interests” as described by Clark (1998). These emerged most clearly in the Old Main
University case, though the contrast of events there to those at other sites is instructive as well.
As I described above, those two key mechanisms include:
• Through the networks and communicative practices in which individuals extend their
learning and act as agents in the re-mediation of their own practices shared with
colleagues (in admissions, in academic departments, and so on).
• Through structures (in the practice theory sense) that engender learning practices in other
settings by creating the conditions for practices to be re-mediated using new tools,
language, technology, objectives, and so on.
In both cases, organizational change as a function of practice is essentially a cycle of learning
(practice re-mediation) and communicating with respect to particular objectives: the learning of
new ways of doing things and/or new reasons for doing things differently (e.g., participating in
the Scorecard committee or similarly structured settings) and the communicating of that
knowledge within the other areas of one’s practice such that other new practices emerge (e.g., the
Scorecard participant re-applying learning with departmental meetings, bringing new questions
or tools to other committees, participants convincing their department chair to embed new goals
or metrics within faculty research grants, or the Scorecard committee creating new workshops
for training, and so on).
If these two features—structures of learning practices and a communicative network in
which those structures are embedded—form something like a core apparatus of an organizational
“change practice,” then the specific mediating dimensions outlined separately in the response to
202
research question #2 above essentially describe a range of interacting components that determine
the nature of change-focused practices within any given setting. A dean at Valley University (a
chemist by training) used the analogy of a “catalyst” to describe the role performance-based
funding played in bringing the Equity Scorecard to the university (presumably, then, causing a
change “reaction”). To extend his analogy, structures for learning practices and networks of
communicative practices might be imagined as key elements of an organizational change
reaction, whereas the specific inputs of accountability objectives and resources, institutional
logics, leadership practices, professional and moral identities, practices of data use and
associated artifacts, and specific technologies (e.g., action research) and language (e.g., equity-
mindedness) could be seen to serve as reactors changing the nature of the change reaction as a
function of the unique mix and quantity of them found in any given setting. The nature of the
organizational change practice will depend on the mix of these reactors.
All organizations have networks of communication, both formal (reporting chains) and
informal (water cooler chats), across which organizational communication routinely occurs. The
significance of these networks and communicative practices with respect to change in higher
education is, as the literature reviewed in Chapter II suggests, the ways in which they operate
within the unique forms of academic organizations—that is, in what ways they span the
organizational architecture (e.g., silos), bridge distinct “cultures” (Berquist & Pawlak, 2008), or
facilitate the “cybernetics” or flows of information across the organization (Birnbaum, 1988).
Each university in this study has a set of (very different) communicative networks and
established communicative practices; all three also had (because of their participation in the
Equity Scorecard and Deliverolog) at least two rudimentary settings for the enactment of
learning practices. But in each case, the ways in which these learning practices within the
203
committees extended into networks of communicative practice to change the structural contexts
of practice elsewhere in the university varied. In plain language: the extent to which the learning
of the committees resulted in the potential for ongoing learning in other settings across the
university varied depending not only on the nature of the learning that occurred within the
committees but also on the relationships those committees had to broader organizational
contexts.
Accountability as a driver of practice change. All three universities, by virtue of being
members of PASSHE, have a as a common accountability context the system’s performance-
based funding policy, which ties a portion of their annual state appropriation to their
performance in meeting several goals outlined in a set of metrics (e.g., “closing gaps”) and
measured through annual data reporting on student outcomes. Despite this shared context, the
way that accountability figured into the practices associated with the change initiatives was
different at each university. In one setting, the objectives, metrics, and resources introduced by
the accountability context interacted with leadership practices and dominant institutional logics
around “distributed leadership” to create legitimizing discursive resources for change practices.
In another setting, those accountability properties interacted with the leadership practices and
logics of that setting to create a coherent but discursively limiting frame of reference for change
practices. In a third setting, the accountability context seemed to have no apparent interaction
with other elements of the change practices, either generative or conservative.
How accountability policy may be designed and implemented productively to generate
improvements in equity and/or student success is, as I described in Chapter I, one of the most
important questions for the current era and one of the underlying questions motivating this study.
The findings from this study suggest that one of the challenges to making accountability
204
effective as a tool for fundamentally re-mediating practices within higher education (which, I
argued above, may be one way to re-conceive of the goal of accountability policy) is to
understand how accountability can serve, as the dean suggested, as a “catalyst” for reorienting
and generating resources around a shared objective (e.g., equity). Perhaps the most compelling
role of accountability I observed in this study was at Old Main University, where leaders and
practitioners participating in the change initiatives used the legitimating quality of the
accountability context as a strategic discursive resource both for their own learning practices and
for generating new structures of learning across the organization. A promising project for future
research might thus be to examine more carefully and in additional settings how accountability
features in the discursive practices of higher education practitioners working towards improving
student success and equity or other objectives.
The role of data and data-use in practice change. The universities included in this
study had varying levels of institutional capacity for and extents of data use within existing
organizational routines. For the most part, consistent with the literature on data-use in higher
education (e.g., Ewell, 2012; Jenkins & Kerrigan, 2008), practitioners outside of a few key
leadership or institutional research positions did not have access to a wide range of institutional
data—particularly with respect to equity in admissions, retention, and completion. The data-use
element of the change initiatives thus served to re-mediate their collective activities and to define
for the committees the objectives of the activities and goals (for example, to reduce the gap
between graduation rates of black and white students from 20 percentage points to 10 over five
years). This direct re-mediating role was evident in every case (as a function of the initiatives’
design). Beyond this direct re-mediating role within the committees, however, the relationship
205
between data use as a kind of technology (technê) for change and the broader array of change-
associated practices and structures varied across settings in provocative ways.
Perhaps the most surprising finding that emerged across settings was the extent to which
data as a mediating artifact of practice and data use as a form of practice technology interacted
with so many other dimensions of practice, including communicative and leadership practices,
institutional logics, and professional values. Most striking, the use of data to re-mediate the
activities of practitioners within the committees appeared to interact with the professional and/or
personal ethical dimensions of practice for many of those involved in the initiatives—and to do
so (though to different effects) across universities and for individuals in different professional
roles within universities. In other words, data as a re-mediating artifact seemed to activate what
MacIntyre (2007) would call the virtue of practice; practices re-mediated with data in many cases
also changed practitioners’ relationships to the organizational objectives of equity and student
success such that the technê of data use became integrated into the phronesis of professional
judgment. As I described in the Old Main University case narrative in Chapter VI, this re-
mediated relationship is akin to the notion of double-loop learning in that it brought about a new
form of practitioner agency with respect to the underlying conditions reflected in the learning
tools (data). In practice terms, data use formed a new structural context for professional activity.
Integrating accountability and data use in change practices. In pragmatic terms, what
does this interaction between data and professional virtues, and the variance of that interaction
across settings, suggest about the efficacy of the current accountability-driven change initiatives
that, like the Equity Scorecard and Deliverology, see data-use as a core (albeit differently
enacted) mechanism of change? One thing it suggests is that there is a clear and compelling case
to be made for additional research that examines specifically how data may serve as a tool for
206
advancing professional practice within higher education by examining not just who uses data,
how often, and for what purposes, but how data-use as a form of technology may fundamentally
change practitioners’ relationships to their practice.
A second and very tentative suggestion is that accountability policy might be made more
effective by design strategies that embed processes allowing practitioners within academic
organizations to find in the objectives of that policy some link to their own sense of the “good”
in their practice. Of course, the potential of such a linkage may be more likely when the
objectives have to do with equity or student success than those related to, say, “productivity,” for
obvious reasons. Moreover, such linkages may be generative of both change and resistance (as in
the case of the faculty member at Hillside University who, by challenging the validity of the
data, rejected the very framing of the equity objectives of the initiative). Indeed, this general set
of findings related to professional identity may be generalizable only with respect to particular
kinds of accountability contexts and goals, such as student success and equity, that are common
but by no means the only objectives embedded in current completion-agenda era policy.
Nonetheless, there are two points I would make with respect to data use and professional
“virtue” in practice. The first is that, while the equity and student success objectives at the core
of the PASSHE change initiatives may have a particular relationship to the personal and
professional values of those who choose to work in higher education, there are also many other
elements of what higher education institutions do—including learning assessment and research
and innovation—in which policymakers and other constituents have an interest. There is, I would
argue, a virtue in these practices that could be similarly enhanced or activated by data-use or
other technologies that change practitioners’ underlying relationships to the outcomes conveyed
207
by data, even when framed by external objectives (for example, by a federal college “ranking”
system or by accreditors calls for universities to create “cultures of assessment”).
The second point with respect to data use as a practice technology vis-à-vis
accountability is that it also has the potential to interact with professional virtues in ways that
engender resistance or subversive practices (as de Certeau might suggest)—and that this is not a
“failure” of policy or the change initiatives it sponsors. Rather, such a re-mediated form of
practice would, I argue, provide university practitioners themselves, as well as institutional
leaders or policymakers who must contend with their competing preferences, greater clarity with
respect to the “true” underlying tensions within policy objectives. This type of productive
resistance was evident, for example, in the way some of those working in the committees were
able to use data-use practices (along with new language about institutional responsibility for
equity) to challenge dominant institutional logics with respect to prestige and selectivity. In a
similar way, by activating the virtues inherent to practice, data use technologies could allow
those within academic institutions to better critique the alignment of policy objectives with
institutional values or values about the public purposes of higher education more broadly. Such
practice-anchored, data-mediated, and virtue-based critiques would be far more productive, in
my view, than much of the current critical discourse around the completion agenda.
Recognizing the potential of interacting elements of higher education practice. The
final lesson this study offers with respect to how the types of change initiatives occurring within
PASSHE have the potential to be generative of (or reflect) an organizational “bureaucracy” or
capacity for ongoing change is to illustrate the many complex ways in which the mediating
factors of practice overlap and intersect with respect to any given objective. None of the new
mediating elements outlined above operated separately from others, and none served to re-
208
mediate practices within the committees independent of the moral, professional, and institutional
components of practice both within the committee and among the practices of committee
members. The generative power of the practice framework is to recognize the interactive nature
of these mediating elements within higher education practice and how the “mix” of those
interactions will significantly determine the new modes of activity and structure that emerge.
It is never that accountability policy nor data-use “best practices” will lead directly to
changes in student outcomes; rather accountability-based objectives and resources interact with
institutional logics and the relationship between those logics and practitioners’ own personal and
professional goals and values with respect to their roles within the university to reconfigure
practices in ways that may change outcomes. Similarly, data-use and other new forms of “know-
how” introduced by change initiatives may create new modes of practice but only in interaction
with practitioners’ tacit wisdom and the varying “pragmatic regimes” (Thévenot, 2001) through
which practitioners approach their work and pursue their goals. New language can be a powerful
element of re-mediating practice in the form of a resource of legitimating tactics, but that
language has to be adopted within practice as a discursive tactic vis-à-vis certain institutional
logics and communicative networks. Finally, how re-mediated practices in one setting may serve
to create resources for organizational change elsewhere (for example, in “learning structures”) is
a function of the mix of all these factors in a combination uniquely suited to the history and
context of the university. Understanding how to enact change in practice may not be about
“culture change” but rather about assessing the origins and logics of current structures of practice
within a university and then strategically injecting re-mediating elements to change the
conditions in which those practices occur. Future research could thus benefit significantly from
drawing on the institutional logics perspective to interrogate the sources of certain regimes of
209
practice within higher education (e.g., admissions, developmental education assessment and
modes of delivery, learning assessment, selection into majors, and so on).
13
Connecting to Research and Theory: Contributions and Limits of the Practice Approach
In the foreword and at various points in the introduction and literature review, I argued
that conceptual approaches adopted in many prior studies of change were not only limited vis-à-
vis the nature of current change initiatives but also limited with respect to the objectives of those
studies themselves. In particular, I argued that functionalist models of change and many of the
other theoretical approaches (diligently outlined in Kezar, 2001) and atheoretical approaches
common in organizational studies do not attend fully to the interactive nature of human agency
and structure in enacting change within a confluence of many different social, political,
economic, and professional goals and values. I also illustrated how many existing studies of
change rely extensively on accounts of practice without benefitting from a robust theorizing of
what those practices may convey about change processes.
I came to practice theory after attempts in earlier work to understand the impact of
performance-based funding policy on organizational change as a function of “implementation”
and using the theoretical and analytic tools of various organizational and political theories on
implementation. What I was not able to figure out, using those theoretical perspectives, was how
to understand the organizational change process vis-à-vis policy if there was no actual thing
being implemented; in other words, unlike other types of policy that implementation theories
were designed to address (conceived of in terms of policy subjects and specific prescribed
actions those subjects were intended to take), performance-based funding policy is a form of
13
Bastedo (2009) took this approach in a study examining the origins of state policy measures as the result of
convergent institutional logics in Massachusetts.
210
incentive tool (Schneider & Ingram, 1990) that gives inducements to produce certain outcomes
without prescribing certain activities for reaching them.
I then looked to organizational learning theories, thinking that perhaps performance-
based funding would change organizational outcomes through a mechanism of organizational
learning whereby those working within higher education would become aware of the new
requirements of policy and make organizational adjustments needed to respond effectively (Daft,
2004; Daft & Weick, 1984). Indeed, this approach is now being explored extensively by
Dougherty and his colleagues (Dougherty & Reddy, 2013; Jones, Dougherty, Lahr, Natow,
Pheatt, & Reddy, 2014) who are hoping to understand how performance-based funding brings
about or relates to processes of organizational learning within institutions.
Though models of organizational learning are helpful in developing heuristic
conceptualizations of how change might occur within individuals or organizations as a function
of accountability policy, however, they largely explicate and then leave unexplored links
between cognitive (e.g., Argyris & Schön, 1978) or cultural (e.g., Cook & Yanow, 1993; Yanow,
2000a) processes of learning and explanations of actual individual changes in action that then
translate to organizational change. Moreover, models of organizational learning do not generally
provide convincing accounts of how changes in action might be expected to occur not as a result
of identifying an unambiguous “error” or external exigency but with respect to the conditions
that better characterize change in higher education—specifically, in response to the introduction
of policy objectives that interact with (and often compete with) a wide array of existing and
ostensibly legitimate organizational goals and practices within universities.
I thus needed a way to understand how the work individuals do every day within
academic institutions might come to change, how that work interacts with external objectives or
211
the processes of change initiatives, and how that work relates to the broader organization. In the
heavy-lifting construct of the “practice,” practice theory integrates many of the same abstract
conceptual elements that appear in institutional, cultural, and organizational theories of change
(such as cultures, logics, norms, networks, cognitive processes) but without relying on those
abstractions to do the “work” of change. Practice theory, owing largely to Bourdieu, Giddens,
and Latour, integrates these abstract elements as part of the structuring of human action while
allowing for human agency to simultaneously reproduce or alter them within any given setting.
Contributions of the practice approach. I thus anchored this study within a set of
ontological and theoretical assumptions about the nature of organizational activity as concrete,
day-to-day arrays of ordered activities (Schatzki, 2001) that reflect but also have the potential to
change many of the abstract elements sociological theory is concerned with generally, such as
norms and logics. I believe this approach has been particularly generative for this study of
change for the following reasons.
First, it has allowed me to understand how the change initiatives entered into the ongoing
work of individuals within the universities. Rather than assuming that they represented
something wholly new or drastic departures from existing practices, the practice approach
illustrates how investments in change can be reflected in reconfigurations of patterns of activity
already familiar to and legitimated within academic organizations.
Second, the approach (particularly CHAT) provides a model of learning anchored by
concrete elements of social activity: artifacts like reports, PowerPoints; words that are used with
special meaning in a given setting; tools like databases and the templates within them (in
programs like PeopleSoft or Banner) through which practitioners’ gain information about their
students, and so on. By emphasizing how all practices (arrays of activity directed at a goal) are
212
mediated by these elements, learning as a function of practice then becomes similarly concrete in
terms of re-mediating activity. This perspective allowed me to see how, specifically and in
observable concrete ways, the tools, processes, language, and artifacts of the change initiatives
served to potentially bring about learning in a social setting.
Finally, the practice approach emphasis on practice as having the potential to generate
resources—both physical (new artifacts) or in terms of alterations of structures (Giddens,
1984)—that then impact practices elsewhere within a network of associated practices (for
example, all those that constitute an organization) has helped me in this study to begin
conceptualizing how the learning in one setting (practice re-mediation) as the result of a change
initiative may result in incremental learning across settings. This conceptualization, in turn, has
prompted a number of what I think are promising projects for future research.
Limitations of the practice approach. Putting aside the myriad ways in which my
adoption of it may have been flawed, for example in abusing or confusing its jargon or in the
methodological limitations of having few opportunities for observation, I believe the practice
approach also has some inherent limitations as a tool for studying change. First, though I tried in
Chapter III to be careful in distinguishing a “practice” from general everyday activities, theorists
working from this approach have generally not been consistent in the usage of the term and the
principles that comprise the corpus of the theory across these works may create more uncertainty
than specificity (Thévenot, 2001; Turner, 1994). If nothing else, the jargon of practices,
activities, mediating and re-mediating elements, and structures is confusing and risks sending the
researcher into a never-ending tautology when trying use those terms to explain a process of
change over time between individual actions and organizational processes (activity constitutes
structure and structure constitutes activity but which comes first?).
213
Second, though practice theory integrates sociological phenomena like culture, logics,
and norms in a specific way that maintains the practice ontology (they are reproduced through
ordered arrays of practice), I nonetheless found it difficult to apply practice theory and take
account of these various dimensions without reverting back to some of the implicit deterministic
assumptions about those dimensions that are more common within institutional and cultural
theories. A critical reader of the case narratives will like recognize instances where I made such
errors, such as in glossing over how a logic of distributed leadership may have “caused” the
practice re-mediation to have effects at Old Main University different from those at the other
universities. In part this is my error, but at least in part I believe that practice theory also may
contribute to the likelihood of such muddled reasoning by attempting to overwork its primary
unit, the practice, as the carrier of so much social significance.
Limitations of the Study Generally and Directions for Future Research
Aside from the potential conceptual limitations inherent to the practice theory approach,
this study has methodological limitations with respect both to its intended aims and the
generalizability of its findings to wider settings. The first is that, like many studies of change, it
relies to a significant extent on self-reported data. Though merged with and compared against
observation, document analysis, and historical analysis, self-reported data—particularly in
accounts of “what took place” retrospectively have the potential to distort findings due simply to
the fact that human beings tend to both provide post hoc rationalization of their behavior
(Argyris & Schön, 1974; Weick, 1976) or alter their accounts in storytelling (Daiute, 2014).
Given the emphasis within the theoretical perspectives of this study on the meaning of variations
in arrays of activity, more observations in particular would have strengthened the methodological
approach to capturing the significance of practice to change.
214
Second, the study is also limited by the small number of universities in my sample.
Despite my attempts to capture variation within the PASSHE system, there are certainly some
dimensions of variation that I was not able to account for in my sampling strategy. For example,
I noted above that the provosts generally supported the Equity Scorecard projects in the three
institutions I studied but were less supportive at other institutions; this likely had significant
implications for my findings, but I was not able to tease those out because my sample did not
have variation on that dimension. Similarly, the generalizability of the findings would be more
robust if I had been able to include additional settings outside the system and in different types of
sectors or state policy and historical contexts. There are of course features of the PASSHE
universities that make them unique and which surely have implications for my findings (for
example, the significant role of unions in the universities, or their histories as regional normal
schools). Studies of change in community colleges (which are experiencing many of the same
types of change initiatives) or other types of public universities would help to refine the general
sets of relationships identified in Chapter V and provide a multiplicity of accounts by which to
better understand the nature of change-related practice today.
Future research could thus replicate studies of change using a cross-state or cross-system
sampling design. Future research could also adopt a similar approach to studying change but
focusing on initiatives with objectives other than student success or equity (such as developing
processes for learning assessment). There are also specific questions that emerged from the
findings that could be explored more deeply through additional, focused research that maintains
a practice focus in order to flesh out a more comprehensive theory of the interrelationships
between different levels of practice within a higher education system (for example,
policymaking, leadership, faculty, and so on). I noted two such potentially lucrative projects for
215
future research above: specifically, (1) examining the ways in which leaders or other
administrators within universities enact the logics or objectives of accountability through
discursive practices and how those practices create resources for practice among faculty and
other staff; and (2) historical practice analysis of the sources of certain types of practices across
different strata of higher education (for example, in policymaking or in admissions) in order to
better understand the particular structures and mediating elements of those practices.
Additionally, future research could more deeply explore the nature of data use in higher
education by capitalizing on the practice theory approach to better understand how, as I have
tentatively discussed here, the technology of data use may serve to alter the “sociomaterial
imbrication” (Leonardi, 2012) of higher education practice by changing the agency relationship
practitioners have to outcomes conveyed in data.
Finally, I have argued at various points that the practice theory approach to understanding
change not only supports new ways of understanding and evaluating change initiatives (e.g., in
terms of its engendering of “learning structures” or new communicative networks) but also a
precise and ontically grounded foundation for criticism of policy objectives (as opposed to
ideological criticisms rendered in abstractions about “public good” or “academic culture”). Thus,
future research could also draw on studies of practice to construct critical inquiries into the
nature of change objectives. In what ways, if any, might the objectives of the completion
agenda—as new elements re-mediating leadership practices, for example—fundamentally alter
the array of practices (that is, both activities and structures) within academic organizations?
What are the implications of those enacted objectives for faculty work? In what ways might the
enactment of a so-called equity agenda within the discourse of the completion agenda serve to
supplant or reproduce logics of diversity that guided much of university practice over the past
216
four decades but failed to produce equal outcomes for students of color (Bensimon, Rueda,
Dowd, & Harris, 2007)? In what ways do the policymaking and leadership practices associated
with such change initiatives reflect and reproduce the interests of advantaged groups, and where
is there thus potential for those practices to be re-mediated with resources generated by practices
of communities of color (e.g., entities other than the “megafoundations” or traditional policy
organizations)? These questions reflect criticisms common to the current discourse but anchor
them in the actual, observable, and remediable practices that, as practice theory suggests,
perpetuate enduring systems of power and oppression.
As a final thought, I offer a coda on the two competing “grand narratives” that appear
throughout the literature on change in higher education. On one hand, higher education
institutions are resistant to and, regardless, incapable of change. On the other hand, colleges and
universities are so remarkably adaptive that they have managed to remain legitimate as social
institutions in roughly the same form across centuries. I suggested that perhaps the practice
theory lens would provide some missing link between the two narratives and to the extent that it
can, based on the findings of this study it is this: There is in the activities of each individual
practitioner and each individual organization as a group of practitioners a constantly adapting set
of practices that both reproduce and change the structures of higher education organizations,
their perceived political and ideological objectives, and the virtues inherent to their disciplinary
components. The practice ontology suggests that the social reality of the higher education
organization is constructed and constrained by the nature of practice – an array of activities
governed by a logic (or set of logics) but enacted through mediated human and material agency.
From this perspective, many of the apparent chasms and tensions in the literature on change
dissolve as mere creations of the way we conceive of and discuss academic work, institutional
217
administration, and policymaking as categorically separate domains. When we ask practitioners
to tell us “what they’re up to,” we get a description of activities that inevitably encompass the
resources generated across these intricately linked domains as well as the unique tactics and
personal stories of each individual (Latour, 2014). This may seem relative to the point of futility,
but in fact as we zoom out we begin to see patterns of common ways in which practice is re-
mediated over time and how those re-mediations constitute/are constituted by common
motivations and tactics, while at the same time the general form of higher education, its
structures and general modes of practice, appear enduring and immovable.
As Feldman argues (in Feldman and Orlikowski, 2011), practice theory provided her “a
new way of conceptualizing routines and a way of understanding the relationship between
stability and change as a result of the internal (or endogenous) dynamics of the routine” and an
understanding that, “Stability and change are different outcomes of the same dynamic rather than
different dynamics” (p. 10). Similarly, in this study the change initiatives served to re-mediate
and (in some cases significantly) reconfigure an array of enduring practices. Those practices of
the participants will go on after the completion of the Scorecard and Deliverology projects, and
will look very much like they did before—teaching, doing committee work, attending
departmental meetings, and so on. But within those practices, there is now the potential for new
language, technology, and objectives to interact with practitioners’ own relationships to their
work in such a way that changes the outcomes that work produces. In this way, change and
stability in higher education are part of the same dynamic practices.
218
Bibliography
Altheide, D. L., & Johnson, J. M. (1994). Criteria for assessing interpretive validity in qualitative
research. In Denzin, N. K., & Lincoln, Y. (Eds.), Handbook of qualitative research (pp.
485-499). Thousand Oaks, CA: Sage.
Altheide, D., Coyle, M., DeVriese, K., & Schneider, C. (2008). Emergent qualitative document
analysis. In Nagy Hesse-Biber, S., & P. Leavy (Eds.), Handbook of emergent methods
(pp. 127–151). New York, NY: Guilford Press.
Argyris, C., & Schön, D. A. (1974). Theory in practice: Increasing professional effectiveness.
San Francisco, CA: Jossey-Bass.
Argyris, C., & Schön, D. A. (1978). Organizational learning: A theory of action perspective.
Reading, MA: Addison-Wesley.
Aristotle (1962). Nicomachean Ethics. (M. Ostwald, Trans.). New York: Macmillan Publishing
Co., Library of Liberal Arts. (Original publication date unknown).
Bastedo, M. N. (2009). Convergent institutional logics in public higher education: State
policymaking and governing board activism. The Review of Higher Education, 32(2),
209–234. doi:10.1353/rhe.0.0045
Bensimon, E. M. (2005). Closing the achievement gap in higher education: An organizational
learning perspective. New Directions for Institutional Research, 131, 99-111.
Bensimon, E. M. (2006). Learning equity-mindedness: Equality in educational outcomes. The
Academic Workplace, 1(17), 2-21.
Bensimon, E. M. (2007). The underestimated significance of practitioner knowledge in the
scholarship of student success. The Review of Higher Education, 30(4), 441-469.
Bensimon, E. M., & Malcom, L. E. (2012). Confronting equity issues on campus: Implementing
the Equity Scorecard in theory and practice. Sterling, VA: Stylus Publishing.
Bensimon, E. M., Rueda, R., Dowd, A. C., & Harris III, F. (2007). Accountability, equity, and
practitioner learning and change. Metropolitan Universities Journal, 18(3), 28–45.
Bergquist, W. H. (1992). The four cultures of the academy. San Francisco: Jossey-Bass.
Bergquist, W. H., & Pawlak, K. (2008). Engaging the six cultures of the academy: Revised and
expanded edition of the four cultures of the academy. San Francisco: Jossey-Bass.
Birnbaum, R. (1988). How colleges work: The cybernetics of academic organization and
leadership. San Francisco: Jossey-Bass.
219
Blin, F., & Munro, M. (2008). Why hasn’t technology disrupted academics’ teaching practices?
Understanding resistance to change through the lens of activity theory. Computers &
Education, 50(2), 475–490. doi:10.1016/j.compedu.2007.09.017
Bogue, E. G., & Johnson, B. D. (2010). Performance incentives and public college accountability
in the United States: A quarter century policy audit. Higher Education Management and
Policy, 22(2), 1-22.
Bourdieu, P. (1980/1990). The Logic of practice. (R. Nice, Trans.). Stanford, CA: Stanford
University Press. (Original work published 1980).
Boyce, M. (2003). Organizational learning is essential to achieving and sustaining change in
higher education. Innovative Higher Education, 28(2), 119–136.
Bragg, D. D., & Durham, B. (2012). Perspectives on access and equity in the era of (community)
college completion. Community College Review, 40(2), 106–125.
doi:10.1177/0091552112444724
Burke, J. C. (2002). Funding public colleges and universities for performance: Popularity,
problems, and prospects. Albany, NY: Rockefeller Institute Press.
Burke, J. C., & Minassians, H. (2003). Performance reporting: “Real” accountability or
accountability “lite”: Seventh annual survey. Albany, NY: Nelson A. Rockefeller
Institute of Government, State University of New York.
Burke, J. C., & Modarresi, S. (2000). To keep or not to keep performance funding: Signals from
stakeholders. The Journal of Higher Education, 71(4), 432-453.
Cavanaugh, J. C., & Garland, P. (2012, May-June). Performance funding in Pennsylvania.
Change: The Magazine of Higher Learning.
Charmaz, K. (2006). Constructing grounded theory. Thousand Oaks, CA: Sage.
Christensen, C. M., & Eyring, H. J. (2011). The innovative university: Changing the DNA of
higher education from the inside out. John Wiley & Sons.
Chronicle of Higher Education (2012, March 2). Do college-completion rates really measure
quality? Commentary. http://chronicle.com/article/Do-College-Completion-
Rates/131029/
Cialdini, R.B. and M.R. Trost. (1998). Social influence: Social norms, conformity, and
compliance. In Gilbert, D.T., S.T. Fiske, & G. Lindzey (eds.), The handbook of social
psychology, Volume 2, pp. 151-192. New York, NY: McGraw Hill.
220
Clark, B. R. (1983a). The contradictions of change in academic systems. Higher Education,
12(1), 101–116. Retrieved from http://link.springer.com/article/10.1007/BF00140275
Clark, B. R. (1983b). The higher education system: Academic organization in cross-national
perspective. Berkeley, CA: University of California Press.
Clark, B. R. (1998). Creating entrepreneurial universities: Organizational pathways of
transformation. New York, NY: Pergamon/Elsevier Science.
Clark, B. R. (2004). Sustaining change in universities: Continuities in case studies and concepts.
New York, NY: Society for Research into Higher Education & Open University Press.
Coburn, C. E. (2001). Collective sensemaking about reading: How teachers mediate reading
policy in their professional communities. Educational Evaluation and Policy Analysis,
23(2), 145-170.
Coburn, C. E. (2004). Beyond decoupling: Rethinking the relationship between the institutional
environment and the classroom. Sociology of Education, 77(3), 211-244.
Coburn, C. E. (2005). Shaping teaching sensemaking: School leaders and the enactment of
reading policy. Educational Policy, 19(3), 476-509.
Coburn, C. E., & Talbert, J. E. (2006). Conceptions of evidence use in school districts: Mapping
the terrain. American Journal of Education, 112, 469-495.
Coburn, C. E., & Turner, E. (2012). The practice of data use: An introduction. American Journal
of Education, 118(2), 99–111.
Coburn, C. E., Toure’, J., & Yamashita, M. (2009). Evidence, interpretation, and persuasion:
Instructional decision making at the district central office. The Teachers College Record,
111(4), 1115–1161.
Cohen, M. D., & March, J. G. (1974). Leadership and ambiguity: The American college
president. Berkeley, CA: The Carnegie Commission on Higher Education; McGraw-Hill.
Cole, M., & Engeström, Y. (1993). A cultural-historical approach to distributed cognition. In G.
Salomon (Ed.), Distributed cognitions: Psychological and educational considerations, 1–
46. New York: Cambridge University Press.
Cole, M. (1983). A socio-cultural approach to the study of re-mediation. New Directions in
Studying Children: Speeches from the Conference of the Erikson Institute (pp. 51-68).
Retrieved from ERIC http://files.eric.ed.gov /fulltext/ED265955.pdf
221
Coburn, C. E. (2012). Collective Sensemaking about Reading : How Teachers Mediate Reading
Policy in Their Professional Communities, 23(2), 145–170.
Cook, S. D. N., & Yanow, D. (1993). Culture and organizational learning. Journal of
Management Inquiry, 2(4), 373-390.
Creswell, J. W. (2007). Qualitative inquiry & research design: Choosing among five approaches
(2
nd
ed.). Thousand Oaks, CA: Sage.
Daft, R. L. (2004). Organization theory and design (8
th
ed.). Mason, OH: Thomson South-
Western.
Daft, R. L., & Weick, K. (1984). Toward a model of organizations as interpretation systems.
Academy of Management Review, 9(2), 284–295.
Daiute, C. (2014). Narrative inquiry: A dynamic approach. Thousand Oaks, CA: Sage.
de Certeau, M. (1984). The practice of everyday life. (S. Rendall, Trans.). Berkeley: University
of California Press.
Debord, G. (1958). Theory of the dérive. Internationale Situationniste, 2, 50-54.
DiMaggio, P. J., & Powell, W. W. (1983). The iron cage revisited: Institutional isomorphism and
collective rationality in organizational fields. American sociological review, 48(2), 147–
160.
DiMaggio, P. J., & Powell, W. W. (1991). The iron cage revisited: Institutional isomorphism and
collective rationality in organizational fields (revised). In Powell, W. W., & DiMaggio, P.
J (Eds.), The new institutionalism in organizational analysis (pp. 63-82). Chicago:
University of Chicago Press.
Dougherty, K. J., & Hong, E. (2006). Performance accountability as imperfect panacea: The
community college experience. In T. Bailey, & V. S. Morest (Eds.), Defending the
community college equity agenda (51-86). Baltimore: The Johns Hopkins University
Press.
Dougherty, K. J., & Reddy, V. (2011). The impacts of state performance funding systems on
higher education institutions: Research literature review and policy recommendations
(CCRC Working Paper No. 37). New York, NY: Columbia University, Teachers College,
Community College Research Center. Retrieved from
http://ccrc.tc.columbia.edu/publications/impacts-state-performance-funding.html
222
Dougherty, K. J., & Reddy, V. (2013). Performance funding for higher education: What are the
mechanisms? What are the impacts? ASHE Higher Education Report, 39(2). John Wiley
& Sons.
Dowd, A. C. (2005). Data don’t drive: Building a practitioner-drive culture of inquiry to assess
community college performance. Indianapolis, IN: Lumina Foundation for Education.
Dowd, A. C., & Tong, V. P. (2007). Accountability, assessment, and the scholarship of "best
practice". In J. C. Smart (Ed.), Handbook of higher education, 22 (pp. 57-119). Springer
Publishing.
Engeström, Y. (1987). Learning by expanding: An activity-theoretic approach to developmental
research. Helsinki, Finland: Orienta-Konsultit.
Engeström, Y. (2000). Activity theory as a framework for analyzing and redesigning work.
Ergonomics, 43(7), 960-974.
Engeström, Y. (2001). Expansive learning at work: Toward an activity-theoretical
reconceptualization. Journal of Education and Work, 14(1), 133-156.
Ewell, P. T. (2011). Assessing student learning outcomes in college: The role of the states. In D.
E. Heller (Ed.), The states and public higher education policy: Affordability, access, and
accountability (2
nd
ed.) (pp. 151-172). Baltimore, MD: The Johns Hopkins University
Press.
Ewell, P. T. (2012). Using information in higher education decision-making: Modes, obstacles,
and remedies (Draft 2). Paper prepared for the Spencer Foundation Organizational
Learning Initiative. Available at www.spencer.org.
Ewell, P. T., & Jones, D. J. (1994). Pointing the way: Indicators as policy tools in higher
education. In Ruppert, S. S. (Ed.), Charting higher education accountability: A
sourcebook on state-level performance indicators (6-15). Denver, CO: Education
Commission for the States.
Feldman, M. (2000). Organizational routines as a source of continuous change. Organization
Science, 11(6), 611–629.
Feldman, M. (2004). Resources in emerging structures and processes of change. Organization
Science, 15(3), 295–309. doi:10.1287/orsc.
Feldman, M., & Orlikowski, W. (2011). Theorizing practice and practicing theory. Organization
Science, 22, 1240–1253.
223
Feldman, M., & Pentland, B. (2003). Reconceptualizing organizational routines as a source of
flexibility and change. Administrative Science Quarterly, 48(1), 94–118.
Feldman, M., & Rafaeli, A. (2002). Organizational routines as sources of connections and
understandings. Journal of Management Studies, 39(3), 309–331.
Fichtenbaum, R. (2013, August 24). Statement on the President’s proposal for performance
based funding. Retrieved from http://www.aaup.org/news/statement-president’s-proposal-
performance-based-funding
Friedland, R., & Alford, R. R. (1991). Bringing society back in: Symbols, practices, and
institutional contradictions. In Powell, W. W., & DiMaggio, P. J (Eds.), The new
institutionalism in organizational analysis (pp. 232-263). Chicago: University of Chicago
Press.
Friedman, H. R. (1975, May 2). “Desegregation Plan Implementation/Preparation of HEW
Progress Report.” Memorandum from the Commonwealth of Pennsylvania Chief of the
Affirmative Action/Desegregation Division to the Presidents of the State Colleges and
Universities. Shippensburg University Archives, Government (State) Administration, Box
2.
Geertz, C. (1973). The interpretation of cultures. New York, NY: Perseus/Basic Books.
Giddens, A. (1984). The constitution of society: Outline of the theory of structuration. Malden,
MA: Polity Press.
Gioia, D. A., & Poole, P. P. (1984). Scripts in organizational behavior. Academy of Management
Review, 9(3), 449-459.
Glassman, R. B. (1973). Persistence and loose coupling in living systems. Behavioral Science,
18, 83-98.
Hall, C. E. (2011). “Advocacy philanthropy” and the public policy agenda: The role of modern
foundations in American higher education. Unpublished dissertation. Retrieved from
http://gradworks.umi.com/15/00/1500712.html
Hammersley, M., & Atkinson, P. (2007). Ethnography: Principles in practice, 3
rd
edition. New
York: Routledge.
Harbour, C. P., & Nagy, P. (2005). Assessing a state-mandated institutional accountability
program: The perceptions of selected community college leaders. Community College
Journal of Research and Practice, 29(6), 445–461.
224
Hearn, J. (1996). Transforming US higher education: An organizational perspective. Innovative
Higher Education, 21(2), 141–154.
Huisman, J., & Currie, J. (2004). Accountability in higher education: Bridge over troubled
water? Higher Education, 48(4), 529–551.
Ingram, D., Louis Seashore, K., & Schroeder, R. G. (2004). Accountability policies and teacher
decision making: Barriers to the use of data to improve practice. Teachers College
Record, 106(6), 1258-1287.
Jarzabkowski, P. (2004). Strategy as practice: Recursiveness, adaptation, and practices-in-use.
Organization Studies, 25(4), 529–560. doi:10.1177/0170840604040675
Jarzabkowski, P. (2008). Shaping strategy as a structuration process. Academy of Management
Journal, 51(4), 621–650.
Jenkins, D., & Kerrigan, M. R. (2008). Evidence-based decision making in community colleges:
Findings from a survey of faculty and administrator data use at Achieving the Dream
colleges. New York: Community College Research Center, Teachers College, Columbia
University. Retrieved from http://ccrc.tc.columbia.edu/Publication.asp?UID=653
Jenkins, D. (2011). Redesigning community colleges for completion: Lessons from research on
high-performance organizations. CCRC Working Paper No. 24. New York, NY:
Columbia University Teachers College, Community College Research Center.
Jenkins, D., Wachen, J., Moore, C., & Shulock, N. (2012). Washington State Student
Achievement Initiative Policy Study: Final report: Community College Research Center
and Institute for Higher Education Leadership and Policy.
Jones, D. P. (2013). Outcomes-based funding: The wave of implementation. Indianapolis, IN:
Complete College America. Retrieved from http://completecollege.org/pdfs/Outcomes-
Based-Funding-Report-Final.pdf
Jones, S. M., Dougherty, K. J., Lahr, H., Natow, R. S., Pheatt, L., & Reddy, V. (2014).
“Organizational learning for improved student outcomes in community colleges:
Structures and processes.” Paper presented at the annual conference of the American
Educational Research Association in Philadelphia, Pennsylvania.
Jones, S. R., Torres, V., & Arminio, J. L. (2006). Negotiating the complexities of qualitative
research in higher education: Fundamental elements and issues. New York, NY:
Routledge.
225
Kelly, P. J. (2008). Beyond social justice: The threat of inequality to workforce development in
the Western United States. Boulder, CO: Western Interstate Commission for Higher
Education.
Kezar, A. J. (2001). Understanding and facilitating organizational change in the 21
st
century:
Recent research and conceptualizations. ASHE-ERIC Higher Education Report, 28(4).
San Francisco: Jossey-Bass.
Kezar, A. J. (2011). What is the best way to achieve broader reach of improved practices in
higher education? Innovative Higher Education, 36(4), 235–247. doi:10.1007/s10755-011-
9174-z
Kezar, A. J. (2013). How colleges change: Understanding, leading, and enacting change. New
York, NY: Routledge.
Kristjánsson, K. (2005). Smoothing it: Some Aristotelian misgivings about the phronesis-praxis
perspective on education. Educational Philosophy and Theory, 37(4), 455–473.
Latour, B. (2005). Reassembling the social: An introduction to Actor-Network-Theory. New
York: Oxford University Press.
Latour, B. (2014). An inquiry into modes of existence. (C. Porter, Trans.). Cambridge, MA:
Harvard University Press.
Lawrence, T. B., & Suddaby, R. (2006). Institutions and institutional work. In Clegg, S. R.,
Hardy, C., Lawrence, T. B., & Nord, W. R. (Eds.), The SAGE Handbook of Organization
Studies (2
nd
edition) (pp. 215-254). Thousand Oaks, CA: Sage.
Leonardi, P. M. (2012). Car crashes without cars: Lessons about simulation technology and
organizational change from automotive design. Cambridge, MA: MIT Press.
Levine, A. (1980). Why innovation fails. Albany, NY: SUNY Press.
Lewin, T. (2013, August 22). Obama’s plan aims to lower cost of college. The New York Times.
Retrieved from http://www.nytimes.com/2013/08/22/education/obamas-plan-aims-to-lower-
cost-of-college.html
Lincoln, Y. S., & Guba, E. G. (1994). Paradigmatic controversies, contradictions, and emerging
confluences. In N. K. Denzin & Y. S. Lincoln (Eds.), The Sage handbook of qualitative
research (pp. 163-186). Thousand Oaks, CA: Sage
226
Lincoln, Y. S., Lynham, S. A., & Guba, E. G. (2011). Paradigmatic controversies, contradictions,
and emerging confluences, revisited. In N. K. Denzin & Y. S. Lincoln (Eds.), The Sage
handbook of qualitative research (4
th
ed.) (pp. 97-128). Thousand Oaks, CA: Sage.
Luker, K. (2008). Salsa dancing into the social sciences: Research in an age of info-glut. Boston,
MA: Harvard University Press.
MacIntyre, A. (2007). After virtue: A study in moral theory (3
rd
ed.). Notre Dame, IN: University
of Notre Dame Press.
Marsh, J. A. (2012). Interventions promoting educators’ use of data: Research insights and gaps.
Teachers College Record, 114(110303), 1-48.
Marsh, J. A., Pane, J. F., & Hamilton, L. S. (2006). Making sense of data-driven decision making
in education. Evidence from recent RAND research (No. OP-170-EDU). Santa Monica,
CA: RAND Corporation.
Marshall, K. (2011, January 20). PASSHE revises performance funding system: Focus on
student success highlights changes approved by Board (press release). Pennsylvania State
System of Higher Education. Retrieved from
www.passhe.edu/inside/ne/press/Lists/Press%20Releases/pageview2.aspx
Meyer, J., & Rowan, B. (1977). Institutionalized organizations: Formal structure as myth and
ceremony. American Journal of Sociology, 83(2), 340–363.
Middle State Commission on Higher Education (2009). Report to the faculty, administration,
trustees, students, of [Valley University]. Retrieved August 31, 2014 from [Valley
University] website.
Mohlenhoff, G. B. (1969, August 21). Letter to Dr. Ralph E. Heiges regarding a two-day
conference for academic deans in response to HEW Desegregation Order. Shippensburg
University Archives, Government (State) Administration, Box 2-1-43.
Mohlenhoff, G. B. (1971, April). “Desegregation in Higher Education.” Memorandum to
Pennsylvania State Colleges & Universities. Shippensburg University Archives,
Government (State) Administration, Box 2-1-43.
Mohlenhoff, G. B. (1971, July). “Desegregation in Higher Education.” Memorandum to
Pennsylvania State Colleges & Universities. Shippensburg University Archives,
Government (State) Administration, Box 2-1-43.
227
Morest, V. S., & Jenkins, D. (2007). Institutional research and the culture of evidence at
community colleges: Report no. 1 in the culture of evidence series. New York:
Community College Research Center, Teachers College, Columbia University. Retrieved
from http://ccrc.tc.columbia.edu/Publication.asp?uid=515
Morgan, G. (1986). Images of organization. Thousand Oaks, CA: Sage.
Neave, G. (2012). Change, leverage, suasion and intent: An historical excursion across three
decades of change in higher education in Western Europe. In Stensaker, B., Välimaa, J.,
& Sarrico, C. S. (Eds.), Managing reform in universities: The dynamics of culture,
identity and organizational change (pp. 19-40). New York, NY: Palgrave-Macmillan.
Neumann, A. & Pallas, A. (in press). Critical policy analysis, the craft of qualitative research,
and analysis of data on the Texas Top 10% law. In Martinez-Aleman, A., Pusser, B., &
Bensimon, E. M. (Eds.), Critical approaches to the study of higher education (n.p.).
Baltimore, MD: The Johns Hopkins University Press.
Nicolini, D. (2009a). Zooming in and out: Studying practices by switching theoretical lenses and
trailing connections. Organization Studies, 30(12), 1391–1418.
doi:10.1177/0170840609349875
Nicolini, D. (2009b). Medical innovation as a process of translation: a case from the field of
telemedicine. British Journal of Management, 21(4), 1011-1026.
Nicolini, D. (2013). Practice theory, work, & organization: An introduction. Oxford: Oxford
University Press.
Nisbett, R. E., & Wilson, T. D. (1977). Telling more than we can know: Verbal reports on
mental processes. Psychological Review, 84(3), 231-259.
Pennsylvania State System of Higher Education (PASSHE) (n.d.). Pennsylvania State System of
Higher Education Fact Sheet. Retrieved August 30, 2014 from
http://www.passhe.edu/partners/Documents/Pennsylvania%20State%20System%20of%2
0Higher%20Education%20Fact%20Sheet.pdf
Perna, L. W., & Thomas, S. L. (2006). A framework for reducing the college success gap and
promoting success for all. Commissioned report for the National Symposium on
Postsecondary Student Success: Spearheading a Dialog on Student Success. Washington,
DC: National Center for Education Statistics, National Postsecondary Education
Coopertative.
228
Polkinghorne, D. E. (2004). Practice and the human sciences: The case for a judgment-based
practice of care. Albany, NY: SUNY Press.
Prior, L. (2008). Researching documents. In Nagy Hesse-Biber, S., & P. Leavy (Eds.), Handbook
of emergent methods (111-126). New York, NY: Guilford Press.
Rhoades, G. (2012). Closing the door, increasing the gap: Who’s not going to (community)
college? University of Arizona, Center for the Future of Higher Education, Policy Report
#1.
Russell, A. (2011). Higher education policy brief: A guide to major U.S. college completion
initiatives. Washington, DC: American Association of State Colleges and Universities.
Saarinen, T., & Välimaa, J. (2012). Change as an intellectual device and as an object of research.
In Stensaker, B., Välimaa, J., & Sarrico, C. S. (Eds.), Managing reform in universities:
The dynamics of culture, identity and organizational change (pp. 41-60). New York, NY:
Palgrave-Macmillan.
Sabatier, P., & Mazmanian, D. (1980). The implementation of public policy: A framework of
analysis. Policy Studies Journal, 8(4), 538-560.
Schatzki, T. R. (2001). Introduction: Practice theory. In Schatzki, T. R., Knorr Cetina, K., &
Savigny, E. V. (Eds.), The practice turn in contemporary theory (pp. 1-14). New York,
NY: Routledge.
Schneider, A., & Ingram, H. (1990). Behavioral assumptions of policy tools. Journal of Politics,
52(2), 510-529.
Shulock, N. (2003). A fundamentally new approach to accountability: Putting state policy issues
first. Paper presented at the 28
th
conference of the Association for the Study of Higher
Education in Portland, OR. Sacramento, CA: California State University, Sacramento,
Institute for Higher Education Leadership & Policy. Retrieved from
http://www.csus.edu/ihelp/PDFs/R_ASHE_Accountbility_10-03.pdf
Slack, M. (2013, August 22). President Obama explains his plan to combat rising college costs.
The White House Blog. Retrieved from
http://www.whitehouse.gov/blog/2013/08/22/president-obama-explains-his-plan-combat-
rising-college-costs
Slaughter, S. & Leslie, L. (1997). Academic capitalism: Politics, policies and the entrepreneurial
university. Baltimore, MD: The Johns Hopkins University Press.
229
Slaughter, S. & Rhoades, G. (2000). The neo-liberal university. New Labor Forum, 6, 73-79.
Slaughter, S. & Rhoades, G. (2004). Academic capitalism and the new economy: Markets, state,
and higher education. Baltimore, MD: The Johns Hopkins University Press.
Smay, P. (1969, September 18). Letter from Dr. Smay, Vice President for Academic Affairs, to
the Academic Council of the Pennsylvania State Colleges & Universities. Shippensburg
University Archives, RG 4 – Provost and Vice President for Academic Affairs, Series 1,
Academic Office, Box 2.
Smith, J., & Deemer, D. (2000). The problem of criteria in the age of relativism. In Denzin, N.,
& Y. Lincoln (Eds.), Handbook of qualitative research, 2nd ed. (pp. 877–896). London:
Sage.
Sporn, B. (1999). Adaptive university structures: An analysis of adaptation to socioeconomic
environments of US and European universities. Philadelphia, PA: Jessica Kingsley
Publishers.
Stake, R. E. (1995). The art of case study research. Thousand Oaks, CA: Sage.
Stensaker, B. (2003). Trance, transparency and transformation: The impact of external quality
monitoring on higher education. Quality in Higher Education, 9(2), 151–159.
doi:10.1080/13538320308158
Stensaker, B., Henkel, M., Välimaa, J., & Sarrico, C. S. (2012). Introduction: How is change in
higher education managed? In Stensaker, B., Välimaa, J., & Sarrico, C. S. (Eds.),
Managing reform in universities: The dynamics of culture, identity and organizational
change (pp. 1-16). New York, NY: Palgrave-Macmillan.
Stroh, N. W. (1969, September, 9). Small-town attitudes keep races apart at state-run colleges.
The Philadelphia Evening Bulletin (page number illegible). Shippensburg University
Archives, Government (State) Administration, Box 2.
Summers, L. H. (2012, January 20). What you (really) need to know. The New York Times,
online edition (ED26). Retrieved August 8, 2014.
Taylor, J. (2001). Toward a theory of imbrication and organizational communication. The
American Journal of Semiotics, 17(2), 269–298.
Thévenot, L. (2001). Pragmatic regimes governing the engagement with the world. In Schatzki,
T. R., Knorr Cetina, K., & Savigny, E. V. (Eds.), The practice turn in contemporary
theory (pp. 56-73). New York, NY: Routledge.
230
Thompson, J. D. (1967) Organizations in action: Social science bases of administrative theory.
New York: McGraw-Hill.
Thornton, P. H., & Ocasio, W. (2008). Institutional logics. In Greenwood, R., Oliver, C., Sahlin,
K., & Suddaby, R. (Eds.), The SAGE handbook of organizational institutionalism (pp.
99-129). Thousand Oaks, CA: Sage.
Thornton, P. H., Ocasio, W., & Lounsbury, M. (2012). The institutional logics perspective: A
new approach to culture, structure, and process. Oxford, UK: Oxford University Press.
Tierney, W. G. (1998). On the road to recovery and renewal: Reinventing academe. In Tierney,
W. G. (Ed.), The responsive university: Restructuring for high performance (pp. 1-12).
Baltimore, MD: The Johns Hopkins University Press.
Tucker, A. L., Nembhard, I. M., & Edmondson, A. C. (2007). Implementing new practices: An
empirical study of organizational learning in hospital intensive care units. Management
Science, 53(6), 894–907.
Turner, S. (1994). The social theory of practices: Tradition, tacit knowledge, and
presuppositions. Chicago: University of Chicago Press.
Turner, S. (2001). Throwing out the tacit rule book: Learning and practices. In Schatzki, T. R.,
Knorr Cetina, K., & Savigny, E. V. (Eds.), The practice turn in contemporary theory (pp.
120-130). New York, NY: Routledge.
United States Department of Education, Office of Civil Rights (1991, March). Historically black
colleges and universities and higher education desegregation. Washington, DC: Author.
Retrieved August 30, 2014, from
http://www2.ed.gov/about/offices/list/ocr/docs/hq9511.html
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes.
Cambridge, MA: Harvard University Press.
Warren, E., quoted in Field, W. (2013, December 13). Senators say accreditors are ineffective
and beset by conflicts of interest. Chronicle of Higher Education. Retrieved from
http://chronicle.com/article/Senators-Say-Accreditors-Are/143589/
Weick, K. (1976). Educational organizations as loosely coupled systems. Administrative Science
Quarterly, 21(1), 1–19.
Weick, K. (1995). Sensemaking in organizations. Thousand Oaks, CA: Sage.
231
Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. New York:
Cambridge University Press.
Wenner, D. (2013, June 24). Pennsylvania universities say they don't admit students based on
race, so affirmative action decision won't have an impact. PennLive, The Patriot-News,
online edition. Retrieved August 30, 2014, from
http://blog.pennlive.com/midstate_impact/print.html?entry=/2013/06/affirmative_action_
supreme_cou.html
Witham, K. A., & Bensimon, E. M. (2012). Creating a culture of inquiry around equity and
student success. In S. D. Museus & U. M. Jayakumar (Eds.), Creating campus cultures:
Fostering success among racially diverse student populations (pp. 46-67). New York,
NY: Routledge.
Yanow, D. (2000a). Seeing organizational learning: A ‘cultural’ view. Organization, 7(2), 247–
268.
Yanow, D. (2000b). Conducting interpretive policy analysis. Qualitative research methods series
no. 47. Thousand Oaks, CA: Sage.
Ybema, S., Yanow, D., Wels, H., & Kamsteeg, F. (2009). Studying everyday organizational life.
In Ybema, S., Yanow, D., Wels, H., & Kamsteeg, F. (Eds.), Organizational ethnography:
Studying the complexities of everyday life (pp. 1-20). Thousand Oaks, CA: Sage.
Zachry, E. M. (2008). Promising instructional reforms in developmental education: A case study
of three Achieving the Dream colleges. New York, NY: MDRC.
Zemsky, R. (2013). Checklist for change: Making American higher education a sustainable
enterprise. New Brunswick, NJ: Rutgers University Press.
232
Appendix I: List of Data Collected at Each Site
Interviews Observations
Documents Collected/
Analyzed*
Hillside University
1. Student affairs staff member
2. Faculty member
3. Student affairs staff member
4. Administrator
5. Student affairs staff member
6. Student affairs staff member
• Informal observations/
interviews
• Equity Scorecard committee
final reports
• Deliverology data report
• Brochures for
students/families
• Website
Valley University
1. Faculty member
2. Faculty member
3. Administrator
4. Administrator
5. Faculty member
• Informal observations/
interviews
• Equity Scorecard committee
final reports
• University archival
material**
• Academic Master Plan
• Accreditation reports
(various years, 1980-2009)
Old Main University
1. Student affairs staff member
2. Faculty member
3. Faculty member
4. Administrator
5. Faculty member
6. Faculty member
• Formal observation of
Equity Matters training
• Equity Scorecard committee
final reports
• University archival
material**
• Strategic Plan
• Equity Matters workshop
agenda, data template,
associated documents
• Accreditation reports
(various years, 1980-2009)
*Some artifacts I was not able to copy and retain but I analyzed on-site and noted observations in field
memos.
**Citations for any documents referenced specifically are included in the bibliography.
233
Appendix II: Annotated List of Deductive and Inductive/Emergent Codes
Deductive codes & origins
Accountability
References to external or internal accountability measures, e.g. policy, accountability,
funding, strategic planning, etc. (broad literature review)
Change
References (explicit or implicit) to or indications of specific changes having occurred in
practice, policy, and so on. (broad literature review)
Character -
first person
Descriptions of self/narratives told from first-person perspective or descriptions of self-
identity (Daiute, 2014)
Character -
third person
Descriptions of others/narratives about events told from third-person perspective, describing
aspects of others’ personalities or roles (Daiute, 2014)
Communicative
practices
References (explicit or implicit) to practices involving written or spoken actions,
performative documents (e.g., strategic plans), networks of communication, and so on.
(Latour, 1984; Nicolini, 2013)
Data use
References to or narratives about the use of data as a component of practices within activity
settings directed to specific objectives (e.g., committees) or describing values, preferences,
attitudes, experiences with respect to data as an artifact of practice. (broad literature review)
History
References (explicit or implicit) to prior practices, the “way things used to be,” specific
historical moments or events, narratives about self or organization in the distant past. (CHAT
broadly)
Language
References (explicit or implicit) to specific words, phrases, or concepts that took on unique
meanings within specific activity settings and/or that could be traced across settings.
(Nicolini, 2009; CHAT broadly)
Inst. logic -
equity/diversity
References (typically implicit) to institutional norms, histories, policies, values, or practices
pertaining to enrollment and success of students of color or low-income students; uses of
particular words in relation to “underrepresented,” “URM,” etc. (Bensimon, 2005; Thornton,
Ocasio, & Lounsbury, 2012)
Inst. logic -
prestige/rankings
References (typically implicit) to institutional norms, histories, policies, values, or practices
related to the university’s pursuit of “excellence” or institutional identity in terms of quality,
national or regional rankings, awards, selectivity in admissions, etc.
(Thornton, Ocasio, & Lounsbury, 2012).
Practices
References (explicit or implicit) to specific activities or sets of activities collectively or
individually pursued in order to meet a stated objective; also references to settings in which
activities were pursued and any explicit references to structures that shaped those activities
(Nicolini, 2013; Schatzki, 2001)
Professional
identity
References (explicit or implicit) to one’s own identity as a professional, including references
to disciplinary background or training, education, self-identity in relation to a professional
role, descriptions of work duties or obligations, etc. (Thornton, Ocasio, & Lounsbury, 2012;
Polkinghorne, 2004; Bensimon, 2007; Thévenot, 2001).
Resources
References (explicit or implicit) to funding or other resources to enable activities or in
relation to the potential for pursuing certain courses of action; other resources including
physical space, materials, time, as well as symbolic resources like legitimacy, credibility,
authority, power with a relationship to practice. (Nicolini, 2009; Feldman & Orlikowski,
2011)
234
Structures
References (explicit or implicit) to specific organizational structures (hierarchy, policy,
division of labor, etc.) or to the conditions of practice created in relationship to those
organizational structures (power, professional efficacy, patterns of interaction, informal
routines, etc.). (Latour, 1984; Latour, 2014; Giddens, 1984; Nicolini, 2013; Schatzki, 2001)
Technology
References (explicit or implicit) to specific ways of doing work; including but not limited to
technological (i.e., computers) mechanisms and structured processes (e.g., the Scorecard)
and the inanimate mediators of those processes (Latour, 1984; Polkinghorne, 2004)
Tools/artifacts
References (explicit or implicit) to specific material objects used in the course of activities
directed to a specific objective. (CHAT broadly, Nicolini, 2009)
Value statements
Explicit statements affirming a value with respect to some element of organizational activity,
objectives of change initiatives or larger goals (equity, student success, etc.); specifically,
use of value-conferring adjectives when describing activities or people (e.g., brilliant,
important, dangerous, corporate, unethical, painful, reaffirming, proud, etc.). (Daiute, 2014)
Inductive/emergent codes
Leadership
References (explicit or implicit) to specific practices of individuals in senior leadership
positions (here defined as president or provost).
Learning
structures*
References to or descriptions of specific forums or instances of structures generative of
practices where the specific objective at hand is not to change student outcomes directly but
to learn about or evaluate student outcomes as a matter of individual, group, or
organizational routine. (Clark, 1998; Sporn, 1999; Levine, 1980; Nicolini, 2013).
Inst. logic – other
References (typically implicit) to other institutional norms, histories, policies, values, or
practices that emerged consistently across interviews within a setting to convey a certain
regime of practice; for example, pertaining the scarcity of resources, distributed leadership,
etc. (Thornton, Ocasio, & Lounsbury, 2012)
Personal/moral
values**
Narrative accounts that conveyed a belief based on professional, religious, or other (i.e., non-
organizational) values, but particularly in relationship to practice. (Thévenot, 2001;
Polkinghorne, 2004; Bensimon, 2007; MacIntyre, 2007; Thornton, Ocasio, & Lounsbury,
2012)
Self-efficacy
Narrative accounts involving first-person statements or indications (explicit or implicit) of
one’s own sense of efficacy vis-à-vis a set of activities or a specific objective; references to
individual potential in relation to organizational activity.
Skepticism
References to or indications of (explicit or implicit) of skepticism specifically in relation to
the change initiatives or similar committee work, or components of those activities (data use,
inquiry, making recommendations, etc.). Also references to or indications of resistance,
frustration, hesitancy, or other negative emotional reactions.
Notes:
*This code emerged during my data analysis, at which point I went back to the literature and the theory to try to
better understand and conceptualize what I was seeing in the data; I then re-analyzed the data with a refined
conceptual lens to identify instances that were distinct from data elements coded under the more general “structures”
code.
**This code also emerged during my initial data collection and field memos, so I went back and revised my
theoretical framework to better account for it, drawing on the literature cited here in particular. Subsequently the
code functioned more deductively as I looked for confirmation of its distinct role in the narratives in my data.
235
Appendix III: Sample Interview Protocol
Introduction
Thank you for taking time to meet with me today. The purpose of this interview is to learn about your
experience participating in ___________[name of initiative/project]. This interview is part of a research
project funded by the Spencer Foundation exploring the effectiveness of data-use and inquiry within
higher education institutions. We are particularly interested in data-use around equity and student
outcomes.
We are interested in learning about your experiences participating in this initiative and, specifically,
what kinds of activities you have engaged in, what kinds of data or other information you used in the
course of your participation, and what you saw as the objectives and outcomes of this project/initiative.
It’s important to emphasize that your participation in this research is completely voluntary, and that all
of your responses to these questions will be completely confidential and anonymous. At any point, you
are welcome to say that you’d prefer not to answer any of my questions or that you wish to stop the
interview altogether.
With your permission, we will be digitally recording this interview so that we can accurately transcribe it
later on. Also, as we prepare our write-up of findings from the interviews, we will provide you with a
summary so that you can check to make sure we’ve captured your responses fairly and accurately.
Do you have any questions before we begin?
****
1. Can you tell me how you got involved in the equity scorecard/deliverology?
2. Do you recall how the project was introduced to you and what your sense was of purpose or
objectives of the project?
3. Can you describe what sorts of activities in particular you engaged in during the course of this
project?
4. Can you describe any ways in which in this committee and these activities were similar to or
different from prior committee work you’ve done focused on issues of student success,
retention, diversity, equity, and so on?
5. Can you describe how the focus on race and issues of racial equity felt for you personally and
how you perceived the quality of group discussion about race?
6. Do you recall the process your committee went through to arrive at recommendations? Can you
describe that process, including how the group prioritized or evaluated ideas?
7. Moving forward, in what ways if any do you see the work of the committee being extended or
adopted within the university?
236
Appendix IV: Informed Consent Document
University of Southern California
Rossier School of Education
Waite Phillips Hall
Suite 702
Los Angeles, CA 90089-4037
INFORMED CONSENT FOR NON-MEDICAL RESEARCH
CONSENT TO PARTICIPATE IN RESEARCH
You are invited to participate in a research study conducted by the Center for Urban Education
(CUE) at the University of Southern California; the principal investigator for this project is
Alicia C. Dowd, associate professor of higher education at the University of Southern California
and co-director of CUE.
You are eligible to participate in this study because you are a member of staff at the
Pennsylvania State System of Higher Education (PASSHE) or an administrator, or member of
faculty or staff one of the PASSHE member institutions. Your participation is voluntary. You
may withdraw your participation at any time.
PURPOSE OF THE STUDY
The purpose of this study is to understand the factors and conditions at post-secondary
institutions that facilitate or hinder the use of data towards changing educational practice,
improving institutional effectiveness, and achieving equity in student outcomes.
PROCEDURES
If you volunteer to participate in this study, you will be asked to participate in one or more
interviews. The interview(s) will be tape recorded with your permission. The interviews are
anticipated to take an hour each. You may also be asked to allow the researchers to conduct
observations of meetings or institutional sites in which you may be present. Finally, you may be
asked to provide publically available documents related to the study.
POTENTIAL RISKS AND DISCOMFORTS
There are no anticipated risks to your participation. However, you may feel uncomfortable
discussing your assessment of your institution’s culture towards the use of data, changes in
practice, and commitment to racial equity initiatives, as well as being audio-taped. If you feel
uncomfortable with any question or request, you may decline to respond.
POTENTIAL BENEFITS TO SUBJECTS AND/OR TO SOCIETY
You may not directly benefit from your participation in this study. It is hoped that the results of
this study will help develop better understanding about the ways in which data are used at post-
secondary institutions for changes in educational practices and improvements in institutional
effectiveness.
237
PAYMENT/COMPENSATION FOR PARTICIPATION
You will not be paid for participating in this research study.
CONFIDENTIALITY
We will keep your records for this study confidential as far as permitted by the law. However, if
we are required to do so by law, we will disclose confidential information about you. The
members of the research team and the University of Southern California’s Human Subjects
Protection Program (HSPP) may access the data. The HSPP reviews and monitors research
studies to protect the rights and welfare of research participants.
Data collected from individual participants in this study will be coded by the respondent’s
initials. The name of the individual participant’s university will be known and named in the data.
In reporting results, names will be represented with pseudonyms (false names). These
pseudonyms will be used in the transcriptions, as well as in all presentations and publications
that result from this study. However, PASSHE’s identity as the setting for this study will not be
anonymous. An individual’s pseudonym may be paired in some reports with the name of that
person’s university.
The data will be stored in password protected documents on the computers of participating
researchers and in a shared (password protected) Box.com file. Hard copies will be stored in the
on- and off-campus offices of the researchers. All data obtained for this study will be kept in files
in locked offices to prevent access by unauthorized personnel.
Audiotapes will be used to ensure that we correctly document the information you provide and
for writing reports based on the interview. The audiotapes will be transcribed and will be kept for
a period of five years before they are destroyed.
All data (participants and institution) will be coded with pseudonyms (false names). These
pseudonyms will be used in the transcriptions, as well as in all presentations and publications
that result from this study. However, PASSHE’s identity as the setting for this study will not be
anonymous.
PARTICIPATION AND WITHDRAWAL
You can choose whether to be in this study or not. If you volunteer to be in this study, you may
withdraw at any time without consequences of any kind. If you withdraw from the study, your
identity will be removed from all written reports and other documents generated by this study.
You may also refuse to answer any questions you do not want to answer and still remain in the
study. The investigator may withdraw you from this research if circumstances arise to do so.
ALTERNATIVES TO PARTICIPATION
Your alternative is to not participate in this study. Your relationship with PASSHE, your
institution, or USC will not be affected, whether or not you participate in this study.
IDENTIFICATION OF INVESTIGATORS
If you have any questions or concerns about the research, please feel free to contact Alicia C.
Dowd at 213/740-5202, or write to her at adowd@usc.edu or at the Center for Urban Education,
238
The University of Southern California, Rossier School of Education, Waite Phillips Hall, Suite
702, Los Angeles, CA 90089-4037.
RIGHTS OF RESEARCH SUBJECTS
You may withdraw your consent at any time and stop participation without penalty. You are not
waiving any legal claims, rights or remedies because of your participation in this research study.
If you have questions regarding your rights as a research subject, contact the University Park
Institutional Review Board (UPIRB), 3720 South Flower Street #301, Los Angeles, CA 90089-
0702, (213) 821-5272 or upirb@usc.edu.
SIGNATURE OF RESEARCH SUBJECT
I understand the procedures described above, and I understand fully the rights of a potential
subject in a research study involving people as subjects. My questions have been answered to my
satisfaction, and I agree to participate in this study. I have been given a copy of this form.
□ I agree to be audio-recorded
□ I do not want to be audio–recorded
Name of Subject
___
Signature of Subject Date
SIGNATURE OF INVESTIGATOR
I have explained the research to the subject and answered all of his/her questions. I believe that
he/she understands the information described in this document and freely consents to participate.
Name of Person Obtaining Consent
Signature of Person Obtaining Consent Date
Abstract (if available)
Abstract
This dissertation examines ways in which change occurs in higher education institutions. Though building upon traditions of scholarship on organizational change both in higher education and other settings, this study makes unique and timely contributions in two key ways. First, the study focuses on what I argue is a genre of change initiative characteristic of the so-called “completion agenda” that has driven much of public higher education policy and practice over the last five to seven years. Such initiatives are anchored in organizational practice but linked explicitly to state or system accountability policy, they share a common set of objectives related to completion and equity in student achievement, and they tend to rely on team-based data use as a structure and mechanism for organizational change processes. ❧ Second, this study adopts a practice-theory approach to studying organizational change in an attempt to (1) overcome what I argue are some conceptual limitations endemic to much of the extant research on change in higher education and (2) demonstrate a theoretical approach better suited to the current contexts of change in particular. Practice theory puts emphasis on the significance of individuals’ daily work activities for understanding the nature of organizational activity, social work settings, organizational history and culture, and institutional contexts. The practice-theory lens enables me in this study to ascertain both the nature of change within the specific settings of committees tasked with developing recommendations for improvements in student success and equity, as well as the organizational implications of those committees’ work over time. Practice theory also helps clarify the ways in which the specific accountability contexts and data-use mechanisms characteristic of completion-agenda reform efforts may serve to support (or inhibit) practice change within institutions. ❧ Through a case study of the Pennsylvania State System of Higher Education (PASSHE), this study investigates how a particular set of initiatives—the Equity Scorecard and Deliverology—have brought about change within universities in the context of outcomes-based funding policy focused on completion and equity. This study focuses in-depth on the practices and changes in practice associated with those particular initiatives as they were implemented by committees comprised of professionals from across roles (administrators, faculty, and professional staff) at three purposefully sampled universities within PASSHE. ❧ The findings of this study are presented in the form of eight propositions about the nature of change related to these initiatives and the policy and institutional contexts in which they were enacted. Those propositions are then elaborated through case narratives describing both patterns and variations across the three universities in the nature of practice change associated with the initiatives. Both the propositions and the case narratives illustrate the complexity of interacting factors that mediate practices directed at improving student success and equity. In most cases, the practices associated with the specific change initiatives represented reconfigurations of existing types of practices (e.g., retention committees): though the universities had engaged in similar types of efforts previously, the accountability policy context of the system’s new outcomes-based funding policy and the specific tools, language, artifacts, and technologies provided by the initiatives (Equity Scorecard and Deliverology) served in some cases to significantly re-mediate those existing practices. Those re-mediated practices then had organizational implications to the extent that they created resources for ongoing learning and practice-change across the university, particularly in the form of new structures for learning (e.g., data-use routines and venues for collective inquiry).
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Re-mediating practitioners' practice for equity in higher education: evaluating the effectiveness of action research
PDF
Language and identity in critical sensegiving: journeys of higher education equity agents
PDF
Participation in higher education diversity, equity, and inclusion work: a relational intersectionality of organizations analysis
PDF
Faculty learning and agency for racial equity
PDF
Evaluating the impact of CUE's action research processes and tools on practitioners' beliefs and practices
PDF
The influence of organizational learning on inquiry group participants in promoting equity at a community college
PDF
Practitioner reflections and agency in fostering African American and Latino student outcomes in STEM
PDF
Addressing a historical mission in a performance driven system: a case study of a public historically Black university engaged in the equity scorecard process
PDF
Equity for students of color through practitioner accountability
PDF
Creating an equity state of mind: a learning process
PDF
Tempering transformative change: whiteness and racialized emotions in graduate leaders' implementation of equity plans
PDF
How community college basic skills coordinators lead organizational learning
PDF
Institutional researchers as agents of organizational learning in hispanic-serving community colleges
PDF
Racing? to transform colleges and universities: an institutional case study of race-related organizational change in higher education
PDF
Examining felt accountability and uneven practice in dual organizational systems: a bioecological study toward improving organizational accountability
PDF
Outsourcing technology and support in higher education
PDF
Uneven development of perspectives and practice: Preservice teachers' literacy learning in an era of high-stakes accountability
PDF
Assessment and teaching improvement in higher education: investigating an unproven link
PDF
Women of color senior leaders: pathways to increasing representation in higher education
PDF
Designing equity-focused action research: benefits and challenges to sustained collaboration and organizational change
Asset Metadata
Creator
Witham, Keith Allen
(author)
Core Title
Making equity & student success work: practice change in higher education
School
Rossier School of Education
Degree
Doctor of Philosophy
Degree Program
Education
Publication Date
10/29/2015
Defense Date
10/06/2014
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
accountability policy,data use,equity in higher education,Higher education,OAI-PMH Harvest,organizational change,organizational learning,practice theory
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Bensimon, Estela Mara (
committee chair
), Burch, Patricia (
committee member
), Dowd, Alicia C. (
committee member
), Riley, Patricia (
committee member
)
Creator Email
kalwitham@gmail.com,kwitham@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c3-511756
Unique identifier
UC11298391
Identifier
etd-WithamKeit-3042.pdf (filename),usctheses-c3-511756 (legacy record id)
Legacy Identifier
etd-WithamKeit-3042.pdf
Dmrecord
511756
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Witham, Keith Allen
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
accountability policy
data use
equity in higher education
organizational change
organizational learning
practice theory