Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
An evaluation of individual learning among members of a diversity scorecard project evidence team
(USC Thesis Other)
An evaluation of individual learning among members of a diversity scorecard project evidence team
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
AN EVALUATION OF INDIVIDUAL LEARNING AMONG MEMBERS
OF A DIVERSITY SCORECARD PROJECT EVIDENCE TEAM
by
Donna Marie Post
A Dissertation Presented to the
FACULTY OF THE ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
May 2007
Copyright 2007 Donna Marie Post
ii
DEDICATION
I want to thank my family, advisor, dissertation committee members, members of the
Diversity Scorecard thematic group, coworkers, and friends for their patience and
support during the process of researching and writing this dissertation.
iii
TABLE OF CONTENTS
DEDICATION ii
ABSTRACT vi
CHAPTER 1: INTRODUCTION 1
The Study 1
The Importance of an Evaluation 2
The Diversity Scorecard Project 2
The Four Perspectives of the Diversity Scorecard 6
Practitioner-as-Researcher 8
The Goals of the Diversity Scorecard 9
The Statement of the Problem 11
The Research Questions 12
CHAPTER 2: REVIEW OF LITERATURE 14
Transformative Learning 15
The Existence of Organizational Learning 17
Perspectives of Organizational Learning 18
Team Learning 20
Individual Learning 21
Problems Uncovered 24
Awareness 25
Interpretation 26
Learning Theories Used 29
Summary 30
CHAPTER 3: METHODOLOGY 32
Design Strategy 32
Research Questions 33
Becoming Knowledgeable 34
Design Decisions 34
Generalizability, Credibility, and Validity 36
Data Collection 37
First Data Source 38
Second Data Source 39
Third Data Source 40
Data Analysis 40
Analyzing Field Notes 41
Ethical Considerations 45
iv
CHAPTER 4: RESULTS 47
The Research Questions 50
The First Research Question 50
The Second Research Question 51
Organization of the Five Case Studies 51
The Evidence Team 52
How the Evidence Team Made Use of the DSP 57
Case Study One: Theresa 59
Feelings and Attitudes 61
Awareness 63
Conclusion 66
Case Study Two: Katherine 66
Feelings and Attitudes 68
Awareness 71
Conclusion 72
Case Study Three: James 73
Feelings and Attitudes 76
Awareness 78
Interpretation 82
Conclusion 84
Case Study Four: Richard 85
Feelings and Attitudes 87
Awareness 94
Interpretation 96
Conclusion 97
Case Study Five: Robert 99
Feelings and Attitudes 100
Awareness 105
Interpretation 108
Conclusion 110
A Comparison of Learning between Evidence Team Members 111
CHAPTER FIVE: SUMMARY 125
A Memo to the President of My College 125
Institutionalization of the Equity Scorecard Project 126
Benefit of the Equity Scorecard Project to Other Programs 127
Importance of Existing Data 128
Criteria for the Selection of an Evidence Team 129
Cost of Implementation 130
What I Learned 131
Conclusion 134
REFERENCES 136
v
APPENDIX A: CHRONOLOGICAL ORDER OF MEETINGS 139
APPENDIX B: SAMPLE STRUCTURE OF FIELD NOTES 141
APPENDIX C: MAGNITUDE OF STATEMENTS 142
APPENDIX D: COMPARISON OF EVIDENCE TEAM MEMBERS 147
vi
ABSTRACT
The Diversity Scorecard Project evaluated in this study was created by the
University of Southern California’s Center for Urban Education in order to foster an
institutional awareness of inequities in educational outcomes that exist for
underrepresented students. It was the hope of the creators of the Diversity Scorecard
Project that, through action research, the practitioners-as-researchers would develop
into equity oriented thinkers and become change agents at the institution. I conducted
this evaluation as a naturalistic inquiry to determine the impact of individual learning
among the evidence team members at one of fourteen institutions that participated in
the Diversity Scorecard Project over a span of four years. I present five nested case
studies, one story for each evidence team member, then present a comparison of the
impact the Diversity Scorecard Project had for all the individuals. As a result of this
study, I determined that, after looking at the institutional data that had been
disaggregated by race and ethnicity, learning had taken place and the evidence team
members had become change agents for their institution.
1
CHAPTER 1: INTRODUCTION
Education is going through many changes of which public scrutiny is a key
concern (Kezar & Eckel, 1999). The topic of this dissertation, in keeping with
society’s need for accountability, is an evaluation of one aspect of a project whose
intent was to decrease the gap of inequitable postsecondary educational outcomes for
historically underrepresented students. This chapter consists of four sections: (a) the
study, which describes the type of study being implemented and why evaluation is
important; (b) the Diversity Scorecard Project (DSP), which describes aspects of the
project being evaluated, the four perspectives chosen for the project, how and why
practitioner-as-researcher was implemented, and the goals of the DSP; (c) the
statement of the problem; and (d) the questions that frame the evaluation.
The Study
The purpose of this qualitative study was to evaluate learning among the
members of 1 of the 14 institutional evidence teams that participated in the Diversity
Scorecard Project (DSP), a four-year project developed by researchers at the
University of Southern California’s Center of Urban Education (CUE). The CUE
creators designed a change-oriented intervention to address the incidence of
inequitable postsecondary educational outcomes for historically underrepresented
students. This study focused on the learning that happened among the individuals
that made up the evidence team at one of the participating colleges. The following
subsection delineates information regarding evaluations since the need for an
evaluation is not intuitively obvious.
2
The Importance of an Evaluation
The importance of an interpretive evaluation after the completion of a
program such as the Diversity Scorecard Project is well stated by Ernest T. Stringer
(1999):
Evaluation should, ultimately, assess the worth of a set of activities or a
project according to its impacts on the primary stakeholders. Many
evaluations focus on the activities in which project members engaged but fail
to provide any indication of the extent to which the process has made an
impact on the lives of the people for whom the project was formulated. (p.
158)
The intent of an evaluation is to “judge the overall effectiveness to inform
major decisions about whether a program should continue” (Patton, 2002, p. 218).
This evaluation of the DSP is interpretive and answers two research questions that
focus on one aspect of the DSP with the intent to help the creators of the project
assess the impact of the project in order to decide upon future implementations. This
study used a typical case sampling in which the sample is illustrative of a typical
program (Patton, 2002). The chosen institution is not in any way an extreme case in
that it is not unusual, poor, or excellent; it is merely 1 of 14 institutions that
participated in and completed the DSP, which means that the president of the
institution received the DSP evidence team members’ report. Since it is a single site
typical sample, this evaluation will not be generalizable to other institutions.
The Diversity Scorecard Project
The Diversity Scorecard Project was a four-year project begun in 2001 as an
intervention to help lessen the gap in postsecondary educational outcomes for
3
historically underrepresented students in higher educational institutions. A bottom up
philosophy was the foundation of the design for the concept of the Diversity
Scorecard Project (DSP). The premise was that the individual institutional actors
must first become aware of inequities of educational outcomes within their institution
before change can occur within an institution. To create awareness that leads to
change within the institution, the DSP focused on effecting change in the mental
models of the individuals who participated in the DSP, where a mental model is “a
cognitive frame that shapes how one perceives something” (Boyce, 2003, p. 128) and
“a cognitive frame is the way in which an individual understands a situation”
(Bensimon, 2004a, p. 8). The intent of the DSP was to involve practitioners within
the already diverse institution in a collaborative inquiry in which they would review
existing data available within their institution, but this time disaggregated by race
and ethnicity.
It was the belief of the DSP creators that by disaggregating data by race and
ethnicity the participants would develop an awareness of any gaps that exist in the
institution. A key concept in the DSP was that if the participants as “practitioners-as-
researchers,” instead of the CUE researchers, collected and analyzed the
disaggregated data, the participants would develop a deeper awareness of differences
in educational outcomes among underrepresented students. The main principle of the
DSP was that “learning and change are socially constructed and facilitated by
engagement in a joint productive activity” (Bensimon, 2005). Within this principle is
the underlying concept of the DSP that individuals who are in positions that can
4
influence student outcomes need to interpret the problem from the standpoint of
equity in order to close the gap in educational outcomes where “equity is defined as
the point at which a particular ethnic group’s representation across all majors,
programs, honors, and so on at the institution is equal to the group’s representation in
the student body” (Bensimon, Polkinghorne, Bauman, & Vallejo, 2004, p. 110). To
begin a process in which individuals could become equity minded, individuals from
each institution joined to create small evidence teams. Each team had the task of
creating indicators within the perspectives of the scorecard to demonstrate the current
performance of minority students at their own institution. The expectation, based on
constructivist theories of learning, was that individuals, through their involvement in
a collaborative inquiry project, would develop a new awareness of inequities in
educational outcomes and be more likely to reflect on their own role in addressing
them.
Based on a business model, The Balanced Scorecard (Kaplan & Norton,
1996), and an academic model, The Academic Scorecard (O’Neil, Bensimon,
Diamond, & Moore, 1999), the DSP adapted the vision and strategy perspectives to
better complement the visions and strategies that are critical to the success of closing
the educational gap between minority students and other students within educational
institutions. The DSP is a framework that consists of the following four perspectives:
(a) access, (b) excellence, (c) institutional receptivity, and (d) retention. Using these
four perspectives with the understanding that “measuring student performance
involves choosing a set of indicators and instruments” (Fuhrman, 1999, p. 3), the task
5
of the practitioners-as-researchers was “to focus their attention on the accountability
side of diversity” (Bensimon et al., 2004, p. 112) by analyzing their institution’s
disaggregated data then creating indicators for each of the four perspectives. It was
the hope of the DSP creators that this would create awareness and understanding of
the inequities found within the evidence team members’ own institution in order to
shift their cognitive frame from one of diversity or deficit to one of equity. Where
“the deficit cognitive frame reflects a theory that blames unequal outcomes on
stereotypical characteristics attributed to racial and ethnic groups,” (Bensimon,
2004b, p. 15), the diversity cognitive frame focuses on “whether the population of an
institution is diverse” (Bensimon, Peña, & Castillo, 2006, p. 18), and the equity
cognitive frame attributes unequal outcomes “to racialized institutional structures and
practices that have a cumulative effect of placing racial and ethnic minorities (as well
as women) at a disadvantage” (Bensimon, 2004b, p. 21). The practitioners-as-
researchers were chosen for each of the 14 institutions of higher education by the
president of the institution. Each institution was requested to appoint at least three
individuals including an institutional researcher who would all become the
practitioners-as-researchers with the understanding that the representatives from
CUE would only be involved as facilitators of the DSP. The hope of the CUE
developers was that the indicators for the four perspectives of the DSP and the two
goals of the DSP, discussed later, would reach fruition by using the action research
methodology of practitioner-as-researcher.
6
The Four Perspectives of the Diversity Scorecard
The DSP is a framework that consists of four perspectives. The task was to
analyze existing data and create indicators within each perspective. The suggestion of
the CUE facilitators was for evidence team members to use the Diversity Scorecard
as a tool to analyze existing institutional data, but this time in a way that may not
have been looked at before, disaggregated by race and ethnicity. They were to view
these data from the four perspectives: (a) access, (b) excellence, (c) institutional
receptivity, and (d) retention, to find indicators that could discover critical gaps in the
educational performance of underrepresented students.
The access perspective was developed to provide “the same degree of
diversity [that exists on campus] in those educational outcomes that indicate students
of color have an opportunity to gain access to opportunity and power” (Bensimon et
al., 2004, p. 112) through “access to institution’s programs and resources”
(Bensimon, 2004b, p. 46) and was defined to answer questions regarding access to
the institution, internships and fellowships, transfer to four-year colleges, academic
programs, etc. The excellence perspective refers to “excellent performance among
students from low-income and traditionally underserved racial/ethnic groups”
(Bensimon, 2005) and was developed to answer questions within two components,
access and achievement. When viewing excellence regarding access, the question of
whether a course created by divisions is a “gatekeeper” (preventing access) or
“gateway” (creating access) for the minority students arises. When viewing
excellence regarding achievement, questions arise regarding GPA, completion rates,
7
and eligibility for graduate school. The institutional receptivity perspective
“examines dimensions of institutional support that can help create a more
accommodating and responsive campus environment for students drawn from
underrepresented groups” (Bensimon, 2004b, p. 47) and responds to questions
regarding institutional support; racial and ethnic diversity of faculty, administrators,
and staff; and how employees of the institution compare to the diversity of the
student body. The retention perspective “refers to the completion of certificates and
degrees as well as term to term retention” (Bensimon, 2005) and answers questions
regarding the retention rate of and withdrawal rate from majors known to be
pathways to higher paying careers, successful course completion within the basic
skills, degree and certificate completion, etc.
Using these four perspectives as guides, the practitioners-as-researchers were
to develop indicators for each perspective with the understanding that the four
connected perspectives should not to be isolated from one another in keeping with
the balanced theme of the Academic Scorecard and the Balanced Scorecard
exemplified by Robert S. Kaplan and David P. Norton (1996) as follows:
The [four] measures represent a balance between external measures for
shareholders and customers, and internal measures of critical business
processes, innovation, and learning and growth. The measures are balanced
between the outcome measures—the results from past efforts—and the
measures that drive future performance. And the scorecard is balanced
between objective, easily quantified outcome measures and subjective,
somewhat judgmental performance drivers of the outcome measures. (p. 10)
8
To accomplish this the practitioners-as-researchers were to decide what type of data
to use for each perspective then select objectives and measures that followed the
principles of the DSP.
Practitioner-as-Researcher
To develop an awareness of any inequities that exist for historically
underrepresented students at their institution, the USC CUE creators developed an
alternative to the traditional research method. In the traditional research method, the
researcher controlled all phases of the study and produced the knowledge, but it was
the intent of the DSP to have the evidence team members produce the knowledge,
identify any problems, and take action. For CUE and the institution to achieve the
desired results of the DSP, CUE creators developed the practitioner-as-researcher
method where campus members working in a team became responsible for analyzing
and interpreting campus data. Lyle Yorks & Victoria J. Marsick (2000) cited Yorks,
O’Neil, and Marsick, 1999, when they described this type of action learning as “an
approach to working with and developing people which uses work on a real project
or problem as the way to learn. Participants work in small groups to take action to
solve their project or problem, and learn how to learn from that action” (p. 255). The
expectation of the practitioner-as-researcher was that engagement in the analysis of
data would deepen their understanding of differential outcomes in their institution.
As Chris Argyris and Donald A. Schön (1996) elucidated, “We see practitioners not
as passive recipients of expertise, but as Deweyan inquirers” (p. 35) where inquiry
“combines mental reasoning and action. The Deweyan inquirer is not a spectator but
9
an actor who stands within a situation of action, seeking actively to understand and
change it” (p. 31). For the first time as inquirers, they saw the data disaggregated in a
new way, by race and ethnicity. They were no longer casually glancing at a report;
instead, they analyzed the data first hand, developed a deeper awareness of problems
that already existed, inquired into the magnitude of the problem, and ideally became
agents of change through their ownership of the research (Bensimon et al., 2004).
The Goals of the Diversity Scorecard Project
There were two main goals of the Diversity Scorecard Project. The first goal
was to make each institutional actor aware of inequities in educational outcomes in
hopes they would acquire an equity frame of reference. The second goal was for
institutional actors to take responsibility for the inequities found, reframe the
problem from the perspective of their institution, rather than view it as a student
problem, and ask, “What can I do to close the gap?”
Each institution formed a team that consisted of faculty members, staff, and
administration. The project leaders hypothesized that in these teams some individuals
would interpret inequalities in outcomes as having to do with student academic or
personal deficits; others might be blinded to the inequities by the diversity of the
student body, and still others might already be equity minded. The challenge was to
determine whether deficit and diversity oriented individuals could become equity
oriented. Using a practitioner-as-researcher methodology made the institutional
actors the researchers who worked directly with the data so that using and analyzing
10
existing data, disaggregated by race and ethnicity, from each individual’s institution
created an awareness of the inequities in educational outcomes.
CUE researchers understood that “what gets measured is what gets attended
to by campus leaders” (Bensimon et al., 2004, p. 113), so the evidence team was
responsible for creating a report that demonstrated the inequitable outcomes found
within the institution and the indicators created by the team with the use of the
Diversity Scorecard tool. The completed report, including each of the four
perspectives’ objectives and measures, was presented to the president and sometimes
other administrators within the institution. Merely giving the written report to the
president of the institution was not enough since the reading of the report could not
be guaranteed. A formal presentation with compelling graphics would need to be
made first to at least the president of the institution and later to the administrators,
faculty, and staff of the institution since administrative accountability is necessary for
reform to be sustained (O’Day, 2002). After seeing the presentation, it was the hope
of the team and DSP creators that other individuals would take the next step, to
create benchmarks and regularly monitor the educational outcomes of minority
students so that the process would become institutionalized.
The Diversity Scorecard was merely a tool. To create an institutional
atmosphere of equity in educational outcomes among underrepresented students, an
awareness that a problem exists, an understanding of the problem, and change agents
willing to start from the bottom up to produce changes in their values as well as in
11
institutional studies, policies, and practices must exist. These were the intent of the
CUE DSP developers.
The Statement of the Problem
The purpose of this study is to evaluate the impact of the Diversity Scorecard
Project (DSP) on individual’s learning within 1 of 14 institutional inquiry teams that
participated in the DSP. Derived from the expectations of the DSP, this study
explored the experiences of the evidence team and the individuals within the team
during the implementation of the project at one institution by reviewing available
material, including the field notes of the CUE facilitators over the duration of the
DSP. This provided the opportunity for me to assess whether the individuals in the
team, by doing their own research, as well as the team itself would exhibit changes in
diversity and deficit thinking to become more equity minded. To accomplish this, I
conducted a naturalistic inquiry (Patton, 2002) in which I researched the CUE web
site then searched the institution’s web pages using key terms found on the CUE
website to get a sense of how the institution portrays diversity and equity, and
whether there was mention of the institution’s participation in the Diversity
Scorecard Project and in what context. Last, I analyzed the CUE facilitators’ field
notes and data. The CUE facilitators maintained field notes over the entire process of
the DSP in which the individuals in the team met multiple times while the CUE
facilitator, a Ph. D. student, took notes to capture the dialogues of those individuals
as they looked at the data. It was my intent that the design of this study remained
open and flexible to explore the findings of the DSP at this institution. According to
12
Michael Quinn Patton (2002), “when more than one object of study or unit of
analysis is included in fieldwork, case studies may be layered and nested within the
overall, primary case approach” (p. 298). Throughout the nested case studies of the
individuals in the team, I examined field notes and discussed what the evidence team
members addressed or didn’t address as I examined the DSP documents aggregated
by and created by the CUE facilitators. I developed a case study for each individual
delineating how they appeared to have participated and what enabled their
participation as well as their experiences. Since each member of the evidence team
had a different role, I looked at his or her learning with respect to each distinctive
role within the team. This was valuable information as I evaluated the depth in which
the DSP reached the individuals and as I determined whether individuals became
aware and equity minded at their institution.
The Research Questions
I organized the case study according to the following research questions that
are centered on two key elements of the intent of the Diversity Scorecard Project
(DSP): (a) How did the participants’ feelings and attitudes about the DSP overall and
about their own participation in particular change over the course of the project? And
(b) did the participants’ awareness of and interpretation about inequities in
educational outcomes for minority students change over the course of the project?
To answer these research questions, I contemplated how the evidence team
members experienced the DSP; statements made that indicated a self reported change
as a result of the DSP, new practices, a conscious understanding of the awareness of
13
others, individual learning, and feelings and attitudes toward the DSP. The thematic
group decided to include a question regarding the evidence team member’s feelings
and attitudes so that a determination could be made whether their learning could
possibly be influenced through the process of transformational learning (discussed in
Chapter 2). I based this study on the field notes and supplemented the field notes
using the DSP evidence team’s reports to the president and the institution’s web
pages. Since I did not collect the data, looking at the report and institution’s web
pages helped me to get a sense of the institution so that I was not totally disconnected
from it. My primary source for telling the story of the duration of the DSP at one
institution was the field notes kept by the CUE facilitators. I used these notes to
construct a case study for each evidence team member that described what went on at
a particular institution.
14
CHAPTER 2: REVIEW OF LITERATURE
During the literature review, I uncovered a common theme and used it as a
guide in introducing the various aspects of learning. This common theme uncovered
that, with the understanding there was a problem, the individual would use
experiential knowledge and become aware of new information through interpretation
in which the individual would reflect on reasons for the existence of the problem. As
I developed the literature review, I kept in mind that the intent of the study was
individual learning within the DSP team, but I also realized the intent of the DSP was
for organizations to learn. Therefore, I included information on transformative
learning in the first section to emphasize the importance of feelings and attitudes. I
then included organizational learning in the second section of this chapter to discuss
the various researchers’ perspectives found in organizational learning. I incorporated
information on team learning in the third section because the method used by the
DSP, practitioner-as-researcher, required the work of a team to create change within
the organization through the input they gleaned from the DSP. The next section built
upon the common theme found in learning by creating subsections within the
individual learning section. These subsections display how each researcher discussed
the importance of uncovering a problem, becoming aware of available information,
and interpreting available information. Since it was also important to the reader to
understand the distinction between such frames of thought as diversity, deficit, and
equity mindedness, the last section discussed two common theories of learning,
behavioral and cognitive theory.
15
Transformative Learning
The DSP has in its foundation a transformative learning aspect. Jack Mezirow
(2000) described transformative learning as,
The process by which we transform our taken-for-granted frames of reference
(meaning perspectives, habits of mind, mind-sets) to make them more
inclusive, discriminating, open, emotionally capable of change, and reflective
so that they may generate beliefs and opinions that will prove more true or
justified to guide action (pp. 7-8).
He discussed the existence of two domains of learning, instrumental learning and
communicative learning, where the first deals with “learning to control and
manipulate the environment or other people, as in task-oriented problem solving to
improve performance” (p. 8) and the latter involves “what others mean when they
communicate with you. This often involves feelings, intentions, values, and moral
issues” (p. 8). Mezirow emphasized that, in transformative theory, reflective
discourse taps the experience of self and others to reach a logical conclusion in such
a way that “effective participation in discourse and in transformative learning
requires emotional maturity… knowing and managing one’s emotions, motivating
oneself, recognizing emotions in others and handling relationships—as well as clear
thinking” (p. 11). He further explained that “discourse is not based on winning
arguments; it centrally involves finding agreement, welcoming difference, ‘trying on’
other points of view, identifying the common in the contradictions, tolerating the
anxiety implicit in paradox, searching for synthesis, and reframing” (pp. 12-13). This
frame of reference for each evidence team member “shapes and delimits perception,
16
cognition, feelings, and disposition by predisposing our intentions, expectations, and
purposes” (p. 16). Within this frame of reference lie the habits of mind,
[which] becomes expressed as a point of view…. [Where] a point of view
comprises clusters of meaning schemes—sets of immediate specific
expectations, beliefs, feelings, attitudes, and judgments—that tacitly direct
and shape a specific interpretation and determine how we judge, typify
objects, and attribute causality. (p. 18)
According to Sharan B. Merriam (2004), “The goal of transformative learning is
independent thinking” (p. 61) and the product is achieving greater autonomy. She
maintained, “For transformational learning to occur, one must be able to critically
reflect and engage in rational discourse” (p. 60).
Another important aspect to transformative learning is the individual’s frame
of reference. Estela M. Bensimon, Edlyn V. Peña, and Carolina Castillo (2006)
realized that “cognitive frames determine the questions that are asked, what is
noticed, and how it is interpreted…. Cognitive frames reveal as well as conceal…
[and therefore] can be identified in the language individuals use to construct
meaning” (p. 8). They argued, “there are different degrees or dimensions to the
diversity and equity cognitive frames” (p. 22) then connected the need of double-
loop learning (discussed later) as a requirement for the attainment of an equity frame
of reference, where they paraphrased Georgia Bauman’s explanation of double-loop
learning as one which “focuses attention on root causes of a problem and the changes
that need to be made in attitudes, values, beliefs, and practices of individuals to bring
about enduring results” (p. 29). Although this study focuses on transformative
learning of individuals and uses the concept of cognitive frame to determine if
17
learning took place, the focus of the DSP was for organizations to learn, it is
therefore necessary to discuss organizational learning.
The Existence of Organizational Learning
The intent of the DSP was for organizational learning to take place. For this
reason, it is necessary to present a brief discussion on organizational learning. Before
such a discussion can begin, a review of organizational learning has to start with an
assumption that organizations can learn. The following assumptions by George P.
Huber (1991) also apply to this study:
With respect to the existence of organizational learning, let us assume that an
organization learns if any of its units acquires knowledge that it recognizes
as potentially useful to the organization. A corollary assumption is that an
organization learns something even if not every one of its components learns
that something. (p. 89)
There have been many definitions of organizational learning, but Chris Argyris
(1977) provided a brief yet succinct definition, “organizational learning is a process
of detecting and correcting error. Error is for our purposes any feature of knowledge
or knowing that inhibits learning” (p. 116) where depth and method of detection and
correction of errors can help determine single- or double-loop learning. Bensimon,
Peña, and Castillo (2006) provided a more in depth definition of organizational
learning with respect to the DSP:
Organizational learning is a process whereby individuals become more
conscious of racial and ethnic inequalities in educational outcomes, thus
increasing the likelihood that they and their institutions will assume personal
and collective responsibility for their elimination. (p. 6)
18
With this definition in mind and the intent of the DSP, another aspect of learning
employed by the DSP was team learning. In an article written by Kasl, Marsick, and
Dechant in 1997, team learning is “ ‘a process through which a group creates
knowledge for its members, for itself as a system, and for others’ (italics added)” (as
cited in Yorks & Marsick, 2000, p. 253). Another word that requires an agreed upon
definition is learning. According to Daniel H. Kim (1993), “The dictionary definition
states that learning is ‘the acquiring of knowledge or skill.’… Both parts of the
definition are important: what people learn (know-how) and how they understand
and apply that learning (know-why)” (p. 38). This definition combined with the
team’s ability to take action as an outcome of team learning helps identify the
purpose of the DSP, for organizations to learn. Therefore, the following section
presents a brief discussion about organizational learning.
Perspectives of Organizational Learning
Throughout the literature on organizational learning, there are different
perspectives on how organizations learn. Argyris and Schön (1996); Bensimon, Peña,
and Castillo (2006); as well as Mary E. Boyce (2003) discussed first- and second-
order change in which single- and double-loop learning occur such that first-order
change with single-loop learning does not influence the framework of the institution
the way that second-order change with double-loop learning does. Huber’s (1991)
perspective consisted of four processes that constitute organizational learning: (a)
knowledge acquisition, “the process by which knowledge is obtained” (p. 90) in
which the learner draws on current knowledge of the organization, learns vicariously
19
through other organizations, grafts needed knowledge to current knowledge, and
evaluates the organization’s environment and performance; (b) information
distribution in which communication of information is shared to develop a new
understanding; (c) information interpretation which “is the process by which
distributed information is given one or more commonly understood interpretations”
(p. 90); and (d) organizational memory, the storage of information. Kim reiterated
this when he stated that organizations learn when there is an influence on the
organization’s mental model.
Yorks & Marsick (2000) viewed organizational learning through a different
perspective, action learning combined with collaborative inquiry. In this perspective
a small group, after discovering a problem or being given a project, learns by taking
an action then reflects on that action. This creates transformative learning that
changes the habits of mind where there is an understanding of choices made and the
importance of the outcomes to the organization. Patricia A. Alexander and P. Karen
Murphy (1998) emphasized the importance of action learning and collaborative
inquiry perspective when they stated, “the knowledge that learners possess becomes
an extremely powerful force, not only in what information they attend to … or how
that information is perceived … but also in what is judged by learners as relevant or
important” (p. 28).
Richard L. Daft & George P. Huber (1987) discussed two different
perspectives of organizational learning that they felt must exist simultaneously in the
organization. The first was the systems-structured perspective; in this perspective, the
20
distribution of data is for institutional units that know how to use the data when
taking action. The second was the interpretive perspective; in this perspective,
information, defined as “that which can change a person’s understanding or mental
representation” (p. 13), is shared with members and through discussion and possible
trial and error the participants all attempt to make sense of it having the net effect of
creating a learning situation in which they change their “assumptions, symbols, and
values” (p. 10). Since the concepts of individual and team learning are linked to
organizational learning, it was necessary to include both. The following section
discusses team learning.
Team Learning
A discussion of organizational learning cannot be complete without a
discussion of team learning. According to Yorks & Marsick (2000), by creating small
groups of individuals who are learning through participation in action learning and
collaborative inquiry, the intention is to enhance the organization’s performance
through transitional learning and thus organizational learning. Argyris and Schön
(1996) called attention to the fact that “it is increasingly common to find people
attributing to teams, departments, or whole organizations, such activities as thinking,
reasoning, remembering, or learning” (p. 6). Yorks and Marsick (2000) then
demonstrated the importance of team learning with respect to organizational learning
during the following discussion:
Senge (1990, p. 10) describes teams, not individuals, as ‘the fundamental
learning unit in modern organizations…. Unless teams can learn, [an]
organization cannot learn.’ Watkins and Marsick (1993, p. 14) assert that
21
‘teams, groups, and networks can become the medium for moving new
knowledge throughout the learning organization’ and that such collaborative
structures ‘enhance the organization’s ability to learn because they offer
avenues for exchange of new ways of working.’ (p. 254)
With a firm understanding of both organizational and team learning, we now turn to
elements of individual learning since embedded in many discussions of
organizational and team learning are discussions of individual learning. Kim (1993)
said it best when she said, “Organizations learn via their individual members. Hence,
theories of individual learning are crucial for understanding organizational learning”
(p. 37). Therefore, the following section presents the concept of individual learning
including similar perspectives found in the literature.
Individual Learning
Organizational or team learning cannot be considered without addressing
individual learning (Kim, 1993; Huber, 1991; Yorks & Marsick, 2000; Daft &
Huber, 1987; Argyris & Schön, 1996; Boyce, 2003). Kim (1993) clearly established
this requirement:
If a distinction between organization and individual is not made explicit, a
model of organizational learning will either obscure the actual learning
process by ignoring the role of the individual (and anthropomorphizing
organizations) or become a simplistic extension of individual learning by
glossing over organizational complexities. (p. 42)
Kim combined the discussion with a link between individual learning and
organizational learning through a mental model that was based on two elements,
learning and memory, which he labeled OADI-SMM. The label was based on two
concepts; first, Kofman’s learning cycle, observe-assess-design-implement (OADI),
22
which demonstrated the individual learning portion but doesn’t address retention
needed in organizational learning and second, a shared mental model (SMM), which
explored two levels of learning related to mental models, operational where one
learns the procedures, the know-how, of the organization and conceptual where one
learns by questioning the purpose of procedures, in other words, the know-why of the
organization. Yorks and Marsick (2000) also discussed the need of individual
learning for learning to occur within the organization when they stated, “While
transformative learning occurs at the organizational level it also must occur at the
individual level” (p. 275). Daft and Huber (1987), discussing the interpretive
perspective of learning (explained later in this chapter), connected organizational
learning to individual learning when they clarified how learning happens for
individuals within an organization, “Data mean nothing until they are used by
organization participants. Information can be defined as data that have utility, reduce
uncertainty, or changes one’s understanding about the external world” (p. 8). Argyris
and Schön (1996), before beginning their discussion of organizational learning,
pointed out the following two aspects regarding individuals as a necessity in
organizational learning:
We should think of organizational learning in terms of the “organizational
environments” within which individuals think and act. Organizations have
been conceived as behavioral settings for human interaction, fields for the
exercise of power, systems of institutionalized incentives that govern
individual behavior, or socio-cultural contexts in which individuals engage in
symbolic interaction. From one or more of these perspectives, we may be able
to describe the conditions under which, within an organizational environment,
the thought and action of individuals yield organizational learning. (p. 7)
23
To clarify the importance and need of individual learning as it related to
organizational learning, Argyris and Schön pointed out the fact that “organizational
learning occurs when individuals within an organization experience a problematic
situation and inquire into it on the organization’s behalf” (p. 16). This emphasized
that individuals are considered agents who learn for the organization creating an
outcome of organizational learning. During her discussion of processes and theories
of learning, Boyce (2003) signified how individual learning ties to organizational
learning when she stated:
Individuals and groups provide explanations for events from their
perspectives and construct images of the organization rather than sharing one
unifying organizational reality. Much discussion occurs as individual and
groups provide explanation for events from their perspectives and construct
images of the organization rather than sharing one unifying organizational
reality. (p. 123)
During a discussion regarding learning, Huber (1991) pointed out three important
aspects. First, “Learning need not be conscious or intentional…. Further, learning
does not always increase the learner’s effectiveness, or even potential
effectiveness…. Finally, learning need not result in observable changes in behavior”
(p. 89). He then emphasized that a person’s choice may be to change their cognitive
map or their understanding rather than their behavior. Throughout this literature
review a common theme developed. This theme displayed that, with problems
discovered, individual learners use current and new awareness to help interpret and
reflect upon the reasons for the existence of the problems in order for learning to take
24
place. The following subsection discusses the common aspects of learning among
researchers.
Problems Uncovered
The first element common among the different researchers was the initial
need to uncover problems or find errors. Boyce (2003) explained that “first- and
second-order change and single- and double-loop learning identify the categories of
change or learning” (p. 125) and that first-order change single-loop learning does not
result in questioning the underlying framework of the organization, including values
and culture, whereas second-order double-loop learning does. Argyris and Schön
(1996) discussed how, similar to transitive learning, the first step of double-loop
learning of a problem is conducted through questioning where problems are
uncovered or errors found. This is similar to the perspective of Huber’s four
processes in that the first process detects an error through questioning or noticing
organizational performance. Even Daft and Huber’s two simultaneous perspectives
can be molded into the previous two perspectives, although one does not necessarily
lead to the other, in that the systems-structured perspective monitors the organization
for problems. Individual learning was the main focus of the first part of the OADI-
SMM model. Kim realized the importance of observation to determine where
problems existed before they could go any further when he stated, “people
experience concrete events and actively observe what is happening” (p. 38) from
there they could asses and reflect on that experience. Each researcher discussed the
necessity for uncovering problems or discovering errors to begin the learning
25
process. Once a problem was discovered, the individual could move to the second
step of learning.
Awareness
After their discussion regarding questioning to find problems, Argyris and
Schön (1996) discussed how the next step to double-loop learning of a problem was
seeking of information. This was similar to the perspective of Huber’s (1991) four
processes since, after the detection of an error or errors, information was gathered or
distributed which had the effect of creating awareness. Huber defined information
distribution as “the process by which information from different sources is shared
and thereby leads to new information or understanding” (p. 90). He then linked the
need for information distribution to individual learning when he stated, “When
information is widely distributed in an organization, so that more and more varied
sources for it exist, retrieval efforts are more likely to succeed and individuals and
units are more likely to be able to learn” (p. 100). Daft and Huber’s two simultaneous
perspectives can also be molded into the previous two perspectives. After monitoring
the organization for problems, the next step would be probing for information and
distribution to units within the organization to be acted upon. Even Yorks and
Marsick (2000) emphasized the importance of awareness to learning when they
stated, “Increasing people’s awareness of the personal choices available to them is
the important outcome of transformative learning” (p. 275). Yorks and Marsick also
elucidated that transformative learning “emancipates individual learners through
making them aware of how psychological-socioeconomic-cultural forces might have
26
limited personal choice or have been the source for dysfunctionally constructed
habits of mind” (p. 254). Each of the researchers had made clear the importance of
awareness as an element of learning.
Interpretation
Interpretation of information provides an opportunity for reflection on
information or inquiry into available information. The premise of action learning
through the participant-as-researcher model explained in Chapter 1 was the
foundation of the DSP. Yorks and Marsick (2000) recognized the need for reflective
inquiry with this learning method when they stated, “Both action learning and
collaborative inquiry are highly participatory and designed to foster learning from
experience through cycles of action and subsequent reflection on that action” (p.
255). They later described collaborative inquiry as “a process consisting of repeated
episodes of reflection and action through which a group of peers strives to answer a
question of importance to them” (p. 266). Kim further explained the technique of
inquiry when he stated that as the individual learning cycle proceeds through the
OADI, the individuals actively experience and observe then assess the process
through discussion and reflection initiating a common understanding of individual
views. This concept was echoed by Yorks and Marsick when they stated, “Each
participant is a coinquirer—shaping the question, designing the inquiry process,
participating in the experience, making and communicating meaning” (p. 266). As
part of their discussion regarding double-loop learning, Argyris and Schön (1996)
discussed how once a problem is found, and information gathered, it must be
27
reflected upon to correct problems that may be a product of or reinforced by
institution norms, values, or framework of the institution. This is similar to the
perspective of Huber’s four processes in that errors were detected, information
gathered or distributed, and then interpreted or reflected upon although Huber does
not speak about values. Even Daft and Huber’s two simultaneous perspectives can be
molded into the previous two perspectives, although one does not necessarily lead to
the other, in that the systems-structured perspective after monitoring the organization
for problems and probing for information, distributed it to units within the
organization to be acted upon while the interpretive perspective referred to the shared
reflection of the information through discussion.
During her discussion of how individual learning is linked to organizational
learning, Boyce (2003) emphasized the importance of inquiry and dialogue within
single- or double-loop learning as individuals and organizations learn. According to
Boyce, single-loop learning showed change of a first-order in which there was a
reactive change whereas double-loop learning was associated with second-order
change in which a reflexive change of assumptions and values of an individual or
organization occurred. Argyris and Schön (1996) explained that when looking for
single- or double-loop learning it is important to look not only at the beginning, but
where it went. They further explained that with double-loop learning the question is
turned back to the questioner in that “exploring not only the objective facts
surrounding an instance of inefficiency, but also the reasons and motives behind
those facts…. Double-loop learning depends on questioning one’s own assumptions
28
and behavior” (p. 27). Huber (1991), however, stressed the cognitive aspects of
learning with a focus on learning within frames of reference as he discussed the
dichotomy of single-loop and double-loop learning. He explained how the frames of
reference related to the routine and immediate tasks found in single-loop learning
versus non-routine and long-term outcomes found with double-loop learning “where
the latter is related to an organization’s frame of reference…. This conceptual
distinction between learning within a frame of reference and learning a new frame of
reference seems critically important” (Huber, 1991, p. 93). He also discussed how a
person’s previous cognitive map within the various units of the organization
interpreted information differently and that learning within the organization occurred
because of the varied individual interpretations.
Through each of the researchers’ perspectives discussed above, problems
were uncovered, new awareness through information gathering was implemented,
and that information was interpreted. Through this, learning took place and, as the
individuals learned, team learning and organizational learning occurred since, as Kim
(1993) explained, “an organization learns through its individual members and,
therefore, is affected either directly or indirectly by individual learning” (p. 41).
Determining single- or double-loop learning was important for each individual
because single-loop learners might have been more defensive and shown a more
superficial way of looking at issues in an organization whereas double-loop learners
might have shown a commitment to developing an understanding as well as
developing new values and norms of the organization (Argyris, 1991; Argyris &
29
Schön, 1996) which was the goal of the DSP. With the understanding that if
individuals within a team do not learn there can be no team learning and that
individual learning or team learning is a prerequisite to organizational learning and a
further understanding that not all individuals or units are required to learn for
organizational learning to occur (corollary assumption by Huber discussed earlier in
the chapter), a discussion of learning theories can now be introduced.
Learning Theories Used
There are various theories of organizational learning. The most prevalent
discussed by the researchers above are behavioral and cognitive theories. The
distinction between behavioral and cognitive theory is that behavioral theory
contemplates the surrounding environment where there is continuous development
throughout an entire life whereas cognitive theory considers the individuals’
thoughts, feelings, and beliefs while placing an emphasis on information processing
as a basis of learning (Schunk, 2004). For example, Huber (1991) as well as Argyris
and Schön (1996) discussed both behavioral and cognitive theories when discussing
organizational learning. When discussing behavioral theory, Huber discussed
behavior within organizations through unlearning, the discarding of knowledge,
which leads to a decrease or increase in “potential” behaviors while Argyris and
Schön discussed the “organization’s learning system [which] is made up of the
structure that channel organizational inquiry and the behavioral world of the
organization” (p. 28) where behavioral world is the behaviors that govern the
relationships of those within the organization. When discussing cognitive theory,
30
Huber discussed the use of cognitive maps, defined as “belief structure or mental
representation or frame of reference” (p. 102), in interpretation and framing of
information as an attempt to create a uniform interpretation of information after it
was distributed to various units, and Argyris and Schön also mentioned frames in
their discussion regarding cognitive frames such as single- and double-loop learning
in which possible outcomes of the single-loop learning are changes in strategies used
by the institution whereas possible outcomes of the double-loop learning are changes
in the values of the institution.
Summary
“Innovations, whether or not achieving their goals, are not automatically
institutionalized; organizations are not naturally prepared to accommodate and
incorporate them” (Curry, 1992, p. 2). The DSP was a learning process that
attempted to bring about individual and organizational learning since “equality in
fact and equality as a result in higher education is imperative” (Bensimon, 2004, Fall,
p. 2). The creators of the DSP, based on the ways in which individuals made sense of
data that revealed racial inequities, proposed three cognitive frames: diversity,
deficit, and equity. Those with a diversity cognitive frame felt that the demographics
of their student body were already diverse and didn’t need a Diversity Scorecard.
Those with a deficit cognitive frame felt that it was a problem of the students’
backgrounds and hence not solvable by an institution. And those with an equity
cognitive frame felt that there were inequities in educational outcomes and that it was
an internal institutional problem so any changes to be made would need to be by the
31
individuals within the institution as change agents. In chapter three, I discuss how I
used these concepts to understand how individuals might have unlearned deficit and
diversity thinking and developed into equity thinking, one of the aims of the DSP
creators.
32
CHAPTER THREE – METHODOLOGY
Although the literature review included organizational and team learning, the
main purpose of this qualitative study is an evaluation of one aspect, individual
learning, of the Diversity Scorecard Project (DSP) at a single institution. In this
chapter, while keeping in mind the needs of the audience of this evaluation, the
Center For Urban Education (CUE) researchers, I will focus on three facets: (a) the
design strategy, (b) data collection methods, and (c) analysis strategies. The method
that I have decided to use for the case study of the individuals within a team is an
evaluation of a purposeful typical sample, which is 1 of the 14 institutions that
participated in the DSP. I chose this bounded system, a case bounded by time and
place (Creswell, 1998), purposeful sample as the case because “studying
information-rich cases yields insights and in-depth understanding … [that] will
illuminate the questions under study” (Patton, 2002, p. 230). To determine a best
method, I needed to determine a proper design. In this chapter I discussed in detail
how I chose this method and how I implemented the study.
Design Strategy
Before designing any research project, whether quantitative or qualitative, a
problem must be presented (Creswell, 1998). The original problem that the DSP
presented was one of inequitable outcomes for underrepresented students. The
problem of this case study was to determine whether learning took place among the
participants of the DSP evidence team within one institution. With an understanding
of the problem of the study, Creswell (1998) and Patton (2002) both suggested that
33
the next phase of the design would be the creation of the research question or
research questions.
Research Questions
Qualitative research is emergent in that originally chosen research questions
or research designs have the capacity to change throughout the life of the study
(Creswell, 2003). There are many aspects to consider when designing an evaluation.
First, I needed to frame the research questions that answer the purpose of this
evaluation while keeping in mind that the questions should be open-ended in order to
study the problem (Creswell 1998). Therefore, the first decision to be made was on
the number of research questions that I could realistically respond to. At first, the
thematic group and I chose to limit the number of research questions to three, but as
the dissertation process progressed it was later reduced to two. By limiting the study
to two research questions, it could have more depth. If I were to add to the number of
research questions, my ability to respond in depth to each question would be severely
hampered. Keeping in mind the purpose of the study and the need to limit the number
of research questions, the research questions decided upon for this study were: (a)
How did the participants’ feelings and attitudes about the DSP overall and about their
own participation in particular change over the course of the project? And (b) did the
participants’ awareness about and interpretations of inequities in educational
outcomes for minority students change over the course of the project? According to
Patton (2002), research questions developed directly from the stated purpose of the
DSP would be very useful as an evaluation inquiry. Using qualitative methods with
34
fewer questions directly related to the stated purpose of the DSP allowed me to
produce a wealth of information about the individuals.
Becoming Knowledgeable
As Creswell pointed out, becoming cognizant of the research questions
helped me determine the type of literature review required then the literature review
helped me limit the scope of the study. Once I had an idea of the type of information
required, how learning happened, I began the literature review. Reviewing literature
can be fraught with personal biases and values (Creswell, 1998; Patton 2002).
According to Patton (2002), “reviewing the literature can present a quandary in
qualitative inquiry because it may bias the researcher’s thinking and reduce openness
to whatever emerges in the field” (p. 226). I determined that it was necessary to
research literature before I analyzed the institution’s web pages, looked at the data, or
read the field notes because it was necessary to become more knowledgeable in the
field of learning and to better understand the stated purpose of the DSP since a total
understanding of the purpose is necessary to any evaluation (Patton, 2002).
Design Decisions
After presenting a problem and creating research questions that would guide
the study, the next step was to determine the unit of analysis, sample strategy, and
sample size that would best focus the evaluation for the stakeholders of the project.
Patton (2002) discussed three different types: (a) individual, which focuses on an
individual in a setting; (b) group, which focuses on one or more groups with
characteristics that bring individuals together; and (c) different parts of the program.
35
Although this evaluation dealt only with one aspect within a program, most of the
information I reviewed involved the CUE facilitators’ field notes regarding the
individuals and the team. Therefore, some aspects of this evaluation have also been
people focused. For this reason, I have chosen to use Patton’s different parts of the
program as the unit of analysis in which my major focus was to use the field notes to
construct a story about each individual to delineate their individual and possibly
group progress. Knowing the unit of analysis made the choice of sample strategy and
sample size an easy task. The director of the USC CUE chose the institution, West
Coast Community College, as a purposeful typical sample strategy that was an
average institution; it was one that she had determined was neither the best performer
nor the worst, yet one that was rich in information that could answer the research
questions in-depth. Patton also discussed another aspect of a qualitative case study in
which the researcher could use nested or layered case studies to describe the
individuals within the umbrella of the case study and Creswell (1998) discussed the
design of a study chosen for the ontological nature of the research questions in which
“reality is constructed by individuals involved in the research situation” (p. 76).
Since the research questions do address individuals and since the questions are of an
ontological nature, I have, therefore, created multiple nested case studies regarding
the individuals of the evidence team within this one case study of West Coast
Community College in which I created a story for each individual.
36
Generalizability, Credibility, and Validity
Although the typical case sampling was best for the stated purpose of the
study, I would like to note that, even though this case sample illustrated what was
average in the DSP, the findings of the case study were limited to one institution,
West Coast Community College, and cannot be considered as generalizable. The
findings of this evaluation answered the research questions to be used by the USC
CUE DSP so that they could make future decisions regarding the program. Since the
purposeful sample size and sample strategy both support the stated purpose of the
study, the typical sample of one institution is creditable for this evaluation. During
this study, I also utilized the concept of member checking. According to Creswell
(2003), member checking in which “the accuracy of the qualitative findings through
taking the final report or specific descriptions or themes back to participants and
determining whether these participants feel that they are accurate” (p. 196) is a step
toward validation of the accuracy of the findings. Therefore, the CUE director, who
has intimate knowledge of and private information regarding the project, and one of
the CUE facilitators, who wrote some of the field notes for the institution in this
study, both have read my analysis so that they could make judgments about my
interpretation and verify its accuracy. Since the institution was information-rich, with
information that was readily available and easily analyzed and all data sources
triangulated by themes found from the three different sources and member checking
insured the accuracy, this type of qualitative inquiry is also valid (Patton, 2002;
Creswell, 1998; Creswell, 2003).
37
Data Collection
Closely related to the design strategy of purposeful sampling was data
collecting. The above sampling strategy used a pure naturalistic-qualitative strategy
in which I used qualitative data that I found in web pages and CUE facilitators’ field
notes to perform a content analysis on that data. I researched the web pages and
analyzed the field notes written by CUE facilitators about the individuals involved in
the project. From that text, I hypothesized what I expected to find from meeting to
meeting regarding each individual and the group. I then composed a story about each
individual based on what they and the facilitators said about constructing the DSP
and the effect of each individual’s participation in the DSP.
“There are four basic types of information to collect: observations …,
interviews …, documents …, and audio-visual” (Creswell, 1998, p. 120). As
suggested by Creswell the types of data used would be guided by the research
questions. Merriam (1998) noted one benefit of using documents when she stated,
“The presence of documents does not intrude upon or alter the setting in ways that
the presence of the investigator often does…. Documents are, in fact, a ready-made
source of data easily accessible to the imaginative and resourceful investigator” (p.
112). The data I used included background information from the USC CUE web site
regarding the DSP; information regarding the employees, minority students, and
equity from West Coast Community College’s web pages; and information from the
CUE facilitators’ field notes and reports. “Whether in fieldwork or library work, the
data collection is guided by questions, educated hunches, and emerging findings”
38
(Merriam, 1998, p. 120). To find all pertinent information in these documents, I
proceeded in a systematic manner while keeping an open mind and thinking
creatively (Merriam, 1998). Since these documents were secondary source data, not
created by me or for this study, when viewing them I needed to remain mindful of
their sources. As an external, neutral investigator of this qualitative inquiry, I kept in
mind to “carefully reflect on, deal with, and report potential sources of bias and
error” (Patton, 2002, p. 51).
Creswell (1998) pointed out that to analyze a qualitative study I must
“visualize data collection as a series of interrelated activities” (p. 110). First, I needed
to find the web sites to record the information found and store the data. The type of
information I was looking at and searching for varied with the source and as I found
new information. After viewing the web sites, I read and reread the CUE facilitators’
field notes to gain a full understanding of their content. The following provides a
brief description of the three sources of collected data.
First Data Source
The USC CUE’s web site is easily accessible and capable of providing a
thorough understanding of the DSP, key terms, concepts, and definitions that can
help identify the framework of this study (Merriam, 1998). While investigating the
site, I recorded the information through note taking activities and when necessary,
made copies of pertinent information that might be logged for inclusion in the
appendix. With a firm understanding of the concepts of the creators of the DSP, I
39
could find information that would help my understanding of the institution I would
be studying.
Second Data Source
West Coast Community College’s web site was similar to the first source,
USC’s CUE web site, including the ease of accessibility; the distinction was in the
way I searched for information. Using West Coast Community College’s web site
search engine, I found information about minority students, equity, diversity, and
multiculturalism by typing in key terms or phrases that I found, while probing the
CUE web site, to be significant to the DSP. I then visited the web pages that were
listed by the various searches and collected information that existed on the web pages
then noted information that was missing from the web pages that I considered
necessary for the intent of the DSP. I then evaluated the content of the web page by
the importance of the type of document that the web page represented (i.e., faculty
handbook) or the location of the information on the page (i.e., is it near the beginning
of the document or at the end). I also made note of any key words that had no web
pages listed after searching for that key word, since the lack of information can be
just as revealing as information that is found (Patton, 2002). I also searched for
information on two of the DSP evidence team members because the CUE facilitator’s
field notes, described in the next section, had provided no information as to their
ethnicity.
40
Third Data Source
CUE facilitators created field notes for each on-site DSP meeting at West
Coast Community College. The use of these notes required access by gaining a level
of confidence by the facilitators and contacting the “gatekeeper” (Creswell, 1998),
the person who can provide me with the initial contact and necessary permission to
review the information. Before gaining access to the documents, I perused then
signed confidentiality statements. Once signed, the USC CUE director handed me a
compact disc (CD) with the entire collection of field notes and reports that the
facilitators amassed from West Coast Community College. Using these field notes
and reports, I attempted to answer the two research questions during the analysis
phase.
Data Analysis
“Using documentary material as data is not much different from using
interviews or observations” (Merriam, 1998, p. 120). From the data collection, I
created a detailed description of the individuals as well as themes (Creswell, 1998).
The USC CUE web pages contained many documents and areas of interest. I
analyzed each major web page for categories, themes, or key words. The next step
was to use these categories, themes, or key words as search words on West Coast
Community College’s web pages; this assisted in an inductive analysis and creative
synthesis of the institution web pages. I analyzed the results of the searches to
determine the type of information found or not found and how it related to the stated
41
purpose of the DSP. The last and most influential phase of the study was the analysis
of the facilitators’ field notes from beginning to end of the multiple-year DSP.
Analyzing the Field Notes
Before I began the analysis of the facilitators’ field notes, the thematic
dissertation group and I verified the research questions. To do this I reread the
articles, books, and web pages searching for key ideas while keeping in mind the
research questions and the information found in the literature review. Once we
verified the two research questions, I began my analysis of the field notes using an
individual contact form, one for each individual at each meeting, created by the
thematic dissertation group to help respond to the research questions. I also kept in
mind that as I reviewed more literature and as I read and reread the field notes, the
research questions might be subject to change since qualitative study is ever
changing as new aspects or ideas grow from previous ones.
Once the thematic group and I developed the final two research questions, I
began perusing the field notes. To gain a “holistic perspective” (Patton, 2002) of the
project, I needed to read the field notes multiple times. I first perused the field notes a
few times without referring to any research questions or analysis. This was a time of
reflection so that I would have an opportunity to develop a general sense of what had
occurred over the full 4-year implementation of the DSP. After a few readings of the
field notes, I responded to the research questions on each individual’s contact form
then hypothesized the actions of each individual during each subsequent meeting. As
I continued reading the field notes, I determined whether my hypothesis was correct
42
or incorrect followed by an examination of why it was correct or incorrect. After
those readings, I continued to reread the field notes multiple times keeping in mind
the continuous need for analyzing and interpreting. Each time I reread the field notes,
I perfected the responses to the research questions created from the field notes as I
began to recognize new concepts or ideas as well as feelings and attitudes.
For the reader and me to better understand the DSP evidence team members, I
began by looking for any information on the team member’s background. I felt that
the reader and I would relate better to the story of the individual if their background
at West Coast Community College was known, and since this was a project regarding
diversity, I wanted to make sure that the reader and I understood the ethnic makeup
of the team. To do this, I looked for information on their ethnicity, standing within
the college, and any information that directly related to the person outside of the
DSP. Most of the information was found in the field notes, but two of the members
had not been given an ethnic identity. To determine their ethnic identity, I searched
the West Coast Community College’s web pages. By looking at a picture of one of
them, I was able to imply a possible ethnicity, but the second person’s identity could
not be determined by looking at his picture, so I established his possible ethnicity
from his last name.
The first research question dealt with the feelings and attitudes of the DSP
evidence team members. In order to answer this question, I coded the field notes
using categories I had determined as I read and reread the field notes. The majority of
the categories found were expressly discussed by the CUE facilitators in their field
43
notes, but some of the feelings I discovered from implications of what was said by or
about the evidence team members. After reading the field notes the first time, I
realized that there was one overlying attitude for all the team members, a positive
attitude toward the DSP, and I focused all other feelings that I found as an argument
for or against that positive attitude. The types of categories that I found under
feelings were cautiousness, confidence, enthusiasm, excitement, and motivation.
After categorizing the feelings and attitudes, I coded explicit and implicit information
from the field notes that contained words, actions, or both that supported the feeling
or attitude being discussed. The words I used to code were the actual statements
made by or about the evidence team members and the actions were body movements
noted by the CUE facilitators, events that occurred as a result of an act by the
evidence team member, or measures that were taken by the evidence team member.
Since the reader would not have access to the actual field notes, I have
provided three items to help the reader better understand the CUE facilitators’ field
notes. The first is a listing in chronological order of each meeting that includes my
interpretation of the purpose for each meeting (see Appendix A); the second is a
sample of the structure of the CUE facilitator’s field notes for one of the meetings
(see Appendix B), which is representative of the field notes for all meetings. The
third provides the reader with a sense of the magnitude of statements made by each
individual regarding various aspects within the DSP (see Appendix C). To do this, I
coded their statements using categories that I had observed to be in common then
recorded the count of each category. Under many of the major categories, I found
44
specialized topics that were the focus of many of the individuals on the evidence
team. I also recorded the count of each of these specialized topics. The information I
recorded provided an opportunity to give the reader a better understanding of the
focus of each evidence team member and to create a brief comparison between
committee members.
The following is a listing of the categories and specialized topics that were
counted for each evidence team member. The five major categories were
conversations about: (a) students in general, (b) minority students, (c) Diversity
Scorecard Project (DSP), (d) a pipeline that tells the story of the student’s time at
West Coast Community College including barriers they may encounter, and (e)
programs other than the DSP at West Coast Community College. The specialized
topics were developed because there were some topics within the major categories
that were emphasized by various individuals. To give the reader an understanding of
the evidence team member’s focus, I counted the number of conversations that were
about the following specialized topics: (a) educational outcomes with respect to
students in general, taken from statements regarding students in general; (b)
educational outcomes with respect to minority students; (c) the creation of a cohort
which fell under the category of the DSP because it dealt with creating a DSP
database of representative students that match the demographics of the college; and
(d) the College Review Process which, though only one of many programs at West
Coast Community College, was the one program discussed most often by many of
the evidence team members.
45
Ethical Considerations
The analysis phase also brought to question ethical considerations (Creswell,
2003). One of prime importance was that I would have to keep the anonymity of
those involved with the DSP in mind when interpreting the data. This was not only a
concern because I signed a confidentiality statement, but because those involved
have a right to privacy. Therefore, I used pseudonyms for both individuals and
institutions discussed in this study. Also, resulting from the high level of Internet
computer information theft, I never connected to the Internet nor saved the
information on a computer that connected to the Internet while viewing the CUE
facilitators’ data CD. The computer I used to do my work and save my information
never connected to the Internet. When required to send information to my
dissertation committee, I first confirmed that the file being sent would not divulge the
individuals’ and institutions’ anonymity then saved the information to a flash drive to
transfer to a computer connected to the Internet. The next issue I encountered was
how long to keep the confidential data. The USC CUE director requested that, upon
completion of this study, I return the CD and any information printed from the CD to
USC CUE. The last issue of importance was the accurate interpretation of the data,
hence the validity of the study, since Creswell (2003) emphasized, “Validity … is
seen as the strength of qualitative research” (p. 196). To assure accuracy, I intended
to find mutual themes throughout the 3 sources and to clarify any bias I brought to
this study. When hypothesizing about, analyzing, and interpreting the data collected,
46
I remained independent and neutral which was important since the members of my
dissertation committee have direct ties to the USC CUE Diversity Scorecard Project.
47
CHAPTER FOUR: RESULTS
The majority of the information for this chapter has been gleaned from the
CUE facilitators’ notes that were written during 15 Diversity Scorecard Project
(DSP) meetings that took place over a 4-year span of time at West Coast Community
College. The remainder of the information was taken from West Coast Community
College’s website in order to complete some of the background information for the
institution or individuals. During most of the DSP meetings, at least two CUE
facilitators attended the meetings in which one was responsible for writing notes that
contained the information from each meeting. These notes were the main source of
my data on learning and the background information of the team, individuals, and
institution involved. There were usually two facilitators present at the DSP meetings
at West Coast Community College, but on some occasions only one CUE facilitator
was able to attend, but that was rare, at other times there were more than two CUE
facilitators present as new CUE facilitators were introduced to the team over the 4-
year implementation of the DSP. At each meeting at least one of the facilitators from
CUE was responsible for documenting each meeting. Throughout the span of the
DSP, the facilitators noted actual quotations of the participants or, as was mostly the
case, provided a synopsis of the conversations of those in attendance. In the majority
of the field notes, the CUE facilitators included debriefings consisting of
conversations that were held between the CUE facilitators after the DSP meeting was
concluded. This information encapsulated what they viewed as important events,
topics, or conversations that occurred during the meeting.
48
As explained in the previous chapter, I have provided a count of
conversations (see Appendix C) organized by topics that I had determined to be
major topics among the five evidence team members. Within the major topics, I have
included subtopics that I considered representative of a large proportion of the major
topic for at least one of the evidence team members. While perusing the notes, I
determined there were five major topics discussed during the DSP meetings: (a)
Students in General, (b) Minority Students, (c) The DSP, (d) The Pipeline, and (e)
Other Programs at West Coast Community College. The subtopics included are: (a)
Educational Outcomes, (b) DSP Cohort, and (c) The College Review Process (CRP).
The subtopic heading “Educational Outcomes” was actually counted as two separate
headings each representing a separate major topic, “Students in General” as well as
“Minority Students.” I counted how many of those conversations dealt with student
educational outcomes for students in general then counted how many of those
conversations dealt with only minority students and placed the counts under their
respective major topic. I also noted that a lot of the conversations under the major
heading of “DSP” dealt with the development of a cohort that would represent the
demographics of the college, so I created a subtopic titled “DSP Cohort.” The last
subtopic created was created after I realized that a large proportion of the
conversations regarding other programs at West Coast Community College were
about one program, the College Review Process, so I provided a count of how many
of the conversations dealt with that topic. It is important to note that the title of the
program “College Review Process” is a fictitious name that was created to assure the
49
anonymity of the college and individuals involved in this study. The name was
chosen to identify the fact that the program was a college-wide program, not just a
departmental program.
For reasons of confidentiality, names and titles have been changed. Therefore
the names of the college, members of the DSP evidence team, CUE program
implementers, others mentioned during the meetings, other institutions, and job titles
are fictitious and created in such a way as to hide the identity of those people or
institutions involved. The CUE program implementers and the CUE facilitators were
not mentioned by name, instead they were only referred to as a CUE director or CUE
facilitator; the institution, a 2-year college with more than one campus, was renamed
West Coast Community College and its campuses referred to as other campuses since
providing information about the number of other campuses would give too much
information that could jeopardize the confidentiality of the college; and the
individuals within the DSP evidence team have been given the fictitious names of
Theresa, Katherine, James, Richard, and Robert. Any other individuals mentioned
who are members of West Coast Community College, but not part of the evidence
team have also been given fictitious names; for example, the president of the college
was given the name Ralph. Since other institutions were also mentioned during the
meetings, those institutions were referred to as other institutions and any members
within those institutions were given fictitious names if they were mentioned more
than once in this study; otherwise, they are only referred to as a person from another
institution. For the sake of anonymity, the names of West Coast Community College
50
divisions, units, departments, or the titles of non-DSP committee members that are
unique and therefore could identify the person or college have also been changed.
The Research Questions
The two research questions for this study are: (a) How did the participants’
feelings and attitudes about the DSP overall and about their own participation in
particular change over the course of the project? And (b) did the participants’
awareness of and interpretation about inequities in educational outcomes for minority
students change over the course of the project? This case study, which is to
determine if learning took place among the five evidence team members, was
constructed under the framework of the two research questions in such a way that the
story being told about each evidence team member was told through the information
gathered to answer the two research questions.
The First Research Question. The rationale for considering how the
participant’s felt about the DSP was that learning was more likely to happen when
individuals have positive feelings about the process and purposes of the project.
Individual’s feelings and attitudes toward the DSP could facilitate or hinder any
learning that might have taken place regarding equity over the four-year
implementation of the project. By following each individual’s story it was my intent
to identify any feelings, attitudes, or reactions that may have been expressed
explicitly or implicitly. To do this I reviewed their comments and the comments
made by the CUE facilitators.
51
The Second Research Question. To answer the second research question, I
separated the second research question into two parts. In the first part, the analysis
focused on the knowledge each evidence team member had about educational
outcomes of students in general and minorities in particular. In other words, I was
interested in determining what current knowledge the participants had about
students’ goals, success, and particular outcome patterns for specific populations then
about any new knowledge they may have obtained after looking at data
disaggregated by ethnicity. Simply put, I wanted to know whether individuals had
given thought to student outcomes and whether participating in the project intensified
their awareness.
The second research question also dealt with each evidence team member’s
interpretation of inequitable educational outcomes for minorities. In order to interpret
the data, the evidence team members needed to question the reason behind any gaps
in educational outcomes that existed for minority students taking classes at their
college. The CUE team then expected that this interpretation of the patterns would
instill in the participants a desire to take action and become agents of change.
Organization of the Five Case Studies
This study includes five separate case studies embedded within it. Each
member of the DSP evidence team at West Coast Community College is presented as
a full case study. The individuals who joined the DSP were called evidence team
members, not committee members, because the intent of the DSP was to have the
individuals do more than just participate in the DSP. The individuals who joined the
52
DSP were to become the researchers through a process of “practitioner-as-
researcher” (see Chapter 1).
There were 15 meetings throughout the four-year implementation of the DSP
at the college and the CUE facilitators took notes during each meeting. I have
analyzed the CUE facilitators’ field notes to determine if learning occurred during
the time each member of the evidence team was involved with the DSP. Before I
present the case studies for each member of the evidence team, I have provided a
brief synopsis of the evidence team as a whole.
The Evidence Team
Three of the evidence team members, Theresa, James, and Katherine, were
the original evidence team members, although Theresa and James left West Coast
Community College before the conclusion of the four-year project. In the first
meeting the CUE facilitator noted, “All members of the team are White.” I found it
strange that the entire evidence team selected to begin the DSP was entirely White,
considering the emphasis of the DSP was on ethnicity. The other two evidence team
members, Richard (Hispanic) and Robert (African American), joined the DSP in the
second year (Meeting 5) and remained with the DSP until its conclusion. The only
evidence team member to be involved with the DSP from the beginning to the end
was Katherine.
The original evidence team members were all positive about the DSP, were
familiar with one another, were willing to invest their time into the project, and open
to discussions regarding problems that existed at West Coast Community College. In
53
the very first DSP meeting the CUE facilitators made clear that the evidence team
worked well together and had a positive attitude toward the DSP, “The group seemed
very familiar with each other. They seemed positive, energetic, and comfortable.”
Their familiarity and their positive attitude were corroborated during the CUE
facilitators’ debriefing of the first meeting when they noted, “Chemistry: seemed
very positive, comfortable, have worked together before, assumed familiarity.” By
the second meeting, the CUE facilitators not only corroborated the evidence team
members’ positive attitude, but also the team’s openness and motivation when they
stated, “The group is open and positive in general. They still seem very motivated
about the project and Theresa told us at the end that they are enjoying it.” Their
feeling of familiarity allowed the team to work openly together as demonstrated
during the first meeting after Theresa made the comment that “the structure [of the
college] is always under development but that they [West Coast Community College]
never move,” to which the CUE facilitators noted, “all laughed about this and made
side comments.” The second was when the CUE facilitator emphasized that the team
members all laughed after Theresa explained that there was a problem at the college
in that, “we say ‘we’re gonna [sic] do good’ but it never works.” These two
statements showed that they were comfortable with the fact that there was no need to
stifle any information regarding their feelings about the institution. After reading two
other statements made by the CUE facilitators, I determined that these original
evidence team members were willing to invest time in the DSP. The first was when
they noted that the “group members seemed interested in being at the meeting.
54
Eager,” and the second was when they noted their busy schedules, but no notation
was made that the DSP was a hindrance, “the evidence team members laughed about
their busy schedules.” These two statements, combined with the fact that there was
no mention by anyone describing the DSP as a burden or drain of their resources,
implied that they were willing, even eager, to invest their time into the DSP.
In the second year, the remaining evidence team members, Richard and
Robert, joined the DSP. From their first meeting (Meeting 5) the CUE facilitators
believed that “they were great additions” to the evidence team and in one of their
later meetings during their debriefing, the CUE facilitator noted, “Both Richard and
Robert were excited.” Until the second year, the original evidence team members did
not truly begin looking at nor discussing data disaggregated by ethnicity; instead,
they viewed the problem as prepared versus unprepared students. I determined that
there were three possible explanations for this, including possible combinations. The
first explanation was a problem the original evidence team was having with their
existing data files because of the implementation of a new data collection system.
Since they did not have the data, they could not view it disaggregated by ethnicity
and if they couldn’t view it disaggregated by ethnicity, their discussion would not
focus on ethnicity. This topic was broached during the very first DSP meeting when
the CUE facilitator noted that the team was not looking at ethnic inequalities, but
instead “looked at it as prepared versus under-prepared regardless of ethnicity. Those
who have defined goals versus those who do not have defined goals.” At that time
the CUE facilitators suggested that “maybe they [the evidence team] should
55
disaggregate this by ethnicity,” but the CUE facilitators also noted that it “may be
problematic that they [the college] are having such trouble with their data.” The
second explanation, partially related to the first, was since Richard didn’t join the
DSP until the second year, the evidence team was again not provided with data
disaggregated by ethnicity that might initiate conversations regarding ethnicity. The
third explanation was that the topic of ethnicity was never fully discussed until
Robert, an African American, joined the team as shown by the CUE facilitators who
emphasized the importance of having an evidence team member of color join the
team because he brought a focus of ethnicity to the conversations,
I think that a lot of the things that he [Robert] said would not have been said
had he not been there because these were issues that were obviously
important to him. And his comments lead the bulk of the discussion for a
while... We might have glossed over some of the things that he mentioned
like are the graduating seniors truly representative of the students that we’re
serving and can we use that for comparison purposes and if we do, are certain
groups over represented by ethnicity. He talked about a lot of things that I
don’t think would have come up. (Meeting 6)
The evidence team began the DSP with a positive attitude and motivated to
work on it and near the end of the DSP they continued to show their motivation to
work on the DSP as was shown by the CUE facilitators during their debriefings of
the last two DSP meetings when they stated, “They seem motivated and hopefully
will accomplish their goals this year…. They are on top of it [Link of the DSP to the
College Review Process, communicating, and the new president’s report] and are
motivated—good worker bees.” There were many instances that showed how well
the evidence team worked together, but the highlight of their ability to work together
56
was the discussions they held amongst each other to help bring an understanding of
what was needed by Richard to present the data disaggregated by ethnicity in order to
tell the story over time of the West Coast Community College students. The
following are examples of the moments in which the CUE facilitator noted there
were discussions between the evidence team members:
• There was some discussion between Theresa and Richard regarding
which students to include. (Meeting 7)
• Theresa gave Richard a chart she had drawn on the back of her paper.
They talked to each other about ‘right to know’ data…. Richard and
Theresa talked about how to break down the data. There seemed to be
some differences between what Theresa thought and what Richard
predicted the data would show. There was some discussion regarding
who to include in the data and they decided on the ‘expanded right to
know’ cohort data. (Meeting 7)
• Theresa asked about continuous enrollment in primary terms (fall,
spring) and suggested it could be an “excellence” indicator. There was
some discussion between Theresa and Richard about whether or how
to count the winter intersession and summer semester. Theresa told
him to only worry about fall and spring. (Meeting 8)
• Theresa and Richard had some discussion regarding defining part-
time and full-time cohorts. (Meeting 8)
• Richard looked on the computer for the IPEDS credit load column. He
did some work on the computer to find it. There was some discussion
between Theresa, James, and Richard. Richard made a few notes to
himself. They were basically talking in West Coast Community
College data code. (Meeting 8)
• Theresa, Richard, and James discussed different fields and what we
should look at next. They looked at “uninformed” versus “informed”
goal. (Meeting 8)
• There was some discussion about “undecideds,” and James suggested
that Richard could run them separately. (Meeting 8)
57
• There was some discussion amongst the three of them about clean and
unclean data. (Meeting 8)
• There was some discussion among James, Richard, and Theresa
regarding how to replicate this and then add in the other elements—
degree applicable units and total units. (Meeting 8)
There is no mention in the field notes of what the discussions entailed, but this
concept of evidence team members helping evidence team members through
discussions implies a strong motivation and commitment by each to get the job done.
How the Evidence Team Made Use of the DSP
The evidence team saw the DSP as a tool that could benefit other programs at
their college. By the fourth meeting a difference between the West Coast Community
College DSP evidence team and other partner institution evidence teams became
apparent. The CUE facilitators noted that the West Coast Community College DSP
evidence team had begun using the DSP “as a tool and not a process in and of itself.”
They realized that West Coast Community College wanted “systemic linkages rather
than piecemeal linkages (like one goal about nursing and another one about calculus)
… [and] is working from their strategic plan to the DS rather than the other way
around.” The CUE facilitator illustrated that “in other institutions the DS is a process.
For West Coast Community College it is a tool to provide information for the
strategic planning process regarding diversity and the success of diverse students.”
The CUE facilitator then made the comment that “it’s interesting to see how different
institutions have interpreted and applied the DS.… [They then noted that] Theresa
asked if we [CUE] thought they [the evidence team] interpreted this way because the
58
whole team is made up of researchers.” This indicated that she believed the West
Coast Community College evidence team was different than the teams she had met
from other partner institution. That was not the only difference that existed between
the partnering institutions. In the fifth meeting the question was asked if any other
college was creating a longitudinal study with a cohort, a sample of students
representing the demographics of the college, over time that reached across the
institution; the CUE facilitator responded, “No, more often partners focused on a
discipline or area, like math and/or science, and looked at those rather than following
a cohort all the way through the degree program, regardless of major…. You’re
looking for leaks… you may be the only ones doing it this way.”
With an understanding of the background of the evidence team, it is now time
to concentrate on the individuals within the team. The results found for each
individual was organized by the two research questions as follows. Each evidence
team member has a brief introduction as to who the participant was. This background
was followed by the analysis based on the participant’s experience with the DSP as
noted by the CUE facilitator’s field notes. These experiences were analyzed based on
the framework of the two research questions under the headings of Feelings and
Attitudes, Awareness, and Interpretation, if interpretation took place. At the end of
each evidence team member’s individual case study, I have included a Conclusion.
The results found for each research question was presented by providing actual
comments made or situations that occurred during the time the evidence team
member was involved with the DSP. I begin with an analysis of Theresa because she
59
was responsible for putting the DSP evidence team together and because she was the
leader of the DSP evidence team at its onset.
Case Study One: Theresa
Theresa was the Vice President for an area that included college planning and
program assessment and throughout her six years at West Coast Community College,
she had held several other positions that gave her the advantage of looking at the big
picture as noted by the CUE facilitator, “She understands all the pieces going on
within that part of the organization [planning].” At the first meeting of the scorecard
team, Theresa mentioned her familiarity with the balanced scorecard and its
adaptation to higher education institutions and that ever since hearing about the
scorecard she had become interested in implementing it. Consequently, Theresa saw
the DSP as making it possible for her to try out an approach that was appealing to
her. Thus when the president of West Coast Community College received the
invitation to be part of the DSP and asked her if she thought it would be useful to the
college she did not hesitate and quickly formed a team. At the first meeting of the
team, she told the CUE facilitators that “the president recruited her for the committee
and she ‘drafted’ the other two” members. “These two, James [the Director of
Research] and Katherine [Mathematics Department faculty member and coordinator
of the departments assessment committee], seemed like obvious choices.” It became
apparent that Theresa had already worked with the others since she “was very
complimentary of both of them, their knowledge and skills.” Theresa’s comment
about having “drafted” them caused laughter and bantering, so I sensed from the
60
CUE facilitator’s notes that this was a collegial team and that they were comfortable
working with each other. These impressions were reinforced by the CUE facilitator’s
description of Theresa as “not dominant or directive. [She was] quick to share
knowledge, praised the others in front of us, not domineering at all…. [They] seemed
to regard each other as peers. Power was not an issue in the group.” Theresa
remained the leader of the DSP until her resignation from West Coast Community
College. Her last meeting was the twelfth of fifteen DSP meetings.
Through the field notes, Theresa emerges as someone whose attraction to the
scorecard was the framework and the structure provided. The scorecard filled an
administrative need. While she seemed to like the concept of the scorecard as a
method and often spent considerable time figuring out best ways to depict data, she
did not appear to be as engaged in making sense of the data. Almost three-quarters
(Appendix C, 70.9%) of her conversations dealt with the methods of the project or
other non-student topics and fewer than twenty percent (Appendix C, 19.2%) of the
statements dealt directly with students. For example, after being shown dismal data
on student persistence, the field notes show Theresa asking if “there [was] a
difference between full-time and part-time?” That is, rather than reacting to the
disparities shown by the data, which was the intent of the DSP, she appeared to be
more preoccupied by the ways in which the data were cut. In considering Theresa’s
comments as a whole, over the entire life of the project, my impression was that she
placed more importance on the data than on what it conveyed about the state of
equity on the campus. In fact, toward the very end of the project, just before leaving
61
the college, Theresa became preoccupied with which of the four perspectives in the
scorecard could be viewed as the equivalent of dependent and independent variables.
It should also be noted that this was the only team among the 14 partner
institutions that focused on a cohort of students and followed their progress through
various indicators. The decision to focus on a cohort and do a longitudinal analysis
was primarily Theresa’s idea.
Feelings and Attitudes
As I mentioned above, Theresa was predisposed to liking the scorecard. She
had been thinking of how to make use of it prior to being invited into the DSP.
Throughout the time that Theresa was in the DSP, she displayed a positive attitude.
One of the reasons why she liked the DSP was that it complimented and reinforced
strategic planning, which the college was about to undertake.
Even though Theresa felt uneasy about the reaction of the campus community
to data disaggregated by race and ethnicity, she recognized the benefits the DSP
could have for many of the college’s initiatives. The college administration valued
data and viewed themselves as data-oriented, but the college did not have a good data
system. Due to the lack of accurate data, the college had recently lost its designation
as a Hispanic Serving Institution because they were unable to document that they
enrolled 25% low-income Hispanic students as required by the US Department of
Education. Theresa complained, “The data problem really slowed down the planning
process. It has been hard to do factbooks without quality data.”
62
An important factor in Theresa’s positive attitude toward the DSP is that it fit
her agenda and it could be useful in advancing college-wide initiatives under her
purview. She pointed out, “The planning committee has long-term goals but no
metrics.” Theresa was aware that the college was good at producing lots of reports,
but not as good at using those reports to move into action. “We say ‘we’re gonna do
good’ but it never works,” but with the scorecard’s four perspectives they could
“identify progress on their [the college-wide action plan] strategic goals.” She felt “a
need to educate the campus on this type of information use” and the DSP provided
the means for doing so.
The challenge is linking things and the attraction of the DS was the
indicators. We do reports all the time. It’s another battle to get links and
action as a result. This is extremely difficult…. The DS helps us to articulate
access beyond who’s coming in the door.
She also saw the DSP as filling gaps. For example, she mentioned, “There is
an assessment committee, which is new. It is a small committee… [and] they do not
yet have a framework.” Similarly, in speaking about another committee she said,
“[We] are already working on it (the math problem), but now [with the DSP
framework] they can use data in a more sophisticated way. Bring data to bear on the
problem.”
Theresa’s positive feelings about the DSP could be surmised from the many
uses she found for it. She shared her feelings about the DSP. On the second meeting,
the CUE facilitator wrote, “Theresa told us at the end that they are enjoying it [the
project].” Theresa also felt that the DSP could help answer “questions the college
63
was struggling with.” However, the questions she had in mind were not about equity
and minority students. “We’re in a growth mode [and] this project could help us
think about marketing. Who are we attracting? Who are we keeping? How are they
doing?”
Awareness
As mentioned earlier, this team did not see actual data until 12 months after
they got started. Theresa focused more on unprepared students generally than on
ethnicity. “Our biggest problem is under-prepared students vs. prepared, regardless
of ethnicity. Those who survived high school vs. those who succeeded in high
school.” On the first meeting the CUE facilitator tried to get the group to commit to
looking at the outcomes for a particular minority group; however, Theresa was more
focused on the entire student population. The conversation went like this:
CUE facilitator: Let’s pick a target group to get started.
Theresa: We’re always concerned with gender, ethnicity, age…
CUE facilitator: Let’s take ethnicity. Is there one group we should focus on?
Theresa: No. We have heightened awareness about Latinos because the group
is growing and we lag behind the growth of the population. One of our goals
is, “What does 28,0000 students mean?”
In the debriefing notes, the CUE facilitators referred to this exchange and noted that
the three members of the team had trouble focusing their attention on minority
groups and they noted that they might need to “steer them a little” in that direction.
Even when data were presented by race and ethnicity, Theresa tended to view
the patterns in terms of unprepared vs. prepared students. She interpreted the
64
problems they were seeing in the data as “remedial students not being socialized to
the non-remedial culture of the academy.” However, even though she was not likely
to focus on the outcomes of a particular group or to question what might be going on
with Hispanic or Black students, at one meeting she told a story that demonstrated
her understanding of the socio-cultural obstacles these students face in the classroom.
The story was about a student in her political science class who had a problem
understanding the concept of “check and balances” because the student pictured a
checkbook. For Theresa, this event stood as an important moment in her own
education.
As the group became more immersed in the work of the project, Theresa
became more outspoken about institutional responses to the barriers faced by
students. “We tend to keep the curriculum the same and ask student services to help
the students get through it rather than adjust the curriculum to the students. We need
to worry about who we’re NOT serving.” But what she struggled with was “how to
engage the institution in a conversation about the barriers to those populations’
participation.” On the one hand, she was aware that the college was not serving all
students equitably. On the other hand, even when speaking about the problem of
equity, she did not use the language of equity or refer to the condition of specific
groups.
Theresa admitted that “race was an issue which is hard to talk about” and
knew that even if the DSP team raised the possibility of a misalignment between the
curriculum and student needs there would be “accusations of ‘dumbing down’ [the
65
curriculum].” Theresa was aware that “traditionally it is assumed that students don’t
succeed because they’re unprepared. We blame the victim too much.” One of her
hopes was that “This model [DSP] will hopefully trigger some thinking about this
[equity]. It’s hard for faculty to remember what it’s like to be 18 and a student. They
have overly high expectations.”
One of the few times that Theresa spoke specifically about inequities was
when the data showed big differences between White and African American students
getting into transfer level courses. She said, “If Whites can get in and African
Americans can’t that’s a problem.” She also started asking more equity-oriented
questions, such as “Among those who assessed into basic skills English, are there
different experiences by ethnicity?” And she also began to adopt the DSP’s
institutional responsibility standpoint. For example, anticipating that people on
campus would attribute differences in outcomes to differences in prior schooling, she
said, “You can’t fix preparation [so there is a need to also ask] are there other
barriers in place?”
Theresa was aware that the college lacked faculty diversity and that this lack
of diversity may have negative consequences for minority students. “We’d like to
add some diversity to the faculty and counseling staff. Some students may feel more
comfortable talking to a counselor who is like them.” She also pointed out that the
“affirmative action office needs benchmarks for hiring.”
At the meeting in which the team presented their findings to the president,
Theresa spoke very directly about African American students, “The African
66
American findings were the most shocking…. California takes its diversity for
granted…. We have one ethnic group not being addressed.”
Theresa became aware of the need for the West Coast Community College to
maintain a vision of equity as it went through the process of creating an action plan,
What this project does is it gives it a frame to keep the initiative of equity on
the table…. The scorecard keeps equity on the table for action plan
measures…. We need to know which population we are working with…. We
have to target specific groups in a proactive way.
Conclusion
Through the DSP process, Theresa seemed to have become more aware of
disparities in student outcomes. She also learned to interpret these disparities from
the perspective of institutional responsibility, which was one of the goals of the DSP.
In the meetings she also began to reflect about how to engage the institution in a
conversation about the barriers that minority students may be experiencing.
In sum, at the start of the project, Theresa’s attraction to the scorecard seemed
to be primarily utilitarian in relation to the strategic plan, assessment, as well as
marketing, but by the end of the project, particularly once the team started looking at
the data, she began to ask questions and provide interpretations that exhibited the
equity standpoint advocated by the DSP.
Case Study Two: Katherine
Katherine was a White female and the only member of the Diversity
Scorecard Project at West Coast Community College who remained an evidence
team member from the beginning of the Diversity Scorecard Project to the end of its
67
four-year process. Katherine appeared to be a quiet person, which I surmised from
two observations. The first was the fact that the least number of statements, an
average of 5 conversations per DSP meeting (75 total conversations, see Appendix
C), made by or about a team member were made by or about Katherine and the
second was from the direct comments by the CUE facilitator, “Katherine is quiet, but
not disengaged.” When Theresa first “drafted” the original members of the Diversity
Scorecard Project, Katherine had been with West Coast Community College for 10-
years and was an Associate Professor of Mathematics and the coordinator of program
assessment in her department. She knew something about the Balanced Scorecard
(see Chapter 1) because the summer before the DSP started she attended a workshop
where it was discussed. One important detail about Katherine was that she served on
key institutional committees and because of her involvement in assessment she
seemed like a natural member for the DSP team. The CUE facilitator wrote in the
field notes that “it was revealed that Katherine is on almost all of the committees on
campus” and “everyone joked about Katherine being at every meeting.” She was in
at least five critical college committees, including the College Review Process,
Partnership for Excellence (PFE) committee, and the Student Assistance Committee.
Theresa had mentioned that Katherine had helped lessen the fear about assessment.
Katherine’s participation on governance and decision making committees was
confirmed by the large proportion of statements made by or about Katherine, over
half (Appendix C, 50.67%), with regards to the other institutional programs or
initiatives to which she had some connection. Katherine saw the scorecard as
68
permeating all other programs, and she was concerned with how to make use of the
scorecard data to advance the goals of the many campus initiatives in which she was
involved. Specific references to students were not evident in the field notes, so I
found it difficult to get a sense of how she felt about student outcomes. Only ten
percent (Appendix C, 10.67%) of the conversations were regarding students where
fewer than 5 percent (Appendix C, 4.0%) of her total conversations dealt with the
outcomes for all students and none dealt with outcomes for minority students. She
appeared to focus mainly on the uses of the DSP for other college initiatives. As I
have pointed out for the other members of the evidence team, it is possible that
Katherine may have made comments that were not documented in the CUE
facilitator’s field notes.
Feelings and Attitudes
Katherine displayed a positive attitude toward the DSP from the beginning.
Evidence of Katherine’s positive engagement with the DSP are based on the CUE
facilitator’s observations of her gestures and actions, the benefits Katherine saw in
the DSP for other programs, and her desire to disseminate the findings of the DSP.
Over one-quarter of the conversations by or about Katherine (Appendix C, 25.33%)
related directly to the DSP. In some of these conversations, the CUE facilitator
would make notes of moments that showed Katherine’s actions or body language that
demonstrated her engagement in the DSP, such as nodding of her head. In one
meeting the CUE facilitator noted, “James and Katherine and Theresa were all
making notes and underlining on the page.” Even though Katherine was quiet, she
69
still came across as engaged in the process. Katherine did not display gestures that
might have indicated boredom or disinterest such as doodling or staring off into
space.
As early as the second meeting Katherine explained that she “thought the
measures on the vital signs document would be helpful for [the] College Review
Process [because] they want to track students by ethnicity, gender, and program.”
She explained how they could also use the College Review Process to communicate
the DSP to other groups at the college. On several occasions she provided
suggestions about ways the evidence team could link the DSP to other programs. For
example, during a conversation regarding the ESL (English as a Second Language)
Department’s experience going through the College Review Process, the CUE
facilitator discussed how another institution was focusing their scorecard on math. I
gathered that after listening to the conversation focusing only on ESL, Katherine did
not want the others to lose site of the Math Department, so she suggested, “Should
we focus on ESL and math and link this to College Review Process?” Once the link
was established, Katherine wanted to disseminate the findings from the DSP to
others at the college. She wanted to communicate the findings of the DSP to the
Academic Senate, “I don’t know if there is a subcommittee for diversity issues but it
would be nice if the whole Academic Senate can hear it [a presentation on the DSP].”
Katherine was knowledgeable of the processes needed to disseminate information at
the college and mentioned the need to create a process that would “come from the
president.” She understood that for the DSP to be taken seriously, it too had to come
70
from the president because that is when people realize it is a priority. She pointed to
everyone being serious about the “Basic Skills Task Force” because it had come
from the president. She also suggested presenting the scorecard at the “required flex
day” in February. She mentioned there would be about fifty faculty members and that
the scorecard could be a session topic. Although Katherine, as a coordinator in
charge of an assessment study appeared not to have been in the habit of
disaggregating data by race and ethnicity (see Awareness section below), as a result
of her participation in the DSP she saw the value of doing so for the disciplines going
through the College Review Process. This was noted above, in that the College
Review Process wanted to track students by ethnicity and gender, and from the fact
that she stated, “We [the DSP committee] brainstormed on these questions [to be
asked during the College Review Process]…. The College Review Process develops
the program guide and I’m going to implement the questions.”
In sum, I judged that she felt positive about the scorecard based on the CUE
researchers’ observations of her body language and gestures during the meetings.
Additionally, Katherine mentioned on repeated occasions the various ways in which
the scorecard could be linked to other major campus initiatives, which I interpreted
as a sign of positive attitude to the scorecard. Based on Katherine’s capacity to see
the utility of the scorecard more broadly and in connection to the most substantive
work of the college (e.g., College Review Process), I inferred that she was a strong
supporter and advocate. It is doubtful that she would have spent time trying to figure
out how to make it more than just an ad hoc project had she felt differently.
71
Awareness
At the start of the DSP, Katherine’s awareness was that all students were
performing equally poorly. Statements she made regarding students were normally
about all students, not ethnicity, and as I perused the field notes, I was not able to
find any statements made by Katherine about educational outcomes that were
specifically in regards to minority students. At the beginning of the project she spoke
about ethnicity as a descriptive demographic variable, and less so in the DSP’s
language of equity and fairness. For example, in the first meeting when she said, “If
we cut by ethnicity we would probably want to cut by gender also. There are a lot of
studies in math and science by gender, there are a few by ethnicity, but there are very
few that look at both gender and ethnicity.”
Katherine was attentive to the data provided by Richard but her comments
were typically about all students rather than about a particular group. Even though
she was in the math department, she was quite surprised with the results of the
placement data, “It doesn’t even seem possible but 40% [of all students] are placing
in basic arithmetic (the lowest level),” she said. While Katherine was concerned
about the large proportion of students who were placed into basic arithmetic she only
commented on the 40% figure for all students, and did not specifically focus her
attention on whether it was higher or lower for Hispanic or African American
students. The only sign of Katherine having developed awareness of why it is
important to disaggregate data by race and ethnicity was in comments she made at
one of the meetings close to the end of the project. On the 12
th
meeting, the team
72
members discussed the link between the DSP and the College Review Program
(CRP) where Katherine observed,
The focus [of the CRP] has not been on diversity issues in English and Math.
So maybe that’s something we need to think about. I know in math, we are so
overwhelmed about remediation issue. We didn’t disaggregate it. We know
it’s bad across the board.
So, on the one hand while Katherine recognized the value of disaggregating data as
part of the College Review Process, she did not declare it needed to be done or state
why it was important to do. She did not elaborate on how disaggregated data might
enhance the College Review Process. Additionally, her comments about “knowing
that it is bad across the board” seemed to suggest that despite her positive attitudes
toward the scorecard, she had not developed a deeper awareness about the need to
focus purposefully at the outcomes of specific groups and take note of racial patterns
of inequality.
Conclusion
Katherine had a positive attitude as to the benefit the DSP had for other
programs that already existed at the college. She regarded the DSP as having broad
applicability in that the scorecard could permeate all other programs, so her focus
was on the utilitarian aspect of the DSP in that the scorecard could be a useful
instrument at the college. There was little to suggest that while participating in the
DSP she developed greater awareness of inequalities in educational outcomes for
underrepresented students, although she learned that more students were in basic
arithmetic than she had previously thought. The field notes did not provide
73
information that would have allowed me to discuss how Katherine interpreted
inequalities, which was not surprising given that there was no evidence that
Katherine ever spoke about the outcomes of a specific group. It was also telling that,
at the same time she was involved with the DSP, Katherine conducted an analysis of
the performance of the Mathematics Department, but surprisingly, despite the
emphasis of the DSP on disaggregating data, she did not think of breaking up the
data by race and ethnicity. It is because she didn’t disaggregate data for her own
department, the least one would expect of a DSP evidence team member, that made it
appear as if Katherine, despite valuing the scorecard, did not develop a deep
awareness of racial patterns of inequalities. However, one glimmer about how
participation in the DSP may have changed Katherine’s outlook were her comments
about the importance of looking at data disaggregated by race or ethnicity in the
College Review Process. So it is possible that, by the end of the project, Katherine
had begun to realize the importance of disaggregating data by race and ethnicity.
Case Study Three: James
James, a White male, was one of the two people originally chosen by Theresa
to help start the DSP and was a member of the evidence team for the first 13 of 15
meetings at which time he left West Coast Community College. He was with West
Coast Community College for five years as the Director of Research and was
involved with assessment activities at the college. As I read the field notes, I realized
that even though James spoke very little, an average of 18 statements per meeting
(Appendix C), his main focus was on both the process of the DSP and other
74
programs at West Coast Community College. I surmised this since at least 40% of
the statements by or about James dealt with issues regarding the DSP and 42% dealt
with various programs at the college. Some of his discussions about the DSP,
involved his clarification of terminology or understanding a concept. During the
debriefing for one of the first meetings of the evidence team, the CUE facilitator
noted, “James brought up questions about clarifying access, which was helpful….
James had the most questions.” He continuously clarified his understanding as shown
from the following list of statements drawn from different meetings:
• James went back to slide 9 and asked what a ‘responsive’ goal was.
(Meeting 1).
• James clarified his own understanding of what Richard was saying.
(Meeting 7)
• James took a moment to clarify his understanding of what was going
on. (Meeting 8)
James also clarified his understanding by summarizing decisions,
So I’m clear, we’ll look at some analyses to figure out which measures are
best. We’ll look at (1) over a two-year period, what happens; (2) look at
course patterns in English for problems. (Meeting 7)
As I read the statements made by or about James regarding programs at the college, I
realized that nearly two-thirds (Appendix C, 60%) of these statements were about
one program in particular, the College Review Process. Since one of James’ concerns
was that the college tended to “keep going through cycles of data collection and
program implementation that don’t stick,” I found that the implication from his
statements was that he was working toward implementing lasting programs. This was
75
shown by the fact that approximately three years prior to the start of the DSP, James
co-chaired a College Review Process committee that revamped program review by
making it less bureaucratic and more professional, e.g., placing greater emphasis on
peer review and dialogue. James never ended his involvement with the College
Review Process and he noted (Meeting 11) that the requirement of the DSP that data
be disaggregated by race and ethnicity was now incorporated into the self-study
process for the College Review Program:
We use the scorecard as a part of a self-study. We use it for the self-study of
programs to address how to change those things. (e.g., a certain math class
not working for African American students). [James also noted that] some
colleagues are reluctant to look at this. Some people say they don’t need this
[to disaggregate data by race and ethnicity].
In order to answer the second research question, my focus, as I read the field
notes, was on statements regarding minority students, but the field notes revealed that
only 16% of the total number of statements made by or about James discussed
students at West Coast Community College and only 40% of those statements
(6.36% of all statements) discussed minority students. As James went through the
DSP, it wasn’t until the fourth meeting that he specifically mentioned minority
students. This was the first of only four meetings in which James specifically
discussed minority students. Even when Theresa left West Coast Community College
and James became the new leader of the DSP committee (Meeting 12), there were no
comments in the field notes that suggested he discussed minority students. There was
no notice that James would be leaving West Coast Community College. Instead, his
76
leaving was announced in the 14
th
meeting and the DSP was once again in need of a
new leader.
Feelings and Attitudes
James had a positive attitude from the very first meeting to the end of his time
with the DSP. His positive feelings about the DSP were primarily about its utility and
relevance to other campus initiatives as shown by the high percentage of comments
in the field notes (Appendix C, 42%) by or about James’ discussions of college
programs. He viewed the activities related to the scorecard not as a burden or extra
work but as very complimentary to other initiatives that relied on data, “We hope to
clean the data and then put it in a data warehouse to create reports that could be put
on the web. The scorecard is a nice opportunity for that--an easy way to share info.”
From the comments that he made, it can be surmised that he saw the DSP as a useful
tool that came along at just the right time, “One of my motivations was to make data
accessible. Now, when faculty go through the College Review Process they ask for
learning outcomes, what their program is designed to do and how they will assess it.”
Based on the observations made by the CUE facilitators in the field notes, James
demonstrated his enthusiasm by giving “constant nods and uh-huhs” when they went
over the purposes of the project and generally by reacting affirmatively to the task.
For example, when the evidence team was told it was time to begin work on the
scorecard, it was noted that James responded, “All right! (Also enthusiastically.)”
James recognized the link between the DSP and the College Review Process,
which required all academic departments to conduct a self-assessment and future
77
planning. This was demonstrated by the high percentage of statements (Appendix C,
24% of all statements) he made regarding the College Review Process. James saw
the Diversity Scorecard as a good link for the data requirements of the newly vamped
College Review Process that was getting underway at the same time as the DSP, “It
[the DSP] may be a good fit with [the] College Review Process or college-wide
action planning.” Thus the disaggregated data for the scorecard could also be given
to the academic departments as part of the review process. Another link James saw
was the natural connection between the DSP and the state-required Student Equity
Plan, “Maybe we could combine the Scorecard with the Student Equity Plan [a report
mandated by the Chancellor of Community Colleges].”
James’ positive attitude was supported by his belief that the DSP could
facilitate necessary discussions about disaggregated data. James explained how the
DSP helped the College Review Process discussions interpret the data, “The
disciplines need support in analyzing and interpreting data. We [programs going
through the College Review Process] did sit down, but [we] need to spend more time
on interpreting rather than collecting and analyzing.” He was enthusiastic about the
scorecard’s capacity to help faculty members become aware of problems within their
own departments. He likened the scorecard to the dashboard of an automobile, “The
indicators are on the dashboard, which is the DS. The light [signifying equity]
indicates that there is a problem here. Then, that problem can be discovered and dealt
with through College Review Process.” He regarded the Diversity Scorecard as “a
different way of packaging measures” and as “a monitoring device” that the
78
administration could rely on “to describe what’s going on at the institution.” He felt
that “there wasn’t a whole lot of discussion in other places [within the college] about
disaggregated data.” James viewed the DSP as a useful method to call the attention
of faculty members to problems that they may not have been aware of or may not
have wanted to look at. James also liked the DSP because he hoped the approach
would increase the likelihood that the data would produce action. He admitted that,
in the past, he found it difficult to communicate the importance of data when he said,
“[My] past frustration has been turning data into action.” He hoped others on campus
would become involved in discussions about disaggregated data, “Once they get the
data they will begin to involve others on campus in the project.”
Awareness
James’ awareness of students and his new awareness as a result of being a
member of the DSP evidence team can be summarized as follows. At the start of the
project the awareness James had was very broad regarding students in general, but by
the end of the project, James no longer spoke about the problem of unpreparedness.
Instead, he became aware of multiple inequities in educational outcomes in very
specific ways. My analysis of the field notes showed his new awareness consisted of
problems that both African American and Hispanic students were encountering.
After looking at data disaggregated by ethnicity and race during different meetings,
James discovered three main problems faced by minority students: (a) GPAs for
Hispanic and African American students were lower than for White and Asian
79
students, (b) African American students didn’t take the recommended courses, and
(c) more of the African American students were likely to be part-time.
At the start of the project (Meeting 1), James thought that a large proportion
of students were unprepared for college and based on his perspective about the
problem, he suggested that the scorecard focus specifically on the “under-prepared,”
“We could pull out students with long-term goals and segment them—prepared for
college-level work versus under-prepared. We could talk about those who are under-
prepared, [that] would be a good-sized group.” He felt that there was a “need to
move students without goals to having educational goals and help them define their
program or career ideas.” I interpreted this statement as showing that James viewed
students’ poor outcomes as emanating from their lack of direction.
James’ first mention of minority students was in the fourth meeting, but it
was more speculation than fact, “Maybe the traditional model of curriculum fits with
some. Maybe it’s mostly Asian students.” Then in the same meeting he again
discussed minority students, but based on what others had said not on what he found
in the data,
James talked about College Review Process in ESL and English. ESL
concluded that there are lots of students who can speak English well but don’t
understand grammar and can’t write at the level they need to. But, they are
getting put into English courses and the curriculum is not working.
It wasn’t until the next meeting (Meeting 5) when Richard joined the evidence team
and brought his “data packets” that James began to look at the data and “noticed that
Hispanics and African Americans were higher in the part-time category.” In the sixth
80
meeting, the only time that the topic of minority students was discussed by James
was after Robert had mentioned that African American students’ high aspirations
didn’t match their low performance. In response to Robert’s statement, James said,
“It’s interesting that the African Americans don’t seem to take the recommended
placement levels.”
Over half (Appendix C, 51%) of the statements made by or about James
regarding students was made during the DSP report to the president (Meeting 9). Of
those statements, the number of statements made regarding minority students was
only one more (Appendix C, 10) than the number of statements made about students
in general (Appendix C, 9). I found it interesting that the number of statements made
regarding minority students at that one meeting represented two-thirds of the total
number of statements James made about minority students throughout the entire time
he was an evidence team member. As the person responsible for explaining the data
at the president’s report meeting, he presented the various data tables and charts
included in the final report that summarized the DSP findings. Although James never
said specifically that these findings represented new knowledge or that he was now
more aware of differences in educational outcomes, each of the illustrations provided
in his PowerPoint presentations showed critical differences across groups. James’
understanding of the disparities in outcomes and the ways in which he discussed
them with the president and other administrators suggested that he along with the
other members of the evidence team became very conscious of critical differences
across racial/ethnic groups. James, as a result of data disaggregation by race and
81
gender, now became much more aware of gender patterns of inequality. For example,
he pointed out differences in enrollment patterns by race and ethnicity, particularly
among Hispanic students where the increase in enrollment was primarily female,
The White population as a percentage of the entering cohort is declining over
time, both men and women. Hispanics are rising, but the bulk of the increase
is in women only. Asians declined slightly while African Americans stayed
pretty constant.
He then shared data on degree-seeking and transfer students stating that “we see the
same trends” except with respect to Hispanic females where “the same gender effect”
was not in evidence. Notably, James’ presentation demonstrated the importance of
looking at racial/ethnic groups by gender because their findings suggested that the
inequalities were not being experienced uniformly.
During the president’s meeting, James revealed data that showed disparities
between different ethnic groups, “Asians are the most degree-seeking. African
Americans are second then Whites then Hispanics…. Asians are the highest
[graduating by the sixth term]. Whites and Hispanics are almost identical. African
Americans are the lowest.” He pointed out that sizeable differences existed in
preparation and in the number of attempted versus completed units by racial/ethnic
groups where,
You see major differences in English placement. There are not many prepared
in math. African Americans and Hispanics are the lowest. One thing of note
is that African American’s preparation is not helping them access what they
need to reach their aspirations…. Asians attempted and completed the most
number of units, then Whites and Hispanics, and African Americans were the
lowest…. This includes basic skills courses.
82
He concluded the presentation of the DSP President’s Report by calling attention to
major inequities for two indicators representing academic success, GPA and
successful transfer to a four-year college in which “Whites and Asians have the
highest success, then Hispanics, and African Americans are the lowest…. Whites
have higher success rates even though they were so close to the Hispanic data all
along. Whites tend to transfer to 4 year [colleges] more than others.”
Interpretation
In order to determine changes in James’ interpretation of inequities in
educational outcomes, I looked for statements that showed reflection or questioning.
There was no indication in the field notes of James expressing specific reflection
about educational outcomes for minority students. Therefore, my findings are
provided in a chronological order to show a possible change in his frame of reference
for all students between the time he began the DSP and the time just before he left
West Coast Community College for all students. I am using chronological
organization because the statements that represent interpretation were subtle and I
want to show the changes, even small ones. As discussed during the introduction for
James, only 6% of the statements made by or about James showed a mention of
minority students whereas an additional 9% of all his statements were about all
students at West Coast Community College. Since there were so few statements
regarding minority students, I am including his statements regarding all students in
this section. As I read the field notes, I found that James’ intent was to create an
atmosphere of interpretation for the college, which was shown while he was creating
83
the scorecard for the DSP. During many of the meetings, James discussed the types
of questions that could create reflection regarding inequitable outcomes for
minorities, which implied that any statements regarding all students that he made
would apply to minority students as well:
• We can learn more about barriers that students face if we separate
them out. (Meeting 4)
• We could have the total student population, their long-term goals, how
many were assessed with their different levels of placement, and how
many had successful completion. We need to get an overall sense of
what’s going on and then compare the groups. (Meeting 6)
• At the macro-level you can look at consistent enrollment, mean GPA,
units attempted…. You can get a sense of where they’re falling out.
And you could compare ethnicities on one page. (Meeting 7)
Although James made no specific interpretation regarding minority students, he did
illustrate the importance of interpretation with respect to institutional receptivity as
shown when the CUE facilitator noted, “James: Asking questions about faculty
issues is important. He posed hypothetical questions like: What are the institution’s
hiring practices? Are they hiring more adjunct faculty? These questions can drive a
meaningful conversation.” James knew the questions that needed to be asked when
looking at data disaggregated by ethnicity which showed that he was now reflecting
on problems students were having as institutional problems, “Could it be due to one
subject? Is there an ethnicity effect?”
The following are the reflections by James regarding all students presented in
a chronological order. In the first meeting James explicitly stated that a large group
of students were unprepared for college courses; this was an awareness presented in
84
the previous section. By the second meeting, he began to show a more open frame of
reference with respect to prepared versus unprepared students as he questioned its
importance, “[We] need to take a step back and see if academic preparation is really
an issue, according to the data.” By the third meeting James began to reflect on
student problems as institutionally centered not student centered, “There are different
assumptions or dispositions related to class and culture. It may be that the teacher
can’t explain concepts in a way that particular students would understand because
their linguistic codes are not matching.” He recognized that counselors may be
making assumptions about students’ capacities and reflected on problems that exist
for students whose first language is not English, “We need to identify ‘special needs’
populations. People have been talking about generation 1.5 [students who speak
English well outside their home, but whose main language at home is non-
English]…. The counselors advise them into the wrong courses because they speak
so well” (Meeting 4). James recognized that “students have different needs…. [We]
need to figure out who the student populations are and what their needs are”
(Meeting 9).
Conclusion
Based on what was provided in the field notes, James had a positive attitude
about working on the DSP. His drive for clarity and precision helped him to develop
an understanding of the context of the DSP and he was motivated and enthusiastic
about working on it. He saw how the DSP linked to other programs and the benefit
the DSP provided with respect to disseminating disaggregated data to others at the
85
college. James found the DSP useful as a tool to find inequities in educational
outcomes for minority students. After looking at the data disaggregated by race and
ethnicity, James changed his frame of reference from one of believing that a large
number of students were unprepared for college to a more open frame of reference
the college needed to meet the student needs. He showed his awareness of inequities
that existed in educational outcomes for minority students at only a few of the
meetings where the majority of his statements regarding minority and all students
were made during the president’s report meeting. He found that Hispanic and African
American students had lower GPAs than Whites and Asians and that both Hispanic
and African American students were less degree-seeking and had the lowest
placement, yet the African American students had the highest aspirations. He also
discovered that Hispanic students’ enrollment was rising, but mostly for females. He
found that African American students didn’t take the recommended courses, were the
fewest to graduate by the sixth term, and were more likely to be part-time students.
Although there was never any specific mention in the field notes regarding any
interpretation made by James regarding minority students, he did show interpretation
for all students with the intent to look at ethnicity and there were instances in which
he showed the types of questions or reflections he wanted others to ask.
Case Study Four: Richard
Richard, a Hispanic male, was one of the two individuals at West Coast
Community College who joined the Diversity Scorecard Project (DSP) in the second
year (Meeting 5) prior to the presentation of the DSP president’s report. Although
86
there is no explicit mention of Richard’s ethnicity, I surmised he was Hispanic by his
last name. As the Director of Research, it was understood by the DSP evidence team
that Richard would be responsible for the collection of data needed by the group.
This was implicitly corroborated by the proportion of the conversations dealing
directly with the DSP (Appendix C, 59.8%), of which almost one-quarter (Appendix
C, 26.7%) dealt with the evidence team choosing a cohort that would best represent
the demographics of the college and almost one-quarter (Appendix C, 23.3%) dealt
with finding ways for the DSP evidence team to demonstrate a pipeline that would
tell the story of the students time at West Coast Community College. A more explicit
corroboration was the fact that he always came prepared for the meetings and sought
to clarify his own or others understanding of the data, which was demonstrated in the
first three meetings. At his first meeting, referred to by the CUE facilitators as a
“data party,” Richard brought his laptop and “party favors which were fairly big
packets of data.” At that time, the data was not broken down by percentages so that it
was difficult for the others to understand what was being shown, but when he
brought his data packets to his second meeting he “presented the data in a more
‘reader-friendly’ manner,” so the evidence team no longer needed to break out their
calculators to understand what the data was showing. By his third meeting, I found
that Richard had become more familiar with the requirements of the DSP for West
Coast Community College, in that the evidence team was looking at data for the
entire institution, unlike some of the other DSP partners who were looking at
individual disciplines. To accomplish this, he developed a “score booklet” of several
87
pages where “each page reflected the same data for different groups—gender and
ethnicity groups” versus a scorecard where the results of the data could be shown on
one page. While reading the field notes, I observed that Richard showed his
professionalism in the way he deliberated over the problems involved with the
collection of data. He showed that he understood the complexities of collecting data
for studies or projects, “Richard discussed how data and information collection had
become more complex and comprehensive at West Coast Community College over
the last decade.” This was also shown in his explanation of past difficulties with
assessment because “people did not follow-through with ideas. It had different foci
and inconsistencies in its implementation…. We could not tease out what we needed
to study. There was not a clear vision of what the program should accomplish. And
our program did not resemble [it].”
Feelings and Attitudes
I found that Richard displayed a positive attitude that was supported by his
enthusiasm to find answers, his confidence when challenged to show the story of the
students at West Coast Community College, and his desire to communicate the DSP
findings that had been disaggregated by ethnicities and gender. In his first meeting
(Meeting 5) the CUE facilitator noted, “Richard displayed great enthusiasm and
curiosity about the data we wanted to explore. He brought his laptop and was eager
to find answers to any questions that came up through the conversation.” I found that
he was willing to start where the team was at so that when others stated that the work
he had brought was not clear or was not what the team wanted, he did not become
88
offended. This was shown when the CUE facilitator noted that Richard was trying
“to show where they [the students] placed, but [realized] maybe this isn’t the
friendliest way to do it because this way people will have to pull out their calculator.”
I found no comments that Richard had become offended or upset; instead, I found
that he was willing keep trying and, when necessary, “go back to the original file and
tie back to that.”
I discovered that even though Richard ran into many challenging situations,
he was persistent in his efforts to find a way to make data easy to read, “The
challenge here is how to deal with all of these subgroups without this all getting
messy.” At another meeting, he again emphasized his willingness to face the
challenge. When asked to show the students persistence from semester to semester he
explained, “I used this same structure for another presentation and it worked. It’s a
good set of comparisons. So…I’ll put that together. The challenge will be to present
this in a graphical sense, so it’s understandable.” Richard was open to the challenge
of presenting new ways to show data, “Yes… we need to re-organize. I’m trying to
think of how to do it.” He never hesitated to challenge himself, “I’ll tell you what; I
won’t be bounded by physical structure. I’ll try to trace the pipeline.”
While reading the field notes, I perceived that Richard, when unsure of what
was being asked of him, showed no reluctance in saying to the others in the team that
he was having trouble understanding. The following are selected quotes that
demonstrate Richard’s confidence in his own abilities as a researcher that enabled
him to admit a lack of understanding of their needs.
89
• I’m still not quite clear. (Meeting 6)
• So, we do the analysis four times, but we make the group smaller by
looking at transfer. The rationale for this is that they are ‘traditional’
college students. It’s a diverse population though, so I’m still not sure.
(Meeting 6)
• I have some methodological questions… I need some basic places to
start. I’m not sure what the desire for everyone (the team) is…. I need
some guidance on what to specifically look for. (Meeting 7)
• This is just so I know what I’m doing when I’m on my own.
(Meeting 8)
He also clarified his or others understanding by asking lots of questions,
• Will we just look at everything, but what should we focus on?
(Meeting 7)
• [Do we want] to compare one “flag” to another…. Is there value in
that? (Meeting 8)
• We’re looking at a trend? Or, a pipeline? (Meeting 8)
When the team settled on the idea of a pipeline to show where students were falling
out of the system, Richard wanted more focus and through statements and questions
explained,
I want ideas on… what level of courses are students taking? Are we only
interested in degree applicable units? Transfer? We need to get an idea of the
pipeline by ethnicity. By spring 1999 about half persist. We can see what it is
by ethnic group. We can look at different rates that students are taking and
passing courses like college English and college Math… If we mapped it out
and looked at the slope (decline in enrollment) for the different groups as
time goes on… What does that look like? The problem is we’re only
capturing those who are consecutively enrolled. Maybe we want to say, “This
many stopped out.”… How do I determine the main river? (Meeting 8)
I observed that, during a discussion about placement and success in “gatekeeper”
courses, Richard tested his understanding of what the others were trying to show by
90
introducing scenarios, “If it included placement level and success rate for enrolled
spaces it may indicate prior preparation, whether or not they took the course, and
their success rates. Does this tell us anything? I think so.” While reading the field
notes, I found one quote that personified the way Richard played out his role with the
DSP because it showed that Richard was confident enough in his work to take the
time to fully understand what was required,
Richard was taking quite a lot of time trying to figure out how exactly to
isolate the data terms that we wanted to look at. There was really too much
information available, it seemed. It was probably a good 20 or 30 minutes of
him playing on his computer with help and suggestions from James. They
were looking up codes for things in a binder Richard had, etc. It took a while,
but I think Richard ended up feeling more confident about what the group
wanted. (Meeting 8)
After reading the field notes, I realized that Richard had experienced the DSP
in the ways the researchers had envisioned. He had changed from a passive
researcher, who only looked at ways to present data, to being enthusiastic and eager
to disseminate the DSP findings that showed inequitable outcomes for minorities at
West Coast Community College. Even though Richard began the DSP as a data guru,
he had, by the end of the DSP, become a very outspoken advocate for the need to
communicate the findings of the DSP. It took time for Richard to begin discussing
the communication of the findings of the DSP to other programs (Meeting 11), but it
seemed that his delayed reaction was because he had joined the DSP after the
conversations regarding how to link the DSP with other programs on campus had
already occurred, and he had not been aware of the inequalities that existed at the
college before he saw the data. I was left with the impression that Richard began to
91
fully understand the purpose of the DSP as he worked with the data, and once he
understood the meaning behind his research, he became an activist who wanted to
communicate the findings of the DSP to other programs even if the issue was
considered sensitive to others. During a conversation about how to get others at the
college to “adopt looking at disaggregated data,” Richard expressed that the team
should take the risk,
It’s about getting people to care about it. We have to get people to look at it
[the data] more complexly. The other challenge is getting people to do
something about it. It requires a different level of engagement by the college
community. We have to convince people this is a priority. In the worst-case
scenario a faculty person could say, “What are you saying, I’m a racist? I
grade people differently?” So we have to discuss this in a non-accusatory
way. (Meeting 13)
I observed that by the last meeting of the DSP, Richard had developed a deeper
understanding about the purpose of the DSP that went beyond the technical aspects
of constructing the scorecard.
We can go on forever looking at issues of equity because there is so much out
there to do. Just presenting this information could be shocking. They have to
engage this conversation for the first time. And that in itself is a story because
it’s not an institutional priority because they don’t think about race and
equity. That’s huge. We don’t want to jump by that. We have to deal with
past trends and issues and discuss how these perceptions affect how you
teach, grade, and your expectations. If you don’t deal with these issues, you
can look at as much data as you want, but everything else is deemed
irrelevant. (Meeting 15)
Richard believed the information found through the DSP had to be introduced
throughout the college and felt the College Review Process could be used to
disseminate, campus wide, the findings and thus provide others at the college an
92
opportunity to view the inequitable educational outcomes that he had previously
found.
The glaring deficiency is not getting it out in this didactic way to our own
faculty. Maybe this College Review Process is the best way to do it. It’s not
forced upon them but seen as a tool to use—integrate it subtly into the
campus community.
Richard thought it was best to integrate the DSP into the College Review Process and
was persistent in trying to find the best way to involve the committee, “How do we
implement this [the DSP] into the College Review Process? Do we ask to be on their
agenda?” Richard, after hearing the CUE facilitator’s suggestion to create a
workshop where the College Review Process committee would become a fictitious
department in order for them to reflect on disaggregated data, said,
We should talk to Katherine about it [DSP workshop]. How much
information do they need? We should know how much they know about data.
That’s a great idea of treating them like a department. They can take that back
to their departments…. That’s a great idea, and would be a good way of
sustaining and integrating…. So that’s a good activity to work on.
Later, Richard reiterated the usefulness of holding a mock review process, “But not
just include the findings. You can have a fake College Review Process case study in
the training.”
I determined that Richard wanted more than just divisions to view the results
of the findings from the DSP as was shown by his desire to present to the Academic
Senate as well as integrate the DSP with the state mandated Student Equity Plan.
Even though others were showing caution regarding presenting the findings of the
DSP to the Academic Senate, Richard continually pushed the agenda, “Maybe in the
93
senate, if there is an appropriate subcommittee to do a long presentation, [then we
could] ask them if this can be presented at the general Academic Senate.” When the
discussion came down to asking the President of the Academic Senate how the DSP
should be presented to the Academic Senate, Richard took ownership, “So who is
going to do that?… I’ll do it…. So that’s our next step in disseminating this project.”
The discussion continued through to the following meeting where Richard again
expressed his understanding that caution was required when presenting to the whole
body, “Yes, I want to build a quiet steam before going to the senate. I want to
practice it before I go to senate.” He reiterated his cautiousness in the last meeting,
“I’d rather test drive [the presentation] on a sub committee.” He also wanted the
findings to be integrated with the Student Equity Plan, which was shown by his
persistence in promoting it and his ownership of starting the process. In meeting 13,
he first suggested that the team should “include the DSP into the Equity Plan.” In the
same meeting he emphasized that the evidence team should make this a priority
because time was a factor, “Student Equity is next because we want to get it ready
for summer.” After the CUE facilitator complimented the evidence team for having
“a lot more data than is listed in the Student Equity Plan,” Richard took ownership of
beginning the integration of the two projects (Meeting 14), “I haven’t heard any
activity on this campus for the Student Equity plan and it’s due in six months. Maybe
I should call whoever is putting that together. I can show him what we have for the
Diversity Scorecard.”
94
Awareness
I found that, at the start of his time with the DSP, Richard was not aware of
inequities that existed at the college. As I read the field notes, I realized that Richard
spoke about students far more often in his first four “data party” meetings (Meetings
5-8) in which over eighty percent (Appendix C, 80.6%) of his total conversations
dealt with all students or minority students specifically. I did note that although his
last six meetings had fewer discussions regarding students, a large proportion
(Appendix C, 83.2%) of those conversations were with respect to minority students
or ethnicities.
After reading the field notes, I realized that there was no mention of
Richard’s current awareness of educational outcomes of minority students, but I did
find one comment by Richard that showed his knowledge of the demographics of the
college:
The characteristics of the cohort were: 1st time students, entered fall 1999,
ages 17-19, no college degree, not currently in high school, any educational
goal. It came out to be about 2, 562 students. Richard commented that when
you break down the cohort by sex and ethnicity, it resembles the college at
large.
Although I felt that, at the beginning of his time with the DSP, Richard believed his
job on the DSP was only to present the data, I observed that as he looked at the data,
he began to recognize inequalities in educational outcomes for minority students.
One such finding was his realization that the African American male students were
least successful, “The differences in groups, we saw that Asians are best and Blacks
are doing worse. You look at gender differences and Asian females are doing best,
95
and Black men are doing worst.” As he continued to view data, Richard noted
differences by ethnicity and gender in vocational courses, “58% of Hispanics are
women…. [And] there are more Asian men than Asian women.” In the last meeting
he informed the others of his realization that, “Latinos are making this huge increase,
but they are mostly women. It determines all these other outcomes. Now you have to
look at a nontraditional minority—the male.” As he viewed the data he had been
working on, Richard continually found differences between ethnicities, “A real
disparity [exists] between Whites and every other group. The majority of students
overall place into one level below college level English, but two levels below
(remedial English) there are big disparities too.” The CUE facilitator made a note of
a conversation in which Richard discussed his recognition of differences by ethnicity,
“Again there were big disparities among ethnicities but there were no disparities for
gender. Whites were placing in the college level at higher rates than any other
group.” Looking at the data, disaggregated by ethnicity, Richard became aware of the
fact that African American students didn’t always take the courses they placed into,
“Of the sample for college level English, 166 African Americans placed in college
level English, but only 49 took the course.”
With respect to student placement for all students, Richard found that
students registered for courses lower than they assessed into, “In math, students tend
to take the course lower than was recommended.” He began to see anomalies and
soon realized that students who placed into college level reading courses did not
necessarily assess at the same placement level for English composition, “Of those
96
who placed in college level reading, about 93% placed in English composition
(college level).” Richard now recognized that placement into English composition
was a determinant in the decision to take or not take English the first semester.
Yes, we want to look at how many took English, where they placed, and how
many are in college level English. Have some students taken something
below college level or that is not transferable? It seems like those who place
in college English take it. If students don’t place in college English, it looks
like they won’t take English at all in the first semester. This data gives you an
idea about how they’re doing.
Interpretation
I noticed that as he proceeded through the DSP, Richard began to use the
language of one who had transitioned to an equity frame of mind, and he knew the
right questions to ask in order to create inquiry into what the data was showing such
as when he asked, “How many are persisting versus dropping out?” When Richard
reflected on the findings for all students, he appeared to place an emphasis on student
placement, “There is a disproportionate number of students placing in and taking
college level English. Many are not taking it. We need to find out how many are
taking any English course.” In the same meeting, Richard reflected on a statement by
the CUE facilitator regarding African American students who, although they stated a
goal to transfer, rarely placed and took the college level English course, “This is
important because it may mean that students are accessing particular services
differently.” Richard wanted to know whether those students who did not place in
college English ever took the course, “Do we want to look at whether they took
college level English at any time? Through fall 2001?” Richard continued to try to
97
make sense out of the placement of students, “It’s reasonable to assume students
would at least attempt college level courses within the first 2 years. Is the argument
that better prepared students will succeed at higher rates? We would have to look at it
over time.” He later reflected on a problem that might exist in which students
couldn’t get into the college level English course the first time out,
As Katherine said, we don’t know if the students couldn’t get into the
course… If they can’t get in within 4 semesters, that’s a problem…. If some
are getting through the first time and others are not, that’s a problem.
I found that Richard didn’t stop his interpretation of the data there, he reflected on
the problem from all points of view, “But how do we know they want to get into that
course?”
Conclusion
Richard epitomized the purpose of the DSP. It seemed that he came into the
project without much awareness of student educational outcomes and by the end of
the project became very engaged in developing strategies to get the word out to the
rest of the campus. Richard had a positive attitude toward the DSP and he took the
project and his role very seriously. It was clear that he valued it highly and that he
was willing to invest the time needed to find the best ways to provide data for the
DSP. He pondered over alternative formats to present what the evidence team was
trying to show as they searched for leaks in the pipeline that represented student
progress at West Coast Community College. As Richard looked at the data that he
presented to the evidence team, he became aware of inequities that showed
differences by ethnicity, gender, and sometimes both ethnicity and gender. He began
98
to use language of one who had developed an equity frame of reference. Richard
became aware of the fact that African American male students were the least
successful. He also found that some students who assessed into the college level
would take a course lower than they placed and if they didn’t assess into the college
level, they tended to avoid the course. He also noticed that White students placed into
college level courses more than other ethnicities and although African American
students placed into college level English composition, they rarely took the course.
Although Richard focused on problems for minority students, he also noticed
problems all students were having. He questioned why students were not taking
courses at the level they placed and whether there may be an access problem that
kept them from taking the course. By the end of the DSP, Richard’s main goal was to
disseminate the findings of the DSP to other programs at the college so that others
could also become aware of inequities in educational outcomes for minority students.
99
Case Study Five: Robert
Robert, an African American male, was an assistant professor and a
coordinator of student services at one of the satellite campuses of West Coast
Community College when he and Richard joined the DSP in the fifth meeting. He
continued as a member the DSP evidence team until the fifteenth meeting when the
DSP ended. Throughout the entire course of the DSP, there were no comments made
by the CUE facilitators as to Robert’s race or ethnicity so I searched the West Coast
Community College web pages and found a picture of Robert and surmised from that
picture that he was an African American. I then determined he was from one of the
other campuses from the field notes when Theresa said, “West Coast Community
College second campus is the most diverse of the campuses. Robert is at that
campus.” I found it interesting that before Robert ever became a member of the
evidence team, the original three members had told the CUE facilitators that “Robert
is currently doing their environmental scan.” Since this was an all White team, I
found it peculiar that during their second meeting, after mentioning the work Robert
was doing, they didn’t take the opportunity to discuss the importance of diversity on
their team. I found no statements in the field notes in which they considered inviting
Robert to join them and since I realized they knew of his work, I found myself
questioning why he hadn’t been invited to join the DSP evidence team from the start.
Another topic I found interesting was regarding a comment made by Theresa in the
meeting before she left West Coast Community College in which she named Robert
as her successor, “[Robert] would be the new team leader,” but two meetings later,
100
there appeared to be confusion as to who had actually been the team leader when she
left, Robert or James. The confusion was brought to light during the fourteenth
meeting after James had left West Coast Community College, when it was noted that
“the CUE facilitator went over the goals of this phase because Richard asked for
some ‘leadership’ due to James (the team leader) leaving West Coast Community
College and the DSP team.” Apparently Richard, the CUE facilitator, or both had
thought that James was the leader. As it turned out, based on the notes, Robert
emerged as a very effective team leader who brought a lot of energy to the meetings,
“Robert is really a good team leader. He did the pre-work and got prepared for the
day.” The second CUE facilitator corroborated this statement,
Robert was very organized in that he prepared for our meeting. He talked to
Richard and Katherine before meeting with us. He is enthusiastic about
making changes on campus and is concerned with the best way to
communicate the DSP to others. He is also concerned with not having others
lose out on the process of reflecting on the data. There is a sense that there is
more commitment and empowerment on Robert’s end because of his new
position as team leader.
Feelings and Attitudes
After perusing the CUE facilitator’s field notes, I determined that Robert
demonstrated a positive attitude toward the DSP since he thought the DSP could be
useful to West Coast Community College both politically and strategically. This was
shown by his understanding of the purpose of the DSP and his motivation to
communicate the findings of the scorecard to other programs. The high proportion of
Robert’s total conversations, over forty percent (Appendix C, 41.0%), that dealt
positively with other programs at West Coast Community College, confirmed his
101
positive attitude regarding the political and strategic benefit of the DSP in generating
previously unspoken dialogues at the college. Another positive aspect was that
Robert took the time to study the materials, which I surmised to be a sign of positive
feelings about the project since, to do a good job as a member of the DSP evidence
team, he cared enough to invest the time.
As I read the field notes, I found that Robert saw in the scorecard an
opportunity to do something that the college typically did not do, openly discuss
race. It was as if Robert felt that the DSP gave him and others at the college
permission to focus attention on and discuss inequalities that exist in educational
outcomes for minority students. He understood that through reflecting on the DSP
disaggregated data, he and the other evidence team members were now justified in
generating discussions regarding the sensitive issue of race, a topic that had always
been difficult to discuss at West Coast Community College. This provided
opportunities for Robert to be more intentional and more responsive to the students,
“We need to understand why each group is not learning even though this topic is
politically charged.” He realized the DSP presented an opportunity for the college to
become “pioneers” and do something that was unique in education, encourage
discussions regarding issues of race.
The focus should be on students. I’m interested in what institutional barriers
there are for students. Is it faculty? Counseling? Tutoring? We need to tease
out these barriers. We can be pioneers in how we are going to address this.
There is no incentive for community colleges to disaggregate data [as they
see it]. Disaggregating data is important for us to better address different
student groups.
102
I concluded that his positive attitude regarding the DSP grew from the fact that he
valued a process that would initiate dialogues on sensitive issues at an institutional
level, “It’s important to show where students are getting hung up…. One [aspect of
the DSP] is interpreting data and the second is generating a big picture on data. For
example, institutional patterns.”
As I read the CUE facilitators field notes I found that Robert was motivated
to bring dialogue regarding ethnicities to the college through dissemination of the
findings of the DSP. The following is a list of these programs:
• We could communicate it to Student Services. (Meeting 10)
• We need to pick some things that are tangible and that we can on the
larger view. College Review Process seems like a good opportunity
because they have data disaggregated by ethnicity. We can work with
them on their data. We can extrapolate that to other services on
campus. (Meeting 10)
• There is a standing diversity committee on campus although I’m not
sure how to work with them…. (Meeting 11)
• I think we should go in front of Academic Senate just to make them
aware of it. The best-case scenario would be that someone within
would want to find out what is going on out of their own volition….
(Meeting 13)
Robert was not just saying the DSP is great, he was strategically finding ways to
make use of it. Of the programs listed above, I recognized that Robert was conscious
of two powerful groups that would assist with his agenda to initiate discussions about
inequitable outcomes for minority students, the Academic Senate and the College
Review Process. He was highly motivated to present to the Academic Senate in order
to open their eyes as to existing problems at the college,
103
We can present this to the academic senate. I think politically it would be
tough for them to say that they are against disaggregating data. We are trying
to undergo some sort of organizational change. When I had opportunity to
present the data, people are darn near shocked and they say, “Why isn’t
something being done?” A lot of times with the data, you can’t help but ask
the question of “why” and how to address it.
He realized the Academic Senate, after they saw the findings of the DSP, would be
beneficial at providing a political climate supportive of dialogue regarding barriers
minorities faced at West Coast Community College, “Maybe they [the Academic
Senate] can go to a committee after we make the presentation to the general
academic senate. They might become the natural allies to have this kind of
conversation.” Robert also recognized the strategic benefit of having the College
Review Process communicate the DSP findings throughout the college despite the
sensitivity of some of the topics, and he saw how the DSP could become a buffer
needed to focus the conversations on what the data, disaggregated by ethnicity,
revealed. This was shown by his grasp of the difficulties others would have
discussing disaggregated data and his intuitive understanding of the need for
guidance from the DSP evidence team:
College Review Process already disaggregated their data but they were not
always equipped to deal with outcomes if they were not familiar with
diversity and equity. They need to know how to facilitate the discussion.
Ninety-eight percent of White faculty are uncomfortable with this kind of
discussion. We need a buffer group like the Diversity Scorecard team to aid
in that process.
He understood that many would skirt the issue without the assistance that the DSP
evidence team could provide,
104
It really takes expertise to guide faculty through this. They are going to feel
uncomfortable. It’s an art to facilitate this conversation and keep them
engaged. We have to get them from point A to point B. One issue that
pressures departments is to publish. We cannot have a discussion about
learner outcomes without disaggregating data. We have to look at it in this
manner. Some folks are not willing to go there to say, “This is not ok, what
are we going to do about it?” Some people will want to move on and discuss
other things. We have to be willing to challenge status quo and put ourselves
on the line. This is systemic and we have to look at those issues.
Robert’s awareness that ethnicity was not currently being discussed in the College
Review Process motivated him to push the agenda for the DSP evidence team to
create questions to be used by the College Review Process that would initiate
discussions regarding race,
The disciplines have at their disposal information about DSP data. But they
bypass the figures of ethnicity and they don’t discuss it. Each of the College
Review Process gets a packet with questions to help think through their
CRP…. Robert went over the 10 guiding questions for the College Review
Process but none of them guide them to discuss outcomes by ethnicity. One
of the ways to include this in the discussion, Robert said, is to include a
question or two about ethnicity to spark thought and discussion.
Robert wanted the questions created by the DSP evidence team to become an integral
piece of the College Review Process and was aware of the fact that once the
questions were presented, the DSP evidence team would need to guide those going
through the College Review Process to create an atmosphere of reflection on the DSP
disaggregated data,
There are a couple of ways to approach it, but we want to infuse it into
College Review Process. So questions would infuse and then massage it so
that they can ask questions…. We can frame it in terms of questions of the
four frames.
105
In his last meeting, Robert told the CUE facilitators that he had begun unifying the
DSP and the College Review Process and explained what resulted:
We met with College Review Process (CRP) 2 weeks ago. We talked about
how they evaluate their particular disciplines. We talked about getting them
to disaggregate data by giving them three questions that would prompt that
kind of activity and discussion.
Katherine then passed out a sheet showing the three questions the DSP evidence
team had decided upon:
1. What trends exist in the data (gender, ethnicity, night vs. day,
traditional vs. non-traditional age, etc.)?
2. Why might these trends be occurring?
3. Are there additional data that would assist your discipline in analyzing
these trends?”
When the CUE facilitator questioned the wording of the second question, Robert,
knowing issues of race were very sensitive at the college, explained the DSP
evidence team’s reasoning for that particular question. Since it was understood that
people either did not want to talk about these issues or did not have the opportunities
to do so, he felt strongly that it was very important for the DSP evidence team to
draw out the current perception of those going through the College Review Process:
We can see how their [faculty] perceptions affect student outcomes so we
want to gather that data on their reactions. Then we need to ask questions. So
we need to understand their present perceptions. And these people really
don’t know how to address the problems. So we can put down different
questions, but we still have to evaluate their perceptions.
Awareness
In the nine meetings Robert attended, almost twenty percent (Appendix C,
17.2%) of the conversations dealt directly with students. From the beginning of
106
Robert’s time with the DSP I realized that he was already aware of inequities in
educational outcomes for minority students and the language he used in discussions
was one of a person who was already within the equity cognitive frame. I
discovered that before Robert joined the DSP, he was aware of the importance of
institutional receptivity for all students, but for African American students in
particular, “Most of the literature shows that faculty contact impacts student
achievement. Regardless of ethnicity, contact is important. In my study on African
American males, informal contact with faculty is important.” Through his research
he also became aware of ethnic differences in counseling, “Literature shows that
there is a difference in the type of advising received by ethnicity.”
I found that, when discussing minority students, Robert focused his
comments on their aspirations, achievement, and placement. As Robert looked at the
disaggregated data, he developed new awareness when he found two disparities for
African American students, “It’s interesting that the African Americans seem to
have the highest aspirations, but their performance is the lowest…. There are
patterns for African Americans.” After viewing data that had been disaggregated by
ethnicity, he realized the data was demonstrating that a barrier existed within the
institution, “Well, it looks like African Americans start with high academic
aspirations, but something happens along the way…. It seems like their aspirations
don’t reflect the achievement indicators. There’s a huge disparity.” Robert discussed
how the data revealed that even with equal preparation there was still a problem for
minority students, “All the students in College English have similar preparation, and
107
we still see differences by ethnicity. Something is happening institutionally or in
class.” His new awareness led him to question the meaning behind the data being
presented. When he saw the high percentage, 70%, of African American students
who did not take the College English course they placed into, he asked the evidence
team members, “So, what would be the difference between those placing in a
particular course and those who actually took the course?”
Eleven percent (Appendix C, 11.48%) of Robert’s total conversations dealt
directly with all students, but his intent was to eventually look at problems by
ethnicity, “I almost think you have to do the ‘snapshot’ and then disaggregate by
ethnicity and then do a comparison.” After looking at data presented to the evidence
team, the team began discussing best methods to determine which measures would
be looked at for the DSP. It was at this time Robert showed his awareness that “there
are indications that there are problems at all levels.” As he explained how to present
data to show where students were falling out of the system, he emphasized that, as
the DSP intended, the evidence team’s emphasis should be on inequities in
placement not in preparation, “We can’t do much about a student’s preparation
before they get here, but we can make sure to place them properly.” Robert then
showed his awareness that, although the problems students were encountering were
institutional problems, those problems were not unique to West Coast Community
College, “Aggregate numbers across the state show patterns that are similar to what
we showed you today. West Coast Community College is not an anomaly”
(Meeting 9).
108
In his very first meeting (Meeting 5), Robert displayed new awareness in the
form of shock after viewing some data,
Twelve out of 2500 got a degree or certificate within two years? Wow! (He
covered his face with his hands and was in disbelief.)… You would think
with the accessibility and low cost of our courses that numbers would be
higher for finishing in two years.
I found that Robert was trying to understand the types of barriers all students face,
“So few are transferring though. It’s very few. It tells one piece of the story. We need
to find out the story of the 30 percent that drop out after their first semester.” I also
realized that Robert’s development of new awareness was beneficial to the DSP and
to Robert because, as he looked at the data for all students, his new awareness led to
questions that needed to be asked:
• At the end of 3 years, what is the highest level of English achieved for
certain groups?
• We should look at the big picture first… Why is this population not
served?
• If you place in remedial English, you can’t finish in 2 years. But, why
aren’t those in college English finishing?
Interpretation
I found that unlike the other evidence team members, Robert emphasized that
student preparation was not the problem; he saw the problems as institutional.
Through clarification, justification, explanation, and rationalization, Robert
interpreted the data presented for both minority students and all students. After
viewing data that showed that African American students had high aspirations,
Robert suggested one explanation for barriers they may be facing institutionally,
109
“This seems to debunk the myth of African Americans having low academic
aspirations, which impacts teacher expectations. We could look at the difference
between aspirations and the impacts of advising in changing one’s aspirations.”
Robert clarified the needs of the DSP as they searched for ways to present the data,
We may want to break it up by pre-collegiate characteristics then look at
where students placed and what they took. Then you can focus on particular
treatments. This would help us target students early on and identify the
appropriate interventions based on the entering conditions of the students.
As Robert viewed the data and continued to determine the best use of assessment and
student goals, I discovered that he reflected on the possibility of institutional
disparities as he focused on the existence of institutional barriers being demonstrated
by the data,
The goals don’t say much except that they don’t match the patterns of
success. There is no correlation between placement and persistence. If
students come in with the goal of transferring, we need to look at what’s
happening institutionally that prevents that.
After looking at the data, Robert gave an example that explained problems African
American students were having, “College level English and college level math are
good indicators because they [African American students] don’t persist by semester,
they probably won’t move up.” Robert interpreted the data through his rationalization
of the institutional problems that were appearing at various places within the
institution. While helping the others to understand the concept of a pipeline to find
where students were dropping out, he grappled with the problem of showing these
barriers and exclaimed,
110
You’ll find differences at every point. If we identify differences everywhere,
what is that going to tell us? There have been differences at every point. It
looks like something is taking place in the classroom. Assessment should
control for preparation. But, they’re still not successful… It must be
something in the classroom.
At the end of the president’s meeting, Robert once again justified his conviction that
student preparation should not be an issue since his frame of reference was that the
inequitable educational outcomes shown by the DSP data represent institutional
problems, “The unit of analysis tends to be the student and we blame them for their
under-achievement. The institution plays a role in producing achievement or under-
achievement.”
Conclusion
Robert had a positive attitude toward the DSP. He was acutely aware of a
diversion by others at the college regarding discussing race. His positive attitude was
shown by his understanding of how the DSP would justify these types of discussions
and his motivation to communicate the DSP to other programs at West Coast
Community College since it would be instrumental in institutionalizing dialogue
about inequities in educational outcomes for minority students. From the beginning
of his time with the diversity scorecard project he used the language of one who was
already within the equity cognitive frame. As he developed new awareness from the
data disaggregated by ethnicity, he began reflecting on specific problems in which
African American students had a huge disparity between their aspirations and their
outcomes as well as between their course placement and the courses they actually
took. His new awareness made him more conscious of the institutional responsibility
111
for correcting these problems, and he began the development of questions that
needed to be answered. He found that despite equivalent preparation, there were still
discrepancies for minority students, which showed something must have been
happening institutionally or in the classroom.
A Comparison of Learning between Evidence Team Members
Theresa saw the functionality of the framework of the DSP as a benefit to the
college as was shown by her initial use of the balanced scorecard before the onset of
the DSP. She began the DSP with a positive attitude and was able to convey that
attitude to the others so that they too showed a positive attitude from the beginning of
the DSP at West Coast Community College. She was open and willing to help the
others see the vision of the pipeline that could find institutional indicators to
determine where students were falling out of the system. Katherine, who was on just
about every committee at the college, was in a perfect position to help the others
understand the processes necessary to communicate the findings of the DSP to others
at West Coast Community College. Although Katherine was a quiet person who
spoke very little, she understood the “who, where, and when” of disseminating the
information. Katherine’s emphasis was on the process of creating indicators and best
ways to communicate the DSP, not what the data was showing. James’ perspicuous
attitude made it necessary for him to find clarity for himself and others and to be
precise about using the proper cohort data. Since he saw the DSP as the first stage
that was needed for the College Review Process, he was motivated to develop the
connection between the DSP to the College Review Process. To him the DSP was the
112
dashboard that had lights to indicate when there was a problem. It was then up to the
College Review Process to look at the problem and interpret its meaning. Richard
brought the awareness of ethnic and gender differences to the attention of the others.
He was responsible for putting together the data to be presented to various programs
inside and outside of West Coast Community College. As he did this, he discussed
his new awareness as he looked at the data disaggregated by ethnicity. Through his
search for the best ways to present the data, he recognized what the data was showing
and informed the others of differences he found between ethnicities. Of all the
members of the evidence team, Richard encapsulated the meaning of the DSP
because he started the program only to present data that represented the college that
he knew to already be diverse, but when he saw the inequities in educational
outcomes for minority students, he changed his mental model from one of diversity
to one of equity and became a change agent. Robert began the DSP with an equity
frame of mind and looked at best ways to depict and present the information to others
at West Coast Community College. He saw problems all students were having on a
statewide level and within his own institution, but he did not interpret the problems
to be student preparation. He noted that African American students had high
aspirations and yet their performance was the lowest of all the ethnicities. He saw
student barriers from the story the DSP scorecard indicators were telling and found
that equal preparation did not relate to equal success. He appeared to be the catalyst
for the beginning of discussions that looked into the reasons behind these problems
minority students were facing and emphasized to the evidence team that something
113
was happening on the institutional level and in the classroom. Robert welcomed the
DSP because it provided a venue for discussion regarding inequalities in educational
outcomes. It should be noted that until Robert joined the group in the second year,
very little discussion regarding ethnicity was emphasized by the original evidence
team members, but I was not able to determine if that was because Robert joined the
group and began interpreting the findings or because both he and Richard joined the
group and that was the time that Richard began presenting data disaggregated by
ethnicity which opened up the discussions. The one point I was able to determine
was that Robert did begin the major discussions regarding the reasons for inequitable
educational outcomes for minority students that were found at West Coast
Community College.
With respect to the evidence team member’s feelings and attitudes toward the
DSP, each one saw the Diversity Scorecard Project as a benefit to West Coast
Community College from their first day through their last. The three who began in
the first year of the DSP readily discussed links between the DSP and other
programs. After Richard and Robert joined the evidence team in the second year and
Richard began to show the data disaggregated by ethnicity, each member was
motivated to communicate the findings of the scorecard indicators to those linked
programs. The following shows the similarities and differences in the way each
discussed inequities in educational outcomes for minority students and looks at each
individual’s mental model. To do this, I have created tables (see Appendix D), which
show comparisons from individual to individual.
114
All the evidence team members had a positive attitude about the DSP and I
was able to break down the demonstration of their positive attitudes to seven topics:
(a) commented that the DSP was beneficial to other programs, (b) shared the
scorecard with other college constituencies, (c) realized the usefulness of the
framework, (d) saw the usefulness of the disaggregated data to emphasize disparities
by ethnicities, (e) saw how the perspectives could mobilize others, (f) saw how the
scorecard could facilitate difficult dialogue, and (g) showed enjoyment, enthusiasm,
or excitement while working on the DSP. Many of the reasons within each of the
topics allowed a comparison or contrast to help determine how each felt about the
Diversity Scorecard Project.
Each member of the evidence team saw a benefit the DSP had for other
programs. They each saw how it reinforced college-wide initiatives such as the
College Review Process. The differences were that Theresa mostly saw the DSP as a
tool to help answer her marketing questions where Richard and Katherine began to
see the benefit of communicating the disaggregated data to the various programs to
enlighten them on the existence of educational gaps for minorities. James viewed the
scorecard as complimentary to other initiatives that relied on data and wanted the
requirement of data disaggregated by race and ethnicity to become incorporated into
the College Review Process. Whereas Robert saw the political and strategic benefit
of the DSP within many of the college-wide programs in order to generate dialogue
that was once not acceptable, and, like Richard, saw how the DSP could be beneficial
to the state mandated Student Equity Plan.
115
Theresa left West Coast Community College before many of the discussions
began regarding sharing the scorecard with other college constituencies, but the other
four wanted to share the scorecard with other college constituencies. Katherine saw
how they could use the College Review Process to communicate the DSP to others
on campus. James wanted to make data accessible so that during the College Review
Process faculty could start asking about learning outcomes and how to assess it.
Richard saw how the DSP as a tool would get the information out to the faculty
through the College Review Process without forcing it upon them so that it could
become integrated within the college. Robert understood the purpose of the DSP and
wanted to communicate the findings of the scorecard to others at the college to open
up long overdue dialogue about inequities in educational outcomes that were found
to exist at West Coast Community College for minority students.
Theresa was the only member who actually discussed the value of the
framework that was the DSP. She was originally attracted to the DSP because of the
framework and its indicators. She became preoccupied with the perspectives
including which perspective to consider a dependent or independent variable. She
saw how it could fill gaps that existed in other programs such as the newly formed
assessment committee, which had no framework, and how they could “bring data to
bear on the problem.” She saw how the four perspectives of the DSP could identify
progress on their college-wide action plan strategic goals.
All but Theresa saw the usefulness of the requirement of the DSP to
disaggregate data by ethnicity in order to emphasize disparities. Theresa did not
116
appear engaged in looking at the data because she was preoccupied with the way the
data was to be cut. Her priority was in how to present the data, not what the data
showed about the state of inequity at the college. Katherine had an opportunity to use
disaggregated data during a math study, but chose not to. By the end of the DSP she
may have begun to see its usefulness, since it was shown that she wanted to
disseminate the findings from the DSP to others at the college. James focused on
creating interpretation among the departments going through the College Review
Process and viewed the DSP as a way to emphasize problems that faculty may not
have been aware of or wanted to look at. Richard, who had begun as a researcher,
began to see the results of the disaggregated data and quickly became an advocate
who realized the need to communicate the findings of the DSP so that others would
understand the disparities that existed at the college. Robert valued the process that
could initiate necessary dialogue. He wanted others to have the opportunity to reflect
on the disaggregated data and saw the scorecard as an opportunity to do what had
never been done before, discuss race. The DSP provided Robert and others the
permission to openly discuss sensitive topics after viewing the results of the
disaggregated data.
James, Richard, and Robert all hoped that the DSP perspectives could
mobilize others at the college. James wanted others at the college to become involved
in discussions about the disaggregated data. Richard wanted the information he
discovered to be disseminated throughout the college, not just the College Review
Process and the Academic Senate, but he wanted it to become integrated into the
117
Student Equity Plan so that all could view the inequitable educational outcomes for
minorities that he had found. Robert saw the college as becoming “pioneers” that can
do something unique in education, encourage discussions regarding issues of race.
Through mobilization he saw that organizational change could take place. He
recognized the benefit of having the Academic Senate supportive of the dialogue that
would result after seeing barriers minorities faced within the college and recognized
the strategic benefit of having the College Review Process communicate the findings
of the DSP to the various departments throughout West Coast Community College.
All but Katherine discussed how the DSP could facilitate difficult dialogue.
Theresa’s hope was that the DSP would trigger thought regarding the issue of equity.
James recognized that some colleagues were reluctant to look at these types of issues
and believed the DSP could facilitate necessary discussions and interpretations about
what was found by disaggregating the data. When Richard first joined the DSP, he
was not aware of the importance of disaggregating data, but after seeing the results,
he became an advocate and was constantly pushing the agenda to disseminate,
college-wide, the findings of the DSP so that others would become aware of the
problems that existed for minority students. Robert realized that through the DSP
they could reflect on the disaggregated data which provided him and the others
justification in creating discussions that involved sensitive issues of race. He knew
this topic was politically charged, but saw how the DSP could become a buffer as the
DSP evidence team guided others through their discussions. He understood that
118
people did not want to discuss nor did they usually have the opportunity to discuss
these issues, but knew this was necessary for the college.
Throughout the entire DSP, comments were made about how much all the
team members enjoyed working on the DSP and about their enthusiasm and
excitement as they did so. Theresa was predisposed to liking the scorecard because
she had already been to a balanced scorecard seminar and had already begun using it
before the DSP was implemented. She told the DSP CUE facilitators that they were
enjoying participating in the project. Katherine was quiet, but the CUE facilitators
made notes of body language or action that demonstrated her engagement showing
that she was engaged in the process. James showed his enthusiasm through his nods
and enthusiastic responses. Richard was very enthusiastic about exploring the data he
presented and was eager and persistent in finding answers to questions asked and
willing to face the challenge of making the data easy to read while showing the
longitudinal life of the student. Robert turned out to be a good team leader who was
always prepared and took the time to study the materials. He was very organized and
enthusiastic about making changes at the college. Once he became the leader, the
CUE facilitators noted that he seemed more committed and empowered.
It is now time to compare and contrast the evidence team members’
awareness and interpretation. I was able to identify nine topics that demonstrated
either awareness or interpretation or both: (a) prepared versus unprepared students in
general, (b) prepared versus unprepared students with respect to ethnicity, (c)
understood socio-cultural obstacles, (d) awareness of institutional barriers, (e) aware
119
college not serving all students equitably, (f) aware of inequalities in educational
outcomes by ethnicity, (g) adopted an institutional responsibility standpoint, (h)
aware of lack of faculty diversity and consequences for minority students, and (i)
aware of the need to maintain a vision of equity. Each evidence team member
discussed his or her awareness of at least one of these topics. I will now present a
comparison and contrast of how each fit into the categories.
All but Richard at some time discussed the topic of prepared versus
unprepared students in general. When data was presented by race and ethnicity,
Theresa focused more on the unprepared student in general than on ethnicity. Her
interpretation was that remedial students were not being socialized to the culture of
academe, but later in the meetings she showed her awareness that there existed a
mentality in which the traditional assumption was to blame the victim, the student,
too much for being unprepared. This showed that she was beginning to lessen her
stance on the problem being that of the students rather than the institution. Katherine
on the other hand believed that all students were performing equally poorly and
although data shown by ethnicity showed inequities, her comments focused on all
students. James, at the start of the process, thought that many of the students were
unprepared for college, but by the second meeting he began to change and said that,
after looking at what the data had to show, he thought they needed to determine if
preparation was really an issue. Robert on the other hand never believed the issue
was one of students being unprepared. He understood it was the college’s
responsibility to place the students into the proper courses.
120
Only James and Robert discussed prepared versus unprepared students with
respect to ethnicity. Theresa’s interest appeared to be for all students since she
seemed to focus on the marketing capabilities of the DSP. Katherine on the other
hand believed that things were bad regardless of ethnicity for all students and never
showed a need to focus on the outcomes of specific groups. Richard never discussed
prepared versus unprepared students, so I am not familiar with his awareness on this
topic. By the end of his time with the DSP, James no longer spoke about the problem
of prepared versus unprepared students; instead, he showed that he had become
aware of multiple inequities in educational outcomes for both African American and
Hispanic students (discussed later). Robert emphasized that preparation was not a
problem without a solution; instead, he saw it as an institutional problem and implied
that it could be solved after interpreting the inequitable outcomes shown by the DSP
disaggregated data.
Theresa, James, and Robert each discussed their understanding of the socio-
cultural obstacles faced by students. During a discussion, Theresa told a story about a
student’s misunderstanding about “checks and balances” because the student kept
picturing balancing a checkbook. James discussed how he found that students whose
second language is English, but spoke English well, are being placed into the wrong
types of English classes and realized this was an institutional problem, not a student
problem. Robert, on the other hand, discussed how, through his research, he had
found that there are ethnic differences in how students are counseled where the
difference in student aspirations and the way they are advised may even change their
121
aspirations. Neither Katherine nor Richard ever made any comments as to their
awareness of socio-cultural obstacles students may face.
Each evidence team member except Katherine showed an awareness of
institutional barriers students face. Through the process of the DSP, Theresa became
more outspoken about the barriers faced by students, but rather than directly discuss
ways to tear down the barriers, her focus was on how to engage others in the
conversation. James on the other hand stated that he knew they could learn more
about the types of barriers minority students were facing if they looked at data
disaggregated by ethnicity. Richard began to recognize that student placement into
English composition determined whether they took English in the first semester or
not and was concerned as to why some students were getting through college level
English the first time while others were not. Robert discussed the fact that with
similar preparation, there was still a problem for minority students. He was trying to
understand why so few students transfer and emphasized the fact that African
American students showed high aspirations even though most believed they didn’t.
He interpreted this as an institutional problem since most likely teacher expectation is
impacted by their view of the aspirations of the students.
Theresa, James, Richard, and Robert all became aware that the college was
not serving all students equitably. Theresa discussed the state of African American
students when she said that their findings were the most shocking and that it was one
group not being addressed. James on the other hand, understood that by looking at
consistent enrollment, GPA, and units attempted, they could find where the students
122
were falling out and compare that by ethnicity, implying he was expecting to see a
difference. Richard reflected on the fact that African American students who stated a
transfer goal rarely showed that they were in college English courses and interpreted
it as meaning that students are accessing services differently. Robert saw the problem
students were having as occurring at all levels within the institution.
Katherine was the only member who did not discuss an awareness of
inequalities in educational outcomes by ethnicity. Theresa saw a big difference
between White and African American students entering college level courses and
began asking more equity-oriented questions, although never sought the answers.
James was very conscious of critical differences across racial and ethnic groups. He
noticed that there were more part-time Hispanic and African American students and
noticed that the African American students didn’t take the recommended placement
level courses. He pointed out that there were disparities in preparation and in the
number of attempted versus completed units by racial and ethnic groups and
emphasized the tremendous disparities that existed for GPA and successful transfer.
At the beginning of his time with the DSP, Richard showed no awareness of
inequalities at the college. After viewing the data, he began to discover them; he
began to realize that African American male students were least successful, and he
found disparities between White students and every other group where White
students were placing in college level at higher rates than any other group. He also
found that the African American students weren’t taking the courses they placed into.
From the beginning of his time with the DSP, Robert was already aware of existing
123
inequalities in educational outcomes for minority students. After looking at the data
being presented, he also developed new awareness. He saw that African American
students had high aspirations yet low performance. He realized that college level
English and college level math indicators showed that African American students
didn’t persist and therefore didn’t move up. Since these students were not successful
he interpreted the problem as something happening in the classroom since
assessment controlled for preparation.
Theresa, James, and Robert each adopted an institutional responsibility
standpoint. From one statement made by Theresa, I determined that she was now
viewing problems as an institutional responsibility to solve. It was when she told the
others that you could not fix student preparation and said that they all need to ask
what barriers are in place for these students. James also showed that he was
reflecting on problems students were having from an institutional responsibility
standpoint when he asked if the problem could be due to one subject then added that
he wanted to know if there was an ethnicity effect. Robert was already aware of the
importance of institutional receptivity for African American students and reflected on
what could be happening institutionally that could cause the barriers shown by the
data.
Only Theresa and James ever directly discussed a lack of faculty diversity
and the consequences for minority students. Theresa demonstrated that she was
aware that this lack might have negative consequences for minority students where
James just hinted at his awareness by asking questions about hiring practices and
124
about the proportion of adjunct professors both of which were stated in order to drive
conversations about diversity. The other evidence team members hadn’t discussed
this facet of institutional receptivity.
Theresa, Katherine, and Robert were the evidence team members who
displayed their awareness of the need to maintain a vision of equity. In her last
meeting, Theresa told the CUE facilitators and the other evidence team members the
importance she placed on the DSP since it keeps the initiative of equity on the table
that could be used in the college-wide action plan in order to target specific groups in
a proactive way. Katherine, concerned with the large proportion of students in
remedial math courses, stated that diversity hasn’t been an issue in English or math
and that was something they needed to consider.
125
CHAPTER FIVE: SUMMARY
A Memo to the President of My College
TO: President
FROM: Donna M. Post
SUBJECT: The Equity Scorecard
I am writing this memo to request that you consider the adoption of the
Equity Scorecard (previously called Diversity Scorecard Project) at our college. I
recommend adopting the Equity Scorecard Project for our college because I have
seen, first hand through my dissertation, how the Diversity Scorecard Project (DSP)
made an impact at one college (see section below: What I Learned). The DSP was
successfully used to help a team of professionals assess student outcomes and
become change agents whose goal was to create equity in educational outcomes for
underrepresented students at their college. Although my study of individual learning
among the DSP evidence team at one college is not generalizable, the implications I
gleaned from the study were invaluable to me as an educator.
The DSP at West Coast Community College was a success and would be an
excellent program to implement at our college or any college with a diverse student
body. The Equity Scorecard Project framework, which is based on the original DSP,
would provide our college the opportunity to create indicators for many of the
programs that already exist or are just being implemented. I have determined that the
following programs could benefit from the implementation of the Equity Scorecard
Project at our college:
126
• Accreditation
• Student Equity Plan
• College-wide Strategic Planning
• Program Review
• Student Learning Outcomes Program
• College-wide Technology Committee
In order to implement the Equity Scorecard Project at our college, there are at
least four processes. First, you will need to select a team who understands that the
intent of the Equity Scorecard Project is to institutionalize the process. Second, at
least one member of the team will need to be from the research department so that
data, disaggregated by race and ethnicity, will be readily available to all evidence
team members from the first day on. Third, financial costs should be considered. To
ensure the institutionalization of the program, a line item should be added to the
baseline budget. This should also reflect the cost of time spent by the faculty, staff, or
administrators on the evidence team. Fourth, our college should consider joining into
a partnership with USC CUE developers in order to work directly with their
facilitators and join in their partner seminars.
Institutionalization of the Equity Scorecard Project
The goal of the Equity Scorecard Project is to foster institutional change by
developing institutional accountability. Institutionalizing the process of action
learning provides an opportunity for our college and its individuals to become aware
of the educational inequities that exist for minority students, reflect upon and
127
interpret their new awareness with a focus on equity for all, and begin using the
language of equity. In my study, West Coast Community College’s implementation
of the DSP was a success because the evidence team members saw the need to
institutionalize the project from the beginning. They immediately worked toward
creating links to programs that had not yet found a framework and other programs
that would benefit from the addition of the DSP framework. It is therefore vital that
when our college begins the Equity Scorecard Project, the evidence team members
begin discussions regarding links to other programs and find other ways to
institutionalize the process.
Benefit of the Equity Scorecard Project to Other Programs
Two such programs that currently exist at our college are the accreditation
process and the Student Equity Plan. Since the main function of the Equity Scorecard
Project is to assess institutional effectiveness, the framework presented by the Equity
Scorecard Project would be of benefit to both programs. This goal matches the goals
of both the student outcomes focus of the Accreditation Commission for Community
and Junior Colleges, Western Association of Schools and Colleges and the state
mandated Student Equity Plan.
Another advantage the Equity Scorecard Project is in the development of our
strategic plan and implementation of the college-wide academic program reviews.
Recognizing the needs of the college, the evidence team members at West Coast
Community College were able to disseminate the findings of the DSP disaggregated
data. In conjunction with the college-wide strategic planning and program review,
128
the evidence team members created dialogue within various units at the college
regarding the inequities found. This created a realization in others of the existence of
inequitable educational outcomes that existed for minority students. Through these
dialogues, it was the hope of the evidence team members that others at the college
would also begin using the language of equity when discussing student learning and
student outcomes. The results of my study show that this process had begun.
In addition to the above programs, our college is currently in the process of
implementing a college-wide student learning outcomes project and the college-wide
technology committee is looking for ways to bring technology to the forefront of
student learning. By using the framework of the Equity Scorecard Project, the student
learning outcomes project would have a way of creating measurable indicators that
could move the focus from one of educational inequalities being a student problem to
one of an institutional responsibility as was the case in my study. The framework
would also be beneficial to the technology committee since they need to find ways to
measure results of various uses of technology on campus. By using the Equity
Scorecard, they would be given the opportunity to find any inequities that exist for
minority students at our college.
Importance of Existing Data
The evidence team members of the Diversity Scorecard Project at West
Coast Community College did not begin discussions regarding race and ethnicity
until the second year of its implementation. One important factor for this was that the
college did not have clean data. It took one year before the data was ready to be
129
presented to the evidence team. If our college provides the evidence team with data,
disaggregated by race and ethnicity, at their first meeting, then we will provide an
environment where action learning would commence immediately.
Criteria for the Selection of an Evidence Team
From my study, I concluded that the success of the Equity Scorecard Project
was dependent on the selection of the evidence team. There are three important
aspects that must be considered when choosing an evidence team: (a) members
represent diversity of institution, (b) have the backing of president, and (c) members
are in prominent positions of college-wide programs.
First, you would want the team to represent the diversity of the college. The
team at West Coast Community College was originally all White. The results showed
that until a new minority member joined the evidence team, there were very few
discussions regarding race or ethnicity. With the introduction of Robert, an African
American, to the evidence team, there was finally an internal champion who brought
race and ethnicity to the forefront of dialogues. I came to realize that by creating an
atmosphere of diversity within the evidence team, the members would encompass
multiple perspectives and be more open to discussions about race and ethnicity from
the beginning.
Second, it was noted by a member of the evidence team at West Coast
Community College that programs that have the backing of administration,
especially the president, tend to be sustained. The team began the DSP with a
positive attitude and it was noted by the evidence team at West Coast Community
130
College that it was very important to have the backing of the president of the college.
If the team is selected by the president, others at the college would realize that the
college is placing high importance on the findings of the project, which in turn would
ensure the success of the project.
Third, the prominence of the evidence team members is also very important.
Having an evidence team made up of colleagues who are in key positions within
other programs was instrumental to the project’s success at West Coast Community
College. The evidence team members in this study consisted of vice presidents of the
college, chairs of college-wide committees, and one member from the research
department who was very knowledgeable on the best methods for presenting data.
Therefore, the members chosen for our college should be members of a variety of
college-wide programs and should engage colleagues from both the academic and
administrative sides of the college, with at least one member from the institutional
research area.
Cost of Implementation
The cost of implementing the Equity Scorecard at this college is minimal, yet
as noted above, the benefits are significant. The main cost would be the time spent by
the members of the project disseminating the findings of the Equity Scorecard to
others at the college then, as new individuals come to realize the disparities that exist
for underrepresented students, the time they spend finding solutions that could reduce
these gaps. This would be time well spent since the overall effect for the college and
students would be the reduction of educational inequalities that exist at the college
131
for minority students. For faculty, release time should be provided. For
administrators or staff, enough time should be allotted from their current position.
The actual financial cost to the college would be minor over the four year
implementation of the project. The only expense incurred would be travel
reimbursement costs as the college change agents meet with other institutions to
coordinate their efforts with the Equity Scorecard Project. USC CUE creators are
very supportive of institutional partners and provide opportunities for these
partnering institutions to meet and discuss how other institutions have implemented
the ESP.
Those involved from the research department will not be required to create
new data which could become costly. The main tenet of the Equity Scorecard Project
is that it does not require new data to be developed; the project uses existing data.
Any new data collected through surveys would be conducted through any linked
programs. Each of these linked programs has already been provided with a budget for
surveys, therefore there would be no added expense to the college for research.
What I Learned
At the beginning of this study I discussed the importance of an interpretive
evaluation to assess the worth of a program upon its conclusion (see Chapter 1). The
reason this type of evaluation was chosen was to enable the stakeholders, the USC
CUE DSP developers, to determine if their program impacted learning for the
evidence team members at 1 of 14 institutions that partnered in the four-year
implementation of the DSP. After analyzing all the field notes and reports, I
132
determined that action learning among the DSP evidence team impacted learning
among the individual members of the evidence team as well as the team as a whole.
This information corroborated what the CUE developers had hypothesized as they
developed the Diversity Scorecard Project. My determination that learning among
evidence team members was impacted by the DSP was accomplished by answering
two research questions: (a) “How did the participants’ feelings and attitudes about
the DSP overall and about their own participation in particular change over the
course of the project?” And (b) “Did the participants’ awareness of and interpretation
about inequities in educational outcomes for minority students change over the
course of the project?”
The first research question provided an opportunity to determine the attitude
of each participant regarding their involvement with the DSP. I found that each of the
evidence team members began the DSP with a positive attitude which lasted through
the end of their time with the DSP. This positive attitude facilitated their learning
because they were excited about working on the project, willing to find ways to
institutionalize the DSP from its onset, and willing to put in the time necessary to
proceed through the project. From the first day of the project, the evidence team
members were trying to find ways to institutionalize the DSP. They found it was
beneficial for ongoing program reviews as well as accreditation and a state mandated
Student Equity Plan. Their positive attitude was a main factor in their willingness to
invest their time in the project.
133
The second research question dealt directly with individual learning. From
the results, I was able to conclude that the four evidence team members, who had not
begun the DSP with an equity frame of mind, changed from single-loop learners,
who either saw the problem as a student problem or felt the college was already
diverse, to double-loop learners, who adopted the attitude that the problems that
existed for minority students were the responsibility of the institution. Through that
process, those four DSP evidence team members became equity minded change
agents who saw the need to disseminate the findings of the disaggregated data to
other programs at the college to reduce the educational inequities that existed for
underrepresented students at their college. The fifth member did not begin showing
signs of a change until the last meeting. From her statement at the last meeting it was
apparent that she had the potential to become a change agent.
From this evaluative study, I was able to determine that individual and team
learning did take place at West Coast Community College. The participants
developed new awareness of the educational inequalities that existed for minorities at
their college. With this new awareness, the members began to reflect upon these
findings and, through the reflective discourse discussed in transformative learning
(see Chapter Two), interpreted the problems from the standpoint of equity. The
individual’s frame of reference was important to transformative learning. Each
nested case study presented in this study showed that either the evidence team
members maintained their equity frame of mind or changed their cognitive frame
from one of diversity or deficit thinking to one of equity thinking. As a team, the
134
evidence team members created questions that would be asked during the College
Review Process and combined the findings of the DSP with the state mandated
Student Equity Plan. With the understanding that organizations can learn (See
Chapter Two), the individuals had become more aware of educational inequalities for
minority students and it appeared that the institution may have begun the process of
assuming responsibility to eliminate their existence. Behavioral theories, with respect
to organizations, discussed the importance of unlearning as a way to increase
potential behaviors associated with learning. As the evidence team members became
more aware of educational inequities at their college, they began to question their
own existing awareness and unlearned behaviors that were detrimental to their ability
to reduce the gaps found within the institution. In the process they changed their
cognitive frame of thought from single-loop learners to double-loop learners who
focused on changing the values within the institution.
Conclusion
The Equity Scorecard Project would be an excellent program to implement at
our college since the findings of this study show that the Diversity Scorecard Project
at West Coast Community College positively impacted the lives of the evidence team
members. Learning did take place and the members learned to think with an equity
frame of mind. Although this study is not generalizable, the implications for practice
at our college provide information that would help our college implement a
successful Equity Scorecard Project. Like West Coast Community College, there are
many college-wide programs at our college that focus on student learning and student
135
outcomes. Many of these do not yet have a framework or their focus is on students in
general. The Equity Scorecard Project would be extremely beneficial to our diverse
college to bring a focus on discrepancies that exist for underrepresented students.
Through the implementation of the Equity Scorecard Project at our college,
administrators, staff, and faculty would be provided with the opportunity to learn
how to think with an equity cognitive frame.
It is my conclusion that that the DSP impacted individual learning as a direct
result of their involvement with the DSP and that the project was, therefore, a
success. With that determination in mind, I highly recommend that my college
implement the Equity Scorecard Project as soon as possible with the intent of
institutionalizing it throughout the various college-wide programs in order to reduce
any inequalities that exist in educational outcomes for minority students.
136
REFERENCES
Alexander, P. A., & Murphy, P. K. (1998). The research base for APA’s learner-
centered psychological principles. In N. M. Lambert & B. L. McCombs
(Eds.). How students learn: Reforming schools through learner-centered
education (pp. 25-60). Washington, DC: American Psychological
Association.
Argyris, C. (1977) Double loop learning in organizations. Harvard Business Review,
55(5), 115-125.
Argyris, C. (1991). Teaching smart people how to learn. Reflections, 4(2), 4-15.
Argyris, C., & Schön, D. A. (1996). Organizational learning II: Theory, method, and
practice. Reading, MA: Addison-Wesley Publishing Company.
Bensimon, E. M., Peña, E. V., & Castillo, C. (2006). The cognitive frames through
which institutional actors interpret inequality in educational outcomes among
black and Hispanic college students. Unpublished manuscript.
Bensimon, E. M. (2004a). Closing the achievement gap in higher education: An
organizational learning perspective. Unpublished material.
Bensimon, E. M. (2004b). The diversity scorecard: A learning approach to
institutional change. Change, 36(1), 44-52.
Bensimon, E. M. (2004, Fall). Message from the Director. CUE, Center for Urban
Education: The newsletter, 2(1), 2.
Bensimon, E. M. (2005). [Thematic group lectures and Diversity Scorecard Project
material]. Unpublished raw data.
Bensimon, E. M., Polkinghorne, D. E., Bauman, G. L., & Vallejo, E. (2004). Doing
research that makes a difference. The Journal of Higher Education, 75(1),
104-126.
Boyce, M. E. (2003). Organizational learning is essential to achieving and sustaining
change in higher education. Innovative Higher Education, 28, 119-136.
Creswell, J. W. (1998). Qualitative inquiry and research design: Choosing from five
traditions. Thousand Oaks, CA: Sage.
137
Creswell, J. W. (2003). Research design: Qualitative, quantitative, and mixed
methods approaches (2nd ed.). Thousand Oaks, CA: Sage.
Curry, B. K. (1992). Instituting enduring innovations: Achieving continuity and
change in higher education. ASHE-ERIC Higher Education Report, 7.
Daft, D. L., & Huber, G. P. (1987). How organizations learn: A communication
framework. Research in the Sociology of Organizations, 5, 1-36.
Fuhrman, S. H. (1999). The new accountability (CPRE Policy Brief RB-27).
University of Pennsylvania: CPRE.
Huber, G. P. (1991). Organization learning: The contributing processes and the
literatures. Organization Science, 2(1), 88-115.
Kaplan, R. S., & Norton, D. (1996). The balanced scorecard: Translating strategy
into action. Boston: Harvard Business School Press.
Kezar, A., & Eckel, P. (1999, April). Balancing the core strategies of institutional
transformation: Toward a “mobile” model of change. Paper presented at the
Annual Meeting of the American Educational Research Association,
Montreal, Quebec, Canada.
Kim, D. H. (1993). The link between individual and organizational learning. Sloan
Management Review, 35, 37-50.
Lorenzo, A. L., & LeCroy, N. A. (1994). A framework for fundamental change in the
community college. American Association of Community Colleges Journal,
64(4), 14-19.
Merriam, S. B. (1998). Qualitative research and case study applications in education
(2nd ed.). San Francisco: Jossey-Bass.
Merriam, S. B. (2004). The role of cognitive development in Mezirow’s
transformational learning theory. American Association for Adult and
Continuing Education, 55(1), 60-68.
Mezirow, J. (2000). Learning to think like an adult: Core concepts of transformation
theory. In J. Mezirow & Associates (Eds.), Learning as transformation (pp.
3-34). San Francisco: Jossey-Bass.
O’Day, J. A. (2002). Complexity, accountability, and school improvement. Harvard
educational review, 21(3), 293.
138
O’Neil, H. F., Jr., Bensimon, E. M., Diamond, M. A., & Moore, M. R. (1999).
Designing and implementing an academic scorecard. Change, 31(6), 32-40.
Patton, M. Q. (2002). Qualitative Research & Evaluation Methods. Thousand Oaks,
CA: Sage.
Schunk, D. H. (2004). Learning theories: An educational perspective (4th ed.).
Upper Saddle River, NJ: Pearson Education.
Stringer, E. T. (1999). Action Research (2nd ed.). Thousand Oaks, CA: Sage
Publications.
Yorks, L., & Marsick, V. J. (2000). Organizational learning and transformation. In J.
Mezirow (Ed.), Learning as transformation: Critical perspectives on a theory
in progress (pp. 253-281). San Francisco: Jossey-Bass Inc.
139
APPENDIX A: CHRONOLOGICAL ORDER OF MEETINGS
Meeting # Date Meeting Purpose Special Notes
1 3/1/01
Introductory - binder passed out - Minutes
show they were to develop goals, measures,
and select groups to be given to CUE
facilitator by April 20th (Submit ~3 goals
and measures to CUE facilitator via email).
Noted that West Coast
Community College is an H.S.I.
(Hispanic Servicing Institution)
2 5/29/01
Team started linking DSP w/ program
review, strategic planning, assessment
committee, and discussed the mandatory
Student Equity Plan
3 11/19/01 Focused on the three goals - not receptivity
4 1/11/02
More about finding out what West Coast
Community College is doing than helping
them. They are using DSP as a tool, not a
process.
5 3/22/02 Looking at types of data available
Richard & Robert are new to the
evidence team. Richard brought
a data packet he had prepared
before his initial meeting.
6 4/22/02 Looked at ways to show the data Theresa absent
7 5/13/02
Showing Richard the type of data needed for
the Pipeline envisioned by the committee &
finalized which cohort would be used.
8 6/3/02 Data party with emphasis on Pipeline
Although the seating chart
shows that Robert was present,
the notes do not have any
comments by or about him.
9 3/10/03 President's Meeting
Theresa introduced James who
presented the slides showing
disaggregated data.
10 7/22/03
Determining good groups to communicate
DSP
New CUE facilitator - Katherine
and Richard were absent.
Discussed WASC = Western
Association of Schools and
Colleges (Accreditation)
140
11 9/15/03
This meeting gave the evidence team a
chance to reflect where they were and where
they were going.
New CUE facilitator - Theresa
absent, but announced she was
leaving West Coast Community
College. Katherine and Robert
both late. Katherine came in last
after her class finished.
12 11/12/03
New CUE facilitator was asked to get
information on the College Review Process.
The evidence team asked for direction on
Stage II of DSP, but no information given.
Theresa's last meeting. Only
discussion was about the
College Review Process at West
Coast Community College.
13 4/5/04
Discussed how to disseminate DSP, but
mainly focused on the College Review
Process at West Coast Community College.
James' last meeting.
14 6/21/04 Sustaining and Communicating
Announced James left the
college. Robert is the new
leader.
15 10/4/04
Discuss College Review Process, President's
Report, & Presentations to various areas of
the college
141
APPENDIX B: SAMPLE STRUCTURE OF FIELD NOTES
West Coast Community College
Meeting Notes
7-22-03
James
Theresa
Robert
CUE facilitator 1
CUE facilitator 2
CUE facilitator 1 Robert
CUE facilitator 2
James Theresa
CUE facilitator 1 began the meeting by directing the team to their binders. She
reviewed its items (i.e. Quick View, President’s Report, etc.).
CUE facilitator 1 mentioned to the team that one of the directors for CUE and
another CUE facilitator were working on the benchmark workbook. She noted that a
focus of Phase II will be developing benchmarks for the Phase I reports.
[Meeting notes were taken until the end and after it concluded, the CUE facilitators
debriefed.]
Debrief:
CUE facilitator 2: I think it went really well. They all seem to be motivated to start
developing their plans to address their issues. And they have their focus which is
nice. So I think…. And they all agree on their focus, at least the people who were
there. So that was good to see. It just sounds like they need to get more their feet on
the ground and talk to different people about getting that together.
CUE facilitator 1: Right. Take the steps rather than just talking about it. I think it’s
good… I thought it was kind of cool that their area of focus sort of came out of the
discussion as, I don’t know, sort of organic as opposed to, “Well, this is what the
report said so this is what we should do.” You know like another institution we
worked with is a little bit more like that.
CUE facilitator 1: And this new College Review Process is still sort of under
development. So it’s not bizarre to introduce something else into the process. Yeah,
it’s good. It’s promising.
142
APPENDIX C: MAGNITUDE OF STATEMENTS
Table 1: Theresa: Magnitude of Evidence Team Member’s Statements
Discussion Topics - Major Categories
Meeting #
Students in
General
Minority
Students
DSP Pipeline
College
Programs
Other Total
1 4 3 14 1 8 2 32
2 1 1 5 0 3 1 11
3 3 0 1 0 3 2 9
4 3 2 19 0 6 1 31
5 7 1 5 3 1 1 18
6 ABS ABS ABS ABS ABS ABS ABS
7 4 1 5 9 0 4 23
8 0 2 12 3 2 5 24
9 4 2 9 0 3 2 20
10 1 1 5 0 13 2 22
11 ABS ABS ABS ABS ABS ABS ABS
12 0 1 7 0 13 3 24
13 N/A N/A N/A N/A N/A N/A N/A
14 N/A N/A N/A N/A N/A N/A N/A
15 N/A N/A N/A N/A N/A N/A N/A
Total 27 14 82 16 52 23 214
Percentage 12.62% 6.54% 38.32% 7.48% 24.30% 10.75%100.00%
Specialized Topics
Meeting # Student Outcomes
Minority Student
Outcomes
DSP Cohort
College Review
Process
1 0 0 3 0
2 0 1 1 0
3 0 0 0 0
4 0 0 3 1
5 3 1 1 0
6 ABS ABS ABS ABS
7 2 1 2 0
8 0 1 9 0
9 2 1 1 0
10 0 0 1 2
11 ABS ABS ABS ABS
12 0 0 0 12
13 N/A N/A N/A N/A
14 N/A N/A N/A N/A
15 N/A N/A N/A N/A
Total 7 5 21
Percentage 3.27% 2.34% 9.81%
% All Stmnts 25.93% 35.71% 25.61%
143
Table 2: Katherine: Magnitude of Evidence Team Member’s Statements
Discussion Topics - Major Categories
Meeting #
Students in
General
Minority
Students
DSP Pipeline
College
Programs
Other Total
1 1 1 4 0 4 5 15
2 0 0 2 0 2 0 4
3 0 0 0 0 3 0 3
4 0 0 1 0 1 0 2
5 2 0 0 0 2 0 4
6 1 0 3 0 0 0 4
7 1 0 1 0 0 0 2
8 0 0 1 0 0 1 2
9 0 0 0 0 3 0 3
10 ABS ABS ABS ABS ABS ABS ABS
11 0 0 2 0 5 3 10
12 0 1 1 0 7 0 9
13 1 0 2 0 4 0 7
14 ABS ABS ABS ABS ABS ABS ABS
15 0 0 2 0 7 1 10
Total 6 2 19 0 38 10 75
Percentage 8.00% 2.67% 25.33% 0.00% 50.67% 13.33% 100.00%
Specialized Topics
Meeting #
Student
Outcomes
Minority Students
Outcomes
DSP Cohort
College Review
Process
1 0 0 1 0
2 0 0 0 2
3 0 0 0 0
4 0 0 0 0
5 2 0 0 0
6 0 0 0 0
7 1 0 0 0
8 0 0 0 0
9 0 0 0 0
10 ABS ABS ABS ABS
11 0 0 0 1
12 0 0 0 2
13 0 0 0 0
14 ABS ABS ABS ABS
15 0 0 0 4
Total 3 0 1 9
Percentage 4.00% 0.00% 1.33%
% All Stmnts 50.00% 0.00% 5.26%
144
Table 3: James: Magnitude of Evidence Team Member’s Statements
Discussion Topics - Major Categories
Meeting #
Students in
General
Minority
Students
DSP Pipeline
College
Programs
Other Total
1 3 0 12 0 2 3 20
2 4 0 4 0 9 0 17
3 1 0 3 1 2 0 7
4 1 2 8 0 7 0 18
5 1 2 5 0 0 0 8
6 0 1 18 3 0 0 22
7 0 0 3 8 0 1 12
8 1 1 9 6 0 2 19
9 9 12 4 0 2 0 27
10 0 0 5 0 1 0 6
11 1 0 6 0 22 0 29
12 0 0 0 0 21 0 21
13 1 0 6 0 19 0 26
14 N/A N/A N/A N/A N/A N/A N/A
15 N/A N/A N/A N/A N/A N/A N/A
Total 22 18 83 18 85 6 232
Percentage 9.48% 7.76% 35.78%7.76% 36.64% 2.59%100.00%
Specialized Topics
Meeting #
Student
Outcomes
Minority Students
Outcomes
DSP Cohort
College Review
Process
1 0 0 1 0
2 0 0 0 4
3 0 0 1 0
4 0 1 1 3
5 1 1 0 0
6 0 1 13 0
7 0 0 1 0
8 1 0 2 0
9 5 10 0 1
10 0 0 2 0
11 0 0 0 10
12 0 0 0 18
13 0 0 0 6
14 N/A N/A N/A N/A
15 N/A N/A N/A N/A
Total7 13 21 42
Percentage3.02% 5.60% 9.05% 18.10%
% All Stmnts31.82% 72.22% 25.30% 49.41%
145
Table 4: Richard: Magnitude of Evidence Team Member’s Statements
Discussion Topics - Major Categories
Meeting #
Students in
General
Minority
Students
DSP Pipeline
College
Programs
Other Total
1 N/A N/A N/A N/A N/A N/A N/A
2 N/A N/A N/A N/A N/A N/A N/A
3 N/A N/A N/A N/A N/A N/A N/A
4 N/A N/A N/A N/A N/A N/A N/A
5 5 6 8 0 0 2 21
6 3 3 18 6 0 0 30
7 4 0 5 13 0 0 22
8 4 0 12 8 0 1 25
9 0 0 3 0 0 0 3
10 ABS ABS ABS ABS ABS ABSABS
11 1 0 3 0 13 0 17
12 0 0 1 0 1 0 2
13 0 0 6 0 15 0 21
14 0 1 15 0 13 0 29
15 0 4 18 0 2 0 24
Total 17 14 89 27 44 3 194
Percentage 8.76% 7.22% 45.88%13.92% 22.68% 1.55%100.00%
Specialized Topics
Meeting #
Student
Outcomes
Minority Students
Outcomes
DSP Cohort
College Review
Process
1 N/A N/A N/A
2 N/A N/A N/A
3 N/A N/A N/A
4 N/A N/A N/A
5 5 3 5
6 2 1 14
7 4 0 2
8 2 0 9
9 0 0 0
10 ABS ABS ABS
11 1 0 1
12 0 0 0
13 0 0 0
14 0 1 0
15 0 3 0
Total14 8 31
Percentage7.22% 4.12% 15.98%
% All Stmnts 82.35% 57.14% 34.83%
146
Table 5: Robert: Magnitude of Evidence Team Member’s Statements
Discussion Topics - Major Categories
Meeting #
Students in
General
Minority
Students
DSP Pipeline
College
Programs
Other Total
1 N/A N/A N/A N/A N/A N/A N/A
2 N/A N/A N/A N/A N/A N/A N/A
3 N/A N/A N/A N/A N/A N/A N/A
4 N/A N/A N/A N/A N/A N/A N/A
5 3 1 5 0 1 3 13
6 5 3 7 7 0 0 22
7 1 0 3 6 1 0 11
8 ABS ABS ABS ABS ABS ABS ABS
9 3 2 0 0 2 0 7
10 0 1 2 2 5 0 10
11 1 0 0 0 9 0 10
12 ABS ABS ABS ABS ABS ABS 0
13 1 0 3 0 10 0 14
14 0 0 5 0 12 2 19
15 0 0 6 0 10 0 16
Total 14 7 31 15 50 5 122
Percentage 11.48% 5.74% 25.41% 12.30% 40.98% 4.10% 100.00%
Specialized Topics
Meeting #
Student
Outcomes
Minority Students
Outcomes
DSP Cohort
College Review
Process
1 N/A N/A N/A N/A
2 N/A N/A N/A N/A
3 N/A N/A N/A N/A
4 N/A N/A N/A N/A
5 2 0 4
6 5 3 7
7 1 0 0
8 ABS ABS ABS ABS
9
10 0 0 1
11 0 0 0
12 ABS ABS ABS ABS
13 1 0 0
14 0 0 0
15 0 0 0
Total9 3 12
Percentage7.38% 2.46% 9.84%
% All Stmnts
64.29% 42.86% 38.71%
147
APPENDIX D: COMPARISON OF EVIDENCE TEAM MEMBERS
Table 1: Evidence Team Members – Feelings & Attitudes
Positive Attitude
Theresa Katherine James Richard Robert
Beneficial to other
Programs
√ √ √ √ √
Shared scorecard with
other college constituencies
√ √ √ √
Usefulness of the
framework
√
Usefulness of the
disaggregated data to
emphasize disparities by
ethnicity
√ √ √ √ √
Perspectives could
mobilize the college into
action
√ √ √
Facilitate difficult dialogue
√ √ √ √
Enjoyment / Enthusiasm /
Excitement
√ √ √ √ √
148
Table 2: Evidence Team Members – Awareness
Theresa Katherine James Richard Robert
Prepared vs. Unprepared
Students in general
√ √ √ √
Prepared vs. Unprepared
Students with respect to
ethnicity
√ √
Understood socio-cultural
obstacles
√ √ √
Awareness of Institutional
Barriers
√ √ √ √
Aware college not serving all
students equitably
√ √ √ √
Aware of inequalities in
educational outcomes by
ethnicity
√ √ √ √
Adopted institutional
responsibility standpoint
√ √ √
Aware of lack of faculty
diversity & consequences for
minority students
√ √
Aware of the need to maintain
a vision of equity
√ √
Used language of equity
√ √ √
Abstract (if available)
Abstract
The Diversity Scorecard Project evaluated in this study was created by the University of Southern California's Center for Urban Education in order to foster an institutional awareness of inequities in educational outcomes that exist for underrepresented students. It was the hope of the creators of the Diversity Scorecard Project that, through action research, the practitioners-as-researchers would develop into equity oriented thinkers and become change agents at the institution. I conducted this evaluation as a naturalistic inquiry to determine the impact of individual learning among the evidence team members at one of fourteen institutions that participated in the Diversity Scorecard Project over a span of four years. I present five nested case studies, one story for each evidence team member, then present a comparison of the impact the Diversity Scorecard Project had for all the individuals. As a result of this study, I determined that, after looking at the institutional data that had been disaggregated by race and ethnicity, learning had taken place and the evidence team members had become change agents for their institution.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Beyond access: an evaluation of attitudes and learning towards achieving equitable educational outcomes in higher education
PDF
Creating an equity state of mind: a learning process
PDF
Evaluating the impact of CUE's action research processes and tools on practitioners' beliefs and practices
PDF
Participation in higher education diversity, equity, and inclusion work: a relational intersectionality of organizations analysis
PDF
The experiences of African American students in a basic skills learning community at a four year public university
PDF
Faculty learning in career and technical education: a case study of designing and implementing peer observation
PDF
Making equity & student success work: practice change in higher education
PDF
Remediating artifacts: facilitating a culture of inquiry among community college faculty to address issues of student equity and access
PDF
Syllabus review equity-oriented tool as a means of self-change among STEM faculty
PDF
Knowledge, motivation, and organizational influences within leadership development: a study of a business unit in a prominent technology company
PDF
Reframing the role of transfer facilitators: using action research methods for new knowledge development
PDF
Evaluation of New Teacher Induction (NTI) mentor practice for developing NTI teachers capable of differentiating instruction to address cultural diversity, equity, and learner variability
PDF
Exploring student perceptions and experiences regarding the role of diversity courses and service-learning on cross-racial interactions
PDF
Evaluating the effects of diversity courses and student diversity experiences on undergraduate students' democratic values at a private urban research institution
PDF
State policy as an opportunity to address Latinx transfer inequity in community college
PDF
Advancing equity and inclusion in higher education: the role of the chief diversity officer and the institution in creating more diverse campus climates
PDF
Examining the impact of LETRS professional learning on student literacy outcomes: a quantitative analysis
PDF
An evaluation of project based learning implementation in STEM
PDF
Characteristics that create a quality early learning center: An evaluation study
PDF
Racial disparities in U.S. K-12 suburban schools: an evaluation study
Asset Metadata
Creator
Post, Donna Marie
(author)
Core Title
An evaluation of individual learning among members of a diversity scorecard project evidence team
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Publication Date
02/15/2007
Defense Date
11/30/2006
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
diversity,equity,evaluation,inequities in educational outcomes,Learning and Instruction,OAI-PMH Harvest,organizational learning
Language
English
Advisor
Bensimon, Estela Mara (
committee chair
)
Creator Email
donnapos@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-m272
Unique identifier
UC1227298
Identifier
etd-Post-20070215 (filename),usctheses-m40 (legacy collection record id),usctheses-c127-160778 (legacy record id),usctheses-m272 (legacy record id)
Legacy Identifier
etd-Post-20070215.pdf
Dmrecord
160778
Document Type
Dissertation
Rights
Post, Donna Marie
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Repository Name
Libraries, University of Southern California
Repository Location
Los Angeles, California
Repository Email
cisadmin@lib.usc.edu
Tags
equity
evaluation
inequities in educational outcomes
organizational learning