Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Minding the gap: an evaluation of faculty, staff and administrator readiness to close equity gaps at the California State University
(USC Thesis Other)
Minding the gap: an evaluation of faculty, staff and administrator readiness to close equity gaps at the California State University
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
MINDING THE GAP:
AN EVALUATION OF FACULTY, STAFF and ADMINISTRATOR READINESS
TO CLOSE EQUITY GAPS AT THE CALIFORNIA STATE UNIVERSITY
by
Jeff Gold
______________________________________________________________________________
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
December 2020
Copyright 2020 Jeff Gold
ii
Acknowledgements
The academic journey of pursuing my doctoral degree has not been a solitary venture. I
have been fortunate to enjoy immense support from numerous professors, colleagues, friends and
family members to whom I owe a debt of gratitude. First, I would like to thank my wife, Tracy,
for her patience, understanding and good humor. Although you undoubtedly grew tired of my
weekly laments of having homework— for the next three years—you were always there with
words of encouragement and a smile. If it offers any consolation, after almost a decade of
marriage, you will finally be able to say you married a doctor. I love you very much!
My journey would never have begun were it not for the caring guidance of my boss and
mentor, Dr. Loren Blanchard. Although it took me a couple years to warm to the idea of pursuing
my doctorate, your gracious and prescient encouragement convinced me that this was a step I
needed to take. I cannot tell you how much I appreciate your continued support and mentorship
along with your consistent reminders to take time to “smell the roses” and appreciate life’s many
blessings. This is certainly one of those moments!
I am also extremely grateful to my dissertation committee. Given my study’s focus on
equity, having the opportunity to learn from a diverse group of talented, selfless leaders has been
invaluable. First, a big thank you to my Chair, Dr. Bryant Adibe, who served as captain of the
ship. I value the autonomy you granted me to navigate the process, and deeply appreciate the
encouragement you provided along the way. I am also indebted to Dr. Maria Ott, who saved the
day by helping me solve a few challenges with the IRB process. Your caring and supportive
nature exemplifies what it means to be a servant leader, and your commitment to student success
is a source of inspiration. And finally, I would like to acknowledge Dr. Desdemona Cardoza,
who has taught me so much. On the technical side, you showed me how to test for
iii
multicollinearity and explained how to use the Bonferroni correction to mitigate errors of
multiple comparisons. More importantly, though, you are an amazing leader, friend and
colleague, and I am incredibly grateful for everything you do.
Finally, I would like to acknowledge how challenging it has been to muster the requisite
motivation to complete my dissertation during a prolonged global health crisis. Despite the
formidable amount of weekly reading, I do not recall any empirical references in the scholarly
literature to evidence-based strategies for persisting to a doctoral degree during a pandemic.
Fortunately, though, I learned this lesson from my amazing cohort 11 colleagues, who provided
comradery and words of encouragement to help me cross the finish line. I am grateful for your
support.
iv
Table of Contents
Acknowledgements.........................................................................................................................ii
List of Tables.................................................................................................................................vii
List of Figures...............................................................................................................................viii
Abstract...........................................................................................................................................ix
CHAPTER ONE: INTRODUCTION ............................................................................................. 1
Introduction of the Problem of Practice ...................................................................................... 1
Organizational Context and Mission ........................................................................................... 2
Importance of Addressing the Problem ....................................................................................... 2
Organizational Performance Goal ............................................................................................... 3
Description of Stakeholder Groups ............................................................................................. 4
Stakeholder Group of Focus ........................................................................................................ 4
Stakeholder Goal ......................................................................................................................... 5
Equity Gaps in the CSU .............................................................................................................. 7
Purpose of the Project and Questions .......................................................................................... 9
Definitions ................................................................................................................................. 10
CHAPTER TWO: REVIEW OF THE LITERATURE ................................................................ 12
Equity Gaps in Public Universities ........................................................................................... 12
History and Prevalence .......................................................................................................... 13
Demographic Landscape ....................................................................................................... 14
Root Causes of Equity Gaps...................................................................................................... 15
First-Generation Status .......................................................................................................... 16
Socio-Economic Factors ........................................................................................................ 17
Academic Preparation ............................................................................................................ 18
Psychological Challenges ...................................................................................................... 19
Institutional Factors ............................................................................................................... 20
Student-Focused Approaches to Closing Equity Gaps ............................................................. 21
Self-Affirmation Interventions .............................................................................................. 21
Targeted and Enhanced Student Support Services ................................................................ 22
University-wide Approaches to Closing Equity Gaps .............................................................. 24
v
Implicit Bias Training ............................................................................................................ 24
Professional Development to Promote Equity ....................................................................... 25
Commitment to a Diverse Professoriate ................................................................................ 26
Data-Driven Decision Making for Equitable Student Outcomes .......................................... 27
The Clark and Estes (2008) Gap Analytic Conceptual Framework .......................................... 28
Knowledge, Motivation and Organizational (KMO) Influences .............................................. 29
Knowledge Influences ........................................................................................................... 29
Motivation Influences ............................................................................................................ 35
Organizational Influences ...................................................................................................... 40
Conceptual Framework ............................................................................................................. 46
Conclusion ................................................................................................................................. 49
CHAPTER THREE: RESEARCH METHODS ........................................................................... 51
Participating Stakeholders ......................................................................................................... 51
Survey Sampling Strategy and Rationale .................................................................................. 52
Survey Instrumentation and Statistical Analyses ...................................................................... 52
Survey Instrumentation.......................................................................................................... 54
Statistical Analyses ................................................................................................................ 55
Validity and Reliability ............................................................................................................. 57
Ethics ......................................................................................................................................... 59
CHAPTER FOUR: RESULTS AND FINDINGS ........................................................................ 62
Participating Stakeholders ......................................................................................................... 63
Construct Validity ..................................................................................................................... 65
Results and Discussion .............................................................................................................. 70
Knowledge Influences ........................................................................................................... 70
Motivation Influences ............................................................................................................ 72
Organizational Influences ...................................................................................................... 74
Research Question Findings and Discussion ............................................................................ 77
Research Question 1 .............................................................................................................. 78
Research Question 2 .............................................................................................................. 80
Research Question 3 .............................................................................................................. 83
Summary of Findings ................................................................................................................ 85
vi
CHAPTER FIVE: SOLUTIONS AND RECOMMENDATIONS ............................................... 88
Knowledge Recommendations .............................................................................................. 88
Motivation Recommendations ............................................................................................... 93
Organization Recommendations............................................................................................ 96
Integrated Implementation and Evaluation Plan ..................................................................... 102
Organizational Purpose, Need and Expectations ................................................................. 103
Level 4: Results and Leading Indicators ............................................................................. 103
Level 3: Behavior ................................................................................................................ 104
Level 2: Learning ................................................................................................................. 107
Level 1: Reaction ................................................................................................................. 110
Evaluation Tools .................................................................................................................. 111
Data Analysis and Reporting ............................................................................................... 112
Summary .............................................................................................................................. 115
Limitations and Delimitations ................................................................................................. 115
Recommendations for Future Research .................................................................................. 117
Conclusion ............................................................................................................................... 118
REFERENCES ........................................................................................................................... 120
APPENDICES ............................................................................................................................ 142
Appendix A: CSU Certificate Program in Student Success Analytics Brochure……...…….141
Appendix B: Student Success Analytics Pre-Program Survey .............................................. 142
Appendix C: Evaluation for Use Immediately Following the Opening Session .................... 149
Appendix D: Evaluation Instrument for Use Four Weeks after the Closing Session ............ 150
vii
List of Tables
Table 1: Organizational Mission, Global Goal and Stakeholder Performance Goals .................... 6
Table 2: CSU Fall 2018 Enrollment by Ethnicity and freshman 6-Year Graduation Rates ........... 7
Table 3: Knowledge Influences and Knowledge Types ............................................................... 34
Table 4: Motivational Influences .................................................................................................. 39
Table 5: Organizational Influences ............................................................................................... 45
Table 6: Survey Instruments and Methods ................................................................................... 53
Table 7: Factor Scores for Knowledge Survey Questions ............................................................ 67
Table 8: Factor Scores for Motivation Survey Questions ............................................................. 68
Table 9: Factor Scores for Organization Survey Questions.......................................................... 69
Table 10: Standardized Coefficients for Linear Regression of Knowledge and Motivation ........ 79
Table 11: Standardized Coefficients for Linear Regression of Cultural Models and Settings ..... 82
Table 12: Logistic Regression Output of Faculty Job Role and Higher Education Experience ... 84
Table 13: Prioritization of Knowledge, Motivation and Organizational Influences .................... 86
Table 14: Summary of Knowledge Influences and Recommendations ........................................ 89
Table 15: Summary of Motivation Influences and Recommendations ........................................ 93
Table 16: Summary of Organization Influences and Recommendations ..................................... 97
Table 17: Outcomes, Metrics and Methods for External and Internal Outcomes ...................... 103
Table 18: Critical Behaviors, Metrics, Methods and Timing for Evaluation ............................. 105
Table 19: Required Drivers to Support Critical Behaviors ......................................................... 106
Table 20: Evaluation of the Components of Learning for the Program ..................................... 109
Table 21: Components to Measure Reactions to the Program .................................................... 110
viii
List of Figures
Figure 1: Conceptual Framework ................................................................................................. 47
Figure 2: Survey Respondent Demographic Information ............................................................. 64
Figure 3: Confirmatory Factor Analysis Question Mapping to KMO Constructs........................ 66
Figure 4: Knowledge Survey Question Responses ....................................................................... 71
Figure 5: Motivation Survey Question Responses........................................................................ 73
Figure 6: Organization Survey Question Responses .................................................................... 75
Figure 7: CSU Student Success Dashboard - Faculty Grading Pattern Trend Analysis ............. 114
ix
Abstract
Equity gaps in the degree attainment rates of historically underserved college students and their
more privileged peers are prevalent across the U.S. higher education landscape. Addressing these
discrepancies requires an institution-wide commitment to data-driven, equity-minded practice.
As part of a larger effort to eliminate equity gaps by the year 2025, the California State
University (CSU) developed the Certificate Program in Student Success Analytics, an equity-
focused professional development program for faculty, staff and administrators. This study
evaluated the readiness of CSU program participants to interrogate student data and apply their
insights to improve equity in their practice. Utilizing the Clark and Estes (2008) Gap Framework
Analysis, linear and logistic regressions were performed to identify the knowledge, motivation
and organizational influences impeding stakeholders from becoming more data-driven and
equity-minded practitioners. The quantitative analyses revealed that participants’ propensity to
apply data to their practice was most strongly predicted by their level of procedural and
metacognitive knowledge. Campus resources were also shown to be critical factors in supporting
more equitable practices. Finally, faculty were determined to be less likely than their
administrator and staff colleagues to apply student equity data to improve their practice. These
findings laid the foundation for the development of recommended enhancements to the
certificate program with the goal of equipping all participants with the requisite skills,
motivation and support to regularly apply data-driven insights to improve equity in their practice.
Keywords: culture of evidence, data-informed decision making, diversity, equity gap, graduation
rates, higher education, inclusion, professional development, students of color.
1
CHAPTER ONE: INTRODUCTION
Introduction of the Problem of Practice
Higher education plays an essential role in nurturing the intellectual and professional
development of students while enriching the lives of those who earn a baccalaureate degree.
These positive outcomes are not distributed equally at universities across the nation; a
disproportionately low number of Black and Latinx students attain a college degree (de Brey et
al., 2019) compared to their White counterparts. This inequity is most evident at public
universities, which collectively educate more than two-thirds of all baccalaureate-seeking
students, including a vast majority of enrolled students of color (Bowen et al., 2009). At these
institutions, the discrepancy in college completion rates between White students and their Latinx
and Black peers is pronounced; in 2016, 64% of White students graduated within six years,
compared to 54% of Latinx and 40% of Black students (National Center for Education Statistics
Postsecondary Graduation Rates, 2019). This stark inequity is problematic because earning a
college degree confers a host of lasting benefits to students, their families and the communities in
which they live. College graduates earn more, are less likely to be unemployed, report higher
levels of happiness (Lumina Foundation, 2015), and live an average of five years longer than
non-college graduates (National Center for Health Statistics, 2012). The evidence suggests that
despite recent increases in completion rates for students of all backgrounds, public universities
have not made progress toward closing graduation rate gaps for Black and Latinx students
(Education Trust, 2015; Musu-Gilette et al., 2017; Libassi, 2018). Public four-year universities
need to improve college degree attainment rates for growing populations of Black and Latinx
students to ensure all collegegoers, irrespective of ethnic background or prior circumstance, have
an equitable opportunity to prosper intellectually, professionally and economically.
2
Organizational Context and Mission
The California State University (CSU) provides affordable, high-quality educational
opportunities for approximately 500,000 students on 23 campuses. As one of the largest and
most ethnically and racially diverse public university systems in the world, the CSU employs
over 52,000 faculty and staff throughout California (California State University, 2020). More
than half of CSU students are students of color, and over one-third are the first in their families
to attend college (California State University, 2020). With over 125,000 annual graduates, the
CSU is a leading contributor to economic and social mobility for the State of California,
empowering students to address tomorrow’s challenges and preparing them for an enriching and
prosperous future.
Importance of Addressing the Problem
Solving the challenge of disproportionately low college graduation rates for Black and
Latinx students would afford lasting benefits to these individuals and their communities.
Achieving equity in college degree attainment would increase the lifetime earnings of students of
color, improve the national economy by adding skilled employees to meet economic workforce
needs and promote greater social mobility for underserved populations across the nation
(Harmon, 2012). Over the coming decades, Black and Latinx students are projected to enroll in
postsecondary institutions in greater proportions than any other ethnic or racial group (U.S.
Census Bureau, 2017; The College Board, 2008). While attending college is a positive first step,
completion matters. From 2013 through 2015, if Black and Latinx students had earned a college
degree at the same rate as White students, universities would have conferred an additional one
million college degrees (Libassi, 2018). Failing to address the discrepancy in graduation rates
will severely limit social mobility for students of color, perpetuate racial inequality and
3
jeopardize the national economy (Koropeckyj et al., 2017). To reinforce democratic principles
and ensure a vibrant future, public universities need to solve the problem of unequal completion
rates for Black and Latinx students.
Organizational Performance Goal
The CSU seeks to ensure that all students have the opportunity to earn a high-quality
college degree in a timely manner and in accordance with their personal goals (CSU Graduation
Initiative 2025 System Plan, 2018). The institution is driven by the commitment to better serve
students by equipping them with the skills, knowledge and experiences to foster a future of
success and wellbeing (CSU Graduation Initiative 2025, 2018). Guided by this vision, in 2016
the CSU launched its signature strategic program, Graduation Initiative 2025, and established
ambitious graduation rate and equity targets for all campuses. These goals provide a framework
for the CSU to eliminate equity gaps, or discrepancies in graduation rates between students of
color and their White and Asian peers, and low-income students and their higher-income
counterparts. The equity gap goals were set in the summer of 2016, along with a series of
graduation rate targets for each campus and the system. A cross-representative advisory
committee consisting of faculty, students, staff and administrators developed the methodology
for the construction of the goals (CSU Graduation Initiative 2025 System Plan, 2018). The
committee reviewed data from both internal and external sources, consulted national research
and identified peer benchmark groups to establish baseline degree attainment rates for all
campuses (CSU Graduation Initiative 2025 System Plan, 2018). To measure annual progress, the
CSU developed reporting structures based on campus persistence and completion rate data,
including disaggregated performance metrics for underserved populations. Nationally, no public
comprehensive university system has closed equity gaps and achieved increases to completion
4
rates at levels consistent with the CSU’s new goals (CSU Graduation Initiative 2025 System
Plan, 2018). Attaining these outcomes would set unprecedented standards for equity and timely
degree completion at the nation’s largest and most diverse public four-year institution.
Description of Stakeholder Groups
Involving a variety of campus stakeholders in the equity imperative is critical to closing
achievement gaps (Bensimon, 2018). Within the CSU, faculty will make significant
contributions to eliminating equity gaps by renewing their focus on equity-minded pedagogy and
amending their practice to be more supportive of underserved populations. Campus deans will
provide leadership to meet the goals by motivating faculty and staff, identifying courses with
inequitable outcomes, devising interventions, measuring progress and holding campus
constituents accountable for achieving the desired outcomes. And finally, a more targeted group
of stakeholders, comprised of CSU campus leadership teams participating in a three-month
systemwide professional development program focused on promoting equitable student
outcomes, will contribute to meeting the goals by applying data to become more equity-minded
in their practice.
Stakeholder Group of Focus
The CSU’s goal to achieve equity in degree attainment can only be achieved through the
combined efforts of multiple stakeholders. Given the practical constraints of conducting a
comprehensive analysis of all stakeholders, this study focuses exclusively on CSU campus
leadership teams participating in the CSU Certificate Program in Student Success Analytics, a
systemwide professional development course that imparts training for participants to utilize data
to identify and ameliorate equity challenges. The program, commencing in late January 2020,
consisted of 274 participants, including leadership teams of 10-15 faculty, staff and
5
administrators from twenty CSU campuses, and three teams from other universities across the
nation. The specific composition of each leadership team was determined at the discretion of the
campus president. A more comprehensive overview of the recruitment and selection process for
the program is provided in Chapter 3. For the purpose of this study, stakeholders were limited to
members of the CSU teams, enabling an evaluation of the program’s impact on eliciting equity-
minded learning transfer and behavioral changes aimed at ameliorating equity gaps across the
CSU (Kirkpatrick, 2006).
Stakeholder Goal
This study focused on the readiness of CSU program participants to analyze and apply
student equity data to improve their practice. The datasets introduced in the professional
development program were accessible via systemwide dashboards and campus institutional
research reports and websites (CSU Certificate Program in Student Success Analytics, 2020).
They included a variety of disaggregated analyses such as persistence and graduation rate gap
reports, course GPA gap studies and predictive models that identify when equity gaps are most
likely to occur. Evaluating the readiness of stakeholders to meet their goal of using data to
improve their equity-minded practice is critical. If the program fails to help participants leverage
data to promote equity in their work, the CSU will continue to experience unequal outcomes for
its diverse student body, jeopardizing the institution’s ability to meet its goal of eliminating
equity gaps by 2025.
Table 1 presents a summary of the CSU’s mission, global goal and stakeholder
performance goals.
6
Table 1: Organizational Mission, Global Goal and Stakeholder Performance Goals
Organizational Mission, Global Goal and Stakeholder Performance Goals
Organizational Mission
The mission of the California State University is to provide high-quality
and affordable undergraduate and graduate learning experiences to a
diverse student body to meet the workforce needs of the State of
California and prepare students for a lifetime of achievement.
Organizational Performance Goal
In support of the CSU mission, by the year 2025, all CSU campuses will eliminate equity
gaps, or discrepancies in graduation rates between students of color and their White and
Asian peers.
CSU Faculty CSU Deans
Certificate Program Leadership
Teams
By December 2020 all
CSU faculty will attend at
least one equity-minded
pedagogy professional
development session.
By December 2020 all
CSU college deans will
develop an action plan for
eliminating GPA gaps in
courses taught by faculty
in their colleges.
By December 2020 all CSU
participants in the Certificate
Program will regularly
analyze and apply student
equity data to improve their
practice.
7
Equity Gaps in the CSU
Within the CSU, discrepancies in degree attainment rates for students from different
ethnic group are pronounced. Table 2 shows the ethnic distribution of the CSU student
population in fall, 2018 and indicates the 6-year graduation rates by ethnicity for freshman
students who first enrolled at a CSU campus in 2012 (CSU Fact Book, 2019; CSU Institutional
Research and Analyses, 2020).
Table 2: CSU Fall 2018 Enrollment by Ethnicity and freshman 6-Year Graduation Rates
CSU Fall 2018 Enrollment by Ethnicity, and 2012-2018 Freshman 6-Year Graduation Rates
As Table 2 indicates, Latinx students graduate at rates that are 12 percentage points lower than
their White peers, while the rates for African American students are approximately 20 percentage
points lower. These disparities have persisted over time (CSU Institutional Research and
Analyses, 2020), and have led the CSU to launch a variety of policies and programs focused on
closing the achievement gap.
Ethnic Group
%
Students
Graduation
Rate
Latinx 41.5% 56.3%
White 23% 68.7%
Asian 15.9% 66.2%
Non-US Resident 6.4% 57.9%
Unknown 4.6% 60.5%
Two or More Races 4.4% 61.2%
African American 4% 48.4%
American Indian .2% 50.0%
8
In fall, 2018 the CSU implemented a new academic preparation policy, eliminating
remedial math and English courses throughout the system (CSU Academic Preparation, 2020).
As a result of this policy, incoming freshmen in need of additional academic support were no
longer required to complete non-credit courses and were instead provided opportunities to enroll
in baccalaureate-level classes with enhanced instructional support. Students of color were
disproportionately affected by this policy change, as Latinx and Black students typically
comprised approximately 60% of CSU remedial course enrollments (CSU Academic
Preparation, 2020). While it is too early to assess the policy’s impact on closing the degree
completion gap, one year after its implementation the policy has brought about an eightfold
increase in the number of students who successfully completed the requisite baccalaureate-level
math course in their first year (Watanabe, 2019).
In addition to pursuing more equitable academic policies, in 2018 the CSU developed a
professional development program focused on cultivating organizational improvement at the
“intersection of evidence and equity” called the Certificate Program in Student Success Analytics
(CSU Certificate Program in Student Success Analytics, 2020). Utilizing systemwide and
campus-specific data tools, the program consists of a learning community comprised of teams of
campus leaders who share the collective goal of adopting and continually assessing student
equity practices on their respective campuses. The curriculum includes eight sessions—two face-
to-face convenings and six interactive web conferences—that take place over a three-month
period during the spring. Throughout the program, participants interrogate student data, consult
with national experts and complete an action research project to apply their learning to solve a
salient equity challenges on their campus (CSU Certificate Program in Student Success
Analytics, 2020). A copy of the program brochure is provided in Appendix A.
9
Welcoming its third cohort of students in 2020, the certificate program has scaled
precipitously; participants increased from 41 in 2018, to 124 in 2019 to 274 in 2020 (CSU
Certificate Program in Student Success Analytics, 2020). To better understand the strengths and
weaknesses of the program, the researcher hired an external evaluator to assess program
outcomes for the 2018 and 2019 cohorts. The evaluations were largely derived from qualitative
analyses and included interviews and focus groups with a select number of participants. While
this information led to significant programmatic changes including a more structured action
research project and additional team planning time, the evaluations failed to assess statistically
significant changes in participants’ readiness to apply insight from the program to increase
equity in their practice. Focusing on the 2020 cohort, this study employs quantitative methods to
improve upon previous qualitative evaluations of the certificate program’s impact on
stakeholders’ ability to meet their goals.
Purpose of the Project and Questions
The purpose of this project is to evaluate the readiness of CSU faculty, staff and
administrator participants in the Certificate Program in Student Success Analytics to derive
insight from student data and apply this new knowledge to improve equity in their practice. The
analysis focuses on knowledge, motivation and organizational influences related to achieving
this goal. While a complete evaluation would include all stakeholders, for practical purposes, this
study offers a narrower analysis of campus leadership teams that participated in the 2020
professional development program. The following questions guide the evaluation study:
1. What are participants’ knowledge and motivation influences related to using student
equity data to improve their practice?
10
2. What is the influence of the organization’s cultural models and cultural settings on
stakeholder knowledge and motivation?
3. How do stakeholders’ job role and level of experience relate to their stated propensity to
apply student data to improve their practice?
The answers to these questions coupled with information culled from the scholarly literature will
inform the development of a series of recommendations and an evaluation plan for increasing the
likelihood that certificate program participants apply equity data to improve their practice.
Definitions
The following definitions apply to this study:
• Cultural Capital – the accumulation of skills, behaviors and cultural knowledge that
influence social mobility and power
• Equity – the educational principle of fairness through which the strategic
implementation of policies and practices promote equality of outcomes
• Equity Gap (Achievement Gap, Opportunity Gap) – the disparity in educational
outcomes between various groups of underserved students and their peers. For the
purposes of this study, the equity gap refers to differences in the six-year graduation
rates between first-time Latinx and Black students and their White and Asian peers.
• First-Generation Student – students who are the first in their family to attend college
• Implicit Bias – the unconscious association of negative stereotypes with a specific
group of people, often affecting the way a person thinks about and acts toward
individuals from that group
• Impostor Syndrome – a psychological condition in which individuals feel a sense of
inadequacy and unworthiness of their success despite their prior accomplishments.
11
• KMO – the knowledge, motivation and organizational influences underpinning Clark
and Estes’ (2008) gap analysis framework
• Latinx – gender-neutral term used to refer to students from Latino/Latina and/or
Hispanic backgrounds
• Pell-Grant – federal subsidy available to low-income students to help pay for college
• Persistence Rate – the percentage of a given cohort of students who returned to
college the following term or graduated
• Remediation – non-baccalaureate level courses in English and/or mathematics
developed for college students who require additional academic preparation
• Stereotype Threat - a psychological condition in which an individual’s performance
is affected by negative stereotypes associated with his/her race, gender or ethnicity
• Students of Color – for the purpose of this study, “students of color” refers to Latinx
and Black students
12
CHAPTER TWO: REVIEW OF THE LITERATURE
This literature review examines the prevalence of equity gaps in graduation rates between
students of color and their White and Asian peers at public universities across the United States.
The review begins by defining equity gaps and providing a research-based overview of the
pervasiveness of these gaps in higher education. This section is followed by an overview of the
root causes of equity gaps, detailing how various student characteristics including first-
generation status, socio-economic background, academic preparation and psychological
challenges impose significant barriers that impede students of color from graduating. The review
then moves to solutions, synthesizing approaches to closing gaps through equity-minded
policies, equity-driven professional development programs, targeted student support services and
the adoption of data-driven decision making for improving student outcomes. The chapter
concludes with the application of Clark and Estes’ (2008) Gap Analytic Conceptual Framework
to address knowledge, motivation and organizational (KMO) influences associated with the
ability of participants of the CSU Certificate Program in Student Success Analytics to interrogate
data and apply their findings to increase equitable outcomes for their students.
Equity Gaps in Public Universities
Over the past 50 years, differences in college graduation rates, commonly referred to as
“equity gaps,” have widened in most public U.S. universities (Duncan & Murnane, 2011; Fiske
& Markus, 2012). While minority student enrollment has increased, notably fewer students of
color earn a college degree than their White peers (Carnevale, & Strohl, 2013; Santos &
Haycock, 2016; Perna & Finney, 2014). These persistent equity gaps result from a variety of
interconnected influences including: (a) first-generation status (Cataldi et al., 2018; Pascarella, et
al., 2004), (b) socio-economic factors (Hora, Bouwma-Gearhart, & Hyoung, 2016; Rothstein,
13
2004; Santos & Haycock, 2016), (c) poor academic preparation (Jimenez et al., 2017; Adelman,
2004; Conger et al., 1996) and (d) psychological challenges (Borman et al., 2016; Spitzer &
Aronson, 2015; Yeager, et al., 2016; Stephens et al., 2015). Over the past two decades, an
increasing number of scholars have begun to analyze these influences, identifying the critical
role that equity-minded programs, policies and student support structures play in eliminating
ethnicity-based differences in college degree attainment rates (Dowd, 2005; Bensimon 2014;
Education Trust, 2015). A common thread among these analyses is a clear sense of purpose and
intentionality to ensure all students have an equitable opportunity to fulfill their academic goals.
History and Prevalence
Over the past half century, equity gaps have pervaded the U.S. higher education
landscape, leading to pronounced disparities in educational outcomes for students of different
races, ethnicities, genders and socio-economic statuses (Downey & Condron, 2016). In response
to the Civil Rights Act of 1964, the U.S, Department of Health, Welfare and Education
documented the pernicious effects of equity gaps by commissioning researcher James Coleman
to write the Equality of Opportunity Report (Downey & Condron, 2016; Coleman, 1968). This
seminal, mixed methods study combined interviews of high school principals with an analysis of
student test scores, disaggregated by race, gender, ethnicity and socio-economic status (Coleman,
1968). The report provided a national stage for publicizing pronounced inequities in college
attendance and led to the establishment of affirmative action policies that stimulated a
proliferation of minority student enrollment in college (Massey et al., 2003). Although these
efforts succeeded in narrowing the access gap for college attendance, they had little impact on
alleviating equity gaps in college completion rates (Massey et al., 2003). From 1971–2000,
college graduation rates increased at a much slower rate for minorities than for their White peers
14
(Bauman et al., 2005). During this thirty-year period, students of color were far less likely to
complete a college degree than their White counterparts; Black and Latinx degree completion
rose by approximately seven percentage points, while for White students, the increase was
fifteen percentage points (Bauman et al., 2005). Despite the fact that universities were enrolling a
more ethnically diverse student body, once these students arrived on campus, they were far less
likely to graduate than their White counterparts.
In recent years, college degree completion rates have improved for all students (Eberle-
Sudre et al., 2015). At the same time, however, students of color have not graduated at
disproportionately higher rates than their White peers, and as a result, equity gaps have persisted
(Eberle-Sudre et al., 2015). An analysis of 2010-2016 college completion data revealed that
discrepancies in graduation rates by race and ethnicity remained constant during this six-year
period; Black and Latinx students were consistently one-third to half as likely to earn a college
degree as their White counterparts (Whistle & Hiler, 2018). These gaps are commonplace at
universities across the U.S. In 2016, roughly 50% of the White adult population held a
community college or university degree, whereas less than a third of Black adults and
approximately a quarter of Latinx students held similar credentials (Jones & Berger, 2019).
These incongruities in completion rates have transpired during a period of significant
demographic change in communities across the country.
Demographic Landscape
The U.S. is becoming a more diverse nation with each passing year (U.S. Census Bureau,
2017). Growth in the U.S. college-going pipeline is forecasted to emanate primarily from Latinx
communities within the next thirty years (Cárdenas & Treuhaft, 2013), and by 2044, people of
color are estimated to be the majority (U.S. Census Bureau, 2017). In the State of California,
15
increases in diversity have occurred at an even more rapid pace (National Center for Educational
Statistics, 2019). Reber and Kalogrides (2018) noted that in California, Latinx students have
accounted for a rising majority of public-school enrollment since the mid-2000s, a trend they
forecast to continue well into the future. This growth is not limited to K-12 schools, as California
Latinx students are also attending college in greater numbers than ever (US Census Bureau
American Community Survey, 2017). During the 2017-2018 academic year, 1.3 million Latinx
students enrolled in California colleges and universities, representing over 40% of total
enrollments (US Census Bureau American Community Survey, 2017). The sharp increase in the
population of college going Latinx students requires public universities to better understand why
the degree attainment rates for these students have consistently lagged those of their White peers.
Root Causes of Equity Gaps
Equity gaps in completion rates arise from a variety of impediments and structural
barriers that exacerbate the ability of students of color to graduate from college. First-generation
students face challenges navigating the university landscape without the guidance of parents for
whom these experiences are familiar (Gibbons et al., 2019). Similarly, low-income students are
often required to balance work commitments with academic pursuits, commonly working
multiple jobs to pay for tuition and cover basic food and housing needs (Carnevale & Smith,
2018). Students of color are also more likely to attend under-resourced high schools, with
diminished opportunities to enroll in challenging courses that prepare them for the academic
rigors of a university education (Carnevale, et al., 2018). Additionally, minority students are
often exposed to negative psychological influences that deliver tacit and overt messages
signaling that members of their ethnic groups do not belong in college (Yeager et al., 2016).
Finally, the culture at higher education institutions is frequently unwelcoming for students of
16
color, making it difficult for them to foster meaningful connections to the campus (McNair et al.,
2016). Acknowledging that the limited scope of this review cannot fully address the complex and
multivariate root causes of equity gaps, the following section synthesizes relevant scholarly
literature in an exploration of the aforementioned recurring themes that contribute to inequitable
degree attainment rates.
First-Generation Status
Students of color are more likely than their White peers to be first-generation (Cataldi, et
al., 2018). 1n 2016, while 28% of White students self-identified as first-generation, the
percentages for Black (42%) and Latinx (48%) students were notably higher (Cataldi et al.,
2018). Lacking familial knowledge of the protocols and structures associated with attending
college, first-generation students face challenges maneuvering through the college admissions,
financial aid and academic and co-curricular environments (Gibbons et al., 2019; Collier &
Morgan, 2008). First-generation students are unable to benefit from their parents’ college-going
wisdom to guide them through new experiences such as determining an appropriate course load,
declaring a major or attending office hours.
First-generation students comprise approximately one-third of public university
enrollments (National Center for Educational Statistics, 2017). These students are typically at
greater risk of leaving college before earning a degree because they are unfamiliar with
university norms, are more likely to be academically underprepared and have a greater tendency
to work full-time while enrolled (Adams et al., 2016; Chen & Carroll, 2005). Analyzing six-year
graduation rates at public universities, Cataldi et al. (2018) found that first-generation students
were 30-35% less likely to graduate than their continuing generation peers. The struggles these
students face are often compounded by financial challenges that require them to work long hours
17
and prevent them from enjoying a cohesive social and academic experience during their college-
going years.
Socio-Economic Factors
Socioeconomic influences including health, lifestyle and financial capacity deprive many
low-income students of the resources they need to succeed in college; however, scholars disagree
about the overall influence these factors have on closing the equity gap. In public universities,
race and economic status are often interconnected, as evidenced by the fact that Black and Latinx
students are more than twice as likely as their White and Asian peers to come from families in
the lowest 20% of income brackets (Hora et al, 2016). Studies have sought to disentangle
financial status from race to determine which of the two variables is more likely to impact the
equity gap (Rothstein, 2004; Santos & Haycock, 2016). In a quantitative analysis using logistic
regression to control for a variety of high school and college student characteristics, Rothstein
(2004) asserted that college completion differences across racial lines are primarily due to
economic hardships, leading him to conclude that financially-based interventions hold the most
promise for closing equity gaps. In a more narrowly focused quantitative analysis of public
university graduation rate data, Santos and Haycock (2016) examined college completion gaps
between both African American students and their White peers, and low-income students and
their higher-income counterparts, finding that racial gaps proved to be more salient than
economic disparities. Despite these opposing conclusions, both groups of scholars agree that
socioeconomic status has a marked impact on equity gaps. Students’ high school experiences,
and more specifically, the extent to which they are academically prepared for college, also
contribute to unequal college graduation rates.
18
Academic Preparation
Black and Latinx students are over-represented in remedial college courses, imposing a
deleterious impact on their ability to earn a degree (Vandal, 2016). In a quantitative analysis of
Integrated Postsecondary Education Data System (IPEDS) data, Jimenez et al. (2016) determined
that 55% of African American students and 45% of Latinx students typically enroll in remedial
college courses, compared with 35% of White students. Remedial courses cover high school-
level academic concepts and do not confer college credit toward graduation. These courses are
also more likely to be taught by part-time faculty who are less experienced in the classroom and
less familiar with the university mission (Ott & Dippold, 2018). Unsurprisingly, students who
enroll in remedial courses graduate at much lower rates than their peers. In 2000, 39% of
remedial students earned a bachelor’s degree within six years, compared to 69% of their more
academically prepared colleagues (Adelman, 2004). A more recent report by Complete College
America indicated that remedial students attending community colleges graduated at about half
the rate of their more academically prepared peers (Jones, 2015). These outcomes suggest that
what happens in high school matters, and that students who enter college academically prepared
are much more likely to graduate.
Students of color are further challenged to attain the knowledge and skills they need to
succeed in college because the high schools they attend generally offer fewer college preparatory
courses than high schools with a predominantly White student body (Conger et al., 1996). In a
study of race, gender and poverty gaps at Florida high schools, White students were three times
more likely to enroll in an Advanced Placement (AP) math course than Black students. This
enrollment discrepancy was largely due to the dearth of AP courses at high schools serving large
populations of Black students (Conger et al., 1996). The racial inequities in high school AP
19
course-taking patterns mirror those that exist in college remedial courses, directly contributing to
the disproportionately low college graduation rates for Black and Latinx students. Beyond
academic preparation, psychological challenges also impede the ability of students of color to
earn a baccalaureate degree.
Psychological Challenges
The presence of negative social and psychological factors, especially those related to
stereotype threat, contribute to graduation equity gaps for many Black and Latinx students
(Steele & Aronson, 1995). For low-income and minority students, stereotype threat is the
internalization of negative biases associated with their financial and/or racial statuses, leading
them to underperform in college (Isenberg, 2016). Steele and Aronson (1995) determined that
even subtle reminders that a person belongs to a group stereotyped as academically inferior can
negatively impact test performance. Their study indicated that when Black students were
required to identify their race before taking the Graduate Record Examination, they performed
notably worse than when they were not asked to declare their racial identity (Steele & Aronson,
1995).
Similar to stereotype threat, impostor syndrome is a psychological phenomenon that leads
students to doubt that they have the aptitude to attend college and worry that they will be
exposed as a fraud (Langford & Clance, 1993). Several studies have analyzed the challenges that
impostor syndrome poses to the mental health and emotional wellbeing of college students of
color (Barnshaw & Dunietz, 2015; Clance, 1985; Cokley et al., 2013). Although incidents of
impostor syndrome can affect students of all races, Cokely et al. (2013) reported that Black
students were most likely to be afflicted and experience related feelings of stress, anxiety and
20
disengagement. These negative sentiments can be heightened by unwelcoming institutional
factors that students of color encounter on campus.
Institutional Factors
In addition to first-generation, socio-economic, academic and psychological challenges,
equity gaps are exacerbated by institutional cultural factors that impede the ability of public
universities to meet the needs of underserved students. McNair et al. (2016) averred that to
support students of color, universities must become “student-ready” by committing to diversity,
understanding the barriers that marginalized students face and delivering academic support such
as tutoring to all learners. “Student-ready” universities do not accentuate deficits, but rather
embrace the institution’s responsibility to promote inclusivity and improve academic excellence
and wellbeing for underserved students (McNair et al., 2016). To become “student-ready,”
universities need to dismantle cultural barriers that prevent minority students from succeeding on
their campus.
The literature also suggests that structural and institutionalized racism at universities
contributes to the prevalence of equity gaps on campus (Richards, 2018; McClain & Perry,
2017). Richards (2018) argued that universities must stop using numerical diversity to justify a
commitment to inclusivity, and instead assess the extent to which minority students, faculty and
staff feel welcome. McClain and Perry (2017) concurred with the need for colleges to holistically
assess institutional racism, asserting that to close equity gaps, universities must confront their
historical legacies of exclusion and promote inclusiveness through purposeful actions. The
researchers argued that coordinated interventions and programming must be targeted to help
students of color overcome the numerous cultural barriers to their success (McClain & Perry,
2017).
21
Student-Focused Approaches to Closing Equity Gaps
The process of promoting equity throughout the institution begins with a sustained
commitment to better serve students of color, both inside and outside of the classroom. By
identifying and analyzing the root causes of equity gaps in degree attainment, faculty, staff and
administrators can begin to implement solutions that bestow underserved students with more
equitable opportunities to earn a college degree (Bensimon, 2018; Dowd & Liera, 2018). Among
solutions cited in the literature, self-affirmation interventions—including culturally uplifting
articles, writing assignments and videos—have been shown to promote a greater sense of
belonging for Black and Latinx college students (Yeager et al., 2016; Borman et al., 2016).
Delivering targeted and enhanced support services, such as tutoring and advising, has also
proven to promote more equitable academic outcomes (Association of American Colleges and
Universities, 2018; McNair et al., 2016). The following section will explore these two student-
focused approaches to eliminating equity gaps.
Self-Affirmation Interventions
Minority students’ negative perceptions of their ability to succeed in college are
malleable, affording university leaders the opportunity to mitigate psychological challenges
associated with stereotype threat and impostor syndrome (Steele, 1988; Steele & Aaronson,
1995). Claude Steele (1988) pioneered self-affirmation intervention research, identifying
solutions that supply students of color with targeted messages that positively reinforce the way
they think about themselves and their ability to succeed in college. These interventions can be as
basic as an empowering writing assignment or a self-promoting video (Yeager et al., 2016). In a
series of double-blind experiments conducted at a public flagship university, researchers found
that first-year minority students who received affirming messages buttressing the notion that they
22
would be successful in college were 4-10% more likely to remain enrolled by the end of their
first year (Yeager et al., 2016). Borman et al. (2016) reported equally positive results in a
randomized field trial, demonstrating that middle school students who received self-affirmation
treatments had higher grade point averages than similar students who did not receive the
interventions. Additionally, Spitzer and Aronson (2015) chronicled several examples in which
extending self-affirmation messaging to minority college students helped improve their self-
confidence, decrease stereotype threat and increase their likelihood of graduating. These studies
impart convincing evidence that public universities can begin to narrow equity gaps by
implementing thoughtfully designed interventions that nurture minority students’ sense of
belonging and boost their self-confidence.
Targeted and Enhanced Student Support Services
While addressing psychological challenges can contribute to closing equity gaps by
strengthening a sense of belonging for students of color, offering tailored support services is
another strategy for assisting these students in their pursuit of a college degree. Recent research
affirms the value of faculty and administrator intentionality in the development of support
services targeted to specific populations of underserved students (Harackiewicz & Priniski, 2018;
McNair et al., 2016; Association of American Colleges and Universities, 2018). Harackiewicz
and Priniski (2018) put forth a conceptual model for developing interventions for underserved
populations of students as part of a strategic roadmap for closing equity gaps. Noting the limited
availability of college support services, the researchers argued that rather than giving assistance
to all students, administrators should strategically deploy their support interventions to meet the
needs of students who would benefit the most from heightened assistance (Harackiewicz &
Priniski, 2018).
23
Analyzing targeted interventions across the U.S., the Association of American Colleges
and Universities (2018) detailed several examples of how first-year experience programs with
intentional services for students of color can improve the persistence rates of these students.
Berríos-Allison (2011) conducted similar research on career counseling services tailored to
Latinx students, comparing retention rates between Latinx college students who participated in
culturally derived career counseling with retention rates of Latinx students who did not receive
these services. The results indicated that Latinx students who participated in the program were
retained to the second year at a rate of 93%, while the rate for non-participating Latinx students
was 77%.
Similar types of targeted student support services, such as minority mentorships, furnish
Black and Latinx students with opportunities to build the social and academic capital to
successfully navigate college. These programs can transform the academic trajectories of
students by creating affirming support structures that nurture a deeper connection to campus
(Forbes & Klevan, 2018). Campos et al. (2018) reported the positive impact that a University of
Texas peer mentoring program designed for Latinx high school students and incoming freshmen
had on these students’ self-confidence. Latinx students who participated in the peer mentor
program reported higher levels of emotional development and academic engagement than their
non-participating Latinx peers (Campos et al., 2018). Closing equity gaps necessitates that the
degree attainment rates for students of color increase at disproportionately higher rates than their
peers. This outcome becomes more achievable when Black and Latinx students are afforded
tailored and enhanced support services (Association of American Colleges and Universities,
2018), and when their universities commit to adopting equity-minded practices throughout the
institution (Dowd & Bensimon, 2015).
24
University-wide Approaches to Closing Equity Gaps
Equity-minded practitioners take a systemic approach to improving educational outcomes
for minority students by questioning the university’s support structures and pledging to alter
programs, policies and practices to foster a more just college experience (Bensimon, 2018).
Research conducted by the Education Trust (2015) indicated that equity gaps at institutions with
a cross-campus commitment to supporting underserved students are often markedly lower than
gaps at peer universities serving similar types of students. This section of the review synthesizes
the scholarly literature and documents ways that implicit bias training, equity-minded
professional development, a diverse professoriate and data-driven decision making contribute to
more equitable degree attainment rates.
Implicit Bias Training
Eliminating equity gaps in college completion requires faculty, staff and administrators to
assess their implicit biases, or involuntary associations of stereotype-affirming thoughts related
to people of certain social groups (Moody, 2004). These hidden beliefs are pervasive and affect
individuals’ perceptions of others based solely on characteristics such as race, gender, ethnicity,
religion, age, and/or appearance. Implicit biases can be identified and addressed through a
variety of instruments including social-psychological self-assessments such as the Implicit
Association Test (IAT) (Greenwald et al., 1998; Kayes, 2006; Mandelbaum, 2016). The IAT is
an online association test that measures the extent to which a person holds unconscious, negative
perceptions along racial lines (Greenwald et al., 1998). The test requires participants to evaluate
photographic images of people with various skin tones, thereby uncovering preferences and
hidden assumptions based solely on skin color (Greenwald et al., 1998). At the conclusion of the
25
test, respondents are presented with their results along with a list of suggestions for addressing
their biases.
Recognizing the IAT’s value in promoting personal awareness and equity, several
universities have developed implicit bias training to help faculty, staff and administrators
overcome their prejudices (Ohio State University Implicit Bias Module Series, 2019; Vanderbilt
Office for Diversity, Equity and Inclusion, 2019; University of Oregon Implicit Bias Workshop,
2019). Employing a randomized-controlled study, Cox and Devine (2019) demonstrated that
implicit biases can be overcome through sustained training that assists university practitioners in
breaking the unconscious thinking patterns from which these biases emanate. Campuses can
further demonstrate their commitment to rectifying inequitable student outcomes by offering
equity-minded professional development programs that equip stakeholders with tools to better
support underserved students.
Professional Development to Promote Equity
Professional development programs designed to promote equity-mindedness can create a
supportive forum within which practitioners gain an understanding of existing inequities,
question their assumptions, promote an asset-based view of diversity and reflect on opportunities
to improve their practice (Costino, 2018; Rose & Issa, 2018). Costino (2018) asserted that the
success of equity-minded professional development hinges on the ability to engage participants
in reflection on their own power and privilege while channeling discussions toward an asset-
based view of the contributions of students of color. Similarly, Ngounou and Gutierrez (2017)
claimed that effective equity-based professional development necessitates that faculty and staff
feel uncomfortable as they question longstanding beliefs about their professional obligation to
better support equitable outcomes.
26
Equity-minded professional development can also be targeted to the classroom setting.
Siwatu et al. (2011) determined that faculty self-efficacy for teaching students of color was
highly correlated with the success of these students in the classroom. Their study exemplifies a
strategy for building faculty efficacy for instructing students of color through professional
development that models culturally responsive teaching and provides exposure to diverse role
models. Rose and Issa (2018) emphasized the value of a systemic commitment to equity-minded
professional development to empower practitioners with strategies for removing barriers to
graduation for students of color. While universities can promote equity-minded training through
professional development, a more sustained effort is required to diversify the faculty ranks.
Commitment to a Diverse Professoriate
An ethnically diverse professoriate has been tied to improved academic outcomes for
students of color (Stout et al., 2018; Campbell et al., 2013), but few universities have succeeded
in recruiting and retaining significant numbers of minority faculty (Finkelstein et al., 2016). In
2014, there were more than seven White instructors for every Latinx or African American
faculty member at U.S. public universities (U.S. Department of Education, 2016). These
numbers are consistent with a longitudinal pattern that points to a salient and longstanding gap
between the ethnic composition of faculty and the students they teach (Finkelstein et al., 2016).
The presence of a diverse professoriate has notable implications on the likelihood that
students of color will attain a college degree (Campbell et al., 2013). In a quantitative analysis of
Integrated Postsecondary Education Data System (IPEDS) data, Stout et al. (2018) found that
graduation rates of Latinx and African American students were most closely correlated with the
percentage of their university’s faculty who were of the same ethnicity. This discovery indicates
27
that minority students were more likely to earn a college degree if they attended universities with
faculty who shared their demographic background.
The pursuit of faculty diversity is not a new endeavor. Over the past twenty-five years,
college campuses have implemented a variety of initiatives, with the goal of hiring and retaining
increased numbers of faculty of color (Astin, 1993; Turner et al., 2008; Sgoutas-Emch et al.,
2016). Despite these efforts, a recent study of federal data from 2013 to 2017 found negligible
increases in the growth of tenured Black and Latinx faculty at public master’s universities
(Vásquez et al., 2019). Several studies have suggested that universities can hire and retain faculty
of color by making a public commitment to diversify the faculty ranks, implementing
recruitment strategies that produce diverse pools of candidates and promoting meaningful
opportunities to connect newly-hired minority faculty to the university community (Kelly, et al.,
2017; Abdul-Raheem, 2016; Turner et al., 2008). In addition to altering the racial and ethnic
composition of the professoriate, universities can further promote equitable degree attainment by
facilitating the strategic use of disaggregated data among campus stakeholders.
Data-Driven Decision Making for Equitable Student Outcomes
Analyzing student data offers an evidence-based mechanism for university stakeholders
to assess the prevalence and severity of equity gaps on campus. Disaggregating institutional data
by race is an effective mechanism for uncovering inequities in access, opportunity and success
that can be quantified and addressed by campus leaders (Dowd & Bensimon, 2015). Universities
cultivate evidence-based and equity-minded decision making by promoting the regular use of
disaggregated data throughout the institution. Colleges that do not use data to understand the
nature of student inequities often resort to implementing “off-the-shelf” solutions that attempt to
support all students, but ultimately fall short of meeting the needs of the most underserved
28
student populations (Dowd & Bensimon, 2015). Myers and Finnigan (2018) reinforced these
assertions, arguing that data related to student equity must be used to build trust by promoting
difficult conversations about improving outcomes for students of color.
Data-driven decision making can also contribute to the development of more equitable
university policies and practices. Dowd and Liera (2018) detailed a case study of data-driven
organizational change efforts to address inequitable practices at a predominantly white
institution. Using disaggregated data, the university amended longstanding policies related to
enrollment management, admissions, curriculum and assessment to increase opportunities for
underserved students to earn a college degree (Dowd & Liera, 2018). By embracing data-driven
decision making throughout the institution, university leaders can improve performance by better
aligning their strategies with the goal of eliminating equity gaps.
The Clark and Estes (2008) Gap Analytic Conceptual Framework
The process of identifying and addressing root causes for varying levels of organizational
performance can provide a foundation for driving improvement. Clark and Estes (2008)
established a structured framework for evaluating gaps between an institution’s current
performance and its stated goals through an analysis of three primary influences of performance
gaps; knowledge, motivation and organizational (KMO). In this model, knowledge influences are
explored by determining the extent to which an institution’s constituents possess the requisite
expertise to accomplish their goals (Clark & Estes, 2008). The framework also addresses
stakeholders’ inward drive, or the motivational disposition affecting their beliefs and attitudes
toward accomplishing their goals (Clark & Estes, 2008). These factors offer an indication of the
tendency of stakeholders to address a given task, persist as challenges arise and commit the
needed mental effort to attain their goal (Mayer, 2011; Rueda, 2011). Enriching the analysis of
29
stakeholder knowledge and motivation influences, the framework incorporates an organizational
focus, measuring the extent to which an institution’s structures, policies, resources and culture
impact stakeholders’ ability to achieve their goals (Clark & Estes, 2008).
Knowledge, Motivation and Organizational (KMO) Influences
Aligned with the Clark and Estes (2008) framework, this section identifies the KMO
influences that impact the ability of participants of the CSU Certificate Program in Student
Success Analytics to consult student data to guide them in mitigating equity gaps. The section
begins with an exploration of gaps in stakeholders’ conceptual, procedural and metacognitive
knowledge related to the strategic use of student-level data. The review then describes how
motivational influences, including goal orientation and self-efficacy, impact certificate program
participants. Finally, the study introduces organizational factors, including the value the
institution ascribes to data-driven decision-making and its commitment to sustained professional
development, and contextualizes these influences with respect to certificate program
participants’ ability to achieve their goal.
Knowledge Influences
Certificate program participants need to master a variety of skills and knowledge domains
to meet their goal. These knowledge types—declarative, procedural and metacognitive—vary by
application, complexity and structure, and facilitate stakeholder accomplishment of strategically-
critical tasks (Baker, 2006). Declarative knowledge, consisting of both factual and conceptual
topics, addresses information related to inherent structures and their interconnectedness (Aguinis
& Kraiger, 2009). Factual knowledge includes relevant definitions, terminology, taxonomies and
hierarchical structures that stakeholders need to understand to improve performance. Factual
knowledge can be structured to build conceptual knowledge, which requires an appreciation of
30
the interconnectedness of disparate areas of information through models that create a
representative framework for practical application (Krathwohl, 2002). Factual and conceptual
knowledge address questions related to what things are and how they fit together, but this type of
information fails to stipulate the specific steps that need to be taken in sequence to accomplish a
goal. Stakeholders must also attain procedural knowledge of the skills, methods and techniques
for how to accomplish a specific task in order to take action to meet their goal (Krathwohl, 2002;
Rueda, 2011). The final knowledge type, metacognitive knowledge, refers to the process by
which stakeholders internalize information and gain personal awareness of their learning along
with the ability to control their cognitive processing (Baker, 2006; Krathwohl, 2002; Mayer,
2011). Independent of the subject matter, metacognitive learners are adept at contemplating their
own thoughts, while engaging in reflective activities that increase their self-awareness and
augment their ability to regulate the learning process (Mayer, 2011). The facility with which
stakeholders acquire and apply declarative, procedural and metacognitive knowledge has a direct
impact on their ability to achieve their performance goals (Clark & Estes, 2008).
The following section identifies three specific knowledge influences that are required for
certificate program participants to successfully leverage data to improve their equity-minded
practice. Derived from a review of the literature, each knowledge type will be explored using
Clark and Estes’ (2008) gap analysis methodology and will include an indication of why the
knowledge is needed to promote goal attainment.
Understand how the strategic use of data can influence students’ educational
outcomes. The first knowledge influence that certificate program participants need to meet their
performance goal is an understanding of how the strategic use of data can influence educational
outcomes for students. This knowledge is conceptual, as it encompasses the relationship between
31
analyzing data and taking action to improve student outcomes (Krathwohl, 2002). With an
increased focus on accountability, colleges and universities have become more evidence-driven
in their analyses of student persistence and completion data (Marsh et al., 2006). For many years
administrators have routinely compiled graduation rate figures to comply with basic federal and
state reporting requirements, but campus data systems now have the capacity to engage a broader
cross-section of faculty, staff and administrators with the strategic use of data (Marsh et al.,
2006). Within the CSU system, many campus departments have employed “early warning”
software programs that rely on algorithms to identify students who are struggling in the early
weeks of an academic term, and alert advisors about the need to intervene (Cal State LA Math
Early Alert, 2019; CSU Fullerton Titanium Engagement, 2019; Fresno State Support Net, 2019).
These student performance indicators are actionable and strategic in that they provide an
opportunity for campuses to pinpoint when interventions are required and concentrate support on
students who need timely help. Mandinach (2012) underscored the necessity for educators to
openly explore opportunities for accessing actionable datasets to amend their pedagogy in ways
that measurably improve student learning. Brownell and Tanner (2012) concurred with this
assertion. In their study of life science faculty, they found that instructors who were able to
articulate the connection between data and improved student outcomes were far more willing to
make changes to their practice (Brownell & Tanner, 2012).
Use disaggregated data to identify equity issues. The second knowledge influence that
certificate program participants need to meet their performance goal is an understanding of how
to use disaggregated data to identify equity issues on their campus. This knowledge type is
procedural, in that it elucidates the explicit steps needed to accomplish a specific task
(Krathwohl, 2002). Culling large datasets along racial, ethnic and socioeconomic lines can
32
present equity information in a new way, empowering stakeholders to determine whether their
students are experiencing disparate outcomes (Bensimon & Malcolm, 2012). By breaking the
data into meaningful slices and analyzing the results, certificate program participants can assess
the performance of specific student subpopulations, identify relevant comparison groups and
formulate interventions such as pedagogical changes that seek to mitigate inequities. This
approach is consistent with the principles used in Bensimon’s (2004) Equity Scorecard, a
methodology for employing disaggregated data to highlight racial and ethnic disparities related
to college access, academic progress and degree completion. The purpose of the Equity
Scorecard is to offer a structure by which practitioners can interrogate their data to identify
inequities, analyze and interpret the sources of these disparities and ultimately act to remedy the
underlying challenges (Bensimon, 2004).
In a study of community college faculty, Dowd (2005) found the use of disaggregated
datasets to be instrumental in driving equity-minded action. To be successful in this endeavor,
Dowd (2005) asserted that faculty need to know where to find the data, how to select meaningful
subpopulations, how to assess equity challenges and how to interpret the findings to inform new
thinking. Marsh and Farrell (2015) reinforced this conclusion by determining that faculty
adherence to a data-driven equity framework was vital to improving educational interventions on
behalf of their most underserved students. In both studies, faculty knowledge of how to
disaggregate data to the appropriate level, compare outcomes among disparate groups and
identify discrepancies served as a pre-requisite to applying equity-minded solutions to their
practice.
Reflect on opportunities to leverage student data to become more equity-minded.
The third knowledge influence that certificate program participants need to meet their
33
performance goal is the ability to reflect on opportunities for leveraging student data to become
more equity-minded. This knowledge type is metacognitive, in that it promotes self-regulation
and self-reflection through an increased awareness of one’s thinking (Baker, 2016). By striving
to become more metacognitive, or committing to think about their thinking, faculty can self-
assess their performance, contemplate areas for improvement and identify opportunities for
personal and professional growth. The relationship between intentional reflective thought and
improved performance has been well documented (Dewey, 1933; Rodgers, 2002). The
metacognitive process is especially impactful when practitioners incorporate evidence into their
reflections. Svinicki et al. (2016) discovered that faculty who utilized data to assess successes
and failures in their instruction were more likely to take steps to improve their practice based on
these reflections. Through metacognitive activities, Svinicki et al. (2016) found that faculty were
able to internalize relevant datasets and contemplate scenarios for their meaningful application in
the classroom.
Despite the documented benefits of faculty incorporation of data into the reflective
process, Hora and Smolarek (2018) determined that metacognitive reflections are prone to
individual biases and institutional constraints. The researchers contended that faculty need to be
given the time, space and cultural context for meaningful reflection in a manner that is mindful
of the historic plight of underserved students (Hora & Smolarek, 2018). This finding is
particularly applicable to CSU certificate program participants, many of whom are convening for
the first time to learn about data-driven decision making, metacognition and equity-minded
practice. Through reflection, discussion board postings and small group dialogue, participants are
bestowed opportunities to explore equity-mindedness in a safe and supportive environment.
Table 3 specifies the organizational performance goal and stakeholder goal aligned with
34
the CSU mission and summarizes the three aforementioned knowledge influences that are critical
to meeting these goals.
Table 3: Knowledge Influences and Knowledge Types
Knowledge Influences and Knowledge Types
Organizational Mission
The mission of the California State University is to provide high-quality and affordable
undergraduate and graduate learning experiences to a diverse student body to meet the
workforce needs of the State of California and prepare students for a lifetime of achievement.
Organizational Global Goal
In support of the CSU mission, by the year 2025, all CSU campuses will eliminate equity gaps,
or discrepancies in graduation rates between students of color and their White and Asian peers.
Stakeholder Goal
Upon completion of the Certificate Program in Student Success Analytics, all team members
will regularly analyze and apply student equity data to improve their practice.
Assumed Knowledge Type Application to Certificate Program Participants
Conceptual Participants need to understand how the strategic use of data
can influence educational outcomes for students.
Procedural Participants need to know how to use disaggregated data to
identify equity issues on their campus.
Metacognitive Participants need to know how to reflect on opportunities for
leveraging student equity data to become more equity-
minded in their practice.
35
Motivation Influences
While contextualized knowledge is indispensable, stakeholders also need to possess
sufficient motivation to begin their data-driven analyses, stick with them over time and exert the
requisite mental effort to improve their practice. This section begins by briefly reviewing general
motivation literature before sharpening its focus on motivation-related influences that are closely
associated with certificate program participants’ ability to achieve their goal.
Motivation is an instrumental factor in the successful accomplishment of a stakeholder
goal. Often characterized as an individual or cognitive construct, motivation can also manifest
itself collectively in behavior that is aligned with a group’s social and cultural norms (Bandura,
2000). Understanding motivational influences can facilitate the development of a roadmap that
indicates where a person starts, and how they continue and ultimately complete a given task
(Rueda, 2011). Motivational drivers can be assessed in conjunction with their impact on active
choice, persistence and mental effort (Pajares, 2006). Throughout each of these three stages, an
analysis of motivational influences can explain stakeholder proclivity to take the needed steps to
accomplish their goal. Two motivational theories central to certificate program participants’
active choice, persistence and mental effort are goal orientation theory and self-efficacy theory.
Goal orientation. Goal orientation theory takes into account the importance of mastery,
self-improvement, learning and advancement while focusing on the underlying reasons why
stakeholders opt to engage in performance-related behavior (Yough & Anderman, 2006;
Pintrich, 2003). Goal orientation theory distinguishes between two types of goals; mastery and
performance. Those with a mastery goal orientation typically value the personal challenge of
improving and possess an inherent drive to learn new skills or build existing competencies
(Mayer 2011). Individuals who set performance goals on the other hand, tend to be more
36
competitive, and are guided by the desire to outperform their colleagues (Yough & Anderman,
2006). It is important to note that goal orientation is not a dichotomous attribute, as individuals
may exhibit characteristics consistent with both mastery and performance orientations when
attempting to accomplish a task. In the higher education sector, for example, a college
administrator may be motivated by both an inherent drive to improve student learning and a
competitive yearning to increase graduation rates to a level above peer institutions.
Mastery goal orientation for certificate program participants. Certificate program
participants need to possess a mastery goal orientation with regards to achieving greater equity in
their practice. In a study of teaching research assistants at the University of Minnesota at
Mankato, Camp (2017) found that instructors who set mastery goals for improving their
pedagogy reported higher levels of motivation and elevated self-perceptions of their teaching
performance compared to their peers. Kunst et al. (2018) drew comparable conclusions in their
study of community college faculty, finding that instructors who established challenging mastery
goals were much more likely to participate in professional development than their colleagues
who either set performance goals or had no goal orientation. The certificate program seeks to
encourage mastery goal orientation by establishing a supportive learning community that draws
on the moral imperative to utilize evidence to increase equity-minded practice. The program’s
curriculum emphasizes the notion that equity work is never complete, and that sustained
improvement is not achievable by the establishment of one or two metrics. Throughout the
professional development sessions, participants commit to a continuous cycle of data analysis,
application and measurement to guide improvement.
Self-efficacy. The second motivational influence impacting certificate program
participants’ ability to achieve their goals is self-efficacy. Self-efficacy is a measure of one’s
37
expectation to successfully accomplish a particular task (Pajares, 2006). Unlike self-confidence,
which is associated with an individual’s overall performance expectations, Bandura (2005)
described self-efficacy as an individual’s confidence to successfully complete a specific task or
accomplish a clearly defined goal. Self-efficacy has a significant impact on learning and
motivation (Clark & Estes, 2008). Self-efficacy can be enhanced through modeling that
demonstrates that a goal is attainable, and reinforced through supportive, formative feedback
(Pajares, 2006; Borgoni et al., 2011). This feedback is critical to performance, as it allows for the
immediate identification of successes and failures while affording learners a mechanism for
continually guiding their improvement. Pajares (2006) expanded upon Bandura’s (2000) self-
efficacy theory by connecting it to collective, or organizational, efficacy. This construct applies
to certificate program participants who build self-efficacy through individual data-driven
activities while gaining collective efficacy through collaborative contributions to their team’s
applied research project.
Certificate program participant self-efficacy. Certificate program participants need to
feel a strong sense of self-efficacy for their ability to analyze data to improve student outcomes.
In a study that analyzed teachers engaging in data-driven decision making, Dunn et al. (2013)
found that teachers who reported high levels of self-efficacy for engaging in evidence-based
practices had a high propensity to adopt innovative pedagogies in their classroom. Using
structural equation modeling, Dunn et al. (2013) found that self-efficacious practitioners reported
far lower levels of anxiety regarding data use in their classroom. These findings confirm the
notion that feeling good about one’s ability to use data is a critical factor in improving practice.
Staman et al. (2014) further demonstrated that self-efficacy for data-driven decision making is a
malleable attribute that can be augmented through professional development. Analyzing a data-
38
informed professional development program for Dutch educators, the researchers identified
specific strategies to improve teacher self-efficacy for applying student data to improve their
pedagogy. Their mixed methods study indicated that advancing relationships where practitioners
collaborated to provide feedback on the strategic use of data greatly enhanced teachers’ self-
efficacy for regularly incorporating data into their practice (Staman et al., 2014).
The certificate program in student success analytics addresses the core findings of these
studies through a curriculum that scaffolds faculty opportunities to analyze data within a
community of supportive practitioners. This structure aims to build faculty self-efficacy for
making sense of their student data, while increasing their confidence in using this new
information to promote more equitable student outcomes in their classrooms.
Table 4 summarizes the salient stakeholder motivational influences of goal orientation
and self-efficacy that impact the ability of stakeholders to meet their goal.
39
Table 4: Motivational Influences
Motivational Influences
Organizational Mission
The mission of the California State University is to provide high-quality and affordable
undergraduate and graduate learning experiences to a diverse student body to meet the workforce
needs of the State of California and prepare students for a lifetime of achievement.
Organizational Global Goal
In support of the CSU mission, by the year 2025, all CSU campuses will eliminate equity gaps, or
discrepancies in graduation rates between students of color and their White and Asian peers.
Stakeholder Goal
Upon completion of the Certificate Program in Student Success Analytics, all team members will
regularly analyze and apply student equity data to improve their practice.
Assumed Motivation Influence Application to Certificate Program Participants
Goal Orientation Certificate program participants need to have a mastery goal
orientation for achieving greater equity in their practice.
Self-efficacy Certificate program participants need to feel a strong sense of
self-efficacy in their ability to analyze data to improve
student outcomes.
40
Organizational Influences
While assessing stakeholder knowledge and motivation influences creates a baseline for
understanding gaps in performance, organizational factors also impact stakeholders’ ability to
complete requisite tasks. As Clark and Estes (2008) noted, an organization’s policies and
procedures directly affect performance outcomes. The way in which an organization is
structured, what it values and how it allocates resources all offer indications of the underlying
culture and its alignment with the organizational goals. This section will begin with an overview
of the general theory of organizational culture, before turning to a literature review of
organizational factors that may impact the ability of certificate program participants to regularly
apply student data to improve their equity-minded practice.
General Theory. Culture permeates every aspect of an organization and establishes the
parameters for achieving an institution’s performance goals (Schein, 2004). Organizational
culture is manifested through the interrelationship between the various cultural models and
cultural settings that exist within an institution (Schein, 2004; Gallimore & Goldenberg, 2001).
Cultural models refer to the shared, normative understanding of what the organization values and
how it operates (Gallimore & Goldenberg, 2001). At a more granular level, cultural settings
include the policies, procedures and social dynamics that govern operational procedures and
collaborative activities in the workplace (Gallimore & Goldenberg, 2001). Organizations that
create alignment between cultural models and cultural settings promote consistency between
what they value and how they operate (Clark & Estes, 2008). By identifying how cultural models
and cultural settings within the CSU may be hindering stakeholders’ ability to effectively use
student data to improve their practice, campus leaders can begin to develop strategies to mitigate
negative impacts. The following section references relevant scholarly literature to frame two
41
cultural models and two cultural settings within the CSU that may contribute to stakeholders’
capacity to meet their performance goals.
Cultures of evidence that value data-informed analysis and action. The first cultural
model influence relates to the need to foster a culture of evidence throughout the institution. This
model emphasizes the fundamental value that the organization places on employing data to drive
improvement. Within the higher education sector, the National Association of Student Personnel
Administrators defines a culture of evidence as a campus’ commitment to use data to regularly
measure the efficacy of their programs, services and structures, while consistently relying on this
information to adjust practices to achieve its goals (Oburn, 2005). In a study of educational
accountability frameworks, Marsh (2012) concluded that a school’s organizational culture was
crucial to supporting teachers in the effective use of student data. Cardoza and Gold (2018)
further asserted that these structures are most effective when grounded in a student success
framework that provides campus leaders with a model for incorporating actionable data into their
daily practice.
Within the CSU, there are limited opportunities to employ data in actionable ways to
improve practice. Data collected centrally by the system office is commonly used to assemble
reports in response to state and federal compliance requirements (CSU Institutional Research and
Analyses, 2020). Consistent with the limitations that Dowd (2005) described at several
universities across the country, CSU administrators rarely reflect data back to stakeholders in
ways that can inform productive changes to practice (Cardoza & Gold, 2018). Instead decisions
related to academic policy, student support and pedagogical effectiveness are often made based
on personal preference, anecdote or precedent (Cardoza & Gold, 2018).
42
Continuous professional development of faculty, staff and administrators. The
second cultural model influence relates to the CSU’s support for professional development that
promotes continuous learning for all employees. This model is rooted in an organization’s value
of a human resources frame, which emphasizes the significance of granting employees the power
and support needed to enhance performance (Bolman & Deal, 2008). Costino (2018) asserted
that universities are obligated to offer continuous professional development as a condition of
eliciting transformative change. Costino’s study of an equity-minded faculty development
program, found that these types of learning opportunities not only provide a support structure for
pedagogical improvement, they also underscore the organization’s commitment to professional
growth (2018). Carney et al. (2016) further emphasized the need for a university-wide
commitment to ongoing professional development. Their study of a faculty learning community
at the University of North Georgia, determined that the program’s effectiveness in enhancing
pedagogical innovation resulted largely because the program enjoyed broad support throughout
the institution (Carney et al. 2016). This research suggests that by supplying sustained
professional development opportunities, higher education organizations demonstrate that
employee learning is a valued aspect of their culture.
Across the CSU, campuses’ commitment to continuous professional development has
varied over the past two decades (The CSU Faculty Development Center Survey, 2006).
Although each of the 23 CSU campuses has a faculty development center to support pedagogical
innovation, there are significant differences in the centers’ staffing levels, and inconsistencies in
the breadth of training opportunities (The CSU Faculty Development Center Survey, 2006; CSU
Northridge Faculty Development, 2020; Sonoma State University Faculty Center, 2020). Across
the CSU system, promoting sustained participation in professional development programs from
43
campuses that do not esteem these endeavors presents a challenge for the productive engagement
of their faculty and staff.
Sufficient planning time to analyze student data. In addition to cultural model
influences related to evidence-based decision making and professional development, two
associated cultural setting influences also impact certificate program participant performance.
The first cultural setting influence is the need to furnish faculty and staff with ample time to
analyze student data. Numerous studies point to insufficient time as a significant impediment to
engaging university personnel in activities that support the critical use of data to improve
practice (Marsh, 2012; Costino, 2018; Coburn, 2010; Slavit & Nelson, 2010). Faculty and staff
cannot accomplish the tasks of analyzing data, drawing inferences and applying new knowledge
to improve their pedagogy if these duties are simply added to their workload (Gagliardi et al.,
2018). Releasing faculty from teaching duties and buying out their time have shown to be
effective strategies for prioritizing the incorporation of analytics into their practice (Costino,
2018).
CSU campuses that participate in the Certificate Program in Student Success Analytics
are encouraged to afford their team members planning time to engage with the data (CSU
Certificate Program in Student Success Analytics, 2020). However, CSU campuses differ in the
perceived value they place on planning time; while some campuses seek to adjust program
participants’ workload by releasing them from a subset of their everyday duties, others offer no
such accommodation. Participants from campuses in the latter group need to navigate a cultural
setting that constrains the time they can dedicate to becoming more data-driven practitioners.
Support structures for applying data to practice. The second cultural setting influence
is the need to supply stakeholders with support structures for applying data to their practice.
44
Helping faculty disaggregate student data to identify and rectify inequities requires a profound
institutional commitment (Carter, 2006). Swing and Ross (2016) argued that it is incumbent
upon universities to model the effective use of data throughout the institution. The authors
contended that changing the primary role of campus institutional researchers from compliance
officers to data support coaches is required to elicit transformational change. Gagliardi et al.
(2018) expanded on this theme, declaring that higher education is undergoing an “analytics
revolution” that requires colleges to identify new ways to support the dissemination, analysis and
application of data to augment student success.
In their description of institutional strategies that nurture equity-minded and data-
informed improvement, Dowd and Bensimon (2006) emphasized the need for universities to
prioritize support that guides faculty in improving their evidence-based practice. Within the
CSU, campus support structures furnish varying levels of guidance for practitioner use of student
data. While three CSU campuses supply faculty and staff with dedicated learning communities to
facilitate the meaningful incorporation of data into their practice, the majority of campuses do
not offer this level of support (Long Beach State Data Fellows, 2020; CSU Northridge Data
Champions, 2020; CSU San Marcos Data Fellows, 2020). Participants from campuses that
provide limited guidance must contend with cultural settings that complicate the process of
integrating equity data into their practice.
Table 5 summarizes the four aforementioned organizational influences impacting the
ability of stakeholders to meet their goal.
45
Table 5: Organizational Influences
Organizational Influences
Organizational Mission
The mission of the California State University is to provide high-quality and affordable
undergraduate and graduate learning experiences to a diverse student body to meet the
workforce needs of the State of California and prepare students for a lifetime of achievement.
Organizational Global Goal
In support of the CSU mission, by the year 2025, all CSU campuses will eliminate equity gaps,
or discrepancies in graduation rates between students of color and their White and Asian peers.
Stakeholder Goal
Upon completion of the Certificate Program in Student Success Analytics, all team members
will regularly analyze and apply student equity data to improve their practice.
Assumed Organizational Influence Application to Certificate Program Participants
Cultural Model Influence The organization needs a culture of evidence that
values data-driven analysis and action.
Cultural Model Influence The organization needs a culture of continuous
professional development to support faculty, staff and
administrator growth.
Cultural Setting Influence The organization needs to give participants enough
planning time to have ample opportunities to analyze
student data.
Cultural Setting Influence The organization needs to provide participants with
effective support structures.
46
Conceptual Framework: The Interaction of Stakeholders’ Knowledge and Motivation and
the Organizational Context
Previous sections of this chapter introduced several knowledge, motivation and
organizational influences, expounding their individual connections to stakeholders’ ability to
apply student data to improve their equity-minded practice. These seemingly independent
variables interact with one another and must be considered in combination to fully appreciate
how they affect stakeholders’ ability to achieve their goals. A conceptual framework offers a
visualization of the interactions among the principal ideas, concepts and theories of a research
study, graphically highlighting the inherent logic within these interrelationships (Maxwell,
2013). Highlighting a macro view of how research topics are connected, the conceptual
framework serves as a foundation for the research design and describes the scaffolding used to
inform the scholarly process (Merriam & Tisdell, 2016).
The conceptual framework for this study was derived from a review of extant literature
along with the researcher’s knowledge of the organization’s culture, customs and structure. From
the empirical research related to data-driven strategies for closing equity gaps in higher
education, the researcher culled a list of the most salient knowledge, motivation and
organizational factors influencing the ability of higher education faculty, staff and administrators
to apply student data to improve their equity-minded practice. Figure 1 depicts the interactions of
certificate program participants’ knowledge and motivation within the context of the CSU’s
cultural models and settings.
47
Figure 1: Conceptual Framework
Interaction of Stakeholder Knowledge and Motivation within Organizational Cultural Models
and Settings
The conceptual framework portrays the interrelationship between organizational and
stakeholder influences that affect the ability of certificate program participants to achieve their
48
goal. The outermost blue circle lists two types of organizational factors—cultural models and
cultural settings—that directly impact the institution’s values, policies, resources and priorities
(Gallimore & Goldenberg, 2001). These institutional influences are presented side-by-side within
the conceptual framework to emphasize the intrinsic connections among them, as supported by
the research (Marsh, 2012; Bensimon & Malcolm, 2012). In an analysis of strategies for
advancing cultures of evidence in school settings, Marsh (2012) identified the need for
organizations to demonstrate a commitment to data-driven decision making by providing ample
professional development and ensuring that teachers have sufficient time and support to identify
opportunities for change. Similarly, Bensimon and Malcolm (2012) described a method for
leveraging college data to increase equity that requires an organizational commitment to offer
sustained professional development, support for disaggregating data and planning time for
faculty and staff. Implicit within both studies is the notion that organizational influences are
interrelated, and not presented in isolation.
Nested within the organizational influences in Figure 1 are two types of stakeholder
influences—knowledge and motivation—displayed in the gray circle. Similar to the connection
between cultural models and cultural settings, research indicates that the listed knowledge and
motivation influences also interact with one another (Dowd, 2005; Dunn et al., 2013). In an
analysis of data usage at several community colleges, Dowd (2005) noted that knowledge of how
to disaggregate data with an equity lens motivated faculty to examine opportunities to increase
the success of underserved students. In this case, augmenting stakeholder knowledge helped to
build faculty self-efficacy for engaging with student data to implement equity-minded pedagogy
(Dowd, 2005).
49
Figure 1 illustrates the stakeholder influences as being enclosed within the organizational
influences, suggesting their connectedness. The tendency of faculty to actively engage in data-
driven and equity-minded programs is largely dependent on the university’s culture, and the
value it places on promoting professional learning (Van Schalkwyk et al., 2015). The concentric
circles in the Figure 1 provide a visual representation of the close relationship between
organizational and stakeholder influences.
The red circle at the bottom of Figure 1 stipulates the stakeholder goal; that all certificate
program participants will regularly analyze and apply student data to improve their practice. The
goal is situated with the organizational and stakeholder influences, signifying the impact that
each of these sets of factors has on goal achievement. The framework implies that if the
knowledge and motivation influences of program participants are addressed within an
organizational culture that values ongoing professional development and a culture of inquiry, the
stakeholder goal will be more easily attained.
Conclusion
This study seeks to determine the readiness of certificate program participants to analyze
and apply student data to improve equity in their practice. The stakeholder goal is aligned with
the organization’s goal of eliminating equity gaps in graduation rates between underserved
students and their non-underserved peers by the year 2025. A review of the extant literature
contextualized the prevalence of equity gaps at public universities and explored historical trends,
root causes and research-based strategies for ameliorating the conditions that lead to inequitable
student outcomes. Through the literature review process, the researcher identified relevant KMO
influences that may impact the ability of stakeholders to achieve their goal. The study then
examined these influences using the Clark and Estes’ (2008) performance gap analysis model,
50
which informed the development of a conceptual framework illustrating the interrelationships
among KMO influences. The following chapter will detail the methodological approach for
validating the assumed KMO influences.
51
CHAPTER THREE: RESEARCH METHODS
Research methods refer to the strategies, procedures and techniques guiding the
collection and analysis of data, supporting a structured approach to answering the research
questions (McEwan & McEwan, 2003). Chapter 3 provides an overview of the study design and
details the rationale behind the methodological choices that undergird the validity of the findings.
The chapter begins by revisiting salient characteristics of the participating stakeholders and
introducing the rationale for the selection of the survey instrument. Next, the chapter explains the
intentionality with which the survey questions were developed and presents the statistical
analyses that form the foundation of the quantitative study. The chapter concludes by examining
the validity and reliability of the survey instrument and addressing ethical considerations
employed to safeguard the veracity of the study.
The research questions that guide the study are as follows:
1. What are participants’ knowledge and motivation influences related to using student
equity data to improve their practice?
2. What is the influence of the organization’s cultural models and cultural settings on
stakeholder knowledge and motivation?
3. How do stakeholders’ job role and level of experience relate to their stated propensity to
apply student data to improve their practice?
Participating Stakeholders
The stakeholders for this study were CSU members of the 2020 cohort of the Certificate
Program in Student Success Analytics, a professional development program for campus
leadership teams. The identification of participating teams was led by CSU presidents, who were
invited to constitute groups of 10-15 faculty, staff and administrators to engage with the
52
program. The CSU system office suggested that presidents consider including faculty,
institutional researchers and representatives from Student Affairs in their recruitment efforts, but
the presidents ultimately had the prerogative to determine the level of heterogeneity among
participant roles. As a result of this discretion, the constitution of participating teams varied, with
some groups predominantly derived from the faculty ranks and others largely represented by
administrators. For the 2020 cohort, the certificate program had a total enrollment of 274, with
237 participants from 20 CSU campus teams and 37 representatives from three external
university teams. Surveys were administered to all program participants, but for the purposes of
this study, the scope of the quantitative analysis was limited to the 237 CSU participants.
Survey Sampling Strategy and Rationale
To better understand the salient KMO influences impacting program participants’
readiness to improve their data-driven and equity-minded practice, the researcher implemented a
pre-program survey. The survey facilitated an analysis that employed quantitative methods to
determine the validity of the instrument and measure the extent to which the nine KMO
influences impacted participants’ likelihood of applying student equity data to improve their
practice. The program director administered the survey as a census to the entire population of
CSU stakeholders (n=237). To maximize the response rate, the researcher embedded the request
to complete the survey into two welcome e-mails, emphasizing the importance of gathering data
to facilitate a continuous improvement process for the program.
Survey Instrumentation and Statistical Analyses
Following recommended practices in research design (Merriam & Tisdell, 2016;
Robinson & Firth, 2019; Maxwell, 2013) the researcher purposefully aligned the study’s
methodological approach with the research questions. Table 6 provides an overview of the
53
survey instrumentation and statistical analyses that guided the data collection and investigation
processes.
Table 6: Survey Instruments and Methods
Survey Instruments and Methods
Survey Instrument Statistical Analysis
Pre-Program Survey
● Developed using Qualtrics software
● Online format
● Administered at the beginning of the
program to all CSU stakeholders
● Included four questions related to
participant demographics
● Included 28 KMO influence questions
with seven-point Likert scale
responses
● Included four concluding questions
with a five-point Likert response scale
to assess participants’ propensity to
apply data-informed insights to
improve equity their practice
● Available in Appendix B
Confirmatory Factor Analysis
● Data reduction strategy used to assess
the alignment of survey questions to the
underlying KMO constructs
● Poorly aligned questions were removed
from the analysis thereby improving
construct validity
Linear Regression
● Statistical model used to predict
stakeholders’ likelihood to use student
data to improve equity in their practice
based on various combinations of
KMO influences
Logistic Regression
● Statistical model used to predict
stakeholders’ likelihood to use student
data to improve equity in their practice
based on job role and years of
experience in higher education.
54
Survey Instrumentation
The survey instrument, available in Appendix B, consisted of 36 questions; 28 were
purposefully mapped to a specific KMO influence, four gathered participant demographic
information and four assessed stakeholders’ tendency to apply insights from student data analysis
to improve equity in their practice. The survey included eight to 12 questions for each KMO
influence, promoting a more granular exploration of each factor. To increase the usability of the
questionnaire, the researcher structured the survey into four distinct categories: experiences with
data, feelings about data, campus support and actions.
Surveys are prevalent data collection instruments in social science research and as such,
there are myriad examples of well-constructed questionnaires that can serve as exemplars for
crafting an effective interrogating structure. Robinson and Firth (2019) extolled the value of
adapting existing surveys as a means of increasing the reliability of responses. In developing the
survey instrument, the researcher consulted the extant literature to identify survey structures in
studies that explored KMO influences related to cultures of evidence, professional development
and educational equity. The data-driven decision-making surveys in studies by Luo (2008) and
Kelly and Lueck (2011) offered a logical framework for constructing questions related to
educators’ application of data in a school setting. Using these surveys as a model, the researcher
created several pithy and direct statements about the extent to which stakeholders use data to
better understand student challenges, identify potential solutions and implement changes to their
practice.
To capture participant responses in an intuitive manner, the researcher opted to employ a
seven-point Likert scale. Boone and Boone (2012) asserted that Likert scales are among the most
effective quantitative instruments for capturing respondents’ attitudes, beliefs and feelings, while
55
simultaneously bestowing the researcher with a convenient and reliable mechanism for analyzing
variations in data. The researcher chose to employ a seven-point scale because studies suggest
that seven ordinal choices afford an ample response range, while scales with greater than seven
choices offer limited additional variance (Colman et al., 1997; Johns, 2010). To increase
reliability, the researcher also investigated the inherent limitations of Likert-scale response
structures. Colman et al. (1997) found that Likert-scale surveys can be prone to response bias, as
participants’ answers may be influenced by previous questions. To mitigate the potential for this
bias, the researcher conducted a factor analysis, eliminating three survey questions that were
shown to be misaligned with their intended KMO construct. The procedural details of the factor
analysis and corresponding results are provided in Chapter 4.
Statistical Analyses
Upon collection of the survey responses, the researcher exported the Qualtrics data set to
SPSS for cleaning. The researcher pre-programmed the Qualtrics survey to require responses to
each question, so there were no missing data. The researcher then used SPSS to conduct a series
of quantitative statistical analyses guided by the conceptual framework and research questions.
Johnson and Christensen (2015) averred that quantitative analyses can allow for greater
objectivity and reliability through the use of standardized, reproducible instruments. As
described in Table 6, the researcher performed a confirmatory factor analysis, multiple linear
regressions and a binary logistic regression on participants’ survey responses. The factor analysis
guided improvements to the survey’s validity, while the regression models identified salient
KMO influences impacting stakeholders’ ability to apply student data to improve equity in their
practice. The following sections offer a brief overview of the theory and purpose of the three
aforementioned quantitative methods, each of which will be explored further in Chapter 4.
56
Confirmatory Factor Analysis. To assess and enhance the validity of the survey
instrument, the researcher employed a statistical method known as confirmatory factor analysis
(CFA) on all survey responses. Factor analysis examines the relationship among survey
questions, ultimately reducing a large subset of variables into a smaller number of constructs or
factors (Williams et al., 2010). After collecting data from the pre-program survey, the researcher
performed the CFA to assess the alignment between each survey question and the constructs of
knowledge, motivation and organization. As part of the process, the researcher calculated a
factor score for each question, indicating the strength of the correlation between the question and
underlying construct. For example, with respect to the construct of organizational influences, the
researcher computed factor scores for questions related to professional development, cultures of
evidence, organizational support and sufficiency of resources. Within the CFA model, each
question’s factor score was compared to an established minimum statistical threshold for
inclusion in the given construct. Consistent with this approach, the CFA served as a screening
tool to improve the validity of the survey instrument. As detailed in Chapter 4, the researcher
identified three questions with unacceptably low factor scores—indicating misalignment with
their intended KMO constructs—and eliminated these questions from subsequent statistical
analyses.
Linear Regression. The researcher also ran multiple linear regression models to
determine the extent to which changes in KMO influences predicted changes in the likelihood
that stakeholders would apply data-driven insight to their practice. Linear regression is a
mechanism for quantifying the strength of the relationship between an outcome variable and
independent variable (Montgomery et al., 2012). For this study, the researcher employed three
linear regression models to answer the first two research questions. In the first model, changes in
57
knowledge and motivation influences (independent variables) were employed as predictors of
participant use of data to improve their practice (dependent variable). The second model
regressed the knowledge construct (dependent variable) on the organization’s cultural model
(independent variable) and cultural setting (independent variable) influences. The third model
was identical to the second, except the researcher substituted motivational influences for
knowledge influences as the dependent variable.
Logistic Regression. To answer the third research question, the researcher performed a
binomial logistic regression examining the extent to which participants’ job role and level of
higher education experience (independent variables) predicted the likelihood that they would use
data to improve equity in their practice (dependent variable). Chapter 4 further details the steps
the researcher took to conduct the logistic regression and why he chose this statistical method for
the analysis.
Validity and Reliability
The legitimacy of a quantitative study is dependent on the researcher’s ability to collect
valid and reliable data in a systematically repeatable manner (Creswell & Creswell, 2018). The
concept of validity in survey development refers to the degree to which an instrument accurately
measures what it is seeking to measure (Salkind, 2017). The researcher addressed validity
concerns for the survey instrument by employing concise language, avoiding jargon and posing a
variety of question types.
The researcher also improved the validity of the survey instrument by consulting with
colleagues experienced in quantitative research, seeking their feedback to improve question
clarity and increase alignment with the conceptual framework. Upon compiling a list of potential
questions, the researcher implemented a test run with two CSU professors and one CSU staff
58
member to assess the instrument’s intuitiveness. Based on feedback from this test, the researcher
added clarifying language and reworded several questions. Consistent with Robinson and Firth’s
(2019) recommendations, testing the survey prior to its official administration allowed the
researcher to address his preconceived notions of how participants would likely respond to the
instrument and afforded him an opportunity to edit the language to promote clarity.
In addition to explicitly incorporating validity safeguards into the study, the researcher
integrated strategies to promote reliability. In quantitative research, reliability refers to the extent
to which a study’s findings can be replicated using standardized statistical analyses (Merriam &
Tisdell, 2016). Robinson and Firth (2019) emphasized that survey reliability is predicated on
evidence that survey-takers answer similar types of questions in appropriately parallel ways. To
enhance the reliability of the survey, the researcher employed the quantitative methods described
in the previous section.
Given the importance of standardization in quantitative research, documenting the
statistical methodology is an essential tool for enhancing reliability and reproducibility. Johnson
and Christensen (2013) recommended that researchers maintain an audit trail as a means of
improving reliability by allowing readers to authenticate findings. The researcher enhanced the
study’s reliability by documenting the statistical methodology and compiling a step-by-step
account to facilitate reproducibility. The researcher also ran all statistical analyses more than
once to corroborate results and sought guidance on the methodological approach from colleagues
with substantial experience in quantitative methods. By engaging experts in a peer review of the
audit trail and statistical analyses, the researcher augmented the reliability of the study’s
findings.
59
Ethics
Research ethics are grounded in the principles of affording respect, maximizing benefits,
minimizing risk and ensuring fairness for all participants (Merriam & Tisdell, 2016). Glesne
(2011) also argued that ethical inquiry requires researchers to be attentive of the dignity, privacy
and wellbeing of the subjects of the study. With these principles in mind, this research study
promoted transparency by giving participants written information about its purpose and an
explanation about how survey responses would be analyzed and aggregated to facilitate a deeper
understanding of the impact of the professional development program. To mitigate perceptions
of pressure to participate in the study and minimize the power dynamic between the researcher
and participants, the survey was administered by the program director.
As part of the survey participant recruitment process, the program director obtained
informed consent from all potential subjects and emphasized the voluntary nature of their
participation. Although requiring informed consent does not equalize the power structure
between researcher and participant, the consent requirement increased the likelihood that survey-
takers would make autonomous decisions aligned with their best interest (Glesne, 2011). Survey
respondents were apprised of their right to withdraw from the study at any time without penalty,
and they received written assurances that their responses would be held in strict confidence.
In addition to promoting informed consent, the researcher was mindful of ethical
considerations interwoven into the relationship between the researcher and the subjects of the
study (Glesne, 2011). As an administrator at the CSU Office of the Chancellor, representing a
system of scholarly institutions, the researcher sought to elicit honest and candid responses while
abiding by ethical protocols that respect participants’ best interests. Although the researcher did
not maintain any official reporting relationships with the faculty, staff and administrators who
60
participated in the study, he remained cognizant of the implicit power dynamic between the CSU
campuses and the system office. To assuage undue influence that may be implicit in this
relationship, the researcher committed to presenting all results in de-identified, aggregate form.
The researcher also reminded participants before, during and after the survey that the purpose of
the study was to improve support for CSU faculty, staff and administrators in their use of student
data to improve their equity-minded practice.
The researcher was also aware that his attitudes toward the Certificate Program had the
potential for tacitly impacting participants. Merriam and Tisdell (2016) contended that ethical
dilemmas often arise in studies where the researcher’s opinions influence participants’ responses.
As the creator and director of the study’s professional development program, the researcher held
several preconceived notions about the program’s efficacy, and he had a vested interest in its
continued success. To counter these assumptions, the researcher employed the standardized
statistical instruments introduced in previous sections as a means of identifying potential biases.
The researcher also ensured that participants understood his role in conducting the study.
As a practitioner, the researcher did not encounter stakeholder confusion arising from his dual
role as investigator and administrator given the frequency with which this duality is presented
within the university setting. The majority of study participants held doctoral degrees and
possessed significant experience conducting scholarly research. Additionally, participants were
knowledgeable of the challenges of maintaining boundaries between the roles of educator and
scholar, and some had served on dissertation committees for campus Ed.D. students.
A critical measure of the quality of research is whether a study is conducted ethically and
with integrity (Merriam & Tisdell, 2016). Through the adoption of the aforementioned measures,
the researcher attenuated his implicit and explicit biases while valuing, respecting and protecting
61
program participants. By committing to an ethical approach, the researcher increased the
likelihood that the study’s outcomes accurately reflect the knowledge, motivation and
organizational influences that impact program participants’ ability to become more data-
informed, equity-minded practitioners.
62
CHAPTER FOUR: RESULTS AND FINDINGS
The CSU has the goal of eliminating equity gaps in the degree attainment rates of
historically underserved students and their more privileged peers by the year 2025. In support of
this goal, the CSU has developed several programs and policies to promote equity-minded action
to help more students of color earn a high-quality college degree. One such effort, the Certificate
Program in Student Success Analytics, was launched to provide data-informed and equity-driven
professional development to teams of CSU faculty, staff and administrators, who collectively
comprise the stakeholder group of focus.
The purpose of this study was to evaluate the readiness of certificate program participants
to analyze and apply student data to improve equity in their practice. The researcher collected
data through the administration of an online survey to assess knowledge, motivation and
organizational influences impeding stakeholders’ ability to leverage data-informed insights to
become more equity-minded practitioners. The research questions guiding this study were as
follows:
1. What are participants’ knowledge and motivation influences related to using student
equity data to improve their practice?
2. What is the influence of the organization’s cultural models and cultural settings on
stakeholder knowledge and motivation?
3. How do stakeholders’ job role and level of experience relate to their stated propensity
to apply student data to improve their practice?
The chapter begins by highlighting salient demographic characteristics of the
participating stakeholders. The focus then turns to the concept of construct validity, detailing
how a confirmatory factor analysis facilitated the identification and removal of survey questions
63
that were misaligned with their intended knowledge, motivation, or organization construct. The
following section introduces the survey results, highlighting notable trends in the distribution of
responses and offering a contextualized discussion of the interpretation of their meaning. The
chapter then moves to the research questions, presenting a description of the inferential statistical
models employed to answer the questions, an overview of the results and a discussion of their
broader meaning. The chapter concludes with a summary of the study’s findings and a
prioritization of the knowledge, motivation and organizational influences that must be addressed
to help certificate program participants meet their goal of applying analytic insight to increase
equitable outcomes for their students.
Participating Stakeholders
The study focused on CSU faculty, staff and administrator participants in the 2020 cohort
of the Certificate Program in Student Success Analytics. On January 20, 2020, the 237 registered
CSU participants received an e-mail requesting that they complete the pre-program survey, a
copy of which is included in Appendix B. In the ensuing 11 days, 211 participants (89%)
submitted the survey before the deadline of January 31, 2020. A broad cross-section of CSU
constituents completed the survey. Figure 2 highlights relevant demographic information related
to respondents’ job role, years of experience in higher education, gender and ethnicity.
64
Figure 2: Survey Respondent Demographic Information
Survey Respondent Demographic Information
The results for the demographic section of the survey revealed a relatively even
distribution of job roles among participants, with 41% faculty, 36% administrators and 23%
staff. Respondents reported having an average of 14 years of experience in higher education with
10% stating they had less than six years of experience, and just under half (49%) indicating
experience greater than15 years. Survey-takers were more than twice as likely to be female than
male (66% vs. 31%), indicating a higher level of female participation compared to the total
population of CSU employees (55% vs. 45%) (CSU Employee Distribution by Gender, Race,
Ethnicity and Time Base, 2020). With respect to ethnic background, 44% reported their ethnicity
as White, 18% Asian, 16% Hispanic and 11% Black. The ethnic distribution of program
65
participants was comparable to that found among the general population of CSU employees,
where 52% reported as White, 43% as minority and 6% declining to state (CSU Employee
Distribution by Gender, Race, Ethnicity and Time Base, 2020).
Construct Validity
In quantitative analyses, the veracity and reliability of the survey instrument are
paramount to collecting valid data from which to draw meaningful inferences (Merriam &
Tisdell, 2016). Cronbach and Meehl (1955) demonstrated that statistical tests for construct
validity offer an unbiased assessment of the alignment between survey questions and the
underlying constructs they are seeking to measure. As illustrated in the conceptual framework
and detailed further in Chapter 3, the questions for this study were designed to assess gaps
related to nine influences within the broader constructs of knowledge, motivation and
organization. To evaluate the validity of the survey instrument, the researcher conducted a
confirmatory factor analysis (CFA) using SPSS AMOS software and built a model that mapped
each survey question to its intended KMO construct. A graphical representation of the
hypothesized relationships between the survey questions and KMO constructs is provided in
Figure 3.
66
Figure 3: Confirmatory Factor Analysis Question Mapping to KMO Constructs
Confirmatory Factor Analysis Question Mapping to KMO Constructs
67
Through CFA, the researcher generated factor scores for each survey item, depicted by
the vertical numbers shown in the middle of each arrow in Figure 3. The factor score represents a
standardized regression coefficient indicating the correlation between the survey question and
the larger KMO construct (Salkind, 2010). The higher the factor score, the more closely the
survey question is aligned with—and accurately measures—the underlying construct. While
there is some disagreement about a standardized cutoff index for interpreting factor scores,
Hulland (1999) and Truong & McColl (2011) stipulated that items with factor scores of less than
.50 should be eliminated from further statistical analysis.
Table 7 lists the seven knowledge survey questions and their respective factor scores.
Applying the threshold used in Hulland’s (1999) and Truong and McColl’s (2011) studies to
each item, the final two columns of Table 7 indicate whether the factor score was less than .50
and deleted from further analysis.
Table 7: Factor Scores for Knowledge Survey Questions
Factor Scores for Knowledge Survey Questions
Knowledge Question
Knowledge
Influence
Factor
Score
Delete
Question?
I understand how the use of equity data can
impact student outcomes.
Benefits of
student data
.61 No
I know how to identify equity gaps in student
outcomes.
Disaggregating
data
.79 No
I understand how to analyze student data to get
the information I need.
Benefits of
student data
.78 No
Given a spreadsheet with retention data organized
by various student characteristics, I could identify
salient equity issues.
Disaggregating
data
.77 No
I understand how student data are collected on
my campus.
Disaggregating
data
.64 No
I am comfortable talking with my colleagues
about student data.
Reflecting on
data
.72 No
I often reflect on opportunities to improve
equity in my practice.
Reflecting on
data
.47 Yes
68
As indicated in bold at the bottom of Table 7, the last knowledge question measuring
participants’ self-reflections about data had a factor score of .47 indicating an unacceptably low
correlation with the knowledge construct. Accordingly, the researcher deleted the question and
excluded it from further analysis.
Similarly, Table 8 provides an overview of the question mappings and factor scores for
the motivation items in the survey. As noted in bold, two questions related to mastery goal
orientation had factor scores of less than .50 and were eliminated from the data analysis.
Table 8: Factor Scores for Motivation Survey Questions
Factor Scores for Motivation Survey Questions
Motivation Question
Motivation
Influence
Factor
Score
Delete
Question?
My knowledge of student data is stronger than
my peers.
Self-efficacy .58 No
I am comfortable using web-based dashboards. Self-efficacy .59 No
Improving equity in my practice is one of my
top priorities.
Mastery goal
orientation
.38 Yes
I am committed to learning as much as possible
about using data to improve my practice.
Mastery goal
orientation
.55 No
I feel confident in my ability to apply data-
informed insights to improve student outcomes.
Self-efficacy .66 No
When I have questions about student data, I can
typically figure out the answers on my own.
Self-efficacy .64 No
I am confident that I can become more equity-
minded in my practice.
Self-efficacy .57 No
I feel confident in my ability to navigate data
tables for meaning.
Self-efficacy .67 No
I would commit to improve equity in my
practice even if it were not a priority of my
campus.
Mastery goal
orientation
.47 Yes
Finally, Table 9 specifies the mappings and factor scores for the organizational influence
questions in the survey. As shown in the third column, the factor scores were all greater than .50,
and no questions were removed from the analysis.
69
Table 9: Factor Scores for Organization Survey Questions
Factor Scores for Organizational Influence Survey Questions
Organization Survey Question
Organization
Construct
Factor
Score
Delete
Question?
I have access to the data I need. Support
structures
.54 No
I know how to access student data on my campus. Support
structures
.59 No
My campus values data-informed decision
making.
Culture of
evidence
.66 No
My campus leaders consult data to inform their
decisions.
Culture of
evidence
.55 No
My campus is committed to ensuring that faculty,
staff and administrators have access to quality
professional development.
Professional
development
.70 No
I am satisfied with my opportunities to engage in
professional development.
Professional
development
.77 No
Professional development is one of the top perks
of my job.
Professional
development
.63 No
I have sufficient planning time to participate in
professional development.
Planning
time
,68 No
My campus supports my participation in the
Certificate Program.
Planning
time
.59 No
I know who to talk to on campus to help me
understand how to interpret student data.
Support
structures
.58 No
My campus provides adequate support structures
to help me apply student data to my practice.
Support
structures
.69 No
I feel supported in my efforts to use student data
to improve equity in my practice.
Support
structures
.74 No
The researcher increased the construct validity by conducting a CFA on the survey
responses and eliminating three questions with unacceptably low factor scores, thereby
narrowing the focus of the quantitative analysis to 25 questions determined to be closely aligned
with their intended KMO constructs. The following section introduces the survey results and
includes a brief discussion with contextualized interpretations for understanding their meaning.
70
Results and Discussion
To assess knowledge, motivation and organizational influences affecting program
participants’ ability to apply data-driven insight to their practice, the first part of the survey
instrument posed 28 questions; 25 of which were included in the analysis. A Likert scale was
used to measure stakeholders’ feelings toward each statement, offering respondents seven
choices for answers: Strongly Agree, Agree, Somewhat Agree, Neutral, Somewhat Disagree,
Disagree and Strongly Disagree. This section presents an itemized aggregate summary of the
responses to each survey question organized by their categorization as knowledge, motivation, or
organizational influences. The presentation of survey results is followed by a brief discussion of
salient patterns and contextualized interpretations.
Knowledge Influences
The survey responses to the knowledge questions (K.1 – K.6) revealed broad agreement
with all six statements as depicted by the three bars of varying shades of blue in Figure 4. Over
50% of respondents indicated some level of concurrence (Strongly Agree, Agree, or Somewhat
Agree) with each item. Respondents expressed their most positive inclinations toward question
one: I understand how the use of equity data can impact student outcomes. 83% of participants
concurred with this statement, with 28% reporting strong agreement, 35% agreement and 20%
somewhat agreement. A similar response pattern was recorded for question four: Given a
spreadsheet with retention data organized by various student characteristics, I could identify
equity issues and question six: I am comfortable talking with my colleagues about student data.
Over 80% of participants expressed concurrence with each statement.
71
Figure 4: Knowledge Survey Question Responses
Knowledge Survey Question Responses
Among the six knowledge questions, the distribution of responses was most evenly
spread for item five: I understand how student data are collected on my campus. While half of
respondents indicated some form of agreement, 13% were neutral and 38% expressed varying
levels of disagreement with the statement.
72
The preponderance of affirmative responses to the knowledge statements suggests that
the majority of survey-takers were confident in their conceptual, procedural and metacognitive
knowledge related to applying data-informed insights to improve equity in their practice. These
self-affirming judgments may have been influenced by what Moore and Schatz (2017) described
as the overconfidence effect, a phenomenon by which people overestimate their strengths, beliefs
and abilities. This would explain why responses to questions related to tasks that survey-takers
perform themselves—such as disaggregating data—were more positive than responses to items
referencing actions taken by third parties—such as collecting student data.
Motivation Influences
The survey responses to the motivation questions (M.1 – M.7) demonstrated participants’
overwhelming agreement with the seven motivation-related statements presented in Figure 5.
Respondents expressed their most emphatic concurrence with question three: I feel confident in
my ability to apply data-informed insights to improve student outcomes, with 69% of participants
indicating strong agreement with the statement. Similar responses were recorded for question
six: I am comfortable talking with my colleagues about student data, with 56% of survey-takers
reporting strong agreement. For both of these questions, less than 3% of respondents expressed
any level of disagreement. The highest level of discord was registered for question five: I am
confident that I can become more equity-minded in my practice, where 19% of respondents
registered some level of disagreement.
73
Figure 5: Motivation Survey Question Responses
Motivation Survey Question Responses
74
The responses to the motivation questions underscored participants’ highly efficacious
perceptions about using student data to improve equity in their practice. This is not surprising
given the advanced educational background of CSU faculty, 80% of whom hold a doctoral
degree and therefore have experience conducting data analyses (Employees of the California
State University, 2019). The CSU’s commitment to student success through its Graduation
Initiative 2025 program may have also contributed to strong perceptions of self-efficacy among
survey respondents. The program has provided opportunities for CSU constituents from all
campuses to engage in professional development that both builds knowledge and bolsters
motivation for eliminating equity gaps (CSU Institute for Teaching and Learning, 2020).
Organizational Influences
Consistent with participant responses to the knowledge and motivation survey items, the
answers to the twelve organizational influence questions (O.1 – O.12) shown in Figure 6 also
revealed substantial agreement among respondents. This expression of concurrence was most
prominently evidenced in question nine: My campus supports my participation in the Certificate
Program, where 92% of respondents indicated some level of agreement. High levels of
concurrence were also registered for questions ten: I know who to talk to on campus to help me
understand how to interpret student data, and question three: My campus values data-informed
decision making, with 85% and 79% of survey-takers indicating some level of agreement with
each respective statement.
Although expressions of disagreement with the organizational items were limited, there
was notable variation in participant submissions for two questions. Responses to question one: I
have access to the data I need, revealed that 27% of survey-takers disagreed in some form with
75
the statement. Likewise, 35% of survey-takers expressed some level of disagreement with
question 8: I have sufficient planning time to participate in professional development.
Figure 6: Organization Survey Question Responses
Organization Survey Question Responses
76
Figure 6 (continued)
Organization Survey Question Responses
77
Consistent with response patterns tied to the knowledge and motivation items, survey
submissions for the organizational influence questions showed widespread agreement with
statements related to the quality and prevalence of campus support for leveraging data to
improve equity. These results were likely influenced by the non-random assignment of the
stakeholders who were selected to participate in the program. As detailed in Chapter 3, the
invitation to enroll in the certificate program was extended through the campus presidents, who
either designated participants or delegated their recruitment to a senior administrator. Given
participants’ connection to the president’s office, they may have had a greater tendency to feel
well-supported and may have been more familiar with various campus support structures than
their non-participating peers.
Research Question Findings and Discussion
The descriptive statistics of the knowledge, motivation and organizational influence
survey items provide an overview of the aggregate response patterns. While this information
establishes a baseline for better understanding participants’ perceptions of various KMO
influences, it does not afford predictions about likely future outcomes or assess noteworthy
relationships between variables. To explore salient inferential connections between predictor
influences and outcome variables, the researcher performed linear and logistic regressions. The
following section reintroduces the study’s research questions, clarifies the statistical approach
used to answer each question, presents and explains the results and contextualizes each set of
findings with a brief discussion and analysis.
78
Research Question 1: What Are Participants’ Knowledge and Motivation Influences
Related to Using Student Equity Data to Improve their Practice?
To determine the extent to which stakeholders’ knowledge and motivation predicted their
tendency to utilize student data to improve equity in their practice, a multiple linear regression
was performed. Three new variables were coded to represent the knowledge influences of (1)
appreciating the benefits of data, (2) understanding how to disaggregate data and (3) reflecting
on data. Two motivation variables were also created to assess the influences of (1) self-efficacy
for using data and (2) mastery goal orientation for leveraging data to improve equity. The
outcome variable was derived from responses to survey question A3: Describe the frequency
with which you apply data to improve equity in your practice. Aggregate survey responses were
coded on a continuous scale as follows: Not Interested (1), Want to Do and Not Ready (2), Want
to Do and Ready (3), Do Occasionally (4) and Do Regularly (5).
A stepwise multiple linear regression was conducted to explore the degree to which
knowledge and motivation predicted stakeholder application of data to improve equity in their
practice. The researcher chose a stepwise approach because there was no underlying theory or
hierarchy to guide variable input (Zhang, 2016). Each of the five independent variables (the three
knowledge and two motivation influences) were entered into the equation to regress the variables
simultaneously and remove those with the weakest contribution to the model.
The results determined that among the five variables, two were the strongest contributors
to the variance in the dependent variable: (1) procedural knowledge of how to disaggregate data
(p = .000) and (2) metacognitive knowledge of how to reflect on data (p = 0.035). Table 10
shows the standardized beta coefficients, the t-value measures and the p values, with double
asterisks indicating statistical significance where p <.05.
79
Table 10: Standardized Coefficients for Linear Regression of Knowledge and Motivation
Standardized Coefficients for Linear Regression of Knowledge and Motivation as Predictors of
Applying Data to Improve Equity in One’s Practice
Beta t p
Knowledge – Benefits of Data .128 1.437 .152
Knowledge – Disaggregate Data .398 3.992 .000**
Knowledge – Reflect on Data .189 2.118 .035**
Motivation – Self-efficacy .096 1.386 .167
Motivation – Mastery Goal Orientation -.157 -1.634 .104
** p <. 05
Stakeholders’ knowledge of how to disaggregate data was the strongest predictor, followed by
their understanding of how to reflect on data. The three other variables were removed from the
analysis because they did not contribute to improving the model in a statistically significant way.
A significant regression equation was found F(2,207) = 29.511, p = .000 with R
2
of .293.
The model output indicated that 29.3% of the variance in stakeholders’ tendency to apply data to
their practice was predicted by perceptions of their ability to disaggregate data and reflect on the
use of data. Among these influences, disaggregating data contributed to the majority of the
variance (26.2%) while reflection added minimally (3.1%) to improving the prediction model.
The results indicated that knowledge influences, specifically those related to procedural
and metacognitive knowledge, were much stronger predictors of stakeholders’ likelihood to
apply data-driven insights to their practice than motivation influences. This might be explained
by the differing purposes of the four types of knowledge. Whereas factual and conceptual
knowledge supply learners with needed background information and classification schema, these
forms of knowledge are not directly connected to action (Johnson & Proctor, 2016). Procedural
knowledge, on the other hand, is inherently focused on task execution to promote action
80
(Johnson & Proctor, 2016). Similarly, metacognitive knowledge involves self-reflection that
augments awareness of one’s own learning and promotes more strategic and intentional action
(Rodgers, 2002). Since both procedural and metacognitive knowledge are linked to the practical
application of learning, it is not surprising that participants with high levels of these types of
knowledge were more likely to use data to improve their practice.
The findings also indicated that the motivation influences of self-efficacy and goal
orientation had minimal impact on stakeholders’ likelihood to use student equity data to improve
their practice. It is important to note that the data underlying these and all other survey results
were self-reported, and likely influenced by respondent’s judgment bias. This limitation is
explored further at the end of Chapter 5. Notwithstanding the potential for response bias, further
research is needed to explore why these findings deviate from the scholarly literature that
identified high levels of self-efficacy (Siwatu et. al, 2011; Dunn et al., 2013; Staman et al., 2014)
and mastery-goal orientation (Camp, 2017; Kunst, 2018) as consequential factors for
implementing more data-informed and equitable practices.
Research Question 2: What Is the Influence of the Organization’s Cultural Models and
Cultural Settings on Stakeholder Knowledge and Motivation?
Multiple linear regression was performed to develop two models: (1) an examination of
the relationship between organizational influences and stakeholder knowledge and (2) an
exploration of the impact organizational influences had on stakeholder motivation. Aligned with
Gallimore & Goldenberg’s (2001) classification, the organizational influences were coded into
two new variables: cultural models and cultural settings. The cultural models variable was
derived by combining questions that addressed the shared organizational values of data-informed
decision making and continuous learning through professional development. The cultural
81
settings variable was created by aggregating responses related to campus support structures and
the availability of planning time. The researcher performed two regressions; the first using
organizational influences to predict stakeholder knowledge, and the second combining
organizational influences to predict participant motivation.
In model one, a stepwise multiple linear regression was employed to assess the degree to
which cultural models and settings predicted stakeholder knowledge. The model simultaneously
regressed the four independent variables (two cultural model influences and two cultural setting
influences) to determine their impact on stakeholder knowledge. The results, as presented in
Table 11, indicated that cultural settings were a much stronger predictor of knowledge than
cultural models. When predicting stakeholder knowledge influences, both cultural models and
cultural settings proved to be statistically significant, but the beta coefficient for cultural models
was -.271 indicating a negative impact.
Model two also followed a stepwise structure with cultural models and cultural settings
used to predict a different dependent variable; stakeholder motivation. The results paralleled
those found in model one. While cultural models were not determined to be statistically
significant predictors of stakeholder motivation, cultural settings had a statistically significant,
positive relationship. In other words, the more favorable respondents’ perceptions of their
campus cultural settings, the more likely stakeholders were to report high levels of motivation.
Table 11 provides the standardized beta coefficients, the t-value measures and the p values, with
double asterisks indicating statistical significance where p <.05.
82
Table 11: Standardized Coefficients for Linear Regression of Cultural Models and Settings
Standardized Coefficients for Linear Regression of Cultural Models and Cultural Settings as
Predictors of Knowledge and Motivation
Beta T p
Model One (Dependent Variable = Knowledge)
Organization – Cultural Models -.271 -3.497 .001**
Organization – Cultural Settings .684 8.840 .000**
Model Two (Dependent Variable = Motivation)
Organization – Cultural Models -.127 -1.514 .132
Organization – Cultural Settings .397 6.261 .000**
** p <. 05
For the knowledge influences (model one) a significant regression equation was found
F(2,207) = 1604.125, p = .000 with R
2
of .291, indicating that 29.1% of the variance in
stakeholders’ knowledge was predicted by the organization’s cultural models and cultural
settings. Cultural settings accounted for a notable majority of the overall variance (25.6%). A
significant regression equation was also found for the motivation model F(1,209) = 1207.612,
p=.000 with R
2
of .154. These results indicated that 15.4% of the variance in stakeholder
motivation was predicted by participants’ perception of the organization’s cultural models and
cultural settings. As noted, organizational influences were far more predictive of changes to
stakeholder knowledge than motivation.
These findings suggest that CSU cultural settings, specifically campuses’ commitment to
offering needed support structures and sufficient planning time, significantly contributed to the
likelihood that stakeholders would use data to improve their practice. The differences between
cultural models and cultural settings may explain why cultural settings were shown to be more
powerful predictors of knowledge and motivation. Whereas cultural models refer to an unseen,
83
shared understanding of how the organization operates, cultural settings are visible, concrete
manifestations of the organization’s priorities (Gallimore & Goldenberg, 2001; Rueda, 2011).
The findings indicate that while it is important for CSU campuses to promote a culture of
evidence and value continuous learning, the tangibility of campus support structures is critical to
help stakeholders move their data-driven insights to equity-minded actions.
Research Question 3: How Do Stakeholders’ Job Role and Level Of Experience Relate to
their Stated Propensity to Apply Student Data to Improve their Practice?
A binomial logistic regression was conducted to determine the extent to which
stakeholders’ job role and number of years of experience in higher education predicted their
likelihood to apply student data improve their practice. A new dichotomous variable was created
to classify the dependent variable into two categories: (0) does not apply data and (1) applies
data. Aggregate survey responses were re-coded as follows: Not Interested (0), Want to Do and
Not Ready (0), Want to Do and Ready (0), Do Occasionally (1) and Do Regularly (1).
Converting the continuous responses into a binary outcome variable offered a clear distinction
between applying data and not applying data. The same strategy was used to create a new
dichotomous variable for job role where (0) = not faculty and (1) = faculty. The years of
experience data were entered as a continuous variable.
Through binary logistic regression, a dichotomous outcome is translated into a linear
model by comparing the independent variables to the log odds of the outcome occurring
(Osborne, 2017). Table 12 shows the results of the logistic regression listing the beta coefficient,
p value and odds ratio for each of the independent variables.
84
Table 12: Logistic Regression Output of Faculty Job Role and Higher Education Experience
Logistic Regression Output of Faculty Job Role and Higher Education Experience as Predictors
of the Likelihood of Applying Data to Improve Equity in One’s Practice
Beta P value Odds Ratio
Faculty Job Role -.537 .063 .585
Higher Education Experience .049 .012 ** .015
** p <. 05
Although marginally outside the range for being considered statistically significant
(p=.063), faculty were found to be 58.5% less likely to apply data to their practice than their non-
faculty colleagues enrolled in the program. This may be due to the fact that for CSU faculty, the
tenure and promotion policies ascribe notable value to research and scholarly activities while
deeming quality teaching to be a lower priority (CFA - Preparing Your RTP File for Review,
2019). Faculty who publish in prestigious journals typically enjoy a smooth path to tenure, while
those who focus on the classroom and dedicate less attention to scholarly endeavors may face a
more arduous road to promotion. As such early-career faculty are incentivized to engage in
research as opposed to pursuing pedagogical innovations that help them become more data-
driven and equity-minded teachers.
The model also found years of higher education experience to be a statistically significant
predictor of stakeholders’ likelihood to apply data insights to their practice. Put differently, the
longer time participants had worked in the higher education sector, the more likely they were to
apply data-informed insights to their practice. This finding may also be indirectly connected to
the CSU’s tenure and promotion policy. Upon achieving tenure, after several years of dedicating
time and effort focused on the pursuit of scholarly activities, faculty have more time and freedom
to contemplate strategies for improving their pedagogical practice.
85
Summary of Findings
This chapter presented the results of the quantitative analyses conducted on the 211
survey responses submitted by CSU faculty, staff and administrators participating in the 2020
cohort of the Certificate Program in Student Success Analytics. The findings indicated that the
knowledge, motivation and organizational influences culled from the literature review had
varying levels of impact on stakeholders’ ability to meet their goal. Among the salient findings
were the following insights:
• Knowledge influences had a statistically significant, positive impact on
participants’ stated tendency to apply student equity data to improve their
practice.
• Motivation influences did not have a statistically significant impact on
participants’ practical application of data.
• The procedural knowledge of how to disaggregate data was the strongest
predictor of whether or not stakeholders applied data to improve their equity-
minded practice.
• The metacognitive knowledge of how to reflect on data also helped to predict
participants’ likelihood of applying data to their practice.
• CSU campus cultural settings related to support structures and planning time had
a statistically significant, positive impact on stakeholder knowledge and
motivation.
• CSU campus cultural models related to cultures of evidence and value of
professional development had little impact on stakeholder knowledge and
motivation.
86
• CSU faculty participants were significantly less likely to apply student data to
improve equity in their practice than their staff and administrator peers in the
program.
• Participants with higher levels of experience were more likely to apply data-
driven insights to improve their practice than their less experienced colleagues.
Table 13 distills the findings within the context of the nine knowledge, motivation and
organizational influences, and assigns each a priority level (high/low) derived from the
quantitative analyses.
Table 13: Prioritization of Knowledge, Motivation and Organizational Influences
Prioritization of Knowledge, Motivation and Organizational Influences
Category Influence Priority
Conceptual
Knowledge
Participants need to understand how the strategic use of data can
influence educational outcomes for students.
Low
Procedural
Knowledge
Participants need to know how to use disaggregated data to
identify equity issues on their campus.
High
Metacognitive
Knowledge
Participants need to know how to reflect on opportunities for
leveraging student equity data to become more equity-minded in
their practice.
High
Goal Orientation
Motivation
Participants need to have a mastery goal orientation for
achieving greater equity in their practice.
Low
Self-efficacy
Motivation
Participants need to feel a strong sense of self-efficacy for their
ability to analyze data to improve student outcomes.
Low
Cultural Model
Organization
The organization needs a culture of evidence that values data-
driven analysis and action.
Low
Cultural Model
Organization
The organization needs a culture of continuous professional
development to support faculty, staff and administrator growth.
Low
Cultural Setting
Organization
The organization needs to give participants enough planning
time to have ample opportunities to analyze student data
High
87
Cultural Setting
Organization
The organization needs to provide participants with effective
support structures.
High
Chapter 5 synthesizes insights from the literature review with findings from the
quantitative analyses to present a list of prioritized recommendations for better addressing the
influences listed in Table 13.
88
CHAPTER FIVE: SOLUTIONS AND RECOMMENDATIONS
Amplifying the study’s findings, Chapter 5 offers evidence-based recommendations for
addressing the validated needs influencing certificate program participants’ ability to leverage
data to improve equity in their practice. The recommendations, aligned with the conceptual
framework and findings, were bolstered by the identification of relevant evidence-based
principles for improving stakeholder performance. In support of these recommendations, the
chapter proposes an implementation plan rooted in the New World Kirkpatrick model for
organizational change (Kirkpatrick & Kirkpatrick, 2016). Central to the plan are a series of
recommended enhancements to the Certificate Program in Student Success Analytics organized
by the four levels of the Kirkpatrick model: Results, Behavior, Learning and Reaction. Next, the
chapter presents an evaluation plan for monitoring the impact of the newly revised training
program on stakeholder motivation, confidence, learning and likelihood to transfer their new
knowledge to their practice. The chapter concludes with a discussion of the limitations and
delimitations of the study and suggestions for future research.
Knowledge Recommendations
To meet their goal of applying equity data to improve their practice, stakeholders need to
have (a) conceptual knowledge of how the strategic use of data can influence educational
outcomes for students, (b) a procedural understanding of how to use disaggregated data to
identify equity issues on their campus and (c) the metacognitive skill to reflect on opportunities
for leveraging student equity data to become more equity-minded in their practice. The
quantitative analyses in Chapter 4 determined that stakeholder procedural and metacognitive
knowledge were the most influential predictors of applying data-driven insight to one’s practice.
Table 14 reintroduces the knowledge influences, assigning a higher level of priority to the
89
procedural and metacognitive influences. The final column of Table 14 includes a context-
specific recommendation for addressing each item based on relevant learning theory principles.
Table 14: Summary of Knowledge Influences and Recommendations
Summary of Knowledge Influences and Recommendations
Knowledge Influence Priority Principle and Citation Context-Specific
Recommendation
Participants need to
understand how the
strategic use of data
can influence
educational outcomes
for students.
(Conceptual)
Low Modeled behavior is
more likely to be
adopted if the model is
credible, similar (e.g.,
gender, culturally
appropriate) and the
behavior has functional
value (Denler et al.,
2014).
Provide information in the
form of first-hand testimonials
of how faculty and staff
enhanced student outcomes by
analyzing data and applying
evidence-based insights to
improve their practice.
Participants need to
know how to use
disaggregated data to
identify equity issues
on their campus.
(Procedural)
High When people indicate
not knowing
procedures to
accomplish a goal there
is a clear example of a
loss of known
information to
complete the task.
(Clark & Estes, 2008).
Provide a job aid that includes
a flow chart illustrating the
requisite steps to disaggregate
student data for the purposes
of illuminating equity issues.
Participants need to
know how to reflect on
opportunities to use
student data to become
more equity-minded in
their practice.
(Metacognitive)
High Debriefing the thinking
process related to the
completion of new
tasks facilitates the
learning process
(Baker, 2006).
Provide training that includes
opportunities for learners to
engage in guided self-
monitoring and self-
assessment.
90
Build Participants’ Conceptual Knowledge for Leveraging Data to Improve Student
Success. Although not a top priority, program participants need a conceptual understanding of
how the strategic use of data can influence educational outcomes for students. The principles of
social cognitive theory, which highlight the value of observational learning, include strategies for
effectively closing the conceptual knowledge gap (Bandura, 2000). Elaborating on social
cognitive theory, Denler et al. (2014) and Mayer (2011) demonstrated that modeled behavior is
more likely to be adopted if the model is deemed to be credible and relatable to the learner. In
line with this assertion, the recommendation is to recruit CSU faculty and staff who have
successfully applied data to improve outcomes for their students to serve as models. These
faculty and staff would supply learners with first-hand testimonials of how they used student data
to improve their practice, demonstrating the conceptual interconnectedness between their data-
driven actions and the subsequent improvement in student performance. This recommendation
would afford program participants enhanced opportunities to increase their conceptual
knowledge through interaction with relatable models with similar jobs and experiences.
Reinforcing conceptual knowledge requires learners to gain an understanding of relevant
principles, theories and relationships that work together to produce a specific outcome
(Krathwohl, 2002). Having faculty and staff with different job roles describe these
interconnections can serve as a catalyst not only for promoting conceptual learning, but also for
compelling data-informed action. Felix et al. (2015) described a professional development
program at the Community College of Aurora where teams of equity-minded professionals from
across the institution worked together to model essential concepts related to data-driven, equity-
minded improvement. Through an examination of syllabi and an analysis of student outcome
data disaggregated by race, gender and ethnicity, campus leaders demonstrated the relationship
91
between pedagogical practice and disparate student outcomes. In addition to clarifying the
benefits of using data to improve student outcomes, the program adopted an Equity Scorecard to
continuously monitor the academic progress of students of color (Felix et al., 2015). As CSU
stakeholders are exposed to testimonials from their colleagues about how data can be used to
nurture student success, their conceptual knowledge gaps will likely close.
Improve Participants’ Procedural Knowledge for Using Disaggregated Data to
Identify Equity Issues. The findings of this study indicated a significant need for stakeholders’
to have a procedural understanding of how to disaggregate data to identify equity issues. As
presented in Chapter 2, procedural knowledge involves an understanding of the steps that must
be taken within a specific sequence to deliver a desired outcome (Krathwohl, 2002). In
accordance with information processing system theory, Clark and Estes (2008) contended that
when employees do not know the procedures to meet their objective, they should be given step-
by-step information to assist them in completing the task. In support of this principle, the
recommendation is to develop a job aid for stakeholders that includes a flowchart illustrating the
requisite steps to disaggregate student data for the purposes of illuminating equity issues. The
flowchart will serve as a graphic organizer enumerating the tasks that need to be completed to
identify appropriate datasets, interrogate student characteristic data and disaggregate salient
outcome measures by race, gender, ethnicity and socio-economic status. Serving as a reliable
source of self-help guidance, the job aid will remind stakeholders of the process for distilling
large datasets into smaller units, thereby enabling an evaluation of the equitable attainment of
student performance outcomes.
Clark and Estes (2008) stressed that job aids are most effectively implemented in
situations where stakeholders already possess ample experience, and when they do not require
92
extensive guided practice to complete the procedure. Given that the majority of certificate
program participants have previously engaged in scholarly data analysis, the job aid will offer a
timely reminder of how to accomplish the previously-learned task of disaggregating data. In their
analysis of successful equity-minded change efforts, Dowd and Bensimon (2015) emphasized the
criticality of disaggregating data by race and extolled the value of developing visual
representations of the underlying inequities. The inclusion of graphical flowcharts in the job aid
will capitalize on this finding and enhance the impact of the message. Relying on visually
intuitive job aids will augment stakeholders’ procedural knowledge of how to disaggregate data
to identify pertinent equity challenges. As the findings determined procedural knowledge to be a
statistically significant predictor of stakeholder likelihood to improve their equity-minded
practice, this recommendation is a high priority.
Build Participants’ Metacognitive Knowledge of Leveraging Student Data to
Become More Equity-minded. The study’s findings also underscored the importance of
advancing stakeholders’ metacognitive knowledge related to their self-reflection on becoming
more data-informed and equity-minded practitioners. Metacognitive knowledge is a central
component of strategic problem-solving, empowering stakeholders to internalize information and
gain personal awareness of their learning (Rueda, 2011; Krathwohl, 2002; Mayer, 2011). To
promote metacognition and facilitate learning, Baker (2006) emphasized the importance of
debriefing the thinking process related to the completion of new tasks. Based on these assertions
aligned with information processing system theory, the recommendation is to offer stakeholders
training that includes opportunities to engage in guided self-monitoring and self-assessment.
Within the certificate program, this recommendation will be implemented through a series of
discussion board prompts that ask stakeholders to reflect on their key takeaways from each
93
course session and think about how they intend to apply the new knowledge to improve their
practice.
Svinicki et al. (2016) discovered that higher education leaders who were encouraged to
think deeply about student equity data were more likely to take steps to improve their practice
based on these reflections. Central to the study was the notion that metacognitive reflection that
leads to impactful change must be purposeful and directed (Svinicki et al., 2016). By embedding
opportunities for reflection within the certificate program curriculum and asking stakeholders to
regularly post their thoughts to the discussion board, intentional metacognitive exercises will be
integrated into the learning experience. As the findings determined metacognitive knowledge to
be a statistically significant predictor of stakeholder tendency to improve their equity-minded
practice, this recommendation is a high priority.
Motivation Recommendations
In addition to the aforementioned knowledge influences, the motivation factors listed in
Table 15 need to be addressed to help stakeholders meet their goal of applying equity data to
improve their practice. Certificate program participants must possess a mastery goal orientation
toward achieving greater equity in their practice and they need to feel more self-efficacious in
their ability to analyze and apply data to improve student outcomes. While the study’s findings
determined that motivational influences were not strong predictors of participants’ likelihood to
apply data-driven insight to their practice, the literature review summarized several examples to
the contrary. As such, Table 15 indicates the lower priority level of the motivational influences
and provides a context-specific recommendation for addressing each need based on relevant
learning theory principles.
Table 15: Summary of Motivation Influences and Recommendations
94
Summary of Motivation Influences and Recommendations
Assumed
Motivation
Influence*
Priority
Principle and Citation Context-Specific
Recommendation
Participants need to
have a mastery goal
orientation for
achieving greater
equity in their
practice.
(Goal Orientation)
Low Goal orientation theory
suggests that cultivating
a mastery orientation
enhances learning,
motivation and
performance (Yough &
Anderman, 2006).
Engage participants in goal-
setting professional development
activities that establish realistic,
near-term metrics for applying
student data to improve equity in
their practice.
Participants need to
feel a strong sense of
self-efficacy for their
ability to analyze
data to improve
student outcomes.
(Self-efficacy)
Low Feedback that is private,
specific, and timely
enhances performance
(Shute, 2008).
Embed the certificate program
with regular homework
assignments that provide
stakeholders with personalized
feedback about their progress
mastering data analyses tasks.
Foster Participants’ Mastery Goal Orientation for Achieving More Equity in Their
Practice. Although not a top priority, program participants need to possess a mastery goal
orientation for approaching the task of using data to promote more equitable educational
outcomes for their students. Goal orientation theory posits that forming a mastery orientation
augments motivation, learning and performance (Yough & Anderman, 2006). Unlike
performance goals which involve measuring relative competence compared to others, mastery
goals are focused on learning and improvement derived from a set of self-developed standards
(Cerasoli & Ford, 2014). The principles of mastery goal orientation theory suggest that CSU
stakeholders will achieve greater results if they set goals that establish near-term targets for
applying data to improve their practice. In support of this theory, the recommendation is to
engage program participants with a goal-setting activity that facilitates their establishment of a
95
series of metrics related to applying equity data to their practice. Subsequently, participants will
be asked to reflect on their goals and post monthly updates to the discussion board detailing their
progress in becoming more equity-minded practitioners.
In a study measuring the impact of goal orientation theory on public school teacher
behavior, Butler and Shabaz (2014) found that teachers with mastery-orientations were more
likely to deliver stimulating and engaging instruction than their performance-oriented peers.
Camp (2017) reported similar findings at the University of Minnesota at Mankato, where
mastery-oriented teaching candidates demonstrated a higher propensity to commit to
improvement in the classroom than their colleagues with a performance-orientation or no goal
orientation. Both studies suggest that the autonomous, self-driven nature of the goal setting
activity is an important aspect of nurturing the drive for self-improvement (Butler & Shabaz,
2014: Camp, 2017). In line with this contention, certificate program participants will be
encouraged to set mastery goals and establish near-term improvement measures derived from
their analyses of student equity data.
Cultivate Participants’ Self-Efficacy for Analyzing and Applying Student Data. In
addition to having a mastery goal orientation, stakeholders also need self-efficacy for analyzing
and applying student data. Although a lower priority than addressing procedural and
metacognitive knowledge needs, the certificate program must promote efficacious feelings
toward improving one’s equity-minded practice. Self-efficacy theory asserts that motivation and
performance are heightened when learners form positive expectations for success (Pajares,
2006). Shute (2008) contended that furnishing learners with directed feedback that is private,
specific and timely is an effective strategy for enhancing self-efficacy. This would suggest that to
stimulate efficacious beliefs about their ability to become more data-informed practitioners,
96
stakeholders need ample opportunities to receive personalized feedback about their performance.
In support of this premise, the recommendation is to embed the certificate program with regular
homework assignments that allow opportunities for participants to apply newly learned concepts
to their practice. Program administrators will dispense personalized feedback on the homework
assignments and contribute reassuring, constructive comments to increase participants’ self-
efficacy for analyzing and applying data.
A central tenet of self-efficacy theory is the notion that learners can improve the belief in
their ability to succeed through constructive feedback that encourages further progress toward
meeting their goal (Pajares, 2006). Studies have demonstrated a variety of successful strategies
for offering feedback-rich professional development to increase educators’ self-efficacy in their
pedagogical practices. A mixed methods analysis of a professional development program for
medical faculty in Bhutan determined that frequent interactions with peer coaches helped to
greatly enhanced program participants’ self-efficacy for improving their pedagogy (Tenzin, et
al., 2019). This finding suggests that implementing opportunities for meaningful feedback in the
certificate program will increase stakeholders’ self-efficacy for leveraging data to improve
student outcomes.
Organization Recommendations
In addition to mitigating performance challenges posed by knowledge and motivation
gaps, addressing organizational influences is also vital to supporting stakeholder goal attainment
(Clark & Estes, 2008). The organizational factors listed in Table 16 indicate that CSU campuses
needs to a) advance a culture of evidence, b) encourage faculty and staff to engage in
professional development, c) offer ample planning time and d) ensure that stakeholders have
reliable support structures. Consistent with Gallimore and Goldenberg’s (2001) classification
97
structure introduced in Chapter 2, the organizational influences are identified as cultural models
or cultural settings. In accordance with the study’s findings, the two cultural setting influences
have been designated as high priorities because they were determined to be statistically
significant contributors to stakeholders’ knowledge and motivation. The final column in Table
16 provides recommendations derived from relevant learning theory principles for attenuating
each organizational influence gap.
Table 16: Summary of Organization Influences and Recommendations
Summary of Organizational Influences and Recommendations
Assumed Organizational
Influence*
Priority
Principle and Citation Context-Specific
Recommendation
The organization needs a
culture of evidence that
values data-driven
analysis and action.
(Cultural Model)
Low Effective change efforts
use evidence-based
solutions and adapt them,
where necessary to the
organization’s culture
(Clark and Estes, 2008).
Develop a faculty portal
that leverages statistical
modeling to identify salient
equity gaps in all CSU
courses and provides the
campus community with
evidence-based solutions
for improvement.
The organization needs a
culture of continuous
professional
development to support
faculty, staff and
administrator growth.
(Cultural Model)
Low Organizational
effectiveness is enhanced
when leaders provide staff
with the material and
professional development
needed for the successful
execution of their jobs
(Waters et al., 2003)
Implement a “train the
trainers” model to build
capacity and resource
support for scaling the
Certificate Program
curriculum throughout the
CSU.
The organization needs
to give participants
enough planning time to
have ample opportunities
to analyze student data.
(Cultural Setting)
High Insuring staff’s
resource needs are
being met is correlated
with increased
employee learning
outcomes (Waters et al.,
2003)
Provide program
participants with
unstructured “team
planning time” to analyze
student data in a
collaborative, solutions-
oriented environment.
The organization needs
to provide participants
with effective support
structures.
High Effective change efforts
ensure that everyone has
the resources (equipment,
personnel, time, etc.)
needed to do their job,
Create an artificial
intelligence “chat-bot” that
facilitates access to
resources that support
stakeholder efforts to
98
(Cultural Setting) and that if there are
resource shortages, then
resources are aligned with
organizational priorities
(Clark & Estes, 2008)
become more equity-
minded and data-driven.
Promote a Culture of Evidence that Values Data-Driven Analysis and Action. The
CSU needs to foster a culture of evidence and data-informed decision making to facilitate
participants’ application of student data to their practice. Although the study’s findings
determined this influence to be lower priority, a recommendation rooted in change management
theory has been selected to address the need. Clark and Estes (2008) maintained that effective
change efforts are grounded in evidence-based solutions that are adapted, where necessary, to the
organization’s culture. This suggests that faculty need to have access to student equity data in the
courses they teach, and that this information should be contextualized in alignment with the
CSU’s equity mission. The recommendation is to develop a faculty portal that uses statistical
modeling to identify salient equity gaps in all CSU courses and offer faculty personalized,
evidence-based solutions for improvement. The faculty portal will be linked to the Graduation
Initiative 2025 website, thereby connecting faculty change efforts to the CSU system-wide goal
of eliminating equity gaps in college graduation rates.
Effective cultures of evidence promote equity by leveraging data to uncover disparities
across the organization, while promoting the collaborative identification of solutions to rectify
inequities (Harris & Bensimon, 2007). Disaggregated equity gap data can serve as a powerful
resource for faculty, staff and administrators to pursue change efforts that lead to more equitable
outcomes. Dowd and Liera (2018) chronicled a case study detailing how faculty at Old Main
University interrogated course outcome data disaggregated by race and ethnicity to identify
opportunities for changing their practice to promote more equitable outcomes. Implementing
99
these purposeful efforts narrowed equity gaps in the grade distributions while increasing learning
for all students (Dowd & Liera, 2018). Building from this example, faculty clearly need access to
equity gap data for the courses they teach along with examples of course redesign models that
successfully closed GPA gaps in similar courses across the CSU. The recommendation to
develop a faculty portal is designed to meet this need by highlighting equity gaps in all courses
taught at the CSU and by including faculty testimonials of how they redesigned their courses and
adopted inclusive pedagogical strategies to promote more equitable outcomes.
Prioritize Professional Development Opportunities for Improving Data-Informed
Practice. Although a lower priority, the CSU also needs to promote professional development
opportunities broadly throughout the institution to support faculty and staff efforts to use data to
improve their practice. A recommendation derived from the concept of balanced leadership was
crafted to address the need. A critical feature of the balanced leadership model espoused by
Waters et al. (2003) stipulates that organizational effectiveness is enhanced when leaders provide
staff with the material and professional development needed for the successful execution of their
jobs. This suggests that scaling the certificate program’s curriculum and offering equity-minded
professional development to faculty and staff across the CSU would improve performance. In
line with this assertion, the recommendation is to implement a “train the trainers” model to build
capacity for the certificate program curriculum to be delivered across the CSU.
Senge (1990) asserted that cultivating a learning organization that is committed to
continuous data-driven improvement is a sustainable strategy for driving improved performance.
In the higher education sector, Costino (2018) and Carney et al., (2016) stressed the need for a
broad-reaching institutional commitment to equity-minded professional development to
demonstrate that the university values continuous learning and is committed to eliminating
100
inequities. These studies suggest that scaling the professional development program to reach a
larger audience of CSU practitioners would afford the CSU greater opportunities to meet its goal
of eliminating equity gaps in degree attainment rates by 2025. Acknowledging the significant
cost associated with increasing the number of participants, a blended “train the trainer” model
offers a cost-effective approach to scaling the program (Pearce et al., 2012). The
recommendation is to invite campus faculty development directors to participate in the 2021
cohort of the certificate program and engage them with additional equity-minded professional
development to enable them to lead similar sessions on their campuses.
Ensure Stakeholders Have Ample Planning Time to Analyze Student Data. With
respect to cultural settings, the study’s findings indicated that sufficient planning time was a
statistically significant predictor of the tendency for stakeholders to apply student data to their
practice. In their meta-analysis of the effects of leadership on achievement, Waters et al. (2003)
averred that supplying adequate planning time and resources is correlated with increased learning
outcomes. In line with this assertion, the recommendation is to incorporate unstructured “team
planning time” into the program to help program participants analyze student data in a
collaborative, solutions-oriented environment. Interrogating data, conducting analyses and
contemplating opportunities for adjusting equity in one’s practice are all time-consuming
endeavors (Bensimon, 2004; Dowd & Liera, 2018). This recommendation affords program
participants with the requisite time and support to become more data-informed and equity-
minded practitioners.
In a book describing various approaches to analytics-based transformational change,
Gagliardi et al. (2018) underscored that planning time is essential for university leaders to
immerse themselves in student data, understand their meaning and contemplate potential
101
applications to their practice. Planning time is also a central component of the Equity Scorecard
model in which participatory action research affords campus teams with collaborative planning
time to conduct equity-based inquiry into their practice (Bensimon et al., 2016). The
recommendation to include unstructured team planning time as part of the certificate program
curriculum aligns with the findings of these two studies. In their campus groups, stakeholders
will have the opportunity to explore the practical application of new learning to their local
contexts.
Provide Stakeholders with Effective Support Structures for Improving Equity in
Their Practice. Lastly, the results of this study indicated that the CSU needs to furnish
participants with robust support structures to help them meet their goals. The findings indicated
that addressing this need is a high priority, as the availability of sufficient campus support was a
strong predictor of stakeholder likelihood to improve equity in their practice. A recommendation
rooted in change management theory has been selected to address this organizational need. Clark
and Estes (2008) contended that effective change efforts ensure that everyone has the resources
(equipment, personnel, time, etc.) needed to do their job. This suggests that faculty and staff need
guidance throughout the program to help them derive insights from student data and apply this
knowledge to their practice. In an effort to offer stakeholders continuous and sustainable support,
the recommendation is to develop an artificial intelligence “chat-bot” that facilitates access to
and an explanation of resources that support equity-minded and data-driven practices.
Research has shown that performance is heightened when leaders supply staff with the
resources needed to meet their professional goals (Clark and Estes, 2008; Waters et al., 2003). In
a faculty development study by the American Council of Education, Haras et al. (2017)
emphasized the importance of offering instructional faculty high-quality teaching and learning
102
resources to aid student-centered changes in their pedagogy. The report included examples of
how resources such as tutorials, videos, research studies, learning tools, websites and e-mail
listservs contributed to producing notable improvements in the level of faculty teaching
effectiveness. Although many of these resources are readily available to participants of the
certificate program they are not organized and contextualized in a manner that make them easy
to locate and intuitive to use. The recommendation to create an artificial intelligence “chatbot”
will remedy this problem by answering participants’ resource-related questions and presenting
support solutions that most closely align with their needs.
Integrated Implementation and Evaluation Plan
To guide the application of the aforementioned recommendations, the researcher
developed an implementation and evaluation plan grounded in the principles of the New World
Kirkpatrick Model for organizational change (Kirkpatrick & Kirkpatrick, 2016). The model
outlines four stages of evaluation, presented in reverse order beginning with Level 4, which
evaluates the extent to which improved performance is achieved as a result of the training
program (Kirkpatrick & Kirkpatrick, 2016). After focusing on results, the model calls for an
identification of specific actions in Level 3 that demonstrate how participants’ behaviors portray
the transfer of new knowledge to improve their practice. Level 2 then assesses participants’
learning by measuring various aspects of their skill, knowledge, attitude and commitment-level
following the program. Finally, Level 1 assesses reactions, gauging the extent to which
stakeholders found the training to be valuable, engaging and relevant to their practice. With the
New World Kirkpatrick Model as a foundation, the evaluation plan detailed in the following
sections puts forth a series of suggested enhancements to the CSU Certificate Program in Student
103
Success Analytics, creating an action plan for ameliorating salient KMO gaps that prevent
stakeholders from meeting their goal.
Organizational Purpose, Need and Expectations
The CSU has set the goal of eliminating equity gaps in graduation rates between
historically underserved students and their more privileged peers by 2025. As part of a larger
effort to meet this goal, the CSU developed the Certificate Program in Student Success Analytics
to supply faculty, staff and administrators with guided practice in conducting evidence-based
inquiry to identify opportunities for increasing equitable student outcomes. The 2020 cohort of
CSU program participants—the collective stakeholders for this study—will help the CSU meet
its goal by learning to analyze and apply student equity data to improve their practice, resulting
in more equity-minded programs, teaching strategies and policies. The recommendations
identified in this study will empower stakeholders with the knowledge, motivation and
organizational support to make fundamental changes to their practice that advance greater equity
throughout the CSU.
Level 4: Results and Leading Indicators
Table 17 indicates the proposed approach to evaluating the results (Level 4) with respect
to certificate program participant achievement of the external and internal desired outcomes,
metrics and methods. If the internal outcomes are accomplished through the proposed
enhancements to the training program and augmented organizational support for analyzing and
applying student data to practice, then the external outcomes should also be realized.
Table 17: Outcomes, Metrics and Methods for External and Internal Outcomes
Outcomes, Metrics and Methods for External and Internal Outcomes
Outcome Metric(s) Method(s)
External Outcomes
104
Increased academic success
for students of color
Improved course passage rates
for students of color
Conduct trend analysis of
course passage rates
disaggregated by race
Increased continuous
enrollment of students of
color
Increased retention rates for
students of color
Conduct trend analysis of first,
second, and third year
retention rates for students of
color
Increased participation in the
Student Success Analytics
Certificate Program
Increased number of program
participants (2021 vs. 2020)
Monitor program registrations
Increase in the number of
faculty, staff and
administrators of color
Increased number of CSU
employees of color (2021 vs.
2020)
Track annual HR employee
status report
Internal Outcomes
More frequent use of
disaggregated data sets by
faculty, staff and
administrators
Increased visits to CSU faculty
dashboard
Track dashboard usage
statistics
Greater sense of belonging
for students of color
Improvement as indicated by
students of color responses to
engagement-related questions
on the National Study for
Student Engagement (NSSE)
survey
Conduct trend analysis of how
students of color respond to
NSSE engagement questions
Increased understanding
among CSU faculty, staff and
administrators of how to
overcome inherent prejudices
to promote equity
Increased participation in
system-wide implicit bias
training
Track participation in system-
wide implicit bias training
Level 3: Behavior
Critical Behaviors. While defining the desired results clarifies the expected outcomes of the
change effort, stakeholders’ critical behaviors will ultimately determine whether or not they
achieve the desired outcomes. Level 3 of the New World Kirkpatrick Model concentrates on
actions taken by stakeholders to transfer what they learned in the certificate program into on-the-
job behaviors (Kirkpatrick & Kirkpatrick, 2016). These newly implemented behaviors are
critical, in that they directly impact the organizational outcomes in ways that are both observable
and measurable. Table 18 lists four actions that certificate program participants need to take to
105
achieve their goal of using evidence to improve equity in their practice. Critical behaviors for
faculty include accessing equity gap grade data for the courses they teach and using these data to
make equity-minded changes to their pedagogy to better support students of color.
Administrators and staff need to implement the data action research projects they developed in
the program, and all participants need to share their reflections on improving equity in their
practice. Table 18 expands on these critical behaviors and provides a suggested metric and
method for tracking successful execution of the behavior along with an associated timeline.
Table 18: Critical Behaviors, Metrics, Methods and Timing for Evaluation
Critical Behaviors, Metrics, Methods and Timing for Evaluation
Critical Behavior Metric(s)
Method(s)
Timing
1. Participating
faculty access
equity gap grade
data for the
courses they teach.
Number of faculty
who visit the faculty
portal
Faculty dashboard
usage statistics
At the conclusion of
every academic term
2. Participating
faculty make data-
informed changes
to their pedagogy
to increase equity
in their courses.
Number of faculty
who report improving
their equity-minded
pedagogy as a result
of the program.
Survey One year after
conclusion of the
program
3. Participating
administrators and
staff implement
their team’s action
research project.
Number of action
research projects that
reach the
implementation stage
Team lead reports Six months after
conclusion of the
program
4. All participants
document their
reflections on
improving equity
in their practice.
Number of
participants who
posts their reflections
Learning community
discussion board
postings
Monthly
Required drivers. Moving from learning to action requires organizational drivers in the
form of both support and accountability (Kirkpatrick and Kirkpatrick, 2016). Certificate program
participants need the support of the campus community to reinforce their newly acquired knowledge
106
and ensure that their learning is transferred into equity-minded action. The drivers in Table 19 are
designed to reinforce, encourage, reward and monitor actions to help participants cultivate more
equitable outcomes for their students.
Table 19: Required Drivers to Support Critical Behaviors
Required Drivers to Support Critical Behaviors
Method(s) Timing
Critical Behaviors Supported
1, 2, 3 Etc.
Reinforcing
Job aids for disaggregating
student data by race.
Provided in certificate
program, available online
1,2
Implicit bias training Ongoing 1,2,3,4
Posting reflections to learning
community discussion board
After every semester 1,2,3,4
Encouraging
Pedagogical support from
campus faculty development
centers
Ongoing 1,2
Article in system-wide
newsletter
Annual 1,2,3
Peer feedback delivered via
learning community discussion
board
Ongoing 1,2,3,4
Rewarding
Recognition at the Graduation
Initiative Symposium
Annual event (October) 1,2,3
Public acknowledgement from
campus president, provost, and
vice president for student
affairs.
Upon successful
conclusion of Certificate
Program (June)
1,2,3
Monitoring
Post-program survey One month following
program (July)
1,2,3
Program administrators conduct
web meeting to review progress
on action research projects
Biannually after program
concludes
3
Organizational Support. Kirkpatrick and Kirkpatrick (2016) averred that organizational
support serves as an essential measure of accountability, greatly increasing the likelihood that
new learning will result in productive action. To assist program participants’ transfer of insights
107
derived from their analysis of student data into more equitable behavior, the CSU needs to
implement the required drivers listed in Table 19. These drivers are grounded in the CSU’s moral
imperative to ensure that students from all backgrounds have equitable opportunities to attain a
high-quality college degree. Among other support services, the CSU will develop job aids for
disaggregating data, equity-minded professional development, coaching from faculty
development centers and public recognition of exemplary implementations. More importantly,
stakeholders will have ongoing access to the course discussion boards and chat tools, presenting
robust opportunities for continued collaboration with the learning community. Implementing
these tools and support structures will not require a large financial outlay of organizational
resources. Collectively, the suggested program enhancements provide an opportunity for the
CSU to make cost-effective enhancements to the certificate program that will improve
effectiveness with minimal additional investment.
Level 2: Learning
Level 2 focuses on the learning process by which participants gain the prerequisite
knowledge, skills, confidence and commitment to make consequential behavioral changes in
their practice to improve equity (Kirkpatrick & Kirkpatrick, 2016). Setting learning goals for the
program establishes a baseline from which to develop the curriculum and establish metrics to
evaluate the extent to which the goals are being met.
Learning Goals. Upon successful completion of the recommended enhancements to the
Certificate Program in Student Success Analytics, participants will be able to:
1. Explain how analyzing student data can produce insights that help improve equity in their
practice (Conceptual Knowledge).
2. Demonstrate how to disaggregate student outcome data by race (Procedural Knowledge).
108
3. Reflect on opportunities to leverage student data to promote more equitable student
outcomes (Metacognitive Knowledge).
4. Gain confidence in their ability to become more data-informed and equity-minded
practitioners (Self-efficacy).
5. Set goals for improving equity in their practice (Goal Orientation).
Program. The learning goals listed above will be achieved through stakeholder
engagement with the Certificate Program in Student Success Analytics, a four-month
professional development program. The program consists of two face-to-face convenings; a
kickoff meeting in late January and a closing session in early May. In the months in between,
campus teams of 10-15 faculty, staff and administrators participate in a series of biweekly hybrid
learning sessions featuring guest presenters from around the nation who lead interactive
discussions about equity-related topics. Following the presentations, leaders on each campus
team contextualize the discussion by identifying opportunities for applying the evidence-based
strategies to their local context.
Acknowledging the importance of the social aspect of learning, the certificate program
promotes cross-campus collaboration by advancing a learning community model. A web-based
discussion board serves as the central hub of the learning community, where participants can ask
questions, share progress, contribute feedback and post reflections. To improve their mastery of
data analytics skills, participants are encouraged to complete biweekly homework assignments
that offer additional practice for interrogating data and identifying salient patterns within the
CSU’s dashboards. Program administrators contribute their feedback to these assignments and
give suggestions for deeper exploration. Emphasizing the importance of transferring learning to
improve practice, at the culmination of the program campus teams submit an action research
109
project proposal detailing how they intend to enhance a campus program or policy to support
more equitable outcomes for historically underserved students.
Evaluation of the components of learning. Clearly defining a program’s learning
objectives not only guides the curriculum development process, but also establishes guideposts
for evaluating the extent to which the desired learning is taking place. Kirkpatrick and
Kirkpatrick (2016) delineated five essential aspects of evaluating Level 2 learning: knowledge,
skills, attitude, confidence and commitment. The knowledge metric is focused on the information
and skills stakeholders need to meet their goal. Attitude can be measured by assessing the value
that participants ascribe to completing the newly learned tasks. Participants’ confidence levels
can be determined by evaluating beliefs about their ability to successfully accomplish a given
task. And finally, stakeholder commitment can be assessed by gauging stakeholder motivation to
follow through with performing the new learning. Table 20 lists the evaluation methods and
timing for each component of learning embedded within the certificate program.
Table 20: Evaluation of the Components of Learning for the Program
Evaluation of the Components of Learning for the Program
Method(s) or Activity(ies) Timing
Declarative Knowledge “I know it.”
Knowledge checks through paired activities and discussions
with learning community colleagues and team leaders.
Ongoing throughout the
program
Knowledge checks through homework assignments. Biweekly throughout the
program
Knowledge check through post-program survey. Upon completion of the
program.
Procedural Skills “I can do it right now.”
Feedback received from peers during peer-to peer activities. Ongoing throughout the
program
Utilization of job aids, in groups and individually, to
successfully disaggregate student data by race.
Ongoing throughout the
program
Scenario-based activities assessed through homework
assignments.
Biweekly throughout the
program
Attitude “I believe this is worthwhile.”
110
Team leaders’ observations of participants’ statements and
actions demonstrating that they see the benefit of becoming
more equity-minded.
Ongoing throughout the
program
Team discussions of the value of leveraging data to uncover
insights for more equitable practice.
Ongoing throughout the
program
Retrospective feedback provided via survey assessment. Upon completion of the
program.
Confidence “I think I can do it on the job.”
Self-reflection posts to learning community discussion board
regarding participants’ feelings about applying the learning
to their practice.
Bimonthly
Goal-setting activities specifying near-term targets for
improved practice.
Ongoing throughout the
program
Retrospective feedback provided via survey assessment. Upon completion of the
program
Commitment “I will do it on the job.”
Self-reflection posts to learning community discussion board
about progress implementing data-informed insights into
their practice.
Bimonthly
Progress update on implementation of action research
project.
Two months after program
completion
Level 1: Reaction
Stakeholders’ reactions to the certificate program training impact their likelihood to apply
newly acquired knowledge to their jobs to improve performance. Level 1 of the New World
model measures participants’ reactions to the program by assessing their engagement with the
content, their perceptions of the relevance of the curriculum to their jobs and their overall level
of satisfaction (Kirkpatrick & Kirkpatrick, 2016). Gathered throughout the program, this
formative feedback will guide program administrators to take mid-course corrections that
promote continuous improvement. Table 21 outlines the measurement strategies that will gauge
stakeholders’ reactions to the program.
Table 21: Components to Measure Reactions to the Program
Components to Measure Reactions to the Program
Method(s) or Tool(s) Timing
111
Engagement
Attendance Ongoing during program
Participation in course discussions Ongoing during program
Timely completion of homework assignments, reflective
posts to discussion board and action research project
Ongoing during program
Relevance
Discussion with learning community and team leaders Ongoing during program
Post-program survey Upon program completion
Implementation of action research project Two months after program
completion
Customer Satisfaction
Discussion with learning community and team leaders Ongoing during program
Post-program survey Upon program completion
Evaluation Tools
Immediately following the program implementation. The certificate program
curriculum will contain multiple opportunities for engaging stakeholders in formal and informal
evaluations consistent with the four levels of the New World Kirkpatrick Model (Kirkpatrick &
Kirkpatrick, 2016). During each of the hybrid sessions, the team leaders will facilitate group
discussions among campus participants to gauge reactions to the day’s presentation, check for
understanding, assess motivation and discuss the potential for application to participants’
practice. Every two weeks the team leaders will share salient information from the informal
evaluations with the program administrators who will make adjustments to the curriculum to
improve the learning experience. More formally, stakeholders will complete biweekly homework
assignments and post comments to the learning community discussion board documenting their
thoughts, reactions and questions as they progress through the program.
In addition to implementing a variety of recurring evaluation techniques, the certificate
program director will administer an online survey, a copy of which can be found in Appendix C.
Upon completion of the opening session, participants will be asked to fill out the survey to assess
their level of satisfaction and ascertain the degree to which the kickoff meeting met their
112
expectations (Level 1) and their perceptions of how much they learned (Level 2). The survey
items in this instrument will be aligned with the learning objectives of the opening session.
Delayed for a period after the program implementation. Two months after the
culmination of the certificate program, participants will be e-mailed a link to a summative
survey, included in Appendix D. This online instrument will not only assess participants’
reactions to (Level 1) and learning from (Level 2) the training program, but will also evaluate the
extent to which stakeholders have transferred learning to their practice (Level 3) and attained
results (Level 4) that indicate more equitable student outcomes. Participants will also be
encouraged to stay connected to the learning community by regularly posting progress updates to
the discussion board and sharing perceptions about their journey to become more data-informed
and equity-minded.
Data Analysis and Reporting
Following the conclusion of the certificate program, the program administrator will
develop two biannual reports synthesizing the results of the informal and formal evaluations. The
first report will provide campus administrators with an executive summary of participant
responses to survey questions and include examples of how stakeholders applied learning from
the program to improve equity in their practice. The report will also feature a case study
chronicling the process by which one program participant analyzed data, discovered inequitable
student outcomes and devised and implemented corrective actions to ameliorate the inequities.
Although the case study will be in narrative form, it will demonstrate an evidence-based need for
improvement and supply documentation of the results of the intervention.
The second report, delivered through a series of interactive visualizations in the CSU
Student Success Dashboard, will be targeted to faculty participants. Updated after every term, the
113
dashboard will include a trend analysis of grading patterns in all courses taught at the CSU. As
illustrated in the preliminary design mock-ups in Figure 7, the dashboard will indicate the
presence of GPA gaps between students of color and their peers along with the distribution of
course grades. The site will be password protected to ensure that access is restricted to faculty,
permitting them to only see data for the courses they teach.
114
Figure 7: CSU Student Success Dashboard - Faculty Grading Pattern Trend Analysis
CSU Student Success Dashboard - Faculty Grading Pattern Trend Analysis
115
Summary
The recommended improvements, implementation strategies and evaluation plans for this
research study were rooted in the New World Kirkpatrick Model framework (Kirkpatrick &
Kirkpatrick, 2016). The model employs a tiered evaluation structure to help program
administrators maximize performance by focusing on results, and then working backwards to
determine the essential behaviors and learning outcomes needed to drive productive change. As
presented throughout the chapter, formal and informal evaluations were incorporated throughout
the program to measure the formative and summative impact of the changes and promote
frequent opportunities for data-informed improvements. The evidence-based plan increases the
likelihood that certificate program participants will not only find value in the training program,
but also be more likely to apply new knowledge to improve equity in their practice.
Limitations and Delimitations
All research studies contain inherent limitations, or influences that are outside the
researcher’s control, potentially affecting the accuracy of the findings (Merriam & Tisdell,
2016). A central limitation of this study was the potential for bias due to the self-reported nature
of the survey responses and the power dynamic between the CSU system office and campus
participants. Although participant responses were anonymous, self-reported data are prone to
exaggeration and can be skewed based on social desirability bias, where participants are driven
to answer in ways that portray them favorably (Grimm, 2010). In addition to the potential for
bias from unduly favorable submissions, responses may have also been influenced due to the fact
that the program was hosted by the CSU system office. At the onset of the program, participants
were required to issue their informed consent for completing the survey, which was subsequently
analyzed by a researcher employed by the CSU Chancellor’s Office. This knowledge may have
116
led some to submit more positive answers to survey items, thereby threatening the veracity of the
data.
The global COVID-19 pandemic also presented a significant limitation to the
methodological design of the study. Originally, the researcher planned to conduct a repeated
measures evaluation of KMO influences by identifying statistically significant differences in
participants’ responses to pre- and post-program surveys. In March, 2020 when the health crisis
necessitated social distancing, the program administrator contacted all participants to express
concern for their health and safety and extend an offer for them to withdraw from the program
and reenroll the following year. In the ensuing two weeks, 72 participants from five CSU
campuses withdrew, substantially diminishing the potential response rate for a post-program
survey. Correspondingly, the researcher decided to forgo the second survey, and instead chose to
rely exclusively on the first instrument as a means of analyzing stakeholder readiness to apply
data-informed insights to improve equity in their practice.
The design and implementation of this study also had delimitations, or choices made by
the researcher that influenced its overall scope (Merriam & Tisdell, 2016). To promote clarity,
the researcher opted to center the literature review on the narrow topic of closing equity gaps in
public university graduation rates between students of color and their White and Asian peers.
This narrow focus precluded a more in-depth analysis of other salient degree-attainment gaps
related to student characteristics such as gender, socio-economic status and first-generation
status.
A second notable delimitation was associated with the researcher’s decision to conduct a
purely quantitative analysis and forego a mixed methods approach with participant interviews,
observations and document analysis. While the incorporation of these qualitative elements would
117
likely have contributed additional salient information, the researcher had already employed these
methods to evaluate outcomes for previous cohorts of program participants. The quantitative
approach not only facilitated a novel, in-depth analysis of KMO influences, it also afforded the
researcher the opportunity to gain mastery of several relevant statistical analyses.
Recommendations for Future Research
The scholarly literature is rife with research that analyzes strategies and evidence-based
approaches for ameliorating equity gaps in higher education. This study focused on the readiness
of CSU stakeholders to use data to identify inequities and change their practice to promote more
equitable outcomes. Although gauging readiness to improve equity is an important step, it
presupposes that the resulting actions will cultivate more equitable outcomes for students. To
evaluate the relationship between stakeholder actions and the elimination of equity gaps, a
longitudinal study is needed to identify statistically significant changes in the performance of
students enrolling in courses taught by faculty participants in the certificate program. Such
studies would compare historical GPA gaps between students of color and their peers in courses
taught by faculty participants both before and after their participation in the program.
In addition to a longitudinal analysis of equity-related outcomes, future empirical
research is needed to explore the intersectionality of student identities with respect to equity
gaps. Current studies typically limit their focus to singular aspects of student identifies—such as
race, ethnicity, gender, or socio-economic status. Analyzing the complex relationships between
intersectional dimensions of student identities would undoubtedly produce critical knowledge for
better understanding disparate student experiences and contribute to the development of more
precise strategies for better serving a diverse student body.
118
Conclusion
Earning a college degree is one of the best predictors of future social mobility, bestowing
historically underserved students with an opportunity to fundamentally alter their career
trajectory and enjoy a more prosperous and fulfilling life (Chetty et al., 2017). Despite these
benefits, students of color are far less likely to earn a college degree than their White and Asian
peers (National Center for Education Statistics Postsecondary Graduation Rates, 2019; Bowen, et
al, 2009). Aligned with the CSU’s goal of eliminating equity gaps in graduation rates by the year
2025, this study assessed the readiness of participants in the Certificate Program in Student
Success Analytics to apply data-driven insights to improve equity in their practice. Utilizing the
Clark and Estes (2008) Gap Framework Analysis, linear and logistic regressions were performed
to identify knowledge, motivation and organizational influences impeding program participants
from becoming more data-driven and equity-minded practitioners. The quantitative analyses
revealed that participants’ propensity to apply data to their practice was most strongly predicted
by their level of procedural and metacognitive knowledge. Campus resources were also shown to
be critical factors in supporting more equitable practices. Finally, faculty were determined to be
less likely than their administrator and staff colleagues to apply student equity data to improve
their practice. These findings yielded a framework for the development of a list of recommended
programmatic enhancements with the goal of equipping all participants with the skills,
motivation and support needed to regularly apply data-driven insights to improve equity in their
practice.
The problem of inequitable outcomes for students of different races and ethnicities is
complex and multivariate, requiring the intentional efforts of the entire campus community. To
guide equity-minded actions, campus constituents must have the requisite knowledge, motivation
119
and organizational support to understand their role in promoting more just student outcomes.
Through an analysis of the readiness of CSU Certificate Program in Student Success Analytics
participants to apply data-driven insights to improve equity in their practice, this study offered
contextualized recommendations for enhancing the program and fostering more purposeful
actions to eliminate equity gaps at the California State University.
120
References
Abdul-Raheem, J. (2016). Faculty diversity and tenure in higher education. Journal of Cultural
Diversity, 23(2), 53-56.
Adams, D., Meyers, S., & Beidas, R. (2016). The relationship between financial strain, perceived
stress, psychological symptoms, and academic and social integration in undergraduate
students. Journal of American College Health, 64(5), 362-370.
Adelman, C. (2004). Institute of Education Sciences, Principal indicators of student academic
histories in postsecondary education, 1972-2000. Washington, D.C.: Institute of
Education Sciences, U.S. Dept. of Education, 2004.
https://www2.ed.gov/rschstat/research/pubs/prinindicat/prinindicat.pdf
Aguinis, H., & Kraiger, K. (2009). Benefits of training and development for individuals and
teams, organizations, and society. Annual Review of Psychology, 60, 451–474.
https://doi.org/10.1146/annurev.psych.60.110707.163505
Association of American Colleges and Universities. (2018). A Vision for Equity: Committing to
Equity and Inclusive Excellence: Campus-Based Strategies for Student Success.
Washington D.C: AAC&U.
Astin, A. W. (1993). Diversity and multiculturalism on the campus: How are students
affected? Change: The Magazine of Higher Learning, 25(2), 44-49.
Baker, L. (2006). Metacognition. http://www.education.com/reference/article/metacognition
Bandura, A. (2000). Exercise of human agency through collective efficacy. Current Directions in
Psychological Science, 9, 75–78. https://doi.org/10.1111/1467-8721.00064
Bandura, A. (2005). The evolution of social cognitive theory. In K. G. Smith & M. A. Hitt
(Eds.), Great Minds in Management (pp. 9–35). Oxford: Oxford University.
121
Barnshaw, J., & Dunietz, S. (2015) Busting the myths. Academe, 101(2), 4-84.
Bensimon, E. M. (2018). Reclaiming Racial Justice in Equity. Change: The Magazine of Higher
Learning, 50(3-4), 95-98.
Bensimon, E. M. (2004). The diversity scorecard: A learning approach to institutional
change. Change: The magazine of higher learning, 36(1), 44-52.
Bensimon, E. M., Dowd, A. C., & Witham, K. (2016). Five principles for enacting equity by
design. Diversity and Democracy, 19(1), 1-8.
https://www.stephens.edu/files/resources/five-principles-for-enacting-equity-by-design-1-
4.pdf
Bensimon, E. M., Hao, L., & Bustillos, L. T. (2007). Measuring the state of equity in higher
education. In P. Gandara, G. Orfield, & C. Horn (Eds.), Expanding opportunity in higher
education: Leveraging promise (pp. 143–166). Albany, NY: State University of New
York Press.
Bensimon, E.M., & Malcom, L. (2012). Confronting equity issues on campus implementing the
equity scorecard in theory and practice. Sterling, VA: Stylus.
Berríos-Allison, A.C. (2011). Career support group for Latino/a college students. Journal of
College Counseling, 14(1), 80-95.
Bolman, L. G., & Deal, T. E. (2008). Reframing organizations: Artistry, choice, and
leadership (4th ed.). San Francisco: Jossey-Bass.
Boone, H. N., & Boone, D. A. (2012). Analyzing likert data. Journal of extension, 50(2), 1-5.
122
Borgogni, L., Russo S. D., & Latham, G. P. (2011). The relationship of employee perceptions of
the immediate supervisor and top management with collective efficacy. Journal of
Leadership and Organizational Studies, 18, 5–13.
https://doi.org/10.1177/1548051810379799
Borman, G. D., Grigg, J., & Hanselman, P. (2016). An effort to close achievement gaps at scale
through self-affirmation. Educational Evaluation and Policy Analysis, 38(1), 21-42.
https://doi.org/10.3102/0162373715581709
Bowen, W. G., Chingos, M., & Mcpherson, M. (2009). Crossing the finish line: Completing
college at America’s public universities. Princeton, N.J: Princeton University Press.
Brownell, S. E., & Tanner, K. D. (2012). Barriers to faculty pedagogical change: lack of training,
time, incentives, and… tensions with professional identity? CBE-Life Sciences
Education, 11(4), 339-346.
Butler, R., & Shibaz, L. (2014). Striving to connect and striving to learn: Influences of relational
and mastery goals for teaching on teacher behaviors and student interest and help
seeking. International journal of educational research, 65, 41-53.
California State University. (2020). CSU Fact Book 2019 [Brochure]. Long Beach, CA
Cal State LA Math Early Alert. (2020). http://www.calstatela.edu/smartstart/math-early-alert
Camp, H. (2017). Goal setting as teacher development practice. International Journal of
Teaching and Learning in Higher Education, 29(1), 61-72.
Campbell, L. G., Mehtani, S., Dozier, M. E., & Rinehart, J. (2013). Gender-heterogeneous
working groups produce higher quality science. PloS One, 8(10), e79147.
123
Campos, E.E., Van Ryn, R., & Davidson, T.J. (2018). Changing deficit narratives about young
Latino men in Texas through a research-based mentorship program. Voices in Urban
Education 2018(2), 53-59.
Cardoza, D., & Gold, J. (2018). Moving from Data to Action: Changing Institutional Culture and
Behavior. In The Analytics Revolution in Higher Education: Big Data, Organizational
Learning, and Student Success. (pp. 119-138). Stylus Publishing, LLC.
Carnevale, A. P., & Smith, N. (2018). Balancing Work and Learning: Implications for Low-
Income Students. https://1gyhoq479ufd3yna29x7ubjn-wpengine.netdna-ssl.com/wp-
content/uploads/Low-Income-Working-Learners-FR.pdf
Carnevale, A. P., Van Der Werf, M., Quinn, M. C., Strohl, J., & Repnikov, D. (2018). Our
Separate & Unequal Public Colleges: How Public Colleges Reinforce White Racial
Privilege and Marginalize Black and Latino Students. Georgetown University Center on
Education and the Workforce.
Carnevale, A., & Strohl, J. (2013). Separate & unequal: How higher education reinforces the
intergenerational reproduction of white racial privilege. Washington D.C.: Georgetown
University. https://1gyhoq479ufd3yna29x7ubjn-wpengine.netdna-ssl.com/wp-
content/uploads/SeparateUnequal.FR_.pdf
Carney, M.A., Ng, L., & Cooper, T. (2016). Professional development amid change: Fostering
academic excellence and faculty productivity at teaching-intensive universities. The
Journal of Faculty Development, 30(2), 27-38.
Carter, F.D. (2006). Key issues in the persistence of underrepresented minority students. New
Directions for Institutional Research, 2006 (130), 33-46. https://doi.org/10.1002/ir.178
124
Cataldi, E. F., Bennett, C. T., & Chen, X. (2018). First-Generation Students: College Access,
Persistence, and Postbachelor's Outcomes. Stats in Brief. NCES 2018-421. National
Center for Education Statistics.
Cerasoli, C. P., & Ford, M. T. (2014). Intrinsic motivation, performance, and the mediating role
of mastery goal orientation: A test of self-determination theory. The Journal of
psychology, 148(3), 267-286.
CFA - Preparing Your RTP File for Review. (2019). https://www.calfac.org/item/rtp-policies-
info
Chen, X., & Carroll, C.D. (2005). First-Generation Students in Postsecondary Education: A
Look at Their College Transcripts (NCES 2005-171). U.S. Department of Education.
Washington, DC: National Center for Education Statistics.
Chetty, R., Friedman, J. N., Saez, E., Turner, N., & Yagan, D. (2017). Mobility report cards: The
role of colleges in intergenerational mobility (No. w23618). National Bureau of
Economic Research.
Clance, P. R. (1985). The Impostor Phenomenon: Overcoming the fear that haunts your success.
Atlanta, GA: Peachtree
Clark, R. E. & Estes, F. (2008). Turning research into results: A guide to selecting the right
performance solutions. Charlotte, NC: Information Age Publishing, Inc.
Coburn, C. E. (2010). Partnership for district reform: The challenges of evidence use in a major
urban district. In C. E. Coburn & M. K. Stein (Eds.), Research and practice in education:
Building alliances, bridging the divide (pp. 167–182). New York, NY: Rowman &
Littlefield.
125
Cohen-Vogel, D.R. (2018). Higher Education Decision Support: Building Capacity, Adding
Value. In The Analytics Revolution in Higher Education: Big Data, Organizational
Learning, and Student Success. (pp. 15-30). Stylus Publishing, LLC.
Collier, P. J., & Morgan, D. L. (2008). “Is That Paper Really Due Today?” Differences in First-
Generation and Traditional College Students’ Understandings of Faculty Expectations.
Higher Education, 55(4): 425–446.
Cokley, K., McClain, S., Enciso, A., & Martinez, M., (2013). An examination of the impact of
minority status stress and impostor feelings on the mental health of diverse ethnic
minority students. Journal of Multicultural Counseling and Development, 41(4), 82-95.
Cokley, K., Smith, L., Bernard, D., Hurst, A., Jackson, S., Stone, S., & Roberts, D. (2017).
Impostor feelings as a moderator and mediator of the relationship between perceived
discrimination and mental health among racial/ethnic minority college students. Journal
of Counseling Psychology, 64(2), 141.
Colman, A. M., Norris, C. E., & Preston, C. C. (1997). Comparing rating scales of different
lengths: Equivalence of scores from 5-point and 7-point scales. Psychological
Reports, 80(2), 355-362.
Conger, D., Long, M. C., & Iatarola, P. (2009). Explaining race, poverty, and gender disparities
in advanced course-taking. New York, N.Y.: https://doi.org/10.1002/pam.20455
Costino, K. A. (2018). Equity-minded faculty development: An intersectional identity-conscious
community of practice model for faculty learning. Metropolitan Universities, 29(1), 117-
136.
126
Cox, W. T., & Devine, P. G. (2019). The prejudice habit-breaking intervention: An
empowerment-based confrontation approach. In Confronting Prejudice and
Discrimination (pp. 249-274). Academic Press.
Creswell, J. W. & Creswell, J. D. (2018). Research design: Qualitative, quantitative, and mixed
methods approaches. Thousand Oaks, CA: Sage Publications.
Cronbach, L. J., & Meehl, P. E. (1955). Construct validity in psychological tests. Psychological
Bulletin, 52(4), 281–302.
CSU Academic Preparation. (2020) https://www2.calstate.edu/csu-system/why-the-csu-
matters/graduation-initiative-2025/academic-preparation
CSU Certificate Program in Student Success Analytics. (2020).
http://dashboard.csuprojects.org/ssa/certificate-program/index.html
CSU Employee Distribution by Gender, Race, Ethnicity and Time Base. (2020).
https://www2.calstate.edu/csu-system/faculty-staff/employee-profile/csu-
staff/Pages/employee-headcount-by-gender-and-ethnicity.aspx
CSU Fullerton Titanium Engagement. (2020).
https://www.fullerton.edu/it/services/atc/titanium_engagement.php
CSU Graduation Initiative 2025. (2018). https://www2.calstate.edu/csu-system/why-the-csu-
matters/graduation-initiative-2025
CSU Graduation Initiative 2025 System Plan. (2017). https://www2.calstate.edu/csu-
system/why-the-csu-matters/graduation-initiative-2025/Pages/campus-plans-and-
goals.aspx
CSU Faculty Development Center Survey. (2006).
https://www.calstate.edu/itl/fdc/council/documents/FDC_Survey_Rpt_2006-acc.pdf
127
CSU Institute for Teaching and Learning. (2020). https://www2.calstate.edu/csu-system/faculty-
staff/Institute-for-teaching-and-learning
CSU Institutional Research and Analyses. (2020). https://www.calstate.edu/as/
CSU Institutional Research Dashboard. (2020).
https://tableau.calstate.edu/views/GraduationRatesPopulationPyramidPrototype_liveversi
on/SummaryOverview
CSU Northridge Data Champions. (2020). https://www.csun.edu/mike-curb-arts-media-
communication/data-champions
CSU Northridge Faculty Development. (2020). https://www.csun.edu/undergraduate-
studies/faculty-development
CSU San Marcos Data Fellows. (2020). https://www.csusm.edu/ipa/datafellows/index.html
de Brey, C., Musu, L., McFarland, J., Wilkinson-Flicker, S., Diliberti, M., Zhang, A.,
Branstetter, C., & Wang, X. (2019). Status and Trends in the Education of Racial and
Ethnic Groups 2018. Washington D.C.: National Center for Education Statistics.
Denler, H., Wolters, C., & Benzon, M. (2014). Social cognitive theory.
Dewey, J. (1933). How we think, a restatement of the relation of reflective thinking to the
educative process. Boston: Heath.
Dimitrov, D. M., & Rumrill Jr, P. D. (2003). Pretest-posttest designs and measurement of
change. Work, 20(2), 159-165.
Dowd, A. C. (2005). Data don’t drive: Building a practitioner-driven culture of inquiry to assess
community college performance. Boston: University of Massachusetts, Lumina
Foundation for Education.
128
Dowd, A. C., & Bensimon, E. M. (2015). Engaging the" race question": Accountability and
equity in US higher education. Teachers College Press.
Dowd, A., & Liera, R. (2018). Sustaining change towards racial equity through cycles of
inquiry. Education policy analysis archives (26), 65.
https://doi.org/10.14507/epaa.26.3274
Downey, D. B., & Condron, D. J. (2016). Fifty years since the Coleman Report: Rethinking the
relationship between schools and inequality. Sociology of Education, 89(3), 207-220.
Dunn, K. E., Airola, D. T., Lo, W. J., & Garrison, M. (2013). Becoming data driven: The
influence of teachers’ sense of efficacy on concerns related to data-driven decision
making. The Journal of Experimental Education, 81(2), 222-241.
Duncan, G. J., & Murnane, R. J. (2011). Whither opportunity? Rising inequality, schools, and
children's life chances. New York: Russell Sage Foundation.
Education Trust. (2015). Rising tide: Do college grad rate gains benefit all students?
Washington DC: Education Trust.
Employees of the California State University. (2019). https://www2.calstate.edu/csu-
system/faculty-staff/employee-profile/Documents/Fall2019CSUProfiles.pdf
Felix, E., Bensimon, E., Hanson, D., Gray, J., & Klingsmith, L. (2015). Developing Agency for
Equity‐ Minded Change. New Directions for Community Colleges, 2015(172), 25–42.
https://doi.org/10.1002/cc.20161
Fink, A. (2013). How to conduct surveys: A step-by-step guide. (5th ed.). Thousand Oaks:
SAGE.
Finkelstein, M. J., Conley, V. M., & Schuster, J. H. (2016). Taking the measure of faculty
diversity. Advancing Higher Education, 4(2)
129
Fiske, S. T., & Markus, H. R. (Eds.). (2012). Facing social class: How societal rank influences
interaction. New York: Russell Sage Foundation.
Forbes, P., & Klevan, S. (2018). My brother’s keeper: Nurturing in-school relationships for
young men of color in New York City. Voices in Urban Education 2018(2), 32-37.
Fresno State SupportNet. (2019). http://www.fresnostate.edu/studentaffairs/lrc/supportnet/
Gagliardi, J., Parnell, A., & Carpenter-Hubin, J. (2018). The analytics revolution in higher
education. Change: The Magazine of Higher Learning, 50(2), 22-29.
Gallimore, R. & Goldenberg, C. (2001). Analyzing cultural models and settings to connect
minority achievement and school improvement research. Educational Psychologist,
31(1), 45-56.
Gibbons, M. M., Rhinehart, A., & Hardin, E. (2019). How first-generation college students
adjust to college. Journal of College Student Retention: Research, Theory &
Practice, 20(4), 488-510.
Glesne, C. (2011). Chapter 6: But is it ethical? Considering what is “right.” In Becoming
qualitative researchers: An introduction (4th ed.) (pp. 162-183). Boston, MA: Pearson.
Gorozidis, G., & Papaioannou, A. G. (2014). Teachers' motivation to participate in training and
to implement innovations. Teaching and Teacher Education, 39, 1-11.
Grimm, P. (2010). Social desirability bias. Wiley international encyclopedia of marketing.
Greenwald, A. G., McGhee, D. E., & Schwartz, J. L. (1998). Measuring individual differences in
implicit cognition: the implicit association test. Journal of personality and social
psychology, 74(6), 1464. https://doi.org/10.1037/0022-3514.74.6.1464
Harackiewicz, J. M., & Priniski, S. J. (2018). Improving student outcomes in higher education:
The science of targeted intervention. Annual Review of Psychology, 69, 409-435.
130
Harmon, N. (2012). The Role of Minority-Serving Institutions in National College Completion
Goals. Washington D.C: Institute for Higher Education Policy.
Haras, C., Taylor, S. C., Sorcinelli, M. D., & Von Hoene, L. (2017). Institutional commitment to
teaching excellence: Assessing the impacts and outcomes of faculty development.
Washington, DC: American Council on Education.
Harris III, F., & Bensimon, E. M. (2007). The equity scorecard: A collaborative approach to
assess and respond to racial/ethnic disparities in student outcomes. New directions for
student services, 2007(120), 77-84.
Higher education. (2015). https://obamawhitehouse.archives.gov/node/174466
Holland, B. A. (2016). Factors and Strategies That Influence Faculty Involvement in Public
Service. Journal of Higher Education Outreach and Engagement, 20(1), 63-71.
Hora, M. T., Bouwma-Gearhart, J., & Park, H. J. (2017). Advancing diversity and inclusion in
higher education: Key data highlights focusing on race and ethnicity and promising
practices. Washington DC: U.S. Department of Education.
Hora, M. T., & Smolarek, B. B. (2018). Examining faculty reflective practice: A call for critical
awareness and institutional support. The Journal of Higher Education, 89(4), 553-581.
https://doi.org/10.1080/00221546.2018.1437663
Hulland, J. (1999), Use of partial least squares (PLS) in strategic management research: a review
of four recent studies. Strategic Management Journal, 20: 195-204.
Isenberg, N. (2016). White trash: The 400-year untold history of class in America. Viking: New
York.
131
Jimenez, L., Sargard, S., Morales, J., & Thompson, M. (2016). Remedial education: The cost of
catching up. Washington DC: Center for American Progress.
https://www.scribd.com/document/323744409/Remedial-Education-The-Cost-of-
Catching-Up
Johns, R. (2005). One size doesn’t fit all: Selecting response scales for attitude items. Journal of
Elections, Public Opinion & Parties, 15(2), 237-264.
Johnson, R. B., & Christensen, L. B. (2015). Educational research: Quantitative, qualitative,
and mixed approaches. (5th ed.). Thousand Oaks: SAGE.
Johnson, A., & Proctor, R. W. (2016). Skill acquisition and training: Achieving expertise in
simple and complex tasks. Taylor & Francis.
Jones, S., (2015) The Game Changers: Strategies to Boost College Completion and Close
Attainment Gaps, Change: The Magazine of Higher Learning, 47:2,24-
29. https://doi.org/10.1080/00091383.2015.1018085
Jones, T., & Berger, K. (2019). Aiming for Equity: A Guide to Statewide Attainment Goals for
Racial Equity Advocates. Education Trust.
Kelly, B. T., Gayles, J. G., & Williams, C. D. (2017). Recruitment without retention: A critical
case of black faculty unrest. The Journal of Negro Education, 86(3), 305-317.
Kelly, M. S., & Lueck, C. (2011). Adopting a data-driven public health framework in schools:
Results from a multi-disciplinary survey on school-based mental health
practice. Advances in School Mental Health Promotion, 4(4), 5-12.
Kirkpatrick, D. L. (2006). Seven keys to unlock the four levels of evaluation. Performance
Improvement, 45, 5–8.
132
Kirkpatrick, D.L. (1994). Evaluating training programs: the four levels. (1st ed.). San Francisco:
Berrett-Koehler.
Kirkpatrick, J. D., & Kirkpatrick, W. K. (2016). Kirkpatrick’s four levels of training evaluation.
Alexandria, VA: ATD Press.
Knapp, T. (2016). Why Is the One-Group Pretest–Posttest Design Still Used? Clinical Nursing
Research, 25(5), 467–472. https://doi.org/10.1177/1054773816666280
Koropeckyj, S., Lafakis, C., & Ozimek, A. (2017). The economic impact of increasing college
completion. Cambridge, MA: American Academy of Arts & Sciences.
https://www.amacad.org/multimedia/pdfs/publications/researchpapersmonographs/CFUE
_Economic-Impact/CFUE_Economic-Impact.pdf
Krathwohl, D. R. (2002). A revision of Bloom’s Taxonomy: An overview. Theory Into Practice,
41, 212–218. https://doi.org/10.1207/s15430421tip4104_2
Krueger, R. A., & Casey, M. A. (2009). Focus groups: A practical guide for applied research
(4th ed.). Thousand Oaks, CA: SAGE Publications.
Kunst, E. M., van Woerkom, M., & Poell, R. F. (2018). Teachers' goal orientation profiles and
participation in professional development activities. Vocations and Learning, 11(1), 91-
111. https://doi.org/10.1007/s12186-017-9182-y
Langford, J., & Clance, P. R. (1993). The imposter phenomenon: Recent research findings
regarding dynamics, personality and family patterns and their implications for
treatment. Psychotherapy: Theory, Research, Practice, Training, 30(3), 495-501.
Libassi, C. J. (2018). The neglected college race gap: Racial disparities among college
completers. Washington, D.C: Center for American Progress.
Long Beach State Data Fellows. (2020). https://www.csulb.edu/data-fellows
133
Lumina Foundation. (2015). It's not just the money: The benefits of college education to
individuals and to society. New York: Lumina Foundation.
https://www.luminafoundation.org/files/resources/its-not-just-the-money.pdf
Luo, M. (2008). Structural equation modeling for high school principals' data-driven decision
making: An analysis of information use environments. Educational Administration
Quarterly, 44(5), 603-634.
Mandelbaum, E. (2016). Attitude, inference, association: On the propositional structure of
implicit bias. Noûs, 50(3), 629-658.
Mandinach, E. B. (2012). A Perfect Time for Data Use: Using Data-Driven Decision Making to
Inform Practice, Educational Psychologist. 47:2, 7185. 47.
https://doi.org/10.1080/00461520.2012.667064
Makopoulou, K., Neville, R., Ntoumanis, N., & Thomas, G. (2019). An investigation into the
effects of short-course professional development on teachers’ and teaching assistants’
self-efficacy, Professional Development in Education.
https://doi.org/10.1080/19415257.2019.1665572
Marsh, J. A. (2012). Interventions promoting educators’ use of data: Research insights and
gaps. Teachers College Record, 114(11), 1–48.
Marsh, J. A., Pane, J. F., & Hamilton, L. S. (2006). Making sense of data-driven decision making
in education (RAND Education occasional paper). Santa Monica, CA: RAND
Corporation. http://www.rand.org/pubs/occasional_papers/OP170.html
Marsh, J. A., & Farrell, C. C. (2015). How leaders can support teachers with data-driven decision
making: A framework for understanding capacity building. Educational Management
Administration Leadership, 43(2), 269–289. https://doi.org/10.1177/1741143214537229
134
Maxwell, J. A. (2013). Qualitative research design: An interactive approach. (3rd ed.).
Thousand Oaks: SAGE.
Mayer, R. E. (2011). Applying the science of learning. Boston, MA: Pearson Education.
McClain, K. S., & Perry, A. (2017). Where did they go: Retention rates for students of color at
predominantly white institutions. College Student Affairs Leadership, 4(1), 3.
https://scholarworks.gvsu.edu/csal/vol4/iss1/3
McNair, T.B, Albertine, S., Cooper, M.A., McDonald, N. & Major, T. (2016). Becoming a
Student-Ready College: A New Culture of Leadership for Student Success. San Francisco,
CA: Jossey-Bass.
Merriam, S. B., & Tisdell, E. J. (2016). Qualitative research: A guide to design and
implementation. (4th ed.). San Francisco: Jossey-Bass.
Montgomery, D. C., Peck, E. A., & Vining, G. G. (2012). Introduction to linear regression
analysis (Vol. 821). John Wiley & Sons.
Moody, J. A. (2004). Faculty diversity: Problems and solutions New York: Routledge.
https://doi.org/10.1111/j.1467-9647.2007.00333.x
Moore, D. A., & Schatz, D. (2017). The three faces of overconfidence. Social and Personality
Psychology Compass, 11(8), 1–12. https://doi.org/10.1111/spc3.12331
Myers, L.C., & Finnigan, K.S. (2018). Using data to guide difficult conversations around
structural racism. Voices in Urban Education 2018(2), 38-45.
Musu-Gillette, L., De Brey, C., McFarland, J., Hussar, W. & Sonnenberg, W. (2017). Status and
trends in the education of racial and ethnic groups. Washington D.C.: National Center
for Education Statistics.
135
National Center for Educational Statistics. (2019).
https://nces.ed.gov/programs/coe/indicator_coi.asp
National Center for Education Statistics (NCES) (2017).
http://nces.ed.gov/programs/digest/d15/tables/dt15_326.10.asp
National Center for Educational Statistics Postsecondary Graduation Rates. (2019).
https://nces.ed.gov/programs/raceindicators/indicator_red.asp
National Center for Health Statistics. (2012). Health, United States, 2011: With Special Feature
on Socioeconomic Status and Health. Hyattsville, MD.
https://www.cdc.gov/nchs/data/hus/hus11.pdf
Natkin, L. W., & Kolbe, T. (2016). Enhancing sustainability curricula through faculty learning
communities. International Journal of Sustainability in Higher Education, 17(4), 540-
558.
Ngounou, G., & Gutierrez, N. (2017). Learning to lead for racial equity. Phi Delta
Kappan, 99(3), 37-41.
Oburn, M. (2005). Building a culture of evidence in student affairs. New directions for
community colleges, 205(131), 19-32.
Ohio State University Implicit Bias Module Series. (2019)
http://kirwaninstitute.osu.edu/implicit-bias-training/
Osborne, J. W. (2017). Simple Linear Models with Categorical Dependent Variables: Binary
Logistic Regression.
Ott, M., & Dippold, L. (2018) Part-Time Faculty Involvement in Decision-Making, Community
College Journal of Research and Practice, 42:6, 452 455.
https://doi.org/10.1080/10668926.2017.1321057
136
Pajares, F. (2006). Self-efficacy theory. http://www.education.com/reference/article/self-
efficacytheory
Pascarella, E.T., Pierson, C.T., Wolniak, G.C., and Terenzini, P.T. (2004). First-Generation
College Students: Additional Evidence on College Experiences and Outcomes. The
Journal of Higher Education, 75(3): 249–284.
Pearce, J., Mann, M. K., Jones, C., van Buschbach, S., Olff, M., & Bisson, J. I. (2012). The most
effective way of delivering a train‐ the‐ trainers program: a systematic review. Journal of
Continuing Education in the Health Professions, 32(3), 215-226.
https://journals.lww.com/jcehp/Abstract/2012/32030/The_Most_Effective_Way_of_Deli
vering_a.10.aspx
Pintrich, P. R. (2003). A motivational science perspective on the role of student motivation in
learning and teaching contexts. Journal of Educational Psychology, 95, 667–686.
https://doi.org/10.1037/0022-0663.95.4.667
Reber, S., & Kalogrides, D. (2018). Setting the Stage: Trends in Student Demographics and
Enrollment in California. Technical Report. Getting Down to Facts II. Policy Analysis for
California Education, PACE.
Rethinking the Gap at the CSU. (2020) http://www.dashboard.csuprojects.org/rethinkingthegap/
Richards, B. N. (2018). Is Your University Racist? Inside Higher Ed.
Robinson, S.B. & Firth Leonard, K. (2019). Designing quality survey questions. Los Angeles:
SAGE.
Rodgers, C. (2002). Defining reflection: Another look at John Dewey and reflective
thinking. Teachers college record, 104(4), 842-866.
137
Rose, C., & Issa, M.D. (2018). A district-wide approach to culturally and linguistically
sustaining practices in the Boston public schools. Voices in Urban Education 2018(2),
13-18.
Rothstein, R. (2004). Class and schools: Using social, economic, and educational reform to
close the black-white achievement gap. Economic Policy Institute: New York, N.Y.
Rubin, H. J., & Rubin, I. S. (2012). Chapter 6: Conversational partnerships. In Qualitative
interviewing: The art of hearing data (3rd ed.) (pp. 85-92). Thousand Oaks, CA: SAGE
Publications.
Rueda, R. (2011). The 3 dimensions of improving student performance. New York: Teachers
College Press.
Salkind, N. J. (2010). Encyclopedia of research design Thousand Oaks, CA: SAGE Publications,
Inc. http://dx.doi.org/10.4135/9781412961288
Salkind, N. J. (2017). Statistics for people who (think they) hate statistics: Using Microsoft Excel
2016 (4th ed.). Thousand Oaks, CA: SAGE.
Santos, J. L., & Haycock, K. (2016). Fixing America’s college attainment problems: It’s about
more than affordability. Washington D.C.: Education Trust.
https://files.eric.ed.gov/fulltext/ED570435.pdf
Schein, E.H. (2017). Organizational culture and leadership, 5th Edition. San Francisco: Jossey-
Bass.
Schneider, B., Brief, A., & Guzzo, R. (1996). Creating a culture and climate for sustainable
organizational change. Organizational Dynamics, 24(4), 7-19.
Senge, P. M. (1990). The art and practice of the learning organization.
138
Sgoutas-Emch, S., Baird, L., Myers, P., Camacho, M., & Lord, S. (2016). We're not all white
men: Using a cohort/cluster approach to diversify STEM faculty hiring. The National
Education Association Higher Education Journal, 32(1)
Shute, V. J. (2008). Focus on formative feedback. Review of educational research, 78(1), 153-
189.
Siwatu, K., Frazier, P., Osaghae, O., & Starker, T. (2011). From maybe I can to yes I can:
Developing preservice and inservice teachers' self-efficacy to teach African American
students. The Journal of Negro Education. 80(3), 209-222.
Slavit, D., & Nelson, T. H. (2010). Collaborative teacher inquiry as a tool for building theory on
the development and use of rich mathematical tasks. Journal of Mathematics Teacher
Education, 13(1), 201–221.
Smith, D. G., Turner, C. S., Osei-Kofi, N., & Richards, S. (2004). Interrupting the usual:
Successful strategies for hiring diverse faculty. The Journal of Higher Education, 75(2),
133-160. https://doi.org/10.1080/00221546.2004.11778900
Sonoma State University Faculty Center. (2020). https://facultycenter.sonoma.edu/
Spitzer, B. and Aronson, J. (2015), Minding and mending the gap: Social psychological
interventions to reduce educational disparities. British Journal of Educational
Psychology, 85: 1-18. London. https://doi.org/10.1111/bjep.12067
Staman, L., Visscher, A. J., & Luyten, H. (2014). The effects of professional development on the
attitudes, knowledge and skills for data-driven decision making. Studies in Educational
Evaluation, 42, 79-90.
Steele, C. M. (1988). The psychology of self-affirmation: Sustaining the integrity of the self.
In Advances in experimental social psychology (Vol. 21, pp. 261-302). Academic Press.
139
Steele, C., & Aronson, J. (1995). Stereotype threat and the intellectual test performance of
African Americans. Journal of Personality and Social Psychology, 69(5), 797-811.
Stephens, N. M., Brannon, T. N., Markus, H. R., & Nelson, J. E. (2015). Feeling at home in
college: Fortifying school‐ relevant selves to reduce social class disparities in higher
education. Social Issues and Policy Review, 9(1), 1-24.
https://doi.org/10.1111/sipr.12008
Stout, R., Archie, C., Cross, D., & Carman, C. A. (2018). The relationship between faculty
diversity and graduation rates in higher education. Intercultural Education, 29(3), 399-
417. https://doi.org/10.1080/14675986.2018.1437997
Svinicki, M. D., Williams, K., Rackley, K., Sanders, A. J. Z., Pine, L., & Stewart, J. (2016).
Factors associated with faculty use of student data for instructional
improvement. International Journal for the Scholarship of Teaching and Learning, 10(2).
https://doi.org/10.20429/ijsotl.2016.100205
Swing, R. L., & Ross, L. E. (2016). A new vision for institutional research. Change: The
Magazine of Higher Learning, 48(2), 6-13.
Tenzin K, Dorji T, Choeda T, Pongpirul K. Impact of faculty development programme on self-
efficacy, competency and attitude towards medical education in Bhutan: a mixed-
methods study. BMC medical education. 2019;19(1):468-12.
https://doi.org/10.1186/s12909-019-1904-4
Truong, Y., & McColl, R. (2011). Intrinsic motivations, self-esteem, and luxury goods
consumption. Journal of Retailing and Consumer Services.
https://doi.org/10.1016/j.jretconser.2011.08.004
140
Turner, C. S., González, J. C., & Wood, J. L. (2008). Faculty of color in academe: What 20 years
of literature tells us. Journal of Diversity in Higher Education. 1(3), 139-168.
https://doi.org/10.1037/a0012837
University of Oregon Implicit Bias Workshops. (2019).
https://inclusion.uoregon.edu/implicitbias
U.S. Census Bureau. (2017). https://census.gov/programs-surveys/popproj.html
U.S. Census Bureau American Community Survey. (2017). :
https://www.census.gov/acs/www/data/data-tables-and-tools/data-profiles/2017/
U.S. Department of Education. (2016). Advancing diversity and inclusion in higher education:
Key data highlights focusing on race and ethnicity and promising practices.
https://www2.ed.gov/rschstat/research/pubs/advancing-diversity-inclusion.pdf
Vandal, B., (2016). Remedial Education’s Role in Perpetuating Achievement Gaps. Washington
D.C.: Complete College America. https://completecollege.org/article/remedial-
educations-role-in-perpetuating-achievement-gaps/
Vanderbilt Office for Diversity, Equity, and Inclusion. (2019).
https://www.vanderbilt.edu/diversity/unconscious-bias/
Van Schalkwyk, S., Leibowitz, B., Herman, N., & Farmer, J. (2015). Reflections on professional
learning: Choices, context and culture. Studies in Educational Evaluation, 46, 4-10.
Watanabe, T. (2019, February 25). Cal State remedial education reforms help thousands more
students pass college-level math classes. Los Angeles Times.
https://www.latimes.com/local/education/la-me-edu-cal-state-remedial-education-
reforms-20190225-story.html
141
Waters, T., Marzano, R. J., & McNulty, B. (2003). Balanced Leadership: What 30 Years of
Research Tells Us about the Effect of Leadership on Student Achievement. A Working
Paper.
Williams, B., Onsman, A., & Brown, T. (2010). Exploratory factor analysis: A five-step guide
for novices. Australasian Journal of Paramedicine, 8(3).
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.475.8594&rep=rep1&type=pdf
Vásquez, J., Flores, I., Barros Souza, A., Barry, J., & Monroy, S. (2019). Considering the Ethno-
racial and Gender Diversity of Faculty in United States College and University
Intellectual Communities. Hispanic Journal of Law and Policy, 2, 1-31.
Yeager, D.S., Walton, G. M., Brady, S. T., Akcinar, E. N., Paunesku, D., Keane, L, & Dweck, C.
S. (2016). Teaching a lay theory before college narrows achievement gaps at
scale. Proceedings of the National Academy of Sciences, Washington D.C.
Yough, M., & Anderman, E. (2006). Goal orientation theory.
http://www.education.com/reference/article/goal-orientation-theory
Zhang, Z. (2016). Variable selection with stepwise and best subset approaches. Annals of
translational medicine, 4(7).
Zimmerman, B. J. (1998). Academic studying and the development of personal skill: A
self-regulatory perspective. Educational psychologist, 33(2-3), 73-86.
142
APPENDIX A
CSU Certificate Program in Student Success Analytics Brochure
143
144
APPENDIX B
Student Success Analytics Pre-Program Survey
1. My role on campus can best be described as:
Faculty
Staff
Administrator
Other __________________
2. I have worked in higher education for ____ years.
3. I identify my gender as:
Male
Female
Non-Binary
Prefer Not to State
4. I identify my ethnicity as:
Asian,
Black/African American
Hispanic/Latinx
Native American
Pacific Islander
White
Other
Prefer Not to State
145
Below is a list of statements related to your experiences with student equity data. Please
indicate how strongly you agree or disagree with each statement.
EXPERIENCES WITH DATA
Strongly
Agree
Agree
Somewhat
Agree
Neutral
Somewhat
Disagree
Disagree
Strongly
Disagree
I have access to the data I need.
I understand how the use of equity
data can impact student outcomes.
I know how to identify equity gaps in
student outcomes.
My knowledge of student data is
stronger than my peers.
I know how to access student data on
my campus.
I understand how to analyze student
data to get the information I need.
I am comfortable using web-based
dashboards.
Given a spreadsheet with retention
data organized by student
characteristics, I could identify
salient equity issues.
I understand how student data are
collected on my campus.
I am comfortable talking with my
colleagues about data.
I often reflect on opportunities to
improve equity in my practice.
146
Below is a list of statements related to your feelings about student equity data. Please indicate
how strongly you agree or disagree with each statement.
FEELINGS ABOUT DATA
Strongly
Agree
Agree
Somewhat
Agree
Neutral
Somewhat
Disagree
Disagree
Strongly
Disagree
Improving equity in my practice is
one of my top priorities.
I am committed to learning as much
as possible about using data to
improve my practice.
I feel confident in my ability to apply
data-informed insights to improve
student outcomes.
When I have questions about student
data, I can typically figure out the
answers on my own.
I am confident that I can become
more equity-minded in my practice.
I feel confident in my ability to
navigate data tables for meaning.
I would commit to improve equity in
my practice even if it were not a
priority of my campus.
147
Below is a list of statements related to your campus support for data-enhanced decision
making and professional development. Please indicate how strongly you agree or disagree
with each statement.
CAMPUS SUPPORT
Strongly
Agree
Agree
Somewhat
Agree
Neutral
Somewhat
Disagree
Disagree
Strongly
Disagree
My campus values data-informed
decision making.
My campus leaders consult data to
inform their decisions.
My campus is committed to ensuring
that faculty, staff, and administrators
have access to quality professional
development.
I am satisfied with my opportunities
to engage in professional
development.
Professional development is one of
the top perks of my job.
I have sufficient planning time to
participate in professional
development.
My campus supports my
participation in the Certificate
Program.
I know who to talk to on campus to
help me understand how to interpret
student data.
My campus provides adequate
support structures to help me apply
student data to my practice.
I feel supported in my efforts to
leverage student data to improve
equity in my practice.
148
For each item below, select the option which most accurately describes the frequency with
which you engage in the listed activity.
ACTIONS Do Regularly
Do
Occasionally
Want to Do
& Ready
Want to Do &
Not Ready
Not
Interested
Analyzing student data
Thinking about equity in my
practice
Applying data to improve equity
in my practice
Discussing strategies with my
colleagues for leveraging student
data to improve equity
149
APPENDIX C
Evaluation Instrument for Use Immediately Following the Opening Session
Please rate the level to which you agree for each item by indicating a choice with an X.
Strongly
Disagree
Disagree Agree
Strongly
Agree
1. The opening session added to my learning
experience
2. The program activities were aligned to the
objectives.
3. The presenters encouraged participation
during each session.
4. The program was relevant to my role on
campus.
5. I feel comfortable engaging in upcoming
learning community activities.
6. The content related to improving equity
on campus is important to me.
7. I know what I need to do to apply what I
learned to my practice.
8. The program increased my confidence in
my ability to apply student data to
improve equity in my practice.
9. What part of the opening session was most impactful for you?
_____________________________________________________________________
_____________________________________________________________________
_____________________________________________________________________
10. What part of the opening session can be improved, and how?
_____________________________________________________________________
_____________________________________________________________________
_____________________________________________________________________
150
APPENDIX D
Evaluation Instrument for Use Four Weeks after the Closing Session
Please rate the level to which you agree for each item by indicating a choice with an X.
Strongly
Disagree
Disagree Agree
Strongly
Agree
1. The program increased my confidence in
my ability to analyze student data through
an equity lens.
2. I consider myself to be a more equity-
minded practitioner because of the
certificate program.
3. I am committed to finding additional
opportunities for increasing equity in my
practice.
4. I have applied lessons learned in the
certificate program to my practice.
5. My campus supports my efforts to
improve equity in my practice.
6. I would recommend the certificate
program to my colleagues.
7. Please describe any specific ways that you have applied lessons learned from the
certificate program to improve equity in your practice.
_____________________________________________________________________
_____________________________________________________________________
_____________________________________________________________________
8. What (if any) metrics will you use in the future to measure your progress in becoming a
more equity-minded practitioner?
_____________________________________________________________________
_____________________________________________________________________
_____________________________________________________________________
Abstract (if available)
Abstract
Equity gaps in the degree attainment rates of historically underserved college students and their more privileged peers are prevalent across the U.S. higher education landscape. Addressing these discrepancies requires an institution-wide commitment to data-driven, equity-minded practice. As part of a larger effort to eliminate equity gaps by the year 2025, the California State University (CSU) developed the Certificate Program in Student Success Analytics, an equity-focused professional development program for faculty, staff and administrators. This study evaluated the readiness of CSU program participants to interrogate student data and apply their insights to improve equity in their practice. Utilizing the Clark and Estes (2008) Gap Framework Analysis, linear and logistic regressions were performed to identify the knowledge, motivation and organizational influences impeding stakeholders from becoming more data-driven and equity-minded practitioners. The quantitative analyses revealed that participants’ propensity to apply data to their practice was most strongly predicted by their level of procedural and metacognitive knowledge. Campus resources were also shown to be critical factors in supporting more equitable practices. Finally, faculty were determined to be less likely than their administrator and staff colleagues to apply student equity data to improve their practice. These findings laid the foundation for the development of recommended enhancements to the certificate program with the goal of equipping all participants with the requisite skills, motivation and support to regularly apply data-driven insights to improve equity in their practice.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
First-generation college students and persistence to a degree: an evaluation study
PDF
Academic department chair readiness to lead toward equity: a gap analysis
PDF
Building teacher competency to work with middle school long-term English language learners: an improvement model
PDF
An evaluation of general education faculty practices to support student decision-making at one community college
PDF
Equity and access: the under-identification of African American students in gifted programs
PDF
Closing the completion gap for African American students at California community colleges: a research study
PDF
PBIS and equity for African American students with and without disabilities: a gap analysis
PDF
Building data use capacity through school leaders: an evaluation study
PDF
Barriers to gender equity in K-12 educational leadership: an evaluation study
PDF
Perception of alternative education teachers readiness to instruct English language learners: an evaluation study
PDF
Social work faculty practices in writing instruction: an exploratory study
PDF
Decision making: closing opportunity gaps for dual enrollment and advanced placement in the California Central Valley
PDF
Job placement outcomes for graduates of a southwestern university school of business: an evaluation study
PDF
The implementation of data driven decision making to improve low-performing schools: an evaluation study of superintendents in the western United States
PDF
Inclusion of adjunct faculty in the community college culture
PDF
Improving the reclassification rate gap
PDF
Mandatory reporting of sexual violence by faculty and staff at Hometown University: an evaluation study
PDF
Evaluation of New Teacher Induction (NTI) mentor practice for developing NTI teachers capable of differentiating instruction to address cultural diversity, equity, and learner variability
PDF
Gender inequities in behind-the-camera positions of power in the film industry: an evaluation study
PDF
Trending upward: an evaluation study of teacher practices in serving special needs students in a public high school
Asset Metadata
Creator
Gold, Jeff
(author)
Core Title
Minding the gap: an evaluation of faculty, staff and administrator readiness to close equity gaps at the California State University
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Organizational Change and Leadership (On Line)
Publication Date
09/02/2020
Defense Date
07/20/2020
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
culture of evidence,data-informed decision making,diversity,equity gap,graduation rates,Higher education,inclusion,OAI-PMH Harvest,professional development,students of color
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Adibe, Bryant (
committee chair
), Cardoza, Desdemona (
committee member
), Ott, Maria (
committee member
)
Creator Email
jgold@calstate.edu,jgold@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c89-369504
Unique identifier
UC11666351
Identifier
etd-GoldJeff-8937.pdf (filename),usctheses-c89-369504 (legacy record id)
Legacy Identifier
etd-GoldJeff-8937.pdf
Dmrecord
369504
Document Type
Dissertation
Rights
Gold, Jeff
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
culture of evidence
data-informed decision making
equity gap
graduation rates
inclusion
professional development
students of color