Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Satisfactory academic progress for doctoral students: an improvement study
(USC Thesis Other)
Satisfactory academic progress for doctoral students: an improvement study
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
SATISFACTORY ACADEMIC PROGRESS FOR DOCTORAL STUDENTS:
An Improvement Study
by
Lindsay Cahn
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
August 2020
Copyright 2020 Lindsay Cahn
ii
Acknowledgments
It was never my goal to attempt a doctoral degree, but after 15 years of working with
doctoral students, some of their ambition (or crazy) must have rubbed off on me. Thank you to
my dissertation committee, Dr. Seli, Dr. Stowe, and my chair, Dr. Muraszewski. If my writing a
doctoral dissertation about doctoral student progress wasn’t meta enough, their reading a
dissertation about how faculty can improve advising for doctoral students must have been an
interesting experience. Thank you for persevering.
To my beloved colleagues in Cohort X, thank you. I never dreamed that I’d make so
many brilliant, entertaining, and generous new friends. The Tuesday/Wednesday night crew were
what kept me going throughout this experience and made it an unforgettable experience. Fight
on! Special thanks to my school “wife” Dr. Michelle Rippy, who kept me honest, sane, and
entertained throughout. Her endless support is the reason I’m finishing this degree; I literally
couldn’t have done it without her. Even a global pandemic couldn’t stop us, although it did throw
a huge wrench into our celebration plans.
Thank you to all the friends and colleagues who have supported me. The SB girls, Mimo,
Kimi, Briana, my fellow directors, and dear friends, Andrea, Bridget, Erica, Hilary, Lillian, and
everyone at “PGU” who has supported and encouraged me. Thank you for putting up with my
constant anxiety and distraction over the past three years!
Finally, thank you to my family. My parents went through this experience with me from
start to finish, listening to my complaints, cooking me meals, and contributing financially.
Special thanks to my Aunt Joan, my brother Andrew, and sister-in-law, Kim, my niece, Olive,
and nephew, Buster. I missed so many birthday parties and opportunities to hang out, and you
iii
were all very patient with my constant stress. This dissertation is dedicated to the Cahns and the
Claytons, past, present, and future.
iv
Table of Contents
Acknowledgments ...................................................................................................................................... ii
Table of Contents ....................................................................................................................................... iv
List of Tables ............................................................................................................................................ vii
List of Figures............................................................................................................................................ ix
Abstract ...................................................................................................................................................... x
Chapter One: Overview of the Study .......................................................................................................... 1
Introduction to the Problem of Practice ..................................................................................................... 1
Background of the Problem ........................................................................................................................ 1
Importance of Addressing the Problem ...................................................................................................... 5
Organizational Context and Mission .......................................................................................................... 6
Organizational Goal ................................................................................................................................... 7
Importance of the Evaluation ..................................................................................................................... 8
Description of Stakeholder Groups ............................................................................................................ 9
Stakeholder Group for the Study .............................................................................................................. 10
Purpose of the Project and Questions ...................................................................................................... 10
Methodological Framework ..................................................................................................................... 11
Definitions ................................................................................................................................................ 12
Organization of this Study ........................................................................................................................ 12
Chapter Two: Literature Review ............................................................................................................... 13
Historical Perspective on Doctoral Education ......................................................................................... 13
Current Methods of Assessment of Progress ............................................................................................ 14
Characteristics Related to Student Doctoral Progress ............................................................................. 17
The Clark and Estes (2008) Gap Analytic Conceptual Framework ......................................................... 25
Doctoral Faculty Advisor Knowledge and Motivation Influences ............................................................ 26
Motivation ................................................................................................................................................ 30
v
Conceptual Framework ............................................................................................................................ 37
Conclusion ................................................................................................................................................ 41
Chapter Three: Methodology ..................................................................................................................... 43
Participating Stakeholders ....................................................................................................................... 45
Data Collection and Instrumentation ....................................................................................................... 48
Validity and Reliability ............................................................................................................................. 50
Data Analysis............................................................................................................................................ 51
Credibility and Trustworthiness ............................................................................................................... 52
Ethics ........................................................................................................................................................ 53
Chapter Four: Results and Findings ......................................................................................................... 57
Participating Stakeholders ....................................................................................................................... 58
Determination of Validation ..................................................................................................................... 61
Results and Findings ................................................................................................................................ 61
Synthesis ................................................................................................................................................... 82
Chapter Five: Discussion ............................................................................................................................ 84
Introduction and Overview ....................................................................................................................... 84
Integrated Implementation and Evaluation Plan ...................................................................................... 95
Strengths and Weaknesses of the Approach ........................................................................................... 109
Limitations and Delimitations ................................................................................................................ 110
Future Research ..................................................................................................................................... 111
Conclusion .............................................................................................................................................. 112
References .................................................................................................................................................. 113
Appendix A ................................................................................................................................................ 127
Survey Protocol ...................................................................................................................................... 127
Appendix B ................................................................................................................................................ 133
vi
Interview Protocol .................................................................................................................................. 133
Appendix C ................................................................................................................................................ 136
Informed Consent Form.......................................................................................................................... 136
Appendix D ................................................................................................................................................ 141
Immediate Evaluation Instrument ........................................................................................................... 141
Appendix E ................................................................................................................................................ 144
Delayed Evaluation Instrument .............................................................................................................. 144
vii
List of Tables
Table 1: PGU Schools, Program codes, Type of Degree, Time To Completion Information, and SAP Rates…7
Table 2: Organizational Mission, Global Goal and Stakeholder Performance Goals………………………..10
Table 3: Knowledge Influence, Knowledge Type, and Knowledge Influence Assessment…………………….30
Table 4: Assumed Motivation, Influence, and Motivational Influence Assessments…………………………..33
Table 5: Organizational Influences and Organization Influence Assessments………………………………..37
Table 6: Influence type, subtype, and stakeholder assumed influences analyzed for this study……………….57
Table 7: Interview and Survey Participation by School Affiliation……………………………………………59
Table 8: Descriptive Statistics for Participants’ Self-Identified Years of Experience…………………………59
Table 9: Interview Participants by Pseudonym, School and Program Affiliation, Number of Advisees and
Years of Experience…………………………..……………………………………………………...60
Table 10: Question 6- Faculty Advisors Self-Identified Ability to Identify SAP Review Period…………...……62
Table 11: Question 7: Faculty Advisors Responses to Knowledge Question………………………………...……63
Table 12: Question 8 Faculty Advisors Answer which Terms are in a SAP Review……………………....…..64
Table 13: Descriptive Statistics for Faculty Advisors’ Self-Identified Number of Meetings with Advisees Each
Term…………………………………………………………………………………………..………67
Table 14: Faculty Advisor Themes in Response to Question About How They Track Student Progress………69
Table 15: Summarized Results for Question 12 - Motivation- Utility Value ……………………………………….71
Table 16: Summarized Results for Question 13 - Motivation- Utility Value ……………………….………………72
Table 17: Summarized Results for Question 14 - Motivation- Utility Value ……………………………….………73
Table 18: Combined Results from Survey Questions 15-17 - Motivation- Self-Efficacy………………….……76
Table 19: Combined Results from Survey Questions 18-19 - Organization- Cultural Setting…………………79
Table 20: Summarized Results from Survey Question 20 - Organization- Cultural Setting……………………80
Table 21: Combined Results from Survey Questions 21-23 - Organization- Cultural Model………………….81
Table 22: Stakeholder Assumed Influences and Summary of Validation as Need or Asset…………………….83
Table 23: Summary of Knowledge Influences and Recommendations……………………………………….………86
Table 24: Summary of Motivation Influences and Recommendations……………………………………………….89
Table 25: Summary of Organization Influences and Recommendations……………………………….……….92
viii
Table 26: Outcomes, Metrics, and Methods for External and Internal Outcomes…………………………....97
Table 27: Critical Behaviors, Metrics, Methods, and Timing for Evaluation……………………………...…99
Table 28: Required Drivers to Support Critical Behaviors…………………………………………………..……101
Table 29: Evaluation of the Components of Learning for the Program………………………..……………105
Table 30: Components to Measure Reactions to the Program………………………………………………106
ix
List of Figures
Figure 1: Conceptual Framework Drawing Showing the Interaction Between the Organization, Progressive
Graduate University, The Stakeholders, Doctoral Mentoring Faculty, and the Goal, Increasing SAP
Rates By 12%..............................................................................................................................................39
Figure 2: Sample Dashboard for SAP Training Program Data……………………………………….……………….108
x
Abstract
This study addresses the issue of slow rates of progress toward doctoral degree completion using
the annual satisfactory academic progress (SAP) review as an indicator of inadequate progress.
The Clark and Estes (2008) gap analysis was used to evaluate the assumed knowledge,
motivation, and organizational influences that are preventing Progressive Graduate University
(PGU, a pseudonym) from reaching its organizational goal to increase SAP rates for doctoral
students. A mixed-method approach was used, with quantitative surveys and qualitative
interviews with faculty members. All six assumed influences were validated as gaps for the
organization, although the study was hampered by a low response rate. Detailed
recommendations are made to address the gaps, and an integrated implementation plan was
designed, including evaluation of outcomes.
1
CHAPTER ONE: OVERVIEW OF THE STUDY
Introduction to the Problem of Practice
This study addresses the problem of slow rates of progress toward degree completion for
doctoral students in the social sciences. The Council of Graduate Schools Ph.D. Completion
Project found that after ten years, only 56% of students starting a social science doctorate had
completed their degree (Sowell et al., 2008), demonstrating that this is a problem. The evidence
highlights that the median time-to-degree for the doctorate across all fields of study was 8.8
years in 2017 (NSF/NCSES, 2018b). While there are no universal standards for the length of
time a doctoral program should take to complete, many experts suggest four to five years as an
appropriate time frame for most disciplines (Berelson, 1960; Millett & Nettles, 2009). This
problem is important to solve because longer enrollment time is associated with a higher risk of
withdrawing without completing the degree (van der Haert et al., 2014) and higher overall levels
of student debt (Baum & Steele, 2017).
Background of the Problem
This section describes two main topic areas as background information for the problem of
slow doctoral student progress toward degree completion. Those two topics are: financial support
and resources, and programmatic and faculty criteria. Although the literature presented here has
been applied to a variety of educational contexts, this review focuses primarily on the literature’s
application to the problem of making satisfactory academic progress in a doctoral program.
Financial Support and Resources
There are several aspects to the financial support of a doctoral student. The first is
financial support provided by the doctoral program. Some programs offer multi-year support to
2
doctoral students in the form of fellowships, scholarships, and research or teaching
assistanceships. The second type of financial resources are outside of the doctoral program, for
instance, the student’s savings, educational loans, family support, or employment outside of the
academic department.
There are some indications that types of institutional financial support have a significant
influence on time-to-degree and completion rates of doctoral students. Chandler (2018) found
that receipt of a large scholarship early in their doctoral studies increased the probability of
degree completion over recipients of a smaller scholarship. This finding supports earlier studies
suggesting the timing of financial support is significant, with support provided early in the
doctoral student’s enrollment as more predictive of degree completion than later, such as in the
dissertation phase (Ehrenberg & Mavros, 1995). Perhaps not surprisingly, studies have found that
department financial support in the form of fellowships or scholarships, both types of funding
that do not require significant work on projects other than the student’s research, result in better
student outcomes than funding types that require the student to be employed as a teaching or
research assistant (Bolli et al., 2015; Horta et al., 2018).
Not all doctoral students receive funding from their institution. Researchers found that
students without department-based funding have the highest rates of attrition from their degree
program (van der Haert et al., 2014). This study also confirmed that the type of funding was
significant, with grants and fellowships resulting in higher completion rates than employment-
based funding like TAships. However, another study found that having funding from the
department of any kind, whether it is fellowship or employment-based, improved completion
rates as opposed to relying on non-department funding like outside employment (Bair &
Haworth, 2005). These findings imply that time-to-degree is closely tied to the amount of time
3
and effort a student can put into their studies. Behr and Theune (2016) found significantly higher
time-to-degree rates for students who were employed outside of their academic department.
Students who must work, either as a TA or research assistant or outside of the academic
department, have less time to devote to their doctoral studies, and therefore take longer to finish
their degree.
Students who work full-time or rely entirely on loans to fund their doctoral studies have
additional barriers to degree completion. In a study of the effect of loans on doctoral student
completion, Kim and Otts (2010) found that students with loans took less time to complete than
students funding their degree by working. The authors suggest this finding is due to the ability of
loan-funded students to focus solely on their educational objectives and are not distracted by the
need for outside employment. Other researchers posit that the debt incurred with educational
loans encourage fast completion due to the desire to limit the total amount borrowed (Mendoza
et al., 2014). However, as loan borrowing for doctoral education increases, there is evidence that
financial stress is a significant cause of doctoral attrition (D’Andrea, 2002). Depending on the
employment prospects in their field of study, their ability to repay high loan balances is a
legitimate concern for many students (Baum & Steele, 2018), particularly in lower-paid fields
like the social sciences and education.
Programmatic and Faculty Criteria
Finally, the programmatic factors encompass the academic support of the program and
the university itself. Programmatic factors include the design of the program, flexibility of the
curriculum, and timing and sequencing of courses within the program. Programmatic factors also
include the resources available to students both within the department and widely through the
4
university, like the library and writing center. This category also includes what is generally
considered to be the most crucial factor for student success, faculty support and mentorship.
Skakni (2018), in a qualitative case study of a Canadian research university, found that
doctoral education is an initiation into a research career and that there was an element of
academic hazing that takes place as part of the process. This hazing entails rituals of initiation
that are purposely difficult or opaque to ensure that only a subsection of students succeed
(Skakni, 2018). Skakni interviewed students, faculty, and staff, as well as evaluating documents
as part of her research, and found that while the quality of doctoral supervision (faculty advising)
is a significant influence on student progress, doctoral supervision is not a faculty member’s
main focus (Skakni, 2018). Similarly, in a study of doctoral students asked if they have
considered leaving their program, issues with faculty or advising were most common (Ruud et
al., 2018). Caruth (2015) also found that conflicts with faculty or program staff were a significant
contributor to the decision to withdraw from doctoral programs.
Craft, Augustine-Shaw, Fairbanks, and Adams-Wright (2016) also determined that the
advisor-advisee relationship contributes to doctoral student persistence. Using document
analysis, Craft et al. (2016) reviewed documents related to doctoral advising at 12 schools. They
found several themes relevant to this discussion, including higher levels of accountability for
students than for faculty, and that terms like “advisor” and “mentor” were used interchangeably
to mean the same thing. The resulting confusion from these inconsistencies creates a barrier for
students seeking to understand their responsibilities and roles within the department.
How a faculty advisor responds when their advisee encounters a dip in productivity
varies widely. Lepp, Remmik, and Leijen (2016) found that many faculty either do not feel that
they have time to reach out proactively to students or think that it is the student’s responsibility
5
to reach out if they are experiencing problems. These findings suggest that the student is held
entirely responsible for their own progress or lack thereof. Additionally, faculty attribute the
student’s lack of progress to events in the student’s personal life, like the need to work outside of
academia for financial reasons, or lack of motivation or academic preparation, despite the efforts
of the faculty (Lepp et al., 2016).
Importance of Addressing the Problem
It is important to increase rates of doctoral progress for a variety of reasons. For the
United States, having a steady stream of new doctoral recipients fuels innovation in the fields of
science and research, in addition to providing trained faculty to work in higher education
(NSF/NCSES, 2018a). Colleges and universities rely on a constant infusion of new instructors
and researchers to continue their ability to educate future generations and remain relevant in their
fields of study (NSF/NCSES, 2018a). If prospective doctoral students decide to attend foreign
universities, they may choose to stay and make their contributions to that country, causing the
U.S. to fall behind on research and educational advancements (NSF/NCSES, 2018a). For
doctoral-granting colleges and universities, each enrolled doctoral student represents a
significant investment of money and resources, which can be considered wasted when a student
does not complete their degree (Caruth, 2015).
Moreover, for the prospective doctoral students themselves, the importance of completing
the doctorate in a reasonable amount of time is paramount. Studies show that students who
graduate with higher levels of student loan debt will, in their lifetime, have higher levels of
financial distress, lower savings for retirement, and lower overall net worth (Gayardon et al.,
2018). Additionally, higher educational debt is associated with lower levels of overall health,
including mental health, after leaving higher education (Gayardon et al., 2018).
6
Organizational Context and Mission
Progressive Graduate University (PGU), a pseudonym, is a small, private, not-for-profit
university offering post-secondary (graduate-level) distance education programs with
administrative offices located in the western United States. According to PGU’s website,
Progressive’s mission is grounded in the values of social justice and sustainability, as well as
learner-centered adult education. Originally conceived as an alternative to traditional brick and
mortar graduate schools, Progressive has grown and evolved as technology has advanced over
the 40-plus years since its founding. Progressive’s focus, from its inception, has been to create
educational opportunities for adults who, for various reasons, cannot or do not want to relocate to
a college town to pursue graduate education.
Progressive averages 1,000 students, 80 staff members, and 185 Faculty, and has a faculty
to student ratio of 6:1 (About Us, University Website). Progressive’s population is predominantly
female, with only 24% of students identifying as men (About Us, University Website). The
ethnic composition of the university is 50% White, with 14% Black or African American, 12%
Hispanic or Latino, and 4% Asian (About Us, University Website). International students make
up 8%, and the remaining 12% identify as American Indian or Alaska Native, two or more races,
or race/ethnicity unknown (About Us, University Website).
Progressive is regionally accredited by the WASC Senior College and University
Commission (WSCUC). Within PGU, programs are organized into one of two schools, each with
its own leadership structure. Table 1 lists each school and the programs within it, the degrees
awarded, the advertised time to degree, the maximum length of study deadline, and the average
(mean) time to degree completion over the past 10 years (when available). All of the doctoral
programs at Progressive earn social science (Ph.D.) or education (Ed.D) degrees. For the
7
purposes of this study, both degree types (Ed.D. and Ph.D.) are referred to interchangeably as
research doctoral degrees.
Table 1
PGU Schools, Program codes, Type of Degree, Time To Completion Information, and SAP Rates
School
Program Type
of
degree
Advertised
time to
complete
in years
a
Length of
study
deadline
in years
b
Mean time
to
completion
c
Mean
positive SAP
rates 2015-
2019
d
School 1 Program 1 PhD 5 8 7.9 83%
School 1 Program 2 PhD 4 10 5.60 81%
School 1 Program 3 PhD 4 10
e
90%
School 2 Program 4 PhD 4 10 6.47 66%
School 2 Program 5 EdD 3 10 5.43 80%
a
source: university marketing brochures
b
source: university catalog
c
source: university website data from 2009-2019
d
source: internal university data
e
not enough data to calculate
Table 1 shows that all of the programs with ten years of data have average (mean) time to
completion rates that are higher than their advertised time to completion. Because doctoral
degrees take many years to complete, a good intermediary indication of progress is the annual
SAP review. The annual SAP review identifies students and programs that are experiencing slow
rates of progress while there is time and the opportunity for intervention.
Organizational Goal
Progressive University’s goal is to increase the number of doctoral students who make
satisfactory academic progress (SAP) by 12% by September 2020. This goal is detailed in the
university’s 2016-2020 strategic plan. Objective 1 describes the university’s goals and objectives
related to students, and objective 1.3 is to improve student retention, persistence, and time to
completion. Current SAP rates vary widely by school and program, and averages from the 2015-
2019 reviews range from 90% positive SAP reviews in Program 5 to 66% positive reviews in
8
Program 1. According to internal university data, School 2 has the lowest overall SAP rates, with
only 63% of School 2 students achieving a positive SAP review in 2018.
Importance of the Evaluation
It is important to evaluate the organization’s performance in relation to the performance
goal for many reasons. Maintaining SAP status is essential to the timely completion of a degree
program. Students enrolled in doctoral programs at PGU are required to complete a minimum of
18 new units each year to achieve a positive SAP review. Because PGU charges doctoral students
flat-rate annual tuition, failing to complete the minimum required units each year can lead to a
substantial increase in the overall cost of the degree due to the increased number of enrolled
terms. The 2019/2020 annual tuition for doctoral programs is $8,870-$9,700/term, or $26,610 -
$29,100/ year (university website).
Students who do not achieve a positive SAP review are disqualified from receiving
federal financial aid (loans). There is an appeal process that allows students with personal
circumstances like personal medical or family difficulties to regain probationary eligibility for
financial aid. However, students without appropriate circumstances, or whose appeals are denied
for other reasons, must come up with the full tuition immediately or take a leave of absence from
the university. Some students who lose financial aid eligibility may need to withdraw from the
university without completing their degrees, while others choose to take out high-interest rate
private loans, mortgage their homes, or withdraw money from retirement accounts to pay their
tuition. In addition to the financial impact, the time spent creating probation plans and writing
appeal statements takes away from time the student could be applying to their studies and can
become a source of considerable stress and anxiety.
9
For the university, having high average time-to-degree and attrition rates are detrimental
to the university’s reputation and ability to recruit new students. The university’s accreditors also
review progress rates, and low rates of student progress and high rates of attrition can lead to
problems with continued accreditation. Additionally, the disqualification and appeal processes
require significant staff and faculty time and resources to complete. Internal PGU data estimates
an annual cost in staff and faculty salary of $42,000 for the time spent preparing for, completing,
and following up on financial aid appeals. Reducing the number of students who are disqualified
from financial aid would result in significant savings in university time and resources that could
be devoted to other projects.
Description of Stakeholder Groups
The primary stakeholders at PGU are students, faculty, and staff. PGU’s students are
distance learners and are based throughout the United States as well as internationally. Students
are the key stakeholder and are the beneficiaries of the organizational performance goal.
Doctoral faculty advisors at PGU are also geographically distributed throughout North America.
Each doctoral student chooses or is assigned a faculty advisor from the program’s core faculty.
The role of the faculty advisor is to socialize and mentor the student throughout their program,
guiding and assessing qualitative progress. Staff members are the administrative employees who
support the university’s operations—most staff work out of one of PGU’s administrative offices.
Student services staff are the main staff members who interact with students. Student services
include the Registrar, Financial aid and Scholarships, Student Accounts, Advising, and
Admissions. The Advising Office has a dedicated Graduate Program Advisor for each degree
program, who assists with schedule planning, leaves, and financial aid eligibility, and administers
the annual SAP review process.
10
Stakeholder Group for the Study
While the joint efforts of all stakeholders will contribute to the achievement of higher
SAP review rates for doctoral students, for practical purposes, the stakeholders of interest will be
doctoral faculty advisors. This stakeholder was chosen because the performance goals for staff
and students have been met, and yet there has not been significant progress toward achieving the
strategic goal. Table 2 details the organizational mission, performance goal, and stakeholder
goals, as described in the 2016-2020 PGU strategic plan.
Table 2
Organizational Mission, Global Goal and Stakeholder Performance Goals
Organizational Mission
Dedicated to the education of the next generation of social justice practitioners and scholars
Organizational Performance Goal
Increase the number of doctoral students who make satisfactory academic progress (SAP) per
program by 12% by September 2020.
Stakeholder One Goal
By September 2019, all core
faculty will demonstrate an
understanding of how to
advise students regarding SAP
reviews, their importance, and
their relationship to time-to-
degree completion.
Stakeholder Two Goal
By September 2017, Graduate
Program Advisors will
increase their advising about
SAP through targeted
messaging and a review of
SAP progress during every
advising conversation.
Stakeholder Three Goal
By September 2018, all
students will demonstrate an
understanding of the SAP
review process through
increased communication and
messaging from other
stakeholder groups.
Purpose of the Project and Questions
The purpose of this project is to conduct a gap analysis to examine the knowledge,
motivation, and organizational influences that interfere with increasing student progress toward
degree completion and therefore achieving higher positive SAP review decision rates for
doctoral students. While a complete gap analysis would focus on all stakeholders, for practical
11
purposes, the stakeholder of focus for this study is doctoral faculty advisors. The analysis will
begin by generating a list of assumed interfering influences that will be examined systematically
to focus on actual or validated interfering influences.
The research questions for this study will be:
1. What are the doctoral faculty advisors’ knowledge, motivation, and organizational
elements that interfere with Progressive Graduate University achieving a 12%
increase in satisfactory academic progress (SAP) review outcomes?
2. What is the interaction between Progressive Graduate University’s cultural and
context and doctoral faculty advisors’ knowledge and motivation toward the goal
of increasing SAP review outcomes by 12%?
Methodological Framework
This study is based on the gap analytic framework created by Clark and Estes (2008).
Clark and Estes describe a framework for identifying and analyzing gaps between a performance
goal and the current level of performance in an organization. Using this framework, the
stakeholder knowledge and skills, motivation, and organizational influences that affect
performance toward the goal are systematically identified and examined. The assumed
knowledge, motivation, and organizational influences were identified through the literature
review process and will be described in detail in Chapter 2. This study will employ a mixed-
methods design, using interviews and a survey with questions informed by the research questions
and the assumed knowledge, motivation, and organizational influences. All faculty members who
are assigned to advise one or more doctoral students at PGU will be invited to complete the
survey. This census design will prevent the need for sampling the population. Concurrently, eight
doctoral faculty advisors will be recruited to participate in a 20-minute interview to gather
12
supplemental qualitative data. The quantitative and qualitative data collected will be analyzed,
and recommendations for future changes will be made.
Definitions
Doctoral Student: A student in a research doctoral program with the goal of earning a Ph.D. or
an Ed.D. degree.
Satisfactory Academic Progress (SAP): A regular review of student progress, as required by the
Department of Education. Each school can determine its own SAP policies, but the evaluation
must be completed at least once per year and must evaluate progress and pace toward degree
completion. The school's SAP standards must have a quantitative and a qualitative component.
Organization of this Study
This study will be organized into five chapters. This chapter introduced the problem of
practice, the organization of focus, and the framework for the study. Chapter 2 reviews the
literature related to doctoral degree completion as well as the conceptual framework for the
study. Chapter 3 reviews the methodology chosen for this study, who the participants are, and
how data will be collected and analyzed. Chapter 4 presents the data collected and the analysis of
that data. Chapter 5 provides recommendations based on the data analysis and supported by
research for solutions for the problem of practice.
13
CHAPTER TWO: LITERATURE REVIEW
The following section is a review of the literature related to doctoral student progress and
degree completion in the United States. First will be an overview of the history of doctoral
education in the U.S. and how progress is generally measured in doctoral programs. Next will be
a discussion of the financial and programmatic criteria that have been shown to predict doctoral
program completion, followed by a discussion of the conceptual framework for this study, the
Clark and Estes gap analysis. That section will include a description of the specific influences
related to the knowledge, motivation, and organizational influences that are assumed to affect
faculty advisors’ ability to increase progress in doctoral students at PGU.
Historical Perspective on Doctoral Education
From the beginnings of doctoral education in the United States in the late 1800s, there
has been debate among the academic community related to the purpose, length, and requirements
for the doctoral degree (Berelson, 1960; James, 1903). In 1903, William James published an
editorial in the Harvard Monthly magazine, decrying the end of the doctoral degree as a purely
academic pursuit of scholarship and research. In James’s opinion, the common practice of using
the doctoral degree as a requirement for university faculty employment was imprudent, as the
research-focused doctoral degree was never intended to produce great educators. This argument
illustrates a central issue that continues to be seen in modern universities. Faculty members are
trained in academic research during their doctoral studies, but graduate students are rarely given
formal instruction in methods of teaching and advising students (Austin, 2002). Nevertheless, the
doctoral degree continues to be a minimum requirement for most tenured university faculty
positions (Berelson, 1960).
14
As a result of the need for a doctorate to teach in the academy, the number of doctoral-
granting institutions in the United States has grown from the first established at Johns Hopkins
University in 1887 to 286 institutions in 1973 and a total of 436 institutions conferring a total of
54,904 degrees in 2016 (NSF/NCSES, 2017). This exponential growth in doctoral degree
attainment brings with it several inherent problems; for instance, the length of time it takes to
earn a doctorate. The first doctoral degrees took 2-3 years after the bachelor’s degree to
complete. However, as the number of degree-granting institutions and doctoral-level disciplines
expanded, an agreement about how long a doctoral degree should take to complete was never
formalized (Berelson, 1960). That lack of standardization of degree requirements has resulted in
the current system, where doctoral degrees are generally intended to take from 3-8 years to
complete, but in practice may take much longer (Berelson, 1960; NSF/NCSES, 2017). Without
an established baseline for how long doctoral degrees should take to complete, it is hard to
determine what constitutes adequate progress toward completion, although there are some
standard assessments of progression through the degree steps.
Current Methods of Assessment of Progress
While what constitutes making progress in a doctoral program varies by school and
program, faculty advisors generally consider progress to mean keeping the focus on the degree,
staying productive, setting and meeting goals, and making constant forward progress (Barnes,
2009; Gardner, 2009). Because it is difficult to measure “staying productive” with a standardized
metric, programs and accreditors define progress using quantitative descriptions of a student’s
academic record, most commonly grade point average (GPA) and years enrolled (York et al.,
2015). However, longitudinal research has shown that GPA is not a good predictor of success in a
doctoral program, as the skills required to earn a high grade in a structured course are very
15
different from the type of independent research required to complete a doctoral dissertation
(Gardner, 2009; Williams et al., 1970).
Using years enrolled as a standard metric for progress is also problematic, as the
maximum number of years a student is allowed to complete a doctoral degree varies widely
between institutions, ranging from five to 10 years (National Research Council, 1996). Similarly,
because different disciplines have different styles of research, the expectation of how long it
takes to complete a dissertation varies widely depending on the type of research required, and
therefore it is difficult to compare across disciplines. Some degrees may require extensive travel
and months or years of fieldwork, while others can be completed in a local campus laboratory or
library (Ostriker et al., 2011).
As degree requirements and the length of time it takes to complete them vary, it is
challenging to develop a standard set of stages for doctoral completion. Many doctoral programs
require qualifying or comprehensive exams to mark the transition from coursework to research,
although some do not (National Research Council, 1996). Ampaw and Jaeger (2012) propose
three general stages of a doctoral degree: the first year of coursework, which they call transition;
the second stage, development, is the second year through advancement to candidacy; and the
third stage, research, ends at dissertation completion. Advancement to candidacy generally refers
to the point at which a student has completed the required coursework and passed any qualifying
or comprehensive exams and is working primarily on dissertation research and writing (National
Research Council, 1996).
The Department of Education mandated that schools disbursing Title IV federal financial
aid institute satisfactory academic progress (SAP) requirements for students in 1976 (Bennett &
Grothe, 1982). These requirements require that universities set a maximum time frame for each
16
program of study and regularly track student progress toward completion of that program. The
SAP requirements are designed to ensure that public funds are not being directed to students who
are not adequately focused on their academic goals (Bennett & Grothe, 1982). Many universities
use a calculation of minimum cumulative GPA and completion rate (units attempted divided by
units completed) to determine satisfactory academic progress. However, there is almost no
research devoted to satisfactory academic progress and how it is measured, particularly at the
graduate level.
PGU uses a slightly different metric for measuring SAP. PGU’s SAP policy requires that
doctoral students register for and complete a minimum of 18 units per year during the
coursework stages of the doctoral program. Once a student reaches all but dissertation (ABD)
status, registration for, and successful completion of a dissertation research course each term is
used to show progress (University academic catalog). This metric, up to the point that students
reach the maximum length of study limit (8-10 years depending on program), is determined to
show adequate minimum progress toward the degree at PGU. However, it is important to note
that completing one of PGU’s 80-90-unit doctoral degrees (university website) should only take
four to five years if a student is completing the minimum of 18 units per year. Therefore, the
maximum length of study of 10 years is double what it should take to complete the degree at
minimum annual progress levels. This mismatch in minimum progress standards creates a
confusing expectation for students and faculty.
While there are many ways to measure progress in a doctoral program, including SAP,
there lacks a standard definition or benchmark for measuring and tracking progress across
schools and disciplines. The variability of standards for progress makes it difficult to compare
student rates of progress across institutions. This variability also makes it hard for students to
17
judge which doctoral programs may take longer to finish than others, an important decision-point
for many prospective students. Slower rates of progress toward degree completion are associated
with higher rates of attrition (Bair & Haworth, 2005). What schools and accreditors consider
progress is generally related to the more easily measurable and quantifiable metrics of grade
point average and years enrolled, what programs and faculty consider progress is often the more
difficult to judge metrics of productivity and momentum. These “soft” criteria for success make
it particularly challenging to study and predict which students will be successful in a doctoral
program.
Characteristics Related to Student Doctoral Progress
There has been extensive research into the characteristics both of students and of doctoral
programs that predict doctoral student success. These studies can be divided into broad
categories related to the basis of the characteristics, whether they are personal to the student, or
specific to the doctoral program itself. The third category is financial support, which is a
combination of resources that may be provided by the program or the student. The following
section will describe the literature related to student progress in doctoral programs as it relates to
the program’s design and academic resources, financial support of students, and faculty advising
and mentoring.
Program Design, Academic Support, and Resources
Some researchers believe that causes of doctoral attrition and extended time-to-degree
completion are not due to the student’s demographics, academic preparation, or perseverance,
but instead are due to the program or university not providing appropriate structure, support, and
resources to support the learning of doctoral competencies. Lovitts (2001) suggests that while
faculty and program administrators attribute student attrition to the individual students, the real
18
causes are within the program’s control. These causes include dissatisfaction with the program of
study, the faculty instructors, or the student’s faculty advisor (Lovitts, 2001). Similarly, Bair and
Hayworth (2005) found that a student’s satisfaction with the program contributed to their
likelihood of degree completion. Bagaka’s, Badillo, Bransteter, and Rispinto (2015) also found
that department support factors like faculty support, program structure, and research engagement
were positively correlated with degree completion.
There are many theories about how a doctoral program can be structured to support
student progress. One strategy that has found success is developing an introductory doctoral
course or series of courses to be taken in the first year of study (L. A. Garcia, 2013; Salani et al.,
2016). An introductory course allows the program to set norms, instruct students on basic
academic and research techniques, and provide an opportunity to bond with their classmates and
meet program faculty (L. A. Garcia, 2013; Kuperminc et al., 2016). Therefore, the introductory
doctoral course can also be seen as a way to help level the playing field for students with
different levels of academic preparation and experience. Along similar lines, Benavides and
Keyes (2016) suggest that new student orientations may be a key factor in increasing student
retention, socialization, and satisfaction in doctoral programs.
Another strategy is to develop structured programs that follow a cohort model. In a
cohort model, students who enter at the same time continue through the program at the same
speed and taking the same courses each term (Bagaka's et al., 2015). A similar strategy is
developing collaborative dissertation cohorts, where students who begin their dissertation
research at the same time form a group, supervised by a faculty member (Burnett, 1999). This
cohort strategy allows students to support each other throughout the dissertation process and
provides structure to a process that is typically completed independently (Burnett, 1999).
19
Research findings are mixed on the importance of other academic support systems to
completion rates. Strachan, Murray, and Grierson (2004) found that providing dissertation
resources like templates and worked examples may support students in completing their
dissertations. However, Zhou and Okahana (2019) found that the availability of departmental
academic resources, like writing and teaching instruction and graduate student associations, did
not influence degree completion rates. There are many ways that a university and doctoral
program can provide structure and assistance to doctoral students to ease the transition to
becoming doctoral-level scholars and support retention and completion rates. Along with
program and university level support and structures, adequate funding is an essential part of
supporting doctoral student progress.
Financial Criteria
Financial support for doctoral students can be broadly split into two categories, the
student’s personal financial resources and resources provided by the institution. A student’s
personal financial resources include their savings and family contributions, their employment
income, and loans they take out to fund their education, including federal financial aid as well as
any personal loans or lines of credit. Loans require repayment at some point, either upon leaving
the institution (graduating or withdrawing) or according to a payment schedule (Hamel &
Furlong, 2012). Financial resources provided by the institution may include fellowship or
scholarship funding, department assistanceships, tuition waivers, and federal work-study
programs (Hamel & Furlong, 2012). These institutional resources require varying amounts of
work by the student to maintain the funding. For example, fellowships may require only that the
student remains in good standing to continue receiving the funding, and does not require the
funding to be paid back at any point, while assistanceships generally require either teaching for
20
the department or doing research for a faculty member in exchange for a tuition waiver and
salary (Hamel & Furlong, 2012). First will be a discussion of the completion issues related to
program or institutional funding, and then a discussion of issues related to self-funding or relying
on loans to fund the degree.
Program financial support. Institutional or programmatic financial support in the form
of fellowships, grants, tuition waivers, and teaching and research assistance-ships have been
shown to have varying effects on student progress and degree completion. For the purposes of
this discussion, there are two categories of institutional funding: the kinds that require work
within the program or the department to be eligible; and the kind which are intended solely to
support the student’s research and scholarship. Generally, assistanceships require work within
the department. A teaching assistanceship allows the student to learn teaching and classroom
management skills but represents a considerable amount of work each week of the term (Hamel
& Furlong, 2012). A research assistanceship similarly gives a student experience working on a
research project with a faculty member, but that research project also requires a weekly time
commitment that reduces the student’s own study and research time (Hamel & Furlong, 2012).
Alternatively, fellowship and grant funding is generally meant to support a student’s own
research and scholarly interests and does not require any weekly time commitment (Hamel &
Furlong, 2012). Predictably, given those criteria, fellowship, and grant funding is associated with
higher completion rates and lower time-to-degree than assistanceships with their additional
required work (Caruth, 2015; van der Haert et al., 2014). However, several studies found that all
types of institutional funding are associated with higher completion and lower time-to-degree
completion rates than students without institutional funding (Bair & Haworth, 2005; Zhou &
Okahana, 2019).
21
To sum up, students with department or university funding to assist with financing their
degrees may face additional challenges, including an expectation of time working as teaching or
research assistants, which can detract from time spent on degree completion. However, not all
schools provide these types of funding to their graduate students. Increasingly students are
choosing to enroll in non-traditional, professional, or online doctoral degree programs, which
expect students to self-fund their education entirely.
Student financial resources. Doctoral students are relying more on personal financial
resources, employment outside of the academic department, and federal financial aid borrowing
than ever before, illustrating the trend toward fewer institutional support structures for doctoral
students. Studies have found that for doctoral students, self-funding their degree programs has a
significant impact on student progress toward and completion of the degree. In a foundational
study, Abedi and Benkin (1987) found that doctoral students who work outside of the academic
department to support themselves and their families during their doctoral studies take nearly two
years longer to complete the degree (8.6 years as opposed to 6.6 for students on fellowship).
Gillingham, Seneca, and Taussig (1991) made a similar determination that spending more time
on academic study reduces time-to-degree while working more hours on an outside job increases
time-to-degree. These findings are supported by several more recent studies (Bolli et al., 2015;
Horta et al., 2018). Both findings make logical sense, as the more hours a student works in
outside employment, the fewer hours they can devote to their studies and research.
In addition to the time that employment can take from a doctoral student’s studies,
financial pressures are found to be disruptive to the timely completion of a doctoral degree. Berg
(2016), in a study of African-American and Latinx students in an online program, found that
53% of respondents agreed or strongly agreed that their progress toward the doctorate was
22
affected by outside financial pressures. These outside financial pressures may include the stress
of going into debt in order to earn the degree. Student borrowing to fund doctoral degrees is
increasing rapidly and causing more doctoral students to take on more substantial amounts of
debt. Schuh and Gansemer-Topf (2012) describe this issue, showing that more students are
taking out loans to fund their education, borrowing greater sums of money, and graduating with
higher levels of debt than ever before. Baum and Steele (2018) found that the percentage of
students borrowing more than $75,000 to fund advanced degrees increased from 7% to 15%
between the 2007-2008 academic year and the 2011-2012 academic year.
Similarly, the National Center for Education Statistics reported that while the number of
doctoral graduates who take out loans did not change significantly in the 15 years between 1999
and 2015, the average loan balances doubled, from $53,500 to $108,400, which represents a
103% increase (McFarland et al., 2018). It is important to keep in mind that those rates are only
for doctoral graduates, and do not include the educational debt incurred by the estimated 50% of
students who fail to complete their doctoral program. These increases in loan debt are significant
because research findings show that student loan debt has a significant negative impact on future
financial wellbeing (Gayardon et al., 2018).
Gayardon, Callender, Dean, and Desjardins (2018) describe the implications of student
loan debt as being long-lasting and impacting their future net worth, savings and retirement
savings balances, and increasing their overall levels of financial stress throughout their lives.
That increased debt and financial stress might be worth it if taking out educational loans were
shown to increase rates of doctoral completion, but that correlation is not clear. Ampaw and
Jaeger (2012), in their study of the stages of doctoral education, found that loans increase the
chances of completing the first, transition stage of the doctoral program, but reduce completion
23
of the second, development stage. This finding is significant, as educational loans are only a
good investment if the degree is completed. As there are few, if any, ways to track doctoral non-
completers, it is hard to know what the impact of educational borrowing is for those who fail to
complete the degree.
Doctoral students are working and borrowing to fund their degrees more than ever before,
which has implications for their future financial prospects and levels of stress-related to financial
issues. While more students are turning to financial aid to fund their programs, there are still
students who are fully or partially funded by their institutions. Full or partial funding by
institutions has its benefits and drawbacks for doctoral completion but is generally better for
completion rates than no institutional funding (Bair & Haworth, 2005; Zhou & Okahana, 2019).
However, only traditional doctoral programs offer significant funding packages, and this type of
funding is mostly unknown for students studying in professional fields like education or through
online or otherwise non-traditional programs.
Faculty Advising and Mentoring
Multiple studies show that the most important factor for doctoral student progress and
degree completion is a quality relationship with faculty advisors. Terry and Ghosh (2015)
describe faculty mentoring as critical to the process of developing doctoral competencies.
Ideally, a faculty advisor would find a balance between providing structure and guidance to a
student while also supporting the student’s autonomy and independence (Delamont et al., 1998;
Erichsen et al., 2014). Godskesen and Kobayashi (2016) recommend that faculty advisors
provide coaching in self-regulation skills to give students the tools to persevere through their
programs. Lechuga (2011) defines the roles of a doctoral advisor as encompassing several types
of relationships: advisor; instructor; and agent of socialization. The advisor role is a counselor for
24
issues both academic and personal, a person that the student can go to with questions and for
advice (Lechuga, 2011). Another function of the advisor role is to be knowledgeable about
university and department policy in order to guide the student through the program’s
requirements (Cross, 2018). The instructor role creates formal learning opportunities for the
student to grow their skills in research and other skills required in the field (Lechuga, 2011).
Finally, the agent of socialization demonstrates and coaches the student on the norms of behavior
and professional skills required for success in the field of study (Lechuga, 2011). These multiple
roles, combined into a single advisor, help the student develop the skills required to finish the
degree and be successful in their chosen field of study.
Several studies show that the role of the faculty advisor is so vital to a doctoral student’s
success that without a good advisor, a student is much less likely to finish the degree. Caruth
(2015) suggests that conflicts within the department, either with faculty, staff, or advisors, may
be the main reason that students leave doctoral programs without completing their degrees.
Similarly, Bair and Haworth (2005) describe the positive correlation between the quality of the
relationship between student and faculty advisor and degree completion. Unfortunately, while
this relationship is shown to be essential to doctoral student success, most departments do not put
adequate resources into developing faculty advising skills. Skakni (2018) describes that while
doctoral supervision is a significant influence on student success, faculty are rarely trained in
how to be a good advisor.
Craft, Augustine-Shaw, Fairbanks, and Adams-Wright (2016) describe a fundamental
confusion between what constitutes faculty advising and mentoring, and that programs have
more accountability for the students than the faculty for their success. This echoes what Lovitts
(2001) describes as a system that puts the entirety of the blame for failure on the student, but
25
allows programs and faculty advisors to take credit for the student’s successes. As an example,
Lepp, Remmik, and Leijen’s (2016) study investigates how doctoral supervisors respond to a
student’s stall in progress. They describe the faculty responses to a lack of progress as a problem
for the student to solve, or at least endeavor to solve by reaching out to their faculty advisor
proactively for assistance (Lepp et al., 2016). Whereas, Cross (2018) found that graduate
students want their faculty advisors to show an interest in their studies by reaching out to them,
not waiting passively for the student to contact them. One can see how this could quickly
become a situation where days, weeks, and even months pass, while both the advisor and the
student wait for the other to initiate communication. However, another way of considering it is
that the advisor has not prepared the student adequately for how to do research independently,
including how to respond to the inevitable periods of low productivity (Lepp et al., 2016).
Faculty advising and mentoring are shown to be essential factors for increasing retention
and eventual degree completion for doctoral students. However, faculty advisors are rarely given
training or even guidelines for what their role as an advisor entails. Improving faculty advising
and mentoring of doctoral students is, therefore, a crucial factor for increasing student progress
and degree completion rates.
The Clark and Estes (2008) Gap Analytic Conceptual Framework
A key aspect of organizational problem-solving is determining the essential knowledge
and skills, motivation, and organizational influences related to achieving a specific performance
goal (Clark & Estes, 2008). Clark and Estes (2008) define a framework for identifying and
analyzing the gap between a performance goal and the current level of performance in an
organization. This framework systematically identifies and examines three elements of a problem
to determine where a gap exists, the stakeholder’s knowledge, motivation, as well as the
26
organizational influences that impact performance toward the goal (Clark & Estes, 2008). In this
context, knowledge can be divided into four types: factual, conceptual, procedural, and
metacognitive (Krathwohl, 2002; Rueda, 2011). Motivation has three essential elements; active
choice, persistence, and mental effort (Clark & Estes, 2008). Organizational influences include
the resources, work processes, and workplace culture of an organization (Clark & Estes, 2008)
In the following section, the Clark and Estes (2008) gap analysis will be applied to
determine the knowledge, motivation, and organizational influences faculty advisors need to
achieve the goal of improving satisfactory progress review rates by September 2020. The first
section will describe the assumed knowledge influences on the stakeholder goal. The second
section will describe the assumed motivational influences, followed thirdly by the assumed
organizational influences. Each of these assumed influences will then be tested through the
methodology described in Chapter 3.
Doctoral Faculty Advisor Knowledge and Motivation Influences
This section reviews the literature related to knowledge influences relevant to improving
satisfactory academic progress review rates at Progressive Graduate University. The first step
when diagnosing performance gaps is determining the essential knowledge and skills a
stakeholder needs that are related to achieving a specific performance goal (Clark & Estes,
2008). Once the relevant knowledge and skills are identified, the gap analysis will determine
whether the key stakeholders possess the requisite knowledge and skills to achieve the goal, and
if not, what is needed to bridge the gap (Clark & Estes, 2008).
Knowledge and Skills
Krathwohl (2002), in his revision of Bloom’s Taxonomy, describes four types of
knowledge; factual knowledge, conceptual knowledge, procedural knowledge, and
27
metacognitive knowledge. Factual knowledge is the basic understanding of a subject, including
the terminology, definitions, as well as detailed, specific facts (Krathwohl, 2002). Conceptual
knowledge involves understanding the relationships between those facts, and how to classify,
organize, and generalize that factual knowledge within a specific context (Krathwohl, 2002).
Procedural knowledge refers to how to use those facts and context to complete a task, including
understanding how and in what order to use specific methods and context-specific techniques
(Krathwohl, 2002). Metacognitive knowledge refers to knowing when and why to use the other
three kinds of knowledge to complete a task as well as the self-awareness to be cognizant of
one’s own knowledge and skills (Krathwohl, 2002). The following are assumed knowledge
influences that faculty need in order to address the problem of student progress at PGU. Each
knowledge influence will be described in the context of doctoral education and categorized by
type of knowledge.
Faculty advisors need to understand the SAP policy. To assist students in making
academic progress and maintaining satisfactory academic progress, faculty need to have factual
knowledge of the policies and procedures of the university. For a faculty member to be able to
advise students regarding their annual progress goals and the implications for their SAP review,
faculty need to have a factual understanding of the SAP review process. Strong knowledge of
university policies and procedures is also indicated as one of the features of a competent faculty
advisor (Barnes et al., 2010; Cross, 2018; Schroeder & Terras, 2015). Without the knowledge of
policy and procedure, the faculty advisor’s guidance related to course timing and progression
would be incomplete and unhelpful (Barnes et al., 2010).
For instance, if a student indicates their desire to concentrate on dissertation writing for a
term, but the faculty member is unaware of the SAP policy, the faculty member may agree to the
28
plan without knowing that it could put the student in jeopardy of not making SAP in their next
review. The transition between coursework and dissertation is particularly challenging when it
comes to meeting the SAP requirements. Without knowledge of the SAP process, timing and
requirements, a faculty advisor could inadvertently misadvise their students, particularly during
this crucial transition period.
Faculty advisors need to know how to facilitate doctoral student progress. Many
studies have found that the faculty advisor-student relationship is essential to doctoral student
persistence, timely progress, and degree completion (Golde et al., 2009; Ivankova & Stick, 2007;
Offstein et al., 2004; Paglis et al., 2006; Rockinson-Szapkiw et al., 2016; Wao & Onwuegbuzie,
2011). There are several components to the supervisory relationship of a faculty advisor and a
student. Traditionally, the faculty advisor role is that of a mentor to a student apprentice (Golde
et al., 2009). The mentor has a role as an advisor, providing academic advice and guidance
(Lechuga, 2011) There is also a teaching role, where the faculty member instructs the student in
research skills and development of academic writing skills (Roumell & Bolliger, 2017). Finally,
there is the role of the socializing agent, where the faculty member brings the student into the
professional sphere and provides guidance about the cultural and professional norms of behavior
for the profession (Lechuga, 2011). However, these multiple roles are generally unspoken and
assumed, and faculty advisors may never have received training on how to fulfill the
requirements of each role (Devine & Hunter, 2017; Harding-DeKam et al., 2012; Roumell &
Bolliger, 2017).
Several strategies have been found to be effective in facilitating doctoral student progress.
One strategy is maintaining regular contact with advisees via email, phone calls, or meetings
(Roumell & Bolliger, 2017). When students do not have regular contact with their faculty
29
advisor, they report feeling “isolated and forgotten” (Erichsen et al., 2014, p. 335). Staying in
contact with students allows faculty advisors to keep track of student progress and recognize
when students are falling behind and may need more support (Devine & Hunter, 2017). These
meetings also provide the opportunity to set deadlines for expected deliverables, like dissertation
drafts, and hold students accountable for meeting those deadlines.
Another strategy is to provide timely and consistent feedback to advisees and making
sure that the feedback is both constructive and specific (Kumar et al., 2013). It is most useful for
students to implement feedback when the ideas and content are still fresh in the student’s mind
(Kumar et al., 2013). So, timely feedback is essential. Additionally, improving academic and
research skills requires that faculty advisors provide honest and specific feedback to develop
competencies (Kumar et al., 2013). If a student’s writing skills are not well developed, providing
extensive feedback can be a drain on faculty advisors, taking a lot of time and energy to provide
(Godskesen & Kobayashi, 2016). In this case, it may be important to refer the student to
appropriate writing resources, like a university writing center, for additional support (Cate &
Miller, 2015).
Finally, establishing a professional but collegial relationship with students is essential.
One of the most essential parts of a faculty advisor- student relationship is building trust
(Roumell & Bolliger, 2017). Students should feel comfortable asking questions and being honest
about their concerns and limitations without repercussions from the faculty advisor (Kumar et
al., 2013). Similarly, students desire a faculty advisor that shows genuine care and concern for
them, as well as interest and commitment to helping them succeed (Offstein et al., 2004).
In summary, Table 3 shows the two assumed faculty advisor knowledge influences
related to the goal of increasing the number of doctoral students who make satisfactory academic
30
progress. These two influences were identified based on the literature presented in this chapter.
In the next section, the assumed motivational influences for faculty advisors related to the goal
will be described.
Table 3
Knowledge Influence, Knowledge Type, and Knowledge Influence Assessment
Organizational Mission
Dedicated to the education of the next generation of social justice practitioners and scholars
Organizational Global Goal
Increase the number of doctoral students who make satisfactory academic progress (SAP) by
12% by September 2020.
Stakeholder Goal
By September 2019, all core faculty will demonstrate an understanding of how to advise
students regarding SAP reviews, their importance, and their relationship to time-to-degree
completion.
Knowledge Influence
Knowledge Type (i.e.,
declarative (factual or
conceptual), procedural, or
metacognitive)
Knowledge Influence
Assessment
Faculty advisors need to
understand the SAP policy
Factual
Survey questions testing the
faculty advisor’s knowledge
of the SAP policy
Faculty advisors need to
know how to facilitate
doctoral student progress
Procedural
Survey questions asking about
how faculty members
approach different situations
with their advisees
Motivation
This section reviews the literature related to the assumed motivational influences
pertinent to improving student progress at PGU. Motivation consists of three essential elements:
active choice; persistence; and mental effort (Clark & Estes, 2008). Active choice refers to the
decision to begin a task or process, persistence is continuing to work on it once started, and
mental effort refers to identifying and investing the appropriate amount of thought and effort into
31
completing the task or process (Clark & Estes, 2008). The assumed motivational influences for
faculty advisors related to student progress are utility value theory and self-efficacy theory.
Faculty advisors need to value the importance of SAP reviews to students and the
institution. Value refers to several ways that a person can determine the importance of a task or
process relative to other ways they could spend their time and energy (Clark & Estes, 2008).
Utility value describes a task or action that we do not because it is necessarily enjoyable or
interesting, but because completing it will bring a desired benefit (Clark & Estes, 2008). In this
context, faculty advisors need to value the importance of monitoring student progress and
understand the importance of improving student progress to students and the institution.
According to a book documenting the origins of PGU, written by a longtime faculty member, a
core university belief is that students and faculty should have the freedom to determine the order
and timing of their courses and program timeframes. This core belief is in direct contrast to
policies like SAP and other ways of measuring and enforcing standards of progress. However,
given the regulatory requirements and highly competitive environment of higher education, it is
essential that faculty advisors understand the value of student completion and time-to-degree
rates to the sustainability of the university.
Additionally, for students, there is professional and financial value in completing degrees
on time. Many students work outside of the university while completing their degrees, but
students with higher levels of financial need may need to work full-time in addition to receiving
financial aid (V olkwein & Lorang, 1996). For students with financial barriers, slowing down
their studies at PGU is not a financially sustainable plan, because the university charges flat-rate
tuition for all doctoral programs (university website). Regularly taking fewer units per term
results in higher loan balances upon graduation and increases the likelihood that a student will
32
reach the maximum borrowing limit for financial aid, currently $20,500 per year, and a
maximum loan balance of $138,500 (Federal Student Aid: an Office of the U.S. Department of
Education). Therefore, faculty must understand the importance to the student and the university
in maintaining satisfactory progress and graduating on time.
Faculty advisors need to believe they can influence SAP review outcomes for their
advisees. Self-efficacy theory describes the belief that an action will produce the desired
outcome or the self-perception that it is possible to achieve the stated goal (Pajares, 2009). Self-
efficacy refers to the person’s confidence in their ability to influence an outcome, not the
consequences of the outcome (Pajares, 2009). For example, if a faculty advisor does not believe
they can influence SAP review outcomes, they will not have an incentive to take action to
improve progress rates. There are several origins of a person’s self-efficacy beliefs, most
importantly, their experience of success or failure performing the task, but also their observation
of other’s experiences, social messages received from others, and their own psychological state
(Pajares, 2009). If a faculty advisor has years of experience advising students, and their
interpretation of that experience is that student progress is not correlated with their advising, the
faculty advisor will likely lack self-efficacy beliefs about their ability to influence student
progress.
Similarly, if a faculty advisor observes their colleagues’ attempts to influence student
progress are not effective, they may develop the belief that such attempts are futile. Additionally,
a faculty advisor’s self-efficacy beliefs about student progress can be influenced by the social
messages they receive from their fellow faculty advisors regarding student progress, whether
those messages are intentional or not. A faculty advisor’s psychological state can also affect their
self-efficacy beliefs. A positive mood increases self-efficacy, and a depressed, anxious, or
33
unhappy mood can lead to decreased self-efficacy beliefs. Finally, a study of pre-law advisors
shows that time and resources are positively correlated with an advisor's sense of self-efficacy in
advising (Knotts & Wofford, 2017). A faculty member with insufficient time or resources to
devote to advising is less likely to feel that they are a successful advisor.
In summary, Table 4 shows the faculty advisor's assumed motivational influences related
to the goal of increasing the number of doctoral students who make satisfactory academic
progress. These influences were identified based on the literature described above. In the next
section, the organizational influences related to the goal will be described.
Table 4
Assumed Motivation, Influence, and Motivational Influence Assessments
Organizational Mission
Dedicated to the education of the next generation of social justice practitioners and scholars
Organizational Global Goal
Increase the number of doctoral students who make satisfactory academic progress (SAP) by 12%
by September 2020.
Stakeholder Goal
By September 2019, all core faculty will demonstrate an understanding of how to advise students
regarding SAP reviews, their importance, and their relationship to time-to-degree completion.
Motivational Indicator(s)
Survey results that show faculty advisors recognize the value of their mentees making satisfactory
progress.
Assumed Motivation Influences Motivational Influence Assessment
Utility Value- Faculty advisors need to value the
importance of SAP reviews to students and the
institution.
Survey Questions asking about how faculty
advisors value the SAP review
Self-Efficacy- Faculty advisors need to believe
they can influence SAP review outcomes for their
advisees.
Survey Questions asking about the influence
faculty advisors believe that they have on the
outcomes of SAP reviews.
34
Organization
The final aspect of the gap analysis is determining the organizational influences relevant
to the problem being analyzed (Clark & Estes, 2008). Organizational influences are defined as
the work processes, material resources, and value chains and streams within the organization
(Clark & Estes, 2008). Organizational barriers may include the policies, processes, and resource
levels within the organization that prevent an employee from being successful at achieving their
stated or assigned work goals (Clark & Estes, 2008). Organizational culture can support or be a
barrier to reaching an organizational goal, as an organization’s culture defines the ways that
policies are enforced and procedures are enacted (Clark & Estes, 2008; Schein, 2017).
General theory. Within any organization, there are both cultural models and cultural
settings at work which combine to create the organizational culture (Gallimore & Goldenberg,
2001). Cultural models represent the shared attitudes and values of the organization, which are
generally understood but invisible to the individual who works there (Gallimore & Goldenberg,
2001). Cultural settings are the visible manifestations of an organization’s cultural models, for
instance, stated goals and values, policies, procedures, and stated performance goals, or lack
thereof (Gallimore & Goldenberg, 2001). Within an organization, the cultural settings and
models work together to create the overall organizational culture within workgroups, individuals,
and the organization wholly.
Assumed cultural settings. Where cultural models are generally the invisible norms and
values of an organization, the cultural settings are the tangible expressions of an organization’s
culture, its expressed values, policies, and procedures (Gallimore & Goldenberg, 2001). The
assumed organizational influence that represents cultural settings at PGU is that the organization
needs to provide resources and training related to SAP and facilitating doctoral student progress.
35
The importance of accountability in an organization is what can take the organization from vague
goals to a solid structure for reaching those goals (Schein, 2017). One of the ways that an
organization holds employees accountable for their performance is by evaluating their
performance regularly (McClellan, 2016). While staff advisors are given performance reviews at
least three times annually (PGU staff handbook), faculty advisors are reviewed once every three
years (PGU faculty evaluation policy). Student advising and facilitating learning are aspects of
the faculty review, but metrics like student advisee progress rates and satisfactory academic
progress review rates are not evaluated as part of the faculty review.
One of the many differences between a staff advisor and a faculty advisor in the dual
advising model is the level of training and instruction they are given related to student advising
topics. Staff advisors are hired specifically for their skill or expertise at advising and trained
extensively, while faculty members advise students as a small part of their overall role at the
university (McClellan, 2016). Staff advisors are trained on policy and procedure intensively upon
being hired and on an ongoing basis throughout the year (McClellan, 2016). Faculty advisors, in
contrast, are given limited training on advising topics and have significant additional
responsibilities for the university, including teaching, committee work, and pursuing research
and publication (McClellan, 2016). Providing additional training and communication about the
importance of student progress could encourage faculty to put more time and effort into
promoting degree progress with their advisees.
Assumed cultural models. Cultural models are invisible but commonly understood
norms for an organization (Gallimore & Goldenberg, 2001). For an organization to make
progress toward their goals, in this case, for PGU to make improvements in student progress
rates, the organization must change the organizational culture related to student progress. The
36
assumed influence that represents cultural models at PGU is that the organization needs a culture
valuing the importance of monitoring and tracking student progress.
At PGU, there has long been an understood cultural model among faculty that students
should be allowed to take as long as they need to gain competency in a particular area of study.
This core value is expressed directly in a book about the origins of PGU, written by a longtime
PGU faculty member. However, this cultural model is in direct conflict with the expectation to
complete a specific number of credits each year to maintain satisfactory academic progress
(SAP). Faculty advisors may encourage their advisees to request or extend incomplete grades in
courses to focus on additional learning or career goals, which may be seen to represent a
qualitative measure of progress. In contrast, staff advisors complete and enforce SAP reviews
and encourage this quantitative measure of progress. Conflicts between students and program
staff and faculty have been found to be a leading cause of doctoral student attrition (Caruth,
2015). Ultimately, it is the students who face the implications of the unsatisfactory review by
losing financial aid and scholarship eligibility.
In summary, Table 5 shows the assumed organizational influences related to the goal of
increasing doctoral student satisfactory academic progress rates by 12% at PGU. These
influences were identified based on the literature described in this section.
37
Table 5
Organizational Influences and Organization Influence Assessments
Conceptual Framework
A conceptual framework provides a visualization or conceptualization of the research
study as well as a way to posit a theory of the assumed interactions between the stakeholder and
the organization in relation to the strategic goals (Maxwell, 2013). A conceptual framework can
justify the need for a research study as well as provide a model to test assumed theories about the
interaction of the stakeholders (Maxwell, 2013). In this conceptual framework, while each of the
potential influencers is presented independently, they do not exist in isolation. Their relationship
is both complex and nuanced, but this figure represents an understanding of the general ways
they interact in relation to the strategic goal.
The conceptual framework for this study is based on the gap analytic framework
described by Clark and Estes (2008). Clark and Estes created a framework for identifying and
analyzing the gap between a performance goal and the current level of performance in an
organization. Using this framework, the stakeholder knowledge and skills, motivation, and
Organizational Mission
Dedicated to the education of the next generation of social justice practitioners and scholars
Organizational Global Goal
Increase the number of doctoral students who make satisfactory academic progress (SAP) by
12% by September 2020.
Assumed Organizational Influences Organization Influence Assessment
Cultural Setting Influence:
The organization needs to provide faculty
with resources and training related to SAP
and facilitating doctoral student progress.
Survey questions about the training faculty
have received, and the training they would
like to receive related to facilitating student
progress.
Cultural Model Influence:
The organization needs a culture valuing the
importance of measuring and tracking
student progress.
Survey questions about the importance and
validity of quantitative progress indicators
required by accreditors and the Department
of Education.
38
organizational influences that impact performance toward the goal can be systematically
identified and examined. Figure 1 demonstrates the assumed relationships between the
knowledge and motivational influences of the stakeholder, doctoral faculty advisors, and the
organizational influences of Progressive Graduate University, in relationship to the strategic goal
to increase students’ satisfactory academic progress rates by 12%.
The organization, Progressive Graduate University, is represented by a large blue circle,
with the assumed cultural model and setting influences shown within it. The assumed cultural
model influence is that the organization needs to value the importance of measuring and tracking
student progress. The cultural setting influence is that the organization needs to create a structure
of faculty accountability for student progress.
39
Figure 1
Conceptual Framework Drawing Showing the Interaction Between the Organization,
Progressive Graduate University, The Stakeholders, Doctoral Mentoring Faculty, and the Goal,
Increasing SAP Rates By 12%.
40
The stakeholder, PGU doctoral faculty advisors, is represented by a green circle that is
entirely enclosed within the blue circle of the organization. This represents the relationship of the
faculty as employees of the organization and therefore influenced by the organizational culture.
The assumed knowledge and motivational influences for faculty are listed within the green
circle. The knowledge influences are: faculty advisors need to understand the SAP policy, and
faculty advisors need procedural knowledge of how to facilitate doctoral student progress. The
motivational influences are: Faculty advisors need to believe they can influence SAP review
outcomes for their advisees, and faculty advisors need to value the importance of SAP reviews to
students and the institution.
There is a yellow arrow originating from the blue circle representing the organization that
points toward the strategic goal, shown within a red rectangle, representing the organization’s
desired progress toward its strategic goal. The strategic goal is to increase SAP rates by 12% by
September 2020. The faculty, as they exist within the organization, are therefore also moving
toward the strategic goal, through the influence of the organization.
The specific knowledge, motivation, and organizational influences for this study were
chosen based both on the research literature and nine years of personal experience working for
the organization. Research shows that the relationship between a faculty advisor and a student is
an essential part of the student’s success in a doctoral program (Golde et al., 2009; Rockinson-
Szapkiw et al., 2016; Wao & Onwuegbuzie, 2011). The literature also supports that an effective
faculty advisor should be knowledgeable about the program, its requirements, and deadlines that
affect students (Calabrese & Smith, 2010; King, 2008; Schroeder & Terras, 2015). However, in
this researcher’s experience, faculty advisors do not advise students about SAP requirements,
and some even denigrate the SAP review as unimportant. Thus, faculty advisors must have a
41
factual knowledge of the SAP policy as well as the procedural knowledge of how to facilitate
student progress. Equally important is that faculty advisors believe that they can influence SAP
decisions for their students as well as value the importance of the SAP reviews if the
organization is going to reach the strategic goal of increasing SAP rates. Without the
motivational influence of valuing the importance of SAP reviews, knowledge of the policy will
not lead to a change in advising practices.
Similarly, if the organization wants to see a change in faculty advising practices, they
need to create a structure of faculty accountability for their advisees’ progress (Craft et al., 2016).
Creating a culture of accountability provides additional motivation for faculty to devote more of
their time to advising students and tracking their progress (Craft et al., 2016). Without creating
the expectation of a faculty member’s advisees making SAP at specific rates and reinforcing that
expectation through the faculty review process, the organization cannot expect a sustained
change in faculty behavior. Without a sustained change in faculty behavior, the organization
cannot expect to see an improvement in SAP rates.
Conclusion
The purpose of this study is to understand how faculty knowledge, motivation, and
organizational influences that may influence the rates of satisfactory academic progress reviews
for doctoral students at Progressive Graduate University. This literature review examined the
research related to doctoral student progress in the United States, beginning with a history and
overview of the problem and then addressing the specific personal, financial, and programmatic
factors thought to influence doctoral student progress. These topics were then applied to Clark
and Estes’ (2008) gap analysis to determine the faculty knowledge, motivation, and
organizational influences that are assumed to affect doctoral student progress rates at Progressive
42
Graduate University. Chapter 3 will present the methodological approach for this study, as well
as how the influences will be validated.
43
CHAPTER THREE: METHODOLOGY
The purpose of this project was to conduct a gap analysis to examine the knowledge,
motivation, and organizational influences that create barriers for doctoral students, preventing
them from making satisfactory progress toward degree completion. This project is aligned with
PGU’s strategic plan, which calls for a 12% increase in satisfactory progress reviews by
September 2020. The analysis began by generating a list of possible or assumed interfering
influences that will be examined systematically to focus on actual or validated interfering
influences. While a complete gap analysis would focus on all stakeholders, for practical
purposes, the stakeholder of focus in this analysis is doctoral faculty advisors at Progressive
Graduate University.
The research questions for this project are:
1. What are the knowledge, motivation, and organizational elements that interfere
with Progressive Graduate University achieving a 12% increase in satisfactory
academic progress (SAP) review outcomes?
2. What is the interaction between Progressive Graduate University’s culture and
context and doctoral faculty advisors’ knowledge and motivation toward the goal
of increasing SAP review outcomes by 12%?
These are descriptive research questions that can be best answered using a convergent
mixed method research design (Creswell & Creswell, 2018). While descriptive research
questions can be answered by qualitative, quantitative, or mixed methods studies, for this study, a
mixed-method research design was indicated. The mixed-method approach was chosen because
of the ability to compare the qualitative and quantitative data to develop a rich understanding of
the perspectives of individual stakeholders (Creswell & Creswell, 2018); in this case, doctoral
44
faculty advisors. A convergent mixed methods design collects qualitative and quantitative data
concurrently, and then the results of each part of the study are analyzed and compared (Creswell
& Creswell, 2018).
In this study, the data collected attempts to determine the knowledge, motivations, and
organizational influences that interfere with faculty advisors increasing rates of student progress.
Mixed method research is best used when a researcher wants to take advantage of the strengths
of qualitative and quantitative data collection methods, while also minimizing the limitations of
each method when used individually (Creswell & Creswell, 2018). Quantitative methods are
indicated when there exists significant literature on a subject, as there is about doctoral student
progress, which has been written about extensively since the early 1900s (Berelson, 1960;
Caruth, 2015; James, 1903; Tinto, 1987). Quantitative research methods are also indicated when
a researcher is interested in investigating “performance data and attitude data” (Creswell &
Creswell, 2018, p. 16) as it relates to known theories in the field. With such extensive research
into the issue already published, it was valuable to determine PGU’s faculty member’s attitudes
and performance in relation to the established field of literature.
Qualitative research is indicated when discovering meaning is important (Creswell &
Creswell, 2018). A qualitative interview allows a researcher to probe and develop a deeper
understanding of the participants' thoughts and feelings about a subject (Creswell & Creswell,
2018). Finally, a researcher must take their audience into account when designing a study
(Creswell & Creswell, 2018), and given the number of competing projects and priorities at PGU,
and the limited time for completing this study, a convergent mixed methods design allowed for
the most in-depth results in the shortest amount of time.
45
Participating Stakeholders
Progressive Graduate University (PGU), a pseudonym, is a small, private, non-profit
graduate school that offers distance and hybrid graduate certificate, master’s, and doctoral degree
programs. The student population is approximately 1000 students, with 80% of the students
enrolled in doctoral programs. PGU stakeholders include faculty, students, staff, and
administrators, but for the purposes of this study, the stakeholder population of focus was
doctoral faculty advisors. PGU has approximately 130 core and adjunct faculty members for its
doctoral, master’s, and certificate programs, but the specific population of focus for this study
was the approximately 60 core faculty members who serve as advisors to doctoral students. At
PGU, where there is not a tenure system for faculty, the term “core faculty” refers to permanent
part- or full-time contract faculty members in an academic program. In addition to core faculty,
each program also employs adjunct faculty who are hired to teach a specific course for a specific
term but generally do not serve as faculty advisors or dissertation committee members.
Survey Sampling Criteria and Rationale
Faculty Criterion 1. Faculty participants must be current core doctoral faculty at
PGU. PGU hires both core and adjunct faculty to teach in graduate programs, but only core
faculty members are assigned as doctoral advisors/mentors. When a core faculty member retires
or resigns, they may be retained on a temporary adjunct basis to continue advising one or more
students close to completing their degree(s). In this case, the retired or resigned faculty member’s
knowledge and motivational influences may be influenced by their non-continuing faculty status
and were excluded from the study. For instance, a faculty member who is retired may put less
energy into advising their few remaining students, or conversely, will put enormous energy into
that student’s advising due to being freed from the administrative tasks of being a full-time
46
faculty member. Either way, the inclusion of these adjunct faculty could skew the results of the
study.
Faculty Criterion 2. Faculty participants must be currently assigned as the faculty
advisor of one or more active students. To determine the knowledge, motivation, and
organizational influences of current doctoral faculty advisors, the faculty surveyed must
currently serve in the role of advisor to at least one active student. While other faculty members
may have previously been an advisor or may serve as an advisor in the future, this study seeks
only to determine the knowledge and motivation of current faculty advisors at PGU.
Survey Recruitment Strategy and Rationale
This study employed a survey design strategy. Survey design is a quantitative research
design using a questionnaire or survey as the primary method of data collection to determine the
attitudes and opinions of a population (Creswell & Creswell, 2018; Johnson & Christensen,
2014). This study employed a census rather than a sampling strategy. A sampling strategy is
indicated when a population is so large that individually contacting every member of a
population is unrealistic for a study in terms of time and resources (Johnson & Christensen,
2014). A census is a study that contacts every member of a population to request information
(Johnson & Christensen, 2014), which is feasible for this study with a population of
approximately 60 persons in the stakeholder group. This approach avoids the need to generalize
the results of a sample to the entire population, as the entire population was surveyed.
PGU restricts methods of soliciting research participants to public postings, either in
person, online or on a university-controlled forum. Due to this restriction, invitations to
participate could not be mailed or emailed directly to faculty advisors and follow up messages
couldn’t be sent to non-responders. Therefore, the invitation to complete the survey and
47
interview was distributed using the modalities allowed. A poster was displayed on a public
bulletin board during a faculty retreat in January 2020, an announcement introducing the study
and requesting participation was made at a face-to-face faculty meeting that also occurred in
January 2020, and postcards with the survey link were circulated at the same meeting.
Due to the busy schedules of faculty advisors, it may have been difficult for faculty
advisors to make time to complete the survey. In order to encourage participation, an incentive
was offered. For each survey completed, a $10 donation was made by the researcher to a PGU
student scholarship fund. In the request for participation, and on the first page of the survey,
information was provided regarding the purpose of the survey, that participation is voluntary, and
reiterating the confidentiality of responses.
Interview Recruitment Criteria and Rationale
Faculty interview criterion 1. Faculty participants must be eligible to complete the
survey portion of the study. Namely, faculty members must be both core PGU faculty members
and be currently serving as faculty advisor to one or more students, in order to participate. As
previously described, this limited participants to those who are current doctoral faculty advisors
and ensures that data collected is reflective of current faculty advisor beliefs and attitudes.
Faculty interview criterion 2. Faculty respondents must be willing to participate in a
20-minute in-person or online face-to-face interview. Respondents were invited to participate
in interviews in the survey invitation posting and again upon completion of the online survey.
Interview Recruitment Strategy and Rationale
A purposeful sampling strategy was utilized to determine the interview respondents from
the census population. Purposeful sampling is a strategy whereby individuals are selected who
are particularly appropriate for answering the research questions (Maxwell, 2013). In this case,
48
interview participants from the PGU program with the lowest SAP rates (Program 2) were
prioritized in order to gain a better perspective on those faculty members’ attitudes and beliefs
about SAP. To encourage participation, an incentive was offered in the form of a $20 donation to
a PGU student scholarship fund for each interview completed.
Data Collection and Instrumentation
This study used anonymous surveys and confidential interviews to collect information
from and about how doctoral faculty address satisfactory academic progress with their doctoral
student advisees. The survey and interview questions addressed the assumed knowledge, skills,
motivation, and organizational resources faculty need to increase rates of doctoral student
progress. Faculty responsibility for students’ progress is a politically sensitive topic at PGU, so
anonymous surveys and confidential interviews are essential to collecting candid responses. As a
researcher internal to the organization, asking about a politically sensitive topic, respondents may
be more likely to give detailed information if they are assured that their responses will remain
confidential (Robinson & Leonard, 2019).
Surveys
Survey instrument. The faculty survey, shown in Appendix A, included 27 questions,
with three questions related to each of the six assumed influences as well as five demographic
questions and one open-response summary question. In order to assess the knowledge influences,
multiple-choice, closed-ended questions were asked to assess the faculty advisors’ knowledge of
the factual aspects of those influences. Multiple choice questions are indicated when assessing a
respondent’s level of knowledge about a subject (Irwin & Stafford, 2016). Open response
questions were asked for procedural knowledge influences. Open response questions are
indicated when there are a wide range of ways to answer the question, and the researcher is
49
interested in collecting nuanced data (Robinson & Leonard, 2019). The motivational influences
were assessed with Likert or Likert-style rating scales to determine agreement or disagreement
with statements about the motivational influences. Rating scales are indicated when measuring
attitudes about a topic (Irwin & Stafford, 2016; Robinson & Leonard, 2019). Organizational
influences were assessed using a Likert or Likert-style rating scales to measure the cultural
model influence. The organizational setting influence was measured using multiple-choice
questions with an option for an open-ended response. As these questions are related to resources
and training faculty feel they need the organization to provide, it is most appropriate to allow for
an open-ended response because of the wide range of possible answers (Robinson & Leonard,
2019).
Survey procedures. Surveys were administered anonymously using Qualtrics online
survey software. Online survey administration is time and cost-efficient, helps protect respondent
anonymity, and allows for built-in data analysis tools (Robinson & Leonard, 2019). Online
administration is particularly appropriate for PGU faculty members as they are distributed
throughout the country and are accustomed to email and online communication. The survey
remained open for six weeks, including several weeks in the middle of the term, a time that is
less busy for faculty than the start or end of a term (Irwin & Stafford, 2016).
Interviews
Eight one-time, semi-structured interviews were conducted via the online meeting
platform Zoom. Interviews were semi-structured to allow for flexibility in questioning and
follow-up on themes that emerge from the questions (Merriam & Tisdell, 2015). Questions were
focused on the assumed motivational influences, with a secondary focus on the assumed
organizational influences. These areas of inquiry best supplement the survey questions and allow
50
for collecting rich data from participants. The full interview protocol can be found in Appendix
B.
Validity and Reliability
Ensuring the validity and reliability of a survey instrument is essential for quantitative
research (Robinson & Leonard, 2019). This study used newly created survey questions rather
than using an existing instrument. In order to increase the content validity of the questions, it is
recommended that a researcher pretest the draft questions with content experts familiar with the
organization, the survey content, or with survey administration (Irwin & Stafford, 2016;
Robinson & Leonard, 2019). Cognitive interviewing is the process of asking a respondent to
complete the survey while the researcher observes (Irwin & Stafford, 2016). This process allows
the researcher to identify any confusing elements of the questions and to give feedback on the
order of the questions and the logic of the survey (Irwin & Stafford, 2016; Robinson & Leonard,
2019). In order to increase the validity of the questions, cognitive interviews were conducted
with content experts. The survey was tested with staff advisors from PGU who are familiar with
the SAP review process and terminology used by faculty advisors. These content experts
provided detailed feedback on the question content and helped ensure that the questions were
easy for respondents to understand and answer.
In addition to the content validity, it is essential to establish the reliability of the survey
items of assessing the assumed knowledge, motivation, and organizational influences. There are
several ways to increase reliability, including standardizing the survey administration, increasing
the number of questions, making the questions easy to understand and moderately hard (rather
than too easy or too hard), and avoiding distracting external events (Salkind, 2016). This survey
was administered in a standardized way, online, with detailed instructions. Additionally, multiple
51
questions were asked for each assumed influence to increase the survey’s reliability. While it is
not possible to altogether avoid external distractions, the ability to complete the survey at the
respondent’s own time and pace should be helpful to minimize distractions.
Another critical element for gathering quality data is ensuring an adequate response rate
to a survey. There are several ways to increase the response rate to a survey, including ensuring
that the respondents understand the survey’s relevance to their interests, keeping the survey as
short as possible in order not to take an excessive amount of the respondent’s time, and utilizing
reminders to encourage participation (Robinson & Leonard, 2019). One way to help respondents
understand the relevance and importance of the survey is to enlist the help of key stakeholders to
endorse and encourage participation (Pazzaglia et al., 2016). The university provost’s approval
on the study served that purpose for this study, as well as the faculty chair’s approval to
announce the study during a faculty meeting. Additionally, the need for additional questions to
ensure reliability was balanced with the need for the survey to take a reasonable amount of time
to complete. Working with content experts during the cognitive interviewing process helped
ensure that none of the questions were unintentionally offensive or upsetting to the respondents.
These measures were used to increase the response rate and reduce the bias that results from non-
responses.
Data Analysis
Quantitative Data analysis was completed using Microsoft Excel. Survey responses were
downloaded from Qualtrics into Excel, analyzed, and descriptive statistics were generated for
each question. Qualitative and open response questions were standardized and converted to
numerical data when applicable. Longer response topics were saved in a Word document and
hand-coded using the assumed knowledge, motivation, and organizational influences as a priori
52
open codes. Interview transcripts were generated by downloading the automatically generated
Zoom cloud recording transcript, and corrections were made manually while listening to the
recorded interview. Transcripts were then hand-coded using the assumed knowledge, motivation,
and organizational influences as a priori open codes. Once categorized with a priori codes, the
transcripts were reviewed thoroughly to identify themes within the a priori categories, and those
themes were explored, cross-checked, and analyzed for frequency (Creswell & Creswell, 2018).
The coded transcripts were then reviewed for quotes and statements that illustrated the
quantitative findings in order to keep the research focus on the participant’s meaning and lived
experience.
Credibility and Trustworthiness
There are two threats to the credibility and trustworthiness of a qualitative study. The first
is researcher bias, and the second is reactivity (Maxwell, 2013). Researcher bias is the idea that
the researcher brings their own biases and preconceived ideas to a research project, which could
influence the conclusions they make. In this case, bias was reduced by using triangulation
between the survey and the interview data. This strategy can help reduce the researcher bias
inherent in qualitative research.
Reactivity refers to the ways in which the respondent is influenced by the presence of the
researcher, either by knowing that they are being observed and studied or by the position of the
researcher in relation to the research topic or the respondent (Maxwell, 2013). In this study, as
the interviewer is a member of the organization, and is known to the respondents, there was
additional information provided at the start of the interview to acknowledge the relationship of
the researcher to the subject matter. Respondents were reminded that the results will be kept
confidential and will not be used against them in any way. Respondents were encouraged to be
53
forthright, and interview responses were compared to the anonymous survey results to provide a
point of comparison.
Ethics
Ethics are essential to address in any study, and this study focused on the following three
methods of ensuring ethical research. The study was designed with the express purpose of not
harming participants in any way. Glesne (2011) describes the importance of empowering
research participants by providing them with informed consent before they participate. Informed
consent means explaining that participation in the study is voluntary and explaining any risks for
participation (Glesne, 2011). Further, Rubin and Rubin (2012) describe the researcher’s ethical
responsibility to be honest with participants, treat them with respect, and honor the promises that
you make to them.
Participants in this study were given information about informed consent and asked to
sign an informed consent form prior to participating in interviews. Appendix C shows a sample
informed consent form. Signed informed consent forms were stored on a password-protected
private computer in the researchers’ home and will be securely disposed of at the conclusion of
the study as directed by IRB requirements. Survey participants were asked to confirm their
understanding of the informed consent by checking a box before beginning the survey. The
complete survey instrument can be found in Appendix B, including an informed consent
statement. This information included that participation in the study is voluntary and that their
anonymity (for the survey) or confidentiality (for the interview) was protected. The survey
informed consent statement explained that while the survey will ask for some general
demographic information to categorize responses, like the number of years of advising
experience and department affiliation, the survey does not request or capture a record of their
54
name, IP address, or any other specific identifying information. The interview participants were
asked to choose a pseudonym (or were assigned a pseudonym from a list) to protect their
confidentiality, and their name and other identifying information were not recorded in field notes
or interview transcripts. Participants were also notified that this study was submitted to and
approved by the Institutional Review Board (IRB) at USC and PGU and follows the guidelines
of both universities’ IRB to protect the rights and welfare of study participants.
Another important aspect of ethical responsibility is disclosing the researcher’s
relationship with the participants and the organization being studied (Glesne, 2011). As a current
employee of the organization, the researcher’s role was disclosed to IRB and to study
participants. The researcher’s position as the director of the advising office is distinct from the
role of the researcher for this study, and the researcher is not in a supervisory relationship with
any of the participants of the study. The reporting structure of the organization shows the director
of advising as parallel to study participants but separated by two levels of authority, i.e., the
researcher’s supervisor, an associate provost, answers to the provost, as do the faculty advisors’
supervisors, the department chairs.
Survey data was collected via Qualtrics online survey software. The survey instrument
was designed to not collect any personally identifying data on participants, ensuring strict
anonymity for all respondents. The data collected was password-protected, and was only
downloaded to one personal computer, which is also password protected and kept locked in the
researcher’s home. However, as the data itself will not contain any identifying information, even
if the password and the security of the home were to be compromised, it is not possible to
identify participants or their individual information from the data.
55
Interview data was kept on a password-protected personal computer belonging to the
researcher and did not contain any names or identifying information about participants. Signed
informed consent forms were kept separate from transcripts, and there is no connection between
those forms and the interview transcripts and data. Direct quotes were only used if they do not
pose a risk of identifying the participant. All reported data was de-identified and summarized to
protect respondent confidentiality.
The results of this study will be used to inform initiatives to improve student advising in
the organization by addressing any gaps in advising that may exist between the faculty advisors
and the staff advisors. As the supervisor of the staff advisors, this researcher’s role may result in
assumptions about the level of advising faculty members provide, or biases about faculty
advising. Care was taken to address these biases when designing the study and constructing the
survey and interview questions. Questions were designed to be neutral in tone and not make
assumptions about the knowledge and motivation of participants. Participants were informed that
the purpose of the study is to improve advising services for students, not to highlight deficiencies
in faculty advising or any other current university practices. There was an incentive offered for
participation in the survey, in the form of a $10 donation per survey and a $20 donation per
interview to a PGU doctoral student scholarship fund. The incentive was intended to reward
participation in a way that is particularly relevant to faculty advisors by encouraging student
progress by providing scholarship funding to students.
Limitations and Delimitations
Every research study design has strengths as well as limitations. Mixed methods designs
are one way to maximize the strengths and minimize the limitations of each type of data
collection (Creswell & Creswell, 2018). Anonymous surveys have the limitation that responses
56
cannot be verified, particularly regarding a respondent’s behavior (Robinson & Leonard, 2019).
Interviews are limited by the honesty of the respondents and are subject to response bias due to
the presence of the interviewer (Creswell & Creswell, 2018). Additionally, a low response rate
was anticipated due to the university’s restrictions on soliciting research participation. This factor
may add to the possibility of a non-response bias on the part of respondents leading to skewed
data.
57
CHAPTER FOUR: RESULTS AND FINDINGS
This study explores the gap in faculty advisor knowledge and motivation as well as the
organizational influences related to improving satisfactory academic progress rates at
Progressive Graduate University. A review of the literature related to student progress was
conducted and assumed knowledge, motivation, and organizational influences were identified
and are summarized in Table 6. Results and findings will be organized by influence and category
of influence. A mixed-method study design was used to collect qualitative and quantitative data
from the stakeholder group, faculty advisors. The study included an anonymous survey that
collected quantitative and qualitative data, as well as eight short interviews to provide more
detailed qualitative data in support of the quantitative results. The survey and interviews were
conducted concurrently to maximize response rates within a limited time frame.
Table 6
Influence type, subtype, and stakeholder assumed influences analyzed for this study
Influence Influence Category Stakeholder Assumed Influence
Knowledge Factual Faculty advisors need to understand the SAP
policy
Procedural Faculty advisors need to know how to facilitate
doctoral student progress
Motivation Utility Value Faculty advisors need to value the importance
of SAP reviews to students and the institution.
Self-Efficacy Faculty advisors need to believe they can
influence SAP review outcomes for their
advisees.
Organization Cultural Setting The organization needs to provide faculty with
resources and training related to SAP and
facilitating doctoral student progress.
Cultural Model The organization needs a culture valuing the
importance of measuring and tracking student
progress.
58
The research questions guiding this study were:
1. What are the knowledge, motivation, and organizational elements that interfere
with Progressive Graduate University achieving a 12% increase in satisfactory
academic progress (SAP) review outcomes?
2. What is the interaction between Progressive Graduate University’s culture and
context and doctoral faculty advisors’ knowledge and motivation toward the goal
of increasing SAP review outcomes by 12%?
Participating Stakeholders
The stakeholders for this study are doctoral faculty advisors at Progressive Graduate
University. Doctoral faculty advisors are core faculty members at the university, qualified for a
faculty position by holding a doctoral-level degree in a field relevant to their department. There
are 63 faculty advisors across two schools at Progressive that fit the criteria for inclusion in this
study.
Survey Participants
Progressive University’s IRB restricts how researchers may solicit participation from the
university community to public postings of information (PGU IRB requirements). Direct
solicitation in the form of email messages to individual faculty advisors was not allowed. This
restricted the ability to follow up with individual participants and resulted in relatively low
participation rates. Participation was requested through the announcement of the study at a
faculty meeting, followed by passing out postcards containing the survey link. A poster
advertising the study was displayed in a shared office area near faculty mailboxes. An
announcement and follow up reminder was posted on the university’s public Facebook group
page. Of the 63 faculty advisors who fit the criteria for participation in this study, 22 participated
59
in the survey, representing a 35% response rate. School 1 represented 55% of respondents with a
total of 12, and School 2 had ten respondents. Table 7 details the participation rates for each
school. The faculty participants range from newly hired (0 years experience) to 40 years of
experience, with a mean of 15 years and a median of 11.5 years. Descriptive statistics for the
participating faculty advisors’ years of experience are detailed in Table 8.
Table 7
Interview and Survey Participation by School Affiliation
Survey
Participation
Interview
Participation
# % # %
School 1 12 55% 3 38%
School 2 10 45% 5 63%
Total 22 100% 8 100%
Table 8
Descriptive Statistics for Participants’ Self-Identified Years of Experience
Question 2: How many
years have you been
advising doctoral students?
Mean 15.22727
Standard Error 2.686894
Median 11.5
Mode 15
Standard
Deviation
12.60265
Sample
Variance
158.8268
Kurtosis -0.50672
Skewness 0.824033
Range 40
Minimum 0
Maximum 40
Count 22
60
Interview participants
A total of eight follow up interviews with faculty advisors were conducted for this study.
The participants primarily represented School 2 (63%), as shown in Table 7. A list of the
interview participants, their chosen pseudonyms, and their self-identified number of years’
experience and number of current advisees is detailed in Table 9. Interview participants
volunteered to participate after completing the survey instrument by clicking a link and entering
their name and contact information on a Google form unconnected to the Qualtrics survey or by
emailing the researcher directly. The participants were contacted by email and invited to use an
online scheduling page set up using Acuity Scheduling to find a time for the interview to be held.
A private Zoom teleconferencing meeting was created and confirmed by email to occur at the
time chosen by the participant.
Table 9
Interview Participants by Pseudonym, School and Program Affiliation, Number of Advisees and
Years of Experience
Participant
#
Pseudonym School Program #
Advisees
Years of
Experience
1 Zelda 1 3 39 5
2 Wanda 1 2 12 8-10
3 Victor 2 4 12-13 20-30
4 Tanya 2 4 14 5
5 Jean 2 4 20-25 27
6 Shaquita 2 4 5-7 2.5
7 Priscilla 1 3 39 4-5
8 Chantal 2 4 12 6
61
Determination of Validation
In this mixed-method study, both qualitative and quantitative data were collected to
validate the assumed influences. In a gap analysis, validity is used to determine whether an
influence is determined to be an “asset,” meaning that the influence is shown to be present for a
majority of the stakeholders, or a “need,” meaning that the influence was not shown to be present
for the majority of stakeholders. Determining the percentage of responses, or “cut score,” needed
to determine if an influence is an asset or a need varies by study (Zieky & Perie, 2006). For this
study, due to the importance of the goal to the university, an influence was determined to be
validated as a need if fewer than 75% of respondents were in agreement about an influence. An
influence was determined to be an asset if more than 75% of respondents were in agreement
about the presence of that influence. In cases where the survey responses to specific questions
about the same influence, or when survey responses and interview data about an influence did
not align, an evaluation was made to determine if the results met the 75% threshold, and the
discrepancies and decision-making process were discussed in the results and findings section for
that influence.
Results and Findings
Knowledge Results and Findings
According to Krathwohl’s (2002) revision of Bloom’s Taxonomy, there are four types of
knowledge: factual knowledge; conceptual knowledge; procedural knowledge; and
metacognitive knowledge. This study investigated two of those types of knowledge, factual and
procedural. Factual knowledge includes the terminology, definitions, as well as detailed, specific
facts about a subject (Krathwohl, 2002). Procedural knowledge refers to those facts that are used
62
to complete a task, for instance, how and in what order to complete the steps in the task
(Krathwohl, 2002).
Factual Knowledge. The survey instrument asked three questions to determine the
factual knowledge of participants related to the assumed knowledge influence: faculty need to
understand the SAP policy. The SAP review for doctoral students at PGU is an annual review. If
a student starts their doctoral program in May, their annual review is completed at the end of the
following April. The tracking sheet is an internal report, similar to an unofficial transcript, which
is a commonly used way that faculty advisors access information about a student’s record. The
tracking sheet lists the student’s program start date, which provides one way to determine their
SAP review, as well as listing the student’s past SAP review dates and results. Between those
two pieces of information, a doctoral faculty advisor that understands the SAP policy should be
able to determine a student’s SAP review period.
The first knowledge question (Question 6) asked the advisors to self-report their ability to
identify a student’s SAP review period from viewing the student’s tracking sheet. Fifty-five
percent of respondents self-reported that they know how to identify the student’s review period
using the tracking sheet. Table 10 shows the responses to Question 6 in responses and
percentages.
Table 10
Question 6- Faculty Advisors Self-Identified Ability to Identify SAP Review Period
When viewing a student’s tracking sheet, do you know how to
identify that student’s SAP review period (dates of annual review)?
Answers count %
Yes 12 55%
No 4 18%
I'm not sure 6 27%
Total 22 100%
63
The next two knowledge questions tested the faculty advisors’ factual knowledge of the
SAP review process by asking for identification of the timing and terms included in a spring SAP
review. Question 7 asked, based on the student’s start month, when the student’s SAP review
would occur. Sixty-four percent of respondents were able to correctly identify the review period
as being part of the Spring review. Table 11 shows a visualization of the responses to Question 7.
Table 11
Question 7: Faculty Advisors Responses to Knowledge Question
If a student started their doctoral program in May, when are they reviewed for SAP?
Answers count %
Spring Review
(End of April)
Correct
14 64%
Fall Review
(end of August)
Incorrect
1 5%
I'm not sure 7 32%
Total 22 100%
Question 8 asked a similar question to Question 7, but in a slightly different way. This
question asked which terms would be included in a Spring 2020 SAP review. Academic terms at
Progressive are semester-length trimesters; Spring term (January-April), Summer term (May-
August), and Fall term (September-December). Fifty percent of respondents were able to
correctly identify the three terms included in the spring SAP review as Summer 2019, Fall 2019,
and Spring 2020. Without being able to identify the terms included in the review, the faculty
advisor will not be able to determine if a student has completed the minimum number of course
credits during the review period and will therefore not give accurate information to students
about their progress. Table 12 shows a visualization of the responses to question 8.
64
Table 12
Question 8
If a student is receiving a Spring 2020 SAP review, which
academic terms are included in that review period?
Summer 2019, Fall 2019, Spring 2020
(Correct answer)
11 50%
Spring 2019, Summer 2019, Fall 2019
(Incorrect answer)
5 23%
Fall 2019, Spring 2020, Summer 2020
(Incorrect answer)
0 0%
I'm not sure 5 23%
Blank 1 5%
Total 22 100%
When asked about her understanding of the SAP review in an interview, Zelda, a faculty
advisor with four years of experience, responded:
No, I think that I hadn't come to understand exactly [the SAP] process because it really
was never discussed directly with me. I see it when I get emails from advising that
students have either made [SAP] or not made it. So, when they haven't made it, I follow
up with my students or if they're not responding, I follow up, but it took me a while to get
on board with that as a process because I sort of had to learn how by going rather than
someone telling me upfront.
Zelda’s response suggests that faculty are not being adequately prepared to advise
students about the SAP policy and procedure, which is confirmed by the survey responses.
Similarly, in the open response section of the survey, several responses indicated the need for
more faculty advisor training about SAP. One respondent said, “If SAP is critical, academic
advisors need more info.” And another stated, “I cannot ever remember receiving any
information about advising students, particularly with regard to SAP. It might be helpful to
65
receive training in this area.” A third response stated, “Faculty who have been here for long
periods of time (or for shorter periods) do not understand the process, and therefore there can be
a disconnect between staff/admin, faculty, and students.” These responses indicate that there is a
systemic problem with faculty not understanding the SAP policy at PGU.
Based on the responses to the three factual knowledge questions and the survey open
answer and interview responses, the factual knowledge influence that faculty advisors need to
understand the SAP policy is validated as a need at Progressive University. These results confirm
that only half of doctoral faculty advisors believe that they can identify the SAP review period,
and that self-assessment is confirmed by their responses to the two questions testing this
information. None of the three questions reached the 75% cut score needed to be validated as an
asset. Therefore, the knowledge influence that faculty advisors need to understand the SAP
policy is validated as a gap and categorized as an organizational need.
Procedural Knowledge
The survey asked several questions related to the procedural knowledge influence that
faculty need to know how to facilitate doctoral student progress. Research has shown that there
are many ways for faculty advisors to facilitate student progress, but one particularly useful
strategy is for faculty to meet with their advisees regularly to provide support and encouragement
(Cross, 2018; Marshall et al., 2017; Roumell & Bolliger, 2017). While what constitutes “regular”
meetings seems to vary, several studies suggest monthly meetings to be appropriate for most
students (Cross, 2018; Roumell & Bolliger, 2017).
In order to encourage honest self-reporting of data that may be subject to social
desirability bias, the survey questions for procedural knowledge were constructed as open
response. Social desirability bias is the phenomenon whereby a respondent will give the answer
66
thought to portray them in the best light (Robinson & Leonard, 2019). For this type of question,
if the respondents were given a multiple-choice range of answers, they could easily be influenced
to choose an answer that makes them look like they fall within the normal or average range of
responses. By making this question a short answer response, the faculty advisors were forced to
give an answer without any cues indicating the range of answers that might be considered
appropriate and are therefore more likely to give an accurate answer.
Survey Question 9 asked faculty to report how many times per term (each term represents
four calendar months) that they meet with their advisees. Many respondents indicated a range of
one-to-two or three-to-four meetings per term. In order to quantify the results, range answers
were converted to the mid-point of the range, i.e., the answer one-two was converted to 1.5.
Three responses were not converted to numbers as they could not be quantified. Two respondents
indicated “it varies” in the free-response field, and one respondent left the field blank. As shown
in Table 13, the most common response (mode) was four meetings per term, or once per month.
The median response was 3.75, and the mean was 3.47.
67
Table 13
Descriptive Statistics for Faculty Advisors’ Self-Identified Number of Meetings with Advisees
Each Term
Question 9- In a typical semester/term,
how often do you meet with your
advisees, either individually or in
groups?
Mean 3.472222
Standard Error 0.301066
Median 3.75
Mode 4
Standard Deviation 1.277316
Sample Variance 1.631536
Kurtosis -0.62981
Skewness 0.045623
Range 4.5
Minimum 1.5
Maximum 6
Sum 62.5
Count 18
Question 10 asked participants to answer the question, “Do you check in with your
advisees if they have not been in touch with you as expected? Why or why not?” All respondents
(100%) indicated that they do check in with their advisees when they have not been in touch as
expected. However, several further indicated that they reach out to non-responsive students but
do not feel that students should rely on faculty to offer assistance proactively. One respondent
indicated, “as graduate students, they have the responsibility to take the initiative to ask for
help.” Another respondent answered:
68
Yes, with some reluctance, I do [reach out to non-responsive students]. I see this as part
of the role, especially in a relationally oriented institution such as {PGU}. At the same
time, we are an institution of adult learning. I am amazed at the extent to which adult
doctoral students do not stay in touch with mentors/advisors and need these check-ins,
prompts, and nudges.
Other respondents showed concern for the students’ personal and family situations in
their responses. One indicated that they reach out to non-responsive students “to determine if
they are experiencing difficulties or could benefit from guidance.” Another said that “sometimes
I'm just concerned about their well-being.” Additionally, several respondents clarified that they
reach out more often if the student is in a particularly challenging part of the program, such as in
their first year or during dissertation writing.
Question 11 asked faculty advisors, “how do you track your advisees' progress through
the program?” Twenty faculty advisors answered this question, responses were analyzed, and
common responses were categorized by theme and frequency, detailed in Table 14. The majority
of respondents (80%) indicated that they track student progress by speaking to their students,
indicating that they primarily rely on student self-reporting of progress. The next most common
response (40%) was that the advisor reviews the student’s tracking sheet. This internal report is
similar to a transcript but includes additional information about the student’s dissertation
committee and stage, as well as information about other program requirements such as the
student’s accrued research and clinical hours if required for their program. Other common
responses included reviewing a student’s narrative grade reports, checking how many incomplete
courses they have on their record, and reviewing their progress against the student’s academic
plan.
69
Table 14
Faculty Advisor Themes in Response to Question About How They Track Student Progress
Question 11: How do you track your
advisees' progress through the
program?
# %
Ask student/talk to the student 16 80%
Review tracking sheet 8 40%
Email from advising 7 35%
Reviewing narrative grade reports 4 20%
Incompletes 3 15%
Review student’s academic plan 2 10%
Therefore, based on the results and findings, the procedural knowledge influence that
faculty need to know how to facilitate doctoral student progress was found to be validated as a
need. The faculty advisor participants indicated through their responses that they are aware of
methods for facilitating doctoral student progress but are not in agreement about the importance
of using those methods in practice with their students.
Motivation Results and Findings
There are three essential elements of motivation: active choice; persistence; and mental
effort (Clark & Estes, 2008). Active choice refers to a person’s decision to begin a task or
process, persistence denotes the decision to continue to work on the task once started, and mental
effort refers to putting thought and effort into completing the task or process (Clark & Estes,
2008). The assumed motivational influences for this study are faculty advisors need to value the
importance of SAP to the student and the organization, and faculty advisors need to believe they
can influence SAP review outcomes. The first influence represents the utility value theory, and
the second is self-efficacy theory. These motivational influences were reflected in both the
survey and interview questions for this study.
70
Utility value theory. Value indicates how a person determines the importance of a task or
process relative to other ways they could spend their time and energy (Clark & Estes, 2008).
Utility value describes a task or action that a person does, not because it is necessarily enjoyable
or interesting, but because completing it will bring a desired benefit (Clark & Estes, 2008). In
this context, faculty advisors need to value the importance of monitoring student progress and
understand the importance of improving student progress to students and the institution.
Questions 12, 13, and 14 of the survey addressed the utility value assumed influence, as well as
several questions from the interviews.
Question 12 of the survey asked faculty advisors to self-report how often they bring up
the annual SAP review when advising students. Introducing the SAP review as a topic of
conversation in advising sessions signals that the faculty advisor finds this topic important, and
therefore understands the utility value of the topic. For example, staff advisors at PGU are
instructed to always include SAP reviews as a topic of conversation in advising sessions with
students. In contrast, a striking 68% of faculty advisors answered negatively to this question,
indicating that the majority of faculty advisors do not prioritize SAP advising with their students.
The most commonly reported response (mode) was “rarely.” Table 15 shows the summarized
results, including combined results for the positive answers “always” and “sometimes” as well as
the combined negative answers “occasionally” and “rarely.” This result is important because
many students only go to their faculty advisor for advice on course registration and progress. If
their faculty advisor does not value the importance of the SAP review and doesn’t bring it up as a
topic of conversation, their students are in danger of not making SAP due to avoidable errors in
course planning.
71
Table 15
Summarized Results for Question 12 - Motivation- Utility Value
Question 12: When advising students, how often do you bring up the annual
SAP review as part of your regular topics of discussion?
Answers # % Combined %
Always 1 5%
23%
Sometimes 4 18%
Occasionally 5 23%
68%
Rarely 10 45%
Never 0 0% 0%
No response 2 9% 9%
total 22 100% 100%
Question 13 of the survey asked faculty advisors to self-report how important they feel
the SAP review is for tracking student progress. This question is important because if faculty
advisors do not think the SAP review is important to students and the university, they will not
engage with it. A majority of respondents (86%) gave a positive response to this question,
answering either “very important,” “important,” or “moderately important.” The results for
question 13 are summarized in Table 16. This indicates that faculty advisors do value the SAP
review for tracking student progress, even if they do not advise students about it, as shown by
their responses to Question 12.
72
Table 16
Summarized Results for Question 13 - Motivation- Utility Value
Question 13: How important is the SAP review
process for tracking student progress?
Answers # % Combined %
Very important 7 32%
86%
Important 4 18%
Moderately important 8 36%
of little importance 1 5%
5%
unimportant 0 0%
No response 2 9% 9%
total 22 100% 100%
The final survey question related to utility value motivation was Question 13, which
asked faculty advisors to indicate the level that they believe the SAP review is a good measure of
student progress toward degree completion. This question addresses the faculty advisors’ sense
of the SAP review’s importance to student progress broadly, not only for financial or
administrative reasons. The most common response was “somewhat agree” at 50%, and the
combined positive responses of “strongly agree” and “somewhat agree” was 64%. Table 17
summarizes the results for Question 14. This result is encouraging, but the contrast between the
86% agreement on the importance of the review in question 13, and this question’s 64%
agreement is striking and indicates conflicting beliefs in the value of the SAP review. The faculty
appear to understand the importance of students receiving positive SAP reviews, but not
necessarily that the review measures appropriate minimum annual progress toward degree
completion.
73
Table 17
Summarized Results for Question 14 - Motivation- Utility Value
Question 14:
Please indicate your level of agreement with this statement:
"I believe the SAP review is a good measure of student progress toward degree completion."
Answers # % Combined %
Strongly agree 3 14%
64%
Somewhat agree 11 50%
Neither agree nor disagree 1 5% 5%
Somewhat disagree 5 23%
23%
Strongly disagree 0 0%
no response 2 9% 9%
total 22 100% 100%
In addition to the survey questions, each interview participant was asked if they think that
the SAP review is a valuable tool for tracking student progress and whether it was important for
students to make SAP. Of the eight interview participants, all respondents indicated positive
responses to one or both questions. However, several participants qualified their answer by
indicating that they understand it is important to the university, but that they have hesitations
about the SAP review’s method of quantifying progress. Wanda, a faculty advisor with 8-10
years of experience advising doctoral students, responded: “In theory, yes, but sometimes how
SAP is specifically measured seems a little disconnected with the student or, you know, the stage
the student is in.” Shaquita, a newer advisor, said, “I don’t know if it [the SAP review] is the only
sort of way to measure success.” Jean, an experienced advisor with nearly 30 years of
experience, described the evolution of her beliefs by saying:
74
When I first started advising students, I was very strongly opposed to the annual review. I
thought it was infringing on the faculty’s rights and responsibilities. And about ten years
ago, the academic review became much more rigorous and standardized. Before then, I
thought, okay, I don't like it, but it's flexible, and we can modify it to meet the different
paths of students. And then, when it became much more rigorous, I was initially very
opposed to it. However, one of the things about doctoral studies is it's very easy for
doctoral students to take much longer than they should as students. And when they take
too long... It's not only the dollars resources. It's also the resources out of their personal
life. And so the academic progress over the years has become for me, a really important
yardstick for self-management of progress.
Jean’s response demonstrates an evolving understanding of SAP and student progress.
This helps to explain why faculty advisors are accepting of the importance of the review for the
university but are ambivalent or opposed to the review when it comes to individual students.
Jean’s statement suggests that only after decades of experience do faculty advisors begin to see
the value of annual SAP reviews to students wholly.
These findings, when combined with the survey results, indicate that while the majority
of faculty advisors appear to understand the importance of the SAP review process to the
university’s ability to continue disbursing federal student aid, they do not necessarily understand
its value for tracking student progress. Moreover, while they agree that the SAP review is
important, they do not put that understanding into practice when advising students, as shown by
their answers to Question 12, that they rarely bring SAP up in advising conversations. This
indicates that the utility value influence represents a need at PGU.
75
Self-efficacy theory. Self-efficacy theory refers to the belief that a specific action will
produce the desired outcome or the self-perception that it is possible to achieve the stated goal
(Pajares, 2009). Self-efficacy describes the person’s confidence in their ability to influence an
outcome, although not necessarily the consequences of the outcome (Pajares, 2009). There are
several origins of a person’s self-efficacy beliefs, most importantly, their experience of success
or failure performing the task, but also their observation of other’s experiences, social messages
received from others, and their own psychological state (Pajares, 2009).
Survey questions 15, 16, and 17 addressed the motivation self-efficacy influence for this
study; faculty advisors need to believe they can influence SAP review outcomes. The questions
for this influence were asked in the form of a statement that the participants indicated their level
of agreement with, using a Likert-style five-point scale. Table 18 shows the combined results of
each question, the number and percentage of responses for each of the five options, as well as a
combined percentage for the positive responses “strongly agree” and “somewhat agree,” as well
as the negative responses “somewhat disagree” and “strongly disagree.”
Questions 15 and 16 were asked as positive statements, and Question 17 was phrased as a
negative statement, to allow triangulation of responses. The responses to the negatively phrased
Question 17 had the most agreement among participants, with 73% responding in the negative to
the statement, “My advising of students has nothing to do with whether they make SAP.”
However, this result falls short of reaching the 75% cut score to be validated as an asset. The
interview responses to a similar question help further explain the variance in responses to these
questions about faculty advisor self-efficacy.
76
Table 18
Combined Results from Survey Questions 15-17 - Motivation- Self-Efficacy
Each interview participant was asked if, as a faculty advisor, they feel like they have the
ability to influence whether or not an advisee makes SAP. The responses to this question were
generally positive, in that the faculty advisor believes that they can have an influence on
students, but that it depended on both internal and external factors. Zelda’s response was typical
of the other participants when she responded: “[I can influence my advisee’s SAP reviews] to
some degree, but not entirely.” Tanya suggested that this was a skill she believed faculty advisors
need to work harder to develop: “Well, I know that we could [influence advisees to make SAP],
but whether we do or not is the question, and I haven't yet.” These statements suggest that faculty
advisors recognize that they are not doing enough to support students related to SAP reviews.
Several of the participants indicated that their influence was the most useful in helping
students who were struggling with psychological barriers like shame or embarrassment about not
Please indicate your level of agreement with this statement:
Question 15: "In my
experience, if I advise
students about annual SAP
requirements, they are more
likely to make SAP."
Question 16: “I believe that
as a faculty advisor, I have
an influence on whether or
not my students make
SAP.”
Question 17: “My advising
of students has nothing to
do with whether they make
SAP.”
Answers # % Combined % # % Combined % # % Combined %
Strongly
agree
1 5%
37%
6 27%
68%
0 0%
14%
Somewhat
agree
7 32% 9 41% 3 14%
Neither
agree nor
disagree
10 45% 45% 3 14% 14% 1 5% 5%
Somewhat
disagree
2 9%
9%
2 9%
9%
12 55%
73%
Strongly
disagree
0 0% 0 0% 4 18%
no
response
2 9% 9% 2 9% 9% 2 9% 9%
total 22 100% 100% 22 100% 100% 22 100% 100%
77
making progress or “imposter syndrome,” which is a form of doubt in one’s own intellectual
ability, in comparison to one’s peers. Research shows that imposter syndrome is common among
doctoral students (C. E. Garcia & Yao, 2019; Marshall et al., 2017). Encouraging advisees who
are struggling with these psychological barriers and helping challenge their doubts and negative
self-beliefs are believed to be ways that the faculty advisor can positively influence progress.
Jean described how she encourages self-confidence in students suffering from imposter
syndrome as “providing an environment in which the student can take stock of themselves and
move forward.” She further clarified the impostor syndrome issue as:
It's very common for doctoral students to talk about the imposter syndrome. And when
you start not making the progress you think you should then that imposter syndrome gets
really big, and you feel as though you want to, and you may be highly motivated, you put
a lot of time and resources into the degree, but that imposter syndrome [the belief that] “I
don't really belong here” starts getting bigger and bigger. And so, I think it's that self-
confidence and the ability to put self-blame aside and really start focusing on where
you're going and what you can still do and how you can turn it around.
Another way the participants indicated that they could help students make progress is by
providing advising when students encounter external barriers, which one participant described as
“life factors.” Priscilla described her influence by saying:
I think their life probably determines more whether or not they make progress. The extent
that they're willing to talk with me about what may be going on in their life, then I may
have an influence about how quickly they progress through.
Wanda described her influence when a student encountered external barriers as “I can add
that extra layer of information that is helpful in achieving [SAP].” The extra layer of information
78
she is referring to is the ability to take a leave of absence or otherwise adjust their academic
schedule to achieve a positive SAP review.
Based on the results and findings of the quantitative and qualitative measures for this
motivational influence, the self-efficacy influence is validated as a need. The faculty advisors
participating in this study have differing beliefs about their ability to influence student progress
reviews, but the positive responses do not meet the 75% cut rate to be validated as an asset.
Organizational Results and Findings
Within an organization, there are cultural models and cultural settings that combine to
create a distinct organizational culture (Gallimore & Goldenberg, 2001). The shared attitudes and
values of the organization, which are generally understood but invisible to the individual who
works there, represent cultural models (Gallimore & Goldenberg, 2001). The visible displays of
an organization’s cultural models, like their stated goals and values, policies, procedures, and
performance goals, or lack thereof, represent the cultural settings (Gallimore & Goldenberg,
2001). The cultural settings and models work together within an organization to create the
overall organizational culture.
Cultural settings. The organizational influence representing a cultural setting at PGU
was that the organization needs to provide faculty with resources and training related to SAP and
facilitating doctoral progress. Questions 18 – 20 on the survey addressed this organizational
influence. Questions 18 and 19 were framed as questions that faculty advisors could answer yes,
no, or I don’t know. The results for all three cultural setting questions are detailed in Table 19.
79
In response to Question 18, have you ever received training or instruction about how SAP
is calculated, 55% of faculty advisors answered “No.” In response to Question 19, do you believe
training related to facilitating doctoral student progress should be provided to faculty advisors,
73% of respondents answered in the positive, suggesting that faculty advisors would be willing
to participate in training if it were offered.
Table 19
Combined Results from Survey Questions 18-19 - Organization- Cultural Setting
Question 18: Have you ever
received
training or instruction about how
SAP is calculated?
Question 19: Do you believe
training related to facilitating
doctoral student progress should be
provided to faculty advisors?
Answers
# % # %
Yes 7 32% 16 73%
No 12 55% 0 0%
I don't know 3 14% 5 23%
No response 0 0% 1 5%
Total 22 100% 22 100%
The final survey question related to this influence was Question 20, and asked faculty
advisors to answer how much they agree with the following statement: “I feel I have been
provided with the developmental resources I need to facilitate doctoral student progress.”
Responses to this question, as detailed in Table 20, were evenly distributed between the positive
and negative with 46% answering in the negative, and 41% in the positive. Within the positive
and negative answers, the majority of respondents answered somewhat agree and somewhat
disagree, rather than strongly agree or disagree. This result appears to indicate that the faculty are
unsure if they have the resources they need or possibly are not aware of what resources might be
helpful.
Responses to the open answer survey questions and interviews help to illustrate these
responses. One survey responded indicated, “As newer faculty, I have been introduced to SAP
80
and have talked with advisors about it a few times to make sure I understand it, and I have found
it a bit complicated to clearly advise students about this aspect of their progress…” Another
responded, “I've been repeatedly told that SAP is more related to the academic advising office
and is not something that I, as a faculty [advisor], am directly connected to.” These responses
indicate that there is confusion among faculty about what their role is related to the SAP reviews,
and that clarification is needed.
Table 20
Summarized Results from Survey Question 20 - Organization- Cultural Setting
Question 20:
How much do you agree with the following statement:
“I feel I’ve been provided with the developmental resources I need to facilitate doctoral
student progress.”
Answers # % Combined %
Strongly agree 1 5%
41%
Somewhat agree 8 36%
Neither agree nor disagree 2 9% 9%
Somewhat disagree 7 32%
46%
Strongly disagree 3 14%
No response 1 5% 5%
Total 22 100% 100%
Overall, the responses to the questions related to the cultural setting organizational
influence indicate that the university is not providing enough training or resources related to
student progress and development for faculty advisors. While the majority of faculty advisors
surveyed believe that training and resources should be provided, the majority do not believe they
have ever received this training and are conflicted about whether or not they have received the
necessary resources. Therefore, the organizational influence is validated as a need.
Cultural model. The organizational influence representing a cultural model at PGU was
that the organization needs a culture valuing the importance of measuring and tracking student
progress. This influence was addressed in Questions 21-23 of the survey, as well as in the
81
interviews. The survey questions were phrased as statements, with a Likert-style scale for
respondents to indicate their level of agreement with the statement. Table 21 shows the combined
results of all three questions.
Table 21
Combined Results from Survey Questions 21-23 - Organization- Cultural Model
Question 21 asked participants to indicate their level of agreement with the statement
[PGU] values the importance of measuring student progress. An impressive 77% of respondents
either strongly agreed (50%) or somewhat agreed (27%) with this statement. This is in stark
contrast to the interviews, where seven of eight interview participants indicated a mixed or
qualified response to a similar question.
Interview participants were also asked if they believe PGU has a culture that values
measuring and tracking student progress, and seven of the eight interview respondents indicated
that different university constituents value student progress differently. Tanya described the
Please indicate your level of agreement with this statement:
Question 21:
“PGU values the
importance of measuring
student progress.”
Question 22:
"PGU could do more to
strengthen its resources for
measuring student
progress.”
Question 23:
“PGU holds me accountable
for my students’ SAP
rates.”
Answers # % Combined % # % Combined % # % Combined %
Strongly
agree
11 50%
77%
5 23%
60%
0 0%
18%
Somewhat
agree
6 27% 6 27% 4 18%
Neither
agree nor
disagree
1 5% 5% 6 27% 27% 5 23% 23%
Somewhat
disagree
3 14%
14%
3 14%
14%
5 23%
60%
Strongly
disagree
0 0% 0 0% 6 27%
no
response
1 5% 5% 2 9% 9% 2 9% 9%
total 22 100% 100% 22 100% 100% 22 100% 100%
82
culture as “inconsistent.” Chantal used the word “imbalance” to describe the staff and faculty
culture related to SAP. Priscilla commented, “It’s been my understanding that [SAP] is for the
[staff] advisor [to emphasize in advising].” Shaquita said: “No… I think it has a staff culture that
values it. I do not think that it has a faculty culture that values it.” Victor indicated he believes:
“…it's graduate school and the work should speak for itself, particularly the writing and in that
sense the grades are an unnecessary pressure, a managerial kind of pressure that the students
don't need.” These responses typify the faculty advisors’ resistance to quantifying student
progress and indicate that there does not exist a unified organizational culture valuing student
progress. Therefore, the cultural model influence is validated as a need.
Question 22 stated PGU could do more to strengthen its resources for measuring student
progress, and 60% of participants answered in the affirmative to this statement. Question 23
stated that PGU holds me accountable for my students’ SAP rates, and 60% of respondents
agreed to this statement. These results, in concert with the interview findings, show that while
there is some disagreement about the culture of valuing the importance of progress reviews, there
still exists a gap in this organizational influence. Therefore, the cultural model influence is
validated as a need.
Synthesis
This study was designed to determine if there exists a gap between the necessary faculty
advisor knowledge and motivation as well as the organizational influences needed to increase
SAP rates at the university. These results and findings show that there exists a significant gap in
knowledge, motivation, and cultural models and settings, validating each of the assumed
influences as an organizational need. Table 22 shows each stakeholder assumed influence, its
category, and its validation as a need in the organization. While the degree of need varied for
83
each influence, when combined, the quantitative results and the qualitative findings found that
influences reached the level of agreement indicated for this study to be validated as an asset.
Table 22
Stakeholder Assumed Influences and Summary of Validation as Need or Asset
Influence Influence
Category
Stakeholder Assumed Influence Validation
Knowledge Factual Faculty advisors need to understand the SAP
policy
Need
Procedural Faculty advisors need to know how to facilitate
doctoral student progress
Need
Motivation Utility Value Faculty advisors need to value the importance of
SAP reviews to students and the institution.
Need
Self-Efficacy Faculty advisors need to believe they can
influence SAP review outcomes for their
advisees.
Need
Organization Cultural
Setting
The organization needs to provide faculty with
resources and training related to SAP and
facilitating doctoral student progress.
Need
Cultural
Model
The organization needs a culture valuing the
importance of measuring and tracking student
progress.
Need
Chapter 5 will introduce recommendations for solutions to close the gap between the
validated influences in order to help the organization reach its goal. The recommendations
include an implementation plan and a strategy to evaluate the implementation based on
Kirkpatrick’s (2016) four levels of training evaluation. Finally, the limitations and delimitations
of this study and its methodology will be discussed.
84
CHAPTER FIVE: DISCUSSION
This study is evaluating the gap between the assumed knowledge, motivation, and
organizational influences interfering with Progressive Graduate University’s goal to increase
SAP review outcomes by 12%. The assumed influences were identified through a literature
review in Chapter 2 and evaluated using the methodology described in Chapter 3. Chapter 4
analyzed the results and findings of the study and validated each of the influences as either an
organizational need or an asset.
In this chapter, recommendations will be made for addressing the validated needs
identified in Chapter 3. The recommendations will be organized by category, and by influence
within that category. Finally, the recommendations will be compiled into a comprehensive
training program recommendation, including methods for evaluating its success. The evaluation
system is based on the New World Kirkpatrick approach (Kirkpatrick & Kirkpatrick, 2016),
which includes four levels of evaluation to increase the chances of achieving the organizational
goal.
Introduction and Overview
This section of the dissertation will introduce recommendations for implementation of
solutions to address each of the knowledge, motivation, and organizational influences from this
study, as well as provide an integrated program for implementation and evaluation of the
proposed program. The recommendations for each influence are guided by whether the influence
was validated as an asset or a need in the study’s results and findings. As all influences were
found to be organizational needs, each will be introduced, trained, and encouraged/reinforced as
part of the program.
85
The recommendation for the knowledge influence is to provide resources, training, and
job aids in support of faculty advisors’ knowledge of the SAP policy and to support doctoral
student progress more wholly. The motivation recommendations are to continue to model and
reinforce the importance of SAP reviews, as well as to privately recognize faculty advisors who
are successful in helping their advisees achieve positive SAP reviews. The organizational
recommendations are to provide financial support for the creation of training programs as well as
an early alert reporting program and to encourage program leaders to incorporate SAP review
data as part of the faculty advisor review process.
The recommendations described above will be integrated into a comprehensive program
based on Kirkpatrick and Kirkpatrick’s (2016) model of training implementation and evaluation.
An asynchronous online training program will be created to provide immediate training for
faculty advisors on the SAP process. Concurrent to the training program, the university should
implement technology solutions to simplify and streamline information to faculty advisors, as
well as implementing strategies to reinforce and encourage adoption of the new practices.
This program will be evaluated using the New World Kirkpatrick Model (Kirkpatrick &
Kirkpatrick, 2016) that includes four levels of training evaluation in order to strengthen the
program and maximize the chances of its success in reaching the organizational goal of increased
SAP rates. The Kirkpatrick model’s four levels evaluate: 1) the immediate reaction to the
training program; 2) the learning gained from the training program; 3) the resulting change in
behavior after the training program; and 4) the leading indicators that would signal success at
achieving the desired results.
86
Knowledge Recommendations
The assumed knowledge influences for this study are that faculty advisors need to
understand the SAP policy, and that faculty advisors need to know how to facilitate doctoral
student progress were both validated as gaps based on this study. The first influence, faculty
advisors need to understand the SAP policy, is declarative factual knowledge, and is the highest
priority knowledge influence based on the goal to increase SAP rates at PGU. The second
influence represents procedural knowledge. Table 23 lists the assumed knowledge influences,
context-specific recommendations for action, and the educational principles upon which the
recommendations are based.
Table 23
Summary of Knowledge Influences and Recommendations
Assumed
Knowledge
Influence
Validated
as a Gap?
Priority
Yes, No
Principle and Citation Context-Specific
Recommendation
Faculty advisors
need to
understand the
SAP policy.
(D-F)
Yes Yes Provide printed and
visual information
about a topic together
to aid recall of spatial
relationships (Rueda,
2011).
Faculty advisors should be
provided with technology-
assisted job aids to assist in
developing knowledge about
the SAP policy.
Faculty advisors
need to know
how to facilitate
doctoral student
progress. (P)
Yes Yes Social interaction,
cooperative learning,
cognitive
apprenticeships such as
reciprocal teaching, and
personalization
methods promote
knowledge transfer
(Rueda, 2011).
Provide information about
best practices for facilitating
doctoral student progress
and create opportunities for
faculty to discuss advising
practices with their
colleagues.
87
Increasing faculty advisors’ knowledge about the SAP policy. The results and findings
for this study suggest that 50% of faculty advisors at PGU do not have an adequate factual
understanding of the satisfactory academic progress policy, despite having received previous
training. Information processing system theory provides a framework for creating a solution to
this knowledge gap. Rueda (2011) describes the use of visual information in the form of
diagrams or matrices to aid in the recall of spatial information and the enhancement of memory
and learning transfer. Visual representations of a student’s progress could, therefore, be useful to
facilitate faculty advisors’ understanding of the policy in the context of a specific student’s
situation. Therefore, faculty advisors should be provided with technology-assisted job aids, for
instance, a real-time summary of courses and units completed toward the upcoming SAP review
for each student advisee, to assist in developing knowledge about the SAP policy.
Technology, like a university’s student information system, can augment academic
advising by simplifying and extending access to information (Leonard, 2008; Steele, 2014;
Wilcox, 2017). The university’s student information system provides electronic access to real-
time student records, providing the information faculty advisors need to advise about a student’s
progress toward fulfilling the requirements for SAP. Currently, this information is available to
faculty advisors but requires them to recognize the dates of a student’s SAP review based on
their start date in the program and was incorrectly identified by 50% of faculty advisors in this
study. Rueda (2011) describes how a visual representation enhances memory for sequence and
meaning. Kirkpatrick and Kirkpatrick (2016) introduce the importance of reinforcing knowledge
using technology. Providing a report that consolidates this information into a single place, with a
visual representation of course completion over the time period of the student’s SAP review,
88
would, therefore, provide an appropriate and useful job aid to facilitate faculty advising about
SAP.
Increasing faculty advisors’ knowledge of how to facilitate student progress. The
results of this study confirm that faculty at PGU know how to facilitate student progress but do
not consistently use that knowledge when advising students. Sociocultural and sociohistorical
approaches to learning provide a basis for understanding how to approach this knowledge gap.
Rueda (2011) describes how social interaction and cooperative learning can provide increased
effort and persistence when performing a task, as well as promoting the probability that the
strategies will be utilized. Therefore, the recommendation to close this knowledge gap is to
provide information about best practices for facilitating doctoral student progress and create
opportunities for faculty to discuss advising practices with their colleagues. Evidence-based best
practices for supporting doctoral student progress will be provided in a shared forum for faculty,
facilitating discussions, and encouraging faculty advisors to recall and use that information when
they interact with their advisees.
Research shows that faculty advisors are expected to facilitate doctoral student progress
but are rarely given specific information or instruction on how that is meant to be done
(Bøgelund, 2015; Craft et al., 2016; Skakni, 2018). Skakni (2018) reports that in her qualitative
study of faculty advisors in the social sciences, almost half reported the desire to have both
institutional support for faculty advising as well as a forum for exchanging information with
more experienced faculty advisors. The research affirms that providing informational resources
in concert with a structure for the exchange of information, advice, and best practices in advising
is a solid strategy for closing this knowledge gap.
89
Motivation Recommendations
Introduction. The assumed motivational influences for this study are that faculty
members need to value the importance of SAP reviews to students and the institution, and that
faculty advisors need to believe they can influence SAP review outcomes for their advisees. The
results and findings of this study validate the assumed motivational influences stated above. It
will be important for the university to model and enforce the importance of SAP reviews by
providing faculty advisors with feedback on their advisees’ progress and instructional support for
new faculty advisors at the university. Rueda (2011) provides a framework for understanding
motivational influences in educational settings. The assumed motivational influences, principles
of the motivational theory, and context-specific recommendations for future action are detailed
below in Table 24.
Table 24
Summary of Motivation Influences and Recommendations
Assumed Motivation
Influence*
Validated
as a Gap?
Priority
Yes, No
Principle and
Citation
Context-Specific
Recommendation
Faculty advisors need
to value the importance
of SAP reviews to
students and the
institution.
(Expectancy Value
Theory)
Yes Yes Include rationales
about the
importance and
utility value of the
task (Pintrich,
2003).
Model and reinforce the
importance of SAP
reviews to students and the
university community by
providing training and
incentives.
Faculty advisors need
to believe they can
influence SAP review
outcomes for their
advisees. (Self-
Efficacy Theory)
Yes Yes Link rewards with
progress
(Pintrich, 2003).
Provide incentives by
privately recognizing the
faculty advisors with the
highest percentage of
advisees who achieve SAP
each year.
90
Reinforce the importance of SAP reviews for students and the institution. Eighty-six
percent of faculty surveyed for this study indicated that SAP reviews are important for students
and the institution. Expectancy value theory provides the basis for understanding the relative
importance that individuals attach to a task. Pintrich (2003) suggests that providing rationales for
the value of a task can reinforce the utility value of that task. For faculty advisors, who have a lot
of competing priorities and limited time to complete them, this approach would serve to further
encourage the importance of advising about SAP reviews. The recommendation is for the
university to increase its communication of the importance of SAP reviews to students and the
university. This could be accomplished through a combination of regular communication about
the goals of the review as well as continuing to include the improvement of SAP rates as part of
the university’s long-term strategic plan goals.
Rueda (2011) describes utility value as one dimension of task value that describes the
usefulness of a task in achieving a goal. Utility value is extrinsic, in that the individual may not
enjoy the task, but they recognize the importance of engaging in it in order to achieve a higher
goal (Eccles & Wigfield, 2002). Hulleman et al. (2010) found that interventions aimed at
increasing the utility value of a task led to increased interest in performing the task. Therefore,
while faculty members may not enjoy advising students about SAP reviews, university and
program leadership must continue to emphasize the importance of SAP review rates to students
and the university, so that faculty are more likely to engage and continue engaging in SAP
advising.
Privately recognize high achieving faculty advisors. The results of this study suggest
that faculty advisors have conflicting beliefs about their ability to influence SAP review
outcomes among their advisees; therefore, this represents an organizational need. Self-efficacy
91
theory explains the importance of having confidence in positive outcomes in order to support an
individual’s motivation to complete a task. Pintrich (2003) proposes that linking rewards to
progress can positively motivate an individual to engage in an activity. For faculty advisors,
being recognized for high performance may provide encouragement to support their confidence
and motivation to advise about SAP reviews. Therefore, the recommendation is that the
university privately recognizes the faculty advisors in each program with the highest percentage
of advisees who achieve SAP each year or demonstrates the largest improvement in SAP rates.
This recognition will serve as a regular reminder to faculty that the university values SAP review
outcomes.
Bandura (1977) describes self-efficacy as an individual’s expectation of success at a
given task. Further, Bandura correlates a person’s self-efficacy beliefs with the amount of effort
and energy that person will expend on a task. Building on Bandura’s theory, Cervone et al.
(1991) describe the importance of a specific goal when assessing self-efficacy beliefs about a
task when it is complicated. The more complex the task, according to their findings, the more
important it is to have a well-defined goal, as ambiguous goals on complex tasks can lead to
lower performance. This suggests that the current system at Progressive, where there is no
defined goal for faculty advisors related to SAP review performance, may lead to reduced
motivation to pursue advising about SAP. Pintrich (2003) suggests that extrinsic rewards can
motivate performance. However, Malik et al. (2015) found that extrinsic rewards must be valued
by an individual to be effective, as an unimportant reward can negatively affect performance.
Thus, it will be essential for the organization to be thoughtful about how recognition of SAP
performance occurs, given the importance of the value of the reward to performance
improvement.
92
Organization Recommendations
Introduction. The assumed organizational influences for this study are that the
organization needs to provide faculty with resources and training related to SAP and facilitating
doctoral student progress, and the organization needs a culture valuing the importance of
measuring and tracking student progress. The results and findings of this research study validated
both organizational influences as gaps. Clark and Estes (2008) provide a framework for
understanding these organizational influences and how to address them. Table 25 shows these
assumed influences, context-specific recommendations for Progressive, and the principle on
which the recommendation is based.
Table 25
Summary of Organization Influences and Recommendations
Assumed
Organization
Influence*
Validated
as a
Gap?
Priority
Yes, No
Principle and Citation
Context-Specific
Recommendation
The organization
needs to provide
faculty with
resources and
training related to
SAP and
facilitating doctoral
student progress.
(Cultural setting)
Yes Yes Effective change efforts
ensure that everyone has the
resources (equipment,
personnel, time, etc.)
needed to do their job and
that if there are resource
shortages, then resources
are aligned with
organizational priorities
(Clark & Estes, 2008)
The organization
should provide
financial support to
create training and an
“early alert” program to
facilitate faculty
advisors’ efforts to
support doctoral
student progress.
The organization
needs a culture
valuing the
importance of
measuring and
tracking student
progress. (Cultural
model)
Yes Yes Effective organizations
ensure that organizational
messages, rewards, policies,
and procedures that govern
the work of the organization
are aligned with or are
supportive of organizational
goals and values. (Clark &
Estes, 2008)
The organization
should encourage
program leaders to use
SAP review data as part
of the faculty review
process and hold
faculty accountable for
their students’ progress
rates.
93
Provide resources to create training and job aids to facilitate faculty advisors’
efforts to support doctoral student progress. The results and findings of this study found that
the organization is failing to provide faculty with adequate resources and training related to SAP
and facilitating doctoral student progress. Sixty-nine percent of respondents indicated they have
either never received or do not remember receiving training about the SAP process.
Organizational change theory provides a framework for developing recommendations for
solutions to this organizational problem. Clark and Estes (2008) suggest that effective change
efforts ensure that employees have the resources needed to do their job, and those resources
should be aligned with organizational priorities. Faculty advisors will be better equipped to
advise their students regarding the SAP review process if they have received training and
instruction about the SAP policy and procedure. Therefore, the recommendation is that the
organization provide resources to create training and job aids to facilitate faculty advisors’ efforts
to support doctoral student progress.
Similarly, faculty advisors cannot respond to problems of which they are not aware.
Therefore, the recommendation is that the university implements an “early alert” tracking
system, which would allow faculty instructors, staff, and faculty advisors to report student issues
quickly and easily. This system would be integrated with the existing student information system
and would facilitate the reporting of common issues like the student stopped participating in
class or is behind in submitting assignments. With that information early in the term, the faculty
advisor and graduate program advisor could reach out proactively to the student to provide
support and resources to get back on track before the term ends and the SAP review is
completed.
94
Clark and Estes (2008) describe the importance of aligning organizational processes and
structures with the goals of the organization and providing the appropriate knowledge, skills, and
organizational support to achieve the goals. For the university to increase SAP rates among
doctoral students, faculty advisors must have knowledge of the SAP policy, and the university
should provide organizational support to reinforce the acquisition of SAP knowledge. Studies
have shown that students expect their faculty advisor to be knowledgeable about university
policy and procedure (Cross, 2018; Lechuga, 2011). Not only is a student's progress and
eligibility for loan and scholarship funding threatened by inaccurate SAP advising, but it may
also threaten the trust between advisor and advisee. This is particularly important because the
relationship between faculty advisor and doctoral student is essential to progress toward degree
completion (Bair & Haworth, 2005; Caruth, 2015). Similarly, research shows that faculty
advisors are rarely trained on the skill-specific knowledge and abilities to be an effective advisor
(Skakni, 2018). Therefore, organizational support for faculty knowledge about SAP policy is
essential for reaching the organizational goal to increase SAP rates and also for the retention and
success of students overall.
Encourage program leaders to use SAP review data as part of the faculty review
process and hold faculty accountable for their students’ progress rates. The results and
findings of this study found that 50% of respondents do not believe that the university holds
them responsible for their advisees’ progress. Organizational change theory provides a
framework for developing recommendations for organizational problems. Clark and Estes (2008)
describe the importance of organizational messaging and rewards being aligned with and
supportive of organizational goals and values. For the university, the organizational goal to
increase SAP review results will be more successful if the organization aligns that goal with the
95
faculty advisors’ job performance evaluations. The recommendation is for the organization to
encourage program leaders to hold faculty accountable for their students’ progress rates by
making SAP review data part of the faculty review process.
Clark and Estes (2008) describe the importance of clear communication from the
organization when implementing a change effort. Clear communication is essential for the
organization to demonstrate its commitment to the change effort by aligning its messaging and
reward structure with its stated goals and objectives. This means the organization needs to
provide clear and consistent corrective feedback to support and encourage the change in
organizational culture (Clark & Estes, 2008). It is essential to align goals, strategies, and reward
systems in order to successfully reach organizational goals (Gilley et al., 2009; Rueda, 2011).
Therefore, if the organization expects to reach its goal of improving SAP rates, it must align its
commitment to that goal with its communication, feedback, and reward structures for faculty
advisors.
Integrated Implementation and Evaluation Plan
Implementation and Evaluation Framework
This study uses the New World Kirkpatrick Model (Kirkpatrick & Kirkpatrick, 2016) as a
framework to understand the necessary steps for implementing and evaluating the recommended
solutions discussed in the previous section. The Kirkpatrick model is based on Donald
Kirkpatrick’s four levels of evaluation of a training program, updated in 2016 by James and
Wendy Kirkpatrick. The four levels of training evaluation are: reaction (Level 1); learning (Level
2); behavior (Level 3); and results (Level 4). The Kirkpatrick model starts with Level 4 by
defining the desired results of a training program, identifying the indicators that define success,
and building an implementation and evaluation plan based on the chosen indicators and results.
96
Organizational Purpose, Need, and Expectations
Progressive University is a graduate university with a mission to educate social justice
practitioners and scholars. The university has a strategic goal to increase the percentage of
students that achieve a positive SAP review by 12% by September 2020. The overall SAP rate
for doctoral students in 2015, when this goal was set, was 76%, with significant variation
between programs and schools. Not achieving a positive SAP review has serious implications for
students’ funding and progress toward degree completion.
The stakeholders for this study are doctoral faculty advisors. The stakeholder goal was
that by September 2019, all doctoral faculty advisors will demonstrate an understanding of the
SAP process and how to advise students about their annual SAP review. The organizational goal
is to increase SAP rates across all doctoral programs by 12% by September 2020. That goal is in
line with Progressive University’s strategic plan and mission, vision, and values. Achieving the
stakeholder goal will support the organizational goal to increase SAP rates by further supporting
student progress through accurate and timely faculty advising related to progress and SAP. The
expectation is that if faculty advisors better understand the SAP policy, they will be able to
advise students better and support their satisfactory progress through the program.
Level 4: Results and Leading Indicators
In order to implement the Kirkpatrick model, an organization needs to identify the
internal and external outcomes that the training plan will accomplish. External outcomes are
those that are demonstrated by the customer or client (Kirkpatrick & Kirkpatrick, 2016), in this
case, the student. The external outcomes for this study are increased doctoral student SAP rates
and increased student satisfaction with faculty advising. The metrics for evaluating these
outcomes involve achieving the goal rate of a 12% increase in SAP outcomes per program, as
97
demonstrated by the annual institutional research evaluation of SAP review results, and by
achieving higher levels of student satisfaction with faculty advisors as reported on the biennial
student satisfaction survey.
Internal outcomes are seen at the stakeholder level, or the team, department, and
organizational level (Kirkpatrick & Kirkpatrick, 2016). For this study, the internal outcomes are
increased faculty advisor understanding of the SAP process and policy and increased faculty
advising related to SAP reviews and student progress. The metrics for evaluating these outcomes
are increased rates of faculty understanding of SAP policy and increased rates of faculty self-
reports of proactive SAP advising. An annual faculty advisor survey will measure both internal
metrics. The internal and external outcomes, metrics, and methods are detailed in Table 26.
Table 26
Outcomes, Metrics, and Methods for External and Internal Outcomes
Outcome
Metric(s) Method(s)
External Outcomes
Increased doctoral student
SAP rates
SAP review outcomes increase by
12% per program
Annual Institutional
Research evaluation of
SAP review results
Increased student satisfaction
with faculty advising
Students will report a higher level
of satisfaction with the quality of
their faculty advisement
Biennial student
satisfaction survey
Internal Outcomes
Increased Faculty advisor
understanding of SAP review
policy and procedures
Increased self-reporting by faculty
that they understand SAP policy
and procedure on internal surveys
Annual faculty advisor
survey
Increased faculty advising
related to SAP reviews and
student progress
Increased faculty self-reporting
that they advise students
proactively regarding the SAP
reviews.
Annual faculty advisor
survey
98
Level 3: Behavior
Critical behaviors. The Kirkpatrick model emphasizes the importance of tracking and
reinforcing the critical behaviors considered essential to achieving the desired outcomes
(Kirkpatrick & Kirkpatrick, 2016). Each critical behavior must have a metric and method for
evaluating the behavior as well as a defined timing for each metric and method. The critical
behaviors for this study are that faculty advisors include SAP review information and discussions
of progress in all of their advising meetings with students and that faculty advisors
comprehensively review student plans for adherence to the SAP review policy.
The first critical behavior, faculty advisors include discussions about SAP reviews and
progress in advising conversations, will be measured by faculty advisors self-reporting frequency
of discussion in advising meetings in an annual survey, as well as the number of referrals made
to graduate program advisors related to SAP issues. These referrals should be made throughout
the term; as soon as a faculty advisor becomes aware of an issue related to an individual
student’s progress. The second critical behavior, comprehensively evaluating student progress
plans to ensure they meet SAP requirements, will be measured by reducing the number of plans
submitted each term that need to be revised and returned by graduate program advisors. Plans are
usually submitted at the end of one term or the start of the next term, as students are completing
courses and registering for the next term’s courses. The two critical behaviors and the metrics,
methods, and timing are detailed in Table 27.
99
Table 27
Critical Behaviors, Metrics, Methods, and Timing for Evaluation
Critical Behavior Metric(s) Method(s) Timing
Doctoral faculty advisors
include discussion of
SAP review and progress
in advising conversations
Self-report of student
progress issues to the
graduate program
advisor through the early
alert system and self-
reported behavior on an
annual survey
Graduate program
advisors receive early
alert referrals from
faculty instructors and
will follow up with
faculty advisors, the
student, and the
program
Throughout
the term
Doctoral faculty advisors
comprehensively evaluate
student progress plans to
ensure they meet
requirements for SAP
Signed progress plans
are submitted to advising
and meet SAP
requirements
Graduate program
advisors make fewer
corrections to signed
progress plans
3x per year
during each
term
Required drivers. Kirkpatrick and Kirkpatrick (2016) also emphasize the importance of
identifying the required drivers needed to keep stakeholders focused on the critical behaviors
required to achieve the organizational goals. The required drivers consist of the ways the
organization will reinforce, encourage, reward, and monitor the identified critical behaviors. The
required drivers for this study were identified using the knowledge, motivation, and
organizational influences identified earlier in this chapter.
The knowledge influences are that faculty advisors need to understand the SAP policy
and know how to facilitate doctoral student progress. The associated drivers will reinforce the
skills learned in training by reminding faculty advisors via email about upcoming SAP dates and
deadlines, as well as creating a job aid in the form of an individual notation on each student’s
planning page, which the faculty will be able to view, indicating the terms and dates of that
student’s review. These drivers will also reinforce the cultural setting organizational influence
100
that the organization needs to provide faculty advisors with resources and training related to SAP
and facilitating doctoral student progress.
The motivation influences are that faculty advisors need to value the importance of the
SAP reviews and to believe that they have the ability to influence SAP review outcomes. Those
influences will be supported by drivers that encourage and reward faculty advisor behaviors. The
encouraging driver will be to provide updates on SAP review rates at the monthly faculty
meetings to ensure that faculty are regularly reminded of the importance of the SAP reviews to
the organization. The rewarding driver will be to privately recognize faculty advisor(s) with high
rates of SAP or high SAP improvement rates among their advisees. This should both reward the
highest achieving faculty members as well as encourage faculty members showing improvement.
Lastly, the monitoring drivers will be feedback from graduate program advisors regarding
the progress plans they approve for their students, as well as direct feedback from the faculty
advisors’ supervisors, the program directors, about their advisees’ progress rates. These drivers
represent the organizational model influence that the organization needs a culture valuing the
importance of measuring and tracking student progress. Table 28 lists the required drivers as well
as the methods, timing, and critical behaviors that each driver supports.
101
Table 28
Required Drivers to Support Critical Behaviors
Method(s) Timing Critical
Behaviors
Supported
1, 2, 3 Etc.
Reinforcing
Email reminders from the advising office about
upcoming SAP review dates
Spring and Fall,
prior to reviews
1
The advising office will add and maintain informational
notation on the student planning dashboard of
individual student review dates.
Continuous 1,2
Encouraging
Program directors/department chairs will give regular
updates at faculty meetings reviewing SAP rates
Monthly 1
Rewarding
Department chairs or program directors will privately
recognize faculty advisor(s) with high rates of SAP or
high SAP improvement rates among their advisees
Spring and Fall,
after review
occurs
1, 2
Monitoring
Feedback from graduate program advisors regarding
plans that do not meet requirements
As needed
throughout the
term
2
Direct feedback from program directors (faculty
supervisors) regarding individual faculty’s advisee SAP
rates
Spring and Fall,
after review
occurs
1
Organizational support. It is important to build-in accountability measures to ensure
that the organization continues to enforce the required drivers for as long as it takes to achieve
102
and maintain the desired goals. For Progressive, that will mean adding to the existing schema for
faculty feedback and progress reviews to include the monitoring and rewarding drivers for SAP
advising. These recommendations match the organizational recommendations to use SAP review
data as part of the faculty review process and hold faculty accountable for their students’
progress rates.
It will also be important to add accountability at the program director and department
chair levels so that those individuals have the motivation to continue reinforcing, rewarding,
encouraging and monitoring the faculty advisors’ behaviors and progress toward improving SAP
rates. These measures also address the organizational culture changes needed to provide faculty
advisors with the resources and training to facilitate doctoral progress as well as creating a
culture that values measuring and tracking student progress.
While the proposed implementation plan does not require substantial financial
investment, there will need to be some ability to invest in the creation and continued
enhancement of technology-enhanced job aids to support faculty advisors, as recommended
earlier. The organization will need to provide adequate financial support to create and run the
training programs, as well as to implement any future technology or procedural improvements
that support the organizational goal to increase SAP rates.
Level 2: Learning
Kirkpatrick and Kirkpatrick (2016) describe level two as the degree to which the
participants gain the knowledge, skills, attitudes, and other learning objectives intended in the
training plan. This level is where one gauges the success of the training program and evaluate
participants’ ability and confidence to use the content in their daily work. A training program
103
should always include detailed learning objectives for participants as well as an evaluation of
whether those specific objectives were achieved.
Learning goals. Once the training program is completed, faculty advisors should be able
to:
1. Differentiate between Spring and Fall SAP review students based on their tracking sheet.
2. Identify the terms included in the Spring and Fall SAP review.
3. Evaluate a student’s record to determine the number of units needed to make or regain
SAP status in their next review.
4. Calculate the number of units a student has achieved toward their next SAP review.
5. Recognize when a student has reached or is approaching the all but dissertation (ABD)
stage of their program.
6. Evaluate if a student will reach ABD in the review period.
7. Demonstrate an understanding of the changes in the SAP policy at the ABD stage.
Program. Progressive currently uses asynchronous training programs for other required
training topics. As an institution that is distributed throughout North America and the world,
asynchronous training is the most convenient for faculty advisors. As such, the recommendation
is to create an asynchronous, self-paced training program using the learning management system
(LMS) utilized by the university, Moodle. Moodle allows for the creation of a training module
that incorporates reading, videos, and quizzes to engage learners. The faculty advisors will be
able to access and complete the training in one sitting or short bursts throughout the workweek.
The entire training module should not take longer than 90 minutes to complete. Faculty advisors
would be given six to eight weeks to complete the module once released, with a short refresher
course once a year.
104
Evaluation of the components of learning. The self-paced Moodle training course
would incorporate short tests of knowledge throughout, to ensure understanding of each concept
before moving on to the next unit. After the instruction units, a summative evaluation using real-
world examples will be given, with a minimum score of 80% correct answers in order to
complete the training module. Table 29 details the methods and activities for evaluating the
learning and timing of each.
105
Table 29
Evaluation of the Components of Learning for the Program
Method(s) or Activity(ies) Timing
Declarative Knowledge “I know it.”
Multiple choice formative knowledge checks Throughout each module/unit of
the training program
Multiple choice summative knowledge evaluation At the conclusion of the training
program
Procedural Skills “I can do it right now.”
Self-reported assessment of ability to access live student
progress screens and tracking sheets
At the conclusion of the training
program
Attitude “I believe this is worthwhile.”
Self-reported assessment of the value of training materials
using Likert scale
At the conclusion of the training
program
Facilitated discussion during a live online faculty meeting At the conclusion of the training
program allotted time period.
Confidence “I think I can do it on the job.”
Self-reported assessment of confidence in SAP skills using
Likert scales
At the conclusion of the training
program
Opportunity to schedule one-on-one follow up training
with graduate program advisors to review real student
examples
On-demand, as needed
Commitment “I will do it on the job.”
Self-reported assessment of commitment to use skills
learned in training
At the conclusion of the training
program
Level 1: Reaction
Using an online, self-paced training module allows for constant adjustments to the
content in reaction to faculty advisor feedback. Therefore, one or two question reaction surveys
will be incorporated into the knowledge checks for each module/unit of the training. Facilitators
will be able to make changes to the training modules in real-time to respond to negative
106
feedback. Table 30 shows the methods and tools for measuring reactions to the training program,
as well as the timing of each.
Table 30
Components to Measure Reactions to the Program
Evaluation Tools
Immediately following the program implementation. At the conclusion of the training
module, participants will be asked to complete an immediate evaluation survey. The survey,
shown in Appendix D, will contain ten questions designed to measure the participants’ reaction
(level 1) and learning (level 2). The survey is brief to encourage high levels of participation and
uses mainly Likert scale reaction ratings for ease of completion. The survey will also include one
or two knowledge questions to assess the basic level two outcomes immediately. If immediately
after completing the training, the faculty advisors cannot identify basic information about the
SAP review, the training program will need to be immediately reassessed and revised.
Delayed for a period after the program implementation. Sixty days following the
completion of the learning module, the participants will be sent a follow-up survey in order to
measure all four levels of the Kirkpatrick scale. This delayed evaluation survey, shown in
Method(s) or Tool(s) Timing
Engagement
The training lead will evaluate faculty engagement with the training
program (how long do they spend on each module, how quickly do
they complete the training)
Throughout the open-
training period
Relevance
Self-reported relevance ratings At the conclusion of
each unit/module
Customer Satisfaction
Self-reported satisfaction ratings At the conclusion of
each unit/module
107
Appendix E, will measure the reaction and learning from the training, as well as the behavior,
drivers, and results identified as essential to achieving the performance goal. Waiting 60 days
allows the participants to have an opportunity to use the skills from the training (most faculty
advisors reported meeting with their advisees at least once a month) but not so much time as to
forget about the training entirely. In this way, the follow-up survey can also function as a level 3
reminder to use the skills from the training.
Data Analysis and Reporting
The findings and results of the immediate and delayed instruments will be presented as a
dashboard for weekly distribution to program and university leadership. The dashboard will
provide summarized results of the survey instruments measuring Levels 1 through 4 as well as
the percentage of faculty advisors who have completed the training program. An example
dashboard is shown in Figure 2.
108
Figure 2
Sample Dashboard for SAP Training Program Data
109
Summary
The success of any initiative can be bolstered by carefully and strategically planning for
the implementation of the project as well as pre-planning the methods and metrics for evaluation.
Using the New World Kirkpatrick Model (Kirkpatrick & Kirkpatrick, 2016), the recommended
solutions to the knowledge, motivation, and organizational needs identified in Chapter 4 are
organized into a four-level training evaluation program. The initial training plan, intended to
support the most essential knowledge that faculty advisors need to advise students about SAP, is
supported by Kirkpatrick’s four-level evaluation plan. The four levels of evaluation measure the
reaction to the training (Level 1), the learning achieved at the training (Level 2), the changes in
behavior that are needed after the training (Level 3), and lastly, the results of the changes in
behavior on organizational outcomes (Level 4). Using these four measures of evaluation provides
a full picture of the short- and long-term outcomes from the training program and allow the
program to be improved during implementation and throughout the training program’s use.
These measures, used together, allow an organization to achieve the best return on their training
investment of time, money, and resources.
Strengths and Weaknesses of the Approach
All approaches have inherent strengths and weaknesses, and the Clark and Estes (2008)
gap analysis is no different. The highly structured approach of this gap analysis provides a very
clear path for problem-solving at a specific institution. However, the focus on a single
stakeholder in this study may be impractical for some organizations. A more comprehensive
approach, addressing multiple stakeholder groups might be a better approach for reaching a goal
when the problem has roots in many aspects of the organization. In this case, as the other
110
principal stakeholders had already addressed the problem when this study was initiated, the
single stakeholder focus was appropriate.
Similarly, the single stakeholder focus, as well as the focus on a single organization,
limits the generalizability of this study. PGU uses a different method of assessing student
progress than many institutions do. However, the issue of student progress rates is thought to be
universal among doctoral-granting institutions, and therefore while the approach and
recommendations of this study are specific to PGU, the concepts and methods may be useful to
other organizations.
Limitations and Delimitations
The limitations of this study are rooted in the small sample size and low response rate.
PGU’s restrictions on researchers’ ability to recruit participants, as well as the distributed nature
of the faculty at PGU, both contributed to the low response rate. Similarly, the validity of the
study may have been affected by the researcher being familiar to the participants. While the
researcher does not supervise any of the participants, their roles interact regularly. Having a
person that is known to the participants conducting the interviews lends itself to biased results, as
the participants are fully aware of the answers that would please the interviewer.
The delimitations of this study, looking at one stakeholder at one small university, limit
the generalizability of the results. PGU is a very small university, with only five doctoral
programs limited to the social sciences. PGU has unique demographics, with students that are
older than average graduate students, and PGU lacks the types of financial support that many
universities offer. A more extensive study, which collected information from multiple
stakeholders at multiple universities, would improve the validity of the study, as well as improve
its generalizability.
111
Future Research
Considering the limitations and delimitations of this study, future research is
recommended to be on both a larger scale and to have a broader scope. Within the organization,
continuing to closely track SAP rates as well as completion and time to completion rates are
essential to identify trends and improvements. Additionally, further study into contributing
factors that affect SAP rates, including gender identity, age, racial and ethnic group identity, prior
academic preparation, and sources of funding, are indicated. Future studies should also engage
with students, alumni, and other stakeholders of interest to identify other areas of research and
contributing factors.
The organization should continue to track SAP rates closely, continuing to investigate
trends related to funding type, prior academic preparation, and program of study. Similarly,
comparative studies of doctoral faculty advising practices between high performing and lower
performing programs would be useful to identify the factors that influence SAP rates. This study
investigated two knowledge influences assumed to affect SAP rates, but there are many more
knowledge influences that may be a factor in influencing SAP rates. Training and evaluation of
other policy and procedural knowledge influences may be indicated.
Possible future research should also be considered that includes universities with larger
populations and more diverse areas of study, which could significantly improve the
generalizability of the results. Similarly, investigating more aspects of the faculty advisor
demographics is recommended to determine if age, racial, and ethnic group identity or gender
identity has an impact on faculty advising and doctoral completion rates. Future researchers may
consider applying for research grants to provide increased incentives for faculty advisor
participation.
112
Conclusion
This study addresses the problem of slow rates of progress toward the completion of
doctoral degrees at Progressive Graduate University. Progressive has a goal of achieving a 12%
increase in doctoral student SAP rates, which has not been met after implementing recommended
improvements for and by the student and staff stakeholder groups. Doctoral faculty advisors
were chosen as the primary stakeholder group for this study because they have not implemented
any solutions related to this goal. This study has validated all of the assumed knowledge,
motivation, and organizational influences as needing improvement at PGU.
The recommended solutions are focused on PGU specifically, but many studies have
shown that slow rates of progress are a problem throughout doctoral education in the United
States. While traditional wisdom has been that this is a problem rooted entirely in the students
themselves, meaning that they are lacking either their preparation for doctoral study, their ability
to finance their doctoral program, or their dedication to completing their doctoral degree.
Nevertheless, this study contributes to the body of research showing that there exists a lack of
resources and support from the doctoral program, which contributes to the problem. With
increased faculty training and accountability, universities can promote doctoral student progress
by providing the support that students need to overcome their deficits and achieve their goal of
completing a doctoral program.
113
References
Abedi, J., & Benkin, E. (1987). The effects of students’ academic, financial, and demographic
variables on time to the doctorate. Research in Higher Education, 27(1), 3–14.
https://doi.org/10.1007/BF00992302
Ampaw, F. D., & Jaeger, A. J. (2012). Completing the three stages of doctoral education: An
event history analysis. Research in Higher Education, 53(6), 640–660.
https://doi.org/10.1007/s11162-011-9250-3
Austin, A. E. (2002). Preparing the next generation of faculty. The Journal of Higher Education,
73(1), 94–122. https://doi.org/10.1080/00221546.2002.11777132
Bagaka’s, J. G., Badillo, N., Bransteter, I., & Rispinto, S. (2015). Exploring student success in a
doctoral program: The power of mentorship and research engagement. International Journal
of Doctoral Studies, 10, 323–342.
Bair, C. R., & Haworth, J. G. (2005). Doctoral student attrition and persistence: A meta-synthesis
of research. In J. C. Smart (Ed.), Higher Education: Vol. 19. Higher education: Handbook of
theory and research (pp. 481–534). Kluwer Academic Publishers.
Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change. Psychological
Review, 84(2), 191–215.
Barnes, B. J. (2009). The nature of exemplary doctoral advisors’ expectations and the ways they
may influence doctoral persistence. Journal of College Student Retention, 11(3), 323–343.
https://doi.org/10.2190/CS.11.3.b
114
Barnes, B. J., Williams, E. A., & Archer, S. A. (2010). Characteristics that matter most: Doctoral
students’ perceptions of positive and negative advisor attributes. NACADA Journal, 30(1),
34–46. https://doi.org/10.12930/0271-9517-30.1.34
Baum, S., & Steele, P. (2017). Who goes to graduate school and who succeeds? Washington DC:
Access Group, Inc. and Urban Institute. Urban Institute; Access Group.
Baum, S., & Steele, P. (2018). Graduate and professional school debt: How much students
borrow. Washington DC: AccessLex Institute and Urban Institute. Urban Institute; AccessLex
Institute.
Behr, A., & Theune, K. (2016). The causal effect of off-campus work on time to degree.
Education Economics, 24(2), 189–209. https://doi.org/10.1080/09645292.2014.974509
Benavides, A. D., & Keyes, L. (2016). New-student orientations: Supporting success and
socialization in graduate programs. Journal of Public Affairs Education, 22(1), 107–124.
Bennett, W., & Grothe, B. (1982). Implementation of an academic progress policy at a public
urban university: A review after four years. Journal of Student Financial Aid, 12(1), Article 5,
33–39.
Berelson, B. (1960). Graduate education in the United States. McGraw-Hill Book Company.
Berg, G. A. (2016). The dissertation process and mentor relationships for African-American and
Latina/o students in an online program. American Journal of Distance Education, 30(4), 225–
236. https://doi.org/10.1080/08923647.2016.1227191
Bøgelund, P. (2015). How supervisors perceive PhD supervision: And how they practice it.
International Journal of Doctoral Studies, 10, 39–55.
115
Bolli, T., Agasisti, T., & Johnes, G. (2015). The impact of institutional student support on
graduation rates in US Ph.D. programmes. Education Economics, 23(4), 396–418.
https://doi.org/10.1080/09645292.2013.842541
Burnett, P. C. (1999). The supervision of doctoral dissertations using a collaborative cohort
model. Counselor Education and Supervision, 39(1), 46–52.
Calabrese, R. L., & Smith, P. A. (2010). The doctoral student’s advisor and mentor: Sage advice
from the experts. Rowman & Littlefield Education.
Caruth, G. D. (2015). Doctoral student attrition: A problem for higher education. The Journal of
Educational Thought, 48(3), 189–215.
Cate, P., & Miller, M. A. (2015). Academic advising within the academy: History, mission, and
role. In P. Folsom, F. L. Yoder, & J. Joslin (Eds.), The new advisor guidebook: Mastering the
art of academic advising (pp. 39–53). Jossey-Bass.
Cervone, D., Jiwani, N., & Wood, R. (1991). Goal setting and the differential influence of self-
regulatory processes on complex decision-making performance. Journal of Personality and
Social Psychology, 61(2), 257–266.
Chandler, V . (2018). Short and long-term impacts of an increase in graduate funding. Economics
of Education Review, 62, 104–112. https://doi.org/10.1016/j.econedurev.2017.11.007
Clark, R. E., & Estes, F. (2008). Turning research into results: A guide to selecting the right
performance solutions. Information Age Publishing.
Craft, C. M., Augustine-Shaw, D., Fairbanks, A., & Adams-Wright, G. (2016). Advising doctoral
students in education programs. NACADA Journal, 36(1), 54–65.
https://doi.org/10.12930/nacada-15-013
116
Creswell, J. W., & Creswell, J. D. (2018). Research design: Qualitative, quantitative, and mixed
methods approaches (Fifth edition). Sage Publications.
Cross, L. K. (2018). Graduate student perceptions of online advising. NACADA Journal, 38(2),
72–80. https://doi.org/10.12930/nacada-17-015
D’Andrea, L. M. (2002). Obstacles to completion of the doctoral degree in colleges of education:
The professor’s perspective. Educational Research Quarterly, 25(3), 42.
Delamont, S., Parry, O., & Atkinson, P. (1998). Creating a delicate balance: The doctoral
supervisor’s dilemmas. Teaching in Higher Education, 3(2), 157–172.
https://doi.org/10.1080/1356215980030203
Devine, K., & Hunter, K. H. (2017). PhD student emotional exhaustion: The role of supportive
supervision and self-presentation behaviours. Innovations in Education and Teaching
International, 54(4), 335–344. https://doi.org/10.1080/14703297.2016.1174143
Eccles, J. S., & Wigfield, A. (2002). Motivational Beliefs, Values, and Goals. Annual Review of
Psychology, 53, 109–132.
Ehrenberg, R. G., & Mavros, P. G. (1995). Do Doctoral Students’ Financial Support Patterns
Affect Their Times-To-Degree and Completion Probabilities? Journal of Human Resources,
30(3), 581–609.
Erichsen, E. A., Bolliger, D. U., & Halupa, C. (2014). Student satisfaction with graduate
supervision in doctoral programs primarily delivered in distance education settings. Studies in
Higher Education, 39(2), 321–338. https://doi.org/10.1080/03075079.2012.709496
Federal Student Aid: an Office of the U.S. Department of Education. Subsidized and
unsubsidized loans. https://studentaid.ed.gov/sa/types/loans/subsidized-unsubsidized
117
Gallimore, R., & Goldenberg, C. (2001). Analyzing Cultural Models and Settings to Connect
Minority Achievement and School Improvement Research. Educational Psychologist, 36(1),
45–56. https://doi.org/10.1207/S15326985EP3601_5
Garcia, C. E., & Yao, C. W. (2019). The role of an online first-year seminar in higher education
doctoral students’ scholarly development. The Internet and Higher Education, 42, 44–52.
https://doi.org/10.1016/j.iheduc.2019.04.002
Garcia, L. A. (2013). Factors of attrition in cohort doctoral education: A self-determination
theory perspective [PhD Thesis, University of Texas at Arlington]. Zotero (via BibTeX-
Export). https://rc.library.uta.edu/uta-
ir/bitstream/handle/10106/24173/Garcia_uta_2502D_12407.pdf?sequence=1
Gardner, S. K. (2009). Conceptualizing success in doctoral education: Perspectives of faculty in
seven disciplines. The Review of Higher Education, 32(3), 383–406.
https://doi.org/10.1353/rhe.0.0075
Gayardon, A. de, Callender, C., Deane, K., & Desjardins, S. (2018). Graduate indebtedness: Its
perceived effects on behaviour and life choices - a literature review. London, England: Centre
for Global Education.
Gilley, A., Gilley, J. W., & McMillan, H. S. (2009). Organizational change: Motivation,
communication, and leadership effectiveness. Performance Improvement Quarterly, 21(4),
75–94. https://doi.org/10.1002/piq.20039
Gillingham, L., Seneca, J. J., & Taussig, M. K. (1991). The determinants of progress to the
doctoral degree. Research in Higher Education, 32(4), 449–468.
https://doi.org/10.1007/BF00992186
118
Glesne, C. (2011). Becoming qualitative researchers: An introduction (4
th
). Pearson Education.
Godskesen, M., & Kobayashi, S. (2016). Coaching doctoral students – a means to enhance
progress and support self-organisation in doctoral education. Studies in Continuing Education,
38(2), 145–161. https://doi.org/10.1080/0158037x.2015.1055464
Golde, C. M., Bueschel, A. C., Jones, L., & Walker, G. E. (2009). Advocating apprenticeship and
intellectual community: Lessons from the Carnegie Initiative on the Doctorate. In R. G.
Ehrenberg & C. V . Kuh (Eds.), Doctoral education and the faculty of the future (pp. 53–64).
Cornell University Press.
Hamel, A. V ., & Furlong, J. S. (2012). The graduate school funding handbook (3
rd
ed.).
University of Pennsylvania Press.
Harding-DeKam, J. L., Hamilton, B., & Loyd, S. (2012). The hidden curriculum of doctoral
advising. NACADA Journal, 32(2), 5–16.
Horta, H., Cattaneo, M., & Meoli, M. (2018). PhD funding as a determinant of PhD and career
research performance. Studies in Higher Education, 43(3), 542–570.
https://doi.org/10.1080/03075079.2016.1185406
Hulleman, C. S., Godes, O., Hendricks, B. L., & Harackiewicz, J. M. (2010). Enhancing interest
and performance with a utility value intervention. Journal of Educational Psychology, 102(4),
880–895. https://doi.org/10.1037/a0019506
Irwin, C. W., & Stafford, E. T. (2016). Survey methods for educators: Collaborative survey
development (part 1 of 3) (REL 2016–163). Washington, DC: U.S. Department of Education,
Institute of Education Sciences, National Center for Education Evaluation and Regional
Assistance, Regional Educational Laboratory Northeast & Islands.
119
Ivankova, N. V ., & Stick, S. L. (2007). Students’ persistence in a distributed doctoral program in
educational leadership in higher education: A mixed methods study. Research in Higher
Education, 48(1), 93–135. https://doi.org/10.1007/s11162-006-9025-4
James, W. (1903). The Ph.D. octopus. Harvard Monthly, 36(1), 1–9.
Johnson, B., & Christensen, L. (2014). Educational research: Quantitative, qualitative, and
mixed approaches (Fifth edition). Sage Publications.
Kim, D., & Otts, C. (2010). The effect of loans on time to doctorate degree: Differences by
race/ethnicity, field of study, and institutional characteristics. The Journal of Higher
Education, 81(1), 1–32. https://doi.org/10.1080/00221546.2010.11778968
King, M. C. (2008). Organization of academic advising services. In V . N. Gordon, W. R. Habley,
& T. Grites (Eds.), Academic advising: A comprehensive handbook (2
nd
ed., pp. 242–252).
John Wiley & Sons.
Kirkpatrick, J. D., & Kirkpatrick, W. K. (2016). Kirkpatrick’s four levels of training evaluation.
ATD Press.
Knotts, H. G., & Wofford, C. B. (2017). Perceptions of effectiveness and job satisfaction of pre-
law advisors. NACADA Journal, 37(2), 76-88.
Krathwohl, D. R. (2002). A revision of Bloom’s taxonomy: An overview. Theory into Practice,
41(4), 212–218. https://doi.org/10.1207/s15430421tip4104_2
Kumar, S., Johnson, M., & Hardemon, T. (2013). Dissertations at a distance: Students’
perceptions of online mentoring in a doctoral program. The Journal of Distance Education,
27(1), 1–11.
120
Kuperminc, G. P., Chan, W. Y ., Seitz, S., & Wilson, C. (2016). Infusing community psychology
practice competencies into doctoral training. Global Journal of Community Psychology
Practice, 7(4), 1–24.
Lechuga, V. M. (2011). Faculty-graduate student mentoring relationships: Mentors’ perceived
roles and responsibilities. Higher Education, 62(6), 757–771. https://doi.org/10.1007/s10734-
011-9416-0
Leonard, M. J. (2008). Advising Delivery: Using Technology. In V . N. Gordon, W. R. Habley, &
T. Grites (Eds.), Academic advising: A comprehensive handbook (2
nd
ed., pp. 292–308). John
Wiley & Sons.
Lepp, L., Remmik, M., & Leijen, D. A. J. (2016). Doctoral students’ research stall: Supervisors’
perceptions and intervention strategies. SAGE Open, 6(3), 1-12.
https://doi.org/10.1177/2158244016659116
Lovitts, B. E. (2001). Leaving the ivory tower: The causes and consequences of departure from
doctoral study. Rowman & Littlefield.
Malik, M. A. R., Butt, A. N., & Choi, J. N. (2015). Rewards and employee creative performance:
Moderating effects of creative self-efficacy, reward importance, and locus of control. Journal
of Organizational Behavior, 36(1), 59–74. https://doi.org/10.1002/job.1943
Marshall, S. M., Klocko, B., & Davidson, J. (2017). Dissertation completion:: No longer higher
education’s invisible problem. Journal of Educational Research and Practice, 7(1), 74–90.
Maxwell, J. A. (2013). Qualitative research design: An interactive approach (3
rd
ed.). Applied
social research methods series: Vol. 41. SAGE.
121
McClellan, E. E. (2016). Reward systems and career ladders for advisors. In T. J. Grites, M. A.
Miller, & J. Givans V oller (Eds.), Beyond foundations: Developing as a master academic
advisor (pp. 225–249). Jossey-Bass.
McFarland, J., Hussar, B., Wang, X., Zhang, J., Wang, K., Rathbun, A., Barmer, A., Forrest
Cataldi, E., & Bullock Mann, F. (2018). The condition of education 2018. (NCES 2018-144).
U.S. Department of Education. Washington, DC: National Center for Education Statistics.
Mendoza, P., Villarreal III, P., & Gunderson, A. (2014). Within-year retention among Ph.D.
students: The effect of debt, assistantships, and fellowships. Research in Higher Education,
55(7), 650–685.
Merriam, S. B., & Tisdell, E. J. (2015). Qualitative research: A guide to design and
implementation (Fourth edition). Jossey-Bass.
Millett, C. M., & Nettles, M. T. (2009). Three ways of winning doctoral education: Rate of
progress, degree completion, and time to degree. In R. G. Ehrenberg & C. V . Kuh (Eds.),
Doctoral education and the faculty of the future (pp. 65–89). Cornell University Press.
National Research Council. (1996). The Path to the Ph.D: Measuring graduate attrition in the
sciences and humanities. National Academies Press. https://doi.org/10.17226/5195
NSF/NCSES. (December 2017). Doctorate Recipients from U.S. Universities 2016: Table 2.
Doctorate-granting institutions and doctorate recipients per institution: 1973-2016 [NSF 18-
304]. https://www.nsf.gov/statistics/2018/nsf18304/technical-notes.cfm
NSF/NCSES. (2018a). Doctorate Recipients from U.S. Universities: 2016. NSF 18-304.
Alexandria, V A. https://www.nsf.gov/statistics/2018/nsf18304/
122
NSF/NCSES. (December 2018b). Doctorate recipients from U.S. universities (Table 32): 2017.
NSF 19-301. Alexandria, V A. National Science Foundation, National Center for Science and
Engineering Statistics.
Offstein, E. H., Larson, M. B., McNeill, A. L., & Mjoni Mwale, H. (2004). Are we doing enough
for today’s graduate student? International Journal of Educational Management, 18(7), 396–
407. https://doi.org/10.1108/09513540410563103
Ostriker, J. P., Kuh, C. V., & V oytuk, J. A. (Eds.). (2011). A data-based assessment of research-
doctorate programs in the United States. National Academies Press (US).
http://www.ncbi.nlm.nih.gov/pubmed/22379653 https://doi.org/10.17226/12994
Paglis, L. L., Green, S. G., & Bauer, T. N. (2006). Does adviser mentoring add value? A
longitudinal study of mentoring and doctoral student outcomes. Research in Higher
Education, 47(4), 451–476. https://doi.org/10.1007/s11162-005-9003-2
Pajares, F. (2009). Self-efficacy theory. Education.com.
http://www.education.com/reference/article/self-efficacy-theory/
Pazzaglia, A. M., Stafford, E. T., & Rodriguez, S. M. (2016). Survey methods for educators:
Selecting samples and administering surveys (part 2 of 3) (REL 2016–160). Washington, DC:
U.S. Department of Education, Institute of Education Sciences, National Center for Education
Evaluation and Regional Assistance, Regional Educational Laboratory Northeast & Islands.
Pintrich, P. R. (2003). A motivational science perspective on the role of student motivation in
learning and teaching contexts. Journal of Educational Psychology, 95(4), 667–686.
https://doi.org/10.1037/0022-0663.95.4.667
Robinson, S. B., & Leonard, K. F. (2019). Designing quality survey questions. Sage Publications.
123
Rockinson-Szapkiw, A. J., Spaulding, L. S., & Spaulding, M. T. (2016). Identifying significant
integration and institutional factors that predict online doctoral persistence. The Internet and
Higher Education, 31, 101–112. https://doi.org/10.1016/j.iheduc.2016.07.003
Roumell, E. A. L., & Bolliger, D. U. (2017). Experiences of faculty with doctoral student
supervision in programs delivered via distance. The Journal of Continuing Higher Education,
65(2), 82–93. https://doi.org/10.1080/07377363.2017.1320179
Rubin, H. J., & Rubin, I. S. (2012). Qualitative interviewing: The art of hearing data (3
rd
). Sage
Publications.
Rueda, R. (2011). The three dimensions of improving student performance. Teachers College
Press.
Ruud, C. M., Saclarides, E. S., George-Jackson, C. E., & Lubienski, S. T. (2018). Tipping points:
Doctoral students and consideration of departure. Journal of College Student Retention:
Research, Theory & Practice, 20(3), 286–307. https://doi.org/10.1177/1521025116666082
Salani, D., Albuja, L. D., & Azaiza, K. (2016). The keys to success in doctoral studies: A
preimmersion course. Journal of Professional Nursing : Official Journal of the American
Association of Colleges of Nursing, 32(5), 358–363.
https://doi.org/10.1016/j.profnurs.2016.01.005
Salkind, N. J. (2016). Statistics for people who (think they) hate statistics (International student
edition). Sage Publications.
Schein, E. H. (2017). Organizational culture and leadership (5
th
Edition). Wiley.
124
Schroeder, S. M., & Terras, K. L. (2015). Advising experiences and needs of online, cohort, and
classroom adult graduate learners. NACADA Journal, 35(1), 42–55.
https://doi.org/10.12930/nacada-13-044
Schuh, J. H., & Gansemer-Topf, A. M. (2012). Finances and retention. In A. Seidman (Ed.),
American Council of Education, Series on Higher Education. College student retention:
Formula for student success (2
nd
ed.). Rowman & Littlefield.
Skakni, I. (2018). Doctoral studies as an initiatory trial: Expected and taken-for-granted practices
that impede PhD students’ progress. Teaching in Higher Education, 23(8), 927–944.
https://doi.org/10.1080/13562517.2018.1449742
Sowell, R., Zhang, T., Redd, K. E., & King, M. F. (2008). Ph.D. completion and attrition:
Analysis of baseline program data from the Ph.D. completion project. Council of Graduate
Schools.
Steele, G. E. (2014). Intentional use of technology for academic advising. NACADA
Clearinghouse Resource. https://nacada.ksu.edu/Resources/Clearinghouse/View-
Articles/Intentional-use-of-technology-for-academic-advising.aspx
Strachan, R., Murray, R., & Grierson, H. (2004). A web-based tool for dissertation writing.
British Journal of Educational Technology, 35(3), 369–375.
http://doi.wiley.com/10.1111/j.0007-1013.2004.00395.x
Terry, T., & Ghosh, R. (2015). Mentoring from different social spheres: How can multiple
mentors help in doctoral student success in Ed.D programs? Mentoring and Tutoring:
Partnership in Learning, 23(3), 187–212. https://doi.org/10.1080/13611267.2015.1072396
125
Tinto, V . (1987). Leaving college: Rethinking the causes and cures of student attrition.
University of Chicago Press.
van der Haert, M., Arias, E., Emplit, P., Halloin, V ., & Dehon, C. (2014). Dropout and degree
completion in doctoral study: A competing risks survival analysis. Studies in Higher
Education, 39(10), 1885–1909. https://doi.org/10.1080/03075079.2013.806458
V olkwein, J. F., & Lorang, W. G. (1996). Characteristics of extenders: Full-time students who
take light credit loads and graduate in more than four years. Research in Higher Education,
37(1), 43–67.
Wao, H. O., & Onwuegbuzie, A. J. (2011). A mixed research investigation of factors related to
time to the doctorate in education. International Journal of Doctoral Studies, 6, 115–134.
https://doi.org/10.28945/1505
Wilcox, E. (2017). The technologist’s advising curriculum. EDUCAUSE.
https://er.educause.edu/blogs/2017/7/the-technologists-advising-curriculum
Williams, J. D., Harlow, S. D., & Gab, D. (1970). A longitudinal study examining prediction of
doctoral success: Grade point average as criterion, or graduation vs. non-graduation as
criterion. The Journal of Educational Research, 64(4), 161–165.
https://doi.org/10.1080/00220671.1970.10884126
York, T. T., Gibson, C., & Rankin, S. (2015). Defining and measuring academic success.
Practical Assessment, Research & Evaluation, 20(5), 1–20.
Zhou, E., & Okahana, H. (2019). The role of department supports on doctoral completion and
time-to-degree. Journal of College Student Retention, 20(4), 511–529.
https://doi.org/10.1177/1521025116682036
126
Zieky, M., & Perie, M. (2006). A primer on setting cut scores on tests of educational
achievement. Educational Testing Service.
127
APPENDIX A
Survey Protocol
Research
Question/
Data Type
KMO
Survey Item (question and
response)
Scale of
Measurement
Potential
Analyses
Visual
Representation
Informed Consent
statement
N/A
I am interested in understanding faculty advisors' knowledge and
motivation as well as organizational influences related to SAP and facilitating
doctoral student progress. You will be presented with information relevant to SAP
and student advising and asked to answer some questions about it. Please be assured
that your responses will be kept completely confidential.
The study should take you around 5-7 minutes to complete. For each
completed survey I will donate $10 to a PGU student scholarship fund.
Your participation in this research is voluntary. You have the right to
withdraw at any point during the study, for any reason, and without any prejudice.
If you would like to contact the Principal Investigator in the study to discuss this
research, please e-mail Lindsay Cahn, lcahn@usc.edu. If you have questions or
concerns about your rights as a research participant, contact the USC IRB by email
at IRB@usc.edu or telephone at (323) 442-0114.
By clicking the button below, you acknowledge that your participation in the study
is voluntary, you are 18 years of age, and that you are aware that you may choose to
terminate your participation in the study at any time and for any reason.
Please note that this survey will be best displayed on a laptop or desktop
computer. Some features may be less compatible for use on a mobile device.
The USC Institutional Review Board retains the right to access the signed
informed consent forms and other study documents.
Demographics –
Sample
Description
N/A
How many years have you
been advising Doctoral
students?
Single line Number entry
Ratio
Percentage,
Frequency,
Mode,
Median,
Mean,
Standard
Deviation,
Range
Table
Demographics-
Sample
description
N/A
In the 2018-19 academic year,
approximately how many
students did you advise as the
official faculty advisor/mentor
of record?
Single line Number entry
Ratio
Percentage,
Frequency,
Mode,
Median,
Mean,
Standard
Deviation,
Range
Table
128
Demographics-
Sample
description
N/A
What school are you primarily
affiliated with at PGU?
Choose one:
School 1, School 2
Nominal
Percentage,
Frequency
Table
Demographics-
sample description
N/A
What program(s) do you
advise doctoral students for:
Select multiple: Options
depend on previous question:
School 1= Program 1, 2, 3
School 2= Program 4, 5
Nominal
Percentage,
frequency
Table
Faculty advisors
need to understand
the SAP policy
K-F
When viewing a student’s
tracking sheet, do you know
how to identify student’s SAP
review period (dates of
review)?
Yes, No, I’m not sure
Nominal
Percentage,
Frequency,
Mode,
Median,
Range
Table
Faculty advisors
need to understand
the SAP policy
K-F
If a student started their
doctoral program in May,
when are they reviewed for
SAP?
Choose one: Spring Review
(end of April), Fall Review
(end of August), I’m not sure
nominal
Percentage,
Frequency
Table, pie chart
Faculty advisors
need to understand
the SAP policy
K-F
If a student is receiving a
Spring 2020 SAP review,
which academic terms are
included in that review period?
Choose one (randomized
order):
• Fall 2019, Spring
2020, and Summer
2020
• Summer 2019, Fall
2019, and Spring 2020
• Spring 2019, Summer
2019, Fall 2019
• I’m not sure.
nominal
Percentage,
Frequency
Table, pie chart
Faculty advisors
need to know how
to facilitate
doctoral student
progress
K-P
In a typical semester/term,
how often do you meet with
your advisees, either
individually or in groups?
Open response
129
Faculty advisors
need to know how
to facilitate
doctoral student
progress
K-P
Do you check in with your
advisees if they have not been
in touch with you as expected?
Why or why not?
Open response
Faculty advisors
need to know how
to facilitate
doctoral student
progress
K-P
How do you track your
advisees’ progress through the
program?
Open response
Faculty advisors
need to value the
importance of
SAP reviews to
students and the
institution.
M-UV
When advising students, do
you bring up the annual SAP
review as part of your regular
topics of discussion?
Always, sometimes,
occasionally, rarely, never
(if rarely or never: “why not?”
open response)
ordinal
Percentage,
Frequency,
Mode,
Median,
Range
Table, Stacked
bar chart
Faculty advisors
need to value the
importance of
SAP reviews to
students and the
institution.
M-UV
How important is the SAP
review process for tracking
student progress?
Very important, important,
moderately important, of little
importance, unimportant
(if of little importance,
unimportant: “why not?” open
response)
ordinal
Percentage,
Frequency,
Mode,
Median,
Range
Table, Stacked
bar chart
Faculty advisors
need to value the
importance of
SAP reviews to
students and the
institution.
M-UV
Please indicate your level of
agreement with this statement:
“I believe the SAP review is a
good measure of student
progress toward degree
completion.”
Strongly agree, agree,
undecided, disagree, strongly
disagree
(if disagree, strongly disagree:
“why not?” open response)
ordinal
Percentage,
Frequency,
Mode,
Median,
Range
Table, Stacked
bar chart
130
Faculty advisors
need to believe
they can influence
SAP review
outcomes for their
advisees.
M-SET
Please indicate your level of
agreement with this statement:
“In my experience, if I advise
students about annual SAP
requirements, they are more
likely to make SAP.”
Strongly agree, agree,
undecided, disagree, strongly
disagree
(if disagree, strongly disagree:
“why not?” open response)
ordinal
Percentage,
Frequency,
Mode,
Median,
Range
Table, Stacked
bar chart
Faculty advisors
need to believe
they can influence
SAP review
outcomes for their
advisees.
M-SET
Please indicate your level of
agreement with this statement:
“I believe that as a faculty
advisor, I have an influence on
whether or not my students
make SAP”
Strongly agree, agree,
undecided, disagree, strongly
disagree
(if disagree, strongly disagree:
“why not?” open response)
ordinal
Percentage,
Frequency,
Mode,
Median,
Range
Table, Stacked
bar chart
Faculty advisors
need to believe
they can influence
SAP review
outcomes for their
advisees.
M-SET
Please indicate your level of
agreement with this statement:
“My advising of students has
no impact on whether or not
they make SAP”
Strongly agree, agree,
undecided, disagree, strongly
disagree
(if agree, strongly agree: “why
not?” open response)
ordinal
Percentage,
Frequency,
Mode,
Median,
Range
Table, Stacked
bar chart
The organization
needs to provide
faculty with
resources and
training related to
SAP and
facilitating
doctoral student
progress
O-CS
Have you ever received
training or instruction about
how SAP is calculated?
Yes, No, I’m not sure
Nominal
Percentage,
mode
Table, Stacked
bar chart
131
The organization
needs to provide
faculty with
resources and
training related to
SAP and
facilitating
doctoral student
progress
O-CS
Do you believe training related
to facilitating doctoral student
progress should be provided to
faculty advisors?
Yes, no, I don’t know
nominal
Percentage,
Frequency
Table
The organization
needs to provide
faculty with
resources and
training related to
SAP and
facilitating
doctoral student
progress
O-CS
To what extent do you agree
with the following statement:
“I feel I’ve been provided with
the developmental resources I
need to facilitate doctoral
student progress”
Strongly agree, agree,
undecided, disagree, strongly
disagree
ordinal
Percentage,
Frequency,
Mode,
Median,
Range
Table,
The organization
needs a culture
valuing the
importance
measuring and
tracking student
progress.
O-CM
To what extent do you agree
with the following statement:
“PGU values the importance
of measuring and tracking
student progress”
Strongly agree, agree,
undecided, disagree, strongly
disagree
(if disagree, strongly disagree:
“why not?” open response)
ordinal
Percentage,
Frequency,
Mode,
Median,
Range
Table,
The organization
needs a culture
valuing the
importance
measuring and
tracking student
progress.
O-CM
To what extent do you agree
with the following statement:
“PGU could do more to
strengthen its resources for
measuring and tracking
student progress.”
Strongly agree, agree,
undecided, disagree, strongly
disagree
ordinal
Percentage,
Frequency,
Mode,
Median,
Range
Table, Stacked
bar chart
132
The organization
needs a culture
valuing the
importance
measuring and
tracking student
progress.
O-CM
To what extent do you agree
with the following statement:
“PGU holds me accountable
for my students’ SAP rates”
Strongly agree, agree,
undecided, disagree, strongly
disagree
ordinal
Percentage,
Frequency,
Mode,
Median,
Range
Table, Stacked
bar chart
Legend: K-F=Knowledge-Factual, K-P=Knowledge-Procedural, M-UV= Utility Value, M-SET=
Self-efficacy Theory, O-CM=Cultural Model, O-CS=Cultural Setting
133
APPENDIX B
Interview Protocol
Thank you for agreeing to participate in this research study and signing the informed
consent form. Do you mind if I record our interview? As I’ve mentioned, I will be asking you
some questions about your experience as a faculty advisor at PGU. While you know me as the
director of student advising at PGU, for the purposes of this interview please try to think of me
as a student researcher and not a PGU staff member. I will not hold anything against you that you
say in this interview, and I truly value your honesty about this topic.
I will use a pseudonym to refer to you in my notes and in the published study to protect
your identity. I will not identify you by name to anyone in relation to this study. Is there a name
you’d like me to use for you? If you don’t have a preference, I’ll assign you a name from this
list. I’d also like to remind you that you can withdraw from participation in the study at any time,
and you are not obligated to answer any of my questions. Please let me know if you have any
questions or concerns either during or after the interview.
1. To start out, how many years have you been
advising doctoral students?
2. In a given year about how many students do you
advise?
3. What program do you advise students for?
Starting out questions- background
info.
134
Now I’m going to ask you some questions about the
annual satisfactory academic progress review (SAP).
4. Would you say that the SAP review is a valuable
tool for tracking student progress?
5. Do you think it is important for students to make
SAP?
6. Is there a better method you think could be used
to assess progress?
Utility Value- faculty advisors need
to value the importance of SAP
reviews to students and the
institution.
7. Do you feel like you have the ability to influence
whether or not your advisees make SAP?
Probes
• Can you describe methods have you used to
help students make SAP?
• Follow up: Were those interventions
successful?
• Follow up: how could you tell?
Self-Efficacy - Faculty advisors
need to believe they can influence
SAP review outcomes for their
advisees
8. Do you believe that PGU has a culture that
values measuring and tracking student progress?
Cultural Model Influence: The
organization needs a culture valuing
the importance measuring and
tracking student progress.
9. Have you received any specific training about
how and when SAP reviews are conducted?
Probes:
• Who provided the training?
• When was that training provided?
Cultural Setting Influence: The
organization needs to provide
faculty with resources and training
related to SAP and facilitating
student progress.
10. As we finish up, is there anything else you’d like
to add?
Final Question
135
Thank you for taking the time to speak with me today. I really appreciate your time and
the thought that you put into your answers. I’ll leave you with my contact information in case
you have any questions later.
136
APPENDIX C
Informed Consent Form
INFORMED CONSENT FOR RESEARCH
Study Title: Satisfactory Academic Progress for Doctoral Students: An Improvement
Study
Principal Investigator: Lindsay Cahn
Supervisor: Dr. Alison Muraszewski
Department: Rossier School of Education
---------------------------------------------------------------------------------------------------------------------
INTRODUCTION
We invite you to take part in a research study. Please take as much time as you need to
read the consent form. You may want to discuss it with your family, friends, or your personal
doctor. If you find any of the language difficult to understand, please ask questions. If you decide
to participate, you will be asked to sign this form. A copy of the signed form will be provided to
you for your records.
PURPOSE
The purpose of this study is to determine if there are gaps in the faculty advisor
knowledge, motivation, and organizational influences interfering with the organization’s goal of
increasing SAP rates by 12%. We hope to learn more about the faculty knowledge and
motivation influences related to SAP and student progress. You are invited as a possible
participant because you are a doctoral faculty advisor at PGU. About 60 participants will be
invited to take part in the study.
PROCEDURES
137
If you decide to take part, this is what will happen: you will be interviewed for
approximately 20 minutes and/or fill out a 5-10 minute survey related to your experience as a
doctoral faculty advisor.
RISKS AND DISCOMFORTS
Possible risks and discomforts you could experience during this study include:
• Some of the questions may make you feel uneasy or embarrassed. You can choose to skip
or stop answering any questions you don’t want to.
• There is a small risk that people who are not connected with this study will learn your
identity or your personal information. All possible care will be taken to prevent this.
BENEFITS
There are no direct benefits to you from taking part in this study. However, your
participation in this study may help us learn what barriers faculty advisors may have that are
preventing an increase in SAP rates for doctoral students.
PRIV ACY/CONFIDENTIALITY
We will keep your records for this study confidential as far as permitted by law. However,
if we are required to do so by law, we will disclose confidential information about you. Efforts
will be made to limit the use and disclosure of your personal information, including research
study and medical records, to people who are required to review this information. We may
publish the information from this study in journals or present it at meetings. If we do, we will not
use your name.
The University of Southern California’s Institutional Review Board (IRB) may review
your records. The Institutional Review Board of Progressive Graduate University retains the
right to access the signed informed consent forms and study documents.
138
Your data or specimens will be securely stored at the principle investigator’s home on a
password protected computer.
The security of data transmitted over the Internet cannot be guaranteed, therefore, there is
a slight risk that the information you send to me via email will not be secure. The collection of
such data is not expected to present any greater risk than you would encounter in everyday life
when sending and/or receiving information over the Internet.
The results of this research will be published in my dissertation and possibly published in
subsequent journals or books or presentations. Direct quotes from your interview may be used in
publications or presentations. You will be identified by pseudonym only and any identifying
information will be removed.
ALTERNATIVES
An alternative would be to not participate in this study.
PAYMENTS
You will not be compensated for your participation in this research.
VOLUNTARY PARTICIPATION
It is your choice whether or not to participate. If you choose to participate, you may
change your mind and leave the study at any time. Refusal to participate or stopping your
participation will involve no penalty or loss of benefits to which you are otherwise entitled.
If you stop being in the research, already collected data may not be removed from the
study database. You will be asked whether the investigator can continue to collect data from your
records. If you agree, this data will be handled the same as the research data. No new information
or samples will be collected about you or from you by the study team without your permission.
139
The study site may still, after your withdrawal, need to report any safety event that you
may have experienced due to your participation to all entities involved in the study. Your
personal information, including any identifiable information, that has already been collected up
to the time of your withdrawal will be kept and used to guarantee the integrity of the study, to
determine the safety effects, and to satisfy any legal or regulatory requirements.
CONTACT INFORMATION
If you have questions, concerns, complaints, or think the research has hurt you, talk to the
study investigator at 805-898-4052.
This research has been reviewed by the USC Institutional Review Board (IRB). The IRB
is a research review board that reviews and monitors research studies to protect the rights and
welfare of research participants. Contact the IRB if you have questions about your rights as a
research participant or you have complaints about the research. You may contact the IRB at (323)
442-0114 or by email at irb@usc.edu.
If you have any questions about any aspect of this study or your involvement, please tell
the researcher before signing this form. You may also contact the supervising faculty if you have
questions or concerns about your participation in this study. The supervising faculty has provided
contact information at the bottom of this form.
You may also ask questions at any time during your participation in this study.
STATEMENT OF CONSENT
I have read (or someone has read to me) the information provided above. I have been
given a chance to ask questions. All my questions have been answered. By signing this form, I
am agreeing to take part in this study.
140
Name of Research Participant Signature Date Signed
(and Time*)
Person Obtaining Consent
I have personally explained the research to the participant using non-technical language. I
have answered all the participant’s questions. I believe that the participant understands the
information described in this informed consent and freely consents to participate.
Name of Person Obtaining Signature Date Signed
Informed Consent (and Time*)
Supervising faculty:
Alison K. Muraszewski, Ed.D.
Lecturer
USC Rossier School of Education
3470 Trousdale Parkway
Waite Phillip Hall (WPH)
Los Angeles, CA 90089
alkeller@usc.edu
141
APPENDIX D
Immediate Evaluation Instrument
Communication:
Congratulations for successfully completing the SAP training module. In order to
improve this training, we would appreciate your feedback using the short survey below. Please
contact us with any questions or concerns.
Best,
(Training Coordinator)
142
Level and
Category
Question Rating scale
2-
Knowledge
How would you rate your understanding
of the SAP process prior to completing
this training?
1) very high
2) high
3) average
4) low
5) very low
2-
Knowledge
How would you rate your understanding
of the SAP process after completing this
training?
1) very high
2) high
3) average
4) low
5) very low
How much do you agree with the following statements:
1-
Satisfaction
This training module was easy to
navigate.
1) strongly agree
2) agree
3) undecided
4) disagree
5) strongly disagree
1-
Engagement
This program held my interest. 1) strongly agree
2) agree
3) undecided
4) disagree
5) strongly disagree
1- Relevance I will use the information covered in this
training in my daily work.
1) strongly agree
2) agree
3) undecided
4) disagree
5) strongly disagree
2-
Declarative
knowledge
If a student started their doctoral program
in May, when are they reviewed for
SAP?
1) Spring Review (end of
April)
2) Fall Review (end of
August)
3) I’m not sure
2-
Declarative
knowledge
If a student is receiving a Spring 2020
SAP review, which academic terms are
included in that review period?
1) Fall 2019, Spring 2020,
and Summer 2020
2) Summer 2019, Fall 2019,
and Spring 2020
3) Spring 2019, Summer
2019, Fall 2019
4) I’m not sure
143
2- Attitude I believe this training was relevant to my
role as a faculty advisor.
1) strongly agree
2) agree
3) undecided
4) disagree
5) strongly disagree
2-
Confidence
I feel confident I can use the information
from this training in my advising of
students.
1) strongly agree
2) agree
3) undecided
4) disagree
5) strongly disagree
2-
Commitment
I am committed to using this information
in my advising of students.
1) strongly agree
2) agree
3) undecided
4) disagree
5) strongly disagree
Please share any additional comments or
feedback related to this training module:
(free text short answer
response)
144
APPENDIX E
Delayed Evaluation Instrument
Communication:
Dear (participant name),
You completed the SAP training module 30 days ago. In order to be able to make
improvements in the training module, and in support of the goal of increasing SAP rates, we
are asking for a few minutes of your time to provide additional feedback. Please fill out and
submit the survey below within the next 30 days. If you have any questions or concerns, please
contact me.
Best,
(Training coordinator)
145
Level and
Category
Question Rating scale
Level 1-
Reaction
I have used the information from the
SAP training manual while
performing my job.
1) strongly agree
2) agree
3) undecided
4) disagree
5) strongly disagree
Level 1-
Customer
Satisfaction
Completing the SAP training was a
good use of my time.
1) strongly agree
2) agree
3) undecided
4) disagree
5) strongly disagree
Level 2-
Learning
If a student started their doctoral
program in September, when are
they reviewed for SAP?
1) Spring Review (end of April)
2) Fall Review (end of August)
3) I’m not sure
Level 3-
Behavior
I have been able to successfully
advise students about their
upcoming SAP review as a result of
this training.
1) strongly agree
2) agree
3) undecided
4) disagree
5) strongly disagree
Level 3-
Behavior
I feel more comfortable reviewing
students’ academic plans (PIPs &
PCPs) since completing this
training.
1) strongly agree
2) agree
3) undecided
4) disagree
5) strongly disagree
Level 3-
Drivers
Please indicate if any of the
following statements are true:
(select all that apply)
1) I do not have the necessary
knowledge or skills to advise
about SAP
2) I do not understand how to
advise students about SAP
3) I have more important
priorities than advising about
SAP
4) I don’t feel confident advising
students about SAP
5) I don’t feel confident
reviewing student plans (PIPs
and PCPs) for accuracy
6) I don’t believe advising
students about SAP is
important
7) none of the above statements
are true for me
146
Level 3-
Drivers
I have the support from the
university that I need to advise
students about SAP
1) strongly agree
2) agree
3) undecided
4) disagree
5) strongly disagree
Level 4-
Leading
Indicators
Since completing the training, I
have done the following: (check all
that apply)
1) Referred student(s) to their
graduate program advisor to
address a SAP issue.
2) Helped a student create an
academic plan (learning plan,
PIP, or PCP) that meets SAP
requirements.
3) Identified students who are in
danger of not making SAP in
their next review.
4) I have not had an opportunity
to do any of the above.
Please share any additional
comments or feedback related to
this training module or the goal to
increase SAP rates:
(free text short answer response)
Abstract (if available)
Abstract
This study addresses the issue of slow rates of progress toward doctoral degree completion using the annual satisfactory academic progress (SAP) review as an indicator of inadequate progress. The Clark and Estes (2008) gap analysis was used to evaluate the assumed knowledge, motivation, and organizational influences that are preventing Progressive Graduate University (PGU, a pseudonym) from reaching its organizational goal to increase SAP rates for doctoral students. A mixed-method approach was used, with quantitative surveys and qualitative interviews with faculty members. All six assumed influences were validated as gaps for the organization, although the study was hampered by a low response rate. Detailed recommendations are made to address the gaps, and an integrated implementation plan was designed, including evaluation of outcomes.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Academic coaching for Pell-eligible, academically at-risk freshmen: an evaluation study
PDF
Bridging the gap: a formative evaluation of the productivity-based funding model’s support for academically underserved students
PDF
Preparing university faculty for distance education: an evaluation study
PDF
Advisor impact on student veterans at a post-secondary institution: an evaluation study
PDF
Manager leadership skills in the context of a new business strategy initiative: an evaluative study
PDF
An improvement study of leading a sustainable electric utility future through organizational change effectiveness
PDF
Utilizing data analytics in the field of physical security: an exploratory study
PDF
School-based interventions for chronically absent students in poverty
PDF
Going through it all: an evaluation of medical leave policy impact on persistence rates for students with disabilities
PDF
The board fundraising challenge after nonprofit mergers: an evaluation study
PDF
Raising women leaders of Christian higher education: an innovation study
PDF
The impact of faculty interactions on online student sense of belonging: an evaluation study
PDF
One to one tablet integration in the mathematics classroom: an evaluation study of an international school in China
PDF
Preparing millennial students for a multigenerational workforce: an innovation study
PDF
Relationship between employee disengagement and employee performance among facilities employees in higher education: an evaluation study
PDF
First-generation college students and persistence to a degree: an evaluation study
PDF
The academic implications of providing social emotional learning in K-12: an evaluation study
PDF
Labor displacement: a gap analysis an evaluation study addressing professional development in a small business environment
PDF
Retaining female field grade officers in the USAF: an evaluative study
PDF
College preparation coursework opportunities for students of color at the secondary level
Asset Metadata
Creator
Cahn, Lindsay
(author)
Core Title
Satisfactory academic progress for doctoral students: an improvement study
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Organizational Change and Leadership (On Line)
Publication Date
07/26/2020
Defense Date
07/22/2020
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
doctoral students,graduate students,OAI-PMH Harvest,satisfactory academic progress,time to degree
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Muraszewski, Alison Keller (
committee chair
), Seli, Helena (
committee member
), Stowe, Kathy (
committee member
)
Creator Email
lcahn@usc.edu,lindsay.cahn@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c89-344258
Unique identifier
UC11663940
Identifier
etd-CahnLindsa-8763.pdf (filename),usctheses-c89-344258 (legacy record id)
Legacy Identifier
etd-CahnLindsa-8763.pdf
Dmrecord
344258
Document Type
Dissertation
Rights
Cahn, Lindsay
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
doctoral students
graduate students
satisfactory academic progress
time to degree