Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Creating "excellent" learning experiences: a gap analysis of a university extension program
(USC Thesis Other)
Creating "excellent" learning experiences: a gap analysis of a university extension program
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
Running head: CREATING “EXCELLENT” LEARNING EXPERIENCES
1
CREATING “EXCELLENT” LEARNING EXPERIENCES: A GAP ANALYSIS OF
A UNIVERSITY EXTENSION PROGRAM
by
Ing Phansavath
_____________________________________________________________________
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
August 2015
Copyright 2015 Ing Phansavath
CREATING “EXCELLENT” LEARNING EXPERIENCES
2
ACKNOWLEDGEMENTS
The journey of completing this doctoral program, and writing this dissertation, would not
have been possible without the support, understanding, and guidance of key individuals in my
life. I am blessed to have family, friends, and mentors who constantly encouraged me and
believed in me even when work, school, and life in general presented challenges that seemed
insurmountable, and required more hours than the day could give.
First, to my family and friends for understanding that I would be a hermit for the duration
of this program. You were my constant cheerleaders, and I thank you for believing in me. I
would also like to acknowledge my work family, a great team of people who stepped up and
helped me to manage the workload. Thank you for allowing me to retreat and focus on school.
To my advisor, Dr. Rob Filback, who gently pushed and prodded, especially when I
would get frustrated at not being able to work through a writing block. His upbeat attitude, and
can-do approach, provided the motivation I needed to continue on the journey. To all the faculty
members who taught in the Global EdD program – thank you for imparting your knowledge, and
challenging us to think beyond what we thought we knew. I learned more than I could have
imagined, and appreciated the care that was taken in designing this program and ensuring that we
had all that we needed to succeed.
Finally, a big thank you to my fellow cohort 2 members, aka the KMO-fos, for just being
awesome. The friendships that developed are lifelong, and without these wonderful, funny, and
brilliant people, the experience would not have been as great. Special thanks to Dulcie for the
constant check-ins and for being my partner in crime for the last two years. “What’s the point?”
was a running joke, but the point is where we are now, one step closer to being the educational
leaders we imagined ourselves to be.
CREATING “EXCELLENT” LEARNING EXPERIENCES
3
TABLE OF CONTENTS
Acknowledgements 2
List of Tables 5
List of Figures 6
Abstract 7
Chapter 1: Overview of the Study 8
Introduction of Problem 8
Organizational Context and Mission 9
Organizational Problem 10
Related Literature 11
Importance of Problem 14
Organizational Goal 15
Stakeholders and Stakeholders’ Goals 16
Stakeholder for the Study and Stakeholder Performance Gap 18
Purpose of the Project 19
Methodological Framework 20
Organization of Dissertation 20
Chapter 2: Review of the Literature 21
Factors Impacting Demand for Postsecondary Education 22
Continuing Education 28
Quality and Excellence in Postsecondary Education 32
Understanding Learning and Designing Learning Experiences 36
Barriers to Achieving Excellent Learning Experiences 40
Summary and Conclusion 43
Chapter 3: Methodology 45
Purpose of the Project and Questions 45
Framework for the Study 45
Assumed Causes of the Performance Gap 46
Validation of the Causes of the Performance Gap 50
Participants 62
Procedures 62
Data Collection 63
Data Analysis 64
Chapter 4: Results and Findings 66
Results and Findings for Knowledge Causes 66
Results and Findings for Motivation Causes 82
Results and Findings for Organization Causes 89
Summary 98
CREATING “EXCELLENT” LEARNING EXPERIENCES
4
Chapter 5: Solutions, Implementation and Evaluation 103
Validated Causes Selection and Rationale 103
Solutions 106
Implementation Plan 118
Evaluation Plan 123
Strengths and Weaknesses of the Approach 127
Limitations 128
Future Research 129
Conclusion 130
References 132
Appendices 141
Appendix A: Survey 141
Appendix B: Interview Protocol for University Extension BLP Instructors 144
CREATING “EXCELLENT” LEARNING EXPERIENCES
5
LIST OF TABLES
Table 1. Organizational Mission, Global Goal, and Stakeholder Goals 18
Table 2. Summary of Sources about Assumed Causes for Knowledge, Motivation, and 49
Organizational Issues
Table 3. Summary of Assumed Knowledge Causes and Validation 53
Table 4. Summary of Assumed Motivation Causes and Validation 57
Table 5. Summary of Assumed Organizational/Culture/Context Causes and Validation 60
Table 6. Comparison of Syllabus Components from Survey Results to Document 73
Analysis
Table 7. Summary of Results and Findings for Assumed Knowledge Causes 81
Table 8. Summary of Results and Findings for Assumed Motivation Causes 88
Table 9. Summary of Results and Findings for Assumed Organizational Causes 97
Table 10. Assumed Causes and Finding Summary Table 101
Table 11. Validated Causes Summary Table 105
Table 12. Findings, Solutions, Implementation, and Learning Organization Activity 121
CREATING “EXCELLENT” LEARNING EXPERIENCES
6
LIST OF FIGURES
Figure 1. Gap analysis process 46
Figure 2. Responses to survey question assessing instructor knowledge about syllabus 68
components
Figure 3. Results of document analysis of syllabus components 71
Figure 4. Response to survey question about defining the relationship between learning 76
objectives, assessments and teaching strategies
Figure 5. Responses to survey question about methods to assess effectiveness of 79
course design
Figure 6. Responses to survey questions regarding self-efficacy 83
Figure 7. Self-reported amount of hours instructors spend per quarter updating courses 84
Figure 8. Responses to survey question regarding incentives for instructors to improve 91
course quality
CREATING “EXCELLENT” LEARNING EXPERIENCES
7
ABSTRACT
This study utilized the Clark and Estes gap analysis to investigate the knowledge, motivation,
and organizational barriers that 330 Business and Legal Programs Department instructors face in
developing courses that create “excellent” learning experiences at University Extension, a
nonprofit, university continuing education institution. Assumed causes of the knowledge,
motivation, and organizational barriers were generated from related literature, learning and
motivation theories, and personal knowledge. The analysis of this qualitative case study
validated nine causes that led to five recommended solutions. To support instructors in their
efforts to produce courses that create “excellent” learning experiences, University Extension
should define excellence for the University Extension context, allocate resources to establish a
comprehensive instructor development program, establish instructor learning communities,
establish measures of excellence, and establish a clear system of rewards and incentives. The
solutions introduce activities that move University Extension closer to becoming a learning
organization that supports professional learning. An implementation and evaluation plan is also
proposed.
CREATING “EXCELLENT” LEARNING EXPERIENCES
8
CHAPTER 1
OVERVIEW OF THE STUDY
Introduction of Problem
Postsecondary education is facing an environment of change. A variety of economic,
political, and social trends are impacting postsecondary education, and these factors are
increasing the pressure on postsecondary education institutions to ensure the quality of their
programs (Huisman & Currie, 2004; Torres & Schugurensky, 2002). The world economy has
increasingly become more knowledge-based, with 90% of fastest growing jobs requiring high
skilled workers who have high levels of educational attainment (U.S. Department of Education
[USDOE], 2006). Because the workforce requires more specialized skills, the role of
postsecondary education in workforce development has grown in importance (Huisman &
Currie, 2004; Torres & Schugurensky, 2002). This shift to a knowledge-based economy is
increasing the demand for postsecondary education, creating a competitive marketplace that
requires institutions to be accountable in providing quality education (USDOE, 2006; Torres &
Schugurensky, 2002). Many of the students associated with this growing demand are
nontraditional students who turn to continuing education institutions that align courses and
curriculum with the needs of local industry, to develop the skills that the current economy seeks
(Pusser, Gansneder, Gallaway, & Pope, 2005; Röbken, 2009).
Continuing education institutions are revenue-driven, market-oriented businesses, so they
must meet these students’ expectations of quality and ensure student satisfaction in order to build
and maintain a reputation that will continually drive student enrollment, especially with
increased competition in the continuing education space (Pusser et al., 2005; Röbken, 2009;
Lauzon, 2013). This context concerns the study site of this research, henceforth referred to as
CREATING “EXCELLENT” LEARNING EXPERIENCES
9
“University Extension,” because as a financially self-sustaining continuing education institution,
if the quality of programs do not create learning experiences meet students’ expectations, that
translates into potential loss of revenue. Consistent enrollment revenue is crucial for continued
operation at current levels.
Organizational Context and Mission
Located in an affluent region of Southern California, University Extension was founded
in the early 20
th
century, and is the continuing education school of a large, public research
university. Though the main campus is a public university that receives state funding, University
Extension operates as a self-sustaining institution that receives no financial support from the
main campus, and depends entirely on student tuition for financial sustainability. At the time of
writing, University Extension was undergoing a strategic planning process, and the new mission
of the organization was being determined. The former mission was about providing opportunities
for adult learners to achieve their personal and professional goals. The new mission reflects the
changing environment of higher education, and refocuses University Extension on creating
learning experiences that are memorable and impactful to students. The learning experiences
should embody the University Extension value of academic excellence, though the standards for
excellence were not clearly defined at the time of writing.
University Extension strives to achieve its current mission by offering a large variety of
courses both in the in-person and online formats. University Extension has six programming
departments, which include, English Language Center, Arts, Education, Engineering, Humanities
and Sciences, and Business and Legal Programs (BLP). The subject of this research is BLP
because it is one of the largest departments in terms of number of courses offered, total
enrollments, and total revenue.
CREATING “EXCELLENT” LEARNING EXPERIENCES
10
From internal documents and records, in the 2012-2013 academic year, University
Extension conducted approximately 5,300 classes, served 38,500 students, and over 90,000 total
enrollments. Eighty-eight percent of University Extension students have earned a baccalaureate
or higher degrees, and take courses at University Extension for career enhancement and growth
purposes. Students range in age of 18-80, with the mean age at 36, but the modal age is 24. Of
the total University Extension numbers, BLP accounted for 22% of classes conducted, 23% of
total enrollments, and approximately 22% of total revenue for the 2012-2013 academic year.
BLP instructors are full-time, working professionals who teach on a part-time, contract basis.
They are hired for their content expertise and not expected to be trained teachers. University
Extension offers an Instructor Development Program (IDP) to prepare instructors to teach.
Organizational Problem
The Dean of University Extension, during a town hall presentation in early 2014,
identified a number of changes affecting higher education, including the growth of the
knowledge-based economy, the increased competition in the postsecondary education space, and
demands for performance and accountability that are leading institutions to move from being
faculty-centered to student-centered. Amidst all this change, the Dean called on University
Extension to meet the forthcoming challenges by refocusing on ensuring a student learning
experience that embodies the institutional value of excellence.
“Excellence” is a University Extension value that is articulated in the vision, mission, and
values statement. BLP strives to achieve “excellence” in all courses, however the institution
lacked clear, guiding principles of excellence as well as robust measures to evaluate the quality
of student learning experiences. The only institutional measure used to gauge “excellence” was
the summative student evaluation administered at the end of each course. The evaluation has
CREATING “EXCELLENT” LEARNING EXPERIENCES
11
fifteen questions, asking about course quality, instructional quality, course materials, and guest
lecturers. In addition to the nine-point Likert scale score, the evaluation provides students an
opportunity to leave comments. Of a possible 9.0 on the University Extension end-of-course
summative survey, the BLP average score was 7.5. From the administrative standpoint, any score
that falls below a 7.0 is a cause for concern, and that score is considered when making hiring
decisions for future quarters. From a brief scan of Fall 2013 BLP evaluations, there were some
comments stating disappointment at the quality of instruction and overall course. Brief scanning
interviews with instructors also indicated that some of them had difficulty with teaching in a
diverse, adult learning environment, and may have had challenges engaging all students.
Additionally, the department lacked a comprehensive process to measure quality and ensure
excellence. The student evaluation scores, student and instructor comments, in addition to the
lack of monitoring and evaluation processes, represented a gap between in meeting the
institutional value of, and students’ expectations for, excellence in learning experiences. The
discrepancy gap analysis was used to frame this problem.
Related Literature
The knowledge-based economy is demanding more specialized skills that require higher
levels of educational attainment than in previous generations (USDOE, 2006). Technological
advancements have changed the nature of modern work, requiring workers to be skilled at
integrating academic and technical concepts for solving real-world problems (Bragg, 2001). As
the global economy becomes more integrated and competitive, governments are pressured to
focus on their higher education systems to ensure that education is aligned with the needs of the
highly specialized job market (Torres & Schugurensky, 2002).
CREATING “EXCELLENT” LEARNING EXPERIENCES
12
Education is growing priority for governments and policy makers because there is a
belief that a highly skilled workforce and well-educated workforce will enhance economic
growth (Liu, 2009). This perspective puts the onus of workforce development on postsecondary
institutions, and students are compelled to invest in their education to increase their marketability
in the job market (Torres & Schugurensky, 2002). Education is the competitive edge for career
growth, so students are looking for assurances that the time and money they spend on higher
education is a worthwhile investment (Huisman & Currie, 2004). Additionally, because the job
market requires higher skills, more and more people are seeking more education. In the 2012-
2013 academic year, enrollment at University Extension increased by about 5% over the
previous year, and the 2013-2014 is forecasted to increase by about the same amount.
The National Center for Education Statistics (NCES, 2013) predicts that participation in
postsecondary education in the United States will increase by about 20% in the next ten years.
To meet this demand, more institutions are joining the market, creating a postsecondary
education landscape that has become a competitive marketplace where students are consumers of
education. As consumers, the students expect that the investments they make in their education
will generate returns in terms of quality education that prepares them for employment (Davis,
2003; Huisman & Currie, 2004; White & Glickman, 2007).
In the United States, there has been an increase in the number of nontraditional students
who are returning to school in order to gain more specialized skills that fit the needs of the
knowledge-based workforce (NCES, 2013). Continuing education and professional development
institutions, like University Extension, offer opportunities for adult learners to gain relevant,
practical skills and knowledge that can be immediately applied to the job (Cervero, 2001).
Continuing education institutions are typically aligned with local industry needs, so they fill
CREATING “EXCELLENT” LEARNING EXPERIENCES
13
some of the space for workforce development within the post-secondary education landscape
(Jarvis, 1996). However, because of the growing number of institutions offering continuing
education and professional development programs, the market has become extremely
competitive, and all institutions need to ensure that students are satisfied with the quality of
education and outcomes (Breneman, 2005).
As higher education’s role in workforce development grows larger and students are
compelled to invest in education, students expect high quality and returns on their investment.
Postsecondary education institutions are instituting accountability measures in order to assure
students that they are investing in a quality product (Huisman & Currie, 2004; Torres &
Schugurensky, 2002). For market-driven institutions, building a reputation on quality helps to
attract potential students, thus ensuring high quality programs is a priority (Liu, 2009).
Continuing education institutions are especially sensitive to students’ needs because many are
structured to be revenue-generating institutions, so student satisfaction becomes a high priority
(Breneman, 2005).
As students shift from being learners to customers, their perception of excellence is the
determinate of quality (Browne, Kaldenberg, Browne, & Brown, 1998). Consumer-oriented
approaches to quality assurance are led by market-driven systems, so attaining the highest level
of student satisfaction is imperative for post-secondary institutions that have market-based
structures that are dependent on student tuition for financial viability (Liu, 2009).
Defining “excellence” and “quality” is a difficult task in higher education because the
context and purpose of the institution influence how the terms of “excellence” and “quality” are
understood (Krause, 2012). Measuring excellence within an educational setting is difficult
because of the many inputs mixed with individual students’ expectations, but ensuring quality
CREATING “EXCELLENT” LEARNING EXPERIENCES
14
will more likely influence satisfaction positively (Browne et al., 1998). Students identify
academic dimensions such as faculty teaching as the best indicators of quality, therefore, when
institutions focus on customer satisfaction, the dimensions that the students identify as most
important must be the main focus for improvement. Emphasizing customer service “reflects the
principle that the customer is the final arbiter of quality” (Browne et al., 1998, p. 2).
Student evaluations are one such tool that is used to measure student satisfaction and
teaching quality (Marsh 2007). The use of students’ evaluations of teaching is a common method
of determining quality, and is an increasingly important factor in feedback to faculty, personnel
decisions, quality assurance, and information to other students for course and instructor selection
(Marsh, 2007). One of the main factors that students cite as important to their satisfaction is
instructional effectiveness (Elliott & Healy, 2001). Instructors have a major impact on students’
learning experiences and they must be knowledgeable about how to design and deliver courses
that produce learning experiences that meet students’ needs (Fink, 2013).
The adult learning theory of andragogy can be used as a guide to structure learning
experiences that are appropriate and effective for continuing education students (Knowles,
1980). Additionally, Merrill’s (2002) first five principles of instructions also inform the design
of learning experiences that are impactful to students. Fink (2013) offers a framework on how to
create significant learning experiences that go beyond merely laying foundational knowledge.
Through a significant learning experience, students should gain not only content knowledge, but
also personal development that will positively impact their lives (Fink, 2013).
Importance of Problem
Higher education has become a competitive marketplace, with students seen as
“customers” who are expecting returns on their investment (Hanushek & Rivkin, 2006; Huisman
CREATING “EXCELLENT” LEARNING EXPERIENCES
15
& Currie, 2004). Due to the growth of the knowledge-based economy, there has been an
emergence of nontraditional learners, prompting a growth in institutions that provide continuing
education and professional development programs (Pusser et al., 2005). This growth has flooded
the postsecondary education marketplace, and institutions must figure out how to be competitive.
Providing evidence of instructional and course quality helps to show students that they
are making a worthy investment in the institution for their postsecondary educational needs
(Hanushek & Rivkin, 2006). In the context of adult continuing education institutions, a student
learning experience that embodies the institutional value of excellence is important to ensure that
students will return to take additional classes. Satisfied students also provide vital word-of-
mouth marketing that are important for the institution to gain new students.
University Extension needs to continually provide “excellent” learning experiences to
remain the leader in continuing education and professional development. Though University
Extension actively recruits new students, internal University Extension marketing research
studies show that many students indicate that they take courses at University Extension because
of recommendations from someone they know. Positive word-of-mouth marketing is vital to
University Extension’s business, so it is imperative to accountable to students for quality
education, and assure them that they are receiving the expected returns on their investment in
education.
Organizational Goal
The strategic planning process that University Extension went through yielded a new
vision for the organization. University Extension’s new vision directs the institution to be a
driver of change in businesses, communities, and individuals’ lives by being a collaborative
partner and utilizing new educational technologies to expand access and reach. In order to
CREATING “EXCELLENT” LEARNING EXPERIENCES
16
achieve this vision, University Extension must ensure programs and course offerings are of the
highest quality, and that the new institutional mission of creating “excellent” learning
experiences is met. The Dean expects each programming department, including BLP to ensure
quality and excellence. BLP only used end-of-term student evaluations as a method to measure
course quality, student satisfaction and teaching effectiveness. For the 2014- 2015 five-years
program reviews, BLP submitted average student evaluation scores of 7.5. By the next round of
five-years reviews in 2019-2020, the goal is to improve the quality of course design in order to
set the foundation for “excellent” learning experiences. An increase in the average student
evaluation score would be an indicator of improvement signifying students feel that University
Extension delivered high quality programs. Additionally, new indicators are to be created to
better measure instructional quality and the achievement of “excellent” learning experiences for
students.
Stakeholders and Stakeholders’ Goals
The stakeholders at University Extension include students, instructors, and staff. Each
stakeholder group plays a distinct role in University Extension’s efforts to achieve its goal of
creating “excellent” learning experiences. Students are the stakeholders who ultimately judge the
ability of University Extension to reach its organizational goal, so their input is critical in
directing the institution on how to achieve the goal. However, instructors and staff also play
critical roles in achieving the organizational goal because they are responsible for designing,
implementing, and delivering the product to the students.
In BLP, the instructors are working professionals who are experts in their field. Most of
them have graduate degrees and at least 10 years of professional experience. Those without a
graduate degree usually have over 20 years of professional experience and are senior level
CREATING “EXCELLENT” LEARNING EXPERIENCES
17
executives in their field. However, BLP instructors are not required to be trained teachers. The
main criterion in hiring is expertise in field of study. University Extension offers an Instructor
Development Program (IDP) that provides workshops and seminars for new instructors, so not
having teaching experience would not automatically eliminate a potential instructor from the
hiring pool.
The BLP staff stakeholders include senior management and administrative personnel who
are responsible for working with instructors to design and implement courses and programs.
Senior management includes one department director, three program directors, and four program
managers. Administrative personnel include six administrative assistants, ten program
representatives, and two student advisors. The directors work directly with instructors to design
courses and programs, but the program representatives, under the direction of the directors,
implement the delivery of courses in coordination with the instructors. The student advisors, also
under the purview of the directors, interface with students and provide guidance for program and
class selection, as well as counseling for any academic related issues.
Table 1 identifies the key stakeholders and outlines their goals as related to the
overarching organizational goal.
CREATING “EXCELLENT” LEARNING EXPERIENCES
18
Table 1
Organizational Mission, Global Goal, and Stakeholder Goals
Organizational Mission
University Extension’s mission is to ensure students have excellent learning experiences.
Organizational Global Goal
By the 2019-2020 program review date, the University Extension Department of Business and
Legal Programs will improve quality of course design to strengthen the foundation for
excellent learning experiences as indicated by an increase in average student evaluation scores,
and additional new tools and measures to evaluate students’ learning experiences.
Instructors
By June 2016, instructors will
generate course curricula that
are driven by learning
outcomes and aligned key
assessments.
Program Directors
By May 2016, the Program
Directors will design and
implement an appropriate
course monitoring and
evaluation processes that
measures student learning
experiences.
Student Advisor
By May 2016, the Student
Advisors will ensure that
100% of new certificate
program students have study
plans that articulate student
learning outcomes and
strategies for self-regulation.
Stakeholder for the Study and Stakeholder Performance Gap
Though each stakeholder group contributes in different ways to the organizational goal of
achieving excellence, the critical group for this study is the instructors. Instructors design their
courses, deliver the instruction, and interact directly with students, and the feedback provided
from students stem from their experience in the classroom. Since students indicate that class
instruction is a main indicator of course quality (Browne et al., 1998), it is critical to work with
instructors to improve the quality of instruction in order to raise the student evaluation scores,
and move towards meeting creating an excellent student learning experience. If instructors do not
meet students’ expectations in the classroom, student will not have a memorable learning
CREATING “EXCELLENT” LEARNING EXPERIENCES
19
experience, and they may choose to attend another continuing education institution, and will not
recommend University Extension to other potential students.
From a brief scan of course syllabi, only about 40% of instructors had created course
syllabi that were driven by learning outcomes and aligned with key assessments. The goal was to
have all instructors proficient in designing curricula with clear learning goals, appropriate
teaching and learning activities, and aligned feedback and assessment (Fink, 2013), so that
represented a gap of 60%.
Purpose of the Project
The purpose of this study was to conduct a gap analysis to examine the root causes of
why instructors were not designing courses that created “excellent” learning experiences, as
evidenced by student evaluations scores for University Extension BLP courses that average 7.5
out of a possible 9. The analysis focused on causes for this problem due to gaps in the areas of
knowledge and skill, motivation, and organizational issues. The analysis began with generating a
list of possible or assumed causes and then by examining these systematically to focus on actual
or validated causes. While a complete gap analysis would focus on all stakeholders, for practical
purposes the stakeholder group focused on in this analysis was the instructors.
The following questions guided this study:
1. What are the knowledge, motivation, and organizational causes that are barriers to
instructors achieving their goal of creating curriculum that produce “excellent”
learning experiences and increasing student satisfaction?
2. What are the knowledge, motivation, and organizational solutions to those barriers?
CREATING “EXCELLENT” LEARNING EXPERIENCES
20
Methodological Framework
Clark and Estes’ (2008) gap analysis methodology was implemented in order to research
the potential causes of and explore potential solutions for the lower than desired average student
evaluation scores. Clark and Estes’ (2008) gap analysis is a systematic, analytical method that
helps to clarify organizational goals and identify the gap between the actual performance level
and the preferred performance level within an organization. Personal knowledge and related
literature formed the base used to generate assumed causes for the performance gap. These
causes were validated using surveys, interviews, literature review, and document analysis.
Research-based solutions were recommended and evaluated in a comprehensive manner.
Organization of Dissertation
Chapter 1 introduced the problem statement and described the specific problem that this
study will address. It provided a background of the study, discussed the organization, introduced
the stakeholders, and identified the goals of the study. Chapter 2 will review relevant literature
and present theories and research related to the problem associated with the study. The research
methodology and data collection and analysis procedures will be discussed in Chapter 3.
Chapter 4 will contain an analysis of the research results and identify validated causes of the
problem. Chapter 5 will synthesize the findings, discuss solutions and implementation plans, and
propose an evaluation plan for measuring the effectiveness of the proposed solutions.
CREATING “EXCELLENT” LEARNING EXPERIENCES
21
CHAPTER 2
REVIEW OF THE LITERATURE
Postsecondary education is being impacted by a number of different factors that are
creating conditions in which increased demand, competition, and accountability are pushing
institutions to examine the quality of their programs and courses. These factors include the
growth of the knowledge economy, the increasing value of education, the growth of the non-
traditional student population, and the increased pressures of accountability from stakeholders
that postsecondary education institutions must face. The combination of these factors are pushing
postsecondary education institutions to focus on ensuring that their educational offerings are
meeting the needs of the larger community, and are meeting the quality expectations of students.
For financially self-sustaining, continuing education institutions, the satisfaction of
students is of paramount importance for the retention of the student and positive word-of-mouth
marketing. If students are unhappy with their experience, they can easily take their business
elsewhere. Thus for institutions that are tuition dependent, focusing on providing an excellent
educational experience becomes a high priority.
The following literature review will discuss how the factors impacting postsecondary
education are increasing demand and accountability. After setting the context of the forces
creating change in the postsecondary education landscape, the continuing education space will be
explored, specifically looking at the role of continuing education institutions, their marketplace
orientation, and the imperative for creating learning experiences that meet the needs and
expectations of students. Then, a discussion about learning and designing learning experiences
will follow. The chapter concludes with a discussion of the gap analysis and the elements that
frame this study of creating excellent learning experiences.
CREATING “EXCELLENT” LEARNING EXPERIENCES
22
Factors Impacting Demand for Postsecondary Education
The postsecondary education space has become crowded as demand for, and access to,
postsecondary education increases. According to the NCES, in 2011, there were 4,076 degree-
granting institutions. Enrollment in degree granting institutions increased by 32% between 2001-
2011, with enrollments by students under age 25 increasing by 35%, and enrollments by students
age 25 and older increasing 41% (NCES, 2013). A number of factors contribute to this growth in
demand for postsecondary education.
The increasing demands for higher levels of education derive from the changing world
economy, which is increasingly knowledge-based. The globalizing nature of the world economy
fuels competition amongst nations, and governments and industry are looking to their higher
education systems to produce skilled workers who will help economies compete. As more jobs
require more education, the value of education is increasing, and more nontraditional students
are returning to school in hopes of increasing their earnings potential. This influx of students has
spurred the growth of postsecondary education institutions, creating a competitive market.
Tierney and Hentschke (2007) aptly summarize the situation stating that, “In a knowledge
economy, education is a growth industry” (p. 19). However, as the industry grows, the measures
for accountability are also increasing, putting pressure on postsecondary institutions to ensure
that their educational offerings are of the highest quality.
Growth of the Knowledge-based Economy
There are a number of interconnected forces that are driving the growth of the
postsecondary education market. One main factor in the increased demand is the evolution of the
world economy into a knowledge-based economy, with 90% of jobs requiring at least some
postsecondary education (USDOE, 2006). The world is entering the Information Age, and as
CREATING “EXCELLENT” LEARNING EXPERIENCES
23
technology spans and permeates into almost every sector, jobs become more knowledge-based,
requiring higher levels of education than previous generations (Bragg, 2001; Carnevale &
Desrochers, 2002).
The new, knowledge-based economy is demanding a workforce that is educated and
more skilled in technology and higher level, critical thinking (Bragg, 2001; Carnevale &
Desrochers, 2002; USDOE, 2006; Torres & Schugurensky, 2002). Bragg (2001) explains that a
new vocationalism developed from the growth of the economic sectors in business, health care,
and technological fields, with jobs in those areas all requiring completion of some postsecondary
education. Technology is one main reason for the growth of the knowledge-based economy. The
proliferation of technology across all economic sectors is producing an information age that
necessitates workers with specialized technological skills, which require at least some
postsecondary education (Carnevale & Desrochers, 2002; Torres & Schugurensky, 2002).
This education requirement increases the value of education, as those with even some
postsecondary education earn more than those without. Additionally, nations recognize that their
economic competitiveness is contingent upon having an educated workforce, so governments are
leaning on postsecondary institutions to produce graduates who are prepared to contribute to the
knowledge-based economy (Liu, 2009; USDOE, 2006).
The Increasing Economic Value of Education
In order to remain competitive in the global economy, governments and employers are
looking to postsecondary education institutions to help develop the workforce. Liu (2009) asserts
that education is a growing priority for policy makers because “it is also believed that economic
growth and performance will be enhanced by producing more highly skilled human capital and a
well educated workforce” (p. 213). A report from the U.S. Department of Education commission
CREATING “EXCELLENT” LEARNING EXPERIENCES
24
on the future of higher education (USDOE, 2006) acknowledges that though the US is home to a
large share of the world’s best universities, other countries are following in suit and investing in
the education of their citizenry in order to increase their human capital and gain a competitive
edge. The report contends that higher education is a key factor in the human capital needed to
sustain economic growth and increase workforce productivity.
In addition to governments leaning on postsecondary education for human capital
development, employers are also looking to external sources for workforce training. As the labor
market shifts and workers are less likely to develop their careers with one employer, instead
moving between employers within the same industry, employers are seeking employees who
already have some training or certification (Carnevale & Desrochers, 2002). The increasingly
transient nature of employees makes the investment in internal training less valuable, so the
value of education from reputable external sources, such as postsecondary education institutions,
increases (Carnevale & Desrochers, 2002; Jarvis, 1996).
The economic value of education has increased due to shifts in the labor market.
Carnevale and Desrochers (2002) identify two types of shifts, which include the creation of new
knowledge-based jobs, and the increasing requirement for postsecondary education across all
jobs, even those that previously did not require the advanced education. As the jobs in the white-
collar, knowledge-based sectors grow, the low-wage service jobs are not growing, and blue-
collar factory jobs are decreasing, indicating that fewer jobs are available for those with only a
secondary level education (Carnevale & Desrochers, 2002). This has led to higher demand for
postsecondary education, and increased numbers of educational institutions trying to meet that
need.
CREATING “EXCELLENT” LEARNING EXPERIENCES
25
Educational qualifications are an increasingly valuable currency in the job market (Jarvis,
1996). The growing demand for more educated workers is also increasing the wage differential
between workers who have postsecondary education and those who do not (Carnevale &
Desrochers, 2002; Kena et al., 2014). Carnevale and Desrochers (2002) succinctly sum up the
situation by asserting that “Good pay and benefits are linked to postsecondary educational
attainment, achievement, training, and technology” (p. 17). Kena et al. (2014), in their NCES
report, compared young adults across educational attainment levels and found that annual
income increased with each level of educational attainment. The report also illustrates the
growing gap in earnings, as those with a bachelor’s earning twice as much as those without
(Kena et al., 2014). The knowledge economy has increased the value of education, and workers
must achieve these higher levels in order to be relevant to the workforce. This increased value of
education is increasing the demand for education, particularly among nontraditional students.
Growth of Nontraditional Student Population
Nontraditional students represent the fastest growing sector of the student population, and
according to recent trends, this student population will continue to grow (NCES, 2013). The US
Department of Education defines nontraditional learners by one or more of the following
characteristics: delayed postsecondary enrollment high school, part-time enrollment in
postsecondary program, for financial aid purposes is classified as financially independent, has
dependents other than a spouse, is a single parent, or does not have a high school diploma (Choy,
2002). A majority of enrollees in continuing education programs are nontraditional students per
the US Department of Education definition (Pusser et al., 2005).
Many of the nontraditional students are current working professionals looking for courses
and programs to update their skills. As the ratio of older people increases in the American
CREATING “EXCELLENT” LEARNING EXPERIENCES
26
workforce, there is a need for this population to continually update their skillsets in a
technological environment that is rapidly changing (Röbken, 2009). This increased demand by
the nontraditional student population has led to an increase in the number of postsecondary
institutions. Specifically, this growth of nontraditional learners is a boon for continuing
education institutions whose demographic is the working professional and adult learner (Pusser
et al., 2005).
Growing Demand for Accountability
The growing accountability from external stakeholders is another factor greatly impacting
postsecondary education institutions. Huisman and Currie (2004), in their review of
accountability in higher education, summarize the literature on accountability to determine that
accountability is an obligation of institutions to justify their performance. Accountability is also a
mechanism that can be used as a regulatory device, to legitimize institutions, and as a method of
quality assurance (Huisman & Currie, 2004). Of the types of accountability that exist, Huisman
and Currie believe that the two types most applicable to higher education are professional and
political accountability. The difference between the two types is the source for the standard of
performance. Professional accountability allows for autonomous individuals to make decisions
based on internalized professional norms, and political accountability includes an external body
to which the institution must be accountable (Huisman & Currie, 2004).
Huisman and Currie (2004) identify four trends that have impacted accountability in
higher education. The first trend is the changing relationship between governments and
universities, with market forces exerting more influence, and government moving in the direction
of regulating higher education (Huisman & Currie, 2004). In the United States, Orosz (2012)
points to the economic downturn and shifting of politics to the conservative right during the
CREATING “EXCELLENT” LEARNING EXPERIENCES
27
1980s as the point where higher education budgets were cut, and tuition at public institutions
started to rise. Concomitant with the rise of tuition was the rise of demand from students,
parents, and taxpayers for increased productivity from higher education institutions, leading to
the second trend that was impacting accountability practices (Orosz, 2012).
The second trend that Huisman and Currie (2004) identify is the expectation that
students, parents, and government have for increased efficiency and value for money. The
demands for higher efficiency and value stem from the massification of higher education and the
constraints this puts on government budgets, as well students being more critical of quality
because of the increasing cost of higher education (Huisman & Currie, 2004). Austin and
Sorcinelli (2013) state, “across the country, both public and private higher education institutions
are facing increasing expenses, coupled with concerns from parents, students, legislatures, and
the general public about rising costs and greater demand for accountability” (p. 86). The authors
cite the growing demand for accountability as a cause for administration to focus on strategic
initiatives that will demonstrate the benefits and quality of education, including initiatives for
faculty development that will help faculty become more effective in the classroom.
Third is the internationalization of higher education and globalization, which is allowing
foreign institutions and businesses to enter the local market and raising questions as to where
accountability lays (Huisman & Currie, 2004). As education internationalizes and becomes more
available, the business of education is growing and programs and courses are seen as
commodities that students can buy (Merriam & Bierema, 2013). Merriam and Bierema (2013)
identify globalization a force in making education “a commodity of the marketplace” (p. 3).
Finally, the rapid development of information and communications technologies is facilitating
globalization and blurring the traditional boundaries of higher education because institutions are
CREATING “EXCELLENT” LEARNING EXPERIENCES
28
no longer bound by geographic location in their ability to expand and reach audiences across
borders (Huisman & Currie, 2004). Continuing education is one sector of the postsecondary
education landscape that has embraced the use of information and communications technologies
to make access to education more accessible (English & Mayo, 2012), and though the ability is
instantaneously transmit information allows education to be easily accessed, it also raises
questions about the quality of these programs (Huisman & Currie, 2004).
Continuing Education
Continuing education is a term that encompasses a variety of programs. Continuing
education can be offered through the workplace, through professional associations, or through
institutes of higher learning (Cervero, 2001). A majority of continuing education programs are
non-credit or certificate courses that do not lead to an academic degree (Röbken, 2009). For the
purpose of this study, the following discussion on continuing education will focus on nonprofit
higher education continuing education programs, also referred to as university continuing
education programs or university extension. According to English and Mayo (2012),
University continuing education is the term used, mainly in North America and other
parts of the world, to refer to the administrative division within many universities which
offers courses and programs, usually to persons at a distance from the institution. In
North American, the use of the term “continuing education” implies that many of the
students are casual and enrolled in part time courses, though entire degrees may be
offered through continuing education departments. (p. 143)
Though there are many types of continuing education program providing both vocational and
non-vocational adult education, the common element is that continuing education programs are
accessible to students who are not part of the traditional student cohort (English & Mayo, 2012).
CREATING “EXCELLENT” LEARNING EXPERIENCES
29
The role of continuing education has evolved and changed over time. The current role of
continuing education is to provide professionals with educational opportunities to grow and
evolve in their field to remain relevant and current (Cervero, 2001; Lauzon, 2013). Continuing
education is seen to play a major role in developing the US labor force (Röbken, 2009).
Continuing education programs traditionally have strong linkages to the local labor markets
because their focus is on building vocational skills and increasing the credentialing of working
adults (Jarvis, 1996; Pusser et al., 2005). Additionally, as public universities are under greater
pressure to play a larger role in economic development, more university continuing education
programs are collaborating with business to design programs (Cervero, 2001). Jarvis (1996)
argues that continuing education programs, due to their market and workforce development
orientation, are more responsive to labor market needs than traditional colleges and universities.
Technological advances are also changing rapidly, requiring workers to constantly update
their skills and companies are outsourcing that training to continuing education institutions
(Jarvis, 1996). Employers are turning to continuing education institutions to provide workforce
training because workers are not building their career with one employer, but rather, moving
between employers over the course of their career (Carnevale & Desrochers, 2002; Jarvis, 1996).
This leads to a decrease in the value of in-house training because employers do not expect
employees to stay long enough to make the return on investment worthwhile, often times
expecting that new employees will have had the relevant education and training (Carnevale &
Desrochers, 2002). This supports the idea that educational qualifications serve as currency in the
job market, making education a marketable commodity (Jarvis, 1996). Continuing education
institutions are poised to offer that commodity.
CREATING “EXCELLENT” LEARNING EXPERIENCES
30
Marketplace Orientation of Continuing Education Institutions
University extension programs were originally conceived as community education units
that served to fulfill the university’s commitment to public service (Lauzon, 2013). With the
emergence of the global economy and growth of knowledge society, these university extension
programs assumed a significant role in the training and development of participants in the
increasingly competitive economy, thus ushering in an adoption of a business orientation and
market model to operate university extension programs (English & Mayo, 2012; Lauzon, 2013).
Continuing education institutions are typically not financially supported by the main campus,
and are required to be revenue-generating, profit-centered organizations (Breneman, 2005;
English & Mayo, 2012; Pusser et al., 2005, Röbken, 2009). For American higher education
institutions, continuing education programs are a source of additional revenue, and are generally
geared towards the competitive market (Röbken, 2009). The financial viability of continuing
education institutions is dependent on student tuition and fees (English & Mayo, 2012; Lauzon,
2013; Pusser et al. 2005).
There are many institutions that offer continuing education programs, providing students
with opportunities to pick and choose an education service provider. The competitive
environment that postsecondary education institutions face make constant improvement of
educational quality critical because that is a major factor in student satisfaction and decision to
continue taking classes (Browne et al., 1998). One of the main competitors of the nonprofit,
public university based continuing education programs is for-profit universities and colleges,
such as University of Phoenix (Breneman, 2005). These for-profit institutions target the same
demographic of students as the nonprofit continuing education institutions (Röbken, 2009). The
CREATING “EXCELLENT” LEARNING EXPERIENCES
31
for-profit institutions are considered efficient, responsive to demand, and entrepreneurial, and
they directly compete for students (Röbken, 2009).
Students as Customers
Students in continuing education programs are typically nontraditional students. Most are
older, part-time, and unable to assume full-time student status (English & Mayo, 2012; Pusser et
al., 2005). According to Röbken (2009), 80% of continue education programs are intended for
work-related training, and increasingly, companies are not reimbursing employees for the
training, so individuals must bear the expense in order to acquire the necessary training to
advance in their careers. Given the cost of education, and the market orientation of continuing
education programs, students are increasingly seen as consumers of postsecondary education,
especially as costs and competition increase. As education is increasingly viewed as a
commodity (Jarvis, 1996), students, as customers, are starting to expect more from their
investment. Because students are bearing higher costs for education, they are becoming more
critical of the services that are exchanged (Huisman & Currie, 2004). They want to be assured
that they are making a worthy investment. To be competitive, continuing education providers
need to generate positive buzz about their programs, and ensure that students’ expectations of
excellence are met (Breneman, 2005; Browne et al., 1998).
The repeat business of students is important to the financial stability of continuing
education institutions. In their survey of empirical studies on student satisfaction, Elliott and
Healy (2001) conclude that there is a strong relationship between customer satisfaction and the
intent to repurchase. This repurchase intent is crucial for tuition dependent continuing education
institutions as satisfied students will be more inclined to keep paying for additional classes.
Röbken (2009) and Breneman (2005) also found that word-of-mouth marketing is an important
CREATING “EXCELLENT” LEARNING EXPERIENCES
32
factor in ensuring constant demand of continuing education programs. Browne et al. (1998),
found that student willingness to recommend a program was predictive of overall student
satisfaction. Postsecondary institutions must identify what students’ value, propose to deliver
that value, and follow through with the intent in order to remain competitive (Elliott & Healy,
2001).
Quality and Excellence in Postsecondary Education
The Dean of University Extension, during an internal meeting he held in early 2014 to
reveal University Extension’s new vision and mission, emphasized that 21
st
century
postsecondary education institutes need to be student-centered. This means that the student
experience is paramount, and faculty and staff need to take steps to ensure that students are
having excellent learning experiences. Student satisfaction is also critical for financially self-
sustaining institutions that depend on student tuition and fees, because the retention and
recommendation of current students helps to maintain enrollment levels. This section will first
discuss what quality and excellence mean, then the discussion will turn to how excellence can be
measured, and finally there will be a discussion about the factors that students find important
when they are evaluating a learning experience.
Understanding Quality and Excellence
In reviewing the literature, there is not a clear, agreed upon definition of “quality” and
“excellence.” Hill, Lomas, and MacGregor (2003) concede that, “the notion of quality is not a
simple one; rather it is problematic, contested and multidimensional and requires examination at
institutional, departmental and individual levels” (p. 17). The understanding of quality and
excellence will depend on the context in which those terms are being defined, and the
manifestations will also be different per each context.
CREATING “EXCELLENT” LEARNING EXPERIENCES
33
Krause (2012) describes how defining quality in higher education is an “ill-defined
problem that is under-theorized yet associated with high stakes policy-making and funding,
particularly at the macro national level” (p. 285). The author notes that though there is an
increasing interest in quality, demonstrated by the breadth of literature about quality across all
functions of higher education, there is not consensus around what quality means. The
interpretation of quality is underscored by political undertones, so the purpose and context of
quality must be understood in terms of the larger political and economic environment (Krause,
2012). Often, the discourses on quality, both internal and external to higher education, are
contradictory, leading to policy tensions between stakeholders (Krause, 2012).
Little and Locke (2011) also acknowledge a tension in the discussion about excellence,
specifically around the various notions of excellence, explaining that excellence can be defined
in terms of “positional good for students, as aspirational target for continuous quality
enhancement, a form of reputational advantage for higher education institutions or a means of
achieving governmental and economic goals” (p. 120). These wide-ranging views on excellence
require that institutions determine for themselves how they need to define excellence according
to their particular context. Though the concept of excellence is elusive in a general sense, the
characteristics of excellence can be reflected in defining the terms of quality teaching and
learning (Little & Locke, 2011).
In terms of teaching, Hénard and Roseveare (2012) define quality teaching as “the use of
pedagogical techniques to produce learning outcomes for students” (p. 7). The process of
producing the learning outcomes involves several dimensions, including the effective design of
curriculum, utilizing various learning contexts, integrating feedback, and appropriate assessment
of learning outcomes (Hénard & Roseveare, 2012). Ensuring quality teaching requires
CREATING “EXCELLENT” LEARNING EXPERIENCES
34
coordination at institutional level with policies and programs that support teaching, at the
program level with efforts to enhance design, content, and delivery of programs, and at the
individual level with initiatives to help teachers become more student-centered and work toward
student improvement (Hénard & Roseveare, 2012).
The challenge in defining quality and excellence can lead to difficulty in determining
measures for quality and excellence (Krause, 2012). Gunn and Fisk (2013) acknowledge that as
the discussions about teaching excellence have grown, the ability to measure impact on student
learning becomes harder. Krause (2012) suggests that there is a need to utilize sensitizing
concepts as lens through which to analyze and evaluate the concept of quality because the
construct of quality is complex and multi-layered.
Measuring Excellence
Though continuing education institutions are implementing quality assurance tools, the
concept of quality is not universally agreed upon. Tam (2001) explains that perceptions of
quality are dependent upon the stakeholder’s interest in higher education. She contends that
many institutions have quality measurements that analyze inputs and outputs, but the
performance indicators do not capture the true quality of the student learning experience.
Ramsden (1991) asserts that the economic perspective on education largely influenced the usage
of performance indicators that only measure inputs and outputs. Though performance indicators
may be useful for some administrative purposes, Ramsden does not view them as refined enough
to measure teaching performance.
Students are key stakeholders in continuing education therefore their opinions and
satisfaction with the learning experience is valuable to continuing education institutions. The
competition for students leads to a variety of quality assurance methods aimed at ensuring
CREATING “EXCELLENT” LEARNING EXPERIENCES
35
student satisfaction (Röbken, 2009). In addition to student evaluations, other measures that
continuing education institutions have used include consultative committees, benchmarking, self-
evaluations, student focus groups, and external quality consultation (Röbken, 2009).
In Röbken’s (2009) study of continuing education programs in the United States, he finds
that the most popular quality assurance instrument used is the evaluation form completed by the
student at the end of the course. The results of these evaluations are taken seriously by the
instructors and administrative staff, often times being used as the basis for decisions on whether
to offer the class again or to extend the instructor contract (Röbken, 2009). However, questions
have been raised about the validity and usefulness of student evaluations for measuring teaching
quality and effectiveness. Elliott and Shin (2002) suggest that the traditional method of
assessing satisfaction with a single-item score does not fully capture the student experience.
They propose an evaluation that produces a composite score that incorporates multi-item
attributes, which provides more accurate and objective reflection of the overall experience.
Zerihun, Beishuizen, and Van Os (2012) suggest using an evaluation that goes beyond
measuring teaching quality, and instead asks students to reflect upon their learning experience.
The learning experience is created by the teacher’s facilitation and the engagement of students,
so evaluations should include questions that provide opportunities for student self-evaluation of
their learning experience (Zerihun et al., 2012). The proposed evaluation includes four factors:
the quality of assessment and feedback, course organization and presentation, student self-
evaluation, and students’ level of engagement (Zerihun et al., 2012). These factors of the
evaluation reflect the idea that students need to be active in the learning process in order to have
a valuable learning experience (Zerihun et al., 2012).
CREATING “EXCELLENT” LEARNING EXPERIENCES
36
Understanding Learning and Designing Learning Experiences
The general consensus about learning is that it is a process of change that occurs when a
learner interacts with his or her environment (Ambrose, Bridges, DiPietro, Lovett, & Norman,
2010; Knowles, 1980; Mayer, 2011; Shuell, 2013). Knowles (1980) defines a “learning
experience” as the quality and amount of interaction between learners and their environment.
Knowles specifies that the main role of teachers is to manage the two key variables of
environment and interaction in the learning process. According to Knowles, “the critical function
of the teacher, therefore, is to create a rich environment from which students can extract learning
and then guide their interaction with it so as to optimize their learning from it” (p. 56). To better
understand how learning environments should be created, the concept of learning will first be
discussed, followed by an exploration of andragogy, the leading theory on adult learning, which
provides a starting point to discuss how learning environments should be design to achieve
excellent learning experiences.
Learning
Ambrose et al. (2010) define learning as “a process that leads to change, which occurs as
a result of experience and increases the potential for improved performance and future learning”
(p. 3). This definition and variations of this definition were found in other conceptualizations of
learning (Knowles, 1980; Mayer, 2011; Merriam & Bierema, 2013; Shuell, 2013). The key
components of learning involve a long-lasting change in learner’s knowledge, beliefs, behaviors,
and attitudes (Ambrose et al., 2010; Mayer, 2011). The cause of the learning is the learner’s
interpretation and response to their experiences and interaction with their environments
(Ambrose et al., 2010; Mayer, 2011).
CREATING “EXCELLENT” LEARNING EXPERIENCES
37
Ambrose et al. (2010) also recognize that learning is a developmental process in which
students’ skills, knowledge, abilities, and social and emotional experiences influence how the
students will engage with the learning process. Based on this developmental and holistic
perspective, Ambrose et al. (2010) propose seven principles of learning. These principles of
learning are (1) students’ prior knowledge, (2) student’s organization and application of
knowledge, (3) student’s motivation, (4) acquisition, integration, and application of component
skills, (5) goal-directed practice with targeted feedback, (6) interaction of students’ development
and course climate to impact learning (7) self-monitoring and adjusting approaches to learning to
become self-directed learners. These principles of learning can be useful in helping educators
structure learning activities that will motivate and engage students.
Adult Learning Theory
Learning theories provide a framework to understand how people learn, but the various
theories of learning are based on different assumptions and are suitable to explain some learning
situations, but not all (Shuell, 2013). Students in continuing education programs are adult
learners, so understanding how adults learn is important when considering how learning
experiences should be structured in order to meet the needs of continuing education students
(Cervero, 2001). The most prominent adult learning theory is andragogy, which was introduced
in the early 1970s by Malcolm Knowles. In his book examining adult learners, Knowles (1973)
presented andragogy as a theory to emphasize the concept that adults learn differently than
children. Andragogy is based on the assumptions that adults are self-directed learners, life
experiences enhance learning, readiness to learn is oriented towards the developmental tasks of
social roles, and adults have a problem-centered orientation to learning (Knowles, 1973). Each of
CREATING “EXCELLENT” LEARNING EXPERIENCES
38
these assumptions has implications for how a teacher should create the learning environment and
guide interaction (Knowles, 1980).
Though Knowles’ introduction of andragogy as a theory of adult learning is significant
for the field of adult education, Merriam (2001) discusses the debates about andragogy as a
theory, and questions if the characteristics of learners as described by andragogy are limited to
adults. Merriam points out that not all adults are self-directed and depend on the teacher for
structure, while some children are highly independent. Similarly, the motivation to learn is
sometimes extrinsic for adults, whereas some children possess intrinsic motivation to pursue
learning (Merriam, 2001). For each of the assumptions about adult learners, there exist examples
that some adults do not possess the characteristics, while some children do exhibit the
characteristics (Merriam, 2001). Merriam suggest that andragogy is more as a guiding principle
than a theory of adult learning, and that andragogy’s biggest impact is as a guide to practice and
designing programs for adults.
Designing Learning Experiences
Knowles’ (1980) outline of andragogy includes a number of implications for practice that
are useful to consider when designing learning experiences for adult learners, such as
emphasizing experiential learning and practical application, activating prior learning, and
organizing problem-centered curriculum. Many of these implications are reflected in Merrill’s
first principles of instruction. Merrill (2002) identifies four distinct phases of learning that
support with some of the assumptions of andragogy, and they include (1) activation of prior
experience, (2) demonstration of skills, (3) application of skills, (4) integration of skills into real-
world activities. The principles of instruction that Merrill presents address all four phases of
learning.
CREATING “EXCELLENT” LEARNING EXPERIENCES
39
Merrill (2002) describes five principles of instruction that are intended to be prescriptive
in terms of creating learning environments. These first five principles of instruction are necessary
to set the foundation for effective and efficient instruction (Merrill, 2002). The first five
principles of instruction are (1) learning is promoted when learners are engaged in solving real-
world problems, (2) learning is promoted when existing knowledge is activated as a foundation
for new knowledge, (3) learning is promoted when new knowledge is demonstrated to the
learner, (4) learning is promoted when new knowledge is applied by the learner, (5) learning is
promoted when new knowledge is integrated into the learner’s world (Merrill, 2002). These five
principles promote an approach that incorporates aspects of andragogy, and can be applicable to
designing learning experiences for adult learners.
There are three essential elements of designing programs that Mayer (2011) describes for
effective learning. These elements are understanding how people learn, instructional methods,
and assessments (Mayer, 2011). The relationship between these elements is important in
designing learning experiences that are effective for learning (Mayer, 2011). Gagné and Merrill
(1990) identify goals as the starting point for instructional design. Sometimes goals are
conceived as learning outcomes or objectives, but regardless of nomenclature, the purpose of the
goals is to have a projected result of learning that will inform the design of the course (Gagné &
Merrill, 1990).
The quality of learning is a key factor in the overall educational experience. In addition to
ensuring that students are prepared for the labor market, excellent learning experiences should
impact the individual. Fink (2013) designed a framework to create significant learning
experiences that defines this type of learning as containing six outcomes, including foundational
knowledge, application, integration, human dimension, caring, and learning how to learn. Going
CREATING “EXCELLENT” LEARNING EXPERIENCES
40
beyond the meeting immediate human capital needs, significant learning experiences allow
instructors to reclaim education as a means to address social issues that go beyond economics
(Gouthro, 2002). Fink (2013) defines learning in terms of change, and significant learning should
leave a lasting change in the learner that is important for the learner’s life.
Fink’s (2013) framework for creating significant learning experiences focuses on
instructional design. Fink presents four components of teaching. These include knowledge of the
subject matter, design of instruction, teacher-student interactions, and course management (Fink,
2013). Of these factors, he identifies instructional design as the bottleneck in the process of
creating an impactful learning experience for students because most instructors do not have
extensive training in designing instruction. They do not possess the requisite knowledge to
understand how to rethink teaching, and how to redesign for significant learning outcomes.
In their study utilizing Fink’s taxonomy of significant learning, Levine et al. (2008)
utilized the framework to redesign courses. Their results found that Fink’s (2013) model of
instructional design provides a sound and comprehensive model for teachers looking to create
significant learning experiences that go beyond building foundational knowledge. The results
also confirmed that the learning outcome that instructors emphasized created the most noticeable
impact in learning.
Barriers to Achieving Excellent Learning Experiences
In most university continuing education programs, the teaching staff work on a part-time,
contract basis, and are typically not well compensated (English & Mayo, 2012; Röbken, 2009).
As part-time, adjunct faculty, the teachers fall into one of the four categories used to describe
adjuncts. These categories are (1) career enders, (2) specialists, experts, and professionals, (3)
aspiring academics, (4) freelancers (Tarr, 2010). Most instructors are recruited externally and
CREATING “EXCELLENT” LEARNING EXPERIENCES
41
have many years of professional experience that is actively used to connect the class content to
real-world working practice (Röbken, 2009). Many instructors teach at more than one institution,
and the financial compensation for teaching is prime motivation for becoming an instructor
(Röbken, 2009; Tarr, 2010).
The goal for University Extension BLP instructors is to ensure that they are designing
courses that create excellent learning experiences for students. Students should be positively
impacted and changed by the learning experience.
Learning and Motivation Theory
The assumed causes of the gap in achieving the highest evaluation scores can be analyzed
through knowledge, motivation, and organizational frameworks. Knowledge can be categorized
into four dimensions, factual, conceptual, procedural, and metacognitive (Anderson &
Krathwohl, 2001). Motivation can be identified and manifested through active choice,
persistence, and mental effort (Clark & Estes, 2008; Pintrich, 2003). Organizational gaps can be
analyzed through cultural models and settings (Gallimore & Goldenberg, 2001), as well as
considering other factors such as organizational goals and structures (Clark & Estes, 2008).
Knowledge and skills. Instructors may not possess basic, factual knowledge of
curriculum design. They may not conceptually understand the relationships between course
objectives and key assessments, which relate back to understanding the basics of curriculum
design (Gagné & Merrill, 1990). Procedurally, they may not know how to develop course
objectives, nor implement teaching and learning strategies. Gunn and Fisk (2013) identify the
teachers’ abilities to reflect upon their own teaching effectiveness to be demonstrative of
excellence. University Extension instructors may lack metacognitive knowledge of how to reflect
upon their own teaching effectiveness.
CREATING “EXCELLENT” LEARNING EXPERIENCES
42
Motivation. Instructors potentially do not value the task of improving their teaching
skills. There are no consistent consequences for low evaluation scores, which may decrease the
value of the investment into becoming a better instructor (Pintrich, 2003). They are not making
the choice nor expending the necessary mental effort in order to consistently update and improve
their courses. Some instructors have gone through trainings and have learned about strategies,
but not all are choosing to implement the new strategies that they learn. This lack of motivation
to update materials and integrate new engagement strategies is reflected in their lack of choice,
persistence and mental effort.
Organization. University Extension offers workshops and training through the
Instructor Development Program (IDP) for instructors to learn and improve their teaching.
However, the institution does not provide any incentives or consequences for participation or
lack thereof in IDP programs. There is also weak internal structure and support to foster an
instructor community that promotes a culture of continual assessment and improvement of
teaching. Moreover, there is a limited budget for instructional support, so the Office of
Instructional Enhancement (OIE) which oversees the IDP, is understaffed, and program
representatives who are the main point of contact for instructors are underprepared to provide the
guidance instructors may seek.
Assumed Causes from the Review of the Literature
Existing research and literature on instructional quality and instructor engagement reveal
potential causes to the gap in creating engaging curriculum that will garner higher evaluation
scores from students.
Knowledge and skills. In university continuing education programs, complaints about
instructors not being prepared to teach adult learners is not uncommon (English & Mayo, 2012).
CREATING “EXCELLENT” LEARNING EXPERIENCES
43
Instructors may want to achieve higher-level learning goals such as critical thinking, but they
may be unfamiliar with strategies to reach those goals (Fink, 2013). They may be utilizing
techniques that do not align with the end goal, and they are not creating significant learning
experiences for students because they do not know how to create these experiences (Fink, 2013).
Motivation. The lack of accountability that instructors may have to low evaluation scores
may weaken any sense of obligation that instructors feel towards improving their instructional
quality (Burke & Rau, 2007). Huisman and Currie (2004) assert that improved quality depends
on instructors being motivated by intrinsic rewards. Not having clear performance goals that
reinforce the need for instructors to persist and put forth effort can cause the instructors to feel
less motivated to investing time and effort into updating course materials and trying to learn new
teaching strategies (Burke & Rau, 2007).
Organization. Though “excellence” in teaching and learning is an institutional value at
University Extension, the organization may not embody the value in its structure and processes
(Fink, 2013). Because the value is not inculcated into the organizational culture, instructors may
not understand what “excellence” means and therefore this importance of creating and
maintaining “excellence” may be weak (Schein, 2004). Additionally, Tarr (2010) indicates that
the most serious threat to academic quality is insufficient institutional support for adjunct faculty
members. Without an investment into the support structures for instructors, the effectiveness of
the instructors can be limited (Tarr, 2010).
Summary and Conclusion
The focus on ensuring students have an excellent learning is motivated by the need to be
competitive in the post-secondary education market, but the utilization of the significant learning
framework demonstrates that education is more than just meeting the demands of the economy.
CREATING “EXCELLENT” LEARNING EXPERIENCES
44
As education in the 21
st
century becomes less faculty-centered and more student-centered,
institutions must take stock of their ability to meet the needs and expectations of students.
However, in order to make the change and become student-centered, instructors need to be
responsible for being teachers who can create excellent learning experiences.
The direct affect of achieving quality learning would be a change in the learner that
positively impacts the learner’s life (Fink, 2013). The indirect impact would be increasing
student satisfaction and the likelihood that the student will take additional courses and contribute
to positive word-of-mouth marketing. It is important to ascertain and address the knowledge,
motivation, and organizational barriers that prevent instructors from being successful in
achieving the goal of creating significant learning. The next chapter will more closely examine
these factors as they pertain to this study.
CREATING “EXCELLENT” LEARNING EXPERIENCES
45
CHAPTER 3
METHODOLOGY
Purpose of the Project and Questions
This purpose of this study was to conduct a gap analysis to understand the underlying
causes of why University Extension BLP instructors are not creating courses that produce
excellent learning experiences as measured by course evaluations that average a 7.5 out of 9. The
evaluation scores are utilized by administration as an indicator of student satisfaction, and in the
competitive market of continuing education, ensuring student satisfaction is a priority for
continued financial viability. The gap analysis examined the knowledge, motivation, and
organizational gaps that were potential causes of this problem. A list of assumed causes were
generated, and then those possible causes were systematically narrowed to focus on validated
causes. Though a complete gap analysis would study all stakeholders, for practical purposes
instructors were identified as the key stakeholder group, and they were the focus of analysis.
The following questions guided this study:
1. What are the knowledge, motivation and organization causes that are barriers to
instructors achieving their goal of creating curriculum that produce excellent learning
experiences and increasing student satisfaction?
2. What are the recommended knowledge, motivation, and organizational solutions to
those barriers?
Framework for the Study
The Clark and Estes (2008) gap analysis process is a problem-solving framework that
was used to research the assumed causes and explore potential solutions to the challenge of
increasing the average student evaluation scores. This framework provided a systematic,
CREATING “EXCELLENT” LEARNING EXPERIENCES
46
analytical method to clarify organizational goals and identify the gaps between the actual
performance level and the preferred performance level with an organization. Assumed causes for
the performance gap were generated from personal knowledge, learning theories, and related
literature. Gap analysis utilizes various data collection and data analysis methods. A mixed-
methods approach was used to validate the assumed causes, including surveys, follow-up
interviews, and document analysis. Recommended solutions were research-based and evaluated
in a comprehensive manner.
Figure 1. Gap analysis process
Assumed Causes of the Performance Gap
Clark and Estes (2008) suggest that due to the rapid pace of business in the current
economy, organizations choose and apply change programs quickly, without fully diagnosing the
problem. Solutions are often based on assumed causes, and problems are deemed addressed
CREATING “EXCELLENT” LEARNING EXPERIENCES
47
when the solution is implemented, but often the solution is a mismatch for the problem (Rueda,
2011). Due to time and social pressures, the time and mental effort required to examine a
problem is not expended, which often leads to overlooking underlying causes of problems
(Clark, 2012). In order to be thorough in analyzing performance gaps, the assumed causes were
drawn from scanning interviews; learning, motivation, and organizational/culture theory; and
review of literature on the specific topic of study.
The assumed causes from learning, motivation, and organizational culture theory and
review of literature were discussed in the previous chapter. The following knowledge and skills,
motivation, and organizational assumed causes for less than desired evaluation scores were
elicited from brief document scans and informal interviews with BLP instructors.
Knowledge and Skills
A brief scan of existing curricula revealed that not all courses had clearly stated learning
objectives and goals, and that syllabi did not reflect consistency in assessment with the goals of
the course. Scanning interviews with a few instructors also indicated they were not familiar with
principles of course design, teaching and learning strategies for adult learners, and they
underestimated the time they needed to spend on developing and delivering course materials.
This lack of knowledge was reflected in the course evaluations, where students had commented
on the lack of coherence in class and inability of instructors to keep students engaged despite the
desire of the student to learn the topic.
Motivation
Instructors were not spending the necessary time they need in order to become better
teachers. Informal conversations revealed that they overestimated their abilities and did not
allocate adequate time for course preparations. Instructors are also full-time working
CREATING “EXCELLENT” LEARNING EXPERIENCES
48
professionals, and spending time to become better teachers is not the highest priority. Some
instructors also indicated a belief that it is not possible to provide a satisfactory learning
experience to all students because of the diversity of the University Extension classroom. The
lack of actively choosing to expend time, lack of persisting with their initial interest in teaching,
and lack of effort because of their high sense of self-efficacy and attribution of the causes of
dissatisfaction, all indicated that motivation was a potential problem (Clark & Estes, 2008;
Pintrich, 2003).
Organization
The informal conversations with instructors revealed that they did not have a sense of
community with University Extension. They had a vague idea of the organizational vision and
mission, but the guiding principles were not reinforced because the instructors did not see the
values embodied in their interactions with University Extension. Also, instructors did not have
strong sense of community with each other. For example, instructors who teach in the same
discipline were not all acquainted with one another, and they did not share with each other their
teaching experiences and best practices. The sense of disconnect with the larger organization
potentially stemmed from feeling like they were resigned to roles that remain in the classroom,
but not involved with University Extension beyond the classroom.
Summary
Table 2 provides a summary of the sources of assumed causes categorized as Knowledge,
Motivation, and Organization determined through scanning interviews discussed above, and
related literature, and existing learning and motivation theories as discussed in Chapter 2.
CREATING “EXCELLENT” LEARNING EXPERIENCES
49
Table 2
Summary of Sources about Assumed Causes for Knowledge, Motivation, and Organizational
Issues
Causes
Sources Knowledge Motivation Organizational Processes
Scanning
interviews,
personal
knowledge
Instructors do not know the
basics/components of course
design.
Instructors are unable to
identify appropriate teaching
and learning strategies for
adult learners.
Instructors are content-area
experts and have a high
sense of self-efficacy,
leading to underestimation
of needed time and ability.
Instructors attribute
students’ dissatisfaction
with class to the challenges
instructors face with
teaching in a diverse adult
learning setting.
The relationship between
University Extension and
instructors is more transactional
then relational, so there is lack of
instructor community for
instructors to discuss and share
their understandings of teaching
and learning excellence.
Learning
and
motivation
theory
Instructors do not know how
to align learning goals with
assessments and teaching
and learning strategies.
Instructors do not know how
to assess the effectiveness of
their course design.
Instructors are full-time
working professionals and
only teach part-time.
Compensation for teaching
is not as high as
professional work, so the
cost value of spending time
to develop/update
curriculum is low.
There is little utility value
in implementing new
teaching strategies because
are no opportunities for
advancement and little
recognition for being an
excellent instructor.
There is no consistent
accountability for low evaluation
scores.
There are no consequences or
incentives for IDP participation
and/or improved instructional
quality.
The IDP does not offer workshops
in a variety of formats that are
adaptable to instructor schedules.
Program reps (point of contact for
instructors) are not adequately
trained to provide instructional
support despite being expected to
provide the support.
The Office of Instructional
Enhancement (OIE), which houses
the Instructor Develop Program, is
understaffed and due to
understaffing, is focused mainly on
the technical aspects of online
course delivery, not on overall
instructional support.
Background
and review
of the
literature
Instructors do not understand
the relationship between
learning objectives, key
assessments, and teaching
and learning strategies.
Lack of consequences for
low evaluation scores
decreases task value.
The organizational vision, mission,
and values are not effectively
embedded in the organizational
culture.
CREATING “EXCELLENT” LEARNING EXPERIENCES
50
Validation of the Causes of the Performance Gap
The remaining sections of this chapter will discuss how the assumed causes of the
performance gap were assessed and validated in order to determine which of the causes were, in
fact, problems that required solutions. Not all assumed causes were validated as problems, thus
invalidated causes will not require solutions.
Validation of the Causes of the Performance Gap: Knowledge
The following discussion on assessing and validating assumed knowledge causes will be
guided by Anderson and Krathwohl’s (2001) expansion on and revision of Bloom’s taxonomy.
Anderson and Krathwohl (2001) reorganized and added to the original taxonomy, creating four
knowledge dimensions that include factual, conceptual, procedural, and metacognitive
knowledge. These knowledge dimensions are accompanied by six dimensions of the cognitive
process, which include remember, understand, apply, analyze, evaluate, and create. The four
knowledge dimensions help to identify and classify the specific gaps in knowledge, which
inform the dimension of cognitive processes necessary to target when exploring potential
solutions.
Factual knowledge causes’ validation. Factual knowledge refers to the basic, elemental
knowledge that an individual requires in order to know about a subject area or discipline
(Anderson & Krathwohl, 2001). It is the knowledge of terms and facts in isolation, and not the
connection between those terms and facts.
One assumed factual knowledge gap was that instructors do not know the
basics/component parts of course design. Most instructors are not trained teachers, so they lack
the foundational knowledge about curriculum design. They do not possess the vocabulary nor
know about the components of a curriculum. To assess this assumed cause, a survey was
CREATING “EXCELLENT” LEARNING EXPERIENCES
51
administered to instructors asking them to provide specific information about whether they have
prior teaching knowledge and what they know of curriculum design. Survey questions included
“What are some components of course design?” A document analysis of existing syllabi was
also conducted to see if syllabi contained components of a curriculum, such as course objectives,
grading rubrics, and course outline.
Conceptual knowledge causes’ validation. Whereas factual knowledge provides
individuals with the basic terminology and facts, conceptual knowledge is the dimension that
connects those isolated terms and facts (Anderson & Krathwohl, 2001). Conceptual knowledge
organizes information, creating schemas that individuals use to understand the information that is
not overt in factual knowledge. It is a deeper dimension that allows individuals to understand
concepts, principles, and theories (Anderson & Krathwohl, 2001).
BLP instructors potentially lacked conceptual knowledge about teaching and learning
strategies for adult learners, and the relationship between learning objectives, assessments, and
teaching and learning strategies. They may have been familiar with the terminology, but may not
have had a full grasp of what those terms mean and how they translate into teaching in an adult
learner classroom. In order to validate these assumed causes, a survey was administered to probe
the instructors’ understanding of course design, appropriate adult learner strategies, and the
relationship between learning objectives, assessments, and strategies. Interview questions further
probed instructors’ conceptual knowledge. Similar to the questions that probed factual
knowledge, questions to probe conceptual knowledge asked instructors to identify principles of
curriculum design that they know, identify appropriate teaching and learning strategies
appropriate for adult learners, and explain how learning objectives relate (or not) to assessments
and teaching strategies.
CREATING “EXCELLENT” LEARNING EXPERIENCES
52
Procedural knowledge causes’ validation. Procedural knowledge is the application of
the knowledge in different contexts (Anderson & Krathwohl, 2001). The assumed procedural
knowledge gaps were that instructors did not know how to align learning goals with assessments
and teaching and learning strategies. University Extension offers a one-day workshop that
introduces these concepts, so the instructors may be familiar with the concepts, but applying the
factual and conceptual knowledge may be a challenge because they may not have had the
opportunity to practice, nor were the lessons modeled.
In order to assess the procedural knowledge gaps, instructors were asked to describe the
relationship between learning goals, assessments, and teaching and learning strategies. A
document analysis of syllabi was also conducted to review how instructors incorporated those
three components in their course design.
Metacognitive knowledge causes’ validation. Metacognitive knowledge is the
knowledge and awareness of one’s own cognitive process and the ability to self-regulate and
reflect (Anderson & Krathwohl, 2001). The potential metacognitive knowledge gaps for
instructors include being unable to appropriately assess their own teaching effectiveness.
Personal interviews with instructors revealed that regardless of how long they have been
teaching, instructors lack the ability to reflect on their effectiveness in the classroom.
The assumed metacognitive knowledge gaps were assessed with survey and interview
questions that asked the instructors how they know that they have created excellent learning
experiences for their students. They were also asked to judge their effectiveness based on stated
objectives, student learning outcomes, and course evaluations. A summary of the assumed
knowledge causes and how the causes were validated is found in Table 3.
CREATING “EXCELLENT” LEARNING EXPERIENCES
53
Table 3
Summary of Assumed Knowledge Causes and Validation
Assumed Knowledge Cause* How Will It Be Validated?
Instructors do not know the
basics/components of course
design (f)
Survey
What are some components of a curriculum?
Interview
Would you please briefly describe your teaching background and experience,
specifically any experiences you’ve had in learning about course design.
Document Analysis
Review course syllabi for sections about learning goals, appropriate teaching
and learning activities, and appropriate assessments.
Instructors are unable to
identify appropriate teaching
and learning strategies for adult
learners (c)
Survey
Which of the following engagement strategies would you use when teaching
adult learners? (Provide list of 5 strategies with 6 point Likert scale)
Interview
Would you please describe teaching and learning strategies you use with
diverse adult learners.
Document Analysis
Review course syllabi for descriptions of course activities.
Instructors do not understand
the integrated relationship
between learning objectives,
key assessments, and teaching
and learning strategies (c)
Survey
What do you think is the relationship between learning objectives, assessments,
and teaching and learning strategies?
Document Analysis
Review course syllabi for integration of learning objectives, key assessments,
and teaching and learning strategies.
Instructors do not know how to
align learning goals with
assessments and teaching and
learning strategies (p)
Document Analysis
Review course syllabi for integration and alignment of learning objectives, key
assessments, and teaching and learning strategies.
Instructors do not know how to
assess the effectiveness of their
course design (m)
Survey
What are some methods you use to assess the effectiveness of your course
design?
Interview
How do you know that the course that you have designed is effective in student
learning?
*Indicate knowledge type for each assumed cause listed using these abbreviations: (F)actual; (C)onceptual;
(P)rocedural; (M)etacognitive
CREATING “EXCELLENT” LEARNING EXPERIENCES
54
Validation of the Causes of the Performance Gap: Motivation
Clark and Estes (2008) describe motivation as what gets people going, keeps people
moving, and signals how much effort to expend. Motivation is thus seen as the active choice
individuals make to engage in a task, the persistence they display in continuing with the task, and
the effort they put into doing the task (Clark & Estes, 2008). Gaps in achievement that are linked
to choice, persistence, or effort indicate that motivation is a problem (Clark & Estes, 2008).
Pintrich (2003) provides a framework of motivational science that will be the main guide
for this discussion on assumed motivation causes and solutions. Motivation principles presented
by Clark and Estes (2008) provide additional support to the Pintrich (2003) framework.
According to Pintrich (2003), there are five basic social-constructs that underlie motivation.
These include beliefs about self-efficacy, attributions, interests, values, and goal orientation.
These beliefs affect the motivation indicators of choice, persistence, and effort.
From scanning interviews, one potential motivation problem was that instructors were not
spending adequate time to develop courses and prepare for teaching, indicating a lack of choice,
persistence, and effort. An assumed cause was that because instructors are content-area experts,
they are overconfident in their ability to translate their professional knowledge in the classroom,
thus underestimating the amount of time they need to properly prepare courses (Clark & Estes,
2008; Pintrich, 2003). Overconfidence stems from a belief of high self-efficacy. In order to
assess this assumed cause, a couple of written survey items were asked. The first question asked
how confident instructors felt about their ability to develop and prepare course materials. The
second asked how much time they think they need to spend on course development and
preparation.
CREATING “EXCELLENT” LEARNING EXPERIENCES
55
Instructors were also potentially not spending enough time on course development and
preparation because of a low cost-value. Most instructors are also full-time working
professionals. They teach on a part-time basis and are usually initially motivated by personal
interest. However, because many underestimate the amount of time required to prepare course
materials, they have to weigh the value of teaching against their other personal and professional
obligations (Eccles, 2009). Through personal conversations, instructors mentioned that the
compensation they received for teaching was not adequate for the amount of time they spend
outside of class preparing materials and corresponding with students. This assumed cause was
assessed with a written survey item asking whether or not instructors think it is important to set
aside and dedicate time to course development and preparation despite mounting personal and/or
professional obligations.
Another potential motivation problem was that instructors were not applying the
strategies that they have learned in the Instructor Development Program (IDP). The IDP is an
optional training program that is comprised of a series of workshops that teach instructors how to
develop courses and how to teach. Course offerings include a curriculum development course
that reviews learning objectives, key assessments, and teaching strategies. The courses are
complimentary for University Extension instructors, and courses are offered quarterly. The
instructors may have chosen to attend the program, but were not persisting and making the
mental effort to utilize the information they gained. They were not utilizing the job aides, nor
making attempts to incorporate new strategies. One potential cause for the lack of persistence
was that there are no hard consequences for low evaluation scores. This lack of consequence
decreases the value of integrating anything new into their course curriculum (Ambrose et al.,
2010; Eccles, 2009). It does not motivate instructors to make any changes or adjustments. This
CREATING “EXCELLENT” LEARNING EXPERIENCES
56
assumed cause was assessed with a written survey item that asked if instructors utilize the
evaluations to reflect on and update their curriculum.
The instructors could also not be utilizing the information from the IDP because they
found low utility value for the information. Implementing the new strategies will not result in a
tangible reward for the instructors, so they see little value in adjusting their curriculum (Clark &
Estes, 2008). Assessing this assumed cause took form in a written survey item that asked if
instructors thought it is important to incorporate the information they gain in the IDP into their
courses.
Another potential cause of the lack of effort in incorporating IDP information was that
instructors attributed student dissatisfaction to the overall challenge of teaching in a diverse,
adult learning environment. University Extension students range in age, education and
professional backgrounds, and English-language proficiency. This diverse environment presents
a set of interesting teaching challenges, and from personal conversations, instructors have
indicated that they believe this environment is why students are not happy with their courses.
Because instructors attribute the problem to an external cause that is beyond their control, they
are not motivated to incorporate any new strategies because regardless of what they do, the
classroom environment is the problem (Pintrich, 2003). This assumed cause was assessed with a
written survey item that asked if instructors believe that student satisfaction is influenced by
instructor effort.
CREATING “EXCELLENT” LEARNING EXPERIENCES
57
Table 4
Summary of Assumed Motivation Causes and Validation
Motivational
Problem
Type of
Indicator Possible Cause(s)* How Will It Be Validated?
Instructors are
not investing
time in
becoming better
teachers.
Active choice,
persistence,
effort
• High sense of self-
efficacy, leading to
underestimation of
needed time and
ability
• Cost value of
spending time to
develop/ update
curriculum is low
Survey
• I am certain that I can design a course
that creates significant learning
experiences.
• Compared to other instructors, I think I
know a great deal about course design.
(6-points Likert scale)
• On average, I spend _____ time on (a)
course development (b) course updating
(provide continuous range of time)
• I often choose to spend my time
improving course design even if it
requires more work. (6-points Likert
scale)
Interview
• How often do you review and update
your course design? Why do you choose
to invest the time on course updating?
Or why not? How much time do you
usually spend updating materials?
Instructors are
not utilizing nor
applying the
information
provided in
Instructor
Development
Programs.
Persistence,
effort
• Lack of consequences
for low evaluation
scores decreases
value of applying
new teaching
information
• There is little utility
value in
implementing new
teaching strategies
• Instructors attribute
students’
dissatisfaction with
class to the
challenges instructors
face with teaching in
a diverse adult
learning setting
Survey
• I think that course evaluations are useful
for me to improve course design.
• I think that what I learn in IDP
workshops is useful for me to improve
my course design.
• The amount of effort I put into
preparing course materials influences
student satisfaction.
Interview
• What is the impact of course evaluations
on your decision to update curriculum?
• What factors do you think contribute to
student dissatisfaction?
CREATING “EXCELLENT” LEARNING EXPERIENCES
58
Validation of the Causes of the Performance Gap: Organization/Culture/Context
The following discussion on assumed causes and solutions is guided by Gallimore and
Goldenberg’s (2001) framework of cultural models and settings. Culture is a construct that is
understood in many ways. Gallimore and Goldenberg (2001) define cultural models as the
unspoken shared understanding of how interactions and processes occur within a group of
people, whereas cultural settings are the physical, visible spaces in which those interactions and
processes occur.
In reviewing the cultural models and settings that exist at University Extension, there is
an apparent lack of shared understanding of the value of “excellence.” Excellence is a value that
is stated on University Extension collateral materials as a value of the institution, but scanning
interviews with instructors illustrated that “excellence,” especially in regards to teaching and
learning, are understood in very different ways. Some instructors indicated that excellence means
high evaluation scores whereas others identify excellence as the quality of final projects
indicating how much students have learned. Even staff, through informal conversations,
indicated that they understand “excellence” in different ways. This inconsistent understanding
illustrates that the cultural model is deficient of a shared construct of “excellence,” is likely
caused by a variety of factors (Gallimore & Goldenberg, 2001).
First, from personal observation, the stated vision, mission, and values of University
Extension are not embodied in the organizational culture. Though written on the website and on
various brochures and marketing materials, the statements which are meant to guide the
organization are not well known to all staff and certainly not to all instructors, and are not all
inherent in the processes of the organization (Clark & Estes, 2008; Gallimore & Goldenberg,
CREATING “EXCELLENT” LEARNING EXPERIENCES
59
2001). To validate this assumed cause, instructors were asked if they know the stated vision,
mission, and values of University Extension.
Another potential cause of the lack of understanding of shared meaning of the
“excellence” value is that University Extension does not foster a culture of community with
instructors (Fink, 2013; Schein, 2004). The relationship with instructors is on a transactional
level, where teaching assignments are offered and accepted (or not), and then logistics of
scheduling are set up. Aside from these transactions, there are not many opportunities for
personal interactions with staff, and very little opportunities for interaction with other instructors,
creating a cultural setting that does not foster connection with University Extension that is
reflective of the type of relationship that University Extension desires with instructors (Clark &
Estes, 2008; Gallimore & Goldenberg, 2001). Ideally, instructors would be interacting more to
have discussions about teaching and learning that help instructors reflect on their own teaching
and course design. This assumed cause was assessed with two survey questions that asked
instructors about how connected they feel to the institution, and how connected they feel to other
instructors.
Closely connected to the lack of shared understanding of “excellence,” another
organizational problem was that instructors were not participating in the IDP workshops and
were not proactive in improving course quality. This was potentially due to a lack of incentives
to attend IDP workshops and to work on improving courses to raise evaluation scores. To assess
this, survey and interview questions probed instructors about their understanding of incentives or
consequences as they relate to evaluation scores and IDP workshops.
CREATING “EXCELLENT” LEARNING EXPERIENCES
60
Table 5
Summary of Assumed Organizational/Culture/Context Causes and Validation
Organizational
Problem
Possible Organizational
Cause(s)* How Will it be Validated?
Instructors are not
participating in
Instructor
Development
Program (IDP)
workshops and are
not actively working
to improve
instructional quality.
• There is no consistent
accountability for low
evaluations scores
• There are no consequences
or incentives for IDP
participation and/or
improved instructional
quality.
• The IDP does not offer
workshops in a variety of
formats that are adaptable to
instructor schedules.
Survey
• There are clear consequences for low
evaluation scores.
• There are institutional incentives for
me to improve the quality of my
courses.
• I can easily fit IDP workshops into my
schedule.
Interview
• What are the reasons why you
participate (or not) in IDP programs?
• Are IDP workshops conveniently
scheduled for you? If not, what
suggestions do you have to make them
more accessible for you?
Instructors do not
have a shared
understanding of
what teaching and
learning
“excellence” means
at University
Extension
• The organizational vision,
mission, and values are not
effectively embedded in the
organizational culture.
• The relationship between
University Extension and
instructors is more
transactional then relational,
so there is lack of instructor
community for instructors to
discuss and share their
understandings of teaching
and learning excellence.
Survey
• I am familiar with University
Extension’s mission.
• I feel like I am part of the University
Extension community.
• I have opportunities to discuss and
share teaching and learning strategies
with other University Extension
Instructors.
Interview
• How is University Extension’s mission
reflected in your course design?
• Can you describe the nature of the
relationship you have with University
Extension? What about with other
instructors? How does this affect your
understanding of teaching and learning
excellence at University Extension?
CREATING “EXCELLENT” LEARNING EXPERIENCES
61
Table 5, continued
Organizational
Problem
Possible Organizational
Cause(s)* How Will it be Validated?
University Extension
provides limited
instructional support
• UNEX program reps (point
of contact for instructors) are
not adequately trained to
provide instructional support
despite being expected to
provide the support.
• The Office of Instructional
Enhancement (OIE), which
houses the Instructor
Develop Program, is
understaffed and due to
understaffing, is focused
mainly on the technical
aspects of online course
delivery, not on overall
instructional support.
• Institutionally there is a
limited budget for
instructional support.
Survey
• The instructional support I receive from
my program representative helps me
improve the quality of my design.
• The Office of Instructional
Enhancement/Instructor Development
Program has been helpful in providing
me instructional support, beyond online
technical support, that helps me
improve the quality of my course
design.
Interview
• Can you tell me about your experiences
with receiving instructional support
from University Extension, specifically
as it relates to course design?
• What types of instructional support
would you like to see offered by
University Extension that is not
currently available?
The previously mentioned assumed knowledge, motivation, and organizational causes of
the performance gap in achievement of higher evaluation scores were generated based on
personal knowledge through scanning interviews, learning theories, and related literature. Each
of the assumed causes were verified as an actual problem in order to determine the root causes of
the performance problem. These causes were assessed and validated utilizing a mixed methods
approach. Quantitative and qualitative data were gathered with a document analysis, a survey to
all BLP instructors, and interviews from a subset of instructors.
CREATING “EXCELLENT” LEARNING EXPERIENCES
62
Participants
To assess the gap in achievement of higher evaluation scores for University Extension
Department of Business and Legal Programs instructors (BLP), the key stakeholder group of 330
BLP instructors was the survey population. Instructor data located in BLP was requested and
provided by the department administrative assistant. All instructors who taught at least one class
within the last year were sent the survey for completion. A total of 75 instructors responded to
the survey. Of the 75 instructors, 12% had been teaching with University Extension for less than
one year, 29% for five years or less, 21% for ten years or less, and 38% for more than ten years.
On average, these instructors were teaching 4 courses per year, with 46% teaching solely in the
face-to-face format, 24% solely in the online format, and 30% teaching in both formats.
A purposefully selected subset sample was identified for further interview. Instructors in
this subset were recommended by program directors. These instructors previously demonstrated
a willingness to provide feedback that was direct and accurate per their perspective, even if it
was a negative viewpoint of University Extension. Nine instructors were selected and
interviewed.
Procedures
In order to validate the assumed knowledge, motivation and organizational causes, this
study utilized an online survey, interview (either in-person or via video software), and document
analysis of related course and organizational materials. The survey contained 25 items and was
developed to ascertain the validity of the above outlined assumed causes. The survey items were
created based on existing research and knowledge about the context of the study. The items
assessing the motivation causes were developed from the Motivated Strategies for Learning
Questionnaire, a valid and reliable motivation-related instrument (Pintrich, Smith, Garcia, &
CREATING “EXCELLENT” LEARNING EXPERIENCES
63
McKeachie, 1993). The survey contained Likert scale, multiple choice, close-ended and open-
ended responses. Brief demographic questions were also included in the survey. An interview
protocol of nine questions was generated to guide the interviews with a purposefully selected
subset of instructors. The interview was designed to be semi-structured in order to further probe
responses from the selected interviewees.
Data Collection
This study utilized a qualitative approach to data collection. Data was collected in Fall
2014. The following sections overview the data collection methods of survey, interview, and
document analysis.
Surveys
The survey was distributed in October 2014 using Qualtrics, an online survey tool. The
survey was open for six weeks, and was distributed to 330 BLP instructors, as defined in this
study. The survey included 25 items, including a mix of six-response Likert scale questions, and
open-ended questions. Instructors received an email request to complete the survey, and a link
was provided for them to access the survey. The survey was collected anonymously, and
responses were tabulated through survey software. Results were kept confidential. The survey
instrument can be found in Appendix A.
Interviews
BLP instructors were interviewed individually either in person if they lived locally, or
online via video software if they are not local. Three interviews were conducted live in a private
room at University Extension. In-person interviews were audio recorded with a digital voice
recorder. Six online interviews were conducted and recorded with via Skype, a video
CREATING “EXCELLENT” LEARNING EXPERIENCES
64
conferencing software. Forty-five minutes was allocated for the interview time, with allowance
for additional time if the respondent is willing to continue the interview beyond the allotted time.
Each interview was conducted utilizing the interview protocol. The interview was semi-
structured, and started with background information about the instructors’ teaching experience.
Questions then transitioned to focus on the assumed knowledge, motivation, and organizational
causes. These questions expanded upon the survey questions, and probed further to explore
conceptual understanding of instructional design, expectancy value and goal attribution of
teaching, and extent of organizational support of hindrance of achieving instructional goals. The
interview protocol can be found in Appendix B.
Document Analysis
University Extension and BLP documentation was analyzed in order to triangulate
results, and achieve a more comprehensive view of assumed gaps. At the larger, institutional
level, documents and artifacts that were analyzed include the course catalog and institutional
website. In BLP, documents that were reviewed included course syllabi.
Data Analysis
This study utilized a qualitative approach to assess survey and interview results. Once the
survey closed and results submitted, a descriptive statistical analysis was conducted. The average
levels of responses were identified by the mean and standard deviation. Utilizing the Eight Step
Coding Process developed by Tesch (as cited by Creswell, 2014), the recorded interviews were
transcribed, and coded into themes that fell into the categories of knowledge, motivation, and
organization, then further refined to the subcategories of types of knowledge, motivation and
organization. The results were analyzed for relevant information, and reviewed for emergent
causes to the performance gap. Frequency of themes indicated significance, and was compared to
CREATING “EXCELLENT” LEARNING EXPERIENCES
65
the survey data. Also using the Eight Step Coding Process, documents were reviewed for themes
that were coded into the knowledge, motivation, or organizational categories, and similar to
interview coding, further refined into knowledge, motivation, and organizational types. The
results of the document analysis provided a method of triangulation as results were compared
with the survey and interview results.
CREATING “EXCELLENT” LEARNING EXPERIENCES
66
CHAPTER 4
RESULTS AND FINDINGS
The purpose of this study was to uncover the barriers that challenge University Extension
instructors in creating courses that deliver “excellent” learning experiences for students. A gap
analysis was conducted to identify the underlying knowledge, motivation, and organizational
causes that may be negatively affecting instructor performance in this area. Assumed knowledge,
motivation, and organizational causes were generated from a comprehensive review of literature,
learning and motivation theories, and personal knowledge. The gap analysis framework allowed
for the possible causes to be systematically narrowed and validated based on the findings from a
survey, interviews, and document analysis. This chapter presents the results in the categories of
knowledge, motivation, and organization, with key findings synthesized. The final section of the
chapter will summarize the overall findings in preparation for the development of solutions in the
next chapter.
Results and Findings for Knowledge Causes
A survey, interviews, and document analysis were used to validate or not validate the
assumed knowledge causes. Using Anderson and Krathwohl’s (2001) expansion on Bloom’s
taxonomy, knowledge is categorized into four dimensions, including factual, conceptual,
procedural, and metacognitive. Knowledge barriers were identified and classified per each
knowledge dimension. Following is a description of the findings related to the assumed
knowledge causes, based on the results of the survey, interviews, and document analysis.
Sufficient Factual Knowledge About the Components of Curriculum Design
Factual knowledge refers to the basic, elemental knowledge that an individual must know
about a subject area or discipline in order to adequately perform (Anderson & Krathwohl, 2001).
CREATING “EXCELLENT” LEARNING EXPERIENCES
67
It is the knowledge of terms and facts in isolation, and not the connection between those terms
and facts. The assumed factual knowledge gap in this study was that University Extension
instructors do not know the basic component parts of course design. Most instructors are not
trained teachers, therefore it was assumed that they lack the foundational declarative knowledge
about curriculum design, including the vocabulary to describe basic components of a curriculum.
To assess instructors’ factual knowledge about the components of course design,
instructors were asked to identify the components of curriculum that they include in their course
syllabus. Twenty-one items were listed and instructors were asked to rate whether or not they
agree that the component belonged in their syllabus. The components included items such as
course description, course objectives, method of instruction, instructor expectations, grading
criteria, and overview of exams/quizzes/assessments. On a Likert scale of one through six, with
six indicating strong agreement, the mean score for the components fell between 4.33 and 5.69,
indicating agree or strongly agree. The results suggest that instructors are familiar with the
components and identified that the components belong in their syllabi. Scores were higher for
components that are included in a syllabus template that University Extension provides for
instructors, such as course description, instructor contact information, required course materials,
course objectives, grading rubric, grading policies, and weekly schedule. Figure 2 shows the
results of the survey and lists the components from the highest to lowest in terms of which
components instructors identified as agreeing should be part of the syllabus.
The survey results suggest that instructors are familiar with the terminology of
curriculum components and recognize that these components should be included. While these
results confirm that instructors possess factual knowledge about the components of course
CREATING “EXCELLENT” LEARNING EXPERIENCES
68
design, the question of their ability to apply this factual knowledge remains and will be
addressed in the next section.
Figure 2. Responses to survey question assessing instructor knowledge about syllabus
components
4.33
4.49
4.65
4.73
4.79
4.94
4.99
5.1
5.1
5.15
5.16
5.18
5.19
5.3
5.31
5.35
5.4
5.55
5.58
5.62
5.69
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 6
Student Feedback Strategies
Student Resources
Prerequisites
Student Expectations
Overview of exams/quizzes/assessments
Supplemental or Optional Materials
Methods of Instruction
Overview of Assignments
Standards for Academic Honesty and Penalties for
Student Conduct Policies
Instructor Availability/ Office Hours
Weekly Schedule Overview
Instructor Expectations
Grading Criteria/ Rubric
Student Learning Objectives
Grading Policies and Procedures
Course Logistics Information
Overall Course Objectives
Required Course Materials
Instructor Contact Information
Course Description
Mean
Syllabus Components
CREATING “EXCELLENT” LEARNING EXPERIENCES
69
Interviews were used to probe further into instructors’ factual knowledge and to
understand the extent of formal training and education they had received on teaching and course
design. Instructors were asked to describe their teaching background and experience and,
specifically, their experience with course design. Eight of nine instructors said they had prior
teaching experience, either formally at another educational institution, or in their profession as
trainers. Six of the eight instructors with prior teaching experience mentioned that they had also
been through training on course and curriculum development either at another institution or with
University Extension. Though the instructors described various training programs, the manner in
which they spoke about curriculum affirmed that they possessed foundational, factual knowledge
about course design, with one instructor describing that she learned about “instructor training,
building a syllabus, classroom management, sparking creativity and curiosity in students,
requiring rigor, and so forth.” One instructor stated that the formal training program he
experienced was where he “really learned more formal methods of teaching and assessment and
outcomes.” Another instructor also said that at the other institution where he teaches, the
institution “really stressed instructor education and so I’ve gotten some experience there.”
The results from the survey and interviews did not validate the assumed cause that
instructors do not possess factual knowledge about curriculum design. However, though the
survey and interviews affirmed that the instructors know the vocabulary, a possible gap revealed
itself through document analysis.
Factual Knowledge Not as Evident in Application
Though the data from the survey and interviews suggest that instructors possess adequate
factual knowledge about the vocabulary of course design, the document analysis of course
syllabi revealed that the knowledge was cursory and not properly applied. For the document
CREATING “EXCELLENT” LEARNING EXPERIENCES
70
analysis, 30 syllabi were reviewed for the inclusion of the components that instructors were
asked about in the survey. Additionally, the dates and course descriptions in the syllabi were
compared to course listings as promoted on the institutional website and course catalog, and the
depth of detail about assignments and grading was reviewed.
In reviewing for the same components listed in the survey, the document analysis showed
that not all components that instructors agreed or strongly agreed should be in syllabi were
actually in the syllabi reviewed. Some components such as course description and required
materials were in nearly all reviewed syllabi. However, none of the syllabi contained all
components. The component that was found in almost all syllabi was the course description,
though one syllabus did not contain a description. It should also be noted that of the syllabi that
included a course description, 63% of those descriptions matched the catalog description, and
34% did not. The mismatch in course descriptions potentially reveals a source of misalignment
of student expectations for the class because students review the descriptions on the institutional
website and/or course catalog to learn about class. Thirty percent of all syllabi analyzed did not
list course objectives or student learning objectives. Sixty-seven percent did not list course
prerequisites, and 50% did not clearly identify methods of instruction. Figure 3 displays the
results of the document analysis, showing how many syllabi contained each component.
CREATING “EXCELLENT” LEARNING EXPERIENCES
71
Figure 3. Results of document analysis of syllabus components
27
22
21
22
20
19
12
15
13
8
11
7
8
7
4
4
3
1
1
1
4
3
2
7
7
17
3
2
5
4
1
6
1
2
4
6
8
10
9
11
15
17
15
19
13
20
22
21
21
22
26
24
28
29
29
0 5 10 15 20 25 30
Student Feedback Strategies
Student Expectations
Student Resources
Overview of exams/quizzes/assessments
Prerequisites
Standards for Academic Honesty and Penalties for
Infractions
Instructor Expectations
Overview of Assignments
Instructor Availability/ Office Hours
Methods of Instruction
Supplemental or Optional Materials
Course Logistics Information
Student Learning Objectives
Instructor Contact Information
Overall Course Objectives
Student Conduct Policies
Grading Policies and Procedures
Weekly Assignments
Grading Criteria/ Rubric
Weekly Schedule Overview
Course Description
Required Course Materials
Respondents(n=30)
Syllabus Components
Yes Some None
CREATING “EXCELLENT” LEARNING EXPERIENCES
72
When comparing the survey data to the document analysis data, the discrepancy between
what instructors identified in the survey as necessary components of a syllabus to what
components were actually included in the analyzed syllabi points to a potential problem. Table 6
displays the results of the survey in comparison to the results of the document analysis, listing
the syllabus components from highest to lowest in order of what instructors agreed should be
included. Some notable discrepancies include the differences between the mean score for overall
course objectives and student learning objectives. From the survey, the component of overall
course objectives has a mean score of 5.55 while the document analysis score was 4.67. From
instructional design theories, objectives and goals are deemed to be an essential starting point for
course design (Fink, 2013; Gagné & Merrill, 1990). The survey data suggests that instructors
know that course objectives should be a part of a syllabus, but the document analysis does not
show evidence of instructors applying the knowledge. Also, there is a large discrepancy between
the scores for overview of assignments, with the survey score at 5.1 and document analysis score
at 3.5. Though instructors seem to understand that students should have an idea of the types of
assignments they should expect in the class, the instructors are not including the component that
provides the overview of assignments as often. It is not clear from the data if the cause of this
discrepancy is due to a knowledge, motivation, or organizational problem. It is possible that the
discrepancy is due to a lack of procedural knowledge about how to apply the components, but it
could also be due to lack of motivation to apply the knowledge. The questions from the survey,
interview, and document analysis did not further probe the discrepancies to yield data that
concretely confirmed a cause.
CREATING “EXCELLENT” LEARNING EXPERIENCES
73
Table 6
Comparison of Syllabus Components from Survey Results to Document Analysis
Survey
Results
Document
Analysis Score
Syllabus Component Mean Mean
Course Description 5.69 5.83
Instructor Contact Information 5.62 4.67
Required Course Materials 5.58 5.83
Overall Course Objectives 5.55 4.67
Course Logistics Information 5.4 4.58
Grading Policies and Procedures 5.35 5
Student Learning Objectives 5.31 4.58
Grading Criteria/ Rubric 5.3 5.5
Instructor Expectations (what instructor expects from students) 5.19 3.42
Weekly Schedule Overview 5.18 5.75
Instructor Availability/ Office Hours 5.16 3.83
Student Conduct Policies 5.15 4.92
Overview of Assignments 5.1 3.5
Standards for Academic Honesty and Penalties for Infractions 5.1 2.67
Methods of Instruction 4.99 4.08
Supplemental or Optional Materials 4.94 4.12
Overview of exams/quizzes/assessments 4.79 2.33
Student Expectations (what students can expect from instructor) 4.73 2
Prerequisites 4.65 2.67
Student Resources 4.49 2.25
Student Feedback Strategies (other than tests/quizzes) 4.33 1.34
CREATING “EXCELLENT” LEARNING EXPERIENCES
74
Insufficient Conceptual and Procedural Knowledge About Curriculum Components
Conceptual knowledge is the dimension of knowledge that bridges isolated terms and
facts. Factual knowledge lays the foundations by providing basic terminology and discrete
information, but conceptual knowledge builds upon that by connecting each individual item,
organizing the information and creating schemas that help individuals understand the
information beyond mere facts (Anderson & Krathwohl, 2001). Individuals possess conceptual
knowledge when they can understand concepts, principles, and theories (Anderson & Krathwohl,
2001). Procedural knowledge refers to the application of knowledge in varied contexts
(Anderson & Krathwohl, 2001). Beyond knowing the terms and understanding the relationship
between terms, procedural knowledge is demonstrated in the utilization of the knowledge in
context.
The assumed cause was that BLP instructors do not have the conceptual knowledge about
the relationship between learning objectives, teaching and learning strategies, and assessments.
The basis of an effective learning experience occurs when learning objectives inform teaching
and learning strategies, and assessments are aligned to measure the achievement of learning
objectives (Fink, 2013). They may possess the factual knowledge about vocabulary, but may not
understand the application of the terms and how those concepts translate into designing an
effective adult learning experience. Closely connected to this, the assumed cause for the
procedural knowledge gap was that instructors do not know how to align course objectives,
strategies, and assessments. Additionally, they may not possess conceptual knowledge about
teaching and learning strategies for adult learners.
The instructors demonstrated factual knowledge about the terms used in curriculum
design, but the survey and document analysis results show that there is some lack of conceptual
CREATING “EXCELLENT” LEARNING EXPERIENCES
75
knowledge about how the terms relate to one another. Instructors were asked an open-ended
survey question about their thoughts on the relationship between learning objectives,
assessments, and teaching strategies. The qualitative responses were analyzed in terms of how
clearly they defined this relationship. Sixty-one instructors responded to the question. Of the
respondents, 46% percent of instructors provided a response that did not acknowledge a
relationship between the three components, with responses such as “I use all these elements in
my teaching” and “too technical of a question — I am a real world practitioner first, not an
academic.” Thirty-nine percent acknowledged that there must be some relationship, but did not
provide a clear definition, such as “I think all are interwoven and if well designed and utilized,
the student experience should be very stimulating and educational.” Only 24% of the instructors
were able to clearly define the relationship between all three components, providing a definition
similar to one instructor’s response of:
learning objectives is what you expect students to grasp during and at the end of class,
assessments is the evaluation of whether or not the objectives were met and teaching and
learning strategies are the methods used to help students perform and grasp materials.
With 76% of instructors not providing a clear definition in the survey, this indicates that
instructors lack the conceptual knowledge about the connections between the discrete terms of
learning objectives, assessments and teaching strategies. Because only 24% of instructors
demonstrated a conceptual grasp of the relationship between the components, the results also
validate the assumption that instructors do not possess procedural knowledge on aligning
learning objectives, assessments, and teaching strategies.
CREATING “EXCELLENT” LEARNING EXPERIENCES
76
Figure 4. Response to survey question about defining the relationship between learning
objectives, assessments and teaching strategies
The document analysis also affirmed that most instructors were not clearly connecting the
learning objectives, assessments, and teaching strategies. Seventy percent of the reviewed syllabi
had objectives listed. These objectives were either student learning outcomes or overall course
objectives. In the University Extension context, the two concepts are similar, but student learning
outcomes refer to the tangible, measurable skills and knowledge that the individual student
should gain at the completion of the class, whereas overall course objectives refer to the
overarching course goals and purpose, providing a direction for the class. The difference is
slight, and University Extension asks that one or the other is included in the syllabus. Though a
majority of the instructors listed some type of objectives, only 50% of the syllabi included any
Clearly defined all
three relationships
24%
Somewhat defined
(at least defined one
relationship)
21%
Acknowledge
relationship (no
definition but
recognized
connection)
9%
No relationship
46%
CREATING “EXCELLENT” LEARNING EXPERIENCES
77
type of mention of the method of instruction, describing how the objectives would be met, and
only 27% of them provided an overview of class assessments, providing an opportunity for
students to demonstrate if learning objectives were achieved. Additionally, the finding in the
document analysis about the lack of application of the factual knowledge of curriculum
components in syllabi potentially supports the assumed cause that instructors lack conceptual and
procedural knowledge.
Sufficient Conceptual Knowledge About Appropriate Teaching Strategies
The assumed cause that instructors are unable to identify appropriate teaching and
learning strategies for adult learners was invalidated by the survey and interview results.
Instructors were asked an open-ended question to identify teaching and learning strategies that
they use for adult learners. Adult learning theories suggest that activating prior knowledge and
using a practical, applied approach is more effective for adult learners (Knowles, 1980; Merriam
& Bierema, 2013). Seventy-seven percent of the respondents utilize multi-modal content
delivery methods, ranging from traditional lectures and presentations to using film clips and live
simulations. Forty-four percent of the respondents use case studies or find opportunities for
practical application, recognizing the importance of applied knowledge in the continuing
education classroom. One instructor wrote, “I rely heavily on case studies from my own day job
where I work with Fortune 500 clients. The students over the past 11 years have remarked that
they value this real world experience and relevant case studies.” The use of case studies and
opportunities for practical application align with the research on effective teaching strategies for
adult learners (Knowles, 1980; Merriam & Bierema, 2013).
The interviews confirmed the survey results, with 7 of 9 instructors citing the use of
discussion and various modes of instruction to engage students. Five of the 9 interviewees
CREATING “EXCELLENT” LEARNING EXPERIENCES
78
indicated that they use applied scenarios and case studies because from their experience, they
know that their students want to be able to immediately apply what they learn in their
professional lives. One instructor commented that
the courses that I teach are very applications driven, so I have sort of the benefit of thirty
years of experience in the work settings that create the environment for which the courses
that I am teaching occur… what I always try to do with a class is, what is it that you
learned tonight that you can use tomorrow.
Another interviewee acknowledged that her experience as an adult learner influenced her choice
in methods used to teach, stating, “when I was learning as an adult learner, I wanted to apply
what I was learning right away…. It’s what I wanted so it’s what I do for my students.”
Sufficient Metacognitive Knowledge to Gauge Teaching Effectiveness
Metacognitive knowledge refers to the understanding and awareness of one’s own
cognitive process and the ability to reflect and self-regulate (Anderson & Krathwohl, 2001). It is
the ability to gauge one’s own effectiveness. From initial scanning interviews, the assumed cause
was that instructors do not possess the metacognitive knowledge to appropriately reflect upon
and assess their teaching effectiveness. The survey result and interview findings did not validate
the assumed cause that instructors are unable to assess the effectiveness of their course design.
The instructors demonstrate that they possess the metacognitive knowledge needed to reflect on
how effectively they have designed their courses based on student learning.
In the survey, instructors were asked to identify methods they use to assess the
effectiveness of their course design. This was an open-ended question inviting respondents to
think about and identify their own responses. The main indicator that instructors identified that
they use to determine their effectiveness is student performance. Of the 57 instructors who
CREATING “EXCELLENT” LEARNING EXPERIENCES
79
provided a response, 60% cited student performance through assessments as the method they use
to determine the effectiveness of their course design. Forty-seven percent stated that they use
student evaluations, while 19% said they determined effectiveness with the types of questions
students asked and the quality of student discussion. Sixty-eight percent of the respondents only
cited one method of gauging effectiveness, but 32% named more than one method. Of those that
only responded with one method, 56% stated that they only use student performance, 10% cited
only student questions and discussions, 3% cited the amount of time they spent preparing
materials, and 31% cited using the student evaluation as the method to gauge effectiveness.
Figure 5. Responses to survey question about methods to assess effectiveness of course design
Multiple
methods
32%
Student
Performance
38%
Student
Questions
7%
Class Prep Time
2%
Student Eval
21%
One method
68%
CREATING “EXCELLENT” LEARNING EXPERIENCES
80
The interview also asked instructors to describe how they know that the courses they
designed were effective. All interviewees provided responses that validated the survey results.
Six of the interviewees said they gauge effectiveness based on student performance in class. Six
of the interviewees also said that they use student evaluations and student feedback to get a sense
of whether or not the class was successful. One of the instructors stated about the evaluations
that “I usually read those very thoroughly. Some of them have a lot of comments and some have
number valuations, so I look at that.” The interviewees also identified multiple methods of
gauging effectiveness. For example, one instructor stated that she will
look at it across several platforms…the first platform I take a look at is at the level of
engagement by students in the classroom. The second thing that I take a look at is how
students perform in their assignments. And I look at their assignments and the caliber of
their assignments even more so than testing. And the third is course evaluation.
Five of the interviewees also said they conduct their own midpoint evaluation in order to get a
sense from students about whether or not the students are learning.
Synthesis of Results and Findings for Knowledge Causes
The findings from the collected data confirm that there is a knowledge gap that exists for
instructors. Instructors possess factual knowledge about curriculum design, but lack some
conceptual and procedural knowledge. The instructors are knowledgeable about the terms and
components of course design, but there is some disconnect in applying those terms. They do not
understand the relationship between components, and are not building courses that include all the
components. However, through their teaching experience, the instructors have found ways to
assess their effectiveness in the classroom, confirming that instructors possess metacognitive
CREATING “EXCELLENT” LEARNING EXPERIENCES
81
knowledge. Through trial and error, instructors have gained knowledge about teaching, but some
conceptual and procedural knowledge gaps still exist.
Table 7
Summary of Results and Findings for Assumed Knowledge Causes
Assumed Cause Result Explanation
Factual Knowledge
Instructors do not know the
basics/components of course
design.
Not validated,
but application
of factual
knowledge not
evident
Survey: range of mean scores = 4.33-5.69 indicating agree
or strongly agree to inclusion of curriculum components.
Interview: 8 of 9 interviewees indicated prior teaching
experience or prior teaching training.
Document Analysis: Application of factual knowledge not
evident.
Conceptual Knowledge
Instructors do not understand
the integrated relationship
between learning objectives,
key assessments, and teaching
and learning strategies.
Procedural Knowledge
Instructors do not know how
to align learning goals with
assessments and teaching and
learning strategies
Validated Survey: 46% did not acknowledge the relationship, and 29%
acknowledged the relationship but did not provide a
description of the relationship. Only 24% of instructors
were able to clearly define the relationship between all three
components.
Document Analysis: 70% of reviewed syllabi had either
course or student learning objectives, but only 50%
mentioned methods of instruction and provided an overview
of assignments. Only 27% provided an overview of class
assessments.
Conceptual Knowledge
Instructors are unable to
identify appropriate teaching
and learning strategies for
adult learners.
Not validated Survey: 77% of instructors use multi-modal content delivery
methods and teaching strategies in order to engage adult
learners, demonstrating conceptual knowledge about
appropriate strategies for adult learners.
Interviews: 7 of 9 interviewees use discussion and different
modes to engage students, with 5 interviewees indicating
that they utilized applied scenarios because adult learners
want to be able to apply what they learn.
Metacognitive Knowledge
Instructors do not know how
to assess the effectiveness of
their course design.
Not validated Survey: 60% of respondents indicated that student
performance was a key indicator with 32% only identifying
student performance as the indicator. 47% utilized student
evaluations. 32% used more than one method.
Interviews: 6 interviewees said they gauge effectiveness
based on student performance in class. 6 also said they use
student evaluations and feedback, and 5 conduct their own
midpoint check to gauge effectiveness.
CREATING “EXCELLENT” LEARNING EXPERIENCES
82
Results and Findings for Motivation Causes
According to Clark and Estes (2008), motivation is the force that drives people to
actively choose to engage in an activity, to persist in that task, and it also serves to signal the
amount of effort to utilize. Motivation is a problem if gaps in achievement are related to choice,
persistence, or effort (Clark & Estes, 2008). Underlying the motivation indicators of choice,
persistence, and effort are five basic social-constructors, which are beliefs about self-efficacy,
attributions, interests, values, and goal orientation (Pintrich, 2003). Within these broad categories
are more specific levels of beliefs, all of which affect the motivation indicators of choice,
persistence, and effort. The assumed causes in this study probed instructors’ motivation as it
relates to self-efficacy, cost value, utility value, and attribution.
Instructors Choose to Invest Time to Develop and Improve Their Courses
One of the assumed motivational problems was that instructors are not choosing to spend
time improving their courses. There were two assumed causes of this problem. The first assumed
cause was that instructors have a high sense of self-efficacy, leading to overconfidence in their
skills and underestimation of time needed to improve their courses. The second assumed cause
behind the lack of choice to invest time to improve courses was that the task was of a low cost
value to instructors.
In general, instructors seem to have a sense of high self-efficacy in their ability to create
courses that lead to satisfactory learning experiences for students. Seventy-six percent of
instructors surveyed agreed or strongly agreed that they have the ability to design courses that
will lead to significant learning experiences for students. Instructors were also asked to compare
themselves to other instructors in terms of knowledge about course design, and 48% agree or
strongly agree, while another 23% somewhat agree, that their know a great deal about course
CREATING “EXCELLENT” LEARNING EXPERIENCES
83
design. However, despite possessing a high sense of self-efficacy, the survey found that
instructors are motivated to spend time to improve their courses, invalidating the assumed cause
that instructors are not spending time improving courses because of high self-efficacy.
Figure 6. Responses to survey questions regarding self-efficacy
In the survey, instructors were asked to agree or disagree on a six-point Likert scale about
their willingness to spend time on improving course design even if it required more work.
Eighty-four percent of instructors agreed or strongly agreed that they do choose to spend time on
improving their courses. Another survey question asked and open-ended question about how
much time instructors estimated they spend on improving their courses. Forty-one percent said
they spent between 10 hours or less updating their courses. Sixteen percent spend 20 hours or
3%
5%
18%
26%
34%
14%
0%
3% 3%
19%
30%
46%
0%
4%
3%
10%
36%
48%
0%
10%
20%
30%
40%
50%
60%
Strongly
Disagree
Disagree Somewhat
Disagree
Somewhat Agree Agree Strongly Agree
Compared to other instructors, I think I know a great deal about course design.
I am certain that I can design a course that creates significant learning experiences.
I often choose to spend my time improving course design even if it requires more work.
CREATING “EXCELLENT” LEARNING EXPERIENCES
84
less, and 44% said they spend more than 20 hours updating their courses. Nine percent did not
specify a number of hours, but wrote that they would spend “tons of time” on updating their
courses. The responses to the survey questions indicate that instructors are choosing to spend
time to update their courses despite their high sense of self-efficacy. The responses also indicate
that instructors find value in the task of updating their courses.
Figure 7. Self-reported amount of hours instructors spend per quarter updating courses
The interview probed the motivation behind why instructors would choose to invest time
in updating courses. Six of the instructors indicated that updated, relevant materials helped to
better engage students, leading them to decide to take the time to update their courses. Five of
the instructors cited a sense of responsibility to the students as one of the reasons why they
choose to invest the time to update materials. As stated by one of the instructors, “I really feel
like it’s my responsibility as an instructor to give the students as much as I can because I want
them to be successful.” Four of the instructors mentioned that their personal interest in the topic
9%
28%
13%
10%
6%
13%
21%
0
10
20
30
40
50
60
70
Unspecified 1-5 6-10 11-15 16-20 21-25 25+
Respondents (n=68)
Number of Hours
CREATING “EXCELLENT” LEARNING EXPERIENCES
85
was a factor in their decision to update materials, while three enthusiastically discussed their love
of teaching as the reason why the put time and effort into improving their courses. When asked
why she would invest the time to update her courses, one instructor simply stated, “I love it.”
The interview results affirm the survey findings, and did not validate the assumed cause that
instructors are not motivated to invest the time because of low cost-value and high self-efficacy.
Instructors Find Utility Value In IDP Workshops and Course Evaluations
Another assumed cause of the motivation gap was that instructors do not find utility value
in the course evaluations and IDP workshops. Though the survey and interview questions did not
probe to confirm how instructors were actually using the feedback from evaluations and
knowledge gained from IDP workshops, the data suggests that instructors find value in the
evaluations and IDP workshops, not validating the assumed cause.
Currently, the only formal tool used by the institution to evaluate courses is the end-of-
term course evaluation. The course evaluation consists of Likert-scale questions that ask students
to rate their overall impression of the instructor and the course. The final questions on the
evaluation are open-ended questions that allow students to write comments. In the interviews,
instructors indicated that the written comments were the most useful part of the evaluation, as the
Likert-scale rankings do not carry much meaning. The survey results found that the end-of-term
course evaluation is a tool that instructors use to revise their courses. Seventy-three percent of
instructors agreed or strongly agreed that the course evaluation is a useful tool for them to
improve their course design.
Though the instructors indicate in the survey that the course evaluation is a useful tool, it
should be noted that only four of nine interviewees said that the written comments prompted
them to reassess the course and make adjustments. One instructor said
CREATING “EXCELLENT” LEARNING EXPERIENCES
86
I look at both comments that they make in class and than also I really read those and I
keep them all in a book, like a binder of previous classes that I taught. I do go over it
from time to time to see if there’s some new trends that they’re suggesting. There are
certain subjects that they say they didn’t learn enough about in the class. So I do, over
time, eliminate some things that I think might be less important to them. I look at those
evaluations; I also monitor comments and take notes on it. I usually have an assistant
with me and I ask them to take notes on what the students’ comments are during the class
and after the class.
However, one interviewee admitted to not given much attention to the course evaluations.
Out of the 74 instructors surveyed, 88% of the instructors responded that they had
participated in instructor training through the IDP. Of the 88% who participated, 64% agreed or
strongly agreed that what they learned in the IDP workshop was useful to improve course design.
The assumed cause was that instructors were not motivated to improve their courses because
they did not find the IDP workshops useful was not validated.
Instructors Attribute Their Efforts with Student Satisfaction
Eighty-five percent of instructors surveyed agreed or strongly agreed that the amount of
effort instructors put into preparing materials influences student satisfaction. They acknowledged
that instructor effort plays a direct role in the student experience. Based on the survey results, it
would appear that the assumption that instructors are attributing student dissatisfaction to other
factors is not valid.
However, during the interviews, instructors identified multiple reasons for student
dissatisfaction, nuancing the findings to indicate that though instructors in the survey indicated
that their efforts contribute to student satisfaction, instructors also attribute student satisfaction to
CREATING “EXCELLENT” LEARNING EXPERIENCES
87
other factors beyond their control. In the interview responses, five of seven identified reasons
why students might be dissatisfied were factors that instructors have no control over, such as
misaligned student expectations and the student not being prepared. It should be noted that
during the document analysis of syllabi for knowledge gaps, 34% of the course descriptions in
the syllabi did not match the catalog description, potentially setting up the misalignment of
student expectations. Only two instructors identified their lack of ability to engage students as
being one potential reason for dissatisfaction, and three instructors identifying not following the
course outline as a source of student ire.
Overall, it seems that though instructors recognized the importance of their efforts in
affecting student satisfaction, further questioning identified that they believe other factors are
also involved. Despite discussing other factors, a majority of interviewed instructors
acknowledged their role, thus supporting the findings of the survey, and invalidating the assumed
cause that instructors are not motivated to improve courses because they attribute student
dissatisfaction to factors not connected to their efforts.
Synthesis of Results and Findings for Motivation Causes
The results from the survey and interviews do not support the assumed causes for
motivation gaps in this study. In general, the results indicate that instructors are intrinsically
motivated. Instructors are choosing to invest the time in order to improve, they are using student
feedback to try to make improvements, and they recognize the role they play in student learning.
CREATING “EXCELLENT” LEARNING EXPERIENCES
88
Table 8
Summary of Results and Findings for Assumed Motivation Causes
Motivational
Problems Assumed Causes Result Explanation
Instructors are not
investing time to
improve their
courses
Self Efficacy
High sense of self-
efficacy, leading to
underestimation of
needed time and ability
Cost Value
Cost value of spending
time to develop/ update
curriculum is low
Not validated
Instructors have high
sense of self-efficacy,
but they are still
choosing to spend time
updating their courses
because they find
value.
Survey: Self Efficacy
76% agree or strongly agree that they
can design learning experiences. 48%
agree or strongly agree that they know a
great deal about course designed
compared to other instructors, with
another 23% somewhat agreeing.
Survey: Cost Value 84% choose to
spend time improving courses even if it
requires more work.
Interviews: indicate that instructors
want to better engage students by
having updated, relevant materials. 4
instructors discussed their personal
interest in the topic, and 3
enthusiastically mentioned that do it
because they love teaching.
Instructors are not
using course
evaluations results
and IDP learning
to improve
courses
Utility Value
Lack of consequences
for low evaluation
scores decreases value
of applying new
teaching information
There is little utility
value in implementing
new teaching strategies
Mostly Not Validated
The results suggest
that instructors find the
course evaluations and
IDP workshops useful,
but actual application
was not confirmed.
Survey: 64% agree or strongly agree
that the IDP workshops are useful. 73%
agree or strongly agree that course
evaluations help improve course design.
Interviews: 4 of 9 interviewees use
written comments on course evaluations
to reassess the course and make
adjustments.
Attribution
Instructors attribute
students’
dissatisfaction with
class to the challenges
instructors face with
teaching in a diverse
adult learning setting
Mostly Not Validated
Though instructors
acknowledge their
role, the interviews
nuanced the findings
to suggest that other
factors.
Survey: 85% agree or strongly agree
that the amount of effort instructors put
into preparing materials influences
satisfaction.
Interviews: Interviewees identified
multiple reasons for student
dissatisfaction, with 5 of 7 identified
reasons being factors that instructors
have no control over, such as misaligned
student expectations and the student not
being prepared. Only 2 instructors
identified their lack of ability to engage
students as being one potential reason
for dissatisfaction.
CREATING “EXCELLENT” LEARNING EXPERIENCES
89
Results and Findings for Organization Causes
This study explores organizational gaps that are rooted in the culture and context of the
organization. Gallimore and Goldenberg (2001) provide the framework for cultural settings and
models that facilitate the examination of organizational gaps. The assumed causes of
organizational gaps include a lack of accountability of instructors to the institution, lack of
incentives to improve course quality, limited institutional resources for instructional support, and
a lack of shared community amongst instructors.
Organizational Culture Does Not Foster Accountability to the Institution
One assumed cause was that there is a lack of accountability in the cultural setting of the
organization as evidenced by the lack of consistent consequences when instructors receive low
evaluation scores. The survey results did not validate the assumed cause because most instructors
agree that there are clear consequences for low evaluation scores, but the interview revealed that
instructors felt accountable to the students, not the institution, if they received low evaluation
scores. This indicates that instructors have a sense of accountability, but it is not necessarily an
accountability that is fostered by the organizational cultural setting.
The survey asked instructors whether or not there are clear consequences for low
evaluation scores. Forty percent of the instructors agree or strongly agree that there are clear
consequences for low evaluation scores. Thirty-two percent somewhat agree, while the
remaining 28% of instructors fall in the ranges of disagree. Though the survey results indicated
that the instructors believe there are clear consequences for low evaluations, the survey did not
ask instructors what those consequences are. However, the interview further explored this
question and revealed that the instructors ultimately felt responsibility to the students, thus using
the evaluations to make improvements in their courses to help student learning.
CREATING “EXCELLENT” LEARNING EXPERIENCES
90
In the interviews, two instructors indicated that the administration never spoke with them
about their evaluations and were uncertain how administration used the evaluations. One
instructor talked about how he initiated discussion about his evaluation, but received no
responses, stating, “I’ve brought up my reviews with you guys but I haven’t heard back either
way.” The response of this instructor demonstrates a lack of consistent follow-up on behalf of the
institution and lack of clarity and communication to instructors about how the administration
utilizes the evaluations.
However, the other interviewed instructors discussed using the course evaluations to
make improvement on their courses. Four of the instructors said that low evaluations would
cause them to make adjustments. Five of the instructors talked about a sense of responsibility to
the students, indicating that they felt accountable to the students because students are paying for
the classes. One instructor stated, “at the end of the day you know, they’re not paying
insignificant amount of money to come to the class. And I want to make sure they get a lot of
value out of it.” None of the instructors interviewed mentioned being accountable to the
institution. Rather, instructors discussed using the feedback from the evaluations to inform
course improvement so that the course would better engage future students. It was not to improve
their standing with the institution. As one instructor responded when asked about what happens
when low evaluations are received, he said “I always try to figure out well, what did I do, and
how can I make it better?” This seems to indicate an intrinsic motivation to improve because of
desire to be better teachers and a sense of responsibility to the students, not a sense of
accountability driven by the organization’s cultural setting.
CREATING “EXCELLENT” LEARNING EXPERIENCES
91
Unclear Institutional Incentives for Course Improvement
Organizationally, it was also assumed that there was a lack of incentives for instructors to
improve the quality of courses. The survey partially validated the assumed cause, with 49% of
instructors disagreeing to varying degrees that there are institutional incentives to improve course
quality. Though 51% of instructors fall in the somewhat agree to strongly agree categories, it is
worth noting that 27% of instructors disagree or strongly disagree, while 27% agree or strongly
agree. The result of the survey question is evenly spread, and indicates an uneven perception of
institutional incentives. This could potentially indicate a need to better define incentives and
better communicate the existence of incentives, if any.
Figure 8. Responses to survey question regarding incentives for instructors to improve course
quality
0
10
20
30
40
50
60
70
Strongly
Disagree
Disagree Somewhat
Disagree
Somewhat
Agree
Agree Strongly Agree
Number of Respondents
CREATING “EXCELLENT” LEARNING EXPERIENCES
92
In the interviews, instructors were asked if they had participated in IDP programs, and if
yes, why they chose to participate in IDP workshops. Of the seven instructors who have
participated in an IDP workshop, three mentioned that the opportunity to learn something new
was the reason why they participated in IDP programs. One instructor specifically recognized the
value of the IDP workshop, stating, “it’s a great value. It doesn’t cost a penny. You hear some
fantastic presenters on social marketing or new media. To take a Saturday to do that is just an
incredible valuable gift I like to receive.” Two of the instructors specifically mentioned that the
IDP workshops provided opportunities to meet other instructors, which they thought enhanced
their teaching experiences. One instructor said
you don’t have the opportunity to interact with other instructors but maybe once or twice
a year. But the instructor development courses, there are fifteen or twenty other
instructors that you can connect with and exchange ideas, become guest speakers for each
others classes and interact with each other. It’s great to have that accessibility to other
instructors so that if I have an issue or a question, I can call another instructor and get
their advice or opinion on what they would do in the situation.
The responses of the instructors seemed to indicate that they wanted to enhance their teaching
skills, and were not necessarily driven by institutional incentives. One of the instructors thought
participation was mandatory, so that was the reason for participation, but overall, the motivation
to participate in the workshops seemed to stem from personal interest. The interview responses
reveal that instructors are not extrinsically motivated by reward, reinforcing the idea that
instructors are intrinsically motivated to teach and continually improve their craft.
CREATING “EXCELLENT” LEARNING EXPERIENCES
93
Limited and Inconsistent Resources for Instructional Support
Another assumed organizational problem was that University Extension provides limited
instructional support, caused by lack guidance by program representatives and IDP staff
provided to instructors to improve the quality of course design, as well as limited and
inconveniently scheduled IDP workshops. The results from the survey and interview findings
somewhat validated this assumption.
Forty percent of instructors agree or strongly agree that the support they received from
academic program staff helped to improve the quality of their course. Thirty percent of
instructors responded that they somewhat agree, while 10% somewhat disagree, with 17%
disagree or strongly disagree. The results indicate a potential inconsistency in the support
provided to instructors. From the interviews, six instructors recognized that program
representatives provide administrative support, but the instructors did not mention that the
program representatives provide instructional support. One instructor said that staff seemed
overworked and had no time to do more than provide administrative support. The responses of
the instructors about program representatives indicate that the instructors understand that the role
of the program representative is more for administrative support, not classroom instruction
support, affirming that program representatives who are the main point of contact to instructors
are not adequately trained to provide support beyond classroom logistics.
In terms of support from the Office of Instructional Enhancement (OIE), which works
mainly to support instructors for online learning, the survey results were similar to support from
academic program staff. Thirty percent somewhat agree that the instructional support receive
helps them improve, with 28% agree and 14% strongly agree. Fifteen percent somewhat disagree
CREATING “EXCELLENT” LEARNING EXPERIENCES
94
and 12% disagree or strongly disagree. The results suggest that there may be a lack of clarity of
the role of OIE, and that the support provided by OIE to instructors could be inconsistent.
With regard to the scheduling of IDP workshops, 29% of instructors agreed or strongly
agreed that they are able to fit the IDP workshops into their schedules. Twenty-five percent
disagreed or strongly disagreed. IDP workshops are currently scheduled as in-person sessions on
Saturday mornings. The survey indicates that the scheduling is not the most convenient for
instructors and there is potential to consider alternative schedules and formats. Instructors were
also asked about the convenience of IDP programs during the interview. Five interviewees said
that the IDP programs were somewhat convenient, and would make the time to attend if needed.
However, four demonstrated a high degree of self-efficacy, stating that they did not feel they
needed the IDP. Two said that regardless of format and scheduling, they were too busy to
participate in IDP programs. Two also said that the topics offered were not new or interesting,
and that the offerings were too sporadic. The interview responses suggest that IDP offerings are
limited, and that the instructors unevenly value the IDP.
Organizational Culture Moderately Promotes Value of Excellence Through Shared
Instructor Community
One of the organizational assumed causes is that instructors do not have a shared
understanding of “excellence” at University Extension because the organizational mission is not
clearly communicated and the organization does not provide opportunities to develop instructor
community to help build a shared understanding of excellence. From the survey, a majority of
instructors agree to varying extents that the mission of the organization is reflected in the
organizational culture. Fifty-four percent agree or strongly agree while 29% somewhat agree.
The results suggest that instructors have a sense of the organizational mission because the
CREATING “EXCELLENT” LEARNING EXPERIENCES
95
organization’s cultural setting communicates the mission. However, the survey did not probe to
ask what instructors thought excellence meant for University Extension. Though instructors may
have a sense of excellence, the definition of excellence was not defined.
In terms of community, 62% agree or strongly agree that they feel a part of the University
Extension community, but 17% fell into the disagree categories. However, fewer instructors felt
that they had opportunities to connect with other instructors. Only 24% agree or strongly agree
that they have opportunities to discuss and share teaching and learning strategies with other
UNEX instructors, while 19% disagree or strongly disagree. About an even percentage of
instructors somewhat agreed and disagreed. Though instructors feel that the cultural setting
communicates the organizational mission of excellence, the cultural model does not seem to
communicate the same value because the lack of instructor community does not provide
opportunities for instructors to develop a shared sense of excellence.
From the interviews, four instructors stated that they had some type of relationship with
other instructors. Of those four, two expressed that they were friendly with other instructors
while the other two felt they had developed strong relationships with other instructors. However,
one instructor stated that there was no interaction with other instructors. Six of the instructors
expressed that they felt the opportunities to get to know other instructors was limited, but they all
expressed a desire for more opportunities.
Synthesis of Results and Findings for Organization Causes
The results and findings of the survey and interview confirm that there are organizational
gaps. Though the organization does not have clear accountability or incentives to improve course
quality, the instructors possess a personal sense of accountability to students and intrinsic
motivation to improve. The cultural setting may not strongly reinforce accountability nor provide
CREATING “EXCELLENT” LEARNING EXPERIENCES
96
incentives, but that does not seem as important to instructors because other factors are driving
their willingness and desire to improve course quality.
There also appears to be limited and inconsistent instructional support available to
instructors. Instructors seem to be able to find the administrative support they need, but the
instructional support is uneven and not consistently provided. Finally, though instructors feel
that the cultural setting conveys the institutional value of excellence, the value was not clearly
defined, and the organization does not strongly promote a collective sense of community where
instructors can discuss, share, and build mutual understandings of excellence. Though instructors
feel a connection to the organization, they do not have many opportunities to connect with one
another and build a shared community of instructors.
CREATING “EXCELLENT” LEARNING EXPERIENCES
97
Table 9
Summary of Results and Findings for Assumed Organizational Causes
Organizational Problem Assumed Cause Result Explanation
Instructors are not
participating in Instructor
Development Program
(IDP) workshops and are
not actively working to
improve instructional
quality
Cultural Setting:
Accountability
There is no consistent
accountability for low
evaluations scores
Partially
Validated
Survey:
40% agree or strongly agree and
32% somewhat agree that there are
clear consequences for low
evaluations scores, but 28% are on
the disagree side.
Interviews: None of the interviewees
mentioned accountability to the
institution. They only discussed their
accountability to students.
Cultural Setting:
Incentives
There are no consequences
or incentives for IDP
participation and/or
improved instructional
quality
Partially
Validated
Survey:
49% somewhat disagree to strongly
disagree, while 51% somewhat to
strongly agree, that there are
institutional incentives to improve
course quality.
Interviews: 7 of 9 interviewees
participated in IDP, none citing
institutional incentives as reasons for
participation.
University Extension
provides limited
instructional support
Cultural Setting:
Resources
UNEX program reps are not
adequately trained to
provide instructional support
The Office of Instructional
Enhancement (OIE) is
focused mainly on the
technical aspects of online
course delivery, not on
overall instructional support.
The IDP does not offer
workshops in a variety of
formats that are adaptable to
instructor schedules.
Partially
Validated
Survey:
40% agree or strongly agree that
program reps provide support that
helps improve course quality
42% agree or strongly agree that the
instructional support from OIE helps
them improve
29% agree or strongly agree that IDP
workshops are conveniently
scheduled
Interviews: 6 of 9 instructors
recognized that program reps
provide administrative support, but
did not cite much instructional
support.
5 of 9 interviewees said the IDP
programs were somewhat
convenient, and would make the
time if needed to attend.
CREATING “EXCELLENT” LEARNING EXPERIENCES
98
Table 9, continued
Organizational Problem Assumed Cause Result Explanation
Instructors do not have a
shared understanding of
what teaching and
learning “excellence”
means at University
Extension
Cultural Setting:
Communication
The organizational vision,
mission, and values are not
effectively embedded in the
organizational culture
Cultural Model:
Communication
There is lack of instructor
community for instructors to
discuss and share their
understandings of teaching
and learning excellence.
Partially
Validated
Survey: 54% agree or strongly agree
that the mission is reflected in the
organizational culture.
62% agree or strongly agree that
they feel a part of the UNEX
community. Only 24% agree or
strongly agree that they have
opportunities to connect with other
instructors.
Interviews: 6 interviewees felt there
were limited opportunities to get to
know other instructors and expressed
desire for that opportunity.
Summary
The survey results, interview findings, and document analysis validated and did not
validate some of the assumed causes of the knowledge, motivation, and organizational gaps that
exist for instructors in creating learning experiences for students. However, the results and
findings revealed nuances in some of the assumed causes, as evidenced by a number of partially
assumed causes.
Through experience, the instructors have learned how to teach adult learners, and they
know how to gauge whether or not their efforts are successful, but the data revealed that there is
a conceptual and procedural knowledge gap for instructors. Though they know the language of
curriculum components, instructors are not conceptually knowledgeable about how those
components relate to one another. Furthermore, the knowledge about the course components is
not being applied as evidenced by the document analysis of course syllabi. The conceptual and
procedural knowledge gaps require solutions that will be explored in the next chapter.
CREATING “EXCELLENT” LEARNING EXPERIENCES
99
The data did not validate any of the assumed motivation causes, however, the data did
reveal some nuances that are worthy of exploration. The survey results and interview findings
validated the assumption that instructors have a high sense of self-efficacy, but despite the high
self-efficacy, instructors choose to invest time to improve their courses. The desire to engage
students and the joy of teaching seem to be the factors that motivate instructors to invest time to
continually work to improve their courses. The data also affirms that instructors find value in the
course evaluations and the IDP workshops, but how the instructors actually use the information
from the course evaluations and IDP workshops was not confirmed. Overall, the data revealed
that instructors are actively choosing to invest time and are making the effort to utilize course
evaluations and IDP information to improve their courses. The data suggested that instructors
value the feedback they receive from students, and attribute student satisfaction to the effort they
put into preparing their courses. The challenge will be channeling their motivation to continually
work at improving their courses.
The data partially validated all of the assumed organizational gaps. The data suggested
that though instructors feel that they are accountable for low evaluation scores, it is not because
they are accountable to the institution, but rather because the instructors feel responsible to the
students. The instructors want students to be engaged and to succeed, so the accountability they
feel is to the students. The data also highlighted the lack of clear institutional incentives for
improving courses. The instructors were evenly split between those that thought they were
incentivized versus those that did not think they were not incentivized. The organizational gaps
around the lack of clarity about accountability and incentives are opportunities for the institution
to implement changes to gain clarity for instructors.
CREATING “EXCELLENT” LEARNING EXPERIENCES
100
Additionally, the assumed organizational gaps about institutional resources for
instructional support were also partially validated. The data affirmed that though the program
reps, OIE, and the IDP offer support, the support is uneven, and does not fully meet the needs of
instructors. Finally, the data suggested that though the cultural model of the organization
communicates a sense of community, there are not as many opportunities for instructors to
connect with one another to share understandings of excellence and build a vibrant instructor
community. Table 10 summarizes the findings.
The next chapter will present potential solutions for the validated causes of the
knowledge and organizational gaps. Though the data did not validate the assumed motivation
gaps, the solutions described in the next chapter have been designed to also encourage the
intrinsic motivation of instructors because the finding about lack of factual knowledge
application potentially points to an unconfirmed motivation gap. Also, instructor motivation is
important for the success of solution implementation. All of the solutions that will be offered in
the next chapter are evidence-based and derived from relevant research and case studies.
CREATING “EXCELLENT” LEARNING EXPERIENCES
101
Table 10
Assumed Causes and Finding Summary Table
Gap
Analysis
Category Assumed Causes Findings
Knowledge Factual: Instructors do not know the
basics/components of course design.
Mostly Not Validated: Instructors possess
sufficient factual knowledge about the
components of curriculum design, but
factual knowledge not evident in
application.
Knowledge Conceptual: Instructors do not
understand the integrated relationship
between learning objectives, key
assessments, and teaching and learning
strategies.
Procedural: Instructors do not know
how to align learning goals with
assessments and teaching and learning
strategies.
Validated: The lack of evidence of factual
knowledge application indicates
insufficient conceptual and procedural
knowledge about applying curriculum
components in their course syllabi.
Knowledge Conceptual: Instructors are unable to
identify appropriate teaching and
learning strategies for adult learners.
Not Validated: Instructors utilize
appropriate teaching strategies for adult
learners.
Knowledge Metacognitive: Instructors do not know
how to assess the effectiveness of their
course design.
Not Validated: Instructors demonstrated
that they possess metacognitive knowledge
needed to reflect on the effectiveness of
courses based on student learning.
Motivation Self Efficacy: High sense of self-
efficacy, leading to underestimation of
needed time and ability.
Cost Value: Cost value of spending time
to develop/ update curriculum is low.
Not Validated: Despite having high self-
efficacy, instructors were choosing to
spend time because they found value in the
process.
Motivation Utility Value: Lack of consequences for
low evaluation scores decreases value of
applying new teaching information.
There is little utility value in
implementing new teaching strategies.
Mostly Not Validated: Instructors find
value in the feedback of course evaluations
and IDP workshops, but actual use of
information to improve courses was not
confirmed.
CREATING “EXCELLENT” LEARNING EXPERIENCES
102
Table 10, continued
Gap Analysis
Category Assumed Causes Findings
Motivation Attribution: Instructors attribute
students’ dissatisfaction with class to the
challenges instructors face with teaching
in a diverse adult learning setting.
Mostly Not Validated: Instructors take
responsibility for their role in affecting
student satisfaction, but nuanced interview
findings suggest potential attribution to
other factors.
Organization Cultural Setting: Accountability
There is no consistent accountability for
low evaluations scores.
Partially Validated: The organizational
culture does not foster a sense of
accountability to the institution, but
instructors feel personal accountability to
students.
Organization Cultural Setting: Incentives
There are no consequences or incentives
for IDP participation and/or improved
instructional quality.
Partially Validated: Instructors are
unclear about the existence of institutional
incentives for course improvement.
Organization Cultural Setting: Resources
UNEX program reps are not adequately
trained to provide instructional support.
The Office of Instructional Enhancement
(OIE) is focused mainly on the technical
aspects of online course delivery, not on
overall instructional support.
The IDP does not offer workshops in a
variety of formats that are adaptable to
instructor schedules.
Partially Validated: There are limited
and inconsistent resources for instructional
support.
Organization Cultural Setting: Communication
The organizational vision, mission, and
values are not effectively embedded in
the organizational culture.
Cultural Model: Communication
There is lack of instructor community
for instructors to discuss and share their
understandings of teaching and learning
excellence.
Partially Validated: The organizational
culture moderately promotes value
excellence through shared instructor
community.
CREATING “EXCELLENT” LEARNING EXPERIENCES
103
CHAPTER 5
SOLUTIONS, IMPLEMENTATION AND EVALUATION
The new mission of University Extension directs the institution to focus on creating
excellent learning experiences. One of the first steps in operationalizing this mission requires that
courses be designed to produce learning experiences that are excellent for students. A total of 17
assumed causes thought to be contributing to the challenges instructors face in designing courses
that produce excellent learning experiences were identified and investigated through the gap
analysis framework (Clark & Estes, 2008). The previous chapter presented findings that
validated or partially validated nine of these cause. The purpose of this chapter is to offer
solutions to address the all these validated causes.
The gap analysis framework isolates the categories of knowledge, motivation, and
organization in order to clearly identify root causes of problem. However, the factors of
knowledge, motivation, and organization do not operate in isolation. The interconnected nature
of the categories allows for solutions that address multiple causes of performance gaps. The
proposed solutions are organized in cascading order of impact and importance, with each
proposed solution reinforced by the next. Though each solution will have specific, anticipated
outcomes, overlap in deployment is expected. Following the discussion of proposed solutions, an
implementation and evaluation plan will be proposed.
Validated Causes Selection and Rationale
The data confirmed that knowledge and organizational barriers exist, impeding
instructors’ abilities to create excellent learning experiences for students. The data validated or
partially validated nine of the assumed causes, which are summarized in Table 11. Five solutions
will be proposed in this chapter that will address all of these nine causes. The solutions offered
CREATING “EXCELLENT” LEARNING EXPERIENCES
104
integrate the categories of knowledge, motivation, and organization, therefore the nature of the
solutions will address all validated and partially validated knowledge and organizational causes,
as well as provide suggestions to maintain instructor motivation even though no motivation
barriers were confirmed. Also, all of the partially validated causes are organizational gaps, and
because the organization is central to all the proposed solutions, the partially validated causes
must be addressed.
CREATING “EXCELLENT” LEARNING EXPERIENCES
105
Table 11
Validated Causes Summary Table
Gap Analysis
Category Validated Causes
Knowledge Conceptual Knowledge
Instructors do not understand the integrated relationship between learning
objectives, key assessments, and teaching and learning strategies.
Procedural Knowledge
Instructors do not know how to align learning goals with assessments and
teaching and learning strategies.
Organization Cultural Setting: Accountability
There is no consistent accountability for low evaluations scores.
Organization Cultural Setting: Incentives
There are no consequences or incentives for IDP participation and/or
improved instructional quality.
Organization Cultural Setting: Resources
UNEX program reps are not adequately trained to provide instructional
support.
The Office of Instructional Enhancement (OIE) is focused mainly on the
technical aspects of online course delivery, not on overall instructional
support.
The IDP does not offer workshops in a variety of formats that are adaptable
to instructor schedules.
Organization Cultural Setting: Communication
The organizational vision, mission, and values are not effectively embedded
in the organizational culture.
Cultural Model: Communication
There is lack of instructor community for instructors to discuss and share
their understandings of teaching and learning excellence.
CREATING “EXCELLENT” LEARNING EXPERIENCES
106
Solutions
The underlying theme of the proposed solutions is that the institution should incorporate
activities of a learning organization that supports professional learning. A learning organization
is one that systematically problem solves by engaging in activities of learning and reflection for
the purpose of continuous improvement (Dill, 1999). Hoekstra and Crocker (2015) define
professional learning as “engaging in activities that lead to improved professional practice or the
capacity to behave in improved ways” (p. 62). They also point out that such learning can take
different forms in that it “may be intentional or unintentional and conscious or beyond the
learner’s awareness and involve formal and/or informal learning activities” (Hoekstra & Crocker,
2015, p. 62). Saroyan and Trigwell (2015) underscore the importance of professional learning by
describing it as “a fundamental cornerstone of dynamic learning organizations” (p. 92). They
further the argument for learning organizations by stating “organizational learning is an effective
way to address the need for effective teaching” (p. 92). By incorporating practices that support
the evolution of University Extension into a learning organization, the institution is signaling to
instructors its commitment to excellence, and commitment to supporting instructors in their
continuous effort to create excellent learning experiences for students (Dill, 1999; Saroyan &
Trigwell, 2015).
Therefore, at the center of the solutions is the role of the institution. Though instructors
are the key stakeholders, the cultural settings and models within the organization need to support
the ability of instructors to attain the knowledge they need in order to produce course curricula
that achieve the expected learning experiences. Additionally, the organizational conditions need
to exist to motivate instructors to actively choose, expend the effort, and persist in continually
improving their courses. Though none of the assumed motivation causes were validated, the
CREATING “EXCELLENT” LEARNING EXPERIENCES
107
proposed solutions will impact motivation as well because of the interconnected nature of the
knowledge, motivation and organizational categories. The five proposed solutions are multi-
layered, with each affecting the other in some way and directly addressing some causes and
indirectly addressing other causes.
The proposed solutions are all, in some way, related to a facet of a learning organization.
There are six activities that provide the framework for a learning organization (Dill, 1999). The
six activities of a learning organization are seeking to explore new knowledge by systematic
problem solving, utilizing personal experience for learning, learning from others, trying with
new processes, sharing knowledge within the organization, and assessing learning (Dill, 1999).
Though the solutions do not fully follow the framework, they embody aspects of the six
activities, which move University Extension closer to behaving like a learning organization. The
solutions are organized in a cascading order of impact, with one predicated on the other: (1)
define and establish standards of excellence, (2) allocate resources to establish a comprehensive
instructor development program, (3) establish instructor learning communities, (4) establish
measures of excellence to emphasize value of instructional excellence, and (5) establish a clear
system of rewards and incentives.
Define and Establish Standards of Excellence
In order to ensure that instructors are creating excellent learning experiences, University
Extension needs to first define “excellence” for the University Extension context, and then set
standards of excellence. “Excellence” is an abstract term that does not have an agreed upon
definition in higher education, and the terms of excellence will vary depending on the purpose
and context of the specific situation (Hill et al., 2003; Krause, 2012; Little & Locke, 2011). In
order to establish standards of excellence, University Extension will need to consider its
CREATING “EXCELLENT” LEARNING EXPERIENCES
108
stakeholders and overall purpose when determining what excellence means for the institution. As
a continuing education institution, University Extension provides courses and programs for
personal and professional development to an audience of nontraditional students. The notion of
excellence for University Extension should take into consideration that the role and purpose of
continuing education is to provide educational opportunities that will help students remain
relevant and current in their professional field (Cervero, 2001; Lauzon, 2013).
Though the term “excellence” can be used in an aspirational sense, it can also be used to
describe the experience of high quality programs (Little & Locke, 2011). In terms of producing
quality learning outcomes, Hénard and Roseveare (2012), in their report recommending policies
and practices to foster quality in higher education, identify teaching quality as the catalyst for
improving the quality of learning experiences for students. Because University Extension’s
mission is to ensure that students have excellent learning experiences, the concept of excellence
should focus on quality teaching. Gunn and Fisk (2013), in their exploration of the practical
elements of teacher excellence, provide a framework that categorizes the various aspects of
teaching which impact the learning experience. The categories include curriculum design,
knowledge of the subject, ability to inspire and motivate, respect and care for students as
individuals, active and group learning, critical and scholarly, and engagement in assessment
(Gunn & Fisk, 2013). University Extension can use these categories as a starting point to discuss
define excellence and establish standards for the University Extension context.
Allocate Resources to Establish a Comprehensive Instructor Development Program (IDP)
In order to achieve the institutional mission of creating learning experiences that are
impactful and meaningful to students, the institution needs to prioritize the value of excellence in
teaching and learning by allocating the necessary resources to support instructors’ professional
CREATING “EXCELLENT” LEARNING EXPERIENCES
109
development. The findings from the survey and interviews affirmed that the limited and
inconsistent resources for instructional support affect instructors’ abilities to consistently create
courses that produce learning experiences that meet institutional expectations of excellence. The
limited and inconsistent investment in instructional support does not signal to instructors that the
institution values excellence in teaching (Austin & Sorcinelli, 2013; Dill, 1999; Tarr, 2010).
Austin and Sorcinelli (2013), in their exploration of faculty development in higher education,
note that a key, strategic element for institutional excellence is faculty development because
faculty directly delivers education to students and must be able to teach and adapt to the
expectations of the complex postsecondary environment. By establishing a more comprehensive
IDP, the institution can demonstrate its commitment to instructional excellence, and also provide
more opportunities for instructors to close the knowledge gaps.
In their analysis of literature about professional learning in higher education, Saroyan and
Trigwell (2015) identify the role of institutions in supporting efforts to improve teaching as a
common theme across many contexts. The efforts to improve teaching are bolstered when the
institution invests in professional learning for instructors. In his review of academic learning
organizations, Dill (1999) identifies a key relationship between academic quality and
organizational structure. If the organization is not designed to support professional learning, the
quality of teaching will not improve (Dill, 1999). A restructured IDP can support systemic
problem solving, which is the first activity of a learning organization, by being designed to work
directly with program units and instructors in order to collaboratively explore problems that arise
as they related to teaching and learning (Dill, 1999). Current staffing can be reorganized in order
to provide the IDP with additional human resources to help facilitate workshops, instructor
learning communities, and to provide individual assistance to instructors as needed. Additionally,
CREATING “EXCELLENT” LEARNING EXPERIENCES
110
IDP should offer program staff training on how to advise instructors on classroom management
issues, as well as know when to recognize the need to refer instructors to the IDP for further
help.
The investment and allocation of resources is the first step in building a comprehensive
IDP. A comprehensive IDP would provide more robust, on-going support that meets the needs of
instructors. Most instructors at University Extension are part-time, contract instructors. Tarr
(2010) describes how institutions should work with part-time faculty, and affirms that investing
in the development of part-time faculty and demonstrating that the institution values their
contributions can increase teaching effectiveness. Without the investment in instructor
development, instructors are not sensing the priority the institution places on academic
excellence.
Lyons (2007) identifies five basic needs of part-time instructors, which include: (1) a
thorough orientation to the institution, its culture and practices, (2) adequate training in
fundamental teaching and classroom management skills, (3) a sense of belonging to the
institution, (4) both initial and ongoing professional development, (5) recognition for quality
work that is perceived as appropriate and adequate. The current IDP offers an in-depth
orientation, as well as foundational courses about course design, assessment and evaluation, and
integrating digital tools. However, there are no follow-up courses to these foundational
workshops, or consistent, on-going professional development.
In order to fully support instructors’ development, and address the conceptual and
procedural knowledge gaps, the IDP must go beyond offering a handful of workshops that are
catered only to new instructors. Additional professional development opportunities should be
developed to continually offer instructors opportunities to advance their teaching skills (Lyons,
CREATING “EXCELLENT” LEARNING EXPERIENCES
111
2007; Tarr, 2010). Also, the current half-day workshops should be expanded to multi-session
workshops that provide ample opportunities for instructors practice and apply the knowledge
they have learned, as well as being offered in various formats at multiple times so that instructors
have multiple opportunities to participate (Tarr, 2010). Ambrose et al. (2010) suggest that one of
the principles of learning is the acquiring of component skills, practice integrating them, and
application of what they have learned. The IDP workshops should be informed by adult learning
theories, such as andragogy, and be opportunities to demonstrate the strategies that are most
effective for adult learners, such as being designed to give instructors multiple opportunities for
the practice and application of the acquire component skills so that instructors can develop
mastery of the lesson (Ambrose et al., 2010; Merriam & Bierema, 2013).
In addition to providing workshops that reflect adult learning theory and implement best
practices for adult learners, the IDP should support professional development by reflecting
strategies that will engage instructors and encourage them to continually participate in IDP to
improve the quality of their teaching. These strategies include (1) utilizing instructor input to
develop new professional development opportunities; (2) using instructors experiences as
learning opportunities; (3) implement a practical, applied approach to learning instead of being
too theoretical and philosophical; (4) encourage and facilitate peer-sharing for problem-solving;
(5) increase opportunities to participate in the IDP by offering alternative formats and options
(Ambrose et al., 2010; Fink, 2013; Knowles, 1980; Merriam & Bierema, 2013). The investment
into a comprehensive IDP will not only signal UNEX’s commitment to excellence and address
the challenge of limited instructor support, but will also provide instructors opportunities to
bridge knowledge gaps.
CREATING “EXCELLENT” LEARNING EXPERIENCES
112
Establish Instructor Learning Communities
The results and findings revealed that though instructors seem to have a grasp on the
vocabulary of course design, there seemed to be a gap in conceptual and procedural knowledge
when it came to actually putting together their courses. In addition to establishing a more
comprehensive IDP and developing additional IDP workshops, learning communities for
instructors should be created to encourage peer networking, stimulate teaching scholarship,
develop mentorship opportunities, and promote community building (Cox, 2004; Dill, 1999;
Holmes & Kozlowski, 2014; Taylor & Znajda, 2015). Learning communities would address the
findings about instructors not possessing sufficient conceptual and procedural knowledge about
curriculum components, as well as addressing the gap in organizational culture that only
moderately supports shared instructor community at UNEX. Learning communities would also
promote the learning organization activities of learning from personal experience, learning from
others, and trying new approaches (Dill, 1999). Instructors will have an opportunity to reflect
upon their own teaching practices, share learning from their own practice with their peers, and
the groups could potentially be organized around a topic that explores new strategies and
practices for teaching and learning.
Learning communities should consist of six to fifteen members, and could be created as
cohort-based or topic-based groups (Cox, 2004). New instructors to University Extension can be
placed into cohort-based groups to help them integrate into the University Extension instructor
community, whereas current instructors could organize themselves into topic-based groups.
Instructors can propose topic-based groups to University Extension administration, who would
then provide the coordination effort to form the group. The topics could revolve around
discipline-specific issues, or could be broader to address wider, institutional challenges in
CREATING “EXCELLENT” LEARNING EXPERIENCES
113
teaching and learning. The length of time that each group exists will depend on how long it takes
to satisfactorily address the topic or professional development need of the group. The IDP can
provide oversight and facilitation of learning community groups.
Forming learning communities can help to address knowledge barriers to creating
excellent learning experiences. Taylor and Znajda (2015), in their case study of a professional
learning initiative in a Canadian higher education institution, found that instructors experience a
gap between their espoused beliefs about how to be learning-centered teachers and their actual
practices of teaching. It is possible that the data revealed that University Extension instructors
experience a similar type of gap between their factual knowledge and application of that factual
knowledge. In the case that Taylor and Znajda examined, the educational development
opportunity designed as a community of learning helped instructors to bridge that gap between
how they thought they should teach and how they actually taught. Additionally, the case study
found that the intentional use of alignment practices in the learning community helped teaching
effectiveness increase when instructors learned to better align course content, learning outcomes,
learning activities, and evaluation strategies (Taylor & Znajda, 2015). In their reflection about
the outcomes of the learning community, Taylor and Znajda found that the peer support and
feedback was key to individual learning, as was the opportunity to problem-solve with other
faculty from different disciplines.
Cox (2004) describes the how faculty learning communities are a method to transform an
institution into a learning organization. Using his experience at Miami University, Cox describes
how faculty participation in learning communities positively impacted change in student
learning. Cox posits that learning communities are needed in higher education to help faculty
meet the challenges of teaching and to overcome the sense of isolation that faculty members feel
CREATING “EXCELLENT” LEARNING EXPERIENCES
114
in higher education. Faculty learning communities at Miami University provided faculty
members opportunities to collaborate across disciplines and to become more effective at
increasing student learning (Cox, 2004).
The instructor learning communities will serve as a vehicle to help bridge the knowledge
gap for instructors, and it will also help bridge the organizational gap of not having a shared
instructor community (Cox, 2004; Dill, 1999; Holmes & Kozlowski, 2014; Taylor & Znajda,
2015). Holmes and Kozlowski (2014) explore the experiences of faculty who participated in a
research learning community and found that in addition to the expected outcomes found in other
studies, such as the faculty feeling more supported, having a better sense of professional identity,
and increasing their knowledge and skills base, the faculty in their study also felt the learning
community improved their collegiality and combated the problem of feeling like there was a lack
of community. Dill (1999) also found that the quality improvements in curriculum development
were less a function of individuals making adjustments in their own work, but more an outcome
of the collective collaboration of the members of the learning community involved in a complex
process. Additionally, Holmes and Kozlowski (2014) also found that the learning community
instilled a sense of accountability amongst the participants, motivating them to work towards
meeting community and university expectations. Indirectly, the learning community can also
address the organizational gap of accountability by reinforcing the need for accountability to the
institution.
Establish Measures of Excellence to Reinforce Institutional Accountability
The last activity of a learning organization is to measure learning (Dill, 1999). Dill
describes measuring learning as “developing indicators of organizational performance, which
can be used to evaluate whether learning is actually occurring” (p. 143). University Extension
CREATING “EXCELLENT” LEARNING EXPERIENCES
115
currently only uses end-of-term student evaluations to assess the quality of instructors and
courses, and it is the only tool used for instructor accountability. The results and findings from
the survey and interview revealed that though instructors are accountable to students, they do not
necessarily feel accountable to the institution because there is a lack of institutional enforcement
of accountability as evidenced by the inconsistent consequences for low evaluation scores. There
is also a lack of additional measures to provide administrators and instructors data about whether
or not the courses are producing excellent learning experiences for students. The lack of
institutional consequences causes unclear expectations, and lack of additional measures does not
give instructors benchmarks to compare to and work from. In order to bridge the organizational
gap of accountability, the institution should emphasize the value of instructional excellence and
implement processes that reiterate and reinforce the expectations and value of the institution so
that it becomes embedded into the organizational culture.
To address the issue of unclear institutional consequences for low evaluation scores,
administrators should conduct annual reviews of each instructor, providing for an opportunity to
communicate and reiterate institutional expectations for instructional excellence (Hoekstra &
Crocker, 2015). The annual review should be an opportunity for reflective feedback, integrating
external feedback into the personal reflections of the instructor about their own teaching
practices (Hoekstra & Crocker, 2015; Saroyan & Trigwell, 2015). Hoekstra and Crocker (2015)
suggest using personal development plans to give instructors control over their own learning
goals and future growth. Having an annual review process will provide instructors with a sense
of accountability to institutional expectations.
An investment into establishing more robust measurement tools, as well as more
professional learning opportunities, will help to signal the institution’s commitment to ensuring
CREATING “EXCELLENT” LEARNING EXPERIENCES
116
excellence by providing instructors with opportunities to continuously assess and improve
teaching. Assessments and evaluations are key components to professional learning (Dill, 1999).
UNEX currently utilizes the end-of-term student evaluation as the key measurement of academic
quality and student experience. The student evaluation produces an instructor score and a course
score, but the scores do not provide much insight into the actual student experience and teaching
effectiveness. Dill (1999) suggests that additional questions that explore the student experience
be included, such as asking how much time students spent on coursework, their class
participation, interest in learning, exam results and degree of personal responsibility. The results
of these student-related indicators provide a baseline to observe change when revisions to
curriculum and course design are implemented (Dill, 1999). Additionally, other measures and
indicators need to be used to assess the effectiveness of any educational development
intervention. Some additional measures that can be used include student demand for
courses/programs, certificate completion rates, time to completion and student graduate
placement/performance in the labor market (Dill, 1999).
By establishing clear measures, the institution not only devises a method to determine the
effectiveness of professional learning opportunities, but also gives instructors a clearer
understanding of the key performance indicators that they should be working towards. With
specific goals in mind, instructors know what the institution expects and can be held accountable
to those measures.
Establish System of Rewards and Incentives
In their analysis of professional learning, Saroyan and Trigwell (2015) found that
instructors need to have internal or external motivation to prompt them to choose to participate,
exert effort and persist in professional learning. With clear measures of excellence and enforced
CREATING “EXCELLENT” LEARNING EXPERIENCES
117
accountability, University Extension should establish a system of rewards and incentives to
encourage instructors to continually improve the courses they teach. The survey and interview
findings revealed that instructors were unclear about institutional incentives to improve teaching.
Currently, UNEX does not have a consistent policy for incentivizing and rewarding instructors.
There is an annual distinguished instructor award, but it is not well publicized, and instructors
are not aware of the measures used to determine recipients of the award.
One of the needs of part-time instructors as described by Lyons (2007) is that they need
to be recognized for the quality of their work. By establishing a more robust teaching awards
program, the institution can signal to instructors that they value the contributions of the instructor
(Tarr, 2010). The current distinguished instructor award only recognizes two instructors per
academic department, for a total of twelve recipients annually. Each program director can
nominate an instructor, but aside from a required length of service, there are not other objective
measures that are used in the nomination. The current program should be expanded and tiered to
recognize instructors who have met a baseline expectation that includes additional objective
criteria, such as minimum evaluation scores. The program should build to the top prize of being
named a distinguished instructor, which would also include objective criteria such as earning top
evaluations for a set number of years or number of student nominations. Also, the recognition of
the distinguished instructors occurs as a small ceremony with just the distinguished instructors,
their guests, and UNEX staff. This recognition should occur at an annual instructor event where
the distinguished instructors can be recognized in front of a larger group of their peers.
Though the data suggests that instructors derive personal satisfaction from teaching and
are intrinsically motivated, the extent to which an institution embodies the value of professional
learning demonstrates its commitment to professional learning by investing in resources, and
CREATING “EXCELLENT” LEARNING EXPERIENCES
118
recognizing instructors can be the extra extrinsic motivation instructors need to continually try to
improve (Saroyan & Trigwell, 2015). Saroyan and Trigwell also recognized that rewards and
incentives help provide the extrinsic motivation to instructors to prioritize professional learning.
Restructuring the compensation model to by providing bonuses or salary increases based on
participation in specific professional development opportunities and improvements in teaching is
one strategy to incentive instructors (Tarr, 2010). UNEX can encourage the intrinsic motivation
with recognition and provide extrinsic motivation to continually participate in professional
learning opportunities to improve the quality of teaching. Having a clear rewards and incentive
system that recognizes effective and quality teaching will address the organizational gap of
unclear institutional incentives to improve teaching quality (Lyons, 2007; Tarr, 2010).
Implementation Plan
The gap analysis framework provided a methodology to delve into the root causes of
knowledge, motivation, and organizational gaps, and though each category was analyzed
separately, the solutions presented are integrated and linked. The solutions provided work
together to address the validated causes, and are connected by an overarching theme. The
proposed solutions all support the evolution of University Extension into a learning organization.
With the implementation of each solution, the organization will take one step closer to becoming
an institution that inculcates into its culture the practice of learning for continuous improvement.
Table 12 provides a summary of solutions with their implementation.
The first step in the implementation process is to define and establish standards of
excellence for University Extension. A committee representing the main stakeholders for
University Extension should be formed in order to explore what “excellence” means in the
University Extension context. After the definition and standards are established, they should be
CREATING “EXCELLENT” LEARNING EXPERIENCES
119
widely shared to all institutional stakeholders, and continually communicated so that the
definition and standards are incorporated into institutional practices, and embedded in
institutional culture.
In order to address the problem of limited and inconsistent resources for instructional
support, the solution is to allocate resources to establish a comprehensive IDP. Implementing this
solution will require the reorganization of existing staff in the Office of Instructional
Enhancement in order to focus more staff for IDP. After the IDP is more fully staffed, workshop
offerings need to be adapted to more accessible formats, and workshops should be redesigned to
allow for more opportunities for instructors to practice applying skills. The IDP also needs to
implement a process of surveying instructional development needs to inform the development of
new professional development opportunities. To ensure that all staff who support instructors are
appropriately trained, the IDP will also need to develop training for program staff that is focused
on teaching program staff how to provide individualized support to instructors, but recognizing
when instructors should be referred to the IDP for more comprehensive support.
After the IDP is reorganized and fully staffed, the organization can then turn to
establishing instructor learning communities. IDP and program staff will need to be trained on
how to coordinate and facilitate learning groups, and guidelines should be developed to provide
structure to learning communities. Once guidelines are established, stakeholders should be
surveyed for potential learning group topics, and participants recruited to join the communities.
Each learning community should expect to report out their experiences to administration during
an annual review process.
After excellence is defined for University Extension, and as the IDP and learning
communities are being established, standards need to be developed to help inform that type of
CREATING “EXCELLENT” LEARNING EXPERIENCES
120
measures that should be used. Additionally, in order to embed the culture of accountability, the
standards of excellence need to be communicated to instructors and they should be encouraged to
use appropriate assessments to measure learning. Instructors should also receive consistent
communications about the expectation of participation in an annual review with their respective
program director.
Finally, after standards and measures of excellence are established, a new system of
rewards and incentives can be implemented. A committee should be formed to review the
current Distinguished Instructor Awards and restructure the program to be more inclusive, and to
have more objective criteria based on the new measures of excellence. University Extension also
needs to reexamine the pay structure, and align it to the measures of excellence in order to
incentive participation in the IDP and reward excellent teaching. Table 12 provides a summary
of the solutions and implementation plan, and shows how the solutions support the activities of a
learning organization.
CREATING “EXCELLENT” LEARNING EXPERIENCES
121
Table 12
Findings, Solutions, Implementation, and Learning Organization Activity
Findings Solutions Implementation
Learning
Organization
Activity
Organization
The
organizational
culture
moderately
promotes value
excellence
through shared
instructor
community
Define and
establish
standards of
excellence
Establish institutional committee to
clearly define excellence for
University Extension
Set standards of excellence and
communicate the defined standards of
excellence widely to institutional
stakeholders
Continually reinforce standards by
incorporating into institutional
practices in order to embed in
organizational culture
n/a
Organization
Limited and
inconsistent
resources for
instructional
support
Allocate
resources to
establish a
comprehensive
instructor
development
program
Reorganize Office of Instructional
Enhancement to provide more staff
support to IDP
Adapt current IDP offerings into
online format and other alternative
delivery methods; offer workshops on
a monthly basis
Revise current IDP workshop
curriculum to allow for more
opportunities to integrate and apply
component skills
Based on survey of instructional
needs, develop new IDP workshops
with goal of offering at least one new
workshop per year
Develop IDP workshop for program
reps and other staff who work with
instructors
Systemic
problem
solving
Trying new
approaches
Transferring
knowledge
CREATING “EXCELLENT” LEARNING EXPERIENCES
122
Table 12, continued
Findings Solutions Implementation
Learning
Organization
Activity
Knowledge
Insufficient
conceptual and
procedural
knowledge about
curriculum
components
Organization
The organizational
culture moderately
promotes value
excellence through
shared instructor
community
Establish
instructor
learning
communities
and
mentorships
Train IDP and program unit staff on
coordinating and facilitating instructor
learning communities
Develop expectations and guidelines for
how instructor learning communities
should be formed and managed; also
mandate regular meetings
Identify potential topics and recruit for
voluntary participation amongst current
instructors
Strongly encourage new instructors to
join learning communities
Annual review of learning communities
by participants and administration
Learning
from own
experience
Learning
from others
experience
Trying new
approaches
Organization
Lack of
consequences for
low evaluation
does not enforce
accountability to
institution
Establish
measures of
excellence to
reinforce
institutional
accountability
Develop new measurement tools to
assess and evaluate the student learning
experience, using defined standards of
excellence
Communicate to instructors about, and
encourage instructors to use, appropriate
assessments to measure learning in their
classes
Require Program Directors to file
annual review for instructors, and use
learning analytics to discuss instructor
performance
Measuring
learning
Organization
Unclear
institutional
incentives to
improve teaching
quality
Establish
system of
rewards and
incentives
Establish institutional committee to
restructure current Distinguished
Instructor Award to include more
objective criteria for nomination and a
tiered structure of recognition.
Restructure pay model to incentivize
participation in IDP and reward
excellent teaching
n/a
CREATING “EXCELLENT” LEARNING EXPERIENCES
123
Evaluation Plan
After implementation, the proposed solutions should be evaluated to determine if they
were effective in closing the knowledge, motivation, and organizational performance gaps.
Kirkpatrick’s (2006) four levels of evaluation will form the basis of the evaluation plan. Level
one measures reactions, which is essentially how participants like a particular program
(Kirkpatrick, 2006). Level two measures learning, or the impact of the program on participants
and what principles, fact and techniques they understood (Clark & Estes, 2008; Kirkpatrick,
2006). The third level measures behavior, going beyond reaction and learning to see if the
participant is utilizing and applying what they have learned in the training, and whether the gains
made are lasting (Clark & Estes, 2008; Kirkpatrick, 2006). The final level measures results,
which ultimately goes back to the original goal and gap, seeing if the solutions achieved the
desired results (Kirkpatrick, 2006).
Level 1: Reactions
The first level of evaluation will measure how instructors respond to change. If UNEX is
successful in implementing the solutions to develop into a learning organization that supports
professional learning, the expected reaction is that instructors will notice that instructional
support and professional learning opportunities are more available, and that there will be more
engagement with the IDP. To evaluate this change, specific elements need to be identified as
useful to measure, and feedback should be collected with a simple instrument (Kirkpatrick,
2006). Immediate, honest feedback is critical to this level of evaluation (Kirkpatrick, 2006), so
the surveys should be distributed directly following the professional learning opportunity.
After instructors experience new instructional support offered by the institution, they
should be asked to complete a simple survey. The survey should utilize questions that are
CREATING “EXCELLENT” LEARNING EXPERIENCES
124
answered on a continuous scale, such as a 5-point Likert scale of strongly agree to strongly
disagree, so that the results can be quantified (Kirkpatrick, 2006). The questions should elicit for
the effectiveness of the instructional support or professional learning opportunity, whether it be a
new IDP workshop or learning community experience. Sample questions include: “I learned a
new skill to help me become a more effective instructor” and “This program helped me learn
how to create better learning experiences for my students.”
The results of the survey should be reviewed and a baseline score established for the
continuous scale questions. If scores do not meet the established baseline, then considerations
need to be made to change the next intervention (Kirkpatrick, 2006).
Level 2: Learning
The second level of evaluation will measure how much instructors have learned, and
whether or not there is an observable change in skill, knowledge, or attitude (Kirkpatrick, 2006).
For instructors, the level two evaluation of measuring learning would demonstrate the knowledge
instructors have gained during the professional learning opportunities.
A pre-post survey would be recommended as the level two measurement tool
(Kirkpatrick, 2006). Prior to the professional learning opportunity, a pre-survey should be
administered. The purpose of this pre-survey is to establish the baseline knowledge level that can
be compared to the knowledge gained through the learning opportunity. Questions should probe
instructors’ prior knowledge about the particular topic that will be discussed or presented during
the professional learning opportunity. The questions should be designed to provide instructors
opportunities to demonstrate knowledge, such as asking short-answer questions relevant to the
topic being covered in the learning opportunity. The questions might also be short scenarios that
ask instructors to solve a problem.
CREATING “EXCELLENT” LEARNING EXPERIENCES
125
The post-meeting survey should mirror the questions asked in the pre-meeting survey
(Kirkpatrick, 2006). This will allow for direct comparison of the impact of the meeting on
instructors knowledge about effective methods to produce excellent learning experiences. If
scenario questions are asked in the pre-survey, similar scenarios should be included in the post-
survey so that learning can be measured. The post-meeting survey should also include an open
question asking how the meeting could be improved. If instructors appear to not have learned
during the meeting, then the responses of the open question about how the meeting could be
improved should be carefully reviewed and adjustments made per the suggestions for
improvement.
Level 3: Behavior
The expected outcome at the behavior level is that instructors will develop a shared
understanding of “excellence” and incorporate the strategies to create courses that produce
excellent learning experiences into their classes. Assuming that the proposed solutions are
effective, the level three evaluations would indicate that instructors are actively engaging with
the institution and with one another about how learning experiences are achieved in their classes.
They should also be proactive in seeking resources to improve their instructional quality in order
to achieve “excellence.”
Level three evaluation measurements will need to occur over time. Behavior does not
always immediately change, so evaluations of behavior should be done in intervals over a period
of time (Kirkpatrick, 2006). A suggested time frame would be an assessment every three months
for a period of one year. The same questions asked in the level two evaluation can be used to
assess whether or not the program had lasting impact (Clark & Estes, 2008), however, additional
questions should include “What new strategies that you have learned in these meetings have you
CREATING “EXCELLENT” LEARNING EXPERIENCES
126
incorporated in your courses?” These additional questions should elicit a more detailed picture of
how proactive or not the instructors have been in taking steps to improve their instructional
quality. Also, the instructors’ syllabi and course materials should be reviewed every quarter they
teach during the span of a year to see if changes are being made. Sustaining the evaluation over a
year will allow for comparisons over time to see if the behavior is lasting.
Level 4: Impact
At the fourth, and final, level of evaluation, the expectation is that instructors who have
participated in the professional learning opportunities will have increased their average course
evaluation scores by at least one full point-level. If the proposed solution is effective, the course
evaluation scores will have risen, and the performance gap will have been closed. Additionally,
new tools should be developed to measure learning experiences, such as student placement in the
labor market, rate of word-of-mouth referrals, and persistence rate for certificate students.
Instructors will instinctively know the mission and values of UNEX, and they would know how
to design courses that achieve the institutional mission of creating courses that produce excellent
learning experiences.
The measurement tool for level four evaluation is the actual course evaluation. Because
course evaluations are administered to students at the end of every class, a baseline of
information exists for comparison. The evaluations of instructors pre-program can be compared
to their evaluations post-program to see if course evaluation scores improved. If the scores
improved, then the solution can be deemed effective. However, if improvements are not
achieved, then the gap and assumed causes should be revisited and revised.
CREATING “EXCELLENT” LEARNING EXPERIENCES
127
Strengths and Weaknesses of the Approach
The gap analysis framework provides a systematic method to diagnose and solve a
performance problem (Clark & Estes, 2008). The process requires a clearly articulated goal, and
provides an opportunity to uncover root causes of a problem by examining the knowledge,
motivation, and organizational factors that may be barriers to achieving the goals. The gap
analysis process is a holistic approach to problem-solving that encourages constant process-
improvement by investigating root causes, identifying appropriate solutions, creating an
implementation plan, and evaluating the solutions to assess whether or not the goal is being
reached. This systematic framework encourages an evidence-based approach, and does not allow
for reactive solutions to a performance problem. It encourages a more thoughtful, data-driven
process that will yield solutions that address root causes.
However, the gap analysis process necessitates a clear organizational goal as a starting
point for problem identification and generation of assumed causes. At the time of writing, new
leadership had recently transitioned into University Extension, and a strategic planning process
was underway to determine the new direction of institution. In this study, the new mission of
University Extension formed the basis of the goal, but because the mission was still new, the
interpretation of the mission into a measurable goal was not as specifically articulated in the
performance problem. For organizations that do not have established guiding principles,
determining a clear performance problem can be challenging, and the potential to misdiagnose a
problem is greater. Not all organizations have the goal specificity required for a gap analysis, so
it is not a process that is optimal for all.
CREATING “EXCELLENT” LEARNING EXPERIENCES
128
Limitations
The purpose of this study was to improve the performance of University Extension’s
Department of Business, Management and Legal Programs. Due to the context specificity of this
study, the primary limitation is the inability to generalize findings. However, the application of
Clark and Estes (2008) gap analysis as the problem-solving framework that informs this study,
can be beneficial to other institutions as a process that can guide performance improvement.
The scope of this study bounded the analysis to the key stakeholder group of instructors.
The experience of instructors may or may not be reflective of the experiences of other
stakeholders, thus limiting the study. The design of this study also presented other limitations.
The survey, interview, and document analysis findings partially validated a number of causes,
leaving open potential questions that were not further probed. The data also introduced potential
causes that were not originally considered, but the limited time for data collection did not allow
for additional follow up questioning in order to further probe and confirm causes. In assessing
instructor knowledge, syllabi were reviewed, but discussions with instructors about their syllabi
did not occur, and there potentially were artifacts other than the syllabi that existed, such as
online, supplemental materials, but were not examined. Additionally, observations were initially
considered as part of the data collection methodology, but time restraints did not allow for an
additional method to triangulate data.
There was also potential for social desirability bias because the survey and interview
participants may have provided responses that were not truly reflective of their experiences
(Fisher, 1993). The questions revolved around instructors’ experiences with teaching and with
working with University Extension staff, and they may have provided responses that are believed
to be more desirable. Also, the purposeful selection of interview participants limited the study to
CREATING “EXCELLENT” LEARNING EXPERIENCES
129
instructors willing to speak with the investigator, and there could have been a potential sample
bias.
Future Research
The literature about instructional quality and faculty development in higher education is
broad and comprehensive, but literature specific to continuing education institutions is limited.
Non-traditional, adult learners are a growing segment of the student population, and the need for
continuing professional education will continue to grow. More research is needed to address the
expectations for continuing education institutions, and how these institutions can ensure that
students’ classroom experiences are meeting their needs and expectations.
Additionally, this study did not uncover any motivation barriers, but the discrepancy
between the survey and document analysis brought forward a potential problem caused by a
motivational barrier. Further research could uncover and confirm instructors’ motivation for
teaching, providing more insight into practices that continuing education institutions can use to
continually encourage constant improvement.
Finally, this study limited research to one stakeholder group. A more comprehensive gap
analysis would have included all the stakeholders, including staff and students. The learning
experiences could be explored from a student perspective, which could better inform continuing
education institutions about what students need in order to have an excellent educational
experience. The classroom experience is only one aspect of an educational experience, so future
research that explored all aspects of the learning experience would lead to more comprehensive
solutions.
CREATING “EXCELLENT” LEARNING EXPERIENCES
130
Conclusion
As the world economy becomes more knowledge based, more and more jobs are
requiring at least some postsecondary education, which is driving many non-traditional students
back to school and increasing the demand for postsecondary educational programs and courses
(USDOE, 2006). However, postsecondary education institutions are under pressure to
demonstrate their value to students, employers, and policy-makers because of the increasing cost
of education (Huisman & Currie, 2004). Students want to be assured that their return on the
outcomes of their education will be worth the investment.
Continuing education programs, like University Extension, offer students opportunities to
take courses for personal and professional growth. As the postsecondary education market
becomes more saturated with more schools offering continuing education programs, University
Extension needs to ensure its program offerings provide excellent learning experiences that will
bring students back to take more classes, and make the word-of-mouth recommendations that are
important marketing tools for continuing education programs (Browne et al., 1998).
The cornerstone of an excellent learning experience is the actual experience of the course.
Courses need to be designed to create learning experiences that will be impactful and meaningful
to students (Fink, 2013). Instructors can possess great content knowledge, but if their course is
not designed to effectively deliver the content in a way that engages and challenges students, the
students’ learning experience will not be great (Fink, 2013).
At University Extension, the industry professionals who serve as part-time instructors
possess the motivation to teach, but lack some of the technical knowledge about effective course
design and teaching. The solutions offered in this study focus on building a structure and a
culture of organizational learning that fully supports professional learning. The solutions go
CREATING “EXCELLENT” LEARNING EXPERIENCES
131
beyond suggesting additional technical training for instructors, but rather focus on how the
institution can truly embrace its mission and reflect its commitment to teaching and learning. By
integrating activities of a learning organization, University Extension can be the embodiment of
of lifelong learning, and create a culture that drives staff and instructors to constantly strive to
create excellent learning experiences for students.
CREATING “EXCELLENT” LEARNING EXPERIENCES
132
REFERENCES
Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How
learning works. San Francisco, CA: Jossey-Bass.
Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching, and
assessing: A revision of Bloom’s taxonomy of educational objectives. New York, NY:
Longman.
Austin, A. E., & Sorcinelli, M. D. (2013). The future of faculty development: Where are we
going? New Directions for Teaching and Learning, 2013(133), 85-97.
Bragg, D. D. (2001). Opportunities and challenges for the new vocationalism in American
community colleges. New Directions for Community Colleges, 2001(115), 5-15.
Breneman, D. W. (2005). Entrepreneurship in higher education. New Directions for Higher
Education, 2005(129), 3-9.
Browne, B., Kaldenberg, D. O., Browne, W. G., & Brown, D. J. (1998). Student as customer:
Factors affecting satisfaction and assessments of institutional quality. Journal of
Marketing for Higher Education, 8(3), 1-14. doi:10.1300/J050v08n03
Burke, L. A., & Rau, B. (2007). Managing chronic excuse-making behaviors of faculty applying
Schlenker’s responsibility triangle. Educational Management Administration &
Leadership, 35(3), 415-428.
Carnevale, A. P., & Desrochers, D. M. (2002). The missing middle: Aligning education and the
knowledge economy. Journal for Vocational Special Needs Education, 25(1), 3-23.
Cervero, R. M. (2001). Continuing professional education in transition, 1981-2000. International
Journal of Lifelong Education, 20(1-2), 16-30.
CREATING “EXCELLENT” LEARNING EXPERIENCES
133
Choy, S. (2002). Nontraditional Undergraduates: Findings from the condition of education 2002
(NCES 2002-012). Washington, D.C.: U.S. Department of Education, National Center for
Education Statistics.
Clark, R. E. (2012). On gap analysis [Video file]. Retrieved from
https://www.2sc.usc.edu/mod/page/view.php?id=32790
Clark, R. E., & Estes, F. (2008). Turning research into results: A guide to selecting the right
performance solutions. Atlanta, GA: CEP Press.
Cox, M. D. (2004). Introduction to faculty learning communities. New Directions for Teaching
and Learning, 2004(97), 5-23.
Creswell, J. W. (2014). Research design: Qualitative, quantitative, and mixed methods
approaches. Sage publications.
Davis, M. (2003). Barriers to reflective practice: The changing nature of higher education. Active
Learning in Higher Education, 4(3), 243-255.
Dill, D. D. (1999). Academic accountability and university adaptation: The architecture of an
academic learning organization. Higher Education, 38(2), 127-154.
Eccles, J. (2009). Expectancy value motivational theory. Retrieved from
http://www.education.com/reference/article/expectancy-value-motivational-theory/
Elliott, K. M., & Healy, M. A. (2001). Key factors influencing student satisfaction related to
recruitment and retention. Journal of Marketing for Higher Education, 10(4), 1-11.
Elliott, K. M., & Shin, D. (2002). Student satisfaction: An alternative approach to assessing this
important concept. Journal of Higher Education Policy and Management, 24(2), 197-
209.
CREATING “EXCELLENT” LEARNING EXPERIENCES
134
English, L. M., & Mayo, P. (2012). Learning with adults: A critical pedagogical introduction.
Rotterdam, Netherlands: Sense Publishers.
Fink, L. D. (2013). Creating significant learning experiences: An integrated approach to
designing college courses. San Francisco, CA: Jossey-Bass.
Fisher, R. J. (1993). Social desirability bias and the validity of indirect questioning. Journal of
Consumer Research, 20, 303-315.
Gagné, R. M., & Merrill, M. D. (1990). Integrative goals for instructional design. Educational
Technology Research and Development, 38(1), 23-30.
Gallimore, R., & Goldenberg, C. (2001). Analyzing cultural models and settings to connect
minority achievement and school improvement research. Educational Psychologist,
31(1), 45-56.
Gouthro, P. A. (2002). Education for sale: At what cost? Lifelong learning and the marketplace.
International Journal of Lifelong Education, 21(4), 334-346.
Gunn, V., & Fisk, A. (2013). Considering teaching excellence in higher education: 2007-2013:
A literature review since the CHERI Report 2007. York, UK: Higher Education
Academy.
Hanushek, E. A., & Rivkin, S. G. (2006). Teacher quality. In E. A. Hanushek & F. Welch (Eds.),
Handbook of the economics of education (Vol. 2, pp. 1051-1078). Amsterdam: North
Holland.
Hénard, F., & Roseveare, D. (2012). Fostering quality teaching in higher education: Policies
and Practices (An IMHE guide for higher education institutions). Paris, France:
Organisation for Economic Co-operation and Development.
CREATING “EXCELLENT” LEARNING EXPERIENCES
135
Hill, Y., Lomas, L., & MacGregor, J. (2003). Students’ perceptions of quality in higher
education. Quality Assurance in Education, 11(1), 15-20.
Hoekstra, A., & Crocker, J. R. (2015). Design, implementation, and evaluation of an ePortfolio
approach to support faculty development in vocational education. Studies in Educational
Evaluation, 46, 61-73.
Holmes, C., & Kozlowski, K. (2014). Faculty experiences in a research learning community. The
Journal of Faculty Development, 28(2), 35-42.
Huisman, J., & Currie, J. (2004). Accountability in higher education: Bridge over troubled
water? Higher Education, 48, 529-551.
Jarvis, P. (1996). Continuing education in a late-modern or global society: Towards a theoretical
framework for comparative analysis. Comparative Education, 32(2), 233-244.
doi:10.1080/03050069628867
Kena, G., Aud, S., Johnson, F., Wang, X., Zhang, J., Rathbun, A., . . . Kristapovich, P. (2014).
The condition of education 2014 (NCES 2014-083). Washington, D.C.: U.S. Department
of Education, National Center for Education Statistics.
Kirkpatrick, D. L. (2006). Seven keys to unlock the four levels of evaluation. Performance
Improvement, 45(7), 5-8.
Knowles, M. S. (1973). The adult learner: A neglected species. Houston, TX: Gulf.
Knowles, M. S. (1980). The modern practice of adult education. New York, NY: The Adult
Education Company.
Krause, K. L. (2012). Addressing the wicked problem of quality in higher education: Theoretical
approaches and implications. Higher Education Research & Development, 31(3), 285-
297.
CREATING “EXCELLENT” LEARNING EXPERIENCES
136
Lauzon, A. C. (2013). University extension and public service in the age of economic
globalization: A response to Thompson and Lamble. Canadian Journal of University
Continuing Education, 26(1), 79-95.
Levine, L. E., Fallahi, C. R., Nicoll-Senft, J. M., Tessier, J. T., Watson, C. L., & Wood, R. M.
(2008). Creating significant learning experiences across disciplines. College Teaching,
56(4), 247-254.
Little, B., & Locke, W. (2011). Conceptions of excellence in teaching and learning and
implications for future policy and practice. In M. Rostan & M. Vaira (Eds.), Questioning
excellence in higher education (pp. 119-137). Dordrecht, Netherlands: SensePublishers.
Liu, N. R. (2009). Decentralisation and marketisation of adult and continuing education: A
Chinese case study. International Journal of Educational Development, 29(3), 212-218.
doi:10.1016/j.ijedudev.2008.02.007
Lyons, R. E. (2007). Deepening our understanding of adjunct faculty. In R. Lyons (Ed.), Best
practices for supporting adjunct faculty (pp. 1-13). Bolton, MA: Anker.
Marsh, H. W. (2007). Students’ evaluations of university teaching: Dimensionality, reliability,
validity, potential biases and usefulness. In R. P. Perry & J. C. Smart (Eds.), The
scholarship of teaching and learning in higher education: An evidence-based perspective
(pp. 319-383). Dordrecht, Netherlands: Springer.
Mayer, R. E. (2011). Applying the science of learning. Boston, MA: Pearson Education.
Merriam, S. B. (2001). Andragogy and self‐directed learning: Pillars of adult learning theory.
New Directions for Adult and Continuing Education, 2001(89), 3-14.
Merriam, S. B., & Bierema, L. L. (2013). Adult learning: Linking theory and practice. San
Francisco, CA: Jossey-Bass.
CREATING “EXCELLENT” LEARNING EXPERIENCES
137
Merrill, M. D. (2002). First principles of instruction. Educational Technology Research and
Development, 50(3), 43-59.
National Center for Education Statistics. (2013). Digest of education statistics: 2012.
Washington, D.C.: U.S. Department of Education, National Center for Education
Statistics. Retrieved from http://nces.ed.gov/programs/digest/d12/
Orosz, K. (2012). Accountability and the public funding of higher education. In A. Curaj, P.
Scott, L. Vlasceanu, & L. Wilson (Eds.), European higher education at the crossroads:
Between the Bologna Process and national reforms (pp. 691-707). Dordrecht,
Netherlands: Springer.
Pintrich, P. R., Smith, D. A., García, T., & McKeachie, W. J. (1993). Reliability and predictive
validity of the Motivated Strategies for Learning Questionnaire (MSLQ). Educational
and psychological measurement, 53(3), 801-813.
Pintrich, P. R. (2003). A motivational science perspective on the role of student motivation in
learning and teaching contexts. Journal of Educational Psychology, 95(4), 667-686.
Pusser, B., Gansneder, B. M., Gallaway, N., & Pope, N. S. (2005). Entrepreneurial activity in
nonprofit institutions: A portrait of continuing education. New Directions for Higher
Education, 2005(129), 27-42.
Ramsden, P. (1991). A performance indicator of teaching quality in higher education: The
Course Experience Questionnaire. Studies in Higher Education, 16(2), 129-150.
Röbken, H. (2009). Continuing higher education in the United States of America (USA). In M.
Knust & A. Hanft (Eds.), Continuing higher education and lifelong learning (pp. 287-
322). Dordrecht, Netherlands: Springer.
CREATING “EXCELLENT” LEARNING EXPERIENCES
138
Rueda, R. (2011). The 3 dimensions of improving student performance. New York, NY:
Teachers College Press.
Saroyan, A., & Trigwell, K. (2015). Higher education teachers’ professional learning: Process
and outcome. Studies in Educational Evaluation, 46, 92-101.
Schein, E. H. (2004). Organizational culture and leadership. San Francisco, CA: Jossey-Bass.
Shuell, T. (2013). Theories of learning. Retrieved from
http://www.education.com/reference/article/theories-of-learning/
Tam, M. (2001). Measuring quality and performance in higher education. Quality in Higher
Education, 7(1), 47-54.
Tarr, T. A. (2010). Working with adjunct faculty members. In K. J. Gillespie & D. L. Robertson
(Eds.), A guide to faculty development (2nd ed., pp. 347-362). San Francisco, CA: Jossey-
Bass.
Taylor, K. L., & Znajda, S. K. (2015). Demonstrating the impact of educational development:
The case of a course design collaborative. Studies in Educational Evaluation, 46, 39-46.
Tierney, W. G., & Hentschke, G. C. (2007). New players, different game: Understanding the rise
of for-profit colleges and universities. Baltimore, MD: Johns Hopkins University Press.
Torres, C. A., & Schugurensky, D. (2002). The political economy of higher education in the era
of neoliberal globalization: Latin America in comparative perspective. Higher Education,
43(4), 429-455.
U.S. Department of Education. (2006). A test of leadership: Charting the future of U.S. higher
education. Washington, D.C.: Author.
White, S. C., & Glickman, T. S. (2007). Innovation in higher education: Implications for the
future. New Directions for Higher Education, 2007(137), 97-105.
CREATING “EXCELLENT” LEARNING EXPERIENCES
139
Zerihun, Z., Beishuizen, J., & Van Os, W. (2012). Student learning experience as indicator of
teaching quality. Educational Assessment, Evaluation and Accountability, 24(2), 99-111.
CREATING “EXCELLENT” LEARNING EXPERIENCES
140
APPENDIX A
SURVEY
1. Length of time teaching at University Extension.
! Continuous dropdown menu of 1-100 years
2. How many classes do you teach annually?
! Continuous dropdown menu of 1-100
3. I have been trained to design courses.
1 2 3 4 5 6
Strongly
Disagree
Disagree
Somewhat
Disagree
Somewhat
Agree
Agree
Strongly
Agree
4. Compared to other instructors, I think I know a great deal about course design. (MSE)
1 2 3 4 5 6
Strongly
Disagree
Disagree
Somewhat
Disagree
Somewhat
Agree
Agree
Strongly
Agree
5. I am certain that I can design a course that creates significant learning experiences. (MSE)
1 2 3 4 5 6
Strongly
Disagree
Disagree
Somewhat
Disagree
Somewhat
Agree
Agree
Strongly
Agree
6. What are some components of course design (KF)
7. What are some appropriate teaching and learning strategies for adult learners? (KC)
8. What do you think is the relationship between learning objectives, assessments, and teaching
and learning strategies? (KC)
9. Read the following course description. What would be two potential course objectives? (KP)
10. Based on the following learning objective:
a. What assessment method would you utilize to measure achievement of the objective?
(KP)
b. What teaching and learning strategy would you use to achieve the objective? (KP)
11. What are some methods you use to assess the effectiveness of your course design? (KM)
CREATING “EXCELLENT” LEARNING EXPERIENCES
141
12. On average, I spend _____ hours per week on
a. course development (provide continuous range of time) (MSE)
b. course updating (provide continuous range of time) (MSE)
13. I often choose to spend my time improving course design even if it requires more work.
(MCV)
1 2 3 4 5 6
Strongly
Disagree
Disagree
Somewhat
Disagree
Somewhat
Agree
Agree
Strongly
Agree
14. The amount of effort I put into preparing course materials influences student satisfaction.
(MA)
1 2 3 4 5 6
Strongly
Disagree
Disagree
Somewhat
Disagree
Somewhat
Agree
Agree
Strongly
Agree
15. I think that course evaluations are useful for me to improve course design. (MUV)
1 2 3 4 5 6
Strongly
Disagree
Disagree
Somewhat
Disagree
Somewhat
Agree
Agree
Strongly
Agree
16. There are clear consequences for low evaluation scores. (OCSA)
1 2 3 4 5 6
Strongly
Disagree
Disagree
Somewhat
Disagree
Somewhat
Agree
Agree
Strongly
Agree
17. There are institutional incentives for me to improve the quality of my courses. (OCSI)
1 2 3 4 5 6
Strongly
Disagree
Disagree
Somewhat
Disagree
Somewhat
Agree
Agree
Strongly
Agree
18. Have you participated in instructor training at University Extension through the Instructor
Development Program? If no, skip to question 20.
! Yes
! No
19. I think that what I learn in IDP workshops is useful for me to improve my course design.
(MUV)
CREATING “EXCELLENT” LEARNING EXPERIENCES
142
1 2 3 4 5 6
Strongly
Disagree
Disagree
Somewhat
Disagree
Somewhat
Agree
Agree
Strongly
Agree
20. I can easily fit IDP workshops into my schedule. (OCSR)
1 2 3 4 5 6
Strongly
Disagree
Disagree
Somewhat
Disagree
Somewhat
Agree
Agree
Strongly
Agree
21. The instructional support I receive from my program representative helps me improve the
quality of my design. (OCSR)
1 2 3 4 5 6
Strongly
Disagree
Disagree
Somewhat
Disagree
Somewhat
Agree
Agree
Strongly
Agree
22. The Office of Instructional Enhancement/Instructor Development Program has been helpful
in providing me instructional support, beyond online technical support, that helps me
improve the quality of my course design (OCSR)
1 2 3 4 5 6
Strongly
Disagree
Disagree
Somewhat
Disagree
Somewhat
Agree
Agree
Strongly
Agree
23. University Extension’s Mission is clearly reflected in the organizational culture (OCSC)
1 2 3 4 5 6
Strongly
Disagree
Disagree
Somewhat
Disagree
Somewhat
Agree
Agree
Strongly
Agree
24. I feel like I am part of the University Extension community. (OCMC)
1 2 3 4 5 6
Strongly
Disagree
Disagree
Somewhat
Disagree
Somewhat
Agree
Agree
Strongly
Agree
25. I have opportunities to discuss and share teaching and learning strategies with other
University Extension Instructors. (OCMC)
1 2 3 4 5 6
Strongly
Disagree
Disagree
Somewhat
Disagree
Somewhat
Agree
Agree
Strongly
Agree
CREATING “EXCELLENT” LEARNING EXPERIENCES
143
APPENDIX B
INTERVIEW PROTOCOL FOR UNIVERSITY EXTENSION BLP INSTRUCTORS
Respondent (Name): _____________________________________________________
Number of years teaching:_______________________________________________
Location of Interview: ____________________________________________________
Time in / Time Out: _______________________________________________________
Introduction
Thanks for meeting with me today. I am conducting this exercise as part of my dissertation
research with my doctoral program at USC, exploring how University Extension can achieve its
new mission of creating extraordinary learning experiences. I anticipate taking no more than 45
minutes of your time, and I have nine questions for your consideration.
Your participation is completely voluntary. We can skip any question you want at any time, and
you may stop the interview at any time. Any identifiable information obtained in connection with
this study will remain confidential. Your responses will be coded with a false name (pseudonym)
and maintained separately. If you are comfortable with it I would like to record our conversation
and the recording will be destroyed after it is transcribed. Do you have any questions? Ready to
begin?
1. Would you please briefly describe your teaching background and experience, specifically
any experiences you’ve had in learning about course design?
2. How do you know that the course that you have designed is effective in student learning?
Probing questions if it doesn’t come out on its own
a. How is University Extension’s new mission of creating extraordinary learning
experiences reflected in your course design?
3. Would you please describe teaching and learning strategies you use with diverse adult
learners?
Probing questions if it doesn’t come out on its own
a. How do you decide which teaching and learning strategies to employ?
b. How do you decide which assessment method to implement?
4. How often do you review and update your course design?
Probing questions if it doesn’t come out on its own
a. Why do you choose to invest the time on course updating? Or why not? How
much time do you usually spend updating materials?
CREATING “EXCELLENT” LEARNING EXPERIENCES
144
5. What is the impact of course evaluations on your decision to update curriculum?
Probing questions if it doesn’t come out on its own
a. What happens when you receive low evaluation scores?
6. What factors do you think contribute to student dissatisfaction?
7. What are the reasons why you participate (or not) in IDP programs?
Probing questions if it doesn’t come out on its own
a. Are IDP workshops conveniently scheduled for you?
b. If not, what suggestions do you have to make them more accessible for you?
8. Can you describe the nature of the relationship you have with University Extension?
Probing questions if it doesn’t come out on its own
a. What about with other instructors?
b. How does this affect your understanding of teaching and learning excellence at
University Extension?
9. Can you tell me about your experiences with receiving instructional support from
University Extension, specifically as it relates to improving your course design?
Probing questions if it doesn’t come out on its own
What types of instructional support would you like to see offered by University Extension that is
not currently available?
Abstract (if available)
Abstract
This study utilized the Clark and Estes gap analysis to investigate the knowledge, motivation, and organizational barriers that 330 Business and Legal Programs Department instructors face in developing courses that create “excellent” learning experiences at University Extension, a nonprofit, university continuing education institution. Assumed causes of the knowledge, motivation, and organizational barriers were generated from related literature, learning and motivation theories, and personal knowledge. The analysis of this qualitative case study validated nine causes that led to five recommended solutions. To support instructors in their efforts to produce courses that create “excellent” learning experiences, University Extension should define excellence for the University Extension context, allocate resources to establish a comprehensive instructor development program, establish instructor learning communities, establish measures of excellence, and establish a clear system of rewards and incentives. The solutions introduce activities that move University Extension closer to becoming a learning organization that supports professional learning. An implementation and evaluation plan is also proposed.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Systemic multilayered assessment of global awareness in undergraduate students: an innovation study
PDF
Improving math achievement among fourth graders at Al-Corniche Primary For Girls: a gap analysis
PDF
Increasing English performance in Chinese schools: a gap analysis
PDF
Minority Reserve Officer Training Corps officer candidate recruitment and retention: a gap analysis
PDF
Service-learning and character development: an analysis of Up with People resulting in a model of global citizens for servant leadership
PDF
Improving educational attainment at a bridge program in Saudi Arabia: a gap analysis
PDF
Creating a comprehensive professional development program for MBA students: a needs analysis
PDF
Improving instructor skills (IIS): a Needs analysis
PDF
Creating a language immersion teacher recruitment pipeline: understanding the needs and motivation of prospective candidates
PDF
Increasing international student enrollment at an East Asian university: a gap analysis
PDF
Fostering competent professionals: instructional systems specialists at the instructional systems technology program
PDF
Increasing partnerships between education and industry in the United Arab Emirates: a gap analysis
PDF
Building national capacity in student affairs at a local university in Qatar: A gap analysis
PDF
Integrating education for social justice and social innovation: a gap analysis of a high school program innovation to increase justice-oriented action
PDF
The role of divisional principals in teacher retention in East African international schools
PDF
Intentional, pedagogically driven, and systematic use of technology in teaching practice
PDF
Girls in STEM: the underrepresented trajectory in Tennessee: an innovation study
PDF
Improving early grade reading instruction in Ghana: a discrepancy gap analysis
PDF
An innovation study: designing a social entrepreneurship coaching model for entrepreneurs
PDF
The role of higher education in bridging workforce skills gaps: an evaluation study
Asset Metadata
Creator
Phansavath, Ing
(author)
Core Title
Creating "excellent" learning experiences: a gap analysis of a university extension program
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Global Executive
Publication Date
11/23/2015
Defense Date
07/16/2015
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
adult learners,continuing education,defining excellence,instructional support,learning experiences,OAI-PMH Harvest
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Filback, Robert A. (
committee chair
), Seli, Helena (
committee member
), Tambascia, Tracy (
committee member
)
Creator Email
ingyphans@gmail.com,phansava@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c40-202443
Unique identifier
UC11278035
Identifier
etd-Phansavath-4054.pdf (filename),usctheses-c40-202443 (legacy record id)
Legacy Identifier
etd-Phansavath-4054.pdf
Dmrecord
202443
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Phansavath, Ing
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
adult learners
continuing education
defining excellence
instructional support
learning experiences