Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
laure-burke-march12
(USC Thesis Other)
laure-burke-march12
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
ONLINE PROFESSIONAL DEVELOPMENT: USING DATA TO EVALUATE
PROGRAM EFFECTIVENESS IN
PREPARING FACULTY TO TEACH ONLINE
by
Laure Sue Burke
__________________________________________________________________
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
May 2012
Copyright 2012 Laure Sue Burke
ii
DEDICATION
I dedicate this dissertation to my resilient mother, the late Sue Morihara. You
taught me about the importance of hard work and undeniable faith. It is also
dedicated to my joyful sons, Joshua and Jacob. There is greatness in both of you.
And finally, I dedicate this dissertation to my caring husband, Joe. You are the
sunshine of my life.
iii
ACKNOWLEDGEMENTS
My dissertation journey has been fueled by the support of the following
individuals. I am extremely grateful for their support and guidance through the
entire doctorate program.
Joe Burke, my encouraging husband, for serving as my primary cheerleader
during the past three years;
Joshua and Jacob, my extraordinary sons, for their patience and
understanding;
Dr. Melora Sundt, my brilliant dissertation chair, for guiding this dissertation
to its completion and challenging me to do my best work;
Dr. Dominic Brewer and Dr. Larry Picus, my committee members, for
providing me with thoughtful feedback;
Dr. Dennis Hocevar, USC’s statistical genius, for helping me make sense of
my data;
Dr. John Morton, my mentor, for encouraging me to pursue a doctorate
degree;
Dr. Leon Richards, Dr. Louise Pagotto, Mary Hattori, and my other
colleagues, for trusting me to embark on this study to help our college;
My fellow colleagues in the 2009 USC Cohort for teaching me so much
about the good work that they are doing to improve education in the state of Hawaii;
iv
Dr. Dana Tomonari for being there at times when I needed to be rescued; and
My parents for instilling the value of education in me. I cannot thank the
Lord enough for blessing me with these amazing people.
v
TABLE OF CONTENTS
Dedication ii
Acknowledgements iii
List of Tables vi
List of Figures viii
Abstract ix
Chapter One: Introduction 1
Chapter Two: Review of the Literature 22
Chapter Three: Methodology 89
Chapter Four: Analysis of the Data 111
Chapter Five: Discussion 135
References 153
Appendices 164
Appendix A: Distance Learning Certification Program 164
Appendix B: Survey (Pre-Survey) 173
Appendix C: Survey (Post-Survey) 177
vi
LIST OF TABLES
Table 1: Comparison of Pedagogical and Andragogical Assumptions 31
Table 2: Key Components of Effective Online Instructor Professional 71
Development
Table 3: Key Components of Online Professional Development by 78
National Governing Agencies
Table 4: Pre-Survey Respondents by Gender 96
Table 5: Post-Survey Respondents by Gender 96
Table 6: Pre-Survey Respondents by Age 97
Table 7: Post-Survey Respondents by Age 97
Table 8: Pre-Survey Respondents by Years of Teaching 98
Table 9: Post-Survey Respondents by Years of Teaching 98
Table 10: Pre-Survey Respondent by Years of Teaching Online 99
Table 11: Post-Survey Respondent by Years of Teaching Online 99
Table 12: Research Questions, Survey Items, and Statistical Analyses 107
Table 13: Cronbach’s Alpha for TPACK 109
Table 14: Response Rates 113
Table 15: Sample Demographics 114
Table 16: TPACK Categories with Brief Definition and Example 117
Corresponding Questions
Table 17: Pre-Survey Descriptive Statistics for TPACK Subscales with 118
Cronbach’s Alpha Scores
Table 18: Post-Survey Summary of Descriptive Statistics for TPACK 119
Subscales
vii
Table 19: Matched Mean for Pre- and Post-TPACK Survey Questions 120
Table 20: Professional Development Activity 123
Table 21: Overall Mean Pre-Survey and Post-Survey Scores for 13 128
Questions — “Participants Preparedness for Online
Teaching Tasks”
Table 22: Mean Pre-Survey and Post-Survey Scores for Individual Questions 129
on “Participant Preparedness for Online Teaching Tasks”
Table 23: Pre-Survey Question on “Participants’ Preparedness for Online 130
Teaching Experience”
Table 24: Matched Mean for Pre- and Post-Survey Question on 131
“Participants’ Preparedness for Online Teaching Experience”
Table 25: Level of Interest in “Teaching and Designing an Online Course” 133
viii
LIST OF FIGURES
Figure 1. Andragogy in Practice (Knowles, Holton, & Swanson, 1998) 29
Figure 2. Pedagogical Technological Content Knowledge Framework 50
(Mishra & Koehler, 2006, p. 1025)
Figure 3. Online Community College Distance Education Certification 103
Program (2011)
Figure 4. Participants’ Age and Preparedness for Online Teaching 132
ix
ABSTRACT
This study investigated a sample of community college faculty who
participated in an online professional development program. The purpose of the
study was to determine the relationship between participating in an online
professional development program and changes in participants’ self-assessed
knowledge about tasks associated with effective online teaching. The study
examined what professional development activities the participants found to be most
useful in preparing them to teach in an online environment. The study also aimed to
determine what professional development activities gave the participants the greatest
knowledge gains.
An evaluation was accomplished by first ensuring that the professional
development was well designed to meet the training goal of increasing faculty
knowledge with respect to technology, pedagogy, and content, and the combination
of each of these areas (TPACK). Second, the evaluation referred to principles of
adult learning and Kirkpatrick’s evaluation model as a framework for guiding the
data collection. The pre- and post-survey was conducted and quantitative methods
utilized to collect and analyze the data. The analysis showed that there were
statistically significant differences in learning gains. Participants who participated in
the online professional development program improved their self-assessed
knowledge about tasks associated with effective online teaching, indicating that they
felt more confident about their technology, pedagogy, and content knowledge.
Participants also demonstrated improvement in their attitudes of preparation relating
x
to an online teaching experience and ranked certain activities as more useful than
others.
1
CHAPTER ONE
INTRODUCTION
Higher education institutions have been turning to online courses and
programs as a means to increase educational attainment. As a result, over the last
decade, distance education has been growing rapidly in the United States (Moloney
& Oakley II, 2006; Pagliari, Batts & McFadden, 2009). During the fall 2005 and fall
2006 time period, more than two thirds of all United States higher education
institutions provided students with an option to enroll in online education, with the
majority of them providing programs that were fully online (Allen & Seaman, 2007).
In 2006, nearly 20% of all United States students took a minimum of one online
class. According to a survey by Allen and Seaman (2007), almost 3.5 million
students were enrolled in at least one online course during the fall of 2006 compared
to 1.6 million in fall of 2002 representing a compound annual growth of 21.5%.
United States higher education institutions with online course offerings expect their
online enrollments to continue on an increasing trend (Allen & Seaman, 2010).
The increased demand for online course offerings in higher education
institutions has generated discussions about the need to prepare faculty to teach
online. Faculty members have experience with designing courses for a face-to-face
teaching environment; however, many are not familiar with how to apply traditional
teaching practices to the online environment. Introducing a new context such as the
virtual environment, “where the rules of face-to face teaching do not apply,
challenges faculty to establish new ways of thinking about course design” (Koehler,
2
Mishra, Hershey, & Peruski, 2004, p. 35). According to Palloff and Pratt (2011)
faculty need new skills in effective online instruction such as learning to facilitate
online discussions and assessing student learning. Additionally, faculty members
who are new to online teaching need to acquire technological skills (Moon,
Michelich & McKinnon, 2005).
As rapid growth in online education continues, faculty have few pathways to
acquire training for online instruction resulting in poorly constructed courses and
continued allegations that online education is not as rigorous as its face-to-face
counterparts (Palloff & Pratt, 2011). According to Palloff and Pratt (2011), faculty
are often left on their own to find appropriate training or create their own approach to
online teaching based on conversations with colleagues, or information they discover
online or in journals. Others seek assistance by attending on-campus professional
development to support successful technology integration (Grant, 2004).
Prior to teaching online, Yang and Cornelious (2005) stress that it is essential
for faculty to receive training on how to deliver online courses. Other educational
researchers have also documented the need for professional development on the area
of online teaching (Koehler et al., 2004; Pagliari et al., 2009; Shepherd, Alpert, &
Koeller, 2008; Simonson, 2007; Yang & Cornelius, 2005).
Professional development serves to guide faculty on what works best and
what doesn’t work when teaching online (Pankowski, 2004). Current research
emphasizes the need for faculty training to help them in converting face-to-face
courses to online courses (Almala, 2006); however, the “literature primarily
3
describes existing training programs and best practices of preparing online
instructors based on theoretical stances and anecdotal evidence” (Roman, Kelsey, &
Lin, 2010, p. 2).
There is limited research addressing what constitutes effective professional
development from the perspectives of the instructors who receive training on how to
make a successful transition from face-to-face delivery to online delivery (Roman et
al., 2010). “Even scholars who support more complex perspectives on integrating
technology and education find it difficult to offer concrete models or frameworks to
guide educators” (Koehler et al., 2004, p. 27). According to Palloff and Pratt (2011),
professional development programs that do exist vary widely in terms of content and
quality lacking a uniform and accepted approach to preparing faculty to teach online.
Furthermore, Koehler, Mishra, Hershey, and Peruski (2004) suggest that faculty need
new forms of support and collaboration to develop new pedagogical methods and
tools to teach their course content on the web.
Although millions, if not billions of dollars are being spent on professional
development each year (Borco, 2004), little evaluation is being done to measure how
the training improves teaching practices (Hill, 2009). New and exciting forms of
professional development guarantee neither high-quality delivery nor substantive
effects on instructors, teaching and student learning (Hill, 2009). Professional
development will be more effective and more efficient if instructors’ weaknesses can
be linked to the learning opportunities most likely to remedy those weaknesses.
Obtaining information about the quality of professional development can be gathered
4
by conducting small-scale but rigorous studies to provide training effectiveness
measurements (Hill, 2009).
As more and more educational institutions look to professional development
programs to prepare their faculty to teach online, there will need to be systematic
methods of evaluating professional development programs that are being used to
educate faculty on preferred practices with online courses. When professional
development program results are positive, it is essential to know which components
worked most effectively. Documenting positive program outcomes of participants
can inform others of what works best in preparing faculty to teach online. This study
focuses on the evaluation of a professional development to prepare faculty to teach
online and its relationship to changes in their self-assessed knowledge about tasks
associated with effective online teaching.
Background of the Problem
United States Postsecondary Educational Attainment
Even before the onset of online course offerings in universities, American
higher education has been the envy of the world for years, educating more people to
higher levels than any other nation. Although the United States continues to have
some of the world’s best universities, educational attainment generally is
proportionately increasing more slowly in the United States than in other
industrialized countries (United States Department of Education [USDOE], 2006).
Business leaders, policymakers, educators, and the public have been faced with
increasing global competitiveness while worker shortages in new jobs requiring
5
higher level skills in communication, math, and technology are on the rise
(Association for Career and Technical Education, 2010; Friedman, 2005). With
more than half of the fastest growing jobs in the United States requiring education
beyond high school and the first of the Baby Boomers retiring, millions of college
educated citizens are needed to fill new and existing jobs (Callan & Atwell, 2009).
In 2006, about 80% of all United States occupations required at least some
postsecondary education or training with approximately one-third of the occupations
requiring a bachelors degree or higher, and the trend is projected to persist through
2014 (Holzer & Lerman, 2009; United States Bureau of Labor Statistics, 2010).
Despite the demand for a college educated workforce (Organization for Economic
Cooperation and Development, 2011), during the past two decades, the United States
failed to achieve college success for a much higher percentage of students
threatening the future of the nation (Lucas, 2010).
For students who enroll in college, rates of educational attainment leading to
certificate, associate, and baccalaureate programs are poor and have improved only
slightly. According to the National Center for Education Statistics (NCES, 2010)
looking at degree attainment of a nationally representative sample of students who
began postsecondary education for the first time in the 2003-2004 academic year,
only 49% earned a credential ranging from an educational certificate to a bachelor’s
degree by June 2009. Another 15% remained enrolled, but had not yet completed a
program of study; and about one-third or 36% left postsecondary education without a
credential of any kind (National Center for Education Statistics [NCES], 2010). It is
6
important to note that some of the technology fields provide students with on-the-
job-training and may not require a college degree. Nonetheless, these low degree
attainment rates are depriving the United States of college-educated and trained
workers needed to keep the American workforce competitive globally (National
Center for Public Policy and Higher Education [NCPPHE], 2008).
Online Education to Increase Educational Attainment in Hawaii
Similar to other United States educational institutions, University of Hawaii’s
Community Colleges’ (UHCC) educational leaders have committed to increasing
educational attainment rates in Hawaii (Johnsrud, 2010) by supporting alternate
methods of instruction such as online course offerings. In the UHCC’s Strategic
Plan Update (2008), the community college system documented its long term
commitment to distance education by including a strategic goal of increasing
educational programs offered to students using distance learning technologies
(University of Hawaii Community Colleges [UHCC], 2008). Offering online
courses provide students greater flexibility in obtaining an education, removing
traditional barriers of time, space, and place (Allen & Seaman, 2010).
Online Professional Development
“As higher education evolves in unexpected ways, this new landscape
demands innovation and flexibility from the institutions that serve the nation’s
learners” (USDOE, 2006, p. xi). In order to meet the strategic goals of the UHCC’s
Strategic Plan of expanding online courses to underserved regions of Hawaii, the
UHCC system recognized that there was a need to provide effective professional
7
development and support to faculty who wish to convert face-to-face courses to
online courses, and to create an instructor certification process to ensure a certain
level of standards is met. In 2007, one of the UHCC’s, Online Community College,
1
spearheaded a system-wide multi-year project to establish a professional
development program consisting of assets and services from across the UHCC
system so that there was a consistent standards-driven professional development
program complete with technical and instructional design support (Career and
Technical Education Program Improvement and Leadership Grant, 2009). The
Chancellor of Online Community College agreed to allocate $100,000 toward faculty
professional development. The goals of the program included the following: 1)
encourage high quality learning environments through appropriate training and
support on best practices in online teaching, and the components of good course
websites using the college’s online course management system, and 2) increase
number of courses offered online. The college’s initial 13 week face-to-face training
program ended in May 2008, enrolling 63 faculty from 18 college units. The
program funding benefitted the faculty by providing them with financial support in
the form of stipends and technical support to create online instructional materials.
Additionally, each faculty cohort was assigned an Information Technology Specialist
and a highly-trained student assistant. The success of Online Community College’s
initial program was based on how many participants completed the thirteen week
1
To protect the privacy of the college, a pseudonym was used throughout the dissertation.
8
professional development program. Despite the lack of a formal program evaluation,
Online Community College’s Chancellor agreed to fund subsequent professional
development programs offered by the college’s Center for Excellence in Learning
and Teaching Technology (CELTT). In 2009, the second series of professional
development programs were enhanced with an online component. The most current
professional development conducted during the summer 2010 was entirely online
with face-to-face meetings available with the CELTT staff for faculty who preferred
meeting in person.
Evaluation of Online Professional Development
According to Guskey (2009), there is very little good research on effective
professional development. Rigorous studies of professional development can
consume considerable time and resources requiring significant cooperation from
practitioners at all levels to gather relevant data. Researchers shy away from
professional development studies; hence “sound, trustworthy, and scientifically valid
evidence on professional development characteristics that improve student learning
remains scarce” (Guskey, 2009, p. 226). Nonetheless, Guskey (2009) notes that
educational leaders who oversee planning of professional development need help in
clarifying goals for improving student learning and determining what best reflects
their achievement.
Although evaluation is essential to determine the effectiveness of a training
program (Kirkpatrick & Kirkpatrick, 2006), the trainers of Online Community
College’s instructor certification program did not survey program participants to
9
obtain feedback on how effective the professional development was in preparing
participants to teach online. When there are no formal evaluation measures, it is
difficult to conclude that any positive outcomes were related to participation in a
professional development program. For example, faculty who did not participate in
the instructor certification may have learned how to teach online from other training
sources enabling them to have the knowledge and skills necessary to teach a course
online. Similarly, some of the faculty participants came to the instructor certification
program with prior knowledge related to teaching online courses. In fact, 34 of the
64 program participants of the original instructor certification program designated
that they had experience teaching online prior to enrolling in the instructor
certification program (Hattori, 2011) so it is possible that an increase in online
course offerings would have taken place without their participation in the
professional development program.
Despite the lack of professional development evaluation to determine the
effectiveness of the instructor certification program, Online Community College has
experienced tremendous growth in online course offerings and student enrollment
(Hattori, 2011). The number of students who took a combination of online and face-
to-face courses increased from 17.6% of total college enrollment from fall 2006 to
33% in fall 2009. The number of students taking only online courses more than
doubled, from 7% to 15% of total student enrollment in 2009. In fall 2007, the
college offered 77 online courses compared to 160 courses during the spring 2011
semester. While there are positive trends in online education offerings and student
10
enrollment, the college needs clarity about which components of the Distance
Education Academy contributed to the successful outcomes since there were no
methods in place to evaluate the professional development program.
Statement of the Problem
Higher education institutions such as Online Community College are
challenged with developing “faculty who are ready, willing, and able to teach in the
online world” (Koehler et al., 2004). When faculty choose to teach online, courses
are typically content and faculty or facilitator driven, similar to the face-to-face
classroom (Palloff & Pratt, 2007). According to Simonson (2007), universities must
provide faculty with instructional design support that prepares them to teach their
face-to-face courses online.
The United States Department of Education’s (USDOE) Office of
Postsecondary Education’s 2006 Report titled “Evidence of Quality Distance
Education Programs” presented an indicator of quality related to faculty being
involved in a strong and active professional development process. According to the
Accrediting Commission for Community and Junior Colleges’ (ACCJC) Distance
Education Manual (2008), “faculty access to appropriate technology and software as
well as to support personnel is critical to a successful program” (p. 5). Researchers
also recommend that professional development for online teaching needs to provide
faculty with an understanding of how course content, pedagogy, and technology can
be integrated when designing online courses (Koehler et al., 2004; Mishra & Koehler,
2006).
11
Creating and maintaining online professional development for adult learners
requires significant investment of resources on behalf of the educational institution
(Yang & Cornelius, 2005). Despite the investment of financial and human resources
many institutions do not have a comprehensive evaluation system to assess their
training against preferred practices. Such is the case with Online Community
College which is the institution at the focal point of this research study.
Guskey (2000) emphasized that universities today live in an age of
accountability and those who provide faculty with support via professional
development are asked to show that what they do really matters. In certain instances,
public colleges have established policies relating to program evaluation in response
to external mandates such as those required by accrediting bodies (University of
Hawaii, 2005). According to Guskey (2000), just as educators plan carefully and
make ongoing assessments of student learning an integral part of their instructional
process, professional developers need to make evaluation a key part of the
professional development process. Guskey (2000) and other educators have applied
Kirkpatrick’s (2006, 2007) four-level evaluation model originally developed to
evaluate training programs in business and industry. Kirkpatrick’s four evaluation
levels are: 1) reactions, 2) learning, 3) transfer, and 4) results. According to the
model, evaluation should always start with level one, and then, as time and budget
allows, move sequentially through levels two, three, and four (Kirkpatrick, 1994).
Therefore, “each successive level represents a more precise measure of the
12
effectiveness of the training program, but at the same time requires a more rigorous
and time-consuming analysis” (Kirkpatrick, 1994, p. 1).
The goal of level one evaluations is to measure participants’ perception to
learning experiences relative to a course, content, instructor, and relevancy to their
job immediately following the training experience “in order to initiate continuous
improvement of training experiences” (Kirkpatrick & Kirkpatrick, 2007, p. 32).
Kirkpatrick notes that participants’ reactions have important consequences for
learning or level two of the four-level evaluation model. However, according to
Clark (2004), educators need to be cautious when assessing reactions to a program,
because although reactions can be positive and participants may enjoy doing new
things, the result may not actually be making any difference in student learning.
Evaluating at level two moves beyond learner satisfaction and focuses on
assessing the extent that participants have advanced in skills, knowledge, or attitude
(Kirkpatrick, 1994). Measurement at level two is more laborious than level one,
ranging from formal to informal testing to team assessment and self-assessment.
Kirkpatrick (1994) suggests that participants take a pre-test and post-test to
determine the amount of learning that has occurred.
Level three of Kirkpatrick’s model measures the transfer of learning
behaviors due to the professional development (Kirkpatrick, 1994; Kirkpatrick &
Kirkpatrick, 2006). For many trainers, level three represents the “truest assessment
of a program’s effectiveness” (Kirkpatrick, 1994, p. 2) because level three attempts
to answer the question — “Are the newly acquired skills, knowledge, or attitude
13
being used in the everyday environment of the learner” (Kirkpatrick, 1994, p. 2).
Kirkpatrick further notes that measuring at this level is difficult as it is impossible to
predict when the change in behavior will occur; therefore, the trainer has to make
critical decisions in terms of when to evaluate, how often to evaluate, and how to
evaluate (Kirkpatrick, 1994).
Evaluating results or level four provides the greatest challenge to training
professionals (Kirkpatrick & Kirkpatrick, 2006). Level four of Kirkpatrick’s model
connects directly to the impact of the professional development on the organization
or the bottom line. From a business and organizational perspective, Kirkpatrick
states that “this is the overall reason for a training program, yet level four results are
typically not addressed” (Kirkpatrick, 1994, p. 2). In Kirkpatrick’s more recent work
(Kirkpatrick & Kirkpatrick, 2006), he suggests that trainers allow time for results to
develop — perhaps six months to a year.
Evaluating professional development faculty participants’ self-assessed
knowledge about tasks associated with effective online teaching can provide Online
Community College with level two data (Kirkpatrick & Kirkpatrick, 2006).
Pinpointing what professional development activities associated with effective online
teaching faculty participants find to be most useful will assist the institution in
replicating the program, and avoids the miscalculation of which aspects of the
professional development are most beneficial. Although not within the scope of this
study because of the dissertation timeline, the researcher will propose that Online
Community College commits resources to evaluate the professional development
14
through level four (Kirkpatrick & Kirkpatrick, 2006b). Level four results evaluation
will provide evidence connected to professional development participants’
effectiveness as measured by student learning outcomes in the online learning
environment and the strategic outcomes of Online Community College.
Essential areas to explore in this research study include adult learning
principles (Brookfield, 1986; Merriam, Caffarella, & Baumgartner, 2007) and the
degree to which the use of those principles supports the success of online
professional development programs such as Online Community College’s Distance
Education Academy. Additionally, there needs to be an examination on what the
literature supports as the preferred methods of evaluating professional development
that is specific to online learning. In any training or professional development
program, Kirkpatrick and Kirkpatrick (2007) notes that it is essential to have an
evaluation method to ensure that the program meets the needs of the stakeholders
and is delivered in the most effective way possible for the group for which it is
intended.
Purpose of the Study
In order to teach using a distance education platform, Palloff and Pratt (2011)
emphasize that faculty members are equipped with knowledge and skills relating to
teaching online. Professional development should include technology for online
course delivery and the pedagogy used for online teaching (Accrediting Commission
for Community and Junior Colleges, 2008; Higher Education Program and Policy
Council of the American Federation of Teachers [AFT], 2000; Palloff & Pratt, 2007).
15
Unfortunately, according to Palloff and Pratt (2011), the training of online instructors
has not kept pace with the demand for excellence in the online environment. This
study aims to contribute to the body of knowledge on effective professional
development for online faculty by exploring the relationship between participating in
an online professional development program specifically focused on online teaching
and changes in faculty participants’ self-assessed knowledge about tasks associated
with effective online teaching. Evaluation of professional development serves to
better understand training so that it can be strengthened, and to determine what
effects the training has had in terms of intended outcomes (Guskey, 2000). Using a
quantitative method approach, the researcher will refer to Kirkpatrick’s four-level
evaluation model to obtain evaluation data on the participants’ skills, knowledge, or
attitude toward teaching online and how the professional development provided by
the institution prepared them for online teaching.
Research Questions
The research questions that framed the study are:
1. What is the relationship between participating in the professional
development and changes in participants’ self-assessed knowledge about
tasks associated with effective online teaching?
2. What professional development activities did the participants find to be
most useful?
3. In what professional development activities did the participants have the
greatest knowledge gains?
16
Significance of the Study
Faculty are an essential component to the success of online learning at the
majority of educational institutions (Allen & Seaman, 2005); however, the
availability of trained faculty to teach online courses continues to be a critical issue
(Palloff & Pratt, 2011). As best practices for online teaching continue to emerge,
Pagliari, Batts, and McFadden (2009) note that faculty must keep abreast of latest
developments in online teaching. A study documenting the effectiveness of
professional development to prepare faculty to teach online will contribute to the
literature in the following ways. First, this study will provide a community college
with knowledge about how effective the professional development is in preparing
faculty to teach online by documenting preferred practices at Online Community
College. The study will inform program administrators and trainers about how to
enhance and implement future online professional development for adult learners.
Studying the professional development’s effectiveness will provide useful
information on the degree to which the college is receiving a positive return on its
time and financial investment in future training initiatives. Implementing evidenced
based evaluation methods will help to sustain professional development for online
teaching while supporting the long term commitment to expand distance education as
a means of increasing educational attainment in Hawaii. Second, research findings
will provide other faculty in similar situations with insights on what the faculty in
this study consider most useful in professional development to prepare them to teach
17
online. Lastly, the students who enroll in online courses will benefit from faculty
prepared to teach online.
Delimitations
The data that will be collected are delimited to a quantitative survey of
faculty teaching at a community college in Hawaii. The scope of the possible
candidates for this study is limited due to the small number of faculty who will
participate in the online professional development. Additionally, a selection bias
may arise because some of the faculty who attend will volunteer for the professional
development program suggesting that they may be more motivated than the general
faculty population. However, all participants who attend the online professional
development program preparing them to teach an online course will be included in
the study population. Each participant will have similar online assignment
expectations in order to successfully complete the online professional development
program requirements. The results of the study are dependent on the responses of
community college faculty participants and can be generalized only to participants
with similar demographics.
Limitations
A limitation to this research study is that the researcher is a faculty member
at the community college/research site. Being an insider has the possibility of being
so immersed that the researcher may be oblivious to existing patterns (Patton, 2002).
Additionally, the researcher will be a participant observer in order to develop an
insider’s view of what is happening in the online professional development. Patton
18
(2002) considers this a research limitation because the participant observer may
affect the situation being observed in unknown ways (Patton, 2002). Participants
and program staff may behave in an atypical manner when they know they are being
observed. The researcher will need to combine “participation and observation so as
to become capable of understanding the setting as an insider while describing it to
and for outsiders” (Patton, 2002, p. 268). Another limitation is the convenience
sampling procedure of this study, which decreases the generalizability of findings.
Definitions of Terms
Adult Learner: Malcolm Knowles developed the understanding of adult
learning, characterized by students outside of the k-12 educational system. Knowles
described the group as being self-directed, goal and relevance oriented, and as
having life experiences and practical knowledge (Knowles, Holten & Swanson,
2005).
Andragogy: A set of core adult learning principle introduced by Malcolm
Knowles in 1968 (Merriam et al., 2007) that focuses on engaging adult learners as
opposed to children or pedagogy (Knowles et al., 2005).
Asynchronous: Communication that can occur at any time, meaning that
people can communicate online without a pattern of interaction. Examples include
e-mail and discussion boards (Palloff & Pratt, 2007).
Blended/Hybrid Course: A course that blends online and face-to-face
delivery 30 to 79% of the time. A substantial portion of the course content is
19
delivered online, typically using online discussions with some face-to-face meetings
(Allen & Seaman, 2007).
Distance Education: A broad description of distance learning via the Internet
that takes several forms, including “fully online courses, hybrid or blended courses
that contain some face-to-face contact time in combination with online delivery, and
technology-enhanced courses, which meet predominantly face-to-face but
incorporate elements of technology into the course” (Palloff & Pratt, 2007, p. 3).
Online Course/Teaching: Delivering 80% or more of course content online
with typically no face-to-face meetings (Allen & Seaman, 2007)
Program Evaluation: “A systematic, purposeful process of studying,
reviewing, and analyzing data gathered from multiple sources in order to make
informed decisions about a program” (Killion, 2008).
Professional Development: The process of learning among educators (Killion,
2008).
Synchronous: Communication that occurs real-time in a virtual classroom or
a chat room (Palloff & Pratt, 2007).
Teacher Presence: “The design, facilitation, and direction of cognitive and
social processes for the purpose of realizing personally meaningful and educationally
worthwhile learning outcomes” (Anderson, Rourke, Garrison, & Archer, 2001, p. 5).
Traditional or Face-to-Face Course: Conducting a course with no online
technology (Allen & Seaman, 2007)
20
Universal Design for Learning: A set of principles for curriculum
development that gives learners with diverse abilities and backgrounds equal
opportunities to learn. Universal design supports instructors’ efforts to meet the
challenge of diversity by providing flexible instructional materials, techniques, and
strategies that help instructors differentiate instruction to meet these varied needs
(CAST, 2008).
Web Facilitated Course: A course that uses web-based technology 1 to 29%
of the time to facilitate what is essentially a face-to-face course. Instructor uses a
course management system (CMS) or web pages to post the course syllabus and
assignments (Allen & Seaman, 2007).
Organization of the Study
This dissertation is organized into five chapters. Chapter One provides an
introduction, background of the problem, statement of the problem, purpose of the
study, research questions, significance of the study, limitations, delimitations, and
the definitions of terms.
Chapter Two presents a review of the literature relevant to this study. It
addresses the following general topic areas: adult learning principles, how adult
learning principles are applied to online professional development, effective
strategies for preparing faculty to teach online, and methods of evaluating
professional development that is specific to online learning and teaching.
21
Chapter Three discusses the methodology to be used in the study, including
the research design, population and sampling procedure, research instruments, data
collection methods, and data analysis.
Chapter Four is organized by research questions and reports the findings of
the study. Each section includes a thorough reflection on the research findings and
presents insights about what the findings mean.
Chapter Five includes a brief summary of the research findings based on the
dissertation survey results of Online Community College’s online professional
development certification program specifically focused on training faculty to teach
distance education courses. The chapter will also include implications for practice,
limitations of the study, and a culminating discussion of future research
recommendations.
22
CHAPTER TWO
REVIEW OF THE LITERATURE
Distance Education is on the rise and considered as a critical component for
the long-term strategy for United States higher education institutions according to a
survey aimed at documenting the extent of online education in the United States
(Allen & Seaman, 2010). Sixty-three percent of all reporting higher education
institutions said that online learning was a critical part of their institution’s long-term
strategy, an increase from fifty-nine percent in 2009 (Allen & Seaman, 2010). Since
technological currency is important to higher education institutions, the level of
support for faculty professional development is one indicator of educational
institutions’ commitment to distance learning.
Professional development prepares faculty for their central role of
establishing quality online courses (Accrediting Commission for Community and
Junior Colleges Western Association of Schools and Colleges [ACCJC-WASC],
2010). Accrediting bodies have required that education institutions provide faculty
with professional development programs on teaching methodologies in distance
education viewing this support as a critical component to a successful distance
education program (ACCJC-WASC, 2010). Effective professional development is a
process that is intentional, ongoing, and systematic (Guskey, 2000). Equally
necessary is an evaluation component to systematically assess the professional
development programs. The results of these evaluations serve as the basis for
program improvement (ACCJC-WASC, 2002, 2006, 2010).
23
This study will focus on evaluating a professional development to prepare
faculty to teach online and its relationship to changes in faculty participants’ self-
assessed knowledge about tasks associated with effective online teaching. The
findings will inform program administrators about how to implement future online
professional development. The purpose of this literature review is to examine
previous research associated with effective professional development to prepare
faculty to teach online. An analysis of prior research and findings will assist in
further development of this study.
This chapter will provide an analysis of adult learning, a discussion of
professional development for online teaching, an overview of effective components
of online professional development programs aimed at preparing faculty to teach
online, and a review of professional development evaluation models. Additionally,
this chapter will include a summary of the components of professional development
to prepare faculty to teach online including the constructs that often get measured in
the literature. Identifying gaps and trends within the literature, the chapter ends with
a summary of issues and characteristics that ought to be considered when evaluating
a professional development program to prepare faculty to teach online.
Professional Development and Adult Learning
Professional development is a purposeful and intentional process that
includes evidence documenting if learning outcomes are met (Guskey, 2000).
According to Guskey (2000), the effects of professional development on student
learning vary widely as a function of differences in program content, the structure
24
and format of the experience, and the context in which implementation occurs. Garet,
Porter, Desimone, Briman and Yoon (2001) suggest that understanding something
about how adults learn or andragogy can improve an instructor’s ability to impact
learning outcomes. Adult learners come to professional development programs with
unique characteristics such as being self- directed, independent, goal oriented, and
having a range of significant life experiences (Knowles et al., 2005; Newton, 1977;
Terehoff, 2002). Since the participants of the professional development program are
adult learners, an examination of adult learning is essential to understanding the
research questions.
Adult learning is flooded with multiple theories, models and principles on
how adults learn and the methods to best facilitate their learning (Brookfield, 1986;
Candy, 1991; Garrison, 1997; Grow, 1991; Knowles et al., 2005). Merriam (1993)
confirmed the complexity and present condition of adult learning stating that:
...it is doubtful that a phenomenon as complex as adult learning will ever be
explained by a single theory, model or set of principles. Instead, we have a
case of the proverbial elephant being described differently depending on who
is talking and on which part of the animal is examined. (p. 1)
Each theory, model, and principle has its vocal supporters and detractors.
In identifying how a professional development program prepares participants
to teach online, there are benefits to understanding how adults learn best. This
section will first describe adult learning or andragogy, then, review related literature
(Knowles, 1988, 1989; Knowles et al., 2005). Knowles et al. (2005) and other
researchers such as Tough (1966), and Brookfield (1986) provide frameworks and
25
perspectives that have contributed to the literature on adult learner characteristics
assisting professional development trainers in delivering effective adult education
using online instruction (Knowles et al., 2005). At the end of this section reviewing
perspectives and literature relating to adult learning, the researcher will summarize
the common trends and gaps that can help to address the research questions of this
study.
Andragogy
In the early 1970s, Malcolm Knowles coined the term andragogy referring to
the differences between how adults learn and pedagogy or how children learn
(Knowles et al., 2005). Unlike pedagogy, andragogy does not assign the teacher full
responsibility for all decisions about learning content, method, timing, and
evaluation. Adult learners play an active role in the educational dynamics.
Andragogy has been described as a set of guidelines (Merriam, 1993), a philosophy
(Pratt, 1993), a set of assumptions (Brookfield, 1986), a theory (Knowles, 1989), and
a set of core adult learning principles (Knowles et al., 2005). Many researchers
debate labeling andragogy as a true learning theory particularly citing characteristics
such as self-directed learner that can be found in both children and adults (Davenport
& Davenport, 1985). Despite the criticisms, Brookfield (1984) stated that andragogy
“remains the single most important concept to adult theory of teaching and learning”
(p. 189). According to Merriam, Caffarella, & Baumgartner (2007), “practitioners
who work with adult learners continue to find Knowles’s andragogy, with its
26
characteristics of adult learners to be a helpful rubric for better understanding adult
learners” (p. 92).
Recognizing adults possess interests and abilities that are different than
children, Knowles created the andragogical model to highlight his six core
assumptions of andragogy (Knowles et al., 2005, pp. 64-66). The first assumption of
andragogy is adults need to know why they need to learn something prior to
undertaking to learn it. Tough (1979) supported Knowles first principle in his
research finding that when adults undertake to learn something on their own, they
will invest considerable energy in questioning the benefits they will gain from
learning it and the negative consequences of not learning it. Knowles (Knowles et
al., 2005) comments that the first tasks of a facilitator on adult learning is to help the
learner become aware of why they need to know the subject matter before they
engage in the learning and during the learning process.
The second assumption of andragogy is adults have a developed self-concept
of being responsible for their own decisions. Adult learners have a need to be seen
by others and treated by others as being capable of self-direction. Adult educators
such as professional development trainers are tasked with creating learning
experiences in which the adult learner can transition from dependent to self-directed
learners according to Knowles (Knowles et al., 2005).
The third assumption of andragogy is adult learners bring a greater volume
and a different quality of experience than youths in the face-to-face or online
classroom. According to Knowles (Knowles et al., 2005), in any group of adults,
27
there will be a wider range of individual differences than is the case with a group of
youths; therefore, greater emphasis in adult education is placed on individualization
of teaching and learning strategies. Additionally, the riches resources for learning
may reside in the adult learners themselves. Knowles emphasizes that adult
educators use experiential teaching techniques that tap into the experience of the
learners, such as group discussions, simulation exercise, problem solving activities,
case methods, laboratory methods, and peer-helping activities.
The fourth assumption of andragogy is adult learners come to the face-to-face
or online classroom ready to learn those things they need to know and be able to do
in order to cope effectively with their real life situations. The critical implication of
this assumption is the importance of timing learning experiences to coincide with the
developmental stages of the adult learner. Adult educators can induce readiness
through exposing the learner to models of superior performance, career counseling,
simulation exercises, and other techniques.
The fifth assumption of andragogy relates to adult learners’ orientation to
learning. In contrast to children’s and youths’ subject-centered orientation to learning,
adults orientation to learning is life centered or task/problem centered. Adults are
motivated to learn to the extent that they perceive that learning will assist them in
performing tasks or dealing with problems that they confront in their life situations.
The sixth assumption of andragogy is adults are responsive to some external
motivators such as better jobs, promotions, and higher salaries. However, Knowles
(Knowles et al., 2005) notes that the most potent motivators are internal pressures
28
such as the desire for increased job satisfaction, self-esteem, quality of life, and the
like.
Andragogy in Practice
In addition to six assumptions of andragogy, the andragogy in practice model
provides an enhanced conceptual framework to more systematically apply
andragogical principles across multiple domains of adult learning practice (Knowles
et al., 2005, p. 148). The three dimensions of the andragogy in practice model
include 1) the goals and purposes for learning, 2) individual and situation differences,
and 3) andragogy or the core adult learning principles. The three dimensional
process for understanding adult learning recognizes the lack of homogeneity among
learners and learning situations, and illustrates that the learning transaction is a
multifaceted activity (Knowles et al., 2005, p. 151).
Knowles placed particular emphasis on the importance of context when
referring to self-directed learning. Upon reviewing 36 applications of andragogy,
Knowles intended andragogical principles to be viewed as flexible assumptions
depending on the situation (Knowles et al., 2005, p. 146). Knowles observed that
people moved toward self-directedness at differing rates and not necessarily in all
dimensions of life, and that in some situations, adults may need to be “temporarily
dependent” in learning situations. To assist the adult learner, Knowles (1975)
proposed his linear model emphasizing that learning moved through a series of steps
to help the adult learner reach learning goals in a self-directed manner.
29
Figure 1. Andragogy in Practice (Knowles, Holton, & Swanson, 1998)
30
The andragogical teacher or trainer uses a process model preparing in
advance a set of procedures for involving learners in a process that involves these
elements:
1. preparing the learner;
2. establishing a climate conducive to learning;
3. creating a mechanism for mutual planning;
4. diagnosing the needs for learning;
5. formulating program objectives (which is content) that will satisfy these
needs;
6. designing a pattern of learning experiences;
7. conducting these learning experiences with suitable techniques and
materials;
8. evaluating the learning outcomes and rediagnosing learning needs.
Many teachers or trainers use a content design or model in traditional
education instead of a process model as described in Table 1. The trainer or teacher
decides in advance what knowledge or skill needs to be transmitted to the learner,
arranges this body of content into logical units, selects the most efficient means for
transmitting this content (lectures, readings, lab exercises, videos, etc.), and then
develops a plan for presenting these content units in some sort of sequence. The
difference between the two is that the “content model is concerned with transmitting
information and skills, whereas the process model is concerned with providing
31
procedures and resources for helping learners acquire information and skills”
(Knowles et al., 2005, p. 116).
Table 1
Comparison of Pedagogical and Andragogical Assumptions
Element
Pedagogical
Approach
(Content Model)
Andragogical Approach
(Process Model)
Preparing Learners Minimal Provide Information; Prepare for
participation; Help develop realistic
expectations; Begin thinking about
content
Climate Authority-oriented
Formal
Competitive
Relaxed, trusting, mutually respectful,
informal, warm, collaborative,
supportive, openness, authenticity,
humanness
Planning By teacher Mechanism for mutual planning by
learners and facilitator
Diagnosis of
Needs
By teacher By mutual assessment
Setting Objectives By teacher By mutual negotiation
Designing
Learning Plans
Logic of subject
matter;
Content units
Sequenced by readiness;
Problem units
Learning Activities Transmittal
techniques
Experiential techniques (inquiry)
Evaluation By Teacher Mutual re-diagnosis of needs;
Mutual measurement of program
Note. Developed from Knowles (1992) and Knowles (1995)
32
The process elements of andragogy expect that the learner has a high degree
of responsibility for learning since the entire system is built around the concept of
self- directed learning. Educators have noted that the characteristic of learner
responsibility can be a challenge noting that “by and large, the adults we work with
have not learned to be self-directing inquirers; they have been conditioned to be
dependent on teachers to teach them” (Knowles et al., 2005, p. 117). Adult learners
often experience a form of culture-shock when first exposed to truly adult
educational programs according to Knowles, Holten, and Swanson (2005).
A brief experiential encounter with the concepts and skills of self-directed
leaning (Brookfield, 1986) helps adults to feel more secure in entering into an adult
educational program. Knowles provides suggestions on how to prepare the learner;
however, the researcher did not find any empirical research to support the claims and
proposed activities. Even without empirical research, designs of programs for new
participants are increasing including a preparatory learning how-to-learn activity.
Knowles et al. (2005) suggest that activities may range from an hour to a day in
length depending on the length and intensity of the total program and consists of the
elements such as a brief explanation of the differences between proactive and
reactive learning, and a group exercise to identify the resources of the participants
(Knowles et al., 2005).
Knowles also suggests that adult learning programs include learning contracts
for completing tasks and evaluations. Cafarella and Cafarella (1986) investigated
163 students from six universities enrolled in adult education where learning
33
contracts were employed and found that learning contracts had some impact on
developing competencies for self-directed learning such as 1) translating learning
needs into learning objectives in a form that makes possible the accomplishment of
these objectives; 2) identifying human and material resources appropriate to different
kinds of learning objectives; and 3) selecting effective strategies for using learning
resources. Of the three competencies, translating learning needs into learning
objectives appeared to be the most consistent across subgroups within the study
sample.
Some critics have pointed out, despite being the primary model of adult
learning for more than 40 years, andragogy has not been well tested empirically
(Grace, 1985; Pratt, 1993, Merriam et al., 2007) and further research is needed
within its theoretical frame (Knowles et al., 2005). According to Merriam et al.
(2007), a few studies have focused on the relationship between andragogical
assumptions and instruction. Beder and Darkenwald (1985) queried teachers who
taught preadults and adults if their teaching behavior differed according to their
students’ age. Teachers reported viewing adult learners differently using more
andragogical techniques. Gorham (1985) however observed teachers who taught
both preadults and adults. She found no differences in how a particular teacher
instructed the different learners, although the teachers claimed that they did treat the
preadult and adult learners differently.
Research studies that elaborate on a few of Knowles’s core principles or
assumptions have been conducted in business and industry settings. Although
34
different from the educational setting, these studies included adult learners who
attended corporate training programs similar to the time period of professional
development in a college setting. Tannenbaum, Mathieu, Salas, and Cannon-Bowers
(1991) researched a group of new employees/recruits to examine the extent to which
training fulfillment or the extent to which training met/fulfilled the group’s
expectations. The eight week long recruit training had similarities to training held in
other context such as professional development programs that include a strong
motivation and value component in addition to skill building. The study focused
primarily on how the training was conducted and was somewhat consistent with
adult learning principles. The results of the study showed that training fulfillment
was related to post-training organizational commitment, academic self-efficacy, and
motivation to use the training. Organizational commitment and motivation to use the
training had the most positive results. These findings point to the importance of
understanding trainees’ expectations and desires through needs assessment and
mutual planning.
Hicks and Klimoski (1987) studied a large not-for-profit research and
development organization’s management training on performance appraisals. The
group that received a more realistic preview of what topics would be covered and the
expected outcomes, and were given a choice about whether to attend the training
were more likely to think the training was appropriate for them. The managers also
believed they were better able to benefit from the training, showed more
commitment to their decision to participate in the training, and were more satisfied
35
with the learning. Similarly, adult learners with a high degree of choice also were
more motivated to learn and learned more.
Baldwin, Magjuka, and Loher (1991) studied whether trainee involvement in
planning about learning would enhance the learning process. Specifically, they
explored whether being given a choice of training content and actually getting that
choice would enhance pre-training attitudes and training outcomes. Their findings
reinforced the importance of providing the adult learner choices about learning.
Trainees who had choices about attending training, and received their choice, had a
higher pretraining motivation and learning.
Clark, Dobbins, and Ladd (1993) studied 15 training groups across 12
different organizations representing a wide variety of organizational types and
training topics. Their finding showed that career and job utility were significant
predictors of training motivation and when employees had the opportunity to provide
input into the training decisions, they were more likely to perceive job and career
utility.
Other claims for the adoption of andragogical principles for training have
been documented (Holmes, 1980; Thorne & Marshal, 1984) yet these studies all
focused on adult learning in organizational settings, so caution must be taken in
generalizing to all adult learning situations. One primary message to adult learning
professional developers is involving adult learners in mutual planning and as
learning partners in their professional development. However, details on the
methods by which this effect works cannot be determined by current research.
36
According to Knowles et al. (2005), “engaging adults in planning the learning
process could enable people to decide not to participate in low-value learning, or
could actually change their attitude toward the learning” (p. 185). Empirical
research appears to point to adult learners’ needing information and involvement
before learning: the how, the what, and the why of learning.
Tough’s Adult Learning Project and Self Teaching
One of the most significant early contributions to self-directed learning
characteristic of adult learners is Alan Tough (1967) who focused on the behavior of
adults when they themselves had the primary responsibility for planning and
conducting a fairly lengthy learning project. Tough (1966) completed his qualitative
research study to fulfill his doctoral dissertation in 1965 at the University of Chicago
on teaching tasks performed by “adult self-teachers.” Tough (1966) published his
dissertation a year later in The Adult Education Quarterly highlighting his findings
relating to adult self-teachers. Tough’s original research has contributed to the
modern day andragogical and self-directed learning principles (Brookfield, 1986;
Knowles et al., 2005). In fact, Knowles cited Tough’s 1971 research with adult
learners as being instrumental in teaching how adults learn, how they organize their
learning, and how they seek out support.
Tough (1966) investigated 40 learning projects in which adult learners were
considered self-directed learners. He used the term self-teaching when referring to
examples of learning during which the adult learner clearly acted as his or her own
teacher. Tough (1966) stated that “the term self-teacher has been used to refer to any
37
person who is engaged in self-teaching” (p. 30). The self-teacher assumed the
responsibility for planning and directing the course of learning. According to Tough
(1966), similar behavior has been called self-instruction, self-education, and
independent study.
Each adult learning project in Tough’s study met three criteria:
1) The adult learner made a purposeful attempt to learn a specific skill or
definite knowledge; 2) the adult learner spent a minimum of eight hours
participating in self-directed learning the year preceding the study; and 3) the
adult learner, rather than any professional educator, assumed the majority of
the responsibility for planning, controlling, and supervising the entire project.
(Tough, 1966, p. 30)
There were several major questions guiding Tough’s research:
What sorts of difficulties and concerns arise during self-teaching? How much
assistance do self-teachers obtain with various aspects or tasks of self-
teaching? How many individuals provide the assistance? What types of
people are these assistants? How much additional assistance do self-teachers
want? (Tough, 1966, p. 31)
Tough (1966) interviewed 40 adult learners individually for two hours. The
interview began with Tough (1966) presenting a definition of self-teaching to the
research participants and a subject matter list or teaching task list that Johnstone
(1963) found had been commonly used by self-taught learners. Based on the
participants’ responses, they were asked to complete a questionnaire providing the
researcher with feedback on how much assistance had been obtained by other people
during the self-teaching process. The questionnaire included the following
questions: “I could not have performed the task successfully without assistance
received; I received a large amount of assistance; I definitely received some
38
assistance; No assistance” (Tough, 1966, p. 33). Tough (1966) also queried
participants on whether they would have liked more assistance than they actually
received from people.
Ninety-five percent of the adults interviewed had conducted at least one self-
teaching project lasting a minimum of eight hours during the preceding year (Tough,
1966). All 40 of the self-teachers, however, obtained a very large amount of
assistance. They sought assistance from a variety of people including fellow learners.
Every research participant used at least two different types of helpers, and the
majority used three or four. Additionally, all but seven of the research participants
would have liked more assistance.
The findings informed Tough’s self-teaching theory and suggested that
functioning as a teacher and learner simultaneously is not easy as “many of the
subjects experienced some difficulty or concern while planning and carrying through
their major self-teaching project” (Tough, 1966, p. 36). Tough’s experiment
supported the notion that it is “erroneous to think of a typical self-teacher as a person
who plans and conducts his learning alone and without human assistance” (Tough,
1966, p. 36). He expanded his self-teaching theory to the context of an adult who is
enrolled in a course stating that the learner “may obtain more assistance from a
greater variety of individuals than most educators realize” (Tough, 1966, p. 36).
Tough’s (1966) research provided specific implications for educators stating that
“self-teachers obtain a variety of assistance from several people and it seems clear
that educators usually should not expect a self-teacher to work alone without
39
assistance from institutional staff or someone else” (Tough, 1966, p. 37).
Understanding the behaviors of adolescents and adults during self-teaching, Tough
(1966) emphasized that “educational institutions may become much more effective
at preparing and assisting people who want to continue learning throughout their
adult years” (p. 37).
Brookfield’s Perspective on Adult Learning
Since Tough’s original work on self-teaching, others have replicated his
interview schedules, prompt sheets and questionnaires in studies of a number of
occupational groups (Brookfield, 1984). According to Brookfield (1984) who
analyzes current approaches to adult learning, Tough’s subsequent 1984 research in
Intentional Changes documented almost 50 follow up studies since his original study
of 40 Canadian college graduates (Tough, 1967). Brookfield noted that “we may
safely assume that practically every graduate program in North America currently
includes at least one dissertation or thesis replicating this research with a new target
group” (Brookfield, 1984, p. 60). Brookfield acknowledged that Tough’s original
work accomplished several things. Firstly, Brookfield commented that Tough’s
research “helped to shift the focus of educators’ attention onto the phenomenon of
adult learning” (Brookfield, 1984, p. 60). Secondly, Tough’s research “has
challenged the assumption that adult learning can only occur in the presence of a
fully accredited and certified professional teacher appropriately trained in techniques
of instructional design and classroom management” (Brookfield, 1984, p. 60).
Thirdly, his research has “helped to break down the false dichotomy in which
40
institutionally sponsored learning is seen as purposeful and deliberate and learning
occurring in non-institutional contexts is held to be serendipitous, ineffective and
wholly experiential” (Brookfield, 1984, p. 60).
While Tough’s (1966) historical research and subsequent studies (Tough,
1982) have helped to inform contemporary research around self-directed learning,
Brookfield critiques several areas of his work. One major criticism of Tough’s work
by Brookfield (1984) is Tough’s qualitative questioning techniques addressed the
quantity of self-directed learning but it did not address the quality of the learning.
“The result of the understandable desire to estimate the number of hours adults spend
in conducting their own learning has been a lack of any awareness regarding the
quality or effectiveness of such learning” (Brookfield, 1986, p. 65). Additionally
Brookfield suggested that there needs to be “a congruence between adults’ own
judgments regarding the effectiveness of their self-directed learning, and its
effectiveness by some objective measure” (Brookfield, 1986, p. 65). Brookfield
(1986) proposed that Tough should have included a means to compare the success of
the learning to judgments made by experts in the area that the adults studied. An
example given by Brookfield was whether a “self-teaching project on fascism should
be considered as of equivalent research significance as learning how to rewire a
basement” (Brookfield, 1984, p. 66). He questioned the implicit manner in which
the research design treated all learning projects as having the same significance to
the adult learner. Brookfield (1986) views these broad generalizations as being
absurd.
41
The danger of emphasizing mechanical aspects of learning projects such as
number of hours spent in learning, the number of assistants used, or the
nonhuman resources most frequently adopted, is that of coming to regard all
self-directed learning as exhibiting some kind of conceptual or substantive
unity. (Brookfield, 1984, p. 67)
In contrast to Tough, Brookfield (1986) holds a broader perspective of self-
directed learning noting that it is not merely counting self-teaching hours or resource
locating techniques. It is a matter of understanding how the learner changes
perspectives, shifts paradigms and replaces one way of interpreting the world by
another. The role of the teacher or facilitator is to prompt learners to consider
alternative perspectives by challenging them to examine their previously held values,
beliefs, and behaviors. Brookfield (1986) points out that the teacher or facilitator can
assist learners by using an “analytical component” through the development of the
competence in “critical reflectivity” (Knowles et al., 2005, pp. 105-106).
Challenging learners may be considered as creating learner anxiety, “...but such
anxiety should be accepted as a normal component of adult learning and not
something to be avoided at all costs for the fear that learners will leave the group”
(Knowles et al., 2005, p. 106).
Brookfield also questioned how effective the research participants were in
recalling and reconstructing their learning experience. “Relying solely on self-
reported success, quality, and effectiveness of self-directed learning is clearly
questionable” (Brookfield, 1986, p. 55). Candy (1991) also agreed with Brookfield
noting that “there are many drawbacks to such an approach, not the least of which is
people’s frequent inability to recall the details of what might have been, at the time,
42
crucial events in the learning process” (p. 169). Both Candy and Brookfield agreed
that one cannot ensure the reliability of retrospective recreations of the learning
process. Despite the criticisms and lack of empirical studies, self-directed learning
and individual learning projects have flourished within the field of adult education
(Merriam et al., 2007).
Other Adult Learning Models/Perspectives
Self-directed learning has surfaced as a major driving force of adult
education research (Merriam et al., 2007). This self-directedness is usually defined
in terms of externally observable learning activities or behaviors. It has recently
been described in education literature as “a process of learning in which people take
the primary initiative for planning, carrying out, and evaluating their own learning
experiences” (Merriam et al., 2007, p. 110). There is minimal empirical research in
implementing self-directed learning, but many models and approaches have been
described.
Researchers such as Candy (1991) provided in depth conceptual models
expanding on the descriptive nature of Tough’s and Knowles’ research. Grow
(1991) viewed the self-directed learner as learner characteristics and the “degree of
choice that learners have within an instructional situation” (p. 128). Grow’s (1991)
model placed emphasis on the teacher/learner relationship proposing that “learners
advance through stages or increasing self-direction and that teachers can help or
hinder that development” (p. 125). Garrison (1997) portrayed self-directed learning
as a process that is not so well planned or linear in nature. Instead, Garrison
43
proposed a multidimensional and interactive model of self-directed learning.
Interactive models “...emphasize two or more factors, such as opportunities people
find in their own environments, the personality characteristics of learners, cognitive
processes, and the context of learning, which collectively interact to form episodes of
self-directed learning” (Merriam et al., 2007, p. 111).
“Criticisms relating to self-directed learning have come from theorists
operating from a critical philosophical perspective” (Knowles et al., 2005, p. 142).
Most prominent of these include perspective transformation (Mezirow, 1991) and a
critical paradigm of self-directed learning (Brookfield, 1984). Brookfield was the
only self-directed learning theorists who emphasized the need for facilitation of
critical thought focusing solely on the individual learner.
In 1984, Brookfield noted the prevalence of self-directed learning in
education research commenting that “by almost every measure conceivable, research
into self-directed adult learning constitutes the chief growth area in the field of adult
education research in the last decade” (Brookfield, 1984, p. 59). The development of
self-directed learning capacities continues to be “the most frequently articulated aim
of educators and trainers of adults” (Brookfield, 1986, p. 40).
Self-Directed Learning — A Framework for Online Teaching
Knowles (1975) foresaw technology as one of the major forces shaping adult
learning in the twenty first century and a force that would be consistent with
andragogical principles. Online teaching presents opportunities for adult learning
(Gibbons & Wentworth, 2001; Knowles et al., 2005). First, it directly
44
accommodates the adult learners desire to be self-directed in learning. Second, well-
developed online instruction enables adults to tailor the learning experience to fit
prior life experiences. Third, when properly designed, online instruction easily
allows learners to tailor the learning to real world problems.
Along with opportunities come special challenges, mainly in self-directed
learning through online instruction. Using technology as the primary tool for self-
directed learning requires that the learners have very well-developed self-directed
learning skills. “It is not uncommon for organizations implementing technology-
based learning to discover that the intended learners do not have the metacognitive
skills, motivation, or confidence to engage in the required level of self-directed
learning” (Knowles et al., 2005, p. 237). Online learning allows for andragogy but
requires that learners be ready for andragogy and for controlling their own learning.
The first step of Knowles’s program planning model, “Preparing the Learners” is an
important element in ensuring adult online learners have the fundamental skills of
learning how to learn and the opportunities that online learning offers.
Adult Learning Theory Discussion
This section presented the historical self-teaching research done by Tough
(1966), Brookfield’s more contemporary self-directed learning perspective, and
Knowles (1975) andragogical principles and model based on Tough’s original work
acknowledging that self-directed learners move through a series of linear steps to
reach their learning goals. Other researchers such as Candy (1991), Grow (1991) and
Garrison (1997) provided expanded perspectives; however, they did not provide
45
empirical research to support their individual perspective. Garrison’s model,
contrary to Knowles, offered a nonlinear and comprehensive model by integrating
contextual, cognitive and motivational dimensions of the adult’s educational
experience.
Common elements of adult learning is the self-directed nature of the learner
including the learner’s desire to choose and control the learning pace, activities and
even content. Another common element is educators must provide resources and
assistance to help the learner who assumes the primary responsibility for completing
the learning tasks. A self-directed learner may work independently but there needs to
be resources available to help the learner complete or supplement learning. Support
may also be in the form of providing the learner with information about andragogical
principles or completing prerequisites related to the skill that is the subject matter of
the training. Self-directed learning proponents also suggest that educators serve as a
guide facilitating the learning process.
Educators of adults wanting to expand their theoretical perspective and
practice of effectively facilitating learning are encouraged to reflect upon self-
directed learning models (Garrison, 1997). Self-directed learning models and
andragogical principles are important to this study since they are predominantly used
in teaching adults; however, research is needed in the area of online professional
development in the higher education distance education environment.
There currently is limited research examining how self-directed learning
models can be applied in different contexts such as the distance education learning
46
environment. Mishra & Koehler (2006) noted that the “advent of technology has
dramatically changed routines and practices in most arenas of human work” (p.
1017) including how adult education is delivered. Based on the review of empirical
research relating to adult learning, the following elements may also have a positive
influence on learning outcomes for online professional development: Learning
contracts translating learning needs into learning objectives emphasizing task
completion and evaluation measures (Cafarella & Cafarella, 1986; Clark, Dobbins, &
Ladd, 1993); adult learner needs assessments prior to the learning experience
(Tannenbaum, Mathieu, Salas, & Cannon-Bowers, 1991), a preview of topics,
expected outcomes and a choice to participate in learning experience (Baldwin,
Magjuka, & Loher, 1991; Hicks & Klimoski, 1987), adult learner resources and best
practice models throughout the learning experience (Tough, 1966; Knowles et al.,
2005).
Professional Development for Online Teaching
Technology, Pedagogy, and Content Knowledge — A Comprehensive
Framework
Koehler et al. (2004) proposed that educators today should encompass a high
degree of knowledge and preparedness in the areas of technology, pedagogy, and
content knowledge, also referred to as the TPCK framework (Mishra & Koehler,
2006). The application of the TPCK framework provides professional development
educators with a structure for meeting the challenges of developing instructors who
are not only proficient in the use of technology, but also in the use of technology for
47
education and learning. Koelher et al. (2004) argued that it is important that any
successful faculty development program approach has ways to let faculty experience
the interaction of these three components as well as understand the way they can co-
constrain each other. Most traditional approaches to professional development fall
short of this mark according to Koehler et al. (2004).
Koehler et al. (2004) pointed out the inadequacies of more traditional
approaches to faculty development. First, workshops and tutorials tend to be ill-
suited to develop a thorough understanding of the relationships between the
technology and pedagogy that join together in effective practice. There is a tendency
for these approaches to treat technology as being separate from pedagogy. Once
instructors learn a particular portion of technology, trainers assume that the
instructors will effortlessly know how to use the technology in their teaching. More
importantly, content may not be considered in the professional development
appearing that it is irrelevant to course design. Koehler et al. (2004) added that short
workshops do not offer instructors the opportunities to explore connections between
technology, content, and pedagogy—essential elements for effective technology
integration. Second, another standard approach of offering faculty members
technical support advocates creating technical expert groups who are available to
assist faculty. Koehler et al. (2004) commented that they see problems with this
approach because technical support personnel rarely have backgrounds in education
or instructional design yet are charged with helping faculty to reconfigure their ideas
to match the online tools available for their course. Since faculty members do not
48
have the technical knowledge, they are constrained by what is offered to them by the
technical experts. Koehler et al. (2004) viewed leaving design decisions to the
technical experts as having negative impact on pedagogy and online course quality.
Koehler et al. (2004) and Koelher and Mishra (2005) recommend a design
team approach to faculty professional development noting that many other
approaches to faculty development are plagued with the aforementioned criticism.
According to Koehler et al. (2004) “there is no single technological solution that
applies for every teacher, every course or every view of teaching” (p. 31). Instead
“quality teaching requires developing a nuanced understanding of the complex
relationships between technology, content, and pedagogy and using this
understanding to develop appropriate, context specific strategies and representations”
(Koehler et al., 2004, p. 31).
The foundation of Koehler, Mishra, Hershey, and Peruski’s design team
approach came out of research conducted at Michigan State University (MSU). Six
tenured faculty members were enrolled as students in a face-to-face educational
technology Master’s level course. The design teams consisted of one faculty
member and three or four Master’s students working on designing an online course
that would be taught by the faculty member in the upcoming school year. The
purpose of the research was to study the outcomes from students’ and faculty’s
participation in the course, with particular interest in developing a better
understanding of the elements that constrain and support the development of genuine
technology integration. Sources of data including progress reports, group postings,
49
e-mail interviews with the students, in-depth interviews with the faculty members,
student reflection papers, and other artifacts were used to develop case studies of
design groups. Common findings for each group were “various episodes of
grappling with issues surrounding the content, issues of pedagogy, and technology”
(Koelher et al., 2004, p. 34).
Koelher et al. (2004) presented a portrait of the design approach from three
perspectives — the educational technology graduate students, the faculty participants,
and the group at large; however, this review will discuss the faculty perspective since
that is the focus of this dissertation. The faculty members gave up at least 10 hours
of their week for the 15 week semester to be a student in the design class. Instead of
turning over their course to a web designer to develop, the faculty worked together to
design the courses themselves. Along the way, they not only learned about new
technology skills, but they also learned how the technology could be leveraged to
accomplish higher-order learning goals for students in their content area. Despite the
small number of faculty involved in the study, Koelher et al. (2004) noted that the
experience was a very profitable one in terms of realizing the importance of
considering key features of the online work that have to be wrestled with before
going online such as content, technology, and pedagogy. Content refers to the actual
subject matter that is to be taught or learned such as core ideas, procedures, resources,
and representations that make up the course and subject matter. Technology
encompasses different methods of presenting and interacting with information on the
computer screen. Since technology will continually change over time, Mishra and
50
Koelher (2006) highlighted that the ability to learn and adapt to new technologies
irrespective of what the specific technologies are is important. Pedagogy is viewed
as the process and practice or methods of teaching and learning and it encompasses
overall educational purposes and techniques, as well as strategies for evaluating
student understandings. According to Mishra and Koelher (2006), an instructor with
deep pedagogical knowledge understands cognitive, social, and developmental
theories of learning and how they apply to their students in the learning environment.
In an online course, all three components—content, pedagogy, and technology, are
delivered on the web. Therefore, Koelher et al. (2004) concluded that the design of
the online course is key to the success of the course.
In more recent work, Mishra and Koehler (2006), instead of treating content,
pedagogy, and technology as three separate issues, they presented a framework
emphasizing the complex interplay of these three bodies of knowledge.
Figure 2. Pedagogical Technological Content Knowledge Framework: The three
circles, content, pedagogy, and technology, overlap to lead to four more kinds of
interrelated knowledge (Mishra & Koehler, 2006, p. 1025)
51
Using the Technology Pedagogy Content Knowledge (TPCK) framework
involves instructors with learning experiences with real educational problems to be
solved by technology according to Mishra and Koehler (2006, p. 1034). This
framework builds on Shulman’s construct of pedagogical content knowledge (PCK)
to include technology knowledge (Koehler & Mishra, in press). In learning
technology by design, Mishra and Koehler (2006) noted that the learners are
provided the opportunity to transcend the passive learner role and to take control of
their learning which has been mentioned previously in this section as an adult
learning principle. Additionally, based on the andragogical model, adult learners are
problem solvers by nature which coincides with the learning by design model.
In 2005, Koehler and Mishra focused on measuring TPCK through surveys
administered at different times during a learning experience. In their study on
professional development, four faculty members and thirteen students completed
surveys twice, once toward the start of the training and once at the end. Mishra and
Koehler (2006) noted that their data “clearly show that participants in the design
teams moved from considering technology, pedagogy, and content as independent
constructs toward a more transactional and codependent construction that indicated a
sensitivity to the nuances of technology integration” (p. 1043).
The TPCK framework will allow the researcher to make sense of the
“complex web of relationships that exist when teachers attempt to apply technology
to the teaching of a subject matter” (Mishra & Koehler, 2006, p. 1044). In particular,
“it argues against teaching technology skills in isolation and supports integrated and
52
design-based approaches as being appropriate techniques for teaching teachers to use
technology” (Mishra & Koehler, 2006, p. 1045). Furthermore, learning
environments that encourage learners to explore technologies in relationship to
subject matter in authentic contexts are often most useful. The TPCK framework,
according to Mishra and Koehler (2006), provides educational researchers with a
language to talk about the connections that are present or absent in
conceptualizations of educational technology. Mishra and Koehler (2006) encourage
further research in the area of professional development around technology which is
the focus of this dissertation.
Up until this point, the TPCK studies discussed have focused on creating
online courses in a face-to-face classroom. Archambault and Crippen (2009)
surveyed 1795 online teachers employed at virtual schools from across the United
States. A total of 596 K-12 online teachers from 25 states responded representing a
33% response rate. The 24 item survey underwent numerous revisions over a two
year period employing a guiding framework for skills that online teachers should
know and be able to do. The survey measured the online teachers’ knowledge with
respect to the three key demands described by the TPCK framework that has since
been changed to TPACK.
Although TPCK has been changed to TPACK, the TPACK framework
continues to measure technology, pedagogy, content, and the combination of each
these areas. Since this dissertation is concerned about evaluation professional
development conducted online, Archambault and Crippen’s (2009) findings can
53
provide insight on how to measure instructor knowledge of the TPACK domains.
Archambault and Crippen’s (2009) findings indicated that knowledge ratings were
highest among the domains of pedagogy, content, and pedagogical content,
indicating that the teachers who responded felt very good about their knowledge
related to those domains. However, the teachers were less confident when it came to
the technology domain. One area that displayed a high correlation was between
pedagogy and content calling into question the distinctiveness of these domains.
Archambault and Crippen (2009) further noted the possible flaw in the TPACK
model as a framework for measuring the groups of teachers. What was evident is
that teachers felt strongly about their ability to deal with issues related to pedagogy
and content and more hesitant when it comes to dealing with technology.
Archambault and Crippen (2009) also suggested that the result is likely related to the
activities that traditional teachers do on a daily basis, such as planning lessons, using
teaching strategies to convey content, mapping content to district standards and
assessing students’ knowledge of various topics, which are emphasized in teacher
education programs. It was interesting to note, however, that teachers had a
difficulty differentiating between pedagogy and content as their survey results
demonstrated that the teachers viewed them as being intertwined.
Higher education faculty are not required to have a teacher education
certification credential in order to teach a college level course. Unlike K-12
educators who are expected to successfully complete education and subject matter
courses as well as a student teaching experience in order to receive a teacher
54
education certification, higher education faculty usually are required to have an
advanced degree in their subject area only. Despite the thoroughness of the TPACK
framework, Archambault and Crippen (2009) discourage researchers from using the
TPACK survey with K-12 educators because of the intricacies of several of the
domain combinations. However, this may not be the case when evaluating
professional development for higher education since they are not trained in the
pedagogical domain. This study will help to better understand the relevance of the of
the TPACK domains in a higher education environment. Furthermore, the study
findings will assist instructional designers for professional development programs by
highlighting what higher education online faculty need to know about technology,
pedagogy, and content knowledge, and their interrelationship (Archambault &
Crippen, 2009). This dissertation survey will refer to the TPACK survey questions
for the three primary domains of pedagogy, technology, and content knowledge to
obtain a better understanding of faculty perceptions of how the online professional
development training prepared them to teach online.
Effective Components of Online Professional Development Programs
A comprehensive literature review revealed a limited amount of scholarly
research relating to training faculty to teach online (Wolf, 2006), despite the growth
in online faculty. According to Varvel (2007), a conservative estimate of the number
of online instructors in the United States is 50,000. Majority of these instructors
have no formal online education training, relying mainly on their experience as a
student and a face-to-face instructor. Although there may be quality courses being
55
taught by instructors who are teaching online, it still leaves one to ask just how ready
are these instructors to teach online or how are they keeping up with the rapidly
changing technology?
Instructor expertise has been cited as one of the most important factors
contributing to quality online instruction (Varvel, 2007). However, many instructors
report they are unprepared to teach online because they have been mostly prepared to
instruct in a traditional classroom environment (Varvel, 2007; Wilson, 2001).
Despite the need for quality online teaching professional development, many
professional development programs are not of high quality, offering fragmented,
intellectually superficial seminars (Borko, 2004) or lack ongoing teaching support
for instructors beyond the professional development (Barnett, 2002).
A survey of faculty who teach undergraduate mathematics courses online
indicated that most faculty at two-year colleges have not received adequate training.
Sixty-four faculty who teach online undergraduate mathematics courses throughout
the United States were e-mailed a survey asking how they learned to teach online
and thirty-five faculty responded to the survey. Sixty percent of the faculty said they
would have benefited from more training in facilitating online interaction before they
began teaching online (Pankowski, 2004). Another finding of the undergraduate
online mathematics instructors was that only 19 of the faculty responding to the
survey had ever taken an online course. Although the original study sample was
small, in subsequent focus groups, the faculty were unanimous that, in order to teach
online, faculty should first experience education from the student’s point of view
56
(Pankowski, 2004). Pankowski’s study addressed the importance of effective
delivery methods for professional development to prepare instructors for online
teaching.
Barker (2003) and Grant (2004) noted that professional development to teach
online is important and should be an ongoing process supported by continual training
as opposed to a one time workshop. In a case study describing their process of
creating a faculty development program for online instruction for nursing faculty at
Sacred Heart University, Barker (2003) found that faculty development programs
involved instructional design and use of technology. According to Barker (2003), to
ensure quality and student learning, sound educational theory and principles must be
incorporated in online course development and delivery. One of the key components
of their professional development program that will also inform this dissertation’s
research questions was faculty members observed another online course so that they
could learn how to manage discussion boards, and see how online courses build a
learning community and support networking and collaborative learning (Barker,
2003).
Wolf (2006) conducted a literature review to examine best practices in the
training of faculty to teach wholly or mostly online which revealed a scarcity of
scholarly work in this area. The study aimed to determine best practices in training
University of Maryland University College (UMUC) faculty to teach online. UMUC
is a nontraditional higher education institution focused on educating working adults
and active duty military personnel. The researcher used a case study method and
57
interviewed experts in the field of distance education. The interview results were
compared to suggest a set of best practices in training faculty to teach online.
UMUC was selected for the case study because the university has a
demonstrated record of scalability and evaluation, and revision of process to improve
its practices. According to Wolf (2006) “scalability is critical to any effort to define
best practices” (p. 51). The first part of the study included a meta-analysis of more
than 300 books, dissertations, periodicals, and Web sites around the subject of
training faculty and trainers to teach online. Wolf (2006) found that most of the
literature concerned helping faculty to integrate technology into teaching and not
teaching wholly online. The second part of the study involved interviewing 25
participants including the UMUC faculty who had years of experience with
numerous faculty members, training programs, and methods of distance education.
The result of Wolf’s (2006) research is a list of best practices for training
higher education faculty who teach courses online. According to Wolf (2006) no
universal definition of best practices exists although multiple authors write about
distance education. In Wolf’s study, she defined best practices as methods being
effectively used by industry experts or participants who identified themselves as
having experiential knowledge based on years of experience.
Wolf (2006) noted that “faculty who will be teaching online are successful
when they participate in formal training” (p. 55). She further defined success in
online teaching when a faculty member completed at least one online class with
student evaluation scores and student complaints within accepted limits for the
58
institution. Interestingly, there were no references to evaluation methods measuring
any assessment of learning gains. Instead, faculty online teaching success was based
on level one evaluations measuring students’ reaction to their learning experiences
(Kirkpatrick & Kirkpatrick, 2007). Although these reactions may be positive and the
students enjoyed their online learning experiences, the level one evaluation results
does not allude to any differences in student learning according to Kirkpatrick (2007).
Faculty at UMUC are required to complete a five week long, baseline
training course conducted wholly online if the faculty will be using UMUC’s
propriety courseware; however, no reference to training was made for the non
UMUC faculty who were interviewed for the study. From the meta-analysis of the
survey results, Wolf (2006) concluded the following (pp. 55- 61):
1. Formal online training results in successful teaching.
2. Successful classroom teaching has no correlation on successful online
teaching. Successful online faculty need not have had face-to-face
teaching experience before teaching online.
3. Minimum computing skills including the use of a computer, the Internet,
and online applications are required for successful teaching online.
Faculty are asked to self-assess their technical skills before beginning the
training. Effective programs offer separate training for faculty who need
to improve their computing skills before training to teach online, and
assessment of those skills is conducted by means other than self-
assessment.
59
4. Effective training programs are designed so that faculty are trained to
teach online using the course delivery system with which they will be
teaching. Instructors learn how to create assignments, manage online
conferences, and provide student feedback.
5. Successful training encompasses pedagogy, although methods for
introducing pedagogy differ.
6. Effective distance education programs provide ongoing faculty support in
the form of mentoring, shadowing, continuing education workshops, or a
combination of these. Effective programs survey their faculty to
determine what types of support are most desired.
7. Motivation is the most important factor when choosing faculty to teach
online. Additional incentives may be required when a distance education
program is new and institutions are trying to encourage existing faculty to
move online; however, it is not clear which form of incentive is best.
Successful programs choose incentives that are meaningful to their
faculty.
8. Faculty should be recruited specifically to teach online. If the core value
resides in the quality of teaching, and the teaching experience is defined
as mentoring and dialogic learning, the primary criterion for hiring
faculty to teach in the online environment is online facilitation and
mentoring and not content expertise.
60
9. Faculty should be involved in course design. If faculty are expected to
design courses without the assistance of an instructional designer,
instructional design theory is included as part of the training to teach
online. Distance learning courses require working with instructional
designers, production technicians, evaluation experts, and support service
units.
10. Successful programs are supported by their institution (financial, human,
and infrastructure resources necessary to design, maintain, and support
the distance education training program).
Wolf’s (2006) research provides comprehensive research based findings in
development of best practices for training programs in educational institutions and
corporations; however, there is minimal evidence that the best practices have been
tested empirically. In fact, Wolf’s findings relating to the need to have course
content expertise contradict the TPACK framework empirically tested by Mishra and
Koelher (2006).
Additionally, the 25 interviews included a range of professionals and not
specifically faculty who have taught courses online. Nonetheless, several themes
emerged from Wolf’s (2006) research that may be meaningful to the investigation of
Online Community College’s professional development for online teaching. When
developing online training programs, faculty members should 1) want to teach in an
online environment; 2) have a minimum technology competency prior to beginning
their training to teach online; 3) be required to engage in training in the course
61
delivery system before teaching online; 4) use the course delivery system with which
they will be teaching; 5) understand how to encompass appropriate pedagogy when
teaching online; and 6) be involved in the online course design. Finally, the
institution should “have a clearly articulated mission for its distance education
program and must commit to providing the necessary financial, human, and
infrastructure” (Wolf, 2006, p. 62).
Rapid growth of online education and the perceived difference between
online and face-to-face instruction has required training and support for instructors
transitioning to online delivery (Roman et al., 2010). Roman, Kelsey, and Lin
(2010) conducted a study at a single university which examined which components
of six week fully online program, Preparing Online Instructors (POI), helped faculty
to transition their face-to-face course to online delivery. The goal of the professional
development was to immerse faculty in a training course that required them to be an
online student. Additionally, the program aimed to help faculty experience typical
distance education student behaviors and communication practices, and develop
positive online teaching practices and classroom management strategies.
Professional development program participants were primarily faculty who had
minimal online teaching experience. All 40 participants expressed great interest in
exploring teaching online and developing online pedagogy skills, enrolling
voluntarily in the intensive professional development. No incentives such as
additional compensation were provided, except a certificate of participation at the
end of the training.
62
The professional development delivered one module per week for six weeks
covering the following topics: online course design and syllabus; building an online
classroom; online course activities and assignments; online instructional content, and
multimedia; copyright and fair use; and best practice in online classroom
management. Each module had multiple assignments, and discussion board postings
to facilitate weekly discussions around training content.
The researcher used qualitative data from open-ended questions and
discussion board reflections from the program participants and a survey. The
researcher developed survey items considered professional development participants
as adult learners and that professional development initiatives must present them
with the opportunities relevant to their personalities, life and educational experiences,
and learning preferences. The framework for the survey items came from The Seven
Principles of Effective Instruction in Undergraduate Courses proposed by
Chickering and Gamson (1987) and modified by Chickering and Ehrmann (1996) for
multimedia use. According to Chickering and Ehrmann, effective online instructors
incorporate the following principles into their teaching:
1. facilitate contacts between students and instructors;
2. foster reciprocity and cooperation among students;
3. promote active learning;
4. provide prompt feedback;
5. emphasize time on task;
63
6. communicate high expectations;
7. respect diverse talents and ways of learning.
The professional development evaluation did not go beyond Kirkpatrick’s
(2007) level one which mainly measured the participants’ reactions to the learning
experiences relative to the training content and experience, and the instructor. There
were no references to higher levels of evaluation such as assessment of student
learning outcomes. Eighty-six percent of the program participants found the
professional development content enriching. Seventy-one percent indicated that the
course improved their technological skills of online teaching; and eighty-six percent
pinpointed that the strong pedagogical training was beneficial to preparing them to
teach online. Other areas of the professional development that were beneficial in
preparing the participants to teach online included discussion board postings with the
trainer and peer participants; discussion board reflection assignments; instructor
provided models and examples of best practices; opportunity to learn online about
teaching online; and prompt feedback from the trainer. Overall, faculty evaluations
alluded to effective online professional development programs need to emphasize
technological and pedagogical skill development.
Findings relating to what was least beneficial in preparing faculty to teach
online included domination of the discussion board by a few verbose participants; a
couple of challenging assignments; occasionally overly positive feedback from the
trainers, and overuse of the discussion board. In addition, some participants noted
that they felt overwhelmed being grouped with other faculty who had more
64
experience in online teaching, while others appreciated the exposure to the
background of more experienced instructors.
Other key findings suggested that faculty need to be surveyed prior to the
start of the professional development to identify their training needs, which is one of
the key principles of andragogy. It is also recommended that professional
development trainers consider formative evaluations that can help to ensure the
training is modified and improved to better prepare online instructors to be effective
in the online setting. Ongoing resources and support mechanisms after the training
were also identified as essential professional development components. According
to Roman et al. (2010), findings from the study informed professional development
providers and university administrators on how to plan and implement instructor
training programs to enhance online teaching skills. Roman et al.’s (2010) research
findings will help to inform the research questions of this dissertation in considering
variables, and specific elements for the quantitative survey.
Taylor and McQuiggan (2008) conducted a research study at The
Pennsylvania State University to learn more about the professional development
experiences and needs of faculty who teach online. Invitations to complete an online
survey were e-mailed to 260 faculty and 68 usable surveys were returned
representing a 27.9% return rate. When asked about resources that they had used in
preparing to teach online, faculty responded that instructional designers and
colleagues experienced in teaching online were most helpful. However, they were
interested in assistance in the following areas when designing and developing future
65
online courses: 1) choosing appropriate technologies to enhance their online course
(55.9%), 2) converting course materials for online use (35.3%), 3) creating effective
online assessment instruments (35.3%), 4) creating video clips (33.8%t), and 5)
determining ways to assess student progress in an online course (33.8%). The
faculty also provided information on the following course delivery topics that held
the most interest: 1) facilitating online discussion forums (47.1%), 2) building and
enhancing professor-student relationships in the online classroom (39.7%), 3)
facilitating web conferencing sessions (35.3%), 4) managing online teaching
workloads (33.8%), and 5) providing meaningful feedback (32.4%). Overall, “while
faculty have some interest in enhancing their skills in using technology tools to build
solutions, most are concerned with the instructional design implications in
developing their online courses” (Taylor & McQuiggan, 2008, p. 34). Faculty
alluded to wanting a better understanding of the types of online interactions and how
to make better decisions about technology selections in order to achieve student
learning goals. When experienced faculty were queried about how to advise faculty
new to online teaching on the best methods of preparing to teach online, they
suggested that new online faculty observe an online course, be an online student,
work with an instructional designer, learn the university’s course management
system, and locate technical assistance.
Other noteworthy findings about most helpful aspects of the online
professional development involved the opportunity for faculty to share real-life
experiences with their colleagues, to use various technologies including the
66
university’s course management system, and to access specific examples and
strategies. Additionally, one-on-one development with a mentor or colleague was
considered the most effective learning mode, closely followed by one-on-one
interactions with an instructional designer. With regard to the face-to-face learning
mode, faculty considered it the least effective professional development method.
Instead, faculty preferred having access to online resources and references, and
online self-paced modules. This result might point to the time constraints that
faculty face in the higher education teaching environment, or their changing attitudes
about the utility and the power of technology. Lastly, unlike previous studies,
faculty release time to develop or deliver courses was cited infrequently (4.4 percent).
The faculty did not have a common incentive to teach online. They indicated
recognition toward tenure and promotion as the most prominent incentive (23.5
percent). Findings from this study can help to guide the researcher on what
components, including online course content and resources, are essential in preparing
faculty to teach online.
Pagliari et al. (2009) investigated preparation and best practices among
faculty of technology-oriented coursework in North Carolina Community Colleges.
Similar to Roman et al. (2010), Pagliari et al. (2009) referred to Chickering and
Ehrmann’s (1996) Seven Principles as their foundation for determining best practices
in online teaching. While the lists of best practices are very popular, it is unclear
whether they actually help to improve faculty online teaching and student learning.
Pagliari et al. (2009) used a survey instrument pertaining to faculty’s experiences
67
with training/preparation and actual practices in online courses within the North
Carolina Community College System. Out of 60 potential e-mail respondents, only
22 individuals completed the online survey tool representing a response rate of 34%.
Two of the research questions asked the participants if they were trained in
best practices and what best practices were utilized in their online course(s). The
faculty responses had three major areas of training in best practices: timely feedback,
48.1%; setting rules for a friendly online environment, 40.7%; and using online
assessment tools such as quizzes, 37% (Pagliari et al., 2009). The second set of
faculty responses had three major areas of best practices used in the classroom:
timely feedback, 74.1%; supporting students through online communication, 66.7%;
and using discussion boards to facilitate interaction, 63% (Pagliari et al., 2009). The
majority of the faculty also reported using online assessment tools 55.6%; setting
rules or a friendly online environment, 55.6%; providing introduction activities
55.6%, and providing detailed syllabus information such as learning modules 59.3%
(Pagliari et al., 2009).
The findings of a study conducted by Pagliari et al. (2009) emphasized a need
for further research in the areas of faculty training for online courses. Providing
faculty with examples of best practices during the online training was viewed as one
of the outcomes of the training for teaching online courses. Pagliari et al. (2009)
noted that best practices seemed to be used by faculty members who were surveyed;
however, development and training with technologies that assist synchronous
discussion is needed to enhance the educational experience. Pagliari et al. (2009)
68
concluded that “there is a need to evaluate the latest technologies and develop web-
based training modules that train faculty members on best practices to teach online
courses.” This research study will respond to Pagliari et al. (2009) recommendations
for future research. Furthermore, the variables that were considered best practices in
training will be considered as possible elements that an online faculty professional
development should include.
Orr, Williams and Pennington (2009) researched faculty perceptions of
“...effective processes, practices and infrastructure that are considered essential
components of successful online teaching and learning efforts” (p. 257). In a
qualitative study, the researchers interviewed a total of ten faculty members from
two regional comprehensive universities. They examined factors that might
influence online teaching success including faculty member perceptions, tenure,
compensation, time to develop an online course, and institutional support. Two
suggestions emerged from the research. First, instructors need to have an
understanding of how their efforts connect to department, college, and institutional
efforts. Second, in the early phases of teaching online, the development of necessary
technology skill sets is essential to success. However, as instructors mature in their
online efforts, the faculty development focus must shift from technology skills to
pedagogical improvements. Examples provided by Orr et al. (2009) included a
pedagogical focus such as creating more active learning activities in online courses
or increasing student engagement.
69
“Faculty members are looking for answers to questions concerning course
improvement, fostering interaction and engagement, and constructive feedback in an
online environment” (Orr, Williams, & Pennington, 2009, p. 266). Orr et al. (2009)
recommended that “institutions increase staffing in the area of pedagogical support
and form faculty development support teams by using these increased resources” (p.
266). Using this type of faculty development model would provide expanded
resources to faculty who are veterans of online teaching additional support in
improving course quality and student learning. By including pedagogical resources
in development support teams, faculty members would have additional staffing to
share in the online course development workload. Interestingly, Brookfield (1986)
viewed adults teaching and learning in one another’s company as a means of
engaging adult learners in challenging, passionate and creative activity.
Finally, Keengwe, Kidd, and Kyei-Blankson (2008) reviewed 25 narratives
of respondents to learn about higher education faculty’s experiences with adopting
an information and technology (ICT) system and challenges they faced. Four major
themes were analyzed one of which was training and development. Faculty
responses indicated that they would be more likely to use new technology if they
were provided guided practice, examples, and remedial support in using the available
tools. In addition to instructional training, the study depicted that faculty need
training in technology in order to adopt the online system. The researchers also
recommended that training is relevant and current to the needs of the faculty,
encouraging a needs assessment prior to the start of the training. These finding
70
suggest professional development components and content that should be considered
when conducting professional development for online teaching. Furthermore, the
research addresses the adult learning principle relating to relevancy of training
content making the recommendation to conduct a needs assessment prior to
delivering online training. Contrary to research discussed earlier by Wolf (2006),
Keengwe et al. (2008) alluded to a combination of face-to-face interaction and online
training.
As researchers concluded in their studies, professional development should
not focus on the technology itself but also pedagogy and content knowledge.
Incorporating pedagogy or andragogical principles prepares faculty to teach online
by providing them with techniques to increase interactivity in online classes, deliver
course content in an innovative way, and empower learners. The preferred method
of online professional development is to have faculty members experience online
pedagogy and technology as a student before teaching their first online course. More
recent literature recommends that an array of technological communication tools and
supports are made available to professional development participants to enhance the
online educational experience, with an increasing need to include synchronous, live
web conferencing tools to engage the online learner. Additionally, common
components of online professional development include learner needs assessments
and collaborative faculty support/design teams serving to model/demonstrate best
practices. A summary of best practices for online professional development is in the
table below.
71
Table 2
Key Components of Effective Online Instructor Professional Development
Author Year Components
Barker 2003 • Instructional design of online course
• Technology tools
• Faculty observations of another online course to learn how to manage discussion
boards, build a learning community, and support networking and collaborative
learning
• Pedagogy: Trainers must incorporate sound educational theory and principles in
online course development and delivery
*Koelher et
al.
2004 • Technology tools
• Pedagogy
• Content knowledge
• Design team approach with three stages: 1)roles/goals of participants and
construct first draft of course web site; 2)role consolidation and issues of course
content and pedagogical strategies; 3)integration or work on problems
Pankowski 2004 • Facilitating online discussions before teaching online
• Faculty should first experience online education as student before teaching online
*Koehler
and Mishra;
Mishra and
Koehler
2005,
2006
• An understanding of technology and pedagogy and how both interplay with
course content
• Ability to learn and adapt to new technologies
• Design team approach
Wolf 2006 • Formal online training for faculty who want to teach online
• Faculty assessed for technical skills before training and should have minimum
computing skills (online applications). Tutorials required if these skills are
lacking. Assessment of these skills needed before beginning professional
development
• Faculty trained online using course delivery system. Learn how to create
assignments, manage online conferences, and provide student feedback
• Appropriate pedagogy for online teaching
• Ongoing support (mentoring, shadowing, continuing education workshops.
Survey of faculty to determine what type of support needed
• Incentives that are meaningful to faculty if distance education program is new
• Instructional design: Professional development requires team approach
(instructional designers, production technicians, evaluation experts, and support
service units)
• Institutional support (financial, human, infrastructure)
72
Table 2, continued
Author Year Components
Taylor &
McQuiggin
2008 • Most helpful in preparing to teach online: Work with instructional designer; help
from colleagues experienced in teaching online
• When designing and developing online courses faculty need help with: Choosing
appropriate technologies to enhance their online course; converting course
materials for online use; creating effective online assessment instruments;
creating video clips; determining ways to assess student progress in an online
course
• When delivering an online course faculty need help with: Facilitating online
discussion; building and enhancing professorship relationships in the online
classroom; facilitating web conferencing sessions; increasing interactions in an
online course; providing meaningful feedback on assignments
• Recommendation from faculty to new online instructors: Observe an online
course; be an online student; work with an instructional designer; talk to
colleagues experienced in online teaching; learn the university’s course
management system; locate technical assistance
• Other pedagogical suggestions that faculty should learn/use: establish online
presence; give prompt and effective feedback; provide appropriate details and
clarity; set student expectations; support interaction; play a facilitative role; be
flexible
Keengwe et
al.
2008 • Instructional training: Guided practice, examples, remedial support in using the
available tools
• Technology
• Needs assessment to make training relevant.
• Face-to-face component
Archambault
and Crippen/
Archambault
and Barnett
2009/
2010
• Technology and pedagogy and how both interplay with course content
Orr et al. 2009 • Technology skills
• Pedagogical skills
• Suggest faculty support teams to ease workload in creating online course
Pagliari et
al.
2009 • Best practices in online training for the following: timely feedback; setting rules
for friendly online participation; using online assessment tools such as quizzes;
online communication tools; introduction activities; detailed syllabus
• Professional development using technologies that assist synchronous discussion
needed to enhance the educational experience. Faculty need to evaluate latest
technologies and develop web-based modules
Roman et al. 2010 • Technological skills
• Pedagogical skills: discussion board postings and reflection assignments,
instructor provided models/examples of best practices, opportunity to learn online
about online, prompt feedback from the trainer
• Needs assessment of faculty before professional development begins
• Formative evaluations to respond to progress of online faculty
73
In addition to empirical studies alluding to components of effective online
instructor professional development, national educational organizations and
accrediting bodies have provided recommendations in the form of standards,
guidelines, and benchmarks.
National Standards, Guidelines, and Benchmarks
This section highlights accreditation standards, guidelines, and benchmarks
related to online course offerings in higher education providing insight into the
current climate of accountability in online learning and teaching. The extraordinary
growth of distance education in higher education has prompted national professional
organizations and accrediting bodies to develop principles, guidelines and
benchmarks to ensure quality distance education. Virtually all strategies included
the topic of faculty training (Institute of Higher Education Policy [IHEP], 2000). In
June 1998, The American Association of University Professors (AAUP) organized a
Special Committee on Distance Education and Intellectual Property Issues to address
faculty concerns about distance education (Council for Higher Education
Accreditation, 2000). A year later AAUP policies addressing distance education
emphasized that faculty retain their normal responsibilities for selecting and
presenting courses; however, the association recommended that faculty be given
enough time to become familiar with the online teaching tools.
In May 2000, the Higher Education Program and Policy Council of the
American Federation of Teachers (AFT) prepared a report to respond to concerns
such as whether distance education faculty were receiving needed training and
74
technical support to provide successful educational experiences (AFT, 2000). The
AFT surveyed 200 members of its higher education locals who were practitioners of
distance education. These practitioners also taught distance education courses in
every major academic area. Drawing from the survey responses as well as
scholarship on distance education programs and the advice of AFT’s Higher
Education Program and Policy Council in the 1999-2000 academic year, AFT
developed a set of guidelines for good practices in distance education. One of the
guidelines emphasized that faculty be prepared to meet the special requirements of
teaching at a distance. The AFT Report emphasized that faculty teaching distance
education courses must become proficient in the communications technology
employed in their distance education courses encouraging faculty to work alone or in
teams to take full advantage of distance education technology. The majority of the
survey respondents almost uniformly commented on the preparation time for
distance learning courses as being much greater than for a classroom-based course,
especially for the first time the course is offered by a faculty member. Some
estimates ranged from 66 % to 500 % longer. To prepare faculty, the AFT report
suggested that “faculty must be provided adequate training and technical support—in
terms of hardware, software, and troubleshooting” (AFT, 2000, p. 8). The AFT
Report documented that support should also include special assistance in
instructional design.
To ensure effective online instruction, The Institute of Higher Education
Policy (IHEP, 2000), a nonprofit organization whose mission is to foster access and
75
quality in postsecondary education, was asked by National Education Association
(NEA), the nation’s largest professional organization of higher education faculty,
and Blackboard, Inc., a leading Internet education company, to analyze issues related
to quality in distance education. The Institute of Higher Education Policy released
the report, Quality on the Line: Benchmarks for Success in Internet-Based Distance
Education that included 24 benchmarks for effective online education. According to
the report, “these benchmarks distilled the best strategies used by colleges and
universities that are engaged in online learning, ensuring quality for the students and
faculty who use it” (IHEP, 2000, p. vii). To obtain these benchmarks, a case study
approach was used consisting of three sequential phases. First, a comprehensive
literature search was conducted to compile benchmarks that were recommended by
other organizations and groups, including those suggested in various articles and
publications. This initial search resulted in a total of 45 benchmarks. Second,
institutions that have substantial experience in distance education and considered
leaders in online education were identified. Third, these institutions were visited by
Institute personnel to assess the degree to which the campuses incorporated the
benchmarks in their online courses and programs. Each of the six site visits included
in depth interviews with faculty, administrators, and students. In addition to the in
depth interviews, they were given a survey using a Likert Scale. Of the 147
respondents who were interviewed, there were 27 faculty and 16 individuals who
were both a faculty member and an administrator. Based on the survey results, the
76
following four faculty support benchmarks focusing specifically on essential
elements in preparing faculty to teach online were validated:
• Technical assistance in course development is available to faculty, who
are encouraged to use it.
• Faculty members are assisted in the transition from traditional teaching to
online instruction and assessed during the process.
• Instructor training and assistance, including peer mentoring, continues
through the progression of the online course.
• Faculty members are provided with written resources to deal with issues
arising from student use of electronically accessed data (IHEP, 2000).
Lastly, in 2001, another organization, The Western Association of Schools
and Colleges Accrediting Commission for Senior Colleges and Universities created
Good Practices for Electronically Offered Degrees and Certificate Programs in
response to the rise of online education and the need to identify it as important
components of higher education. The Good Practices for Electronically Offered
Degrees and Certificate Programs are divided into five components, each of which
is meant to address a particular area of an educational institution’s activity relevant
to distance education. They are institutional context and commitment, curriculum
and instruction, faculty support, student evaluation, and evaluation and assessment.
This study is most concerned about the faculty support component. With faculty
roles becoming increasingly diverse and reorganized, the Commission asserts that
good practices should address the following faculty support:
77
3a. In the development of an electronically offered program, the institution,
and its participating faculty have considered issues of workload,
compensation, ownership of intellectual property resulting from the program,
and the implications of program participation for the faculty member’s
professional evaluation processes. This mutual understanding is based on
policies and agreements adopted by the parties.
3b. The institution provides an ongoing program of appropriate technical,
design, and production support for participating faculty members.
3c. The institution provides those responsible for program development the
orientation and training to help them become proficient in the uses of the
program’s technologies, including potential changes in course design and
management.
3d. The institution provides to those responsible for working directly with
students the orientation and training to help them become proficient in the
uses of the technologies for these purposes, including strategies for effective
interaction. (WASC, 2006)
In reviewing the report, there were no references to empirical data to support
the Commission’s claims of best practices. Despite the lack of evidence, the report
documented that its contents constituted “a common understanding of those elements
which reflect quality distance education programming. As such, they are intended to
inform and facilitate the evaluation policies and processes of each region” (WASC,
2006, p. 1).
Most recently, in October, 2010, the ACCJC-WASC developed an updated
Guide to Evaluating Distance Education and Correspondence Education. The
Guide is a tool to be used by institutions preparing their Self Evaluation Report of
Educational Quality and Institutional Effectiveness and other reports.
Table 3 highlights two of the faculty surveys addressing professional
development components for online teaching.
78
Table 3
Key Components of Online Professional Development by National Governing
Agencies
Author Year Components
Higher Education
Program & Council
of the American
Federation of
Teachers
2000
• Instructional design of course
• Technical support (hardware, software, and
troubleshooting)
• Communication technology employed in distance
education courses
• More course preparation time for first-time online
faculty
The Institute of
Higher Education
Policy (NEA)
2000
• Technical assistance in course development
• Assistance in the transition from traditional teaching to
online instruction, assessed during the process
• Instructor training and assistance, including peer
mentoring, continues through the progression of the
online course
• Written resources to deal with issues arising from
student use of electronically accessed data
Professional Development Evaluation
Accrediting agencies such as the ACCJC are obliged by their purpose — to
assure education quality and effectiveness — and by federal regulations to review
the quality of distance education during its comprehensive accreditation reviews
(ACCJC/WASC, 2010). In 2010, the ACCJC/WASC published a Guide to
Evaluating Distance Education and Correspondence Education (ACCJC/WASC,
79
2010) serving as the primary resource to be used as institutions prepare self-
evaluation reports on the focused area of distance education. The main body of the
publication presents guiding questions that can be used in evaluating distance
education. One of the standards focuses on professional development of faculty
presenting multiple questions: 1) What professional development programs relevant
to distance education personnel does the institution support? 2) How does the
institution determine the professional development needs of its personnel involved in
distance education? 3) What development programs on teaching methodologies in
distance education does the institution provide? 4) What impact do professional
development activities related to distance education have on improvement of
teaching and learning, and how does the institution evaluate that improvement?
(ACCJC/WASC, 2010).
Based on the demands put upon Online Community College to document that
professional development programs for teaching online are meeting their goals, the
development of a comprehensive evaluation process is needed. Online Community
College has a professional development program in place for online faculty; however,
there are no methods of evaluating the effectiveness of the online professional
development in preparing faculty to teach online.
In the context of professional development or training, “evaluation is a
systematic investigation of merit or worth” (Guskey, 2000, p. 41). Patton (2002)
also describes other purposes for program evaluation including facilitating
improvements, generating knowledge, and providing accountability. Kirkpatrick’s
80
Model of Evaluation (Kirkpatrick & Kirkpatrick, 2006, 2007), although developed
out of research conducted with business and industry, provides an evaluation
framework relevant for professional development in the educational context (Guskey,
2000). Originally devised to judge the quality, effectiveness, and efficiency of
supervisory training programs, Kirkpatrick’s model is continually used worldwide
(Guskey, 2000).
Evaluation Models
According to Guskey (2000), professional development evaluation can be a
very complex task requiring the use of theoretical and conceptual frameworks.
There are many professional development evaluation tools available that have
advantages and limitations. Guskey presents evaluation models that are most
applicable in evaluating professional development including Tyler’s Evaluation
Model, Metfessel and Michael’s Evaluation Model, Hammond’s Evaluation Model,
Scriven’s Goal-Free Evaluation Model, Stufflebeams’s CIPP Evaluation Model, and
Kirkpatrick’s Evaluation Model. A review of each model will help in determining
the most effective evaluative model to implement when evaluating the online
professional development offered by Online Community College. This section will
discuss professional development evaluation models and will conclude with an
evaluation framework that will be used for this research study.
Tyler’s Evaluation Model. One of the earliest and influential models of
evaluation was developed by Ralph W. Tyler (Guskey, 2000). Tyler believed that
the essential first step in any evaluation is that the program or activity’s goals are
81
clear so that the evaluation can focus on the extent to which the program or activity’s
goals were achieved (Guskey, 2000). Tyler’s model included a series of steps that he
believed should be followed in a systematic evaluation. The seven steps included 1)
establish broad goals or objectives, 2) classify or order the goals or objectives, 3)
define the goals or objectives in observable terms, 4) find situations in which
achievements of the objectives is demonstrated, 5) develop or select measurement
techniques, 6) collect performance data, 7) compare the performance data with the
stated objectives (Guskey, 2000). According to Guskey (2000), “Tylers evaluation
model was relatively simple, easy to follow, easily understood, and produced
information that was directly relevant to educations” (p. 49).
Metfessel and Michael’s Evaluation Model. In 1967, Metfessel and
Michael added to Tyler’s evaluation model by greatly expanding the methods of data
collection to include multiple constituencies throughout the evaluation process
(Guskey, 2000). The eight steps outlined in Metfessel and Michaels’ evaluation
model are listed below:
1. Involve the total school community as facilitators in the evaluation
process.
2. Formulate a cohesive model of goals and specific objectives.
3. Translate objectives into a communicable form applicable to facilitating
learning in the school environment.
4. Select or construct instruments to furnish measures allowing inferences
about program effectiveness.
82
5. Carry out periodic observations using content-valid tests, scales, and
other behavior measures.
6. Analyze data using appropriate statistical methods.
7. Interpret the data using standards of desired levels of performance over
all measures.
8. Develop recommendations for the further implementation, modification,
and revision of broad goals and specific objectives.
Metfessel and Michael’s primary contribution to the evaluation process was the
development of recommendations for further implementation expanding the
usefulness of evaluation results (Guskey, 2000).
Hammond’s Evaluation Model. Tyler’s evaluation model was further
extended 20 years later by Hammond, who presented a more detailed structure for
evaluation (Guskey, 2000). In addition to determining whether or not a program’s
goals were attained, Hammond proposed that evaluators place equal importance on
why those goals were attained or why they were not. Hammond created a three-
dimensional model to organize various factors which influence the attainment of the
program’s desired goals and objectives (Guskey, 2000). Compared to Tyler and
Metfessel and Michael’s evaluation models, Hammond’s model is more complex.
Scriven’s Goal Free Evaluation Model. Scriven’s goal free evaluation
model focuses on a program’s goal to help clarify evaluation processes and yield
relevant information for educators, similar to Tyler and Metfessel and Michael’s
model (Guskey, 2000). However, Scriven believes that program goals must be
83
examined in order to avoid potential bias and improve objectivity as well. Therefore,
Scriven goal free evaluation model focuses on actual outcomes of a program, rather
than solely on those that are intended. Guskey (2000) noted that Scriven’s model
supplement and complement Tyler’s goal oriented evaluation model and are not
mutually exclusive.
Stufflebeam’s CIPP Evaluation Model. Departing from Tyler’s goal-
oriented approach, Stufflebeam’s CIPP evaluation model is centered on decision
making processes. According to Guskey (2000), Stufflebeam’s model emphasizes
the decisions that policymakers and administrators must make and the information
they need to make those decisions. Stufflebeam’s evaluation model is based on four
different kinds of evaluation information: context, input, process, and product
information, creating the acronym CIPP (Guskey, 2000). First, context evaluation is
used in the planning stage to focus on identifying problems, needs, and opportunities
that exist in a specific educational setting. Second, input evaluation focuses on
structuring decisions to provide information about how best to allocate resources to
achieve program goals and objectives. Third, process evaluation provides
information for implementing decisions. The primary goal is to identify any defects
in the program design and to make suggestions on how those might be fixed. Lastly,
product evaluation focuses on determining and interpreting program outcomes. The
final evaluation compares program expectations and results enabling program
administrators to decide whether to continue, terminate, or modify the program
(Guskey, 2000).
84
Kirkpatrick’s Evaluation Model. “Kirkpatrick’s evaluation model’s
simplicity and practicality have made it the foundation of training program
evaluations in businesses around the world,” stated Guskey (2000). It aims to
measure quality, efficiency, and effectiveness of training through the following four
levels: 1) reaction, 2) learning, 3) behavior, 4) results (Guskey, 2000; Kirkpatrick,
2006).
The first level of Kirkpatrick’s model focuses on evaluating participants’
reactions or how they felt about the program. Since a basic principle of adult
learning is that adults tend to be motivated to learn when they believe their needs are
being met, subjective forms of program evaluation at the level one can assist training
professionals in determining whether or not current professional development is
meeting their needs.
The second level measures the participants’ behavior or knowledge, skills,
and attitudes that the participants acquire as a result of the training. Trainers
implement a paper and pencil test such as a pre/post test to measure changes at the
second level. The third level considers the extent to which the on-the-job behavior
of participants were impacted or changed because of the training. The primary focus
is on how much and what type of change actually occurred in the participants’ job
performance. The last level or results evaluation is designed to assess the bottom
line of a business or organization. According to Guskey (2000), evaluation at this
level assesses such things as improved productivity or better quality.
85
Many researchers have made a number of modifications to Kirkpatrick’s
model such as Clark and Estes (2008) and Guskey (Killion, 2008). Nonetheless,
Kirkpatrick’s’ model has the benefit of nearly a half century of use in many different
organizational settings in various countries and cultures (Clark & Estes, 2008).
Clark and Estes (2008) stated that “while there have been many improvements
suggested about how to evaluate each level, the Kirkpatrick basic four-level model is
still the best and almost universally used system” (p. 128).
Online Professional Development Evaluation
Few studies have been found that examine online professional development
for faculty relative to level two, three, and four Kirkpatrick’s model: learning,
behavior, and results. A majority of the studies reviewed used level one evaluation
such as self-report measures of faculty from interviews, survey, and questionnaires.
According to Desimone (2009), self-report measures can provide valuable insight
into educators perspectives of their own teaching; however, these same researchers
argued that an overuse of the research examining the effects of professional
development on classroom behavior and student learning is based on self-report
measures, and this overemphasis needs to be balanced with studies using measures of
change that are more objective. Hence, more research is needed with regards to
Kirkpatrick’s evaluation levels two, three, and four to answer essential questions
such as 1) How are changes in knowledge, skills, and attitudes measured upon
completion of the professional development (learning)? 2) How does the actual
online professional development compare with regards to faculty implementation of
86
instructional practice in the online classroom (behavior)? and 3) How does the actual
professional development impact student learning in the online classroom (results)?
Summary
There are many issues and characteristics that ought to be considered when
evaluating a professional development program preparing faculty to teach online.
The literature presents multiple constructs that may have an impact on preparing
faculty to teach online. The constructs can be categorized into several categories:
trainers, professional development, and participants.
First, the nature of the online adult learner suggests that online professional
development be based on andragogical principles anchored in Knowles’ assumptive
differences between pedagogy and andragogy (Gibson & Wentworth, 2001). Online
education is widely accepted as student-centered education compared to the
professor-centered traditional education. The shift to online education has altered
the instructor’s role from a traditional lecturer to more of facilitator of the self-
directed learner. Research also suggests that facilitators address the needs of the
adult educator by including professional development components such as learner
objectives and contracts, needs assessments, learner resources, and modeling of best
practices throughout the learning experience.
Second, according to the TPACK framework, the professional development
for online teaching should address not only technology, but also integrate the content
and pedagogy necessary for teaching online (Archambault & Barnett, 2010;
Archambault & Crippen, 2009; Mishra & Koehler, 2006). According to the
87
literature, most training for online learning focus primarily on technology and fail to
integrate the content and pedagogy for teaching online (Archambault & Barnett,
2010; Archambault & Crippen, 2009; Mishra & Koehler, 2006). Furthermore, the
preferred method of online professional development is to have faculty members
experience online pedagogy and technology as a student before teaching their first
online course. More recent literature recommends that an array of technological
communication tools and supports are made available to professional development
participants to enhance the online educational experience, with an increasing need to
have synchronous, live web conferencing tools to engage the online learner.
Third, the format and duration of professional development may also impact
the outcomes of professional development. For example, collaborative faculty
support and design teams have been shown to have positive effects in preparing
faculty to teach online. Research also suggests that professional development
trainers provide online learners with a program or course orientation before getting
started.
Other variables of consideration related to the online learner include age,
gender, teaching experience and online teaching experience. Prior teaching and
technology experience has been deemed as factors that impact the success of faculty
member’s ability to teach online.
Despite the evidence that confirms the components necessary for online
teaching preparation, there is minimal empirical research that specifically examines
the usefulness of online professional development in preparing faculty to teach
88
online. Professional development for learning to teach online should be evaluated
for its effectiveness (Guskey, 2000). Guskey (2000) recommends Kirkpatrick’s four
level model because of its practicality and simplicity. This study will examine the
degree to which the aforementioned variables contribute to effective professional
development to prepare faculty to teach online at Online Community College, and
the relationship of the professional development to changes in participants’ self-
assessed knowledge about tasks associated with effective online teaching.
89
CHAPTER THREE
METHODOLOGY
Allan and Seaman (2005, 2007, 2010) have documented the growth and
demand for online higher education courses offerings. Current research emphasized
the need for faculty training to assure faculty members who teach online are prepared
(Pagliari et al., 2009). It was found that online training programs “should emphasize
both technical and pedagogical skill development, evaluate training needs prior to
the training, and provide ongoing resources and support mechanisms after the
training” (Roman et al., 2010). Other researchers proposed a contemporary
framework for professional development of online teaching including pedagogy,
technology, and content knowledge (Koehler, Mishra & Yahya, 2007).
Dede, Ketelhut, Whitehouse, Breit, and McCloskey (2009) noted that
although educational institutions need to build instructor capacity for improvement
in online teaching, they also need to ensure that time, effort, and scarce resources are
expended only on quality programs that teach with and about best practices. The
literature included studies of faculty perceptions on training programs for teaching
online; however, there still is a need for more research to be conducted in the area of
best practices in training faculty to be successful online instructors (Pagliari et al.,
2009). The purpose of this study was to research the effectiveness of a college’s
professional development program, specifically focusing on how the training
prepared faculty to teach online. The outcome of this study provided knowledge on
preferred practices for effectively training faculty to teach online.
90
The study addressed the following research questions:
1. What is the relationship between participating in the professional
development and changes in participants’ self-assessed knowledge about
tasks associated with effective online teaching?
2. What professional development activities did the participants find to be
most useful?
3. In what professional development activities did the participants have the
greatest knowledge gains?
Research Method and Design
Research is conducted to discover new information and make a contribution
to a field of knowledge, or “illuminate a societal concern” (Patton, 2002, p. 213) and
then test a method, program, or policy for possible recommendation to practitioners
as a useful practice to solve a particular problem (McEwan & McEwan, 2003).
According to McEwan and McEwan (2003), many educational research studies
evaluate whether an intervention causes a particular change or outcome. To aid
educational researchers, McEwan and McEwan (2003, p. 4) present five questions
that should be answered when evaluating an intervention:
1. Does it work (causal)?
2. How does it work (process)?
3. Is it worthwhile (cost)?
4. Will it work for me (usability)?
5. Is it working for me (evaluation)?
91
Each of the five questions provides essential information pertaining to the
effectiveness of professional development. The causal question assists the college in
determining if there is a relationship between the professional development and
faculty obtaining new knowledge and skills to prepare them to teach online.
Secondly, the process question provides information about how the professional
development works. Thirdly, the cost question addresses whether or not the
professional development is worthwhile and cost effective for the college. The
fourth question on usability reveals if the professional development strategies used in
professional development will work for future professional development to training
faculty to teach online. Additionally, the usability question provides insight on
which professional development components contributed to making the most impact
in preparing faculty to teach courses online. Finally, the evaluation question
divulges if the online professional development was effective in preparing faculty to
teach online. Combined, the five questions assist the researcher in gathering
significant insight that can be used in future professional development to prepare
educators to teach online.
This study referred to Kirkpatrick’s four level model to evaluate how
effective the professional development program was in preparing the faculty to teach
online. A pre- and post-survey measured the first two levels of Kirkpatrick’s model:
1) The faculty’s reaction or how was the training?; and 2) The faculty’s learning or
what did they learn? (Kirkpatrick & Kirkpatrick, 2006). The third and fourth level of
Kirkpatrick’s model evaluates faculty behavior beyond the training and results that
92
occur because the faculty attended the professional development (Kirkpatrick &
Kirkpatrick, 2006). Kirkpatrick’s level three and four are beyond the scope of this
research study. According to Kirkpatrick and Kirkpatrick (2007), it can take time for
the learner to create new on-the-job habits, and it takes a while longer for outcomes
to show up in full force — “sometimes as long as a year” (p. 111). A trainer must
consider all factors that are involved when considering a time lapse. However, in
mission-critical training programs such as the online professional development, it is
critical to conduct level two evaluations, as this will provide leading indicators that
will tell you whether the expected results are forthcoming or not (Kirkpatrick &
Kirkpatrick, 2007).
Evaluations of professional development programs can be conducted at
various points in the training program. Evaluations conducted at the end are
considered summative to determine the overall effectiveness of a program, method,
or policy, and whether or not the program being evaluated has the potential of being
generalized to other situations (Patton, 2002). On the contrary, formative
evaluations serve the purpose of improving a specific program, policy, or product.
No attempt is made in formative evaluations to generalize findings beyond the
setting in which the evaluation takes place. Formative evaluations rely primarily on
qualitative studies, while summative evaluations mainly on quantitative research
with statistical pre- and post-surveys. Qualitative data in summative evaluations can
be used to add depth and detail to quantitative findings (Patton, 2002).
93
This study was summative using primarily quantitative research to evaluate
the professional development at Kirkpatrick’s levels one and two. Quantitative
research methods provide a high level of measurement and statistical power
(Cresswell, 2003). Quantitative research provides a numeric description of trends,
attitudes, and opinions of a population, and serves as a means to examine the
relationship among variables (Creswell, 2003). Creswell noted that a quantitative
approach is one in which the researcher primarily uses “postpositivist claims for
developing new knowledge, employs strategies of inquiry such as experiments, and
surveys, and collects data on predetermined instruments that yield statistical data” (p.
18). Traditionally, postpositivist assumptions have governed claims about what
warrants knowledge. According to Cresswell (2003), the postpositivist viewpoint is
sometimes called the “scientific method” or doing “science” research (p. 6).
Alternately, a qualitative approach to research is one in which the researcher
“often makes knowledge claims based primarily on constructivist perspectives or
advocacy/participatory perspectives” (Cresswell, 2003, p. 18). The primary purpose
of qualitative research is to generate new knowledge through theory generation;
hence, the researcher collects open-ended, emerging data with the primary intent of
developing themes from the data. According to Patton (2002), qualitative data come
from three kinds of data collection: 1) in-depth open-ended interviews; 2) direct
observations; and 3) written documents. Qualitative findings may be presented alone
or in combination with quantitative data. Patton (2002) noted that research studies
employing multiple methods, including combinations of qualitative and quantitative
94
data, are common. For example, a survey that includes both fixed-choice (closed)
questions and open-ended questions is an example of how quantitative and
qualitative inquiry are combined.
The approach for this research study was mainly a quantitative design. It
employed strategies of inquiry that involved collecting quantitative data from the
faculty who participated in an online professional development program at a college.
The faculty participants who attended the professional development program
completed a survey prior to participating in the professional development. The
survey revealed faculty perceptions regarding their knowledge about tasks associated
with effective online teaching prior to the treatment. At the conclusion of the
professional development, faculty participants completed a post-survey to determine
the relationship between the professional development and changes in faculty
participants’ self-assessed knowledge about tasks associated with effective online
teaching. The post-survey included open-ended survey questions to assess faculty
participants’ perceptions on how the professional development provided by the
college prepared them for online teaching. The questions sought to identify overall
program effectiveness, and what was most useful in preparing them to teach online.
Due to the small sample size, disaggregation of data was not the main focus of this
investigation. However, the researcher explored trends relating to variables such as
age, gender, and prior experience with teaching and teaching online, and how they
may have influenced participants’ learning and online teaching preparation.
95
Population and Sample
This study was conducted at a college located in Honolulu, Hawaii. The
college will remain anonymous and referred to as “Online Community College.”
Online Community College is one of seven community colleges within a state
system. Online Community College was selected for this study because it has been
offering professional development programs for online teaching since 2008. The
online professional development program has been providing faculty with the
flexibility to interact with their colleagues and trainers to acquire knowledge and
skills to teach college level online courses.
The sample consisted of all faculty members who participated in the
professional development program. Regular faculty and lecturers who may or may
not have prior online teaching experience were invited to participate in the
professional development program. The college has over 250 regular faculty
members and over 150 lecturers. Faculty who participated in the professional
development program were asked to complete a pre- and post-survey to disclose
their knowledge and attitudes toward transitioning to the online teaching
environment. The college originally planned to have 30 faculty participants in the
online professional development program; however, the college accepted a total 39
participants. All of the 39 participants who were accepted into the online
professional development program completed the pre-survey and 25 (64%)
participants completed the post-survey. Table 4 and Table 5 summarize the gender
of the participants.
96
Table 4
Pre-Survey Respondents by Gender
Gender Frequency Percentage
Female 24 61.54
Male 15 38.46
Total 39 100
Table 5
Post-Survey Respondents by Gender
Gender Frequency Percentage
Female 18 72.0
Male 7 28.0
Total 25 100
97
Table 6 and Table 7 describe the age of the participants. Forty-five was the
mean age for the pre-survey participants and 44 was the mean age for the post-
survey participants.
Table 6
Pre-Survey Respondents by Age
Age Frequency Percentage
20 to 29 2 5.13
30 to 39 9 23.08
40 to 49 16 41.03
50 to 59 9 23.08
60 to 69 3 7.69
Total 39 100
Table 7
Post-Survey Respondents by Age
Age Frequency Percentage
20 to 29 1 4.00
30 to 39 8 32.00
40 to 49 10 40.00
50 to 59 5 20.00
60 to 69 1 4.00
Total 25 100
98
Table 8 and Table 9 provide a breakdown of the faculty participants’ years of
teaching in higher education. Higher education teaching experience for the pre-
survey and post- survey participants ranged from less than a year to 30 years.
Table 8
Pre-Survey Respondents by Years of Teaching
Years of Teaching Frequency Percentage
0-5 11 28.21
6-10 10 25.64
10-15 9 23.08
16-20 2 5.13
21-25 5 12.82
26-30 2 5.13
Total 39 100
Table 9
Post-Survey Respondents by Years of Teaching
Years of Teaching Frequency Percentage
0 to 5 6 24.0
6 to 10 8 32.0
10 to 15 6 24.0
16 to 20 1 4.0
21 to 25 3 12.0
26 to 30 1 4.0
Total 25 100
99
Table 10 and Table 11 summarize the participants’ online teaching
experience. Over a third of the pre-survey and post-survey participants have no
online teaching experience.
Table 10
Pre-Survey Respondent by Years of Teaching Online
Years of Teaching Online Frequency Percentage
0 15 38.46
1-2 10 25.64
3-4 5 12.82
5 or more 9 23.08
Total 39 100
Table 11
Post-Survey Respondent by Years of Teaching Online
Years of Teaching Online Frequency Percentage
0 8 32.00
1-2 6 24.00
3-4 3 12.00
5 or more 8 32.00
Total 25 100
100
The online teaching environment included a course management system
named Laulima featuring synchronous and asynchronous communication tools. The
next section explains the online professional development program.
Intervention
The college provided faculty an opportunity to engage in an online
professional development to support the creation and delivery of an online or hybrid
course based on practices cited in the literature and accreditation guidelines for
distance learning (Appendix A). Most of the professional development was
delivered online so that faculty could complete training modules at their convenience
on the Laulima course management system that they would eventually use when they
instruct their online course at Online Community College. The major topics
included course management, content management, communication and
collaboration, and assessment. To ensure fidelity, faculty participants completed
assigned training modules and related assignments. Faculty participants read and
reviewed training modules, and practiced the skills, strategies, and tools by
completing assignments on the Laulima course management site.
Major themes that the professional development trainer intended to infuse
throughout the training modules included ACCJC guidelines, best practices in higher
education, creating a community of learners, establishing and maintaining a presence
in an online environment, time and stress management, and universal design for
learning. The professional development trainers were a team comprised of faculty
and staff with expertise in distance learning, instructional design, professional
101
development, multimedia, Laulima course management, and other learning
technologies.
Each faculty participant was required to build a topical section for a hybrid or
online course by the end of the training period. The tangible outcomes that all
faculty participants needed to meet by the end of the training period are listed below:
• A syllabus, including the following sections:
o Course Information;
o Academic and Course Calendars;
o Policies;
o Technical Requirements;
o Student Services.
• Demonstrate understanding of applicable ACCJC guidelines by
implementing them in a course through content, delivery, activities,
assessment, etc.
• A course plan listing topics and activities that contain references to the
specific student learning outcomes addressed by the activities.
• Use of the tools related to Laulima course administration, organization,
and maintenance.
• A course web site that includes tools in these categories:
o Content delivery (at least 1 tool);
o Communication (at least 1 tool);
o Assessment (at least 1 tool).
102
• Include tools and information that address student readiness for online
learning.
• Include at least one individual learning activity embedded into one
Laulima tool.
• Apply at least two universal design for learning principles in course
content or assessment activities.
• Integrate at least one activity that is interactive, collaborative, or foster
connections among students (community building).
• Links to student engagement and success resources both on and off
campus.
• Identify methods to be used to assess the course and how it will be used
to improve the course.
At the completion of the professional development, faculty participants
received a Distance Learning and Laulima Certification from Online Community
College and a Netbook computer to support their online teaching in subsequent
semesters.
103
Figure 3. Online Community College Distance Education Certification Program
(2011)
104
Instrumentation
Survey Method
According to Killion (2008), surveys are measures of personal opinion,
beliefs, knowledge, behaviors, or values and can be completed orally or in writing.
Oral surveys are often referred to as interviews. The difference between a survey
and an interview is that a survey tends to use multiple-choice questions with
predetermined responses, although it doesn’t always. Online surveys are
increasingly being used because they don’t require budget for postage, paper, and
personnel, and they make tabulation of responses easier.
When choosing a data-collection method that involves a survey, the evaluator
decides whether to use an existing survey or construct a new one particular to the
professional development being evaluated (Killion, 2008). Existing instruments are
usually field- tested to be valid and reliable, however, they may not align with the
constructs of the evaluation at hand or appropriate for the context or population who
will be completing them.
The main instrument of data to inform the research questions was a survey
adapted from a research study completed by Dr. Leanna Archambault and Dr. Ken
Crippen (2009). The researcher was granted full permission to use the survey to
measure the perceived knowledge levels of those who teach in an online
environment specific to technical expertise, online pedagogy, and content area. The
participants responded to the following question, “How would you rate your own
knowledge in doing the following tasks associated with teaching in distance
105
education setting?” (Archambault & Crippen, 2009, p. 78). Twenty-four items along
the areas of technology, pedagogy, content knowledge, and the combination of these
areas were included, and the Likert scale for answering consisted of 1 (Poor), 2
(Fair), 3 (Good), 4 (Very good), and 5 (Excellent).
The researcher modified the survey minimally to adapt to the higher
education environment since the Archambault and Crippen (2009) originally
intended the survey to be used when studying K-12 educators (Appendix B;
Appendix C). The survey contained two main sections. The first section consisted
of survey items to measure the constructs related to teaching online knowledge and
the second section focused on teaching online preparedness, including open-ended
questions to inform one of the research questions. An open-ended question provided
data to answer the research question about what tasks associated with effective
online teaching the professional development participants found to be most useful.
Participants also responded to an open-ended question relating to the distance
education program’s effectiveness in preparing them to teach online. The survey
questions from the second section mirrored the professional development learning
outcomes established by the trainer. The survey also included demographic
information including age, gender, teaching experience, and online teaching
experience. The survey was field tested for construct validity.
Data Collection
The researcher used a pre- and post-survey to collect data for this study. Data
collected were used to evaluate the professional development intervention’s impact
106
on faculty online teaching preparation. The surveys were distributed to the
designated college in October 2011 and January 2012. In order to obtain approval to
field the survey, the researcher received IRB approval from the University of
Southern California (USC). Once the IRB had been finalized, the researcher
contacted the designer of the professional development program who manages the
college’s Center for Teaching and Learning Technology (CELTT) to embark on the
study.
The professional development was primarily online; however, faculty
participants were required to participate in two face-to-face sessions. The faculty
were asked to complete the pre-survey and post-survey during the face-to-face
orientation session and the face-to-face concluding session. The orientation session
provided faculty participants with information they needed to begin the online
professional development. The concluding session provided faculty participants an
opportunity to showcase how they applied the professional development knowledge
to their newly developed course site. The researcher designed the survey so that it
took approximately 10 to 15 minutes to complete. The survey was e-mailed to
faculty participants who were unable to attend the face-to-face orientation or final
session.
Data Analysis
For the data analysis phase of the research project, the research used SPSS
v19.0 software to run statistical analyses to answer the research questions.
Descriptive statistics were computed to organize and describe the data characteristics
107
(Kurpius & Stafford, 2006). Percentages and frequency distributions were calculated.
Variables such as gender, age, teaching and online teaching experience were
analyzed using descriptive statistics. Table 12 presents an overview of how each
research question is connected to the survey items, statistical analyses, and the
Kirkpatrick framework levels measured (Kirkpatrick & Kirkpatrick, 2006).
Table 12
Research Questions, Survey Items, and Statistical Analyses
Research Question Survey Statistical Analysis
Kirkpatrick
Framework
Q1: What is the
relationship between
participating in the
professional development
and changes in
participants’ knowledge
about tasks associated
with effective online
teaching?
Pre-: B1a - B1x
Post-: B1a - B1x
Descriptive Statistics
• Mean
• Standard Deviation
Paired Samples t-test
Level 2 - Learning
Q2: What professional
development activities did
the participants find to be
most useful?
Post-: C3,C7,C8 Descriptive Statistics
• Frequency
• Percentages
Survey Open-ended
Response
Level 1- Reaction
Q3: In what professional
development activities did
the participants have the
greatest knowledge gains?
Pre-: C2a – C2m
Post-: C2a – C2m
Pre-: C1
Post-: C1
Post-: C5
Post-: C6
Descriptive Statistics
• Mean
• Standard Deviation
Paired Samples t-test
Level 2 - Learning
108
The primary purpose of the data analysis was to determine whether or not the
differences in the pre- and post- results were statistically significant (p< .05). A t-
test was done to compare the survey responses from the exact same faculty members.
A t-test for dependent means indicates that a single group of the same participants is
being studied under two conditions (Salkind, 2008). Primarily, it is because the
same faculty were surveyed at two times, before the start of the online professional
development program and at the end of the program. The difference between the
faculty members’ scores on the pre-survey and post-survey was the main focus of
this study. Lastly, although not the main focus of this investigation, a repeated
measures analysis of variance was also run to explore the relationship between the
online professional development outcomes and participant demographics such as age
and online teaching experience.
Cronbach’s alpha was computed to measure internal consistency reliability.
Internal consistency reliability was used to determine whether the items on the
survey were consistent with one another to ensure they represented the constructs
(Salkind, 2008). According to Salkind (2008), the more consistently individual item
scores vary with the total score on the survey, the more confidence the researcher can
have that the survey is measuring one thing, and that one thing is the sum of what
each item evaluates. The values are presented in Table 13 and indicate that the
subscales have alpha values from .80 to .90 and are reasonable for internal reliability
(Salkind, 2011).
109
Table 13
Cronbach’s Alpha for TPACK
Domain N of Survey Items Cronbach’s Alpha
Pedagogy 3 .817
Technology 3 .856
Content 3 .831
Pedagogical Content 4 .909
Technological Content 3 .837
Technological Pedagogy 4 .887
Technological Content Pedagogy 4 .804
Cronbach’s alpha was also computed for the 13 questions in section two of
the survey asking participants about how prepared they are for the teaching tasks
related to the professional development learning outcomes. The value equated to .92
reflecting internal reliability.
Limitations
In order for the study results to be useful, the researcher considered internal
and external validity limitations (McEwan & McEwan, 2003). In terms of internal
validity, the research findings may have been impacted by biases relating to self-
selection of the faculty participants. Additionally, over the course of the study, the
110
faculty participants may have been influenced by factors other than the professional
development intervention.
Although the survey provided faculty participants an opportunity to respond
to open-ended questions, the quantitative data collection method was limited
including primarily items and scales. Furthermore, the survey was distributed to a
number of faculty participants online which limited personal contact during
administration (Fowler, 2002). Lastly, all 39 participants completed the pre-survey;
however, not all faculty completed the post-survey by the January 2011 professional
development end date, reducing the post survey count to 25.
111
CHAPTER FOUR
ANALYSIS OF THE DATA
This chapter presents the findings of the study in response to three research
questions. The research reviewed in Chapter Two indicated that effective
professional development will be designed and implemented to meet the needs of the
adult learner, and that effective online teaching involves having knowledge relating
to technology, pedagogy, and course content. The first research question, therefore,
sought to determine the relationship between participating in a professional
development program and changes in participants’ self-assessed knowledge about
tasks associated with effective online teaching. The second question examined what
professional development activities the participants found to be most useful in
preparing them to teach in an online environment. Finally, the third question aimed
to determine what professional development activities gave the participants the
greatest knowledge gains.
The data were analyzed using the SPSS v19.0 software. The researcher
utilized descriptive and inferential statistics to report the findings in this chapter.
The first section of the chapter presents participant characteristics including gender,
age, and years of teaching in higher education and online. The second section of the
chapter presents the results of the data analysis as they pertain to the three research
questions. Detailed discussion of these results will be presented in Chapter Five.
112
Description of the Sample
As noted in Chapter Three, the participants in the study were primarily
faculty from a community college on Oahu, Hawaii. The original sample consisted
of 39 faculty members who participated in the online professional development
program. Faculty completed a pre-survey during a mandatory orientation session
and a post-survey at a culminating session to showcase the application of their online
learning. Both sessions were also available to faculty via Blackboard Collaborate.
Participants who chose to attend the sessions virtually were provided a web address
for the pre- and post-surveys. Participants who did not attend the final session were
e-mailed the post-survey web address with a request to complete the online post-
survey. A total of 39 (N=39) pre-surveys were distributed and completed. At the
completion of the 12 week professional development, a total of 39 post-surveys were
distributed, however only 25 (N=25) were completed. The pre-survey response rate
was 100%; the post-survey response rate was 64.1%. Based on the final data, the
researcher looked at participants who completed the pre-survey and post-survey.
Both surveys were matched using participants’ birth date and telephone number
combinations. There were a total of 24 (N=24) matched surveys. The response rates
for the pre- and post-surveys and matched group are depicted in Table 14 below:
113
Table 14
Response Rates
Survey Type Response Rate (N = 39)
Pre-Survey 100%
Post-Survey 64.1%
Matched Pre- /Post-Survey 62.3%
Matched Sample Demographics
The researcher ran frequency distribution analysis for the entire sample to
provide a concise representation of the data and an accurate breakdown of the sample
population as noted in Chapter Three. The most basic way to illustrate data is
through a frequency distribution allowing for further computation of other statistics
(Salkind, 2011). Demographic characteristics of the pre-survey sample and the
matched sample are presented in Table 15.
As shown in Table 15, the pre-survey respondents had a range of higher
education teaching and online teaching experience. There was also a range of age
levels. The majority of the respondents were female.
Pre-survey respondents’ years in teaching higher education ranged from 1 –
30 years with a mean of 10.67 years (SD = 7.72). Sixty-four percent of the
respondents had 0 – 2 years of online teaching experience (M = 2.44 years; SD =
3.07). Twenty-three percent had more than 5 years of online teaching experience.
114
The respondents age ranged from 27 to 65 with a mean age of 45.31 (SD = 10.01).
Lastly, the respondents were predominantly female with 24 responses (61.5%).
Table 15
Sample Demographics
Type
Pre-Survey
(N = 39) %
Matched
(N = 24) %
Years of Experience in Higher Education
0 – 5 11 28.2 6 25.0
6 – 10 10 25.6 7 29.2
11 – 15 9 23.1 6 25.0
16 or more 9 23.1 5 21.0
Years of Experience in Online Teaching
0 – 15 38.5 7 29.2
1 – 2 10 25.6 6 25.0
3 – 4 5 12.8 3 12.5
5 or more 9 23.1 8 33.2
Age
20 – 29 2 5.1 1 4.2
30 – 39 9 23.1 8 33.4
40 – 49 16 41.0 9 37.6
50 – 59 9 23.1 5 20.9
60 – 69 3 7.7 1 4.2
Gender
Female 24 61.5 17 71.8
Male 15 38.5 7 29.2
115
The 24 matched survey respondents’ were demographically similar to the 39
pre-survey respondents. The matched survey respondents’ years of experience in
teaching in higher education also ranged from 1 – 30 years with a comparable mean
of 10.8 years (SD = 8.07). More than half of the matched survey respondents had 0
– 2 years of teaching online experience (M = 3.0 years; SD = 3.37). Thirty-three
percent had more than 5 years of online teaching experience. The respondents’ age
ranged from 28 to 67 with a mean age of 43.8 (SD = 9.64). Lastly, 71.8% of the
matched survey respondents were female.
The researcher ran Independent Sample t-tests comparing the 24 matched
respondents with the 15 participants who did not complete a post-survey. Results
indicated that there were no statistically significant demographic differences between
the two groups. Because of the small sample size, disaggregated data for any of
these variables was not the primary focus of this investigation.
Results
Aiming to evaluate the online professional development at Kirkpatrick’s level
two, learning, there were three questions that guided this investigation:
1. Kirkpatrick Level 2 (Learning) What is the relationship between
participating in the professional development and changes in participants’
self-assessed knowledge about tasks associated with effective online
teaching?
2. Kirkpatrick Level 1 (Reaction) What professional development activities
did the participants find to be most useful?
116
3. Kirkpatrick Level 2 (Learning) In what professional development
activities did the participants have the greatest knowledge gains?
The following sections will present the results of the analyses for the three
research questions based on the participant responses from the pre- and post-surveys.
The surveys included fixed choice (closed) questions and open-ended questions
requiring a written response.
Findings Related to Research Question 1
Research question one asked “What is the relationship between participating
in the professional development and changes in participants’ self-assessed
knowledge about tasks associated with effective online teaching?” Matched pre- and
post-surveys were used to evaluate the changes in participants’ understanding of
technology, pedagogy, and content knowledge (TPACK) at the second level of the
Kirkpatrick framework, learning.
The tasks associated with effective online teaching in research question one
were presented in the 24 TPACK questions in the “Teaching Online Knowledge”
section of the survey. To determine the level of TPACK knowledge of the
participants, survey questions B1a – B1x asked participants to rate their knowledge
in each of the seven areas as described by the TPACK framework. The participants
responded to the following question: “How would you rate your knowledge in doing
the following tasks associated with teaching in a distance education setting?” Table
16 provides a brief definition of the seven TPACK categories and example
corresponding questions that represent the TPACK subscales.
117
Table 16
TPACK Categories with Brief Definition and Example Corresponding Questions
Category Definition Example
Content (C) Subject matter to be taught My ability to decide on the
scope of concepts taught
within my class
Pedagogy (P) Process and methods of
teaching and learning
My ability to adjust teaching
methodology based on student
performance/feedback
Technology (T) Tools and modalities to
represent information
My ability to address various
computer issues related to
software
Pedagogical
Content (PC)
Process and methods of
teaching that are applicable to
teaching of a specific content
My ability to distinguish
between correct and incorrect
problem solving attempts by
students within my discipline
Technological
Content (TC)
The manner in which
technology and content are
reciprocally related to each
other
My ability to use
technological representations
to demonstrate specific
concepts in my content area
Technological
Pedagogy (TP)
The manner in which
technology is used in teaching
and learning settings
My ability to moderate online
interactivity among students
Technological
Pedagogy
Content
(TPACK)
The manner in which
technology integration in
teaching and learning
represents the dynamic,
transactional nature between
the three components
My ability to use technology
to predict students’ skill/
understanding of a particular
topic
118
The TPACK framework suggests that online learning opportunities should
address technology integration in comprehensive ways that address all components.
Participants responded to the 24 TPACK items that made up the 7 subscales using a
Likert scale that had the following responses: 1 (Poor), 2 (Fair), 3 (Good), 4 (Very
Good), and 5 (Excellent). The researcher tested the reliability of the TPACK
subscales that resulted in reliable composites with Cronbach’s alpha scores ranging
from .804 to .909 as displayed in Table 17.
Table 17
Pre-Survey Descriptive Statistics for TPACK Subscales with Cronbach’s Alpha
Scores
TPACK
Knowledge
Subscale
Number of
Survey
Items Survey Items
Mean (pre)
N=39
Standard
Deviation
(pre) N=39
Cronbach’s
Alpha
Pedagogy 3 3, 10, 18 3.28 .843 .817
Technology 3 1, 7, 17 2.33 .958 .856
Content 3 2, 4, 13 3.28 .913 .831
Pedagogical
content
4 6, 9, 19, 21 3.35 .905 .909
Technological
content
3 15, 20, 22 2.53 .999 .837
Technological
pedagogy
4 8, 12, 14, 16 2.48 1.036 .887
Technology
pedagogy
content
4 5, 11, 23, 24 2.56 .854 .804
119
Means and standard deviations for each of the composite variables were
calculated for the post-survey population and are displayed in Table 18.
Table 18
Post-Survey Summary of Descriptive Statistics for TPACK Subscales
TPACK Knowledge
Subscale
Number
of Survey
Items Survey Items
Mean (post)
N=25
Standard
Deviation (post)
N=25
Pedagogy 3 3, 10, 18 3.67 .752
Technology 3 1, 7, 17 2.89 .927
Content 3 2, 4, 13 3.83 .688
Pedagogical content 4 6, 9, 19, 21 3.60 .713
Technological
content
3 15, 20, 22 3.39 .896
Technological
pedagogy
4 8, 12, 14, 16 3.18 .914
Technology
pedagogy content
4 5, 11, 23, 24 3.15 .848
A paired sample t-tests indicated the difference between each of the matched
pre- and post-survey TPACK composite variables were statistically significant as
shown in Table 19. The results indicated an overall rise in professional development
participants’ responses in all of the seven TPACK subscales.
120
Table 19
Matched Mean for Pre- and Post-TPACK Survey Questions
TPACK Knowledge
Subscale
Mean (pre)
N=24
Mean (post)
N=24 t Sig.
Pedagogy 3.19 3.65 3.304 .003
Technology 2.36 2.83 3.593 .002
Content 3.24 3.79 3.498 .002
Pedagogical content 3.26 3.55 3.685 .001
Technological content 2.65 3.33 5.204 .001
Technological pedagogy 2.64 3.17 4.123 .001
Technology pedagogy
content
2.64 3.10 3.689 .001
The results suggested that there may be a relationship between participating
in the distance education professional development and participants’ self-assessed
knowledge about tasks associated with effective online teaching as noted in the
TPACK literature. The research results indicated that the distance learning
professional development may have supported the TPACK theoretical framework by
improving the faculty participants’ knowledge about how to design and teach courses
online, particularly in understanding the relationship between technology, pedagogy,
and course content. The TPACK framework “argues against teaching technology
skills in isolation and supports integrated and design-based approaches as being
121
appropriate techniques for teaching instructors to use technology” (Mishra &
Koehler, 2006, p. 1045). The domains that represented the largest gains among the
distance learning professional development participants when comparing the pre-
and post-survey included the use of technology as it relates to technological content,
technological pedagogy, and technology pedagogy content.
The researcher also conducted further analysis on whether the treatment
effect was moderated by other variables such as age and online teaching experience.
Learning changed by participants’ age and years of experience but the sample size
was too small to come to any statistically significant conclusions.
Findings Related to Research Question 2
Research question two asked participants which three online professional
development activities did the participants find to be most useful. Online
Community College has had positive results in previous online professional
development programs; however, these practices have not been documented. The
aim of research question two was to document preferred practices in professional
development for online teaching since there is limited research addressing what
constitutes effective professional development from the perspectives of the
instructors who receive training on how to make a successful transition from face-to-
face delivery to online delivery (Ronan et al., 2010).
The range of possible online professional development activities that the
participants could indicate as most useful in preparing them for online teaching were
presented in the “Teaching Online Preparedness” section of the post-survey. Survey
122
question C7 asked the participants to select three online professional development
activities they found most useful in preparing them to teach online. Participants had
an option to write in an activity or select from the 13 items presented in the post-
survey. The post-survey results demonstrated that the participants ranked the top
three professional development activities as “Apply Universal Design for Online
Learning”; “Utilize the Laulima Learning Platform”; and “Use Synchronous
Communication Tools Such As Blackboard Collaborate”. Table 20 presents the
ranking of the 13 items.
In Chapter Two, one of the scalable studies (Wolf, 2006) noted that effective
training programs are designed to teach online using the course delivery system with
which the participants will be teaching. The findings suggest that the top three
professional development activities connected to practices cited in the literature since
Online Community College trainers incorporated universal design, synchronous
communication tools, and the Laulima learning platform in the design and delivery
of the online professional development program. Online Community College’s
professional development trainers followed effective training program design
enabling faculty participants to experience preferred practices so that they would
have insight into how to best incorporate them into their own course design and
delivery.
123
Table 20
Professional Development Activity
Frequency %
Apply Universal Design for Learning in Course Content 13 18
Utilize the Laulima Learning Platform 9 13
Use Synchronous Communication Tools such as
Blackboard Collaborate/Elluminate
9 13
Facilitate/moderate an online classroom 7 10
Utilize online instructional pedagogy such as
acknowledging students’ perspective, varying instruction,
and managing problem behavior
6 8
Teaching strategies for hybrid courses 6 8
Use asynchronous communication tools such as Discussion
Boards and Multimedia
5 7
Create an online syllabus 5 7
Create an online course orientation 4 6
Create an online assessment 4 6
Manage online assignments such as grading and providing
timely student feedback
1 1
Prevent plagiarism or cheating in an online classroom 1 1
Generate online reports 0 0
Other 2 3
124
These findings must be interpreted with caution because the items that ranked
in the top three categories represented only 35% of the participant responses.
Participants selected a range of responses that included all but one of the professional
development activities demonstrating individual learner differences as noted in the
core adult learning principles.
Interestingly, the higher ranked professional development activities were
learning activities that could be connected to experiential learning techniques used
throughout the training. Additionally, the online professional development program
trainers modeled the activities as they delivered the training to the faculty. The
higher rated learning activities were also demonstrated or featured in the mandatory
orientation session and the optional face-to-face sessions/webinars.
On the contrary, professional development activities that ranked on bottom of
the list were generally not presented to the participants in an experiential manner.
Most of the information for the lesser rated activities were delivered to the
participants in a text document format.
Optional webinars or in-person seminars. To further inform research
question two, post-survey question C3 asked participants about how many optional
webinars or in-person seminars they attended during the course of the professional
development. More specifically, the question read “How many optional webinars or
in-person seminars did you attend?” Participants responded by selecting from the
following choices: 1) None; 2) One; 3) Two; and 4) Three or more. Seven of the
participants attended three or more; nine of the participants attended two or more;
125
four of the participants attended one or more; and only four of the participants did
not attend any optional webinar or in-person seminar. In summary, 84% of the
matched surveys noted that they attended at least one or more optional professional
development activities.
Participants open-ended survey responses. Patton (2002) noted that open-
ended questions can provide further insight to supplement quantitative data. One of
the open-ended questions that was included in the post-survey asked participants to
provide comments regarding the distance education’s program effectiveness in
preparing them to teach online. Participant responses provided further insight into
research question two by exploring which professional development elements they
found to be most useful in preparing them to teach online.
Participants’ written comments that were most prevalent related to the
approach of the distance education professional development program. Two of the
participants mentioned that the faculty sharing sessions were effective program
elements. The first faculty member had been teaching online for few years prior to
attending the training:
The program encouraged me to explore tools that I have not attempted to
understand in the past three years of teaching online. The most helpful
element of the program was the sharing session. I picked up and was inspired
to try new, creative ideas.
The second faculty member also commented on the faculty sharing sessions stating
the following: “I learned a lot from the sharing sessions. I feel a little more
126
comfortable about my hybrid class now with the training from the certification
program.”
Other participants commented on the benefits of incorporating Web 2.0 tools
into the professional development: “I think the most useful thing for me is having
more exposure to Blackboard. Although I am not using it in my hybrid class, I know
that I’ll be using it more in online courses.” Similarly, a different participant stated
that “This professional development experience really invigorated me as a teacher
and helped me to rework some elements in my online classes to increase student
engagement. It also made me think about things I have not for a long time.”
Other faculty participants alluded to the quality of the resources that were
made available to them and the convenience of accessing them via the online
delivery platform: “The online reading materials and resources were very valuable
and will be useful for future use. Opened my eyes to new possibilities and
opportunities.” Another faculty member noted that “The resources provided online,
in webinars, and in person have been excellent.”
Lastly, five participants mentioned the support and expertise of the Center for
Excellence and Teaching Technologies’ (CELTT) support staff and design team.
One participant noted that “This has been an excellent exercise that has increased my
self-efficacy in designing and delivering an online course…the CELTT staff have
done a wonderful job of guiding and supporting my learning as always.”
The participants’ open-ended responses about the professional development
design components supported adult learning principles discussed in Chapter Two
127
including adults typically become ready to learn when they experience a need to
cope with a life situation or perform a task. Having staff support available provided
the self-directed learners with just-in-time support to assist them in fulfilling the
learning outcomes for the distance education professional development program.
Findings Related to Research Question 3
Research question three asked “In what professional development activities
did the participants have the greatest knowledge gains?” Pre- and post-surveys were
used to evaluate the participants’ knowledge at the second level of the Kirkpatrick
framework, learning. The professional development activities in research question
three were presented in the Questions C2a – C2m in the “Teaching Online
Preparedness” section of the surveys, The 13 online preparation questions focused on
the learning goals of the training. Participants were asked to respond to the
following question: “Please read the following statements, and then indicate how
prepared you are for the following online teaching tasks?” Participants had an option
to select 1) Not at all, 2) Somewhat, 3) Quite a bit, and 4) Completely. Participants’
responses to the individual questions were analyzed to determine learning gains on
the distance education professional development learning outcomes.
First the researcher tested the 13 questions for reliability. The 13 questions
were a reliable composite receiving a Cronbach’s alpha score of .925. Next, the
researcher ran a t-test to examine participants’ overall pre-survey and post-survey
score on the “preparedness for online teaching tasks”. Mean scores indicated that
participants reported an overall improved level of “preparedness for online teaching
128
tasks” presented in the distance education professional development program. The
pre- and post-survey means are presented in Table 21.
Table 21
Overall Mean Pre-Survey and Post-Survey Scores for 13 Questions — “Participants
Preparedness for Online Teaching Tasks”
Number of Participants Mean (Pre) Mean (Post) t Sig.
24 2.21 2.88 6.122 .001
A paired samples t-test conducted on the pre- and post-surveys revealed there
were statistically significant differences in 12 of the 13 scores. Based on the t-test
scores, the items denoting the greatest gains included: 1) Apply universal design for
learning in course content, 2) Prevent plagiarism or cheating in an online classroom,
3) Teaching strategies for hybrid courses, 4) Utilize online instructional pedagogy,
such as acknowledging students’ perspective, varying instruction, and managing
problem behavior, 5) Create an online course orientation. Table 22 specifically
highlights the knowledge gains by comparing the mean scores of the pre- and post-
survey items to inform research question three.
129
Table 22
Mean Pre-Survey and Post-Survey Scores for Individual Questions on “Participant
Preparedness for Online Teaching Tasks”
Response to survey question prompt: How
prepared are you for the following teaching
tasks:
Mean
(Pre)
N=24
Mean
(Post)
N=24 T Sig.
Create an online syllabus 2.63 3.25 3.021 .006
Create an online course orientation 2.08 2.92 4.453 .001
Facilitate/moderate an online classroom 2.38 2.96 2.807 .010
Use asynchronous communication tools such as
discussion boards and multimedia
2.58 2.92 1.781 .088
Use synchronous communication tools such as
Blackboard Collaborate/Elluminate
1.96 2.46 2.505 .020
Manage online assignments such as grading
and providing timely student feedback
2.67 3.17 2.627 .015
Utilize online instructional pedagogy such as
acknowledging students’ perspective, varying
instruction, and managing problem behavior
2.25 2.88 4.733 .001
Create an online assessment 2.21 2.92 4.303 .001
Apply universal design for learning in course
content
1.92 2.83 5.412 .001
Prevent plagiarism or cheating in an online
classroom
1.83 2.63 5.379 .001
Utilize the Laulima learning platform 2.46 3.21 3.423 .002
Generate online statistical reports 1.67 2.50 4.053 .001
Teaching strategies for hybrid courses 2.13 2.83 5.027 .001
130
All items except “Use asynchronous communication tools such as discussion boards
and multimedia” had statistically significant learning gains. The participants’
response to the item “Use asynchronous communication tools such as discussion
boards and multimedia” was not statistically significant (p = .088). The survey
question responses suggested that participants demonstrated higher learning gains in
certain activities; however, there were learning gains in all professional development
activities.
Readiness to teach online. In addition to learning about the participants
online knowledge gains, the researcher asked participants in survey question C1
whether they felt ready for an online teaching experience to gauge their self-reported
level of preparation at the start and at the end of the professional development.
Possible answers were 1) Not at all; 2) Somewhat; 3) Quite a bit; and 4) Completely.
First, the mean response for the pre-survey was reported to determine the confidence
level of the 39 faculty participants. The mean response was “somewhat” prepared
(M = 2.21, SD = .695) as show in Table 23 below.
Table 23
Pre-Survey Question on “Participants’ Preparedness for Online Teaching
Experience”
Question
Mean (pre)
N=39
Standard Deviation
(pre) N=39
How prepared do you feel for an online
teaching experience?
2.21 .695
131
Then a paired samples t-test compared the pre- and post-scores for the
matched surveys. As displayed in Table 24, T-test results indicated an increase in
participants’ feeling of preparedness for an online teaching experience.
Table 24
Matched Mean for Pre- and Post-Survey Question on “Participants’ Preparedness
for Online Teaching Experience”
Survey Question
Mean (Pre)
N=24
Mean (Post)
N=24 T Sig.
How prepared do you feel for an
online teaching experience?
2.33 2.88 3.406 .002
Interestingly, the initial 39 participants have a mean preparedness score of
2.21 compared to 24 matched survey participants (M = 2.33). Based on the t-test
results, matched participants’ feeling of preparedness was significantly improved by
the end of the online professional development.
Despite the small sample size, the researcher also conducted further analysis
on whether the matched participants’ preparedness score was moderated by other
variables such as age and online teaching experience to see if there were any trends
worth noting. It appears that participants’ preparedness scores changed by age and
years of experience. For example, the participants who were over the age of 50 had
a better view of themselves at the start of the training by rating higher on the “feeling
132
prepared to teach online” question. However, the participants under the age of 50
appeared to get more out of the training based on an increase in their post-survey
preparedness score. Figure 4 depicts the differences between the younger (under 50
years of age) and the older (over 50 years of age) participants. Since the sample size
was too small, the researcher was unable to come to any statistically significant
conclusions. Perhaps with a larger sample size, there could possibly be statistically
significant results.
Figure 4. Participants’ Age and Preparedness for Online Teaching
133
Level of interest in teaching and designing an online course. The
researcher asked the matched participants in post-survey question C5 and C6 about
their level of interest in teaching and designing an online course to gauge their self-
reported level of interest at the end of the professional development experience.
Possible responses were 1) Less Interested; 2) Same; and 3) More Interested. Table
25 provides a summary of the responses with 50% or more of the participants more
interested in teaching and designing an online course at the end of their distance
education professional development experience.
Table 25
Level of Interest in “Teaching and Designing an Online Course”
Type
Matched
N = 24 %
Interest in Teaching and Online Course
More Interested 12 50
Same 12 50
Less Interested 0 0
Interest in Designing and Online Course
More Interested 14 58
Same 10 42
Less Interested 0 0
134
Summary of Findings
Through descriptive statistics and various statistical analyses to measure the
sample items, the researcher aimed to answer three research questions. The research
determined that the surveys were reliable and that there were statistically significant
differences in learning gains between the pre-survey and post-survey data. It was
found that statistical differences were evident in regards to the distance learning
professional development knowledge gains using the TPACK scale. Participants
who participated in the online professional development program improved their
self-assessed knowledge about tasks associated with effective online teaching
(research question one). Participants also demonstrated improvement in their
attitudes of preparation relating to an online teaching experience and ranked certain
online activities as more useful than others (research question two). Lastly,
participants experienced learning gains in the majority of online teaching tasks
presented in the professional development program. Certain teaching tasks such as
applying universal design for learning in course content and teaching strategies for
hybrid courses had higher learning gains (research question three).
In the next chapter, there is a discussion about the findings of the study in the
context of the literature in regards to adult learning, professional development for
online teaching, and evaluation. Finally, recommendations are made for the
organization’s online professional development program and future research on
online professional development.
135
CHAPTER FIVE
DISCUSSION
Higher education institutions have been offering students online course
options as a means to increase educational attainment. These institutions are
challenged with developing faculty who are ready, willing, and able to teach in the
online world (Koehler, Mishra, Hershey, & Peruski, 2004). When faculty opt to
teach online, courses are typically content-faculty- or facilitator-driven (Palloff &
Pratt, 2007). Many faculty report they are unprepared to teach online because they
have been mostly prepared to instruct in a traditional classroom environment (Varvel,
2007; Wilson, 2001). As a result, institutions are encouraged to provide faculty with
instructional design support that prepares them to teach their courses online
(Simonson, 2007).
Designing a course for the online environment and then teaching it there
require slightly different skills sets than for face-to-face instruction, depending on
the elements of online being used. It is not a matter of simply “uploading” one’s
course and then teaching. And just as higher education historically has provided
little guidance for faculty on either effective course design or effective pedagogy in
the face-to-face classroom environment, until recently, little attention was given to
coaching faculty who ventured to deliver a course online. Researchers proposed that
professional development for online teaching needs to provide faculty with an
understanding of how course content, pedagogy, and technology can be integrated
when designing online courses (Koehler, Mishra, Hershey, & Peruski, 2004; Mishra
136
& Koehler, 2006). In this approach, faculty develop technological solutions to
pedagogical challenges in their subject matter content.
Creating and maintaining online professional development for adult learners
requires significant investment of resources on behalf of the educational institution
(Yang & Cornelius, 2005). Despite the investment of financial and human resources,
many institutions do not have an evaluation system to assess their training against
preferred practices as cited in the literature. When evaluation does occur, it seldom
extends beyond participants’ satisfaction and reaction to the professional
development activities, making it difficult to accurately assess the professional
development’s impact on creating changes in learner outcomes. Furthermore, there
is limited research addressing what constitutes effective professional development
from the perspectives of the instructors who receive training on how to make a
successfully transition from face-to-face delivery to online delivery (Roman et al.,
2010).
The purpose of this study was to evaluate a professional development to
prepare faculty to teach online and its relationship to changes in their self-assessed
knowledge about tasks associated with effective online teaching. The evaluation was
accomplished by first ensuring that the training was well-designed and implemented
to meet the training goal of increasing faculty knowledge with respect to the three
key domains as described by the TPACK framework: Technology, pedagogy,
content, and the combination of each of these areas. Second, the researcher referred
to principles of adult learning (Knowles et al., 2005) and Kirkpatrick’s evaluation
137
model (2006) as a framework guiding data collection about how effective the
distance education professional development was in helping faculty to improve their
knowledge relating to online teaching.
Obtaining information about quality professional development can be
gathered by conducting small-scale but rigorous studies to provide training
effectiveness measures (Hill, 2009). Similarly, the findings of this small-scale study
provided information that may help program administrators and trainers enhance and
implement future online professional development for adult learners. This chapter
will consolidate the research by summarizing the study results by research question.
Finally, it will discuss implications for practice, suggestions for further research, and
conclusions.
Summary of Findings
Professional Development for Online Teaching
A comprehensive literature review revealed limited scholarly research
relating to training faculty to teach online (Wolf, 2006). In Chapter Two, the
researcher summarized what was cited in the literature as issues and characteristics
that ought to be considered in professional development programs to prepare faculty
to teach online. One of the primary theoretical frameworks that stood out in the
literature was the TPACK framework. Using the TPACK framework as the
theoretical grounding to interpret the results of research question one, it appears that
this professional development may have been designed and implemented to improve
138
the participants understanding of online teaching tasks as represented by the TPACK
framework.
Research Question 1 (Kirkpatrick Level 2, Learning). Research question
one specifically focused on the changes in participants’ self-assessed knowledge as
defined by TPACK literature. Participants who participated in the online
professional development program improved their self-assessed knowledge about
tasks associated with effective online teaching. The pre- and post-survey results
provided evidence of statistically significant results for all of the seven TPACK
composite variables. The TPACK domains that represented the largest gains among
the online professional development participants when comparing the pre- and post-
survey results included the use of technology as it relates to technological content,
technological pedagogy, and technology pedagogy content. Most likely the gains are
related directly to the design of the training since it was designed to meet the training
goal of increasing the knowledge of faculty with respect to the three key domains as
described by the TPACK framework: technology, pedagogy, content, and the
combination of each of these areas.
Program developers in this study provided participants with numerous
opportunities to be exposed to TPACK framework. The course website integrated
TPACK standards with links to relevant resources. Additionally, the participants had
an opportunity to attend a TPACK framework session and were required to complete
course planning activities in their content area, which served as the basis for making
decisions about how and which technologies would be integrated into their course
139
site. The participants were also required to select appropriate technological tools that
supported course plans. The design of the online professional development enabled
faculty to “move beyond oversimplified approaches that treat technology as an ‘add
on’ instead to focus in a more ecological way, upon connections among technology,
content, and pedagogy” (Koehler & Mishra, 2009, p. 67) as they play out in the
online classroom context. The design team approach (Koehler et al., 2004; Koehler
& Mishra, 2005) was incorporated into the professional development’s design to
provide pedagogical and technical support to the participants. Throughout the
professional development, the team of faculty and instructional designers were
available to provide customized help to the participants as they completed the goal of
the training--a final project. Their final project involved showcasing how they
implemented the online tools into their course.
Research Question 2 (Kirkpatrick Level 1, Reaction). Seeking to
document preferred online professional development practices at Online Community
College, the researcher used research question two, which asked which online
professional development activities the participants found to be most useful, and by
extrapolation, most effective.
From among 13 possible online learning activities, faculty identified
“applying universal design for online learning”, “utilizing the Laulima learning
platform”, and “using synchronous tools such as Blackboard Collaborate” as the top
three most useful online professional development activities. These topics are in line
with recent literature that recommended an array of technological communication
140
tools and supports be made available to professional development participants to
enhance the online learning experience, with an increasing need to have synchronous,
live web conferencing tools to engage the online learner (Pagliari et al., 2009;
Roman, 2010; Taylor & McQuiggin, 2008; Wolf, 2006).
It is worth noting that the professional development activities that ranked on
the bottom of the list by the participants were generally not presented to the
participants in an experiential manner. Adult learning literature suggested that adults
will find experiential strategies most helpful, and indeed, the ones the participants
did not pick were not experiential.
Another common professional development component mentioned by the
participants in their open-ended responses was the appreciation of having the support
and expertise of the training center’s support staff and design team. The literature
noted that adults are self directed learners (Tough, 1967; Brookfield, 1986); therefore,
having the training staff available may have provided participants with just-in-time
support to assist them in meeting the professional development outcomes. Koehler
et al.’s (2004) initial TPACK research also supported the use of design teams as a
strategy to facilitate TPACK. Online Community College’s faculty design team
included teaching faculty and technical experts which were also encouraged by the
TPACK literature. Koehler et al. (2004) viewed leaving design decisions to
technical experts only as having a negative impact on pedagogy and online course
quality.
141
Research Question 3 (Kirkpatrick Level 2, Learning). Moving beyond
Kirkpatrick’s level one, research question three measured participants’ preparedness
and which professional development activities experienced greatest knowledge gains.
Mean scores indicated that there were statistically significant gains in all items
except “use asynchronous communication tools such as discussion boards and
multimedia.” One possible explanation for the lack of statistical significance is
Online Community College’s trainers did not use discussion boards during the
training. Literature recommended that faculty learn how to manage discussion
boards (Barker, 2003) and facilitate online discussions before teaching online
(Pankowski, 2004; Roman et al., 2010; Taylor & McQuiggin, 2008). Instead of using
discussion boards, communication was primarily delivered synchronously via
Blackboard Collaborate during scheduled sessions. Others attended in-person
sessions to discuss relevant topics based on their training needs. Communication
with the trainers and design support team relating to course design questions or
turning in assignments was handled primarily via e-mail.
Professional Development and Adult Learning
Based on the statistical significance of the research results, it seems that the
participants were exposed to effective online training. Using the Andragogy in
Practice Model as a lens to analyze the overall outcomes and effectiveness of the
online professional development, it appears that the training design and
implementation were based on andragogical assumptions. The online delivery
142
design of the professional development, which included self-pace modules,
encouraged the participants to take increasing responsibility for their own learning.
Andragogy in Practice Model. The success of the distance education
program based on the participants’ responses to the three research questions may
directly relate to how the professional development incorporated the elements of the
transactional model of andragogy. Andragogy in practice focuses on the
characteristics of the situation and the learning transaction when teaching adult
learners. Similarly, the online professional development took into consideration the
individual and situational differences of the learners. Additionally, the trainers
communicated clearly the goals and purposes for the learning. By applying the core
principles and the process elements of adult learning when designing and conducting
the professional development, the trainers may have created effective learning
processes for the participants.
Learner’s need to know. The first adult learning principle is that adult
learners need to know why they need to learn something before undertaking to learn
it. During the mandatory orientation session, the trainer clearly described the
reasons why the participants needed to learn the professional development course
content. Additionally, the following goals of the professional development
(Appendix A) were shared with the participants: 1) support faculty as they prepare
for hybrid or online courses; 2) promote development of course that employ high
impact practices; 3) support advancement of tactical and strategic plan goals; and 4)
use TPACK framework of teacher knowledge. Faculty participants also committed
143
to the training by signing a learning contract which is also supported by previous
research (Cafarella & Cafarella, 1996; Clark, Dobbins, & Ladd, 1993).
Self concept of the learner. The second adult learning principle is that adult
learners have a self-concept of being responsible for their own decision. Their self-
concept is heavily dependent upon a move toward self-direction. Providing the
professional development in an online platform was congruent with treating
participants as autonomous and being capable of self-direction. Faculty support took
the form of an instructional design team that included a faculty trainer, two
instructional designers, and a part-time student helper. Since participants came to
the training with a range of teaching online and teaching in higher education
experience, they were able to initiate requests for assistance from the design team
based on their individual needs. Similarly, the design team approach for professional
development was presented in Mishra and Koehler’s (2004) TPACK research as a
recommended approach to support learners.
Prior experience of the learner. The third adult learning principle is that
adult learners come into an educational activity with a greater volume and different
quality of experiences than youth. Their prior adult learner experiences provide a
rich resource for learning. The online professional development allowed for a
greater emphasis on individualized teaching and learning strategies taking into
consideration the participants’ prior experiences. Participants were also given an
opportunity to share their range of experiences at the culminating sessions.
144
Demonstrating the online course work the participants developed was a method of
teaching, enabling participants to obtain ideas from each other.
Readiness to learn. The fourth principle is adult learners become ready to
learn when they experience a need to cope with a life situation or perform a task.
The distance learning certification program had admission guidelines that required
participants to commit to designing a course that is part of their academic program
making the training relevant. Furthermore, the applicants were required to have the
support of a department chair or program dean validating that the participants were
teaching or planning to teach an online course at the college.
Orientation to learning. The fifth principle is adults’ orientation to learning
is problem centered and contextual. The distance learning certification program was
designed to be problem centered and contextual as participants were charged with
deciding on the most effective methods of teaching their course work in an online
learning environment. The participants who completed the training may have
perceived the training as being useful since it provided them with resources to design
a new course or improve an existing course.
Motivation to learn. The motivation for adult learners is internal rather than
external. The adult learning literature notes that adult learners can be motivated by
their own internal pressures such as the desire for increased job satisfaction or self
esteem. This study did not explore faculty motivations; however, faculty may have
been motivated to participate to simply become better online instructors by
improving their online teaching knowledge. As part of the faculty and review
145
process, faculty are also required to document professional development. Therefore,
participants may have viewed the online program as a means of documenting
professional development efforts. The trainer suggested, at the orientation session,
that faculty use the professional development completion documentation as evidence
of their participation for future tenure and promotion documents. In addition to a
distance learning program certification, the college provided faculty an external
reward which may have been a factor impacting whether faculty participated in the
training. Participants received a Netbook computer upon completion of the
professional development. Since only 24 of the 39 faculty actually completed the
survey, further qualitative research may be needed to learn more about why faculty
did not complete the online professional development.
Andragogical Trainer and the Process Model. Did the training prepare
faculty? Faculty preparedness for teaching online and designing an online course
had statistically significant gains. Faculty levels of interest in teaching and designing
an online course also supported their overall online preparedness rating, with 50% or
more of the participants more interested in engaging in teaching and designing an
online course at the end of the distance education experience. The online
professional development’s effectiveness can be summarized by benchmarking
Online Community College’s program against the process model used to involve
adults in their learning process. Online Community College applied adult learning
process elements to the distance education learning context in the following ways:
1) participants were provided a comprehensive orientation session preparing them
146
for the scope of the professional development requirements; 2) trainers provided the
participants with a collaborative working environment including optional workshops
and assistance to support individual learning needs; 3) trainers provided a
mechanism for mutual planning with the participants and the course designer via e-
mail correspondence and relevant pre-course planning assignments; 4) trainers
diagnosed the needs of the learning; 5) mutual training objectives were developed
based on the needs of the participants; 6) trainers sequenced the training by
presenting materials in manageable modules; 7) training activities were experiential
using suitable techniques and materials; and 8) there was an evaluation mechanism
included in the professional development’s design.
Online Professional Development Evaluation
Did the training improve knowledge? After using levels one and two of the
Kirkpatrick’s framework to evaluate the online professional development program,
the findings suggested that the faculty had positive reactions to the training.
Furthermore, the findings suggested that the professional development was a viable
and effective way to help faculty improve their TPACK knowledge. A key element
of success based on TPACK literature was that professional development program
participants were provided with technological and pedagogical tools to design their
courses while using the online learning platform, which was similar to how the
training was designed and implemented at Online Community College. Using the
online platform supported experiential learning; hence, it gave the participants a
means to experience an online course from the perspective of a student becoming an
147
online instructor. (Pankowski, 2004; Roman et al., 2010; Taylor & McQuiggin,
2008; Wolf, 2006). The findings also implied that the faculty demonstrated higher
learning gains in certain activities; however there were learning gains in most of the
professional development activities.
Recommendations and Next Steps
This study demonstrated that there may be a relationship between
participating in a professional development program and changes in participants’
self-assessed knowledge about tasks associated with effective online teaching. Since
the literature review found limited TPACK research in the higher education
environment, one of the unique aspects of this study is that it incorporated a TPACK
framework in an online, higher education environment. This study also confirmed
that there were elements of the design and implementation of the professional
development that community college faculty preferred leading to a statistically
significant increase in their confidence to teach online. Although this study
produced some interesting and useful findings about preferred methods in preparing
faculty to teach online faculty, there are other factors that the researcher proposes for
future qualitative and quantitative research.
First, it is not known if all faculty completed the online professional
development program since only 24 of the 39 participants were accounted for at the
culminating event. The purpose of gathering the participants at the end of the
training was to provide faculty a forum to share their final product. Based on the
research findings, perhaps the training needed to have more opportunities for
148
asynchronous communication via discussion boards. Asynchronous communication
could serve as formative assessment measures. Another suggestion is to encourage
faculty to participate in the optional webinars. Eighty-four percent of the 24 faculty
who completed the post survey noted that they attended at least one or more optional
webinars.
Second, it is not known if there was a need for more faculty trainers on the
instructional design team. The team consisted of only one teaching faculty member
who had to review participant course planning documents timely. Possible delays in
response time to participants’ course planning documents may have impacted their
progress in completing the online modules and other professional development
course work.
Third, it is unclear as to whether participants needed more time to complete
the online professional development and how other college committee work may
have interfered with professional development completion. The college was in the
midst of completing a five year review and gearing up for a 2012 accreditation visit
during the same period as the online professional development. The issue of time
needed to prepare for distance learning courses has been documented in previous
research (AFT, 2000). It was found that the preparation for distance learning courses
as being much greater than for a classroom-based course, especially for first-time
online faculty. Some estimates ranged from 66% to 500% longer (AFT, 2000).
Fourth, age and other individual difference such as gender, online teaching
and higher education differences were not captured in this study because of the small
149
sample size. It is important to study how to best support faculty development to
accommodate different learning needs. With a larger sample size, data can be
disaggregated to uncover possible statistically significant findings related to the
participants’ demographics.
Fifth, further research can identify faculty motivations and the impact of the
Netbook computer as an external motivator for online professional development
participation. Previous research highlighted that motivation is the most important
factor when choosing faculty to teach online (Wolf, 2006). Additional incentive may
be required to encourage existing faculty; however, it is not clear which form of
incentive is best. Successful programs choose incentives that are meaningful to their
faculty.
Finally, to obtain a full evaluation of the professional development program
at Kirkpatrick levels three and four, the evaluation process must continue. Although
the distance education professional development program incorporated Kirkpatrick
level three evaluation measures, time constraints on the study did not allow for
collection and analysis of this data. Therefore, this research study stopped at a
Kirkpatrick level two evaluation. Higher education accrediting agencies have set
guidelines regarding the relevancy of distance learning professional development and
its impact on institutional goals. The college needs to complete the evaluation
process at Kirkpatrick levels three and four.
150
Conclusions
The effects of professional development on faculty knowledge vary widely as
a function of adult learner characteristics, differences in program content, the
structure and format of the experience, and the context in which implementation
occurs (Guskey, 2000). The value of this study on effective online professional
development may be best used by college administrators and trainers who are
charged with implementing an online professional development. First, the findings
supported the relevance of incorporating the TPACK domains in a higher education
environment. Second, the research was able to apply the principles of adult learning
framework to the success of the professional development in improving participants’
self-assessed knowledge and online teaching preparedness. Third, the study
provided a systematic evaluation of the professional outcomes based on Kirkpatrick
level two, learning.
Despite the small sample size, the study documented preferred practices for
community college faculty. Further qualitative research is recommended to uncover
more detailed information about variables that may have impacted the research
results. It is also recommended that the college continues to support the professional
development needs of the online faculty. Professional evaluation at Kirkpatrick
levels one and two provided evidence that the professional development worked and
that there were preferred processes and content. Staff, time, money should be
allocated to conduct Kirkpatrick levels three and four evaluation to ensure
participants’ learning is applied to their online courses and organizational results
151
expected by the college are measured. With more qualitative research, the college
will have further insight as to the return of investment of the professional
development to the college.
The results of this study will be shared with the college’s administrators and
the design team to put further evaluation methods into place, and to enhance future
online professional development programs. The results offered statistically
significant gains related to the effectiveness of the professional development in
improving self-assessed knowledge relating to online teaching. The results of this
study can provide similar educational institutions with a framework to evaluate
online professional development programs. Furthermore, higher education
institutions may want to consider the online professional developments’ content and
teaching processes if they are interested in an online training that is grounded in
current TPACK and adult learning literature.
The researcher began this study because of a quest for knowledge about
which professional development components work most effectively in preparing
faculty to teach online. The impetus behind the study was the need to prepare
faculty to increase distance education course offerings as a means to increase
educational opportunities at Online Community College; hence, educational
attainment rates in Hawaii. In doing so, the study aimed to document preferred
online professional development practices at the college. The research findings may
have policy implications if the college is serious about its commitment to increasing
educational programs offered to students using distance technologies as a means to
152
improve educational attainment. Community college faculty are typically required to
have subject matter mastery; however, they are not expected to complete courses on
how to teach adult learners or how to use technology in the context of an online
classroom. Therefore, this study may be useful to leaders of community colleges and
other higher education institutions who are interested in implementing professional
development programs to certify online faculty so that they are prepared to teach in
an online environment.
Finally, although the context of this study was professional development in
the online environment, implications of the statistically significant research results
have the potential to go beyond the online classroom, opening the way to greater
pedagogical training opportunities for all faculty who teach in higher education.
Traditionally, unlike teachers in elementary and secondary education, majority of the
faculty in higher education rarely do any pedagogical training. Based on the
statistically significant results of this research study, there could be exponential
benefits if professional development efforts are expanded and sustained. Online
Community College’s distance education professional development program’s
outcomes serve as a pivotal starting point for other forms of professional
development that could ideally foster a long-term commitment to improve
instructional practices.
153
REFERENCES
Accrediting Commission for Community Colleges and Junior Colleges Western
Association of Schools and Colleges (2002). Accreditation standards.
Retrieved from
http://www.accjc.org/wpcontent/uploads/2011/01/ACCJC_WASC_ACCRED
ITATION_STANDARDS2011.pdf
Accrediting Commission for Community Colleges and Junior Colleges Western
Association of Schools and Colleges (2006). Accreditation standards
annotated for continuous quality improvement and slos. Retrieved from
http://www.accjc.org/wpcontent/uploads/2011/01/Standards_Annotated_for_
Boards_CQI_and_SLOs2011.pdf
Accrediting Commission for Community and Junior Colleges, Western Association
of Schools and Colleges (2008). Distance learning manual. Retrieved from
http://www.collegeofthedesert.edu/Accreditation/Documents/Self-
Study%20Resources/Accreditation/Distance%20Learning%20Manual%20Au
gust%202008.pdf
Accrediting Commission for Community Colleges and Junior Colleges Western
Association of Schools and Colleges (2010). Guide to evaluating distance
education and correspondence education. Retrieved from
http://www.accjc.org/wp-content/uploads/2010/09/Guide-to-Evaluating-
Distance-Education.pdf
Allen, E. Seaman, J. (2005). Growing by degrees: Online education in the United
States, 2005. The Sloan Consortium. Retrieved from
http://sloanconsortium.org/publications/survey/pdf/growing_by_degrees.pdf
Allen, E. & Seaman, J. (2007). Online nation: Five years of growth in online
learning. The Sloan Consortium.
Allen, E. & Seaman, J. (2010). Class differences—online education in the United
States, 2010. The Sloan Consortium. Retrieved from
http://sloanconsortium.org/publications/survey/class_differences
Almala, A. (2006). The community college leadership perspectives of quality e-
learning. Distance Learning, 3, 9-14.
Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teacher
presence in a computer conferencing context. Journal of Asynchronous
Learning Network, 5(2), 1-15.
154
Archambault, L. M. & Barnett, J. H. (2010). Revisiting technological pedagogical
knowledge: Exploring the TPACK framework. Computers & Education, 55
(4), 1656-1662.
Archambault, L, M. & Crippen, K. (2009). Examining TPACK among K-12 online
distance educators in the United States. Contemporary Issues in Technology
and Teacher Education, 9 (1), 71-88.
Association for Career and Technical Education (2010). Career and technical
education’s role in American competitiveness. Retrieved from
http://www.acteonline.org/uploadedFiles/Publications_and_Online_Media/fil
es/Guidance_issuebrief.pdf
Baldwin, T. T., Magjuka, R. J., & Loher, B.T. (1991). The perils of participation:
Effects of choice of training on trainee motivation and learning. Personnel
Psychology, 44(1), 51-65.
Barker, A. (2003). Faculty development for teaching online: Educational and
technological issues. Journal of Continuing Education in Nursing, 34 (6),
273-278.
Barnett, M. (2002, April) Issues and trends concerning electronic networking
technologies for teacher professional development: A critical review of the
literature. Paper presented at the American Educational Research
Association, New Orleans, L.A.
Beder, H. & Darkenwald, G. (1982). Differences between teaching adults and pre-
adults: Some propositions and findings. Adult Education, 32 (3), 142-155.
Borko, H. (2004). Professional development and teacher learning: Mapping the
terrain. Educational Researcher, 33(8), 1-49.
Brookfield, S. (1984). Self-directed adult learning: A critical paradigm. Adult
Education Quarterly, 35 (3), 59-71.
Brookfield, S. (1986). Understanding and facilitating adult learning. San Francisco,
CA: John Wiley and Sons.
Caffarella, R. S. & Caffarella, E. P. (1986). Self-directedness and learning contracts
in adult education. Adult Education Quarterly, 36 (4), 226-234.
155
Callan, P. M. & Atwell, R. H. (April 7, 2009). History we can’t afford to repeat.
Inside Higher Ed. Retrieved from
http://www.insidehighered.com/views/2009/04/07callan
Candy, P. C. (1991). Self-direction for lifelong learning. San Francisco, CA: Jossey
Bass
CAST (2008). Universal design for learning guidelines version 1.0. Wakefield, MA:
Author.
Chickering, A., & Gamson, Z. (1987). Seven principles for good practice in
undergraduate education. AAHE Bulletin, 39 (7), 3-6.
Chickering, A. & Ehrmann, S. (1996). Implementing the seven principles:
Technology as lever. AAHE Bulletin, 49 (2), 3-6. Retrieved form
http://www.tltgroup.org/programs/seven.html
Clark, C. S., Dobbins, G. H., and Ladd, R. T. (1993). Exploratory field study of
training motivation. Group and Organization Management, 18 (3) 292-307.
Clark, R. E. (2004). See the forest, tend the trees: Analyzing and solving
accountability problems. UrbanEd, 20-22.
Clark, R. E. & Estes, F. (2008). Turning research into results: A guide to selecting
the right performance solutions. Charlotte, NC: Information Age Publishing,
Inc.
Council for Higher Education Accreditation (2000). Distance learning in higher
education. CHEA Update #3. Retrieved from
http://www.eric.edu.gov/PDFS/ED446560.pdf
Creswell, J. W. (2003). Research design-qualitative, quantitative, and mixed
methods approaches (2nd ed.). Thousand Oaks: Sage Publications.
Creswell, J. W. (2008). Research design-qualitative, quantitative, and mixed
methods approaches (3nd ed.). Thousand Oaks: Sage Publications.
Davenport, J., & Davenport, J. (1985). A chronology and analysis of the andragogy
debate. Adult Education Quarterly, 35(3), 152–159.
Dede, C., Jass Ketelhut, D., Whitehouse, P., Breit, L., McCloskey, E. M. (2009). A
research agenda for online teacher professional development. Journal of
Teacher Education, 60, 8-19.
156
Desimone, L. M. (2009). Improving impact studies of teachers’ professional
development: Toward better conceptualizations and measures. Educational
Researcher, 38(3), 181-199.
Fowler, F. J. (2002). Survey research methods. Thousands Oaks, CA: Sage
Publications.
Friedman, T. (2005). The world is flat: A brief history of 21
st
century. New York:
Farrar, Straus, and Giroux.
Garet, M., Porter, A., Desimone, L., Birman, B., & Yoon, K. (2001). What makes
professional development effective? Results from a national sample of
teachers. American Educational Research Journal, 38(4), 915-945.
Garrison, D. R. (1997). Self-directed learning: Toward a comprehensive model.
Adult Education Quarterly, 48 (18), 18-33.
Gibbons, H. & Wentworth, G. (2001). Andragogical and pedagogical training
differences for online instructors. Online Journal of Distance Learning
Administration, 4 (3). Retrieved from
http://www.westga.edu/~distance/ojdla/fall43/gibbons_wentworth43.html
Gorham, J. (1985). Differences between teaching adults and pre-adults: A closer
look. Adult Education Quarterly, 35 (4), 194-209.
Grace, A. P. (1985). Striking a critical pose: Andragogy-missing links, missing
values. International Journal of Lifelong Education, 15, 382-392.
Grant, M. M. (2004). Learning to teach with the web: Factors influencing teacher
education faculty. The Internet and Higher Education, 7 (4), 329-341.
Grow, G. O. (1991). Teaching learners to be self-directed. Adult Education
Quarterly, 41 (3), 125-149.
Guskey, T. (2000). Evaluating professional development. Thousand Oaks, CA:
Corwin Press, Inc.
Guskey, T. (2009). Closing the knowledge gap on effective professional
development. Educational Horizons, 89 (4), 224-233.
Hattori, M. (2011). Distance education Developments 2007-2010--Center for
Excellence in Learning, Teaching and Technology. Unpublished manuscript.
157
Hicks, W. D., & Klimoski, R. J. (1987). Entry into training programs and its effects
on training outcomes: A field experiment. Academy of Management Journal,
30(3), 542-552.
Higher Education Program and Policy Council of the American Federation of
Teachers. (2000). Distance education --guidelines for good practice. No. 36-
0693. Washington, DC: Retrieved January 23, 2011 from
https://laulima.hawaii.edu/access/content/group/MAN.1248.201113/Readings
/AFTGuidelinesForGoodPractice.pdf.
Hill, H. C. (2009). Fixing teacher professional development. Phi Delta Kappan,
90(7), 470-476. Retrieved from
http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=36801406
&site=ehost-live
Holzer, H., & Lerman, R. (2009). The future of middle-skill jobs. Washington, DC:
Brookings Institution. Retrieved from
http://www.brookings.edu/~/media/Files/rc/papers/2009/02_middle_skill_job
s_holzer/02_middle_skill_jobs_holzer.pdf.
Johnsrud, L. K.VP for academic planning & policy. Paper presented at the Planning
for Hawai'i's Future--Raising Hawai'i's Competitive Edge, Windward
Community College. Retrieved from
http://www.hawaii.edu/offices/app/aa/systemwide_committees/counselors/fin
alaatnlinda101510_r3.pdf
Johnstone, J. W. C. (1963). The educational pursuits of American adults. Adult
Education Quarterly, 13 (4), 217-221.
Keengwe, J., Kidd, T., Kyei-Blankson, L. (2009). Faculty and technology:
Implications for faculty training and technology leadership. Journal of
Science Educational Technology, 18, 23-28.
Killion, J. (2008). Assessing impact. Thousand Oaks: Corwin Press.
Kirkpatrick, D. L. (1994). Evaluating training programs—the four levels. San
Francisco, CA: Berrett-Koehler Publishers, Inc.
Kirkpatrick, D. L. & Kirkpatrick, J.D. (2006). Evaluating training programs—the
four levels. San Francisco: Berrett-Koehler Publishers, Inc.
158
Kirkpatrick, D. L. & Kirkpatrick, J. D. (2007). Implementing the four levels—a
practical guide for effective evaluation of training programs. San Francisco:
Berrett-Koehler Publishers, Inc.
Knowles, M. S. (1975). Self-directed learning: A guide for learners and teachers.
Chicago: Follet.
Knowles, M. S. (1988). The adult learner: A neglected species. Houston: Gulf.
Knowles, M. S. (1989). The making of an adult educator. San Francisco: Jossey Bass,
l989.
Knowles, M. S., Holton III, E. F., & Swanson, S. A. (2005). The adult learner—the
definitive classic in adult education and human resource development.
Massachusetts: Elsevier.
Koelher, M. J. & Mishra, P. (2005). What happens when teachers design educational
technology? The development of technological pedagogical content
knowledge. Journal of Education Computing Research, 32 (2), 131-152.
Koehler, M. J., & Mishra, P. (2008). Introducing Technological Pedagogical
Knowledge. In AACTE (Eds.). The Handbook of Technological Pedagogical
Content Knowledge for Educators. Routledge/Taylor & Francis Group for the
American Association of Colleges of Teacher Education.
Koehler, M. J. & Mishra, P. (2009). What is technological pedagogical content
knowledge? Contemporary Issues in Technology and Teacher Education, 9
(1), 60-70.
Koelher, M. J., Mishra, P., Hershey, K., & Peruski, L. (2004). With a little help from
your students: A new model for faculty development and online course
design. Journal of Technology and Teacher Education, 12(1), 25-55.
Koelher, M. J., Mishra, P., & Yahya, K. (2007). Tracing the development of teacher
knowledge in a design seminar: Integrating content, pedagogy and
technology. Computers & Education, 49, 740-762.
Kosak, Manning, Dobson, Rogerson, Contnam, Colaric, & Mcfadden (2004).
Prepared to teach online? Perspectives of faculty in the University of North
Carolina System. Online Journal of Distance Learning Administration, 7 (3).
Kurpius, S. E. R. & Stafford, M. E. (2006). Testing and measurement—A user
friendly guide. Thousand Oaks: Sage Publications.
159
Lowenthal, P. R. & Leech, N. (in Press). Mixed research and online learning:
Strategies for improvement. To appear in T.T. Kidd (Ed.), Online education
and adult learning: New frontiers for teaching practices. Hershey, P.A: IGI
Global.
Lucas, G. (2010). Education nation. San Francisco: Jossey-Bass.
Mayer, R. (2008). Learning and instruction. New Jersey: Pearson Prentice Hall.
McEwan, E. K. & McEwan P. J. (2003). Making sense of research, what’s good,
what’s not, and how to tell the difference. Thousand Oaks: Corwin Press.
Merriam, S. B. (1993). Adult learning: Where have we come from? Where are we
headed?, New Directions for Adult and Continuing Education, 57, 5-14.
Meriam, S. (2001). Andragogy and self-directed learning: pillars of adult learning
theory, New Directions for Adult and Continuing Education, 89, 3-13.
Merriam S., Caffarella R., & Baumgartner L. (2007) Learning in adulthood-A
comprehensive guide. San Francisco, CA: Jossey Bass.
Mezirow, J. (1991). Transformative dimensions of adult learning. San Francisco:
Jossey-Bass.
Mishra, P. & Koehler, M. J. (2006). Technological pedagogical content knowledge:
A framework for teacher knowledge. Teacher College Record, 108 (6), 1017-
1054.
Moller, L., Foshay, W. R., & Huett, J. (2008). The evolution of distance education:
Implications for instructional design on the potential of the web. TechTrends,
52(4), 66-70.
Moloney, J. & Oakley II, B. (2006). Scaling online education: Increasing access to
higher education, Journal of Asynchronous Learning Networks, 10(3).
Moon, D, Michelich, v. & McKinnon, S. (2005). Blow away the competition:
Explosive best practices for cost-effective excellence in distance education.
Community College Journal of Research and Practice, 29 (8).
National Center for Education Statistics (2010). Persistence and attainment of 2003–
04 beginning postsecondary students: After 6 years-- first look. Retrieved
January 23, 2011 from http://nces.ed.gov/pubs2011/2011151.pdf
160
National Center for Public Policy and Higher Education (2008). Measuring up 2008,
the national report card on higher education. San Jose, CA.
Newton, E. S. (1977). Andragogy: Understanding the adult learner. International
Reading Association, 20 (5), 361-363.
Online Community College Center for Excellence in Learning, Teaching and
Technology. (November 17, 2009). Career & technical education program
improvement and leadership grant. Unpublished manuscript.
Online Community College Distance Learning Certification Program (2011).
Retrieved from https://laulima.hawaii.edu/portal/site/456da144-6216-437d-
80de-b1a4ade3b8dd
Organization for Economic Cooperation and Development. (2011). The case for
21st-century learning. Retrieved from
http://www.oecd.org/document/2/0,3746,en_2649_201185_46846594_1_1_1
_1,00.html
Orr, R., Williams, M. R., & Pennington, K. (2009). Institutional efforts to support
faculty in online teaching. Innovation in Higher Education, 34, 257-268.
Pagliari, L., Batts, D., & McFadden, C. (2009). Desired versus actual training for
online instructors in community colleges. Online Journal of Distance
Learning Administration, 12 (4). Retrieved from
http://www.westga.edu/~distance/ojdla124.html
Palloff, R. M. & Pratt, K. (2007). Building online learning communities. San
Francisco: Jossey Bass.
Palloff, R. M. & Pratt, K. (2011). The excellent online instructor-strategies for
professional development. San Francisco, CA: John Wiley & Sons, Inc.
Pankowski, P. (2004). Faculty training for online teaching. The Journal. Retrieved
http://thejournal.com/Articles/2004/09/01/Faculty-Training-for-Online-
Teaching.aspx?p=1
Patton, M. Q. (2002). Qualitative research and methods. Thousand Oaks: Sage
Publications.
Pratt, D. D. (1993). Andragogy after twenty-five years, New Directions for Adult and
Continuing Education, 57, p. 15-23.
161
Roman, T., Kelsey, K., & Lin, H. (2010). Enhancing online education through
instructor skill development in higher education. Online Journal of Distance
Learning Administration, 13(4). Retrieved from
http://www.westga.edu/~distance/ojdla/winter134/roman_kelsey134.html
Salkind, N. J. (2011). Statistics for people who think they hate statistics. Thousand
Oaks, Sage Publications.
Schmidt, D. A., Baran, E., Thompson, A. D., Mishra, P., Koehler, M. J., & Shin, T. S.
(2009). Technological pedagogical content knowledge (TPACK): The
development and validation of an assessment instrument for preservice
teachers. Journal of Research on Technology in Education, 42(2), 123-149.
Shepherd, C., Alpert, M. & Koeller, M (2008). Increasing the efficacy of educators
teaching online. International Journal of Social Sciences, 2 (3), 173-179.
Sherry, L., Billig, S. H., Tavalin, F., & Gibson, D. (2000). New insights on
technology adoption in schools. The Journal. Retrieved from
http://thejournal.com/articles/2000/02/01/new-insights-on-technology-
adoption-in-schools.aspx?sc_lang=en
Simonson, M. (2007). Accreditation and quality in distance education. Distance
Learning, 4 (3), 87-88.
Tannenbaum, S. I., Mathieu, J. E., Salas, E. & Cannon-Bowers, J. A.(1991).
Meeting trainees' expectations: The influence of training fulfillment on the
development of commitment, self-efficacy, and motivation. Journal of
Applied Psychology, 76(6), 759-769.
Taylor, A & McQuiggan, C. (2008). Faculty development programming: If we build
it, will they come? Educause Quarterly, 31 (3), 29-37.
Terehoff, I. I. (2002). Elements of adult learning in teacher professional development.
National Association of Secondary School Principals, NAASP Bulletin, 82,
no. 632, 64-77.
The Higher Learning Commission (2001) Best practices for electronically offered
degree and certificate programs. Retrieved from
http://www.ncahlc.org/information-for-institutions/publications.html
The Institute for Higher Education Policy (2000). Quality on the line: Benchmarks
for success in Internet-based distance education. Retrieved from
http://www.ihep.org/assets/files/publications/m-r/QualityOnTheLine.pdf
162
Thorne, E. H. & Marshall, J. L. (1976). Managerial skills development: An
experience in program design. Personnel Journal, 55, (15).
Tough, A. M. (1966). The assistance obtained by adult self-teachers. Adult
Education Quarterly, 17 (30), 30-37.
Tough, A. M. (1967). Learning without a teacher: A study of tasks and assistance
during adult self-teaching projects. Ontario: The Ontario Institute for Studies
in Education.
Tough, A. M. (1971, 1979). The adult’s learning projects: A fresh approach to
theory and practice in adult learning. Ontario: The Ontario Institute for
Studies in Education.
Tough, A. M. (1982). Intentional changes: a fresh approach to helping people
change. Chicago: Follett.
United States Bureau of Labor Statistics (2010). Occupational outlook handbook,
2010-11 edition, overview of the 2008-18 projections. Retrieved from
http://www.bls.gov/oco/oco2003.htm
United States Department of Education (2006). A test of leadership: Charting the
future of U.S. higher education. Washington, D.C.
University of Hawaii Community College (2005). University of Hawaii Community
Colleges’ Policy UHCCP #5.202. Retrieved from
http://ofie.kcc.hawaii.edu/index.php?option=com_content&view=article&id=
53&Itemid=67
University of Hawaii Community Colleges (2008). Strategic outcomes and
performance measures 2008-2015. Retrieved from
http://www.hawaii.edu/offices/cc/strategicplan/Appendix_B_UHCC_Strategi
c_Outcomes_and_Performance_Measures_2008_2015%20.pdf
Varvel, V. (2007). Master online teacher competences. Online Journal of Distance
Learning Administration, 10 (1). Retrieved from
http://www.westga.edu/~distance/ojdla/spring101/varvel101.htm
Western Association of Schools and Colleges Accrediting Commission for Senior
Colleges and Universities (2006). Guidelines for the evaluation of distance
education.
http://www.wascsenior.org/findit/files/forms/C_RAC_Distance_ed_guideline
s_7_31_2009.doc
163
Wilson, C. (2001). Faculty attitudes about distance learning. Educause Quarterly, 24
(2), 10-71. Retrieved http://net.educause.edu/ir/library/pdf/eqm0128.pdf
Wolf, P. D. (2006). Best practices in the training of faculty to teach online, Journal
of Computing Higher Education, 17(2), 47-78.
Yang, Y. & Cornelius, L. F. (2005). Preparing instructors for quality online
instruction. Online Journal of Distance Learning Administration, 8 (1).
Retrieved from http://www.westga.edu/~distance/ojdla/spring81/yang81.htm
164
APPENDIX A
DISTANCE LEARNING CERTIFICATION PROGRAM
165
166
167
168
169
170
171
172
173
APPENDIX B
SURVEY (PRE-SURVEY)
SURVEY INSTRUMENT: Learning to Teach Online
Introduction
Thank you in advance for sharing your views and experiences regarding teaching
online.
Research suggests that effective professional development is a key piece in providing
faculty with the tools and strategies for improving outcomes for students. We are
interested in learning more about your experience with the training to prepare you to
teach online.
We appreciate hearing your views. Your responses are anonymous. The
information below will help us to understand how faculty’s perceptions may vary.
Knowing more about your views and experience can help strengthen training and
resources for faculty on this very important subject.
By completing this survey, you agree to be a participant. Thank you for
participating!
A. Participant Information
A1. In the box below please enter the year of your birth followed by the last four
digits of your phone number (YYYYPPPP).
For example, if your birth year is 1976 and last four digits of your phone
number are 1234, please enter 19761234.
A2. Gender
o Female
o Male
174
A3. Age
(in years)
A4. How many years have you taught in higher education?
A5. How many years have you taught online?
A6. At which University of Hawaii Community College campus are you
affiliated?
o Hawaii Community College
o Honolulu Community College
o Kapi’olani Community College
o Kauai Community College
o Leeward Community College
o Maui College
o Windward Community College
B. Teaching Online Knowledge
B1. How would you rate your own knowledge in doing the following tasks
associated with teaching in a distance education setting?
For each of the statements below, please indicate your level of knowledge in the
following areas. If you feel your knowledge is poor in a particular area, please
indicate (1). If you feel your knowledge in a particular area is fair, please indicate
(2). If you feel your knowledge in a particular is good, please indicate (3). If you
feel your knowledge in a particular area is very good, please indicate (4) and if you
feel it is excellent, please indicate (5).
175
Item
#
Poor
Fair
Good
Very Good
Excellent
a. My ability to troubleshoot technical problems associated with
hardware (e.g., network connections).
1 2 3 4 5
b. My ability to create materials that map to specific course
objectives and professional competencies.
1 2 3 4 5
c. My ability to use a variety of teaching strategies to relate various
concepts to students.
1 2 3 4 5
d. My ability to decide on the scope of concepts taught within my
class.
1 2 3 4 5
e. My ability to use online student assessment to modify instruction. 1 2 3 4 5
f. My ability to distinguish between correct and incorrect problem
solving attempts by students within my discipline.
1 2 3 4 5
g. My ability to address various computer issues related to software
(e.g., downloading appropriate plug-ins, installing programs).
1 2 3 4 5
h. My ability to create an online environment which allows students
to build new knowledge and skills.
1 2 3 4 5
i. My ability to anticipate likely student misconceptions within a
particular topic.
1 2 3 4 5
j. My ability to determine a particular strategy best suited to teach a
specific concept.
1 2 3 4 5
k. My ability to use technology to predict students'
skill/understanding of a particular topic.
1 2 3 4 5
l. My ability to implement different methods of teaching online. 1 2 3 4 5
m. My ability to plan the sequence of concepts taught within my
class.
1 2 3 4 5
n. My ability to moderate online interactivity among students. 1 2 3 4 5
o. My ability to use technological representations (i.e. multimedia,
visual demonstrations, etc) to demonstrate specific concepts in
my content area.
1 2 3 4 5
p. My ability to encourage online interactivity among students. 1 2 3 4 5
q. My ability to assist students with troubleshooting technical
problems with their personal computers.
1 2 3 4 5
r. My ability to adjust teaching methodology based on student
performance/feedback.
1 2 3 4 5
s. My ability to comfortably produce lesson plans to engage
students in the topic.
1 2 3 4 5
t. My ability to implement curriculum in an online environment. 1 2 3 4 5
u. My ability to assist students in noticing connections between
various concepts in a curriculum.
1 2 3 4 5
v. My ability to use various courseware programs to deliver
instruction (e.g., Laulima, Blackboard Collaborate/Elluminate).
1 2 3 4 5
w. My ability to use technology to create effective representations of
content that depart from textbook knowledge.
1 2 3 4 5
x. My ability to meet the overall demands of online teaching. 1 2 3 4 5
176
C. Teaching Online Preparedness
C1. How prepared do you feel you are for an online teaching experience?
o Not at
all
o Somewhat o Quite a
Bit
o Completely
C2. Please read the following statements, and then indicate how prepared you
are for the following online teaching tasks.
Item #
Not at all
Somewhat
Quite a bit
Completely
a. Create an online syllabus 1 2 3 4
b. Create an online course orientation 1 2 3 4
c. Facilitate/moderate an online classroom 1 2 3 4
d. Use asynchronous communication tools such as discussion
boards and multimedia
1 2 3 4
e. Use synchronous communication tools such as Blackboard
Collaborate/Elluminate
1 2 3 4
f. Manage online assignments such as grading and providing
timely student feedback
1 2 3 4
g. Utilize online instructional pedagogy such as acknowledging
students' perspective, varying instruction, and managing
problem behavior
1 2 3 4
h. Create an online assessment 1 2 3 4
i. Apply universal design for learning in course content 1 2 3 4
j. Prevent plagiarism or cheating in an online classroom 1 2 3 4
k. Utilize the Laulima learning platform 1 2 3 4
l. Generate online statistical reports 1 2 3 4
m. Teaching strategies for hybrid courses 1 2 3 4
177
APPENDIX C
SURVEY (POST-SURVEY)
SURVEY INSTRUMENT: Learning to Teach Online
Introduction
Thank you in advance for sharing your views and experiences regarding teaching
online.
Research suggests that effective professional development is a key piece in providing
faculty with the tools and strategies for improving outcomes for students. We are
interested in learning more about your experience with the training to prepare you to
teach online.
We appreciate hearing your views. Your responses are anonymous. The
information below will help us to understand how faculty’s perceptions may vary.
Knowing more about your views and experience can help strengthen training and
resources for faculty on this very important subject.
By completing this survey, you agree to be a participant. Thank you for
participating!
A. Participant Information
A1. In the box below please enter the year of your birth followed by the last four
digits of your phone number (YYYYPPPP).
For example, if your birth year is 1976 and last four digits of your phone
number are 1234, please enter 19761234.
A2. Gender
o Female
o Male
178
A3. Age
(in years)
A4. How many years have you taught in higher education?
A5. How many years have you taught online?
A6. At which University of Hawaii Community College campus are you
affiliated?
o Hawaii Community College
o Honolulu Community College
o Kapi’olani Community College
o Kauai Community College
o Leeward Community College
o Maui College
o Windward Community College
A7. I acknowledge that I completed the distance learning certification program.
o Yes
o No
A8. If you answered “No” in question #6, please explain why you were unable
to complete the program.
179
B. Teaching Online Knowledge
B1. How would you rate your own knowledge in doing the following tasks
associated with teaching in a distance education setting?
For each of the statements below, please indicate your level of knowledge in the
following areas. If you feel your knowledge is poor in a particular area, please
indicate (1). If you feel your knowledge in a particular area is fair, please indicate
(2). If you feel your knowledge in a particular is good, please indicate (3). If you
feel your knowledge in a particular area is very good, please indicate (4) and if you
feel it is excellent, please indicate (5).
Item #
Poor
Fair
Good
Very Good
Excellent
a. My ability to troubleshoot technical problems associated with
hardware (e.g., network connections).
1 2 3 4 5
b. My ability to create materials that map to specific course
objectives and professional competencies.
1 2 3 4 5
c. My ability to use a variety of teaching strategies to relate
various concepts to students.
1 2 3 4 5
d. My ability to decide on the scope of concepts taught within
my class.
1 2 3 4 5
e. My ability to use online student assessment to modify
instruction.
1 2 3 4 5
f. My ability to distinguish between correct and incorrect
problem solving attempts by students within my discipline.
1 2 3 4 5
g. My ability to address various computer issues related to
software (e.g., downloading appropriate plug-ins, installing
programs).
1 2 3 4 5
h. My ability to create an online environment which allows
students to build new knowledge and skills.
1 2 3 4 5
i. My ability to anticipate likely student misconceptions within a
particular topic.
1 2 3 4 5
j. My ability to determine a particular strategy best suited to
teach a specific concept.
1 2 3 4 5
k. My ability to use technology to predict students'
skill/understanding of a particular topic.
1 2 3 4 5
l. My ability to implement different methods of teaching online. 1 2 3 4 5
m. My ability to plan the sequence of concepts taught within my
class.
1 2 3 4 5
n. My ability to moderate online interactivity among students. 1 2 3 4 5
180
o. My ability to use technological representations (i.e. multimedia,
visual demonstrations, etc) to demonstrate specific concepts in
my content area.
1 2 3 4 5
p. My ability to encourage online interactivity among students. 1 2 3 4 5
q. My ability to assist students with troubleshooting technical
problems with their personal computers.
1 2 3 4 5
r. My ability to adjust teaching methodology based on student
performance/feedback.
1 2 3 4 5
s. My ability to comfortably produce lesson plans to engage
students in the topic.
1 2 3 4 5
t. My ability to implement curriculum in an online environment. 1 2 3 4 5
u. My ability to assist students in noticing connections between
various concepts in a curriculum.
1 2 3 4 5
v. My ability to use various courseware programs to deliver
instruction (e.g., Laulima, Blackboard
Collaborate/Elluminate).
1 2 3 4 5
w. My ability to use technology to create effective representations
of content that depart from textbook knowledge.
1 2 3 4 5
x. My ability to meet the overall demands of online teaching. 1 2 3 4 5
C. Teaching Online Preparedness
C1. How prepared do you feel you are for an online teaching experience?
(1) Not at all (2) Somewhat (3) Quite a Bit (4) Completely
C2. Please read the following statements, and then indicate how prepared you
are for the following online teaching tasks.
Item #
Not at all
Somewhat
Quite a bit
Completely
a. Create an online syllabus 1 2 3 4
b. Create an online course orientation 1 2 3 4
c. Facilitate/moderate an online classroom 1 2 3 4
d. Use asynchronous communication tools such as discussion boards
and multimedia
1 2 3 4
e. Use synchronous communication tools such as Blackboard
Collaborate/Elluminate
1 2 3 4
f. Manage online assignments such as grading and providing timely
student feedback
1 2 3 4
g. Utilize online instructional pedagogy such as acknowledging
students' perspective, varying instruction, and managing problem
behavior
1 2 3 4
181
h. Create an online assessment 1 2 3 4
i. Apply universal design for learning in course content 1 2 3 4
j. Prevent plagiarism or cheating in an online classroom 1 2 3 4
k. Utilize the Laulima learning platform 1 2 3 4
l. Generate online statistical reports 1 2 3 4
m. Teaching strategies for hybrid courses 1 2 3 4
C3. How many optional webinars or in-person seminars did you attend?
(1) None (2) One (3) Two (4) Three or more
C4. How many times did you require face-to-face assistance from CELTT?
(1) None (2) One (3) Two (4) Three or more
C5. Now that you have participated in the distance learning certification
program, please indicate your level of interest in teaching an online course?
(1) Less interested (2) Same (3) More interested
C6. Now that you have participated in the distance learning certification
program, please indicate your level of interest in designing an online course?
(1) Less interested (2) Same (3) More interested
C7. Which three online professional development activities did you find most
useful in preparing you for online teaching?
o Create an online syllabus
o Create an online course orientation
o Facilitate/moderate an online classroom
o Use asynchronous communication tools such as discussion boards and
multimedia
o Use synchronous communication tools such as Blackboard
Collaborate/Elluminate
o Manage online assignments such as grading and providing timely student
feedback
o Utilize online instructional pedagogy such as acknowledging students’
perspective, varying instruction, and managing problem behavior
182
o Create an online assessment
o Apply universal design for learning in course content
o Prevent plagiarism or cheating in an online class
o Utilize the Laulima learning platform
o Generate online statistical reports
o Teaching strategies for hybrid courses
o Other:
C8. Additional comments: We invite you to use the space below to comment on
the distance education certification program’s effectiveness in preparing you to
teach online.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
babette-moreno-march8
PDF
mitzie-higa-march18
PDF
michael-brown-march22
PDF
taranjit-samra-march13
PDF
Jean Rotrou's Laure persecutee (1639) : critical edition
PDF
Microsoft Word - Youn Jung Choi Dissertation 2017 March (final).docx
PDF
THESIS FINAL DRAFT 12.10.2011
PDF
final draft Terri Sterr 12.21.11
PDF
M. Grant, Saqqara C597 (2010-12-06) [21794]
PDF
Microsoft Word - Anil_Muhammed_-_Final_Dissertation_-_02-23-12
PDF
Microsoft Word - REAL Catherine Dissertation 04-12-17.docx
PDF
Microsoft Word - Aguilar_Final_3-14-12-2.doc
PDF
Microsoft Word - Corrections3.15.12_Guirguis_Full Dissertation_.docx
CSV
β-catenin couples self-renewal, induction... [Chapter 2, Table 12]
PDF
Microsoft Word - $ASQ117808_supp_undefined_9EB8ACB8-12E4-11E1-A7BA-440BEF8616FA.docx
PDF
Microsoft Word - $ASQ117615_supp_undefined_04B12AF4-0287-11E1-A6A4-3DFF9D1A67F9.docx
PDF
EP66658.pdf
PDF
EP66431.pdf
PDF
EP70679.pdf
PDF
EP70715.pdf
Asset Metadata
Core Title
laure-burke-march12
Tag
OAI-PMH Harvest
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-oUC11291319
Unique identifier
UC11291319
Legacy Identifier
etd-BurkeLaure-516