Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
The relationship between students’ computer self-efficacy, self-regulation, and engagement in distance learning
(USC Thesis Other)
The relationship between students’ computer self-efficacy, self-regulation, and engagement in distance learning
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
THE RELATIONSHIP BETWEEN STUDENTS’ COMPUTER SELF-EFFICACY,
SELF-REGULATION, AND ENGAGEMENT IN DISTANCE LEARNING
by
Brandon David Martinez
A Dissertation Presented to the
FACULTY OF THE ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements of the Degree
DOCTOR OF EDUCATION
August 2009
Copyright 2009 Brandon David Martinez
ii
DEDICATION
For my wife Lisa, who is the reason I’ve gotten this far; my three sons, Dylan,
Jack, and Cruz for their patience, understanding, and love; and my parents who have
supported me in my every endeavor.
iii
ACKNOWLEDGEMENTS
To my dissertation chair, Dr. Robert Rueda, my committee members, Dr. Harry
O’Neil and Dr. Albert “Skip” Rizzo, my wife, sons, family, and friends, I give my utmost
thanks for your guidance and encouragement through this experience.
iv
TABLE OF CONTENTS
Dedication ...........................................................................................................................ii
Acknowledgements ............................................................................................................iii
List of Tables...................................................................................................................... vi
Abstract .............................................................................................................................vii
Chapter 1: Introduction ....................................................................................................... 1
Research Questions .......................................................................................................... 5
Definition of Terms.......................................................................................................... 6
Delimitations of the Study................................................................................................ 8
Assumptions..................................................................................................................... 8
Limitations of the Study................................................................................................... 8
Significance of the Study ................................................................................................. 9
Chapter 2: Literature Review ............................................................................................ 10
Introduction .................................................................................................................... 10
Organization of Literature Review................................................................................. 10
Sources Searched............................................................................................................ 11
Review of the Literature................................................................................................. 11
Self-efficacy and Computer Self-efficacy............................................................. 11
Self-regulation and Distance Learning.................................................................. 24
Student Engagement and Distance Learning ........................................................ 35
Expectancy Outcome............................................................................................. 39
Chapter 3: Methodology.................................................................................................... 41
Research Questions ........................................................................................................ 41
Design Summary............................................................................................................ 42
Participants..................................................................................................................... 43
Measures......................................................................................................................... 46
Student Self-Regulation Measure.......................................................................... 46
Student Computer Self-Efficacy Measure............................................................. 51
Student Engagement Measure............................................................................... 52
Student Expected Achievement Measure.............................................................. 56
Procedure........................................................................................................................ 58
Data Analysis ................................................................................................................. 59
Chapter 4: Results ............................................................................................................. 61
Correlation...................................................................................................................... 61
Summary ........................................................................................................................ 77
Chapter 5: Discussion........................................................................................................ 79
Limitations of the Study................................................................................................. 79
v
Synthesizing the Results................................................................................................. 81
Implications.................................................................................................................... 88
Future Research.............................................................................................................. 89
Conclusions .................................................................................................................... 91
References ......................................................................................................................... 93
Appendix A ....................................................................................................................... 97
Appendix B ....................................................................................................................... 98
Appendix C ..................................................................................................................... 104
Appendix D ..................................................................................................................... 108
Appendix E...................................................................................................................... 112
vi
LIST OF TABLES
Table 1: Grade Levels of Students in Online and Blended Courses ................................... 3
Table 2: Description of Sample......................................................................................... 45
Table 3: Factor Analysis for Student Self-Regulation Survey.......................................... 49
Table 4: Factor Analysis for Student Engagement Survey ............................................... 54
Table 5: Correlation between Computer Self-Efficacy, Self-Regulation, Engagement
and Expected Achievement............................................................................................... 62
Table 6: Effect Decompositions on English Achievement ............................................... 67
Table 7: Effect Decompositions on Social Science Achievement .................................... 70
Table 8: Effect Decompositions on Math Achievement ................................................... 73
Table 9: Effect Decompositions on Science Achievement ............................................... 76
vii
ABSTRACT
This study examines the relationship between computer self-efficacy, self-
regulation, engagement, and expected achievement for students in distance learning. Prior
research on these variables is robust; however, studies specifically on distance learning at
the high school level are sparse. While the bulk of research on distance learning has
focused on higher education and has mostly looked at the role of self-efficacy and self-
regulation, it was predicted that there would be a significant relationship between
computer self-efficacy, self-regulation, and engagement at the high school level. To
determine of such a relationship existed, students (N=194) were given an online survey
that consisted of items from the Web Users Self-Efficacy scale (Eachus & Cassidy,
2006), designed to measure their computer self-efficacy; a modified version of the self-
regulation subscale from the Motivated Strategies for Learning Questionnaire (Pintrich &
De Groot, 1990), designed to measure their self-regulation; and a modified version of the
Feelings About School Inventory (Fredricks, Blumenfeld, Friedel, & Paris, 2005),
designed to measure their behavioral, emotional, and cognitive engagement. The results
of the analysis indicated that there was a significant relationship between students’ self-
regulation and their engagement and expected achievement in distance learning.
1
CHAPTER 1: OVERVIEW OF THE STUDY
Introduction and Statement of the Research Problem
It is now possible for an individual to complete every school grade from
kindergarten through a doctorate without ever stepping foot on the physical campus of
the sponsoring institute. Moreover, it is now common for a student to participate in this
type of learning without ever meeting the teacher or classmates in person. Distance
learning has become one of the fastest growing sectors in the education field. Whereas
traditional “brick and mortar” institutions require students to physically attend class on a
campus, the Internet has made it possible for anyone with access to a computer to enroll
in and complete courses without ever stepping foot on the physical campus of the school
in which they are enrolled.
Distance learning has been particularly beneficial for students who live great
distances from the nearest institution or because of work restraints cannot attend classes
at the scheduled times. Someone who lives in a rural part of the country may reside a few
hundred miles from the nearest campus and driving this distance on a weekly basis would
not be practical for completing coursework. As well, a person who works from 9:00 a.m.
until 5:00 p.m., five days a week, would not be able to attend a class scheduled from
11:00 a.m. until 1:00 p.m. three days a week. Currently, the majority of research in the
literature focuses upon higher education learners, leaving scant information about the K-
12 setting.
One of the areas of focus for this dissertation is to understand how distance
learners at the K-12 setting are motivated to engage in their learning. Hasan (2003); Lee
& Lee (2008); and Tuckman (2007) have examined the role of motivation in the higher
2
education learner and how self-regulation effects engagement. However, there is little
research about K-12 distance learning in general, thus, many opportunities for research
exit.
The number of institutions that offer distance learning increased by 72% between
1995 and 1998 (Wood, 2001) and continues to grow according to a Sloan Consortium
report (Picciano & Seaman, 2007). The number of students enrolled in distance learning
ranged from 328,000 in a 2002-03 estimate by the U.S. Department of Education to as
many as one million based on reports by The Peak Group and WestEd (Boria, 2006 ).
Regardless of the exact number of enrollment, one of the problems in the literature is that
there is a lack of base data in regards to distance learning enrollment at all grade levels,
making the analysis of enrollment, course completion, and retention difficult. As
enrollment in distance learning at the K-12 levels continues to grow, the need to
understand the motivational characteristics of students and how they engage is of utmost
importance as these schools are equally accountable under No Child Left Behind as the
traditional school. Table 1 shows the number of students in K-12 who are taking at least
one fully online or blended/hybrid class, based on a survey of school district
administrators conducted during the 2005-06 school year. The data is based on the reports
of 366 school districts out of a total of 16,000 school districts in the U.S.
3
Table 1. The grade levels of students taking either fully online or blended courses.
Fully Online Blended/Hybrid Total
N % N % N %
Grades K-5 2733 16% 583 5% 3271 12%
Grades 6-8 1793 10% 3980 36% 5773 20%
Grades 9-12 12625 73% 6519 59% 19144 67%
Other 198 1% 56 1% 254 1%
Total 17349 100% 11093 100% 28442 100%
Source: “K-12 online learning: A survey of U.S. school district administrators,”
by Anthony G. Picciano and Jeff Seaman, 2007, p.8. Copyright 2007 by the Sloan
Consortium.
Despite its perceived novelty, distance learning at the K-12 levels is not immune
to the same problems encountered by traditional K-12 schools. It can be assumed that a
student participating in higher education distance learning has already established a
certain level of ability and self-efficacy. These are students who have decided to
continue their education, have met the minimum entrance requirements to the higher
education institute, and have actively chosen to participate in the selected distance
learning class or classes.
However, the K-12 student is compelled to attend school whether or not it is in a
traditional or virtual setting. Every state in the United States has a compulsory attendance
law, ranging from age five to 18. States such as Arizona and New York require students
to attend from ages six to 16, whereas states such as California and Texas require
attendance between ages six-18. Since these students—distance learners in grades K-
12—are required to attend, it is important to understand how these students’ self-efficacy
for technology and self-regulation relate to their overall engagement in accessing the
4
distance learning tools. Two factors compound this issue: lack of a real time teacher and
real time classmates.
The absence of a real time teacher and real time classmates make engagement
difficult for students with low levels of self-regulation (Lee & Lee, 2008). One of the
challenges of research in this area is the fact that students who participate in “virtual
schools” are geographically displaced. Studies conducted at a traditional high school, for
example, allow a researcher a central location for accessing students, teachers, and other
stakeholders at the school. In the distance learning model, it is difficult for a researcher to
meet with students or teachers for the purposes of surveying or interviewing because the
subjects are not bound by time or place. Access to parents, counselors, and administrators
is difficult for the same aforementioned reasons.
Modeling plays a significant role in the development of a learner’s self-efficacy
and self-regulation via the instructor or fellow classmates (Schunk, Pintrich, & Meece,
2008); however, the distance learner is left without an in-person teacher or classmates.
This raises questions about the current design and delivery of curriculum and how a
student might access the instructor, fellow students, or the overall distance learning tools
in order to learn.
Kawachi (2003) examined the features of collaborative and cooperative learning
and how each may effect the intrinsic motivation of a distance learner. While both are
critical to the success of the distance learner he argues that the distance learning
environment requires an opportunity for students to engage socially with other students.
Without a teacher to monitor or fellow students with whom a learner can interact, self-
handicapping issues like procrastination cannot only hinder a distance learner’s
5
achievement, but serve as a devastating blow to the overall development of the learner’s
self-regulation levels. Tuckman’s (2007) research on the levels of procrastination
demonstrated by distance learners can be remedied or controlled by having a teacher
monitor the high level procrastinator as well as grouping the student with low-level
procrastinators in order to increase motivation and engagement.
In summary, there are current empirical studies in the distance learning field as
related to motivation, specifically self-efficacy and self-regulation. However, they are
limited to the higher education setting. The current gap in the literature calls for research
in the area of K-12 distance learning as related to the same aforementioned motivational
variables, including engagement. The other significant problems are that with its rapid
growth, so too grow the problems of motivation, engagement, and achievement for the
distance learner as seen with traditional K-12 learners. The focus of this study will be to
determine what the relationship is between features of the distance learner such as self-
efficacy in technology and self-regulation, and engagement in the virtual school setting.
Research Questions
The purpose of this study is to examine motivational and learning influences on
engagement in distance learning at the high school setting, specifically in regards to self-
efficacy and self-regulation. The overarching research question is: What is the
relationship between students’ computer self-efficacy and self-regulation, and
engagement in distance learning on expected achievement?
6
The following sub-questions will be addressed as well:
1. What is the relationship between distance education learners’ computer self-
efficacy, self-regulation, and their behavioral, emotional, and cognitive
engagement and expected achievement in English?
2. What is the relationship between distance education learners’ computer self-
efficacy, self-regulation, and their behavioral, emotional, and cognitive
engagement and expected achievement in social science?
3. What is the relationship between distance education learners’ computer self-
efficacy, self-regulation, and their behavioral, emotional, and cognitive
engagement and expected achievement in math?
4. What is the relationship between distance education learners’ computer self-
efficacy, self-regulation, and their behavioral, emotional, and cognitive
engagement and expected achievement in science?
Definition of Terms
For the purposes of this study, distance learning is defined as schooling where
100% of learning takes place via online activity in which students are not bound by time
and place, and one in which non-online activity time with teachers, counselors, or other
school official is limited to phone calls and once a year face-to-face counseling
appointments. This does not include the completion of state required tests such as the
California High School Exit Exam or the California Subject Tests, which are
administered in real time, or synchronously, by a school official and with other students.
Asynchronous is analogous to learning in which students and teachers are not
bound by time or place.
7
Computer self-efficacy refers to an individual’s beliefs about his or her ability to
use a computer to perform a computing task successfully (Compeau & Higgins, 1995;
Hasan, 2003; Karsten & Roth, 1998; Stone & Henry, 2003).
Expected achievement is analogous to expectancy outcome and refers to a
learner’s beliefs about how well he will do on an immediate task or expectations for
future outcomes (Eccles & Wigfield, 2002; Wigfield, 1994); herein it refers to the school
given grade a learner expects to receive in either distance learning English, social
science, math, or science for the Fall 2008 semester.
Engagment is comprised of three components(J. Fredricks, P. Blumenfeld, & A.
Paris, 2004): behavioral engagement refers to an individual’s participation in school
(Finn, 1993); emotional engagement refers to an individual’s positive or negative feelings
towards school (Finn & Voelkl, 1993; J. Fredricks, et al., 2004); and cognitive
engagement refers to an individual voluntarily exerting effort in order to understand and
master challenging tasks (J. Fredricks, et al., 2004).
Self-efficacy is defined as an individual’s beliefs about his or her abilities to
accomplish a task; it is not concerned with the amount or quality of skills one possesses,
but rather what a person believes he or she can achieve with the skills he or she possesses
(Bandura, 1977).
Self-regulation is defined as (Bandura, 1986) the “process of setting goals for
oneself and engaging in behaviors and cognitive processes that lead to goal completion”
(Ormrod, 2006, p. 347).
8
Delimitations
This study only focused on the relationship between students’ computer self-
efficacy, self-regulation, engagement and expected achievement in distance learning.
Assumptions
The researcher assumed that the measures for computer self-efficacy, self-
regulation, and engagement were valid and reliable.
Limitations
The study was limited in that the data are based on self-reported, forced-answer
surveys. Some respondents may feel compelled to answer survey questions according to
how the researcher may want them to respond, therefore clouding the participant’s true
beliefs or feelings about a question. This study was limited to studying students enrolled
in grades 9-12 in two virtual schools in the Western and Southern United States. Also,
administrators at the study sites cited stipulations of the Family Educational Rights
Privacy Act as reason for not releasing actual school issued grades; therefore, the study
was limited to self-reported expected achievement.
Moreover, the study was limited by variables such as the diverse physical learning
environment of the distance learners (i.e. wherever the student uses a computer to
complete coursework), socio-economic status, time of day student logs on to complete
work, or other significant life circumstances such as being a teen parent, living in a
single-parent home, or other situation. The current body of research on distance learning
has been based on correlation studies and quasi-experiments. However, there is a lack of
pure experiment research. Therefore, there is no evidence to support a causal relationship
9
between self-efficacy and self-regulation, and engagement and expected achievement in
distance learners in general.
Significance of the study
The purpose of this study was to better understand the motivational factors that
effect the engagement of distance learners at the K-12 setting. The results will benefit the
body of literature that is lacking in base data as well as other studies specifically aimed at
distance learning in the K-12 setting. It is hoped that at the conclusion of the study,
administrators, curriculum specialists, teachers, counselors, parents, and, most
importantly, students will better understand the relationship between their self-efficacy of
technology and self-regulation and their engagement in distance learning.
10
CHAPTER TWO: LITERATURE REVIEW
Introduction
There are six general problems of this study. First, distance learning is the fastest
growing sector in the field of education, particularly at the 9-12 or high school levels.
The rate of growth—total student enrollment in distance learning—is occurring faster
than the rate of studies designed to examine the general concept of distance learning
(Picciano & Seaman, 2007). Second, as distance learning is the wave of the future in
education, it is important to understand the impact it may have on traditional high
schools. For example, how will traditional schools address loss of attendance revenue,
possible decline in enrollment, and other related issue? Third, there is a need to know
which student factors, such as self-efficacy and self-regulation, make distance learning
workable. Fourth, there is a need to understand differences between distance and
traditional learning and to understand the emotional and social impact distance learning
has on students. Fifth, it is necessary to add to the body of literature that focuses
exclusively on high school distance learning as the body of literature currently focuses on
higher education. Lastly, it is the hope of the researcher to contribute findings that may
lead to a better understanding of the causes of academically engaged and disengaged
distance learners at the high school level.
Organization of the Literature Review
This literature review is organized by the following sections: First, the review will
address the major literature regarding self-efficacy and then is followed by an
examination of literature regarding computer self-efficacy. Second, the major literature
regarding self-regulation is presented followed by a review of literature about self-
11
regulation of distance learners. Next, engagement and engagement in distance learning is
attended to. Finally, expectancy outcome is briefly addressed.
Sources Searched
The strategy used to search for relevant student included the following: A search
of the following databases: PsychInfo, Google Scholar, ERIC, and the University of
Southern California Libraries. The following keywords and keyword combinations were
used for the literature search distance education, distance learning, online learning, web-
based learning, self-efficacy, computer self-efficacy, self-regulation, and engagement.
Additional articles and texts were found via the resources listed in articles found via the
aforementioned method. The key journals that were searched included: American
Educational Research Journal, Computers in Human Behavior, Computers and
Education, Contemporary Educational Psychology, Educational Psychologist,
Educational Psychology Review, Educational Researcher, Journal of Applied
Psychology, Journal of Educational Journal of Educational Computing Research,
Journal of Educational Psychology, Review of Education Research, and Psychological
Review. A total of 48 studies were found that include both qualitative and quantitative
research designs. The time range of these articles and texts spans 1977 to 2008, and the
articles were searched between 2006 and 2009.
Review of the Literature
Self-efficacy and Computer Self-Efficacy
Self-efficacy for technology also referred to as computer self-efficacy (CSE) is
rooted in the social cognitive theory framework of Bandura (1977, 1982, 1986). Before
close scrutiny can be given to computer self-efficacy, a cursory summation of self-
12
efficacy—a key component in social cognitive theory (discussed in the self-regulation
and distance learning section)—as related to Bandura’s framework is necessary.
Bandura (1986) defined self-efficacy as a person’s assessment of their own ability
to carry out a course of action in order to complete specific types of tasks. More
importantly, self-efficacy is not about the amount or quality of one’s skills, but, rather,
what one believes he can achieve with those skills. Schunk, et al.(2008) explained self-
efficacy as “one’s perceived capabilities for learning or performing actions at designated
levels” (p. 379), and Ormrod (2006) stated that self-efficacy is a person’s “self-
constructed judgment” about her abilities in completing tasks successfully.
Schunk (2008) suggested that self-efficacy affects the three indexes of motivation:
active choice, mental effort, and persistence. Active choice is determined by an
individual’s willingness to engage in difficult tasks or avoid a task. Highly efficacious
learners actively choose difficult tasks. Mental effort is determined by the rate of
performance and how much energy is required for said performance, especially with
challenging tasks (Schunk, et al., 2008; Zimmerman, 2000). Persistence is evidenced by
a learner working for extended periods of time toward the completion of a task,
especially when faced with challenges. Typically, learners with some moderate level of
self-efficacy will demonstrate motivation via one of the three aforementioned indices.
The degree to which one is most evident is a direct result of the learner’s expectations for
succeeding at the task.
Perceived self-efficacy expectations may differ based on the specific activity and
the specific context in which the activity is to be performed, (Bandura, 1977;
Zimmerman, 2000). As well, self-efficacy can be viewed as a three-dimensional construct
13
in terms of its level or magnitude, generality, and strength, (Bandura, 1977; Zimmerman,
2000). Level is determined by how difficult a task is to the person. For example, a person
may demonstrate high levels of self-efficacy for a low-level or easy task. Generality
refers to how much a certain level of self-efficacy transfers to other activities within a
domain, such as from square dancing to ball room dancing. Lastly, strength of efficacy is
how confident an individual is about performing a give task and will persevere through
challenges with high self-efficacy and will quit or not engage at all with low self-efficacy
strength. However, before any assessment of one’s self-efficacy—the level, generality,
or strength—it is first necessary to examine the sources of one’s perceived self-efficacy.
To further understand the construct of self-efficacy, Banduara (1977, 1982)
proposed four sources of information for examining expectations of self-efficacy:
performance accomplishments (1977) or enactive attainments (1982); vicarious
experiences; verbal persuasion; and emotional arousal (1977) or physiological state
(1982). In this model, enactive attainments refer to what successes or achievements a
learner actually experiences. The more a person achieves the higher one’s perceived self-
efficacy. For example, a beginning golfer may experience success at sinking two-foot
putts and, therefore, may feel a sense of heightened self-efficacy and future success in
golf. Vicarious experiences are when a person observes another person engaging in and
completing a task successfully. Self-efficacy can increase if the person feels that she
could perform the same tasks she witnessed another experience. Verbal persuasion is
simply when an expert coaches another through a task or encourages a person to engage
in behaviors towards achieving a task. Finally, physiological state is when a person feels
negative or positive feelings in a physical and emotional sense towards certain activities.
14
For example, a person who loathes public speaking may actually feel sick to his stomach
at the prospect of giving a speech in English class. Conversely, a person may feel elated
at the opportunity to lead a small group in an activity because she has a high sense of
self-efficacy in being able to accomplish the task.
In summary, self-efficacy is a critical aspect of motivation and can predict
behavior between people as well as the degree to which and individual’s self-efficacy
changes over time (Bandura & Locke, 2003). Moreover, self-efficacy beliefs are not a
single-dimension aspect of a learner, but, rather, multidimensional in form, depending on
the domain of behavior (Zimmerman, 2000). The following section will address the
major literature and studies on CSE as well as some measures of CSE.
This section will address the following areas related to computer self-efficacy:
first, a comprehensive definition of computer self-efficacy; second, an examination of the
researched antecedents of CSE; third, a cursory note on the role of gender in CSE, and
will then close with an historical perspective on the development and use of computer
self-efficacy measures.
Before the domain was dubbed “computer self-efficacy” Kinzie, Delcourt, &
Powers (1994) referred to the concept as an individual having self-efficacy for the use of
technology and argued that an individual needed a somewhat high level of self-efficacy
in order to successfully use technology. Moos and Azevedo examined several of the
studies mentioned herein in the literature review of computer self-efficacy and should be
consulted for other aspects of computer self-efficacy not addressed in this study
(2009)Compeau and Higgins (1995) proposed one of the most detailed definitions often
referenced in other computer-self-efficacy research. They maintain that CSE is closely
15
aligned with the traditional “Bandurian” definition of self-efficacy in that it is not about
specific computer skills such as using a word processor, navigating the Internet, or
uploading files, but rather a person’s judgment of his capability to use a computer to
complete a task. Karsten & Roth (1998) asserted that computer self-efficacy is rooted in
the heavily researched variable of self-efficacy.
Tantamount to Bandura’s (1977) dimensions of self-efficacy, Compeau &
Higgins(1995) provided insight to the same dimensions—magnitude, strength, and
generalizability—but, as related to computer self-efficacy. They state that magnitude is
simply the level of skill one should expect to possess and demonstrate in completing a
task or, if said skill level is lacking in the individual, the amount of support that person
would need to complete the task. For example, the task of writing an essay using a word
processor would not be of great magnitude. However, creating a complex webpage with
multiple links, JAVA script, and other bells and whistles would be of a greater
magnitude, requiring more skill. Strength refers to the confidence level an individual has
in his or her skills for completing a task using a computer. As with the aforementioned
example, an average user may not think twice about opening a word processor and begin
typing the essay, whereas the same user may not feel his skills are as strong in using a
hyper text mark-up language (HTML) editor and writing JAVA script in order to create a
webpage. Finally, generalizability is how a person’s CSE in one domain such as using
spreadsheets is maintained in another domain such as using an Internet browser. It is
possible for a person to have a high level of CSE in using a word processor but have low
CSE in uploading files onto a website.
16
As with the three dimensions of self-efficacy and the four sources of self-efficacy
(Bandura, 1977, 1982; Zimmerman, 2000) described earlier, Compeau & Higgins (1995)
and Karsten & Roth (1998) explained both of these concepts as specifically related to
computer self-efficacy. These researchers argued that students best obtain enactive
information by actually engaging in computer use that require frequent demonstration of
hardware and software knowledge; vicarious experiences—witnesses both success and
failure by others—provides standards by which a student may judge himself; instructors
and fellow students are the best means for verbal persuasion, specifically in terms of
supporting skill development and becoming computer competent; and physiological
indices may be evidenced by nervousness, sweating, or visceral arousal (getting
butterflies) prior to engagement in a computing task or assessment completed via
computer. Conversely, the absence of these signs may be an indicator of a computer user
with high computer self-efficacy.
Hasan (2003) purported a more recent take on CSE as he defined computer self-
efficacy (CSE) as an individual’s perceptions about his or her own ability to use a
computer for a computing task. Because of the varied levels of computer use and
experience, it is possible for a distance learner to have varied levels of CSE. For example,
a distance learner may possess and demonstrate expert levels using a word processor or
spreadsheet, but may only have minimal skills in website creation. Therefore, depending
on the task of the learning content, the learner may experience varied levels of self-
efficacy and need to draw upon different strategies based on strength of self-regulation.
The next section focuses upon research that examines antecedents of computer self-
efficacy.
17
Theoretical research models by Stone & Henry (2003) and Bates & Khasawneh
(2007) provided examples of what the bulk of research in CSE has sought: what are the
antecedents to computer self-efficacy?
Figure 1: Truncated version of Stone & Henry’s Research Model on CSE
As with each of the studies that follow, Stone & Henry (2003) created a model that
asserts four key antecedents to predicting a person’s level of computer self-efficacy.
Degree of computer use is defined as the frequency and purpose of computer use. For
example, a person who uses a computer daily to check email but nothing else would have
a high frequency of use, but a low degree of use because they would be limiting the use
to a single task compared to a person who uses a computer daily as part of her
employment such as a computer programmer. Past computer experience is the strongest
Degree of
comp. use
Past comp.
experience
Comp.
support
Ease of
system
use
Computer
Self-
Efficacy
Outcome
Expectancy
18
predictor of CSE (specific studies will follow). This antecedent is again user reported,
however, there are some suggestions that pre-tests for specific computer software
packages may be given in future research to correlate with self-reported CSE. Computer
support is defined as the help a user may access, such as a teacher or software instructor
while completing a task. The ease of system use is not as prevalent in other studies, but
has been considered in other studies, especially when the sample studied is using a
uniform system such as a patient information database program at a hospital. Bates and
Khasawneh (2007) provided a research model that is more general in terms of defining
the antecedents and includes outcome variables, such as engagement, with CSE
functioning as a mediator between the antecedent variables and outcome variables.
Figure 2: Bates & Khasawneh Research Model
In this model, Bates & Khasawneh (2007) examined six antecedent variables that have
been included in previous research models: fixed ability (Kinzie, et al., 1994) acquired
skill ((Karsten & Roth, 1998), online learning anxiety(DeLoughry, 1993; Wilfong, 2006),
instructor feedback (Cassidy & Eachus, 2002; Compeau & Higgins, 1995), training
(Potosky, 2002), and previous success (Hasan, 2003; Zhang & Espinoza, 1998). Bates &
Khasawneh (2007) found that four of the six antecedent variables—fixed ability, acquired
skill, anxiety, and previous success—had significant correlations with self-efficacy, r= -
Antecedents of
Self-efficacy
Online Learning
Self-efficacy
(computer self-
efficacy)
• Outcome
Expectations
• Mastery
Perceptions
• Hours per week
(engagement)
19
.32, .38, -.56, .48, p <.01 for each. The following section is an historical sampling of
research and subsequent findings on computer self-efficacy.
Kinzie, Delcourt, & Power (1993) conducted a study of 359 undergraduates and
found that experience (frequency of use) in using technology and positive attitudes
predicted high levels of computer self-efficacy. Zhang & Espinoza (1993) found that
attitude was 1) a predictor of computer self-efficacy; 2) a predictor of desirability to use
computer skills; and 3) a predictor of desirability of learning computer skills each
correlating with statistical significance at the p<.05 level. Furthermore, a person’s
comfort or anxiety in using a computer also predicted CSE. An early study reported by
DeLoughry (1993) addressed the issue of “technophobia” defined as the degree to which
an individual experiences physical discomfort at the notion of using technology. This
supports the notion of physiological indicators of self-efficacy as the findings of this
study showed that of 1,617 subjects, 40% were at risk for technophobia.
Karsten & Roth (1998) examined the effects of computer training on CSE in a
pre- post-test design. In their study, 148 students took pre- and post-training CSE surveys
based on a four-point Likert scale wherein higher scores indicate higher levels CSE.
Results indicated a pre-training M=3.35 (based on 1-4) and a post-training M=4.36
(based on 1-4). This positive correlation of computer training on CSE was statistically
significant, with r=.391, p<.001.
Cassidy & Eachus (2002)found positive correlations in predicting CSE based on
experience with computers, r=.55, p<.001, and based on familiarity with software
packages, r=.53, p<.001. Their conclusion was that positive experiences with computers
and software packages contribute to higher levels of CSE. One consideration, however,
20
is that the independent variable of software packages is not generalizable in that the
measure used did not specify what type or types of software packages were implied. As
well, this is a confounding factor in that as software packages develop, their facility often
time increases. For example, in the early 1990s, a person would have to know how to
write formulas into a spreadsheet program in order to use it for the purposes of creating a
statistics chart, whereas nowadays there are statistics software programs that do not
require any writing of formulas by the user.
As indicated in Fig. 1, Stone & Henry (2003) found positive correlations between
computer experience, computer support, and ease of use with computer self-efficacy,
with all paths significant at p<.01. Likewise, both Hasan (2003) and Wilfong (2006)
reported that in their studies of 151 and 242 students respectively, a positive correlation
between computer experience as a predictor of CSE, with both studies reporting a
significant correlation at the .001 level.
Other research has focused upon slightly different variables associated with CSE.
DeTure (2004) found in a study of 73 students found that students who were field
independent had higher levels of CSE and higher grades as well. In this case, field
independent refers to students who are able to extract a figure from a more complex
visual field. In terms of web-based courses, Kerka (1998) argued that field independents
are able to search and navigate complex and information rich visual fields on the Internet
as compared to field dependents who rely upon others and the external environment in
cognitive restructuring tasks.
Related to web-based learning, Eachus and Cassidy (2006) developed a follow-up
to their Computer User Self-Efficacy (CUSE) measurement in the form of the Web-User
21
Self-Efficacy (WUSE) measurement. Using a sample of 141 subjects, 68 from a Usenet
group and 73 from the general population of a large university in the United Kingdom,
they tested levels of web-user self-efficacy in four domains—information retrieval,
information provision, communications, and internet technology. The reliability of each
domain was demonstrated with alphas for each at .80 or greater. The scale was also
validated via ANOVA where F=72.60, p<.001, supporting the notion that high-self
reports in the four domains based on a scale of novice, intermediate, and advanced,
indicate that high efficacy in the specific domain correlates to high overall web-user self-
efficacy.
Finally, Bates & Khasawneh (2007) stated that, as shown in figure 2, computer
self-efficacy is a strong predictor of online learning self-efficacy. However, their research
includes a third variable, which they refer to as “outcome variables.” This includes the
notion of outcome expectancy and hours spent per week using the online tools, which can
be viewed as engagement. They differ from other research in that they don’t stop at how
previous experience correlates to higher CSE, but rather, focus upon general self-efficacy
leading to higher online or computer self-efficacy, which then becomes a predictor for
levels of engagement and performance outcomes. The antecedents, then, coupled with
the mediator (See Figure 2) become significant predictors for outcome expectancy
(correlated at p<.05), mastery perceptions (correlated at p<.05), and hours per week
(correlated at p<.05). One of the most critical factors in the studies examined thus far is
the specific measurement(s) used. In the next section, the researcher will briefly describe
the development and validation of several instruments aimed at measuring computer self-
efficacy.
22
The pursuit of a consistent validated and reliable measure of computer self-
efficacy is the “holy grail” of CSE research. Early work by Kinzie, Delcourt, and Powers
(1993) saw the creation of the Attitudes Toward Computer Technologies (ACT) and the
Self-Efficacy for Computer Technologies (SCT). The ACT was reliable at the .89 level
for the entire measure using Cronbach’s alpha and the SCT was reliable on four subscales
at alphas between .95 and .99. Zhang & Espinoza (1998) used a survey called the
Computer Technologies Survey (CTS), which was a combination of Kinzie’s ACT and
the Confidence and Desired Knowledge with Computers (CDK) measurement.
Compeau & Higgins (1995) developed the Computer Self-Efficacy scale that was further
developed and tested by Hasan (2003); Wilfong (2006); and Bates & Kasawneh (2007).
The researchers found correlation of prior computer experience with computer self-
efficacy at p<.001, with the exception of Bates & Kasawneh where p<.05.
Two other key researches and measurement developers, Cassidy & Eachus (2002,
2006) developed the Computer User Self-Efficacy (CUSE) scale and the Web-User Self-
Efficacy (WUSE) scale. The CUSE was field tested and shown to be reliable with an
alpha of .94 and validity of p<.0005. The analysis of the WUSE was resulted in an alpha
value of .80, and in order to establish discriminate validity, the researchers asked
participants to rate their level of expertise as novice, intermediate, or advanced. They
predicted that there would be differences in WUSE scores based on these three groups,
therefore, an ANOVA was used to confirm this and resulted in F=72.60, p<.001.
At least four other measures have been developed based on the literature reviewed
by the researcher. However, the researches listed above have been active in the
development of CSE measures or their respective measures have been used and
23
developed by subsequent researchers. The information about measure development and
use as related to CSE is important in that it reveals a gap for the existence of a consistent
measure. Two issues have confounded the development of a long-term universal
measure. First, as technology develops and becomes more facile, scales aimed at
gathering a user’s self-efficacy no longer measure a variable that has the same meaning
although the label remains the same. For example, Compeau & Higgins (1995) used
items aimed at gathering data about email use. In 1995, using email was a multi-step task.
Nowadays, using email is one of the most facile actions based on field research.
Therefore, it is necessary to update scale items to match how difficult or facile current
software is. Second, of the 12 researchers reviewed in this section, a total of 11 different
measures were used for the purposes of understanding CSE. Part of this is due to the
specific researcher’s goal and the variables upon which he or she is focused. As a result,
measurements may need to be developed to assess CSE for aspects of computer used that
have yet to be studied or have yet to be created.
In summation, computer self-efficacy is the belief a person has about her ability—
not the quality of the abilities—to use a computer to complete a task. Many of the key
studies have focused upon the antecedents of computer self-efficacy, with most findings
indicating that prior computer experience is a significant predictor of computer self-
efficacy. Finally, the development of computer self-efficacy measures is critical for the
overall growth of the field and contribution to the literature. As technology continues to
advance, it is imperative that researchers develop measures that are in sync with said
technological advances. The next section will focus upon self-regulation and self-
regulation as related to distance learning.
24
Self-Regulation and Distance Learning
Bandura (1977) initially suggested that self-efficacy and the resultant behavior be
examined at “significant junctures” so as to better understand how one influenced the
other. This model was later expanded to include a third element, environment, which
resulted in the triadic reciprocality model (Bandura, 1986) wherein at any given time,
personal factors, behavior, and environment of a learner effect the level of self-
regulation.
Zimmerman (1989) concurred and expanded upon Bandura’s notion to include
refinements and qualifications of self-regulation. He purports that a learner must use
specific strategies aimed at academic achievement based on one’s self-efficacy. The two
preceding variables to self-regulation are that the learner know the specific learning
goal(s) and understand his or her level of self-efficacy. In other words, in order for a
learner to demonstrate desired levels of self-regulation, she must clearly know what her
learning goals are, such as writing a persuasive essay, and know her level of self-efficacy
for writing. Although she may have never written a persuasive essay before, she may
have had other experiences in writing and, therefore, possess a relatively high level of
self-efficacy for writing in general.
Once a person has identified the learning goal and understands his level of self-
efficacy, he may then utilize self-regulation strategies with effectiveness. These may
include organizing information, transforming information, researching, or using memory
aids (Zimmerman, 1989). Again, this concept is contingent upon the forces that work
within the triadic reciporaclity model.
25
Figure 3: Bandura’s Model of Triadic Reciprocality
In this model, none of the three elements is fixed nor necessarily dominate over the
others. The specific goal(s) sought, the context, the age of the learner, and a slew of other
factors affect how the three elements interact. In short, self-regulation occurs when a
learner uses personal processes to purposefully regulate behavior and attempt to maintain
or change the learning environment. The next section will closely examine the major
related to these three abovementioned dimensions of self-regulation.
Zimmerman (2002) maintained that four key factors constitute personal
influences in student self-regulation. They are 1) student’s knowledge, defined as their
knowledge of regulatory strategies, types of knowledge possessed such as procedural and
conditional (Anderson, Krathwohl, & Bloom, 2001); 2) metacognitive processes, defined
as one’s ability to plan, organize, manage time, as well as use learning processes such as
Personal
Influences
Behavior
Influences
Environmental
Influences
26
mnemonic devices or other memory aid; 3) goals, defined as a precursor to metacognition
as one must know what one desires to learn before he or she can plan, execute, etc.; and
4) affect, which is basically emotion or in the case of encumbering forces, anxiety about
engaging in the behavior required.
In line with these processes, Schunk, et al. (2008) asserted that humans are not
simply controlled vessels as influenced by internal or external forces, but rather, react to
said forces via choice. An example of personal influence on behaviors would be a student
possessing high self-efficacy (personal) for playing the violin and then actively choosing,
persisting, and putting forth effort (behavior) to learn the cello.
Behavioral influences can be simply viewed is how a student responds to the
other two influences in the triadic model. This is how a student overtly acts in a given
situation to either maintain or change the environment or engage in or react to cognitive
influences (Bandura, 1986; Zimmerman, 1989, 1990, 2002). Students may engage in one
of three classes of responding: self-observation, self-judgment, and self-reaction.
Self-observation is the willful monitoring of one’s own performance, successes,
failures, etc. and how one stores this information (journal, grade reports, or verbal
feedback). Self-judgment is how a learner compares her performance to a standard, be it
another student, criterion provided by the teacher, or self-set standard. Finally, self-
reaction is the setting of goals, perceptions of self-efficacy, planning for goal attainment,
and performance outcomes. Again, these are related to the reciprocal nature of the
model.
27
An example of behavior influences on the environment would be when a student
completes a project incorrectly (behavior) and the teacher re-teaching or re-explaining the
desired outcomes (environment) rather than moving on to a new project(Schunk, et al.,
2008).
Zimmerman (1989) stated that environmental influences can also be considered
social or enactive experiences, such as feedback, modeling, verbal persuasion, and direct
assistance are factors of the environment. Of course, the physical setting is a significant
environmental influence and may be one that is most challenging for a student to
manipulate. An example of the environment on personal influence would be a coach
giving praise (environment) to a person new to the sport of basketball after he performed
a task successfully, raising the person’s self-efficacy (personal).
As stated before, the direction of influence (e.g. behavior to personal) is not
always the same nor is it constant. However, it should be noted that most research in this
area concludes that usually two of the three factors will dominate in a given situation.
The three influences of the triadic model have a profound influence on the self-regulation
of a learner.
To understand the complexity of self-regulation, it may be appropriate to first
examine what it is not. According to Zimmerman (2002), “self-regulation is not a mental
ability or an academic performance skill, but rather, the self-directive process by which
learners transform mental abilities into academic skills” (p. 65). Therefore, it is a relief to
know that any learner is capable of being taught self-regulatory strategies. McMahon &
Oliver (2001) likened the learner to the protagonist of a story in that she proactively seeks
28
new knowledge and takes the required steps in order to master it, of course, this being a
high self-regulated learner. The high self-regulated learner, then, is different from a
novice or intermediate self-regulator in that the former demonstrates higher levels of
awareness and use of strategy in learning.
The key emphasis of self-regulation is on how students chose, organize, create,
and/or maintain learning environments (behavioral and environmental influences) in
which they engage in learning and how they manage their time and control their own
instruction (behavioral and cognitive influences) (Zimmerman, 1990). Zimmerman
(1990) has studied the influence of self-regulation on achievement and found that in a
study of 80 subjects, the level of self-regulation (high or low) predicted the academic
track in which the student was enrolled.
Zimmerman suggested six dimensions of self-regulation that can serve as a
framework for research (Schunk, et al., 2008). This framework, albeit in a modified form,
will be discussed later in this chapter as related to distance learning. However, it is
prudent to discuss Zimmerman’s (2002) notion of cyclical phases of self-regulation.
Figure 4: Three cyclical phases of Self-Regulated Learning
1. Forethought phase:
Goal setting, planning,
self-efficacy, outcome
expectancy,
goal orientation
2. Performance phase:
Self-control, imagery,
self-instruction,
self-observation
3. Self-reflection phase:
Self-judgment,
self-evaluation, self-reaction,
self-satisfaction
29
Although the model of triadic reciprocality is nebulous, the sub-processes of self-
regulation are cyclical and a group of processes explicitly precedes another group. For
example, a student must first decide on a learning goal before he can engage in attention
or self-observation. As well, a student must—once a goal has been selected and a plan for
achieving it developed—perform the required behaviors in order to achieve said goal,
before any sort of self-reflection can occur, this in the form of self-evaluation, attribution
to success or failure, and so on. Then, in its cyclical nature, the learner uses the
information from the self-reflection phase to develop new learning goals, plans, modify
perceptions of self-efficacy, and adjust goal orientation.
As mentioned above, Zimmerman developed six underlying psychological
dimensions that students can self-regulate. These have been modified by Dembo, Junge,
& Lynch (2006 ) so as to provide a framework for analyzing these dimensions in the
context of web-based learning. This will serve as a research framework for the present
study.
Zimmerman (1990, 1998) suggested six dimensions of self-regulation that include
understanding the why, how, when, what, where, and with whom in relation to a learner’s
engagement in self-regulation. Dembo, et al. (2006) reworded the notions in the form of
tangible ideas that can be manipulated rather than in the question form (inserted in
parenthesis) proposed by Zimmerman. They are: motive (why), method (how), time
(when), behavior (what), physical environment (where), and social environment (with
whom).
30
Although it has been asserted that self-regulation is not a fixed trait in a learner
(Bandura, 1986; Zimmerman 1989, 1990, 2002; Schunk, et al., 2008), Dembo, et al.
(2008) argued that the component skills related to the underlying dimension can be taught
before or as part of a web-based distance learning class. A brief discussion of each
dimension follows.
Motive can best be assessed via the indicators of motivation—active choice,
persistence, and mental effort (Schunk, et al., 2008) as related to a learning task or goal.
For distance learning, as defined for this study, more research is needed in regards to
drop out rates. This would be an indicator of persistence. Motive is also related to goal
setting in that goals are an antecedent to self-efficacy which antecedes motivation (Locke
& Latham, 1990). For this study, the subjects studied are required by state law to
regularly attend school. However, there are incidents of truancy, failing classes, and
dropping out; therefore, students must still possess a motive for learning.
The method by which one learns is dependent upon the task itself. However, three
basic strategies for learning are rehearsal, elaboration, and organizational structures.
These are progressive in terms of complexity. For example, learning a list of vocabulary
words and their definitions could be achieved by rote memorization. A more complex
method would be to use the vocabulary list to create sentences that demonstrate
understanding of their meaning. Distance learning is a highly independent act that
requires the student to know how to learn before enrolling in an online course.
Continuing this thought of independence, time management is important as there
is no bell system signaling the beginning or ending of classes. There is not a teacher to
31
take roll each period or campus supervisors to sweep the hallways for tardy or truant
students. The distance learner must decide when and for how long to study, log on to the
learning delivery system regularly, and adjust to the asynchronous environment. Dembo
et al. (2006 ) suggested six ways to manage time: be aware of deadlines, know how long
it will take to complete task, be aware of one’s learning process, prioritize, evaluate study
time, and then reprioritize based at the end of the task. The distance learner (DL) usually
has access to assignments and the respective due dates well in advance. One of the
drawbacks to weak time management skills is the propensity for students to procrastinate.
In his influential study, Tuckman (2007) used a measurement of procrastination
that he developed to assess the levels of procrastination of students in an online distance
learning course. The results indicated two general groups, high-procrastinators and low-
procrastinators. He found that by creating peer support groups that offered time
management suggestions and to do lists, he was able to lower the procrastination of the
high-procrastinators. This is a major concern for DLs in grades 9-12 as they are often
home alone.
Unlike the physical environment of a traditional school, the virtual high school
student has a two-fold physicality to contend. First, there is the physical setting where the
student uses his or her computer to access the distance learning tools. This could be at
home in a bedroom, office, or living room, or in a public setting like a library, Internet
café, or coffee house. Second, the actual learning delivery system (like Blackboard) has
its own dimensions that affect student regulation. Self-regulated learners actively chose
the learning environment that is most conducive to her learning. While asynchronous
32
learning frees the student from the binds of time and place, there are limitations to where
a student may access a computer and the Internet.
The issue is creating ecological validity in the learning environment. Rizzo and
his colleagues (2006) maintain that when subjects are removed from the natural or real
world setting, the validity of the test [or learning] environment does not allow for real
world replication. As well, the real world is unpredictable and can create distractions.
For example, the student is not able to discriminate among multiple stimuli. Rizzo and
Kim (2005 ) argued that virtual environments allow students to be in an ecology that is
more real than a traditional learning setting because it may be easier for a learner to
control his or her learning environment compared to a traditional setting where the
learner does not have control over other students, the teacher, maintenance taking place
on the campus, and etc.. Greenhow and colleagues asserted that the idea of a learning
ecology as related to web-based instruction encompasses an array of simultaneous
settings for a learner with boundaries being malleable and created by the learner as well
as those that are pre-existing (2009) This aspect will be of particular interest to the
researcher as related to student engagement.
Although there is the physical absence of classmates, the social environment of
the virtual school can be thriving and supportive for the DL. Recent studies (Dembo, et
al., 2006 ; Lee & Lee, 2008; Tuckman, 2007; Wang & Lin, 2007; Whipp & Chiarelli,
2004) support the notion of collaboration in the distance learning model. These
researches find that a high level of peer support increases computer self-efficacy, self-
regulation, resulting in a higher degree of overall engagement and subsequent
achievement. Furthermore, Dembo argued that the factors most relevant to web-based
33
education are goal orientation, self-efficacy, and desiring challenging tasks with an
orientation towards mastery. The later one is natural to the virtual setting in that it is not
feasible to post the work of students or for a teacher to read scores aloud.
Lastly, performance is most evident in the shift of teacher-centered learning to the
student-centered learning. Self-monitoring and self-evaluation become tantamount to the
overall success of a DL due to the nature of the school. The overall physical absence in
the virtual realm of peers and teachers and also the absence of an adult in the
environment where the student access the DL tools, require a minimal level of self-
efficacy. The development of metacognitive skills create an ample segue to autonomy
and life-long learning.
There is little research to date in regards to self-regulation in web-based distance
learning, especially for grades 9-12. Extant studies, although sparse and geared towards
higher learning, do reveal consistent findings, especially as related to the virtual
environment and context. McMahon & Oliver (2001) found that the learner in a web-
based/online environment must become the protagonist in his learning unlike the old
passive model wherein the teacher delivered information that the student supposedly
learned. This idea implies the presence of an antagonist, perhaps, in the form of low
computer self-efficacy, low self-regulation in terms of accessing distance learning tools
or simply changing the learning environment, or the lack of social skills in accessing the
support networks in the virtual environment. As well, students with technophobia or
anxiety about technology or who procrastinate need to be taught basic self-regulation
strategies prior to taking DL courses or as part of the DL curriculum. They also assert
34
that “a well-designed learning environment should be able to operate on both affective
and cognitive processes to activate self-regulated learning” (p.1303). This is also
supported by Whipp & Chiarelli (2004) who found that self-regulation in cyberspace is
highly contingent on the course design, especially for learners new to the DL format.
In support of Zimmerman’s three-phase cyclical model of self-regulation, Whipp
& Chiarelli (2004) compared traditional and online adaptations of self-regulation
strategies. For example, whereas the traditional mode of goal setting and planning is
achieved via calendars, self-imposed deadlines, and chunking of work, the online
adaptation would include daily logons, coordination of online and offline work (since
students do not have to access the Internet for all phases of DL learning), and planning
for technical problems that may occur (this can be viewed as an element of time
management as well as planning, especially if incidents of technical issues are prevalent).
Again, in support of the importance of context, findings from Whipp & Chiarelli
(2004) indicated that students new to online delivery increased their self-efficacy through
early successes, encouragement from the instructor, and effective modeling in the form of
model student work available online. This is contrary to the social cognitive theory notion
of individuals focusing on a single model and vice versa—most current research supports
the idea that the collective learning process is encouraged and shown to be more effective
(see Tuckman, 2007; McMahon & Oliver, 2001). As well, Wang & Lin (2007) found that
peer critiques raised student scores as compared to students who did not receive peer
assessment. Lee & Lee (2008) determined that high levels of self-regulatory efficacy are
exclusively focused upon environmental influences. This environment, context, and
35
social aspect are imperative to student success. It is implied that the design of the course
support the social interaction of student to student and student to teacher.
In summary, research on distance learning via the Internet indicates that students
must first be taught basic self-regulation skills in order to engage in the six dimensions of
self-regulation (Dembo, et al., 2006 )—they must learn how to learn. Next, course design
is a critical factor in raising self-efficacy and fostering self-regulation, particularly the
opportunity for distance learners to interact in collective learning through peer evaluation,
feedback, and problem solving. It is the hope of the researcher to garner more
information about high school-level distance learners and their levels of self-regulation,
especially as evidence of engagement in learning. The next section will address the
concept of school engagement.
Student Engagement and Distance Learning
This section begins with a definition of engagement and then follows with an
examination of each of the facets of engagement: behavioral, emotional, and cognitive.
Engagement can best be viewed as a multidimensional construct similar to the
triadic reciprocality model of self-regulation (Bandura, 1977, 1986; Zimmerman, 1989,
2002). Specifically, it is a malleable or nebulous construct, depending on the context in
which the student is exposed to learning. Engagement has been researched as behavioral,
emotional, and cognitive. As stated above, engagement shares at least two facets with
self-regulation—behavioral and cognitive. Therefore, self-regulation is itself an act of
engagement. The comprehensive work of Fredricks and colleagues (2004) purported that
engagement should be defined three ways, based on the three types of engagement. As
36
well, they argued that each type has a broad range and that the levels of each may vary
depending on the learner and the learning context. In short, they argue that it is an
adaptable or tractable idea and is dependent upon the interaction between an individual
and the environment. For the purposes of this study, engagement will be defined as the
extent to which a learner is involved in or committed to a learning activity or goal. A
brief discussion of the three facets of engagement follows.
Behavioral Engagement. According to Fredricks and colleagues (2005)
behavioral engagement is evidenced by observable actions such as positive conduct,
effort, persistence, attention, or participation in extracurricular activities such as athletics,
clubs, or student government. Finn & Voelkl (1993) asserted that behavioral engagement
is participation of an individual in a range of actions. For example, a student can attend
school regularly, take notes, or participate in a group activity in order to demonstrate a
minimal level of participation. On the other hand, a more advanced-level might be a
student who initiates a higher-level synthesis question in a science class or participates in
activities related to a given subject outside of required school time. For example, a
student enrolled in a physics class at the high school may also be a part of a physics club.
Finn & Voelkl (1993) also proposed six indicators of behavioral engagement that
include absences and tardiness (habitual truancy or habitually arriving late to class), not-
engaged (failure to complete homework), attendance (regular for all classes), preparation
(having pen, paper, other materials), behavior (not disrupting the learning environment),
and student-teacher relationships (positive feedback and response). Their research
indicated that at risk students raised their levels of engagement in smaller schools.
37
For the distance learner, these six indicators may not be as evident in the virtual
setting: absences are only recorded if a student does not log on (for the K-12 setting)
within a 24-hour period and in an asynchronous environment it is not possible to be
tardy; not-engaged is evident to the extent that a student fails to upload assignment, but
attention is not possible to observe in this setting except with video monitoring;
attendance is not logged by period per se, students merely need to log on to the learning
delivery system every 24-hour period in compliance with state-specific attendance laws;
preparation is indicated by the student having a computer and Internet access; behavior is
only a concern in a chat room where a student could possibly post disruptive comments
or images; and student-teacher relations are somewhat limited to emails and occasional
phone calls, again due to the asynchronous environment.
Emotional Engagement. Fredricks and colleagues (2004) defined emotional
engagement as an individual’s affective reactions; reactions to the teacher; and reactions
to the class and homework. For an Internet-based course, Arbaugh (2000) found that
previous use of technology was a predictor for levels of engagement and elements of
interactive teaching style were strongly correlated with learning. Herrington and
colleagues (2003) stated that students demonstrate two patterns of engagement—willing
acceptance and delayed engagement. These two are more indicative of attitudes or
emotional engagement in that in the first, learners in an virtual environment are willing to
“suspend their disbelief” that they are actually in a learning environment and not hold
critical opinions about the ecological validity of an environment. Second, they argue that
38
students who are used to a teacher-centered environment may not initially value a more
autonomous learner-centered environment.
Cognitive Engagement. Cognitive engagement is defined as the level of
psychological investment required in a learning goal as well as the preference for
demanding work that calls for higher levels of self-regulation (Fredricks, et al., 2004).
Herrington and colleagues (2003) submitted that authentic environments led to higher
levels of self-efficacy and self-regulation, resulting in higher levels of engagement. Their
findings also indicate that support (cognitive based) was a key factor in alleviating initial
reluctance in some students. Along with support, they found that one effective strategy
was to use scaffolding of support as a means to engage reluctant students. Supporting this
notion, Greene and colleagues (2004) and Patrick et al., (2007)argued that autonomy
support correlated positively to grades and strategy use (self-regulation) and that
autonomy predicted levels of self-efficacy (cognitive engagement). Conversely,
Wolters’s (2005) study suggests that low-levels of self-efficacy and task-avoidance lead
to procrastination (disengagement). He averred that students who demonstrated higher
levels of cognitive engagement were mastery goal oriented and had high levels of self-
efficacy. In addressing the notion of disengagement, Tuckman (2007) found that high
procrastinators increased their level of cognitive and behavioral engagement through
support and scaffolding, ultimately increasing their levels of self-regulation).
In summary, engagement is a critical factor in distance learning in that it is one of
the most difficult variables to measure. Although behavioral based engagement is
somewhat facile to detect in a traditional school setting, it is much more difficult to assess
39
in a virtual setting. However, technology does allow for some indicators as suggested by
Finn & Voelkl (1993) that can be observed. As with many motivational variables related
to distance learning, the literature is scant particularly for engagement of high school
distance learners. It is the intent of the researcher to ascertain the behavioral, emotional,
and cognitive engagement of distance learning students in grades 9-12.
Expectancy Outcome
Although access to actual school given grades was not feasible for this study, the
researcher obtained students’ expected achievement outcomes in English, social science,
math, and science. Rather than have students report their general expected achievement,
these four subjects were selected because it is assumed that the more specific the domain,
the more accurate the measurement of expectancy. Therefore, students would be more apt
to report their expected achievement for specific subjects, especially where they may
have differences in past performance. For example, a student who took four classes may
have expected to do extremely well in two of them and not so well in two of them. It can
be assumed, then, that this student might report an overall average expected achievement.
However, by narrowing the expected achievement to specific subjects, the participant is
more likely to report expected achievement specific to a domain versus their overall
expected achievement. As purported by Wigfield (1994), summarizing Eccles, when
individual have high expectancies for success, they typically have higher motivation to
perform achievement tasks. Furthermore, Eccles and her colleagues(Eccles & Wigfield,
2002) posited that expectancies for success are best defined as a learner’s beliefs about
how well he will do on an upcoming task. As well, Eccles suggests that within the
40
expectancy-value model, a learner’s ability beliefs are seen as broad beliefs about
competence in a specific domain, much akin to self-efficacy. In the context of this study,
expectancies are viewed as both the overall expected achievement in upcoming tasks
throughout the course and also the overall beliefs about how one has performed in a
specific domain (i.e. English, social science, etc.).
Conclusions
The research literature is abundant on the general notions of self-efficacy, self-
regulation, and engagement. The review of literature also revealed that although research
on these variables as related to distance learning is growing, it is still limited to higher
education. There is a need to refine and clarify the notion of distance learning and its
synonyms—web-based, Internet-based, online-based, blended contexts, and etc. The
review also pointed out that a significant pattern in distance learning at the higher
education level is that prior computer experience is a significant predictor of computer
self-efficacy, which in turn is positively correlated—in the studies reviewed—to high
levels of self-regulation and engagement. The review indicated that there are not to date
any specific distance learning studies for the grades 9-12 that examine the relationship
between computer self-efficacy, self-regulation, and engagement.
The researcher identified the motivational variables of computer self-efficacy and
self regulation and their relationship to engagement in distance learning as an appropriate
study to better understand the high school level distance learner.
41
CHAPTER 3: METHODOLOGY
This descriptive study was designed to assess the motivational beliefs,
specifically, self-efficacy for technology—herein referred to as Computer Self-
Efficacy(Compeau & Higgins, 1995; Hasan, 2003), self-regulation, and engagement of
students enrolled in distance learning. Participants completed one online survey as the
method for data collection. These measurement tools were self-reported. Although
empirical observations would have been the preferred method of data gathering, it was
neither feasible nor practical for the researcher to do this as the sample was spread
geographically across two states in the Southern and Western United States, and, due to
the nature of distance learning, the participants did not meet in a central locale, but rather,
participated in distance learning from their place of residence. While self-report
instruments can reflect discrepancies due to blatant or tacit misreporting—over- and
under—on the part of the participant, these tools remain the best option when furtive
observation is not possible.
Research Questions
The purpose of this study was to examine motivational influences on engagement
in distance learning at the K-12 setting, specifically in regards to computer self-efficacy
and self-regulation. The overarching research question was: What is the relationship
between students’ computer self-efficacy and self-regulation, and engagement in distance
learning education on expected achievement?
42
The following sub-questions were addressed as well:
1. What is the relationship between distance education learners’ computer self-
efficacy, self-regulation, and their behavioral, emotional, and cognitive
engagement and expected achievement in English?
2. What is the relationship between distance education learners’ computer self-
efficacy, self-regulation, and their behavioral, emotional, and cognitive
engagement and expected achievement in social science?
3. What is the relationship between distance education learners’ computer self-
efficacy, self-regulation, and their behavioral, emotional, and cognitive
engagement and expected achievement in math?
4. What is the relationship between distance education learners’ computer self-
efficacy, self-regulation, and their behavioral, emotional, and cognitive
engagement and expected achievement in science?
Design Summary
This descriptive/correlational study was designed to assess the motivational
beliefs, specifically, self-efficacy for technology—herein referred to as Computer Self-
Efficacy (Compeau & Higgins, 1995; Hasan, 2003), self-regulation, and engagement of
students enrolled in distance learning. In this case, the independent variables were
students’ computer self-efficacy and self-regulation, and the dependent variables were
students’ behavioral, emotional, and cognitive engagement, and English, Social Science,
Math, or Science achievement in distance learning.
43
Participants
The sample of this study consisted of 194 students at one of two virtual schools
located in the Southern and Western United States. Of this total, 183 attended Southern
Virtual School (name changed for study). At the time of this study, the researcher was not
able to ascertain the number of students from Southern Virtual School who were invited
to participate since an administrator from this school disseminated the electronic
invitation to participate (the researcher is still working to obtain the number of total
students invited). The number of students enrolled in grades 9-12 in the Southern Virtual
School was not known. 160 students from Western Virtual School (name changed for
study) were invited to participate and of this total, 11 volunteered to participate. The
gender of the sample was 35.1% (n=68) male and 64.4% (n=125) female. One participant
did not indicate gender. The average age was 16.12 years with the youngest student aged
13 and the oldest student aged 18. As the focus of the study was on high school students,
the grade level breakdown of participants was as follows: 9
th
grade students comprised
14.6% (n=28), 10
th
grade students comprised 26.8% (n=52), 11
th
grade students
comprised 32.5% (n=63), and 12
th
grade 25.3% (n=49). Two (2) students did not indicate
a grade level. (See Table 2)
The years enrolled in distance learning ranged from 109 first-year students,
accounting for 56.2% of the sample to four (4) students who had been enrolled in
distance learning for five (5) or more years, resulting in 2.1% of the sample. This
indicated that over half of the sample had no prior experience in distance learning. The
average number of years enrolled in distance learning for the sample was 1.61. The
44
number of online classes each participant was enrolled ranged from 110 students (56.7%)
taking one class to 11 students (5.7%) taking six (6) or more classes. This revealed that
over half of the sample enrolled in distance learning as a supplement to their traditional
school enrollment. The average number of classes taken by the sample was 2.08.
According to the participants’ self-report, the number of hours each participant spent per
week in the online class ranged from a low of six (6) participants indicating less than one
(1) hour per week to a high of four (4) students indicating they spent 36 or more hours
per week. The average range of hours spent per week in the online classes was 3.31
accounting for 36.1% (n=70) of the sample. One participant did not indicate years
enrolled, number of classes, or hours spent per week. (See Table 2)
45
Table 2.
Description of the Sample
n %
Gender*
Female 125 64.4
Male 68 35.1
Age (in years)
13 1 .5
14 14 7.2
15 45 23.2
16 55 28.4
17 58 29.9
18 21 10.8
School
Southern Virtual 183 94.3
Western Virtual 11 5.7
Grade Level**
9
th
28 14.4
10
th
52 26.8
11
th
63 32.5
12
th
49 25.3
Years in Distance Learning
First Time 109 56.2
1-2 years 55 28.4
3-4 years 25 12.9
5 or more years 4 2.1
Subject Enrollment***
English 79 40.7
Social Science 70 36.1
Mathematics 76 39.2
Science 61 31.4
9ote. *1 student did not indicate gender. ** 2 students did not indicate grade
level. *** These are the maximum number of students in the sample who
enrolled in these courses and may have enrolled in other courses as well.
For the purposes of this study, enrollment in one of four core academic subjects
(English, social science, mathematics, and/or science) was considered for achievement
variable as the range of non-academic subjects was varied and inconsistent between the
schools. For example, what one school calls and considers “Life Skills” may not be
tantamount to what the other school calls and considers for a school of similar scope and
sequence. On the other hand, it is reasonable to assume that a course called “Algebra 1”
46
by one school is called the same by the other and that both will be more similar in scope
and sequence based on criterion for proficiency on state tests, preparation for SAT and
ACT tests, and college/university admission requirements. Therefore, preliminary
analysis indicated the following enrollment in the aforementioned courses: English n=79,
Social Science n=70, Mathematics n=76, and Science n=61.
Measures
Three separate surveys were used to gather data about students’ computer self-
efficacy, self-regulation, and engagement. These assessments were scored using a five
point Likert-type scale with respondents selecting a 1 to indicate that they “Strongly
Disagree” with the statement, and selecting a 5 to indicate that they “Strongly Agree”
with the statement. It should be noted that the original MSLQ uses a 7-point Likert scale;
however, the version used in this study was changed to a 5-point Likert scale so as to be
consistent with the Feelings About School Inventory (Fredricks et al., 2005) and the Web
Users Self-Efficacy (Eachus & Cassidy, 2006) measurements, both of which use 5-point
Likert scales. It should also be noted that the order of the scales was reversed on the
student survey, going from 5 to 1 instead of 1 to 5.
Survey Instrument to Assess Students’ Self-Regulation
An 56-item scale called the Motivated Strategies for Learning Questionnaire
(MSLQ), designed to assess motivation and use of learning strategies by college students,
was developed and field tested by Pintrich & De Groot (1990). The sample size of
N=380 self-reported on motivational beliefs and self-regulated learning strategies on a
seven-point Likert scale, with responses ranging from 1 indicating “strongly agree” to 5
indicating “strongly disagree”. The overall measurement is divided into motivational
47
scales and learning strategy scales. Alpha coefficients for the motivational scales were
strong with a range of .62 to .93, with the six subscales collectively demonstrating good
internal consistency. The Cronbach’s alpha for the original eight items on the self-
regulation subscale was an acceptable .74.
Participants were given the self-regulation subscale of the Motivated Strategies
for Learning Questionnaire (MSLQ) survey instrument that included nine items (Pintrich
et al., 1993). Seven of the nine items were modified to be domain specific in order to
reflect student self-regulation in the online learning environment. For example, an item
that originally read “I often find that I have been reading for class but don’t know what it
is all about” was changed to read “I often find that I have been reading the online class
content but don’t know what it is all about.”
Since seven of the original nine items were modified to be domain specific,
exploratory factor analysis was conducted to determine what, if any, underlying structure
exists for the modified items relation to the original subscale. An initial internal
consistency coefficient of the modified subscale revealed an acceptable Cronbach’s alpha
of .77. The exploratory factor analysis resulted in two factors with eigenvalues greater
than 1, confirmed by the scree plot (Figure 5).
48
Figure 5: Scree Plot of Self-Regulation Scale
These two factors accounted for 50.87% of the variance. A Varimax rotation
(Table 3) was used in the factor analysis. The first factor accounted for 35.98% of the
variance (eigenvalue=3.238). This factor consisted of five items from the modified
MSLQ self-regulation subscale (SR 1, 3, 5, 8, and 9). The second factor accounted for
14.88 % of the variance (eigenvalue= 1.339). This second factor was comprised of four
items from the modified MSLQ self-regulation subscale (SR 2, 4, 6, and 7). Factor 3
accounted for 10.02% of the variance (eigenvalue=.902), and the fourth factor accounted
for 8.44% of the variance (eigenvalue=.759); both of these factors were dropped as they
did not meet the minimum eigenvalue of 1.
49
Table 3
Factor Analysis for Student Self-Regulation Survey
Factor Loadings
Item
Metacognitive
Strategies
Effort Management
Strategies
SR5 Before I start each online session, I think
about the things I will need to do in order to
learn.
.765
SR1 I ask myself questions to make sure I
know the material I have been studying
.713
SR8 When I’m learning online, I stop once in a
while and go over what I have learned.
.675
SR3 I work on optional assignments and
exercises in the online class even when I don’t
have to.
.622
SR9 I work hard to get a good grade even when
I don’t like this class.
.432
SR6 I often find that I have been reading the
online class content but don’t know what it is
all about.
.814
SR7 I find that when I read email or
correspondence from the online teacher or
other students I don’t really pay attention.
.757
SR2 I am more likely to give up when I don’t
understand something in an online setting
(compared to a traditional classroom).
.757
SR4 Even when the online study materials are
dull and uninteresting, I keep working until I
finish.
.523
Method: Principal Component Analysis.
A relation among the component items emerged that explained the two factors.
Items that loaded on Factor 1 were related to metacognitive strategy use such as thinking
50
about what needs to be done in order to learn and was thus labeled “metacognitive
strategies.” Items that loaded on Factor 2 appeared to focus on effort management
behaviors, particularly when addressing difficult or boring tasks such as not
understanding materials or not paying attention during a learning session. Moreover, an
initial internal consistency coefficient revealed a Cronbach’s alpha of .689 for factor 1
(SR items 1, 3, 5, 8, and 9) and a Cronbach’s alpha of .734 for factor 2 (SR items 2, 4, 6,
and 7). Since removal of items based on the factor loadings would reduce the Cronbach’s
alpha when compared to the scale as a whole, all items were included. Furthermore, it
should be noted that Pintrich & De Groot (1990) conducted factor analysis on the 22-
items that comprised the self-regulated learning strategies scale and found two cognitive
scales emerged—cognitive strategy use and self-regulation. The later, they noted, was
comprised of metacognitive items adapted from Weinstein, Schulte, & Palmer (1987) and
Zimmerman & Pons, (1986) and effort management items adapted from Zimmerman &
Pons (1986). They did not conduct factor analysis on the final self-regulation scale (9
items used in this study) and used it as a single scale. With this precedent, the researcher
maintained the self-regulation scale as a single scale and is used as such in the path
analysis discussed in the Results. Moreover, the factor loadings in Table 3 confirm the
underlying constructs that comprised the original self-regulation scale as used on the
MSLQ. To further examine the relationship between these two factors, a Pearson’s
correlation indicated a significant relationship between the two factors (r=.422, p<.001).
Psychometric analysis of this measure is presented in the Results.
51
Survey Instrument to Assess Students’ Computer Self-Efficacy
A 40-item scale designed to assess web-users self-efficacy was developed and
field tested by Eachus and Cassidy (2006). The sample size of N=141 self-reported on
the domains of information retrieval, information provision, communications, and
Internet technology on a five-point Likert scale. Reliability for each of the
aforementioned domains was indicated by alphas of 0.88 (information retrieval), 0.937
(information provision), 0.747 (communications), .0876 (Internet technology), and .0.96
for the entire instrument.
Eachus and Cassidy (2006) used a three point scale (novice, intermediate, and
advanced) to assess the discriminant validity of the WUSE scale. An ANOVA confirmed
the differences among the WUSE scores with F=72.60, p<0.001). As well, each domain
was scored based on level of expertise with information retrieval (F=59.84, p<0.001),
information provision (F=80.41, p<0.001), communications (F=24.48, p<0.001), and
Internet technology (F=72.60, p<0.001). In summary, internal reliability for both the
WUSE and the four sub-domains was reported with an alpha coefficient of .96 for the
entire measure and a range of alphas between .75 and .94 for the four sub-domains. These
findings demonstrate acceptable psychometric properties.
Two subscales from the Web-User’s Self-Efficacy scale, “information retrieval”
and “communication,” comprised of 10 items each, were used to measure participants’
computer self-efficacy. The other two subscales were not used in the study as they
addressed computer-use tasks not typically associated with distance learning. For
example, one subscale that was not used is labeled “information provision” and includes
52
items like “I wouldn’t have any problems creating a simple web page” or “Using ftp to
upload web pages to a server is too complicated for me.” These types of questions
related to skills that are not a part of what a distance learner must be able to do as part of
day to day learning activities. As well, the other subscale that was not used is labeled
“technology” and included items such as “I am not really sure what a modem does” or “I
have no security worries when it comes to buying things over the Internet.” These items
address hardware knowledge such as modems or the use of the Internet for non-
educational purposes such as purchasing products. For the two subscales that were used,
the items were left in their original form; therefore an exploratory factor analysis was not
conducted. However, reliability was calculated with an acceptable Cronbach’s alpha of
.774. Psychometric analysis of the measure is presented in the Results section.
Survey Instrument to Assess Students’ Engagement
Fredricks and her colleagues (2004) developed a 19-item five-point Likert-scale
instrument (1= strongly disagree, 5=strongly agree) measuring three types of
engagement: 1) behavioral engagement, 2) emotional engagement, and 3) cognitive
engagement. The scale was field tested in two waves with wave 1 N=661 and wave 2
N=294, of students in grades 3-5. The internal consistency coefficients (Cronbach’s α)
were found to be high for emotional engagement scale (α=.83 [Wave 1]; α=.86 [Wave 2])
and cognitive engagement (α=.82 [Wave 1]) and acceptable for behavioral engagement
scale (α=.67). These findings demonstrate acceptable psychometric properties.
Thirteen of the 19 items were modified in order to be domain specific and reflect
student engagement while in the online learning environment. Specifically, 4 of 5 items
53
on the behavioral subscale; 6 of 6 items on the emotional subscale; and 3 of 8 items on
the cognitive subscale were modified to fit the online learning domain. For example, one
item that originally read “My classroom is a fun place to be” was changed to read “The
online classroom is a fun place to be”.
Exploratory factor analysis was conducted to determine what, if any, underlying
structure exists for the modified items in relation to the original scale. An initial internal
consistency coefficient of the modified scale revealed an acceptable Cronbach’s alpha of
.77. The exploratory factor analysis resulted in four factors with eigenvalues greater than
1. This factor analysis indicated that items loaded on the particular factor for which they
were intended, including a secondary cognitive engagement factor. These four factors
accounted for 62.02% of the total variance. Table 4 shows the results of the factor
analysis.
54
Table 4
Factor Analysis for Student Engagement Survey
Factor Loadings
Item
Emotional
Engagement
Cognitive
Engagement
Behavioral
Engagement
Metacognitive
Engagement
E1 I like taking the online class. .835
E2 I feel excited about my work in the
online class. .843
E3 The online classroom is a fun place to
be. .864
E4 I am interested in the work at the
online class. .853
E5 I feel happy in online class. .804
E6 I feel bored in online class. (recoded) .728
C1 I check my math schoolwork for
mistakes. .576
C4 When I read a book, I ask myself
questions to make sure I understand what
it is about. .577
C6 If I don’t know what a word means
when I am reading, I do something to
figure it out. .820
C7 If I don’t understand what I read, I go
back and read it over again. .788
C8 I talk with people outside of school
about what I am learning in the online
class. .627
B1 I follow the rules of the online class. .650
B2 I get in trouble in the online class
(missing work, misuse of chat rooms).
(recoded) .755
B4 I am able to consistently pay
attention when I am taking the online
class. .540
B5 I complete my homework on time. .579
C2 I study at home even when I don’t
have a test.
.650
C3 I try to watch TV shows about things
we do in the online/virtual school.
.716
C5 I read extra books to learn more
about things we do in the online/virtual
school.
.801
9ote. B=Behavioral Engagement Item; E=Emotional Engagement Item;
C=Cognitive Engagement. Unique factor loading > .40 are in bold. Factor
1=Emotional Engagement; Factor 2=Cognitive Engagement Direct; Factor
3=Behavioral Engagement; Factor 4=Metacognitive Engagement
55
The first factor accounted for 38.89% of the variance (eigenvalue=7.388). This
factor consisted of all six items on the original emotional engagement subscale. This
factor was labeled emotional engagement as all of the items were related to emotional
engagement.
The second factor accounted for 9.64% of the variance (eigenvalue=1.832) and
consisted of five of the eight items (C1, C4, C6, C7, and C8) on the original cognitive
engagement scale. This factor was labeled cognitive engagement as all of the items
related to cognitive engagement, specifically as related to strategy use versus
metacognitive aspects of learning as will be addressed in factor 4. This factor had a
Cronbach’s alpha of .767 with items 2, 3, and 5 excluded and an alpha of .774 with all
items included. Since removal of items C2, C3, and C5 decreased the reliability of the
scale, all items were included in the cognitive engagement scale.
The third factor accounted for 7.31% of the variance (eigenvalue=1.388) and
consisted of four of the five items (B1, B2, B4, and B5) on the original behavioral
engagement scale. Item B3 (not in table) did not meet the >.40 unique factor loading on
any of the four factors and was dropped (Mertler & Vannatta, 2002). The reliability of the
behavioral engagement scale had a Cronbach’s alpha of .688 with all items and a .699
with item B3 removed.
The fourth factor accounted for 6.18% of the variance (eigenvalue=1.174) and
consisted of three of the eight items (C2, C3, and C5) on the original cognitive
engagement scale. As aforementioned in the second factor section, this factor was labeled
56
meta-cognitive engagement as the items in this factor appeared to relate to cognitive
behaviors related to metacognition. When Fredricks and colleagues first developed the
Feelings About School Inventory (2005) the cognitive engagement scale was comprised
of three items and had a low reliability (Cronbach’s alpha=.55). They decided to bolster
the three self-regulation based items by adding five measures of strategy use, resulting in
a higher reliability (Cronbach’s alpha=.82) for the new 8-item scale. Following the
precedent set by Fredricks et al., the cognitive engagement scale was used as a single
scale. Furthermore, reliability analysis resulted in a Cronbach’s alpha of .774 with all
items included and .669 when just comprised of items C2, C3, and C5. Finally, a
Pearson’s correlation revealed a significant relationship between factor 1 and factor 4
(r=.397, p<.01).
In summary, the entire Feelings About School Inventory scale was left in tact with
all items included, except item B3, which did not load on any factor and was dropped,
resulting in an overall Cronbach’s alpha of .905.
Survey Instrument to Assess Students’ Expected Achievement
Students were asked to indicate what they expected their Fall 2008 achievement
to be for English, social science, math or science by selecting the traditional letter grade
they expected to receive for that semester (A, B, C, D, F, or incomplete). Since these
were neither self-reported grades nor official school-reported grades, no reliability was
conducted. Moreover, due to logistical issues related to the Family Educational Rights
Privacy Act and geographic location of the participants, it was not feasible for the
researcher to obtain official grades for participants’ coursework. Additionally, without
official grades, it was not possible to compare self-reported or, in this case, expected
57
grades to actual grades. In this vein, the dependent variables of English, social science,
math or science expected achievement were treated as indicators of student achievement
expectancy. As purported by Eccles and Wigfield, in expectancy-value theory, both
expectancies and values are assumed to directly influence performance (2002). Therefore,
it is assumed that students’ self-reports of expected achievement should indicate some
level of influence on their actual performance.
In terms of the subject content, the researcher was not able to obtain lesson plans,
homework samples, online activities, assessments, and the like. However, a cursory
glimpse of the standards for the two schools studies is offered here. In order to maintain
the anonymity of the school sites, the names of the states have been omitted.
Furthermore, the content standards in the public domain and are, therefore, not listed in
the references sections. The list of standards for the four subjects—English, social
science, math, and science—are quite similar. The full description of the content
standards is exhaustive and lengthy. See Appendix C for a truncated list of the standards.
An example of what the content standards contain is given here. For example, both state’s
have standards that address the following for English: reading fluency, vocabulary
development, reading comprehension, literary analysis, writing applications, and etc. For
social science, in this example World History, the content standards call for students to
utilize historical inquiry skills and analysis; recognize significant events, figures, and
contributions of medieval civilizations (separate standards for Islamic, Meso and South
American, and Sub-Saharan African civilizations); causes of The Great War; and etc.
Both states’ standards were similarly aligned for math and science. For example, the
Algebra I standards, again virtually identical, call for students to understand and use the
58
rules for exponents; to graph linear equations and compute x- and y-intercepts; solve
quadratic equations by factoring or completing the square; and etc. The standards content
is similar for science as well, including Life Science, Biology, Chemistry, and Physics.
It should be noted, however, that although both of these states require schools to
align curriculum to the content standards, actual curriculum and delivery thereof is not
known and may differ. Moreover, neither of the states has a prescriptive curriculum. For
example, there is no evidence that all teachers must teach the same content in the exact
same manner, utilizing the same guided and independent practice activities and
assessments. Instead, these are descriptive curricula wherein the teacher has the
discretion to design specific learning activities, tasks, and assessments that may differ
from what another teacher from the same state and teaching the same class may
incorporate.
Procedure
This study was implemented with 194 students enrolled in two virtual high
schools. These instruments were implemented at the end of the Fall 2008 school semester
in order to give students ample time to reflect upon their computer self-efficacy, self-
regulation habits, and to estimate what they believed their grade would be for the Fall
2008 semester. In order to recruit for this study, administrators at each virtual school sent
an email to each parent/guardian of students in the sample explaining the purpose of the
survey, its confidentiality, and that it was not a course requirement. Additionally, students
who completed the online survey had the option of entering a drawing for one iPod
Shuffle ($49) or one of two iTunes gift certificates ($15 each) by clicking a button that
was a part of the online survey. Parents/guardians who granted consent for their student
59
to participate then sent a link to the online survey, which was administered via Qualitrics.
Students who were provided with the link, and who subsequently assented to participate,
completed the online questionnaire that garnered the gender, age, grade level, years
enrolled in distance learning, number of online courses enrolled, hours spent per week in
the online environment and responses to the three measurements described above.
Parents/guardians, students and school administrators were informed that the
responses on the surveys were confidential and that nobody at the school sites—teachers,
counselors, or administrators—would see the responses. Furthermore, parents/guardians
and students were informed that there were no known risks to participating in the survey
and that the data would be kept for three years and then destroyed.
Data Analysis
The data gathered from these surveys was analyzed via SPSS 17.0 in various was
in order to better understand the relationship between the variables studied: computer
self-efficacy, self-regulation, engagement, and achievement in distance learning. The
researcher used participants’ email as the identifier in the study. Initially, descriptive
analysis was used to fully describe the sample in terms of items such as gender, grade
level, years enrolled in distance learning, number of courses enrolled, subjects taken
online, and estimated grade in English, social science, math, or science. Reliability data
was calculated for the self-regulation and engagement surveys. Factor analysis was used
to determine if the participant surveys showed at least one self-regulation factor and three
engagement factors. As well, a Pearson’s correlation was created to establish if any
significant correlations existed between students’ computer self-efficacy, self-regulation,
engagement, and achievement. Using rules-of-thumb and Cohen’s power index, it was
60
determined that the sample size was large enough to pick up a medium to large effect size
but only marginal power to detect a small effect (Cohen, 1988; Green, 1991; Kline, 1998;
Van Voorhis & Morgan, 2007) Next, entry regressions were conducted between each
variable to establish beta values for the purpose of determining direct effects of
endogenous variables on exogenous variables in single path analysis (one for each type of
achievement). Finally, effect decomposition equations were calculated to determine the
indirect effects of computer self-efficacy and self-regulation on achievement.
61
CHAPTER FOUR: RESULTS
The results of the research described in chapter three are presented here. This
chapter will present the statistical outcomes for the following research questions posed in
chapter three:
1) What is the relationship between distance education learners’ computer self-efficacy,
self-regulation, and their behavioral, emotional, and cognitive engagement and expected
achievement in English? 2) What is the relationship between distance education learners’
computer self-efficacy, self-regulation, and their behavioral, emotional, and cognitive
engagement and expected achievement in social science? 3) What is the relationship
between distance education learners’ computer self-efficacy, self-regulation, and their
behavioral, emotional, and cognitive engagement and expected achievement in math? 4)
What is the relationship between distance education learners’ computer self- efficacy,
self-regulation, and their behavioral, emotional, and cognitive engagement and expected
achievement in science?
Correlation
A Pearson correlation matrix was created to determine if there is any relationship
between students’ self-regulation, computer self-efficacy, and their engagement and
subsequent achievement in English in distance learning (See Table 5). The results
indicate that there is a strong relationship between self-regulation, computer self-efficacy,
and type of engagement. The motivational variables self-efficacy and computer self-
efficacy were significantly correlated with each other (r=.236, p<.001), suggesting that
participants with high levels of self-regulation were more likely to exhibit high levels of
62
computer self-efficacy in the distance learning environment. The engagement variables—
behavioral, emotional, and cognitive—were significantly correlated with each other. Self-
regulation was significantly correlated with all three engagement variables (behavioral,
r=.730, p<.01, emotional, r=.752, p<.01, and cognitive, r=.698, p<.01, respectively),
demonstrating that those participants who had high levels of self-regulation were more
likely to demonstrate higher levels of engagement in the distance learning setting.
Computer self-efficacy correlated significantly with all three engagement variables
(behavioral, r=.205, p<.01, emotional, r=.157, p<.05, and cognitive, r=203, p<.01,
respectively), revealing that those participants who had high levels of computer self-
efficacy were more likely to exhibit higher levels of engagement in the distance learning
environment setting.
Table 5
Correlation between Computer Self-Efficacy, Self-Regulation, Engagement, and Expected
Achievement
Mean SD
1 2 3 4 5 6 7 8 9
1. CSE
4.07
.461 -
2. SR
3.69
.639 .236
**
-
3. Behavioral
4.06
.689 .205
**
.730
**
-
4. Emotional
3.49
1.06 .157
*
.752
**
.641
**
-
5. Cognitive
3.38
.665 .203
**
.698
**
.549
**
.537
**
-
6. English
.937
.245 .145 .065 .117 .236
*
-.059 -
7. Soc. Sci.
.900
.302 .311
**
.182 .166 .221 .156 .271 -
8. Math
.711
.457 .221 .269
*
.350
**
.296
**
.080 .248 .247 -
9. Science
.853
.358 .398
**
.328
**
.349
**
.269
**
.308
*
.537
**
.543
**
.287 -
**. Correlation is significant at the .01 level (2-tailed).
*. Correlation is significant at the .05 level (2-tailed).
Note: English Achievement n=79, Social Science Achievement n=70, Math Achievement n=76,
Science Achievement n=61, all other variables n=194
63
With the exception of emotional engagement, there were not any significant
correlations between English achievement and self-regulation, computer self-efficacy,
behavioral engagement or cognitive engagement. The correlation between emotional
engagement and English achievement was significant (r=.236, p<.05), implying that of
the 79 participants who reported a grade for English class in the distance learning
environment were more likely to report a high level of achievement when they
demonstrated a high level of emotional engagement.
Social science achievement did not correlate significantly with any of the
variables except computer self-efficacy (r=.311, p<.01), suggesting that of the 70
participants who reported a grade for social science class were more likely to report a
high level of achievement in this class when they also reported a high level of computer
self-efficacy.
There was a significant correlation between math achievement and self-regulation
(r=.269, p<.05), behavioral engagement (r=.350, p<.01) and emotional engagement
(r=.296, p<.01). This indicates that the 76 students who reported a grade for math class in
the distance learning environment were more likely to express a high level of
achievement when they also reported high levels self-regulation, behavioral engagement,
and emotional engagement.
Among the 61 students who reported a grade for science class in the distance
learning environment, strong correlations were found between their achievement in this
class and computer self-efficacy (r=.398, p<.001), self-regulation (r=.328, p<.01),
64
behavioral engagement (r=.349, p<.01), emotional engagement (r=.269, p<.05), cognitive
engagement (r=.308, p<.05). Unlike English, social science, and math, science was the
only subject variable that indicated a significant correlation with other subjects. Of the 61
students who reported science achievement, 47 also reported English achievement
(r=.537, p<.001), 40 also reported social science achievement (r=.543, p<.001). This
suggests that these students were more likely to indicate a high level of achievement in
science as well as English and social studies when they also reported high levels of
computer self-efficacy, self-regulation, behavioral engagement, emotional engagement,
and cognitive engagement.
The following research questions were analyzed using single path analysis. A
review of the literature revealed that there is not a high degree of agreement on how to
conduct a power analysis or consensus on the exact number of subjects required per
variable (Green, 1991; Van Voorhis & Morgan, 2007). However, there are rules of thumb
and formulas that suggest a reasonable range of subjects per variable. For example,
Green (1991) suggests a formula of 9 ≥50+8m where m is the number of variables in the
model. In this study, the formula was 9 ≥50+8(5) =90. Van Voorhis & Morgan (2007)
support Green’s formula, but also suggest Harris’s (1985) as an alternative which is
9 ≥m+50, where m is the total number of predictor variables. In this study, the formula
was 9 ≥5+50=55. Similarly, Kline (1998)suggests that the size should be 10(m) or,
ideally 20(m) where m=number of predictors. Again, for this study, the formulas were
10(5)=50 and 20(5)=100, respectively. Finally, Cohen (1988) outlined three types of
sample sizes based on the effect sizes of small, medium, and large based on power
65
analysis. The sample sizes are similar to Green’s formula for medium and large effect
sizes, but begin to separate for small effect once the number of predictors exceeds five.
Using both Cohen’s (1988) power analysis and both Kline’s (1998) Green’s (1991)
formula as guides, it was determined that the sample sizes for the four path analysis that
follow were large enough to determine a large effect size. However, the sample sizes did
not have enough power to warrant factor analysis as Comrey and Lee (Comrey & Lee,
1992) suggest 300 as “good” and the minimum and 50 as very poor; 100 as poor; and 200
as fair. The sample sizes used in the following path analyses ranged between 61 and 79.
Research Question 1
What is the relationship between distance education learners’ computer self-
efficacy, self-regulation, and their behavioral, emotional, and cognitive engagement and
expected achievement in English?
In order to further explore these relationships, a single path analysis was
conducted to determine the direct and indirect effects computer self-efficacy (CSE), self-
regulation (SR), behavioral engagement (BEH), emotional engagement (EMO), cognitive
engagement (COG) on expected achievement English (ELA) for the 79 participants who
reported an expected grade for a distance learning English class (i.e. English 1 or 9
th
grade, English 2 or 10
th
grade, and etc.). An entry regression was run between each
variable on the path model (See Figure 6) and the resulting beta values were then used to
establish the direct effects between the variables and to, subsequently, calculate effect
66
decompositions to establish if the indirect effects of computer self-efficacy and self-
regulation on expected achievement English (See Table 6).
Figure 6: Path Model for Expected Achievement in English
9ote. *=p<.05; **=p<.01; ***=p<.001
The results indicate that there was not a significant correlation between computer
self-efficacy and self-regulation. However, there were significant direct effects between
computer self-efficacy and cognitive engagement, suggesting that students with high
levels of CSE were more likely to engage on a cognitive level. There were also
significant direct effects between self-regulation and behavioral, emotional and cognitive
engagement, implying that students who self-regulate were more likely to engage in the
distance learning environment on all three levels. Lastly, significant direct effects were
found between emotional engagement and expected achievement in English, implying
Computer
Self-efficacy
Self-
regulation
Emotional
Engagement
Behavioral
Engagement
Cognitive
Engagement
English
Achievement
-.035
.66***
.098
.236*
-.059
.761***
.314**
.777***
.105
.254
67
that students who reported positive feelings about English class were more likely to
report higher achievement. There were not any significant direct effects between
computer self-efficacy and behavioral or emotional engagement, and between behavioral
and cognitive engagement and English achievement.
Table 6 Effect Decompositions on English Achievement
Computer Self-Efficacy on English Achievement
CSE->BEH->ELA is -.035*.098 = -.003
CSE->EMO->ELA is .105*.236 = .025
CSE->COG->ELA is .314*-.059 = -.019
CSE->SR->BEH->ELA is .254*.660*.098 =.016
CSE->SR->EMO->ELA is .254*.777*.236 =.047
CSE->SR->COG->ELA is .254*.761*-.059 =-.011
Total indirect effect of CSE on ELA= .055
------------------------------------------------------------
Self-Regulation on English Achievement
SR->BEH->ELA is .66*.098 = .065
SR ->EMO->ELA is .777*.236 = .183
SR ->COG->ELA is .761*-.059 = -.045
SR -> CSE ->BEH->ELA is .254*-.035*.098 =.0009
SR -> CSE ->EMO->ELA is .254*.105*.236 =.006
SR -> CSE ->COG->ELA is .254*.314*-.059 =-.006
Total indirect effect of SR on ELA= .202
As evident in Table 6, computer self-efficacy did not have an overall profound
indirect effect on English achievement when passing via the engagement variables
separately; however, when coupled with self-regulation, the total indirect effect of CSE
was marginally stronger for a total indirect effect on English achievement of .055.
Conversely, self-regulation had a much stronger indirect effect on English achievement,
especially through the paths of cognitive engagement as a mediator variable. However,
when coupled with CSE, the various indirect effects were significantly weaker than when
68
just conducted with self-regulation alone. The overall indirect effect of self-regulation on
English achievement was .202, suggesting that self-regulation is not a significant
predictor of achievement in distance learning English.
Research Question 2
What is the relationship between distance education learners’ computer self-
efficacy, self-regulation, and their behavioral, emotional, and cognitive engagement and
expected achievement in social science?
In order to further examine these relationships, a single path analysis was
conducted to determine the direct and indirect effects computer self-efficacy (CSE), self-
regulation (SR), behavioral engagement (BEH), emotional engagement (EMO), cognitive
engagement (COG) on expected achievement in Social Science (SCA) for the 70
participants who reported a grade for a distance learning Social Science class (i.e. World
History, U.S. History, Economics, and etc.). An entry regression was run between each
variable on the path model (See Figure 7) and the resulting beta values were then used to
determine the direct effects between variables and to then calculate effect decompositions
to explore the indirect effects of both computer self-efficacy and self-regulation on social
science achievement (See Table 7).
69
Figure 7: Path Model for Expected Achievement in Social Science
9ote. *=p<.05; **=p<.01; ***=p<.001
The results indicate that a significant correlation did not exist between computer
self-efficacy and self-regulation. Nevertheless, there were significant direct effects
between computer self-efficacy and cognitive engagement; between self-regulation and
all three engagement types; however, there was not a significant relationship between any
of the engagement types and social science achievement. Overall, this suggests that
students’ levels engagement did not directly effect their achievement. Effect
decompositions were then calculated to further examine the indirect effects computer
self-efficacy and self-regulation had on social science achievement.
Computer
Self-efficacy
Self-
regulation
Emotional
Engagement
Behavioral
Engagement
Cognitive
Engagement
Social Science
Achievement
.165
.681***
.139
.221
.156
.77***
.259**
.811***
.138
.208
70
Table 7 Effect Decompositions on Social Science Achievement
Effect decompositions CSE on SSA
CSE->BEH-> SSA is .165*.139 = .023
CSE->EMO-> SSA is .138*.221 = .03
CSE->COG-> SSA is .259*.156 = .04
CSE->SR->BEH-> SSA is .208*.681*.139 =.02
CSE->SR->EMO-> SSA is .208*.811*.221 =.04
CSE->SR->COG-> SSA is .208*.77*.156 =.025
Total indirect effect = .178
------------------------------------------------------------
Effect decompositions SR on SSA
SR ->BEH-> SSA is .681*.139 = .095
SR ->EMO-> SSA is .811*.221 = .18
SR ->COG-> SSA is .77*.156 = .12
SR -> CSE ->BEH-> SSA is .208*.165*.139 =.005
SR -> CSE ->EMO-> SSA is .208*.138*.221 =.006
SR -> CSE ->COG-> SSA is .208*.259*.156 =.008
Total indirect effect = .414
Table 7 shows that computer self-efficacy did not have any significant indirect
effect via any of the paths examined nor did the overall indirect effect suggest that CSE
significantly predicts achievement in social science, as the total indirect effect was .178.
As well, self-regulation was not a significant predictor of social science achievement and
showed a decrease in effect when CSE was included in the decomposition for a total
indirect effect of .414. The results for this path model suggest that self-regulation
significantly related to students’ levels of engagement—all three types—but did not
relate to achievement.
71
Research Question 3
What is the relationship between distance education learners’ computer self-
efficacy, self-regulation, and their behavioral, emotional, and cognitive engagement and
expected achievement in math?
A single path analysis was conducted in order to determine the direct and indirect
effects computer self-efficacy (CSE), self-regulation (SR), behavioral engagement
(BEH), emotional engagement (EMO), cognitive engagement (COG) on expected
achievement in Math (MAA) for the 76 participants who reported a grade for a distance
learning Math class (i.e. Algebra 1, Geometry, Algebra 2, and etc.). An entry regression
was run between each variable on the path model (See Figure 8) and the resulting beta
values were then used to calculate effect decompositions (See Table 8).
72
Figure 8: Path Model for Expected Achievement Math
9ote. *=p<.05; **=p<.01; ***=p<.001
As indicated in Figure 8, computer self-efficacy did not have any significant
direct effect on any of the types of engagement. On the other hand, self-regulation had
profound direct effects on all three engagement types (p<.001). Both behavioral and
emotional engagement had strong direct effects on math achievement (p<.01). This
implies that students with high levels of self-efficacy were more likely to report high
levels of engagement and, in the case of behavioral and emotion, report higher levels of
achievement. In order to further explore the indirect effects of computer self-efficacy and
self-regulation on math achievement, effect decompositions were calculated.
Computer
Self-efficacy
Self-
regulation
Emotional
Engagement
Behavioral
Engagement
Cognitive
Engagement
Math
Achievement
.205
.689***
.337**
.296**
.08
.708***
.208
.775***
.193
.248*
73
Table 8 Effect Decompositions on Math Achievement
Effect decompositions CSE on MAA
CSE->BEH-> MAA is .205*.337 = .069
CSE->EMO-> MAA is .193*.296 = .057
CSE->COG-> MAA is .208*.08 = .017
CSE->SR->BEH-> MAA is .248*.689*.337 =.058
CSE->SR->EMO-> MAA is .248*.775*.296 =.057
CSE->SR->COG-> MAA is .248*.708*.08 =.014
Total indirect effect = .272
---------------------------------------------------------------
Effect decompositions SR on MAA
SR ->BEH-> MAA is .689*.337 = .232
SR ->EMO-> MAA is .775*.296 = .23
SR ->COG-> MAA is .708*.08 = .057
SR -> CSE ->BEH-> MAA is .248*.205*.337 =.017
SR -> CSE ->EMO-> MAA is .248*.193*.296 =.014
SR -> CSE ->COG-> MAA is .248*.208*.08 =.004
Total indirect effect = .554
As indicated in Table 8, computer self-efficacy did not have a strong indirect
effect on math achievement as the sole exogenous variable or when doubled with self-
regulation. However, self-regulation had a much stronger total indirect effect. Still, none
of the individual paths showed a significant indirect effect. With overall indirect effects
of .272 and .554, neither computer self-efficacy nor self-regulation was a strong indirect
predictors of math achievement.
74
Research Question 4
What is the relationship between distance education learners’ computer self-
efficacy, self-regulation, and their behavioral, emotional, and cognitive engagement and
expected achievement in science?
A single path analysis was conducted in order to determine the direct and indirect
effects computer self-efficacy (CSE), self-regulation (SR), behavioral engagement
(BEH), emotional engagement (EMO), cognitive engagement (COG) on Science
achievement (SCA) for the 61 participants who reported a grade for a distance learning
Science class (i.e. Biology, Physics, Chemistry, and etc.). An entry regression was run
between each variable on the path model (See Figure 9) and the resulting beta values
were then used to calculate effect decompositions (See Table 9).
75
Figure 9: Path Model for Expected Achievement in Science
9ote. *=p<.05; **=p<.01; ***=p<.001
As revealed in Figure 9, significant direct relationships existed between all
variables in the model. The most significant were between self-regulation and the three
engagement types where p<.001. Also, as compared to the above path models, the
relationship between computer self-efficacy and self-regulation was significant at the
p<.001 level. These results suggest that the students enrolled in distance learning science
were more likely to report a high grade of achievement when they also reported high
levels of behavioral, emotional, and cognitive engagement, self-regulation, and computer
self-efficacy. In order to further explore the indirect effects of computer self-efficacy and
self-regulation on science achievement, effect decompositions were calculated.
Computer
Self-efficacy
Self-
regulation
Emotional
Engagement
Behavioral
Engagement
Cognitive
Engagement
Science
Achievement
.309*
.685***
.326**
.269**
.308*
.832***
.346**
.848***
.272*
.42**
76
Table 9 Effect Decompositions on Science Achievement
Effect decompositions CSE on SCA
CSE->BEH-> SCA is .309*.326 = .101
CSE->EMO-> SCA is .272*.296 = .08
CSE->COG-> SCA is .346*.308 = .107
CSE->SR->BEH-> SCA is .42*.685*.326 =.094
CSE->SR->EMO-> SCA is .42*.848*.269 =.096
CSE->SR->COG-> SCA is .42*.832*.308 =.108
Total indirect effect = .586
------------------------------------------------------------
Effect decompositions SR on SCA
SR ->BEH-> SCA is .685*.326 = .223
SR ->EMO-> SCA is .848*.296 = .251
SR ->COG-> SCA is .832*.308 = .256
SR -> CSE ->BEH-> SCA is .42*.309*.326 =.042
SR -> CSE ->EMO-> SCA is .42*.272*.269 =.031
SR -> CSE ->COG-> SCA is .42*.346*.308 =.045
Total indirect effect = .848
The effect decompositions for computer self-efficacy on science achievement
propose that CSE was not a significant predictor of science achievement when the three
types of engagement functioned as mediator variables. In fact, once CSE is added to the
decomposition, the three types of engagement had a considerable decrease in their effect
on science achievement as compared to their direct effect. Similarly, when self-regulation
was added to the effect decomposition, the indirect effects showed an even greater
decrease. However, when the total indirect effect was calculated, .586, computer self-
efficacy showed a reasonable indirect effect on science. As well, the total indirect effect
of self-regulation on science achievement, .848, was significantly more profound than
any of the other effect decompositions revealed alone. This suggests that both computer
77
self-efficacy and self-regulation were significant predictors of science achievement when
all three types of engagement functioned as mediator variables.
Summary
Three different measurements were used in this study to examine the relationship
between students’ computer self-efficacy and self-regulation and their engagement in
distance learning and their achievement in distance learning English, social science,
math, or science. Participants completed an online survey that contained a modified
version of a modified version of the self-regulation subscale from the Motivated
Strategies for Learning Questionnaire (MSLQ) survey instrument that included nine
items (Pintrich et al., 1993); a modified version of the Feelings About School Inventory
(Fredricks et al., 2005) that measured behavioral, emotional, and cognitive engagement;
and two unmodified subscales from the Web-Users Self-Efficacy survey that measured
student computer self-efficacy (Eachus & Cassidy, 2006).
Exploratory factor analysis and reliability data were used to ensure that the survey
questions aligned with the factors being studied and that the instruments were reliable. A
Pearson’s correlation was conducted to establish the observed correlations between the
variable being examined. Path analyses were used to examine the relationships between
the variables. In order to conduct the path analyses, entry regression was used in to create
path correlations for the purposes of writing effect decompositions in order to calculate
the indirect effects of computer self-efficacy and self-regulation on English, Social
Science, Math, and Science achievement.
Multiple regression analysis indicated that there was a significant relationship
between computer self-efficacy and self-regulation in the math and science achievement
78
models, but not in the English and social science models. Self-regulation was
significantly related to the three types of engagement in all path models. Emotional
engagement was the only engagement type that related to English achievement; none of
the engagement types significantly related to social science achievement; behavioral and
emotional engagement significantly related to math achievement; and all three types of
engagement significantly predicted science achievement.
Path analysis indicated that computer self-efficacy showed significant indirect
effect on science achievement, and self-regulation had significant indirect effects on
English, math, and science achievement when it was the primary exogenous variable.
79
CHAPTER 5: DISCUSSION
This chapter discusses the limitations of this study, the results, the conclusions
and implications, and suggestions for future research. The purpose of this study was to
examine possible relationships between the motivational and variables of computer self-
efficacy, self-regulation and engagement (behavioral, emotional, and cognitive) of
students enrolled in distance learning. Additionally, this study was intended to gain
insight as to how the above mentioned variable relate to student achievement in distance
learning, specifically in the subject areas of English, social science, math, and science.
The results revealed that computer self-efficacy was not significantly related to any type
of engagement or achievement. Self-regulation was significantly related to all three types
of engagement and to achievement for some of the subjects. Finally, all three types of
engagement were significantly correlated to achievement in at least one of the subjects.
Limitations of the Study
There are four limitations of the study that should be mentioned. First, selection
bias is a concern as participation was voluntary, suggesting that those who chose to
participate might have had higher levels of motivation than other students; therefore, the
study would not capture the effects of low motivation. This can be prevented in future
studies by recruiting the sample on a random basis.
As well, the expected achievement responses used in the study are based on self-
reported expected grades for the four subject areas examined (English, social science,
math, and science) and may not be the final and official grade given by the participants’
respective distance learning teacher. It should be noted that the limits of self-reported
grades should be considered. Kuncel and colleagues (2005) assert that obtaining actual
80
grades is difficult to obtain and that researchers should consider the effect of error in self-
reported grades. Their meta-analysis examines the nature of error in self-reported grade
primarily due to random error, differences in students’ perceptions of what they have
learned and the grade-reporting system used by a school, and deliberate misrepresenting
grades due to low ability. Ultimately, they conclude that students who possess high
ability report high grades that are commensurate with their actual grades or, for example,
on well known tests like the ACT; conversely, they purport, students who have low skill
levels are more apt to over-report their grades (Kuncel et al., 2005). However, it was not
feasible for the researcher to obtain actual grades for this study; therefore, it is suggested
that future studies may seek to obtain official transcripts or teacher-given coursework
grades as this can serve to compare student-reported grades to official grades or simply
eliminate the possibility of students reporting inaccurate grades.
Chapter three addressed the specific content addressed in the subjects used for the
expected achievement outcomes used in this study based on the state standards for the
two states in which the two virtual schools reside. However, because of the lack of
specific support documents (i.e. lesson plans, learning activities, assessments, and etc.) it
is not possible to analyze the content differences as related to computer self-efficacy,
self-regulation, and engagement except in general terms. This is addressed later in this
section.
Furthermore, the original Motivated Strategies for Learning Questionnaire
(MSLQ) used a seven-point Likert scale (Pintrich et al., 1993) and was modified to a
five-point in order to remain consistent with the five-point Likert scales used on the
original Web Users Self-Efficacy scale (Eachus & Cassidy, 2006) and the Feelings About
81
School Inventory (Fredricks et al., 2005). However, the internal validity for the modified
MSLQ indicated a Cronbach’s alpha of .77.
It should also be noted that non-significant results may have been due to the small
sample size and possibility of a Type II error, wherein the correlation between variables
may actually be significant, but the calculated results indicated that there was not
significant correlation. Again, the rule of thumb calculations indicate that there was
sufficient power, or sample size, to detect large effects but not small
Synthesizing the Results
This study sought to determine if a relationship existed between the motivational
and learning variables of computer self-efficacy and self-regulation and behavioral,
emotional, and cognitive engagement as well as subsequent achievement in the distance
learning setting at the high school level. Prior research has not examined the
aforementioned variables in a single study at any grade level (i.e. higher education, high
school, etc.). However, prior research has examined the role of computer self-efficacy
and self-regulation in the distance learning environment, mainly at the higher education
setting. Previous studies have indicated that computer self-efficacy is related to self-
regulation and engagement in distance learning (Bates & Khasawneh, 2007; Eachus &
Cassidy, 2006; Hasan, 2003; Stone & Henry, 2003). As well, previous research has
indicated that self-regulation plays a vital role in the success of students enrolled in
distance learning (Dembo, et al., 2008; Lee & Lee, 2008; Tuckman, 2007; Wang & Lin,
2007; and Whipp & Chiarelli, 2004). It was predicted that high levels of computer self-
efficacy and self-regulation would predict levels of engagement and subsequent
achievement of students enrolled in distance learning.
82
In the literature review section of this study, it was concluded that computer that
high levels of computer self-efficacy are determined by antecedent variables such as
degree of computer use, acquired skill, training, and ease of system (Bates & Khasawneh,
2007; Eachus & Cassidy, 2006; Hasan, 2003; and Stone & Henry, 2003). Additionally,
attitude and training were also significant predictors of computer self-efficacy (Karsten &
Roth, 1998; and Zhang & Espinoza, 1993). In alignment with Bates and Khasawneh
(2007), computer self-efficacy related significantly to the three types of engagement. This
suggests that students who have higher levels of computer self-efficacy were more likely
to engage as related to the three types of engagement. It would make sense, then, that if
students felt they could access and use the learning software, they engaged behaviorally,
leading to positive feelings about the experience (emotional engagement) and could then
engage with the curriculum on a cognitive level, thus, using the software tools to
complete learning tasks. However, one consideration may be that as the software used in
distance learning becomes easier to use, the demands of computer self-efficacy may not
be as great as there were in previous studies where the software used in the distance
learning setting was more challenging.
A second consideration may be that the computer self-efficacy measure needs to
be updated so as to include survey items that are aligned with current software and
computing practices. For example, earlier computer self-efficacy studies (Cassidy &
Eachus, 2002; Compeau & Higgins, 1995) looked at computing tasks such as using email
or creating a webpage, which were complicated and involved tasks when these studies
were conducted. Since the time of these studies, the ease of use for email and making a
webpage has increased significantly. Whereas in the past, a computer user would need to
83
know how to use hyper-text mark-up language (HTML) in order to make a webpage,
nowadays, users can use a webpage editor in order to create a sophisticated webpage and
literally know nothing about HTML. In this sense, the computer self-efficacy measure
may contain items that address computing tasks that were more difficult at the time the
measurement was created, but have since become more facile, thus giving an unclear
picture about users’ computer self-efficacy.
Along these lines, this study did not find computer self-efficacy to be a
significant predictor of achievement when examined via path analysis. This relationship
has not been examined in prior research as the examination of computer self-efficacy has
centered on antecedent variables such as prior experience, attitudes, and training, for
example. One possible reason for this is that although a student may feel highly confident
about her ability to use technology in the distance learning setting, it does not necessarily
mean she will learn new material. Furthermore, the nature of the subject being studied
may not be easier to study in an online setting and, in fact, may be more difficult than in a
traditional setting. Therefore, the ease of use of the learning system may only serve to
decrease the level of computer self-efficacy needed to access the content and may
facilitate engagement with the content, but not translate into learning. It is suggested that
future studies examine achievement in a given subject area by using a control group to
determine if the online environment makes learning a subject more, less, or equally
difficult as compared to a traditional setting.
As well, previous research has indicated that in the absence of a real time teacher,
self-regulation plays a vital role in the success of students enrolled in distance learning
(Dembo, Junge, & Lynch, 2008; and Tuckman, 2007) and that collaboration is a crucial
84
component in building one’s self-regulation in the distance learning setting (Lee & Lee,
2008; Wang & Lin, 2007; and Whipp & Chiarelli, 2004). This study found that self-
regulation related to computer self-efficacy as these two were used as exogenous
variables in the path analysis. Like computer self-efficacy, self-regulation, by way of path
analysis, significantly predicted all three types of engagement, supporting previous
research that examined this relationship (Dembo et al., 2008; Tuckman, 2007; Whipp &
Chiarelli, 2004). These previous studies indicate that the social environment of distance
learning has a profound effect on students’ level of self-regulation. In this study, the role
of the teacher is one of high visibility and students’ access to other students in the online
environment is encouraged. It would make sense, then, that having a strong adult model
that provides timely feedback, is a respected model, and encourages collaboration would
positively influence student self-regulation. Moreover, the learning management systems
used in distance learning incorporate weekly announcements and calendars, which might
decrease the likelihood of a student procrastinating (Tuckman, 2007).
As stated previously, the majority of distance learning has focused on antecedents
to and what actually comprises computer self-efficacy, and the role of self-regulation in
this setting. It is curious that there are not any specific studies that examine the role of
engagement in the distance learning setting. However, it should be noted that some
previous studies have examined self-efficacy and self-regulation that imply behavioral,
emotional, and cognitive engagement as underlying variables at work (Eachus & Cassidy,
2006; Hasan, 2003; McMahon & Oliver, 2001; and Tuckman, 2007). Specifically, it can
be ascertained that self-regulation is in itself an act of engagement in that a learner
engages in observable behaviors, indicates positive, negative, or neutral emotions about
85
the learning experience, and planning and monitoring a learning task requires some level
of cognitive engagement.
The results of this study are tantamount to the previous studies in that self-
regulation positively correlated with all types of engagement. Future studies could
examine the role of engagement in a more overt way in order to shed some light on the
role of engagement in distance learning. However, in relation to achievement, and unlike
previous studies, self-regulation did not correlate with achievement in path analysis, with
the exception of achievement in math. It is unclear why self-regulation did not correlate
with achievement as was the case in the Pearson’s correlation matrix and multiple
regression analysis. Even with engagement removed as a mediator variable, self-
regulation only correlated with math achievement. One reason for this might be the
nature of the subject being studied. For example, math is based highly on procedural
knowledge. The nature of daily or weekly tasks is uniform in nature—students view the
lesson, complete practice problems, do the homework problems, and take unit quizzes or
tests. Conversely, in a class like English, the day-to-day activities may range from
reading to writing essays to learning vocabulary, thus, creating a somewhat inconsistent
task routine. It can be argued that the relative routines of math may facilitate the learner’s
ability to self-regulate when compared to a more nebulous schedule as seen in an English
class. Subsequent studies could examine the relationship between self-regulation and the
subject being studied.
Engagement has not been examined exclusively in prior studies, but has been
examined inclusively in the recent work of Bates and Khasawneh (2007), where they
found that high levels of computer self-efficacy lead to high levels of engagement.
86
Additionally, they purported that students who spend more time online were more likely
to be engaged in the learning task. Consistent with their findings, higher levels of all
types of engagement were higher when preceded by higher levels of computer self-
efficacy. Moreover, engagement was found to be a significant predictor of achievement
in each of the four subject areas examined in multiple regression analyses. Specifically,
behavioral engagement was revealed to be a considerable predictor of math and science
achievement; emotional engagement was shown to be a key predictor of English, math,
and science achievement; and cognitive engagement was a strong predictor of science
achievement.
One reason that behavioral engagement correlated with math and science and not
English and social studies achievement is that the presentation and access to humanities
curriculum is not drastically different than how it is presented in a traditional setting. For
instance, reading a work of literature in the online setting is not drastically different from
reading it in the traditional setting—in many cases, students read literature in physical
books regardless of setting. Furthermore, reading text online may not be significantly
different so as to impact the level of achievement. On the other hand, science classes
typically have a laboratory or hands-on component to the learning process. The online
learning software utilizes virtual laboratories for science and animated concepts for math;
therefore, students are required to engage, behaviorally speaking, in a different way than
they would in a traditional setting. Prospective studies could examine the roll that
software plays in engaging students in certain subjects.
As noted, emotional engagement was a strong predictor of achievement in all
subjects but social science. This finding is in line with previous research in that the more
87
positively a student feels about a learning situation, both self-efficacy and self-regulation
are elevated (Bandrua, 1986; and Zimmerman, 1991) and students who are highly self-
regulated tend to accomplish learning tasks. As with the reasons for behavioral
engagement predicting achievement in math and science, the same may be possible
reasons for emotional engagement, as students who are able to participate in laboratory or
hands-on activities via online learning may feel good about the experience.
Finally, cognitive engagement was positively correlated with only science
achievement. Previous research has not examined the relationship between cognitive
engagement and achievement in distance learning. However, it may be that of the four
subjects examined in this study, science may require higher degrees of cognitive
engagement. Whereas English and social science are heavily text-based and reading
classes, science curriculum requires the synthesis of reading, writing and problem
solving. Therefore, students may have had to or actively chose to go above and beyond
the content of the class and access supplemental materials via television, Internet, reading
materials, and the like to further enhance their learning. This would imply two
assumptions: first, highly self-regulated students would be more likely to do this, and
second, students would execute these behaviors in the form of behavioral engagement
(i.e. watching the program, reading the book, or viewing the website) and emotional
engagement because they would feel happy and excited, and be interested in the subject
via distance learning. This is supported in the path analysis as both behavioral and
emotional engagement significantly predicted science achievement. Future studies could
examine the specific way curriculum is delivered within the online environment and how
it relates to engagement and subsequent achievement.
88
Implications
This study supports previous research that suggests that computer self-efficacy is
positively connected to self-regulation and engagement, suggesting that prior experience,
ease of use of software, and training are vital in preparing students to use distance
learning software and tools. However, this study also suggests that as the ease of use of
software increases, the demands of computer self-efficacy may decrease, thus making it
more likely that a learner would be successful in completing distance learning tasks. This
implies that distance learning software developers should consider how much prior
computer knowledge, experience, and training is required in order to use the software
they create. As well, distance learning educators should continue to provide students with
opportunities for performance accomplishments in the form of simple learning tasks early
in the learning process; vicarious experiences (i.e. modeling by the teacher or fellow
student), verbal persuasion (i.e. live chat, video conferencing, or podcast); and positive
physiological state; however, this may be the most difficult to control as each distance
learner’s environment is unique—home, library, café.
Self-regulation was significantly correlated with all types of engagement. This
suggests that teachers should model self-regulation techniques as a way to assuage
procrastination (Tuckman, 2007) and teach students how to actively change elements in
their learning environment a la triadic reciprocality (Bandura 1986, 1988, 2001) wherein
he suggests that behavioral, personal, and environmental influences are malleable and
that the learner can actively change one or more of these influences to, in turn, influence
one or more of the others. The implications are that distance educators should teach
students these aspects of self-regulation, especially as related to environment since, as
89
mentioned above, these tend to be highly unique and individual; thus, an array of possible
learning environments exist within the distance learning model.
Each of the three types of engagement strongly correlated with achievement in at
least one subject area. This implies that distance educators should continue to foster
computer self-efficacy and self-regulation, which, as indicated here and in prior research,
significantly predict levels of engagement. However, this study also showed that none of
the engagement types was significantly linked with achievement in the two humanities
courses, with the exception of emotional engagement correlating with English
achievement. This implies that distance educators and software developers should further
examine how curriculum is delivered for humanities based courses such as English and
social science.
Conversely, the findings of this study indicate that behavioral and emotional
engagement were significant predictors of achievement in both math and science,
implying that the manner in which students are engaged in these content areas (i.e.
features of the software, novelty, etc.) is crucial to their achievement in these courses. To
date, no study has exclusively examined content delivery and this may be an important
area for future research.
Future Research
This study examined the relationship between the motivational and learning
variables of computer self-efficacy and self-regulation; the variables of behavioral,
emotional, and cognitive engagement; and achievement of students enrolled in distance
learning. The results of this study showed significant relationships among several of these
variables via Pearson’s correlation and path analysis.
90
In order to further understand why these variables do or do not correlate, future
research can examine how important computer self-efficacy is in relation to the ease of
use of distance learning software. Future studies can also look at the role of computer
self-efficacy and self-regulation as direct influences on achievement as neither
significantly correlated with achievement when examined via path analysis, but did when
examined a la Pearson’s correlation matrix.
One critical area to examine in future studies is the differences in achievement
between traditional and distance learning contexts, particularly at the high school level. A
larger study with a control group may shed light on the overall effectiveness of distance
learning at the high school level. As distance learning continues to grow at the K-12
level, educators can greatly benefit from this knowledge in terms of course design and
curriculum development.
Another crucial area to explore in future studies is the differences in achievement
between subjects in distance learning. One key roadblock that must be overcome is to
convince virtual school administrators and parents to release official student records,
particularly scores on state tests as these standardized norm- and criterion-based
assessments even the playing field as compared to teacher or school specific grades and
assessments.
Though beyond the scope of this study, future studies can scrutinize the role of
the teacher in the distance learning environment, particularly as related to facilitating
self-efficacy, self-regulation, and incorporating sociocultural elements into the learning
environment. Connected to this, it is important to examine the structural model of the
91
distance learning institute and how the use of fully online versus blended models
compare in the areas of student achievement, self-regulation, and teacher effectiveness.
Conclusion
The aforementioned implications of this study are that the roles of computer self-
efficacy, self-regulation and achievement in distance learning are critical factors in the
continued growth and development of distance learning, particularly at the K-12 level.
Since the bulk of research is focused at the higher education level, more research is
needed at the K-12 level in order to understand how to meet the learning needs of these
students, what factors are considered in software and curriculum development and
design, and how teachers are trained to teach in this setting. Without further research, it
will be difficult for educators to ascertain how effective distance learning is when
compared to traditional education. It would also be useful to know how teacher
preparation at the higher education level compares to that at the K-12 level and if there
are differences in students’ levels of self-regulation, engagement, and achievement based
on the type of training the teacher received.
Down the road, it would be useful to replicate this study on a larger scale,
specifically, based on the number of schools, students, and subjects. Other important
variable to consider in a future study would be the number of years enrolled in distance
learning, the structure of the school, and the number of minutes spent in the distance
learning environment. Although it would be difficult to observe, information about
students’ individual learning environments would also expose the influences of
environment and context on learning.
92
Distance learning is growing lightning fast. As the race to offer distance learning
outpaces the understanding of how it works, it will be imperative for future researchers to
find a way to keep up and solve problems and embrace opportunities as they arise,
identify best practices, and understand that the malleability of technology has changed
how future learning will occur. It is important to realize that the transformation
technology is making in education is significant. Whereas past perceptions of technology
in education have been met with apprehension, disdain, or as only a cursory element in
the grand scheme, the current influence on how curriculum is developed, delivered and
accessed; how teachers are trained is paramount to the success of future learning. By
ardently exploring and understanding the obvious and obscure aspects of distance
learning, educators can better serve the learning needs of all students and discover new
pedagogical approaches to teaching.
93
References
Anderson, L. W., Krathwohl, D. R., & Bloom, B. S. (2001). A taxonomy for learning,
teaching, and assessing : a revision of Bloom's taxonomy of educational
objectives (Complete ed.). New York: Longman.
Arbaugh, J. B. (2000). How classroom environment and student engagement affect
learning in Internet-based MBA courses. Business Communication Quarterly,
63(4), 9-26.
Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change.
Psychological Review, 84(2), 191-215.
Bandura, A. (1982). Self-efficacy mechanism in human agency. American Psyhchologist,
37(2 ), 122-147.
Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory.
Englewood Cliffs, N.J.: Prentice-Hall.
Bandura, A., & Locke, E. (2003). Negative self-efficacy and goal effects revisited. .
Journal of Applied Psychology, 88(1), 87-99.
Bates, R., & Khasawneh, S. (2007). Self-efficacy and college students' perceptions and
use of online learning systems. Computers in Human Behavior, 23, 175-191.
Boria, R. (2006 ). States given guidance on online teaching, e-schools costs Education
Week, September 20, 2006 Available from
http://www.edweek.org/ew/articles/2006/09/20/04virtual.h26.html
Cassidy, S., & Eachus, P. (2002). Developing the computer user self-efficacy (CUSE)
scale: Investigating the relationship between computer self-efficacy, gender and
experience with computers. Journal of Educational Computing Research, 26(2),
133-153.
Compeau, D. R., & Higgins, C. A. (1995). Computer self-efficacy: Development of a
measure and initial test. MIS Quarterly, 19(2), 189-211.
Comrey, A., & Lee, H. (1992). A first course in factor analysis (2nd ed.). Hillsdale:
Erlbaum.
DeLoughry, T. (1993). 2 researchers say "technophobia" may affect millions of students.
Chronicle of Higher Education, 39(34), 25-26.
Dembo, M. H., Junge, L. G., & Lynch, R. (2006 ). Becoming a self-regulated learner:
Implications for web-based education. In H. O'Neil (Ed.), Web-based learning :
theory, research, and practice. Mahwah, N.J.: Lawrence Erlbaum Associates.
Eachus, P., & Cassidy, S. (2006). Development of the web users self-efficacy scale
(WUSE). In E. Cohen (Ed.), The Information Universe : Issues in Informing
Science and Information Technology (pp. 199-210). Santa Rosa, CA: Informing
Science.
Eccles, J. S., & Wigfield, A. (2002). Motivational beliefs, values, and goals. Annual
Review of Psychology, 53(1), 109-132.
Finn, J. D. (1993). School engagement and students at risk. Washington, D.C.: National
Center for Education Statistics.
Finn, J. D., & Voelkl, K. E. (1993). School characteristics related to student engagement.
The Journal of 9egro Education, 62 (3), 249-268.
94
Fredricks, J., Blumenfeld, P., & Paris, A. (2004). School engagement: Potential of the
concept, state of the evidence. Review of Educational Research, 74(1), 59-109.
Fredricks, J. A., Blumenfeld, P., Friedel, J., & Paris, A. (2005). School engagement. In K.
A. Moore & L. Lippman (Eds.), What do children need to flourish?
Conceptualizing and measuring indicators of positive development (pp. 305-321).
New York: Springer.
Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential
of the concept, state of the evidence. Review of Educational Research, 74(1), 59-
109.
Green, S. (1991). How many subjects does it take to do a regression analysis?
Multivariate Behavioral Research, 26(3), 499-510.
Greene, B. A., Miller, R. B., Crowson, H. M., Duke, B. L., & Akey, K. L. (2004).
Predicting high school students' cognitive engagement and achievement:
Contributions of classroom perceptions and motivation. Contemporary
Educational Psychology, 29(4), 462-482.
Greenhow, C., Robelia, B., & Hughes, J. (2009). Learning, teaching, and scholarship in a
digital age. Educational Researcher, 38(4), 246-259.
Harris, R. (1985). A primer of multivariate statistics (2nd ed.). New York: Academic
Press.
Hasan, B. (2003). The influence of specific computer experiences on computer self-
efficacy beliefs. Computers in Human Behavior, 19(4), 443-450.
Herrington, J., Oliver, R., & Reeves, T. C. (2003). Patterns of engagement in authentic
online learning environments. Australian Journal of Educational Technology,
19(1), 59-71.
Karsten, R., & Roth, R. (1998). Computer self-efficacy: A practical indicator of student
computer competency in introductor IS courses. Informing Science, 1(3), 61-68.
Kawachi, P. (2003). Initiating intrinsic motivation in online education: Review of the
current state of the art. Interactive Learning Environments, 11(1), 59-81.
Kerka, S. (1998). Learning Styles and Electronic Information. Trends and Issues Alerts.
Columbus, OH: ERIC Clearinghouse on Adult Career and Vocational Education.
Kinzie, M. B., Delcourt, M. A. B., & Powers, S. M. (1994). Computer technologies:
Attitudes and self-efficacy across undergraduate disciplines. Research in Higher
Education, 35(6), 745-768.
Kline, R. (1998). Principles and practice of structural equation modeling. New York:
Guilford Press.
Kuncel, N., Crede, M., & Thomas, L. (2005). The validity of self-reported grade point
averages, class ranks, and test scores: A meta-analysis and review of the
literature. Review of Educational Research, 75(1), 63-82.
Lee, J.-K., & Lee, W.-K. (2008). The relationship of e-learner's self-regulatory efficacy
and perception of e-Learning environmental quality.
. Computers in Human Behavior, 24(1), 32-47.
Locke, E. A., & Latham, G. P. (1990). A theory of goal setting & task performance.
Englewood Cliffs, N.J.: Prentice Hall.
95
McMahon, M., & Oliver, R. (2001). Promoting self-regulated learning in an on-line
environment. Charlottesville, VA: Association for the Advancement of
Computing in Education (AACE).
Mertler, C. A., & Vannatta, R. A. (2002). Advanced and multivariate statistical methods:
Practical application and interpretation (2nd ed.). Los Angeles: Pyrczak.
Moos, D. C., & Azevedo, R. (2009). Learning with computer-based learning
environments: A literature review of computer self-efficacy. Review of
Educational Research, 79(2), 576-600.
Ormrod, J. E. (2006). Educational psychology: Developing learners (5th ed.). Upper
Saddle River, N.J.: Pearson/Merrill Prentice Hall.
Patrick, H., Allison, R., & Kaplan, A. (2007). Early adolescents' perceptions of classroom
social environment, motivational beliefs, and engagement. Journal of Educational
Psychology, 99(1), 83-98.
Picciano, A., & Seaman, J. (2007). K-12 online learning: A survey of U.S. school district
administrators.
Pintrich, P. R., & De Groot, E. (1990). Motivational and self-regulated learning
components of classroom academic performance. Journal of Educational
Psychology, 82(1), 33-40.
Potosky, D. (2002). A field study of computer efficacy beliefs as an outcome of training:
The role of computer playfulness, computer knowledge, and performance during
training. Computers in Human Behavior, 18(3), 241-255.
Rizzo, A., Bowerly, T., Buckwalter, J., Klimchuk, D., Mitura, R., & Parsons, T. D.
(2006). A virtual reality scenario for all seasons: The virtual classroom. . C9S
Spectrums, 11,(1 ), 35-44.
Rizzo, A., & Kim, G. (2005 ). A SWOT analysis of the field of virtual reality
rehabilitation and therapy. Presence, 14 (2), 119-146.
Schunk, D. H., Pintrich, P. R., & Meece, J. L. (2008). Motivation in education : theory,
research, and applications (3rd ed.). Upper Saddle River, N.J.: Pearson/Merrill
Prentice Hall.
Stone, R. W., & Henry, J. W. (2003). The roles of computer self-efficacy and outcome
expectancy in influencing the computer end-user's organizational commitment.
Journal of End User Computing, 15(1), 38-53.
Tuckman, B. W. (2007). The effect of motivational scaffolding on procrastinators'
distance learning outcomes. Computers & Education 49(2), 414-422.
Van Voorhis, C., & Morgan, B. (2007). Understanding power and rules of thumb for
determining sample sizes. Tutorials in Quantitative Methods for Psychology, 3(2),
43-50.
Weinstein, C., Schulte, A., & Palmer, D. (1987). The learning and study strategies
inventory. Clearwater: H&H Publishing.
Wigfield, A. (1994). Expectancy-value theory of achievement motivation: A
developmental perspective. Educational Psychology Review, 6(1), 49-78.
Wilfong, J. D. (2006). Computer anxiety and anger: The impact of computer use,
computer experience, and self-efficacy beliefs. Computers in Human Behavior,
22(6), 1001-1011.
96
Wolters, C. A., Pintrich, P. R., & Karabenick, S. A. (2005). Assessing academic self-
regulated learning. In K. A. Moore & L. Lippman (Eds.), What do children need
to flourish? Conceptualizing and measuring indicators of positive development
(pp. 251-270). New York: Springer.
Wood, P. A. (2001). The U.S. Department of Education and student financial aid for
distance education: An update. Washington, DC: ERIC Clearinghouse on Higher
Education, Washington, DC.
Zimmerman, B. J. (1989). A social cognitive view of self-regulated academic learning.
Journal of Educational Psychology, 81(3), 329-339.
Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: an
overview. Educational Psychologist, 25(1), 3-17.
Zimmerman, B. J. (2000). Self-efficacy: An essential motive to learn. Contemporary
Educational Psychology, 25, 82-91.
Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory into
Practice, 41(2), 64-72.
Zimmerman, B. J., & Pons, M. (1986). Development of a structured interview for
assessing student use of self-regulated learning strategies. American Educational
Research Journal, 23, 614-628.
97
Appendix A
STUDENT DEMOGRAPHIC INFORMATION
1. What is your gender?
a. female b. male
2. What is your age?
a. 12 years-old or younger b. 13-14 years-old c. 15-16 years-old d. 17 years-old
or older
3. Which virtual high school do you attend?
a. CHOICE OMITTED TO MAINTAIN ANONYMITY
b. CHOICE OMITTED TO MAINTAIN ANONYMITY
4. In which grade are you enrolled?
a. 9
th
b. 10
th
c. 11
th
d. 12
th
5. How many years have you been enrolled in distance/online learning?
a. This is my first year b. 1-2 years c. 3-4 years d. 5 or more years
6. How many online classes are you taking?
a. 1 b. 2 c. 3 d. 4 e. 5 f. 6 or more
7. How many total hours per week do you log on to your classes?
1. <1 hour 2. 1-5 hours 3. 6-10 hours 4. 11-15 hours 5. 16-20 hours
6. 21-25 hours 7. 26-30 hours 8. 31-35 hours 9. 36 or > hours
98
Appendix B
ONLINE STUDENT SURVEY
INSTRUCTIONS: Read the statements below and indicate the strength of your
agreement or disagreement. Please answer statements based on your experience and
perceptions of the distance learning course or courses you have take or are taking this
year. There are no right or wrong answers, so don’t spend a lot of time on any one
answer. We are looking for your overall impression regarding each statement.
1. I ask myself questions to make sure I know the material I have been studying.
Strongly Agree 5 4 3 2 1 Strongly Disagree
2. I am more likely to give up when I don’t understand something in an online setting
(compared to traditional classroom). (R)
Strongly Agree 5 4 3 2 1 Strongly Disagree
3. I work on optional assignments and exercises in the online class even when I don’t
have to.
Strongly Agree 5 4 3 2 1 Strongly Disagree
4. Even when the online study materials are dull and uninteresting, I keep working until I
finish.
Strongly Agree 5 4 3 2 1 Strongly Disagree
5. Before I begin each online session, I think about the things I will need to do to learn.
Strongly Agree 5 4 3 2 1 Strongly Disagree
6. I often find that I have been reading for the online class but don’t know what it is all
about. (R)
Strongly Agree 5 4 3 2 1 Strongly Disagree
7. I find that when I read email or correspondence from the online teacher or other
students I don’t really pay attention. (R)
Strongly Agree 5 4 3 2 1 Strongly Disagree
99
8. When I’m learning online, I stop once in a while and go over what I have read.
Strongly Agree 5 4 3 2 1 Strongly Disagree
9. I work hard to get a good grade even when I don’t like a class.
Strongly Agree 5 4 3 2 1 Strongly Disagree
INSTRUCTIONS: Below are short statements concerning your thoughts and feelings
about participating in school. Read each statement and indicate the strength of your
agreement or disagreement. There are no right or wrong answers, so don’t spend a lot of
time on any one item. We are looking for your overall impression regarding each
statement.
10. I follow the rules of the online class.
Strongly Agree 5 4 3 2 1 Strongly Disagree
11. I get in trouble at school (ex. inappropriate email or chat, cheating, etc).
Strongly Agree 5 4 3 2 1 Strongly Disagree
*12. When I am in the online class, I just act as if I am working. (Dropped from scale)
Strongly Agree 5 4 3 2 1 Strongly Disagree
13. I am able to consistently pay attention when I am in the online class.
Strongly Agree 5 4 3 2 1 Strongly Disagree
14. I complete my work on time.
Strongly Agree 5 4 3 2 1 Strongly Disagree
15. I like being in the virtual high school.
Strongly Agree 5 4 3 2 1 Strongly Disagree
16. I feel excited by my work at school.
Strongly Agree 5 4 3 2 1 Strongly Disagree
100
17. I enjoy the virtual classroom.
Strongly Agree 5 4 3 2 1 Strongly Disagree
18. I am interested in the work in my online class.
Strongly Agree 5 4 3 2 1 Strongly Disagree
19. I feel happy in the online class.
Strongly Agree 5 4 3 2 1 Strongly Disagree
20. I feel bored in the online class. (REVERSED)
Strongly Agree 5 4 3 2 1 Strongly Disagree
21. I check my schoolwork for mistakes.
Strongly Agree 5 4 3 2 1 Strongly Disagree
22. I study even when I don’t have a test.
Strongly Agree 5 4 3 2 1 Strongly Disagree
23. I try to watch TV shows about things we do in the online/virtual school.
Strongly Agree 5 4 3 2 1 Strongly Disagree
24. When I read a book, I ask myself questions to make sure I understand what it is
about.
Strongly Agree 5 4 3 2 1 Strongly Disagree
25. I read extra books to learn more about things we do in school.
Strongly Agree 5 4 3 2 1 Strongly Disagree
26. If I don’t know what a word means when I am reading, I do something to figure it
out.
Strongly Agree 5 4 3 2 1 Strongly Disagree
27. If I don’t understand what I read, I go back and read it over again.
Strongly Agree 5 4 3 2 1 Strongly Disagree
101
28. I talk with people outside of school about what I am learning in the online class.
Strongly Agree 5 4 3 2 1 Strongly Disagree
INSTRUCTIONS: Below are short statements concerning your thoughts and feelings
about Internet use. Read each statement and indicate the strength of your agreement or
disagreement. There are no right or wrong answers, so don’t spend a lot of time on any
one item. We are looking for your overall impression regarding each statement.
29. I rarely have problems finding what I am looking for on the Internet
Strongly Agree 5 4 3 2 1 Strongly Disagree
30. I sometimes find using search engines like Google or Yahoo can be difficult
Strongly Agree 5 4 3 2 1 Strongly Disagree
31. Finding my way around web sites is usually easy for me
Strongly Agree 5 4 3 2 1 Strongly Disagree
32. I would never try to download files from the Internet that would be too complicated.
Strongly Agree 5 4 3 2 1 Strongly Disagree
33. I wouldn't know how to capture pictures from the Internet
Strongly Agree 5 4 3 2 1 Strongly Disagree
34. Downloading music from the Internet is something I feel competent to do
Strongly Agree 5 4 3 2 1 Strongly Disagree
35. I feel confident about using most types of browsers
Strongly Agree 5 4 3 2 1 Strongly Disagree
102
36. I sometimes "get lost" when trying to navigate through the Internet
Strongly Agree 5 4 3 2 1 Strongly Disagree
37. I am not very confident about my ability to use the Internet
Strongly Agree 5 4 3 2 1 Strongly Disagree
38. I know how to deal with annoying advertisements that appear while I'm using the
Internet
Strongly Agree 5 4 3 2 1 Strongly Disagree
39. I find using email easy
Strongly Agree 5 4 3 2 1 Strongly Disagree
40. Using messenger software, like MSN, Yahoo, or AOL always cause me some
problems
Strongly Agree 5 4 3 2 1 Strongly Disagree
41. I would have few problems setting up a web cam
Strongly Agree 5 4 3 2 1 Strongly Disagree
42. I much prefer using letters or the telephone to communicate with people, rather than
the Internet
Strongly Agree 5 4 3 2 1 Strongly Disagree
43. Social network sites like MySpace or Facebook are a good way of reaching people
Strongly Agree 5 4 3 2 1 Strongly Disagree
44. I have little real idea what peer to peer (p2p) software is for
Strongly Agree 5 4 3 2 1 Strongly Disagree
103
45. I regularly exchange music and/or video files with friends
Strongly Agree 5 4 3 2 1 Strongly Disagree
46. Using the Internet makes it much easier to keep in contact with people
Strongly Agree 5 4 3 2 1 Strongly Disagree
47. I regularly use the Internet for playing games
Strongly Agree 5 4 3 2 1 Strongly Disagree
48. I'm not sure how to communicate with people using chat rooms
Strongly Agree 5 4 3 2 1 Strongly Disagree
104
Appendix C
TRUNCATED VERSION OF CONTENT STANDARDS
ENGLISH
Reading
1.0 Word Analysis, Fluency, and Systematic Vocabulary Development
Students apply their knowledge of word origins to determine the meaning of new words encountered in reading
materials and use those words accurately.
2.0 Reading Comprehension (Focus on Informational Materials)
Students read and understand grade-level-appropriate material. They analyze the organizational patterns, arguments,
and positions advanced. The selections in Recommended Literature, Kindergarten Through Grade Twelve illustrate the
quality and complexity of the materials to be read by students. In addition, by grade twelve, students read two million
words annually on their own, including a wide variety of classic and contemporary literature, magazines, newspapers,
and online information. In grades nine and ten, students make substantial progress toward this goal.
3.0 Literary Response and Analysis
Students read and respond to historically or culturally significant works of literature that reflect and enhance their
studies of history and social science. They conduct in-depth analyses of recurrent patterns and themes. The selections in
Recommended Literature, Kindergarten Through Grade Twelve illustrate the quality and complexity of the materials to
be read by students.
Writing
1.0 Writing Strategies
Students write coherent and focused essays that convey a well-defined perspective and tightly reasoned argument. The
writing demonstrates students' awareness of the audience and purpose. Students progress through the stages of the
writing process as needed.
2.0 Writing Applications (Genres and Their Characteristics)
Students combine the rhetorical strategies of narration, exposition, persuasion, and description to produce texts of at
least 1,500 words each. Student writing demonstrates a command of standard American English and the research,
organizational, and drafting strategies outlined in Writing Standard 1.0.
Written and Oral English Language Conventions
The standards for written and oral English language conventions have been placed between those for writing and for
listening and speaking because these conventions are essential to both sets of skills.
1.0 Written and Oral English Language Conventions
Students write and speak with a command of standard English conventions.
105
Listening and Speaking
1.0 Listening and Speaking Strategies
Students formulate adroit judgments about oral communication. They deliver focused and coherent presentations of
their own that convey clear and distinct perspectives and solid reasoning. They use gestures, tone, and vocabulary
tailored to the audience and purpose.
SOCIAL SCIENCE STANDARDS FOR WORLD HISTORY
10.1 Students relate the moral and ethical principles in ancient Greek and Roman
philosophy, in Judaism, and in Christianity to the development of Western political
thought.
10.2 Students compare and contrast the Glorious Revolution of England, the American
Revolution, and the French Revolution and their enduring effects worldwide on the
political expectations for self-government and individual liberty.
10.3 Students analyze the effects of the Industrial Revolution in England, France,
Germany, Japan, and the United States.
10.4 Students analyze patterns of global change in the era of New Imperialism in at least
two of the following regions or countries: Africa, Southeast Asia, China, India, Latin
America, and the Philippines.
10.5 Students analyze the causes and course of the First World War.
10.6 Students analyze the effects of the First World War.
10.7 Students analyze the rise of totalitarian governments after World War I.
10.8 Students analyze the causes and consequences of World War II.
10.9 Students analyze the international developments in the post-World World War II
world.
10.10 Students analyze instances of nation-building in the contemporary world in at least
two of the following regions or countries: the Middle East, Africa, Mexico and other parts
of Latin America, and China.
10.11 Students analyze the integration of countries into the world economy and the
information, technological, and communications revolutions (e.g., television, satellites,
computers).
MATH STANDARDS FOR ALGEBRA I
1.0 Students identify and use the arithmetic properties of subsets of integers and rational,
irrational, and real numbers, including closure properties for the four basic arithmetic operations
where applicable:
106
2.0 Students understand and use such operations as taking the opposite, finding the reciprocal,
taking a root, and raising to a fractional power. They understand and use the rules of exponents.
3.0 Students solve equations and inequalities involving absolute values.
4.0 Students simplify expressions before solving linear equations and inequalities in one variable, such as
3(2x-5) + 4(x-2) = 12.
5.0 Students solve multistep problems, including word problems, involving linear equations and linear
inequalities in one variable and provide justification for each step.
6.0 Students graph a linear equation and compute the x- and y-intercepts (e.g., graph 2x + 6y = 4). They are
also able to sketch the region defined by linear inequality (e.g., they sketch the region defined by 2x + 6y <
4).
7.0 Students verify that a point lies on a line, given an equation of the line. Students are able to derive
linear equations by using the point-slope formula.
8.0 Students understand the concepts of parallel lines and perpendicular lines and how those slopes are
related. Students are able to find the equation of a line perpendicular to a given line that passes through a
given point.
9.0 Students solve a system of two linear equations in two variables algebraically and are able to interpret
the answer graphically. Students are able to solve a system of two linear inequalities in two variables and to
sketch the solution sets.
10.0 Students add, subtract, multiply, and divide monomials and polynomials. Students solve multistep
problems, including word problems, by using these techniques.
11.0 Students apply basic factoring techniques to second- and simple third-degree polynomials. These
techniques include finding a common factor for all terms in a polynomial, recognizing the difference of two
squares, and recognizing perfect squares of binomials.
12.0 Students simplify fractions with polynomials in the numerator and denominator by factoring both and
reducing them to the lowest terms.
13.0 Students add, subtract, multiply, and divide rational expressions and functions. Students solve both
computationally and conceptually challenging problems by using these techniques.
14.0 Students solve a quadratic equation by factoring or completing the square.
15.0 Students apply algebraic techniques to solve rate problems, work problems, and percent mixture
problems.
16.0 Students understand the concepts of a relation and a function, determine whether a given relation
defines a function, and give pertinent information about given relations and functions.
17.0 Students determine the domain of independent variables and the range of dependent variables defined
by a graph, a set of ordered pairs, or a symbolic expression.
18.0 Students determine whether a relation defined by a graph, a set of ordered pairs, or a symbolic
expression is a function and justify the conclusion.
19.0 Students know the quadratic formula and are familiar with its proof by completing the square.
107
20.0 Students use the quadratic formula to find the roots of a second-degree polynomial and to solve
quadratic equations.
21.0 Students graph quadratic functions and know that their roots are the x-intercepts.
22.0 Students use the quadratic formula or factoring techniques or both to determine whether the graph of a
quadratic function will intersect the x-axis in zero, one, or two points.
23.0 Students apply quadratic equations to physical problems, such as the motion of an object under the
force of gravity.
24.0 Students use and know simple aspects of a logical argument:
25.0 Students use properties of the number system to judge the validity of results, to justify each step of a
procedure, and to prove or disprove statements:
SCIENCE STANDARDS FOR EARTH SCIENCE
1. Dynamic Earth Astronomy and planetary exploration reveal the solar system's
structure, scale, and change over time. As a basis for understanding this concept:
2. Earth-based and space-based astronomy reveal the structure, scale, and changes in
stars, galaxies, and the universe over time. As a basis for understanding this
concept:
3. Plate tectonics operating over geologic time has changed the patterns of land, sea,
and mountains on Earth's surface. As the basis for understanding this concept:
4. Energy enters the Earth system primarily as solar radiation and eventually escapes
as heat. As a basis for understanding this concept:
5. Heating of Earth's surface and atmosphere by the sun drives convection within the
atmosphere and oceans, producing winds and ocean currents. As a basis for
understanding this concept:
6. Climate is the long-term average of a region's weather and depends on many
factors. As a basis for understanding this concept:
7. Each element on Earth moves among reservoirs, which exist in the solid earth, in
oceans, in the atmosphere, and within and among organisms as part of
biogeochemical cycles. As a basis for understanding this concept
8. Life has changed Earth's atmosphere, and changes in the atmosphere affect
conditions for life. As a basis for understanding this concept:
108
Appendix D
University of Southern California
Rossier School of Education
Waite Phillips Hall 600 C
Los Angeles, CA 90089-4036
I9FORMATIO9 SHEET FOR 9O9-MEDICAL RESEARCH
PARE1TAL PERMISSIO1
******************************************************************
CO1SE1T TO PARTICIPATE I1 RESEARCH
The relationship between students’ confidence in using technology, self-
regulation and their engagement in distance learning education
Your child is invited to participate in a research study conducted by Brandon Martinez
(Principal Investigator) and Robert Rueda, Ph.D., faculty advisor, from the Rossier
School of Education at the University of Southern California because s/he is a student at
one of the virtual high schools being studied. A total of up to 500 subjects will be
selected to participate. Your child’s participation is voluntary. You should read the
information below, and ask questions about anything you do not understand, before
deciding whether or not to allow your child to participate. Your child will also be given
an opportunity to decide whether or not to participate. Even if you agree to allow your
child to participate, the final decision lies with your child.
Please take as much time as you need to read the consent form. You and/or your child
may also decide to discuss it with your family or friends. If you and/or your child decide
to participate, you will be asked to sign this form. You will be given a copy of this form.
PURPOSE OF THE STUDY
The purpose of this study is to understand if and how your child’s confidence level with
computer technology and the way he/she regulates his/her own learning contributes to
his/her level of engagement and, potentially, learning.
PROCEDURES
If your child agrees to volunteer to participate in this study, s/he will be asked to do the
following things:
109
Complete a 48-item online survey which will take about 10 minutes to complete. The
survey will ask your child questions about how he/she evaluates his/her level of
confidence in using computer technology, self-regulation of learning, and engagement in
the online/virtual high school classes. For example, a sample survey item asks your child
to rate his/her opinion on a scale of 1 to 5 of the following statement: “I pay attention in
school.” Another item on the survey asks your child to rate his/her opinion on a scale of
1 to 5 about the following statement: “I feel happy at school.” If you would like to see a
list of the questions asked of your child, please contact the researcher at
brandodm@usc.edu.
This study will also be looking at your child’s estimated grades for the current grading
period and demographic data (i.e., age, gender). This information will be reported to the
researchers by your child. If you do not want your child to answer the demographic
information, your child can skip these, or any other questions.
The data will only be viewed by the principal investigator administering this study and
your child’s responses will be held in the strictest professional confidence.
If you agree to allow your child to participate, you will be asked to forward the email to
your child. Your child will then be asked to review an assent form, informing him/her of
the research study, and if s/he agrees, will be asked to complete the questionnaires.
POTE1TIAL RISKS A1D DISCOMFORTS
There are no anticipated risks to your child’s participation. Your child may be
uncomfortable due to the time taken away from regular class instruction while filling out
the survey. If your child feels discomfort, s/he may choose not to participate, stop or
withdraw from the study at any time. Your child may also choose not to answer any
questions that make him/her uncomfortable.
POTE1TIAL BE1EFITS TO SUBJECTS A1D/OR TO SOCIETY
There will be no direct benefit to your child for participating in this study. However, the
information from this study may be used to help inform decisions and improve services
for students.
PAYME1T/COMPE1SATIO1 FOR PARTICIPATIO1
Your child will not be paid for participating in this research study. However, s/he is
eligible to be entered into a drawing for one iPod Shuffle or one of two $15 iTunes
certificate. Your child does not have to complete the survey in order to be eligible for a
prize. You can email the link to your child and s/he can enter his/her email into the
appropriate place or you can click on the link and enter your child’s e-mail address. The
drawing will take place approximately one month after the close of the online survey.
The odds of winning are dependant upon the number of entries. If 500 students enter, the
odds will be one in 167. The winner will be contacted via email and will be asked to
provide a mailing address in order to receive the prize. Your child can win only one prize.
110
POTE1TIAL CO1FLICTS OF I1TEREST
The principle investigator of this research study is also the teacher at one of the virtual
high schools. However, your child will not be asked to include his/her name on any
forms; therefore there’s no way for the researcher to know whether or not any students
taking his classes have, or have not, participated.
CO1FIDE1TIALITY
No information is being collected that can be linked back to your child, thereby your
child’s privacy will be ensured. The data will be stored for three years in a locked file
cabinet of the principal investigator after the completion of the study at which time the
information will be destroyed.
Teachers will not have access to the information your child provides on this survey and
your child’s answers will not influence the grade s/he receives in his/her courses.
When the results of the research are published or discussed in conferences, no
information will be included that would reveal your child’s identity.
PARTICIPATIO1 A1D WITHDRAWAL
Your child can choose whether to be in this study or not. If your child volunteers to be in
this study, your child may withdraw at any time without consequences of any kind. Your
child may also refuse to answer any questions s/he doesn’t want to answer and still
remain in the study.
ALTER1ATIVES TO PARTICIPATIO1
Your child’s alternative is to not participate; s/he will be asked to participate in the
regular classroom activities, but will not be asked to complete the questionnaire.
IDE1TIFICATIO1 OF I1VESTIGATORS
If you have any questions or concerns about the research, please feel free to contact the
Principal Investigator, Brandon Martinez, via mail at bmartinez@fjuhsd.net. You may
also contact the Faculty Advisor, Robert Rueda, via mail at WPH 600A, Los Angeles,
CA 90089-4036; email at rueda@usc.edu; or phone at (213) 740-2371.
RIGHTS OF RESEARCH SUBJECTS
You may withdraw your consent at any time and discontinue participation without
penalty. You are not waiving any legal claims, rights or remedies because of your
participation in this research study. If you have questions regarding your rights as a
research subject, contact the University Park IRB, Office of the Vice Provost for
Research, Stonier Hall, Room 224a, Los Angeles, CA 90089-1146, (213) 821-5272 or
upirb@usc.edu.
111
VIRTUAL SIG1ATURE OF RESEARCH SUBJECT
I have read (or someone has read to me) the information provided above. I can make a
copy of this form for my records
□ Yes, I agree to allow my child to participate in the research study and by entering
his email address, the assent form and questionnaires will be forwarded to his/her email.
S/he will also be entered into the drawing.
□ No, I do not want my child to participate; however, I would like him/her to be
eligible for entry into the drawing.
If you do not want your child to be entered into the drawing or you do not want to allow
your child to participate, simply close out this document.
______________________________________________
Child’s email address:
112
Appendix E
University of Southern California
Rossier School of Education
Waite Phillips Hall 600 C
Los Angeles, CA 90089-4036
I9FORMATIO9 SHEET FOR 9O9-MEDICAL RESEARCH
******************************************************************
CO1SE1T TO PARTICIPATE I1 RESEARCH
The effect of feedback type on student engagement and learning among
high school students
You are invited to participate in a research study conducted by Brandon Martinez
(Principal Investigator) and Robert Rueda, Ph.D., faculty advisor, from the Rossier
School of Education at the University of Southern California because you are a student at
one of the virtual high schools being studied. Up to 500 subjects will be selected to
participate. Your participation is voluntary. You should read the information below, and
ask questions about anything you do not understand, before deciding whether or not to
participate. Your parent’s permission will be sought; however, the final decision is
yours. Even if your parents agree to your participation, you don’t have to participate if
you don’t want to. Please take as much time as you need to read this form. You may also
decide to discuss it with your family or friends. You will be given a copy of this form.
PURPOSE OF THE STUDY
The purpose of this study is to understand if and how your confidence using computer
technology and the way you study contributes to your level of participation in school and,
potentially, learning.
PROCEDURES
If you volunteer to participate in this study, we would ask you to do the following things:
Complete a 48-item survey in class which will take about 10 minutes to complete. The
survey will ask you questions about how you evaluate your level of involvement in your
classes and how you rate your confidence in using computers and how you study for
class. For example, a sample survey item asks you to rate your opinion on a scale of 1 to
5, where a 1 means “never” and a 5 means “all of the time” of the following statement: “I
pay attention in school.” Another item on the survey asks you to rate your opinion on a
scale of 1 to 5 about the following statement: “I feel happy at school.”
113
This study will also be looking at your estimated grades and demographic data (i.e., age
and gender). You will be asked to self-report this information. If you, or your parents,
do not want to answer any questions, you can skip the question(s).
Your responses will be held in the strictest professional confidence. Your teachers will
not have access to the information you provide on this survey and your answers will not
influence the grade you receive in your course(s).
POTE1TIAL RISKS A1D DISCOMFORTS
This study does not pose any identifiable risks beyond minor discomfort. You may be
uncomfortable due to the time taken away from regular class instruction while filling out
the survey, from your demographic information and self-reported grades being reviewed,
or the concern with the confidentiality of your answers on the survey. If you feel
discomfort, you may choose not to participate, stop or withdraw from the study at any
time. You may also choose not to answer any questions that make you uncomfortable.
POTE1TIAL BE1EFITS TO SUBJECTS A1D/OR TO SOCIETY
There will be no direct benefit to you for participating in this study. However, the
information from this study may be used to help inform decisions and improve services
for students.
PAYME1T/COMPE1SATIO1 FOR PARTICIPATIO1
You will not be paid for participating in this research study. However, you will be
entered into a drawing for one iPod Shuffle or one of two $15 iTunes certificate. You do
not have to complete the survey in order to be eligible for a prize. Your parent can either
enter your email address or forward you the link and you can include your email address
in the appropriate place. The drawing will take place approximately one month after the
close of the online survey. The odds of winning are dependant upon the number of
entries. If 500 students enter, the odds will be one in 167. The winner will be contacted
via email and will be asked to provide a mailing address in order to receive the prize. You
can win only one prize.
POTE1TIAL CO1FLICTS OF I1TEREST
The principle investigator of this research study is also the teacher at one of the virtual
high schools. Your email will be the only personal information collected by the
researcher. You will not be asked to include your name on any of the questionnaires,
therefore there’s no way for the researcher to know whether or not any students taking his
classes have, or have not, participated.
CO1FIDE1TIALITY
No information is being collected that can be linked back to you, thereby your privacy
will be ensured. The data will be stored for three years in a locked file cabinet of the
principal investigator after the completion of the study at which time the information will
be destroyed.
114
Teachers will not have access to the information you provide on this survey and your
answers will not influence the grade you receive in your course(s).
When the results of the research are published or discussed in conferences, no
information will be included that would reveal your identity.
PARTICIPATIO1 A1D WITHDRAWAL
You can choose whether to be in this study or not. If you volunteer to be in this study,
you may withdraw at any time without consequences of any kind. You may also refuse
to answer any questions you don’t want to answer and still remain in the study.
ALTER1ATIVES TO PARTICIPATIO1
Your alternative is to not participate. If you choose not to participate, you will be asked to
continue with your regular classroom activities, you will not be asked to complete the
surveys or participate in the ‘clicker’ sessions.
IDE1TIFICATIO1 OF I1VESTIGATORS
If you have any questions or concerns about the research, please feel free to contact the
Principal Investigator, Brandon Martinez, via mail at brandodm@usc.edu. You may also
contact the Faculty Advisor, Robert Rueda, via mail at WPH 600A, Los Angeles, CA
90089-4036; email at rueda@usc.edu; or phone at (213) 740-2371.
.
RIGHTS OF RESEARCH SUBJECTS
You may withdraw your consent at any time and discontinue participation without
penalty. You are not waiving any legal claims, rights or remedies because of your
participation in this research study. If you have questions regarding your rights as a
research subject, contact the University Park IRB, Office of the Vice Provost for
Research, Stonier Hall, Room 224a, Los Angeles, CA 90089-1146, (213) 821-5272 or
upirb@usc.edu.
VIRTUAL SIG1ATURE OF RESEARCH SUBJECT
I understand the procedures described above. My questions have been answered to my
satisfaction, and I agree to participate in this study. I can print a copy of this form.
□ Yes, I agree to participate in the research study
□ No, I do not want to participate; however, I would like to be eligible for entry into
the drawing.
______________________________________________
Child’s email address:
Abstract (if available)
Abstract
This study examines the relationship between computer self-efficacy, self-regulation, engagement, and expected achievement for students in distance learning. Prior research on these variables is robust
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
The relationship of students' self-regulation and self-efficacy in an online learning environment
PDF
The relationship between state self-efficacy and state worry and sales performance
PDF
The relationship of gratitude and subjective well-being to self-efficacy and control of learning beliefs among college students
PDF
The effect of reading self-efficacy, expectancy-value, and metacognitive self-regulation on the achievement and persistence of community college students enrolled in basic skills reading courses
PDF
The relationship between levels of expertise, task difficulty, perceived self-efficacy, and mental effort investment in task performance
PDF
Influence of feedback, resources and interaction with superiors on work self-efficacy levels and employee engagement in informal learning activities in the workplace
PDF
Self-regulation and online course satisfaction in high school
PDF
The relationship of model minority stereotype, Asian cultural values, and acculturation to goal orientation, academic self-efficacy, and academic achievement in Asian American college students
PDF
The relationship between motivational factors and engagement in an urban high school setting
PDF
The relationship between teacher assessment practices, student goal orientation, and student engagement in elementary mathematics
PDF
One Hawai’i K-12 complex public school teachers’ level of computer self-efficacy and their acceptance of and integration of technology in the classroom
PDF
The relationship between parenting styles, career decision self-efficacy, and career maturity of Asian American college students
PDF
Factors influencing nursing students' motivation to succeed
PDF
The effects of time-management in a blended distance learning course
PDF
What is the relationship between self-efficacy of community college mathematics faculty and effective instructional practice?
PDF
Latino high school students: Self-efficacy and college choice
PDF
What are the relationships among program delivery, classroom experience, content knowledge, and demographics on pre-service teachers' self-efficacy?
PDF
A formative evaluation of a high school blended learning biology course
PDF
Employing cognitive task analysis supported instruction to increase medical student and surgical resident performance and self-efficacy
PDF
Effects of help-seeking in a blended high school biology class
Asset Metadata
Creator
Martinez, Brandon David
(author)
Core Title
The relationship between students’ computer self-efficacy, self-regulation, and engagement in distance learning
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education
Publication Date
08/07/2009
Defense Date
06/03/2009
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
computer self-efficacy,Distance education,distance learning,engagement,OAI-PMH Harvest,online learning,self-regulation
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
O'Neil, Harold F. (
committee member
), Rizzo, Albert Skip (
committee member
)
Creator Email
brandodm@usc.edu,mookyzaza@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-m2536
Unique identifier
UC1197495
Identifier
etd-Martinez-3120 (filename),usctheses-m40 (legacy collection record id),usctheses-c127-176235 (legacy record id),usctheses-m2536 (legacy record id)
Legacy Identifier
etd-Martinez-3120.pdf
Dmrecord
176235
Document Type
Dissertation
Rights
Martinez, Brandon David
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Repository Name
Libraries, University of Southern California
Repository Location
Los Angeles, California
Repository Email
cisadmin@lib.usc.edu
Tags
computer self-efficacy
distance learning
online learning
self-regulation