Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
A comparsion of traditional face-to-face simulation versus virtual simulation in the development of critical thinking skills, satisfaction, and self-confidence in undergraduate nursing students
(USC Thesis Other)
A comparsion of traditional face-to-face simulation versus virtual simulation in the development of critical thinking skills, satisfaction, and self-confidence in undergraduate nursing students
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
SIMULATION AND CRITICAL THINKING 1
A COMPARISON OF TRADITIONAL FACE-TO-FACE SIMULATION VERSUS VIRTUAL
SIMULATION IN THE DEVELOPMENT OF CRITICAL THINKING SKILLS,
SATISFACTION, AND SELF-CONFIDENCE IN UNDERGRADUATE NURSING
STUDENTS
by
Chia-Yen (Cathy) Li
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
May 2016
Copyright 2016 Chia-Yen (Cathy) Li
SIMULATION AND CRITICAL THINKING 2
Dedication
This dissertation is dedicated to my grandparents, Mr. Cheng-Kang Lu and Mrs. Yun-
Ying Change Lu. I would not become who I am and be the first doctor in the family without
their unconditional support. They have always provided me with a lot of freedom and support in
deciding my career and making decisions because they have always had faith in me and believed
I can be successful. When I get frustrated, they have always listened and showed their sympathy
to me. I know I can always go back home where they will always be there for me. Even though
my grandparents had passed away, I know they would be very happy to see what I have
accomplished today.
SIMULATION AND CRITICAL THINKING 3
Acknowledgements
During the three-year journey in getting my Doctor of Education degree, I have received
a lot of support, guidance, and assistance from USC professors and academic advisors, my
family, colleagues, and classmates. It is impossible to list all their names and express my
appreciation. However, I must express my sincere appreciation to my committee chair and
members, my husband, my daughter, and my uncle.
First, I want to thank my committee chair, Dr. Robert Keim, for his expertise in
quantitative research and data analysis. Thank you for being flexible and adapting to my
learning needs. Thank you for providing guidance to me when I encountered some challenges
during data collection. I also want to thank Dr. Monique Datta for her constructive and thorough
feedback on the content, writing and APA. Finally, I want to thank Dr. David Dunham for being
a nursing content expert in my committee and helping me in getting support from Lippincott and
Laerdal. I really appreciate all the time and efforts my committee chair and members have spent
in reading my dissertation and providing feedback to me. I would not have been able to
complete my dissertation and have this wonderful journey without all their help.
Second, I want to express my deepest appreciation to my husband, Mr. Ming-Chung
Hung. Thank you for taking care of me and tolerating my annoyance when I was pregnant. I also
want to thank you for taking care of our daughter so I could continue working on getting my
doctoral degree and graduate on time. Finally, I want to thank my lovely daughter Verina for
being one of the best toddlers in the world by sitting next to me and letting me do my
coursework. I could not express how lucky I feel to have both of you in my life.
Finally, I want to express my appreciation to my uncle, Mr. Chang-Chin Lu. Thank you
for being so supportive when I decided to quit my job in Taiwan to come to the US when other
SIMULATION AND CRITICAL THINKING 4
members in our family did not support my decision. I would not be able to achieve my goals
without your encouragement and support. Thank you for believing in my strengths and
providing support to help me to overcome my weakness.
SIMULATION AND CRITICAL THINKING 5
Table of Contents
Dedication 2
Acknowledgements 3
List of Tables 7
List of Figures 8
Abstract 10
Chapter One: Overview of the Study 11
Background of the Problem 12
Statement of the Problem 15
Purpose of the Study 16
Research Questions 16
Significance of the Study 17
Limitations 18
Delimitations 18
Definition of Terms 19
Organization of the Study 20
Chapter Two: Literature Review 22
A Brief History of Simulation in Nursing Education 22
Typology of Simulation 25
Platforms of Virtual Simulation 30
Critical Thinking in Nursing Education 32
Definition of Critical Thinking 32
The Effects of Simulation on Critical Thinking Development 35
Human Patient Simulation 35
Virtual Simulation 39
Analysis of Studies on Simulations and Critical Thinking 44
The Effects of Simulation on Satisfaction and Self-Confidence in Learning 48
Analysis of Studies on Simulations and Satisfaction and Self-Confidence in Learning 56
Theoretical Framework 58
Kolb’s Experiential Learning Theory 59
The NLN/Jeffries Simulation Model 60
Chapter Summary 62
Chapter Three: Methodology 63
Sample and Population 64
Instrumentation 65
Data Collection 68
Data Analysis 71
Chapter Four: Results 73
Sample Demographics 73
Face-to-Face Simulation Group 74
Virtual Simulation Group 74
Results 76
Critical Thinking Skills 76
Research Question One 81
Research Question Two 82
SIMULATION AND CRITICAL THINKING 6
Research Question Three 83
Satisfaction and Self-Confidence in Learning 85
Research Question Four 101
Research Question Five 102
Chapter Summary 106
Chapter Five: Discussions of Findings 107
Discussion of Findings 108
Implications for Practice 112
Recommendations for Future Research 114
Conclusion 116
References 117
Appendix A: Mrs. Chase 135
Appendix B: Demographic Questionnaire 136
Appendix C: NLN Simulation Tools Permission 137
Appendix D: Satisfaction and Self-Confidence in Learning Survey-Face-to Face
Simulation Group 138
Appendix E: Satisfaction and Self-Confidence in Learning Survey-Virtual
Simulation Group 139
Appendix F: Recruitment Letter 140
Appendix G: Information Sheet and Consent 141
SIMULATION AND CRITICAL THINKING 7
List of Tables
Table 1: Demographics Characteristics for All Participants 75
Table 2: Descriptive Statistics for HSRT Overall Scores and Minutes Spent on Tests
by Group 77
Table 3: Test of Normality and Overall HSRT Scores by Group 78
Table 4: Independent t-test Results of HSRT Overall Scores (Pre-Test) 81
Table 5: Independent t-test Results of HSRT Overall Scores (Post-Test) 82
Table 6: Paired-Samples t-test Results by Group 84
Table 7: Descriptive Statistics for Individual Survey Question of Satisfaction and
Self-Confidence in Learning by Group 86
Table 8: Test of Normality for Satisfaction and Self-Confidence in Learning
Survey Questions by Group 87
Table 9: Mann-Whitney U Test Results for Total Scores, Test Statistics, and Ranks
by Group 103
Table 10: Mann-Whitney U Test Results for Total Scores and Test Statistics by Question
and Group 104
Table 11: Mann-Whitney U Test Results for Ranks of Each Survey Question by Group 105
SIMULATION AND CRITICAL THINKING 8
List of Figures
Figure 1: The NLN/Jeffries Simulation Framework 61
Figure 2: Face-to-Face Simulation Group HSRT Overall Scores (Pre-Test) 78
Figure 3: Face-to-Face Simulation Group HSRT Overall Scores (Post-Test) 79
Figure 4: Virtual Simulation Group HSRT Overall Scores (Pre-Test) 79
Figure 5: Virtual Simulation Group HSRT Overall Scores (Post-Test) 80
Figure 6: Virtual Simulation Group Survey Question One Scores 88
Figure 7: Face-to-Face Simulation Group Survey Question One Scores 88
Figure 8: Virtual Simulation Group Survey Question Two Scores 89
Figure 9: Face-to-Face Simulation Group Survey Question Two Scores 89
Figure 10: Virtual Simulation Group Survey Question Three Scores 90
Figure 11: Face-to-Face Group Survey Question Three Scores 90
Figure 12: Virtual Simulation Group Survey Question Four Scores 91
Figure 13: Face-to-Face Simulation Group Survey Question Four Scores 91
Figure 14: Virtual Simulation Group Survey Question Five Scores 92
Figure 15: Face-to-Face Simulation Survey Question Five Scores 92
Figure 16: Virtual Simulation Group Survey Question Six Scores 93
Figure 17: Face-to-Face Simulation Group Survey Question Six Scores 93
Figure 18: Virtual Simulation Group Survey Question Seven Scores 94
Figure 19: Face-to-Face Simulation Group Survey Question Seven Scores 94
Figure 20: Virtual Simulation Group Survey Question Eight Scores 95
Figure 21: Face-to-Face Simulation Group Survey Question Eight Scores 95
Figure 22: Virtual Simulation Group Survey Question Nine Scores 96
Figure 23: Face-to-Face Simulation Group Survey Question Nine Scores 96
Figure 24: Virtual Simulation Group Survey Question 10 Scores 97
SIMULATION AND CRITICAL THINKING 9
Figure 25: Virtual Simulation Group Survey Question 11 Scores 97
Figure 26: Face-to-Face Simulation Group Survey Question Ten Scores 98
Figure 27: Face-to-Face Simulation Group Survey Question 11 Scores 98
Figure 28: Virtual Simulation Group Survey Question 12 Scores 99
Figure 29: Face-to-Face Simulation Group Survey Question 12 Scores 99
Figure 30: Virtual Simulation Group Survey Question 13 Scores 100
Figure 31: Face-to-Face Simulation Group Survey Question 13 Scores 100
SIMULATION AND CRITICAL THINKING 10
Abstract
Healthcare organizations demand that nurses have sufficient critical thinking skills to
deal with complex clinical situations, satisfy the needs of patients and family members, and
provide high-quality and safe patient care. However, limited clinical training sites and nursing
faculty shortages drive nurse educators to conduct high-fidelity simulations in educating future
nurses. The high cost of high-fidelity simulation and its effectiveness in increasing nursing
students’ critical thinking skills remains inconclusive. Virtual simulation may serve as an
alternative due its lower cost and accessibility. The purpose of this quantitative study was to
investigate and compare the effects of virtual simulation and traditional face-to-face simulation
on the acquisition of critical thinking skills, satisfaction, and self-confidence in learning among
undergraduate nursing students. The research utilized an experimental, two-group, pre-test and
post-test design. Convenience sampling was used, and the final sample size consisted of 49
undergraduate nursing students who were in their fourth semester of the nursing program. The
Health Science Reasoning Test (HSRT) was used to measure participants’ critical thinking skills
and a Likert scale survey developed by the National League for Nursing (NLN) was used to
evaluate participants’ satisfaction and self-confidence in learning. The data were analyzed using
an independent-samples t-test, a paired-samples t-test, and a Mann-Whitney U test. The results
suggested that virtual simulation may be as effective as face-to-face simulation in improving
critical thinking skills; however; students were more satisfied with face-to-face simulation and
no difference was found in self-confidence level between groups. The results of the study
support the use of virtual simulation by nurse educators. Future research should be conducted to
include a larger sample and to examine the effectiveness of different virtual simulation programs
or virtual simulation combined with other instructional methods.
SIMULATION AND CRITICAL THINKING 11
CHAPTER ONE: OVERVIEW OF THE STUDY
Advanced technologies and improved economic statuses result in patients and family
members’ being better informed about the high quality of health care (Hovancsek et al., 2009).
In order to meet the growing needs of patients, family members, and healthcare organizations’
entry expectations, nurses must have effective critical thinking and clinical reasoning skills to
both recognize fluctuations in patients’ conditions in a timely manner and provide safe patient
care that leads to positive clinical outcomes (Lapkin, Levett-Jones, Bellchambers, & Fernandez,
2010). However, limited availability of clinical sites and faculty shortages in the United States
present a challenge to nursing schools’ current efforts to teach these essential skills (Dutile,
Wright, & Beauchesne, 2011). As a result, innovative teaching strategies must be implemented
to educate future nurses. Simulations can be used to improve and evaluate the competencies of
nursing students, new graduates, and experienced health care professionals by allowing them to
experience real-life situations and patients’ responses, and to apply knowledge in practice
without posing harm to patients (Cioffi, 2001; Decker Sportman, Puetz, & Billings, 2008;
Jeffries, 2005; Simpson, 2002; Weaver 2011).
In nursing education, simulations are used in various forms and settings (Weatherspoon
& Wyatt, 2012), and extensive research focuses on the use of human patient simulators in
relation to students’ satisfaction, anxiety, self-confidence, clinical judgment (Norman, 2012),
knowledge, skills, and attitudes (Shin, Park, & Kim, 2015). Face-to-face simulation training
using high-fidelity mannequins is currently the primary teaching method in nursing schools
(Goodstone et al., 2013; Howard, 2013). However, high-fidelity mannequins and laboratories
are extremely expensive and may not be the most cost-effective way to educate nursing students
(Lapkin & Levett-Jones, 2011; Shin et al, 2015).
SIMULATION AND CRITICAL THINKING 12
Computer-based simulation, such as virtual simulation, costs relatively less and allows
students to learn and practice independently at their own pace (Kilmon, Brown, Ghosh, &
Mikitiuk, 2010; Tschannen, Aebersold, McLaughlin, Bowen, & Fairchild, 2012). With the rapid
growth of online education, virtual simulation may be an effective alternative for nursing
educators if proven to be as useful as other methods in bridging the gap between theory and
clinical practice; however, studies on the effectiveness of virtual simulation in educating nursing
students remain limited (Foronda, Godsall, & Trybulski, 2013; Tschannen et al., 2012). Nurses
need to have sufficient critical thinking skills to provide safe patient care and have appropriate
clinical judgment while dealing with unpredictable situations. Due to the importance of critical
thinking in nursing education, as well as the limited clinical sites and current shortage of nursing
faculty, the efficacy of virtual simulation in improving critical thinking among nursing students
should be investigated. This dissertation examined the effectiveness of virtual simulation in
increasing baccalaureate nursing students’ critical thinking skills.
Background of the Problem
Nurses make up the largest professional group of the healthcare workforce (Alameddine,
Baumann, Laporte, & Deber, 2012), with more than 3.1 million registered nurses across the
United States (American Association of Colleges of Nursing [AACN], 2011). In both hospitals
and long term care facilities, the majority of patient care is delivered by nurses (AACN, 2011).
Despite there being many registered nurses, rising needs caused by national healthcare reform
and an increasingly aging population mean the United States is expected to experience a nursing
shortage (AACN, 2014b). Juraschek, Zhang, Ranganathan, and Lin (2012) anticipated a national
nursing shortage to occur between 2009 and 2030, with the largest RN job deficits to be found in
California, Florida, and Texas. Healthcare facilities may require nurses to take care of more
SIMULATION AND CRITICAL THINKING 13
patients or work overtime due to a lack of adequate staffing on each shift. Researchers found
that a higher nurse-to-patient ratio or patient loads contribute to an increase in hospital
readmission, infection, and patient death (Cimiotti, Aiken, Sloane, & Wu, 2012; Needleman et
al., 2011; Tubbs-Cooley, Cimiotti, Sliber, Sloane, & Aiken, 2013). Acknowledging increased
demand for safe and high-quality health care, the Institute of Medicine released a landmark report
in 2010 recommending that the percentage of nurses with a bachelor’s degree be increased to
80% and a doubling of the number of doctorate-equipped nurses by 2020. Nursing schools need
to supply increasing numbers of competent nurses to ease the shortage and meet the nation’s
demands.
While increasing the number of new nurses may help the anticipated nationwide deficit,
nursing schools currently struggle with faculty shortages. According to the AACN (2015a),
68,938 qualified applicants were not admitted to nursing schools in 2014 due to faculty
shortages, lack of clinical sites, limited classroom space, an inadequate number of clinical
preceptors, and insufficient budgets. In addition, nurse executives have a great deal of concern
with regard to the ability of new nurses to provide patients with safe and effective care. Berkow,
Virkstis, Stewart, and Conway (2009) pointed out that only 10% of healthcare organizations’
leaders believe new graduate nurses are capable of delivering care safely and effectively. In
addition, Del Bueno (2005) pointed out that only 35 % of graduating nurses have sufficient
clinical judgment to meet the entry-level requirements, which is a finding that has not changed
since the1990s. Nurse educators must, therefore, adopt innovative teaching strategies to
overcome existing barriers and help students develop essential skills for safe practice.
The healthcare environment and problems that nurses encounter on a daily basis are
increasingly complex due to the greater demands of patients and families, requirements of
SIMULATION AND CRITICAL THINKING 14
evidence-based practice, and emphasis on patient satisfaction (Chan, 2013). Leaders of
healthcare organizations expect nurse educators to fully prepare students to work in a rapidly
changing environment while successfully meeting the needs of patients and families. However,
due to limited time in both clinical and theory-based courses, it is difficult to teach all of the
diseases or issues students may face in practice. Therefore, critical thinking is crucial in
providing safe and effective patient care (Chang, Chang, Kuo, Yang, & Chou, 2011).
Nationally and internationally, nursing schools use high-fidelity simulation (HFS) widely
(Gates, Parr, & Hughen, 2012; Levett-Jones, Lapkin, Hoffman, Arthur, & Roche, 2011; Norman,
2012), especially since, in 2000, the Institute of Medicine suggested it be used to promote patient
safety (Vincent, Sheriff, & Mellott, 2015). The significant increase in simulation training is due
to difficulties in finding clinical placements (Alinier, Hunt, Gordon, & Horwood, 2006; Foronda
& Bauman, 2014; Jeffries, 2009; Lasater, 2007; Nehring, 2008; Seropian, Brown, Gavilances, &
Driggers, 2004). According to Terry and Whitman (2011), 68.2% of program deans and
directors reported having difficulty finding students clinical placements due to the economic
downturn. Other causes include a limited number of clinical supervisors and faculty members
(Alinier et al., 2006; Jeffries, 2009; Lasater, 2007; McNeal, 2010; Nehring, 2008), concerns of
patient safety (Jeffries, 2009; Nehring, 2008), shortened hospital stays (Lasater, 2007; Nehring,
2008), and lack of clinical opportunities concerning all medical diagnoses taught in classrooms
(Norman, 2012). The use of simulation can increase students’ ability to manage a crisis by
offering opportunities to learn and reflect on scenarios before encountering them in a real setting
(Sanford, 2010; Weaver, 2011).
While simulations can be used to complement what students are unable to experience in
actual clinical settings, the cost of HFS is very high. Nursing schools must have sufficient
SIMULATION AND CRITICAL THINKING 15
financial resources, physical space, and staff to operate a simulation laboratory that includes
proper equipment as well as the time the faculty may have to spend creating scenarios (Sanford,
2010). According to McIntosh, Macario, Flanagan, and Gaba (2006), the cost of creating a
simulation laboratory is $876,485, with maintenance costs of $ 361,245 annually. Schools
facing budget constraints, thus, might find difficulty utilizing HFS in their curricula. In addition,
the perceptions of faculty with regard to the technological knowledge required to teach HFS, as
well as lack of faculty development related to HFS contribute to its underuse (Leighton, 2013).
Finally, even though the use of HFS to teach clinical skills is widely accepted as efficient in
improving knowledge, critical thinking skills, and in enabling students to identify fluctuations in
patient conditions (Lapkin et al., 2010; McCallum, Ness, & Price, 2011), current research on the
impact of HFS on critical thinking is still inconclusive (Shinnick & Woo, 2013). As a result,
alternative cost-effective methods of teaching critical thinking and other required skills must also
be investigated.
Statement of the Problem
The trend in pedagogy recently moved to online education and innovative teaching
(Foronda et al., 2013). According to Allen and Seaman (2013), in the United States, 6.7 million
(32%) students in higher educational institutions took at least one online course in 2011. The use
of virtual world simulation and avatars for training has also grown rapidly (Kalisch, Aebersold,
McLaughlin, Tschannen, & Lane, 2015; Miller & Jenesen, 2014) due to increasingly advanced
technologies, easy access to these technologies, fast Internet speeds, and lower costs (Bryne,
Heavey, & Bryne, 2010). Virtual simulation has the potential to be an effective teaching method
in the future (Tschannen et al., 2012) as new generations of students are more familiar with
computer-based learning and social networking (Aebersold, Tschannen, & Bathish, 2012a).
SIMULATION AND CRITICAL THINKING 16
Extensive research examined the effectiveness of simulations conducted in the laboratory
using high-fidelity human patient simulators. Few studies examined computer-based simulation
and virtual simulation in relation to the critical thinking skills, satisfaction, and self-confidence
in learning of nurses and nursing students. Furthermore, a majority of the virtual simulations
reviewed in current studies were held in Second Life, which was a very popular virtual platform
used by educators to conduct virtual simulations, but which “has not lived up to the hype that
peaked in 2007” (Young, 2010, para. 2). It is imperative to explore the effectiveness of another
platform for virtual simulations. Schools may choose virtual simulation over the costlier HFS if
the former’s effectiveness in developing and promoting students’ critical thinking skills,
satisfaction and self-confidence in learning can be better determined.
Purpose of the Study
Due to the rising demand of safe and high-quality care from patients, families, and the
leaders of health care agencies, educators need to fully prepare students to work in complex
healthcare environments. Face-to-face HFS currently exists as the standard method of
simulation training; however, space requirements and high costs can prevent its implementation.
Therefore, as an alternative, virtual simulation may be a more cost-effective tool to develop
students’ critical thinking skills. The purpose of this dissertation was to investigate and compare
the effects of virtual simulation on nursing students’ acquisition of critical thinking skills as well
as on their satisfaction and self-confidence in learning.
Research Questions
1. Is there a significant difference in the critical thinking skills of students who are trained
in face-to-face simulation and those trained in virtual simulation?
2. Is there a significant difference in the critical thinking skills of students before and after
SIMULATION AND CRITICAL THINKING 17
participating in virtual simulation?
3. Is there a significant difference in the critical thinking skills of students before and after
receiving traditional face-to-face simulation?
4. Is there a significant difference in the satisfaction with learning of students who are
trained via face-to-face simulation and those trained via virtual simulation?
5. Is there a significant difference in the self-confidence in learning of students who are
trained via face-to-face simulation and those trained in virtual simulation?
Significance of the Study
The findings of this dissertation may contribute to current literature regarding the
effectiveness of virtual simulations built on a platform other than Second Life in increasing
nursing students’ critical thinking skills, satisfaction, and self-confidence in learning. In
addition, the results can also assist educators in determining whether traditional face-to-face
simulation is a better instructional method and worth the costs to build and maintain a simulation
laboratory as well as the expenses to train faculty members on how to conduct these simulations.
The findings may also provide evidence regarding whether virtual simulation may be and
effective alternative to traditional face-to-face simulation as a learning tool for students who
have difficulties in commuting to campuses or following a fixed class schedule. Finally, the
results can help educators make informed decisions when choosing a more cost-effective
teaching strategy to improve students’ critical thinking skills, which are critical to ensuring the
safety and satisfaction of patients, while taking their students’ satisfaction and self-confidence in
learning into consideration.
SIMULATION AND CRITICAL THINKING 18
Limitations
This study has several identified limitations. First, all participants were also taking other
classes and practicums during the period of data collection. As a result, their exposure to
different situations in real clinical settings, case studies completed for didactic classes, and
healthcare-related experience while working outside school may contribute to the improvement
in their critical thinking skills before and after receiving either the face-to-face simulation or the
virtual simulation. Second, participants in both groups may have communicated with each other,
and students in the virtual simulation group may have shared the virtual simulation program they
used with participants in the face-to-face simulation group. Third, this study was conducted over
a five-week period, and the increase in participants’ critical thinking skills may not be significant
enough to lead to a statistical difference. Fourth, the face-to-face simulations were taught by
four different lab instructors in four classes in order to accommodate all students. Although all
simulation lab instructors were provided the same transcript of each simulation scenario weekly,
the content delivered and instructional method used may be different due to academic freedom
and students’ responses to simulated scenarios. Therefore, the results may be affected. Finally,
due to small class sizes, participants may have believed their simulation lab instructor would
learn of the survey results and, therefore, provide positive feedback, though they were asked to
complete the satisfaction and self-confidence in learning survey anonymously.
Delimitations
One of the delimitations of this study is the convenience sampling method. The study
took place in a private non-profit university, and Baccalaureate of Science in Nursing (BSN)
students in their fourth semester of the program were recruited to participate in this study.
Therefore, the results of the study cannot be generalized to students in associate nursing
SIMULATION AND CRITICAL THINKING 19
programs, licensed nurses, or the BSN students who just started their programs. In addition, the
small sample size means the study is not widely generalizable.
Definition of Terms
Debriefing. An activity conducted by a faculty member after a simulated patient care
scenario during which “the case is deconstructed and analyzed, and feedback is given to the
participants by faculty and other students” (Cato, 2012, p. 3).
High Fidelity Simulation (HFS). It is to conduct simulation “using full scale
computerized patient simulator, virtual reality, or standardized patients that are extremely
realistic and provide a high level of interactivity and realism for the learner” (Simulation
Innovative Resource Center, n. d., p. 1).
Human Patient Simulators. The “whole body mannequins (adult, child, or infant) that are
capable of responding to certain medications, chest compressions, needle decompression, chest
tube placement, and other physiologic interventions and subsequent responses” (Galloway, 2009,
p. 3).
Partial Task Trainers. Low-tech mannequins that “are designed to replicate part of a
system or process” (Galloway, 2009, p. 3). The mannequins present a specific part of the body
and are used for practicing particular tasks (Decker et al., 2008).
Simulation. “Activities that mimic the reality of a clinical environment and are designed
to demonstrate procedures, decision-making, and critical thinking through techniques such as
role playing and the use of devices such as interactive videos or mannequins” (Jeffries, 2005, p.
97). A typical simulation consists of one student or a group of students and a simulated patient
who requires care (Cato, 2012). A faculty member usually observes how students provide care.
A simulation can be video recorded for students to review and reflect on their performance after
SIMULATION AND CRITICAL THINKING 20
completing each scenario. Distance learners can also be involved by watching recorded
simulations (Cato, 2012).
Standardized Patients. Paid actors or volunteers who are “instructed on how to act as if
they have a particular disease or condition in a given patient simulation in a given health care
setting” (Nehring & Lashley, 2009, p. 14).
Virtual Simulation. The word virtual means “offered on a computer, web-based, or
offered online” and it “has been linked to two-dimensional (2D) computerized case studies, CD-
ROMs, videos, games, or study programs” or “more commonly associated with 3D virtual
communities” (Foronda et al., 2013, p. 281). Virtual simulations occur in a virtual world, which
“resembles the real world, and has real world rules, such as distance, gravity, and the ability to
move within the world” (Dev, Youngblood, Heinrichs, & Kusumoto, 2007, p. 322). The
characters in a virtual world are called avatars and represent users from the real world. Users
control their avatars from their computers (Dev et al., 2007). The virtual simulation program
used in this study is called vim for Nursing Medical/Surgical. This program was selected for this
study because it includes the same NLN simulation scenarios used for conducting face-to-face
simulations in the selected nursing program.
Organization of the Study
This dissertation is presented in five chapters. Chapter One delineates the anticipated
nursing shortage and need for educators to develop competent nurses to meet the demands of
patients, family members, and healthcare organizations. This chapter also discusses challenges
educators encountered in meeting the needs of the population and ensuring patient safety. The
purpose of the study and its potential contributions are also presented. Finally, limitations,
delimitations, and definitions of key terms used in this study are also included. Chapter Two
SIMULATION AND CRITICAL THINKING 21
provides a review of research pertaining to the use of HFS and virtual simulation in improving
critical thinking skills, satisfaction, and self-confidence in learning among prelicensure nursing
students. A synthesis of current literature and the theoretical simulation framework are also
presented. Chapter Three describes the research design of the quantitative study, procedures and
methods for data collection, and statistical techniques used for data analysis. Chapter Four
presents the characteristics of the sample, results of the descriptive analysis, and the statistical
techniques used in the answering research questions. Chapter Five discusses the study’s
findings, strengths, weaknesses, and implications. Recommendations for future research are also
presented in the last chapter.
SIMULATION AND CRITICAL THINKING 22
CHAPTER TWO: LITERATURE REVIEW
Simulation as a training tool is widely and successfully used in military, aviation, and
nuclear power contexts (Aebersold & Tschannen, 2013). Simulation was introduced into nursing
education in the 1990s (Leighton, 2013) and has grown in popularity over the past decade
(Aebersold & Tschannen, 2013; Gates et al., 2011; Levett-Jones et al., 2011; Norman, 2012).
Face-to face simulation using a high-fidelity human patient simulators is the most widely used
and studied method in nursing education and research today (Aebersold et al., 2012a; McCallum
et al., 2011). Virtual simulation has also grown in popularity, though its concepts and potential
impact on nursing education are not yet sufficiently understood (De Gagne, Oh, Kang,
Vorderstrasse, & Johnson, 2013; Foronda & Bauman, 2014; Foronda et al., 2013; Tschannen et
al., 2012). This chapter presents a brief history of simulation training that covers the typology of
simulation and virtual simulation, common platforms of virtual simulation, and critical thinking
in nursing education as well as discusses current research on human patient simulation and
virtual simulation in improving critical thinking, the limitations of current studies, and a
theoretical framework to simulated learning.
A Brief History of Simulation in Nursing Education
Simulation has historically been used predominantly in the aviation industry and in
medication education (Hotchkiss & Mendoza, 2001; Ward-Smith, 2008). According to Rolfe
and Staples (1986), the use of high level simulation was first implemented for pilot training
during World War II (as cited in Ward-Smith, 2008). In nursing education, educators
historically used simulation to teach psychomotor skills, such as injections and catheterizations
(Lasater, 2007). As recently as 1949, turkey and chicken bones, a lamb’s jaw, and doll beds
SIMULATION AND CRITICAL THINKING 23
were used to teach students about fractures and the application of traction (Slalas & Neber,
1949).
According to Nehring and Lashley (2009), the first life-size, low-fidelity static
mannequin, “Mrs. Chase”, was conceived by Lauder Sutherland, the principal of the Hartford
Hospital Training School, and produced by the M.J. Chase company in 1910 based on the real
Mrs. Chase (Appendix A). Mrs. Chase was first used in classrooms in 1911 (Hyland &
Hawkins, 2009) and a baby model was developed in 1913 (Nehring & Lashley, 2009). By 1914,
Mrs. Chase was further modified with a site in her arm for needle injection and an internal device
with which users could practice rectal, urethral, and vaginal procedures (Nehring & Lashley,
2009). Mrs. Chase became very popular in nursing education in the 1950s (Hyland & Hawkins,
2009) and was manufactured until the 1970s (Nehring & Lashley, 2009).
The Harvey model was used for the next decade (Peteani, 2004). The Harvey model is a
life-sized simulator designed with the capability of simulating the physical characteristics of 20
cardiovascular diseases that could be found at the bedside (Jones, Hunt, Carlson, & Seamon,
1997). The Harvey model was equipped with physical features that related to the cardiovascular
system, such as blood pressure and pulse and allowed nursing students to practice distinguishing
normal and abnormal lung and heart sounds (Cooper & Taqueti, 2004). The Harvey model was
commonly used to augment teaching in cardiac units (Peteani, 2004).
The static mannequins used in the 1950s slowly progressed over the years (Decker et al.,
2008). The use of the Resusci-Anne mannequin in cardiopulmonary resuscitation classes
represents the first mannequin-based simulation implemented for the training of contemporary
nurses (Stokowski, 2013). This mannequin was developed in the early 1960s by the Norwegian
plastic-toy company Asmund Laerdal (Cooper & Taqueti, 2004). This mannequin was equipped
SIMULATION AND CRITICAL THINKING 24
with a realistic airway and a chest wall that included an internal spring to allow users to perform
a full cycle of cardiopulmonary resuscitation, consisting of airway, breathing, and circulation
(Cooper & Taqueti, 2004).
Due to the wide use of simulation by surgeons, anesthesiologists, and emergency
medicine doctors, simulation technologies advanced since the 1960s (Wilford & Doyle, 2006).
In 1969, the first fully computerized simulator, Sim-One, was developed and primarily used to
train students in the schools of anesthesia for endotracheal intubation (Peteani, 2004). Sim-One
“was a remarkably lifelike mannequin, controlled by a hybrid digital (with ‘4096 words of
memory’) and analogue computer” (Cooper & Taqueti, 2004, p.12). This mannequin was also
equipped with an anatomically shaped chest wall, blinking eyes, reactive pupils, and a movable
jaw (Cooper & Taqueti, 2004, p.12). However, Sim-One was not commercialized because of the
high cost of the computer technology of the time (Cooper & Taqueti, 2004, p.12).
In the mid-1990s, the Laerdal Company merged with the Medical Plastics Corporation of
Texas and developed a mannequin with more anatomical features, named SimMan.
SimMan is an example of a high-fidelity simulator. The standardized features of SimMan enable
users to practice endotracheal intubation, chest tube insertion, and bladder and intravenous
catheter insertions (Peteani, 2004). SimMan is also built with physical features, such as a mouth,
airway, and chest that allow users to practice physical assessment and other skills (Peteani,
2004). Finally, heart, lung, and bowel sounds as well as electrocardiogram, pulse oximetry, and
arterial pressure can be programmed to simulate scenarios users may encounter in a clinical
setting (Peteani, 2004). SimMan is manufactured in different versions with an age range of six
to 100 years and additional modules are available for simulation of management of wound care,
trauma, and bleeding control (Peteani, 204). Advanced simulation technology promoted the
SIMULATION AND CRITICAL THINKING 25
development of the HFS mannequins used in the 21
st
century. Different generations of SimMan,
such as SimMan Essential and SimMan 3G, include wireless capability and a higher level of
functionality and fidelity, which can be used to enhance practices in respiratory, cardiac, and
neurological management.
Efforts toward web-based simulation for the general public began in 1996 (Foronda et al.,
2013). Along with this movement, virtual simulation became a common teaching method used
to teach physicians and medical students how to deal with various situations and perform
surgeries (Foronda et al., 2013). Virtual reality simulation is also common in other industries,
such as the airline industry and military medicine (Jenson & Forsyth, 2012). Due to the
advancement of 3D and video-game technologies, the use of virtual simulation and avatars in
nursing education grew rapidly (Miller & Jensen, 2014). However, these technologies remain in
the early stages of development (Kilmon et al., 2010). In Europe, the European Commission
funded an electronic virtual patients project (eVip) to create 320 virtual patients, which can be
accessed under a creative commons license (EVIP, 2016). This project serves as an example of
the decreasing barriers in conducting virtual simulations and makes the use of virtual patients
accessible, interoperable, and reusable by proving their standardized designs and implementation
(McAfooes, Childress, Jeffries, & Feken, 2012).
Typology of Simulation
A simulation can be as simple as peer-to-peer learning that enables students to practice
skills with each other or as complex as full-scale simulation that requires students to think
critically and make appropriate clinical decisions (Decker et al., 2008). According to Jeffries
(2005), simulations are:
SIMULATION AND CRITICAL THINKING 26
Activities that mimic the reality of a clinical environment and are designed to
demonstrate procedures, decision-making, and critical thinking through techniques such
as role-playing and the use of devices such as interactive videos or mannequins. A
simulation may be very detailed and closely simulate reality, or it can be a grouping of
components that are combined to provide some semblance or reality. (p. 97)
Case studies scenario settings can also be considered simulation in the world’s broadest
definition (Sanford et al., 2010). The different types of simulations can be determined in terms
of their complexity and fidelity. Fidelity indicates “the degree that the object mimics reality”
(Nehring & Lashley, 2009, p. 536). The higher the fidelity of a simulation, the better a
simulation’s accuracy is compared to a real setting. Low-fidelity patient simulation, which is
often used in the teaching of psychomotor skills, uses low-tech mannequins such as task trainers
to simply represent certain parts of the human body (Weaver, 2011). Full-scale simulation refers
to the use of full-body, medium to high-fidelity mannequins to provide more realistic patient
responses in lifelike medical environments (Decker et al., 2008). Medium-fidelity, or
intermediate-fidelity simulations have more realistic mannequins with lung and heart sounds
(Seropian et al., 2003). Medium-fidelity mannequins can be computerized, but cannot be
utilized in authentic clinical scenarios (Weaver, 2011). Mannequins used in HFS have realistic
physical features, such as chest movements and functional eyes (Seropian et al., 2003). These
mannequins can also react to students’ interventions, and, therefore, reflect the true responses of
patients in clinical settings (Seropian et al., 2003).
Standardized patients are used frequently in medical education (Nehring & Lashley,
2009). Paid actors or volunteers are trained to act as patients consistently in a realistic manner
and research recommend they be used to evaluate students’ communication skills, physical
SIMULATION AND CRITICAL THINKING 27
assessment competencies, and interview techniques (Decker et al., 2008; Nehring & Lashley,
2009). A haptic system refers to a simulator that utilizes both virtual reality technology and real
world exercises to teach and evaluate specific skills in tracking the student performance (Decker
et al., 2008).
The advantages of using simulation in nursing education are well established in the
literature. According to Shin et al. (2015), simulation has maximum benefit for senior
undergraduate nursing students. Simulation offers opportunities to implement skills and
knowledge acquired in school as well as to increase abilities to manage a crisis before
encountering it in a clinical setting, thus allowing students to see the consequences of certain
interventions without harming patients (Gates et al., 2011; Sanford et al., 2010; Weaver, 2011).
In addition, simulation fosters the development of students’ knowledge, communication skills,
and safety competencies (Norman, 2012) as well as allows students to practice holistic care from
patient admission to death and nursing care tailored specifically to disease progression (Gates et
al., 2011). In terms of students’ perceptions, simulations are known to increase their sense of
self-efficacy, self-confidence, and clinical judgment (Norman, 2012). Finally, simulation can
also be used as an evaluation tool to determine a student’s knowledge, psychomotor skills, and
attitudes, which are important for ensuring consistent patient safety across similar circumstances
and the provision of remediation as needed (Kardong-Edgren, Adamson, & Fitzgerald, 2010;
Ironside, Jeffries, & Martin, 2009).
The high cost of face-to-face HFS is its primary disadvantage. According to Johnson et
al. (2014), the cost of one high-fidelity mannequin is greater than $60,000. In addition, the fees
for a physical laboratory, medical equipment, laboratory staff, and the time required for
instructors to create scenarios make the simulation a very expensive teaching strategy (Johnson
SIMULATION AND CRITICAL THINKING 28
et al., 2014; Sanford et al., 2010). In addition, class size can be another drawback of HFS. Face-
to-face HFS is usually offered in groups of 8 to 10 students; therefore, educators must spend
more time in a laboratory to offer HFS, and students are usually only allowed one attempt at each
scenario (Weatherspoon & Wyatt, 2012). Access to a physical laboratory at a specific time may
also be a challenge for students who live in geographically isolated areas. Finally, Lapkin and
Levette-Jones (2011) found that HFS is not the most cost-effective teaching strategy in terms of
students’ knowledge acquisition, satisfaction, and clinical reasoning skills. Based on these
findings, identifying another innovative method to educate future nurses is, therefore, vital.
Web-based simulation emerged as a potential alternative in recent years. According to
Cant and Cooper (2014), web-based simulation can be classified into three types based on their
differing levels of fidelity, including interactive case scenarios in which students can
communicate with their patients using a synchronous texting feature, filmed actors portraying
patients in real medical situations and environments, and animated patients in a virtual world that
allows students to interact with patients via multimedia features. Due to the fact that students of
this century are technologically savvy, virtual reality simulations can be a cost-effective teaching
strategy (Tiala, 2006).
Virtual reality simulation utilizes 3D objects and stereo images to create a simulated
environment that mimics a real clinical setting and allows students to interact with virtual
patients in real time (Cant & Cooper, 2014; Jenson & Forsyth, 2012; Tiala, 2006). Despite their
differences in meaning, the term virtual reality is often used interchangeably with the term
virtual world in the literature (De Gange et al., 2013). Virtual world is an online environment
that allows teaching and learning activities to take place (De Gange et al., 2013). On the other
hand, virtual reality typically involves closed-off space in which realistic projections are cast
SIMULATION AND CRITICAL THINKING 29
onto three to six walls that form a cube, or users otherwise wear a head-mounted device, such as
goggles, which projects a lifelike scene (Davis, 2009; De Gange et al., 2013). As individuals
become fully immersed in these simulated environments, such virtual reality is also called
immersive virtual reality (Simpson, 2002). Non-immersive virtual reality, however, is the more
common form of simulation used in nursing education (Simpson, 2002). In non-immersive
virtual reality simulation, a realistic 3D environment is displayed on a desktop or movie screens
that students must navigate using a mouse or joystick (Davis, 2009).
There are several benefits to using virtual simulation. First, it provides students greater
exposure to practice in a simulated online environment at a relatively lower cost as compared to
HFS (Aebersold, Tschannen, Stephens, Anderson, & Lei, 2012b; Cant & Cooper, 2014;
Tschannen et al., 2012). In addition, multiple students can participate in simulations
simultaneously from their homes as long as they have Internet access (Aebesold et al., 2012b;
Cant & Cooper, 2014; De Gange et al., 2013; Jenson & Forsyth, 2012; Miller & Jensen, 2014;
Youngblood et al., 2008). Moreover, students can repeatedly practice their skills in an
interactive and nonthreatening environment, which reduces their anxiety and faculty concerns
about patient safety (Aebersold & Tschannen, 2013; Jenson & Forsyth, 2012). Virtual
simulation also enables students to respond to patients’ requests and communicate with
colleagues, which promotes the use of critical thinking skills, therapeutic communication
techniques, teamwork, and leadership skills, all of which are required to make appropriate
decisions in a real clinical setting (Aebersol et al., 2012a; Foronda & Bauman, 2014; Foronda et
al., 2013; Tschannen et al., 2012). Finally, student performance can be monitored remotely on a
computer, making a faculty member’s physical presence unnecessary (Jenson & Forsyth, 2012).
SIMULATION AND CRITICAL THINKING 30
Some disadvantages of virtual simulation were also reported in the literature. First, a
major disadvantage of virtual simulation that may impede student learning is technical
difficulties (Aebersold et al., 2012b; Broom, Lynch, & Preece, 2009; De Gange et al, 2013;
Dutile et al., 2010), such as poor audio and video quality (Dev et al., 2007), server issues,
Internet lag, and difficulties navigating avatars in a virtual world (Warburton, 2009). Moreover,
the absence of non-verbal cues can make using virtual simulation difficult (Jenson & Forsyth,
2012). Finally, many students reported having negative perceptions toward virtual simulation
because of “concerns about leaning in a game-like environment” (Aebersold et al., 2012b, p.
473).
Platforms of Virtual Simulation
Second Life is the most popular online, open-access, multiuser virtual environment used
in academic settings (De Gange et al., 2013; Foronda et al., 2013; Warburton, 2009). Second
Life was developed and launched by Linden Lab in 2003 (Aebersold et al., 2012b). Participants
create their own avatars to enable interactions with other users and may receive prompts and
instructions from facilitators, all with the feeling of being in a real-life setting (Aebersold et al.,
2012b). The costs of using Second Life to create a virtual world include purchasing and
maintaining land to build a virtual hospital ward, furniture, and other pre-built items to be placed
in the virtual environment, as well as hiring developers to create a virtual environment if needed
(Kalisch et al., 2015).
Colleges expressed frustrations with Second Life due to difficulties moving in a clunky
environment, problems with managing non-academic activities, and concerns about the
disappearance of already built virtual classrooms/laboratories caused by Linden Lab’s being out
of business or changing servers (Young, 2010). Finally, it may take considerable time and
SIMULATION AND CRITICAL THINKING 31
resources to build a virtual environment and virtual patients (McAfoes et al., 2012). Regardless,
with the educational capabilities of the virtual world still considered valuable, more education
friendly virtual world software emerged, such as “Archie MD, ClinicalCare, Clini-Space,
OLIVE, Open Cobalt, OpenSimulator, TINA, Virtual Heroes, and vSim” (Foronda & Bauman,
2014, p. 412). Some of the virtual simulation programs, such as Clini-Space, Open Cobalt, and
Virtual Heroes, are multiuser virtual environments, and users are allowed to interact with
multiple individuals in real time.
The vSim for Nursing is a series of virtual simulations co-developed by Laerdal Medical
and Wolters Kluwer Health (Forneris & Scroggs, 2014), and it was released in 2014. vSim for
Nursing is a single user virtual simulation that allows students varied access into a virtual world
to interact with simulated patients via computer-animated avatars and to practice skills in a
realistic virtual environment at a convenient time and location. The vSim for Nursing program is
available for different specialty subjects, including medical-surgical, pediatric, maternity,
gerontology, nursing fundamentals, and pharmacology. The workflow of the vSim for Nursing
includes suggested readings linked to Lippincott’s online textbooks, pre-simulation quizzes, vSim
simulations based on the NLN’s simulation scenarios, post-simulation quiz, documentation
assignments using Lippincott’s DocuCare, which is an educational electronic medical record,
and guided reflection questions (NLN, 2016). According to Forneris and Scroggs (2014), faculty
members perceive the vSim for Nursing as a valuable teaching method that is “very effective as a
targeted critical thinking activity to enhance prioritization and decision-making” (p. 348). The
cost of vSim for Nursing per student is $99.95 and includes access for up to 24 months for the
medical-surgical subject, and 12 months for the rest of the subjects.
SIMULATION AND CRITICAL THINKING 32
Critical Thinking in Nursing Education
Advances in technology, rising demands for high-quality patient care, increased patient
acuity, shortened stays in acute care facilities, the growing elderly population, and pressure to
economize costs resulted in dramatic changes in the healthcare organization over the past 30
years (Simpson & Courtney, 2002). Nurses have greater accountability in meeting patients’
needs within an ever-changing environment that demands a higher level of thinking and
reasoning skills (Simpson & Courtney, 2002). Nurses in both academic and clinical settings
agree that critical thinking is important in complex healthcare environments (Fowler, 1998).
Nurses must use critical thinking skills to examine situations they encounter in daily practice in
order to make appropriate clinical judgments. Critical thinking also promotes the development
of evidence-based practices essential to ensuring patient safety and improving clinical outcomes
(Profetto-McGrath, 2005). The AACN (2008) identified critical thinking as one of the core
competencies in a baccalaureate education for professional nursing. Furthermore, the
Commission on Collegiate Nursing Education, which is an accrediting agency for nursing
programs, has adopted the AACN’s guidelines for its accreditation standards.
Definition of Critical Thinking
The intellectual roots of critical thinking can be traced to when Socrates “discovered by a
method of probing questions that people could not rationally justify their confident claims to
knowledge” 2,500 years ago (Critical Thinking Community, 2013, para. 1). While critical
thinking was initially discussed as a method of interpretation and evaluation in the 1940s, it
gained significance in the educational literature in 1962 (Roth, 2010). The meanings and shared
terms for critical thinking were conflicting, and a universal definition of critical thinking
remained undetermined (Flores, Matkin, Burbach, Quinn, & Harding, 2012; Simpson &
SIMULATION AND CRITICAL THINKING 33
Courtney, 2002). Richard Paul, Robert Ennis, and Matthew Lipman are the three leading
researchers of critical thinking (Pearson Higher Education, 2010); their definitions of critical
thinking were included here.
According to Staib (2003), Richard Paul, the director of the Center for Critical Thinking
and the chair of the National Council for Excellence in Critical Thinking in California, is
commonly quoted in the literature on his definition of critical thinking. Paul (1990) stated that
critical thinking requires “metacognition”, and defined critical thinking as “disciplined, self-
directed thinking which exemplifies the perfections of thinking appropriate to a particular mode
or domain of thought” (p. 4). Ennis (1985), a critical thinking measurement expert, defined
critical thinking as “reflective thinking focused on deciding what to believe or do” (p. 46).
Finally, Lipman (1988) defined critical thinking as “skillful, responsible thinking that facilitates
good judgment because it (1) relies upon criteria, (2), is self-correcting, and (3) is sensitive to
context” (p. 39).
Due to confusion as to the definition of critical thinking, the critical thinking Delphi
project, sponsored by the American Philosophical Association, was conducted in 1990 (Scheffer
& Rubenfeld, 2000; Simpson & Courtney, 2002). Facione (1990) and other experts in the fields
of arts and sciences reached a consensus statement to describe critical thinking as follows: “to be
purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and
inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or
contextual considerations upon which that judgment is based” (p. 3).
With respect to the field of nursing, researchers also defined critical thinking differently
(Daly, 1998; Scheffer & Rubenfeld, 2000; Worrell & Profetto-McGrath, 2007). Brigham (1993)
claimed that it is difficult to define critical thinking “as there are as many definitions as there are
SIMULATION AND CRITICAL THINKING 34
experts” (p. 49). The definition of critical thinking specific to nursing was initially focused
solely on intellectual or cognitive ability (Scheffer & Rubenfeld, 2000). However, Tanner
(1997) addressed the importance of including affective components. Critical thinking in nursing
was narrowly defined as “a rational-linear problem solving activity which reflected the nursing
process” (Jones & Brown, 1990, p. 529), though it is variably defined as an attitude, thought
process, and reflective thinking (Staib, 2003). Matthews and Gaul (1979) stated that critical
thinking “is an attitude of inquiry which involves the use of facts, principles, theories,
abstractions, deductions, interpretations and evaluation of arguments”, as well as “involves the
cognitive skills of comprehension, application, analysis, synthesis and evaluation” (p. 19). In
addition, Kataoka-Yahiro and Saylor (1994) defined critical thinking as a process that involves
“reflective and reasonable thinking about nursing problems without a single solution that is
focused on deciding what to believe and do” (p. 352). Finally, Oermann (1997) viewed critical
thinking as foundational for clinical judgments and reasoning concerning about patients.
Scheffer and Rubenfeld (2000) conducted a landmark study that utilized the Delphi
technique to define and reach consensus as to the habits of mind and skills required for critical
thinking in nursing:
Critical thinking in nursing is an essential component of professional accountability and
quality nursing care. Critical thinkers in nursing exhibit these habits of the mind:
confidence, contextual perspective, creativity, flexibility, inquisitiveness, intellectual
integrity, intuition, open-mindedness, perseverance, and reflection. Critical thinkers in
nursing practice the cognitive skills of analyzing, applying standards, discriminating,
information seeking, logical reasoning, predicting and transforming knowledge. (p. 358)
SIMULATION AND CRITICAL THINKING 35
The term critical thinking is often “used interchangeably” with “clinical judgment”,
“problem solving”, and decision-making (Tanner, 2006, p. 204), which causes confusion among
nurse educators, students, and nurses (Simpson & Courtney, 2002). Simpson and Courtney
(2002) pointed out that problem-solving aims to find solutions and answers to problems, whereas
critical thinking focuses on asking questions to fully understand a situation and critique all
possible solutions. Critical thinking is necessary in order to make appropriate clinical decisions
(Lipman & Deatrick, 1997). Lastly, the AACN (2008) pointed out that, in practice, clinical
judgment is the final outcome of utilizing critical thinking.
The Effects of Simulation on Critical Thinking Development
Nurse educators use various teaching methods to promote students’ critical
thinking skills. Critical thinking skills are cultivated over time through exposure to various
experiences, and using only one teaching strategy and one clinical experience may not
effectively develop students’ critical thinking skills (Oermann, 1997). The use of concept maps
(Hicks-Moore & Pastirik, 2006) and preceptorship programs (Myrick, 2002) can effectively
foster students’ critical thinking skills in clinical settings. In addition, Simpson and Courtney
(2002) pointed out the majority of the research supports that students’ critical thinking skills can
be improved using “questioning, small-group activities, role playing, debate, the use of case
studies, journals, simulations, jigsaws, problem-solving and writing assignments” (p. 96).
Human Patient Simulation
Human Patient Simulation (HPS) is widely used in nursing curriculum. Current studies
focus on the effectiveness of instructor-led simulations, especially high-fidelity patient
simulations, in promoting knowledge gain and transfer, self-confidence, satisfaction, perceptions
as to the values of high-fidelity patient simulation, realism, and stress (Weaver, 2011;
SIMULATION AND CRITICAL THINKING 36
Weatherspoon & Wyatt, 2012), whereas only few studies investigated the use of the HPS and its
impact on critical thinking skills. This may be due to a lack of commonly accepted measurement
tools for evaluating critical thinking (Brown & Chrinister, 2009). While a review of the current
literature indicates that the impact of HPS on the development of critical thinking skills remains
inconclusive, the findings of certain studies support HPS as part of an effective teaching strategy
to improve undergraduate nursing students’ critical thinking skills (Fong, 2013; Howard, 2007;
Schumacher, 2004; Sullivan-Mann, Perron, & Fellner, 2009) and newly graduated ICU nurses
(Kaddoura, 2010).
Fong (2013) conducted a two-phase, mixed-methods study to evaluate the effect of HFS
in the improvement of critical thinking skills, satisfaction, and self-confidence in learning among
90 Chinese higher diploma nursing students in Hong Kong. In phase one, students were pre-
tested using the Critical Thinking Survey. After receiving 2 hours of simulation tutorials and two
hours of HFS and debriefing in a laboratory, the same instrument was again administered to
measure participants’ critical thinking skills. The results demonstrated that participants scored
higher on the post-test than on the pre-test, and the difference was statistically significant.
Howard (2007) compared the effectiveness of the use of HPS and interactive case studies
on knowledge gain and critical thinking ability among 49 senior-level nursing students enrolled
in either a BSN or diploma program. The study was a quasi-experimental and included two
groups with a pre-test and post-test design. The measurement tool for critical thinking abilities
and knowledge acquisition was the Health Education System Incorporated exam. The results
indicated that students who received HFS scored higher on the post-test than did the interactive
case studies group. Howard (2007) concluded that HFS is more effective than interactive case
studies in promoting knowledge acquisition and critical thinking ability.
SIMULATION AND CRITICAL THINKING 37
Schumacher (2004) also compared differences in critical thinking abilities and learning
outcomes among beginner BSN students in classroom instruction, simulation, and a combination
of simulation and classroom instruction. There were 36 participants pre-tested using the Health
Education System Incorporated exam and, then, randomly assigned into the three instructional
groups based on the participants’ critical thinking scores at the pre-test. All participants were
rotated through three learning activities that focused on providing care for patients with emergent
cardiovascular and respiratory problems while receiving one of the three teaching strategies.
After completing the learning activities, the Health Education System Incorporated exam was
administered as a post-test. The results showed that both simulation and the combination of
simulation and classroom instruction effectively improved participants’ critical thinking abilities
and learning outcomes (Schumacher, 2004).
Sullivan-Mann et al. (2009) conducted an experimental quantitative study to evaluate the
critical thinking skills of 53 students in an associate degree nursing program both before and
after their exposure to simulation using a HPS. The participants were randomly assigned to
either a control group or treatment group. Both groups followed the same curriculum schedule
and received two simulation scenarios, but the treatment group received three extra simulation
scenarios. The instrument used to evaluate the participants’ critical thinking skills was the
Health Sciences Reasoning Test (HSRT). The experimental groups scored higher than the
control group did on the HSRT exam, which supported simulation as an effective teaching
strategy.
Goodstone et al. (2013) conducted a two-group, quasi-experimental study to investigate
critical thinking skills among first-year nursing students enrolled in an associate program using
HFS and case studies. Students in the HFS group received only weekly simulations, while the
SIMULATION AND CRITICAL THINKING 38
case study group participated in weekly case studies. All participants were pre- and post-tested
with the HSRT. The results showed that both groups had improved mean HSRT scores on the
post-test; however, this improvement was not statistically significant.
In terms of licensed nurses, Kaddoura (2010) conducted a qualitative study to investigate
the perceptions of ten newly graduated BSN nurses soon to be working in an intensive care unit
with regard to the effectiveness of clinical simulation in developing critical thinking skills,
learning, and self-confidence. The study used an exploratory qualitative, descriptive design and
methods for data collection included a demographic questionnaire and semi-structured
interviews. The results showed that clinical simulation was an effective teaching strategy in
fostering critical thinking skills, self-confidence, and learning (Kaddoura, 2010).
On the other hand, some studies do not support the notion of human patient simulation as
an effective teaching strategy for critical thinking improvement. Ravert (2008) examined
differences in critical thinking skills and dispositions among three groups: a regular education
group, a group receiving a combination of regular education and five one-hour weekly small-
group discussions on assigned patients, and a group receiving a combination of regular education
with five one-hour weekly simulations using HPS. The sample consisted of 40 BSN students
and used a pre-test and post-test research design. Instruments used to evaluate critical thinking
were the California Critical Thinking Disposition Inventory and the California Critical Thinking
Skills Test. Ravert (2008) pointed out that all three groups scored higher on both tests and that
the effect size was moderate to large; however, statistically significant differences were not
detected among the three groups.
In addition, Shinnick and Woo (2013) conducted a quantitative study using a one-group,
quasi-experimental design to examine critical thinking improvement after exposure to HFS
SIMULATION AND CRITICAL THINKING 39
among 154 prelicensure nursing students recruited from three schools. All participants were pre-
and post-tested with the HSRT. The results demonstrated no statistically significant
improvement in critical thinking skills.
Maneval et al. (2012) also investigated the critical thinking and clinical decision-making
skills of 26 newly graduated nurses working in adult healthcare areas after exposure to HFS
using a quasi-experimental, pre- and post-test research design. Participants in the control group
received only a regular new nurse orientation, while six human simulator scenarios were added
in addition to the regular orientation for the treatment group. The HSRT and the Clinical
Decision-Making in Nursing Scale were used as instruments for the collection of statistics.
Maneval et al. (2012) found no statistically significant differences scores on either measure
between the control and treatment groups.
Finally, Brown and Chronister (2009) conducted a quantitative study to investigate
critical thinking skills and self-confidence among 140 senior-level BSN students enrolled in an
electrocardiogram course. This study used a comparative and correlational design. The
participants in the treatment group received, on a weekly basis, both lecture and simulation,
while the control group only received didactic instruction. Elsevier’s computerized Evolve
Electrocardiogram Custom Exam was used to measure the participants’ critical thinking gains
(Brown & Chronister, 2009). The treatment group did not score higher than the control group
did on the exam. As a result, the high-fidelity human simulations could not be determined as an
effective teaching method for fostering critical thinking skills.
Virtual Simulation
There are no published studies on the effects of virtual simulation using vSim for Nursing
on improving prelicensure students’ critical thinking skills. The articles reviewed in this
SIMULATION AND CRITICAL THINKING 40
dissertation cover the effects of computer-based simulation and virtual simulation on improving
essential skills required for safe patient care among healthcare professionals. Current studies
related to computer-based simulation, including virtual simulation, focused on clinical judgment
skills (Howard, 2013; Weatherspoon & Wyatt, 2012), knowledge transfer (Tschannen et al.,
2012), decision-making skills (Forseberg, Georg, Ziegert, & Fors, 2011; McCallum et al, 2011),
students and faculty members’ perceptions of virtual simulation and learning (Aebersold et al.,
2012b; Jenson & Forsyth, 2012), improvement in teamwork (Kalisch et al, 2015), leadership
skills (Youngblood et al., 2008), and diagnostic reasoning skills (Wilson, Klein, & Hagler,
2014). Only Johnson and Johnson (2014) compared the effects of HFS and CD-ROM teaching
on the critical thinking skills and clinical performance of nurse anesthetists.
Weatherspoon and Wyatt (2012) conducted a quantitative study using a pre-test and post-
test design to investigate the clinical judgment skills of 23 senior BSN students in an emergency
triage setting. An online learning product called the “Emergency Room Triage Software” was
used as the experimental intervention. Participants were randomly assigned into either a control
group or an experimental group. The students in the control group received only classroom
instruction, while the experimental group received both a lecture on triage and the computer-
based simulations. The participants’ clinical judgment skills were evaluated using the triage
acuity instrument. The experimental group showed significant improvement, while the control
group only showed marginal improvement in terms of the pre-test and post-test scores.
Weatherspoon and Wyatt (2012) concluded that computer-based simulation can be an innovative
teaching strategy for developing clinical judgment skills.
Howard (2013) compared differences of clinical judgment skills among 47 associate
degree nursing students randomly assigned to either the human patient simulation group that
SIMULATION AND CRITICAL THINKING 41
used high-fidelity mannequins or the computer-based simulation group. A computer-based
simulation program called the “MicroSim-Inhospital” was used in this study. Students in the
HFS group received mannequin-based simulation while the computer-based simulation group
received computer-based simulation on day one. After a four-week washout period, all
participants attended both HFS and computer-based simulation on day two for evaluation.
Students’ clinical judgment skills were evaluated via inputs from a faculty facilitator and the
computer-based simulation software. The study found no differences in clinical judgment skills
between the two groups.
Tschannen et al. (2012b) examined knowledge transfer of communication skills,
problem-solving, and prioritization among 115 BSN students using a quasi-experimental design.
Students in the experimental group traditional classroom instruction and three virtual simulations
held in Second Life. All students were evaluated in a mannequin-based simulation using the
modified Capacity to Rescue Instrument. Tschannen et al. (2012) found that students who
received virtual simulations scored better than students who only received didactic instruction,
indicating virtual simulations to be a better teaching strategy in fostering knowledge transfer than
traditional classroom instruction.
Aebersold et al. (2012a) conducted a quantitative study using a repeated-measure design
to examine the effects of virtual simulation on the technological skills of 61 senior nursing
students. The virtual simulation was conducted in Second Life and the instrument used for
evaluation was the Emergency Medicine Crisis Resource Management tool. The results showed
that overall scores improved, yet no statistical significance was detected. However, scores in two
sub-categories, including communication and professional behaviors were found to be
significantly improved between scenarios one and two.
SIMULATION AND CRITICAL THINKING 42
Wilson et al. (2014) investigated whether 54 BSN students performed differently
following the use of computer-based simulations (built using the NurseSquared software) and
human patient simulation in a case presentation. This study used a quasi-experimental crossover
design. All participants were evaluated following the cases using a rubric designed to evaluate
the accuracy and completeness of the Situation, Background, Assessment, Recommendation
format report, which includes all essential components of diagnostic reasoning skills. The results
demonstrated that participants performed slightly better in the human patient simulation cases.
Wilson et al. (2014), thus, concluded that human patient simulation is an effective teaching
strategy for improving students’ diagnostic reasoning skills.
In terms of licensed personnel, Kalisch et al. (2015) investigated the effectiveness of
virtual simulation in promoting teamwork among 16 registered nurses and nurse aides using a
quasi-experimental, pre-test and post-test research design. The Nursing Teamwork Survey, the
Teamwork Knowledge Survey, and a questionnaire were administered three weeks before and
after exposure to virtual simulation in Second Life to evaluate the participants’ teamwork
behaviors, teamwork knowledge, and the effects of the virtual simulation experience in terms of
its overall effectiveness. The results demonstrated that overall teamwork was statistically
improved; however, teamwork knowledge was not. The effect of previous experience with
virtual simulation and computer proficiency was also found to not significantly influence
teamwork behaviors and knowledge (Kalisch et al., 2015).
Johnson and Johnson (2014) investigated the differences between HFS and CD-ROMs in
improving 49 nurse anesthetists’ critical thinking skills and combat performance. The study
utilized a pre-test and post-test, experimental, and mixed design. The research participants were
randomly assigned to a HFS group, a CD-ROM group, or a control group. The participants in
SIMULATION AND CRITICAL THINKING 43
the control group received no instruction until the completion of data collection. The
participants’ critical thinking skills and combat performance were evaluated using instruments
developed by the researchers. The results demonstrated that differences between the HFS and
CD-ROM groups, as well as the CD-ROM and control groups, were not statistically significant
in terms of critical thinking skills. However, the combat performance of the participants in the
HFS group was significantly improved as compared to that of the participants in CD-ROM and
control groups.
Johnson et al. (2014) also conducted a quasi-experimental, pre-test and post-test study to
evaluate 32 advanced-practice nursing students’ knowledge, skills, and attitudes of caring for
acute patients. Participants were randomly assigned to either an HFS or web-based simulation.
The web-based simulation was conducted using the DxR Clinician software. A self-assessment
questionnaire and a performance checklist scored by an evaluator were used to measure the
participants’ performance and competencies. Both groups showed significant improvement after
training. However, the score of the HFS group was significantly higher than that of the web-
based simulation group. Johnson et al. (2014) concluded that the web-based simulation was an
effective teaching tool for use in addition to traditional mannequin simulation.
Finally, Youngblood et al. (2008) compared the team leadership skills of 30 medical
students and recent medical graduates after exposure to either HFS or virtual simulation using an
experimental pre-test-post-test design. A virtual emergency department was created using the
Adobe system software called Atmosphere. The performance of all subjects was evaluated by
three raters using the Emergency Medicine Crisis Resource Management scale. Youngblood et
al (2008) found that subjects’ performance of in both groups did not differ, suggesting that
virtual simulation may be as effective as HFS in learning leadership skills.
SIMULATION AND CRITICAL THINKING 44
In summary, current research on the effectiveness of HFS in improving the critical
thinking skills of prelicensure nursing students and licensed nurses varies. Research on the
differences in critical thinking skills among students who experience computer-based simulation,
virtual simulation and traditional HFS is also lacking. Due to increased emphasis on innovations
in education and the importance of discovering cost-effective instructional methods, it is
imperative to investigate the effects of virtual simulation in fostering critical thinking skills as
compared to traditional face-to-face simulation.
Analysis of Studies on Simulations and Critical Thinking
Based on a review of research on the effects of human patient simulation and virtual
simulation on critical thinking and other related aspects, all studies were conducted in U.S.
nursing schools, except those of McCallum et al. (2011), Forsberg et al. (2011) and Fong (2013),
which took place in Scotland, Sweden, and Hong Kong, respectively. All studies used
convenience sampling and included prelicensure undergraduate nursing students, advanced-
practice graduate nursing students, medical students, and licensed nurses and nurse aides. The
sample sizes of the quantitative studies ranged from 15 (Aebersold et al., 2012a) to154 (Shinnick
and Woo (2013), whereas, in qualitative studies, this number ranges from five (McCallum et al.,
2011) to ten (Kaddoura, 2010). Small sample sizes and convenience sampling are identified as
limitations in the majority of the reviewed research (Aebersold et al., 2012b; Brown &
Chronsiter, 2009; Goodstone et al., 2013; Howard, 2007; Howard, 2013; Johnson et al., 2014;
Kalisch, 2015; Maneval et al., 2012; Ravert, 2008; Tschannen et al., 2012; Weatherspoon &
Wyatt, 2012).
The majority of the reviewed quantitative studies used quasi-experimental, pre-test and
post-test designs save for those of Brown and Chronister (2009), Sulivan-Mann et al. (2009), and
SIMULATION AND CRITICAL THINKING 45
Aebersold et al. (2012a), which used a comparative-correlational design, experimental design,
and repeated-measure design, respectively. The instruments used to evaluate participants’
critical thinking skills varied throughout the reviewed articles. According to Fesler-Birch
(2005), the tests most commonly used to evaluate these skills were the Watson Glaser Critical
Thinking Appraisal, the California Critical Thinking Skills Test, and the California Critical
Thinking Disposition Inventory; however, these instruments are not specific to nursing. Based
on the reviewed studies, the majority of the researchers reported the instruments were reliable.
The HSRT was used in four studies (Goodstone et al., 2013; Maneval et al., 2012;
Shinnick & Woo, 2013; Sullivan-Mann et al., 2009). The HSRT is a multiple-choice exam that
consists of 33 questions and is specifically designed to quantitatively assess health care
professionals’ critical thinking skills (Insight Assessment, 2013; Maneval et al., 2012; Sullivan-
Mann et al., 2009). The test can be administered in 50 minutes and measures test-takers’
reasoning, analysis, inference, evaluation, induction, and deduction skills (Insight Assessment,
2013). Both the reliability and validity of the tool are well established (Shinnick &Woo, 2013).
Scores on the HSRT reflect only individuals’ reflective thinking and reasoned judgments, and no
knowledge of memorized information can be determined (Maneval et al., 2012; Shinnick &
Woo, 2013).
The California Critical Thinking Disposition Inventory and California Critical Thinking
Skills Test were used in Ravert’s (2008) study. The California Critical Thinking Skills Test is a
34-item multiple-choice exam that test-takers can usually complete in 40 to 45 minutes (Insight
Assessment, 2013; Ravert, 2008). The measures of the test include overall reasoning, analysis,
inference, evaluation, deduction, and induction skills (Insight Assessment, 2013). According to
Ravert (2008), the reliability of this test is established in terms of the Kuder Richardson Formula
SIMULATION AND CRITICAL THINKING 46
20 (KR20) with a range of 0.68 to 0.8. In addition, the California Critical Thinking Disposition
Inventory, which consists of 75 6-point Likert scale questions, seeks to measure individual
dispositional aspects of critical thinking and can, usually, be completed in 20 to 30 minutes
(Insight Assessment, 2013). The Cronbach’s alpha for the total scale of California Critical
Thinking Disposition Inventory was 0.91 (Ravert, 2008).
Two studies used the Health Education System Incorporated exam to evaluate critical
thinking skills (Howard, 2004; Schumacher, 2004). Howard (2007) reported the reliability of the
exam as having a KR20 of 0.93 for the pre-test exam and a 0.94 for the post-test exam. The
Point Biserial Correlation Coefficient was 0.14 for pre-test and 0.16 for post-test. In addition,
Brown and Chronister (2009) used Elsevier’s computerized Evolve Electrocardiogram Custom
Exam as the instrument in their study. The exam included 30-item multiple-choice questions and
was selected from Elsevier’s test bank by expert nurse educators and then approved by faculty
(Brown & Chronister, 2009). Brown and Chronister reported that a KR20 of 0.72 for the exam,
with the Point Biserial Correlation Coefficient for each test item being 0.22.
The critical thinking instrument developed by Johnson and Johnson (2014) included 45
multiple-choice questions. Content validity was established by having a panel of experts review
the tool, and 100% content validity was reported for the 35 questions (Johnson & Johnson,
2014). Reliability was also determined by the fact that the Pearson’s R was 0.89, the KR20 was
0.85, and the mean score of the discrimination index for each question was 0.37 (Johnson &
Johnson, 2014).
Fong (2013) used a Critical Thinking Survey, which was a subscale of the Motivated
Strategies for Learning Questionnaire, to measure the development of participants’ critical
thinking. This survey is a 7-point Likert-type scale with five questions. According to Fong
SIMULATION AND CRITICAL THINKING 47
(2013), the survey’s Cronbach’s alpha was .8 for the pre-test and .87 for the post-test,
establishing internal consistency reliability.
The instruments used to measure clinical judgment skills in the reviewed articles were
specific to particular clinical settings, such as emergency rooms. Weatherspoon and Wyatt
(2012) used the triage acuity instrument in their study and report that validity and reliability were
established by two expert nurse educators with test and measurement expertise. No quantitative
method is mentioned in terms of the reliability of the tool. In addition, Howard (2013)
mentioned only that the study was designed and reviewed by nursing and medical experts, and
that it does not provide detailed information about the reliability of the research instruments.
With regard to a tool for evaluating knowledge transfer, Tschannen et al. (2012)
established the reliability of a modified 17-item Capacity to Rescue Instrument (CRI), reporting
that the “correlations between the CRI and the simulation patient outcome were 0.78” (p. 20).
Inter-rater reliability using this instrument was determined by each independently scored
scenario reaching 90 % (Tschannen et al., 2012).
The Emergency Medicine Crisis Resource Management scale, which has 11 items and
evaluates participant performance based on a five-point Likert scale, was used in two studies
(Aebersold et al., 2012a; Youngblood et al., 2008). Aebersold et al. (2012a) modified the
original scale and reported a Cronbach’s alpha for the revised version of 0.9. Young et al. (2008)
reported a Cronbach’s alpha for the scale as 0.96 and the inter-rater reliability as 0.71.
Wilson et al. (2014) used a rubric developed to evaluate the accuracy and completeness
of the Situation, Background, Assessment, Recommendation format, which reported a Cronbach
alpha of 0.77. The Nursing Teamwork Survey is a 33-item questionnaire using 5-point Likert
scale (Kalisch et al., 2005). The Cronbach’s alpha was 0.94, test-retest reliability was r =. 92,
SIMULATION AND CRITICAL THINKING 48
and the concurrent, convergent, and contrast validities were all well-established; however, the
reliability and validity of the teamwork, knowledge survey was not reported (Kalish et al., 2005).
Finally, Johnson et al. (2014) reported an inter-rater reliability of only 0.86, with no description
to the reliability and validity of the tool used.
The Effects of Simulation on Satisfaction and Self-Confidence in Learning
Research has focused on examining and comparing the effectiveness of HFS and low-
fidelity simulation on nursing students’ perceptions of satisfaction and self-confidence in
learning; however, findings are inconsistent (Ma, 2008, Norman, 2012). There are no studies
directly comparing satisfaction and self-confidence in learning between undergraduate nursing
students who participate in face-to-face simulations and those who receive virtual simulation
training. Research on their perceptions regarding virtual simulations is also limited. The
research included in this review mainly concerns HFS and students’ perceptions about learning.
Several studies support the use of simulation in improving students’ satisfaction and self-
confidence in learning. Ma (2008) conducted a descriptive study to examine 50 senior BSN
students’ satisfaction and self-confidence in learning after a simulated mock code experience.
This study used a convenience sampling method and the instrument used was the NLN’s
satisfaction and self-confidence in learning survey. The results demonstrated that students were
satisfied with this simulated experience and felt more confident in dealing with code situations.
Blum, Borglund and Parcells (2010) examined the effects of simulation on students’ self-
confidence level and clinical performance by conducting a quasi-experimental, quantitative
study. A total of 59 entry-level BSN students who enrolled in three different health assessment
and skills courses participated in this study. The participants in the control group received
traditional training involving the use of task trainers and student volunteers, while the students in
SIMULATION AND CRITICAL THINKING 49
the experimental group experienced simulations using Laerdal’s SimMan. Selected items from
Laster’s clinical competence rubric were administered to measure participants’ clinical
competence and self-confidence at the midterm and during final week. The results indicated that
participants in both groups reported improvement in self-confidence overtime; however, the
increases in self-confidence at the midterm and the final assessment did not differ between
groups.
Abdo and Ravert’s (2009) quantitative study examined 48 BSN students’ perceptions of
simulations using HPS. All participants attended five one-hour classes, and simulation scenarios
were related to the context of medical-surgical nursing. A 19-item, 4-point Likert scale survey
was used to measure students’ perceptions of simulation. The results showed that students
reported the simulation reflected real-life scenarios, evaluated their clinical decision-making
skills, helped them to be better prepared for dealing with real-life situations, and gave them more
confidence in providing care to their patients in real clinical settings (Abdo & Ravert, 2009).
Bambibi, Wshurn, and Perkins (2009) conducted an integrated, quasi-experimental, and
repeated-measure study to investigate the effects of simulations on increasing nursing students’
self-efficacy in the context of maternal and infant nursing. This study utilized a convenience
sampling method, and the final sample size consisted of 112 first-semester BSN students. A
researcher-developed a 6-item, 10-point scale pre-test survey was used to measure students’ self-
efficacy before simulation training. Students were, then, asked to attend a three-hour simulation
that had eight stations presenting mannequins of various levels of fidelity for students to practice
a variety of skills. A researcher-developed post-test survey that included three extra open-ended
questions was administered after the simulated experience. The results showed that students
reported increased confidence in performing certain skills, including checking vital signs,
SIMULATION AND CRITICAL THINKING 50
assessing breasts, fundus, and lochia, and providing education to patients. The participants also
reported increased confidence in patient interactions, communication, and clinical judgment
(Bambibi et al., 2009).
The aforementioned study conducted by Fong (2013) used the NLN survey to measure
participants’ satisfaction and self-confidence in learning in phase I. In phase II, focus group
interviews were conducted with 24 purposely selected students to further examine factors that
may influence their satisfaction, self-confidence, and development of critical thinking. The
results of phase I demonstrated that students’ self-confidence increased and that they were
satisfied with their simulation experiences. “A mimic clinical environment”, “holistic care
experience”, “information and reflective thinking”, and “dosage of the HFS” were the four main
themes identified in phase II of the study (Fong, 2013, pp. 106-107). Fong concluded that HFS
is an effective teaching strategy that can be used to improve nursing students’ critical thinking
skills, satisfaction, and self-confidence in learning.
Sinclair and Ferguson (2009) conducted a mixed-methods study to investigate the effects
of simulation on self-efficacy. This study included 174 second-year BSN students on two
campuses. One hundred students at one campus (control group) had two-hour lectures on five
topics, including adult health, mental health, and child health. Seventy-four students at the other
campus (experimental group) received one-hour lectures and one-hour simulations using
medium-fidelity mannequins and role play in the school’s laboratory. Participants’ self-efficacy
was measured with a pre-test and a post-test using the modified Baccalaureate Nursing Student
Teaching-Learning Self-Efficacy questionnaire, which has 16 Likert scale-type questions. All
students were also asked to complete a researcher-developed satisfaction survey including Likert
scale-type question at post-test, while the experimental group of students was also asked to
SIMULATION AND CRITICAL THINKING 51
complete a reflective review with open-ended questions. The results showed that students in the
experimental group had significantly higher self-efficacy than did those in the control group after
all topics, except for mental health. Only 12 students in the experimental group finished the
reflective review, and major themes identified were “peer learning opportunities, reinforcement
of knowledge, and improved confidence” (Sinclair and Ferguson, 2009, p. 7).
Another quantitative, descriptive, and correlation study conducted by Smith and Roehrs
(2009) also examined HFS and BSN students’ satisfaction and self-confidence in learning. Sixty-
eight junior-level BSN students registered in medical/surgical classes participated in the study.
All participants were asked to complete a simulation scenario related to chronic obstructive
pulmonary disease, and the instrument used to measure their satisfaction and self-confidence in
learning was the NLN survey. The results indicated that students were satisfied with the HFS and
felt more confident after the simulated experience. Clearly defined learning objectives and
scenarios that are appropriately challenging to students were correlated with satisfaction and self-
confidence.
Schoening, Sittner, and Todd (2006) investigated students’ perceptions of HFS. This pilot
study utilized a non-experimental design and consisted of a convenience sample of 60 BSN
junior year students. Participants were required to complete six-hour simulations with an
emphasis on preterm labor. Students were requested to complete a researcher-developed 10-item,
4-point Likert scale evaluation to measure their perceptions about the simulated clinical
experience. Students’ weekly reflective journals were also used for data analysis. The results
showed that students viewed simulation as an effective tool to increase their confidence level
because they had hands-on practice and developed communication, teamwork, and decision-
making skills in a safe environment.
SIMULATION AND CRITICAL THINKING 52
Jeffries and Rizzolo (2006) conducted a national, multi-site, and multi-method study,
sponsored by the NLN, examining the effects of different types of simulation on nursing
students’ knowledge, satisfaction, self-confidence, and clinical judgment. This study consisted
of 403 nursing students who were taking their first medical/surgical nursing class. The
participants were randomly assigned to three groups: a paper-and-pencil case study group, a
static mannequin group, and an HFS group. All participants received a 20-minute simulation
based on the designed methods, followed by a 20-minute reflective activity. A five-item
satisfaction survey and an eight-item self-confidence in learning questionnaire, which has
become known as the NLN’s satisfaction and self-confidence in learning survey, were designed
by the researchers and used in this study. The results showed that the confidence levels of
participants who received HFS training or used static mannequins were significantly higher than
those of students who experienced only the paper-and-pencil case study. The students in the
HFS group also reported greater satisfaction with their learning.
Hall (2013) conducted a quantitative study using a Solomon four group design to
evaluate the effects of HFS on 43 first-semester senior-level BSN students’ knowledge gain,
satisfaction, and self-confidence in learning. A convenience sampling method was also used.
Students in the control group received only traditional lectures, while those in the HFS group
experienced a formatted simulation scenario using an HPS. The 30-item Health Education
System Incorporated exam was used to measure improvement in respiratory knowledge and the
NLN survey was administered to evaluate participants’ satisfaction and self-confidence in
learning. The results demonstrated that students were satisfied with simulation, and their self-
confidence also improved, but the knowledge gain between groups did not differ.
SIMULATION AND CRITICAL THINKING 53
In terms of associate program nursing students, Butler, Veltre, and Brady (2009) also
examined whether there was a difference between perceptions of students who had low-fidelity
simulation and those who had HFS. This study used a two-group experimental research design
and a convenience sampling method. Thirty-one associate program students participated in this
study. All participants received a 20-minute pediatric simulation scenario that focused on fluids
and electrolytes, followed by a 10-minute debriefing, but the HFS group used a pediatric HPS
and the low-fidelity simulation group used a static mannequin. Students’ perceptions were
evaluated by using the student versions of NLN simulation design scale, educational practice
questionnaire, and student satisfaction and self-confidence in learning survey. The results,
specific to satisfaction and self-confidence in learning, indicated that students in the HFS group
had significantly higher satisfaction and self-confidence than did the students who received low-
fidelity simulation (Bulter et al., 2009). However, Butler et al. argued that students’ satisfaction
and self-confidence may be influenced by allowing them to self-select the roles they would like
to perform during simulation.
Kuznar (2007) examined the effects of high -fidelity simulation on 37 associate program
students’ perceptions of learning using a descriptive study design. The participants all completed
HFS in three different courses. The researcher developed a 21-item, 5-point Likert-type scale to
measure the perceptions of HFS. The study found that students were highly satisfied with HFS.
On the other hand, some research found the use of HFS did not significantly improve
students’ satisfaction and confidence when compared with other instructional methods.
Tosterud, Hedelin, and Hall-Lord (2013) conducted a quantitative, evaluative, and comparative
study to compare the effects of simulation, both low- and high-fidelity, on nursing students’
satisfaction and self-confidence in learning. This study consisted of 86 Norwegian BSN students
SIMULATION AND CRITICAL THINKING 54
in different academic years who were randomly assigned to a HFS group, a static mannequin
group, or a paper-and-pencil case study group. The instruments used in this study were the NLN
satisfaction and self-confidence in learning survey, an educational practices questionnaire, and a
simulation design scale. The results demonstrated all participants were satisfied with their
simulation experiences regardless of the level of fidelity; however, participants who received the
paper-and-pencil case study reported the most satisfaction among the three groups and the
differences were statistically significant. Tosterud et al (2013) argued that this finding may be
caused by the paper-and-pencil case study, as it is the most well-known teaching strategy in the
nursing program.
Hoadley (2009) conducted a quantitative, experimental, two-group study to investigate
whether simulations of different fidelity levels affect knowledge gain and resuscitation skills.
This study consisted of 53 licensed healthcare professionals who attended a three-day Advanced
Cardiac Life Support class. Participants in the control group attended a traditional instructor-led,
low-fidelity simulation, while those in the experimental group experienced HFS training. The
instruments used included two American Heart Association tests and the NLN satisfaction and
self-confidence survey, as well as a simulation design questionnaire. The results demonstrated
no detected differences in knowledge gain, or improvement in resuscitation skills between
groups. In addition, all participants expressed satisfaction with their respective simulated
experiences; however, there was no difference in satisfaction and self-confidence in learning
between groups. Hoadley pointed out that the same debriefing was provided to both groups;
therefore, the learning may be facilitated by the debriefing rather than by the fidelity level of the
simulation.
SIMULATION AND CRITICAL THINKING 55
Tiffen, Corbridge, Shen, and Robinson (2011) examined the effects of simulation on the
knowledge, confidence, and satisfaction in learning among 28 advanced-practice nursing
students enrolled in an advanced physical assessment class. This study used a convenience
sampling method and participants were randomly assigned to either the control group or the
experimental group. All participants attended the didactic course conducted using the usual
teaching and learning strategies, while the students in the experimental group also received a
one-hour simulation using an intermediate-fidelity simulator. All participants’ level of
confidence was measured using a 6-item, 5-point Likert-type scale, and knowledge gain was
evaluated by a researcher-developed survey consisting of ten questions. Students in the
experimental group were also asked to complete a 4-item, 5-point Likert-type scale to evaluate
their satisfaction with the simulation experience. The results demonstrated that the students in
the experimental group were satisfied with the simulation experience and had significantly more
heart and lung assessment knowledge gain than did the participants in the control group;
however, the confidence levels did not differ between groups.
Finally, Brannan, White, and Bezanson (2008) conducted a study to examine differences
in students’ cognitive skills and confidence levels between traditional classroom instruction and
HFS. This study used a prospective, quasi-experimental, pre-test and post-test design and a total
of 107 junior-level BSN students participated in the study. The 53 students who enrolled in an
adult health class in fall semester received a two-hour traditional lecture on acute myocardial
infarction, while the other 54 students who took the class in spring semester received a two-hour
simulation on the same topic. A research-adapted confidence level tool including a 34-item, 4-
point Likert-type scale was used to evaluate participants’ confidence levels between groups. The
results showed that students in the high-fidelity group had significantly more knowledge gain
SIMULATION AND CRITICAL THINKING 56
than those who received traditional classroom instruction; however, there was no difference in
confidence level detected between groups.
Students and faculty members’ perceptions toward virtual simulation have also been
investigated in both quantitative and qualitative designs. Forsberg et al. (2011) used
questionnaires to assess 77 participants’ opinions on using web-based virtual patients as in
relation to their clinical reasoning skills. The results indicated that the use of web-based virtual
patients as an assessment tool was widely accepted. Aebersold et al. (2012b) used a six-question
survey to evaluate 15 students’ opinions on whether Second Life is a feasible method to conduct
simulations, and found their comments to be positive overall. Some students reported that the
virtual simulation experience was as good as or better than HFS (Aebersold et al., 2012b).
Jenson and Forsyth (2012) used an evaluation survey to obtain feedback from eight faculty
members as to whether virtual simulation, along with a haptic arm device, could improve student
knowledge regarding intravenous catheter insertion. All faculty members either strongly agreed
or agreed that virtual simulation was effective in improving students’ knowledge. Furthermore,
McCallum et al. (2011) conducted a qualitative study to investigate five students’ experiences
with regard to learning decision-making skills using Second Life. The students in this study
reported that the virtual simulation improved their learning and made them feel like they were
making decisions in a real setting (McCallum et al., 2011).
Analysis of Studies on Simulations and Satisfaction and Self-Confidence in Learning
The review of research related to simulations and satisfaction and self-confidence in
learning revealed that all studies took place in the United States, except for those of Fong (2013)
and Tosterud et al. (2013), which were conducted in Hong Kong and Norway, respectively.
Most studies used convenience sampling and consisted of BSN students. However, two studies
SIMULATION AND CRITICAL THINKING 57
involved associate program students (Bulter et al.; 2009; Kuzar (2007), one study included
advanced-practice students (Tiffen et al, 2011), and one study recruited licensed healthcare
professionals (Hoadley, 2009). The sample size of the quantitative studies ranged from 28
(Tiffen et al., 2012) to 403 (Jeffries & Rizzolo, 2006). The research designs utilized in reviewed
studies included quasi-experimental design (Blum et al. 2010; Bambibi et al., 2009; Brannan et
al., 2008; Hall, 2013), experimental design (Bulter et al, 2009; Hoadley, 2009; Tiffen et al.,
2011), mixed-methods design (Jeffries & Rizzolo, 2006; Kuznar, 2007; Sinclair & Ferguson,
2009), descriptive and correlation design (Smith & Rohers, 2009), and evaluative and
comparative designs (Tosterud et al., 2011). Insufficient generalizability due to a small sample
size and/or convenience sampling were identified as the most common limitations in the studies
(Abdo & Ravert, 2009; Blum et al., 2010; Bulter et al, 2009; Fong, 2013; Hall, 2013; Hoadley,
2009; Kuznar, 2007; Ma, 2013; Schoering et al., 2006; Sinclair & Ferguson, 2009; Smith &
Rohers, 2009; Tiffen et al., 2011; Tosterud et al., 2013). Sinclair and Ferguson (2009) also
reported that low response rate was another limitation in their study.
The instrument most commonly used by researchers was the NLN satisfaction and self-
confidence in learning survey (Bulter et al., 2009; Fong, 2013; Hall, 2013, Hoadley, 2009;
Jeffries & Rizzolo, 2006; Ma, 2008; Smith & Roehrs, 2009; Tosterud et al., 2013). This
instrument was designed by Jeffries and Rizzolo to be used in their study. According to Jeffries
and Rizzolo (2006), the Cronbach’s alphas were .87 for self-confidence in learning and .94 for
satisfaction. The Cronbach’s alpha of this instrument was reported by other researchers as
greater than .80, except for the Hall (2013) study, which reported an alpha of .72. However,
Tosterud et al. (2013) reported that, even though the value of Cronbach’s alpha was acceptable,
SIMULATION AND CRITICAL THINKING 58
the translation of this instrument to Norwegian may still affect the study’s results due to
ambiguous language and meaning.
The researcher-developed or modified/adapted instruments that were used to measure
participants’ satisfaction and self-confidence in learning in other studies were all Likert-type
scales, and the Cronbach’s alpha of the developed instruments were reported to be greater than
.80, except in the studies of Kuznar (2007), Schoening et al. (2006), and Sinclair and Ferguson
(2009). Nevertheless, Abdo and Ravert (2009) reported that, even though a coefficient alpha of
.86. was determined, small sample size may affect the results of psychometric testing and the
strength of the instrument should be further examined. Content validity of the instruments
developed or modified by researchers was also reported (Abdo & Ravert, 2009; Bambibi et al.,
2009; Blum et al., 2010; Brannan et al., 2008; Kuznar, 2007; Sinclair & Ferguson, 2009;
Schoening et al., 2006; Tiffen et al., 2011).
In summary, the review of literature found that students viewed simulations as an
effective teaching strategy that should be used to facilitate their learning, regardless of the level
of fidelity utilized. The effects of HFS on students’ satisfaction and self-confidence in learning
were compared with other teaching methods, such as classroom instruction, case study, and static
mannequins. No study was found that directly compared virtual simulation and HFS in terms of
students’ satisfaction and self-confidence in learning. The results of the literature further
validate the need to conduct this dissertation study.
Theoretical Framework
Hands-on experience is an important element of the nursing curriculum because it helps
students develop a better understanding of the information they acquire in lecture classes.
Simulations provide different learning experiences students may not obtain during practicums.
SIMULATION AND CRITICAL THINKING 59
Kolb’s experiential learning theory explains the relationships between experience and learning
and was used to guide this study. The NLN/Jeffries simulation model was also used as a
theoretical framework to highlight the characteristics of simulation design that promote critical
thinking skills and other desired outcomes.
Kolb’s Experiential Learning Theory
According to Kolb (2014), learning is defined as “the process whereby knowledge is
created through the transformation of experience” (p. 49). In this theory, a learning cycle has
four stages, and each stage builds the foundation for the next stage (Kolb, 2014). Concrete
experience is a “feeling dimension” in which learners are fully and unbiasedly engaged in a
learning experience (Evans, Forney, Guido, Patto, & Renn, 2010, p. 140). Reflective
observation is a “watching dimension” in which learners reflect on what they observed via
various lenses (Evans et al., 2010). Based on observations and reflections, learners may form
theories, considered a thinking dimension and defined by Kolb as abstract conceptualization
(Evans et al., 2010, p. 140). Finally, learners who are in the active experimentation stage, or a
“doing dimension”, apply the theories in action to make decision and solve problems (Evans et
al., 2010, p. 140).
The organization of a typical simulation mirrors the four-stage learning cycle of Kolb’s
experiential learning theory. Simulations provide learning opportunities for students through
exposure to a clinical situation they may not yet have encountered in a real clinical setting.
During simulations, students observe what happens, the interventions provided, and the
consequences of these interventions. Students also reflect on their simulation experiences during
a debriefing. Students may then be able to think critically, formulate appropriate interventions
SIMULATION AND CRITICAL THINKING 60
for similar situations based on their observations and reflections, and apply these in a real setting
to solve problems and ensure patient safety and ultimate clinical outcomes.
The NLN/Jeffries Simulation Model
The NLN/Jeffries simulation model is a well-known simulation framework (Figure 1).
Jeffries (2005) indicated that the NLN built partnerships with the Laerdal Company and
organized a national group to develop a simulation model to guide the design, implementation,
and evaluation of simulation. The NLN/Jeffries simulation model was created on the basis of the
findings of theoretical and empirical studies and was proposed to be suitable for all kinds of
simulations incorporated in current nursing curriculum (Jeffries, 2005). The simulation model
has five main components: the facilitator, the participant, educational practices, design
characteristics, and the outcomes. Each component has its own related variables (Jeffries, 2012).
The outcomes of the simulation are affected by the levels of educational practices utilized in
designing and implementing simulations.
Unlike traditional classroom teaching, simulations are student-centered, and teachers who
teach simulations act as facilitators to provide assistance and guidance (Jeffries, 2012). Students
are held responsible for their own learning at a certain level and are expected to be self-directed
and motivated (Jeffries, 2012). Competition is not recommended because it can impede
students’ learning through anxiety and stress (Jeffries, 2005). In addition, “active learning,
feedback, student/faculty interaction, collaboration, high expectations, collaborative learning,
and time on task” are the principles included in the educational practices (Jeffries, 2012, p. 37).
Effective learning and high satisfaction will result if students actively participate in their leaning,
students and faculty provide informed feedback to each other, students interact with the faculty
to discuss their goals and the content of simulation, students work together in problem-solving
SIMULATION AND CRITICAL THINKING 61
and decision-making, the faculty maintains the high standards of students, the faculty establishes
clear and reasonable deadlines for assignments, and faculty uses teaching strategies that can
accommodate and meet the needs of its diverse student population (Jeffries, 2012). In terms of
the simulation design, Jeffries (2012) emphasized the importance of well-written learning
objectives that correspond to students’ current knowledge and experience, providing scenarios
that truly reflect real clinical settings and match the participants’ knowledge and skill levels,
giving cues to participants as needed, and conducting a debriefing after each simulation scenario.
Finally, well-designed simulations are expected to increase students’ knowledge, skills,
satisfaction of learning, self-confidence, and critical thinking (Jeffries, 2012).
Jeffries & Rogers, 2012, p. 37
Figure 1. The NLN/Jeffries Simulation Framework
SIMULATION AND CRITICAL THINKING 62
Chapter Summary
Critical thinking is an important skill that nurses must acquire to effectively deal with
patients’ complex conditions and the ever-changing healthcare environment. Nurse educators
are held accountable for preparing nurses who can think critically and can provide safe patient
care. Practicum is an important element in nursing education and is used to help students apply
the knowledge they learned in class to real clinical settings. However, increased difficulties in
finding clinical placements and faculty shortage led to the emergence of simulation. Simulations
allow students to make clinical decisions and see the consequences of the interventions they
provide without causing real harm to patients. Because of the wide acceptance of web-based
instructions, virtual simulation may be a more cost-effective teaching strategy.
Current research strongly emphasizes the effects of human patient simulation using high-
fidelity mannequins on nursing students’ knowledge gain and transfer, satisfaction, self-
confidence, critical thinking skills, and perceptions of the values of this type of simulation. The
results about the use of the human patient simulation and the improvement of students’ critical
thinking skills remain inconclusive and may not justify the high cost of HFS. Moreover, no
research directly examined the effects of virtual simulation and students’ critical thinking skills
through quantitative measurement. Finally, small sample sizes and convenience sampling in
studies may lead to results that are not generalizable to the entire population. Additionally, it is
necessary to explore the effectiveness of virtual simulation in fostering critical thinking skills
and to compare the findings with HFS in order to determine the most appropriate instructional
method.
SIMULATION AND CRITICAL THINKING 63
CHAPTER THREE: METHODOLOGY
Virtual simulation gained popularity recently due to advances in high technology, cost
savings for nursing schools, and a new generation of tech-savvy students. Second Life, which is
a multiple-user virtual environment, used to be the most commonly used and studied platform for
conducting virtual simulation; however, technical difficulties and issues with managing the
program for academic use frustrated educators. As a result, alternative virtual simulation
software was created to meet students’ needs and to improve their learning experiences. Wolters
Kluwer Health and Laederal Medical co-developed a virtual simulation program called the vSim
for Nursing, which was just released in February, 2014. The vSim for Nursing Medical/Surgical
program includes ten NLN scenarios commonly used in traditional face-to-face simulations. It is
worth investigating if this new virtual simulation product can help students think more critically
and determining if there are any differences between students who receive simulations virtually
or in traditional laboratories in terms of critical thinking skills.
The purpose of this quantitative study was to compare the effects of virtual simulation in
fostering undergraduate nursing students’ critical thinking skills, satisfaction, and self-
confidence in learning to traditional face-to-face simulations. The research questions of this
dissertation study are as follows:
1. Is there a significant difference in the critical thinking skills of students who are trained
in face-to-face simulation and those trained in virtual simulation?
2. Is there a significant difference in the critical thinking skills of students before and after
participating in virtual simulation?
3. Is there a significant difference in the critical thinking skills of students before and after
receiving traditional face-to-face simulation?
SIMULATION AND CRITICAL THINKING 64
4. Is there a significant difference in the satisfaction with learning of students who are
trained via face-to-face simulation and those trained via virtual simulation?
5. Is there a significant difference in the self-confidence in learning of students who are
trained via face-to-face simulation and those trained in virtual simulation?
In order to provide objective data and quantitatively measure and examine the differences
in critical thinking skills, satisfaction and self-confidence in learning of BSN students before and
after exposure to either virtual simulations or face-to-face simulations, a quantitative,
experimental, two-group, and pre-test and post-test research study was conducted. This study
was approved by both the University of Southern California and the selected university’s
institutional review boards.
Sample and Population
The sampling method used in this study is convenience sampling. The selected
institution is a private, non-profit university located in the Western region of the United States
The BSN program of this selected institution consists of five academic semesters, and all
students are required to complete 120 credits to earn a BSN degree. Students in the first semester
of the BSN program are called level one students, and students in their last semester of the
program are called level five students. The nursing department recently trialed the vSim for
Nursing Medical-Surgical program as a substitute for 24 clinical hours among its level five
students due to limited clinical placements for them. Therefore, it is important to determine the
benefits of using this program.
In this study, students who were in the fourth semester of the BSN program (level four
students) were chosen because they were all required to take a one-credit intervention lab course
(NUR 3957) that includes weekly simulation labs. The NUR 3957 course was offered on
SIMULATION AND CRITICAL THINKING 65
Monday, Tuesday, Wednesday, and Friday. Each class lasted three hours, which consisted of 1.5
hours for a skills lab and 1.5 hours for a simulation lab. Each class could have a maximum of 16
students, and the students were equally divided into two groups. The first group of students
completed the face-to-face simulation lab, while the second group of students completed the
skills lab. After the first 1.5 hours of class time, both groups switched to the other lab. The skills
lab and simulation lab were taught by different instructors because both labs were offered
simultaneously. In addition to NUR 3957, the level four students are required to concurrently
take a three-credit Adult Health II lecture (NUR 3963), a four-credit Adult Health II practicum
that requires a total of 165 clinical hours (NUR 3965), a two-credit online leadership class (NUR
3900), and a three-credit nursing research proposal development class (NUR 4700) during the
same semester.
The recruiting criteria of this study include students in the fourth semester of their
programs and currently enrolled in the one-credit NUR 3957 interventional lab in the fall 2015
semester. During this semester, there were a total of 57 level four students who met the
recruiting criteria and were invited to participate in the study. A total of 52 level four students
agreed to participate in the study and took the pre-test; however, the final sample size consisted
of only 49 students because three students did not complete the post-test.
Instrumentation
A five-item questionnaire was used to collect the participants’ demographic information,
including their age, gender, cumulative nursing GPA, as well as types and years of
Healthcare-related experiences (Appendix B). In addition, the HSRT was used to evaluate the
participants’ critical thinking skills and addressed the first, second and third research questions.
The HSRT is sold by Insight Assessment and is specifically designed to evaluate critical thinking
SIMULATION AND CRITICAL THINKING 66
in terms of the inductive reasoning, deductive reasoning, inference skills, analytical reasoning,
and evaluative reasoning skills of healthcare professionals. The HSRT is widely used in nursing
and other healthcare-related studies. It can be administered either as a paper-and-pencil test or as
an online test. In this study, the online HSRT was selected due to instantly available results and
the ability to include five customized demographic questions in it. The HSRT consists of 33
multiple-choice questions, and test-takers can have up to 50 minutes to complete the test. An
overall score comprised of critical thinking skills and the subscale scores is provided by the end
of the test. According to the 2016 HSRT manual, an overall score of 26 to 33 indicates that test-
takers have superior critical thinking skills that may allow for more advanced leaning and
leadership (Insight Assessment, 2016). An overall score of 21 to 25 equals moderate critical
thinking skills that may be needed for academic success and career development (Insight
Assessment, 2016). Furthermore, an overall score of 15 to 20 indicates moderate critical
thinking skills and “the potential for skills-related challenges when engaged in reflective
problem-solving and reflective decision-making associated with learning or employee
development” (Insight Assessment, 2016, p. 49). Finally, an overall score of 14 or below
demonstrates a lack of test-taker effort due to fatigue or difficulties with reading and
comprehension.
The validity and reliability of the HSRT is well established (Allaire, 2015; Shinnick &
Woo, 2012). The content and construct validity of the HSRT were established because it was
developed based on the consensus definition of critical thinking in an American Philosophical
Association Delphi study. The HSRT also presents a brief scenario in each question that reflects
an adequate range of difficulty to challenge test-takers’ reasoning skills and promote accurate
scoring (Allaire, 2015; Insight Assessment, 2016). Construct validity was further established by
SIMULATION AND CRITICAL THINKING 67
the improved scores of students after taking a critical thinking course (Allaire, 2015; Insight
Assessment, 2016). Finally, studies also show the ability of HSRT to predict criterion behavior;
therefore, criterion (predicative) validity was also established (Insight Assessment, 2016). With
respect to the reliability of the HSRT, a KR20 ranging from .77-.83 demonstrated internal
consistency (Insight Assessment, 2013; Maneval et al., 2012). Test-retest reliability was also
established with a coefficient of equal or greater than .80 when the post-test was administrated
two weeks after the pre-test (Insight Assessment, 2016).
In order to measure and compare the participants’ satisfaction and self-confidence and
answer the fourth and fifth research questions, the NLN’s student satisfaction and self-
confidence in learning survey was used in this study. Permission to use this instrument was also
granted by the NLN (Appendix C). The NLN’s survey has 13 five-point scale questions
(Appendix D). The first five questions were developed to evaluate students’ satisfaction with
current learning, while the remaining eight questions were designed to measure students’ self-
confidence in learning. According to the NLN (2015), the Cronbach’s alpha was 0.94 for the
satisfaction section and 0.87 for the self-confidence section. In this current study, the Cronbach
alpha coefficient was .699 and the mean inter-item correlation was .316 for the satisfaction
section. In addition, the Cronbach’s alpha was .734 and the mean inter-item correlation was .346
for the self-confidence in learning section.
To more appropriately evaluate the participants’ experiences with virtual simulation, a
modified version of the NLN’s original survey was used (Appendix E). Similar to the original
survey, the modified instrument has 13 five-point scale questions, with the first five questions
evaluating students’ satisfaction and the other eight questions measuring students’ self-
confidence in learning. The Cronbach’s alpha was .862 and the mean inter-item correlation was
SIMULATION AND CRITICAL THINKING 68
.582 for the satisfaction section. For the self-confidence in learning section, the Cronbach alpha
coefficient was .883and the mean inter-item correlation was .511.
Data Collection
A list of the names and email addresses of all eligible level four students was obtained
from the department’s administrative office. A recruitment letter was emailed to all eligible
level four students one week before the beginning of the fall 2015 semester (Appendix F). The
researcher also attended all four NUR 3957 classes during the first week of the semester to meet
with all the eligible level four students and explain this study to them. An information sheet with
consent form was provided to all eligible level four nursing students during their first class
(Appendix G). All level four students were then randomly assigned to either the face-to-face
simulation group or the virtual simulation group by drawing an assignment card from a bag.
After making the group assignment, the level four students were asked to submit the signed
consent form if they chose to participate in the study. A total of 52 students signed the consent
form and agreed to participate in the study: nine in the Monday class, 16 in the Tuesday class, 11
in the Wednesday class, and 16 in the Friday class. All participants were given the online HSRT
pre-test during their first NUR 3957 class in the university’s computer lab. A brief orientation to
the vSim for Nursing Medical/Surgical program was also provided. The student user’s guide of
vSim for Nursing Medical-Surgical was given to students for accessing and navigating the virtual
simulation program.
The participants were required to complete five consecutive weeks of simulations either
in the simulation lab or in a virtual environment before taking the HSRT post-test, except for the
participants in the Monday class due to a national holiday. The weekly scenarios were assigned
in an attempt to match the content covered in the lecture class in order to help students apply and
SIMULATION AND CRITICAL THINKING 69
practice what they learned in their lecture class. The simulation scenarios used in both the face-
to-face simulation group and the virtual simulation group were the same NLN simulation
scenarios; however, the third scenario used in the face-to-face simulation group was different
than the scenario used in the virtual simulation group because the particular scenario is not
included in the vSim Medical-Surgical program. As a result, the second simulation scenario used
in the face-to-face simulation group was a patient with congestive heart failure, while the
scenario for the simulation group was a patient with acute myocardial infarction. The other
simulation scenarios used in both groups were the same, including patients with pneumonia,
chronic obstructive pulmonary disease, bowel obstruction, and postoperative hysterectomy
opioid intoxication.
Students in the face-to-face group were provided with open-ended pre-simulation
questions about a week prior to their next class and were expected to answer these questions and
bring the answers to their simulation classes. The number of pre-simulation questions depended
on each simulation scenario. The weekly, 1.5-hour face-to-face simulation class was taught by
one simulation instructor with six to eight students. There were four different simulation
instructors who taught the NUR 3957 classes that were offered on four different days in the fall
of 2015, and they were provided with the transcripts consist of simulation scenario information
about one week before their next class. The patient simulator used in the face-to-face simulation
lab is Laerdal’s SimMan. The face-to-face simulation class was usually started with an
instructor-led pre-briefing about the upcoming simulation scenario. After the discussion and pre-
briefing, two students were asked to serve either as a primary nurse or a secondary nurse to take
care of the simulated patient in the scenario, while the rest of the students were asked to observe
and identify the positive and areas for improvement of their peers’ performance in dealing with
SIMULATION AND CRITICAL THINKING 70
the scenario. After each simulation scenario, an instructor-led debriefing was conducted to allow
students to further discuss the scenario, including the interventions they applied and observed, as
well as what they would do differently next time to promote better patient outcomes. The
students were also required to complete a documentation assignment specific to the simulation
scenario in DocuCare, which is an electronic medical record program developed by Lippincott,
no later than three days before their next class.
Participants in the virtual simulation group were provided with free 3-month access for
vSim for Nursing, Medical-Surgical program. All participants in the virtual simulation group
were notified about the weekly assigned scenario about one week before their class and were
asked to open and complete only the assigned scenarios. Prior to the weekly NUR 3957 class, all
participants were asked to complete the suggested readings listed under the assigned scenario
and the multiple-choice pre-simulation quiz. Students could use the online textbooks built into
the vSim for Nursing Medical-Surgical program as a reference if needed. During the scheduled
class time, students were required to start working on the assigned virtual simulation scenario
and stayed at campus for the entire 1.5 hours because the students in the face-to-face simulation
group were conducing simulations in the lab. After completing each simulation scenario,
students were also required to complete a post-simulation quiz and the documentation
assignments in DocuCare. Finally, all participants were required to answer the guided reflection
questions listed at the end of the scenario and to email their reflections to the researcher who
oversaw the vSim class at least three days before their next class. The researcher provided
weekly feedback to students based on their responses to the debriefing questions. Students’
performance of doing each simulation scenario was recorded and scored by the vSim for Nursing
Medical-Surgical program and result reports were available for students and the researcher to
SIMULATION AND CRITICAL THINKING 71
review. Students were required to redo each scenario until they earned at least 80% on both the
virtual simulation scenario and the post-simulation quiz in order to move to the next scenario.
By the midterm of the fall 2015 semester, all students were switched to different
simulation groups so all level four students had a chance to experience both face-to-face and
virtual simulation. Prior to switching students to different groups, the HSRT was used again to
test the participants’ critical thinking skills after receiving either the face-to-face simulation or
virtual simulation. The HSRT post-test was administered based on the participants’
availabilities. Some participants scheduled an appointment with the researcher to take the HSRT
post-test in the researcher’s office using a laptop, while some other students took the test before
their scheduled classes in the school’s computer lab. All participants were required to complete
all simulation scenarios and other simulation-related assignments as well as not experience the
other type of simulation before taking the post-test. All participants were also asked to complete
the satisfaction and self-confidence in learning survey after they completed the HSRT post-test.
A $10 Starbucks gift card was provided to the participants who completed the survey and all the
required tests.
Data Analysis
The overall scores for the HSRT pre-test and post-test were analyzed to compare the use
of face-to-face simulation and the vSim for Nursing Medical-Surgical program in terms of
critical thinking skills gain. In addition, the difference between the participants in either the
face-to-face simulation group or the virtual simulation group was examined in terms of their
satisfaction and self-confidence in learning. As a result, the independent variables in this study
are face-to-face simulation and the virtual simulation program, and the dependent variables are
the research participants’ critical thinking skills, satisfaction, and self-confidence.
SIMULATION AND CRITICAL THINKING 72
Descriptive statistics were used to document the characteristics of participants in both
groups. They were also used to check the distributions of scores on the dependent variables in
order to examine if the assumptions of selected parametric statistical techniques were violated.
IBM SPSS Statistics version 23 was used to analyze the collected data. A paired-samples t-test
was used to compare the HSRT overall scores at the pre-test and the post-test within the face-to-
face simulation group and the virtual simulation group. In addition, an independent t-test was
utilized to compare the pre-test and post-test overall scores of the HSRT to examine if the critical
thinking skills differ between the face-to-face simulation group and the virtual simulation group.
Finally, a Mann-Whitney U test was used to examine if there was any difference between the
face-to-face simulation group and the virtual simulation group in terms of the participants’
satisfaction and self-confidence in learning.
SIMULATION AND CRITICAL THINKING 73
CHAPTER FOUR: RESULTS
The purpose of this study was to compare the effects of virtual simulation to face-to-face
simulation on the acquisition of critical thinking skills, satisfaction, and self-confidence in
learning among undergraduate nursing students. An experimental quantitative research study
with a pre-test and post-test design was conducted. This chapter presents the results that
correspond to the following research questions:
1. Is there a significant difference in the critical thinking skills of students who are trained
in face-to-face simulation and those trained in virtual simulation?
2. Is there a significant difference in the critical thinking skills of students before and after
participating in virtual simulation?
3. Is there a significant difference in the critical thinking skills of students before and after
receiving traditional face-to-face simulation?
4. Is there a significant difference in the satisfaction with learning of students who are
trained via face-to-face simulation and those trained via virtual simulation?
5. Is there a significant difference in the self-confidence in learning of students who are
trained via face-to-face simulation and those trained in virtual simulation?
Sample Demographics
A total of 52 students completed the pre-test; however, three did not take the post test.
As a result, the sample size included in the data analysis consisted of 49 undergraduate junior
nursing students. Among the participants, 42 were female (85.7%) and seven were male
(14.3%). In terms of ethnicity, 30 participants identified themselves as Asian/Asian American
and Pacific Islander (61.2%), 15 participants reported themselves as white (30.6%), three
participants described themselves as Hispanic/Latino/Mexican American (6.1%), and only one
SIMULATION AND CRITICAL THINKING 74
student (2.1 %) was black/African American. The mean age and the mean self-reported nursing
cumulative grade point average for participants in both groups were 25.65 years old and 3.34,
respectively. In regard to healthcare-related work experience, 20 participants reported only
having practicum experience (40.8%), 18 participants (36.7%) reported working as a Certified
Nurse Aide, two participants (4.1%) reported working as a Licensed Practical Nurse, and nine
participants (18.4%) had experience working as a medical clerk, care/monitor technician, or
medical assistant. Finally, 26 participants (53%) reported having healthcare-related experience
of less than a year, nine (18.4%) participants indicated they had one to two years of experience,
and 14 participants (28.6%) had more than two years of experience.
Face-to-Face Simulation Group
There were 27 students in the face-to-face simulation group. The majority of the
participants in this group were females, and they accounted for 77.8% of the group. In addition,
55.6% of the participants identified themselves as Asian/Asian American and Pacific Islander,
followed by 33.3% as white, 7.4% as Hispanic/Latino/Mexican American, and 3.7% as
black/African American. In terms of healthcare-related work experience and years of work,
40.7% of the participants reported working as a Certified Nurse Aide and more than half of the
participants claimed to have healthcare-related experience of less than a year. The mean age of
this group was 21 years with a range of 21 to 34 years. Finally, the mean self-reported
cumulative nursing grade point average was 3.32, ranging from 3.0 to 4.0 (Table 1).
Virtual Simulation Group
This group consisted of 22 participants, of which 95.5% were female. In terms of
ethnicity, about 68.2% of the participants were Asian/Asian American and Pacific Islander,
27.7% were white, and 4.5% were White and Hispanic/Latino/Mexican American. In addition,
SIMULATION AND CRITICAL THINKING 75
more than half of the participants reported practicum as the only healthcare experience they had,
which meant that their years of experience were less than a year. The mean age of the
participants was 25.68 years old, ranging from 21-49 years old. The mean self-reported
cumulative nursing GPA average was 3.37, with a range between 3.0 and 4.0 (Table 1).
Table 1
Demographics Characteristics for All Participants
Face-to-Face
Simulation (n=27)
Virtual
Simulation (n=22)
Gender
Female 21 (77.8%) 21 (95.5%)
Male 6 (22.2%) 1 (4.5%)
Ethnicity
Asian/Asian American
Pacific Islander 15 (55.6%) 15 (68.2%)
White 9 (33.3%) 6 (27.3%)
Hispanic/Latino
Mexican American 2 (7.4%) 1 (4.5%)
Black/African American 1 (3.7%) 0 (0%)
Healthcare-Related Work Experience
Certified Nurse Aide 11 (40.7%) 7 (31.8%)
Practicum 8 (29.6%) 12 (54.5%)
Other 6 (22.2%) 3 (13.6%)
Licensed Practical Nurse 2 (7.4%) 0 (0%)
Healthcare-Related Work Year
Less than One Year 14 (51.9%) 12 (54.5%)
One to Two Years 5 (18.5%) 4 (18.2%)
More Than Two Years 8 (29.6%) 6 (27.3%)
Self-Reported GPA Average
Mean 3.32 3.37
Std. Deviation .277 .331
Minimum 3.0 3.0
Maximum 4.0 4.0
Age (years)
Mean 25.63 25.68
Std. Deviation 4.134 6.799
Minimum 21 21
Maximum 34 49
SIMULATION AND CRITICAL THINKING 76
Results
Prior to conducting inferential statistical analysis, descriptive statistics were used to
examine continuous variables to detect violation of the assumptions of parametric tests that were
chosen for data analysis. The distribution of scores on the dependent variables was examined by
assessing the mean, median, standard deviation, range, skewness, kurtosis, results of normality
tests, and histograms. The results of the inferential statistics that address each research question
were then presented.
Critical Thinking Skills
Participants overall HSRT scores were used to assess their critical thinking skills before
and after receiving either the face-to-face simulation or virtual simulation. The mean time for
participants in the face-to-face group to complete the HSRT was 34.37 minutes for the pre-test
and 27.78 minutes for the post-test. The mean overall score on the HSRT at the pre-test for the
face-to-face simulation group was 18.04. After receiving face-to-face simulation classes, the
participants performed better on the HSRT post-test, as the overall score increased to 18.63. On
the other hand, the mean time the participants in the virtual simulation group spent on the HSRT
was 33.64 minutes for the pre-test and 29.68 minutes for the post-test. Participants in the virtual
simulation group demonstrated increased critical thinking skills after completing simulation
scenarios virtually, as indicated by the mean HSRT overall score at the post-test increase to
19.68 from 18.55 at the pre-test. Table 2 summarizes the descriptive statistics for the HSRT
overall scores by group.
SIMULATION AND CRITICAL THINKING 77
Table 2
Descriptive Statistics for HSRT Overall Scores and Minutes Spent on Tests by Group
Face to Face Simulation (n=27)
Minimum Maximum Mean
Std.
Deviation
Skewness Kurtosis
Statistic
Std.
Error
Statistic
Std.
Error
Pre-Test
Overall
10 26 18.04 4.381 -.042 .448 -.823 .872
Post-Test
Overall
11 29 18.63 4.542 .236 .448 -.623 .872
Minutes on
Pre-test
23 48 34.37 6.721 .195 .448 -.728 .872
Minutes on
Post-Test
19 38 27.78 5.480 .286 .448 -1.039 .872
Virtual Simulation (n=22)
Minimum Maximum Mean
Std.
Deviation
Skewness Kurtosis
Statistic
Std.
Error
Statistic
Std.
Error
Pre-Test
Overall
10 29 18.55 5.578 .071 .491 -1.155 .953
Post-Test
Overall
10 29 19.68 4.401 -.005 .491 .273 .953
Minutes on
Pre-test
15 48 33.64 7.712 .091 .491 .792 .953
Minutes on
Post-Test
17 42 29.86 6.840 -.004 .491 -.572 .953
In terms of the normality of the HSRT overall scores, the results of the Kolmogorov-
Smirnov statistic demonstrated a significance level of .200 for both the face-to-face simulation
group and the virtual simulation group. As a result, it was assumed that both groups’ overall
HSRT scores on the pre-test and post-test are normally distributed. Table 3 summarizes the
results of the Kolmogorov-Smirnov statistic. Figures 2 to 5 demonstrate the actual shapes of the
score distributions for each group.
SIMULATION AND CRITICAL THINKING 78
Table 3
Test of Normality and Overall HSRT Scores by Group
Kolmogorov-Smirnov
Group
Statistic df
Sig.
Face-to-Face Simulation Group
Pre-test HSRT Overall Score .089 27 .200
Post-test HSRT Overall Score .126 27 .200
Virtual Simulation Group
Pre-test HSRT Overall Score .111 22 .200
Post-test HSRT Overall Score .092 22 .200
Figure 2. Face-to-Face Simulation Group HSRT Overall Scores (Pre-Test)
SIMULATION AND CRITICAL THINKING 79
Figure 3. Face-to-Face Simulation Group HSRT Overall Scores (Post-Test)
Figure 4. Virtual Simulation Group HSRT Overall Scores (Pre-Test)
SIMULATION AND CRITICAL THINKING 80
Figure 5. Virtual Simulation Group HSRT Overall Scores (Post-Test)
Due to the fact that the results of the descriptive statistics indicated there was no violation
of the assumptions of parametric techniques, the independent-samples t-test was used to examine
if there was a difference in critical thinking skills for participants in the face-to-face simulation
group and the virtual simulation group. The paired-samples t-test was also used to compare the
differences in the critical thinking skills of students before and after receiving either the face-to-
face simulation or virtual simulation.
In order to determine if there was a significant difference in participants’ critical thinking
skills between the face-to-face simulation group and virtual simulation group before the
interventions, an independent-samples t-test was conducted to compare the mean pre-test HSRT
overall scores for both groups. As shown in Table 4, the results indicated that there was no
significant difference in the overall HSRT scores for the face-to-face simulation group (M =
18.04, SD = 4.381) and the virtual simulation group (M = 18.55, SD = 5.578; t (47) = -.357, p =
SIMULATION AND CRITICAL THINKING 81
.722 two-tailed). The magnitude of the differences in the means (mean difference = -.508, 95%
CI: -3.37 to 2.353) was very small (Cohen’s d = 0.10).
Table 4
Independent t-test Results of HSRT Overall Scores (Pre-Test)
Sample Size Mean Std. Deviation Std. Error Mean
Face-to-Face
Simulation
Overall Score
(Pre-Test)
27 18.04 4.381 .843
Virtual
Simulation
Overall Score
(Pre-Test)
22 18.55 5.578 1.189
Independent Samples Test
Levene’s Test
for Equality of
Variances
t-test for Equality of Means
F
Sig.
t
df
Sig.
(2-tailed)
Mean
Difference
Std.
Difference
95%
Confidence
Interval of the
Difference
Overall
Equal
variances
assumed
2.915
.094
-.357
47
.722
-.508
1.422
Lower Upper
-3.370 2.353
Equal
variances
not
assumed
-.349
39.377
.729
-.508
1.458
-3.456
2.439
Research Question One
An independent-samples t-test was performed to compare the difference in the mean
HSRT overall scores at post-test in order to answer this research question. As shown in Table 5,
the results indicated there was no significant differences in critical thinking skills for students
who received face-to-face simulation (M = 18.63, SD = 4.542) and those who were trained in
virtual simulation (M = 19.68, SD = 4.401; t (47) = -.818, p = .418 two-tailed). The magnitude of
SIMULATION AND CRITICAL THINKING 82
the differences in the means (mean difference = -1.052, 95% CI: -3.64 to 1.536) was very small
(Cohen’s d = 0.235).
Table 5
Independent t-test Results of HSRT Overall Scores (Post-Test)
Sample Size Mean Std. Deviation Std. Error Mean
Face-to-Face
Simulation
Overall Score
(Post-Test)
27 18.63 4.542 .874
Virtual
Simulation
Overall Score
(Post-Test)
22 19.68 4.401 .938
Independent Samples Test
Levene’s Test
for Equality
of Variances
t-test for Equality of Means
F
Sig.
t
df
Sig.
(2-tailed)
Mean
Difference
Std.
Difference
95% Confidence
Interval of the
Difference
Overall
Equal
variances
assumed
.413
.524
-.818
47
.418
-1.052
1.287
Lower Upper
-3.640 1.536
Equal
variances
not
assumed
-.821
45.553
.416
-1.052
1.282
-3.634
1.530
Research Question Two
This research question was addressed by conducting a paired-samples t-test to compare
the mean HSRT overall scores at pre-test and post-test for participants in the virtual simulation
group. As shown in Table 6, a significant difference in participants’ overall HSRT scores before
(M = 18.55, SD = 5.578) and after (M = 19.68, SD = 4.401) receiving virtual simulation was not
SIMULATION AND CRITICAL THINKING 83
found; t (21) = -.747, p = .463 (two-tailed). The mean increase in the HSRT overall scores was
1.136 with a 95% confidence interval ranging from -4.299 to 2.206. The Cohen’s d (0.225)
indicated a very small effect size.
Research Question Three
A paired-samples t-test was performed to address this research question. The results
demonstrated that participants’ critical thinking skills did not differ before (M = 18.04, SD =
4.381) or after (M = 18.63, SD = 4.542) having face-to-face simulation; t (26) = -.503, p = .619
(two-tailed). As shown in Table 6, the mean increase in the HSRT overall scores from pre-test to
post-test was .593 with a 95% confidence interval ranging from -3.014 to 1.829. The Cohen’s d
(0.132) indicated a very small effect size.
SIMULATION AND CRITICAL THINKING 84
Table 6
Paired-Samples t-test Results by Group
Paired Samples Statistics
Group
Mean
Sample Size
Std. Deviation
Std. Error
Mean
Face-to-Face
Simulation
Overall Score
Pre-Test
18.04 27 4.381 .843
Overall Score
Post-Test
18.63 27 4.542 .874
Virtual
Simulation
Overall Score
Pre-Test
18.55 22 5.578 1.189
Overall Score
Post-Test
19.68 22 4.401 .938
Paired Samples Correlations
Group Sample Size Correlation Sig.
Face-to-Face Simulation
Pre-Test & Post-Test Overall Scores
27 .059 .771
Virtual Simulation
Pre-Test & Post-Test Overall Scores
22 -.008 .971
Paired Samples Test
Group Mean Std.
Deviation
Std.
Error
Mean
95% Confidence
interval of the
Difference
t
df
Sig.
(2-tailed)
Face-to-Face
Simulation
Pre-Test &
Post-Test
Overall Scores
-.593
6.122
1.178
Lower Upper
-.503
26
.619
-3.014
1.829
Virtual
Simulation
Pre-Test &
Post-Test
Overall Scores
-1.136
7.133
1.521
-4.299
2.026
-.747
21
.463
SIMULATION AND CRITICAL THINKING 85
Satisfaction and Self-Confidence in Learning
The total score of the five items in the Likert scale survey was used to assess participants’
satisfaction with either face-to-face simulation or virtual simulation. In addition, the difference
in participants’ self-confidence in learning between the groups was assessed by examining the
total scores for the eight survey items. Descriptive analysis was conducted to examine the
normality of the distributions of scores.
All participants answered all 13 survey questions. Table 7 summarizes the descriptive
statistics for all survey questions by group. When assessing the distributions of scores by group,
the results of the Kolmogorov-Smirnov test indicated the score for each survey question was not
normally distributed and, therefore, violated the assumptions of parametric statistical techniques.
Table 8 summarizes the results of Kolmogorov-Smirnov statistics. Figures 6 to 31 illustrate the
actual shapes of score distributions for each survey question by group.
SIMULATION AND CRITICAL THINKING 86
Table 7
Descriptive Statistics for Individual Survey Question of Satisfaction and Self-Confidence in
Learning by Group
Virtual Simulation Group
Question
(Q)
Valid
N
Minimum Maximum Mean Std.
Deviation
Skewedness Kurtosis
Statistic Statistic Statistic Statistic Statistic Std.
Error
Statistic Std.
Error
Q1 22 3 5 4.41 .590 -.379 .491 -.626 .953
Q2 22 3 5 4.50 .598 -.736 .491 -.312 .953
Q3 22 2 5 4.00 .873 -.945 .491 .888 .953
Q4 22 3 5 4.32 .646 -.404 .491 -.540 .953
Q5 22 2 5 4.09 .921 -.977 .491 .616 .953
Q6 22 3 5 4.23 .528 .264 .491 .136 .953
Q7 22 3 5 4.32 .568 -.050 .491 -.506 .953
Q8 22 3 5 4.18 .733 -.304 .491 -.973 .953
Q9 22 3 5 4.41 .590 -.379 .491 -.626 .953
Q10 22 3 5 4.45 .596 -.533 .491 -.524 .953
Q11 22 2 5 4.32 .780 -1.314 .491 2.374 .953
Q12 22 4 5 4.36 .492 .609 .491 -1.802 .953
Q13 22 3 5 4.32 .646 -.404 .491 -.540 .953
Face-to-Face Simulation Group
Question
(Q)
Valid
N
Minimum Maximum Mean Std.
Deviation
Skewedness Kurtosis
Statistic Statistic Statistic Statistic Statistic Std.
Error
Statistic Std.
Error
Q1 27 4 5 4.70 .465 -.946 .448 -1.201 .872
Q2 27 4 5 4.63 .492 -.569 .448 -1.817 .872
Q3 27 4 5 4.96 .192 -5.196 .448 27.000 .872
Q4 27 3 5 4.52 .643 -1.012 .448 .069 .872
Q5 27 4 5 4.89 .320 -2.623 .448 5.265 .872
Q6 27 3 5 4.37 .565 -.136 .448 -.744 .872
Q7 27 3 5 4.48 .580 -.562 .448 -.604 .872
Q8 27 4 5 4.52 .509 -.079 .448 -2.160 .872
Q9 27 3 5 4.74 .526 -1.985 .448 3.462 .872
Q10 27 3 5 4.63 .565 -1.247 .448 .736 .872
Q11 27 4 5 4.78 .424 -1.416 .448 .000 .872
Q12 27 3 5 4.52 .580 .716 .448 -.413 .872
Q13 27 1 5 3.78 1.155 -.664 .448 -.332 .872
Note. Question 1-5 = Satisfaction; Question 6-13 = Self-Confidence in Learning
SIMULATION AND CRITICAL THINKING 87
Table 8
Test of Normality for Satisfaction and Self-Confidence in Learning Survey Questions by Group
Kolmogorov-Smimov
Group Question Statistic df Sig.
Virtual Simulation 1 .301 22 .000
2 .344 22 .000
3 .318 22 .000
4 .280 22 .000
5 .279 22 .000
6 .394 22 .000
7 .349 22 .000
8 .234 22 .003
9 .301 22 .000
10 .320 22 .000
11 .264 22 .000
12 .406 22 .000
13 .280 22 .000
Face-to-Face Simulation 1 .442 27 .000
2 .404 27 .000
3 .539 27 .000
4 .366 27 .000
5 .525 27 .000
6 .337 27 .000
7 .333 27 .000
8 .346 27 .000
9 .467 27 .000
10 .411 27 .000
11 .478 27 .000
12 .352 27 .000
13 .206 27 .005
SIMULATION AND CRITICAL THINKING 88
Figure 6. Virtual Simulation Group Survey Question One Scores
Figure 7. Face-to-Face Simulation Group Survey Question One Scores
SIMULATION AND CRITICAL THINKING 89
Figure 8. Virtual Simulation Group Survey Question Two Scores
Figure 9. Face-to-Face Simulation Group Survey Question Two Scores
SIMULATION AND CRITICAL THINKING 90
Figure 10. Virtual Simulation Group Survey Question Three Scores
Figure 11. Face-to-Face Group Survey Question Three Scores
SIMULATION AND CRITICAL THINKING 91
Figure 12. Virtual Simulation Group Survey Question Four Scores
Figure 13. Face-to-Face Simulation Group Survey Question Four Scores
SIMULATION AND CRITICAL THINKING 92
Figure 14. Virtual Simulation Group Survey Question Five Scores
Figure 15. Face-to-Face Simulation Survey Question Five Scores
SIMULATION AND CRITICAL THINKING 93
Figure 16. Virtual Simulation Group Survey Question Six Scores
Figure 17. Face-to-Face Simulation Group Survey Question Six Scores
SIMULATION AND CRITICAL THINKING 94
Figure 18. Virtual Simulation Group Survey Question Seven Scores
Figure 19. Face-to-Face Simulation Group Survey Question Seven Scores
SIMULATION AND CRITICAL THINKING 95
Figure 20. Virtual Simulation Group Survey Question Eight Scores
Figure 21. Face-to-Face Simulation Group Survey Question Eight Scores
SIMULATION AND CRITICAL THINKING 96
Figure 22. Virtual Simulation Group Survey Question Nine Scores
Figure 23. Face-to-Face Simulation Group Survey Question Nine Scores
SIMULATION AND CRITICAL THINKING 97
Figure 24. Virtual Simulation Group Survey Question 10 Scores
Figure 25. Virtual Simulation Group Survey Question 11 Scores
SIMULATION AND CRITICAL THINKING 98
Figure 26. Face-to-Face Simulation Group Survey Question Ten Scores
Figure 27. Face-to-Face Simulation Group Survey Question 11 Scores
SIMULATION AND CRITICAL THINKING 99
Figure 28. Virtual Simulation Group Survey Question 12 Scores
Figure 29. Face-to-Face Simulation Group Survey Question 12 Scores
SIMULATION AND CRITICAL THINKING 100
Figure 30. Virtual Simulation Group Survey Question 13 Scores
Figure 31. Face-to-Face Simulation Group Survey Question 13 Scores
SIMULATION AND CRITICAL THINKING 101
Research Question Four
Because the results of descriptive analysis revealed that the assumptions of parametric
tests were violated, a Mann-Whitney U test was used to examine if the participants in the face-
to-face simulation group and the virtual simulation group differed in terms of their satisfaction
with current learning. When comparing the total satisfaction with current learning scores for
both groups, the results of the Mann-Whitney U test revealed a significant difference for
participants in the virtual simulation group (Mdn = 21.5, n = 22) and the face-to-face simulation
group (Mdn = 25, n = 27), U = 145, z = -3.145, p = .002 (Table 9). The magnitude of the
differences in satisfaction with current learning between these two groups was medium (r = .45).
As a result, the participants were more satisfied with conducting simulation in the classrooms
than in a virtual environment.
When comparing the difference in scores for each individual survey question about
satisfaction with learning, a significant difference was found in question three: “I enjoyed how
my instructor taught the simulation” for the face-to-face simulation group (Mdn = 4, n = 22) and
“I enjoyed the vSim” for the virtual simulation group (Mdn = 5, n = 27), U = 90, z = -4.963, p =
< .001, r = .71 (Table 10). The participants in the face-to-face simulation group were more
satisfied with how their instructors taught the simulation in classrooms than those students who
participated in simulations in a virtual environment. In addition, there was a significant
difference found in question five, “The way my instructor(s) taught the simulation was suitable
to the way I learn,” for the face-to-face simulation group (Mdn = 4, n = 22) and “The vSim was
suitable to the way I learn” for the virtual simulation group (Mdn = 5, n = 27), U = 135, z = -
3.884, p = < .001, r = .55 (Table 10). This finding suggested that face-to-face simulation is more
suitable to the way students learn than virtual simulation.
SIMULATION AND CRITICAL THINKING 102
Research Question Five
A Mann-Whitney U test was also used to address the last research question. The results
of the comparison of the total scores suggested that there was no significant difference in self-
confidence in learning for participants in the virtual simulation group (Mdn = 33, n = 22) and the
face-to-face simulation group (Mdn = 36, n = 27), U = 231.5, z = -1.326, p = .185, r = .19 (Table
9).
However, when examining the score for each survey question, a significant difference in
the participants’ self-confidence in learning was found between the face-to-face group and the
virtual simulation group based on question 9 and question 11. The former asked participants
about the helpfulness of resources included in the vSim program or used by their simulation
instructors. The results revealed that participants in the face-to-face group (Mdn = 5, n = 27) felt
more confident about the resources that were used to conduct the simulation and promote their
learning than did those in the virtual simulation group (Mdn = 4, n = 22), U = 204, z = -2.215, p
= .027, r = .32 (Table10). In addition, a significant difference was found in question 11, “I know
how to get help when I do not understand the concepts covered in the simulation.” The
participants in the face-to-face simulation group (Mdn = 5, n = 27) had higher self-confidence in
knowing how to get help than those in the virtual simulation group (Mdn = 4, n = 22), U = 195, z
= -2.429, p = .015, r = .35 (Table 10).
SIMULATION AND CRITICAL THINKING 103
Table 9
Mann-Whitney U Test Results for Total Scores, Test Statistics, and Ranks by Group
Report
Sample Size
(N)
Total Satisfaction
(Median)
Sample
Size (N)
Total Self-Confidence
in Leaning
(Median)
Virtual
Simulation
22 21.5000 22 33.0000
Face-to-Face
Simulation
27 25.0000 27 36.0000
Total 49 23.0000 49 35.0000
Test Statistics
Total Satisfaction
Total Self-Confidence in
Leaning
Mann-Whitney U 145.000 231.500
Wilcoxon W 398.000 484.500
Z -3.145 -1.326
Asymp. Sig. (2-tailed) .002 .185
Notes. Grouping Variable: Group
Ranks
Group Sample Size (N) Mean Rank Sum of Ranks
Total
Satisfaction
Virtual
Simulation
22 18.09 398.00
Face-to-
Face
Simulation
27 30.63 827.00
Total Self-
Confidence
in Learning
Virtual
Simulation
22 22.02 484.50
Face-to-
Face
Simulation
27 27.43 740.50
SIMULATION AND CRITICAL THINKING 104
Table 10
Mann-Whitney U Test Results for Total Scores and Test Statistics by Question and Group
Report
Virtual Simulation Group Face-to-Face Simulation Group Total
Question
(Q)
Sample
Size (N)
Median
Question
(Q)
Sample Size
(N)
Median
Question
(Q)
Sample
Size (N)
Median
Q1 22 4.00 Q1 27 5.00 Q1 49 5.00
Q2 22 5.00 Q2 27 5.00 Q2 49 5.00
Q3 22 4.00 Q3 27 5.00 Q3 49 5.00
Q4 22 4.00 Q4 27 5.00 Q4 49 5.00
Q5 22 4.00 Q5 27 5.00 Q5 49 5.00
Q6 22 4.00 Q6 27 4.00 Q6 49 4.00
Q7 22 4.00 Q7 27 5.00 Q7 49 4.00
Q8 22 4.00 Q8 27 5.00 Q8 49 4.00
Q9 22 4.00 Q9 27 5.00 Q9 49 5.00
Q10 22 4.50 Q10 27 5.00 Q10 49 5.00
Q11 22 4.00 Q11 27 5.00 Q11 49 5.00
Q12 22 4.00 Q12 27 5.00 Q12 49 4.00
Q13 22 4.00 Q13 27 4.00 Q13 49 4.00
Test Statistics
Question
(Q)
Mann-Whitney U Wilcoxon W Z Asym. Sig (2- tailed)
Q1 219.000 472.000 -1.829 .067
Q2 267.000 520.000 -.704 .482
Q3 90.000 343.000 -4.963 .000
Q4 244.500 497.500 -1.181 .238
Q5 135.000 388.000 -3.884 .000
Q6 257.000 510.000 -.942 .346
Q7 251.500 504.500 -1.038 .299
Q8 225.000 478.000 -1.612 .107
Q9 204.000 457.000 -2.215 .027
Q10 248.500 501.500 -1.131 .258
Q11 195.000 448.000 -2.429 .015
Q12 247.000 500.000 -1.150 .250
Q13 223.500 601.500 -1.568 .117
Notes. Grouping Variable: Group
SIMULATION AND CRITICAL THINKING 105
Table 11
Mann-Whitney U Test Results for Ranks of Each Survey Question by Group
Ranks
Question
(Q)
Group Sample Size (N) Mean Rank Sum of Ranks
Q1
Virtual Simulation 22 21.45 472.00
Face-to-Face Simulation 27 27.89 753.00
Q2
Virtual Simulation 22 23.64 520.00
Face-to-Face Simulation 27 26.11 705.00
Q3
Virtual Simulation 22 15.59 343.00
Face-to-Face Simulation 27 32.67 882.00
Q4
Virtual Simulation 22 22.61 497.50
Face-to-Face Simulation 27 26.94 727.50
Q5
Virtual Simulation 22 17.64 388.00
Face-to-Face Simulation 27 31.00 837.00
Q6
Virtual Simulation 22 23.18 510.00
Face-to-Face Simulation 27 26.48 715.00
Q7
Virtual Simulation 22 22.93 504.50
Face-to-Face Simulation 27 26.69 720.50
Q8
Virtual Simulation 22 21.73 478.00
Face-to-Face Simulation 27 27.67 747.00
Q9
Virtual Simulation 22 20.77 457.00
Face-to-Face Simulation 27 28.44 768.00
Q10
Virtual Simulation 22 22.80 501.50
Face-to-Face Simulation 27 26.80 723.50
Q11
Virtual Simulation 22 20.36 448.00
Face-to-Face Simulation 27 28.78 777.00
Q12
Virtual Simulation 22 22.73 500.00
Face-to-Face Simulation 27 26.85 725.00
Q13
Virtual Simulation 22 28.34 623.50
Face-to-Face Simulation 27 22.28 601.50
SIMULATION AND CRITICAL THINKING 106
Chapter Summary
This quantitative study was designed to examine the effects of face-to-face simulation
and virtual simulation on undergraduate nursing students’ critical thinking skills, as well as
satisfaction and self-confidence in learning. An intendent-samples t-test was used to compare
the differences in critical thinking skills between groups. The results of applying the selected
statistical techniques showed that the critical thinking skills of the participants in both groups did
not differ before receiving either the face-to-face simulation or the virtual simulation, meaning
that all participants began on an equal footing. After receiving different types of simulation, the
participants in the virtual simulation group scored higher on the post-test than those in the face-
to-face simulation group; however, the difference was not statistically significant. When
comparing the difference in critical thinking skills at pre-test and post-test points within groups,
no significant difference was found. Thus, the study results indicated that there was no
significant difference in critical thinking skills for participants within and between groups.
In addition, the difference in participants’ satisfaction and self-confidence in learning was
compared using the Mann-Whitney U test. The results demonstrated that participants in the
face-to-face group were more satisfied with their current learning than those in the virtual
simulation group. Participants enjoyed the face-to-face simulation more than the virtual
simulation and believed the manner in which their instructors taught the simulation was more
suitable to their learning. In terms of their self-confidence in learning, participants in the face-to-
face simulation group were more confident about the helpfulness of recourse used by their
instructors in teaching the simulation and knew how to get help if needed when they had
questions about the concepts covered in the simulation.
SIMULATION AND CRITICAL THINKING 107
CHAPTER FIVE: DISCUSSIONS OF FINDINGS
Nurses are expected to have sufficient critical thinking skills in order to solve problems
and provide safe patient care to meet the demand of the growing aging population. However,
anticipated faculty and nurse shortages, and a decrease in available clinical sites imposed
challenges on educators to cultivate competent nurses. Simulations using high-fidelity human
patient simulators are widely accepted and used, as these allow students to apply knowledge and
clinical skills in a safe environment without causing harm to real patients. However, the costs of
building and maintaining a simulation laboratory are very high, and current research on their
effectiveness in increasing students’ critical thinking skills remains inconclusive. Due to the
advancement of technology and a new generation of tech-savvy students, the popularity of
virtual simulations has grown. However, current research examining the effect of virtual
simulation in promoting the development of critical thinking skills is lacking. To help educators
understand whether the expenses of conducting simulations in a laboratory are worthwhile and to
make informed decisions in selecting the most cost-effective method to deliver simulations, it is
imperative to compare the effectiveness of face-to-face simulations using HPS with virtual
simulation in increasing critical thinking skills as well as satisfaction and self-confidence in
learning among students.
The purpose of this quantitative study was to investigate and compare nursing students’
critical thinking skills, satisfaction, and self-confidence in learning when receiving either face-to-
face simulation or virtual simulation training. An experimental, two-group, pre-test/post-test
research design was utilized to answer five research questions addressing the differences in
critical thinking skills between and within these two simulation groups as well as participants’
satisfaction and self-confidence in learning between groups. This study took place in a private,
SIMULATION AND CRITICAL THINKING 108
non-profit university located in the western region of the United States during the fall of 2015. A
total of 57 undergraduate nursing students in their fourth semester of the nursing program were
invited and 52 students agreed to participate in the study. However, three students did not
complete the post-test, which led to the final sample size of 49 participants. All participants
were randomly assigned to either the face-to-face simulation group or the virtual simulation
group and were asked to complete the HSRT to measure their critical thinking skills before
receiving simulations. By the midterm of fall 2015, the HSRT post-test, as well as the
satisfaction and self-confidence in learning survey were given to all participants. An
independent-samples t-test was used to compare the mean HSRT overall scores between groups.
Moreover, the mean HSRT overall scores within groups were compared using a paired-samples
t-test. Finally, a Mann-Whitney U test was used to determine whether there was a difference in
satisfaction and self-confidence in learning between groups.
Discussion of Findings
The first research question asked if there was a difference in critical thinking skills of
participants who received face-to-face simulation as compared to those who participated in the
virtual simulation. The results of the independent-samples t-test indicated that there was no
significant difference in critical thinking skills between groups, despite the fact that the mean
HSRT post-test scores of participants in the virtual simulation group were higher than those of
the students in the face-to-face simulation group.
Research directly comparing the differences in critical thinking skills among
undergraduate nursing students who receive either face-to-face simulation or virtual simulation
training is limited. The review of literature indicated that virtual simulation was a better teaching
strategy than classroom instruction in improving clinical judgment skills (Weatherspoon &
SIMULATION AND CRITICAL THINKING 109
Wyatt, 2012), knowledge transfer of communication skills, problem- solving skills, and
prioritization (Tschannen et al., 2012), but no differences were demonstrated when compared
with human patient simulations in increasing nursing students’ clinical judgment skills (Howard,
2013), competency caring for acute patients (Johnson et al., 2014), and leadership skills
(Youngblood et al., 2008). In addition, face-to-face simulation was also found to be effective in
improving critical thinking skills when compared with interactive case studies (Howard, 2007)
and classroom instruction (Schumacher, 2004), but the results of some studies demonstrated no
significant differences when compared with different teaching strategies that were used alone or
in combinations with simulation or other methods (Brown & Chronister, 2009; Maneval et al.,
2012; Ravert, 2008). The study’s results were consistent with one of the reviewed studies
(Johnson & Johnson, 2014) in that the improvement in critical thinking skills did not differ when
simulations were conducted using either HPS in a laboratory or CD-ROMs. The other notable
finding in this study was that participants in the virtual simulation group actually had better
critical thinking skills than those in the face-to-face simulation group as evidenced by the higher
mean overall HSRT scores on the post-test. As a result, virtual simulation may be viewed to be
at least as, if not more, effective and more cost-effective as an instructional strategy as traditional
face-to-face simulation. This result contributes to the emerging literature in support of the use of
virtual simulation.
The second research question asked whether a significant difference in critical thinking
skills exists before and after participants receive virtual simulation. Even though the mean
overall HSRT scores of the participants in the virtual simulation group at the post-test were
higher than at the pre-test, a statistically significant difference was not detected. The current
studies demonstrated that the use of virtual simulation was statistically significant in improving
SIMULATION AND CRITICAL THINKING 110
teamwork (Kalisch et al., 2015), but not in increasing technological skills (Aebersold et al.,
2012b). There are no studies that directly examine the gain in critical thinking skills before and
after receiving virtual simulation.
The third research question asked whether participants in the face-to-face simulation
group developed better critical thinking skills after attending instructor-led simulations using
HPS. The study found that the mean HSRT overall score was increased on the post-test, but no
statistically significant improvement in critical thinking skills was determined. This finding was
consistent with the study conducted by Shinnick and Woo (2013). The result of the study also
reinforces the inconclusiveness of the current literature on the effect of human patient
simulations in promoting the development of nursing students’ critical thinking skills.
Small sample size and short duration of data collection in this study may contribute to the
insignificant differences in the improvement of critical thinking skills between and within the
face-to-face simulation group and the virtual simulation group, even though both groups had
better mean HSRT overall scores on the post-test. Some may also argue that the improved mean
overall HSRT scores for both groups on the post-test might be caused by other factors, such as
work experience, practicums, and attending lecture classes rather than by the simulations. These
factors are not controllable in education and the randomization of group assignment was used to
minimize the effects of these confounding factors.
The fourth research question asked if satisfaction with current learning differs between
the participants who were in the face-to-face simulation and those in the virtual simulation
group. The results of the study indicated the students were more satisfied with face-to-face
simulation than they were with virtual simulation. There was no study directly comparing
satisfaction among these two simulation groups; however, positive results were reported in
SIMULATION AND CRITICAL THINKING 111
research evaluating satisfaction among students who had either virtual simulation or face-to-face
simulation. When considering satisfaction and students’ critical thinking skills, it is interesting
to see that participants in the virtual simulation group who were less satisfied with their learning
actually had more improvement in their critical thinking skills. One possible explanation is that
the participants receiving virtual simulation were required to complete guided debriefing
questions weekly in writing, while the students in the face-to-face group did a verbal debriefing
during class. Petranek (2000) pointed out that written debriefing facilitates better learning and
may explain why students in the virtual simulation had better critical thinking skills as measured
by the HSRT. In addition, students were not familiar with the layout of the vSim for Nursing
program at the beginning of the semester and, therefore, needed to spend extra time to navigate
the program. Finally, the students who received virtual simulation were required to achieve at
least 80% on each simulation scenario and its corresponding post-simulation quiz. The majority
of the students reported that they had to repeat the scenarios several times in order to meet the
expectations, while the students in the face-to-face simulation group were not required to redo
each scenario after class.
The last research questions asked if there was a significant difference in the self-
confidence in learning of students who are trained via face-to-face simulation and those trained
in virtual simulation. There was no significant difference in participants’ self-confidence in
learning between these two groups when the median overall scores were compared. However,
participants in the face-to-face simulation group were more confident in knowing how to obtain
assistance if they were unclear about concepts taught in class and their instructors provided
resources that were helpful in facilitating their learning. This finding is consistent with current
literature in which students reported having improved confidence after attending simulations, but
SIMULATION AND CRITICAL THINKING 112
no difference was found when comparing teaching methods. According to Smith and Roehrs
(2009), clearly defined learning objectives and scenarios that are appropriate but challenging to
students’ current level of knowledge and skills promoted higher satisfaction and self-confidence.
All participants in both groups were provided with a clearly written syllabus with course
outcomes, as well as written learning objectives for the weekly simulation scenarios. The weekly
scenario was also based on the content already taught in the didactic classes. A preference for
weekly personal contact with an instructor may explain why the students in the face-to-face
simulation felt more confident obtaining the help they needed.
Implications for Practice
There are several implications for practice in nurse education. First, the results of the
study provide nurse educators with an alternative instructional method to conduct simulations
that are less expensive, but which may be as effective as traditional face-to-face simulation.
Virtual simulation can easily be incorporated into online nursing programs and offered to
distance learners. As the demand for online education has grown dramatically, the results of the
study may help nurse educators in designing online curricula. This approach is mutually
beneficial to nursing schools that need to increase revenues and reduce expenses as well as to
online learners who demand flexible class schedules due to other obligations or difficulties
commuting to campuses. Virtual simulation is also a more suitable learning tool for students
who prefer to learn at their own pace.
Second, the results indicated that the participants had improved critical thinking skills
after receiving virtual simulation, as demonstrated by the better HSRT scores on the post-test.
Therefore, it can be used in the beginning of a nursing program in addition to regular classroom
instruction or clinical assignments to foster better development of critical thinking skills
SIMULATION AND CRITICAL THINKING 113
overtime. Moreover, because virtual simulation may be considered as effective as face-to-face
simulation using HPS, it may be used to substitute for real clinical hours due to the limited
availability of clinical sites or to help students make up missed clinical hours due to inevitable
circumstances. Finally, it can also be used as a remediation tool to refine students’ learning and
provide them with more opportunities to practice thinking critically to make appropriate clinical
judgments.
Third, the results of the study may also stimulate vendors or leading organizations in
nurse education to develop more innovative and higher-fidelity virtual simulation programs due
to the effectiveness of virtual simulation and the potential profits from the market for virtual
simulation. Developers of future virtual simulation programs should build partnerships with
healthcare professionals and organizations in order to create authentic scenarios that truly reflect
clinical situations and patients’ responses encountered by in real practice. The electronic virtual
patients project funded by the European Commission to create 320 standardized virtual
simulation scenarios serves as an example that can be followed in the United States.
Finally, the results of the study indicated that virtual simulation was as effective as
traditional face-to-face simulation in increasing participants’ self-confidence. Students may feel
more confident in providing care to their patients after participating simulations. Therefore,
students may feel less anxious and be able to think critically about what to do while solving
problems in real clinical settings. Patients and their family members may also trust their nurses
and the care provided to them because nurses demonstrate assurance. A professional relationship
can then be built among, nurses, patients, and family members.
SIMULATION AND CRITICAL THINKING 114
Recommendations for Future Research
The first recommendation for future research is to use a larger sample size and a variety
of different settings in order to improve generalizability, such as students in associate nursing
programs or licensed nurses who work in healthcare. Moreover, most of the existing studies
utilized medical/surgical scenarios when conducting simulations; therefore, future research
should investigate improvement in critical thinking skills, satisfaction, and self-confidence in
learning using scenarios in different clinical specialties, such as nursing fundamentals,
pharmacology, and community health nursing. Finally, a large scale study involving students
from national and international schools and with different cultural backgrounds should also be
conducted to examine and compare the effects of virtual and face-to-face simulation on students’
critical thinking skills, satisfaction, and self-confidence in learning.
Second, the development of critical thinking skills does not occur over a short period of
time. Future studies should be conducted over an extended time frame to determine whether
there is any change in critical thinking skills over a longer term. Future studies should also focus
on examining whether the number of simulation scenarios influences nursing students' critical
thinking skills, satisfaction, and self-confidence in learning. Sullivan-Mann et al. (2009)
compared the critical thinking skills of students who enrolled in an associate nursing program
and found that students who completed five simulation scenarios using human patient simulators
had better critical thinking skills than those who finished just two scenarios. Their study may be
repeated in the context of virtual simulation to determine the values of learning of this innovative
instructional method.
Third, researchers can also conduct studies to examine whether a combination of virtual
simulation and other teaching strategies causes any differences in the development of critical
SIMULATION AND CRITICAL THINKING 115
thinking skills, satisfaction, and self-confidence in learning among nursing students.
Schumacher (2004) compared the differences in critical thinking skills and learning outcomes of
nursing students who were randomly assigned to receive classroom instruction, face-to-face
simulation, or a combination. The results showed that the simulation only and the combination
of simulation and classroom teaching increased students’ critical thinking skills and learning
outcomes. A modification of Schumacher’s study can be conducted to investigate the
effectiveness of a combination of virtual simulation and face-to-face human patient simulation in
increasing critical thinking abilities.
Fourth, more research may be conducted to examine the development of critical thinking,
satisfaction, and self-confidence in learning using different virtual simulation programs available
to nurse educators. While Second Life is the most commonly used virtual simulation platform in
current studies, this dissertation studies the use of vSim for Nursing Medical/Surgical program.
There are more virtual simulation programs, such as CliniSpace and Real Life, which educators
can choose from; however, the current literature has not provided sufficient evidence to support
the effectiveness of various virtual simulation programs. More studies should be conducted to
examine the effectiveness of various virtual simulation programs in increasing critical thinking
skills to help educators make informed decisions.
Finally, the results of this study showed participants in the virtual simulation program
were less satisfied with their current learning than were students in the face-to-face group.
However, this study did not examine the reasons behind this low satisfaction. Qualitative studies
may be conducted in the future to identify the causes of low satisfaction in order to provide
constructive recommendations for educators to incorporate virtual simulation into nursing
curricula and for developers to create more innovative virtual simulation programs.
SIMULATION AND CRITICAL THINKING 116
Conclusion
Based on the results of this quantitative study, virtual simulations may be as effective as
traditional face-to-face simulations in helping nursing students develop critical thinking skills.
However, participants in the face-to-face simulation group felt more satisfied than did those in
the virtual simulation group, while self-confidence in learning did not differ between the groups.
This finding adds to the emerging literature supporting the use of virtual simulation. Due to its
effectiveness in promoting the development of critical thinking skills, lower cost, and
accessibility, nurse educators may incorporate virtual simulation in online programs for online
learners. Educators should also start using virtual simulation to teach freshman nursing students
and as a substitute for missed clinical hours. Nursing education organizations should work with
vendors and other healthcare professionals in developing higher-fidelity virtual patients and
scenarios that truly reflect clinical situations. Better professional relationships with patients and
their family members may then be built due to increased competence and self-confidence.
Nonetheless, despite the promising results, the generalizability of this study is affected by its
limitations. Future research should be conducted during a longer period of time and include a
larger sample size from a variety of contexts. More studies should also focus on investigating
the effectiveness of a combination of virtual simulation with other teaching strategies or the use
of other virtual simulation programs in improving critical thinking skills, satisfaction, and self-
confidence in learning. Finally, studies should be conducted to identify the reasons for lower
satisfaction of students who receive virtual simulation over those who receive face-to-face
simulation.
SIMULATION AND CRITICAL THINKING 117
References
Abdo, A. & Ravert, P. (2006). Student satisfaction with simulation experiences. Clinical
Simulation in Nursing Education, 2(1), e13-e16.
Aebersold, M., Tschannen, D., & Bathish, M. (2012a). Innovative simulation strategies in
education. Nursing Research and Practice, 2012, 765212-7. doi:10.1155/2012/765212
Aebersold, M., Tschannen, D., & Stephens, M., Anderson, P. & Lei, X. (2012b). Second life: A
new strategy in educating nursing students. Clinical Simulation in Nursing, 8(9), e467-
e475.
Aebersold, M., & Tschannen, D. (2013). Simulation in nursing practice: The impact on patient
care. Online Journal of Issues in Nursing, 18(2), 83-94.
Alameddine, M., Baumann, A., Laporte, A., & Deber, R. (2012). A narrative review on the effect
of economic downturns on the nursing labour market: Implications for policy and
planning. Human Resources for Health, 10(1), 23-23. doi:10.1186/1478-4491-10-23
Alinier, G., Hunt, B., Gordon, R., & Harwood, C. (2006). Effectiveness of intermediate ‐fidelity
simulation training technology in undergraduate nursing education. Journal of Advanced
Nursing, 54(3), 359-369. doi:10.1111/j.1365-2648.2006.03810.x
Allaire, J. L. (2015). Assessing critical thinking outcomes of dental hygiene students utilizing
virtual patient simulation: A mixed methods study. Journal of Dental Education, 79(9),
1082-1092.
Allen, L. E., & Seaman, J. (2013). Changing course: Ten years of tracking online education in
the united states. Retrieved from
http://www.onlinelearningsurvey.com/reports/changingcourse.pdf
SIMULATION AND CRITICAL THINKING 118
American Association of Colleges of Nursing (2008). The essentials of baccalaureate education
for professional nursing practice. Retrieved from http://www.aacn.nche.edu/education-
resources/BaccEssentials08.pdf
American Association of Colleges of Nursing (2011). Nursing fact sheet. Retrieved from
http://www.aacn.nche.edu/media-relations/fact-sheets/nursing-fact-sheet
American Association of Colleges of Nursing (2015a). Nursing faculty shortage. Retrieved from
http://www.aacn.nche.edu/media-relations/fact-sheets/nursing-faculty-shortage
Bambini, D., Washburn, J., & Perkins, R. (2009). Outcomes of clinical simulation for novice
nursing students: Communication, confidence, clinical judgment. Nursing Education
Perspectives, 30(2), 79-82.
Berkow, S., Virkstis, K., Stewart, J., & Conway, L. (2009). Assessing new graduate nurse
performance. Nurse Educator, 34(1), 17-22. doi:10.1097/01.NNE.0000343405.90362.15
Blum, C. A., Borglund, S., & Parcells, D. (2010). High-fidelity nursing simulation: Impact on
student self-confidence and clinical competence. International Journal of Nursing
Education Scholarship, 7(1), 1-14. doi:10.2202/1548-923X.2035
Brannan, J. D., White, A., & Bezanson, J. L. (2008). Simulator effects on cognitive skills and
confidence levels. The Journal of Nursing Education, 47(11), 495-500.
doi:10.3928/01484834-20081101-01
Brigham, C. (1993). Nursing education and critical thinking: Interplay of content and thinking.
Holistic Nursing Practice, 7(3), 48-54.
Broom, M., Lynch, M., & Preece, W. (2009). Using online simulation in child health nurse
education. Paediatric Nursing, 21(8), 32-36.
SIMULATION AND CRITICAL THINKING 119
Brown, D., & Chronister, C. (2009). The effect of simulation learning on critical thinking and
self-confidence when incorporated into an electrocardiogram nursing course. Clinical
Simulation in Nursing, 5(1), e45-e52. doi:10.1016/j.ecns.2008.11.001
Byrne, J., Heavey, C., & Byrne, P. J. (2010). A review of web-based simulation and supporting
tools. Simulation Modelling Practice and Theory, 18(3), 253-276.
doi:10.1016/j.simpat.2009.09.013
Butler, K. W., Veltre, D. E., & Brady, D. (2009). Implementation of active learning pedagogy
comparing low-fidelity simulation versus high-fidelity simulation in pediatric nursing
education. Clinical Simulation in Nursing, 5(4), e129-e136.
doi:10.1016/j.ecns.2009.03.118
Cant, R. P., & Cooper, S. J. (2014). Simulation in the internet age: The place of web-based
simulation in nursing education. an integrative review. Nurse Education Today, 34(12),
1435-1442. doi:10.1016/j.nedt.2014.08.001
Cato, M. L. (2012). Chapter 1: Using simulation in nursing education. In P. R. Jeffries (Ed.),
Simulation in nursing education: From conceptualization to evaluation second edition
(pp. 1-11). New York, NY: National League for Nursing.
Chan, Z. C. Y. (2013). A systematic review of critical thinking in nursing education. Nurse
Education Today, 33(3), 236-240. doi: 10.1016/j.nedt.2013.01.007
Chang, M. J., Chang, Y., Kuo, S., Yang, Y., & Chou, F. (2011). Relationships between critical
thinking ability and nursing competence in clinical nurses. Journal of Clinical Nursing,
20(21 ‐22), 3224-3232. doi:10.1111/j.1365-2702.2010.03593.x
SIMULATION AND CRITICAL THINKING 120
Cimiotti, J. P., Aiken, L. H., Sloane, D. M., & Wu, E. S. (2012). Nurse staffing, burnout, and
health care-associated infection. American Journal of Infection Control, 40(6), 486-490.
doi:10.1016/j.ajic.2012.02.029
Cioffi, J. (2001). Clinical simulations: Development and validation. Nurse Education Today,
21(6), 477-486. doi:10.1054/nedt.2001.0584
Cooper, J., & Taqueti, V. (2004). A brief history of the development of mannequin simulators
for clinical education and training. Quality & Safety in Health Care, 13(Suppl 1), i11-i18.
doi:10.1136/qshc.2004.009886
Critical Thinking Community (2013). A Brief History of the Idea of Critical Thinking. Retrieved
from http://www.criticalthinking.org/pages/a-brief-history-of-the-idea-of-critical-
thinking/408
Daly, W. M. (1998). Critical thinking as an outcome of nursing education. what is it? why is it
important to nursing practice? Journal of Advanced Nursing, 28(2), 323-331.
doi:10.1046/j.1365-2648.1998.00783.x
Davis, R. L. (2009). Exploring possibilities: Virtual reality in nursing research. Research and
Theory for Nursing Practice, 23(2), 133-147.
Decker, S., Sportsman, S., Puetz, L., & Billings, L. (2008). The evolution of simulation and its
contribution to competency. Journal of Continuing Education in Nursing, 39(2), 74-80.
doi:10.3928/00220124-20080201-06
De Gagne, J. C., Oh, J., Kang, J., Vorderstrasse, A. A., & Johnson, C. M. (2013). Virtual worlds
in nursing education: A synthesis of the literature. The Journal of Nursing Education,
52(7), 391-396. doi:10.3928/01484834-20130610-03
SIMULATION AND CRITICAL THINKING 121
Del Bueno, D. (2005). A crisis in critical thinking. Nursing Education Perspectives, 26(5), 278-
282.
Dutile, C., Wright, N., & Beauchesne, M. (2011). Virtual clinical education: Going the full
distance in nursing education. Newborn and Infant Nursing Reviews, 11(1), 43-48.
doi:10.1053/j.nainr.2010.12.008
Dev, P., Youngblood, P., Heinrichs, W. L., & Kusumoto, L. (2007). Virtual worlds and team
training. Anesthesiology Clinics, 25(2), 321-336. doi:10.1016/j.anclin.2007.03.001
Electronic Virtual Patients (2016). Welcome to eViP website. Retrieved from
http://www.virtualpatients.eu/
Ennis, R. H. (1985). A logical basis for measuring critical thinking skills. Educational
Leadership, 43(2), 44-48.
Evans, N. J., Forney, D. S., Guido, F. M., Patton, L. D. & Renn, K. A. (2010). Student
Development in College: Theory, research, and practice 2nd edition. San Francisco, CA:
Jossey-Bass, Inc.
Facione, P. A. (1990). Critical Thinking: A Statement of Expert Consensus for Purposes of
Educational Assessment and Instruction. Research Findings and Recommendations.
Retrieved from http://files.eric.ed.gov/fulltext/ED315423.pdf
Flores, K. L., Matkin, G. S., Burbach, M. E., Quinn, C. E., & Harding, H. (2012). Deficient
critical thinking skills among college graduates: Implications for leadership. Educational
Philosophy and Theory, 44(2), 212-230. doi:10.1111/j.1469-5812.2010.00672.x
Fong, W. C. K. (2013). Nursing students' satisfaction and self-confidence towards high-fidelity
simulation and its relationship with the development of critical thinking in Hong Kong
SIMULATION AND CRITICAL THINKING 122
(Doctoral dissertation). Available from ProQuest Dissertations & Theses Full Text: The
Humanities and Social Sciences Collection.
Forneris, S. G., & Scroggs, N. (2014). NLN scholars in residence conduct research on virtual
simulation and the clinical faculty role. Nursing Education Perspectives, 35(5), 348.
Foronda, C., Godsall, L., & Trybulski, J. (2013). Virtual clinical simulation; The state of the
science. Clinical Simulation in Nursing, 9(8), e279-e286.
Foronda, C., & Bauman, E. B. (2014). Strategies to incorporate virtual simulation in nursing
education. Clinical Simulation in Nursing, 10(8), 412-418.
Forsberg, E., Georg, C., Ziegert, K., & Fors, U. (2011). Virtual patients for assessment of clinical
reasoning in nursing -- a pilot study. Nurse Education Today, 31(8), 757-762.
doi:10.1016/j.nedt.2010.11.015
Fowler, L. P. (1998). Improving critical thinking in nursing practice. Journal for Nurses in Staff
Development, 14(4), 183-187. doi:10.1097/00124645-199807000-00004
Galloway, S. (2009). Simulation techniques to bridge the gap between novice and competent
healthcare professionals. Online Journal of Issues in Nursing, 14(2), 1-10.
Gates, N. G., Parr, M. B., & Hughen, J. E. (2012). Enhancing nursing knowledge using high-
fidelity simulation. Journal of Nursing Education, 51(1), 9-15. doi: 10.3928/01484834-
20111116-01
Goodstone, L., Goodstone, M.S., Cino, K., Glaser, C. A., Kupferman, K., & Dember-Neal. T.
(2013). Effect of simulation on the development of critical thinking in associate degree
nursing students. Nursing Education Perspectives, 34(3), 159-162.
Hall, R. M. (2013). Effects of high fidelity simulation on knowledge acquisition, self-confidence,
and satisfaction with baccalaureate nursing students using the Solomon-four research
SIMULATION AND CRITICAL THINKING 123
design (Doctoral Dissertation). Available from ProQuest Dissertations & Theses Full
Text: The Humanities and Social Sciences Collection. (UMI No. 3577823)
Hicks-Moore, S. L., & Pastirik, P. J. (2006). Evaluating critical thinking in clinical concept
maps: A pilot study. International Journal of Nursing Education Scholarship, 3(1), 27-
15. doi:10.2202/1548-923X.1314
Hoadley, T. A. (2009). Learning advanced cardiac life support: A comparison study of the
effects of low- and high-fidelity simulation. Nursing Education Perspectives, 30(2), 91-
95.
Hotchkiss, M. A., & Mendoza, S. N. (2001). Update for nurse anesthetists. part 6. full-body
patient simulation technology: Gaining experience using a malignant hyperthermia
model. AANA Journal, 69(1), 59-65.
Howard, B. J. (2013). Computer-based versus high-fidelity mannequin simulation in developing
clinical judgment in nursing education (Doctoral dissertation). Available from ProQuest
Dissertations & Theses Global. (UMI No. 3558176)
Howard, V. M. (2007). A comparison of educational strategies for the acquisition of medical-
surgical nursing knowledge and critical thinking skills: Human patient simulator vs. the
interactive care study approach (Doctoral dissertation). Available from ProQuest
Dissertations & Theses Global. (UMI No. 3)
Hovancsek, M., Jeffries, P. A., Escudero, E., Foulds, B. J., Husebø, S. E., Iwamoto, Y., Kelly,
M., Petrini, M., & Wang, A. (2009). Creating simulation communities of practice: An
international perspective. Nursing Education Perspectives, 30(2), 121.
SIMULATION AND CRITICAL THINKING 124
Hyland, J. R., & Hawkins, M. C. (2009). High-fidelity human simulation in nursing education: A
review of literature and guide for implementation. Teaching and Learning in Nursing,
4(1), 14-21. doi:10.1016/j.teln.2008.07.004
Insight Assessment (2013). Health science reasoning test (HSRT). Retrieved from
http://www.insightassessment.com/Products/Products-Summary/Critical-Thinking-Skills-
Tests/Health-Sciences-Reasoning-Test-HSRT
Insight Assessment (2016). Health sciences reasoning test: User manual and resource guide. San
Jones, CA: California Academic Press
Institute of Medicine (2010). The future of nursing: Focus on education. Retrieved from
http://iom.nationalacademies.org/~/media/Files/Report%20Files/2010/The-Future-of-
Nursing/Nursing%20Education%202010%20Brief.pdf
Ironside, P. M., Jeffries, P. R., & Martin, A. (2009). Fostering patient safety competencies using
multiple-patient simulation experiences. Nursing Outlook, 57(6), 332-337.
doi:10.1016/j.outlook.2009.07.010
Jeffries, P. A. (2005). A framework for designing, implementing, and evaluating simulations
used as teaching strategies in nursing. Nursing Education Perspectives, 26(2), 96-103.
Jeffries, P. A., & Rizzolo, M. A. (2006). NLN/Laerdal project summary report. Designing and
implementing models for the innovative use of simulation to teach nursing care of ill
adults and children: A national, multi-site, multi-method study. Retrieved from
http://www.nln.org/docs/default-source/professional-development-programs/read-the-
nln-laerdal-project-summary-report-pdf.pdf?sfvrsn=0
Jeffries, P. A. (2009). Dreams for the future for clinical simulation. Nursing Education
Perspectives, 30(2), 71.
SIMULATION AND CRITICAL THINKING 125
Jeffries, P. A, & Rogers, K. L. (2012). Chapter 3: Theoretical framework for simulation design.
In P. R. Jeffries (Ed.), Simulation in nursing education: From conceptualization to
evaluation second edition (pp. 25-42). New York, NY: National League for Nursing.
Jenson, C. E., & Forsyth, D. M. (2012). Virtual reality simulation: Using three-dimensional
technology to teach nursing students. Computers, Informatics, Nursing, 30(6), 312-318.
doi:10.1097/NXN.0b013e31824af6ae
Johnson, M. P., Hickey, K. T., Scopa-Goldman, J., Andrews, T., Boerem, P., Covec, M., &
Larson, E. (2014). Manikin versus web-based simulation for advanced practice nursing
students. Clinical Simulation in Nursing, 10(6), e317-e323.
doi:10.1016/j.ecns.2014.02.004
Johnson, D., & Johnson, S. (2014). The effects of using a human patient simulator compared to a
CD-ROM in teaching critical thinking and performance. U.S. Army Medical Department
Journal, 59-64.
Jones, S. A., & Brown, L. N. (1991). Critical thinking: Impact on nursing education. Journal of
Advanced Nursing, 16(5), 529-533. doi:10.1111/j.1365-2648.1991.tb01687.x
Jones, J. S., Hunt, S. J., Carlson, S. A., & Seamon, J. P. (1997). Assessing bedside cardiologic
examination skills using "Harvey," a cardiology patient simulator. Academic Emergency
Medicine,4(10), 980-985. doi:10.1111/j.1553-2712.1997.tb03664.x
Juraschek, S. P., Zhang, X., Ranganathan, V., & Lin, V. W. (2012). United states registered
nurse workforce report card and shortage forecast. American Journal of Medical Quality,
27(3), 241-249. doi:10.1177/1062860611416634
SIMULATION AND CRITICAL THINKING 126
Kaddoura, M. A. (2010). New graduate nurses’ perceptions of the effects of clinical simulation
on their critical thinking, learning, and confidence. The Journal of Continuing Education
in Nursing, 41(11), 506-516. doi:10.3928/00220124-20100701-02
Kalisch, B. J., Aebersold, M., McLaughlin, M., Tschannen, D., & Lane, S. (2015). An
intervention to improve nursing teamwork using virtual simulation. Western Journal of
Nursing Research, 37(2), 164-179.
Kardong-Edgren, S., Adamson, K. A., & Fitzgerald, C. (2010). A review of currently published
evaluation instruments for human patient simulation. Clinical Simulation in Nursing,
6(1), e25-e35. doi:10.1016/j.ecns.2009.08.004
Kataoka-Yahiro, M., & Saylor, C. (1994). A critical thinking model for nursing judgment. The
Journal of Nursing Education, 33(8), 351-356.
Kilmon, C. A., Brown, L., Ghosh, S., & Mikitiuk, A. (2010). Immersive virtual reality
simulations in nursing education. Nursing Education Perspectives, 31(5), 314-317.
Kolb, D. A. (2014). Experiential learning: Experience as the source of learning and development
(2nd ed.). Upper Saddle River: Pearson Education.
Kuznar, K. A. (2007). Associate degree nursing students' perceptions of learning using a high-
fidelity human patient simulator. Teaching and Learning in Nursing, 2(2), 46-52.
doi:10.1016/j.teln.2007.01.009
Lapkin, S., Levett-Jones, T., Bellchambers, H., & Fernandez, R. (2010). Effectiveness of patient
simulation manikins in teaching clinical reasoning skills to undergraduate nursing
students: A systematic review. Clinical Simulation in Nursing, 6(6), e207-e222.
doi:10.1016/j.ecns.2010.05.005
SIMULATION AND CRITICAL THINKING 127
Lapkin, S., & Levett ‐Jones, T. (2011). A cost–utility analysis of medium vs. high ‐fidelity human
patient simulation manikins in nursing education. Journal of Clinical Nursing, 20(23 ‐24),
3543-3552. doi:10.1111/j.1365-2702.2011.03843.x
Lasater, K. (2007). High-fidelity simulation and the development of clinical judgment: Students'
experiences. Journal of Nursing Education, 46(6), 269-276.
Leighton, K. (2013). Chapter 29: Simulation in nursing. In A. I. Adam, DeMaria, S., Schwartz,
A.D., & Sim, A. J. (Eds.). The comprehensive textbook of healthcare simulation (pp.
425-436). New York, NY: Springer.
Levett-Jones, T., Lapkin, S., Hoffman, K., Arthur, C., & Roche, J. (2011). Examining the impact
of high and medium fidelity simulation experiences on nursing students’ knowledge
acquisition. Nurse Education in Practice, 11(6), 380-383. doi:10.1016/j.nepr.2011.03.014
Lipman, M. (1988). Critical thinking--what can it be? Educational Leadership, 46(1), 38-43.
Lipman, T. H., & Deatrick, J. A. (1997). Preparing advanced practice nurses for clinical decision
making in specialty practice. Nurse Educator, 22(2), 47-50. doi:10.1097/00006223-
199703000-00018
Ma, X. (2013). BSN students’ perception of satisfaction and self-confidence after a simulated
mock code experience: A descriptive study (Master Thesis). Retrieved from
http://digitalcommons.cedarville.edu/nursing_theses/2/
Maneval, R., Fowler, K. A., Kays, J. A., Boyd, T. M., Shuey, J., Harne-Britner, S., & Mastrine,
C. (2012). The effect of high-fidelity patient simulation on the critical thinking and
clinical decision-making skills of new graduate nurses. The Journal of Continuing
Education in Nursing, 43(3), 125-134. doi:10.3928/00220124-20111101-02
SIMULATION AND CRITICAL THINKING 128
Matthews, C. A., & Gaul, A. L. (1979). Nursing diagnosis from the perspective of concept
attainment and critical thinking. Advances in Nursing Science, 2(1), 17-26.
McAfooes, J., Childress, R. M., Jeffries, P. R., & Feken, C. (2012). Chapter 10: Using
collaboration to enhance the effectiveness of simulated learning in nursing education. In
P. R. Jeffries (Ed.), Simulation in nursing education: From conceptualization to
evaluation second edition (pp. 197-215). New York, NY: National League for Nursing.
McCallum, J., Ness, V., & Price, T. (2011). Exploring nursing students' decision-making skills
whilst in a second life clinical simulation laboratory. Nurse Education Today, 31(7), 699-
704. doi:10.1016/j.nedt.2010.03.010
McIntosh, C., Macario, A., Flanagan, B., & Gaba, D. M. (2006). Simulation: What does it really
cost? Simulation in Healthcare: Journal of the Society for Simulation in Healthcare, 1(2),
109. doi:10.1097/01266021-200600120-00041
McNeal, G. J. (2010). Simulation and nursing education. The ABNF Journal: Official Journal of
the Association of Black Nursing Faculty in Higher Education, Inc, 21(4), 78-78.
Miller, M., & Jensen, R. (2014). Avatars in nursing: An integrative review. Nurse Educator,
39(1), 38-41. doi:10.1097/01.NNE.0000437367.03842.63
Myrick, F. (2002). Preceptorship and critical thinking in nursing education. The Journal of
Nursing Education, 41(4), 154-164.
National League for Nursing (2016). vSim for nursing. Retrieved from
http://www.nln.org/centers-for-nursing-education/nln-center-for-innovation-in-
simulation-and-technology/vsim-for-nursing-medical-surgical
SIMULATION AND CRITICAL THINKING 129
Needleman, J., Buerhaus, P., Pankratz, V. S., Leibson, C. L., Stevens, S. R., & Harris, M. (2011).
Nurse staffing and inpatient hospital mortality. The New England Journal of Medicine,
364(11), 1037-1045. doi:10.1056/NEJMsa1001025
Nehring, W. M. (2008). U.S. boards of nursing and the use of high-fidelity patient simulators in
nursing education. Journal of Professional Nursing, 24(2), 109-117.
doi:10.1016/j.profnurs.2007.06.027
Nehring, W. M., & Lashley, F. R. (2009). High-fidelity patient simulation in musing education.
Burlington, MA: Jones & Bartlett Learning.
Norman, J. (2012). Systematic review of the literature on simulation in nursing education. The
ABNF Journal: Official Journal of the Association of Black Nursing Faculty in Higher
Education, Inc, 23(2), 24-28.
Oermann, M. H. (1997). Evaluating critical thinking in clinical practice. Nurse Educator, 22(5),
25-28. doi:10.1097/00006223-199709000-00011
Paul, R. (1990). Critical thinking: What every person needs to survive in a rapidly changing
world. Retrieved from https://d3jc3ahdjad7x7.cloudfront.net/W1zUdGRxQbu1ulF9wlL
7il2eOYBWE5CqbxvS7Bxc0fv29IS0.pdf
Pearson Higher Education (2010). Chapter 1: What is critical thinking? Retrieved from
https://www.pearsonhighered.com/assets/hip/us/hip_us_pearsonhighered/samplechapter/0
134019466.pdf
Peteani, L. A. (2004). Enhancing clinical practice and education with high-fidelity human patient
simulators. Nurse Educator, 29(1), 25-30. doi:10.1097/00006223-200401000-00008
Petranek, C. F. (2000). Written debriefing: The next vital step in learning with simulations.
Simulation & Gaming, 31(1), 108-118.
SIMULATION AND CRITICAL THINKING 130
Profetto-McGrath, J. (2005). Critical thinking and evidence-based practice. Journal of
Professional Nursing, 21(6), 364-371. doi:10.1016/j.profnurs.2005.10.002
Ravert, P. (2008). Patient simulator sessions and critical thinking. The Journal of Nursing
Education, 47(12), 557-562. doi:10.3928/01484834-20081201-06
Roth, M. S. (2010). Beyond critical thinking. Retrieved from
http://acad.erskine.edu/facultyweb/gore/Critical%20Thinking.pdf
Sanford, P. G. (2010). Simulation in nursing education: A review of the research. The
Qualitative Report, 15(4), 1006-1011.
Scheffer, B. K., & Rubenfeld, M. G. (2000). A consensus statement on critical thinking in
nursing. The Journal of Nursing Education, 39(8), 352-359.
Schoening, A. M., Sittner, B. J., & Todd, M. J. (2006). Simulated clinical experience: Nursing’s
students' perceptions and the educators' role. Nurse Educator, 31(6), 253-258.
doi:10.1097/00006223-200611000-00008
Schumacher, L. B. (2004). The impact of utilizing high-fidelity computer simulation on critical
thinking abilities and learning outcomes in undergraduate nursing students: Abstract
(Doctoral Dissertation). Available from ProQuest Dissertations & Theses Global
Seropian, M. A., Brown, K., Gavilanes, J. S., & Driggers, B. (2004). Simulation: Not just a
manikin. Journal of Nursing Education, 43(4), 164-169. doi:10.3928/01484834-
20040401-04
Shin, S., Park, J., & Kim, J. (2015). Effectiveness of patient simulation in nursing education:
Meta-analysis. Nurse Education Today, 35(1), 176-182. doi:10.1016/j.nedt.2014.09.009
SIMULATION AND CRITICAL THINKING 131
Shinnick, M. A., & Woo, M. A. (2013). The effect of human patient simulation on critical
thinking and its predictors in prelicensure nursing students. Nurse Education Today,
33(9), 1062-1067. doi:10.1016/j.nedt.2012.04.004
Simpson, R. L. (2002). The virtual reality revolution: Technology changes nursing education.
Nursing Management (Springhouse), 33(9), 14-15. doi:10.1097/00006247-200209000-
00007
Simpson, E., & Courtney, M. (2002). Critical thinking in nursing education: Literature review.
International Journal of Nursing Practice, 8(2), 89-98. doi:10.1046/j.1440-
172x.2002.00340.x
Simulation Innovative Resource Center (n. d.). SIRC Glossary. Retrieved from
http://sirc.nln.org/mod/glossary/view.php?id=183&mode=letter&hook=S&sortkey=&sort
order=
Sinclair, B., & Ferguson, K. (2009). Integrating simulated Teaching/Learning strategies in
undergraduate nursing education. International Journal of Nursing Education
Scholarship, 6(1), 7-11. doi:10.2202/1548-923X.1676
Slalas, M., & Neber, B. (1949). Demonstrations in Miniature: With chicken bones and doll beds,
nursing students provide realistic teaching aids for the study of fractures and traction.
American Journal of Nursing, 49(3), 172-173.
Smith, S. J., & Roehrs, C. J. (2009). High-fidelity simulation: Factors correlated with nursing
student satisfaction and self-confidence. Nursing Education Perspectives, 30(2), 74-78.
Staib, S. (2003). Teaching and measuring critical thinking: 1. Journal of Nursing Education,
42(11), 498-508.
SIMULATION AND CRITICAL THINKING 132
Stokowski, L. A. (2013). A digital revolution: Games, simulations, and virtual worlds in nursing
education. Retrieved from http://www.medscape.com/viewarticle/780819
Sullivan-Mann, J., Perron, C. A., & Fellner, A. N. (2009). The effects of simulation on nursing
students' critical thinking scores: A quantitative study. Newborn and Infant Nursing
Reviews, 9(2), 111-116. doi:10.1053/j.nainr.2009.03.006
Tanner, C. A. (1997). Spock would have been a terrible nurse (and other issues related to critical
thinking in nursing). The Journal of Nursing Education, 36(1), 3-4.
Tanner, C. A. (2006). Thinking like a nurse: A research-based model of clinical judgment in
nursing. The Journal of Nursing Education, 45(6), 204-211.
Terry, J.A., & Whitman, M. V. (2011). Impact of the economic downturn on nursing schools.
Nursing Economics, 29 (5), 252-256.
Tiala, S. (2006). Integrating virtual reality into technology education labs. The Technology
Teacher, 66(4), 9-13.
Tiffen, J., Corbridge, S., Shen, B. C., & Robinson, P. (2011). Patient simulator for teaching heart
and lung assessment skills to advanced practice nursing students. Clinical Simulation in
Nursing, 7(3), e91-e97. doi:10.1016/j.ecns.2009.10.003
Tosterud, R., Hedelin, B., & Hall-Lord, M. L., (2013). Nursing students' perceptions of high- and
low-fidelity simulation used as learning methods. Nurse Education in Practice, 13(4),
262-270. doi:10.1016/j.nepr.2013.02.002
Tschannen, D., Aebersold, M., McLaughlin, E., Bowen, J., & Fairchild, J. (2012). Use of virtual
simulations for improving knowledge transfer among baccalaureate nursing students.
Journal of Nursing Education and Practice, 2(3), 15-24.
SIMULATION AND CRITICAL THINKING 133
Tubbs-Cooley, H. L., Cimiotti, J. P., Silber, J. H., Sloane, D. M., & Aiken, L. H. (2013). An
observational study of nurse staffing ratios and hospital readmission among children
admitted for common conditions. BMJ Quality & Safety, 22(9), 735-742.
Vincent, M. A., Sheriff, S., & Mellott, S. (2015). The efficacy of high-fidelity simulation on
psychomotor clinical performance improvement if undergraduate nursing students.
Computers, Informatics, Nursing, 33(2), 78-84.
Warburton, S. (2009). Second life in higher education: Assessing the potential for and the
barriers to deploying virtual worlds in learning and teaching. British Journal of
Educational Technology, 40(3), 414-426. doi:10.1111/j.1467-8535.2009.00952.x
Ward-Smith, P. (2008). The effect of simulation learning as a quality initiative. Urologic
Nursing, 28(6), 471-473.
Weatherspoon, D. L., & Wyatt, T. H. (2012). Testing computer-based simulation to enhance
clinical judgment skills in senior nursing students. Nursing clinics of North America,
47(4), 481-491.
Weaver, A. (2011). High-fidelity patient simulation in nursing education: An integrative review.
Nursing Education Perspectives, 32(1), 37-40. doi:10.5480/1536-5026-32.1.37
Wilford, A., & Doyle, T. J. (2006). Integrating simulation training into the nursing curriculum.
British Journal of Nursing, 15(17), 926-930. doi:10.12968/bjon.2006.15.17.21907
Wilson, R. D., Klein, J. D., & Hagler, D. (2014). Computer-based or human patient simulation-
based case analysis: Which works better for teaching diagnostic reasoning skills? Nursing
Education Perspectives, 35(1), 14-18. doi:10.5480/11-515.1
SIMULATION AND CRITICAL THINKING 134
Worrell, J. A., & Profetto-McGrath, J. (2007). Critical thinking as an outcome of context-based
learning among post RN students: A literature review. Nurse Education Today, 27(5),
420-426. doi:10.1016/j.nedt.2006.07.004
Young, J. R. (2010). After frustrations in second life, colleges look to new virtual worlds; the
hype is gone, but not the interest, and professors think some emerging projects may have
instructional staying power. The Chronicle of Higher Education, 56(23).
Youngblood, P., Harter, P. M., Srivastava, S., Moffett, S., Heinrichs, W. L., & Dev, P. (2008).
Design, development, and evaluation of an online virtual emergency department for
training trauma teams. Simulation in Healthcare,3(3), 146-153.
doi:10.1097/SIH.0b013e31817bedf7
SIMULATION AND CRITICAL THINKING 135
Appendix A
Mrs. Chase
Nehring & Lashley, 2009, p. 530
SIMULATION AND CRITICAL THINKING 136
Appendix B
Demographic Questionnaire
Dear Participant:
Please complete the questionnaire. This questionnaire will only be viewed by the principle
investigator and will be destroyed upon the completion of the study. Thank you for your
cooperation.
1. Age:
□ Less than 25 years old
□ 26-35 years old
□ Greater than 36 years old
2. Sex: _______
3. Cumulative Nursing GPA: _______
4. Healthcare Experience (Please circle one that best describes you).
□ Only program Practicums
Worked in healthcare setting as a:
□ Certified Nurse Aide (CNA)
□ Licensed Practical Nurse (LPN)
□ Ward Clerk
□ Other (Please specify): __________
5. Years of experience in healthcare (excluding program practicums).
□ Less than one year
□ One to two years
□ More than two years
SIMULATION AND CRITICAL THINKING 137
Appendix C
NLN Simulation Tools Permission
SIMULATION AND CRITICAL THINKING 138
Appendix D
Satisfaction and Self-Confidence in Learning Survey-Face-to Face Simulation Group
SIMULATION AND CRITICAL THINKING 139
Appendix E
Satisfaction and Self-Confidence in Learning Survey-Virtual Simulation Group
SIMULATION AND CRITICAL THINKING 140
Appendix F
Recruitment Letter
SIMULATION AND CRITICAL THINKING 141
Appendix G
Information Sheet and Consent
SIMULATION AND CRITICAL THINKING 142
Abstract (if available)
Abstract
Healthcare organizations demand that nurses have sufficient critical thinking skills to deal with complex clinical situations, satisfy the needs of patients and family members, and provide high-quality and safe patient care. However, limited clinical training sites and nursing faculty shortages drive nurse educators to conduct high-fidelity simulations in educating future nurses. The high cost of high-fidelity simulation and its effectiveness in increasing nursing students’ critical thinking skills remains inconclusive. Virtual simulation may serve as an alternative due its lower cost and accessibility. The purpose of this quantitative study was to investigate and compare the effects of virtual simulation and traditional face-to-face simulation on the acquisition of critical thinking skills, satisfaction, and self-confidence in learning among undergraduate nursing students. The research utilized an experimental, two-group, pre-test and post-test design. Convenience sampling was used, and the final sample size consisted of 49 undergraduate nursing students who were in their fourth semester of the nursing program. The Health Science Reasoning Test (HSRT) was used to measure participants’ critical thinking skills and a Likert scale survey developed by the National League for Nursing (NLN) was used to evaluate participants’ satisfaction and self-confidence in learning. The data were analyzed using an independent-samples t-test, a paired-samples t-test, and a Mann-Whitney U test. The results suggested that virtual simulation may be as effective as face-to-face simulation in improving critical thinking skills
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Problem-based learning in a dental school: measuring change in students' critical thinking skills
PDF
Critical thinking, global mindedness, and curriculum in a Saudi Arabian secondary school
PDF
The impact of diversity courses on students' critical thinking skills
PDF
A comparative study of motivational predictors and differences of student satisfaction between online learning and on-campus courses
PDF
Online, flipped, and traditional instruction: a comparison of student performance in higher education
PDF
Exploring the scholarly writing development of master’s nursing students
PDF
Influence of globalization, school leadership, and students’ participation in science competitions on 21st-century skill development, instructional practices, and female students’ interest in sci...
PDF
A qualitative study on Hawaii's use of Race to the Top funding on extended learning time in a Zone of School Innovation
PDF
Building leaders: the role of core faculty in student leadership development in an undergraduate business school
PDF
The effect of reading self-efficacy, expectancy-value, and metacognitive self-regulation on the achievement and persistence of community college students enrolled in basic skills reading courses
PDF
Restoring the value of a high school diploma in the United States using 21st century skills as pedagogy: a case study of 21st century skills development and preparation for the global economy
PDF
Black@: using student voices to dismantle colorblindness and inspire culturally relevant pedagogy in southern independent schools
PDF
The effect of traditional method of training on learning transfer, motivation, self-efficacy, and performance orientation in comparison to evidence-based training in Brazilian Jiu-Jitsu
PDF
SciFest and the development of 21st-century skills, interest in coursework in science, technology, engineering, and mathematics, and preparation of Irish students for a globalized Ireland
PDF
The benefits and costs of accreditation of undergraduate medical education programs leading to the MD degree in the United States and its territories
PDF
Staff perceptions of critical challenges faced by Long-Term English Learners (LTEL) during the transition to junior high school
PDF
A study of online project-based learning with Gambassa: crossroads of informal contracting and cloud management systems
PDF
Understanding and preventing small business failure: a gap analysis of the Utah American Legion Baseball organization
PDF
Preparing English language learners to be college and career ready for the 21st century: the leadership role of secondary school principals in the support of English language learners
PDF
A case study on readmitted students: the impact of social and academic involvement on degree completion
Asset Metadata
Creator
Li, Chia-Yen (Cathy)
(author)
Core Title
A comparsion of traditional face-to-face simulation versus virtual simulation in the development of critical thinking skills, satisfaction, and self-confidence in undergraduate nursing students
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Publication Date
03/16/2016
Defense Date
01/23/2016
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
critical thinking,face-to-face simulation,OAI-PMH Harvest,satisfaction and self-confidence in learning,virtual simulation
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Keim, Robert (
committee chair
), Datta, Monique (
committee member
), Dunham, David (
committee member
)
Creator Email
koalali913@hotmail.com,lichiaye@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c40-222339
Unique identifier
UC11277400
Identifier
etd-LiChiaYenC-4209.pdf (filename),usctheses-c40-222339 (legacy record id)
Legacy Identifier
etd-LiChiaYenC-4209-0.pdf
Dmrecord
222339
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Li, Chia-Yen (Cathy)
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
critical thinking
face-to-face simulation
satisfaction and self-confidence in learning
virtual simulation