Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Using innovative field model and competency-based student learning outcomes to level the playing field in social work distance learning education
(USC Thesis Other)
Using innovative field model and competency-based student learning outcomes to level the playing field in social work distance learning education
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
1
UNIVERSITY OF SOUTHERN CALIFORNIA
Using Innovative Field Model and Competency-Based Student Learning Outcomes to Level the
Playing Field in Social Work Distance Learning Education
By
Suh Chen Hsiao
A dissertation submitted in partial fulfillment
of the requirements for the degree of
DOCTOR OF POLICY, PLANNING, AND DEVELOPMENT
PRICE SCHOOL OF PUBLIC POLICY
May 2018
2
Acknowledgments
The completion of this capstone project represents a milestone for my professional
growth and development as well as the final step on a long educational journey that began in Fall
2012. Pursuing the doctoral degree was my dream deeply influenced by my dearest parents,
Kung Shuang and Hong Cheng Hsiao. Without their trust in me as a first generation immigrant
to pursue my MSW degree at USC in 1987, I would not have completed my DPPD at USC. I
also want to extend my gratitude to my husband, George P. Lin, and my daughter, Jacqueline P.
Lin, for their unwavering support and patience throughout the last five years with my educational
endeavors. I hope that they have learned from my adventures that education is important and
hard work that brings such joy, satisfaction and rewards and that age does not matter.
I wish to express my appreciation to my dissertation committee. It has been an honor to
have three exceptional educators. Dr. Deborah Natoli, my Committee Chair, has provided
guidance and challenged me to excel and grow. I am thankful for her commitment to support my
work and belief in my ability. Dr. Peter Robertson, as an instructor and dissertation committee
member, he provides valuable feedback, encouragement and support, without him this would not
be possible. And finally, Dr. Marleen Wong, my supervisor from LAUSD to USC. For the past
seventeen years, she exemplifies all the qualities one should expect from the lifelong mentor.
Furthermore, the leadership roles entrusted by Dean Flynn and Dr. Paul Maiden have been
invaluable for my professional growth and development.
I am extremely fortunate to have colleagues who have contributed along this journey by
my side, Drs. Betsy Phillips, Gary Wood, Jane Yoo, Kristin Ward (and Clarus Research team)
and field faculty/admin team. I have learned from each of them with enthusiasm and the great
3
gifts they shared so openly. I am ready and more equipped to spread my wings to my next
professional endeavors.
4
Table of Contents
Abstract ........................................................................................................................................... 6
Chapter 1: Introduction ................................................................................................................... 8
Statement of Problem .............................................................................................................. 11
Purpose of the Study: Distance Learning and Field Education .............................................. 13
Theoretical Framework: Competency-Based Training ........................................................... 15
An Innovative Field Model: Virtual Field Practicum ............................................................. 18
Curriculum Implementation .................................................................................................... 20
Faculty Recruitment and Development .................................................................................. 20
Curriculum Alignment ............................................................................................................ 21
Hypothesis............................................................................................................................... 22
Research Design...................................................................................................................... 23
Summary ................................................................................................................................. 25
Chapter 2: Review of the Literature .............................................................................................. 26
Introduction ............................................................................................................................. 26
Background and Structure of Social Work Field Education ................................................... 27
Seminar Instructor (Faculty Liaison) ...................................................................................... 27
Agency-Based MSW Field Instructor ..................................................................................... 29
Field Instructor Evaluations of MSW Students ...................................................................... 31
First-Year MSW Students’ Reflection and Self-Assessment ................................................. 34
Radical Changes in Social Work Field Models ...................................................................... 36
Evidence-Based Intervention Trainings .................................................................................. 39
Studies on Student Competency Assessment and Learning Outcomes .................................. 42
Conclusion .............................................................................................................................. 46
Chapter 3: Methodology ............................................................................................................... 48
Introduction ............................................................................................................................. 48
Overview of Purpose............................................................................................................... 48
Philosophical Orientation........................................................................................................ 49
Results from 2015 Exploratory Study..................................................................................... 52
Hypothesis............................................................................................................................... 56
Research Design...................................................................................................................... 56
Study Setting and Participants ................................................................................................ 57
Procedures ............................................................................................................................... 59
Measurement and Instrument .................................................................................................. 60
Data Analysis .......................................................................................................................... 62
Summary ................................................................................................................................. 63
Chapter 4: Results and Data Analysis ........................................................................................... 64
Introduction ............................................................................................................................. 64
Results ..................................................................................................................................... 64
Competency 1: Demonstrate Ethical and Professional Behavior ..................................... 64
Competency 2: Engage Diversity and Differences in Practice ......................................... 65
Competency 3: Advance Human Rights and Social, Economic, and Environmental
Justice ................................................................................................................................ 66
Competency 4: Engage in Practice-Informed Research and Research-Informed Practice 67
Competency 5: Engage in Policy Practice ........................................................................ 68
5
Competency 6: Engage with Individuals, Families, Groups, Organizations, and
Communities ..................................................................................................................... 69
Competency 7: Assess Individuals, Families, Groups, Organizations, and Communities 70
Competency 8: Intervene with Individuals, Families, Groups, Organizations, and
Communities ..................................................................................................................... 71
Competency 9: Evaluate Practice with Individuals, Families, Groups, Organizations, and
Communities ..................................................................................................................... 72
Similarities and Differences: OTG and VAC Students .................................................... 73
Similarities and Differences: Field Instructor Ratings...................................................... 73
Summary of Findings .............................................................................................................. 74
Chapter 5: Innovation to Practice ................................................................................................. 77
Introduction ............................................................................................................................. 77
Outcomes ................................................................................................................................ 77
Discussion ............................................................................................................................... 78
Standardized Student Comprehensive Skills Evaluation Tool ......................................... 78
Student Self-Assessment ................................................................................................... 81
Field Instructor Evaluations .............................................................................................. 83
Limitations .............................................................................................................................. 84
Implications for Practice ......................................................................................................... 86
Contributions to Practice......................................................................................................... 87
Conclusion, Recommendations, and Future Studies............................................................... 88
References ..................................................................................................................................... 90
Appendix A: First-Year Comprehensive Skills Evaluation (CSWE 2015 EPAS) ..................... 115
Appendix B: Student Comprehensive Skills Evaluation Tool (CSWE 2008 EPAS) ................. 127
6
Abstract
Purpose: This study aimed to address the lack of research on competency-based student learning
outcomes for social work distance learning. It addressed the following three objectives: (a)
develop an empirically based online assessment tool adapted from the 2015 Council on Social
Work Education’s Educational Policy and Accreditation Standards with specific examples for
students and field instructors; (b) implement this online assessment tool for both on-ground and
Virtual Academic Center Master of Social Work students; and (c) assess Master of Social Work
students’ learning outcomes in their first-year field practicum as the basis of the self-study and
reaffirmation portion of the 2017 Council on Social Work Education reaccreditation.
Theoretical Framework: The theoretical foundation included competency-based training,
accreditation standards for field education set by the Council on Social Work Education, use of
simulation technology, and evidence-based skill trainings for the growing number of master of
social work distance learning field programs.
Methodology: The study featured a quasi-experimental research design with two groups:
campus-based (n = 544) and web-based (n = 710) students. Students enrolled in first-year field
practicum courses during academic year 2015–2016 had one baseline assessment at the end of
the first semester and one follow-up measurement at the end of the second semester via an online
evaluation tool.
Finding: Results show that web-based students completing the innovative Virtual Field
Practicum field model achieved the same student learning outcomes as their campus-based
cohort at the end of their first-year field practicum.
Implications for Social Work Distance Learning and Field Education: The contributions of this
study include providing evidence through student learning outcomes of the effectiveness of the
7
distance learning field program and a comprehensive field model that could best prepare MSW
students entering their first-year generalist field placements.
Conclusions and Recommendations: This research generated knowledge for the social work
profession that can inform further improvements to field education programs needed to support
field instructors, students, and faculty liaisons in terms of core competencies. Future studies are
crucial to validate generalist practice measurement and explore advanced practice instrument
guided by student learning outcomes.
8
Chapter 1: Introduction
Since its founding in 1880, the University of Southern California (USC) has instilled
students with a moral responsibility to minister to the broader community, including to the
various ethnic groups that had come to Los Angeles seeking a better life from Europe and Asia,
along with Mexican Americans who were already in the university’s neighborhood. USC’s
fourth president, Emma Bovard, a member of the Friday Morning Club, created the Women’s
Home Missionary Society to provide services to help the growing foreign populations. In 1904,
Jane Addams came to Los Angeles and met with Bovard to extend professional training related
to settlement houses to students. Under the auspices of Rev. Dana Bartlett and Rockwell Hunt,
the head of the Department of Economics and Sociology, students were assigned to volunteer as
part of the Bethlehem Institute, which later created the first school of social work on the West
Coast in 1906 (Fertig & Rose, 2007).
One hundred and eleven years later, the USC Suzanne Dworak-Peck School of Social
Work is one of the top-ranking master of social work (MSW) programs in the United States. This
program is the largest U.S. graduate social work program and interdisciplinary clinical research
program. The school’s field department is known for developing nontraditional social work
placements, having a strong military component, preparing students to practice evidence-based
interventions (EBIs), and providing state-of-the art distance learning education.
According to information provided in a faculty meeting in May 2016 by Marilyn Flynn,
dean of the USC Suzanne Dworak-Peck School of Social Work, the school had 25 full-time
faculty members, 375 students, and 500 field placements in 1997. The students volunteered more
than 400,000 hours in the community during their internships. In 2013, the numbers had
increased to more than 100 full-time faculty members, 280 adjunct professors, 2,800 students,
9
and more than 5,000 affiliated field placements in 46 states and international territories such as
Guam, Canada, England, Germany, India, and Japan. The students volunteered more than 11.2
million hours during their internships in 2013. In 2017, six years after the school launched a
web-based program, the Virtual Academic Center (VAC), it featured more than 120 full-time
faculty members, more than 350 adjunct professors, and 3,200 students spanning 50 states, along
with more than 10,000 agency partnerships. More than 1,300 students received their MSW
degree at commencement.
In recent years, the school has created a telehealth program that provides mental health
services to Californians; in 2016, the USC Cohen Military Family Clinic opened its doors to
provide free mental health services to veterans and their families transitioning to civilian life in
Los Angeles (USC Suzanne Dworak-Peck School of Social Work, n.d.). Both programs provide
innovative field placements primarily for MSW interns, who are supervised by licensed clinical
social workers (field instructors). These programs have provided extensive learning opportunities
and clinical settings for MSW interns to integrate research-informed practice into a real-world
setting; the intent of both programs is to align with the school’s mission “to improve the well-
being of vulnerable individuals and communities, advance social and economic justice, and
eradicate pressing societal problems in complex and culturally diverse urban environments
throughout Southern California, the nation and the world” (USC Suzanne Dworak-Peck School
of Social Work, n.d.).
The 12 Grand Challenges for Social Work, as defined by the American Academy of
Social Work & Social Welfare (2017), focus on the following global issues: ensuring healthy
development for all youth, closing the health gap, stopping family violence, advancing long and
productive lives, eradicating social isolation, ending homelessness, creating social responses to a
10
changing environment, harnessing technology for social good, promoting smart decarceration,
reducing extreme economic inequality, building financial capability for all, and achieving equal
opportunity and justice. According to an occupational outlook handbook from the U.S.
Department of Labor (2016), the projected overall social work employment will increase 12%
between 2014 and 2024, which is faster than the average projected 7% growth of employment
across disciplines. Social work employment is expected to grow the most in the foci of health
care (19%); mental health and substance abuse (19%); and children, family, and schools (6%).
This growth projection is an incredible recognition of the social work profession and presents an
opportunity for administrators and faculty members in the USC MSW program to rethink an
innovative MSW curriculum to prepare students for important developments in the workforce.
USC MSW students devote 63% of their program time to field placements (internships).
Field education is the heart and signature pedagogy of social work education. USC MSW
students are required to complete a minimum of 1,000 hours with two field placements to earn
their degree. In addition to completing field hours, students are expected to meet the nine core
competencies outlined in the Educational Policy and Accreditation Standards (EPAS) set by the
Council on Social Work Education (CSWE), an accreditation governing body for social work
education. Field education provides MSW students with apprenticeships and a laboratory to
incorporate theories into practice and develop a professional identity, knowledge, and skills.
Field instructors (agency-based clinical supervisors), who are often MSW graduates with
a minimum of 2 years of job tenure and a great interest in mentoring future social workers,
provide weekly 1-hour supervision as part of their teaching plans. In addition, the field
instructors also provide support through reflective learning tools, observe student progress in
field placements, and coordinate and evaluate various learning opportunities for interns. Field
11
faculty members (liaisons) with extensive field experience serve as integrated seminar instructors
who provide guidance and support to both agency-based field instructors and students to meet
the EPAS requirements (CSWE, 2015) and the Code of Ethics of the National Association of
Social Workers (NASW), a national association that outlines professional standards and values.
MSW students can select various program options to complete their MSW degree: 2
years, 3 years, or 4 years (4, 6, or 8 semesters, respectively). The students may also choose
which campus they attend: University Park Campus, Orange County Academic Center, or VAC.
Statement of Problem
Since the 2008 economic crisis, the USC field program lost between 20% and 30% of its
established field placements because of a lack of funding sources, agency closures, and the loss
of field instructors due to retirement or changes in their employment. Field faculty members are
constantly contacting potential field instructors (mostly alumni) to develop new field placements.
To prepare these new placement agencies to host MSW students, faculty members follow
specific protocols to vet a field agency by meeting with potential field instructors or agency
representatives to review field and curriculum requirements before giving final approval. This
process may occur months, weeks, or days before the first day of the semester or even after the
semester has started. Some agencies might not be fully equipped to have students involved
during the first day of the internship. This situation can be frustrating for both the agencies and
students. These issues put field instructors in a difficult dilemma when determining how to
manage their commitments to their students; that is, allocating time for uninterrupted, consistent
1-hour weekly supervision while maintaining their responsibilities at their employment agencies.
Annually, the Suzanne Dworak-Peck School of Social Work trains more than 500 new
field instructors throughout its various academic centers. After the training, the field faculty
12
continues to build and foster partnerships with field instructors and agencies through
consultations and technical support to develop quality internship programs and enhance their
capacities. It is hoped that through these efforts, stimulating learning experiences will be created
that enable students to meet the EPAS and students will receive the best trainings from their
internships (field placements).
Traditional field models required the placement of students in community-based field
experiences. With the expansion of the web-based program at USC, placing the students in a
timely manner became increasingly difficult because students were able to enroll in the program
up to the start of the semester. This option created a challenge that VAC faculty members and
administrators had not anticipated. Field faculty members can take up to 4 to 6 weeks to develop
the affiliation process and a memorandum of understanding between the university and
community partners, not to mention the time required for field faculty members to conduct site
visits for approval. The length of the affiliation process and last-minute student enrollments
caused significant delays for students and prevented them from successfully completing their
field practicum requirements in alignment with other academic courses.
The last CSWE accreditation site visit to USC was completed in the 2010–2011 academic
year, when the VAC program was first launched; the VAC was not included in the reaffirmation
and reaccreditation review based on 2008 CSWE EPAS. USC successfully received accreditation
in 2011. This capstone project sought to gain knowledge and insight by exploring learning
outcomes among all MSW students at the end of their first-year field practicum based on the
2015 CSWE EPAS. The findings served as the student learning outcomes for the 2016–2017
reaffirmation (self-study) and were used for a site visit completed in September 2017 for the
school’s reaccreditation.
13
Purpose of the Study: Distance Learning and Field Education
In 2011, a free massive open online course offered by Stanford University on artificial
intelligence enrolled more than 160,000 students from 192 countries with access to computers
and the internet. Kurzman (2013) described the phenomenon as the evolution of distance
learning and online education and stated that it could make social work education and direct
practice more effective and less costly. Vernon, Vakalahi, Pierce, Pittman-Munke, and Adkins
(2009) identified, through a CSWE survey in 2005, that 52% of MSW programs already
delivered courses using some sort of technology that might just include recorded lesson plan
and/or a PowerPoint presentation. Another 2008 CSWE survey revealed that 83% of MSW
programs would consider offering their MSW program online (Vernon et al., 2009). In 2013,
CSWE accredited 22 online MSW programs. As of 2017, more than 67 universities are now
accredited by CSWE to offer MSW distance learning programs, compared to 40 universities in
2015 (CSWE, 2017).
Even with the rapid growth of online MSW programs, most social work educators lack
confidence in and hold a bias regarding the effectiveness of these programs (Kurzman, 2013).
Limited literature and data are available to validate the growth of and changes in social work
distance learning education. Kurzman (2013) reported that online MSW degree programs tend to
be driven by private universities and business models and do not focus on providing evidence
about how they are helping students meet social work core competencies. Therefore, despite the
rising trend of online education programs for social work, little is known about the value of the
program structures or student learning outcomes.
Bogo (2006) reviewed 40 studies on field education between 1999 and 2004 relating to
trends and shifts—including the processes of field instruction, assessments of student learning
14
and competencies, and training for field instructors—and eventually identified evidence-based
field education practices. In another article, Wayne, Raskin, and Bogo (2006) pointed out the
structural changes in agencies and universities, along with the student population demographics,
in recent years. These articles described how the historical role of field faculty liaisons was to
maintain the delicate relationship between universities and agencies and to encourage field
instructors to continue their mentorships with students. However, the role of the faculty has been
shifting from one of monitoring the progress of students to more of a teaching-focused duty.
With increased expectations for field faculty members to engage in more scholarly activities,
universities have started hiring adjuncts to take on a liaison role (Wayne et al., 2006). Wayne,
Bogo, and Raskin (2010) suggested how social work field education might be structured in
comparison to the fields of law, education, clergy, engineering, medicine, and nursing. The
authors found that common themes among these disciplines (fields) included the use of
competency-based training, problem-based learning, technology-based teaching programs, and
simulated patients (Wayne et al., 2010).
On June 25, 2014, per a request by the dean of the Suzanne Dworak-Peck School of
Social Work, an information survey was sent to 5,000 community agencies to solicit perspectives
on the type of professional skills and knowledge required for an MSW graduate to gain
employment in a health and human services agency (Wood, 2014). The survey featured 18
questions (seven substantive and 11 demographic) and was divided into two sections: (a) “What
are the three most vitally important skills that a new MSW graduate must have to be successful
working in your organization?” and (b) “What are the three most important areas of knowledge
about policy, research, or professional practice that a new MSW graduate should have to
successfully work at your organization?” (Wood, 2014). Among 335 responses, EBIs were
15
identified as the top priority for social work programs among other important social work
knowledge and skills, such as assessment, communication, clinical skills, collaboration and
documentation, and writing skills. The survey results led to a shift in the core curriculum
objectives by further incorporating EBIs into the development of intensive skills training in the
seminar course (Phillips, Woods, Yoo, Ward, Hsiao, Singh & Morris, in press). This
standardized training, known as the Virtual Field Practicum (VFP), was developed specifically to
build students’ core competencies and aimed to relieve the burden on overworked field
instructors, who may or may not have the time, knowledge, and support to provide this important
field instruction.
Since the inception of the VAC in 2010, the student population has grown from 80 to
more than 2,200 students in six years and has reached 50 states and some U.S. territories
(overseas military bases). The VAC allows students, especially military personnel or their
spouses, to be a part of a rigorous curriculum from the confines of their communities. Students
access asynchronous materials before participating in live sessions on a weekly basis as
facilitated by the faculty. The technology of the VAC changed how students think, behave, and
learn through a new way of delivering education.
Theoretical Framework: Competency-Based Training
The competency-based approach is not a new concept. This approach can be traced back
60 years, when competency-based medical education using outcome-based education was
adopted among policy makers and educators in health care professions (Carraccio, Wolfsthal,
Englander, Ferentz, & Martin, 2002; Frank, Snell, Cate, Holmboe, Carraccio, Swing, & Harris,
2010). Frank et al. (2010) emphasized that the competency-based approach places much greater
accountability on students to acquire competencies and emphasizes curricular outcomes. This
16
approach helped professional training programs and universities align their curriculum
development and achieve student learning outcomes. This approach also emphasized learner-
centered education through multidimensional, developmental, and contextual frameworks in
professional training, rather than merely focusing on the number of training hours completed
(Frank et al., 2010). Multiple professions have widely adopted this approach, including
chiropractic, social work (Menefee & Thompson, 1994), teacher education, and pharmacology,
along with other medical disciplines (Frank et al., 2010).
The 2008 CSWE Core Competencies for Social Workers prescribed a curriculum that
enables students to:
1. Identify as a professional social worker and conduct oneself accordingly;
2. Apply social work ethical principles to guide professional practice;
3. Apply critical thinking to inform and communicate professional judgement;
4. Engage diversity and difference in practice;
5. Advance human rights and social and economic justice;
6. Engage in research-informed practice and practice-informed research;
7. Apply knowledge of human behavior and the social environment;
8. Engage in policy practice to advance social and economic well-being and to deliver
effective social services;
9. Respond to contexts that shape practice; and
10. Engage, assess, intervene, and evaluate with individuals, families, groups,
organizations, and communities.
The 2015 CSWE Core Competencies for Social Workers included the following standards,
indicating that students should be able to:
17
1. Demonstrate ethical and professional behavior;
2. Engage diversity and difference in practice;
3. Advance human rights and social, economic, and environmental justice;
4. Engage in practice-informed research and research-informed practice;
5. Engage in policy practice;
6. Engage with individuals, families, groups, organizations, and communities;
7. Assess individuals, families, groups, organizations, and communities;
8. Intervene with individuals, families, groups, organizations, and communities; and
9. Evaluate practice with individuals, families, groups, organizations, and communities.
The 2015 EPAS kept the same themes as the 2008 EPAS for Competencies 4, 5, 6, and 8. The
remaining professional, ethical, and practice-related competencies from 2008 were integrated
into Competencies 1, 6, 7, 8, and 9.
Prior to fall 2015, VAC field instructors submitted all their students’ evaluation in hard
copy to their faculty liaisons. Students, who were computer literal, might scan the signed copies
of their evaluations and emailed to their faculty liaison at the end of each semester. The student
evaluation data for the 2011 reaffirmation and reaccreditation was collected via paper
submission and manually tabulated. Due the increasing number of MSW students, the USC
Suzanne Dworak-Peck School of Social Work field education program recognized the need to
move forward with a new electronic student comprehensive skill evaluation system for the
purpose of self-study (reaffirmation) and reaccreditation site visit in 2017. The 2015–2016
online student comprehensive skills evaluation tool was developed based on the 2015 EPAS.
The system enables staff to generate reports that offer an overview of student learning outcomes
between campus-based and web-based program options.
18
An Innovative Field Model: Virtual Field Practicum
During the first three years of the VAC program, to address the lack of field placements
across the nation due to six admission cycles in a year and to retain students in the program
(Flynn, Maiden, Smith, Wiley, & Wood, 2013), Dr. Elizabeth Phillips and Dr. Gary Woods
created the innovative VFP. This 15-week curriculum focuses on competency-based skills
training for VAC students, replacing the traditional field placement as their first-semester field
placement (Phillips et al., in press).
The VFP, taught by the field faculty, replaced the traditional field model in which
students received instruction from agency-based MSW field instructors. The VFP is a
comprehensive skills-training program that focuses on experiential learning (Rep, 2012; Wilson,
Brown, Wood, & Farkas, 2013) using twelve hours of structured asynchronous material and four
hours of live sessions weekly. The program includes training in assessment and diagnosis using
the 5th edition of the Diagnostic and Statistical Manual of Mental Disorders, clinical case
management and documentation, social work legal and ethics case vignettes, and evidence-based
practices (EBPs), including motivational interviewing, cognitive behavioral therapy, and
problem-solving therapy, along with active participation from students in the simulated clinical
setting via the VAC online classroom. Upon completion of the VFP, students enter a three-
semester community-based internship program. No other school of social work in the country
prepares students as comprehensively for their field placements or future employment as USC.
The concept of a simulated client is not new to medical educational programs, but is not
yet fully developed in social work education (Logie, Bogo, Regehr, & Power, 2013). In the VFP,
an actor comes to the weekly virtual live class sessions for thirteen weeks, giving students a real-
life scenario to rotate and role-play with the client. This interaction allows the students to apply
19
the skills learned from their faculty instructor and the asynchronous material. Reflections and
rich discussions after the simulated sessions enhance experiential learning without judgment,
increase confidence in practice skills, and decrease performance anxiety, which could lead to
success in field placements. The three-semester field placement that follows VFP completion
provides extended time for students to advance their knowledge and skills. A new concept
proposed by Colby (2013) was to have one semester of foundational field social work curriculum
followed by three semesters of advanced social work field practice, compared to the traditional
field model that consisted of two foundational semesters and two specialized semesters.
Social work educators were skeptical about the VFP’s effectiveness when the USC
Suzanne Dworak-Peck School of Social Work faculty first introduced this innovative field model
at the 2013 CSWE annual conference. The workshop was well-attended by more than 200 social
work educators with questions and doubts. VFP also aligned with Colby’s (2013) concept of one
semester of foundation field curriculum followed by three semesters of advanced practice
training in the field which generated several controversial discussions during the workshop.
Before participating in the workshop, social work educators were unfamiliar with the VFP course
curriculum and firm believers that field trainings could only be done in the community-based
placements with real clients not online in a virtual world with client. The VFP challenged the
traditional model of field education, in which students could only be placed in community-based
internships and receive field instruction and evaluation from agency field instructors. This
paradigm shift supported the need to provide students with comprehensive skills training in core
competencies before entering field placement. With very limited researches have been done on
social work distance learning program using a competency-based evaluation tool, this capstone
project address this gap to confirm VFP can be as effective as the traditional field mode,.
20
Since the implementation of the VFP curriculum in January 2014, more than 3,000 online
students have been trained using this model and earned their MSW degree. Agencies across the
nation provided feedback to faculty liaisons during their site visits and acknowledged the VFP’s
influence, noting how the social work students were much more prepared and confident with
their skills and as a result improved service delivery to their clients. This study explored learning
outcomes from both VAC and on-the-ground (OTG) students during the 2015–2016 academic
year. These data are particularly important for the USC Suzanne Dworak-Peck School of Social
Work because the VAC was not a part of the school’s accreditation review in 2011. This study
would be the first to measure compliance with the new 2015 EPAS, with which all students are
evaluated.
Curriculum Implementation
Faculty Recruitment and Development
The initial VFP courses were offered in Spring 2014. It was not a difficult task to recruit
the initial 10 VAC instructors. In consultation with the VFP creator, Dr. Phillips, school officials
concluded that all VAC instructors must attend weekly consultation group meetings to ensure
fidelity during initial implementation of the new curriculum. Dr. Phillips was instrumental and
committed to elucidating the VFP via weekly 2-hour consultation meetings for several purposes:
to train instructors in EBIs; review upcoming synchronous and asynchronous materials; identify
common themes when role-playing with the simulated client, Mario; and celebrate successes and
address challenges in both classroom and content. The success of the VFP implementation can
be attributed to Dr. Phillips’ tireless support and continuous effort throughout the VFP
implementation since its inception.
21
In May 2014, the target date for full VFP implementation for more than 300 VAC
students, the need for instructors increased as the number of sections increased from 10 to
approximately 32, given only eight to 10 students should be in each section to ensure optimal
learning outcomes. Faculty recruitment and development were crucial program components and
would help determine the success of the VFP curriculum.
To fully implement the VFP in May 2014, more than 25 new instructors were hired to
instruct this course. The goal was to recruit seminar instructors who have been trained or were
willing to be trained in EBIs. Instructors needed to be confident in their knowledge, clinical
skills, and ability to use technology in a virtual classroom setting when instructing students
through real-time experiential role-playing with a simulated client. Instructors would need to
create a safe and supportive environment that allowed students to provide peer support and
feedback to one another, in addition to modeling and facilitating timely productive discussions
during live sessions. Through a large professional network and recruit within current faculty
pool, this was successfully accomplished.
Curriculum Alignment
Suzanne Dworak-Peck School of Social Work underwent major curriculum revision in
2015. While VAC was able to successfully implement VFP since 2014, the delivery of VFP
content for the OTG program took careful planning and preparation. To align the curriculum
between two programs and maintain fidelity to the curriculum, OTG faculty instructors received
access to online course content, including videos and additional teaching supplement materials.
Instead of having a live actor as the client during in-person classes, OTG students practiced their
skills with real clients in their assigned agency settings. They participated in an 8-hour EBI
training and continued to role-play skills learned in their weekly 2-hour seminar courses with
22
support from the instructors. Seminar instructors also served as circle trainers during the 8-hour
EBI trainings as they prepared their courses and observed skills development among their
students.
Mirroring the faculty support and development model from the VAC program, OTG lead
faculty members also started with weekly consultations meetings, allowing the instructors to
identify challenges in delivering the newly developed curriculum and helping refine the content
in the intended course curriculum and specific contexts relevant to the 2015 EPAS. The stability
of the weekly consultation meetings, mentorship from senior faculty members, and collegial
support from other experienced instructors created a supportive learning environment for the
instructors, which translated to a safe and engaging environment for their assigned students.
Compared to OTG students, students from the VAC interacted with their field instructors
(VAC instructors) for a minimum of 4 hours a week, versus 1-hour weekly supervision in an
agency setting. The positive relationship between the VAC instructors and students accounted
for many successes in the development of their practice skills through effective field supervision
and meaningful learning processes (Fortune et al., 2001; Knight, 2001). During weekly
consultation meetings, a parallel process mirrored the classroom instruction, wherein VAC
instructors discussed challenges in teaching the new curriculum and brainstormed strategies and
the best teaching practices to continue refining the delivery of the VFP curriculum.
Hypothesis
The hypothesis is that both VAC and OTG students can reach the established semester
benchmarks at the end of their first-year field practicum measured by their field instructors based
on the 2015 CSWE EPAS competency-based comprehensive skill evaluation tool.
23
Students can now assess their own social work knowledge, values, and skills and their
progress made toward the acquisition of the core competencies during their first and second
semester using the student comprehensive skills evaluation tool, which was not included as part
of the previous evaluation process. Gaining insight into how VAC and OTG students view their
own learning outcomes will be an important factor for social work educators.
Prior to the 2015–2016 academic year, the faculty of the USC Suzanne Dworak-Peck
School of Social Work labored to redesign and replace its 40-year-old curriculum. Both VAC
and OTG students now receive the same curriculum content, but VAC students practice skills via
a simulated client, which replaces the traditional field experience of an internship.
Research Design
The research design of this study is quasi-experimental design (Campbell & Stanley,
1966), using an online evaluation collected at two points during the students’ first year in the
MSW program. Both VAC and OTG students were evaluated using the same instrument that
meets the program objective (see Appendix A). The rationale behind following up with all
students for a second-semester evaluation is to allow for additional comparative analyses to
monitor student learning outcomes and provide a feedback system to improve the quality of the
field curriculum for both the VAC and OTG programs.
24
Figure 1: Student Learning Outcomes Research Design Process
Note. Adapted from Creswell (2014; see p. 172).
The overarching goals are twofold in this capstone project study: to generate knowledge
for social work online education and to test a field model that could potentially to improve the
MSW distance learning field education program. With no standardized evaluation tool provided
by the CSWE for MSW programs and given that each MSW program is designed with a different
focus and specialty, the development of an empirical evaluation tool is essential. With the
implementation of an evaluation tool, social work educators and field instructors can collaborate,
coordinate, and evaluate students using expected professional standards. The hope is that this
capstone project will provide affirmative evidence to confirm that the VFP, a best-practice
model, is as effective as traditional field training. If the data confirm the hypothesis that both
groups of students can achieve similar learning outcomes by the end of their first-year field
practicum, a significant shift in the perception of social work distance learning education would
occur. This finding will be a landmark and significantly contribute to social work field education
Use of structured
simulated client (X)
through VFP
Online assessment
at the end of the
first semester
evaluated by
students and VFP
faculty
Experimental
Group
Not using simulated
client (real-life
clients at the field
placements)
Online assessment
at the end of the
first semester
evaluated by
students and
agency-based
instructors
T1
Control Group
T1
VAC students
(Virtual Field
Practicum)
OTG students
(Traditional
Field Model
T2
T2
Online assessment at
the end of the second
semester evaluated by
students and agency-
based instructors
(students in field)
Online assessment at
the end of the second
semester evaluated by
students and agency-
based instructors
(students remain in
field)
One instrument used at both
Time 1 and Time 2
(regardless students’ chosen
departments)
25
at the national and international levels. Furthermore, USC will have strong evidence to bolster its
2016 reaffirmation and 2017 accreditation review.
Summary
This capstone project sought to achieve the following three objectives: (a) develop an
online assessment tool adapted from the 2015 CSWE EPAS with specific examples for students
and field instructors; (b) implement this online assessment tool for both OTG and VAC students;
and (c) assess student learning outcomes in their first-year field practicum as the basis of the self-
study and reaffirmation process. The following chapter provides a literature review and research
studies conducted on social work field education, setting the stage to accomplish the
aforementioned objectives.
26
Chapter 2: Review of the Literature
This chapter provides a literature review regarding the background and program structure
of social work field education, radical changes in social work field models, and studies on
student competency assessments and learning outcomes.
Introduction
The CSWE accreditation standards for social work programs have changed four times in
the past 20 years. Authorized by the Council for Higher Education Accreditation, the
Commission on Accreditation of the CSWE is the official accreditation body that monitors and
accredits social work baccalaureate and master’s degree programs in the United States. The
commission sets professional accreditation standards and reviews and confirms that social work
programs are complying with expectations through program self-studies and site visits (CSWE,
2015). Through an explicit curriculum using the competency-based framework, social work
programs prepare students for generalist and specialized professional practice. CSWE
Educational Policy 2.0 outlines the accreditation standard M2.0 (MSW) and expectations for
generalist practice, whereas Educational Policy 2.1 provides details regarding the expectations
for specialized practice. Educational Policy 2.2 (Signature Pedagogy: Field Education) provides
specific structures on how to “teach future practitioners the fundamental dimensions of
professional work in their discipline—to think, to perform, and to act ethically and with
integrity” (CSWE, 2015, p. 12).
Social work field education, often referred to as field practicum or field work, is the
signature pedagogy of social work education. Traditionally, students apply theories learned from
their academic courses to practice, an integral component of the curriculum, in community-based
agencies. Students develop their competencies during two years of practicum by performing
27
social work-related activities with clients, families, groups, organizations, and communities;
tasks span the gamut, from administration and planning to advocacy and policy development.
Students are required to receive a minimum of one-hour weekly supervision from qualified field
instructors who graduated at least two years earlier from a CSWE-accredited MSW program.
These experienced field instructors assist students with achieving specific educational learning
objectives in their respective areas of generalist and specialized practice as professional social
workers. The students are expected to demonstrate academic knowledge, skills, critical thinking,
professional behavior, ethics, and values learned in the classroom; apply them in their direct-
practice work; and achieve the nine CSWE EPAS core competencies. This study focused on
generalist practice behavior and assessed student learning outcomes in their first year of the
MSW field practicum.
Background and Structure of Social Work Field Education
Seminar Instructor (Faculty Liaison)
Field faculty members (also known as integrative seminar instructors) possess a
minimum qualification of five years post-MSW experience, including two years as a field
instructor. Integrative seminars serve as the setting for integrating classroom content with social
work practice in field settings.
Seminar instructors (also student field liaisons) collaborate with agency field instructors
to foster comprehensive and high-quality social work education and training at placement sites.
The collaboration between the seminar instructors and field instructors teaches and guides
students to practice with sensitivity to cultural and ethnic diversity and abide by professional
social work behaviors, values, and codes of ethics. Field seminars provide weekly opportunities
for students to process their field experiences and reinforce their practice skills and values by
28
connecting competency-based knowledge gained in the classroom with their field placement
experiences.
Field education prepares students to enter the social work profession by meeting the
following learning objectives (USC Suzanne Dworak-Peck School of Social Work, 2017):
Integration of academic learning with all levels of social work skills (micro, mezzo,
and macro) that include direct and indirect practices;
Achievement of proficiency outlined in the 2015 CSWE EPAS nine core
competencies of social work practice;
Development of the knowledge, skills, and ability to understand and use a broad
range of assessments, treatment modalities, and interventions relevant to the diverse
populations through relevant social work learning activities; and
Development of knowledge for generalist and specialized practices.
In addition to teaching field seminar courses, the seminar instructors also serve as student
liaisons who take on the responsibilities of advising, monitoring, and guiding students and
offering timely support to agency field instructors regarding university policies, curriculum
requirements, and professional educational standards. Furthermore, the field education program
is also responsible for setting policies, criteria, and procedures “for approving and selecting field
settings; placing and monitoring students; supporting student safety; and evaluating student
learning and field setting effectiveness congruent with the social work competencies” (CSWE,
2015, p. 13). Field seminar instructors (faculty liaisons) are the essential key to implementing
and monitoring these policies and protocols as part of their administrative responsibilities.
29
Agency-Based MSW Field Instructor
Field instructors, who host MSW students at their places of employment with the agency
administration’s approval, are required to hold an MSW degree from a CSWE-accredited
program and have two years of post-MSW social work practice experience. CSWE sets “the
parameters and expectations for field instructors and agencies based on educational policies,
criteria, and procedures” (CSWE, 2015, p. 13). All new field instructors are required to take new
field instructor training from a CSWE-accredited school of social work. The new field instructor
training prepares field instructors by providing specific orientation to field practicum
requirements—as well as providing an overview of the social work practice core competencies,
roles, and tasks of the field instructors and agency preceptors; field-specific learning objectives
and activities appropriate for first- and second-year students; best practices for effective
supervision; and field policies, protocol, and procedures for students to achieve the core
professional competencies (CSWE, 2015).
Field instructors are responsible for setting up the overall educational program in the field
setting following the school’s guidelines in consultation with the field liaisons and agency
preceptors (if appropriate). An orientation plan is developed to introduce students to the agency
and the community it serves. The field instructors also provide regular weekly individual field
instruction (teaching plan), including case assignments, review of agency policies and
requirements, review of student educational goals, and evaluation of students’ ongoing
performance. Field instructors and students use the reflective learning tools (educational process
recordings) each semester to guide weekly supervision, in which field instructors offer timely
constructive feedback to maximize students’ learning experiences and outcomes. Additional
observation, documentations, and feedback from the agency staff may contribute to students’
30
professional growth and development, which the designated field instructor coordinates and
monitors.
Homonoff (2008) identified several successful characteristics of good field instructors:
individuals who demonstrate their passion and energy as professional social workers, use various
learning opportunities and teaching styles, and commit to providing orientation and internship
structure for students. When in doubt, field instructors can consult with university liaisons
regarding how to align with university curriculum and CSWE field requirements. Bogo (2006)
identified the following crucial reasons for field instructors to volunteer their mentorship and
supervision to MSW students: (a) commitment to the social work profession and education, (b)
provision of organizational resources and support to create a positive culture and environment
for students and agencies, (c) effective interpersonal relationships with university liaisons, and
(d) reciprocal activities between the universities and field agencies. Field instructors are most
likely to engage or be greatly motivated when they are satisfied with the university liaison and
have access to consultations and university trainings.
In a qualitative study, Homonoff (2008) invited ten field instructors, recipients of the
CSWE Heart of Social Work Award, and university social work educators to brainstorm about
creative ways to impart knowledge, values, and skills through various teaching methods and
supervision models to uphold the mission of field education in the face of fiscal constraints and
pressures for accountability. Agency support was vital in promoting their roles as field
instructors and creating a positive learning environment for both students and agencies
(workforce development). Additional professional growth for field instructors, through their
clinical supervision, manifested as personal rewards for their own practice, gaining deeper self-
awareness through teaching, and further advancing their supervision skills (Tsang, 2013). Field
31
instructors used their clinical assessment to identify student strengths and learning styles and
provide individualized teaching plans and relevant learning opportunities to motivate students to
achieve their ultimate positive learning outcomes (Christenson Delong-Hamilton, Panos, Krase,
Buchan, Farrel, & Rodenhiser, 2015). Additionally, agencies benefited greatly from students’
ideas, work, and contributions and the advanced training opportunities offered by the
universities.
Field Instructor Evaluations of MSW Students
The 2015 EPAS focus on the multidimensional assessment of holistic competencies
necessary to gain essential social work knowledge, values, and skills (Drisko, 2015; Kuhlmann,
2009) that link social work educators and professional practitioners to work collaboratively in
preparing students for their professional practice. The focus is consistent with the 2008
educational standards and assessments of the practice-behaviors approach.
Bogo (2006) found that field instructors preferred to facilitate student learning rather than
evaluate student skills. Field instructors’ leniency in assessing students resulted from their
background, years of experiences, and different teaching methods and supervision styles that
greatly affected their dual roles as mentors and evaluators (Regehr, Bogo, Regehr, & Power,
2007; Ringstad, 2013). Ringstad (2013) found that students scored higher than expected on
competency development measures, possibly because of an intentional or unintentional inflation
of scores by field instructors and the lack of specific scoring rubrics for the evaluation.
Furthermore, Regehr et al. (2007) indicated that field instructors were reluctant to use a tool that
expected them to evaluate and categorize students in different skill. The instructors preferred to
provide individualized student evaluations. However, this preference raised the issue of how
social dimensions of the relationship between the field instructor and the student might interfere
32
with the evaluation process. As a default, field instructors often measured students based on their
personal qualities, including personality traits and learning style, versus evaluating students
based on the core social work competencies (Bogo, 2006; Bogo et al., 2007).
Additionally, a study by Vinton and Wilke (2011) of 33 field instructors with 90 social
work interns examined leniency bias among supervisors and found that student ratings of
themselves were often lower than field instructor ratings of the students. Lower mean ratings and
greater variance occurred when supervisors completed on their own evaluations without any in
person discussion with their students compared to face-to-face evaluations. Most of the time,
field instructors never had any conversations with students when completing the skill
evaluations.
Ornstein and Moses (2010) mentioned the necessity for field instructors to maintain a
delicate balance of “teach versus treat” (p. 108), a term adopted from Frawley-O’Dea and Sarnat
(2001), between a therapeutic and an educational stance in their work and supervision with
students. Often, students may experience personal and family issues during the vigorous and
academically challenging MSW program. Being in field placements 16 hours a week, students
develop strong personal and professional alliances with their field instructors. This relationship
might potentially interfere with the objectivity required for field instructors to guide and
facilitate students’ professional growth and development. When instructors used a competency-
based evaluation, they faced a lack of specific guidelines with which to evaluate students’ moral
standards, self-awareness, self-management, and self-regulation (Ornstein & Moses, 2010). The
critical roles that field instructors play means they need to balance using a strength-based
approach with managing their own ambiguity to provide fair and effective evaluations through
33
the ethical decisions made in the context of social work practice (Ornstein & Moses, 2010; Bogo,
Regehr, Power & Regehr, 2007).
Sowbel (2011) summarized that field instructors, as volunteer professional educators
working without compensation, are responsible for identifying, confronting, and addressing
student performance issues, behaviors, attitudes, and professionalism. Complex dynamics
(supervision relationship) between students and their field instructors exist and field instructors
have great difficulty evaluating student competencies due to lack of a reliable and valid
measurement (evaluation tool). Sowbel’s (2011) study indicated that more than 50% of the 154
participating MSW students were rated by their field instructors as exceptional. This again
confirmed the long-standing issue of inflated grades given by field instructors. Sowbel’s (2012)
article reported how social work educators struggle with their professional obligation to serve as
the gatekeepers for the profession to ensure that MSW students are a good fit for the profession
and would protect clients and field agencies from any harm. She mentioned several factors
contributing to this challenge, including fear of litigation, unclear suitability and admission
criteria, conflicting educator roles, and lack of measures or protocols.
This finding suggests that field instructors are not necessarily skilled at evaluating
students and may lack supervisory skills or experience. As a result, field instructors might avoid
providing students with direct feedback about their performance. In the new field instructor
training, faculty provide various supervision models and experiential case studies to brainstorm
tips and strategies how to best provide relevant feedbacks to their students about their
performances and identify areas for continuous professional growth and development.
Furthermore, faculty liaisons, serve as the consultant, to continue support field instructors on
how to evaluate their students objectively based on the core competencies.
34
First-Year MSW Students’ Reflection and Self-Assessment
The number of prospective students interested in the field of social work has increased,
specifically regarding several advanced practices, such as child welfare, schools (trauma),
integrated care and wellness, homelessness, substance abuse, and aging. MSW distance learning
programs offer a platform of study for these fields and encourage students to be part of a
growing social work profession. Many of these students come from diverse populations and
backgrounds and often have had personal or professional experiences with disabilities and health
and mental health issues, which present different learning objectives and levels of commitment
to their future social work careers. Students younger than 25 years old appear to lack skills,
experience, and preparedness and have more anxiety than older students (Bogo, 2006; Gelman,
2004), which manifest as greater performance anxiety that might improve or hinder their
learning. Nontraditional older students, who have had work experiences, tend to be more self-
directed and engaged in self-assessment of their own learning because of their multiple roles as a
student, employee, and family member (Gelman, 2004). Gelman and Lloyd (2008) explored
preplacement anxiety in a qualitative survey of 360 first-year MSW students and revealed that
70% of students feared not having the necessary skills or experience and 64% worried that they
would make mistakes. The results from the open-ended questions indicated that 47% of the
students felt overwhelmed and 25% experienced anxiety. Students excel best in a supportive
relationship and climate with an expert who is honest, reliable, sociable, prepared, sincere, warm,
skillful, and trustworthy (Bogo, 2015; Strozier, Barnett-Queen, & Bennett, 2000). However,
these relationships are more challenging when agency-based field instructors must not only
perform demanding jobs but also take on the role of field instructor on their own, either with or
without agency support.
35
Fortune, McCarty, and Abramson (2001) explored activities associated with MSW
students’ performance in the field, perceptions of the quality of field placement, and satisfaction
with field experiences. The results demonstrated that students needed to distinguish their own
perceptions from their performance by collaboratively embracing learning opportunities. The
more they participate in the development of the learning activities, they gained great satisfaction
from their field experience. Knight (2001) focused on how field instructors could clearly
introduce the agency and learning assignments to MSW students as an important orientation for
the students. Such an approach would contribute to a positive supervisory relationship between
students and field instructors. Furthermore, it could motivate students to connect and bond with
their field instructors, who were responsible and played a significant role in their first practicum
experiences in their MSW program. Calderon (2013) studied student perceptions of satisfaction
in the field and self-assessments of social work skills and learning activities. The researcher
found that student perceptions of learning did not directly reflect content and skill mastery but
rather satisfaction with their educational experiences. Lee and Fortune (2013b) confirmed that
students had greater satisfaction in the field when they were more engaged in participatory,
conceptual, and linkage learning activities. Over time as a result of their active participation,
students would likely have higher self-rated social work skills at the final point of evaluation
through consistent learning activities during field practicum. Swank (2014) compared supervisor
ratings and student self-ratings of counseling competencies from the midterm point to those from
the end of the practicum course. The results indicated that students rated themselves higher than
their supervisors at the beginning of the supervisor experiences, but supervisors reported greater
changes and progress at the midterm and final evaluations.
36
Student self-assessment of competencies could be extremely meaningful and important
aspects of their professional growth and development, specifically regarding autonomy, self-
awareness, reflection, monitoring, and improvement throughout the lifelong learning process
(Lee & Fortune, 2013a). It can encourage discussions regarding assessments between students
and field instructors and promote student self-efficacy, which enhances critical thinking,
improves performance, and helps them achieve competencies through a self-fulfilling prophecy.
Radical Changes in Social Work Field Models
Each online program has its unique curriculum design and geographic reach. Distance
learning MSW programs vary between hybrid and online. The hybrid (blended) model allows
students to receive educational program delivered partly through off site technological platform
and partly on campus. Some online programs may only offer asynchronous instruction that does
not require students to attend synchronous live session (face-to-face) sessions with instructors
(via internet and technological platform), whereas others may offer both asynchronous and
synchronous instructions. Some programs are accredited across the 50 states, whereas some may
be restricted to geographic locations or require students to secure their own internships as one of
the admission criteria. Regardless of the program structure, the expectation is that all MSW
students be placed in qualified community agency settings for internships where they reside.
Distance learning programs require students to be well-disciplined, organized, motivated, and
tech savvy.
In the 1960s, problem-based learning (PBL) was developed at McMaster University in
Canada and originated from the field of medicine. PBL increased students’ ability to link theory
and practice and improved problem-solving skills and higher-level analytic skills in
37
communication, collaboration, and critical thinking (Wong & Lam, 2007). Hmelo-Silver (2004)
provided a detailed description of PBL:
Students work in collaborative groups to identify what they need to learn in order to solve
a problem. They engage in self-directed learning (SDL) and then apply their new
knowledge to the problem and reflect on what they learned and the effectiveness of the
strategies employed. The teacher acts to facilitate the learning process rather than to
provide knowledge. (p. 235)
This PBL approach seeks to increase the intrinsic motivation of students while they work
through a real-life problem through interdisciplinary collaborations or projects in which students
demonstrate a higher level of analytic understanding (Hmelo-Silver, 2004). The PBL approach
has been widely used in social work education to bolster practice and field seminar courses in
recent years.
Several universities have developed various supplemental training models that
complement students’ field experiences. For example, Vodde and Giddings (2000) created a field
system ecomap to help students conceptualize practicum experiences. Based on their three-year
study evaluating the benefits and challenges of teaching field seminars in both in-class and
online courses, Wolfson, Marsom, and Magnuson (2005) created a fourth-year practicum
seminar offered only online. Carey and McCardle (2011) developed a faculty-guided, on-site,
experiential observation field model to reduce students’ anxiety before they enter the field. Most
universities offer seminar classes (curriculum) to help students integrate theories into practice in
the field. Logie et al. (2013) provided a critical appraisal of how social work education could
expand field trainings through standardized client simulations using reliable and valid methods to
evaluate student competence. These field models were attempts to address interns’ lack of
38
consistent field experiences and provide essential program components to ensure students meet
all CSWE core competencies.
Furthermore, Colby (2013) proposed a six-week practice lab during one semester of the
foundation year before entering the three-semester specialized model. His concept created a
paradigm shift for social work education and aligned with the 2008 EPAS to prepare graduate
students for advanced practice. The use of a standardized (simulated) client validated by Logie et
al. (2013) and has been promoted by Carter, Bornais, and Bilodeau (2011) as an effective
teaching tool for students to gain confidence and specific counseling skills. In the Objective
Structured Clinical Examination study (Katz, Tufford, Bogo, & Regehr, 2014), social work
students were required to role-play their interviewing skills for 15 minutes with a peer in a
practice laboratory before entering the field. The role-play exercise was observed by faculty
members, who then provided immediate feedback. Although similar to peer role-playing, client
simulation using an actor has more distinct advantages. The actors can present a life-changing
crisis in a simulated environment, allowing the students to take the practice more seriously and
not worry about causing harm to the clients. With great emphasis on teaching students the
essential engagement and interviewing skills in a practice lab, Hohman, Pierce, and Barnett
(2015) trained students in a semester-long practice course of motivational interviewing and
found through pretest and posttest surveys that students gained confidence in their
communication and counseling skills during the course.
The Virtual Field Practicum (discussed in Chapter 1) was able to integrate all the above
best practices and deliver social work field education through technology and 100% online. The
combination course content of VFP was able to maximize the various teaching methods from
standardized client simulations through problem-based learning to help reduce students’ anxiety
39
and build their confidence through specific skills building. Offering one semester skill training
field seminar course through technology, as the innovative trail brazier, this approach not only
challenged the traditional social work educators’ perception but also shaped the unforeseen
changes in social work field education to another level.
Evidence-Based Intervention Trainings
EBI trainings are not foreign to most disciplines (medical, psychological, and education
systems) that thrive to align more with research-informed clinical practices from a scientific
perspective. The evidence-based medicine concept came from Sachet, Richardson, Rosenberg,
and Haynes (1997) and has influenced many other clinical disciplines. Many benefits were
derived from having a standardized scientific practice, including the ability to (a) assess a
patient’s problem with a list of structured clinical questions, (b) search and identify the relevant
and best EBPs, (c) critically assess and analyze the evidence, (d) fully understand the search
results, and (e) fully integrate the best practices into the client’s care with his or her
acknowledgement and consent (Hatala, Keitz, Wilson, & Guyatt, 2006).
The EBI planning and implementation movement started in California, when voters
passed Proposition 63, also known as the Mental Health Services Act, in the 2004 general
election (Los Angeles County Department of Mental Health, 2017). The act promised to improve
the delivery of mental health services to all California residents. The law requires each county
mental health department to develop a three-year plan with annual updates specifically focused
on the development of prevention and early intervention services for underserved (or unserved)
vulnerable populations. It includes long-term information systems and mental health facilities to
provide cost-effective services and support to consumers with recovery and wellness in mind
(Los Angeles County Department of Mental Health, 2017). Furthermore, the act allowed each
40
county to address fundamental concepts on how to train culturally competent practitioners
through the support of workforce development and training programs for the purpose to provide
the quality mental health services to consumers and their families. For 18 months in Los Angeles
County, each service area advisory planning committee took the responsibility among all
stakeholders to select and approve specific lists of EBPs that were most relevant to their
communities. Based on data from the Los Angeles County Department of Mental Health client
information system, which features mental health service records, summaries, and
recommendations, all final reports were posted for public comment in November 2009 (Los
Angeles County Department of Mental Health, 2017).
A few studies identified the importance of providing MSW students with EBPs and
related training to enhance student knowledge and application to meet Competency 4 (Engage in
Practice-Informed Research and Research-Informed Practice), which was relevant for both the
2008 and 2015 EPAS. Evidence-based social work instruction could be found in most MSW
research methods courses to teach MSW students to inquire, evaluate, and critique the principles
and effectiveness of EBPs as one of the CSWE core competencies (Traube, Pohle & Barley,
2012). Several textbooks emerged on EBPs, and a generalist practice course reviewed the
essential EBP concepts and theoretical framework. For years, social work educators reflected on
which EBPs should be taught, who could teach them, and the multiple challenges to integrate
EBPs throughout the two-year MSW curriculum. MSW students, however, did not have
opportunities to learn specific EBPs. Most EBPs require extensive initial week long training
followed by weekly supervision from the curriculum developer for up to 12 to 24 months in
order to be certified and practice the specific EBP. USC Suzanne Dworak-Peck School of Social
41
Work has been able to offer 8 hour skills Evidence-based intervention training due to the limited
resources.
Barwick (2011) surveyed more than 589 North American behavioral health
administrators and clinical supervisors who were responsible for hiring MSW graduates. The
responders reported that MSW graduates did not have the basic ability to search for and analyze
EBPs. Only 25% of the responders indicated that they expressed their concerns with academic
programs regarding curriculum recommendations, whereas 36% of them never had such a
conversation and 39% were uncertain about how to address this curriculum gap. Of the contacts
between organizations and graduate programs, 45% focused primarily on field placements
(Barwick, 2011). Bertram, King, Pederson, and Nutt (2014) queried 215 deans and directors of
North American MSW programs, and 58 responded (n = 55 from U.S. programs) and confirmed
teaching at least one EBP in their curriculum. However, the major challenge was a faculty-
related barrier due to a lack of EBP knowledge and practice experiences, followed by limited
opportunities to support MSW students’ application of EBPs in placement settings (Bertram et
al., 2014).
Traube et al. (2012) recommended teaching EBPs at the beginning of the foundational
courses. By learning and practicing EBPs, students would be exposed to EBP principles and a
culture of accountability for the patients’ best outcome, bridging the gap between research and
practice. Traube et al. (2012) cited the five steps of the EBP process proposed by Sholonsky and
Stern (2007): 1) converting information need (prevention, assessment, treatment, risk) into an
answerable question; 2) tracking down current best evidence; 3) critically appraising the
evidence; 4) integrating critical appraisal with practice experience, client’s strengths, values, and
42
circumstances; and 5) evaluating effectiveness and efficiency in exercising steps 1-4 and seeking
ways to improve them next time (pp. 254–255).
Matheisen and Hohman (2013) created an EBP questionnaire to assess knowledge,
attitudes, and behavior regarding EBP and monitor the EBP curriculum training, whereas
Bender, Altschul, Yoder, Parrish, and Nickels (2014) discussed the effectiveness of EBP in
research curricula as an important development component to align students with the core
competency concepts and build their critical thinking and evaluation skills. EBP is considered a
crucial component of social work education that aims to best prepare students entering field
placements, advance their professional development by practicing the science of social work, and
provide best practices to maximize clients’ ultimate treatment outcomes.
Studies on Student Competency Assessment and Learning Outcomes
Assessment is “the systematic collection, review, and use of information about
educational programs undertaken for the purpose of improving student learning and
development” (Palomba & Banta, 1999, p. 4). Palomba and Banta (2001) further stated that an
assessment provides evidence of student competence that should be used to improve learning.
Palomba and Banta (2001) reported that “to ensure that a student has attained competency, three
items of information are needed: a description of the competence, a means of measuring the
competency, and a standard by which the student may be judged as competent” (p. 2).
Not surprisingly, measuring student educational outcomes in the field setting has been
complicated and current related literature is lacking. Regehr et al. (2007) examined three field
evaluation tools and found that each tool was broad and lacked a specific rubric for outcome
interpretation. Various objectives and competencies existed among social work education
43
programs that prevented a standardized, consistent evaluation tool with validity and reliability
(Christenson et al., 2015).
The best known and widely quoted educational outcome model was developed by
Kirkpatrick cited by John Carpenter (2011) as addressing four levels of outcomes when
evaluating social work education; “learners’ reactions to the educational experience;
modification in attitudes and perceptions and the acquisition of knowledge and skills; behavior
change, including the application of learning to the work setting; and results, assessed in relation
to intended outcomes benefits to service users and carers” (pp. 123–124). Self-efficacy scales
were used quite often to measure student confidence in aspects of placement learning. Holden
and colleagues (Holden, Barker, Rosenberg, & Onghena, 2007; Holden, Meenaghan, Anastas, &
Metrey, 2002) developed the Research Self-Efficacy Scale and measured self-ratings as a
learning outcome in three areas: medical social work, research, and evaluation. A follow-up
study by Rishel and Majewski (2009) used the construct of self-efficacy as an evaluation tool in
pilot pretests and posttests with 117 advanced MSW students that showed significant gains
related to all 17 program objectives. Particularly, their findings also indicated no significant
difference between campus-based and off-campus students.
After Florida State University launched its first asynchronous, online, advanced standing
MSW program in fall 2002, Wilke and Vinton (2006) concluded a preliminary study in 2005
comparing the first two web-based advanced-standing MSW cohorts (n = 35) with OTG students
(n = 96). Both groups of students completed instruments via a pretest–posttest research design
assessing social work knowledge, skills, and values at the beginning of their MSW program and
just prior to their graduation. The field evaluation of student performance involved 110 items
with responses ranked on a 5-point scale (1 = low to 5 = outstanding) based on the 2001 EPAS.
44
The following sections were included: supervision and professional identification; conceptual
frameworks; relationship skills; professional communication skills; case management; work with
individuals, groups, and families; and ethical and legal issues. The results indicated that online
students rated themselves significantly lower at admission to the program than OTG students. By
the end of the program, ratings by field instructors from both online and OTG groups were
positive and comparable (Wilke & Vinton, 2006).
A dissertation by Martha T. Early (2007) studied learning outcomes of foundation-year
field instruction comparing on-campus and off-campus students. This study collected data from
three perspectives: students, field instructors, and clients. MSW students at East Carolina
University’s MSW program, established in 1984, assessed their professional growth as social
work interns, ability to perform basic social work tasks, and overall confidence levels. This study
found no significant differences between the two groups, and both groups received similar scores
from their field instructors at the end of their field practicum. Clients’ data was unable to be
collected by on-campus MSW students, most without work experience, and had difficult
engaging clients to complete the pretest and posttest. The off campus students, most placed at
their current work sites for their internships, showed much comfort level engaging with new
clients (Early, 2007). Similar results came from Cummings, Chaffin, & Cockerham (2015), who
conducted a study of 183 MSW graduates of University of Tennessee’s distance education
program compared to the traditional cohort graduates. The researchers found no significant
differences in 10 core competencies based on the revised self-efficacy scale administered to all
students admitted to the MSW program in fall 2010 and again a few week before they graduated
inMay 2012.
45
Hamilton (2009) conducted a secondary analysis of data collected from 16,996 bachelor
of social work students across 203 programs from 2000–2007 who rated their own social work
skills and advising experience. Participants primarily were female (90%) and had a mean age of
28 years and overall GPA of 3.3. The findings indicated that bachelor of social work students in
private universities rated their social work skills higher than students from public programs.
From a programmatic perspective, these data provided valuable information to track students
from entrance into bachelor’s degree programs until two years after graduation.
Meyer-Adams, Potts, Koob, Dorsey, and Rosales (2011) presented a systematic program-
level assessment plan that served as the baseline for MSW program educational outcomes and
the intended process to meet the expected 2008 EPAS. The comprehensive skills evaluation for
field education was included with all other assessment tools, including the Multicultural
Counseling Inventory, Self-Appraisal Inventory, Knowledge Inventory, Student Perceptions
Inventory, and an alumni survey. MSW field instructors completed ratings aggregated across ten
practice behaviors with a 4-point Likert scale based on the 2008 EPAS (Appendix B). Paired t-
tests were used to compare students’ first-year evaluation to their second year. The researchers
concluded that changes were significant and demonstrated a consistently high level of skills
development. The instrument was developed in collaboration with six other MSW programs in
Los Angeles County; however, its reliability and validity was unknown. This instrument was
used for a 2015 exploratory study by this author and results are included in Chapter 3 in
conjunction with a hypothesis for this quasi-experimental research study.
Christenson and colleagues (2015) reported development of the Field Placement and
Practicum Assessment Instrument based on the 2008 EPAS and measured undergraduate and
foundation-year MSW students’ educational outcomes in field settings. This instrument features
46
55 questions and assesses 41 practice behaviors using a 10-point Likert scale. In two initial pilot
cohorts, feedback was collected and questions were revised twice prior to the full
implementation (either via mail or online). A focus group was used to clarify the focus, validity,
and consistent interpretation of the outcomes.
The initial sample included 304 BSW students from 19 social work programs across 18
states and began in fall 2011. This instrument has been accepted as one of the six tools in the
Social Work Education Assessment Package used to measure EPAS competencies (Christenson
et al., 2015). In 2015, the authors reported that approximately 92 CSWE-accredited social work
programs had used the Field Placement and Practicum Assessment Instrument; however, they
reported no specific information on how this standardized instrument has been used by MSW
programs for the purpose of reaccreditation.
Conclusion
Competence assessments by both students and instructors are viable and equally
important in the learning process (Lee & Fortune, 2013a). Although field instructors’ ratings are
expected to be objective, they might be easily influenced by the supervisory relationship and
working alliance between students and field instructors. The compounded variables of student
demographics, characteristics, readiness for professional training, learning styles, commitment to
the profession, and career aspirations could significantly contribute to and affect the acquisition
and attainment of social work skills, values, knowledge, and application.
Additionally, training and consultations involving university faculty members would
support positive student learning outcomes based on the individual field instructor’s training,
experiences, communication, commitment, and teaching and supervision styles. Several field
models addressing the needs to provide essential skill training for MSW students prior placing
47
them in field. These important lessons served as the fundamental curriculum content when VFP
was developed. Most of the studies on student learning outcomes have been done with self-
efficacy scale that students assessed their own social work values, skills and knowledge. Most
studies show no differences between the online and campus students (Wilke & Vinton, 2006;
Early, 2007; Rishel & Majewski, 2009) and informed the hypothesis of this capstone project.
In the next chapter, the author reviews the methodology in terms of how the USC
Suzanne Dworak-Peck School of Social Work developed and implemented an instrument to
collect data on student learning outcomes.
48
Chapter 3: Methodology
Introduction
This chapter features an overview of a 2015 exploratory study that described student
learning outcomes based on the 2008 EPAS, using data collected from an initial 100 VAC
students and compared to 519 OTG students. The results of the exploratory study served as the
hypothesis for research conducted for this capstone project. The philosophical orientation,
hypothesis, research questions, research design, setting, participants, procedures, instrument, and
data analysis for this research are described in this chapter.
Overview of Purpose
For its 2011 reaccreditation, USC Suzanne Dworak-Peck School of Social Work did not
implement an online comprehensive skills evaluation tool to compile students’ field evaluation
data. In addition, the VAC was not included as a part of the reaccreditation because the program
had just launched in October 2010 and did not have any student learning outcomes available for
review. During the 2011 accreditation review, field education presented the only available
comprehensive data on student learning outcomes for all 10 core competencies. These crucial
data were affirmed by the CSWE site reviewer, allowing the school to continue offering its
MSW program for 8 years. The VAC program has grown significantly since its inception, from
the initial 86 students enrolled in 2010 to approximately 2,200 students from across the nation in
2017. Without a doubt, the CSWE reaccreditation committee will pay very close attention to the
VAC program, the impacts of the VFP curriculum, and student learning outcomes, among other
measurements, in its evaluation of the USC MSW program.
49
Philosophical Orientation
The pragmatism-based philosophical world view that underpinned this research approach,
method, and design focuses on problem-centered, real-world practice orientation and
consequences of the proposed intervention, using a pluralistic approach to gain true knowledge
regarding what worked and to confirm to hypothesis (research question) based on the intended
intervention (Creswell, 2014). The research involved simply collecting and analyzing data
through a quantitative research design.
In addition to the competency-based framework mentioned in Chapter 1, this study relied
on systems theory, which describes student learning as “a combination of interrelated and
interconnected elements and activities that form an identifiable, organized and functioning
whole” (Sheafor & Horejsi, 2006, p. 89). This theoretical and conceptual framework: focuses on
external factors that affect how students in a complex environment observe the interactive
patterns among people, groups, and institutions; fosters their competence in applying knowledge
to understand other individuals and their environments; and informs their social work practice in
terms of engagement, assessment, intervention, and evaluation (Greenfield, 2011).
Social cognitive theory, developed by Bandura (1986), identifies both self-efficacy and
self-regulated learning as notable cognitive factors that operate through a self-fulfilling prophecy
in higher education. As discussed in Chapter 2, a self-efficacy scale has been the most widely
used method to assess social work student learning. Zimmerman’s (2000) model of self-
regulated learning also has been used to further assess student learning outcomes. According to
system and social cognitive theories, students with or without previous knowledge of volunteer
or work experiences in the social services can gain self-awareness of their own learning styles
and can develop critical thinking skills for self-reflection and assessment of their own learning.
50
This critical self-assessment serves as the foundation for the students’ advanced professional
growth and development, leading to successful career paths and job satisfaction in support of the
competency-based training approach (Baartman & Ruijs, 2011).
Kolb (1984) developed an experiential learning theory that incorporated two frameworks:
a four-stage cycle of learning and an inventory of four separate learning styles. Much of this
theory focused on the learner’s internal cognitive processes. Kolb stated that learning involved
the acquisition of abstract concepts that could be applied in a range of settings as well as the
development of new concepts derived from experience. “Learning is the process whereby
knowledge is created through the transformation of experience” (Kolb, 1984, p. 38).
Figure 2. Kolb’s (1984) Four-Stage Cycle of Learning
Kolb’s experiential learning model served as the philosophical orientation for this study
and grounded the hypothesis that both VAC and OTG students can reach the established
semester benchmark measured by their field instructors using the 2015 CSWE EPAS
competency-based comprehensive skill evaluation tool. This model outlines the complex
51
learning process and how social work educators and field instructors can best facilitate,
supervise, and guide students’ motivation, learning, and self-efficacy to achieve their
professional competencies and reach their ultimate learning outcomes in the field setting.
To elaborate, social work students engage in field placement through multidimensional
interactions with university faculty members (instructors), agency field instructors, and agency
staff members, which serve as the crucial platform for their learning and opportunities to further
identify needs and gaps regarding their skill development. The coordination and communication
among all involved stakeholders contribute to the success of social work education when both
implicit and explicit curricula are aligned to support student learning and achieve academic
excellence and professional competencies. Figure 3, adapted from Kolb’s (1984) experiential
learning cycle model to apply to social work field education, contends that MSW students
(learner-centered) engage in the learning environment and interact with faculty members, agency
field instructors, and other stakeholders (individuals, groups, organizations, and communities) to
achieve the best student learning outcome.
52
Figure 3. Collaborative Learning Framework
Results from 2015 Exploratory Study
An exploratory longitudinal study was conducted with the first cohort of 100 VAC
students in spring 2014 to assess their learning outcomes compared to 519 campus-based (OTG)
students using the 2008 competency evaluation tool. The study aimed to gather knowledge on
the trajectories of students’ skills development and learning outcomes comparing VFP and OTG
students. The hypothesis posited that VAC students could achieve similar or better competencies
as OTG students (Phillips et al., in press). Subsequently, this author in collaboration with Dr.
Achieve CSWE 2015 EPAS Competencies
Student mastery of the nine core competencies at the end of the MSW program, evidenced by a
comprehensive skills evaluation (both student self-assessments and field instructor ratings)
MSW Student
Motivation, knowledge, skills and
application
Prior knowledge with social service
volunteer and work experiences
Learning styles, self-assessment and
satisfaction
Career path and professional growth
and development
University Social Work
Program
Curriculum
Selecting, approving and
coordinating with field
placement
Monitoring, consultation
and evaluation of students
and field placement
Field Placement
Culture and
administrative support
Learning opportunities
and activities
Adequate supervision and
trainings
Ongoing communication
and collaboration with
university field liaison
53
Phillips and colleagues submitted a paper describing the results of this study to the Journal of
Social Work Education for peer review in 2016, and the article was accepted in August 2017.
The results of the longitudinal study are included in this capstone project as the anchor of the
broader study.
The exploratory study used a longitudinal, quasi-experimental design with nonequivalent
groups (i.e., VFP and OTG students) and four measurement points. All students were assessed at
the end of each of their four semesters of field practicum as part of their standard academic
assessment by their assigned field instructors. However, students did not assess their own
learning or rate their competencies. VAC students started with the VFP in Semester 1, then
completed three consecutive semesters at a single agency. OTG students spent the first two
semesters at one agency and the last two semesters at another agency. All VAC student
evaluation information was collected manually and tabulated in an Excel sheet, whereas
information about the OTG student learning outcomes was generated electronically by staff.
VAC students consisted of 70% of part time students who struggled to balance work and
life. Student retention rate was a challenge. The sample size of VAC students for this exploratory
study was significantly reduced throughout the four semesters (down to 57 of the original 100
students). The sample included 100 VAC students and 520 OTG students in field placements
during the 2013–2014 and 2014–2015 academic years. Table 2 presents the demographics of the
VAC and OTG students regarding gender, racial and ethnic backgrounds, and average GPA.
Students’ demographic data revealed only two statistically significant differences between the
cohorts regarding race/ethnicity and GPA. The VAC students had a 3.6 GPA compared to a 3.7
GPA for OTG students. The OTG cohort also had a greater percentage of Asian and Latino
54
students and a smaller proportion of Caucasian and African American students than the VAC
cohort.
A comprehensive skills evaluation form was developed by six universities in Los Angeles
County (Meyer-Adams et al., 2011) and used by field instructors to rate both VAC and OTG
students in ten core competencies: professionalism, ethics, thinking and judgment, cultural
competence, social justice, EBPs, person in environment, policy, current trends, and practice
skills. These competencies align with the 2008 EPAS (Appendix B). The same form was used by
all field instructors each semester, resulting in four measures of the ten competencies for each
student. For the VAC model, students were rated by their university field practicum instructors in
the first semester. In the second, third, and fourth semesters, the same students were rated by a
different field instructor due to their placement in an agency. For the OTG model, students were
rated by one field instructor for the first two semesters at one agency and by another field
instructor for the remaining semesters at another agency. The rating scores on the form ranged
from 1 to 4, with a higher score denoting greater competence. Cronbach’s alpha for each set of
items per competency ranged from .90 to .98, indicating good internal consistency.
Data for the VAC and OTG cohorts were analyzed by the Clarus Research Consultant
Group using two software programs: SPSS version 22.0 for descriptive analyses and bivariate
statistics (e.g., t-tests) and Hierarchical Linear and Nonlinear Modeling version 7.01 to test group
differences in competency ratings over time.
Significant differences emerged in mean competency scores between the VAC and OTG
groups at each measurement point. The results showed the following findings. At Time 1 (Table
3-A, the VAC students were rated significantly higher for every competency by their field
instructors. By Time 2 (Table 3-B), the VAC students had completed the first semester of their
55
agency-based placement and the OTG students had completed the second semester of their
agency-based placement. Understandably, the OTG group had significantly higher ratings from
their field instructors than VAC students except for two competencies: policy and current trends.
At Time 3 (Table 3-C), the VAC group had completed the second semester of agency-based
placement, whereas the OTG group entered and completed its third placement semester at a
second agency-based placement. VAC students showed higher mean score from their field
instructors for all competencies compared to the OTG group. By Time 4 (Table 3-D), at which
point both groups continued their respective placements from the previous semester, the mean
differences diminished. The VAC group, on average, received higher mean scores in all
competencies especially competencies of social justice, EBPs, and policy.
In summary, the results of this longitudinal study show that across all competencies, the
VAC group had higher mean scores at Time 1 (Table 3-A) and ended with similar or slightly
higher mean scores as the OTG group at Time 4 (Table 3-D). This result could be explained by
and attributed to the different field placements and pedagogical models between groups.
Unfortunately, these 2015 exploratory study data could not be used for the 2017
reaccreditation due to the shift to the 2015 EPAS. Yet the 2015 study informed the hypothesis of
this capstone project, i.e., that both OTG and VAC students could achieve the expected
benchmarks by the end of their first year of practicum as evidenced by their learning outcomes
rated by their agency-based field instructors. This research study tested a larger sample using a
different instrument (2015 student comprehensive skills evaluation) to generate knowledge of
student learning outcomes comparing VAC and campus-based students.
In spring 2016, the CSWE reaccreditation committee notified every MSW program that
each school needed to develop a specific advanced practice evaluation tool that differed from
56
generalist practice evaluation. Therefore, the student comprehensive skills evaluation could not
be used across all four semesters of field practicum. A separate competency-based evaluation
tool needs to be developed to assess students in their advance practice field placements.
Therefore, this capstone project was unable to replicate the same longitudinal quasi-experimental
design with the nonequivalent groups used in the 2015 study, but instead primarily focused on
student learning outcomes from first-year field experiences.
Hypothesis
Based on most of the literature in Chapter 2, the hypothesis for this study is that VAC and
OTG students can reach the established semester benchmarks at the end of their first-year field
practicum measured by their field instructors based on the 2015 CSWE EPAS competency-based
comprehensive skills evaluation tool.
Research Design
This capstone project used a quasi-experimental quantitative design with two
nonequivalent groups of students (VAC and OTG). Each student had one baseline assessment at
the end of the first semester (pretest) and one follow-up measurement at the end of the second
semester (posttest) through an online evaluation tool. Based on the work of Andrew Isserman in
the 1980s and 1990s, quasi-experimental comparison group designs have been frequently used
during the last three decades, especially in fields such as education and psychology (see Feser,
2013). The use of quasi-experimental research allows assessment of student learning outcomes
between two groups (Creswell, 2014). When using a quasi-experimental design without random
group selection, one of the benefits is an increased likelihood of identifying general trends based
on the results (Shadish, Cook, & Campbell, 2002). Without the preselection or random
57
assignment of subjects, this study reduced the difficulty and ethical concerns involved in the
study design and setting.
In this study, students received the same curriculum content and were assessed by the
same evaluation tool at the same time point by their field instructors. The independent variable
was membership in either the VAC (experimental group) or the OTG (control group) cohort. The
dependent variables for the first research question were the nine core competencies (completed
by students and their field instructors using the student comprehensive skills evaluation tool),
which indicated student learning outcomes in the experimental group (VAC students in the VFP
setting) and the control group (OTG students in traditional field placements).
Study Setting and Participants
The study setting was all USC Suzanne Dworak-Peck School of Social Work-approved
field placements. Each group participated in the assigned field courses. OTG students enrolled in
courses titled SOWK 589a and 589b for the academic year 2015–2016 throughout Southern
California (including Los Angeles, San Diego, and Orange counties). VAC students enrolled in
SOWK 586a (VFP) and SOWK 586b (community field placements) during the 2015–2016
academic year. Whereas OTG students received their evaluation from the same agency field
instructors for both semesters, VAC students were evaluated by their VAC instructors in the first
semester and agency field instructors in the second semester (Table 1).
Traditionally, the VAC has three semester starts each academic year. In this study, VAC
students enrolled in SOWK 586a in the following semesters were included: summer 2015, fall
2015, and spring 2016 (n = 710). OTG students enrolled in SOWK 589a in fall 2015 (n = 544)
were included in this study. As VAC students entered their assigned three-semester field
placements, they enrolled in SOWK 586b for fall 2015 and spring and summer 2016 (n = 721)
58
with an increased the student enrollment. OTG students enrolled in SOWK 589b for spring 2016
(n = 538). VAC student enrollment number for 586b classes was larger than 586a given the
flexibility that students could return to program from Leave of Absence at any given time. Each
cohort has set of enrollment goal based on the budgetary projection.
Students missing an evaluation were excluded from the analysis. Students who partially
completed an evaluation were included in the dataset. Some of the competencies weren’t rated
for all students early in their first semester given there were no opportunities in VFP or
traditional field placement to practice. The “not applicable” ratings are allowed in competencies
include 3, 4, 5, and 9 that were not required for all students early in their first semester ratings. In
Tables 5-A through 5-H, the sample sizes for competency areas 3, 4, 5, and 9 are lower than the
sample sizes for the other competency areas. The overall mean score across all competency
areas, which is reported in the last row, is the mean score across students for all available ratings.
In other words, if a student had no ratings for competency areas 3, 4, 5, and 9, the value included
in final row is the average of competency areas 1, 2, 4, 6, 7, and 8. The sample size for this last
row is larger than the sample sizes for some of the individual competency areas because all
students with any ratings are represented in this row: it is an overall mean using all available data
for students. In all cases, the most complete set of data was used for each student.
The final dataset showed that of 1,254 students enrolled (both OTG and VAC), 1,199
students were assessed for SOWK 589a and 586a in fall 2015. In spring 2016, 1,259 students
enrolled and 1,211 students were assessed for SOWK 589b and 586b. Demographic data for
VAC and OTG students are presented in Table 4.
Based on the sample data, there were differences existed between the groups in terms of
demographics including age, race and ethnicity, primary language and admission GPA. OTG
59
students were almost 7 years younger on average than VAC students. The OTG cohort had more
Latino/a (37%.1), with Caucasians (24.4%) and African Americans (12.9%) representing the
next two largest groups, whereas most of the VAC students were Caucasian (42%), with African
Americans (26.2%) and Latino/a (15.9%) representing the next two largest groups. Data
regarding the VAC students’ primary language was unknown (20.7%), compared to OTG
students, nearly a quarter of whom identified as speaking English as a second language (23.0%).
OTG students had a higher average GPA (3.77) compared to VAC students (3.68) upon
admission; however, VAC students made good progress academically in their GPA (3.73)
compared to OTG students (3.79) at the end of the second semester.
Procedures
Upon confirmation of each student’s enrollment and assigned field placement, the field
administrative team at each academic center (i.e., University Park Campus, Orange County
Academic Center, San Diego Academic Center, and VAC) created individual log in and
password credentials for students and field instructors separately and linked them with the
appropriate evaluation form based on each student’s department (advance practice). Regardless
the department, students were assigned to the same comprehensive skill evaluation instrument
for Time 1 and Time 2. OTG students completed their evaluation through an intern placement
tracking program, whereas VAC students assessed their competencies through an online field
evaluation program. Students and field instructors could access their evaluation form remotely
either at school or field placements through a secure log in via the internet.
Students completed self-ratings via an online evaluation form before the end of each
semester and submitted them to the field instructor for review. Field instructors were encouraged
to meet in person with their students and provide feedback before submitting their final ratings,
60
which included a semester summary that captured the students’ progress and areas for further
professional growth and development. Both students and instructors signed and acknowledged
the final ratings before the university field liaison reviewed them. If any of the competency
ratings fell below standards, liaisons reached out to the field instructors for clarification. If there
was no concern, liaisons acknowledged and submitted final grades to the university with one of
the following categories: credit, in progress, or no credit.
Measurement and Instrument
As previously mentioned, the CSWE does not provide specific instruments beyond the
description of the competencies. Each school is tasked with creating and operationalizing a
measurement instrument. After the development of the 2015 student comprehensive skills
evaluation tool, the instrument was piloted with a small group of field seminar instructors (VFP)
and agency field instructors (n = 35), who completed the evaluation tool based on their current
students’ performance. Results were shared with a focus group that provided specific feedback to
improve the clarity of the evaluation tool and modify it with much careful attention. The goal of
the revision was to provide clear and specific direction for field instructors through rubrics and
examples for each item in each core competency.
Appendix A (2015 EPAS) presents the instrument used to evaluate student competencies.
All items are evaluated with a Likert scale that includes rating scores ranging from 0 to 10, with
a higher score denoting greater competency. Benchmarks were set across four field practicums
with a gradual progression on skill development. The student comprehensive skill evaluation
instrument (Appendix A) provides the instruction. Scores for each competency were calculated
as the average score on the various items assessing that competency. Expected benchmark for
each semester was color-coded. Students’ ratings fell in red (0-1 for first semester and 0-3 for
61
second semester) were at risk for not passing the specific competency for that semester whereas
students received green (2-6 for first semester and 4-7 for second semester) would receive a
passing grade as well as students fell in blue indicated performing beyond expected benchmark
(7-10 for the first semester and 8-10 for the second semester). For example, a student might
receive a rating of 2 on Competency 3 in the first semester as expected, and would be expected
to reach a rating of 4–7 by the second semester. The system will automatic provide the instant
visual report on the students’ self-assessment and the ratings of their field instructors (Appendix
A). Cronbach’s alpha for the nine competencies ranged from .90 to .98, indicating good internal
consistency and reliability.
Figure 4: Expected Semester Benchmark (Appendix A)
First
Semester
0 1 2 3
4
5 6 7 8 9 10
Second
Semester
0 1 2 3
4
5
6
7 8 9 10
After 15 months of collaborative discussions in the focus group and during monthly field
faculty meetings, both VAC and OTG field faculty members collectively selected Competencies
1, 2, 6, 7, and 8 as the most important core competencies for first-year MSW students to achieve
by the end of their first semester of field practicum. The expectations were that students should
receive ratings for these competencies from their assigned field instructor that reach the
designated benchmarks. The other competencies (3, 4, 5, and 9) offered an option for a rating of
“not applicable” given the limited field experiences that students might have in their first
semester of field practicum. However, all students were expected to receive ratings on all nine
competencies at their second, third, and fourth semesters of field practicum.
62
Data Analysis
The data from the comprehensive skills evaluation form (both VAC and OTG groups)
were analyzed using both descriptive and statistical procedures. SPSS (version 22.0) was used
for statistical analyses from descriptive statistics (means) to inferential statistics (t-tests). Heavey
(2015) reported that the t-test is appropriate if three conditions are met: the assessment of the
variable is continuous, two groups are being compared, and the two groups are independent.
Furthermore, this capstone project will attempt to compare similarities and differences between
the two different student groups; thus, the paired t-test was used to identify significant
differences (Munro, 2005). Rayens, Svavarsdottir, and Burkhart (2011) reported that a minimum
of 30 dyads in a paired t-test is a statistically appropriate method for comparing differences
between two related groups.
A p-value less than .05 traditionally indicates rejection of the null hypothesis. The p-
value was set to .001 in this study to avoid type 2 error, given that analyses were repeated for
nine competencies separately and for all competencies combined, resulting in several
comparisons. This rigorous p-value standard was used to demonstrate the strength of the
evidence (scores). Munro (2005) reported the acceptable level of power is about 80%, meaning a
20% probability of accepting a false hypothesis (type 2 error). Hedges’ g was used to assess the
effect size for sample t-tests and differences between the experimental and control groups.
This capstone project involved human subjects, but posed no ethical concern and
identified no personal information throughout the data collection. The USC Institutional Review
Board (IRB) was consulted regarding the use of the comprehensive student skills evaluation
form (instrument). The study was classified as exempt from IRB review given the nature of the
periodically embedded educational assessment required of all MSW students enrolled in the field
63
practicum. Therefore, IRB approval was not required and the IRB did not review the instrument.
However, future IRB approval would be important for future research and publication
consideration.
Summary
The outcome of the 2015 exploratory study set the theoretical framework and grounded
the hypothesis for this capstone project. This capstone project included students’ self- assessment
and instructors’ evaluation of student learning. The creation of the online competency-based
evaluation tool was a major step forward, providing students and field instructors an
instantaneous visual report and a specific rubric and Likert scale to guide the objectivity of the
evaluation in relation to learning outcomes. The results of the students’ evaluation data including
students’ self-assessment and ratings by the field instructors’ at Time 1 and Time 2 are presented
in the next chapter.
64
Chapter 4: Results and Data Analysis
Introduction
This chapter outlines the results of the data analyses. These results are organized in tables
that include students’ self-assessment and ratings of their field instructors. The relevant statistics
are used to examine the hypothesis. This study explored the trajectories of students’ skills
development through self-ratings over time and comparisons between VAC (experimental group)
and OTG (control group) students in each semester. Data from the field instructors’ ratings were
also analyzed to gain deeper understanding of differences and similarities in comparison to their
students’ self-ratings.
Results
To test the hypothesis of whether by the end of the first-year field practicum, both VAC
and OTG students could achieve the expected semester benchmarks, this study used data from
the 2015 CSWE competency-based student comprehensive skills evaluation tool.
Competency 1: Demonstrate Ethical and Professional Behavior
OTG student learning outcomes. At Time 1, paired-sample t-test results (Table 5-A)
showed OTG students rated themselves higher than the expected benchmark of 6 in the first
semester (M = 6.16, SD = 1.73). At Time 2, OTG students’ self-ratings in their second semester
of field practicum again was higher than the expected benchmark of 7 (M = 7.48, SD = 1.31).
OTG students made significant progress in competency 1 from Time 1 to Time 2 (t = -20.73, df
= 525, p < .001).
OTG field instructor ratings. At Time 1, paired t-test results (Table 5-B) indicated the
agency-based field instructors reported an average mean score of 5.90 (SD = 1.50) for this
competency. At Time 2, the same field instructors rated their students with a higher mean score
65
(M = 7.68, SD = 1.28). These data confirmed the significant progress made by OTG students as
rated by their field instructors from Time 1 to Time 2, exceeding the expected highest semester
benchmark of 7 (t = -36.25, df = 522, p < .001).
VAC student learning outcomes. Based on paired-sample t-test results (Table 5-C),
VAC students rated themselves (both at Time 1 and Time 2) with an average score of 6.64 (SD =
2.09). VAC students rated themselves higher than the expected highest benchmark of 6. At Time
2, VAC students completed their evaluations based on experiences in their assigned community
field placements. The average score increased (M = 7.03, SD = 1.49), demonstrating statistically
significant progress (t = -3.79, df = 421, p < .001) and exceeding the expected highest semester
benchmark of 7 in this competency.
VAC field instructor ratings. According to paired sample t-test results (Table 5D),
VAC instructors rated their students at Time 1 higher than the expected benchmark of 6 (M =
6.49, SD = 1.70). At Time 2, agency field instructors rated the VAC students at the expected
benchmark (M = 6.96, SD = 1.63), confirming some progress made in skills development (t = -
5.02, df = 496, p < .001).
Competency 2: Engage Diversity and Differences in Practice
OTG student learning outcomes. Paired-sample t-test results (Table 5-A) for OTG
students at Time 1 for this competence were higher than the expected benchmark of 5 (M = 5.57,
SD = 1.88). At Time 2, OTG students’ self-ratings were higher than the expected benchmark
score of 6 (M = 7.03, SD = 1.49). OTG students made significant progress in this competency
from Time 1 to Time 2 (t = -20.99, df = 524, p < .001).
OTG field instructor ratings. Agency-based field instructors provided lower ratings at
Time 1 (Table 5-B) regarding student learning outcomes (M = 5.24, SD = 1.60) for this
66
competency compared to the average self-reported score from students. At Time 2, the field
instructors rated their students with a higher mean score (M = 7.29, SD = 1.43) than the students’
self-ratings. Again, these data confirmed the significant progress that OTG students made from
Time 1 to Time 2, exceeding the expected semester benchmark of 6 (t = -38.94, df = 523, p <
.001).
VAC student learning outcomes. At Time 1 (Table 5-C), VAC students rated
themselves higher the expected benchmark of 6 (M = 6.18, SD = 2.22). At Time 2, 420 VAC
students reported a higher mean score (M = 6.62, SD = 1.66) compared to Time 1, demonstrating
statistically significant progress (t = -3.94, df = 419, p < .001) and meeting the highest semester
benchmark of 7.
VAC field instructor ratings. At Time 1 (Table 5-D), VAC instructors rated their
students with the expected benchmark (M = 5.84, SD = 1.67). At Time 2, agency field instructors
rated them near the expected benchmark (M = 6.45, SD = 1.79), confirming increased skills
development (t = -6.09, df = 494, p < .001) and meeting the expected semester benchmark of 7.
Competency 3: Advance Human Rights and Social, Economic, and Environmental Justice
OTG student learning outcomes. At Time 1 (Table 5-A), paired-sample t-test results
showed that OTG students rated themselves within the expected benchmark of between 2 and 6
(M = 4.83, SD = 2.20). Paired-sample t-test results showed that OTG students’ self-ratings in
their second semester of field practicum met the expected benchmark of 4 to 7 (M = 6.28, SD =
1.87). OTG students made progress in this competency from Time 1 to Time 2 (t = -18.11, df =
490, p < .001).
OTG field instructor ratings. Agency-based field instructors provided evaluations at
Time 1 (Table 5-B) on student learning outcomes that were within the expected semester
67
benchmark (M = 4.54, SD = 1.78) for this competency. At Time 2, the field instructors rated their
students with a higher mean score (M = 6.70, SD = 1.72). These data confirmed the significant
progress that OTG students made from Time 1 to Time 2, meeting the expected semester
benchmark (t = -34.16, df = 500, p < .001).
VAC student learning outcomes. At Time 1 (Table 5-C), VAC students reported a
mean score meeting the expected semester benchmark (M = 5.70, SD = 2.40). At Time 2, their
average score increased (M = 6.18, SD = 1.98), demonstrating professional growth in this
competency and meeting the semester benchmark (t = -2.99, df = 263, p = .003).
Competency 3 only had VAC data for 264 individuals, so the mean difference between
586A and 586B for that competency needed to be weighted less in the overall assessment of
mean difference than a competency with a sample size of 420. A weighted average was taken
into consideration in mean scores across the nine competency areas.
VAC field instructor ratings. At Time 1 (Table 5-D), VAC instructors rated their
students within expected semester benchmark (M = 5.37, SD = 1.88). At Time 2, agency field
instructors rated VAC students again with the expected semester benchmark of 4 to 7 (M = 6.13,
SD = 2.00), similar to the students’ self-ratings. VAC students achieved progress in this
competency by the end of their first-year field practicum (t = -5.41, df = 338, p < .001).
Competency 4: Engage in Practice-Informed Research and Research-Informed Practice
OTG student learning outcomes. Paired-sample t-test results (Table 5-A) showed that
students reached the expected semester benchmark at Time 1 (M = 4.86, SD = 1.94). OTG
students’ self-ratings in their second semester of field practicum reflected that they met the
expected targeted benchmark (M = 6.48, SD = 1.64). OTG students made significant progress in
this competency from Time 1 to Time 2 (t = -22.00, df = 524, p < .001).
68
OTG field instructor ratings. Agency-based field instructors evaluated student learning
outcomes at Time 1 with a mean score within the expected semester benchmark (M = 4.62,
SD = 1.60; see Table 5-B). At Time 2, the field instructors rated their students within the
expected semester benchmark (M = 6.76, SD = 1.54). These data confirmed the significant
progress that OTG students made from Time 1 to Time 2, meeting the expected semester
expected benchmark (t = -36.39, df = 523, p < .001).
VAC student learning outcomes. At Time 1 (Table 5-C), VAC students rated
themselves within the expected benchmark (M = 5.55, SD = 2.24). At Time 2, VAC students
reported a score within the expected semester benchmark (M = 5.95, SD = 1.78), demonstrating
progress in this competency (t = -3.23, df = 419, p = .001).
VAC field instructor ratings. VAC field instructors, at Time 1, rated their students with
the expected semester benchmark (M = 5.55, SD = 1.73) (5-D). At Time 2, agency field
instructors rated VAC students with a lower mean score (M = 5.76, SD = 2.07) than the first
semester, which was unexpected but within the expected semester benchmark in this competency
(t = -1.83, df = 494, p = .067).
Competency 5: Engage in Policy Practice
OTG student learning outcomes. Paired-sample t-test results (Table 5-A) showed that
OTG students at Time 1 rated themselves within the expected semester benchmark (M = 4.54,
SD = 2.03). At Time 2, OTG students reported meeting the expected semester benchmark for this
competency (M = 6.00, SD = 1.80). OTG students made significant progress in this competency
from Time 1 to Time 2 (t = -19.02, df = 500, p < .001).
OTG field instructor ratings. At Time 1 (Table 5-B), field instructors gave a mean
score within the expected semester benchmark (M = 4.41, SD = 1.69) for this competency. At
69
Time 2, the field instructors rated their students with a higher mean score (M = 6.63, SD = 1.68).
These data confirmed the significant progress that OTG students made from Time 1 to Time 2,
meeting the expected semester benchmark of 7 at Time 2 (t = -32.44, df = 507, p < .001).
VAC student learning outcomes At Time 1 (Table 5-C), VAC students provided an
average self-rating for this competency of 5.45 (SD = 2.18). VAC students rated themselves
within the expected benchmark. At Time 2, their average score increased (M = 5.96, SD = 1.88),
demonstrating progress made from Time 1 to Time 2 (t = -3.47, df = 288, p = .001).
VAC field instructor ratings. At Time 1 (Table 5-D), VAC instructors rated their
students within the expected semester benchmark (M = 5.36, SD = 1.78). At Time 2, agency field
instructors rated VAC students higher (M = 5.89, SD = 1.88) than at Time 1. Regardless, this
confirmed increased skills development (t = -4.12, df = 358, p < .001) during the semester.
Competency 6: Engage with Individuals, Families, Groups, Organizations, and
Communities
OTG student learning outcomes. Paired-sample t-test results (Table 5-A) showed OTG
students rated themselves at Time 1 within the expected semester benchmark (M = 5.45, SD =
1.89) based on self-ratings for five subitems. At Time 2, OTG students again reported meeting
the expected semester benchmark (M = 6.97, SD = 1.56). OTG students made significant
progress in this competency from Time 1 to Time 2 (t = -21.95, df = 524, p < .001).
OTG field instructor ratings. Agency-based field instructors provided student
evaluations at Time 1 (Table 5-B) with an average score of 5.27 (SD = 1.60) for this
competency, meeting the semester benchmark. At Time 2, the field instructors rated their
students with a higher mean score (M = 7.29, SD = 1.45) than the first semester and exceeding
70
the expected semester benchmark of 7. This confirmed the significant progress made by OTG
students from Time 1 to Time 2 (t = -37.96, df = 523, p < .001).
VAC student learning outcomes. VAC students (Table 5-C) rated themselves higher
than the expected semester benchmark at Time 1 (M = 5.97, SD = 2.27). At Time 2, VAC
students reported progress made in this competency (M = 6.57, SD = 1.65; t = -5.25, df = 421, p
< .001).
VAC field instructor ratings. VAC instructors (Table 5-D) rated their students with an
average score of 5.79 (SD = 1.68), meeting the expected semester benchmark. At Time 2, agency
field instructors rated VAC students with an increased mean score of 6.52 (SD = 1.73),
confirming increased skills development from Time 1 to Time 2 that met the expected semester
benchmark (t = -7.16, df = 495, p < .001).
Competency 7: Assess Individuals, Families, Groups, Organizations, and Communities
OTG student learning outcomes. Paired-sample t-test results (Table 5-A) showed that
OTG students at Time 1 rated themselves within the expected semester benchmark (M = 4.81,
SD = 1.99) for three subitems. OTG students’ self-ratings in their second semester of field
practicum also fell within the expected semester benchmark (M = 6.58, SD = 1.63). OTG
students made significant progress in this competency from Time 1 to Time 2 (t = -24.64, df =
524, p < .001).
OTG field instructor ratings. Agency-based field instructors provided evaluations at
Time 1 (Table 5-B) and indicated a lower average score (M = 4.66, SD = 1.65) compared to their
students’ self-ratings. However, at Time 2, the field instructors rated their students with a higher
mean score (M = 6.92, SD = 1.49). This confirmed the significant progress that OTG students
made from Time 1 to Time 2 and meeting the expected semester benchmark (t = -39.47,
71
df = 523, p < .001).
VAC student learning outcomes. VAC students (Table 3-C) rated themselves at Time 1
with an average score (M = 5.66, SD = 2.26) for three subitems, falling within the expected
semester benchmark. In this competency, at Time 2, VAC students reported progress in this
competency (M = 6.11, SD = 1.85; t = -3.55, df = 420, p < .001).
VAC field instructor ratings. At Time 1, VAC instructors (Table 3-D) rated their
students with an average score of 5.48 (SD = 1.75), which was within the expected semester
benchmark. At Time 2, agency field instructors rated the VAC students with an average score of
6.12 (SD = 1.85), similar to the students’ self-ratings and within the expected semester
benchmark in this competency (t = -5.98, df = 495, p < .001).
Competency 8: Intervene with Individuals, Families, Groups, Organizations, and
Communities
OTG student learning outcomes. Paired-sample t-test results (Table 3-A) showed that
OTG students rated themselves at Time 1 with an average score within the expected benchmark
(M = 4.42, SD = 2.04) for five subitems. OTG students in their second semester of field
practicum reported an average score of 6.25 (SD = 1.72), again within the expected semester
benchmark (t = -25.12, df = 524, p < .001).
OTG field instructor ratings. Agency-based field instructors (Table 3-B) rated student
learning outcomes at Time 1 with an average score of 4.36 (SD = 1.70), meeting the expected
semester benchmark. At Time 2, the field instructors rated their students with a higher mean
score of 6.71 (SD = 1.60) that confirmed the significant progress that OTG students made from
Time 1 to Time 2 and met the expected semester benchmark of 6 (t = -39.91, df = 523, p < .001).
72
VAC student learning outcomes. At Time 1, VAC students (Table 5-C) rated
themselves within the expected benchmark of 5 (M = 5.06, SD = 2.39). At Time 2, VAC students
showed growth (M = 5.72, SD = 2.00), demonstrating progress and meeting the semester
benchmark (t = -5.11, df = 418, p < .001).
VAC field instructor ratings. VAC instructors (Table 5-D) rated their students at Time
1 with an average score of 5.30 (SD = 1.65), which was within the expected semester benchmark.
At Time 2, agency field instructors rated the VAC students with the same mean score as the
students’ self-ratings (M = 5.72, SD = 2.00), meeting the semester benchmark (t = -3.77,
df = 493, p < .001).
Competency 9: Evaluate Practice with Individuals, Families, Groups, Organizations, and
Communities
OTG student learning outcomes. Paired-sample t-test results (Table 5-A) showed that
OTG students reported an average mean score (M = 4.27, SD = 2.16), meeting the expected
semester benchmark. At Time 2, OTG students’ self-ratings had an average score (M=5.87, SD =
1.84), within the semester benchmark. OTG students made significant progress in this
competency from Time 1 to Time 2 (t = -19.05, df = 509, p < .001).
OTG field instructor ratings. At Time 1, agency-based field instructors (Table 5-B)
gave a mean score of 4.05 (SD = 1.82), meeting the expected semester benchmark. At Time 2,
the field instructors rated their students with a higher mean score (M = 6.24, SD = 1.73) that
confirmed the progress that OTG students made from Time 1 to Time 2 and met the expected
semester benchmark (t = -31.99, df = 506, p < .001).
VAC student learning outcomes. VAC students self-assessed their own learning at
Time 1 (Table 5-C) with an average score of 5.18 (SD = 2.38), within the expected semester
73
benchmark. At Time 2, VAC students reported a higher mean score (M = 5.72, SD = 2.02),
demonstrating progress in skills development (t = -3.28, df = 265, p =.001) and meeting the
semester benchmark.
VAC field instructor ratings. At Time 1, VAC instructors (Table 3-D) rated their
students with a lower average score (M = 4.79, SD = 1.91) compared to students’ average rating.
At Time 2, agency field instructors reported an increased mean score (M = 5.56, SD = 2.05) that
met the expected semester benchmark and confirmed progress in this competency (t = -5.95, df =
398, p < .001).
Similarities and Differences: OTG and VAC Students
In the independent t-test results (Table 3-E) at Time 1, VAC students rated themselves
higher overall (M = 5.96, SD = 2.05) than OTG students (M = 5.26, SD = 1.77) across the nine
competencies (t = -5.56, df = 836.67, p < .001). However, results indicated no statistically
significant difference between the two student groups.
At Time 2 (Table 3-F), both groups of students made progress in all competencies. OTG
students (M = 6.75, SD = 1.45) had better self-ratings than VAC students (M = 6.37, SD = 1.54)
except for Competencies 3, 5, and 9, which were statistically nonsignificant (t = 3.98, df = 946, p
< .001).
Similarities and Differences: Field Instructor Ratings
At Time 1, OTG field instructors (Table 3-G) rated their students across all competencies
with a lower mean score (M = 5.06, SD = 1.48) compared to VAC instructors (M = 5.78, SD =
1.58). OTG students demonstrated higher mean scores in Competencies 4, 5, and 8 (greater
difference), whereas the rest were not statistically significantly different (t = -7.57, df = 1,019, p
74
< .001). VAC students received higher mean scores in the first semester compared to OTG
students.
At Time 2, OTG field instructors (Table 3-H) rated their students with a higher mean
score (M = 7.07, SD = 1.37) compared to VAC agency-based field instructors (M = 6.32, SD =
1.63) across the nine competencies (t = 7.96, df = 969.73, p < .001). OTG students received
higher mean scores beyond the expected semester benchmark of 7 in Competencies 1, 2, and 6.
Summary of Findings
Based on the data collected, the results were significantly positive and confirmed the
hypothesis. That is, both OTG and VAC students reached the expected semester benchmarks at
the end of their first-year field practicum and demonstrated a capacity to meet all CSWE core
competencies, not only through self-assessments but also confirmed by their assigned agency
field instructors, who were independent from the university.
Although VAC students rated themselves higher in all nine competencies upon
completion of the VFP (Time 1), their progress was significantly lesser compared to their OTG
cohort in the second semester. However, despite changes in placement setting at Time 2, VAC
students continued to assess their own learning outcomes higher than their assigned agency-
based field instructors, except for Competency 8. VAC students did make progress in all
competencies, with an average score of 6.32 (Table 5-D) as rated by their agency-based field
instructors, compared to OTG field instructors’ rating of 7.07 (Table 5-B).
In contrast, OTG students clearly rated themselves lower than their field instructors but
higher than their VAC cohort at Time 2. Agency-based field instructors reported significant
growth in their OTG students after two semester of field experiences, similar to findings reported
by Vinton and Wilke (2011). This finding confirmed the results of the 2015 exploratory study
75
based on the 2008 EPAS and the hypothesis that both VAC and OTG students could achieve the
expected learning benchmarks.
According to ratings by OTG field instructors for both semesters, students improved the
most in Competencies 2–4 and 6–8 by the end of the second semester (Table 5-B). Field faculty
members, through collaborative discussion, identified that Competencies 1, 2, and 6–8
represented the most crucial skills for first-year students to develop. It was very reassuring that
Competency 4 was among the most improved student learning outcomes for OTG students,
given the curriculum’s emphasis on EBI core trainings.
Surprisingly, ratings for Competency 4 for VAC students did not show a similar result
compared to OTG students. Although the VFP course emphasized Competency 4, to engage in
practice-informed research and research-informed practice (Table 5-D), VAC agency field
instructors did not observe growth in this competency at Time 2 with their VAC students. This
needs to be further explored in future studies.
In the second semester, ratings for OTG and VAC students (Table 5-G) received from
field instructors were very close for Competencies 3, 5, and 9, whereas field instructors
identified that OTG students performed slightly higher than VAC students in Competencies 2, 4,
and 8.
Figures in the tables (5-A through 5-H) have been double-checked and confirmed as
correct. It may appear that the overall mean difference (reported in the final pair of rows in each
table) is smaller than the average of all the separate mean differences reported elsewhere in the
table. However, because the different competency areas have different sample sizes, their mean
differences need to be weighted to ensure a fair comparison. In Table 5-C, for example,
Competency 3 only had data for 264 individuals, so the mean difference between 586A and
76
586B for that competency area (0.48) needed to be weighted less in the overall assessment of
mean difference than a competency area with more ratings. The weighted average of the
differences in mean scores across the nine competency areas for Table 5-C was 0.44, which is
larger than the overall mean difference of 0.41.
Conclusion
At Time 1, based on the independent t-test results, a significant difference existed in
evaluation scores for all nine competencies. VAC students rated themselves higher compared to
the OTG cohort’s self-ratings across all nine competencies. A similar result was reflected in the
field instructors’ ratings at Time 1: VAC students received a higher average score than OTG
students across all nine competencies. However, mean scores from both groups fell into the
expected semester range of 2–6.
At Time 2, based on the independent t-test results, significant differences also existed in
evaluation scores. VAC students rated themselves lower compared to the OTG cohort across all
nine competencies. Field instructors reported similar results: VAC students received a lower
mean score than OTG students across all nine competencies at the end of the second semester.
Overall, the results show that OTG students received higher final evaluations, beyond the
expected benchmark of 7, whereas VAC students fell within the expected range of 4–7, as
evidenced by ratings from their assigned field instructors.
The next chapter features discussion of the findings, limitations of this capstone study,
implications for practice and future research, and recommendations for program evaluation and
implementation.
77
Chapter 5: Innovation to Practice
Introduction
This chapter discusses how VAC students, without being placed in a traditional field
placement site, received standardized EBI skills training through their VFP program and
achieved core competencies. With no relevant literature available based on the 2015 EPAS, the
results of this study represent important knowledge about how to evaluate MSW students in their
first-year field practicum. These student learning outcomes data served as a significant
foundation for the school’s reaffirmation in 2016 and were used as a part of the 2017
reaccreditation site visit—the first reaccreditation since the inception of the VAC program in
2010. This chapter also discusses the development of the standardized student evaluation tool,
limitations, implications for practice, recommendations, and future studies.
Outcomes
The outcomes were significantly positive and confirmed the hypothesis. That is, both
OTG and VAC students reached the expected semester benchmarks at the end of their first-year
field practicum and demonstrated a capacity to meet CSWE’s nine core competencies. This was
confirmed not only through their own assessments but also by their assigned agency field
instructors, who were independent from the university.
78
Figure 5: Student Learning Outcomes (Students’ Self-Ratings and Field Instructors’ Evaluations)
Note. Blue = OTG students; orange = VAC students; semester benchmark = 2–6 (first semester) and 4–7 (second
semester).
Discussion
Standardized Student Comprehensive Skills Evaluation Tool
The USC Suzanne Dworak-Peck School of Social Work is part of the Southern California
Field Director Consortium, which consists of 11 MSW programs. In the past, all schools shared
one evaluation tool based on the 2008 EPAS. However, since the introduction of the 2015 EPAS,
USC developed an instrument based on the nine core competencies for both generalist and
5.26
6.75
5.96
6.32
TIME 1 TIME 2
Student
Students Learning Outcome
Students’ Self-Ratings
Series1 Series2
5.06
7.07
5.78
6.32
TIME 1 TIME 2
Field Instructor
Students Learning Outcome
Field Instructors’ Ratings
Series1 Series2
79
advanced practice behaviors, given the school’s new curriculum and departmental structure.
These instruments can be shared with other MSW programs if appropriate. Some agency field
instructors experienced difficulty completing the electronic evaluation tool and needed more
technical support and guidance.
To further support field instructors’ use of competency-based evaluations (Bogo, 2006;
Ornstein & Moses, 2010) and serve as gatekeepers of the social work profession, USC leaders
sought to make this tool more user friendly, encouraging the field instructors to review the
evaluation with their students through an online platform throughout the academic year. The use
of practical and relevant language for placement settings with specific examples (rubrics)
enabled students and field instructors to complete the evaluation with clearly defined
instructions. This implementation was a huge step forward, specifically for the VAC field
program, because field education officials can now draw electronic data from both OTG and
VAC programs to make data-driven decisions and program improvements.
OTG field instructors had used an online evaluation for 2 years prior to the new
evaluation tool. The response rates for completed evaluations by OTG students and field
instructors were much higher than their VAC counterparts. The online field evaluation platform
(used by the VAC) was built by the USC Suzanne Dworak-Peck School of Social Work’s
Information Technology Department in spring 2015 and was implemented in fall 2015. Despite
great efforts to disseminate training videos to field faculty liaisons, students, and field
instructors, the implementation encountered some technical difficulties. These difficulties might
account for the lower response rate and increased missing data from VAC students and field
instructors as compared to OTG rates.
80
VAC students and field instructors might have had difficulty with some ratings and chose
to leave some subitems blank, which would result in a default rating of 0. For example,
Competency 3 only had 264 ratings, so the mean difference between 586A and 586B for that
competency needed to be weighted less in the overall assessment of mean difference than a
competency with 420 ratings. Logically, the average mean difference is smaller than all the
separate mean differences, but this is not the case when a weighted average is created to capture
all available data.
Beyond being unfamiliar with the electronic evaluation tool, VAC students and field
instructors were also unfamiliar with the 2015 core competencies regarding how they apply to
the VFP setting and standard community-based field placements. Several trainings were held to
help the VAC instructors gain a better understanding of the core competencies and how to rate
students based on the expected semester benchmark range.
Compared to OTG students, VAC students rated themselves much higher in their first
semester. This could be because VAC students were older, had prior work experiences, and had
less performance anxiety in terms of working with a simulated client versus working with real
clients in field placements, meaning they worried less about making mistakes or causing damage
to their assigned client because they knew he or she was an actor.
When students and field instructors identify any competency as not meeting the semester
benchmark, this tool is intended to promote early identification and intervention to assist students
who might be struggling in the field. In 2013, the USC Suzanne Dworak-Peck School of Social
Work developed a field education manual that has been updated annually (USC Suzanne
Dworak-Peck School of Social Work, 2017). The manual outlines specific policies and
protocols, including roles and responsibilities in the field program, specifically in Section IV:
81
Failure to Make Satisfactory Progress in Field Education. In consultation and collaboration with
agency field instructors, faculty liaisons are tasked with identifying and intervening when
concerns about student performance arise, as reported by Sowbel (2012). Field liaisons take an
active role to ensure student performance improvement (USC Suzanne Dworak-Peck School of
Social Work, 2017) through active participation with agency field instructors and students to
discuss specific expectations in relevant competencies. Field liaisons and agency field instructors
are expected to help students achieve the expected competencies through three levels of
interventions. To gatekeep our social work profession and fully implement a fair, realistic, and
collaborative measurement of students’ competencies in the field practicum (Sowbel, 2011), the
ultimate goal is to train competent future social workers and prevent any harm to clients and field
agencies. It will be a continuous collective effort to examine the admission criteria, curriculum,
and field education (Bogo et al., 2007; Colby, 2013; Gelman, 2004; Gelman et al., 2008).
Student Self-Assessment
Despite clear and specific instructions regarding the expected semester benchmark, VAC
students were adamant about receiving a perfect score for each subitem in their first semester and
would seek a higher rating if the instructors gave a lower rating than their own ratings. Many
students felt they should receive a 10 during their first field practicum, which demonstrated their
expectations and confidence in performing at a higher standard. The other possible explanation
could be that VAC students spent 4 hours interacting with faculty members with instant feedback
when in practice and role-playing with a simulated client. On the other hand, OTG students
might only have seen their field instructors 1 hour a week and were not able to receive timely
consultation and feedback as their VAC cohort in the VFP.
82
With missing some data from VAC students and field instructors at Time 1 (resulting in a
default score of 0), mean scores from VAC students and field instructors were still able to meet
the expected semester benchmark as reported by the OTG students and field instructors. The
reasons for these students’ expectations were unclear, but it could be that the VAC students were
older (average age of 34) or already had experience in a social services agency or a different
professional career capacity. VAC students also made progress after transitioning to community-
based field internships in the second semester, as evaluated by different field instructors. VAC
students, who might not have had much exposure in their VFP courses to Competencies 3
(advance human rights and social, economic, and environmental justice), 5 (engage in policy
practice), and 9 (evaluate practice with individual, families, organizations, and communities) due
to limitations of being in a virtual field practicum without interacting with agency staff and
diverse client populations. However, students reached the semester benchmark in their
community agency placements at Time 2.
Because of the emphasis on EBI skills training during the first semester, VAC students
were expected to gain more confidence with Competency 8, but that was not the case. VAC
students rated themselves at 5.06, which was much lower than the VAC instructors’ rating of
5.30. In the second semester, students and field instructors reported the same rating of 5.72, but it
was still one of the lowest ratings among the nine competencies. The reason for this is unclear
and requires further exploration with students and field instructors in future studies. Many
agency field instructors might not be familiar with EBI trainings or aware of whether and how
students have been trained in these interventions. The USC Suzanne Dworak-Peck School of
Social Work has been at the forefront of integrating EBIs in field course content to promote
students’ professional growth and development. In the second semester, VAC students continued
83
to rate themselves higher than agency-based field instructors for all competencies except
Competency 8.
Conversely, OTG students rated themselves higher than their field instructors in the first
semester but lower in their second semester. As they gained more field experience, they seemed
to need more support and supervision to develop professional skills. In general, both OTG and
VAC students appeared to be confident in most of their skills, knowledge, and application in the
first semester. As they entered the second semester, they seemed to face a learning curve in terms
of reaching a more advanced practice level.
Field Instructor Evaluations
Information shared in the literature review described how field instructors prefer
mentorship and supervision when guiding students’ professional growth in the social work field.
However, these instructors have not received specific training for how to evaluate MSW interns
given the competency-based evaluation established in 2008 by the CSWE. Social work educators
might have assumed that field instructors understood the core competencies and could provide
objective feedback and assessments regarding student performance. Each field instructor
possesses unique training, knowledge, and experiences in social work; therefore, their teaching
plans, supervision styles, and personalities are also varied. Some field instructors might have a
more administrative role and responsibilities in their daily function in their agencies, whereas
others might be clinicians who provide clinical interventions to clients, families, and
communities. Thus, the working relationship between the field instructor and student could differ
among dyads and potentially influence the instructors’ perceptions and evaluations of their
students.
84
VAC data indicated that VAC instructors gave high ratings in the first semester beyond
the designated benchmark for all competencies except Competency 9. As mentioned in the
literature review, field instructors often are lenient when evaluating their students due to their
frequent interaction and working relationship. VAC instructors met with students 4 hours a week
and had constant contact with their students. One of the instructors admitted that she provided a
higher score to encourage and incentivize her students to participate in live role-play sessions
and other learning activities. USC officials have emphasized the importance of student
evaluations of their instructors as a way of maintaining a high quality of instruction. VAC
students were asked to provide feedback on not only the course, but also their instructor at the
end of each semester. These evaluations could create an awkward situation for VAC instructors,
in that they were evaluating their students while being evaluated by their students.
Having received high ratings at Time 1 from their VFP instructors, students had very few
opportunities to demonstrate significant progress after they were placed in their community-
based internships in their second semester. This change in field placement setting might have
contributed to the relatively lower ratings that VAC students received from their agency-based
field instructors compared to OTG students, who spent two semesters in one assigned field
placement. OTG field instructors gave much higher ratings, particularly for Competencies 2, 4,
and 6–8.
Limitations
One factor that could not be controlled was the change in instructors for VAC students.
VAC students were evaluated by their assigned VAC instructors in their first semester and by
agency field instructors at the end of their second semester due to their transition to community-
85
based field placement settings. This might have influenced the validity of the data, given two
different raters compared to only one evaluator for OTG students.
Other variables might not have been captured in this capstone project and could possible
influence student learning outcomes, such as students’ gender, age, prior social work knowledge,
experience, training, educational background, characteristics, learning styles, motivations, and
work ethic. On the other hand, field instructors represent another dependent variable to explore,
because they might have unique ways of interpreting the 2015 core competencies based on their
educational background, professional experiences, and training that could determine how they
evaluate their students’ performance and competencies.
Due to the new 2015 EPAS and changes in the reaccreditation protocol, the CSWE
required different measurements to evaluate students on generalist and advanced practice
behaviors specific to each competency in their chosen specialized study. Longitudinal research
completed in the 2015 exploratory study could not be replicated using the same measurement to
assess the trajectory of the students’ skills development during the four semesters of the field
practicum.
Recommendations for future research include adding qualitative methodology through
surveys or focus groups and further exploring students’ perspectives of their field experiences
and field instructors’ experiences and perspectives regarding teaching, instructing, and
evaluating student performance. These additional data would be very helpful to gain more in-
depth understanding of student learning outcomes and inform specific strategies and trainings to
enhance field education programs for students, faculty instructors, and field instructors.
Despite the increase in online MSW programs in the United States, very few studies have
assessed student learning outcomes in the field. Because each school uses different evaluation
86
tools based on its unique curriculum design and focus, it might not be possible to have a uniform
evaluation tool across MSW programs. Creating a validated evaluation tool fit for individual
unique programs could be difficult and daunting.
In a quasi-experimental design, the researcher lacks control over assignment to conditions
and does not manipulate the causal variable of interest. Reliability can further be discussed when
the instrument used in this study has been piloted with repeated measures and validated over
time via various cohorts of students to generate sufficient data. With replication and repeated
measurement, this instrument could eventually be validated and produce data for program
evaluation that could lead to further improvement in curriculum and trainings for field instructors
regarding how to objectively evaluate students.
Implications for Practice
The 2015 exploratory study (based on the 2008 EPAS) showed promising results. This
capstone project reported that VAC students achieved higher competency scores than OTG
students at Time 1. In their second semester, OTG students demonstrated higher skills
development compared to VAC students. This confirmed that VFP is as effective as the
traditional field model. Students can learn the essential skills via competency-based trainings.
The original plan for this capstone project was to replicate the longitudinal quasi-
experimental design using nonequivalent groups and the same comprehensive skills evaluation
tool across four semesters of field practicum based on the 2015 EPAS requirements.
Subsequently, the USC social work faculty, under direction from the accreditation review
committee, developed a department-specific evaluation tool during summer 2016 that was
implemented in the 2016–2017 academic year based on the new departmental curriculum.
Although this study focused only on the first two semesters of student learning outcomes for the
87
purposes of program evaluation, it is the goal to continue to assess student trajectories in skills
development in nine competencies across the four semesters of field practicum.
Further research is needed to compare data compiled from the 2015–2016 and 2016–2017
academic years with the first two semesters of field practicum and to examine advanced-practice
student learning outcomes at the second, third, and fourth semesters. The results of the three-
semester evaluations in the two student groups will be collected and analyzed by summer 2018.
Contributions to Practice
An innovative program requires specific strategies to successfully implement and achieve
expected positive outcomes. The contributions of this study include providing not only solid
evidence of positive student learning outcomes for the distance learning field education program,
but also validating the effectiveness of an innovative field model, the VFP. The use of weekly
consultation groups for VAC instructors as a voluntary self-organizing governance mechanism
and a collaborative learning format ensured the success of this VFP implementation. These
groups offered an authentic and organic forum to promote problem-based learning and teaching
among all instructors and created a feedback system to refine the VFP curriculum. The
instructors, thus, could model these learning styles and integrate them into their live sessions
with students. In this case, faculty members served as the most valuable human capital in the
educational system.
These 2015–2016 student learning outcomes served as valuable data not only during the
reaffirmation self-study but also for the accreditor during the September 2017 site review. The
reviewer gave a favorable evaluation during her visit and recommended another 8 years of
accreditation. She used the term “golden egg” when she met with the USC provost after her
review. These positive student learning outcomes will forever change the perception of social
88
work distance learning field education. Through innovation and technology advancement, the
VFP model can be viewed as a best practice and replicated by other online and on-campus MSW
programs.
Sharing these data with the faculty of the USC Suzanne Dworak-Peck School of Social
Work is important. These results will also be shared at the third CSWE Social Work Distance
Education Conference in 2017 and the 2018 CSWE annual conference to generate further
discussion. Furthermore, these results might be of interest to other Southern California MSW
program directors, who might consider adapting or using this evaluation tool for their programs
with appropriate modification to fit the curriculum design and needs.
Conclusion, Recommendations, and Future Studies
The VFP model created a ripple effect in the USC Suzanne Dworak-Peck School of
Social Work. Since its full implementation in May 2014, responses from students, agency field
instructors, and faculty members have been overwhelmingly positive. Prospective employers
demand and expect MSW graduates to be ready to enter the workplace sooner than ever, an
expectation validated by employer survey results completed during summer 2014 and provided
by Dr. Wood (2014). Dr. Phillips mentioned during a field retreat meeting in May 2017 that the
VFP is among the best-rated courses according to student evaluations. This outcome changed the
fabric of social work education and initiated the curriculum revision in 2015. Instead of offering
two semesters of generalist practice courses and field placement, USC MSW students are placed
in a one-semester generalist course and field placement. MSW students now enter the
department’s specialized practice curriculum and field placement in their second semester and
engage in a much higher level of integrated learning and skills application. However, this new
curriculum requires commitment, determination, and perseverance from students and readiness
89
to deliver and refine such curriculum by the faculty. The aim is to best prepare MSW graduates
to enter the global job market and be future social work leaders to address the 12 Grand
Challenges for Social Work.
The results of this capstone project indicate the need to further explore semester-based
learning benchmarks, given that both OTG and VAC students rated themselves higher than
expected. When the 2016–2017 student learning outcomes data become available, a small faculty
committee should review and modify the benchmarks. To refine the evaluation tool, surveys
should be created to explore student and field instructor feedback on the evaluation tool,
especially regarding its relevance and user friendliness. Forming focus groups with students and
field instructors with guided questions to collect data for qualitative analysis also would be
informative.
90
References
American Academy of Social Work & Social Welfare. (2017). 12 grand challenges for social
work. Retrieved from http://aaswsw.org/grand-challenges-initiative/12-challenges/
Baartman, L., & Ruijs, L. (2011). Comparing students’ perceived and actual competence in
higher vocational education. Assessment & Evaluation in Higher Education, 36, 385–
398. doi:10.1080/02602938.2011.553274
Barwick, M. (2011). Master’s level clinician competencies in child and youth behavioral
healthcare. Emotional & Behavioral Disorders in Youth, 11, 29–58.
Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory.
Englewood Cliffs, N.J: Prentice-Hall.
Bender, K., Altschul, I., Yoder, J. Parrish, D., & Nickels, S. J. (2014). Training social work
graduate students in the evidence-based practice process. Research on Social Work
Practice, 24, 339–348. doi:10.1177/1049731513506614
Bertram, R. M., King, K., Pederson, R., & Nutt, J. (2014). Implementation frameworks and
MSW curricula: Encouraging pursuit and use of model pertinent data. Journal of
Evidence-Based Social Work, 11, 193–207. doi:10.1080/15433714.2013.850324
Bogo, M. (2006). Field instruction in social work: A review of the research literature. Clinical
Supervisor, 24, 163–193. doi:10.1300/J001v24n01_09
Bogo, M. (2015). Field education for clinical social work practice: Best practices and
contemporary challenges. Clinical Social Work Journal, 43, 317–324.
doi:10.1007/s10615-015-0526-5
Bogo, M., Regehr, C., Power, R., & Regehr, G. (2007). When values collide: Field instructors’
experiences of providing feedback and evaluating competence. The Clinical
91
Supervisor, 26(1-2), 99-117. doi:10.1300/J001v26n01_08
Calderon, O. (2013). Direct and indirect measures of learning outcomes in an MSW program:
What do we actually measure? Journal of Social Work Education, 49, 408–419.
doi:10.1080/10437797.2013.796767\
Campbell, Donald T. 1916-1996 (Donald Thomas), Stanley, Julian C. joint author, & Gage, N.L.
1917-ed. (Nathaniel Lees). (1966) Experimental and quasi-experimental designs for
research. Chicago: R. McNally.
Carey, M. E., & McCardle, M. (2011). Can an observational field model enhance critical
thinking and generalist practice skills? Journal of Social Work Education, 47, 357–366.
doi:10.5175/JSWE.2011.200900117
Carpenter, J. (2011). Evaluating social work education: A review of outcomes, measures,
research designs and practicalities. Social Work Education, 30, 122–140.
doi:10.1080/02615479.2011.540375
Carraccio, C., Wolfsthal, S. D., Englander, R., Ferentz, K., & Martin, C. (2002). Shifting
paradigms: From Flexner to competencies. Academic Medicine, 77, 361–367.
doi:10.1097/00001888-200205000-00003
Carter, I., Bornais, J., & Bilodeau, D. (2011). Considering the use of standardized clients in
professional social work education. Collected Essays on Learning and Teaching, 4, 95–
102. doi:10.22329/celt.v4i0.3279
Christenson, B., Delong-Hamilton, T., Panos, P., Krase, K., Buchan, V., Farrel, D., …
Rodenhiser, R. (2015). Evaluating social work education outcomes: The SWEAP Field
Practicum Placement Assessment Instrument (FPPAI). Field Educator, 5, 1–13.
92
Colby, I. C. (2013). Rethinking the MSW curriculum. Journal of Social Work Education, 49, 4–
15. doi:10.1080/10437797.2013.755404
Council on Social Work Education. (2008). 2008 educational policy and accreditation standards.
Retrieved from https://www.cswe.org/Accreditation/Standards-and-Policies/2008-EPAS
Council on Social Work Education. (2015). 2015 educational policy and accreditation standards.
Retrieved from http://www.cswe.org/Accreditation/EPASRevision.aspx
Council on Social Work Education. (2017). Online and distance learning. Retrieved from
https://www.cswe.org/Accreditation/Distance-Education.aspx
Creswell, J. W. (2014). Research design: Qualitative, quantitative, and mixed methods
approaches (4th ed.). Thousand Oaks, California: Sage.
Cummings, S. M., Chaffin, K. M., & Cockerham, C. (2015). Comparative analysis of an online
and a traditional MSW program: Educational outcomes. Journal of Social Work
Education, 51(1), 109-120. doi:10.1080/10437797.2015.977170
Drisko, J. W. (2015). Holistic competence and its assessment. Smith College Studies in Social
Work, 85, 110–127. doi:10.1080/00377317.2015.1017396
Early, M. T. (2007). Foundation year field instruction in a master of social work program: A
comparison study of learning outcomes for on-campus and off-campus students (Doctoral
dissertation). Old Dominion University, Norfolk, VA.
Fertig, R. D., & Rose, J. S. (2007). 100 years of social work at USC, 1906-2006: A history in
words and pictures. Los Angeles, CA: University of Southern California, Suzanne
Dworak-Peck School of Social Work.
93
Feser, E. (2013). Isserman’s impact: Quasi-experimental comparison group designs in regional
research. International Regional Science Review, 36, 44–68.
doi:10.1177/0160017612464051
Flynn, M., Maiden, R. P., Smith, W., Wiley, J., & Wood, G. (2013). Launching the Virtual
Academic Center: Issues and challenges in innovation. Journal of Teaching in Social
Work, 33, 339–356. doi:10.1080/08841233.2013.843364
Fortune, A. E., McCarty, M., & Abramson, J. S. (2001). Student learning processes in field
education: Relationship of learning activities to quality of field instruction, satisfaction,
and performance among MSW students. Journal of Social Work Education, 37, 111–124.
Frank, J. R., Snell, L. S., Cate, O. T., Holmboe, E. S., Carraccio, C., Swing, S. R., … Harris, K.
A. (2010). Competency-based medical education: Theory to practice. Medical Teacher,
32, 638–645. doi:10.3109/0142159X.2010.501190
Frawley ‐O’Dea, M. G., & Sarnat, J. E. (2001). The supervisory relationship: A contemporary
psychodynamic approach. New York, NY: Guilford Press.
Gelman, C. R. (2004). Anxiety experienced by foundation-year MSW students entering field
placement: Implications for admissions, curriculum, and field education. Journal of
Social Work Education, 40, 39–54. doi:10.1080/10437797.2004.10778478
Gelman, C. R., & Lloyd, C. M. (2008). Pre-placement anxiety among foundation-year MSW
students: A follow up study. Journal of Social Work Education, 44, 173–183.
doi:10.5175/JSWE.2008.200600102
Greenfield, E. A. (2011). Developmental systems theory as a conceptual anchor for generalist
curriculum on human behavior and the social environment. Social Work Education, 30,
529––540. doi:10.1080/02615479.2010.503237
94
Hamilton, T. A. D. (2009). Bachelor social work students’ ratings of social work skills and
advising experience: An analysis of the national Baccalaureate Educational Assessment
Package (BEAP) exit survey data (Doctoral dissertation). Available from ProQuest
Dissertations and Theses database. (UMI No. 3374640).
Hatala, R., Keitz, S. A., Wilson, M. C., & Guyatt, G. (2006). Beyond journal clubs: Moving
toward an integrated evidence-based medicine curriculum. Journal of General Internal
Medicine, 21, 538–541. doi:10.1111/j.1525-1497.2006.00445.x
Heavey, E. (2015). Statistics for nursing: A practical approach elizabeht heavey (Second;
CSZA;ed.) Burlington, MA: Jones & Bartlett Learning.
Hmelo-Silver, C. E. (2004). Problem-based learning: What and how do students learn?
Educational Psychology Review, 16, 235–266.
doi:10.1023/B:EDPR.0000034022.16470.f3
Hohman, M., Pierce, P., & Barnett, E. (2015). Motivational interviewing: An evidence-based
practice for improving student practice skills. Journal of Social Work Education, 51,
287–297. doi:10.1080/10437797.2015.1012925
Holden, G., Barker, K., Rosenberg, G., & Onghena, P. (2007). Assessing progress toward
accreditation related objectives: Evidence regarding the use of self-efficacy as an
outcome in the advanced concentration research curriculum. Research on Social Work
Practice, 17, 456–465. doi:10.1177/1049731506297474
Holden, G., Meenaghan, T., Anastas, J., & Metrey, G. (2002). Outcomes of social work
education: The case for social work self-efficacy. Journal of Social Work Education, 38,
115–133. doi:10.1080/10437797.2002.10779086
95
Homonoff, E. (2008). The heart of social work: Best practitioners rise to challenges in field
instruction. Clinical Supervisor, 27, 135–169. doi:10.1080/07325220802490828
Katz, E., Tufford, L., Bogo, M., & Regehr, C. (2014). Illuminating students’ pre-practicum
conceptual and emotional states: Implications for field education. Journal of Teaching
and Social Work, 34, 96–108. doi:10.1080/08841233.2013.868391
Knight, C. (2001). The Process of Field Instruction: BSW and MSW Students’ Views of
Effective Field Supervision. Journal of Social Work Education, 37(2), 357-379)
Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and
development. Upper Saddle River, NJ: Prentice Hall.
Kuhlmann, E. G. (2009). Behavioral objectives in education for social work: A philosophic
analysis. Social Work & Christianity, 36, 77–00.
Kurzman, P. A. (2013). The evolution of distance learning and online education. Journal of
Teaching in Social Work, 33, 331–338. doi:10.1080/08841233.2013.843346
Lee, M., & Fortune, A. E. (2013a). Patterns of field learning activities and their relation to
learning outcome. Journal of Social Work Education, 49, 420–438.
doi:10.1080/10437797.2013.796786
Lee, M., & Fortune, A. E. (2013b). Do we need more “doing” activities or “thinking” activities
in the field practicum? Journal of Social Work Education, 49, 646–660.
doi:10.1080/10437797.2013.812851
Logie, C., Bogo, M., Regehr, C., & Regehr, G. (2013). A critical appraisal of the use of
standardized client simulations in social work education. Journal of Social Work
Education, 49, 66–80. doi:10.1080/10437797.2013.755377
96
Los Angeles County Department of Mental Health. (2017). Mental Health Services Act
(MHSA). Retrieved from http://dmh.lacounty.gov/wps/portal/dmh/mhsa
Mathiesen, S. G., & Hohman, M. (2013). Revalidation of an evidence-based practice scale for
social work. Journal of Social Work Education, 49, 451–460.
doi:10.1080/10437797.2013.796793
Menefee, D. T., & Thompson, J. J. (1994). Identifying and comparing competences for social
work management: A practice driven approach. Administration in Social Work, 18, 1–25.
doi:10.1300/J147v18n03_01
Meyer-Adams, N., Potts, M. K., Koob, J. J., Dorsey, C. J., & Rosales, A. M. (2011). How to
tackle the shift of educational assessment from learning outcomes to competencies: One
program’s transition. Journal of Social Work Education, 47, 489–507.
doi:10.5175/JSWE.2011.201000017
Munro, B. H. (2005). Statistical methods for health care research (5th ed.). Philadelphia, PA:
Lippincott Williams & Wilkins.
Ornstein, E. D., & Moses, H. (2010). Goodness of fit: A relational approach to field instruction.
Journal of Teaching in Social Work, 30, 101–114. doi:10.1080/08841230903479615
Palomba, C. A., & Banta, T. W. (1999). Assessment essentials: Planning, implementing, and
improving assessment in higher education. San Francisco, CA: Jossey-Bass.
Palomba, C. A., & Banta, T. W. (Eds.). (2001). Assessing student competence in accredited
disciplines: Pioneering approaches to assessment in higher education. Sterling, VA:
Stylus.
Phillips, B., Woods, G., Yoo, J., Ward, K., Hsiao, S. C., Singh, M., & Morris, B. (in press).
Virtual field practicum: Core competencies. Journal of Social Work Education.
97
Rayens, M. K., Svavarsdottir, E. K., & Burkhart, P. V. (2011). Cultural differences in parent-
adolescent agreement on the adolescent’s asthma-related quality of life. Pediatric
Nursing, 37, 311–319.
Regehr, G., Bogo, M., Regehr, C., & Power, R. (2007). Can we build a better mousetrap?
Improving the measures of practice performance in the field practicum. Journal of Social
Work Education, 43, 327–344. doi:10.5175/JSWE.2007.200600607
Rep, M. A. (2012). Standardized patients in education. Radiologic Technology, 83, 503–506.
Ringstad, R. L. (2013). Competency level versus level of competency: The field evaluation
dilemma. Field Educator, 3.2.
Rishel, C. W., & Majewski, V. (2009). Student gains in self-efficacy in an advanced MSW
curriculum: A customized model for outcomes assessment. Journal of Social Work
Education, 45, 365–383. doi:10.5175/JSWE.2009.200800101
Sachett, D. L., Richardson, W.S., Rosenberg, W., & Haynes, R. B. (1997). Evidence-based
medicine: How to practice and teach EBM. New York, NY: Churchill Livingstone.
Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental
designs for generalized causal inference. Boston, MA: Houghton Mifflin.
Sheafor, B. W., & Horejsi, C. R. (2006). Techniques and guidelines for social work practice (7th
ed.). Boston, MA: Allyn and Bacon.
Shlonsky, A. and Stern, S. B. (2007). Reflections on the teaching of evidence based practice.
Research on Social Work Practice, 17: 603
Sowbel, L. R. (2011). Gatekeeping in field performance: Is grade inflation a given? Journal of
Social Work Education, 47, 367–377. doi:10.5175/JSWE.2011.201000006
98
Sowbel, L. R. (2012). Gatekeeping: Why shouldn’t we be ambivalent? Journal of Social Work
Education, 48, 27–44. doi:10.5175/JSWE.2012.201000027
Strozier, A. L., Barnett-Queen, T., & Bennett, C. K. (2000). Supervision: Critical process and
outcome variables. Clinical Supervisor, 19, 21–39. doi:10.1300/J001v19n01_02
Swank, J. M. (2014). Assessing counseling competencies: A comparison of supervisors’ ratings
and student supervisees’ self-ratings. Counseling Outcome Research and Evaluation, 5,
17–27. doi:10.1177/2150137814529147
Traube, D. E., Pohle, C. E., & Barley, M. (2012). Teaching evidence-based social work in
foundation practice courses: Learning from pedagogical choices of allied fields. Journal
of Evidence-Based Social Work, 9(3), 241-259. Doi:10.1080/15433714.2010.525417
Tsang, N. M. (2013). A poetic appreciation of social work field instruction. Practice Digest, 3.2.
U.S. Department of Labor. (2016). Occupational outlook handbook: Social workers. Retrieved
from Bureau of Labor Statistics website: http://www.bls.gov/ooh/community-and-social-
service/social-workers.htm#tab-6
USC Suzanne Dworak-Peck School of Social Work. (n.d.). Our mission. Retrieved from
http://sowkweb.usc.edu/about/mission
USC Suzanne Dworak-Peck School of Social Work. (2017). Field education manual. Retrieved
from https://sowkweb.usc.edu/download/msw/field-education/field-education-manual
Vernon, R., Vakalahi, H., Pierce, D., Pittman-Munke, P., & Adkins, L. F. (2009). Distance
education programs in social work: Current and emerging trends. Journal of Social Work
Education, 45, 263–276. doi:10.5175/JSWE.2009.200700081
Vinton, L., & Wilke, D. J. (2011). Leniency bias in evaluating clinical social work student
interns. Clinical Social Work Journal, 39, 288–295. doi:10.1007/s10615-009-0221-5
99
Vodde, R., & Giddings, M. M. (2000). The field system eco-map: A tool for conceptualizing
practicum experiences. Journal of Teaching in Social Work, 20, 41–61.
doi:10.1300/J067v20n03_05
Wayne, J., Raskin, M., & Bogo, M. (2006). The need for radical change in field education.
Journal of Social Work Education, 42, 161–169.
Wayne, J., Bogo, M., & Raskin, M. (2010). Field education as the signature pedagogy of social
work education. Journal of Social Work Education, 46, 327–339.
doi:10.5175/JSWE.2010.200900043
Wilke, D., & Vinton, L. (2006). Evaluation of the first web-based advanced standing MSW
program. Journal of Social Work Education, 42, 607–620.
doi:10.5175/JSWE.2006.200500501
Wilson, A. B., Brown, S., Wood, Z. B., & Farkas, K. J. (2013). Teaching direct practice skills
using web-based simulations: Home visiting in the virtual world. Journal of Teaching in
Social Work, 33, 421–437. doi:10.1080/08841233.2013.833578
Wolfson, G. K., Marsom, G., & Magnuson, C. W. (2005). Changing the nature of the discourse:
Teaching field seminars online. Journal of Social Work Education, 41, 355–361.
Wong, D. K. P., & Lam, D. O. B. (2007). Problem-based learning in social work: A study of
student learning outcomes. Research on Social Work Practice, 17, 55–65.
doi:10.1177/1049731506293364
Wood, G. (2014). MSW professional skills survey results. Lanham, MD: 2U.
Zimmerman, B. J. (2000). Self-efficacy: An essential motive to learn. Contemporary
Educational Psychology, 25, 82–91. doi:10.1006/ceps.1999.1016
100
Table 1: Field Curriculum in Four Semesters
Field Semester First Semester Second Semester Third Semester Fourth Semester
Field Seminar
(class time)
589a Applied Learning in
Field Education (3 units)
588 Integrative
Learning for Social
Work Practice (2
units)
589b Applied
Learning in Field
Education (3 units)
698b Integrative
Learning for
Advanced Social
Work Practice (1
unit)
699b Advanced
Applied Learning in
Field Education (4
units)
698b Integrative
Learning for
Advanced Social
Work Practice (1
unit)
699b Advanced
Applied Learning in
Field Education (4
units)
Field Practicum
(campus-based)
Campus-based students
placed in community-
based field setting with
agency-based MSW field
instructor
Campus-based
students remain in
same field setting
Campus-based
students start second
field placement
Campus-based
students continue
second-year field
placement
Field Practicum
(VAC)
VAC students enrolled in
VFP taught by field
faculty as their field
instructor. Students
participate in 4 hours of
live sessions with faculty
instructors and 12 hours
of synchronous
assignments
VAC students enter
community-based
field placement
(second field
placement)
VAC students
continue second-
year field placement
in same agency
VAC students
continue second-year
field placement in
same agency
101
Table 2. Demographic Characteristics of VAC and OTG Students
VAC (n = 100) OTG (n = 520)
n (%) n (%)
Race
African American 27 (27%) 61 (12%)
Asian or Pacific Islander 8 (8%) 68 (13%)
Caucasian 44 (44%) 152 (29%)
Latino/a 12 (12%) 173 (33%)
Native American 5 (5%) 6 (1%)
Unknown 4 (4%) 60 (12%)
Gender
Male 16 (16%) 93 (18%)
Female 84 (84%) 425 (82%)
Unknown 0 (0%) 2 (<1%)
GPA
a
3.72 (.25) 3.64 (.31)
a
Values reflect M (SD); GPA was not available for 48 students in the VAC
group.
102
Table 3-A. Independent T-Test Results for Time 1
Competency Group M (SD) t (df) Bonferroni-
adjusted p
1 VAC (n = 98) 2.84 (0.66) -7.42 (615) < .001
OTG (n = 519) 2.37 (0.57)
2 VAC (n = 98) 2.67 (0.55) -9.92 (611) < .001
OTG (n = 515) 2.08 (0.55)
3 VAC (n = 98) 2.63 (0.58) -9.12 (613) < .001
OTG (n = 517) 2.09 (0.53)
4 VAC (n = 98) 2.84 (0.57) -9.34 (614) < .001
OTG (n = 518) 2.28 (0.54)
5 VAC (n = 89) 2.68 (0.54) -8.86 (598) < .001
OTG (n = 511) 2.12 (0.56)
6 VAC (n = 98) 2.53 (0.55) -10.79 (609) < .001
OTG (n = 513) 1.90 (0.53)
7 VAC (n = 98) 2.39 (0.50) -8.03 (133.62) < .001
OTG (n = 518) 1.95 (0.48)
8 VAC (n = 78) 2.28 (0.67) -4.89 (93.23) < .001
OTG (n = 484) 1.89 (0.53)
9 VAC (n = 84) 2.35 (0.73) -5.21 (103.51) < .001
OTG (n = 486) 1.91 (0.60)
10 VAC (n = 94) 2.65 (0.53) -11.34 (577) < .001
OTG (n = 485) 2.03 (0.48)
103
Table 3-B. Independent T-Test Results for Time 2
Competency Group M (SD) t (df) Bonferroni-
adjusted p
1 VAC (n = 93) 2.92 (0.69) 4.31 (113.29) < .001
OTG (n = 494) 3.25 (0.53)
2 VAC (n = 89) 2.74 (0.64) 3.92 (105.89) .002
OTG (n = 492) 3.02 (0.47)
3 VAC (n = 92) 2.74 (0.68) 4.25 (108.50) < .001
OTG (n = 494) 3.05 (0.48)
4 VAC (n = 91) 2.88 (0.64) 4.70 (107.07) < .001
OTG (n = 494) 3.21 (0.45)
5 VAC (n = 90) 2.74 (0.70) 3.85 (103.14) .002
OTG (n = 486) 3.04 (0.45)
6 VAC (n = 90) 2.59 (0.76) 3.46 (104.19) .008
OTG (n = 494) 2.88 (0.51)
7 VAC (n = 91) 2.63 (0.66) 4.79 (104.34) < .001
OTG (n = 494) 2.96 (0.43)
8 VAC (n = 87) 2.71 (0.72) 2.22 (102.04) .290
OTG (n = 477) 2.89 (0.51)
9 VAC (n = 82) 2.69 (0.71) 2.21 (97.47) .293
OTG (n = 476) 2.87 (0.54)
10 VAC (n = 73) 2.60 (0.63) 5.84 (82.21) < .001
OTG (n = 490) 3.05 (0.43)
104
Table 3-C. Independent T-Test Results for Time 3
Competency Group M (SD) t (df) Bonferroni-
adjusted p
1 VAC (n = 40) 3.33 (0.53) -5.30 (458) < .001
OTG (n = 420) 2.87 (0.52)
2 VAC (n = 40) 3.16 (0.59) -5.90 (455) < .001
OTG (n = 417) 2.66 (0.51)
3 VAC (n = 42) 3.10 (0.61) -5.09 (460) < .001
OTG (n = 420) 2.67 (0.50)
4 VAC (n = 41) 3.27 (0.56) -4.83 (46.28) < .001
OTG (n = 419) 2.83 (0.50)
5 VAC (n = 38) 3.21 (0.61) -5.77 (448) < .001
OTG (n = 412) 2.69 (0.53)
6 VAC (n = 42) 3.08 (0.62) -6.69 (459) < .001
OTG (n = 419) 2.51 (0.52)
7 VAC (n = 41) 3.09 (0.59) -5.76 (459) < .001
OTG (n = 420) 2.60 (0.51)
8 VAC (n = 40) 3.05 (0.60) -5.92 (445) < .001
OTG (n = 407) 2.52 (0.53)
9 VAC (n = 37) 3.14 (0.55) -5.48 (451) < .001
OTG (n = 416) 2.58 (0.60)
10 VAC (n = 38) 3.12 (0.57) -5.93 (438) < .001
OTG (n = 402) 2.63 (0.48)
105
Table 3-D. Independent T-Test Results for Time 4
Competency Group M (SD) t (df) Bonferroni-
adjusted p
1 VAC (n = 52) 3.74 (0.37) -1.52 (465) 1.00
OTG (n = 415) 3.65 (0.41)
2 VAC (n = 53) 3.65 (0.43) -2.75 (465) .075
OTG (n = 414) 3.47 (0.43)
3 VAC (n = 53) 3.61 (0.46) -1.47 (467) 1.00
OTG (n = 416) 3.52 (0.42)
4 VAC (n = 52) 3.72 (0.42) -1.50 (464) 1.00
OTG (n = 414) 3.63 (0.41)
5 VAC (n = 50) 3.69 (0.44) -3.07 (457) .046
OTG (n = 409) 3.50 (0.41)
6 VAC (n = 53) 3.60 (0.49) -3.31 (467) .023
OTG (n = 416) 3.38 (0.47)
7 VAC (n = 51) 3.63 (0.46) -3.03 (465) .059
OTG (n = 416) 3.44 (0.43)
8 VAC (n = 50) 3.58 (0.48) -3.45 (455) .014
OTG (n = 407) 3.34 (0.46)
9 VAC (n = 49) 3.59 (0.51) -2.61 (457) .117
OTG (n = 410) 3.39 (0.50)
10 VAC (n = 49) 3.61 (0.43) -1.66 (450) 1.00
OTG (n = 403) 3.51 (0.40)
106
Table 4. 2015–2016 Student Enrollment and Demographic Information
Course A Course B
OTG VAC OTG VAC
n (%) n (%) n (%) n (%)
Age
a
27.4 (6.24) 34.0 (9.05) 27.9 (6.29) 33.9 (9.17)
Gender
Female 446 (82.0) 603 (84.9) 440 (81.8) 620 (86.0)
Male 98 (18.0) 107 (15.1) 98 (18.2) 101 (14.0)
Race and ethnicity
African American 70 (12.9) 186 (26.2) 70 (13.0) 176 (24.4)
Asian or Pacific Islander 72 (13.2) 50 (7.0) 69 (12.8) 55 (7.6)
Caucasian 133 (24.4) 298 (42.0) 134 (24.9) 309 (42.9)
Latino/a 202 (37.1) 113 (15.9) 198 (36.8) 111 (15.4)
Native American 7 (1.3) 13 (1.8) 8 (1.5) 16 (2.2)
Unknown 60 (11.0) 50 (7.0) 59 (11.0) 54 (7.5)
Primary language
English 416 (76.5) 516 (72.7) 414 (77.0) 505 (70.0)
Other 125 (23.0) 47 (6.6) 119 (22.1) 47 (6.5)
Unknown 3 (0.6) 147 (20.7) 5 (0.9) 169 (23.4)
GPA
a
3.77 (0.23) 3.68 (0.34) 3.79 (0.21) 3.73 (0.25)
a
Values reflect M (SD).
107
Table 5-A. Paired-Samples T-Test Results for OTG 589A and OTG 589B (Student Self-Ratings)
Competency Group M (SD) t (df) Bonferroni-
adjusted p
Hedges’ g
1 589A (n = 526) 6.16 (1.73) -20.73 (525) < .001 -.86
589B (n = 526) 7.48 (1.31)
2 589A (n = 525) 5.57 (1.88) -20.99 (524) < .001 -.86
589B (n = 525) 7.03 (1.49)
3 589A (n = 491) 4.83 (2.20) -18.11 (490) < .001 -.71
589B (n = 491) 6.28 (1.87)
4 589A (n = 525) 4.86 (1.94) -22.00 (524) < .001 -.90
589B (n = 525) 6.48 (1.64)
5 589A (n = 501) 4.54 (2.03) -19.02 (500) < .001 -.76
589B (n = 501) 6.00 (1.80)
6 589A (n = 525) 5.45 (1.89) -21.95 (524) < .001 -.88
589B (n = 525) 6.97 (1.56)
7 589A (n = 525) 4.81 (1.99) -24.64 (524) < .001 -.97
589B (n = 525) 6.58 (1.63)
8 589A (n = 525) 4.42 (2.04) -25.12 (524) < .001 -.97
589B (n = 525) 6.25 (1.72)
9 589A (n = 510) 4.27 (2.16) -19.05 (509) < .001 -.80
589B (n = 510) 5.87 (1.84)
1–9 589A (n = 526) 5.26 (1.77) -23.32 (525) < .001 -.92
589B (n = 526) 6.75 (1.45)
108
Table 5-B. Paired-Samples T-Test Results for OTG 589A and OTG 589B (Field Instructor
Ratings)
Competency Group M (SD) t (df) Bonferroni-
adjusted p
Hedges’ g
1 589A (n = 523) 5.90 (1.50) -36.25 (522) < .001 -1.28
589B (n = 523) 7.68 (1.28)
2 589A (n = 524) 5.24 (1.60) -38.94 (523) < .001 -1.35
589B (n = 524) 7.29 (1.43)
3 589A (n = 501) 4.54 (1.78) -34.16 (500) < .001 -1.23
589B (n = 501) 6.70 (1.72)
4 589A (n = 524) 4.62 (1.60) -36.39 (523) < .001 -1.36
589B (n = 524) 6.76 (1.54)
5 589A (n = 508) 4.41 (1.69) -32.44 (507) < .001 -1.16
589B (n = 508) 6.36 (1.68)
6 589A (n = 524) 5.27 (1.60) -37.96 (523) < .001 -1.32
589B (n = 524) 7.29 (1.45)
7 589A (n = 524) 4.66 (1.65) -39.47 (523) < .001 -1.44
589B (n = 524) 6.92 (1.49)
8 589A (n = 524) 4.36 (1.70) -39.91 (523) < .001 -1.42
589B (n = 524) 6.71 (1.60)
9 589A (n = 507) 4.05 (1.82) -31.99 (506) < .001 -1.23
589B (n = 507) 6.24 (1.73)
1–9 589A (n = 524) 5.06 (1.48) -41.17 (523) < .001 -1.41
589B (n = 524) 7.07 (1.37)
109
Table 5-C. Paired-Samples T-Test Results for VAC 586A and VAC 586B (Student Self-Ratings)
Competency Group M (SD) t (df) Bonferroni-
adjusted p
Hedges’ g
1 586A (n = 422) 6.64 (2.09) -3.79 (421) < .001 -.21
586B (n = 422) 7.03 (1.49)
2 586A (n = 420) 6.18 (2.22) -3.94 (419) < .001 -.22
586B (n = 420) 6.62 (1.66)
3 586A (n = 264) 5.70 (2.40) -2.99 (263) .003 -.22
586B (n = 264) 6.18 (1.98)
4 586A (n = 420) 5.55 (2.24) -3.23 (419) .001 -.20
586B (n = 420) 5.95 (1.78)
5 586A (n = 289) 5.45 (2.18) -3.47 (288) .001 -.25
586B (n = 289) 5.96 (1.88)
6 586A (n = 422) 5.97 (2.27) -5.25 (421) < .001 -.30
586B (n = 422) 6.57 (1.65)
7 586A (n = 421) 5.66 (2.26) -3.55 (420) < .001 -.22
586B (n = 421) 6.11 (1.85)
8 586A (n = 419) 5.06 (2.39) -5.11 (418) < .001 -.30
586B (n = 419) 5.72 (2.00)
9 586A (n = 266) 5.18 (2.38) -3.28 (265) .001 -.24
586B (n = 266) 5.72 (2.02)
1–9 586A (n = 422) 5.96 (2.05) -3.86 (421) < .001 -.23
586B (n = 422) 6.37 (1.54)
110
Table 5-D. Paired-Samples T-Test Results for VAC 586A and VAC 586B (Field Instructor
Ratings)
Competency Group M (SD) t (df) Bonferroni-
adjusted p
Hedges’ g
1 586A (n = 497) 6.49 (1.70) -5.02 (496) < .001 -.28
586B (n = 497) 6.96 (1.63)
2 586A (n = 495) 5.84 (1.67) -6.09 (494) < .001 -.35
586B (n = 495) 6.45 (1.79)
3 586A (n = 339) 5.37 (1.88) -5.41 (338) < .001 -.39
586B (n = 339) 6.13 (2.00)
4 586A (n = 495) 5.55 (1.73) -1.83 (494) .067 -.11
586B (n = 495) 5.76 (2.07)
5 586A (n = 359) 5.36 (1.78) -4.12 (358) < .001 -.29
586B (n = 359) 5.89 (1.88)
6 586A (n = 496) 5.79 (1.68) -7.16 (495) < .001 -.43
586B (n = 496) 6.52 (1.73)
7 586A (n = 496) 5.48 (1.75) -5.98 (495) < .001 -.35
586B (n = 496) 6.12 (1.87)
8 586A (n = 494) 5.30 (1.65) -3.77 (493) < .001 -.23
586B (n = 494) 5.72 (2.00)
9 586A (n = 399) 4.79 (1.91) -5.95 (398) < .001 -.39
586B (n = 399) 5.56 (2.05)
1–9 586A (n = 497) 5.78 (1.58) -5.75 (496) < .001 -.34
586B (n = 497) 6.32 (1.63)
111
Table 5-E. Independent T-Test Results for OTG 589A and VAC 586A (Student Self-Ratings)
Competency Group M (SD) t (df) Bonferroni-
adjusted p
Hedges’ g
1 OTG (n = 526) 6.16 (1.73) -3.83 (813.67) < .001 -.25
VAC (n = 422) 6.64 (2.09)
2 OTG (n = 525) 5.57 (1.88) -4.48 (822.33) < .001 -.30
VAC (n = 420) 6.18 (2.22)
3 OTG (n = 491) 4.83 (2.20) -4.88 (499.08) < .001 -.38
VAC (n = 264) 5.70 (2.40)
4 OTG (n = 525) 4.86 (1.94) -4.99 (831.28) < .001 -.33
VAC (n = 420) 5.55 (2.24)
5 OTG (n = 501) 4.54 (2.03) -5.90 (788.00) < .001 -.44
VAC (n = 289) 5.45 (2.18)
6 OTG (n = 525) 5.45 (1.89) -3.78 (817.58) < .001 -.25
VAC (n = 422) 5.97 (2.27)
7 OTG (n = 525) 4.81 (1.99) -6.03 (843.65) < .001 -.40
VAC (n = 421) 5.66 (2.26)
8 OTG (n = 525) 4.42 (2.04) -4.38 (823.60) < .001 -.29
VAC (n = 419) 5.06 (2.39)
9 OTG (n = 510) 4.27 (2.16) -5.22 (493.84) < .001 -.41
VAC (n = 266) 5.18 (2.38)
1–9 OTG (n = 526) 5.26 (1.77) -5.56 (836.67) < .001 -.37
VAC (n = 422) 5.96 (2.05)
112
Table 5-F. Independent T-Test Results for OTG 589B and VAC 586B (Student Self-Ratings)
Competency Group M (SD) t (df) Bonferroni-
adjusted p
Hedges’ g
1 OTG (n = 526) 7.48 (1.31) 4.85 (845) < .001 .32
VAC (n = 422) 7.03 (1.49)
2 OTG (n = 525) 7.03 (1.49) 3.97 (943) < .001 .28
VAC (n = 420) 6.62 (1.66)
3 OTG (n = 491) 6.28 (1.87) 0.65 (753) .519 .05
VAC (n = 264) 6.18 (1.98)
4 OTG (n = 525) 6.48 (1.64) 4.82 (943) < .001 .31
VAC (n = 420) 5.95 (1.78)
5 OTG (n = 501) 6.00 (1.80) 0.35 (788) .727 .02
VAC (n = 289) 5.96 (1.88)
6 OTG (n = 525) 6.97 (1.56) 3.82 (945) < .001 .25
VAC (n = 422) 6.57 (1.65)
7 OTG (n = 525) 6.58 (1.63) 4.12 (944) < .001 .27
VAC (n = 421) 6.11 (1.85)
8 OTG (n = 525) 6.25 (1.72) 4.32 (825) < .001 .28
VAC (n = 419) 5.72 (2.00)
9 OTG (n = 510) 5.87 (1.84) 1.05 (774) .295 .08
VAC (n = 266) 5.72 (2.02)
1–9 OTG (n = 526) 6.75 (1.45) 3.98 (946) < .001 .25
VAC (n = 422) 6.37 (1.54)
113
Table 5-G. Independent T-Test Results for OTG 589A and VAC 586A (Field Instructor Ratings)
Competency Group M (SD) t (df) Bonferroni-
adjusted p
Hedges’ g
1 OTG (n = 523) 5.90 (1.50) -5.92 (1,018) < .001 -.37
VAC (n = 497) 6.49 (1.70)
2 OTG (n = 524) 5.24 (1.60) -5.89 (1,017) < .001 -.37
VAC (n = 495) 5.84 (1.67)
3 OTG (n = 501) 4.54 (1.78) -6.48 (838) < .001 -.46
VAC (n = 339) 5.37 (1.88)
4 OTG (n = 524) 4.62 (1.60) -8.87 (1,017) < .001 -.56
VAC (n = 495) 5.55 (1.73)
5 OTG (n = 508) 4.41 (1.69) -7.97 (865) < .001 -.55
VAC (n = 359) 5.36 (1.78)
6 OTG (n = 524) 5.27 (1.60) -5.06 (1,018) < .001 -.32
VAC (n = 496) 5.79 (1.68)
7 OTG (n = 524) 4.66 (1.65) -7.67 (1,018) < .001 -.48
VAC (n = 496) 5.48 (1.75)
8 OTG (n = 524) 4.36 (1.70) -9.00 (1,016) < .001 -.56
VAC (n = 494) 5.30 (1.65)
9 OTG (n = 507) 4.05 (1.82) -5.92 (904) < .001 -.40
VAC (n = 399) 4.79 (1.91)
1–9 OTG (n = 524) 5.06 (1.48) -7.57 (1,019) < .001 -.47
VAC (n = 497) 5.78 (1.58)
114
Table 3-H. Independent T-Test Results for OTG 589B and VAC 586B (Field Instructor Ratings)
Competency Group M (SD) t (df) Bonferroni-
adjusted p
Hedges’ g
1 OTG (n = 523) 7.68 (1.28) 7.86 (942.12) .000* .49
VAC (n = 497) 6.96 (1.63)
2 OTG (n = 524) 7.29 (1.43) 8.29 (943.11) .000* .52
VAC (n = 495) 6.45 (1.79)
3 OTG (n = 501) 6.70 (1.72) 4.37 (838.00) .000* .31
VAC (n = 339) 6.13 (2.00)
4 OTG (n = 524) 6.76 (1.54) 8.73 (911.86) .000* .55
VAC (n = 495) 5.76 (2.07)
5 OTG (n = 508) 6.36 (1.68) 3.87 (865.00) .000* .27
VAC (n = 359) 5.89 (1.88)
6 OTG (n = 524) 7.29 (1.45) 7.74 (965.09) .000* .48
VAC (n = 496) 6.52 (1.73)
7 OTG (n = 524) 6.92 (1.49) 7.57 (947.08) .000* .47
VAC (n = 496) 6.12 (1.87)
8 OTG (n = 524) 6.71 (1.60) 8.66 (942.87) .000* .55
VAC (n = 494) 5.72 (2.00)
9 OTG (n = 507) 6.24 (1.73) 5.30 (775.53) .000* .36
VAC (n = 399) 5.56 (2.05)
1–9 OTG (n = 524) 7.07 (1.37) 7.96 (969.73) .000* .50
VAC (n = 497) 6.32 (1.63)
115
Appendix A: First-Year Comprehensive Skills Evaluation (CSWE 2015 EPAS)
COMPREHENSIVE SKILLS EVALUATION
GENERALIST PRACTICE
INSTRUCTIONS
Please use the 10-point scale above to rate the skill level (and degree of consistency, as applicable) for each
learning activity item. Use the anchors in the scale to guide your rating of the student’s skill level for each
item on a continuum from “0” (skill is not developed) to “10” (skill is mastered). Ratings on items for each
semester can range from 0 to 10 depending on the student’s skill level. Please use the full scale from 0 to 10,
as appropriate, to rate the skill level of the student regardless of the semester. For example, a student in the
first semester can be rated an “8” on any item if that skill is fully developed and consistently demonstrated
in field at that time. Conversely, a student in the fourth semester can be rated a “2” on any item if that skill
is only beginning to develop at that time. Please note that for only the first semester (589a), some learning
activity items can be rated an N/A; all other items require a rating from 0 to 10. All items require a rating
from 0 to 10 for the second, third, and fourth semesters (589b and 699a/b).
Rating Scale:
0 = Skill is not developed
2 = Skill is beginning to develop
4 = Skill is still developing and is not consistent
6 = Skill is developed and is mostly consistent
8 = Skill is fully developed and consistent
10 = Skill is mastered; exceeds all standards
Skill is not developed (0) Skill is mastered (10)
N/A 0 1 2 3 4 5 6 7 8 9 10
N/A = There was no opportunity for the student to demonstrate skills in this area.
Competency 1 – Demonstrate Ethical and Professional Behavior
Social workers understand the value base of the profession and its ethical standards, as well as relevant laws
and regulations that may impact practice at the micro, mezzo, and macro levels. Social workers understand
frameworks of ethical decision-making and how to apply principles of critical thinking to those frameworks in
practice, research, and policy arenas. Social workers recognize personal values and the distinction between
personal and professional values. They also understand how their personal experiences and affective reactions
influence their professional judgment and behavior. Social workers understand the profession’s history, its
mission, and the roles and responsibilities of the profession. Social Workers also understand the role of other
professions when engaged in inter-professional teams. Social workers recognize the importance of life-long
learning and are committed to continually updating their skills to ensure they are relevant and effective. Social
workers also understand emerging forms of technology and the ethical use of technology in social work
practice.
116
Generalist Practice Behavior Evaluation
FI/ST FI/ST
1
st
1
st
2
nd
2
nd
1a. Applies strategies of ethical reasoning to arrive at principled decisions by applying the
NASW Code of Ethics and relevant laws and regulations.
EXAMPLE: The student respects and promotes the right of clients to self-determination and assists
clients in their efforts to identify and clarify their goals.
1b. Uses self-regulation and self-management to maintain professional roles and boundaries
with clients.
EXAMPLE: The student puts aside her personal values about drug use when helping clients who abuse
drugs.
1c. Uses self-regulation and self-management to maintain professional roles and boundaries
with co-workers, field instructors, and/or colleagues/classmates.
EXAMPLE: Despite personal differences with a co-worker, the student displays respectful behavior.
1d. Tolerates ambiguity in resolving ethical conflicts.
EXAMPLE: The student works in a clinic and navigates the differing views on abortion.
1e. Demonstrates professional appearance.
EXAMPLE: The student understands that the organization has a dress code and follows it accordingly.
1f. Demonstrates professionalism in oral communication.
EXAMPLE: The student avoids professional jargon when she communicates with her client.
1g. Demonstrates professionalism in written communication/documentation.
EXAMPLE: The student provides quality written notes in case records.
1h. Demonstrates professionalism in electronic communication.
EXAMPLE: The student presents himself with professional salutations and good grammar when
communicating via E-mail.
1i. Demonstrates accountability in meeting field placement requirements in a timely manner (i.e.,
attendance, paperwork, and assigned casework or projects).
EXAMPLE: The student is punctual in submitting paperwork and assignments.
1j. Uses technology ethically and appropriately to facilitate practice outcomes.
EXAMPLE: In accordance with agency policies and procedures, the student uses text messaging to send
appointment reminders.
117
1k. Uses supervision/field instruction and/or consultation to guide professional judgment and
behavior.
EXAMPLE: The student takes feedback from field instruction and employs it in the field to improve
communication skills.
Competency 1: Scores in red are not passing; scores in green are passing and within the
expected range; scores in blue are passing but exceed the expected range
First
Semester
0 1 2 3
4
5 6 7 8 9 10
Second
Semester
0 1 2 3
4
5
6
7 8 9 10
Comments (required for any rating in red & blue zones):
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
Competency 2 – Engage Diversity and Difference in Practice
Social workers understand how diversity and difference characterize and shape the human experience and are
critical to the formation of identity. The dimensions of diversity are understood as the intersectionality of
multiple factors including but not limited to age, class, color, culture, disability and ability, ethnicity, gender,
gender identity and expression, immigration status, marital status, political ideology, race, religion/spirituality,
sex, sexual orientation, and tribal sovereign status. Social workers understand that, as a consequence of
difference, a person’s life experiences may include oppression, poverty, marginalization, and alienation as well
as privilege, power, and acclaim. Social workers also understand the forms and mechanisms of oppression and
discrimination and recognize the extent to which a culture’s structures and values, including social, economic,
political, and cultural exclusions, may oppress, marginalize, alienate, or create privilege and power.
Generalist Practice Behavior Evaluation
FI/ST FI/ST
1
st
1
st
2
nd
2
nd
2a. Communicates her/his understanding of the importance of diversity and differences in
shaping life experiences in practice.
EXAMPLE: The student respects and encourages involvement from family members to develop a case
plan that reflects their cultural values.
118
2b. Engages clients and constituencies as experts of their own experiences. Constituencies
include individuals, families, groups, organizations, and/or communities.
EXAMPLE: The client has a mental illness. The student respectfully asks the client to describe her
experience in a home environment where mental illness is stigmatized.
2c. Applies self-regulation and/or self-management to eliminate the influence of personal biases
in working with diverse clients and constituencies.
EXAMPLE: Despite the student’s religious beliefs, she is able to support the work of an organization that
helps gay men access services.
Competency 2: Scores in red are not passing; scores in green are passing and within the
expected range; scores in blue are passing but exceed the expected range
First
Semester
0 1 2 3
4
5 6 7 8 9 10
Second
Semester
0 1 2 3
4
5
6
7 8 9 10
Comments (required for any rating in the red & blue zones):
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
Competency 3 – Advance Human Rights and Social, Economic, and Environmental Justice
Social workers understand that every person regardless of position in society has fundamental human rights
such as freedom, safety, privacy, an adequate standard of living, health care, and education. Social workers
understand the global interconnections of oppression and human rights violations, and are knowledgeable about
theories of human need and social justice and strategies to promote social and economic justice and human
rights. Social workers understand strategies designed to eliminate oppressive structural barriers to ensure that
social goods, rights, and responsibilities are distributed equitably and that civil, political, environmental,
economic, social, and cultural human rights are protected.
Generalist Practice Behavior Evaluation
FI/ST FI/ST
1
st
1
st
2
nd
2
nd
119
3a. Applies principles of social, economic, and environmental justice to advocate for human
rights within the scope of the organization’s mission.
EXAMPLE: The student advocates for her transgender client to be allowed to use the women’s
restroom at her school.
3b. Engages in practices that advance social, economic, and environmental justice within the
scope of the organization’s mission.
EXAMPLE: The student informs her colleagues of social and economic injustices in her client’s
community that adversely affect the client’s access to quality medical care.
Competency 3: Scores in red are not passing; scores in green are passing and within the
expected range; scores in blue are passing but exceed the expected range
Core Competencies that can be N/A for the first semester:
3a, 3b
First
Semester
0 1 2 3
4
5 6 7 8 9 10
Second
Semester
0 1 2 3
4
5
6
7 8 9 10
Comments (required for any rating in the red and blue zones):
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
____________________________________________________________________________________
Competency 4 – Engage in Practice-informed Research and Research-Informed Practice
Social workers understand quantitative and qualitative research methods and their respective roles in advancing
a science of social work and in evaluating their practice. Social workers know the principles of logic, scientific
inquiry, and culturally informed and ethical approaches to building knowledge. Social workers understand that
evidence that informs practice derives from multi-disciplinary sources and multiple ways of knowing. They also
understand the processes of translating research findings into effective practice.
Generalist Practice Behavior Evaluation
FI/ST FI/ST
1
st
1
st
2
nd
2
nd
4a. Implements evidence-based interventions.
EXAMPLE: The student uses Motivational Interviewing techniques with her clients.
120
4b. Translates and integrates research findings with professional judgment to inform and improve
practice.
EXAMPLE: The student demonstrates her professional judgment in the rationale section of the Evidence-
Based Intervention Search assignments.
Competency 4: Scores in red are not passing; scores in green are passing and within the
expected range; scores in blue are passing but exceed the expected range
Core Competencies that can be N/A for the first semester:
4b
First
Semester
0 1 2 3
4
5 6 7 8 9 10
Second
Semester
0 1 2 3
4
5
6
7 8 9 10
Comments (required for any rating in the red and blue zones):
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
Competency 5 – Engage in Policy Practice
Social workers understand that human rights and social justice, as well as social welfare and services, are
mediated by policy and its implementation at the federal, state, and local levels. Social workers understand the
history and current structures of social policies and services, the role of policy in service delivery, and the role
of practice in policy development. Social workers understand their role in policy development and
implementation within their practice settings at the micro, mezzo, and macro levels and they actively engage in
policy practice to effect change within those settings. Social workers recognize and understand the historical,
social, cultural, economic, organizational, environmental, and global influences that affect social policy. They
are also knowledgeable about policy formulation, analysis, implementation, and evaluation.
Generalist Practice Behavior Evaluation
FI/ST FI/ST
1
st
1
st
2
nd
2
nd
121
5a. Demonstrates an understanding of how social welfare and/or agency policy affects the
delivery of and access to social services.
EXAMPLE: The student understands the organization’s policies on Tele-Health service delivery to clients.
5b. Applies policies that advance social well-being for individuals, groups, and/or communities.
EXAMPLE: Understanding the limits of the Affordable Care Act in helping his client afford costly
medical procedures, the student finds community-based organizations to financially help his client in
getting a life-saving procedure.
5c. Collaborates across disciplines for effective policy action.
EXAMPLE: The student attends Neighborhood Council meetings to collaborate with law enforcement to
address gun violence in her clients’ community.
Competency 5: Scores in red are not passing; scores in green are passing and within the
expected range; scores in blue are passing but exceed the expected range
Core Competencies that can be N/A for the first semester:
5a, 5b, 5c
First
Semester
0 1 2 3
4
5 6 7 8 9 10
Second
Semester
0 1 2 3
4
5
6
7 8 9 10
Comments (required for any ratings in the red and blue zones):
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
Competency 6 – Engage with Individuals, Families, Groups, Organizations, and Communities
Social workers understand that engagement is an ongoing component of the dynamic and interactive process of
social work practice with, and on behalf of, diverse individuals, families, groups, organizations, and communities.
Social workers value the importance of human relationships. Social workers understand theories of human
behavior and the social environment, and critically evaluate and apply this knowledge to facilitate engagement
with clients and constituencies, including individuals, families, groups, organizations, and communities. Social
workers understand strategies to engage diverse clients and constituencies to specialized practice effectiveness.
Social workers understand how their personal experiences and affective reactions may impact their ability to
effectively engage with diverse clients and constituencies. Social workers value principles of relationship-building
and inter-professional collaboration to facilitate engagement with clients, constituencies, and other professionals
as appropriate.
122
Generalist Practice Behavior Evaluation
FI/ST FI/ST
1
st
1
st
2
nd
2
nd
6a. Applies knowledge of human behavior and the social environment to engage clients and
constituencies. Constituencies include individuals, families, groups, organizations, and/or
communities.
EXAMPLE: Understanding that the community’s mistrust of outsiders, the student takes time to build
relationships and trust with community members.
6b. Uses knowledge of practice context to inform engagement with clients and constituencies.
EXAMPLE: The student identifies the school setting as a forum for student peer support groups to prevent
school dropout.
6c. Uses empathy to engage diverse clients and constituencies.
EXAMPLE: The client is a recent refugee with mistrust of the government system. The student
understands her client’s reluctance to provide personal information required to obtain government
assistance.
6d. Uses interpersonal skills to engage diverse clients and constituencies.
EXAMPLE: The student uses age-appropriate media (e.g., music, games, movies) to help facilitate social
interaction among older adults in a care facility.
Competency 6: Scores in red are not passing; scores in green are passing and within the
expected range; scores in blue are passing but exceed the expected range
First
Semester
0 1 2 3
4
5 6 7 8 9 10
Second
Semester
0 1 2 3
4
5
6
7 8 9 10
Comments (required for any ratings in the red and blue zones):
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
Competency 7 – Assess Individuals, Families, Groups, Organizations, and Communities
Social workers understand that assessment is an ongoing component of the dynamic and interactive process of
social work practice with, and on behalf of, diverse individuals, families, groups, organizations, and
123
communities. Social workers understand theories of human behavior and the social environment, and critically
evaluate and apply this knowledge in the assessment of diverse clients and constituencies, including
individuals, families, groups, organizations, and communities. Social workers understand methods of
assessment with diverse clients and constituencies to specialized practiceeffectiveness. Social workers
recognize the implications of the larger practice context in the assessment process and value the importance of
interprofessional collaboration in this process. Social workers understand how their personal experiences and
affective reactions may affect their assessment and decision making.
Generalist Practice Behavior Evaluation
FI/ST FI/ST
1
st
1
st
2
nd
2
nd
7a. Applies knowledge of multi-disciplinary theoretical frameworks (i.e., human behavior and
the social environment, person-and-environment, among others) in assessing information from
clients and constituencies. Constituencies include individuals, families, groups, organizations,
and/or communities.
EXAMPLE: In working with a pregnant teen who is abusing substances, the student assesses her client’s
relationships in and outside of the home environment.
7b. Applies critical thinking in assessing information (e.g., client strengths, needs, and
challenges) from clients and constituencies.
EXAMPLE: During the intake process at a domestic violence shelter, the student gathers comprehensive
information about the survivor and her children to get a full picture of their circumstances.
7c. Develops mutually agreed-on intervention goals and objectives.
EXAMPLE: The student and client together develop a treatment plan.
Competency 7: Scores in red are not passing; scores in green are passing and within the
expected range; scores in blue are passing but exceed the expected range
First
Semester
0 1 2 3
4
5 6 7 8 9 10
Second
Semester
0 1 2 3
4
5
6
7 8 9 10
Comments (required for any rating in the red and blue zones):
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
Competency 8 – Intervene with Individuals, Families, Groups, Organizations, and Communities
124
Social workers understand that intervention is an ongoing component of the dynamic and interactive process of
social work practice with, and on behalf of, diverse individuals, families, groups, organizations, and communities.
Social workers are knowledgeable about evidence-informed interventions to achieve the goals of clients and
constituencies, including individuals, families, groups, organizations, and communities. Social workers understand
theories of human behavior and the social environment, and critically evaluate and apply this knowledge to
effectively intervene with clients and constituencies. Social workers understand methods of identifying, analyzing
and implementing evidence-informed interventions to achieve client and constituency goals. Social workers value
the importance of inter-professional teamwork and communication in interventions, recognizing that beneficial
outcomes may require interdisciplinary, inter-professional, and inter-organizational collaboration.
Generalist Practice Behavior Evaluation
FI/ST FI/ST
1
st
1
st
2
nd
2
nd
8a. Selects appropriate intervention strategies based on the assessment, research knowledge, and
values and preferences of clients and constituencies. Constituencies include individuals, families,
groups, organizations, and/or communities.
EXAMPLE: The student’s youth client is using marijuana and alcohol. The student selects an evidence-
based substance use intervention that is effective for the youth’s age group and culture.
8b. Implements interventions to achieve practice goals of clients and constituencies.
EXAMPLE: The student’s client is an older adult who lives alone, has health problems, and exhibits signs
of depression. The student reaches out to the client’s natural supports to increase opportunities for social
interaction.
8c. Uses multidisciplinary collaboration as appropriate to achieve beneficial practice outcomes.
EXAMPLE: The student’s client has been removed from preschool due to behavioral problems. The
student facilitates meetings with the client’s parents, preschool teacher, and primary care physician.
8d. Intervenes on behalf of clients and constituencies through, for example, negotiation,
mediation, and/or advocacy.
EXAMPLE: The student develops a strong working relationship with the local housing authority to secure
stable housing for his clients.
Competency 8: Scores in red are not passing; scores in green are passing and within the
expected range; scores in blue are passing but exceed the expected range
Core Competencies that can be N/A for the first semester:
8c
First
Semester
0 1 2 3
4
5 6 7 8 9 10
Second
Semester
0 1 2 3
4
5
6
7 8 9 10
125
Comments (required for any rating in the red and blue zones of 0):
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
Competency 9 – Evaluate Practice with Individuals, Families, Groups, Organizations, and Communities
Social workers understand that evaluation is an ongoing component of the dynamic and interactive process of
social work practice with, and on behalf of, diverse individuals, families, groups, organizations and
communities. Social workers recognize the importance of evaluating processes and outcomes to advance
practice, policy, and service delivery effectiveness. Social workers understand theories of human behavior and
the social environment, and critically evaluate and apply this knowledge in evaluating outcomes. Social
workers understand qualitative and quantitative methods for evaluating outcomes and practice effectiveness.
Generalist Practice Behavior Evaluation
FI/ST FI/ST
1
st
1
st
2
nd
2
nd
9a. Understands appropriate methods for evaluation of outcomes within the context of the
agency.
EXAMPLE: The student understands that administering a standardized instrument in English for the
organization’s primarily Spanish-speaking clients is not an appropriate method for evaluation.
9b. Evaluates (e.g., monitors and critically analyses) intervention processes and outcomes.
EXAMPLE: The student uses an individual treatment plan to monitor, analyze, and evaluate the processes
and outcomes of an intervention to improve coping skills.
9c. Applies evaluation findings to improve practice effectiveness.
EXAMPLE: The student’s field organization conducted an evaluation study of its parent education
program. The student used the study findings (e.g., parenting skills increased) to inform her clients about
effective parent stress management.
Competency 9: Scores in red are not passing; scores in green are passing and within the
expected range; scores in blue are passing but exceed the expected range
Core Competencies that can be N/A for the first semester:
9a, 9b, 9c
First
Semester
0 1 2 3
4
5 6 7 8 9 10
Second
Semester
0 1 2 3
4
5
6
7 8 9 10
126
Comments (required for any rating in red and blue zones):
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
127
Appendix B: Student Comprehensive Skills Evaluation Tool (CSWE 2008 EPAS)
COMPETENCY #1 – PROFESSIONALISM:
INTERN IDENTIFIES AS A PROFESSIONAL SOCIAL WORKER AND
CONDUCTS HER/HIMSELF ACCORDINGLY
F S
1. Demonstrates professional social work roles and boundaries.
2. Demonstrates professional demeanor in behavior, appearance and
communication.
3. Demonstrates professional time management skills and accountability:
punctuality, attendance, paperwork and assignments.
4. Uses supervision and consultation effectively.
Comments (required for ratings of N/A, 0 and 1):
_
_
COMPETENCY #2 – ETHICS:
INTERN APPLIES SOCIAL WORK ETHICAL PRINCIPLES TO GUIDE
HER/HIS PROFESSIONAL PRACTICE
F S
1. Is knowledgeable about the value base of the profession and makes ethical
decisions by applying standards consistent with the NASW Code of ethics
and other guidelines/principals of the profession.
2. Recognizes and manages personal values and biases in ways that allow
professional values to guide practice.
3. Recognizes and tolerates ambiguity in resolving ethical conflicts.
4. Applies strategies of ethical reasoning to arrive at principled decisions.
Comments (required for ratings of N/A, 0 and 1):
_
_
128
COMPETENCY #3 – THINKING & JUDGMENT:
INTERN APPLIES CRITICAL THINKING TO INFORM AND
COMMUNICATE PROFESSIONAL JUDGMENTS
LEARNING OPPORTUNITIES AND STUDENT ACTIVITIES TO
ACHIEVE COMPETENCY #3
F S
1. Uses critical thinking augmented by creativity and curiosity.
2. Is able to comprehend, appraise and integrate multiple sources of knowledge
including research-based knowledge and practice wisdom.
3. Utilizes appropriate models of assessment, prevention, intervention and
evaluation.
4. Demonstrates effective oral communication in working with individuals,
families, groups, organizations, communities and colleagues.
5. Demonstrates effective written communication in working with individuals,
families, groups, organizations, communities and colleagues.
Comments (required for ratings of N/A, 0 and 1):
_
_
COMPETENCY #4 – CULTURAL COMPETENCY:
INTERN ENGAGES DIVERSITY AND DIFFERENCE IN PRACTICE
F S
1. Is knowledgeable about and respectful of clients who differ by factors such as
age, class, color, culture, disability, ethnicity, gender identity and expression,
immigration status, political ideology, race, religion, sex, and sexual
orientation.
2. Recognizes the extent to which a culture’s structures and values may oppress,
marginalize, alienate, or create or enhance privilege and power in shaping life
experience.
3. Demonstrates self-awareness in eliminating the influence of personal biases
and values in working with diverse groups, including treating clients with
dignity and respect.
129
4. Views self as a student of cultural differences and those s/he works with as
cultural experts.
Comments (required for ratings of N/A, 0 and 1):
_____________________________________________________________________________
COMPETENCY #5 – SOCIAL JUSTICE:
INTERN ADVANCES HUMAN RIGHTS AND SOCIAL AND
ECONOMIC JUSTICE
F S
1. Recognizes that each person, regardless of position in society, has basic
human rights, such as freedom, safety, privacy, an adequate standard of
living, health care and education.
2. Identifies the forms, mechanisms and interconnections of oppression and
discrimination and is knowledgeable about theories of justice and strategies to
promote human and civil rights.
3. Is skilled at advocating for and engaging in practices that promotes social and
economic justice.
Comments (required for ratings of N/A, 0 and 1):
______________________________________________________________________________
COMPETENCY #6 – EVIDENCE BASED PRACTICE:
INTERN ENGAGES IN RESEARCH-INFORMED PRACTICE AND
PRACTICE-INFORMED RESEARCH
LEARNING OPPORTUNITIES AND STUDENT ACTIVITIES TO
ACHIEVE COMPETENCY #6
F S
1. Employs evidence-based interventions and policies.
2. Integrates research findings and professional judgment to improve practice,
policy and social service delivery.
3. Evaluates their own practice for effectiveness and improvement.
Comments (required for ratings of N/A, 0 and 1):
______________________________________________________________________________
_______
130
COMPETENCY #7 – PERSON IN ENVIRONMENT:
INTERN APPLIES KNOWLEDGE OF HUMAN BEHAVIOR
AND THE SOCIAL ENVIRONMENT
F S
1. Demonstrates knowledge of human behavior across the life course.
2. Understands social systems and how they promote or inhibit people in
maintaining or achieving health and well-being.
3. Demonstrate knowledge of person-in-environment, including: biological,
social, cultural, psychological and spiritual development of clients/client
systems.
4. Utilizes a range of theoretical frameworks to guide the processes of
assessment, intervention and evaluation.
Comments (required for ratings of N/A, 0 and 1):
______________________________________________________________________________
COMPETENCY #8 – POLICY:
INTERN ENGAGES IN POLICY PRACTICE TO ADVANCE SOCIAL
AND ECONOMIC WELL BEING AND DELIVER EFFECTIVE
SOCIAL WORK SERVICES
F S
1. Demonstrates understanding of the role of policy in service delivery and the
role of practice in policy development.
2. Analyzes and advocates for policies that promotes social well-being for
individuals, families, groups and communities.
3. Recognizing the importance of collaboration with colleagues and clients for
effective policy action.
Comments (required for ratings of N/A, 0 and 1):
______________________________________________________________________________
COMPETENCY #9 – CURRENT TRENDS:
INTERN RESPONDS TO CONTEXTS THAT SHAPE PRACTICE
F S
1. Seeks information, resources and is proactive in responding to evolving
organizational, community and societal contexts of practice.
2. Continuously discover, appraise and attend to changing locales, populations,
scientific and technological developments and emerging societal trends to
provide relevant services.
Comments (required for ratings of N/A, 0 and 1):
131
______________________________________________________________________________
COMPETENCY #10 – PRACTICE SKILLS:
INTERN ENGAGES, ASSESSES, INTERVENES AND EVALUATES
INDIVIDUALS, FAMILIES, GROUPS, ORGANIZATIONS, AND
COMMUNITIES
F S
(A): ENGAGEMENT
1. Develops rapport and addresses confidentiality appropriately with
individuals, families, groups, organizations and/or communities.
2. Uses empathy and other interpersonal skills (e.g. attending behaviors and
basic interviewing skills) to engage colleagues, stakeholders, agencies,
communities, and policy makers in the change process.
3. Develops a mutually agreed-on focus of work and desired
outcomes/expectations.
(B): ASSESSMENT
1. Collects, organizes, analyzes, and interprets system, policy, community, and
organizational data.
2. Assesses strengths and limitations of organizations, communities, policies,
and individuals.
3. Develop mutually agreed upon intervention goals and objectives; and selects
appropriate intervention strategies.
(C): INTERVENTION
1. Initiates actions to achieve goals within the context of the organization and
the community.
2. Incorporates interventions to enhance organizational, community, and policy
practices (e.g. program development, community organizing, training,
resource development, policy analysis, etc.).
3. Assists groups, communities, and organizations in problem resolution and
desired outcome definition.
4. Negotiates, mediates, and advocates for individuals, groups, agencies, and
communities, keeping in mind city, state, agency, and national policies and
practices.
5. Facilitates change transitions and endings during the engagement and work
process.
(D): EVALUATION
132
1. Critically analyzes, monitors, and evaluates interventions for effectiveness
and sustainability.
2. Utilizes evaluation data to revise interventions in relation to needs, impact,
goal, objective, and outcome achievement.
Comments (required for ratings of N/A, 0 and 1 ):
______________________________________________________________________________
Abstract (if available)
Abstract
Purpose: This study aimed to address the lack of research on competency-based student learning outcomes for social work distance learning. It addressed the following three objectives: (a) develop an empirically based online assessment tool adapted from the 2015 Council on Social Work Education’s Educational Policy and Accreditation Standards with specific examples for students and field instructors
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Education based incarceration: educate to change the organizational culture of corrections in the Los Angeles County Sheriff’s Department
PDF
Does organizational culture play a role in aviation safety? A qualitative case study analysis
PDF
Childhood obesity and prevention: nutrition, cooking, and gardening learning approach for an educational intervention program to increase fruit and vegetable knowledge, preference, and consumptio...
PDF
A public sector organizational change model: prioritizing a community-focused, inclusive, and collaborative approach to strategic planning
PDF
Improving post-secondary success for first generation college students through community partnerships: programming practices for charter high schools
PDF
Preparing for the field of student affairs: White graduate students and the social justice and inclusion competency
PDF
China-Africa cooperation: an assessment through the lens of China’s development experience
PDF
Aligning educational resources and strategies to improve student learning: effective practices using an evidence-based model
PDF
Workplace conflict and employment retaliation in law enforcement; an examination of the causes, effects and viable solutions
PDF
The future of work: defining a healthy ecosystem that closes skills-gaps
PDF
Constructing the achievement index: an assessment to help people achieve high-performance leadership in the workplace
PDF
Outcomes-based contracting through impact bonds: ties to social innovation, systems change, and international development
PDF
Digital portfolios for learning and professional development: a faculty development curriculum
PDF
Emerging best practices for using offline mobile phones to train English teachers in developing countries
PDF
Preparing teachers for social emotional learning driven instruction and practice
PDF
The impact of social capital: a case study on the role of social capital in the restoration and recovery of communities after disasters
PDF
Making an impact with high-net-worth philanthropists: understanding their attributes and engagement preferences at nonprofit organizations
PDF
Older adult community service worker program for Riverside County: community-based solutions for social service delivery
PDF
Examining the federal credit union model in the 21st century
PDF
An exploratory case study on the social viability of Alvarado Street Bakery’s employee-owned cooperative model
Asset Metadata
Creator
Hsiao, Shu Chen
(author)
Core Title
Using innovative field model and competency-based student learning outcomes to level the playing field in social work distance learning education
School
School of Policy, Planning and Development
Degree
Doctor of Policy, Planning & Development
Degree Program
Policy, Planning, and Development
Publication Date
02/23/2018
Defense Date
10/23/2017
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
2015 CSWE competencies,evidence-based skills training,MSW distance learning program,OAI-PMH Harvest,student comprehensive skills evaluation,student learning outcome,Virtual Field Practicum
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Natoli, Deborah (
committee chair
), Robertson, Peter (
committee member
), Wong, Marleen (
committee member
)
Creator Email
shuhsiao@usc.edu,suhchen.hsiao@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c40-477619
Unique identifier
UC11266901
Identifier
etd-HsiaoShuCh-6060.pdf (filename),usctheses-c40-477619 (legacy record id)
Legacy Identifier
etd-HsiaoShuCh-6060.pdf
Dmrecord
477619
Document Type
Dissertation
Rights
Hsiao, Shu Chen
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
2015 CSWE competencies
evidence-based skills training
MSW distance learning program
student comprehensive skills evaluation
student learning outcome
Virtual Field Practicum