Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Assessment and accreditation of undergraduate study abroad programs
(USC Thesis Other)
Assessment and accreditation of undergraduate study abroad programs
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
Running head: ASSESSMENT AND ACCREDITATION 1
ASSESSMENT AND ACCREDITATION OF UNDERGRADUATE STUDY ABROAD
PROGRAMS
by
Lauren Michelle Kim
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
August 2015
Copyright 2015 Lauren Michelle Kim
ASSESSMENT AND ACCREDITATION 2
Dedication
This dissertation is dedicated to my mother, Dae Won Kashiwagi whose unwavering love
made me who I am today. This is also dedicated to my sister, Chiae Byun-Kitayama, who
inspired me to embark on this journey and encouraged me to finish. Finally, I dedicate this to
God because all things are possible with Him.
ASSESSMENT AND ACCREDITATION 3
Acknowledgments
First, I would like to express my deepest appreciation to my dissertation chair, Dr. Ruth
Chung, for being supportive, encouraging, and passionate.
Second, I also would like to thank my committee members, Dr. Patricia Tobey and Dr.
Janice Schafrik. Without their expertise, time, and dedication, this dissertation would not have
been possible.
Third, I would also like to thank the other faculty and staff at USC who were sources of
inspiration and knowledge. I would like to provide special recognition to Dr. Pedro Garcia, Dr.
Kenneth Yates, Dr. Kim Hirabayashi, and Dr. Linda Fisher.
Fourth, I would like to recognize my colleague and friend, Mark Ashley. Without his
support, I would not have been able to complete this dissertation.
Fifth, I would like to thank Hae Sook Kim for comforting and encouraging me whenever
I doubted myself and Dr. Gary McCoy and Dr. Leroy Gainey for being my mentors and always
believing in me.
Finally, I would like to thank Alex Okuda of Soka University of America for the
interview and study abroad program professionals who helped me with the survey.
ASSESSMENT AND ACCREDITATION 4
Table of Contents
Dedication 2
Acknowledgments 3
List of Tables 6
List of Figures 7
Abstract 8
Chapter One: Overview of the Study 9
Statement of the Problem 12
Purpose of the Study 14
Significance of the Study 15
Definitions and Terms 16
Chapter Two: Literature Review 19
Study Abroad Programs 19
Origins of Study Abroad Program 19
Diversification of Study Abroad Programs 21
Assessment of Study Abroad Programs 24
Summary of Study Abroad Programs 27
Intercultural Competence 28
Definition of Intercultural Competence 28
Increasing Intercultural Competence 29
Measurement of Intercultural Competence 30
Summary of Intercultural Competence 32
Accreditation 33
Summary of Literature Review 35
Chapter Three: Methodology 37
Participants 38
Accrediting Agencies 40
Instrumentation and Procedure 41
Document Analysis 42
Survey 43
Interview 45
Data Analytics Strategy 47
Chapter Four: Results 49
Institutional and Study Abroad Program Information 49
Program Objectives 51
Program Assessments 53
Accreditation Standards 56
Summary 62
Chapter Five: Conclusion and Recommendations 64
Gaps in Assessment and Accountability 65
Recommendations for Practice 67
Regional: Adaptation of Study Abroad Policy 68
Institutional: Program Review 70
Student: Assessment Before, During, and After 72
Recommendations for Research 73
ASSESSMENT AND ACCREDITATION 5
Conclusion 73
References 75
Appendix A: Survey Questions for Study Abroad Program Administrator 83
Appendix B: Interview Questions 84
Appendix C: Study Abroad Policy 85
ASSESSMENT AND ACCREDITATION 6
List of Tables
Table 1: The Forum on Education Abroad 2011 State of Field Survey on the Types of Analyses
of Student Data 13
Table 2: The Forum on Education Abroad 2011 State of Field Survey on Transcripts 35
Table 3: Institutions with an estimated undergraduate study abroad participation rate of
over 70% compiled from Open Doors, 2013 39
Table 4: Accrediting Agencies and Areas Covered by the Agencies 41
Table 5: An Example of Table to Organize Information Found Through Document Analysis 42
Table 6: Survey Questions for Study Abroad Program Administrator 43
Table 7: Alignment of Survey Protocols and Research Questions 45
Table 8: Interview Questions 46
Table 9: List of 36 Colleges and Universities and Corresponding Regional
Accrediting Agencies 57
Table 10: Regional Accrediting Agencies and Study Abroad Program Self-Evaluation
Study Guidelines 59
Table 11: Summary of Program Objectives and Regional Accrediting Agency’s
Study Abroad Program Standards 61
ASSESSMENT AND ACCREDITATION 7
List of Figures
Figure 1: Study Abroad Program Student Participation from 2002/03 to 2011/12
(Farrugia & Bhandari, 2013). 10
Figure 2: Study Abroad Program Student Participation by Program Length, 2011-12
(Farrugia & Bhandari, 2013). 22
Figure 3: Study Abroad Program Student Participation by Race, 2011-12
(Farrugia & Bhandari, 2013). 24
Figure 4: Three Levels Addressed by the Recommendation 68
ASSESSMENT AND ACCREDITATION 8
Abstract
In increasingly global society, students are expected to achieve intercultural competence,
and studying abroad is one way to achieve this. Study abroad programs have a long history in
the United States but, until recently, participation was limited to economic and intellectual elites.
In less than two decades, study abroad program participation grew by 400% to more than
280,000 students. Today, programs are provided by over 400 institutions, organizations, and
program providers. This study researched selected colleges and universities with high study
abroad participation rate to explore the current state of study abroad program objectives and
assessments. Data were collected in two phases. The first phase consisted of document analysis,
which provided characteristic information about the study abroad programs and helped to
formulate questions for survey and interview. The second phase was survey and interview.
There was rich assortment of information available to the public about study abroad programs.
However, based on this study, there are five gaps in the assessment and accountability of study
abroad programs. First, study abroad programs provided diverse information about the programs
and the availability was not consistent across different institutions. Second, there was limited
information available about program objectives. Third, program effectiveness was not clearly
defined. Fourth, assessments did not happen consistently. Fifth, there were not enough
resources available to the study abroad program office to assess. This study recommended
implementing study abroad policy at all regional accrediting agencies to improve assessment and
accountability of study abroad programs. Such study abroad policy should be required to have
the programs clearly define program objectives, and have periodic review of the study abroad
program. This policy should also require providing that information to the students.
ASSESSMENT AND ACCREDITATION 9
CHAPTER ONE: OVERVIEW OF THE STUDY
As the world becomes smaller, interacting with people from other cultures is necessary
and it becomes essential to have greater understanding of other cultures, not only in the work
place but in everyday life. In the twenty-first century, students need to be able to understand the
interdependent nature of the world, be able to work in a global environment, and know they are a
part of that global citizenship. This global literacy became the responsibility of the educational
system in preparing students for the twenty-first century, and international education including
study abroad is seen as one important way to increase intercultural competence in students
(Emert & Pearson, 2007). One way to gain greater understanding is through studying abroad.
“Experience in another nation, such as that gained through a study abroad experience, has
received growing attention as a method to help students develop their intercultural knowledge
and adaptability” (Mapp, 2012, p. 727). US educators reported that students returning from
studying abroad showed increased concerns about international affairs and more appreciation of
different cultures than they did before their participation (Hadis, 2005). Studying abroad has
become an established part of the higher education landscape in the United States (Farrugia &
Bhandari, 2013), and 85% of US colleges and universities offer some type of study abroad
programs (Whalen, 2008). By participating in study abroad programs (SAPs), students are
thought to increase their intercultural competence.
Given such emphasis in higher education, it is not surprising that more students
participate in study abroad than ever before. During the 2011-2012 academic year, 283,332 US
students participated in SAPs for academic credits whereas only 172,629 students participated in
SAPs during the 2002-2003 academic year (Farrugia & Bhandari, 2013). Between the 2002-2003
and the 2007-2008 academic years, there has been steady growth of student participation (Figure
ASSESSMENT AND ACCREDITATION 10
1). In less than 10 years, participation increased by 64%. This number roughly represents 1% of
all US students enrolled at post-secondary institutions in the United States, and 86% of the
students are undergraduates (Farrugia & Bhandari, 2013). Of US undergraduates 9.4% studied
abroad and 14.2% students pursuing bachelor’s degree studied abroad during their degree
program.
Figure 1. Study Abroad Program Student Participation from 2002/03 to 2011/12 (Farrugia &
Bhandari, 2013).
There are many advantages for students studying abroad. A primary reason is to improve
foreign language skills. This is based on the belief that, by immersing in the target culture, one
will improve target language skills (Association for the Study of Higher Education [ASHE],
2012). Another reason is to gain cultural understanding. For example, a student may be a
heritage seeker wanting to immerse him/herself in the host culture to understand his/her heritage
or may be interested in diversity or learning about other cultures (ASHE, 2012). An additional
0
50000
100000
150000
200000
250000
300000
2002/03 2003/04 2004/05 2005/06 2006/07 2007/08 2008/09 2009/10 2010/11 2011/12
SA Partipants
ASSESSMENT AND ACCREDITATION 11
benefit is career-related, which means students participate in order to improve future job
prospects. Finally, students participate in study abroad for pleasure. Tourism and social contact
were two of the most important factors in influencing students’ decision to participate in SAPs
(ASHE, 2012). Even though there are benefits to participating in SAPs, academic support and
program costs had a significant impact on students’ intent to study abroad (Goel, de Jong &
Schnusenberg, 2010). Therefore, students participate in SAPs for many reasons.
One of the reasons for students to participate in SAPs is to gain intercultural competence.
Intercultural competence, an ability to appreciate differences among cultures and to
communicate, work, and be supportive of people in cross-cultural settings is becoming a
necessity (Anderson, Lawton, Rexeisen, & Hubbard, 2006; Behrnd & Porzelt, 2012; Clarke,
Flaherty, Wright, & McMillen, 2009). SAPs provide an opportunity to become immersed in
another culture. Furthermore, when students are abroad, not all learning occurs inside a
classroom, and most intercultural learning is accomplished through experiential learning
(Anderson et al., 2006; Behrnd & Porzelt, 2012; Clarke et al., 2009). Co-curricular program
becomes just as important as classroom in improving student’s intercultural competence.
Given the intended benefits as well as costs of SAPs, it is imperative to achieve optimal
student learning outcomes. However, optimal student learning outcomes cannot happen in
vacuum. Student learning outcomes such as improvement in intercultural competence must be
clearly defined and measured. Once they are clearly defined, they must be assessed to ensure
students achieve optimal student learning outcomes. The assessment may also be used to
improve the program, and assessment needs to be practiced across different programs and
institutions. Therefore, the aim of this study was to explore SAPs in terms of intercultural
ASSESSMENT AND ACCREDITATION 12
competence and to recommend a solution to improve accountability and consistency of these
programs.
Statement of the Problem
Although increase in participation in SAPs is a positive trend, there are on-going issues
of consistency and quality control across programs. First, significant questions remain as to
whether SAPs adequately meet one of their main objectives of increasing students’ intercultural
competence. Second, the selection and implementation of appropriate assessment instruments to
measure the objective is critical in understanding SAPs for program improvements. Appropriate
assessments can benefit both the institutions in achieving objectives and the student experience,
and, yet, many programs are not assessed or assessed appropriately.
SAPs aim to enhance the overall academic experience for both undergraduate and
graduate students and to help them gain cultural competence (Anderson et al., 2006; Hadis,
2005; Hallows, Wolf, & Marks, 2011). Intercultural competence improvement is a substantial
part of the program’s overall objectives, but it is seldom clearly identified in comparison to
academic objectives (Mills, Deviney, & Ball, 2010; Pedersen, 2010; Wang, Peyvandi, &
Moghaddam, 2011). Intercultural competence does not happen automatically by simply living or
traveling overseas (Anderson et al., 2006; Behrnd & Porzelt, 2012; Clarke et al., 2009). Clear
goals and objectives are needed.
Currently, SAP effectiveness has been assessed using simplistic methods (Hadis, 2005;
McLeod & Wainwright, 2009) and often measured by an in participation and the positive
feedback received from the participants (McLeod & Wainwright, 2009). Based on a 2011 survey
by The Forum on Education Abroad (2012), 72% of study abroad program offices conducted
analysis on student satisfaction, but only 42% analyzed data for student learning and 22% for
ASSESSMENT AND ACCREDITATION 13
gains in language acquisition (Table 1). The offices are more interested in assessing student
satisfaction of the program than student learning outcomes. Thus, evaluating the effectiveness of
SAPs regarding learning outcomes is important.
Table 1
The Forum on Education Abroad 2011 State of Field Survey on the Types of Analyses of Student
Data
Question: What types of analyses of student data does your office conduct?
Type of data Percent
Number of students abroad 94%
Study Abroad destinations 89%
Student demographics 73%
Assessment of student satisfaction 72%
Assessment of student learning 42%
GPA 30%
Gains in language acquisition 22%
Relationship between learning outcomes and study abroad program type 20%
Impact on the major 8%
We do NOT conduct data analysis 2%
Other 8%
Measurement of the improvement of intercultural competence of the students without the
right tools is challenging. There are some studies focused on the effectiveness of the study
abroad program by measuring the increase of the students’ intercultural competence, but only a
limited number of programs employ pre-post assessments in an attempt to quantify changes in
intercultural competence (Anderson et al., 2006; Hadis, 2005). Behrand and Porzelt (2012)
pointed out that the challenge in measuring intercultural competence is assessing not only
knowledge and skills but attitudes and awareness as well. The program requires rigorous
scientific assessments that include both quantitative and qualitative components, and using pre-
and post-assessment is necessary in considering the impact of the program (Anderson et al.,
ASSESSMENT AND ACCREDITATION 14
2006; Clarke et al., 2009; Pedersen, 2010). By using a more comprehensive approach, SAPs
may be able to analyze students’ intercultural competence development. SAP assessment data
are sometimes used for institutional accreditation by an external accrediting body (The Forum on
Education Abroad, 2014).
SAPs need to clearly state their goals and objectives, and since one of the goals is
intercultural competence, SAPs should select appropriate instruments to measure effectiveness.
With the data, SAPs would be able to conduct evaluations and make necessary recommendation
for improvement. Despite the potential value of SAPs, the effectiveness of varying programs in
achieving their intended goals is questionable. The accreditation process may provide the best
framework at this point for evaluating and ensuring quality of SAPs.
Purpose of the Study
Increasingly, colleges and universities are subject to quality assurance of SAPs (Brewer
& Brockington, 2013). The purpose of this descriptive study was to explore the current
objectives and assessment practice of SAPs, the accreditation standards for SAPs, and ideas to
improve accountability of SAPs. As a result of such an understanding, specific
recommendations were offered to improve assessment and accountability of SAPs.
The following questions guided this study:
1. What are the objectives of selected colleges and universities’ undergraduate study
abroad programs?
a. To what extent do institutions clearly state their objectives?
b. Is there a common theme in the stated objectives?
ASSESSMENT AND ACCREDITATION 15
2. What are the stated criteria for measuring the effectiveness of undergraduate study
abroad programs of selected colleges and universities and how clearly and readily
stated are the criteria?
3. What are the undergraduate study abroad program standards as published by the six
regional accrediting agencies’ websites?
4. What are the notable gaps in the assessment and accountability of study abroad
programs?
5. What recommendation would help to improve assessment and accountability of study
abroad programs?
First, three questions covering descriptive findings of this study were answered in chapter four.
They then provided the basis for questions four and five which addressed the gaps and
improvements and are discussed in chapter five.
Significance of the Study
This study examined different undergraduate SAPs in the United States to provide an
overview of the current state of the field in evaluation and assessment of SAPs, with the
intention of identifying approaches for greater accountability and consistency for evaluating
effectiveness of these programs. In recent years, SAPs received more attention and, in turn,
there is a plethora of research on student learning outcomes. Even though much of the research
showed positive outcomes, “the outcomes may not be as overwhelmingly positive as educators
wish to believe or as warranted by the substantial investment of time and resources required by
institutions and individuals” (Twombly, Salisbury, Tumanut, & Klute, 2012, p. x). With limited
resources available, it is imperative to have assurance of consistent level of quality of the SAPs.
“This can be done by explicitly designing and delivering each study abroad program around
ASSESSMENT AND ACCREDITATION 16
clearly identified educational outcomes” (Twombly et al., 2012, p. x). Greater accountability
will result from clear learning outcomes and assessments.
This study also sought to inform administrators of the need to strategically implement
effective assessment of SAPs. According to McLeod and Wainwright (2009), most SAPs have
done well in terms of evaluation using simplistic research methods, but “more needs to be done
to fully evaluate the study abroad experience” (p. 66). Recommendations provided higher
education administrators with help in developing a self-evaluation study for SAPs. This review
process would help to
Engage in a process that will allow the institution to gauge the importance of
internationalization to an institution’s mission and identity, involve members of the
institution in a discussion of what the institution’s internationalization is and should be,
and pay attention to the kinds of support that exist or are lacking. (Brewer &
Brockington, 2013, p. 3)
This process would aid in identifying approaches for greater accountability and consistency for
evaluating the effectiveness of these programs.
Definitions and Terms
SAPs are defined as all educational programs that occur outside the student’s country of
origin (Kitsantas, 2004). For the purpose of the study, the definition of SAP will only include
formal educational programs (Zhang, 2011). There are three types of SAP in terms of duration:
Short term programs are 2 weeks to 8 weeks in length (e.g., Summer Study Abroad); Mid-length
programs are semester and one or two quarters in length; and Year-long programs are academic
year not including summer (Farrugia & Bhandari, 2013). There are many different types of
SAPs: Faculty-led Study Tour is one in which students are accompanied by a faculty member,
ASSESSMENT AND ACCREDITATION 17
and coursework is offered in English as students visit many cities or countries (Hoffa, 2007);
Integrated university study or direct enrollment program is participation in regular courses
provided by the host university where host-university students attend as well (Vande Berg, n.d.);
Island program is usually taught in English and entails students’ taking courses with other
students from home institution (Vande Berg, n.d.); Service-Learning Trip is a type of
experiential study involving volunteer work abroad (Sachau, Brasher, & Fee, 2010); Travel
seminar or Study Tour refers to students’ visiting different sites traveling from country to
country or city to city (Sachau, Brasher, & Fee, 2010); University Hybrid program is a program
where students both specially arranged courses for study abroad students and regular courses
offered by the host institutions (Vande Berg, n.d.); and University Institute program is an
institute created by the host institution for international students. Students take courses with
other international rather than with regular students from the host institutions (Vande Berg, n.d.).
There are two main definitions to describe a person when evaluating intercultural
competence. A person is ethnocentric when he or she has a “natural tendency to look at the
world primarily from the perspective of one’s own culture and to evaluate all other groups from
that viewpoint” (Engle & Martin, 2010, p. 8). A person is ethnorelative when he or she has
“developed ability to adapt one’s behaviour and judgments to a variety of standards and customs,
to perceive and experience other cultures empathetically, on their own terms” (Engle & Martin,
2010, p. 8).
The following terms are often used in assessment. Assessment is a qualitative or
quantitative “collection of statistical data on individual performance or from individual
activities” (Engle & Martin, 2010, p. 8); Evaluation is an “interpretation or analysis of scores and
statistics, along with other types of information, in order to formulate a judgment or conclusion
ASSESSMENT AND ACCREDITATION 18
about the value, quality, merit, etc. of whatever is being evaluated” (Engle & Martin, 2010, p. 8);
Outcomes is “Specific abilities, knowledge, values, attitudes developed through study abroad.
Examples: listening comprehension, the number and richness of friendships formed abroad”
(Engle & Martin, 2010, p. 8); and Student Learning Outcomes is “A means for clearly stating the
expected outcomes of instruction, or more broadly, the expected outcomes of a program”
(Moskal, Ellis, & Keon, 2008, p. 272).
Following terms are found in accredtition: Higher Education Accreditation is “A peer
review process coordinated by accreditation commissions and member institutions, with the goal
of ensuring that the education provided by colleges and universities meets certain measures of
quality” (American Council on Education, n.d.) and Self-evaluation Study measures the
program’s “performance against the standards established by the accrediting agency” (US
Department of Education, n.d.).
ASSESSMENT AND ACCREDITATION 19
CHAPTER TWO: LITERATURE REVIEW
What are the non-academic goals of SAPs? How are they assessing the programs? What
is being done to ensure that assessments are conducted on regular basis? This literature review
attempts to answer some of these questions through three sections: SAPs, intercultural
competence, and accreditation. The section on SAPs has four parts. First, a brief origin of SAPs
is discussed. Second, the classifications created by The Forum on Education Abroad and also by
Engle and Engle (2003) is presented in an attempt to provide much needed structure, which will
help in assessing SAPs across different program types and length. Third, diversification of SAPs
in program length, providers, and methods is discussed. Fourth, patterns and trends in increasing
participation are presented. Because studying abroad has been thought to increase intercultural
competence (ASHE, 2012; Clarke et al., 2009; Emert & Pearson, 2007; Shaftel, Shaftel, &
Ahluwalia, 2007), the intercultural competence section discusses the definition of intercultural
competence, the impact of the duration of the SAP on intercultural competence, and, finally,
different assessment practices of intercultural competence. The final section on accreditation
provides a brief overview on accreditation in the United States, and three reasons for using
accreditation as a framework of SAPs.
Study Abroad Programs
Origins of Study Abroad Program
Study abroad has a long history and, until recently, only the economic or intellectual elite
participated. According to Hoffa (2007), the history of study abroad in the Unites States began
as early as during the colonial period. Often, sons and, seldom, daughters of wealthy landowners
did the Grand Tour. The Grand Tour was “a purposeful exposure to foreign ways and values”
(Hoffa, 2007, p.14) for aristocratic families throughout Europe. American elites on the Grand
ASSESSMENT AND ACCREDITATION 20
Tour “pursued social, diplomatic, familiar and pragmatic ends much more than they sought
anything resembling academic knowledge” (Hoffa, 2007, p.15). According to Hoffa (2007),
success of the Grand Tour provided the model for later SAPs. Formal SAPs began in 1880 with
a summer study tour run by Indiana University followed by a volunteer program in Asia by
Princeton University in 1892 (Bolen, 2001). There was a limited number of programs available
as well as a low number of participants at that time. Similar to colonial times, participants were
mostly economic or intellectual elites.
American approaches to higher education helped to foster the growth of study abroad
participation. According to Hoffa (2007), in the 1920s, study abroad participation became
possible as part of undergraduate studies due to the American modular credit system. “With the
modular credit system in place, students could take course not just in another domestic
institution, but also in an accredited program or directly from an overseas university, without
impeding their progress toward their degree” (Hoffa, 2007, p.58). According to Bolen (2001),
even with creation of Fulbright scholarships after World War II, studying abroad was still
focused on academic elites, serving only 90,000 American participants in 50 years after World
War II. Study abroad has a long history, but earning credits toward a degree did not start until
1920 (Hoffa, 2007).
Overall, SAPs focused on non-academic experience in the beginning. The programs
exposed participants to the culture and people. However, it was not until 1920 when students
earned credits toward degree. Even with a long history and ability to work towards the degree,
participation was still limited to academic elites.
ASSESSMENT AND ACCREDITATION 21
Diversification of Study Abroad Programs
In the past twenty years, SAPs’ changes were shaped by increasing efforts to
internationalize campuses and students. This expansion in the type of SAPs offered was a
contributing factor in increased study abroad participation. Colleges and universities diversified
SAP offering by accommodating majors previously not supported, offering more locations
including developing nations, and varying duration of the programs. To date, there are more
than 400 institutions, organizations, and other service providers represented at the National
Association of Foreign Student Advisers (NAFSA) conferences to distribute information about
their services and programs (ASHE, 2012).
There are a variety of study abroad providers through the students’ home institution,
another college or university, or through a third-party provider. However, the SAP must be first
approved by the student’s home institution in order to receive academic credits, and there are
some common models of study abroad providers. The institutionally administered program is
developed and run by the sponsoring US college or university in all aspects. There is another
model where consortia of colleges share study abroad offerings, such as those provided by the
California State University International Programs. Some colleges and universities provide
SAPs by relinquishment of the programs to programs established and run by other colleges and
universities, while others work directly with foreign institutes to grant credits for work done.
Third party providers are organizations that develop and run the programs in all aspects,
including control of instruction, housing, recording of grades, and providing other supports. In
the 2011-2012 academic term, 25.9% of study abroad students participated through third party
providers (Institute of International Education, 2013).
ASSESSMENT AND ACCREDITATION 22
Figure 2. Study Abroad Program Student Participation by Program Length, 2011-12 (Farrugia &
Bhandari, 2013).
Moreover, junior year abroad is no longer the only or even most popular form of study
abroad program. During the 2001-2002 term, 8.3% of undergraduate degree seeking students
participated in academic year or calendar year programs. However, only 3.30 % of
undergraduate degree-seeking students participated in a year-long program in the 2011-2012
term (Institute of International Education, 2013). The Institute of International Education (2013)
reports the participation rate broken down by length of program: summer term, one semester,
eight weeks or less during an academic year, January term, academic year, quarter, two quarters,
calendar year, and others (Figure 2). A short-term program is defined as between 2 and 8 weeks
in length or as a summer SAP regardless of length. Mid-length programs take place over a
37.1%
35.0%
14.4%
7.0%
3.2%
2.5%
0.4% 0.1%
0.3%
2011/12
Summer Term
One Semester
8 Weeks or Less During
Academic Year
January Term
Academic Year
One Quarter
Two Quarters
Calendar Year
Other
ASSESSMENT AND ACCREDITATION 23
semester or one or two quarters. During the 2011-2012 term, 58.5% of students participated in a
short-term program, and 37.9% of students participated in a mid-length program (Farrugia &
Bhandari, 2013).
Colleges and universities also sought to diversify not only the program offerings but also
participating students. When federal financial aid became available for study abroad students in
1992, it helped to increase the population of study abroad students. In the 1991-1992 term, there
were 71,154 students participating in SAP. Two decades later, during the 2011-2012 term, there
were 283,332 students. SAPs grew by almost 400% in two decades. Campuses were also able to
make small changes to diversify student participation. In 2001-2002, 17.1% of students
participating in study abroad program were non-White, and that number increased to 23.6% in
the 2011-2012 (Figure 3). Underrepresentation of minorities in higher education overall is
mirrored in study abroad participation. As a percentage of the total study abroad population in
the 2011-2012 term, racial/ethnic minority group participation was as follows: Asian American,
Native Hawaiian and other Pacific Islander (7.7%), Hispanic/Latino(a) (7.6%), African
American (5.3%), multi-racial (2.5%), and Native American/Alaskan Native students (0.5%).
Racial breakdown has not changed much since the 2001-2002 term. Over the decade, Asian
American, Native Hawaiian and other Pacific Islander participation has increased from 5.8% to
7.7%, Hispanic/Latino(a) participation from 5.4% to 7.6%, and African American participation
from 3.5% to 5.3% (Farrugia & Bhandari, 2013).
In the past twenty years, SAPs witnessed explosive growth and three contributing factors
helped to increase student participation. First, students were able to choose from a variety of
programs offered by the home-institution, host-institution, or third party provider. Second,
students were able to choose from different duration lengths. Students were no longer locked
ASSESSMENT AND ACCREDITATION 24
into Junior Year Abroad and able to choose from a short-term, mid-length, or year-long program.
Lastly, availability of financial aid in 1992 coincided with a steady increase of student
participation. Since 1992, more minority students participate in SAPs. SAPs are no longer for
the academic elite but participants are still predominantly white.
Figure 3. Study Abroad Program Student Participation by Race, 2011-12 (Farrugia & Bhandari,
2013).
Assessment of Study Abroad Programs
The Forum on Education Abroad is a non-profit organization with over 650 members and
focuses on “developing and implementing standards of good practice, encouraging and
supporting research initiatives, and offering educational programs and resources to its members”
(The Forum on Education Abroad, n.d.). The Forum was started by a group of education abroad
professionals in May 2000 and incorporated in July 2001. In 2005, the US Department of Justice
and the Federal Trade Commission recognized the Forum to serve as the Standards Development
Organization (SDO) in the field of education abroad:
76.4%
5.3%
7.6%
7.7%
2.5%
0.5%
White
Black or African American
Hispanic or Latino(a)
Asian, Native Hawaiian, or
other Pacific Islander
Multiracial
American Indian or Alaska
Native
ASSESSMENT AND ACCREDITATION 25
(1) To develop and present voluntary consensus standards for education abroad
programs, for domestic colleges and universities and entities in other nations that
provide or partner in providing education abroad programs for students from US
colleges and universities; and (2) to present standards and methods for assessing
performance against the standards that can be used by the smallest and simplest
organizations interested in self-improvement, through to the largest and most
complex organizations in the education abroad field. (Fountain, 2004)
The Forum on Education Abroad created a classification of program type based on a descriptive
approach, and programs fall under one of these classifications (ASHE, 2012). The Forum also
published the second edition of Education Abroad Glossary in 2011. According to The Forum
on Education Abroad (2011), field study programs are primarily experiential study outside of the
classroom; integrated university study is participating in regular courses provided by the host
university where host-university students attend as well; overseas branch campus pertains to a
US college or university with a branch in a foreign country; study abroad center is a classroom
course designed specifically for study abroad students; and travel seminar refers to students
traveling from country to country. The Forum attempts to categorize various programs and
subtypes into one of 24 descriptors (ASHE, 2012). According to ASHE (2012), some of the
most commonly used descriptors are: direct enrollment program where students study at an
overseas university with no formal assistance of a program provider, faculty-led program is led
and directed by a faculty member from the home-institution and it is usually short term in
duration, island program is where US students typically live and study together, and immersion
program emphasizes the integration of U.S. students with the host culture. There are criticisms
over these kinds of classifications. Descriptive labeling of SAPs may not capture the complex
ASSESSMENT AND ACCREDITATION 26
nature of the program. Classification alone does not provide a full picture of the program and
what the students are expected to learn.
Engle and Engle (2003) developed a “level-based classification system” based on seven
program components: duration; entry target-language competence; language used in course
work; academic work context; housing; provisions for cultural interaction, experiential learning;
and guided reflection on cultural experience. Level one is a study tour where led by home
institution faculty for a length of several days to a few weeks. Level two is short-term study of 3
to 8 weeks in duration or a summer program, and students usually take the courses together with
other SAP participants or courses at an institute for foreign students. Level three is cross-cultural
contact program. It is a semester long, and the orientation program provides guided reflection on
the cultural experience. Courses are taken with other study abroad participants or other
international students. Level four is a cross-cultural encounter program, and the length could be
a semester or a year-long. This level requires pre-advanced to advanced level knowledge of
target language and occasionally adds cultural integration activities and provides ongoing guided
reflection on cultural experience. Level five is cross-cultural immersion program. Students take
courses with target culture students. At this level, regular participation in a cultural integration
program that promotes extensive direct cultural contact is required. It also promotes extensive
direct cultural contact. Mentoring and other on-going orientation program or course in cross-
cultural perspectives are provided.
Assessing SAPs does not appear to be high priority for US institutions. According to The
Forum on Education Abroad (2014), “assessing education abroad remains a challenge for forum
member institutions. More institutions report that they identify learning outcomes for their
programs than report having an assessment plan to measure learning outcomes” (p. 2). This is
ASSESSMENT AND ACCREDITATION 27
surprising since 57% of respondents reported that learning outcomes were identified but only
39% of respondents reported having an assessment plan to measure learning outcomes (The
Forum on Education Abroad, 2014). According to Emert and Pearson (2007), one way to
demonstrate the legitimacy of a SAP is by clearly demonstrating its educational outcomes.
The Forum on Education Abroad provides basic definitions of most commonly used
terminology and classification to categorize different programs. The Forum’s approach provides
basic introduction to SAPs; however, a multi-dimensional approach to classifying a program is
preferable over descriptive classification. Engle and Engle’s (2003) approach recognizes the
diversity of twenty-first century SAPs, has potential to provide more sophisticated and complete
statement of programs, and can help to develop program outcomes assessment process.
Summary of Study Abroad Programs
SAPs have a long and rich history. In the beginning, only economic or intellectual elites
participated. That is no longer the case, especially with financial aid becoming available for
SAPs in 1992. Programs grew substantially within the past twenty years in terms of the number
of student participation, program providers, and program lengths. Such growth in SAPs is
remarkable, but it is uncertain whether the quality of programs offered by different providers is
consistent or whether program lengths have any impact on student learning outcomes. In order
to decipher the myriad of information on SAPs, The Forum on Education Abroad and Engle and
Engle (2003) created two separate classification systems for SAPs. The classification further
reinforces the complex nature of SAPs. Assessing SAPs is not simple, and, according to the
field survey completed by The Forum on Education Abroad in 2013, only 39% had plans to
measure learning outcomes.
ASSESSMENT AND ACCREDITATION 28
Intercultural Competence
Definition of Intercultural Competence
One of the reasons students participate in SAP is to gain intercultural competence, but
cultural competence is not clearly defined. Furthermore, Deardorff (2006) argued that there has
not been agreement on the definition of intercultural competence and specific items used have to
be defined in order to assess students’ in this area. She surveyed 24 higher education
administrators and a panel of 23 intercultural scholars to compile a list of intercultural
competence definition and elements. Only those with 80% or higher consensus made the list;
ultimately, nine definitions made the consensus. The study also identified 22 specific
components of intercultural competence and some of them were “understanding others’
worldview”, “adaptability and adjustment to new cultural environment”, and “understanding the
value of cultural diversity” (Deardorff, 2006, pp. 249-50). Findings showed that both
administrators and the scholars preferred intercultural competence definitions that are more
general and broad over specificity. Although it is difficult to find consensus on a definition of
intercultural competence (Emert & Pearson, 2007; Deardorff, 2006), it can be defined as “the
ability to communicate effectively and appropriately in intercultural situations based on one’s
intercultural knowledge, skills, and attitudes” (Deardorff, 2006, p. 247-8). This definition
includes three dimensions to intercultural competence.
Three dimensions of intercultural competence are cognitive, affective, and behavioral
(Emert & Pearson, 2007; Martin, 1987; Williams, 2009). According to Williams (2009), the
cognitive dimension includes the knowledge of “cultural norms, values, behaviors, and issues”
(p. 290); the affective dimension relates to “the flexibility to adapt to new situations and open-
mindedness to encounter new values” (p. 290); the behavioral dimension includes “critical skills
ASSESSMENT AND ACCREDITATION 29
such as resourcefulness, problem-solving skills, and culturally-appropriate people skills” (p.
290). Behrnd and Porzelt (2012) define intercultural competence with three aspects: cognitive,
affective, and conative. They share the definition of cognitive and affective aspects of
intercultural competence with Williams (2009), however, differ slightly on third. According to
Behrnd and Porzelt (2012), “The conative aspects reflect consciousness and knowledge of
different communication styles and non-verbal communication. This includes the identification
and the effective appliance of different communication styles” (p. 215). Although there are
variations to the definition, intercultural competence involves changes to whole person, which
means it affects brain, heart, and action.
Increasing Intercultural Competence
Duration of being abroad has a positive correlation with intercultural competence.
Martin (1987) studied “the relationship between previous intercultural experience and perceived
ability on selected intercultural competencies” (p. 342). To answer an 18-item questionnaire
measuring three dimensions of intercultural competence: cognitive, affective, and behavioral,
175 students were selected. The study found that students’ self-reported abilities were related to
the duration of their intercultural experience. Students with 3 months or more experience
showed the most difference in their awareness of self and culture. No significant difference could
be found between students with less than 3 months experience and no experience.
A study by Behrnd and Porzelt (2012) investigated whether students who had no
intercultural training before study abroad experience showed higher intercultural competence
than did students without abroad experience. This study utilized a 31-item self-reported
questionnaire to measure the three aspects of intercultural competence on 72 German students.
Initial finding showed no significant difference between the students with and without the
ASSESSMENT AND ACCREDITATION 30
experience; however, after further examination of one variable, the length of abroad experience,
there was a significant relationship between the length of greater than 10 months and
intercultural competence.
Studying abroad alone does not increase intercultural competence. Pederson (2010)
utilized the Intercultural Development Inventory to compare three groups of students. Group 1
students who participated in a year-long study abroad program received intervention through “a
Psychology of Group Dynamic course which integrated intercultural effectiveness and diversity
training pedagogy including cultural immersion, guided reflection, and intercultural training” (p.
70). Group 2 students participated in a year-long SAP but did not receive an intervention.
Group 3 students did not participate in a study abroad program. A statistically significant
difference was found between group 1 and groups 2 and 3. No statistically significant difference
was found between groups 2 and 3. The study revealed that studying abroad alone does not
contribute to a statistically significant increase in intercultural competence.
Having intercultural training prior to study abroad showed a positive influence on the
increase of intercultural competence as well as in taking an intervention course. Duration of ten
months or longer also had a positive correlation with an increase in intercultural competence.
Therefore, participating in study abroad alone does not necessarily translate to increase in
intercultural competence, but other factors, such as training, intervention course, and duration, do
affect intercultural competence.
Measurement of Intercultural Competence
There are different instruments to measure the effects of studying abroad. Each
instrument measures different aspects of intercultural competence. Emert and Pearson (2007)
studied 88 community college students enrolled in their Costa Rica program between 2004 and
ASSESSMENT AND ACCREDITATION 31
2006, and forty-three students enrolled in the Oxford University program between 2003 and
2005. The Intercultural Development Inventory was administered as a mandatory pre-departure
orientation. The Intercultural Development Inventory was used for program assessment and not
an assessment of individual student’s intercultural learning. With the post-test after the program,
they found small decreases of ethnocentric tendencies. The Oxford program produced greater
development than the Costa Rica program. Since program requirements, course content, and
student characteristics are different, it is difficult to surmise the contributing factor. The study
revealed that students’ experience through the SAP does foster positive growth in worldview.
Students who studied abroad showed higher intercultural competence than the students
who did not. Clarke et al. (2009) studied two groups of undergraduate business students to
answer whether students who study abroad are more global minded, recognize greater
intercultural communication skills, are more open to diversity, and are more interculturally
sensitive than are students who have not studied abroad. One group of 70 students took four
courses on campus and another group of 87 students completed the same coursework taught by
host country instructors for a faculty-led study abroad program in Belgium. Students
participating in the Belgium program also completed a course on “European Business
Environment” while visiting businesses, governmental institutions, and cultural sites in six
Wetern European countries. Based on a survey completed at the conclusion of the semester, the
study abroad participants were more globally minded, open to intercultural communication and
open to diversity than the non-SAP group. They also found that study abroad students had
signficantly higher scores on the last two stages of the intercultural sensitivity index, adaptation
and integration. Students in these stages are able to empathize and to experience life through the
values of a different culture.
ASSESSMENT AND ACCREDITATION 32
Shaftel et al. (2007) conducted a large-scale study on the impact of overseas experience
for undergraduate professional school students. The study used the Cross-Cultural Adaptability
Inventory, which consists of 50 items rated on a six-point scale, from “not at all like me” to
“very much like me.” Additional student attitude survey questions were developed by the
researchers. The study consisted of 660 students in two groups: one group consisting of 352
summer students and 118 regular semester students participating in study abroad program in
Italy; and another group consisting of 118 summer students and 72 regular semester students on
campus. Data were collected at the beginning and at the end of the term for four consecutive
terms: two summers, fall, and spring semesters. The study found statistically significant
positive change in Cross-Cultural Adaptability Inventory total score for both summer and
semester program students. The summer Italy travel group showed most positive change. For
pre-departure measure, the semester study abroad students scored higher on all factors than did
the campus group. These findings imply that students who participate in semester programs
already have higher intercultural competence. Post only assessment does not accurately capture
the change that occurred.
Summary of Intercultural Competence
Intercultural competence is an important learning outcome for SAPs. However,
participation in SAP does not guarantee gains in intercultural competence for students. There are
many factors that contribute to an increase in intercultural competence. Duration is one of them.
The length of SAPs has positive effect on intercultural competence. Another factor is receiving
intercultural training or courses prior to study abroad participation. There are three dimensions
of intercultural competence that need to be measured: knowledge, adaptability and open-
mindedness, and people skills. There are different instruments to measure intercultural
ASSESSMENT AND ACCREDITATION 33
competence, but the type of instrument to use would depend on the definition of intercultural
competence. This requires a clearly stated learning outcome, and there are at least two
instruments that are widely used to measure intercultural competence: the Intercultural
Development Inventory and the Cross-Cultural Adaptability Inventory. Both can be
administered pre- and post-departure to measure the difference.
Accreditation
Although SAPs have been a part of higher education landscape for a long time, it is
difficult to understand how they are assessed and reviewed. Most institutions that offer SAPs are
assessed and reviewed by an accrediting agency and accredited. Accreditation in the United
States is one of the oldest systems in the world that promotes self-regulation through peer and
professional review for quality assurance and improvement (Eaton, 2010). Accreditation as
defined by the US Department of Education is “the status of public recognition that an
accrediting agency grants to an educational institution or program that meets the agency's
standards and requirements” (US Department of Education, 2012, p. 4), and, as defined by the
Council for Higher Education Accreditation, it is “a process of external quality review created
and used by higher education to scrutinize colleges, universities and programs for quality
assurance and quality improvement” (CHEA, 2014, p.1).
Today, there are 19 institutional and 61 programmatic accreditors providing accreditation
to more than seven thousand colleges and universities and twenty thousand programs (Eaton,
2010). An accrediting agency or agency is “a legal entity, or that part of a legal entity, that
conducts accrediting activities through voluntary, non-Federal peer review and makes decisions
concerning the accreditation or pre-accreditation status of institutions, programs, or both” (US
Department of Education, 2012, p.4). Currently, most accreditors do not provide guidelines for
ASSESSMENT AND ACCREDITATION 34
quality standards in reviewing SAPs, and there is a lack of comprehensive assessments of SAPs.
McLeod and Wainwright (2009) proposed that more full evaluation of study abroad program
experience is needed and Engle and Engle (2003) also called for focusing more on the quality of
participants’ experience rather than the number of participants.
There are three reasons for using accreditation as a framework of SAPs. First, SAPs are
complex and diverse. There is no one right way to measure the effectiveness of the programs.
Accrediting agencies provide guidelines for quality standards but allow flexibility that fosters
institution or program differences (Moskal, Ellis, & Keon, 2008). A self-evaluation study process
gives institutions an opportunity to assess programs and implement improvements. A peer
review focuses on the institutional quality against institutional mission. Second, according to a
2011 survey by The Forum on Education Abroad (Table 2), 82% of those surveyed stated that
schools accept an official transcript from an accredited host institution. Only 21% of receiving
institutions evaluate the courses by academic units such as academic departments, Dean’s
offices, academic advising offices, and others rely heavily on the course evaluation by an
accredited host institution. Accreditation is seen as a stamp of approval, and colleges and
universities often rely on an accrediting body to do due diligence. Third, resources are scarce
and do not find their way into non-required assessment efforts. Often, assessments required
through accreditation standards become a key focus in higher education (Wehlburg, 2013).
Accreditation would prompt colleges and universities to allocate resources toward program
assessment.
ASSESSMENT AND ACCREDITATION 35
Table 2
The Forum on Education Abroad 2011 State of Field Survey on Transcripts
Question: In order to accept credit from an education abroad provider, what kind of
transcript or record of courses does your institution accept? (select all that apply)
Responses Percent
an official transcript from an accredited host institution 82%
an official transcript from a U.S school of record 66%
a record of courses from the program provider 36%
a jointly issued transcript from the host institution and program provider 30%
Other 4%
Accreditation is trusted by the public to provide assurance of a program’s or institution’s
quality. It is also a process that promotes constant improvement. It has standards but is flexible
enough to accommodate the difference in institutions and programs. It is not a perfect system
but it provides a good starting point for SAPs to follow.
Summary of Literature Review
Study abroad has been in existence for a long time. Formal SAP began in 1880 with
summer study tour, and current format of SAPs began in the 1920s with students earning credits
toward degree completion. SAPs are complex and diverse, as there is no one type of program.
They differ in program duration, location, and providers. There is no easy way to determine the
quality of one program from another without evaluating each one. That evaluation requires
resources that many institutions do not have. There has to be an easier way to determine quality
of programs.
One of the objectives of SAP is to increase intercultural competence. Since intercultural
competence is a complicated construct, it needs to be defined in order to select the right
instrument to measure. There are many instruments available to measure intercultural
competence, but two instruments are used widely: the Intercultural Development Inventory and
ASSESSMENT AND ACCREDITATION 36
the Cross-Cultural Adaptability Inventory. According to the survey by The Forum on Education
Abroad, 57% of institutions identified study abroad learning outcomes and only 39% have plans
to measure the outcomes. Institutions would need to identify student learning outcomes in
precise manner so that an appropriate instrument can be used to measure learning outcomes.
Accreditation provides solid guidelines and gives public sense of quality assurance.
However, there is no accrediting agency that oversees SAPs. Academic effectiveness is often
addressed through academic credit standards required by accrediting agencies. Since SAPs
provide both academic and non-academic experience, SAPs require more than just academic
review. How the real benefit of the SAPs that happens outside the classroom is measured is
questionable. With information currently available, students cannot easily determine the
effectiveness of SAPs. Therefore, accountability and assessment of SAPs need to be addressed.
ASSESSMENT AND ACCREDITATION 37
CHAPTER THREE: METHODOLOGY
Pursuit of accountability in higher education is not a new phenomenon, but there is a lack
of a widely accepted definition of quality (Blanco-Ramirez & Berger, 2014). When SAPs were
small and served a limited number of participants, it was seen as a program for an elite few. As
more colleges and universities provided diverse SAPs and participation increases, establishing
accountability for SAPs became necessary. This chapter reviews the methods utilized in
conducting the current descriptive study to understand SAPs in terms of program objectives,
assessment, and accreditation. This study was completed in two phases. First, publicly available
information regarding SAPs was explored and compiled, and, second, surveys and interviews
were conducted to provide a fuller context for the data. Combined data collection methods of
document analysis, survey, and interviews provided answers to first three research questions:
1. What are the objectives of selected colleges and universities’ undergraduate study
abroad programs?
a. To what extent do institutions clearly state their objectives?
b. Is there a common theme in the stated objectives?
2. What are the stated criteria for measuring the effectiveness of undergraduate study
abroad programs of selected colleges and universities and how clearly and readily
stated are the criteria?
3. What are the undergraduate study abroad program standards as published by the six
regional accrediting agencies’ websites?
Resulting data from above formed the basis of addressing the subsequent research questions.
4. What are the notable gaps in the assessment and accountability of study abroad
programs?
ASSESSMENT AND ACCREDITATION 38
5. What recommendation would help to improve assessment and accountability of study
abroad programs?
This chapter first discusses relevant demographic characteristics of participating
institutions. Second, relevant guidelines stated by accrediting agencies are discussed. Third,
instrumentation including protocols is discussed. Finally, the data collection procedures through
document review, survey and interview are presented, followed by a description of data analysis.
After document analysis was completed, a second phase of data collection was undertaken. In
order to provide additional details to the findings from document analysis, a survey was
developed and administered. Interviews complemented document analysis and surveys by
adding additional context to the findings.
Participants
Over 283,332 US students studied abroad for credit during the 2011-2012 term, and, of
that number, 245,649 students were undergraduates and represent roughly 9.4% of all students
pursuing an undergraduate degree program (Farrugia & Bhandari, 2013). Large research
universities send a large number of students to study abroad; however, baccalaureate institutions
have high study abroad participation rates. Each year, the Institute of International Education
publishes a list of colleges and universities with an estimated undergraduate study abroad
participation rate of over 70%. This participation rate is calculated by “dividing the total number
of undergraduate study abroad students by the number of undergraduate degrees conferred as
reported in Integrated Postsecondary Education Data System (IPEDS)” (Farrugia & Bhandari,
2013, p. 20). This study looked at the list of colleges and universities as reported by the Open
Doors 2013 publication (Table 3). This list consisted of 36 colleges and universities. Based on
Carnegie classification, two institutions were doctorate-granting universities, six institutions
ASSESSMENT AND ACCREDITATION 39
were master’s colleges and universities, and 28 institutions were baccalaureate colleges (Farrugia
& Bhandari, 2013). These 36 institutions were selected for having a significant study abroad
program participation rate and comprised the populations for this study. Table 3 provides an
overview of the type and participation rates of these institutions.
Table 3
Institutions with an estimated undergraduate study abroad participation rate of over 70%
compiled from Open Doors, 2013
Name of Institution
Number of
Participants
Participation
Rate*
Highest
Degree**
American University 1,175 72.4 Doctorate
Arcadia University 646 138.0 Master
Austin College 295 107.7
Babson College 260 N/A
Boston College 1,110 74.5 Doctorate
Bryn Mawr College 203 72.8
Calvin College 633 86.8
Carleton College 390 87.4
Centenary College of Louisiana 117 81.3
Centre College 372 132.4
Colorado College 500 91.9
Davidson College 328 73.7
DePauw University 507 113.4
Dickinson College 370 72.1
Earlham College 174 75.0
Elon University 1,092 96.5 Master
Goshen College 160 80.4
Goucher College 340 125.0
Hamline University 375 93.3 Master
Hartwick College 205 70.7
Kalamazoo College 244 80.8
Lee University 629 92.0 Master
Lewis and Clark College 337 78.6
Macalester College 308 77.8
Oberlin College 532 73.8
Saint Mary’s College of Maryland 349 79.7
Saint Olaf College 678 91.6
Soka University of America 65 138.3
ASSESSMENT AND ACCREDITATION 40
Table 3, continued
Susquehanna University 380 75.7
Taylor University 518 128.9
University of Dallas 261 89.4 Master
University of Richmond 261 82.6
Washington and Lee University 599 79.3
Webber International University 182 134.8
Whitworth University 347 73.8 Master
Wofford College 339 109.7
*The estimates of undergraduate participation rates may exceed 100% due to factors such as
students studying abroad more than once, student attrition, and varying cohort sizes from year to
year.
**Highest degree offered is Bachelor’s degree otherwise noted
Accrediting Agencies
There are four types of accrediting bodies: regional, national, religious, and program.
Regional accrediting agencies were selected for this study. Currently, there is no program
accrediting agency that reviews SAPs. A regional accreditor is one of 19 institutional
accreditors, is made up of six regional accrediting agencies, and is responsible for accrediting
colleges and universities selected for this study. The six accrediting agencies are Middle States
Commission on Higher Education (MSCHE), New England Association of Schools and Colleges
– Commission on Institutions of Higher Education (NEASC-CIHE), North Central Association
of Colleges and Schools – The Higher Leaning Commission (NCACS-HLC), Southern
Association of Colleges and Schools (SACS), Northwest Commission on Colleges and
Universities (NWCCU), Western Association of Schools and Colleges (WASC) Senior College
and University Commission and WASC Accrediting Commission for Community And Junior
Colleges (ACCJC-WASC). Table 4 provides an overview of regional accrediting agencies and
the areas covered by the agencies.
ASSESSMENT AND ACCREDITATION 41
Table 4
Accrediting Agencies and Areas Covered by the Agencies
Accrediting Agencies Areas Covered by the Agencies
Middle States Commission on Higher Education (MSCHE) DE, MD, NJ, NY, PA (5) PLUS
DC, PR, USVA
New England Association of Schools and Colleges
Commission on Institutions of Higher Education (NEASC-
CIHE)
CT, ME, MA, NH, RI & VT (6)
PLUS INT’L SCHOOLS
North Central Association of Colleges and Schools The
Higher Learning Commission (NCACS-HLC)
AZ, AR,CO, IL, IN, IA, KS, MI,
MS, MO, NE, ME ND, OH, OK,
SD, WV, WI, WY (19)
Southern Association of Colleges and Schools (SACS) TX, LA, MS, AL, KY, TN, VA,
NC, SC, GA, & FL (11)
Northwest Commission on Colleges and
Universities (NWCCU)
AK, ID, MT, NV, OR, UT, & WA
(7)
WASC Senior College and University Commission &
Western Association of Schools and Colleges Accrediting
Commission for Community and Junior Colleges (ACCJC-
WASC)
CA & HI (2) PLUS US PI
Instrumentation and Procedure
In keeping with the inquiry practice of gathering “multiple forms of data, such as
interviews, observations, and documents, rather than reply on a single data source” (Creswell,
2009, p. 175), document analysis, surveys, and an interview were conducted to collect data. The
three data sources strengthened the study through data triangulation. Triangulation uses “several
kinds of methods or data, including using both quantitative and qualitative approaches” (Patton,
2002, p. 247) and data triangulation uses a variety of data sources. Different data sources help
test for consistency and provide “deeper insight into the relationship between inquiry approach
and the phenomenon under study” (Patton, 2002, p. 248). Research was completed in two
phases. Document analysis was completed during the first phase, and the results from the
document analysis provided the foundations for creating survey and interview questions.
Surveys and the interview were completed during the second phase. Surveys provided
ASSESSMENT AND ACCREDITATION 42
information that was not available through documents analysis whereas an interview added
context to the survey results.
Document Analysis
Document data was collected between November 19, 2014, and December 17, 2014.
Information was collected by reviewing each institution’s website for mission statements, study
abroad program office website address, contact information, program objectives, program flyers,
number of approved programs, program design, program approval process, program support,
program policies, program advising information, application process, participant information,
program assessment, and program review. Research results were copied directly from their
websites and were organized (Table 5). Once the table was complete, contents were reviewed
for similar themes under each column. Another table was created to record key words and
themes.
Table 5
An Example of Table to Organize Information Found Through Document Analysis
Accrediting
Agency
Mission
Statements
Program
Objectives
Evaluation
College 1
College 2
…
College 36
Six regional accreditation websites were researched to understand how undergraduate
SAPs are reviewed. Information was collected by accessing their websites to review the
guidelines on co-curricular programmatic review and the guidelines for the study abroad
program. Particular attention was given to co-curricular program review guidelines since the
focus of this study is on non-academic program objectives. This documentation revealed
current standards set forth for undergraduate SAPs. Regional accrediting agencies websites were
ASSESSMENT AND ACCREDITATION 43
searched using such keywords as SAP, off-campus program, international program, academic
credit, and transfer credit.
Survey
A short online survey was distributed to SAP administrators at 35 institutions. Soka
University of America was exempted from survey since the same questions were asked at the
interview. The survey was open from March 2 to March 20, 2015. A Qualtrics survey tool was
used for the survey, and it was sent via e-mail. E-mail addresses of study abroad program
administrators were gathered from the institutions’ SAP contact information webpages. When
contact information was missing, additional search was done using the institution’s directory and
Frequently Asked Question page and other webpages. The survey was created specifically to
address each research question in this study. The survey comprised eight questions with two
questions having two subset questions each (Appendix A). There were 12 questions altogether,
and four questions were optional (Table 6). At the end of the survey, participants were also
given an opportunity to add an open-ended comment.
Table 6
Survey Questions for Study Abroad Program Administrator
Survey Questions
1 Do you have student learning outcome or program objectives?
1A Do you publish this information online?
1B Is intercultural competence one of your student learning outcomes or program
objectives?
2 Does your regional accrediting agency have guideline/policy on study abroad
programs?
2A How are you addressing your regional accrediting agency’s policy guideline?
2B How do you feel about assessment being required by the accrediting agency?
3 If your regional accrediting agency has a guideline/policy on study abroad programs,
would you more likely to assess study abroad programs?
4 How does your institution define effectiveness?
5 What type of assessments/evaluations do you do?
ASSESSMENT AND ACCREDITATION 44
Table 6, continued
6 What instruments are used to assess your study abroad programs?
7 What kind of data do you currently track related to your study abroad programs?
8 What kind of data would you like to track or would be helpful for you to have in order
to assess effectiveness of study abroad programs?
All survey questions were designed to align with the research questions (Table 6).
Survey questions one and optional subset 1A & B aligned with research question one, “what are
the objectives of selected colleges and universities’ undergraduate study abroad programs?” It
was answered by 17 respondents. Questions 1A & B were only visible to the administrators if
they answered yes to the survey question one. Survey question two followed the same logic.
Questions 2A & B were only visible if respondents answered yes to survey question two. Survey
question two and its subset of questions were aligned to the research question, “What are the
undergraduate study abroad program’s standards as published by the six regional accrediting
agencies’ websites?” It was answered by 14 institutions. Survey questions four to seven aligned
with the research question, “What are the stated criteria for measuring the effectiveness of
undergraduate SAPs of selected colleges and universities?” These short answer survey questions
aimed to provide deeper understanding in measuring the effectiveness of undergraduate SAPs.
They were answered by seven institutions. Survey questions three and eight indirectly aligned to
the research question, “What recommendation would help to improve assessment and
accountability of study abroad programs?” Answers to survey question three help to understand
whether accreditation policy improves assessment practice by administrators and data were
provided by seven institutions.
ASSESSMENT AND ACCREDITATION 45
Table 7
Alignment of Survey Protocols and Research Questions
Administrator Survey
Survey
Question Q1 Q2 Q3 Q4
1 X
1A X
1B X
2 X
2A X
2B X
3 X
4 X
5 X
6 X
7 X
8 X
Interview
After document analysis, an interview was conducted along with surveys as other data
sources. “The purpose of interviewing is to allow us to enter into the other person’s perspective”
(Patton, 2002, p. 341). For the interview, Alex Okuda, Director of Study Abroad Programs at
Soka University of America, was selected since the institution was under WASC accreditation,
and WASC had SAP policy. Once permission was granted, the interview was recorded and
transcribed. The recording was kept in password protected storage and will be destroyed after 3
years. Mr. Okuda also allowed disclosing his name. The interview was about an hour in length.
It was conducted on March 13, 2015, in Aliso Viejo, California. The interview protocol used a
semi-structured approach and an interview guide. “The advantage of an interview guide is that it
makes sure that the interviewer/evaluator has carefully decided how best to use the limited time
available in an interview situation” (Patton, 2002, p. 343). The interview questions were
compiled ahead of time (Table 8). Many of the interview questions overlap with survey
ASSESSMENT AND ACCREDITATION 46
questions since the interviewee works at one of 36 institutions and did not participate in the
survey.
Table 8
Interview Questions
Section A: Background
How long have you been in your present position?
How long have you been in this field?
How long have you been at this institution?
What is your highest degree?
What is your field of study?
Section B: Study Abroad Programs
How many approved Study Abroad Programs do you have?
Do you allow students from other institutions to participate in your program?
What distinguishes your study abroad program from others? OR What makes your
SAP distinct from others?
What kind of student learning outcome or program objectives do you have?
If you do, where is it available? Do you publish it online?
Section C: Assessment
What type of assessments/evaluations do you do (please list all)?
How does your institution define effectiveness?
How do you assess or evaluate your study abroad program?
What kind of data do you currently track related to your SAP?
What kind of data would you like to track or would be helpful for you to have in
order to assess effectiveness of SAPs?
What instruments are used to assess study abroad program?
Are your assessment/evaluation results available to the public? If yes, online?
How do you feel about the current assessment practice of your study abroad
program?
Section D: Accreditation
Has there been a study abroad program review as a part of accreditation
review/self-study?
If yes, do you publish the review?
Do you know about WASC’s new SA policy and what have you done or doing to
meet the requirement?
How would you feel about assessment being required by WASC?
Interview questions were divided into four sections: background, SAPs, assessment, and
accreditation. Background questions helped to provide context for the other questions answered
by the interviewee. The section on SAPs provided answers to the research questions, “What are
ASSESSMENT AND ACCREDITATION 47
the objectives of selected colleges and universities’ undergraduate study abroad programs? (a)
To what extent do institutions clearly state their objectives? (b) Is there a common theme in the
stated objectives?” The accreditation section aligns with the research question, “what are the
undergraduate study abroad program’s standards as published by the six regional accrediting
agencies’ websites?” The assessment section aligns with the research question, “What are the
stated criteria for measuring the effectiveness of undergraduate study abroad programs of
selected colleges and universities and how clearly and readily stated are the criteria?” Interview
questions from the assessment and accreditation sections indirectly provide answers to the
research question, four and five, “what are the notable gaps in the assessment and accountability
of study abroad programs?” and “what recommendation would help to improve assessment and
accountability practice for study abroad programs?” An interview also added depth to the
document analysis and surveys.
Data Analytics Strategy
Analysis of the different forms of data followed Creswell’s (2009) six steps: organize
and prepare, reading through all the data, coding the data, generate description or themes for
analysis, interrelate themes and description, and interpretation of the data. As a first step,
documents were reviewed. Document analysis provided valuable information that was a
“stimulus for paths of inquiry that can be pursued only through direct observation and
interviewing” (Patton, 2002, p. 294). Short answers in the survey provided rich narrative, and no
additional preparation was required at this step. The interview was recorded with permission,
and the interview was transcribed using Express Scribe Transcription software. Steps two and
three involved reviewing the transcript and coding the data; this comprised highlighting key
words such as intercultural competence related words, assessment, and accreditation. For
ASSESSMENT AND ACCREDITATION 48
document analysis, spreadsheet was created to organize the findings. Coding resulted in
generating a small number of themes or categories. The themes or categories created “headings
in the findings sections of studies” (Creswell, 2009, p. 189). Step five is the process whereby
findings can be conveyed through narrative. The themes emerged from document analysis,
surveys, and the interview provided a rich narrative in the results section. Finally, step six led to
an interpretation of the data. Findings from this process validated literatures reviewed. This step
provided the conclusion and recommendation chapter.
ASSESSMENT AND ACCREDITATION 49
CHAPTER FOUR: RESULTS
The purpose of this study was to uncover common themes among SAP objectives and
evaluation practices of selected colleges and universities identified by high SAP participation
rate. Furthermore, this study aimed to make suggestions in improving the programs through
assessment and accountability. This chapter reviews each research question and reports the
findings while integrating the combined data sources of document analysis, survey, and an
interview. Results are divided by the themes according to the research questions beginning with
Institutional and SAP information followed by program objectives, program assessment, and
standards.
Institutional and Study Abroad Program Information
Before examining specific aspects of assessment and accountability in SAPs, it was
necessary to begin with institutional information. The mission statement provides direction and
vision for an institution and, often, each program’s objectives align with that statement. Since
this study examined SAP objectives, it was crucial to review the statements for certain key words
to determine whether they would have a global education focus. Although it was not surprising
to find that 20 out of 36 colleges and universities included words such as diversity, changing
world, complex world, global citizen, global perspective, international engagement, and leaders
of the world in their mission statements, there was no one concept that was frequently found.
In terms of availability and presentation of program information, all 36 institutions had a
study abroad program webpage, but not all provided the same level of program details on said
webpage. For some institutions, details were found on the sponsoring department’s website
instead. For example, Macalester College’s study abroad webpage had pictures of different SAP
locations and a brief statement about the SAP. Program specific information was available on
ASSESSMENT AND ACCREDITATION 50
the sponsoring department’s website. The department’s website provided program details such
as location, credit hours earned, staff, excursions, and accommodation. One of the challenges of
comparing one SAP to another is different SAP terminologies being used by different institutions
even though they may offer similar programs. The notable differences in terminology among
institutions were short term programs. Many institutions offered short term programs but they
were called J-term, May term, interim, winter and summer term. The most common features of
the programs were that they were a semester long and faculty-led.
Different types of programs were offered by the institutions and classified according to
the institutional definition. When the institutions allow students to participate in programs not
their own, sometimes they differentiate the programs into different groups or levels. Group 1 at
Kalamazoo College is the institution’s own program. Group 2 and group 3 SAPs are non-
Kalamazoo programs approved for transfer of academic credits; however, group 2 is eligible for
financial aid, but Group 3 is not eligible. Each institution may offer a combination of different
types of SAPs. Colorado College offers seven of its own programs, six exchange programs, and
14 associated colleges of the Midwest programs. Arcadia University offers over 130 programs
and also has 30 study abroad centers around the world. Goshen College had a unique program
called study-service term program. It is a 13-week program that combines learning about the
local culture through intensive study and language instruction and serving the community
through a volunteer service project.
A systematic review of the publicly available information from SAPs’ websites provided
a glimpse into different ways SAPs are packaged and presented. First, most institution’s mission
statements clearly recognized the importance of global education and this recognition was
included in their mission statements. Second, institutions differed in who provided SAP
ASSESSMENT AND ACCREDITATION 51
information, sponsoring department versus SAP office and different terminologies were used by
different institutions. Third, different types of programs were offered and classified differently
by each institution. Even with a plethora of information available to the public, it was
impossible to determine the quality or effectiveness of the programs.
Program Objectives
Research question 1: What are the objectives of selected colleges and universities’
undergraduate study abroad programs? (a) To what extent do institutions clearly state their
objectives? (b) Is there a common theme in the stated objectives?
The aim was to determine whether the SAP websites provided information such as
program objectives and assessment information and not just academic, logistic, or financial
aspects of the program. From document analysis, only ten institutions listed program objectives.
Of the ten institutions that listed program objectives, all but one listed global citizenship or inter-
cultural competence. Almost all SAP websites provided information such as academic policies,
transferability of courses, application deadlines, student testimonials, program location details,
and costs of the program.
Out of 18 institutions responding to the survey and interview, 15 institutions indicated
they had student learning outcomes or program objectives, and 11 published this information
online. For example, Soka University of America (SUA), which requires the entire student body
to study abroad, provided goals and objectives on their frequently asked question page of the
study abroad website. “SUA’s goal is to provide students with the opportunity to develop the
knowledge, skills, experiences and attitudes in order to prepare them for the global society of the
21st century. We believe that study abroad is one of the most effective ways to accomplish these
goals” (Soka University of America, n.d.). SUA’s five objectives for study abroad are
ASSESSMENT AND ACCREDITATION 52
achieve competence in a foreign language by immersing in the life of the host
country; sharpen students’ interpersonal and communication skills through
interacting with people from backgrounds different than their own; equip students
with new skills, broader perspectives, and an appreciation of cultural differences;
foster a sense of global awareness by exploring the role students can play in the
global community; experience the increasing interdependence of the US and other
countries with an international dimension and a global perspective. (Soka
University of America, n.d.)
Another example of a program objective is that of Saint Mary’s College of Maryland (SMCM).
Its Office of International Education’s website lists study abroad student learning outcomes.
Students at SMCM are expected to achieve the outcomes below.
[R]ecognize cultural differences within and between cultures; display intercultural
communication skills when responding to members of other cultures; articulate
their personal growth in response to experiences in another culture that challenges
or deepens their world views; increase independence and self-reliance through
learning to successfully navigate travel logistics, new communities, and host
culture; comprehend and speak a foreign language more proficiently and in
culturally appropriate ways, when studying in a country where the language of the
university and/or program differs from the student’s native language(s). (Saint
Mary’s College of Maryland, n.d.)
They both include intercultural competence such as intercultural communication skills as an
objective in their student learning outcomes. The survey question did not ask what the student
learning outcomes are. Furthermore, not all institutions listed program objectives. This finding
ASSESSMENT AND ACCREDITATION 53
suggests that the institutions may have program objectives, but not all institution share that
information publicly. It is also clear that high participation rate does not mean a program is more
likely to have objectives or student learning outcomes and make them publicly available.
Program Assessments
Research question 2: What are the stated criteria for measuring the effectiveness of
undergraduate study abroad programs of selected colleges and universities and how clearly and
readily stated are the criteria?
The aim of this question was to determine how each institution defined the effectiveness
of SAPs. Defining effectiveness of the SAPs is important for the following reasons. First,
without clarifying the definition of effectiveness, the study abroad program will be difficult to
assess. Second, the type of assessment used by the program reveals what is considered important
by the institution. Third, the selection of instruments is important in measuring the effectiveness
of the SAPs. Fourth, data that are tracked currently do not always results in measuring the
effectiveness of the SAP.
Document analysis yielded no findings. No information was available on website.
Surveys and the interview explored the question, “How does your institution define
effectiveness?” Out of eight responding to this question, three institutions defined effectiveness
as meeting intended student learning outcome. One institution defined effectiveness only in
terms of achieving a grade of C or above. Another institution’s leadership said they are not there
yet, but would like to in the future:
The previous dean of international education did not focus on general assessment or
learning outcomes. Recently, as interim Dean, I have been using the Global Learning
ASSESSMENT AND ACCREDITATION 54
Rubric from AAC&U to define the learning outcomes of study abroad. When a new
Dean is hired defining the outcomes will be part of the new strategic plan.
Another institution’s representative said, “It is hard to define it. It is not about numbers.
Students have been reporting that study abroad program is life-changing.” The respondent felt
that when a student has a life-changing experience, the program is effective. Mr. Okuda at Soka
University of America said, “Empathizing with others. That’s true outcome” (personal
communication, March 13, 2015). The respondent felt that when the students gain empathy
towards other people after participating in the study abroad program, the program is effective.
Even though not explicitly stated, the type of assessment used in itself reveals what the
institution considers to be the desired goals and outcomes of its SAP. Surveys and an interview
question asked, “What type of assessments or evaluations do you do?” Of the eight institutions
whose representatives answered that question, one expressed that, although the president is a
strong advocate for assessment, they do not do any type of program assessment at this time.
However, they do course evaluation using SUMMA. SUMMA refers to a survey of student
opinion of instruction provided by SUMMA Information Systems, Inc. They would like to have
program assessment in the future. Another responded that assessments are handled by each
academic department. Some institutions reported having a regular program evaluation as a part
of the accreditation process. Four institutions had self-designed satisfaction surveys immediately
after the program, and one out of the four institutions also had “self-designed learning
assessment for off-campus study approximately 3-4 months after program.” One institution’s
representative answered by saying, “currently a final report.” This answer was not clear. It is
possible that the question was too broad and easily misunderstood or they do not conduct. Also,
open-ended short answer survey questions did not yield as high a response as did yes-no
ASSESSMENT AND ACCREDITATION 55
questions. Survey questions with limited answer choices may elicit greater participation;
therefore, questions such as this could have been multiple choice questions.
Additional survey and interview questions were asked to ascertain assessment
instruments being used and the type of data collected. Participation rate was also very low.
Representatives of the same eight institutions answered these questions. The type of instruments
used did not necessarily measure SAP effectiveness. When asked what instruments were used to
assess study abroad programs, representatives from two institutions said they were not using any.
Two institutions used the Intercultural Development Inventory. One institution in particular used
a variety of instruments: “Our own post study abroad program survey, international student
orientation survey, international student senior survey, Global Perceptive Inventory (GPI), study
abroad inferences from NESSE and CLA, forum incident reporting.” One institution did not use
any for program assessment but SUMMA for academic evaluation. Three institutions used self-
designed instruments. There is nothing wrong with using self-designed instruments; however,
there is no indication to suggest whether the validity and reliability of the instruments were
evaluated.
Data tracked did not always result in measuring the effectiveness of the study abroad
program. Two questions were asked about the data being tracked. The first question asked,
what kind of data do you currently track related to your study abroad programs? From six
institutions’ representatives who answered the question, five institutions said participation
information such as who participated in what programs and how many. One institution said they
are not really data driven and not interested in numbers. A second related question was asked:
“What kind of data would you like to track or would be helpful for you to have in order to assess
effectiveness of study abroad programs?” A wide variety of answers was provided by six
ASSESSMENT AND ACCREDITATION 56
institutions. One institution’s representative said, “Although we already track this data, as a
campus we are increasingly focused on accessibility of off-campus study for all sub-groups of
our student population.” Another institution’s representative said, “Without a dedicated
assessment person in the department, we’re maxed out with what we currently track. If we had
time, I’d like to do more with GPI, IDI, even portfolios.” This sentiment was echo by yet a third
institution. Representatives of all six institutions were interested in getting more data.
The predominant theme in response to the research question was lack of clearly defined
study abroad program effectiveness. It would be difficult to determine the type of data needed or
appropriate instruments to use when program effectiveness is not clearly articulated. The finding
is without clearly defining effectiveness in line with the institution’s mission, it would be
difficult to select right instruments to assess the effectiveness of the program.
Accreditation Standards
Research question 3: What are the undergraduate study abroad program’s standards as
published by the six regional accrediting agencies’ websites?
The aim of this research question was to explore whether SAPs were considered special
programs by regional accrediting agencies and, therefore, warranted a separate policy or standard
by the agencies. There are six regional accrediting agencies. Based on document analysis,
MSCHE is represented by eight institutions, NEASC-CIHE is represented by two institutions,
NCACS-HLC is represented by 12 institutions, SACS is represented by 11 institutions, NWCCU
is represented by two institutions, and WASC Senior College and University Commission is
represented by Soka University of America. Institutions are not represented evenly by region in
this study (Table 9).
ASSESSMENT AND ACCREDITATION 57
Table 9
List of 36 Colleges and Universities and Corresponding Regional Accrediting Agencies
Accrediting Agency Total Colleges and Universities
Middle States Commission on Higher
Education (MSCHE)
8 American University
Arcadia University
Bryn Mawr College
Dickinson College
Goucher College
Hartwick College
Saint Mary’s College of Maryland
Susquehanna University
New England Association of Schools and
Colleges Commission on Institutions of
Higher Education (NEASC-CIHE)
2 Babson College
Boston College
North Central Association of Colleges and
Schools The Higher Learning Commission
(NCACS-HLC)
12 Calvin College
Carleton College
Colorado College
DePauw University
Earlham College
Goshen College
Hamline University
Kalamazoo College
Macalester College
Oberlin College
Saint Olaf College
Taylor University
Southern Association of Colleges and Schools
(SACS)
11 Austin College
Centenary College of Louisiana
Centre College
Davidson College
Elon University
Lee University
University of Dallas
University of Richmond
Washington and Lee University
Webber International University
Wofford College
Northwest Commission on Colleges and
Universities (NWCCU)
2 Lewis and Clark College
Whitworth University
WASC Senior College and University
Commission
1 Soka University of America
ASSESSMENT AND ACCREDITATION 58
Not all accrediting agencies had a specific policy or standard on SAPs. Only MSCHE
and WASC Senior College and University Commission had a separate policy addressing SAP
(Table 9). NEASC-CIHE and SACS did not have a separate policy on SAPs; however, they do
mention it in their standards. For NEASC-CIHE, study abroad is briefly mentioned under
standard 4.34. This standard addresses the how academic credits are awarded and mentions
study abroad credits briefly: “The award of credit is based on policies developed and overseen by
the faculty and academic administration. There is demonstrable academic content for all
experiences for which credit is awarded, including study abroad, internships, independent study,
and service learning” (NEASC-CIHE, n.d.). For SACS, it is addressed in the quality and
integrity of undergraduate degrees policy statement. It addresses how the course must be
evaluated if it is on the institution’s transcript for the undergraduate degree. It states that
whether taught by the institution, transferred in from a domestic or an
international institution, or taught elsewhere and transcripted as the institution’s
own (e.g., dual admissions, study-abroad, cross-registration, consortia) should be
evaluated to ensure that the courses meet (1) the requirements for the degree the
institution intends to award and (2) applicable accreditation standards. This
evaluation must be carried out by persons academically qualified to make the
necessary judgments (SACS, n.d.).
Both NEASC-CIHE and SACS addresses academic credits but do not mention any standards or
policies on co-curricular programs or program learning objectives. Both agencies appear to be
much more concerned with appropriate awarding of academic credits.
ASSESSMENT AND ACCREDITATION 59
Table 10
Regional Accrediting Agencies and Study Abroad Program Self-Evaluation Study Guidelines
MSCHE NEASC-
CIHE
NCACS-
HLC
SACS NWCCU WASC
Is there a
separate
policy
addressing
study abroad
program?
Yes No No No No Yes
Is study
abroad
mentioned?
If so, where?
Yes; Under
International
Education
Yes* No Yes** No; Study
abroad was
there in
2004 but
not
anymore
Yes;
Study
Abroad
Policy
Does the
agency
require a
separate
evaluation of
SAP?
Yes No No No No Maybe***
Does it
require a
periodic
program
review?
Yes No No No No Maybe***
Does it
require
assessment
reports?
Yes No No No No Maybe***
* One line under Integrity in the Award of Academic Credit for recording credits
** One line under the quality and integrity of undergraduate degrees for recording credits.
***Study abroad policy first appeared in the 2014.
For MSCHE, policy governing International Programs Offered by Accredited Institutions
states that, “All such activities must be addressed within the self-study and periodic review
report, and in certain cases locations abroad must be visited by the Commission’s representatives
as part of decennial, substantive change, or other reviews” (MSCHE, n.d.). The policy also
ASSESSMENT AND ACCREDITATION 60
recommends the institution share the following information with prospective and enrolled
students:
[T]he learning goals of the program; the relationship, if any, to a foreign
institution grading practices and policies for assigning credit, especially if several
institutions are involved with a single overseas institution or consortium
significant differences between the home campus experience and what can be
expected abroad; the extent of responsibility assumed by the program for housing
participants; what services will and will not be provided. (MSCHE, n.d.)
The commission also recommends that all international programs should, “meet all accreditation
standards and meet standards for quality of instruction, academic rigor, educational
effectiveness, and student achievement comparable to those of other institutional offerings.
Resources such as student services should be appropriate to the culture and mores of the
international setting” (MSCHE, n.d.). This policy is applicable to “education abroad through
branch campuses or additional locations or by contractual relationships with other non-accredited
providers (including consortia of which it is part). Such offerings may include limited study
abroad activities as well as the provision of entire degree programs” (MSCHE, n.d.). This policy
does not apply to faculty-led SAPs. For this policy to apply, an institution must have branch
campuses or centers. Therefore, this policy would not apply to institutions without international
locations, branch campuses or consortia agreement.
The WASC Senior College and University Commission has separate study abroad policy
last modified on July 2, 2014. The policy clearly states that a SAP should
have a well-defined rationale that states the specific nature and purposes of the
program and is accurately represented in the institution’s catalog and all
ASSESSMENT AND ACCREDITATION 61
promotional literature; include provisions for ongoing institutional oversight,
including assessment of the educational effectiveness of programs; Provide
opportunity, at the conclusion of the student’s program or upon return, to process
and reflect on the experience in ways that may contribute to the student’s and
others’ learning. (WASC Senior College and University Commission, 2014)
WASC policy requires the purposes of the program be made public. It also requires
assessment of the educational effectiveness of programs but does not require that
information be made public. Since this policy was updated in 2014, more research is
needed to understand the impact of the policy on WASC institutions. Currently, only two
accrediting agencies have separate policies on SAPs, and only ten institutions publish
their program objectives online (Table 11).
Table 11
Summary of Program Objectives and Regional Accrediting Agency’s Study Abroad Program
Standards
Regional Accrediting Agency Study
Abroad
Standards
Institutions Program
Objective
Middle States Commission on
Higher Education (MSCHE)
Yes American University No
Arcadia University No
Bryn Mawr College Yes
Dickinson College No
Goucher College No
Hartwick College No
Saint Mary’s College of Maryland Yes
Susquehanna University Yes
New England Association of
Schools and Colleges
Commission on Institutions of
Higher Education (NEASC-
CIHE)
No Babson College Yes
Boston College No
ASSESSMENT AND ACCREDITATION 62
Table 11, continued
Regional Accrediting Agency Study
Abroad
Standards
Institutions Program
Objective
North Central Association of
Colleges and Schools The
Higher Learning Commission
(NCACS-HLC)
No Calvin College Yes
Carleton College No
Colorado College No
DePauw University No
Earlham College No
Goshen College Yes
Hamline University No
Kalamazoo College No
Macalester College No
Oberlin College No
Saint Olaf College No
Taylor University No
Southern Association of
Colleges and Schools (SACS)
No Austin College Yes
Centenary College of Louisiana Yes
Centre College No
Davidson College No
Elon University No
Lee University No
University of Dallas No
University of Richmond No
Washington and Lee University Yes
Webber International University No
Wofford College No
Northwest Commission on
Colleges and
Universities (NWCCU)
No Lewis and Clark College No
Whitworth University No
WASC Senior College and
University Commission
Yes Soka University of America Yes
Summary
Accrediting agencies require their members to clearly state and publish a mission
statement whereas only one accrediting agency has clear separate policy on SAPs. Every
institution reviewed had published mission statements; however, as document analysis revealed,
not all institutions with high SAP participation rates defined program objectives and published
the results online. The result was confirmed through surveys and an interview. From those that
ASSESSMENT AND ACCREDITATION 63
published objectives online, these aligned with the institution’s mission statement and either
directly or indirectly addressed intercultural competence. Although data pertaining to the
research question on effectiveness did not yield findings regarding questions on objectives, a
common theme was lack of clearly defined effectiveness of the program. The results suggest
that, without clear guidance or requirements from the accrediting agencies, institutions are less
likely to define or publish them.
ASSESSMENT AND ACCREDITATION 64
CHAPTER FIVE: CONCLUSION AND RECOMMENDATIONS
SAPs in the United States have a long history dating back to the colonial period. The
first formal SAP did not start until 1880 (Bolen, 2001), and it was not until 1920s when students
earned credits toward degree completion by participating in SAPs. SAPs were thought to be for
economic and intellectual elites. For over 50 years after World War II, SAPs served only 90,000
students. This participation rate has changed in the last twenty years, during the 2011-2012
academic year, over 280,000 students participated in SAPs (Farrugia, & Bhandari, 2013). This
increase is also reflected in the number of US institutions offering some type of SAPs whether
offered by the institution or by SAP providers. Currently, there are more than 400 institutions,
organizations, and other service providers of SAPs (ASHE, 2012). Today, more students
participate in a wider variety of SAPs.
Students participate in SAPs for many different reasons. Some students participate to
improve foreign language skills, some students are interested in learning about other cultures,
some students participate for pleasure, and some students participate to improve future job
prospects. In terms of why schools offer SAPs, students need global literacy in the twenty-first
century. SAPs are one of the ways for students to increase intercultural competence (Emert &
Pearson, 2007; Mapp, 2012). However, participating in SAP alone does not increase
intercultural competence (Pederson, 2010).
The purpose of this study was to recommend ways to improve SAPs through assessment
and accreditation for greater quality and consistency of the programs. This study set out to
answer the following questions:
1. What are the objectives of selected colleges and universities’ undergraduate study
abroad programs?
ASSESSMENT AND ACCREDITATION 65
a. To what extent do institutions clearly state their objectives?
b. Is there a common theme in the stated objectives?
2. What are the stated criteria for measuring the effectiveness of undergraduate study
abroad programs of selected colleges and universities and how clearly and readily
stated are the criteria?
3. What are the undergraduate study abroad program standards as published by the six
regional accrediting agencies’ websites?
4. What are the notable gaps in the assessment and accountability of study abroad
programs?
5. What recommendation would help to improve assessment and accountability of study
abroad programs?
The document analysis, survey, and interview were used to triangulate the findings. Selected
institutions with high SAP participation rates and six regional accrediting agencies were selected
for this study.
Gaps in Assessment and Accountability
There were notable gaps in the assessment and accountability of SAPs. This study found
five notable gaps: inconsistent availability of SAP information, ill-defined or undefined SAP
objectives, lack of information on effectiveness of SAPs, almost non-existent assessment
practice, and not enough resources to assess SAPs.
First, the type of information about the program provided by the institutions is very
diverse. The amount of information and the types of information available to the public diverged
greatly from one institution to another. Most institutions provided information regarding costs of
the program, credit hours earned, staff, and accommodations. All provided testimonies from past
ASSESSMENT AND ACCREDITATION 66
attendees and pictures from different locations. Not all institutions had a centralized location to
obtain SAP information. Sometimes, SAP information was provided by the sponsoring
department, for example, the Italian department may house information for a SAP based in Italy.
Some institutions provided program objectives or student learning outcomes. None of the
institutions provided any information on their assessment practices.
Second, less than half had defined program objectives, and only ten institutions had
program objectives available on their website. This contrasted with institutional mission
statement. Institutional mission statements are required by accrediting agencies; therefore, all
institutions had clearly defined mission statements. This reveal that program objectives are not
as well defined as mission statements among institutions. Anderson et al. (2006) also stated that
“the intercultural goals of study abroad programs remain ill defined and unmeasured” (p. 458).
Currently, only one accrediting agency, WASC, has a policy that requires clearly defined
program objectives be publicly available and this policy only became effective in July 2014. It
would be interesting to see the changes occur as a result of this policy with SAPs at institutions
under WASC jurisdiction.
Third, much of SAPs’ effectiveness remains a mystery. There was no readily accessible
information on the effectiveness of these programs. This was not surprising, since effectiveness
of the programs cannot be measured without clearly defined program objectives, and less than
half the institutions defined program objectives. Based on the survey, some programs were
assessed as a part of the accreditation process and some were evaluated by the departments.
Most conducted student satisfaction surveys, but no program assessments were completed. As
stated by Wang et al. (2011), the challenge of SAPs was “ensuring that these programs meet
ASSESSMENT AND ACCREDITATION 67
relevant content, rigor, and quality” (p.19). Without proper assessment, accountability becomes
a challenge.
Fourth, only one accrediting agency had a policy specifically for SAPs, and the
relationship between accreditation policy and assessment practice was questionable. Since the
policy was only implemented in July 2014 for that one accrediting agency, it is difficult to
conclude whether policy would have any impact on how SAPs publish information and assess
programs. Also, only a little over half of those who responded to the survey said they would
implement assessment if required by accrediting agency. This finding may be due to the fact that
“accreditation is often viewed as onerous or as a burdensome external requirement” (Head &
Johnson, p. 37). More research is needed to understand the relationship between policy and
assessment practice.
Fifth, limited resources were cited as a challenge in implementing assessment practice.
“Study abroad programming requires substantial institutional resources and infrastructure”
(Emert & Pearson, 2007, p. 70) and Hadis (2005) believes that “program evaluation is an
afterthought to an ongoing program undertaken by extremely busy program administrators” (p.
5). Resources were not the focus, but comments were made by the institutions’ representatives
that, without additional resources, it would be difficult to conduct more assessment.
Recommendations for Practice
Accreditation provides assurance to the public. When SAPs are administered by an
accredited institution, an assumption is made that the SAP was reviewed by its peers as a part of
that accreditation review process. However, the problem with this assumption is that not all
accrediting bodies have a policy or standard regarding SAPs. Often, the SAPs were not
reviewed as a part of the accreditation review process. This study makes a recommendation to
ASSESSMENT AND ACCREDITATION 68
improve assessment and accountability by requiring SAPs to be a part of the accreditation review
process. In order to ensure that SAPs become a part of accreditation review process, changes
must be made within regional accrediting agencies, institutional culture, and the ways in which
students are assessed. This recommendation needs to be implemented at three different levels:
regional, institutional, and student (Figure 4).
Figure 4. Three Levels Addressed by the Recommendation
Regional: Adaptation of Study Abroad Policy
First, regional accrediting agencies need to establish a separate policy addressing SAPs
specifically similar to WASC’s study abroad policy (Appendix C). MSCHE has SAP standards,
but, out of eight institutions, only three published the program objectives (Table 10). WASC has
SAPs standards but only one institution is listed under WASC. Even though the institution
published program objectives, the interview revealed that it was not posted to comply with
WASC study abroad policy. NCACS-HLC and SACS do not have SAP standards, and two out
Policy at Regional
Accrediting
Agency
Institutional
Program Review
Student Learning
Outcome
ASSESSMENT AND ACCREDITATION 69
of 12 institutions and three out of 11 institutions published the program objectives, respectively.
It is not surprising, since only WASC recently required publishing of program objectives for
SAPs.
An assumption was made that if accrediting agencies had a separate policy on SAPs, then
the institutions would more likely to comply. When asked in survey and interview whether they
would be more likely to assess SAPs if their regional accrediting agency had a guideline/policy
on SAPs, ten respondents answered yes. Alex Okuda from Soka University of America said,
even though he is assessing the program now, he would assess even more and publish results
online if WASC had a policy that required the institution to do so (personal communication,
March 13, 2015).
Study abroad policy or standards would need to include requirements to clearly define
program objectives and to perform assessments regularly. WASC’s study abroad policy
addressed both. One of criteria requires SAPs to “have a well-defined rationale that states the
specific nature and purposes of the program and is accurately represented in the institution’s
catalog and all promotional literature” (WASC Senior College and University Commission, July
2, 2014). The other criterion requires the SAPs to “include provisions for ongoing institutional
oversight, including assessment of the educational effectiveness of programs” (WASC Senior
College and University Commission, July 2, 2014). Policies and standards of accrediting
agencies are not prescriptive. They are broad enough to allow differences in institutions to
define their own effectiveness, student learning outcome, and select appropriate instruments to
assess effectiveness.
Many institutions already evaluate their programs, but focused assessment is needed.
This focus begins with having clearly defined program outcomes to gauge the effectiveness of
ASSESSMENT AND ACCREDITATION 70
the SAP. It is not enough to say the program was successful or that students had life-changing
experiences. With finite resources available in higher education, there is a need to discern
effective programs from ineffective ones to allocate resources well. Having a successful SAP is
not about having a high participation rate or high number of programs available. It is about
making available programs that are effective. This may mean not relying on assessments
conducted by others such as program providers. Effectiveness is defined by each institution and
what is considered effective by one institution may not be effective in another. Promoting a
culture of assessment would be a first step in ensuring effectiveness of SAPs. Accrediting
agencies with policy on SAPs would provide guidance in program assessment.
Institutional: Program Review
Institutions need to make a commitment to review SAPs regularly. One suggestion is to
utilize the services provided by the Forum on Education Abroad Quality Improvement Program
(QUIP). The purpose of QUIP is “to help an organization improve the quality of its offerings
and to think strategically about its future involvement in education abroad” (The Forum on
Education Abroad, 2007, p. 8). The Forum offers three types of review: comprehensive program
review, review of evaluation processes, and guided strategic planning. Comprehensive program
review is for institutions or third party providers with their own programs. Review of evaluation
processes is for “providers and institutions with elaborate education abroad evaluation systems as
well as institutions that do not offer their own programs and seek to evaluate their mechanisms
for sending students on programs sponsored by providers and other institutions” (The Forum on
Education Abroad, 2007, p. 9). Guided strategic planning is for “institutions and organizations
that are either just beginning the process of developing Education Abroad Programs, or those
that want assistance in moving in a new direction” (The Forum on Education Abroad, 2007, p.
ASSESSMENT AND ACCREDITATION 71
9). Depending on the type of SAPs offered, an institution can implement review similar to one
of QUIP’s review types.
Another suggestion is to modify current academic or co-curricular program review
established for accreditation program review processes to include SAPs since the institution is
already familiar with the process. By adapting accreditation program review, future
accreditation self-evaluation study will greatly benefit. Each regional accreditation has program
review resources. For example, WASC has WASC Resource Guide for ‘Good Practices’ in
Academic Program Review. This resource guide is not an instructional manual, but, rather, “it
describes some of the key concepts and good practices implicit in an outcomes-based program
review process in an effort to assist institutions with understanding the new WASC expectations”
(WASC Senior College and University Commission, 2009, p. 1). The self-evaluation study
should address program objectives as well as the definition of effectiveness. The program’s
overall goal should align with the institution’s mission statement, but it should have its own
goals as well. Since SAPs are more than just academic, their effectiveness needs to be measured
as more than academic accomplishments. The program is not effective just because students
reach certain grade thresholds. For example, if students are supposed to be more globally
minded as a result of the SAP, then the program cannot be considered effective just because all
students achieved a grade of C or higher in their academic courses. To truly know whether
students become more globally minded as a result of participating in SAPs, assessment of
students would need to show an increase in global mindedness. This assessment would require
careful planning before, during, and after the student’s participation.
ASSESSMENT AND ACCREDITATION 72
Student: Assessment Before, During, and After
Third, students’ learning outcomes need to be assessed before, during, and after
participating in SAP. This assessment requires careful planning. Most institutions have student
learning outcomes for the courses taken on campus, but not all institutions have student learning
outcomes for the SAPs. As a part of program review, student learning outcomes need to be
defined, and these should align with program objectives and also be measurable. It is important
to choose appropriate instrumentation to assess student learning outcomes. For SAPs, an
increase in intercultural competence would be a student learning outcome, as there are resources
available to measure different aspects of intercultural competence.
With student learning outcomes defined and instruments selected, implementing
assessment becomes a challenge for most institutions, especially with limited resources
available. One suggestion is to create a SAP course that is required for SAP participation. This
course can be divided into three small courses so that students take them before, during, and after
the participation in the SAP itself. With technology, this course can be delivered online and does
not have to have meeting time. The pre-program course can cover orientation material as well
as assessment the baseline for instruments chosen to measure student learning outcomes. The
course during the program could be journaling. This journaling would provide rich qualitative
assessment. After the program, the course can cover de-briefing as well as post-test. The course
can help the students with re-entry as well. It is not necessary to implement such course to
achieve assessment goals; however, assessment of student learning outcomes cannot be an
afterthought. Assessment of student learning outcomes needs to be integral part of SAPs.
To summarize, this study recommends adapting study abroad policy by regional
accrediting agencies, conducting program review at each institution, and implementing
ASSESSMENT AND ACCREDITATION 73
assessment plans for the students. By implementing all or part of this recommendation,
assessment and accountability of study abroad programs would improve.
Recommendations for Research
This study reviewed a small number of participants. This group of institutions, made up
of mostly small private liberal arts institutions, shared many similar characteristics. For future
research, it is advisable to have a pool of participants that are diverse in size and type. That may
be accomplished by changing participant selection criteria. One option may be to select one
regional accrediting agency, such as WASC, and selecting its member institutions to study. This
would allow the study to focus on one policy and the diverse group of institutions. Another way
would be to select institutions based on participation number instead of participation rate. For
example, Goucher College has a participation rate of 125% but only 340 participants in contrast
to American University’s 1,175 participants (Table 3). Selection based on participation number
may provide a more diverse group of institutions than selection based on participation rate.
Another consideration for future research is to interview someone from an accrediting agency
such as WASC. This person may provide insight from accrediting agency’s point of view on
SAPs.
Conclusion
SAPs are complex and diverse. From the number of programs offered at each institution
to the program objectives, SAPs vary greatly from one institution to another. This richness in
diversity of SAPs is preferable over the limited choices students used to have; however, more
programs do not always translate into availability of more effective programs. How can the SAP
policy help the public in distinguishing effective programs? There are three benefits to the SAP
policy. First, students will be better informed when choosing a program, since SAP policy
ASSESSMENT AND ACCREDITATION 74
would require program assessment, and this assessment information will be made available to the
public. Second, assessment will be used by the institutions to improve SAPs. Third, the
institutions would be more likely to provide necessary resources to assess the SAPs, since the
SAP policy would require assessment. The aim of recommendations was to help the institutions
improve the effectiveness of SAPs and the public in being better informed of their effectiveness.
ASSESSMENT AND ACCREDITATION 75
References
American Council on Education (n.d.). Accreditation and Standards. Retrieved from
http://www.acenet.edu/advocacy-news/Pages/Accreditation-and-Standards.aspx.
Association for the Study of Higher Education (2012). The Association for the Study of Higher
Education (ASHE) Higher Education Report, 38(4).
Altshuler, L., Sussman, N. M., & Kachur, E. (2003). Assessing changes in intercultural
sensitivity among physician trainees using the intercultural development inventory.
International Journal of Intercultural Relations, 27, 387-401.
Amuzie, G. L., & Winke, P. (2009). Change in language learning beliefs as a result of study
abroad. System, 37, 366-379.
Anderson, P. H., Lawton, L., Rexeisen, R. J., & Hubbard, A. C. (2006). Short-term study abroad
and intercultural sensitivity: a pilot study. International Jouirnal of Intercultural
Relations, 30, 457-469.
Ballestas, H. C. & Roller, M. C. (2013). The effectiveness of a study abroad program for
increasing students’ cultural competence. Journal of Nursing Education and Practice,
3(6), 125-133.
Behrnd, V., & Porzelt, S. (2012). Intercultural competence and training outcomes of students
with experiences abroad. International Journal of Intercultural Relations, 36, 213-223.
Blanco-Ramirez, G. & Berger, J. (2014). Rankings, accreditation, and the international quest for
quality. Quality Assurance in Education, 22(1), 88-104.
Bolen, M. (2001). Consumerism and US study abroad. Journal of Studies in International
Education, 5. 182-200.
ASSESSMENT AND ACCREDITATION 76
Brewer, E. & Brockington, J. (2013). International education self-studies and external reviews.
AIEA Issue Brief.
Clarke, I., Flaherty, T. B., Wright, N. D., & McMillen, R. M. (2009). Student intercultural
proficiency from study abroad program. Journal of Marketing Education, 31(2), 173-181.
Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods
approaches (3
rd
ed.). Thousand Oaks, CA: Sage Publications.
Deardorff, D. K. (2006). Identification and assessment of intercultural competence as a student
outcome of internationalization. Journal of Studies in International Education, 10, 241-
266.
Dolby, N. (2004). Encountering an American self: study abroad and national identity.
Comparative Education Review, 48(2), 150-173.
Eaton, J. S. (2010). Accreditation and the federal future of higher education. Academe, 96(5),
21-24.
Emert, H. A., & Pearson, D. L. (2007). Expanding the vision of international education:
collaboration, assessment, and intercultural development. New Directions for Community
Colleges, 138, 67-75.
Engle, L., & Engle, J. (2004). Assessing language acquisition and intercultural sensitivity
development in relation to study abroad program design. Frontiers: the Interdisciplinary
Journal of Study Abroad, X, 219-236.
ASSESSMENT AND ACCREDITATION 77
Engle, L., & Martin, P. (2010). Alignment and Accountability in Education Abroad: A
Handbook for the Effective Design and Implementation of Qualitative Assessment Based
on Student Evaluations. The Forum on Education Abroad. Retrieved from
http://www.regis.edu/~/media/Files/RC/Academic%20Programs/Travel-
Learning/ForumEA%20AlignmentandAccountabilityEdAbroad_Assessment%20Guideli
nes.ashx
Farrugia, C.A. & Bhandari, R. (2013). Open Doors 2013 Report on International Educational
Exchange. New York: Institute of International Education.
The Forum on Education Abroad (n.d.). About Us. Retrieved from
http://www.forumea.org/about-us/who-we-are.
The Forum on Education Abroad (2007). Guidebook for The Forum Quality Improvement
Program (QUIP) for Comprehensive Review Participants and Peer Reviewers.
The Forum on Education Abroad (2011). Education Abroad Glossary. Retrieved from
http://www.forumea.org/wp-content/uploads/2014/10/Forum-2011-Glossary-v2.pdf.
The Forum on Education Abroad (2012). The forum state of the field survey report 2011.
http://www.forumea.org/wp-content/uploads/2014/10/ForumEA-StateofFieldSurvey-
2012.pdf.
The Forum on Education Abroad (2014). The forum state of the field survey report 2013.
The Forum on Education Abroad (2015). The Forum on Education Abroad Quality
Improvement Program (QUIP). Retrieved from http://www.forumea.org/wp-
content/uploads/2015/03/QUIP-Instructions-Web-Version-2015.pdf
ASSESSMENT AND ACCREDITATION 78
Fountain, D. (November 29, 2004). Notice Prusuant to the National Cooperative Research and
Production Act of 1993-Forum on Education Abroad, Inc. Retrieved from
https://www.federalregister.gov/articles/2004/11/29/04-26221/notice-pursuant-to-the-
national-cooperative-research-and-production-act-of-1993-forum-on-education.
Goel, L., de Jong, P., & Schnusenberg, O. (2010). Toward a comprenhensive framework of
study abroad intentions and behaviors. Journal of Teaching in International Business,
21(4), 248-265.
Hadis, B. F. (2005). Gauging the impact of study abroad: how to overcome the limitations of a
single-cell design. Assessment & Evaluation in Higher Education, 30(1), 3-19.
Hallows, K., Wolf, P. P., & Marks, M. A. (2011). Short-term study abroad: a transformational
approach to global business education. Journal of International Education in Business,
4(2), 88-111.
Hammer, M. R. (2011). Additional cross-cultural validity testing of the intercultural
development inventory. International Journal of Intercultural Relations, 35, 474-487.
Head, R. B. & Johnson, M. S. (2011). Accreditation and its influence on institutional
effectiveness. New Directions for Community Colleges, 153, 37-52.
Hoffa, W. (2007). A history of US study abroad: beginnings to 1965. A Special Publication of
Frontiers: The Interdisciplinary Journal of Study Abroad and The Forum on Education
Abroad.
Institute of International Education. (2013). Profile of U.S. study abroad students, 2001/02-
2011/12. Open Doors Report on International Educational Exchange. Retrieved from
http://www.iie.org/opendoors.
ASSESSMENT AND ACCREDITATION 79
Kisantas, A. (2004). Studying abroad: the role of college students' goals on the development of
cross-cultural skills and global understanding. College Student Journal, 38(3), 441-452.
Macalester College. (n.d.). Study Abroad. Retrieved from
http://www.macalester.edu/about/whymacalester/studyabroad/.
Martin, J. (1987). The relationship between student sojourner perceptions of intercultural
competencies and previous sojourn experience. International Journal of Intercultural
Relations, 11, 337-355.
McLeod, M., & Wainwright, P. (2009). Researching the study abroad experience. Journal of
Studies in International Education, 13(1), 66-71.
Middle States Commission on Higher Education. (n.d.). International Programs Offered by
Accredited Institutions. Retrieved from
http://www.msche.org/?Nav1=Policies&Nav2=INDEX
Mills, L. H., Deviney, D., & Ball, B. (2010). Short-Term Study Abroad Programs: A Diversity of
Options. The Journal of Human Resource and Adult Learning, 6(2), 1-13.
Moskal, P., Ellis, T., & Keon, T. (2008). Summary of assessment in higher education and the
management of student-learning data. Academy of Management Learning and
Education, 7(20), 269-278.
New England Association of Schools and Colleges Commission on Institutions of Higher
Education (n.d.). Standard and Policies. Retrieved from
https://cihe.neasc.org/sites/cihe.neasc.org/files/downloads/Standards/Standards_for_Accr
editation.pdf
Patton, M. Q. (2002). Qualitative Research & Evaluation Methods (3
rd
ed.). Thousand Oaks,
CA: Sage Publications.
ASSESSMENT AND ACCREDITATION 80
Pedersen, P. J. (2010). Assessing intercultural effectiveness outcomes in a year-long study
abroad program. International Journal of Intercultural Relations, 34, 70-80.
Pitts, M. J. (2009). Identity and the role of expectations, stress, and talk in short-term student
sojourner adjustment: an application of the integrative theory of communication and
cross-cultural adaptation. International Journal of Intercultural Relations, 33, 450-462.
Ryan, M. E., & Twibell, R. S. (2000). Concerns, values, stress, coping, health and educational
outcomes of college students who studied abroad. International Journal of Intercultural
Relations, 24, 409-435.
Sachau, D., Brasher, N., & Fee, S. (2010). Three models for short-term study abroad. Journal of
Management Education, 34(5), 645-670.
Saint Mary’s College of Maryland. (n.d.). Office of International Education. Retrieved from
http://www.smcm.edu/ie/.
Salsbury, M. H., An, B. P., & Pascarella, E. T. (2013). The effect of study abroad on
intercultural competence among undergraduate college students. Journal of Student
Affairs Research and Practice, 50(1), 1-20.
Shaftel, J., Shaftel, T., & Ahluwalia, R. (2007). International educational experience and
intercultural competence. International Journal of Business and Economics, 6(1), 25-34.
Shavelson, R. J., & Huang, L. (2003). Responding responsibly. Change, 35(1), 10-20.
Soka University of America. (n.d.). Frequently Asked Questions. Retrieved from
http://www.soka.edu/academics/studyabroad/frequently-asked-questions.aspx.
Sourthen Association of Colleges and Schools (n.d.). Policies and Publications. Retrieved from
http://www.sacscoc.org/pdf/081705/Quality%20and%20Integrity%20of%20Undergradua
te%20Degrees.pdf
ASSESSMENT AND ACCREDITATION 81
Stier, J. (2003). Internationalism, ethinic diversity and the acquisition of intercultural
competence. Intercultural Education, 14(1), 77-91.
Taguchi, N. (2008). Cognition, language contact, and the development of pragmatic
comprehension in a study-abroad context. Language Learning, 58(1), 33-71.
Twombly, S., Salisbury, M., Tumanut, S., and Klute, P. (2012). Study Abroad in a New Global
Century: Renewing the promise, refining the purpose, ASHE higher education report.
Hoboken, NJ: John Wiley & Sons.
U.S. Department of Education (n.d.). Accreditation in the United States. Retrieved from
http://www2.ed.gov/admins/finaid/accred/accreditation_pg2.html
Vande Berg, M. (n.d.). A Research-Based Approach to Education Abroad Classification.
Retrieved from http://www.forumea.org/wp-
content/uploads/2014/10/researchapproach.pdf
Vande Berg, M. (2009). Intervening in student learning abroad: a research-based inquiry.
Intercultural Education, 20(S1-2), S15-27.
WASC Senior College and University Commission. (September 2009). Program Review
Resource Guide. Retrieved from http://www.wascsenior.org/document-list
WASC Senior College and University Commission. (July 2, 2014). Study Abroad Policy.
Retrieved from http://www.wascsenior.org/document-list
Wang, J., Peyvandi, A., & Moghaddam, J. (2011). An empirical investigation of the factors
influencing the effetiveness of short study abroad programs. International Journal of
Education Research, 6(2), 10-22.
Wehlburg, C. (2013). “Just right” outcome assessment: a fable for higher eductiaon. Assessment
Update, 25(2), 1-2, 15-16.
ASSESSMENT AND ACCREDITATION 82
Whalen, B. (2008). The management and funding of US study abroad. International Higher
Education, 50, 15-16.
Williams, T. (2009). The Reflective Model of Intercultural Competency: A Multidimensional,
Qualitative Approach to Study Abroad Assessment. Frontiers: The Interdisciplinary
Journal Of Study Abroad, 18, 289-306.
Zhang, Y. (. (2011). CSCC review series essay: education abraod in the U.S. community
colleges. Community College Review, 39(2), 181-200.
ASSESSMENT AND ACCREDITATION 83
Appendix A
Survey Questions for Study Abroad Program Administrator
Survey Questions
1 Do you have student learning outcome or program objectives?
1A Do you publish this information online?
1B
Is intercultural competence one of your student learning outcomes or program
objectives?
2
Does your regional accrediting agency have guideline/policy on study abroad
programs?
2A How are you addressing your regional accrediting agency’s policy guideline?
2B How do you feel about assessment being required by the accrediting agency?
3
If your regional accrediting agency has a guideline/policy on study abroad programs,
would you more likely to assess study abroad programs?
4 How does your institution define effectiveness?
5 What type of assessments/evaluations do you do?
6 What instruments are used to assess your study abroad programs?
7 What kind of data do you currently track related to your study abroad programs?
8
What kind of data would you like to track or would be helpful for you to have in order
to assess effectiveness of study abroad programs?
ASSESSMENT AND ACCREDITATION 84
Appendix B
Interview Questions
Section A: Background
How long have you been in your present position?
How long have you been in this field?
How long have you been at this institution?
What is your highest degree?
What is your field of study?
Section B: Study Abroad Programs
How many approved Study Abroad Programs do you have?
Do you allow students from other institutions to participate in your program?
What distinguishes your study abroad program from others? OR What makes your
SAP distinct from others?
What kind of student learning outcome or program objectives do you have?
If you do, where is it available? Do you publish it online?
Section C: Assessment
What type of assessments/evaluations do you do (please list all)?
How does your institution define effectiveness?
How do you assess or evaluate your study abroad program?
What kind of data do you currently track related to your SAP?
What kind of data would you like to track or would be helpful for you to have in
order to assess effectiveness of SAPs?
What instruments are used to assess study abroad program?
Are your assessment/evaluation results available to the public? If yes, online?
How do you feel about the current assessment practice of your study abroad
program?
Section D: Accreditation
Has there been a study abroad program review as a part of accreditation review/self-
study?
If yes, do you publish the review?
Do you know about WASC’s new SA policy and what have you done or doing to
meet the requirement?
How would you feel about assessment being required by WASC?
ASSESSMENT AND ACCREDITATION 85
Appendix C
Study Abroad Policy
Study abroad can be an important phase of undergraduate and graduate programs in American
colleges and universities. Carefully planned and administered foreign study can add significant
dimensions to a student’s educational experience. A study abroad program should:
a. Be clearly related to the objectives of the sponsoring or participating institution;
b. Have a well defined rationale that states the specific nature and purposes of the program
and is accurately represented in the institution’s catalog and all promotional literature;
c. Provide educational experiences related to the institution’s curriculum;
d. Be available to students who are carefully selected according to ability and interest;
e. Have a carefully articulated policy regarding the availability of financial assistance to
students for programs required by the institution;
f. Have clearly specified language proficiency requirements, when appropriate to the
program and place of study, and clearly defined methods of testing language proficiency
prior to acceptance into the program;
g. Provide intended participants with accurate and current information, specifically
describing the following:
h. program opportunities and limitations; how and where instruction will be given and the
relationship to the foreign institution; grading practices; significant differences between a
home campus experience and what can be expected abroad, including information about
local attitudes and mores; and a
i. description of local living conditions and the extent of responsibility assumed by the
program for housing participants;
j. Provide extensive orientation for participants prior to departure for, and on arrival in, the
foreign country with respect to the matters in (g) above, and augmented with more
detailed information and instruction related to the specific program;
k. Provide counseling and supervisory services at the foreign center, with special attention
to problems peculiar to the location and nature of the program;
l. Guarantee adequate basic reference materials to offset any limitations of local libraries or
inaccessibility to them;
m. Include clearly defined criteria and policies for judging performance and assigning credit,
in accordance with prevailing standards and practices at the home institution. A common
basis for determining grade equivalents is established when several institutions are
involved with a single overseas institution or in a consortium;
n. Stipulate that students will not ordinarily receive credit for foreign study that is
undertaken without prior planning or approval on the student’s home campus;
o. Include provisions for ongoing institutional oversight, including assessment of the
educational effectiveness of programs
ASSESSMENT AND ACCREDITATION 86
p. Assure fair reimbursement to participants if the program is not delivered as promised, for
any reason within the sponsor’s control; and
q. Provide opportunity, at the conclusion of the student’s program or upon return, to process
and reflect on the experience in ways that may contribute to the student’s and others’
learning.
Cooperative arrangements are urged among American institutions seeking to provide foreign
study opportunities for their students. In many cases, resident directors, faculty, and facilities
could be shared with significant improvement in the efficiency and economy of the operation.
One basic reference collection, for example, supported and used by students from several
programs, is likely to be more satisfactory than several separate ones.
Credit is not awarded for travel alone. Commercially sponsored “study/travel programs” should
be thoroughly investigated by an institution before it grants degree credit for these activities.
Travel/study courses sponsored by the institution must meet the same academic standards, award
similar credit, and be subject to the same institutional control as other courses and programs
offered by that institution.
Credit for travel/study courses is limited to a maximum of one semester unit of credit per week
of full-time travel/study (or the equivalent in quarter system units), with one additional unit of
credit for additional readings, papers and class meetings that are required before or after the
course.
Abstract (if available)
Abstract
In increasingly global society, students are expected to achieve intercultural competence, and studying abroad is one way to achieve this. Study abroad programs have a long history in the United States but, until recently, participation was limited to economic and intellectual elites. In less than two decades, study abroad program participation grew by 400% to more than 280,000 students. Today, programs are provided by over 400 institutions, organizations, and program providers. This study researched selected colleges and universities with high study abroad participation rate to explore the current state of study abroad program objectives and assessments. Data were collected in two phases. The first phase consisted of document analysis, which provided characteristic information about the study abroad programs and helped to formulate questions for survey and interview. The second phase was survey and interview. There was rich assortment of information available to the public about study abroad programs. However, based on this study, there are five gaps in the assessment and accountability of study abroad programs. First, study abroad programs provided diverse information about the programs and the availability was not consistent across different institutions. Second, there was limited information available about program objectives. Third, program effectiveness was not clearly defined. Fourth, assessments did not happen consistently. Fifth, there were not enough resources available to the study abroad program office to assess. This study recommended implementing study abroad policy at all regional accrediting agencies to improve assessment and accountability of study abroad programs. Such study abroad policy should be required to have the programs clearly define program objectives, and have periodic review of the study abroad program. This policy should also require providing that information to the students.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
To be young, global, and Black: an evaluation of African-American college students’ participation in study abroad programs
PDF
Learning outcomes assessment at American Library Association accredited master's programs in library and information studies
PDF
Examining the choice of business majors to participate in a short-term study abroad program using the gap analysis model
PDF
The making of global citizens through education abroad programs: aligning missions and visions with education abroad programs
PDF
A study of the pedagogical strategies used in support of students with learning disabilities and attitudes held by engineering faculty
PDF
Undergraduates in the developing world: study abroad program management in sub-Saharan Africa
PDF
Assessment, accountability & accreditation: a study of MOOC provider perceptions
PDF
Coloring the pipeline: an analysis of the NASPA Undergraduate Fellows program as a path for underrepresented students into student affairs
PDF
An examination of the direct/indirect measures used in the assessment practices of AACSB-accredited schools
PDF
Perspectives on accreditation and leadership: a case study of an urban city college in jeopardy of losing accreditation
PDF
The role of the fieldbook: a pedagogical tool for intercultural learning
PDF
Study abroad and the ethnoracial identity development of Latinx students
PDF
Students with disabilities in higher education: examining factors of mental health, psychological well-being, and resiliency
PDF
Mapping spirituality-related programming to students’ spiritual qualities
PDF
Establishing and marketing an international branch campus: a case study of Savannah College of Art and Design Hong Kong
PDF
Women in STEM: self-efficacy and its contributors in women in engineering within community college
PDF
Priorities and practices: a mixed methods study of journalism accreditation
PDF
A critical worldview: understanding identity and sense of belonging of underrepresented students' participation in study abroad
PDF
Democracy, diplomacy, and higher education: war college institutional effectivenss in promoting democracy and human rights to international military officers
PDF
Social skills and self-esteem of college students with ADHD
Asset Metadata
Creator
Kim, Lauren Michelle
(author)
Core Title
Assessment and accreditation of undergraduate study abroad programs
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education
Publication Date
06/23/2015
Defense Date
05/26/2015
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
accreditation,assessment,intercultural competence,OAI-PMH Harvest,study abroad
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Chung, Ruth (
committee chair
), Schafrik, Janice (
committee member
), Tobey, Patricia Elaine (
committee member
)
Creator Email
kimlaure@usc.edu,lmknewyork@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c3-576434
Unique identifier
UC11300459
Identifier
etd-KimLaurenM-3504.pdf (filename),usctheses-c3-576434 (legacy record id)
Legacy Identifier
etd-KimLaurenM-3504.pdf
Dmrecord
576434
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Kim, Lauren Michelle
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
accreditation
intercultural competence
study abroad