Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Civic learning program policy compliance by a state department of higher education: an evaluation study
(USC Thesis Other)
Civic learning program policy compliance by a state department of higher education: an evaluation study
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
Running head: CIVIC LEARNING PROGRAMS 1
CIVIC LEARNING PROGRAM POLICY COMPLIANCE BY A STATE DEPARTMENT OF
HIGHER EDUCATION: AN EVALUATION STUDY
by
Amy L. Carmack
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
May 2019
Copyright 2019 Amy L. Carmack
CIVIC LEARNING PROGRAMS 2
Acknowledgements
Success is never complete without the support of others. I have been incredibly fortunate
to be surrounded by supportive friends, family, and colleagues. First, I would like to thank my
dissertation committee: to Dr. Monique Datta for her tireless feedback that made me a better
writer, to Dr. Leanne Dunsmore for her compelling questions, and to Dr. Kimberly Ferrario for
encouraging me from the very first class. I appreciate all the time and effort you dedicated in
helping me become a better scholar, writer, and change agent.
To my USC colleagues: being in a cohort with other fearless change agents may be one
of the best things that ever happened to me. I am so thankful for the constant support that we
gave each other and cannot wait to see what you accomplish.
To James McGhee and Carly Cooper: above all, I know that I was meant to join this
program so I could meet you. I never would have thought I would meet two people who would
become some of my dearest friends, my USC family. Your humor, compassion, and connections
make me a better person.
To my parents, Ken and Deborah Carmack: your constant support throughout this process
has kept me focused and grounded. Growing up, you always encouraged me to do what my heart
said, whether it was moving across the country or changing careers. I will be forever grateful.
Finally, to my sister, Heather, the original Dr. Carmack: your support throughout this
process meant more to me than anything else. You encouraged me, applauded me, and helped
challenge me throughout this program. You helped me find who I was supposed to be.
I am encouraged and ready to become a leader in this changing world, led by the words
of the late Senator Edward M. Kennedy: “The work goes on, the cause endures, the hope still
lives, and the dreams shall never die.”
CIVIC LEARNING PROGRAMS 3
Table of Contents
Acknowledgements 2
List of Tables 5
List of Figures 6
Abstract 7
Introduction to the Problem of Practice 8
Organizational Context and Mission 9
Importance of Addressing the Problem 10
Organizational Performance Goal 11
Stakeholder Group of Focus and Stakeholder Goal 12
Purpose of the Project and Questions 14
Methodological Approach 15
Review of the Literature 16
Components of Civic Learning 17
Assessment Models and Tools of Civic Learning 19
Challenges of Differing Assessments 20
Policy Challenges for Civic Learning in Higher Education 21
Knowledge, Motivation, and Organizational Influences 23
Knowledge Influences 23
Motivational Influences 29
Organizational Influences 31
Conceptual Framework: The Interaction of Stakeholders’ Knowledge, Motivation, and the
Organizational Context 38
Participating Stakeholders: Sampling and Recruitment 42
Interview Sampling Criteria and Rationale 43
Interview Sampling (Recruitment) Strategy and Rationale 43
Document Analysis Sampling Criteria and Rationale 44
Document Analysis Review Strategy and Rationale 44
Data Collection and Instrumentation 45
Interviews 45
CIVIC LEARNING PROGRAMS 4
Documents and Artifacts 46
Data Analysis 47
Results and Findings 48
Theme 1: Compliance is Complicated and Incomplete 49
Theme 2: Interest is the Key to Motivation, Not Goal Setting 53
Theme 3: Institutional Mission was Considered, but Aligning Public Policy was Not 55
Theme 4: Diverting and De-investing in Resources Will Not Sustain a State Policy 58
Theme 5: Civic Learning Must Evolve and be Prioritized to Succeed 62
Summary 65
Recommendations for Practice to Address KMO Influences 66
Knowledge Recommendations 67
Organization Recommendations 69
Integrated Implementation and Evaluation Plan 74
Implementation and Evaluation Framework 74
Organizational Purpose, Need, and Expectations 75
Level 4: Results and Leading Indicators 76
Level 3: Behavior 77
Level 2: Learning 80
Level 1: Reaction 83
Evaluation Tools 84
Data Analysis and Reporting 85
Summary 86
Limitations and Delimitations 87
Recommendations for Future Research 88
Conclusion 89
References 91
Appendices 101
Appendix A: Interview Protocols 101
Appendix B: Document Analysis Protocol 103
Appendix C: Credibility and Trustworthiness 104
Appendix D: Ethics 106
Appendix E: Post-Meeting Evaluation 108
Appendix F: Post-Follow-up Meeting Evaluation 110
Appendix G: Dashboard Example 111
CIVIC LEARNING PROGRAMS 5
List of Tables
Table 1. Organizational Mission, Global Goals, and Stakeholder Performance Goals 14
Table 2. Knowledge Influences, Types, and Assessments for Gap Analysis 28
Table 3. Motivational Influences and Assessments for Gap Analysis 31
Table 4. Organizational Influences and Assessments for Gap Analysis 37
Table 5. Summary of Knowledge Influences and Recommendations 68
Table 6. Summary of Organizational Influences and Recommendations 70
Table 7. Outcomes, Metrics, and Methods for External and Internal Outcomes 76
Table 8. Critical Behaviors, Metrics, Methods, and Timing for Policymakers 78
Table 9. Required Drivers to Support Policymakers’ Critical Behaviors 79
Table 10. Components of Learning for the Program 83
Table 11. Components to Measure Reactions to the Program 84
CIVIC LEARNING PROGRAMS 6
List of Figures
Figure 1. An integrated conceptual framework illustrating the interactions of knowledge,
motivation, and organizational influences 41
CIVIC LEARNING PROGRAMS 7
Abstract
Since 2010, numerous reports have shown an increasing lack of knowledge surrounding civic
learning and college students. To mitigate this growing national problem, the Commonwealth of
Massachusetts passed a state-wide Policy on Civic Learning that requires all undergraduate
students graduating from a public 2- or 4-year institution be civic-minded upon commencement.
This purpose of this study was to evaluate the degree to which the Massachusetts Department of
Higher Education was meeting its civic learning policy compliance goal. Using the Clarke and
Estes (2008) gap analysis as a framework to identify the knowledge, motivation, and
organizational influences that impact successful completion of the identified goal, this study
interviewed Policymakers associated with creating the state public policy. Five themes emerged
from interviews and document analysis, focusing on one identified knowledge influence gap and
three identified organizational influence gaps. This study also created a recommended
implementation and evaluation plan for a learning program that addresses the identified gaps and
that can be used by other states seeking to create a similar state policy. Following Kirkpatrick
and Kirkpatrick’s (2016) New World Order plan, four levels of evaluation were completed.
Evaluation tools were created to effectively assess the completion of successful policy
implementation.
Keywords: civic learning, civic programs, civic policies, higher education
CIVIC LEARNING PROGRAMS 8
Introduction to the Problem of Practice
The most recent election cycle illustrated one of the most shocking secrets of American
society: the lack of civic learning by its citizens (Cole, 2016). The Annenberg Public Policy
Center (2017) reports that many Americans are misinformed or poorly informed about basic
civic information. Included in this commentary are college students and civic learning. The
National Task Force on Civic Learning and Democratic Engagement (2012) defines civic
learning as both civic knowledge and civic engagement, and each aspect contains multiple
elements. Civic learning centers on the idea that one cannot engage in one’s community without
a thorough knowledge of the situation, context, stakeholders, and impacts of public service
(National Task Force, 2012). A declined focus on civic education is not a new problem in the
United States; many educators, scholars, and policymakers have stressed concerns over the lack
of civic knowledge exuded by citizens, particularly adolescents and young adults, for years
(ACTA, 2016; Coley & Sum, 2012; Galston, 2007; Hatcher, 2011; National Task Force, 2012).
However, a recent report from the American Council on Trustees and Alumni (2016)
argues that the most recent decline, starting in 2010, depicts a looming crisis for the United
States. Numerous reports show that college students and recent college graduates are
increasingly ignorant about American history and civics (ACTA, 2016; Forestal, 2016; National
Task Force, 2012). This decline has led to decreased voting activity (ACTA, 2016; Galston,
2004), growing distrust of public figures (Coley & Sum, 2012), a lack of public service
participation after graduation (ACTA, 2016), and a general civic apathy (Davis & Mello, 2012).
Coley and Sum (2012) note that the lack of a civically engaged population can also lead to a
decrease in the economic and psychological well-being of a society. The role of higher education
in civic education has changed substantially over the years; however, most postsecondary
CIVIC LEARNING PROGRAMS 9
institutions purport they are focused on producing civic-minded graduates. Bolstered by the 2012
watershed report, A Crucible Moment, the Massachusetts state legislature adopted the nation’s
only policy on civic learning for all public higher education graduates in an effort to increase
civic learning, responsibility, and practice.
Organizational Context and Mission
Originally created in 1837, the Massachusetts Board of Education is the governing body
for the state’s Department of Elementary and Secondary Education and the Department of
Higher Education (SNAC, n.d.). Reorganized in 1919, the Massachusetts Board of Higher
Education (MBHE) is staff by the Massachusetts Department of Higher Education (MDHE). The
MBHE is “responsible for defining the mission of and coordinating the Commonwealth’s system
of public higher education and its institutions” (About, 2017, para 1). The Department of Higher
Education works with the Massachusetts Public Higher Education System, which includes 15
community colleges, nine state universities, and five University of Massachusetts campuses
(Brennan, 2017). The state public system serves over 300,000 students and 40,000 faculty and
staff (Brennan, 2017; DataCenter, 2017). In 2016-2017, the system issued 25,960 degrees and
certificates (DataCenter, 2017).
Work conducted by the MDHE primarily focuses on The Vision Project. In 2010, the
MBHE and MDHE created The Vision Project, an aggressive state-driven initiative, to “produce
the best-educated citizenry and workforce in the world” (MDHE, 2017). As the public agenda for
all institutions of higher education in the state, the Vision Project identified five key educational
outcomes: college participation, college completion, student learning, workforce alignment, and
closing achievement gaps (Final Report, 2013). However, in response to A Crucible Moment and
the state’s own assessment of civic learning decline among college graduates, the Board adopted
CIVIC LEARNING PROGRAMS 10
a sixth outcome: preparing citizens. The Board understood that in order achieve their goal of
producing a well-educated citizenry and improving the state and national workforce, the public
postsecondary education system must provide students “with the knowledge, skills, and
dispositions to be engaged, informed citizens” (Special Commission on Civic Engagement and
Learning, 2012).
Importance of Addressing the Problem
It is important to address the decline in civic learning in higher education for several
reasons. The effects of a decline in civic education at all educational levels is evident in
decreased voting activity (ACTA, 2016; Galston 2004) and general civic apathy (Davis & Mello,
2012). In fact, the Center for Information and Research on Civic Learning and Engagement
(CIRCLE) (2010) found that the decline in civic learning can be marked by lower income and
overall wealth. As budget cuts continue for civic education at the K-12 level (Department of
Education, 2017), colleges and universities can bridge the gap in civic learning by offering
targeted curricular and co-curricular opportunities for students (Coley & Sum, 2012; National
Task Force, 2012). Higher education can provide a level of civic knowledge that helps young
citizens create, what Galston (2007) calls enlightened self-interest. Moreover, employers report
that they want college graduates to have strong civic knowledge and a dedication to civic
engagement (National Task Force, 2012; Spiezio, 2009). Like many other soft skills, having a
strong civic education aligns with employers’ expectations of how college students can help
create a global workforce (ACTA, 2016; National Task Force, 2012). However, change cannot
just be implemented at the institutional level; rather, institutions must also be supported at state
and national levels. A Crucible Moment (National Task Force, 2012) calls on state and national
policymakers to commit to increasing civic learning opportunities for college students. While no
CIVIC LEARNING PROGRAMS 11
state initiatives were outlined in the report, the MBHE saw an opportunity for a state-level action
that would contribute to the development of Massachusetts citizens and work toward improving
civic education conditions (Policy, 2014). Unknown that many government departments would
undergo severe budget cuts under a new presidential administration (Department of Education,
2017), the MBHE’s efforts at the state level may prove to positively impact civic learning and
promote a more active citizenry as college students learn to engage with government officials
and public policy at every level (Mettler, 2007).
Organizational Performance Goal
Recognizing the importance of civic learning for the development of well-rounded
students and citizens in the state, in 2014, the Massachusetts Board of Higher Education adopted
the first-in-the-nation state-level Policy on Civic Learning. In addition to defining civic learning
and outlining four action items needed for success, the Policy on Civic Learning plans to provide
civic learning opportunities, both curricular and co-curricular, to all students attending any of the
24 institutions in the Massachusetts Public Higher Education System, except the University of
Massachusetts campuses. A Crucial Moment (National Task Force, 2012) called on
postsecondary institutions to provide civic learning opportunities to at least two-thirds of their
student populations by 2020, thus setting a national benchmark. The goal of MDHE is to provide
guidance to all 24 institutions in four broad areas – strategic planning, support, data collection,
and coordination - to achieve 100% compliance of the Policy by January 2019. The goals were
set by the MBHE, which is comprised of higher education professionals and policymakers.
Compliance of each area will be tracked by the MDHE Director of Civic Learning and
Engagement and his staff.
CIVIC LEARNING PROGRAMS 12
The MDHE outlined four broad areas of compliance for the Policy. The first area,
strategic planning, focuses on how each institution will incorporate civic learning curricular
and/or co-curricular initiatives. Each institution considers their mission and academic structure,
with the goal of incorporating civic learning as an outcome for undergraduate education by the
end of the 2014-2015 academic year. The second area, support, tackles how the MDHE can
provide support to each institution as civic learning is incorporated into an educational outcome
for students. Beginning in the 2014-2015 academic year, the MDHE was charged by the MBHE
with 1) organizing and meeting with all identified stakeholders and 2) identifying additional
financial opportunities, such as grants and fellowships for curriculum development, faculty
training, and research (Policy, 2014). The third area, data collection, asks the MDHE to utilize
current resources, such as the Integrated Postsecondary Education Data System (IPEDS) and the
Department’s Higher Education Information Resource System (HEIRS) to house data collected
from each institution beginning in 2014. Moreover, the MDHE will use this data to develop a
comprehensive civic learning assessment tool to determine any impacts of a state-wide policy
initiative on a specific educational outcome (Policy, 2014). Finally, the MDHE will coordinate
with all institutions to determine future policy initiatives, changes, and educational gaps.
Stakeholder Group of Focus and Stakeholder Goal
This study identified three stakeholder groups impacted by the organization’s global goal:
students, institutional subcommittees, and policymakers. Although a complete analysis would
involve all identified stakeholder groups, for the purpose of this work, this study will focus on
the policymakers. This stakeholder group is the focus of analysis for several reasons. First, and
most important, this stakeholder group drafted the recommendations that comprise the Policy on
Civic Learning, including elements under the four broad areas in which they provide guidance to
CIVIC LEARNING PROGRAMS 13
the participating higher education institutions. It is through their work that compliance can be
achieved, and learning measured to show the effectiveness of a state-wide policy initiative on
civic learning for college graduates. Since the group is responsible for developing policy
implementation, they can ensure institutions are staying within the project metrics and goals
outlined and approved by MDHE. Second, the related problem of practice corresponds most
closely to this stakeholder group. When civic learning or civic education programs are examined,
it is often through the lens of educators and students (National Task Force, 2012). Rarely is the
process or a policy addressed. While an obvious gap in the larger problem of practice,
policymakers need to understand that the learning environment must also include them. The
Policy on Civic Learning provides both the guidance to produce systematic change and support
at the state-level so educators can focus on learning. Finally, activities surrounding the
completion of other performance goals are impacted by the successful completion of this
stakeholder’s goal. If the policymakers do not provide guidance and support through the
previously identified four areas (strategic planning, support, data collection, and coordination),
participating institutions may not comply with the state-adopted policy, and the goal of
producing a well-educated citizenry may not be achieved.
The performance goals for each stakeholder group as are varied as the groups themselves.
Each performance goal focuses on a needed element of the compliance and the organization’s
global goal, as well as the larger problem of practice.
CIVIC LEARNING PROGRAMS 14
Table 1
Organizational Mission, Global Goal, and Stakeholder Performance Goals
Organizational Mission
The mission of the Board of Higher Education is to ensure that Massachusetts residents have the
opportunity to benefit from a higher education that enriches their lives and advances their
contributions to the civic life, economic development, and social progress of the Commonwealth.
To that end, the programs and services of Massachusetts higher education must meet standards of
quality commensurate with the benefits it promises and must be truly accessible to the people of
the Commonwealth in all their diversity.
Organizational Performance Goal
By May 2019, the Massachusetts Department of Higher Education will graduate civic-minded
students from all public 2- and 4-year institutions.
Stakeholder 1 Goal Stakeholder 2 Goal Stakeholder 3 Goal
Policymakers
By January 2019,
Policymakers will achieve
100% compliance of the Policy
on Civic Learning at all
participating institutions.
Institution Subcommittees
By August 2018, all
participating institutions will
upload baseline student data
into the shared database
system and successfully pilot
the use of the civic learning
flag system.
Students
By August 2018, all students
attending one of the
participating institutions will
enroll in and complete at least
one civic learning identified
course before graduating.
Purpose of the Project and Questions
The purpose of this project was to evaluate the degree to which the Massachusetts
Department of Higher Education is meeting its civic learning policy compliance goal of
graduating civic-minded students by January 2019. The analysis focused on knowledge,
motivation, and organizational influences related to achieving the organizational goals. While a
complete performance evaluation under the Clark and Estes (2008) gap analysis model would
focus on all stakeholders, for practical purposes, the stakeholders of focus for this analysis are
the policymakers.
CIVIC LEARNING PROGRAMS 15
The following questions will guide this study:
1. To what extent is the Massachusetts Department of Higher Education meeting its goal of
graduating civic-minded students through compliance of the Policy for Civic Learning by
all 2- and 4-year state public institutions of higher education by January 2019?
2. What is the Policymakers’ knowledge and motivation related to achieving the MDHE
goal?
3. What is the interaction between MDHE’s organizational culture and context and
policymakers’ knowledge and motivation in relation to achieving the MDHE goal?
4. What are the recommendations for MDHE organizational practice in the areas of
knowledge, motivation, and organizational resources?
Methodological Approach
The 2014 initiative from the Massachusetts Department of Higher Education seeks to
accomplish two goals: successfully implement a state-wide policy on civic learning by
graduating civic-minded college students and increase civic learning of all college graduates. For
the purposes of this study, though, only the first goal was examined. Therefore, a qualitative
approach was the best method to gather data and interpret results to evaluate whether the state-
wide policy on civic learning was successfully implemented. McEwan and McEwan (2003)
explain that when assessing whether a process works, a qualitative approach is appropriate as
information gathering techniques, such as interviews and document analysis, allow for an intense
focus on meaning and explanation. This creates a holistic view of the problem of practice
(Creswell, 2014).
The Clark and Estes gap analysis conceptual framework was also used for this study. The
Clark and Estes (2008) gap analysis provide a process that captures the human and
CIVIC LEARNING PROGRAMS 16
organizational influences that may impact performance and success. The gap analysis framework
addresses three critical causes of performance gaps: knowledge and skills, motivation, and
organizational barriers (Clark & Estes, 2008). The authors argue that it is important to identify
the root causes of gaps in each area as all three elements are important to the development of a
more effective and efficient workforce (Clark & Estes, 2008). Starting with knowledge, it is
important to determine if people know how to achieve a performance goal (Clark & Estes, 2008;
Rueda, 2011). Krathwohl (2002) identifies four types of knowledge – factual, conceptual,
procedural, and metacognition - and how they impact the achievement of a performance goal.
Once knowledge gaps are identified, an analysis of motivation is vital. Without proper
motivation, an individual will not actively choose to engage in an activity, persist with the
activity to completion, or dedicate enough mental effort to complete the activity (Clark & Estes,
2008; Rueda, 2011). For instance, understanding a person’s level of self-efficacy or the value
they place on an activity will help understand how a performance goal can be met (Rueda, 2011).
Finally, Clark and Estes’ framework examines the organizational barriers that exist for
successful completion. Exploring the resources, policies and procedures, organizational culture,
and organizational practices will uncover gaps in support, which make performance goal
achievement difficult (Clark & Estes, 2008).
Review of the Literature
Creating an active and engaged citizenry is a long and arduous process (National Task
Force, 2012; Forestal, 2016). Researchers agree that this development is often stunted by the
lack of civic learning in higher education (Coley & Sum, 2012; Damon, 2011; Galston, 2004;
National Task Force, 2012; Van Camp & Baugh, 2016). For the purposes of this study, civic
learning is defined as the acquisition of knowledge, intellectual skills, and applied competencies
CIVIC LEARNING PROGRAMS 17
citizens need to be active participants in their communities and pursue a democratic life (Policy,
2014). Inspired by the seminal 2012 work A Crucible Moment, many postsecondary institutions
elected to adopt curricular changes in an effort graduate more civic-minded students. The state of
Massachusetts, however, adopted a state-wide policy initiative, making civic learning and
education part of the public agenda (Brennan, 2017; Policy, 2014; Reiff, 2016).
Components of Civic Learning
Civic learning is important to creating an active and engaged citizenry. The missions of
most colleges and universities, since the time of colonial colleges, have focused on instilling a
strong sense of morality and civic-mindedness (Bryant & Gaston Gayles, 2012; Thelin, 2004).
Bok (2006) notes that “civic responsibility must be learned, for it is neither natural nor
effortless” (p. 172). As colleges and universities evolved and became more diverse, the role of
civic learning became a collaborative activity. Teaching college students about civic
responsibility requires participation from all educators and administrators. That collaboration
starts with the definition of civic learning.
Civic learning and education provides students with skills to become active citizens
through civic curriculum and activities. Van Camp and Baugh (2016) argue that civic learning is
both civic knowledge and civic engagement. The partnership between knowing and enacting
civic responsibility impacts future civic attitudes and actions, so it is important that both
elements be learned (Hillygus, 2005; Van Camp & Baugh, 2016). Many researchers argue that
even introducing a student to civic learning once can significantly increase civic knowledge,
engagement, multicultural sensitivity, and applied critical thinking skills (Bryant & Gaston
Gayles, 2012; Forestal, 2016; Jacoby, 2009; Van Camp & Baugh, 2016).
CIVIC LEARNING PROGRAMS 18
However, the lack of common language has caused confusion on what constitutes civic
learning. Jacoby (2009) mentions that there is no common definition of civic learning, civic
knowledge, or civic engagement. Many institutions define civic learning per their institutional
needs or initiatives, which appeals too many educators and administrators who do not want
political elements included (Jacoby, 2009). Forestal (2016) argues, though, that without a clear
picture of civic learning, it can be difficult to determine what requisite skills are needed for a
contemporary active citizenry. Torney-Purta and colleagues (2015) posit that not defining civic
learning or using common language can make it difficult for academic disciplines, such as
biology and mathematics, to know how their curricula can contribute to civic learning. The lack
of common language can also make it difficult to create or implement assessments to measure
civic learning (Forestal, 2016; Jacoby, 2009; Torney-Purta et al., 2015). For the purposes of this
study, both civic knowledge and civic engagement will be defined using contemporary research.
Civic knowledge. Civic knowledge provides both basic facts and conceptual skill
building opportunities. Coley and Sum (2012) describe civic knowledge as the “cornerstone of
democracy” (p. 3). Without civic knowledge, one cannot actively participate in public service
because one does not understand public service. But, as Reason and Hemer (2015) assert, civic
knowledge is, itself, actionable. Without it, one cannot make change or understand implications
of change within different types of organizations. Civic knowledge contributes to the
development of specific skills, including dialogue, interpersonal perspective taking, and critical
systematic thought (Hatcher, 2011). Many institutions, including Massachusetts policymakers,
utilize the eight civic skills defined in A Crucible Moment: 1) critical inquiry, analysis, and
reasoning; 2) quantitative reasoning; 3) gathering and evaluation of multiple sources of evidence;
4) seeking, engaging, and being informed by multiple perspectives; 5) written, oral, and
CIVIC LEARNING PROGRAMS 19
multimedia communication; 6) deliberation and bridge building across differences; 7)
collaborative decision making; and 8) ability to communication in multiple languages (National
Task Force, 2012, p. 4). While these skills are commonly assessed when gauging civic
knowledge programs, civic knowledge is the most disagreed upon component of civic learning
(Reason & Hemer, 2015).
Civic engagement. Civic engagement allows students to put civic knowledge into
practice. As equally important as civic knowledge, civic engagement asks that a person has a
strong sense of self before engaging in civic practices. Musil (2009) maintains “civic
engagement is acting on a heightened sense of responsibility to one’s own communities that
encompasses the notions of global citizenship and interdependence, participation in building
civic society, and empowering individuals as agents of positive social change to promote social
justice, locally and globally” (p. 59). Civic engagement can lead to the development of one’s
civic identity, intellectual and ethical developments, empathy, and critical thinking (Knefelkamp,
2008).
Assessment Models and Tools of Civic Learning
Currently, civic learning assessments are determined by the component of civic
knowledge or civic engagement. Reason and Hemer (2015) and Torney-Purta and colleagues
(2015) conducted in-depth literature reviews on civic learning assessments and instruments.
Tools such as the Cooperative Institutional Research Program Freshman and Senior (CIRP)
survey, the National Survey of Student Engagement (NSSE), Civic Literacy Assessment, IEA
Civic Education Study Test and Survey, and NAEP Civics only look at specific aspects of either
civic knowledge or civic engagement. For instance, CIRP assess civic values of college students
and the NSEE explores student attitudes, whereas NAEP Civics gauges levels of civic
CIVIC LEARNING PROGRAMS 20
knowledge (Reason & Hemer, 2015). Torney-Purta and colleagues (2015) offer the AAC&U
Learning Value Rubric as an alternative, which seeks to assess both civic knowledge and
engagement through a series of assessments examining knowledge, skills, attitudes, values,
motivations, and efficacies. The researchers argue that using a more comprehensive tool, instead
of multiple tools, can help overcome the challenges of different assessments (Torney-Purta et al.,
2015).
Challenges of Differing Assessments
Institutional rubric inconsistencies make it difficult to create a reliable and valid
assessment that examines both components of civic learning. Forestal (2016) found that faculty
academic autonomy can make it difficult to create a university-wide rubric that can be utilized by
all faculty in all departments. Faculty inexperience and lack of civic knowledge can also make it
difficult to effectively integrate a university-wide civic assessment (Forestal, 2016; National
Task Force, 2012). The National Task Force on Civic Learning and Democratic Engagement
(2012) also found that since many institutions base their civic education opportunities on their
mission statements, it can be challenging to assess what is happening in and out of the classroom.
Combining civic learning via curricular and co-curricular activities requires a longitudal and
complex analysis that many institutions may not be prepared to implement (Forestal, 2016;
National Task Force, 2012).
Moreover, data reported in all assessments is self-reported, leaving many assessments
with gaps in reliability. In a comprehensive literature review, Torney-Purta and colleagues
(2015) found that almost all civic learning assessment rely solely on self-reported data. Their
research found that most students report a positive assessment of themselves and overreport their
level of activity of civic engagement opportunities (Torney-Purta et al., 2015). More troubling,
CIVIC LEARNING PROGRAMS 21
though, researchers found that underrepresented student populations may not be accurately
represented due to access (Reason & Hemer, 2015; Torney-Purta et al., 2015). Reason and
Hemer (2015) discovered that since much of the data is aggregated, it can be difficult to truly
assess civic learning between groups and many demographic variables are only used for vague or
contradictory reports.
Policy Challenges for Civic Learning in Higher Education
The 1947 President’s Commission on Higher Education report was the first time the
federal government articulated a goal for all institutions of higher education. When surveyed,
hundreds of veterans who received and used the GI Bill of 1944 were more significantly active in
civic organizations following graduation (Mettler, 2007). Jacoby (2009) detailed few national
initiatives since the National Defense Education Act of 1958. While there was an outbreak of
federal government educational partnerships in the 1980s, there were few federal
recommendations until the 2012 A Crucible Moment report (Jacoby, 2009). To date, there are
still no national civic education graduation requirements for college students.
State requirements. In most instances, civic learning is managed and implemented at the
state level. However, most state requirements focus in K-12 education, ignoring the importance
state policy can play in civic education for college students (Education Commission of the
States, 2016; Jacoby, 2009). In higher education, many state legislatures leave civic education
programming or requirements to the discretion of the institution (Education Commission of the
States, 2016).
Some researchers argue that changes to civic education should come at the state level
given the unique relationships legislatures have with higher education institutions. Mettler (2007)
notes, “policymaking and civic engagement are deeply related, joined through a complex set of
CIVIC LEARNING PROGRAMS 22
mechanisms that emerge through the seemingly arcane minutia of policy design” (p. 649).
However, policymakers are shifting the responsibility to private businesses and sponsors due to
other societal problems. Rose (2017) found that feedback from higher education institutions and
public citizens can shape state legislatures, alter civic values of state representatives, and
encourage state governments to become active participants in the civic renewal process.
Massachusetts became the first state to require public higher education institutions to
graduate civic-minded students. Using data collected from all 2- and 4-year public higher
education institutions, the state’s Board of Higher Education elected to incorporate a state-wide
graduation requirement to increase the civic learning of all graduates (Reiff, 2016). The first in
the nation state Policy on Civic Learning focuses on providing state support through
implementation and data collection to all public institutions (Brennan, 2017; Reiff, 2016). The
state government believes this policy initiative will help students create a civic identity that will
impact the economic and social well-being of local, state, and national communities (Reiff,
2016). The state also seeks to act as a model for other states to adopt civic education policies
through their state legislatures, creating strong partnerships between higher education institutions
and state governing bodies (Brennan, 2017; Reiff, 2016). More important, by the state taking an
active role in crafting, passing, and implementing a civic learning policy, it changed the narrative
about civic education for multiple stakeholders, including students, businesses, and other
policymakers (Brennan, 2017).
Institutional curricular requirements and co-curricular programming. Realigning
civic learning with mission statements may prove difficult if it was not already embedded.
Almost every college and university will claim civic engagement is a core value of their
institutions (Thelin, 2004). However, few institutions use “civic learning” language to denote
CIVIC LEARNING PROGRAMS 23
both civic knowledge and civic engagement (Hatcher, 2011). This discrepancy may be explained
by past mission statements or current values, as well as campus climate, administrative support,
faculty perspective, student leadership, political climate, and community context (Hatcher,
2011). In the 1990s, civic engagement programming overtook civic knowledge requirements as
many institutions shifted focus to community involvement (Jacoby, 2009). Few institutions
include the completion of a civic knowledge course in their graduation requirements. Forestal
(2016) found that institutions that include a civic knowledge requirement fulfill the intent of the
requirement through a general education civic class or United States history course. The National
Task Force on Civic Learning and Democratic Engagement (2012) affirms that making civic
literacy a requirement for all students to graduate is one of the most effective practices to
increase civic learning. Many, if not all, higher education institutions have civic, economic, or
US history courses included in their curriculum. Adding a requirement, approved through a
curriculum review process, may be the fastest actions an institution can take. This simple action
would create a comprehensive civic learning program in their institution, allowing students to
learn and apply their knowledge to become active citizens (Jacoby, 2009; National Task Force,
2012).
Knowledge, Motivation, and Organizational Influences
Knowledge Influences
Understanding what stakeholders know or do not know is a vital component of assessing
a performance gap. In their analogy of the three critical factors of the analysis process as a car,
Clark and Estes (2008) explain that the knowledge element of the analysis is like the engine and
transmission system. Without the central element of knowledge, the other factors cannot be
determined or investigated, or in the case of the car analogy, run effectively. More important,
CIVIC LEARNING PROGRAMS 24
analysis conducted through an examination of knowledge influences encourages identification of
what is missing from practice. Rueda (2011) argues that a central question surrounds the
knowledge element: “What does one need to know in order to achieve his or her goals?” (p. 27).
With that simple premise, a gap analysis can begin.
Within the selected problem of practice, the lack of civic learning programs in higher
education, a basic understanding of what constitutes civic learning is paramount. However, it is
not just the process of learning new information. Mayer (2011) maintains that this first step also
denotes a change in knowledge due to experience and that the change must alter the learner in a
fundamental way. Policymakers must know how to accurately define civic learning in order to
effectively implement programmatic and curricular changes. This fundamental knowledge
directly impacts the other identified knowledge influences since a weak or incomplete
understanding of civic learning will not allow for a holistic assessment of civic learning or
alignment of civic learning to institutional missions. While there is considerable research
available on the importance of defining civic learning before starting an initiative and assessing
civic learning, there is limited research on aligning state higher education policies with
institutional missions. This lack of consistency illustrates knowledge gaps that currently exist for
successful civic learning program integration into higher education.
Knowledge types. There are four main types of knowledge that highlight the cognitive
approach to performance analysis. The first type of knowledge, factual, focuses on the gathering
of fact-based information, such as terminology (Rueda, 2011). Factual knowledge can be quickly
recalled from short-term memory and easily applied when asked about basic information
(Krathwohl, 2002). Being able to define the four principles of democratic citizenship is a prime
example of this dimension. The second knowledge dimension is conceptual knowledge, which
CIVIC LEARNING PROGRAMS 25
focuses on knowledge of concepts, theories, and classifications (Krathwohl, 2002). Conceptual
knowledge requires a higher level of analysis and application by the learner because the
knowledge type maintains the learner make meaning of the situation through the knowledge
gained (Krathwohl, 2002; Rueda, 2011). Understanding how a state-mandated civic learning
policy aligns with an institution’s mission statement highlights this dimension. The third type of
knowledge, procedural, focuses on how a learner knows how to do something (Rueda, 2011).
However, procedural knowledge also focuses on how well a learner can perform the task, not
just if they can perform the task (Krathwohl, 2002). For instance, knowing how to assess civic
learning through a variety of curricular experiences is different than issuing a civic skills survey
to gauge the level of civic knowledge.
The final type of knowledge, metacognition, may be the most significant for knowledge
development. Rueda (2011) states that this dimension focuses on the awareness of one’s own
cognition and role in the cognitive process. Metacognition is “a key aspect of strategic behavior
in solving problems, and allows one to consider contextual and conditional aspects of a given
activity or problem” (Rueda, 2011, p. 29). For policymakers, understanding how they are
developing civic learning through their policy work, using self-regulation strategies to change
their attitudes about civic engagement, and understanding their role in collegiate learning
illustrate metacognitive knowledge in practice. While all four knowledge types are present in the
identified problem of practice, for the purposes of this study, only two dimensions will be
examined: conceptual and procedural.
Knowledge of civic learning. Before Policymakers can ask higher education
administrators and faculty to apply civic skills and principles into practice, they must first
determine definitions and terminology surrounding the topic. However, since there are multiple
CIVIC LEARNING PROGRAMS 26
definitions of civic learning and civic education (Jacoby, 2009), it may be unclear what elements
constitute this definition and how that terminology is applicable to the policy. Melville, Dedrick,
and Gish (2013) argue that multiple definitions of civic learning make the concept too vague, and
policymakers cannot expect to increase civic knowledge and skill without a clear definition. In
their work on democratic citizenship education, the authors found that current higher education
curriculum does not present students with all aspects, making it difficult for them to truly grasp
civic knowledge and how that knowledge contributes to skill development and engagement
(Melville et al., 2013). Without clear definitions and an understanding of all the elements of civic
learning, students will not know how to increase their knowledge and skills, faculty will not
know what to teach in the classroom, and policymakers will not know how to help institutions
develop opportunities to develop civic-minded graduates.
Aligning institutional mission with public policy. A university’s mission statement
illustrates the values and goals of the institution (Cuthill, 2012). Over the past twenty years,
many institutions have updated their mission statements to include learning goals, aspirational
goals, and new endeavors or professional foci, such as graduate programs (Gaff & Meacham,
2006). Many institutions are also updating their mission statements to align with social issues,
public policy initiatives, and workforce development (Holosko, Winkel, Crandall, & Briggs,
2015). Historically, most institutions have included a focus or mention of civic development in
their mission statements; however, few institutions allot resources to contribute to the success of
that mission (Cuthill, 2012). For successful implementation, Policymakers must know the
mission statements of each participating institution and how the state-mandated Policy on Civic
Learning will fit within that mission. In some cases, a re-alignment may be necessary. Bawa and
Munck (2012) explain that as institutions are changing to meet the demands of new markets and
CIVIC LEARNING PROGRAMS 27
globalization, many institutions may need to re-evaluate their mission statements. The scholars
explain that since civic engagement is already internalized at many institutions, this re-alignment
would need to consist of a more rigorous framework for implementation (Bawa & Munck, 2012).
In his work analyzing the national civic policy implementation at public higher education
institutions in Ireland, Boland (2012) found that mission drift – a move away from the
sustainability of a practice due to a shift in priorities - was a primary concern for implementation.
His research also found that by realigning a mission statement to include a policy initiative, an
institution was able to identify activities more easily that met the intent of the policy, and the
policy contributed to all institutional goals and created stronger shared institutional values
(Boland, 2012). Policymakers must understand how to help institutions re-align their institutional
missions, if needed, to successfully implement this state-mandated initiative.
Assessing civic learning. Policymakers must also know how to accurately assess civic
learning holistically, and within the context of the policy implementation. Rueda (2011)
maintains that it is important that what is being learned is accurately being assessed. While there
are several civic skills, civic values, and civic engagement rubrics and assessments available
(Forestal, 2016; National Task Force, 2012; Reason & Hemer, 2015; Torney-Purta et al., 2015),
there are few tools the look at specific segments of civic learning. Since Policymakers defined
civic learning with specific metrics, they must know how to assess the operationalized products
of each institution and determine if they are meeting the prescribed goal (Brennan, 2017). It is
not enough for students to improve their civic knowledge; they must also “cultivate habits and
skills” that impact their public service (Melville et al., 2013, p. 262). Moreover, there is a glaring
gap in assessment due to self-reported data (Torney-Purta et al., 2015; Van Camp & Baugh,
2016) and a lack of data available from the Integrated Postsecondary Education Data System
CIVIC LEARNING PROGRAMS 28
(IPEDS) and MDHE (Brennan, 2017; Policy, 2014). Policymakers need to know how set
benchmarks and collect baseline data to assess civic learning and show any impacts on student
learning (Brennan, 2017; Reiff, 2016).
Table 2 outlines the organizational mission, goal, and stakeholder goal specific to the
problem of practice. Knowledge influences, types, and assessments are included.
Table 2
Knowledge Influences, Types, and Assessments for Gap Analysis
Organizational Mission
The mission of the Board of Higher Education is to ensure that Massachusetts residents have
the opportunity to benefit from a higher education that enriches their lives and advances their
contributions to the civic life, economic development, and social progress of the
Commonwealth. To that end, the programs and services of Massachusetts higher education
must meet standards of quality commensurate with the benefits it promises and must be truly
accessible to the people of the Commonwealth in all their diversity.
Organizational Global Goal
By May 2019, the Massachusetts Department of Higher Education will graduate civic-minded
students from all public 2- and 4-year institutions.
Stakeholder Goal
By January 2019, Policymakers will achieve 100% compliance of the Policy on Civic
Learning at all participating institutions.
Knowledge Influence Knowledge Type (i.e.,
declarative (factual or
conceptual), procedural, or
metacognitive)
General Literature
Policymakers need to know
definitions and understand
elements of civic learning.
Declarative (conceptual) Forestal, 2016; Jacoby, 2009;
Melville et al., 2013; Torney-
Purta et al., 2015; Van Camp
& Baugh, 2016
Policymakers need to
understand the position of
institutional missions at
postsecondary colleges and
universities and how to
integrate the policy within
these missions.
Declarative/Procedural
Bawa & Munch, 2012;
Boland, 2012; Cuthill, 2012;
Holosko et al., 2012
Policymakers need to know
how to assess civic learning.
Procedural Bok, 2006; Galston, 2007;
Melville et al., 2013; Spiezio,
2009; Van Camp & Baugh,
2016
CIVIC LEARNING PROGRAMS 29
Motivation Influences
Clark and Estes (2008) maintain that humans are made of two complementary bases –
knowledge and motivation. Motivation answers the substantial question of once an individual
has knowledge, what will, or won’t they do with it (Rueda, 2011). Scholars have determined that
motivation is culturally inspired and what motivates one person may not motivate another
(Rueda, 2011). Motivation is as personal as DNA and just as complex. The first indicator, active
choice, reflects the decision one makes to start an activity (Rueda, 2011). However, as Clark and
Estes (2008) argue, the decision not to pursue an activity can also reflect choice. The second
indicator, persistence, represents the continued progress an individual makes on activity (Clark &
Estes, 2008; Rueda, 2011). Persistence may be one of the more difficult motivational indicators
to maintain as one must invest a considerable amount of energy into the activity (Clark & Estes,
2008). The final indicator, mental effort, refers to the amount of mental energy one expends to
complete an activity (Clark & Estes, 2008; Rueda, 2011). However, as Clark and Estes (2008)
explains mental effort is determined by our confidence more than the amount of effort exerted.
This makes successful motivation personal and intrinsic (Mayer, 2011). Of the specific
motivational theories and variables, this problem of practice centers around goal orientation
theory and goal setting theory.
Goal orientation theory and goal setting. Goal orientation theory examines how an
individual engages in goal achievement and why they make those decisions (Yough &
Anderman, 2006). According to Yough and Anderman (2006), goals fall into two categories:
mastery goals or performance goals. Mastery goals are used when an individual wants to master
the task at hand and are truly interested in self-improvement (Yough & Anderman, 2006).
Conversely, performance goals are used when an individual wants to demonstrate a competence
CIVIC LEARNING PROGRAMS 30
in an area to complete a specific task (Yough & Anderman, 2006). After identifying areas for
mastery or performance enhancement, an individual must also produce a specific set of metrics
to meet those goal markers. Locke’s goal setting theory of motivation explains the use of
SMART goals to produce effective results and meet goals (Rueda, 2011). In order to be
considered SMART, goals must be specific, measurable, attainable, relevant, and time-bound.
Moreover, Lunenburg (2011) found that goals must also be used to evaluate specific aspects of
performance, linked to feedback, and allow for commitment and acceptance of results. Goal
orientation and goal setting has been thoroughly explored in the education and management
literature, but there is no identified literature of this theory and public policy as agenda setting is
the preferred method of implementation.
Goals and policy implementation. Since most public policy initiatives use agenda
setting as their driving force for implementation (Hillman, Tandberg, & Sponsler, 2015), there is
limited identified research on goal setting or goal orientation and public policy implementation.
Xue, Murthy, Tran, and Ghaffar (2014) found success using goal setting in health reform policy
implementation at 71 sites, while Leach, Pelkey, and Sabatier (2002) used the theory to examine
44 collaborative partnerships in Washington and California. Information about the policy
initiative and the stakeholder group revealed that the organizational global goal and stakeholder
group are the driving forces for implementation (MDHE, 2017; Policy, 2014). Before
implementation, the Policymakers created smaller goals for implementation for each phase of the
policy initiative and continue to use that framework for their work (Brennan, 2017; Reiff, 2016).
The goals are specifically outlined for each section, and while there is some ability for flexibility,
the stakeholders reference the goals for continued progression on the initiative (Brennan, 2017;
Reiff, 2016).
CIVIC LEARNING PROGRAMS 31
Table 3 outlines the organizational mission, goal, and stakeholder goal specific to the
problem of practice. One motivational influences and assessment suggestions are included.
Table 3
Motivational Influences and Assessments for Gap Analysis
Organizational Mission
The mission of the Board of Higher Education is to ensure that Massachusetts residents have
the opportunity to benefit from a higher education that enriches their lives and advances their
contributions to the civic life, economic development, and social progress of the
Commonwealth. To that end, the programs and services of Massachusetts higher education
must meet standards of quality commensurate with the benefits it promises and must be truly
accessible to the people of the Commonwealth in all their diversity.
Organizational Global Goal
By May 2019, the Massachusetts Department of Higher Education will graduate civic-minded
students from all public 2- and 4-year institutions.
Stakeholder Goal
By January 2019, Policymakers will achieve 100% compliance of the Policy on Civic
Learning at all participating institutions.
Assumed Motivation Influences General Literature
Goal Setting: Policymakers need to set
specific goals through the initiative to help
successfully implement the policy through
multiple contexts and institutions.
Brennan, 2017; Hillman et al., 2015; Reiff,
2016
Organizational Influences
The final component of the gap analysis is the organizational influences that illustrate a
deficiency in the processes, materials, or culture that inhibit successful completion of a project or
goal. Clark and Estes (2008) explain that organizational barriers, such as a lack of resources,
informal organizational policies, and missing materials, should be examined with the same rigor
as knowledge and motivation gaps. Often, organizational barriers may contribute to knowledge
and motivation gaps, exacerbating the gap that may already exist (Clark & Estes, 2008; Rueda,
2011). Rueda (2011) notes that researchers should consider culture, structure, and policies and
CIVIC LEARNING PROGRAMS 32
procedures of organizations to determine organizational barriers. These barriers often comprise
the larger organizational culture that must be examined.
Researchers identify organizational culture as the most important entity in an
organization as it can dictate how work gets done, who is responsible for work, and general
norms of activity (Clark & Estes, 2008; Rueda, 2011). Clark and Estes (2008) define culture as
“a way to describe the core values, goals, beliefs, emotions, and processes learned as people
develop over time” (p. 108). Culture exists in three different approaches in organizations: 1)
culture in the environment, 2) culture in groups, and 3) culture in individuals (Clark & Estes,
2008). Culture in the environment examines how changing the culture can change performance,
culture in groups explores cultural patterns through groups of people with an organization, and
culture in individuals digs into a specific individual’s work processes and motivations (Clark &
Estes, 2008). Another way to break down these approaches is to examine the cultural models and
cultural settings within organization and stakeholder groups.
Cultural models. Understanding the cultural models that exist within a stakeholder
group or larger organization are important to determining what gaps are present for analysis.
Rueda (2011) explains that cultural models “are the shared mental schema or normative
understandings of how the world works or ought to work” (p. 55). Cultural models are dynamic
and expressed through culture practices (Rueda, 2011). In some organizations, cultural models
may help shape the organization’s structure, values, or policies (Rueda, 2011). For this study,
there are two outstanding cultural model issues at play: the lack of a culture of civic learning at
the organizational level and the need to account for the institutional culture at each participating
college and university.
CIVIC LEARNING PROGRAMS 33
Cultural model influences and creating a culture of civic learning. Changing the
culture of an organization or institution is difficult, time-consuming, and requires significant
resources to sustain the practice (Schein, 2004). Some large-scale organizations have
successfully changed their cultures to address customer or client need. For instance, when the
Department of Veterans Affairs elected to change to a patient safety culture in the late 1990s to
address increases in patient errors and mortality, significant structural and policy changes were
implemented to sustain the change (Schein, 2004). Schneider, Brief, and Guzzo (1996) argue that
to effectively change the culture of an organization, the organization must first change the
climate. However, that total organizational change (TOC) would mean change to multiple
policies, practices, and procedures, which may not be needed for the type of culture change
needed from the MDHE (Schneider et al., 1996). Kezar’s work on cultural change models,
though, would be helpful for the culture change needed at the MDHE level for successful policy
implementation by the stakeholder group. Through a cultural change model, Kezar (2001)
explains that change occurs as a result of the current human environment, and it serves as a good
analogy for political changes, such as the Policy on Civic Learning. Cultural changes may be fast
or slow, and account for the complex and diverse systems that function with an organization
(Kezar, 2001).
While Kezar’s cultural model for organization change would be an effective tool to help
the MDHE support the stakeholders in the implementation, the larger problem is that the MDHE
is not planning to change to or adopt any kind of cultural change for civic learning. Jacoby
(2009) argues that for civic learning programs to thrive, organizations must become civic-
minded and adopt a civic mission; thus, creating a culture of civic learning. This top down
approach shows that implementing a significant learning change, such as a state-mandated
CIVIC LEARNING PROGRAMS 34
learning policy, is important and can be sustained. If the MDHE is not planning to make any
cultural changes at the organizational level or due to the state-mandated policy initiative, the
stakeholder group may have a difficult time sustaining implementation efforts at participating
institutions.
Cultural model influences and accounting for institutional culture. Higher education
scholar Thelin (2004) argues that colleges and universities can take on a life of their own. Many
institutions have missions and visions that dictate certain policy or learning directives, and those
tools can also create a specific institutional culture (Thelin, 2004). Introducing a new policy,
procedure, or initiative that runs counter to an already established institutional culture can create
resistance and anxiety among workers, causing those initiatives to fail (Agocs, 1997; Schein,
2004; Schneider et al., 1996). Considering an already existing institutional culture and how a
state-mandated policy will impact that culture is important. Examining how the MDHE and
stakeholder group are prepared to align those cultures with policy implementation requires an
assessment of current human resources, materials, and other policies and procedures. If the
MDHE has not considered how the policy will be embedded within current institutional activity,
it may be difficult for the stakeholders to succeed during implementation.
Cultural settings. Whereas cultural models show how people and activities exist within
organizations, cultural settings illustrate how these models or processes are enacted within the
organization (Rueda, 2011). Rueda (2011) posits that cultural settings are the “who, what, when,
where, why, and how” of organizations (p. 57). Like cultural models, cultural settings are
dynamic, and researchers argue that there is a reciprocal relationship between the two
phenomena (Rueda, 2011). Given the various moving parts of an organization, including the
people, materials/resources, policies and procedures, and overall organizational culture, viewing
CIVIC LEARNING PROGRAMS 35
these elements as core components to the gap analysis are vital to ensure a comprehensive
understanding of the problem of practice, and to evaluate the stakeholder’s goal. In relation to
the cultural models outlined above, two cultural setting influences that will be examined are
decreased resource allocation due to competing state-wide initiatives and a lack of standardized
implementation strategies.
Cultural setting influences and resource allocation due to competing state-wide
initiatives. One of the key components to successfully transitioning the culture of an
organization is resource allocation and the alignment of resources with goals and priorities. From
an organizational change perspective, resource allocation includes human resources, monetary
resources, and materials or equipment (Clark & Estes, 2008). An organization must evaluate and
align those resource allocations to meet the needs of the culture change, while still recognizing
competing projects or initiatives that require some of the resources that may be reallocated
(Kezar, 2001). A culture change must be both disruptive enough to create and sustain the actual
change needed within the organization, while still not disrupting routine activities that allow the
organization to operate (Schein, 2004). The MDHE must also balance these competing ideas,
while addressing the competing projects with in the Vision Project and other state projects.
Under the Vision Project, there are five additional statewide initiatives the organization is
overseeing including reducing six-year graduation rates and offering more workforce
development opportunities to students (MDHE, 2017). While none of the other projects under
this umbrella initiative contain a state-wide policy mandate or require a change in learning at
multiple levels within a postsecondary institution, they are all still important initiatives directed
to meet the Governor’s goal of graduating students that can contribute to the economic
development of the Commonwealth (MDHE, 2017). These competing initiatives mean the
CIVIC LEARNING PROGRAMS 36
MDHE must constantly assess how they are using their resources to meet the requirements of the
policy. However, recent decisions illustrate a decline in commitment from the MDHE. In 2014,
the MDHE hired a Director of Civic Learning and Engagement to oversee the implementation, as
well as a staff of three additional workers (Brennan, 2017; Reiff, 2016). By summer 2017,
though, the staff had been reduced to one student researcher and the Director position had been
cut to a part-time designation (J. Reiff, personal communication, September 8, 2017). With
various aspects of the initiative still be rolled out for implementation, decreasing needed
resources and not allocating resources across competing initiatives may prove detrimental to the
policy program.
Cultural settings influences and a lack of standardized implementation strategies.
Implementing a significant policy or initiative, especially at the state or federal government
level, requires a strong plan for rollout of the implementation (King, Kohsin, & Salisbury, 2005).
Not only is a strong implementation plan important for organizations, it can directly impact
multiple goals of a project, such as data collection. If a larger implementation is being introduced
to multiple locations, teams, or small organizations, standardized implementation may be
important to successful completion.
In this project, MDHE must support the identified stakeholders in their implementation
efforts, including standardized implementation of the policy. Currently, the policy could be
interpreted for implementation in multiple ways, which could cause confusion among
postsecondary institutional stakeholders. While the Policymakers outline some key standardized
elements for implementation, such as initial data collection, the MDHE allowed flexibility in
how institutions could meet the civic-mindedness requirement for their students to accommodate
the institutional cultures at each campus (Brennan, 2017; Reiff, 2016). That allowed flexibility
CIVIC LEARNING PROGRAMS 37
means that some institutions may require a civic-inspired course, such as Introduction to Political
Science, while others may allow students to complete service learning projects to meet the intent.
This flexibility may impact data collection as the tools and rubrics used by not be measuring the
same learning area. Coordinating implementation across a state can be difficult but allowing too
much flexibility in implementation could cause the policy to fail.
Table 4 outlines the organizational mission, goal, and stakeholder goal specific to the
problem of practice. Two cultural models and two cultural settings influences and assessment
suggestions are included.
Table 4
Organizational Influences and Assessments for Gap Analysis
Organizational Mission
The mission of the Board of Higher Education is to ensure that Massachusetts residents have
the opportunity to benefit from a higher education that enriches their lives and advances their
contributions to the civic life, economic development, and social progress of the
Commonwealth. To that end, the programs and services of Massachusetts higher education
must meet standards of quality commensurate with the benefits it promises and must be truly
accessible to the people of the Commonwealth in all their diversity.
Organizational Global Goal
By May 2019, the Massachusetts Department of Higher Education will graduate civic-minded
students from all public 2- and 4-year institutions.
Stakeholder Goal
By January 2019, Policymakers will achieve 100% compliance of the Policy on Civic
Learning at all participating institutions.
Assumed Organizational Influences General Literature
Cultural Model Influence 1: There is a need to
foster a culture in which civic learning is
important to the development of students,
both in and out of the classroom, while
respecting the mission of each institution.
Brennan, 2017; Jacoby, 2009; National Task
Force, 2012; Mettler, 2007; Reiff, 2016
Cultural Model Influence 2: There is a need to
align public policy with institutional cultures
to effectively embedded civic learning culture
into postsecondary institutions.
Agocs, 1997; Brennan, 2017; Thelin, 2004;
Reiff, 2016
Cultural Setting Influence 1: Resources are
not aligned with policy goals and priorities,
Brennan, 2017; Reiff, 2016
CIVIC LEARNING PROGRAMS 38
and competing goals and priorities will
jeopardize success.
Cultural Setting Influence 2: Unclear
direction regarding implementation of civic
learning policy does not allow for
standardized curriculum, making it difficult to
assess impact of policy on civic learning.
Brennan, 2017; King et al., 2005; Reiff, 2016
Conceptual Framework: The Interaction of Stakeholders’ Knowledge, Motivation
and the Organizational Context
Being able to visually or graphically represent a complex issue provides another lens in
which to understand the larger problem of practice. The conceptual framework, or theoretical
framework, is the “underlying structure” of a study (Merriam & Tisdell, 2016, p. 85). This
structure allows the theories and concepts explored in the study to be captured by the design of
the study through a visual representation (Maxwell, 2013). While the conceptual framework can
be used for both quantitative and qualitative research projects, the technique is especially
important for a qualitative research study as it can be used to justify the research and position the
work within a larger research argument or conversation (Maxwell, 2013; Merriam & Tisdell,
2016). Merriam and Tisdell (2016) argue that theory underlies all research because “no study can
be designed without some question being asked” (p. 85). The lens provided from a conceptual
framework explores those questions in a way that connects all aspects of the larger research
design.
The conceptual framework is comprised of concepts, terms, and models through the
larger literature base. Maxwell (2013) notes that there are four sources of this information that
are used to construct the framework: experiential knowledge, existing theory and research, pilot
and exploratory research, and thought experiments. Merriam and Tisdell (2016) explain that the
information gathered through the literature review will directly impact the problem of practice
CIVIC LEARNING PROGRAMS 39
and purpose of the study, and the conceptual framework provides the frame for that review. It is
through these thought activities that the research is constructed, which Maxwell (2013) argues is
the most important feature of the conceptual framework. Through visual representation, the
conceptual framework shows a reader how and why the research is being conducted and orients
them to its importance and utility.
For this research study, a broader literature review identified key elements necessary for
the successful completion of the stakeholder and organizational goal. Through the Clark and
Estes (2008) gap analysis, knowledge and motivational influences of the stakeholder group were
examined in relationship to the organizational influences of the overseeing institution. Using the
conceptual framework, the relationship between those three factors are illustrated. The Civic
Policy on Learning is positioned to answer two questions: what is the state’s role in civic
learning and can a state-wide policy improve civic learning? Only the first question is being
answered in this analysis, so the knowledge, motivation, and organizational influences identified
address that inquiry. To be successful, the policymakers must know how to define civic learning
(Jacoby, 2009), understand how to align a state policy within an institution’s mission (Bawa &
Munck, 2012; Boland, 2012), and know how to assess civic learning (Brennan, 2017; Reiff,
2016). These knowledge influences are directly impacted by the group’s motivation, completion
of the organizational goal. Through the conceptual framework, goal orientation is visualized as
the driving force for the stakeholder group and shows how this motivational theory interacts with
the knowledge influences.
While the policymaker stakeholder group is technically housed under the larger
organization (MDHE), this group acts as an autonomous arm of the organization, responsible for
the creation, development, and implementation of the program. While the conceptual framework
CIVIC LEARNING PROGRAMS 40
illustrates this division, it also shows the organizational influences that the stakeholder group
must address throughout policy implementation. Even though the stakeholder group operates in a
different public sphere from the larger organization, it is still impacted by the organization’s
identified cultural model and setting influences. Specifically, the culture of civic learning
fostered by the MDHE and value it places on institutional culture connect with continued
motivation to complete the goal, while the conflicting allocation of resources and priorities and
standardization directly influence the overall goal of successful implementation. The conceptual
framework depicts the interaction and relationships between each influence and the goals
identified for this study.
CIVIC LEARNING PROGRAMS 41
Cultural Settings and Cultural Models
(Culture of civic learning, regard for institutional culture, lack of
resource allocation, and lack of clear implementation at multiple sites)
Knowledge Influences
Declarative/Conceptual: Concepts
of civic learning
Procedural:
Alignment with institutional
missions & civic learning
assessment
Motivational
Influences
Goal orientation theory
Goal setting theory
Stakeholder Goal:
By January 2019, Policymakers will achieve
100% compliance of the Policy on Civic
Learning at all participating institutions
Organizational Performance Goal:
By May 2019, the Massachusetts Department
of Higher Education will graduate civic-
minded students from all public 2- and 4-year
institutions.
Conceptual Framework Key
Organizational Influences
Motivational Influences
Knowledge Influences
Direction of Interaction
Stakeholder Goal
Organizational Goal
CIVIC LEARNING PROGRAMS 42
Figure 1. An integrated conceptual framework illustrating the interactions of knowledge,
motivation, and organizational influences.
Participating Stakeholders: Sampling and Recruitment
While there are numerous stakeholders working to achieve compliance of the civic
learning policy, the policymakers are the focus for this study since they wrote the Policy on
Civic Learning and are charged with overseeing implementation and data collection. The
policymaker stakeholder group is defined as the participants from the Study Group on Civic
Learning and Engagement and MDHE staff members who are involved in the policy initiative.
Between the two groups, 17 professionals were contacted for this study; therefore, nonrandom,
convenience, purposive sampling was used. As there have been no additions to the make-up of
the group since its creation, the defining characteristic of this sample was their membership in
the group. With such a specific pool of stakeholders, the sample was nonrandom and convenient
because it was easy to identify and assemble participants (Fink, 2013; Johnson & Christensen,
2015) and purposive as a specific characteristic defines the group (Johnson & Christensen, 2015;
Merriam & Tisdell, 2016). Moreover, the sample was also a census as it was the entire
stakeholder group within a case study (Fink, 2013; Merriam & Tisdell, 2016). The group, which
included state lawmakers, educators, and higher education administrators, represented all aspects
of the larger stakeholder organization, including the colleges and universities that fall under the
purview of the larger organization.
In addition to interviews with identified stakeholders, a document analysis was conducted
on public documents associated with the policy implementation. Documents housed in the
project’s corresponding Civic Learning Open Resource Repository were examined for
compliance. However, participating institutions are not required to upload documents to this
CIVIC LEARNING PROGRAMS 43
repository. In some instances, documents are housed on institutional websites. Therefore, those
websites were also analyzed for this study.
Interview Sampling Criteria and Rationale
Criterion 1. The only criterion used for this sample was that professionals are members
of the original policy team, which includes MDHE staff members and members from the Study
Group on Civic Learning and Engagement. There have been no changes to the composition of
this group since its inception. The rationale for this sample selection criterion was that since
these professionals wrote the policy and are charged with implementation; they would be able to
directly address the research questions, which are targeted to this stakeholder group.
Interview Sampling (Recruitment) Strategy and Rationale
Purposive sampling was utilized in the interview portion of the study. Maxwell (2013)
explains that this sampling technique is best utilized in qualitative research as the stakeholder
population was deliberately selected to answer specific questions. Interviews allow researchers to
gather insight into the perspective of the participant (Maxwell, 2013). While the nonrandom
sampling technique will not allow for generalization, the use of purposive sampling for the
interviews will provide a specific description of the stakeholder’s activities (Fink, 2013;
Maxwell, 2013). The interview portion of the study was also used to understand how
stakeholders interpret their experiences and make meaning of their roles in the policy initiative
(Merriam & Tisdell, 2016).
Participants were recruited for individual interviews via emails to their work email
addresses. Participants were asked to confirm or deny participation in the interview. If
confirmed, participants received a follow-up email to arrange the interview. If denied, the
participants received a follow-up email thanking them for their time. To incentivize participation,
CIVIC LEARNING PROGRAMS 44
the researcher provided free beverages during the interview (coffee, tea, soda, non-alcoholic
beverages, etc.), as well as offered face-to-face or online meeting options (Skype, GoToMeeting,
Zoom, etc.).
Document Analysis Sampling Criteria and Rationale
Criterion 1. All curricular materials, projects, and assignments posted in the Civic
Learning Open Resource Repository.
Criterion 2. Review of institutional websites to determine compliance with the state
policy.
Document Analysis Review Strategy and Rationale
Document analysis and review was also used to determine compliance within the
stakeholder goal. An invaluable and often overlooked method of analysis, document review can
give voice and meaning to a given topic (Bowen, 2009). While there are three types of
documents that are reviewed (public records, personal documents, and physical evidence)
(Bowen, 2009), only public records will be used for this study. Participating higher education
institutions were given two options to house documentation showing compliance with the state-
mandated policy: the Civic Learning Open Resource Repository and/or their institutional
website. This researcher was granted access to the repository and reviewed all curricular
materials, projects, and assignments posted. An analysis of institutional websites yielded
additional information regarding curricular and co-curricular options for students.
Since there are many documents in the repository and on the institutional websites to
consider, an organized coding management system was used. Bowen (2009) notes that
organizing data into pre-constructed categories will allow a researcher to target information
needed for the project. It will also allow the documents to be evaluated in a way that “empirical
CIVIC LEARNING PROGRAMS 45
knowledge is produced and understanding is developed” (Bowen, 2009, p. 33). The coding
protocol is included in Appendix B.
Data Collection and Instrumentation
Two specific types of data collection was used for this qualitative case study: interviews
and document analysis. Both data collection options seek to gain understanding from the
research questions (Maxwell, 2013). Using two data collection methods allows for triangulation,
which will be used to establish validity, reduce potential bias, and provide comprehensive
findings (Maxwell, 2013; Merriam & Tisdell, 2016). The below outlines a rationale and
explanation for instrumentation that were used for this study. Both data collection methods
occurred concurrently; however, interviews are listed first in this section.
Interviews
Interview protocol. Semi-structured qualitative interviews were used for this study. This
kind of interview allows the policymakers to describe their experience, capturing their thoughts
and feelings during the process toward compliance through open-ended questions (Merriam &
Tisdell, 2016). Moreover, this kind of interview format will allow for flexibility in questioning
and the opportunity to explore areas in more detail (Merriam & Tisdell, 2016). Interviewing this
census sample provided data to create a complete picture of the policymaker experience.
Questions ranged from introductory inquiries about the policymaker’s professional history and
how he or she became part of this collective, to knowledge of civic learning programs and how a
state-mandated policy can be implemented. Each question in the interview protocol (Appendix
A) connects to both the research questions and the proposed conceptual framework.
Interview procedures. One interview was conducted with each of the policymakers,
lasting between 45 minutes to 1 hour. One hour of time was scheduled for each interview, but an
CIVIC LEARNING PROGRAMS 46
additional hour was planned in the event of extended conversation. A formal interview protocol
using standardized, open-ended questions was used, and the order of the questions remained
consistent for all interviews to ensure complete data collection (Patton, 2002). However, a semi-
structured approach allowed for an exploration of the interviewee’s worldview (Merriam &
Tisdell, 2016).
Interviews were conducted in the location of choice of the interviewee. Interviewee
convenience and geographical location dictated the location of the interview, as well as the time
of the interview. Accommodating the interviewees, as they are spread across the state, was
important to capturing all perspectives. Each interview was conducted one-on-one. Each
interview was recorded on a hand-held recording device and handwritten notes were taken in the
event of a recording failure. Interviews were conducted concurrently to document analysis since
neither is reliant on the other for completion (Creswell, 2014). Each interview was transcribed
within 48 hours of its conclusion.
Documents and Artifacts
Document analysis was also used to collect data to address the organizational goal. Two
types of documents, both publicly available, were sought for this study. The first analysis
involved documents uploaded to the project’s Civic Learning Open Resource Repository. While
the repository houses a multiple of documents related to the project, only curricular materials,
assignments, and projects related to the policy were examined. Participating institutions are not
required to upload documentation into the repository, so a second document analysis was used to
ensure a systematic review (Merriam & Tisdell, 2016). All participating institutions are required
to list classes and activities related to the policy on their institutional websites, marked by an
agreed upon insignia (Brennan, 2017). A review of these institutional websites uncovered any
CIVIC LEARNING PROGRAMS 47
additional documents for review. A coding system was used for the content analysis, cataloguing
findings to support analysis of the research questions and conceptual framework (Merriam &
Tisdell, 2016).
Data Analysis
Data collected from both interviews and documents was analyzed concurrently in the
analysis phase of the study. Analysis of the interview data began as the interviews were being
conducted. Interviews were transcribed within 48 hours from the conclusion of the interview.
Analysis and interpretation were done concurrent with collection (Bogdan & Biklen, 2007).
Coding interview data was conducted in three phases, as suggested by various researchers
(Harding, 2013; Miles, Huberman, & Saldana, 2014). In the first phase, open coding was used to
identify pieces of data that connected to the conceptual framework and research questions. Each
interview question was previously coded to a particular piece of the conceptual framework.
Empirical codes were delivered from those open coding schemas. Second, the empirical codes
transitioned into axial or analytic codes as the data was filtered through the conceptual
framework lens. Special attention was paid to the knowledge, motivation, and organizational
influences of the stakeholder group. Finally, the axial codes were used to identify patterns or
themes for the findings discussion. A codebook was used for the interview data to manage data
and codes.
A research journal was also used to keep track of the researcher’s thoughts during the
process. The research journal detailed questions posed by the researcher during data analysis, as
well as areas for further inquiry in the study. The researcher also used the research journal to
keep any biases in check during the process. Managing and recognizing biases during data
CIVIC LEARNING PROGRAMS 48
analysis was paramount to ensuring the values, expectations, and background of the researcher
did not impact the results found and reported.
The researcher also analyzed data collected through document analysis. Analysis of each
participating institution’s website for markers and policy information was conducted
concurrently with the interviews. A spreadsheet detailing all the participating institutions was
used. This spreadsheet also included a timeline for analysis of each site (i.e., dates for analysis).
Documentation detailing the findings of each institution was used to analyze compliance of each
institution (see Appendix B).
Results and Findings
The purpose of this project was to evaluate the degree to which the Massachusetts
Department of Higher Education was meeting its civic learning policy compliance goal of
graduating civic-minded students by May 2019. Moreover, the study focused on whether the
identified stakeholder group could achieve 100% policy compliance through four areas –
strategic planning, support, data collection, and coordination – by all participating institutions by
January 2019. Guided by the following questions, the study examined the knowledge,
motivation, and organizational influences of the stakeholder group and how those influences
could assist the MDHE achieve its organizational goal:
1. To what extent is the Massachusetts Department of Higher Education meeting its goal of
graduating civic-minded students through compliance of the Policy for Civic Learning by
all 2- and 4-year state public institutions of higher education by January 2019?
2. What is the Policymakers’ knowledge and motivation related to achieving the MDHE
goal?
CIVIC LEARNING PROGRAMS 49
3. What is the interaction between MDHE’s organizational culture and context and
policymakers’ knowledge and motivation in relation to achieving the MDHE goal?
4. What are the recommendations for MDHE organizational practice in the areas of
knowledge, motivation, and organizational resources?
Initially, 17 stakeholders were identified for participation in the study. Upon initial
contact, more than half of the stakeholders reported that they were members “in name only” and
had not participated in the policy creation and compliance work. Of the remaining eight
stakeholders, five stakeholders agreed to be interviewed. However, only four interviews were
used for this study as one participant did not finish the interview. The stakeholders will be
identified as Policymaker 1, Policymaker 2, Policymaker 3, and Policymaker 4 throughout the
analysis. During the interviews, many participating institutions were specifically referenced.
However, to further protect the identities of the interviewees, any institutions included in the
findings were ascribed pseudonyms.
Five themes were identified through data analysis. Using the Clark and Estes (2008)
framework to examine the knowledge, motivation, and organizational influences of the
stakeholder group, the themes represented gaps in each influence, as well as a connection to each
posed research question. Moreover, themes were connected back to the conceptual framework
outlined earlier in this study. It is important to connect the findings back to the conceptual
framework because the conceptual framework informed the design of the study and helps explain
the phenomenon explored in the analysis (Maxwell, 2013).
Theme 1: Compliance is Complicated and Incomplete
The first identified theme focused on the first research question used for this study. The
first research question asked whether the larger organization met its goal of achieving
CIVIC LEARNING PROGRAMS 50
compliance of the state-mandated policy by January 2019. Document analysis and interviews
with the identified stakeholders revealed that the organization did meet its compliance goal, but
there were some identified concerns that impact the extent to which these goals were achieved
and whether they will be sustained.
Compliance achieved only by participating institutions. There are 29 institutions
within the state public higher education system: 15 community colleges, nine state universities,
and five institutions within the University of Massachusetts system. Previous findings reported
that 24 of those institutions were already collecting and analyzing data related to student civic
learning and engagement (Final Report, 2013). Prior to policy implementation, 67% (16) of
participating institutions offered service learning courses (to meet the civic engagement
standard). Only 34% (8) of institutions offered civic education courses through core curriculum
(Final Report, 2013). An analysis of participating institutional websites revealed that, four years
post-policy implementation, 92% (22) of institutions offer a service learning course, and 75%
(18) of institutions offer some combination of civic education courses through core curriculum
based on major. All participating institutions are offering either a civic-minded class or a civic
engagement opportunity to students before graduation; however, not all participating institutions
are offering both options. In their final report (2013), the Policymakers noted that at least six
participating institutions expressed concern for institutionalizing civic learning and engagement
for all students at their institution. While each participating institution is offering some form of
civic learning to students, this may still be a concern for some institutions.
A document analysis of the stakeholder group reports and participating institutions’
websites revealed that all 24 participating institutions are complying with the state-mandated
policy. The stakeholders successfully completed all four elements of the compliance plan with all
CIVIC LEARNING PROGRAMS 51
participating institutions. Strategic planning was achieved through a recommendation made by
the stakeholder group to the MBHE: requiring all participating institutions to “include a
description of how they will include civic learning and engagement as an expected and
measurable learning outcome for all students in their five-year master plans” (Final Report,
2013, p. 11). An analysis of each institutional website uncovered master plans with this
language. Moreover, many institutions also included the information on their civic engagement
center webpages, making it easily accessible to different consumer groups, such as students and
community members. For instance, one institution, Leader Community College, included all
their work product on implementing the policy on their website.
Second, all participating institutions completed the second criterion for compliance by
participating in annual meetings to discuss progress made to achieving the larger organizational
goal (support). Since 2014, four annual meetings have been hosted by the MBHE and MDHE,
the first of which was organized by the identified stakeholder group. These meetings allowed the
participating institutions to share information and ideas for student learning opportunities, as
well as discuss challenges to implementation and sustainability. The fifth annual meeting will
occur in spring 2019. Similarly, these meetings were also used to achieve the fourth compliance
criterion, coordination. Ideas discussed during these meetings offered the MBHE and MDHE the
opportunity to tackle implementation challenges, financial issues, and ideas for future projects.
Finally, compliance was achieved in the third criterion, data collection. The stakeholder
group spent a majority of their time on this step during the implementation process as it was
critical to determine if the larger organization would meet its goal. All four policymakers
mentioned the importance of the data collection process during their interviews, including the
creation of the tools used to assess student learning and civic development. Policymaker 4 stated,
CIVIC LEARNING PROGRAMS 52
“We created the rubrics to authentically assess student learning by looking at what students
produce.” The final rubrics created by the stakeholders and used by participating institutions
gauge student development in civic and democratic knowledge, skills, values, and action (Final
Report, 2013). To date, all participating institutions have uploaded first round data and will
continue to upload data each year.
However, it was recommended to the MBHE that the five institutions comprising the
University of Massachusetts system be excluded from participating in the state mandated policy
because they had already been awarded the Carnegie Engagement Classification for their work
on civic learning and engagement (Final Report, 2013). This decision may have impacted the
sustainability of the policy, as well as the level of continued compliance by other participating
institutions. Arguably, these five institutions would have easily been able to comply with the
policy, given they all have dedicated civic engagement centers and general education
requirements that include civic-minded courses. Why were these institutions allowed to be
excluded from a state-wide policy? Can the MBHE and MDHE claim success of a state-wide
policy on civic learning if some institutions are allowed to be exempt? If other states use this as a
model in their states, would they be expected to achieve a true compliance rate or allow some of
their institutions to be excluded from the policy? If so, the question then becomes: is a state-wide
policy actually necessary for civic learning in higher education?
Incomplete compliance leads to questions of sustainability. As previously mentioned,
five of the state’s public four-year institutions are not participating in the state-mandated policy
due to a national recognition for their work in civic education. However, the exclusion of these
five institutions may impact future participation by other institutions and calls into question
whether a state-wide policy can be successful. Policymaker 3 recalled that implementation and
CIVIC LEARNING PROGRAMS 53
compliance has been “uneven across the state universities and community colleges.”
Specifically, use of the assessment methods created to identify what initiatives institutions were
using to meet the policy goal is being applied inconsistently across the state due to a lack of
participation from all state institutions. Policymaker 2 noted:
I know that Emerson State is reporting CL and we are not reporting CLEO or
CLER…Just between you, me, and the water cooler, some people said they are not going
to do that [report using the assessment method].
Policymaker 3 contends that the stakeholder group did everything they could to make
compliance expectations clear by keeping language simple and allowing “enough movement
within the language to determine how it would happen on their individual campuses” to mitigate
the impact of a state-mandated policy. However, questions still remain about whether a state
policy is effective if 1) not all state institutions are required to participate and 2) there is no
enforcement in exactly what information is reported. If there is no movement or growth at
institutions, can they still claim compliance following the January date? Is the expectation that
institutions continue to develop civic learning opportunities after they have achieved initial
compliance? If so, has that been communicated to participating institutions and what standards
have been established for future compliance goals?
Theme 2: Interest is the Key to Motivation, Not Goal Setting
Motivation is a key component of the Clark and Estes (2008) framework for a gap
analysis. In fact, the authors argue that motivation is what keeps professionals learning, doing,
and accomplishing tasks (Clark & Estes, 2008). Previously, goal setting was identified as the
motivating factor for the identified stakeholder group. Goal setting was selected as the central
motivation for this group as it was the closest theory to agenda setting, which is generally used in
CIVIC LEARNING PROGRAMS 54
public policy implementation and research (Hillman et al., 2015). It was previously suggested
that goal setting as a motivational influence was the driving force of the stakeholder group in
achieving their goal as outlined in the conceptual framework (Figure 1). While the argument that
motivation is the driving force that impacts the identified knowledge influences and movement
toward overall goal completion still stands, the study found that goal setting is not the motivation
that drives the stakeholder group.
Addressing Research Question 2’s focus on motivation, the study found that the proposed
motivation gap was not validated. In fact, goal setting did not play a role in how the
stakeholders’ active choice, persistence, or mental effort impacted their ability to successfully
accomplish their goal. Policymaker 2 articulated that it was easy to achieve their group goals
because they were committed to the process: “We went to the meetings, we did our homework,
and the experience in the room was tremendous.” Moreover, Policymaker 3 noted that the group
developed the goals, but the implementation time was actually determined by the MBHE, not the
stakeholder group after they submitted the project’s goals: “I can’t like the timeline piece
because I believe most of that happened outside the study group.” While the stakeholder group
created implementation and compliance goals that were used by the project, they only had some
influence on the timeline that was proposed. Ultimately, a key piece of goal setting, the creation
of an implementation timeline, was not created by them and had no influence on their work.
Rather, every interviewed stakeholder definitively declared that their genuine interest in
civic learning and the state policy was the real motivating factor for their work. Policymaker 1
revealed, “I become convinced that we need better civic education to get a better democracy.”
Policymaker 3 concurred, emphasizing that working on this policy “was the right thing to do.”
Inspired by the work of John Dewey, Policymaker 4 argued, “Well, we couldn’t imagine
CIVIC LEARNING PROGRAMS 55
graduating students without trying to teach them to write and how can we imagine trying to
graduate students without trying to teach them how to be civically engaged.” For every
interviewed stakeholder, participation in developing a state-wide policy on civic learning in
higher education was personal. Many of the stakeholders had been working in some aspect of
civic learning or education for more than 10 years, most dedicating their entire professional lives
to engaging students with democracy and civics. This kind of personal interest is important in
sustaining engagement, learning, and change (Schraw & Lehman, 2006). Ainley, Hidi, and
Berndorff (2002) found that personal interest directly contributes to persistence in task
completion. More important, Schraw and Lehman (2006) argue that personal interest impacts
learning and information processing. This kind of deeper information processing will impact the
proposed knowledge influences. While personal interest can assist in goal completion, it is
clearly the motivational factor that drives this stakeholder group.
Theme 3: Institutional Mission was Considered, but Aligning Public Policy was Not
Of the three knowledge influences identified through the relevant literature, only one
influence had a gap that was validated. In Research Question 2, three knowledge influences were
identified related to stakeholder goal completion: 1) knowing definitions and understanding
elements of civic learning, 2) understanding institutional missions and how to integrate public
policy, and 3) knowing how to assess civic learning. Interviews with stakeholders revealed that
they knew how to effectively define civic learning for policy implementation, and create and use
assessment tools. However, the stakeholders did not learn the institutional missions of each
participating institutions; therefore, they could not understand how to best help institutions align
the policy with their institutions (implementation).
CIVIC LEARNING PROGRAMS 56
While most of the interviewed stakeholders contend that they considered the importance
of institutional missions, allowing the institutions freedom to implement the policy
overshadowed the best way to implement it consistently at institutions. Since most institutions
have missions that are grounded in civic education (Thelin, 2004), Policymaker 2 noted that the
stakeholder group approached creating the policy with that assumption in mind:
That [institutional mission] was a constant theme of the discussions…the group was
constantly aware and constantly discussed the fact that civic engagement and civic
learning will look different at each school, so there was no mandate as to what this would
look like or how it would be assess or what student outcomes would be. Needless to say,
there was never a prescripted understanding of what that means.
Policymaker 3 agreed, reporting that allowing colleges the freedom to implement the policy was
important to successful implementation, but admitted that it was difficult when institutions
started at very different points:
Some institutions literally had zero, literally had nothing in place in terms of being able to
identify what courses are being taught that might include civic learning or even civic
engagement. There are some institutions that had no tracking mechanisms, no
institutional support, no center, not even a single person.
Keeping the policy and guiding documents fluid and loose to allow institutions to implement
with no guidance specific to their institutions created a quandary for stakeholders: it allowed
institutions to buy-in if they already had a program in place as compliance would be easy to
complete, but created more barriers for institutions that did not have resources dedicated to
creating programming to meet the requirement.
CIVIC LEARNING PROGRAMS 57
Incidentally, this gap also contributed to a gap in an identified organizational influence
concerning aligning public policy with institutional cultures, which are influenced by their
missions. The validated gap in this organizational influence, as examined in Research Question
3, revealed that the policy needed to be aligned with several different groups within the
institution that comprised the culture: students, faculty, and the community/constituents.
Policymaker 2 revealed that the policy implementation is not “reaching as much of the student
body as it should.” The Policymaker continued, contending:
What do we want our institutions to be doing in supporting this? What do we want
faculty to be doing and how do we incentive that? Most importantly, how do we want
students and community to understand and value the work?
Policymaker 2 also commented on the lack of faculty inclusion, purporting that working
with unions and collective bargaining units were not priorities at the time of the policy
development and implementation process. Policymaker 3 reported on the lack of faculty
involvement, positing that while institutions are required to adhere to the state policy, it will not
be sustainable without faculty buy-in. The policymakers all mentioned the important role faculty
would play in the success of the policy, so why did the MDHE and MBHE not elect to engage
this stakeholder population during the development process?
Finally, the policymakers remarked on the lack of alignment with larger MDHE
outcomes and community/constituent expectations. Policymaker 3 argued:
The impacts of this work are not always immediately visible to folks, so in terms of how
it changes the institution, it doesn’t unless they do the work to track data outcomes, like
graduation rates, student success, and retention, and longer-term community relations
outcomes.
CIVIC LEARNING PROGRAMS 58
Policymaker 4 also affirmed that the policy could align with larger MDHE goals (and other
Vision Project initiatives), such as college completion rates, closing the achievement gaps, and
improving college access. This suggestion was included in the final report recommendations
from the stakeholder group, but as Policymaker 4 notes, not everything included in the final
report was adopted by the MBHE for the policy. Policymaker 4 argued that the policy needs to
be “woven” throughout the student experience, from major curriculum to general education
requirements to co-curricular activities. However, interviewed policymakers all agreed that
allowing each participating institution to adapt the policy as they needed on their campuses may
not allow for that kind of integration.
Moreover, all the policymakers conceded that they had not been contacted by any
participating institution asking for assistance in aligning the policy with the institutional mission
and culture. Taking the time to establish how the policy would align with other stakeholder
groups at each individual campus would be important for long-term sustainability and growth
after compliance has been attained. Sandfort and Moulton (2015) report that not including
appropriate stakeholder groups underemphasizes their role in the process, and assumes that their
participation is not needed to achieve program outcomes. This directly connects to the
conceptual framework suggestion that the larger organization (MBHE and MDHE) decision-
making would impact how the stakeholder group produced and recommend solutions for
implementation.
Theme 4: Diverting and De-investing in Resources Will Not Sustain a State Policy
Informed by relevant literature, four organizational influences were outlined in this study.
Of the four influences, only three influence gaps were validated. One of the most significant
organizational influence gaps validated by this study concerned how, if organizational resources
CIVIC LEARNING PROGRAMS 59
are not aligned with policy goals and priorities, competing organizational goals and priorities
will jeopardize success of the initiative. Three resource areas were identified during the interview
process that illustrate the gap: 1) funding issues, 2) institutional infrastructure issues, and 3)
faculty support.
Other project initiatives take center stage and funding allocations. The lack of
monetary support from the MDHE was a large concern discussed by every interviewee. All four
policymakers expressed concern over the starting budget to implement a state-wide initiative, as
well as the continued budget cuts. Policymaker 4 stated:
I think it’s already shifted its centrality for the MBHE and MDHE. And that has to do
with partly changes in budget. There was a small budget to support the initiative when it
was first rolled out, and that budget has been cut, and cut, and cut again. Now it’s maybe
one-fourth was it was in the beginning.
Policymaker 4 confirmed that while many institutions try to find external funding to support the
lack of monetary support by the larger organization, it is difficult when there is already limited
staff at many participating institutions.
Moreover, leadership changes at the MBHE and MDHE have contributed to shifting
priorities for the larger higher education project. Policymaker 4 continued:
But there has also been a change in personnel; a new chair was appointed by a new
governor, a new commissioner was appointed, and the original project, which had this
broad list of goals for higher ed, has kind of receded from the people at the top.
Instead, three other project goals, called “The Big Three,” have taken center stage for the MDHE
because those goals align to national higher education outcomes. In fact, the MDHE website
prominently features those goals on its website as the primary initiatives of the project,
CIVIC LEARNING PROGRAMS 60
regulating the other three goals, including the civic learning goal, to the background. Top-down
directives, like this state-mandated policy, require that organizations meet the financial demands
of the implementation, including providing monetary resources to support institutions (Sandfort
& Moulton, 2015). The researchers found that organizations that provide the appropriate
monetary support to institutions can have significant influence on how the policy is implemented
from a long-term standard, as well as aligning the policy with the larger organization. Since a
financial needs assessment was not completed by the stakeholder group or the larger
organization, consideration was not given to how to support the implementation of the policy in
both the short- and long-term tenure of the policy.
Lack of infrastructure analysis and attention will not support sustainability. One of
the most important aspects of policy implementation is the assessment and support of
infrastructure needed to develop and sustain an initiative (Sandfort & Moulton, 2015).
Policymaker 2, who is involved at their institution in the implementation of the state-mandated
policy, expressed concerns about institutional infrastructure. In addition to a lack of support by
the MDHE of student work contributing to civic learning development, Policymaker 2 remarked:
I don’t see it [sustainability] happening at the higher education level because it wouldn’t
get through governance without an act of god. You know, if we had built this culture
more readily… I mean, we have questions about resources and infrastructure and all
those things that go along with project-based learning. So, for those reasons and more, I
don’t see it being…I don’t see fighting for a mandate as a priority and I think, at this
moment, it is a losing battle.
Policymaker 3 concurred, noting that even though the policy “still stands” regardless of funding
or institutional infrastructure provided, it will be difficult for some institutions to develop and
CIVIC LEARNING PROGRAMS 61
grow the program without the necessary supports. Sandfort and Moulton (2015) argue that policy
programs must be “operationalized with defined activities” to be successful (p. 24). The lack of
resources and infrastructure assigned to assist implementation can cause a policy to fail to be
sustained beyond the initial rollout.
Not focusing on faculty buy-in and incentives harmed implementation. In addition to
significant budget cuts, the MDHE failed to devote resources and support to faculty at each
participating institution. All policymakers disclosed that while the final report from the group
addressed the importance of obtaining faculty buy-in early in the process, the larger organization
did not take any steps to include faculty in the policy development process. Both Policymakers 2
and 3 revealed that while working on the policy, they acknowledged the importance faculty
would play in successful implementation, but elected to let each institution work with their own
faculty since they also needed to obtain executive leadership buy-in. Policymaker 3 reported that
getting “buy-in from faculty who want to do this work must be intentional.” Unfortunately, all
policymakers acknowledged that this was not done during initial work on the policy and that has
caused the state to take a late, but bold, approach to get the policy solidified at each participating
institution.
In 2017, the MBHE and MDHE began working with the faculty collective bargaining
unions for the state to add an incentive to participate in the state-mandated policy. All four
policymakers referenced the steps taken to encourage faculty to participate in the initiative. In
May 2018, a ratified contract for all faculty teaching at state institutions included new language
about community engaged scholarship. Policymaker 2, who followed the process closely, stated
that the new contract incentivizes faculty to participate in the policy as their work can be now be
considered for tenure and promotion.
CIVIC LEARNING PROGRAMS 62
However, the lack of any built-in incentives for faculty may prove to be a bigger issue.
The policymakers recognized that no incentives were built into the policy implementation
process for any other stakeholder group (faculty or administrative staff). Sandfort and Moulton
(2015) highlight the importance of incentivizing staff who are responsible for implementing
policy programs. They contend that by offering incentives, organizations can hold institutions
more accountable for achieving outcomes and “more discretion is granted to how the work is
actually performed” (p. 153). The lack of incentives may have also played an important role in
the motivation of faculty members. Policymaker 2 stated, “The pushback was always ‘this is
extra work.’ It’s extra work to design programs. It’s extra work to do service. It’s extra work to
assess service. There’s no incentive to do this extra work.” The lack of incentives and attention
paid to such an important stakeholder group directly impacts the MDHE’s larger goal to graduate
civic-minded students. More important, de-investing in the policy and diverting resources does
not allow it to succeed on a multi-organizational level.
Theme 5: Civic Learning Must Evolve and be Prioritized to Succeed
In the civic learning literature, advocates for the movement often reference their personal
interest in the field and how they have seen their work impact learning (National Task Force,
2012). Scholars report that institutional support must be present for civic learning initiatives to
succeed, including at the state-level (Jacoby, 2009; National Task Force, 2012). When asked
whether a culture of civic learning is fostered by various stakeholders within the state,
policymakers hesitantly agreed that a culture exists, but were not confident that it existed at all
levels. Policymaker 4 posited that institutions all have “the goal of creating citizens” and that
was why the state needed the policy. The policy would help foster a culture of civic learning
within the state by its very presence. However, Policymaker 4 conceded that if the policy was not
CIVIC LEARNING PROGRAMS 63
implemented thoughtfully and intentionally throughout the university system, it would not
succeed. Policymaker 3 agreed, arguing that the work must be intentional to be successful.
Policymaker 3 elaborated, “It will sustain if we have that culture. We must make it an
expectation. But I don’t think that without funding that sustainability is a realistic goal statement.
It comes down to intentional priorities.” The idea of intentionality as it relates to the creation and
sustainability of a culture of civic learning directs relates to relevant literature on culture change
(Schein, 2004). Being intentional with policy development can help groups define and identify
success and failure, as well as assign public value to its actions (Sandfort & Moulton, 2015).
Identifying the policy as important and intentionally making it a priority at all levels of the
MBHE, MDHE, and each participating institution will help foster a culture of civic learning.
Moreover, the policymakers all referenced that a culture of civic learning could help
institutions and higher education at-large evolve for new generations. Sandfort and Moulton
(2015) explain that culture is the “symbolic dimension of social action, the way a group makes
sense of what is taking place” (p. 18). Policymaker 3 strongly believed that a culture of civic
learning could assist in that evolution and help make sense of the impending change:
As an institution of higher education, we have to morph. We have to change in order to
continue to be relevant to communities, to society, and to our students. As a part of that
morph, we have to ensure we’re providing students with what they need in an education.
Civic learning is a key part of that education. If we are not changing who we are in higher
education in a way that meets our students’ needs, we are going to die.
Policymaker 2 agreed, calling the need for a culture of civic learning at all levels within higher
education a “very watershed moment.” Policymaker 2 continued,
CIVIC LEARNING PROGRAMS 64
That means we need to define it, to build the tools, to build the resources and
infrastructure, to support the work. We need to embrace the work to be civic minded. The
culture exists, but it’s evolving and needs to continue to evolve.
Policymaker 2 also remarked that the evolution of a culture of civic learning means not
pushing everything to the institutional level. Developing culture within the policy context means
“creating a shared understanding of what to do and how to should be done” (Sandfort &
Moulton, 2015, p. 91). Policymaker 2 argued that it has become the participating institutions’
responsibility to create and develop a civic learning culture. Coupled with the recent faculty
contract changes and budget cuts, the MDHE has abdicated its role in creating and fostering this
culture, moving the responsibilities of implementation to the participating institutions without
much thought for sustainability. Policymaker 3 acquiesced that the issue of competing priorities
has impacted the MDHE’s focus on fostering a culture of civic learning. Policymaker 3 stated, “I
think the MDHE fully supports our work and why this work is so essential, but we have yet to
see any kind of funding allocation that supports it and funding talks in terms of priorities.”
Policymaker 4 agreed, revealing that not everything suggested by the stakeholder group was used
in the final policy. The policy, Policymaker 4 disclosed, “was a combination of what they [the
MDHE and MBHE] thought was appropriate and what they thought was feasible in a state
system of public higher education.” Sandfort and Moulton (2015) contend that creating
integrated solutions that advance a policy aimed to strengthening a public good is better than
acting solely out of organizational interest. What if some of the changes made by the larger
organization impacted the development of the culture of civic learning? How can the larger
organization address the need to continue to foster a culture of civic learning and keep its
importance visible to all stakeholders?
CIVIC LEARNING PROGRAMS 65
Summary
The results undercover some interesting findings related to the evaluation of a state-wide
policy compliance effort. While the identified stakeholder group helped the MDHE achieve
compliance with all participating institutions, the larger question of sustainability of policy
growth was called into question. The results also explored each of the identified knowledge,
motivation, and organizational influences, and examined gaps that impacted the success of both
the stakeholder goal and the larger organizational goal.
There were three identified knowledge influences for study: 1) Policymakers need to
know definitions and understand elements of civic learning, 2) Policymakers need to understand
the position of institutional missions at postsecondary colleges and universities and how to
integrate the policy within these missions, and 3) Policymakers need to know how to assess civic
learning. Knowledge influences 1 and 3 were not validated for this study as the stakeholder
group had a firm grasp of civic learning and how to assess it within the state-mandated policy.
However, interviews with the stakeholders revealed a gap in the second knowledge influence,
which also impacted an identified organizational influence. The findings indicate that policy
alignment with institutional missions cannot just be discussed by policymakers; instead, they
must integrate it into the implementation plan of the policy.
There was only one identified motivational influence for this study. Previous literature
(Brennan, 2017; MDHE, 2017; Policy, 2014; Reiff, 2016) indicated that goal setting and
orientation would be the motivating factor for identified stakeholders to achieve their compliance
goal. However, throughout the course of the interview process, it became clear that personal
interest was the driving force behind the stakeholders’ work. While aspects of goal setting helped
them create the framework for their implementation plan, it was not the motivating factor for
CIVIC LEARNING PROGRAMS 66
their success. While this influence was not validated, findings still uncovered a driving force for
this stakeholder group.
Finally, the study examined four different organizational influences: 1) There is a need to
foster a culture in which civic learning is important to the development of students, both in and
out of the classes, while respecting the mission of each institution, 2) There is a need to align
public policy with institutional cultures to effectively embed civic learning culture into
postsecondary institutions, 3) Resources are not aligned with policy goals and priorities, and
competing goals and priorities will jeopardize success, and 4) Unclear direction regarding
implementation of civic learning policy does not allow for standardized curriculum, making it
difficult to assess impact of policy on civic learning. Of the four organizational influences, only
the first three influences had validated gaps that needed to be addressed. The fourth influence
was not validated as the assessment tool created by the identified stakeholder group allowed for
diversity in implementation. Gaps in the first three influences, however, revealed the need to
focus on continued development of a culture of civic learning, reassessing how resources were
being used for continued implementation and to define success, and focusing on how to support
institutions with policy implementation. Questions about continued civic learning growth still
need to be considered, as well as whether this is a reliable and valid template for other states to
use for similar policy creation and implementation. The following section explores
recommendation solutions for those questions.
Recommendations for Practice to Address KMO Influences
The findings and results examined in the previous section highlight the need for
programmatic attention to be paid for a more successful policy implementation. To answer the
final research question posed in this study, the creation of a year-long learning program is
CIVIC LEARNING PROGRAMS 67
recommended to address organizational practice areas for knowledge and organizational
resources, and assist stakeholders in the creation of a sustainable policy on civic learning in
higher education. Motivational resources will not be addressed as no gap was validated for this
study. The learning program will allow stakeholders in other states the opportunity to create an
implementation plan for a state-wide policy including strategies for institutional implementation,
strategic communication planning, and data collection and assessments. The program will be
evaluated using the Kirkpatrick and Kirkpatrick (2016) New World model of training evaluation.
Using Kirkpatrick’s four levels in reverse order, the New World model will collect data through
the four phases of learning program development: identifying desired outcomes (Level 4),
critical behaviors (Level 3), learning in different areas (Level 2), and initial reactions (Level 1).
Moreover, the use of the Blended Evaluation as a guide for survey questions will assist in the
evaluation process (Kirkpatrick & Kirkpatrick, 2016).
Knowledge Recommendations
Introduction. Three knowledge influences were identified using the relevant literature
and framework provided by Krathwohl (2002) and Rueda (2011): 1) Policymakers needed to
know definitions and understand elements of civic learning, 2) Policymakers need to understand
the position of institutional missions and how to integrate the policy within these missions, and
3) Policymakers need to know how to assess civic learning. Both declarative procedural
knowledge were considered for this stakeholder group. Of the three knowledge influences, only
one knowledge influence was validated as having a gap, as outlined in the following table.
CIVIC LEARNING PROGRAMS 68
Table 5
Summary of Knowledge Influences and Recommendations
Assumed Knowledge
Influence
Principle and Citation Context-Specific
Recommendation
Policymakers need to
understand the position of
institutional missions at
postsecondary colleges and
universities and how to
integrate the policy within
these missions. (D)
Individuals must organize
and prioritize prior learning
into schemas to produce
new knowledge Krathwohl,
2002; Rueda, 2011).
Individuals must acquire
knowledge, practice
integrating knowledge into
practice, and apply that new
knowledge (Krathwohl,
2002; Schraw &
McCrudden, 2006).
Provide job aid that lists
participating institutional
missions and examples of
institutions that have
successfully implemented
public policy.
Declarative knowledge solutions. Declarative knowledge is the combination of factual
and conceptual knowledge, and focuses on both on gathering fact-based information and
applying that information to make meaning of complex situations (Krathwohl, 2002; Rueda,
2011). Policymakers need to understand the position of institutional missions and how to
integrate the policy within these missions (D). Krathwohl (2002) and Rueda (2011) argue that
individuals must organize and prioritize prior learning into schemas to produce new knowledge.
The data showed that policymakers did not know the institutional missions of participating
institutions, which impacted their ability to understand how the policy can be integrated at each
institution. According to Clark and Estes (2008), job aids could help craft workable definitions,
including the establishment of terminology that would be used. Policymakers need tools that will
allow them to make sense of new material and organize complex knowledge in strategic ways.
The recommendation is to provide a job aid that lists the institutional mission of each
CIVIC LEARNING PROGRAMS 69
participating institution, as well as examples of institutions that integrated state policy with their
missions.
Researchers contend that many institutions are changing to meet the demands of new
markets and economies, which includes re-evaluating their mission statements to align with
public policy initiatives (Bawa & Munch, 2012; Holosko et al., 2015). Policymakers must avoid
mission drift (Boland, 2012) while creating implementation plans, targeting possible activities
for consideration, and developing assessment tools within the framework of the institutional
mission. Schraw and McCrudden (2006) note that it is important to connect individuals to
learning to make meaningful moments, and those connections will help learners facilitate the
transfer of information into practice (Mayer, 2011). Clark and Estes (2008) suggest that job aids
help with both the connection to materials and transfer of knowledge because they allow learners
to create knowledge on their own.
In alignment with Clark and Estes (2008), Spaulding and Dwyer (2001) conducted a post-
test study with 465 students to determine if job aids could be used to effectively impact task
achievement from knowledge development. While they found job aids can assist in knowledge
transfer, the type and design of the job aid is also important. The use of job aids, instead of
another learning tool, could allow policymakers the freedom to create as they process
information, leading to the integration of the policy with each participating institutional mission.
Organization Recommendations
Introduction. The study explored four organizational influences as they relate to the
stakeholder and larger organizational goals. Each organizational influence was further classified
as a cultural model or cultural setting. Rueda (2011) argues that not all organizational barriers are
visible to groups, so it is important to identify both the culture settings and cultural models of
CIVIC LEARNING PROGRAMS 70
each problem of practice. The four organizational influence examined in this study include: 1)
There is a need to foster a culture in which civic learning is important to the development of
students, both in and out of the classroom, while respecting the mission of each institution, 2)
There is a need to align public policy with institutional cultures to effectively embed civic
learning culture into postsecondary institutions, 3) Resources are not aligned with policy goals
and priorities, and competing goals and priorities will jeopardize success, and 4) Unclear
direction regarding implementation of civic learning policy does not allow for standardized
curriculum, making it difficult to assess impact of policy on civic learning. Of the four identified
organizational influences, only the first three influences had validated gaps. Table 6 outlines the
organizational influences and recommendations from this study.
Table 6
Summary of Organizational Influences and Recommendations
Assumed Knowledge
Influence
Principle and Citation Context-Specific
Recommendation
Policymakers need to foster
a culture in which civic
learning is important to the
development of students,
both in and out of the
classroom, while respecting
the mission of each
institution. (CM)
Cultural models are shared
mental schema that help
shape the organization,
including structure, values,
practices, and policies
(Gallimore & Goldenberg,
2001).
Provide policymakers with
models of high-profile public
officials that place importance on
creating a culture of civic
learning within government and
within their organizations.
There is a need to align
public policy with
institutional cultures to
effectively embedded civic
learning culture into
postsecondary institutions.
(CM)
Introducing a new policy
that runs counter to already
established culture can
create resistance and
anxiety, causing initiatives
to fail (Agocs, 1997;
Schein, 2004; Schneider et
al., 1996).
Provide policymakers with
researched concepts and case
studies on postsecondary
institutions that have successfully
aligned public policy with
institutional culture to create a
civic learning culture within their
institutions.
Resources are not aligned
with policy goals and
priorities, and competing
An organization must
evaluate and align resource
allocations to meet the
Provide policymakers decision-
making guidance on resource
allocations that adequately
CIVIC LEARNING PROGRAMS 71
goals and priorities will
jeopardize success. (CS)
needs of the culture change
(Clark & Estes, 2008;
Kezar, 2001).
Organizational
effectiveness increases
when leaders ensure that
employees have the
resources needed to
achieve the organization’s
goal (Clark & Estes, 2008).
support employees and facilitates
organizational change.
Cultural models. Policymakers need to foster a culture in which civic learning is
important to the development of students, both in and out of the classroom, while respecting the
mission of each institution. Gallimore and Goldenberg (2001) explain that cultural models are
shared mental schema that help shape the organization, including structure, values, practices, and
policies. Policymakers need to be provided with examples of other officials that have placed
importance on changing culture at an institution or within an industry. The recommendation is to
provide policymakers with models of high-profile public officials that place importance on
creating a culture of civic learning within government and within their organizations.
For civic learning programs to thrive, institutions must become civic-minded and adopt a
civic mission (Jacoby, 2009). While most institutions will argue that civic-mindedness is a
critical part of their institutional mission already, the idea is not embedded in their structures,
policies, and procedures. No research exists on creating a culture of civic learning with higher
education institutions, but similar organizations can be used for comparison. For instance, when
the Department of Veterans Affairs elected to improve patient safety throughout their healthcare
system, it was a cultural change within the organization and the industry. Dunn et al. (2007)
studied the effects of one culture changing initiative at 43 VA Medical Centers across the
country, finding that creating new policies and procedures must be connected to both the
CIVIC LEARNING PROGRAMS 72
organization’s mission and clinical practice to be successful. Bringing together the mission and
values of an institution to change culture is difficult, but all of the pieces must align for a true
culture change to occur. Providing detailed examples of other leaders who have successfully
placed significant value on such a culture change will help policymakers in their endeavors as
well.
To successfully create a culture of civic learning, there is a need to align public policy
with institutional cultures to effectively embed civic learning culture into postsecondary
institutions. Researchers argue that introducing a new policy that runs counter to already
established culture can create resistance and anxiety, causing initiatives to fail (Agocs, 1997;
Schein, 2004; Schneider et al., 1996). Therefore, it is recommended that policymakers are
provided with researched concepts and case studies on other postsecondary institutions that have
successfully aligned public policy with institutional culture.
Thelin (2004) notes that many institutions have established visions and missions that
dictate policy and learning directives, as well as tools to create a specific institutional culture.
Higher education institutions are built on tradition and branding, and changing that structure can
be difficult. There is very limited literature available on this kind of change within national
institutions of higher education, but Boland (2012) studied the effects of a national civic policy
implementation at Irish public higher education institutions. In his analysis, Irish institutions
elected to re-align their institutional missions and values to include the policy initiative, thus,
meeting the intent of the policy (Boland, 2012). Policymakers could also turn to Kezar and
Eckel’s (2000) study of how institutional culture influenced the change process at six colleges
and universities. They found that institutions should conduct an institutional culture audit before
initiating any changes within the higher education structure (Kezar & Eckel, 2000). Being able to
CIVIC LEARNING PROGRAMS 73
understand the creation of a culture of civic learning is just as important to policymakers as
understanding how it can be implemented or integrated into existing cultures at institutions.
Cultural settings. Policymakers must contend with resources not being aligned with
policy goals and priorities, and competing goals and priorities which will jeopardize success of
the civic learning policy. An organization must evaluate and align resource allocations to meet
the needs of the culture change (Clark & Estes, 2008; Kezar, 2001). Clark and Estes (2008)
maintain that organizational effectiveness increases when leaders ensure that employees have the
resources needed to achieve the organization’s goals. One recommendation is to provide
policymakers with decision-making guidance on resource allocations that will adequately
support employees undertaking the implementation and that facilitates organizational change.
Balancing conflicting projects within a larger public policy initiative can be difficult. The
Vision Project contains six different policy foci for higher education institutions, but only the
civic learning project is a state-mandated policy (Brennan, 2017; MDHE, 2017; Reiff, 2016).
Conflict arises between initiatives in the Vision Project each year as the Governor redirects the
MDHE to focus on a new project without completion of another (MDHE, 2017). In a multi-state
study of over 200 administrators, faculty, staff, and state policymakers, Richardson, Bracco,
Callan, and Finney (1998) found that there must be specific government strategies provided to
higher education institutions to manage shifting priorities and resources, and that accountability
must be established at the macro, system, and operational levels. Mandating a vague policy has
left participating institutions vulnerable to failure as the lack of a comprehensive plan to mitigate
loss of resources or resource allocation was never created. Policymakers will need guidance on
how to effectively manage resource allocations at the state level, which will help them balance
implementation and support.
CIVIC LEARNING PROGRAMS 74
Integrated Implementation and Evaluation Plan
Implementation and Evaluation Framework
This study used the New World Kirkpatrick Model for evaluation and creation of the
implementation plan. Updated from Kirkpatrick’s original four levels of training, the New World
Model re-orientates the model’s four levels to maximize the transfer of learning for stronger
organizational change results (Kirkpatrick & Kirkpatrick, 2016). Starting with the fourth level,
Results, the authors argue that identifying the leading internal and external indicators, will set an
organization up for success for two main reasons (Kirkpatrick & Kirkpatrick, 2016). First, this
analysis will allow leaders and educators to produce targeted outcomes that can be used to
measure success and progress through implementation. Second, it refocuses attention on
providing the appropriate supports and accountability, which the authors argue is often ignored
during the final analysis (Kirkpatrick & Kirkpatrick, 2016). Next, Kirkpatrick and Kirkpatrick
(2016) note that the third level, Behavior, should align with the previous level through the
identification of critical behavior and required drivers. The authors also argue that this level is
the most important of the cycle because the transfer of learning to practical application shows a
successful implementation has taken place (Kirkpatrick & Kirkpatrick, 2016). After establishing
final two levels, Kirkpatrick and Kirkpatrick (2016) encourage leaders to return to the second
level, Learning, which will focus on determining if participants acquired the appropriate
knowledge, skills, and attitude. The results of this level will reflect a strong implementation plan
crafted in the previous two levels. Finally, Kirkpatrick and Kirkpatrick (2016) recommend
leaders create the evaluation sheets used in the first level, Reaction. The easiest of the levels, the
first level gauges how much participants enjoyed a training. While the authors posit that the New
CIVIC LEARNING PROGRAMS 75
World Model is not a feedback loop, it is vitally important that all the levels align to create a
strong evaluation and implementation plan.
Organizational Purpose, Need, and Expectations
In 2010, the Massachusetts Board of Higher Education (MBHE) and Massachusetts
Department of Higher Education (MDHE) began work on the Vision Project, a state-driven
initiative to “produce the best-educated citizenry and workforce in the world” (MDHE, 2017).
The Vision Project aligns with the MBHE’s mission to ensure that Massachusetts residents have
the opportunity to benefit from a higher education that enriches their lives and advances their
contributions to civic life, economic development, and social progress of the Commonwealth.
Focusing on the sixth initiative in the project, the state legislature passed the Policy on Civic
Learning in 2014, which maintains that all public 2- and 4-year institutions will graduate civic-
minded students. Looking to meet the ambitious national goals of graduating at least half of all
college students as civic-minded by 2020, the MDHE elected to graduate all students from its
public institutions by May 2019. While this initiative impacts three specific stakeholder groups,
this study only focused on the policymaker stakeholder goal of achieving 100% compliance of
the policy by all participating institutions by January 2019. The policymakers are the individuals
who crafted the initial state-mandated policy and established mechanisms for its implementation,
making them the appropriate stakeholder group to study for compliance efforts. Findings indicate
that while the policymakers technically achieved compliance of the policy within the established
timeline, there were some identified knowledge and organizational influence gaps that impacted
the degree to which compliance was achieved. In an effort to assist other policymakers with
similar compliance implementation, several recommendations are explored below.
CIVIC LEARNING PROGRAMS 76
Level 4: Results and Leading Indicators
Table 7 highlights the proposed Level 4 plan, including defined outcomes, metrics, and
methods. Both internal and external outcomes are included. Internal leading indicators show
changes within the organization that will occur to achieve the organizational goal, while external
leading indicators relate to how an external group will respond to successful implementation
within the organization (Kirkpatrick & Kirkpatrick, 2016).
Table 7
Outcomes, Metrics, and Methods for External and Internal Outcomes
Outcome Metric(s) Method(s)
External Outcomes
1. Increased number of
students graduating with
civic-minded exposure.
1a. Number of students who
completed a civic learning class.
1b. Number of students who
completed a civic engagement
opportunity.
1c. Number of students who
completed both opportunities.
1d. Increased civic knowledge of
graduating students.
1e. Increased civic awareness of
graduating students.
1a. Solicit degree completion information
from MDHE.
1b. Solicit degree completion information
from MDHE.
1c. Solicit degree completion information
from MDHE.
1d. Compare pre- and post-survey results.
1e. Compare pre- and post-survey results.
2. Increased public
recognition of the policy.
2a. Increased number of
references to the policy in local
media.
2b. Increased number of social
media references.
2a. Content analysis of local media.
2b. Content analysis of social media and
digital analytics.
3. Increased involvement in
their communities.
3. Number of students involved
in civic engagement in the
community per year.
3. Solicit data from universities on activity
rates; compare annual reports.
Internal Outcomes
4. All participating
institutions included civic
learning in academic
planning.
4. The number of civic courses
and civic engagement
opportunities at each
participating institution.
4. Gather data from the websites of each
participating institution.
5. Increased support of
participating institutions.
5a. The number of
representatives from each
participating institution at yearly
meetings.
5b. The number of each
participating institution at each
meeting.
5a. Collect attendance data from quarterly
meetings used to monitor progress on goal
completion and integrate feedback
opportunities from institutional partners.
5b. Collect attendance data from quarterly
meetings used to monitor progress on goal
completion and integrate feedback
opportunities from institutional partners.
CIVIC LEARNING PROGRAMS 77
5c. The number of follow-up
emails sent to participating
institutions.
5d. The number of participating
institutions receiving grant
funding for additional work.
5c. Create communication plan to keep
institutions updated on goal achievement and
implementation.
5d. Establish check points throughout the
process to re-evaluate budgeting needs and
communicate those needs through an
identified channel to decision makers.
6. Increased data uploads by
participating institutions.
6a. The number of participating
institutions uploading data.
6b. The types of data collected
for the initiative.
6a. Access data collection database to
determine institutions uploading information.
6b. Access data collection database to
determine types of data collected (curricular,
programmatic, etc.).
7. Increased coordination
and support during
implementation.
7a. Reporting of other
institutional changes inspired by
the policy.
7b. Reported curricular changes
inspired by the policy.
7a. Map the organization to determine current
and future civic culture and implementation
impacts.
7b. Organize a change committee at each
institution, comprised of students, faculty,
and staff, to determine curricular changes.
8. Improved relationships
between MDHE and
participating institutions.
8a. The number of internal
communications sent to
participating institutions (as
client stakeholders).
8b. Increased communication
satisfaction by participating
institutions.
8a. Create a comprehensive communication
strategy tracks and collects internal
communications.
8b. Compare pre- and post-communication
satisfaction survey results.
9. Improved faculty
relationships with MDHE.
9. Increased faculty satisfaction
by participating institutions.
9a. Arrange focus groups throughout the
process to document perceptions of faculty.
9a. Compare pre- and post- satisfaction
survey results.
Level 3: Behaviors
Critical behaviors. The policymakers are the stakeholder group of focus for education
and training to help achieve compliance of the state-mandated policy. The first critical behavior
is to educate participating institutions about developing civic learning opportunities and the data
collection procedure. The second critical behavior is for the policymakers to identify gaps in
engagement through the implementation process in an effort to provide additional support and
coordination to the participating institutions. Third, the policymakers must also identify gaps in
communication and engagement between participating institutions and the MDHE. The final
critical behavior is for the policymakers to identify gaps in communication and engagement for
CIVIC LEARNING PROGRAMS 78
faculty during this process. The below table outlines the metrics, methods, and timing of each
identified critical behavior.
Table 8
Critical Behaviors, Metrics, Methods, and Timing for Policymakers
Critical Behavior Metric(s) Method(s) Timing
1. Policymakers
educate
participating
institutions about
civic learning
opportunity
development and
data collection
procedures.
1a. Number of civic
learning opportunities
created.
1b. Number of downloads
per institution,
1. Policymakers
will track
submissions in
database during
opportunity
creation.
1. Monthly reports
2. Policymakers
identify gaps in
institutional
engagement
throughout policy
development
process.
2a. Number of outreach
attempts (and different
modes of communication).
2b. Number of program
forums.
2c. Attendance at program
forums.
2d. Number of opportunities
for additional participation.
2. Policymakers
will track outreach
through established
communication
plan, which will
include forum
schedule and
calendar for
additional
opportunities.
2. Bi-weekly data
analytic reports
3. Policymakers
identify gaps in
communication and
engagement
between
participating
institutions and
MDHE during
implementation
process.
3a. Number of
communications/outreaches.
3b. Number of engagement
opportunities.
3c. Types of engagement
opportunities.
3. Policymakers
will track
outreaches through
established
communication
plan and
engagement
opportunities
through strategic
plan.
3. Quarterly report
4. Policymakers
identify gaps in
communication and
engagement with
faculty throughout
implementation and
data collection.
4a. Number of
communications/outreaches.
4b. Number of participating
faculty at each institution.
4c. Number of engagement
opportunities offered to
faculty.
4. Policymakers
will track through
the established
strategic plan.
4. Quarterly report
CIVIC LEARNING PROGRAMS 79
Required drivers. Policymakers will require assistance through job aids and training
programs from civic learning professionals and educators to help participating institutions
achieve compliance. Moreover, they will require support from the MDHE on identifying and
providing collaborative learning and engagement opportunities with participating institutions.
While policymakers cannot receive additional compensation for their work on policy
compliance, they can receive public recognition on their work and how they help participation
institutions achieve compliance. Table 9 highlights required drivers for policymakers.
Table 9
Required Drivers to Support Policymakers’ Critical Behaviors
Method(s) Timing Critical Behavior Supported (1, 2,
3, etc.)
Reinforcing
Job aid on crafting civic learning,
terminology, to help with civic
learning opportunity development.
Ongoing 1
Job aid on educational assessment
and civic learning assessment.
Ongoing 1
Training on steps to take to create
assessment tools and upload data
into database.
At start of initiative; revisit each
quarter
1
Encouraging
Collaborate during the creation,
implementation, and data collection
process.
Ongoing 2, 3, 4
Rewarding
Public acknowledgement of
participation and creation of civic
learning opportunities.
Ongoing 2, 3, 4
Public acknowledgement of policy
involvement and compliance.
Quarterly 2, 3, 4
Monitoring
Conduct interviews with faculty
and members of participation
institution teams to determine
progress and need gaps.
Yearly 1, 2, 3, 4
Survey faculty and participating
institution teams to determine
satisfaction.
Every six months 2, 3, 4
Assess strategic plan for goal
completion and implementation
milestones.
Quarterly 1
CIVIC LEARNING PROGRAMS 80
Host yearly learning forum with
faculty and participating
institutions.
Yearly 2, 3, 4
Organizational support. To support policymakers’ critical behaviors, the MDHE should
take several steps. First, they should provide the stakeholder group with additional information
needed to successfully guide participating institutions to compliance achievement. For instance,
providing models of high-profile public officials who placed importance on civic education or
similar programs will help them foster a culture of civic learning. The organization should also
provide the policymakers with researched concepts and case studies on other postsecondary
institutions that have successfully implemented civic learning programs or other state-mandated
educational policy initiatives. Finally, the organization needs to provide policymakers with
information about resource allocation and budgeting to adequately support their work and the
participating institutions.
Level 2: Learning
Learning goals. Following completion of the learning program and outlined
recommendations, policymakers will be able to:
1. Define key words and terminology related to civic learning and civic learning
programs, (D)
2. Determine how elements of civic learning can be used to create related programming,
(D)
3. Generate a working definition of civic learning for public policy and implementation,
(D)
4. Clarify variables for civic learning assessment, (P)
5. Design a civic learning assessment tool and plan, (P)
CIVIC LEARNING PROGRAMS 81
6. Integrate a civic learning assessment plan into policy implementation, (P)
7. Create C3 goals for all aspects of policy development, implementation, and
assessment, (M) and
8. Organize opportunities for collaboration during goal setting processes (M).
9. Value the need for civic learning policy at the state level. (Value)
10. Indicate confidence in the creation of public policy and implementation plans for
civic learning. (Confidence)
Program. The aforementioned learning goals will be achieved through a yearlong face-
to-face learning program, designed to help policymakers create and implement a state civic
learning policy. The yearlong learning program will be divided into four sections: one three-day
workshop dedicated to pre-work and three follow-up meetings every quarter to finalize different
elements of the implementation. The follow-up meetings will last for two consecutive days. In
total, policymakers will spend approximately 80 hours engaged in in-person learning
environments.
During the initial workshop, policymakers will receive an overview of civic learning in
higher education, an introduction to what other states are focusing on, and an in-depth review of
what their state institutions are currently offering to students (i.e., civic knowledge and/or civic
engagement). Policymakers will also be given job aids on civic learning definitions, assessment
tools, and questions for future implementation portions. During the three-day workshop,
policymakers will engage in discussions, breakout sessions (based on interests), and action
planning for future quarterly meetings. By the end of the initial workshop, policymakers should
be able to craft a working definition of civic learning for their policy. Each policymaker will also
CIVIC LEARNING PROGRAMS 82
be assigned to one of three implementation teams, which will be highlighted in the quarterly
meetings.
The focus of each quarterly meeting will address a specific element of implementation:
institutional implementation, strategic communication planning, and data collection and
assessment. Once assigned to an implementation team, policymakers will work in their groups
between meetings to create action plans for policy implementation to be reviewed by the team.
At the quarterly meetings, policymakers will engage in discussions, teaching other policymakers
about their team’s specific area, and finalize action plans for the specific area. By the end of the
last quarterly meeting, the policymakers should create the entire implementation plan for the
state policy.
In addition to face-to-face learning, the policymakers will also have access to a shared
system that will house documents and allow for continued conversations. The system will house
all the job aids and learning materials provided in the initial workshop for policymakers to
reference while working with their teams, as well as documents created by the teams for the
quarterly meetings. The policymakers can provide feedback to each other in real time on the
documentation, as well as pose questions and items for consideration at the quarterly meetings.
This system will allow the policymakers to continue to work collaboratively even if they are
separated geographically.
Components of learning. Kirkpatrick and Kirkpatrick (2016) argue that learning is vital
to our success so we can both complete work tasks, as well as contribute to the development of
our organization. Evaluating the knowledge, skills, attitude, confidence, and commitment of
learners will help organizations determine whether learning has taken place and taken hold,
allowing the learner to apply that new knowledge on the job. In the policymaker’s case, it is
CIVIC LEARNING PROGRAMS 83
important determine if the group has attained the knowledge needed to create and implement a
state-mandated civic learning policy successfully. Table 10 includes the evaluation methods and
timing for the knowledge, skills, attitude, confidence, and commitment of the stakeholders.
Table 10
Components of Learning for the Program
Method(s) or Activity(ies) Timing
Declarative Knowledge “I know it.”
Knowledge checks through discussions and
breakout groups
Periodically during the workshop
Create a civic learning working definition By the end of the initial workshop
Procedural Skills “ I can do it right now.”
Knowledge check through discussions Periodically during the workshop
Breakout assessment team discussion At end of initial workshop and throughout the
corresponding quarterly meeting
Create individual and group action plans By end of all in-person meetings
Attitude “I believe this is worthwhile.”
Instructor’s observations During all in-person workshops and meetings
Discussion of value of what they are being
asked to create/do
During all in person workshops and meetings
Reflections and feedback forms At end of all in-person meetings
Confidence “I think I can do it on the job.”
Large group and breakout group discussions During all in-person workshops and meetings
Check-ins with instructor Periodically during workshops and meetings
Reflections and feedback forms At end of all in-person meetings
Commitment “I will do it on the job.”
Large group and breakout group discussions During all in-person workshops and meetings
Create individual and group action plans By end of all in-person meetings
Level 1: Reaction
Measuring reactions. In order to effectively measure learner reactions to a specific
program, Kirkpatrick and Kirkpatrick (2016) maintain that both formative and summative
evaluation methods must be included in the first level of program development. Table 11
includes these evaluation methods through engagement, relevance, and customer satisfaction
lenses.
CIVIC LEARNING PROGRAMS 84
Table 11
Components to Measure Reactions to the Program
Method(s) or Tool(s) Timing
Engagement
Attendance at initial workshop During workshop
Attendance at follow-up meetings During follow-up meetings
Use of shared system for follow-up work Ongoing throughout the learning program
Observations by facilitator During workshop and follow-up meetings
Course evaluations One week after initial workshop and final
follow-up meeting
Relevance
Pulse check during discussions After every module in initial workshop and at
dedicated points in the follow-up meetings
Course evaluations One week after initial workshop and final
follow-up meeting
Customer Satisfaction
Course evaluations One week after initial workshop and final
follow-up meeting
Evaluation Tools
Immediately following the program implementation. Data will be collected following
both the initial learning workshop, as well as the quarterly follow-up meetings. Both qualitative
data, through observations by the facilitator, and quantitative data, through survey methods, will
be used in order to capture diverse aspects of the learning process and impacts of the
programming. Moreover, supplemental qualitative and quantitative data may be collected
through the online sharing document system in an effort to further evaluate the program.
Level 1 Reaction data will be collected through four different methods across the
different learning opportunities. The facilitator will collect observational data during the initial
learning workshop, followed by a course evaluation issues approximately two weeks following
the workshop (Appendix E). A course evaluation will also be used following the final quarterly
follow-up meeting. Pulse checks will also be initiated after every module during the initial
CIVIC LEARNING PROGRAMS 85
workshop, as well as predetermined points during the following meetings. Finally, information
may be gathered in the shared document system used during the follow-up meetings.
Similarly, Level 2 Learning data will be collected using various methods throughout the
learning opportunity. The facilitator will continue to collect observational data during participant
discussions, large group discussions, and breakout discussions. Moreover, reflections and
feedback forms collected throughout the process will provide additional insights into
participants’ attitudes about the process. Finally, individual and group action plans will be used
to determine knowledge and commitment to the program and policy implementation.
Delayed for a period after the program implementation. Given the length of policy
development and implementation, the policymakers will also be assessed during specific time
periods following the initial learning workshop. Using Kirkpatrick and Kirkpatrick’s (2016)
Blended Evaluation as a guide, the brief survey will be sent electronically to all participating
policymakers 30 days following each of the follow-up meetings (Appendix F). This survey will
assess all four aspects of the New World Model: whether the policymakers are making progress
toward their identified outcomes, whether they have identified and implemented critical
behaviors, whether they have experienced an increase in their knowledge, skills, confidence, and
commitment for the initiative, and whether they believe the learning program helped them
achieve their previously identified learning goals. The policy implementation timeline created
during the initial learning workshop will also be used to determine goal completion.
Data Analysis and Reporting
Following the completion of the learning program for Policymakers, Level 4 outcomes
must be tracked and measured to determine if stakeholder compliance goals and organizational
global goals are being met. Using the shared system described in Level 2, a project dashboard
CIVIC LEARNING PROGRAMS 86
will be housed on the internal site to display progress being made toward all goals. The
dashboard will allow the stakeholders and larger organization to track all goals and outcomes
outlined in the four levels in real-time and adjust practices accordingly. Moreover, the dashboard
will allow the larger organization to assess when additional resources may be needed. The
dashboard, featured in Appendix G, illustrates a hypothetical dashboard, broken down by metric
and year.
Summary
Using the Kirkpatrick and Kirkpatrick (2016) New World model for learning program
evaluation allows leaders and educators to focus on developing metrics that align to the overall
goals of the program or project. Through the use of the four levels in reverse, the authors teach
leaders how to embed data analysis into the learning program (Kirkpatrick & Kirkpatrick, 2016).
Often an overlooked feature of learning and development, Kirkpatrick and Kirkpatrick (2016)
argue that being able to show internal benchmarking and a return on expectations will highlight
the success of the program. For this study, using this model to illustrate the importance of
building an implementation plan that can help policymakers determine the success of their
compliance, as well as the sustainability of a state-wide policy, is crucial to its continued success.
In Level 4, Results, the desired outcomes directly connect to both the stakeholder goal of
compliance and the organizational global goal of graduating a civic-minded student body. Work
conducted in this level includes the identification of both internal and external outcomes and
corresponding metrics. In the next level, Behavior, critical behaviors are identified that the
policymakers must focus on to help achieve the larger outcomes outline in the previous level. For
this stakeholder group, Level 3 behaviors range from identifying gaps in engagement and
communication with other stakeholder groups to educating participating institutions about data
CIVIC LEARNING PROGRAMS 87
collection. A comprehensive analysis Level 3 behaviors allow the policymakers to determine
what required drivers need to be implemented to successfully change behaviors and achieve
lasting change.
These behaviors are directly influenced by the transfer of learning illustrated in Level 2.
In this level, learning is the primary focus, including the “intended knowledge, skills, attitude,
confidence, and commitment” derived through the year-long learning program outlined above
(Kirkpatrick & Kirkpatrick, 2016, p. 42). Creating a comprehensive learning program that allows
the policymakers the ability to develop meaning through their shared experienced and mental
models is paramount to its success. Moreover, the structure of the program will help the
policymakers focus the transfer of learning needed for the other levels and to better communicate
with other stakeholders. Finally, Level 1 evaluates the learning program, including proposed
formative and summative methods. Addressing engagement, relevance, and customer satisfaction
through simple means allows the policymakers to determine their overall satisfaction with the
policy development process.
Limitations and Delimitations
There are several limitations to consider for this study. First, and most important, this
study only used a qualitative design. While the design is appropriate as it relates to the research
questions (Creswell, 2014), a qualitative design does not allow the researcher to answer whether
students do graduate as civic-minded individuals. A quantitative or mixed methods design would
allow for a deeper analysis into all aspects of the program (Creswell, 2014) and complement the
exploratory nature of qualitative design (Maxwell, 2013; Merriam & Tisdell, 2016). Second, the
sample population was very specific and small. Merriam and Tisdell (2016) argue that small
sample populations do not allow the findings to be generalized among other populations.
CIVIC LEARNING PROGRAMS 88
Moreover, a small sample population did not accurately reflect all aspects of the study. Merriam
and Tisdell (2016) note that the sample size will “always depend on the questions being asked,
the data being gathered, the analysis in progress, and the resources you have to support the
study” (p. 101). While an adequate number was needed to address those concerns, the small
sample size impacted any attempts to replicate the study.
It is also important to consider that the stakeholder group may be limiting within the
study for two reasons. First, could the same research questions be asked to a related stakeholder
group? For instance, if the study asked the work teams at each participating institution the same
questions about compliance, would those interviews reveal more compelling data? Second, does
the geographical location of the stakeholder group impact their experience? If another state used
the same format for implementation, would the policymakers report the same experiences? How
much of the state legislature’s make-up and culture impact the success of a policy like this?
Finally, the lack of research on goal setting and goal orientation in public policy is
concerning. Most literature about public policy focuses on agenda setting. How well will this
motivational frame explain the influence by the stakeholders? Can goal setting and/or goal
orientation complement agenda setting during implementation of state-wide initiatives?
Additional research in this area is needed and future studies could explore the use of this
motivational influence for change.
Recommendations for Future Research
Given the limitations of this study and the unexplored nature of the research topic, there
are several recommendations for future research. First, more exploration surrounding the
identified stakeholder group is needed. Additional research could include examining meaning
making of policymakers, how policymakers learn while writing policies, and how policymakers
CIVIC LEARNING PROGRAMS 89
adapt during the implementation process. Moreover, future research could also explore how
policymakers interact with other stakeholders to develop, implement, and sustain policy in
communities. This research should also include an investigation into how policymakers create
policy that is sustainable in higher education after compliance is achieved.
Additional research that explores motivational factors, such as interest, in this stakeholder
group is desperately needed. There was no identified literature found surrounding how interest is
developed or maintained by this stakeholder group. A deeper exploration into how interest
impact their work is needed to understand how policymakers in other areas could influence
policy success. Furthermore, additional research is needed in the exploration of how civic
learning public policy impacts institutions of higher education. Since Massachusetts is the only
state in the country that has this kind of state policy, more research is needed to explore how
postsecondary institutions integrate this kind of public policy into their existing structures and
organizational functions. Finally, a significant research project could be undertaken to examine
how a learning program would help policymakers create a comprehensive and effective system
to implement change, including assessment and evaluation.
Conclusion
The decline in civic learning and civic education by college students was illustrated
during the most recent election cycles, and it has prompted institutions of higher education to
address the issue through the use of civic engagement and civic general education curriculum.
Inspired by a 2012 watershed report, one state elected to dedicate a push for civic learning
programs even further by implementing a state-wide policy on civic learning for all public 2- and
4-year postsecondary institutions. The purpose of this study was to determine whether a state
department of higher education could graduate civic-minded students from those institutions by
CIVIC LEARNING PROGRAMS 90
achieving 100% compliance of the state policy. Using the Clark and Estes (2008) gap analysis
conceptual framework as the foundation for this qualitative study, interviews with identified
stakeholders known as policymakers and document analysis explored the knowledge, motivation,
and organizational influences that impacted the organization’s goals. Of the identified influences,
one knowledge gap influence and three organizational gap influences were validated through
analysis. Findings revealed that while compliance was achieved, it was only achieved by
participating institutions and the exclusion of some institutions may impact policy sustainability
and growth. Moreover, policy alignment with institutional missions and cultures needs more
attention, as does aligning resources with the policy goal. Third, the organization needs to
continue efforts to foster a culture of civic learning for the policy to be sustainable. Interestingly,
while not a validated gap in the study, it was identified that interest is the motivating driver for
the stakeholder group, which may impact aspects of other knowledge influences. Based on those
gaps, the Kirkpatrick and Kirkpatrick (2016) New World model was used to create a learning
program and recommendations to help other policymakers create policies and implementation
processes that will lead to successfully compliant and sustainable programs. Creating a
comprehensive learning program, using recommendations derived from the identified gaps, will
allow policymakers to focus on meeting the strategic goals of their group and the larger
organization. More important, implementing these recommendations will allow policymakers,
state legislators, educators, administrators, and students to develop a well-informed electorate of
active citizens, equipped with the knowledge, skills, and values to develop and foster lasting
change in their communities, illustrating the real public value of civic learning.
CIVIC LEARNING PROGRAMS 91
References
Agocs, C. (1997). Institutionalized resistance to organizational change: Denial, inaction, and
repression. Journal of Business Ethics, 16, 917-931.
Ainley, M., Hidi, S., & Berndorff, D. (2002). Interest, learning, and the psychological processes
that mediate their relationship. Journal of Educational Psychology, 94, 545-561.
American Council of Trustees and Alumni. (2016). A crisis in civic education. Retrieved from
https://www.goacta.org/images/download/A_Crisis_in_Civic_Education.pdf
Annenberg Public Policy Center. (2017, September 12). Americans are poorly informed about
basic constitutional provisions. Retrieved from
https://www.annenbergpublicpolicycenter.org/americans-are-poorly-informed-about-
basic-constitutional-provisions/
Bawa, A. C., & Munck, R. (2012). Forward: Globalizing civic engagement. In L. McIllrath, A.
Lyons, & R. Munck (Eds.), Higher education and civic engagement: Comparative
perspectives (pp. xi-xix). New York: Palgrave Macmillan.
Bodgan, R. C., & Biklen, S. K. (2007). Chapter 5: Qualitative data. Qualitative research for
education: An introduction to theories and methods (5
th
ed.) (pp. 160-172). Boston, MA:
Allyn and Bacon.
Bok, D. (2006). Our underachieving colleges: A candid look at how much students learn and
why they should be learning more. Princeton, NJ: Princeton University Press.
Boland, J. A. (2012). Strategies for enhancing sustainability of civic engagement: Opportunities,
risks, and untapped potential. In L. McIllrath, A. Lyons, & R. Munck (Eds.), Higher
education and civic engagement: Comparative perspectives (pp.41-59). New York:
Palgrave Macmillan.
CIVIC LEARNING PROGRAMS 92
Bowen, G. A. (2009). Document analysis as a qualitative research method. Qualitative Research
Journal, 9(2), 27-40.
Brennan, J. (2017). Higher education civic learning and engagement: A Massachusetts study.
Education Commission of the States. Retrieved from https://www.ecs.org/higher-
education-civic-learning-and-engagement-a-massachusetts-case-study/
Bryant, A. N., & Gaston Gayles, J. (2012). The relationship between civic behavior and civic
values: A conceptual model. Journal of Research in Higher Education, 53, 76-93.
Center for Information and Research on Civic Learning and Engagement. (2010). Fact sheet:
Civic skills and federal policy. Retrieved from http://civicyouth.org/wp-
content/uploads/2010/12/Civic-Skills-and-Federal-Policy_final.pdf
Clark, R. E., & Estes, F. (2008). Turning research into results. Charlotte, NC: Information Age
Publishing.
Cole, J. R. (2016, November 8). Ignorance does not lead to election bliss. The Atlantic. Retrieved
from https://www.theatlantic.com/education/archive/2016/11/ignorance-does-not-lead-to-
election-bliss/506894/
Coley, R. J., & Sum, A. (2012). Fault lines in our democracy: Civic knowledge, voting behavior,
and civic engagement in the United States. Educational Testing Service. Retrieved from
http://www.ets.org/s/research/19386/rsc/pdf/18719_fault_lines_report.pdf
Creswell, J. W. (2014). Research design: Qualitative, quantitative, and mixed methods
approaches. Thousand Oaks, CA: Sage.
Cuthill, M. (2012). A “civic mission” for the university: Engaged scholarship and community-
based participatory research. In L. McIllrath, A. Lyons, & R. Munck (Eds.), Higher
CIVIC LEARNING PROGRAMS 93
education and civic engagement: Comparative perspectives (pp.81-99). New York:
Palgrave Macmillan.
Damon, W. (2011). The core of civic virtue. Hoover Digest, 3. Retrieved from
https://www.hoover.org/research/core-civic-virtue
Davis, A., & Mello, B. (2012). Preaching to the apathetic and uninterested: Teaching civic
engagement to freshman and non-majors. Journal for Civic Commitment. Retrieved from
http://ccncce.org/articles/preaching-to-the-apathetic-and-uninterested-teaching-civic-
engagement-to-freshmen-and-non-majors/
Department of Education. (2017). Fiscal year 2017 budget: Summary and background
information. Retrieved from
https://www2.ed.gov/about/overview/budget/budget17/summary/17summary.pdf
Dunn, E., J., Mills, P. D., Neily, J., Crittenden, M. D., Carmack, A. L., & Bagian, J. P. (2007).
Medical team training: Applying crew resource management in the Veterans Health
Administration. The Joint Commission Journal on Quality and Patient Safety, 33(6), 317-
325.
Education Commission of the States. (2016, January). State civic education policy. Retrieved
from https://www.ecs.org/ec-content/uploads/01202016_ECS_CEPGA_Report-1.pdf
Fink, A. (2013). Sampling. In A. Fink (Ed.), How to conduct surveys: A step-by-step guide (5
th
ed.) (pp. 79-98). Thousand Oaks, CA: Sage.
Forestal, J. (2016). “Midwife to democracy”: Civic learning in higher education. Retrieved from
https://intraweb.stockton.edu/eyos/hughescenter/content/docs/Research/Forestal%20Octo
ber%202016.pdf
CIVIC LEARNING PROGRAMS 94
Gaff, J., & Meacham, J. (2006). Learning goals in mission statements: Implications for
educational leadership. Liberal Education, 92(1). Retrieved from
https://www.aacu.org/publications-research/periodicals/learning-goals-mission-
statements-implications-educational
Gallimore, R., & Goldenberg, C. (2001). Analyzing cultural models and settings to connect
minority achievement and school improvement research. Educational Psychologist,
36(1), 45-56.
Galston, W. (2004). Civic education and political participation. Political Science and Politics,
37, 263-266.
Galston, W. (2007). Civic knowledge, civic education, and civic engagement: A summary of
recent research. International Journal of Public Administration, 30(6-7), 623-642.
Glesne, C. (2011). Chapter 6: But is it ethical? Considering what is “right.” In Becoming
qualitative researchers: An introduction (4
th
ed.) (pp. 162-183). Boston, MA: Pearson.
Harding, J. (2013). Chapter 5: Using codes to analyze an illustrative issue. Qualitative data
analysis from start to finish (pp. 81-106). Thousand Oaks, CA: SAGE.
Hatcher, J. A. (2011). Assessing civic knowledge and engagement. New Directions for
Institutional Research, 149, 81-92. doi: 10.1002/ir.382
Hillman, N. W., Tandberg, D. A., & Sponsler, B. A. (2015). Public policy and higher education:
Strategies for framing a research agenda. ASHE Higher Education Report, 41(2).
Hillygus, D. S. (2005). The missing link: Exploring the relationship between higher education
and political engagement. Political Behavior, 27(1), 25-47.
CIVIC LEARNING PROGRAMS 95
Holosko, M. J., Winkel, M., Crandall, C., & Briggs, H. (2015). A content analysis of mission
statements of our top 5 schools of social work. Journal of Social Work Education, 51(2),
222-236.
Jacoby, B. (2009). Civic engagement in today’s higher education: An overview. In B. Jacoby and
Associates (Eds.), Civic engagement in higher education (pp. 1-30). San Francisco, CA:
Jossey-Bass.
Johnson, R. B., & Christensen, L. B. (2015). Sampling in quantitative, qualitative, and mixed
research. In R. B. Johnson, & L. B. Christensen (Eds.), Educational research:
Quantitative, qualitative, and mixed approaches (5
th
ed.) (pp. 247-276). Thousand Oaks,
CA: Sage.
Kezar, A. (2001). Theories and models of organizational change. Understanding and facilitating
organizational change in the 21
st
century: Recent research and conceptualizations. ASHE-
ERIC Higher Education Report, 28(4), 25-58.
Kezar, A., & Eckel, P. (2000). The effect on institutional culture on change strategies in higher
education: Universal principles or culturally responsive concepts? ERIC Clearinghouse
on Higher Education, Office of Educational Research and Improvement. Retrieved from
https://files.eric.ed.gov/fulltext/ED446719.pdf
King, H. B., Kohsin, B., & Salisbury, M. (2005). Systemwide deployment of medical team
training: Lessons learned in the Department of Defense. In K. Henriksen, J. B. Battles, E.
S. Marks, et al. (Eds.), Advances patient safety: From research to implementation, Vol 3.
Rockville, MD: Agency for Healthcare Research and Quality. Retrieved from
https://www.ncbi.nlm.nih.gov/books/NBK20537/
CIVIC LEARNING PROGRAMS 96
Kirkpatrick, J. D., & Kirkpatrick, W. K. (2016). Kirkpatrick’s four levels of training evaluation.
Alexandria, VA: ATD Press.
Knefelkamp, L. (2008). Civic identity: Locating self in community. Diversity and Democracy,
11, 1-3.
Krathwohl, D. R. (2002). A revision of Bloom’s Taxonomy: An overview. Theory into Practice,
41, 212-218.
Leach, W. D., Pelkey, N. W., & Sabatier, P. A. (2002). Stakeholder partnerships as collaborative
policymaking: Evaluation criteria applied to watershed management in California and
Washington. Journal of Policy Analysis and Management, 21(4), 645-670.
Lunenburg, F. C. (2011). Goal-setting theory of motivation. International Journal of
Management, Business, and Administration, 15(1), 1-6.
Massachusetts Department of Higher Education. (2017). About the Department of Higher
Education. Retrieved from http://www.mass.edu/about/aboutdhe.asp
Massachusetts Department of Higher Education. (2017). DHE DataCenter: 2016 degrees
conferred. Retrieved http://www.mass.edu/datacenter/success/SUDegreesConferred.asp
Massachusetts Department of Higher Education. (2017). The Vision Project: Preparing citizens.
Retrieved from http://www.mass.edu/visionproject/preparingcitizens.asp
Maxwell, J. A. (2013). Qualitative research design: An interactive approach. (3
rd
ed.). Thousand
Oaks, CA: Sage.
Mayer, R. E. (2011). Applying the science of learning. Boston: Pearson.
McEwan, E. K., & McEwan, P. J. (2003). Making sense of research. Thousand Oaks: CA, Sage.
CIVIC LEARNING PROGRAMS 97
Melville, K., Dedrick, J., & Gish, E. (2013). Preparing students for democratic life: The
rediscovery of education’s civic purpose. The Journal of General Education, 62(4), 258-
276.
Merriam, S. B., & Tisdell, E. J. (2016). Qualitative research: A guide to design and
implementation. (4
th
ed.). San Francisco: Jossey-Bass.
Mettler, S. (2007). Bringing government back into civic engagement: Considering the role of
public policy. International Journal of Public Administration, 30, 645-650.
Miles, M. B., Huberman, A. M., & Saldana, J. (2014). Chapter 4: Fundamentals of qualitative
data analysis. Qualitative data analysis: A methods sourcebook (3
rd
ed.) (pp. 71-86).
Thousand Oaks, CA: Sage.
Musil, C. (2009). Educating students for personal and social responsibility. In B. Jacoby and
Associates (Eds.), Civic engagement in higher education (pp. 49-68). San Francisco, CA:
Jossey-Bass.
Patton, M. Q. (2002). Chapter 7: Qualitative interview. In Qualitative research and evaluation
methods (3
rd
ed.) (pp. 339-348). Thousand Oaks, CA: SAGE Publications.
Reason, R. D., & Hemer, K. (2015). Civic learning and engagement: A review of the literature
on civic learning, assessment, and instruments. Washington, DC: Association of
American Colleges and Universities, Degree Qualifications Profile, Civic Learning Task
Force. Retrieved from
https://www.aacu.org/sites/default/files/files/qc/CivicLearningLiteratureReviewRev1-26-
15.pdf
Reiff, J. D. (2016). What is a state required civic learning for all its undergraduates? Higher
Learning Research Communications, 6(2).
CIVIC LEARNING PROGRAMS 98
Richardson, R., C., Bracco, K. R., Callan, P. M., & Finney, J. E. (1998). Higher education
governance: Balancing institutional and market influences. The National Center for
Public Policy and Higher Education. Retrieved from
https://files.eric.ed.gov/fulltext/ED426641.pdf
Rose, D. (2017). Higher education and the transformation of American citizenship. American
Political Science Association, 403-407.
Rueda, R. (2011). The three dimensions of improving student performance. New York: Teachers
College Press.
Sandfort, J., & Moulton, S. (2015). Effective implementation in practice: Integrating public
policy and management. San Francisco, CA: Jossey-Bass.
Schein, E. H. (2004). Organizational culture and leadership. San Francisco: Jossey-Bass.
Schneider, B., Brief, A., & Guzzo, R. (1996). Creating a culture and climate for sustainable
organizational change. Organizational Dynamics 24(4), 7-19.
Schraw, G., & Lehman, S. (2006). Interest. Retrieved from
http://www.education.com/reference/article/interest
Schraw, G., & McCrudden, M. (2006). Information processing theory. Retrieved from
http://www.education.com/reference/article/information-processing-theory/
Social Networks and Archival Contexts (n.d.). Massachusetts Board of Education. Retrieved
from http://snaccooperative.org/ark:/99166/w6zp8k82
Spaulding, K., & Dwyer, F. (2001). The effect of time-on-task when using job aids as an
instructional strategy. International Journal of Instructional Media, 28(4), 437.
Special Commission on Civic Engagement and Learning. (2012). Renewing the social compact.
Boston, MA: Massachusetts Board of Higher Education.
CIVIC LEARNING PROGRAMS 99
Spiezio, K. (2009). Engaging general education. In B. Jacoby & Associates (Eds.), Civic
engagement in higher education (pp. 69-84). San Francisco, CA: Jossey-Bass.
Study Group on Civic Learning and Engagement. (2013). The Vision Project: Final report from
the Study Group on Civic Learning and Engagement – Preparing citizens. Boston, MA:
Massachusetts Department of Higher Education. Retrieved from
https://www.capecod.edu/files/service/vision-project.pdf
Study Group on Civic Learning and Engagement. (2014). Policy on civic learning. Boston, MA:
Massachusetts Board of Higher Education.
The National Task Force on Civic Learning and Democratic Engagement. (2012). A crucible
moment: College learning and democracy’s future. Washington, DC: Association of
American Colleges and Universities.
Thelin, J. R. (2004). A history of American higher education. Baltimore, MD: The Johns
Hopkins University Press.
Torney-Purta, J., Cabrera, J. C., Crotts Roohr, K., Liu, O. L., & Rios, J. A. (2015). Assessing
civic competency and engagement in higher education: Research background,
frameworks, and directions for next generation assessments (ETS research report no. RR-
15-34). Princeton, NJ: Educational Testing Service.
Van Camp, D., & Baugh, S-A. (2016). Encouraging civic knowledge and engagement: Exploring
current events through a psychological lens. Journal of the Scholarship of Teaching and
Learning, 16(2), 14-28.
Xue, J., Murthy, B., Tran, N. T., & Ghaffar, A. (2014). Goal setting and knowledge generation
through health policy and systems research in low- and middle-income countries. Health
Research Policy and Systems, 12, 39.
CIVIC LEARNING PROGRAMS 100
Yough, M., & Anderman, E. (2006). Goal orientation theory. Retrieved from
http://www.education.com/reference/article/goal-orientation-theory/
CIVIC LEARNING PROGRAMS 101
Appendix A: Interview Protocols
1. I’m interested in learning more about your background. Please tell me about your
professional journey.
2. How did you become a part of this group?
3. Before this policy initiative, what did civic learning in higher education mean to you? (K)
4. For you, what motivated you to develop and implement this policy? (M)
5. Describe the process of creating a definition for civic learning and its elements for this
policy. (K)
a. What influenced you to select the terms and definitions you did for this policy?
(K)
b. How did you decide what civic skills college student needed upon graduation?
(K)
c. How does this definition of civic learning assist in the assessment of the policy?
(K)
6. In your opinion, how does the policy’s definition of civic learning help meet the
organization’s goal to graduate more civic-minded students? (K)
7. How did you determine which implementation goals to set and when to set them? (M)
a. Tell me about the process used to decide which project goals would build upon
the larger organizational goal. (M)
b. How did you prioritize project goals? (M)
8. Tell me about the influence of institutional goals on the implementation of policy goals.
(M)
CIVIC LEARNING PROGRAMS 102
a. Describe how you helped institutions re-align their missions with the policy.
(K/O)
b. How did you help institutions that already had a significant civic mission align
this policy with their mission? (K)
c. If an institution re-aligns its mission for this policy, how could it impact future
state-mandated projects at the institution? (O)
9. The policy allows for unique implementation at any participating institution. How was
that decision made? (O)
10. How will the policy account for the variations in implementation? (O)
11. Some of the research on civic learning discusses the need for a culture of civic learning
for an institution to be successful. How would you define a culture of civic learning? (O)
a. Describe the culture of civic learning at the MDHE. (O)
12. How does the MDHE plan to sustain any cultural changes made in response to the
policy? (O)
a. What sustainability metrics were included in the policy? (O)
13. With deep budget cuts and resource reallocations for other Vision initiatives, how can
project support continue at the organizational level? (O)
a. How do you prioritize implementation work with reallocations? (O)
b. How does the group plan to keep this policy initiative at the forefront of the
Vision Project? (O)
14. How does the policy assist participating institutions in continued implementation and
practice after compliance has been achieved? (O)
CIVIC LEARNING PROGRAMS 103
Appendix B: Document Analysis Protocol
Name of Institution:_________________________________________________________
Date Examined: ___________________________________________________________
Number of clicks to find information: _________
Compliance Area: Strategic Planning
Civic class
Civic engagement
Both class and
engagement
Compliance Area: Support
Regular meetings
Financial
opportunities
Compliance Area: Data Collection
Types of data
collected
How data is being
used
Compliance Area: Coordination
Relationship between
MDHE and
institution
Future institutional
changes
Future learning
policy changes
Identified educational
gaps
CIVIC LEARNING PROGRAMS 104
Appendix C: Credibility and Trustworthiness
Because humans are both the sources of data and the collectors of data in qualitative
research, reviewing credibility and trustworthiness and establishing mechanisms to ensure these
aspects is vital (Merriam & Tisdell, 2016). As Merriam and Tisdell (2016) note, internal validity
[or credibility] is relative and researchers must take precautions to ensure threats to that
credibility is countered. There are two main threats to credibility: researcher bias and reactivity
(Maxell, 2013). Researcher bias occurs when a researcher’s values or perspectives influence the
study and its findings (Maxwell, 2013). Researchers must also develop strategies to acknowledge
and counteract reactivity. Maxwell (2016) defines reactivity as the influence of the researcher on
the setting or participants.
To ensure credibility and trustworthiness in this study, the researcher is taking specific
steps to ensure threats are minimized. First, triangulation is being used in the research design.
Interviews and document analysis will allow for a diverse range of data from multiple sources,
and reduce the chance of biases for only one method of data collection (Maxwell, 2013; Merriam
& Tisdell, 2016). Second, during data collection, the researcher will use a research journal to
examine how the researcher may affect or be affected by the study. Reflexivity is an important
part of eliminating or minimizing validity threats as it allows the researcher to account for his or
her “biases, dispositions, and assumptions regarding the research” (Merriam & Tisdell, 2016, p.
249). This information will be shared in the final analysis to show readers that the researcher’s
values and expectations did not influence the data collection or findings (Merriam & Tisdell,
2016). It will also allow the researcher to address any biases that may develop during the study.
Finally, the researcher will engage in adequate engagement while collecting data. Merriam and
Tisdell (2016) maintain that it is important that the data and emergent findings feel saturated,
CIVIC LEARNING PROGRAMS 105
meaning the researcher must interview and analyze a significant number of policymakers and
documents to see certain data repeatedly. Interviewing seventeen policymakers and examining
hundreds of documents will allow for a deep dive into the research questions (Merriam &
Tisdell, 2016). It may also produce alternative explanations for phenomena that is being
discovered, allowing for a true emergent design (Merriam & Tisdell, 2016).
CIVIC LEARNING PROGRAMS 106
Appendix D: Ethics
While conducting any research project, a researcher needs to take several ethical
considerations into account. Merriam and Tisdell (2016) contend that a researcher’s ethical
practice is dictated by his or her values and ethics. To that end, numerous ethical guidelines were
used for this study. Institutional Review Board (IRB) approval was obtained from the University
of Southern California’s Office for the Protection of Research Subjects. The submission included
a detailed informed consent form. Signed consent forms were obtained by all participants prior to
the start of interviews, and included required statements concerning the voluntary nature of the
study, the participant’s ability to terminate involvement at any time, and a listing of potential
risks (Glesne, 2011). The informed consent form also included information about the role of
protecting the confidentiality and privacy of the participant’s involvement and storage of data
(Glesne, 2011).
All interviews were audio recorded, with written permission forms being provided to
participants before the start of the interview. Since the study population was small, it is important
that several additional steps are taken to ensure the privacy and confidentiality of the
participations during the qualitative interviews. First, all participants were issued a pseudonym
that does not include any identifiers that could be used to ascertain the real identify of the
interviewee. Second, no identifying information was used when describing the participants.
While demographic information will be collected, it was aggregated and not ascribed to a
specific participant in the study.
Ensuring the confidentiality and privacy of the participants also extends to the collection
and storage of data. Since interviews were audio recorded, digital data files were kept on a
secure, password protected computer in a locked office. Moreover, the transcription of the digital
CIVIC LEARNING PROGRAMS 107
files were password protected, with only the researcher having access to the files. Transcripts
were double checked before recordings are destroyed.
Researchers must also consider their roles in the development and execution of research
projects. This researcher is not a member of the study organization and holds no conflicting or
material interest in the findings. Moreover, the identified stakeholders are all employed by
different organizations, none of which conflict with the researcher’s professional position. Since
there is no overlap with any organizations or the larger organization of study, describing the role
of the researcher to participants will be clear and concise. However, there are some biases that
must be considered. Civic learning is of personal interest to this researcher, so ensuring interview
questions are as unbiased as possible is incredibly important (Merriam & Tisdell, 2016).
Guidance from the researcher’s dissertation committee will ensure questions are not slated
toward a particular viewpoint and that the researcher is not taking an advocate role based on bias
(Glesne, 2011). Collecting data using both interviews and document analysis will also aid in
reducing researcher bias (Merriam & Tisdell, 2016). Finally, this researcher kept a research
journal to record any biases and reflections that arise during data collection.
CIVIC LEARNING PROGRAMS 108
Appendix E: Post-Meeting Evaluation
Thank you for taking this brief electronic survey, evaluating the initial workshop and
final follow-up meeting using a 4-point Likert scale. Space will also be provided at the end of the
survey to provide additional comments and concerns that may not be covered by the survey. The
following scale will be used for this evaluation: 1=Strongly Disagree, 2=Disagree, 3=Agree,
4=Strongly Agree.
Strongly Disagree Agree Strongly
Disagree Agree
1 2 3 4
Engagement (Level 1)
I took responsibility for being
involved in this learning program.
1 2 3 4
This learning program held my
interest.
1 2 3 4
I was encouraged to participate
during the learning program.
1 2 3 4
Relevance (Level 1)
The information in this learning
program is applicable to my work
developing civic learning policy.
1 2 3 4
The activities and discussions
aided me in learning the concepts
for policy development.
1 2 3 4
Customer Satisfaction (Level 1)
I would recommend this learning
program to others who are
developing civic learning policy.
1 2 3 4
Knowledge (Level 2)
I understand how to create a civic
learning definition.
1 2 3 4
I understand how to create civic
learning assessment plans.
1 2 3 4
Attitude (Level 2)
I believe civic learning is
worthwhile to postsecondary
education.
1 2 3 4
I believe my work will contribute
to the development of a civic
learning policy.
1 2 3 4
Confidence (Level 2)
CIVIC LEARNING PROGRAMS 109
I am clear about how to create a
civic learning definition for a
policy.
1 2 3 4
I am clear about how to create
civic learning assessment tools.
1 2 3 4
I am clear how to create goal for
policy implementation.
1 2 3 4
Commitment (Level 2)
I will use the knowledge I gained
from this learning program to
create a civic learning policy.
1 2 3 4
1. How can this learning program be improved?
2. Please share any additional comments you would like the facilitator to know.
CIVIC LEARNING PROGRAMS 110
Appendix F: Post-Follow-up Meeting Evaluation
Thank you for taking this brief electronic survey, evaluating the follow-up meetings
using a 4-point Likert scale. Space will also be provided at the end of the survey to provide
additional comments and concerns that may not be covered by the survey. The following scale
will be used for this evaluation: 1=Strongly Disagree, 2=Disagree, 3=Agree, 4=Strongly Agree.
Strongly Disagree Agree Strongly
Disagree Agree
1 2 3 4
L1: Reaction
Information in the follow-up
meeting is applicable to my work
on the policy.
1 2 3 4
This follow-up meeting was a
good use of my time.
1 2 3 4
L2: Learning
Information in the follow-up
meeting allowed me to continue
developing definitions.
1 2 3 4
Information in the follow-up
meeting allowed me to continue
developing assessment tools.
1 2 3 4
L3: Behavior
I received support to apply what I
learned during policy
development.
1 2 3 4
I have successfully applied what I
learned in the follow-up meeting
to my work on the policy.
1 2 3 4
L4: Results
My efforts have contributed to the
development of policy and
mission of the initiative.
1 2 3 4
1. Please share any additional comments you would like the facilitator to know.
CIVIC LEARNING PROGRAMS 111
Appendix G: Dashboard Example
Abstract (if available)
Abstract
Since 2010, numerous reports have shown an increasing lack of knowledge surrounding civic learning and college students. To mitigate this growing national problem, the Commonwealth of Massachusetts passed a state-wide Policy on Civic Learning that requires all undergraduate students graduating from a public 2- or 4-year institution be civic-minded upon commencement. This purpose of this study was to evaluate the degree to which the Massachusetts Department of Higher Education was meeting its civic learning policy compliance goal. Using the Clarke and Estes (2008) gap analysis as a framework to identify the knowledge, motivation, and organizational influences that impact successful completion of the identified goal, this study interviewed Policymakers associated with creating the state public policy. Five themes emerged from interviews and document analysis, focusing on one identified knowledge influence gap and three identified organizational influence gaps. This study also created a recommended implementation and evaluation plan for a learning program that addresses the identified gaps and that can be used by other states seeking to create a similar state policy. Following Kirkpatrick and Kirkpatrick’s (2016) New World Order plan, four levels of evaluation were completed. Evaluation tools were created to effectively assess the completion of successful policy implementation.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Mandatory reporting of sexual violence by faculty and staff at Hometown University: an evaluation study
PDF
An evaluation of project based learning implementation in STEM
PDF
Efficacy of non-formal education programs in educational outcomes of marginalized Filipino children: an evaluation study
PDF
Reconstructing the literary canon: an innovation study
PDF
Succession for success: an evaluation study of corporate strategy to improve employee satisfaction for women of color
PDF
The importance of rigor and engagement on student achievement and mastery: an evaluation study
PDF
Online graduate-level student learning and engagement: developing critical competencies for future leadership roles: an evaluation study
PDF
Modern corporate learning requires a modern design methodology: an innovation study
PDF
Organizational agility and agile development methods: an evaluation study
PDF
Affluent teens and school stress: an evaluation study
PDF
Prior learning assessment portfolios: an evaluation study
PDF
International school director tenure: an evaluation study
PDF
One to one tablet integration in the mathematics classroom: an evaluation study of an international school in China
PDF
The moderating role of knowledge, motivation, and organizational influences on employee turnover: A gap analysis
PDF
Quality literacy instruction in juvenile court schools: an evaluation study
PDF
Relationship between employee disengagement and employee performance among facilities employees in higher education: an evaluation study
PDF
The academic implications of providing social emotional learning in K-12: an evaluation study
PDF
Adaptive learning in higher education: an evaluation study
PDF
Using mastery learning to address gender inequities in the self-efficacy of high school students in math-intensive STEM subjects: an evaluation study
PDF
STEM teacher education: An evaluation study
Asset Metadata
Creator
Carmack, Amy L.
(author)
Core Title
Civic learning program policy compliance by a state department of higher education: an evaluation study
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Organizational Change and Leadership (On Line)
Publication Date
02/21/2019
Defense Date
01/16/2019
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
civic learning,civic policies,civic programs,Higher education,OAI-PMH Harvest
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Datta, Monique (
committee chair
), Dunsmore, Leanne (
committee member
), Ferrario, Kimberly (
committee member
)
Creator Email
acarmack@usc.edu,amylcarmack@hotmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c89-126035
Unique identifier
UC11676944
Identifier
etd-CarmackAmy-7109.pdf (filename),usctheses-c89-126035 (legacy record id)
Legacy Identifier
etd-CarmackAmy-7109.pdf
Dmrecord
126035
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Carmack, Amy L.
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
civic learning
civic policies
civic programs