Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
A learning model of policy implementation: implementing technology in response to policy requirements
(USC Thesis Other)
A learning model of policy implementation: implementing technology in response to policy requirements
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
A LEARNING MODEL OF POLICY IMPLEMENTATION:
IMPLEMENTING TECHNOLOGY IN RESPONSE TO POLICY REQUIREMENTS
by
Ryan Edward Alcántara
A Dissertation Presented to the
FACULTY OF THE GRADUATE SCHOOL
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF PHILOSOPHY
(PUBLIC ADMINISTRATION)
Copyright 2009 Ryan Edward Alcántara
ii
Dedication
I dedicate this dissertation to my wife and two sons. Suzanne, thank you for your
support, patience, and encouragement throughout this process. And to my boys, Vincent
and Steven, may this be a reminder to you both. You can achieve anything with hard
work, dedication, and the support of those who love you.
iii
Table of Contents
Dedication...................................................................................................................... ii
List of Tables................................................................................................................ iv
List of Figures................................................................................................................ v
Abstract......................................................................................................................... vi
Chapter 1: Overview of the Study................................................................................. 1
Chapter 2: Literature Review......................................................................................... 7
Chapter 3: Research Design and Methods................................................................... 52
Chapter 4: Case Study of SEVIS................................................................................. 85
Chapter 5: Case Study of Online Education.............................................................. 137
Chapter 6: Cross Case Analysis................................................................................. 201
Bibliography.............................................................................................................. 247
Appendix A: Homeland Security and SEVIS Implementation.................................. 256
Appendix B: Semi-Structured Interview Protocol..................................................... 259
Appendix C: Initial Listing of Codes......................................................................... 260
Appendix D: Final Coding Protocol with Reference Count...................................... 262
iv
List of Tables
Table 2-1. Information Age Reform............................................................................ 35
Table 2-2. Propositions and their Relation to the OLSM Model................................ 38
Table 2-3. Understanding Achievement of Implementation Objectives..................... 48
Table 3-1. Policy Cases in Contrast............................................................................ 68
Table 2-2. Propositions and their Relation to the OLSM Model................................. 72
Table 3-2. Measures for Achieving Implementation Objectives................................. 78
Table 4-1. SEVIS Phases of Implementation.............................................................. 92
Table 4-2. Propositions in Practice for SEVIS.......................................................... 107
Table 5-1. Online Education Phases of Implementation........................................... 151
Table 5-2. Expansion of Department Online Offerings by Course........................... 162
Table 5-3. Expansion of Department Online Offerings by Section........................... 163
Table 5-4. Propositions in Practice for Online Education......................................... 168
Table 6-1. Summary of Evidence of Organizational Learning................................. 205
Table 6-2. Implications for Policy Implementation.................................................. 228
Table 3-1. Policy Cases in Contrast........................................................................... 238
v
List of Figures
Figure 1-1. Dynamic Social Model of Organizational Learning Subsystems............. 26
Figure 6.1 Policy Implementation Process................................................................ 202
Figure 6-2. Integrated Dynamic Social Model of Organizational Learning.............. 234
vi
Abstract
This study integrates organizational learning and change theory into a learning
model of policy implementation. This is an exploratory study of technology
implementation with descriptive and normative objectives, utilizing organizational
learning and change theory to identify organizational conditions that promote learning
and contribute to achieving implementation objectives. Learning in policy
implementation is studied in the context of a pair of technology related policies set in a
higher education context, one serving an administrative function
1
and the other an
instructional function.
2
Schwandt and Marquardt’s (2000) Organizational Learning
System Model (OLSM) is used to evaluate the presence and development of
organizational learning in these policy contexts, and serves to develop an enriched model
of policy implementation.
Data was collected via interviews, documentation review, and direct observation
in order to identify and examine conditions that support learning at the individual and
organizational levels, and advance the achievement of implementation objectives. The
conditions include: inter-organizational networks; communication with policymakers;
policy and goal congruency; experimentation; roles and responsibilities; internal
communication; and evaluation and routinization. Implementation objectives reflect the
demands of the policy, use of the technology and aims of the organization. Evidence of
1
The implementation of the Student and Exchange Visitor Information System (SEVIS) will be studied
from a campus perspective. SEVIS is a federal system to monitor international students studying in the
United States.
2
The development and expansion of online course offerings by an academic department serves as a
contrasting case. The policy direction here is much more diffuse and is coupled with the department’s
efforts to grow enrollment.
vii
organizational learning was observed as changes in organizational membership and
culture, as well as the development of new operational systems. Further, this learning
contributed to meeting implementation objectives. Implications on policy
implementation include the need for policymakers to consider policy development from
an organizational perspective and suggest strategies for managers to advance
organizational learning in the implementation process.
1
Chapter 1: Overview of the Study
This research stems from an interest in how policy is translated into
administrative action. With a focus on policy implementation, the study argues the need
to approach implementation with an organizational perspective. Specifically, it seeks to
integrate organizational learning into the study of implementation and advances a model
for conceptualizing learning in this context. The cases studied relate to technology policy
in a higher education setting and the findings suggest strategies for both policymakers
and organizational managers to promote organizational learning through the
implementation process.
Policy implementation has become an important concept in understanding and
analyzing policy. While the early studies centered on the tensions between top down
(Pressman and Wildavsky, 1984; Mazmanian and Sabatier, 1989 and 1983; Bardach,
1977) and bottom up frameworks (Maynard-Moody, et al. 1990; Lipsky, 1980; Elmore,
1978), more recent literature has called for a more complex understanding of policy
implementation (Mead 2001; True, Jones, and Baumgartner 1999; Yanow 1996; Matland
1995; Bishop 2003). This often integrates both approaches and applies a situational
perspective to understanding effective implementation. While learning has been
recognized as an important feature of effective policy implementation (Fiorino, 2001;
Pressman and Wildavsky, 1984; True, Jones, and Baumgartner, 1999), little has been
done to examine the specific organizational conditions that support learning in the
practice of policy implementation. This study addresses this gap in the literature,
integrating current organizational learning and change theory into an implementation
framework and applying it in a comparative case study to analyze how different
2
organizations with varying capacities for transferring individual knowledge to
organization knowledge are able to implement policy. Such an integration contributes to
the development of a new approach for implementation studies and a new generation of
policy implementation theory.
This learning model of policy implementation is directed towards policies that
require fundamental changes in how the organization operates and have a broad impact
on individual organizational members, with an emphasis on changes that have a
technological foundation. Such examples of policy implementation benefit from planned
change in the operation and functions of the implementing agencies. They also require
the acquisition of new knowledge at the individual level, which, if implementation is to
be successful, must be shared at an organizational level.
In addition to the importance of culture and values in managing large-scale
change (Cole, 1991), change is seen as a learning process that takes place within the
individual and across organizations (Langley, Mintzberg, Pitcher, Posada, and Saint-
Macary, 1995; Lundberg, 1983). Further, organizational structure is also recognized as a
valuable feature in furthering learning and promoting change (Nonaka, 1994; Lundberg,
1983; Van Maanen and Schein, 1979), particularly double-looped systems that provide
feedback throughout the organization (Cole, 1991; Argyris and Schön, 1978) and
generative learning organizations, where change and continuous experimentation is
embraced (Marshak, 1993; Senge, 1990). Further, organizational change can be
conceptualized in a variety of manners, each with its own set of assumptions, which
provide a lens to better understand the process by which change occurs (Kezar, 2001;
Morgan, 1997; Van de Ven and Poole, 1995). The lens adopted here characterizes both
3
adaptive and social cognitive change models, where planned change is driven by forces
outside of the organizations, and that change is directed by learning processes that begin
at an individual level.
Learning is more easily understood at the individual than organizational level.
Individuals take information and experience and develop knowledge that is used to shape
behavior, habits, and attitudes, allowing for learning, in many instances, to be observable.
Learning at the organizational level, in contrast, is much more complex. In the best case,
a learning organization is adaptable and amenable to internal and external challenges. In
the worse case, it may merely serve as a metaphor for observed, but not well understood
changes in an organization’s activity, or as a mere aggregation of learning that takes
place at an individual level. Schwandt and Marquardt (2000) offer a useful
conceptualization of organizational learning, defining it as “a system of actions, actors,
symbols, and processes that enable an organization to transform information into valued
knowledge which in turn increases its long-run adaptive capacity” (p. 43). Schwandt and
Marquardt (2000) observe organizational change as a function of learning and
performance. Both of these are requirements for change and support the adaptive activity
of an organization, but through different means. They conceptualize learning through the
Organizational Learning Systems Model (OLSM), which is based on four interrelated
processes, the health of each are needed for organizational learning to take place. These
include: environmental interface; action/reflection; meaning and memory; and
dissemination and diffusion. This model creates a framework for organizational learning
by which the organizations in this study are examined.
4
Studying policy implementation in the context of learning guides one to
understand the organizational conditions that are most relevant for effective
implementation. These conditions for learning incorporate organizational features and
management qualities. They specifically include: networking with other implementing
organizations; communication with policymakers; alignment of policy with
organizational goals; promotion of a culture of experimentation; clarity of roles and
responsibilities; internal communication; and evaluation and routinization. This paper
examines policy implementation at this level and within the context of implementing
technology. The implementation of technology provides a unique set of challenges
within the policymaking arena (Fountain, 2001; Heeks and Davies, 1999; Lessig, 1999),
while also affording the opportunity to examine learning at an organizational as well as at
an individual level, indeed where the organizational learning process begins.
The relevance of this study is based on the premise that policy demands placed on
organizations can introduce new stresses to the organization, which can be overcome to
an extent by the ability of the organization to adapt. This adaptation is most successful if
it is supported by learning processes. While a case study methodology places limits on
the nature of the findings that the study can generate, such an approach is particularly
helpful in filling the current void between the policy implementation and organizational
learning literatures. This is an exploratory and descriptive study seeking to better
understand how learning takes place in the policy implementation process, and, more
specifically, what conditions of the organization are most relevant to supporting effective
implementation. OLSM is used as a framework to understand and study learning activity
in the policy implementation process. Achievement of policy implementation objectives
5
in the study was seen in the context of the technological tool being adopted. This
understanding is drawn from both policy documents and from organizational members in
the data collection process, and thus provide a rich perspective on policy objectives
representing both the top down (policy documents) and bottom up (organizational
members).
Organizational learning was indeed observed over the course of implementation,
and was largely supported by the identified conditions. Further, learning was found to
contribute to the overall achievement of implementation objectives. The observed
learning included the development of new operational systems and directed changes in
organizational membership and culture, both of which contributed to the achievement of
implementation objectives. The model and its conditions served as a useful tool to
conceptualize and study learning in the context of implementation. The focus on
technology related policy in a higher education setting, however, suggest certain
limitations on the study, and it can serve to explain such findings as the limited
applicability of experimentation in these cases. Ultimately, this suggests an important
area for future research, testing the applicability of these conditions to other settings.
The findings, moreover, offer guidance to both policymakers and the managers
overseeing the implementation as to organizational conditions that can both facilitate
implementation, and improve the overall achievement of implementation objectives.
Findings suggest the need for policymakers to consider policy development from an
organizational perspective and to support organizational learning through the
maintenance of communication systems with implementing organizations and the
development of richer evaluation processes, which better capture implementation process
6
related issues. For managers, the model argues that resources and interest be directed to
the learning process and not just implementation activity, and that the development of
learning support systems be put into place early in implementation. Through these
efforts, learning can be a more integrated part of the policy implementation process
leading to more successful implementation, and ultimately policies that better meet their
intent and the needs of society.
7
Chapter 2: Literature Review
Introduction
This study seeks to integrate two related but distinct literatures, policy
implementation and organizational learning, to develop a learning model of policy
implementation. The goal is to illuminate how policy is translated into administrative
action, and to understand the role that learning plays in that process. This literature
review argues that while there is implicit reference to learning in a substantial part of
policy implementation literature, the organizational features of learning have not been
addressed systematically. The discussion then turns to the more focused perspective of
organizational learning that contributes to the model of learning through policy
implementation, or Policy Implementation Enhanced by Learning (PIEL). This
discussion seeks to not only identify the current gaps existing within these sets of
literature and justify the value of integrating them, but also to help identify organizational
conditions that promote learning and achievement of implementation objectives. Policy
related to information technology will serve as the focal area to explore this integrated
model, and so some discussion of that literature also follows.
Policy Implementation
Policy implementation theory rose to prominence in the 1970s and 1980s in an
effort to better understand how and why the implementation of government programs so
often fail and to provide tools to guide policymakers and implementers as they craft and
carry out policy (Mazmanian and Sabatier, 1983; Lipsky, 1980; Elmore, 1978; Bardach,
1977; Pressman and Wildavsky, 1973). Whether it is the development of a law by the
8
legislature, a policy by the executive, or a mandate by the courts, the resulting program or
other action often fails to meet initial expectations and sometimes results in strikingly
different outcomes. Pressman and Wildavsky (1984) describe implementation as
requiring an “understanding that apparently simple sequences of events depend on
complex chains of reciprocal interactions” (p. xxv); much can go wrong throughout that
process.
Bardach (1977) provides a pessimistic perspective of the process. The difficulty
of policy development, according to Bardach, pales in comparison to the challenges of
implementation for several reasons, the paramount one being that it is much easier to
compromise in the formulation phase than to commit during implementation. The
implementation process provides many opportunities to negotiate better terms or renege
on commitments altogether through a series of “implementation games.” Bardach
recognizes the importance of an implementation “fixer” who can exercise personal and
positional power over the implementation process. That power can then channel
legislative authority to amend and modify the original policy as needed in order to
expedite the implementation process and ensure that its intentionality is maintained as
well. Central to the classical model is the need for the policy originators to shepherd the
implementation process in order to ensure its effectiveness. Though it recognizes the role
that different agencies and different parts of an organization can play in the process, he
generally supports a classical “top-down” model of policy implementation.
1
This is a
paternalistic perspective on policy implementation, which does not preclude learning but
1
I borrow the term from Bishop (2003) who differentiates between a classical and adaptive approach to
policy implementation theory. Both terms are used here.
9
largely ignores it, and certainly marginalizes the influence of the organization in the
implementation process.
This top-down perspective of policy implementation (Mazmanian and Sabatier,
1983; Bardach, 1977; Pressman and Wildavsky, 1973) is characterized by an interest in
minimizing ambiguity of both goals and means.
2
The perspective values the statutory
intent of a policy as policymaker, and not bureaucrats, are held accountable to their
constituencies. Mazmanian and Sabatier (1983) suggest six criteria for effective policy
implementation; the chief among these is an unambiguous set of policy goals, as well as
clearly articulated statutory policy aimed to achieve them. Indeed, it is argued, if there is
no clarity from the originators of the legislation, there is no way to measure the
effectiveness of the policy. Mazmanian and Sabatier (1989) define policy
implementation as understanding “what actually happens after a program is enacted and
formulated” (p. 4; emphasis is theirs). Their hierarchical orientation to policy
implementation emphasizes concerns for meeting “legal objectives” and errs on the side
of control. Such a perspective is rooted in the viewpoint that policy outcomes will not be
met if the bureaucracy is not adequately managed through the process. This principal
agent problem
3
is overcome by planning implementation clearly at the onset, including
specificity and enforceability, in order to prevent self-interested and uncommitted
bureaucrats from failing to properly implement policy.
2
Ambiguity of goals and ambiguity of means is a conception of policy implementation offered by Matland
(1995), which is discussed in more detail below.
3
The principal-agent theory, or problem, relates to the adverse selection and moral hazard that arises when
a “principal” must rely on an “agent” to implement or carry out an interested activity (Miller, 2000;
McCubbins, Noll, and Weingast, 1987; Stevens, 1993).
10
Conversely, the bottom-up perspective (Lipsky, 1980; Elmore, 1978) asserts that
effective implementation demands ambiguity of goals. The street level bureaucrat,
responsible for the implementation process, will be best able to understand the needs and
interests of those whom the government is serving and so must have the flexibility to
make adjustment. Where control of the bureaucracy is an abiding interest of the top-
downers, administrative discretion is essential for the bottom-uppers. In terms of
ambiguity of means, again bureaucrats need to have the discretion to make adjustments
so that the policy can be most effective. Measures of effectiveness demand adjustments
and redefinition throughout the implementation process.
While the importance of accountability and clarity in terms of policy evaluation as
offered from the top-down perspective are indeed important elements within a democratic
society and in the conception of policy implementation, these qualities are difficult to
attain given non-rational policymaking behavior. Policies are often unclear—whether
due to political negotiations, incremental introduction, or the meddling of special
interests (Kingdon, 2003; Lindbolm, 1973). Indeed, political dealings can result in
policies that are, in their formation, untenable from an implementation perspective. As
Sharkansky (1996) suggests, “[t]he force of ideological commitments may lead
policymakers to pursue goals that are contradictory to one another, or that are so
ambiguous as to foil implementation” (p. 517). Still, a bottom-up style, while valuing the
importance of adaptation and discretion, may place too much authority with street level
bureaucrats, which ignores or underestimates the political goals articulated in the policy
formulation stage, and, at the very least, would be inappropriate for some types of
policies (Sharkansky, 1996; Bardach, 1977; Lindbolm, 1973).
11
As policy implementation theory has matured, there have been attempts to
balance the two countering perspectives. Pressman and Wildavsky (1984) stress the
value of the top-down conception of policy implementation, describing it as “the ability
to forge subsequent links in the causal chain so as to obtain the desired results” (p. xxiii).
While these “desired results” are delineated by policymakers, the process is understood
as a series of decision points and clearances, where individuals throughout the
implementation process, from various agencies, and at all levels of an organization can
play a substantial role. Here, the complexity of the process is related to the number of
actors involved, as well as the likelihood of unanticipated events arising throughout the
process. While Pressman and Wildasky’s approach to implementation clearly values
legislative intent over bureaucratic discretion, a value is placed on the role of
implementing agencies input into the process.
In the 1984 edition of their classic work on implementation, Wildavsky made
further attempts to bridge this divide. What he stressed is permitting the necessary
latitude to alter and adjust as necessary. By incorporating flexibility in the
implementation process, learning can take place. Also important is ongoing evaluation
that “should engender not a summary but a continuous learning process” (p. 227).
Further, effectiveness needs to be evaluated against both pre-set and evolving objectives,
which contributes to the complexity of evaluation. Administrators cannot be “literal
executioners and successful implementers” (p. 175), indeed “literal implementation is
literally impossible” unless policy is “narrow and uninteresting” (p. 179). Accountability
lay at one end of evaluation continuum, with flexibility and learning at the other. That
flexibility and opportunity to learn has grown to be an essential value in their construct of
12
policy implementation, and indeed essential to the understanding of policy
implementation as a whole. A further discussion of Wildavsky’s linkage between policy
implementation and learning will follow.
More contemporary approaches provide alternative perspectives to bridging the
divide between top-down and bottom-up. Mead (2001), in studying welfare reform in
Wisconsin, challenges both the top-down approach to policy implementation
(Mazmanian, and Sabatier, 1989; Bardach, 1977; Pressman & Wildavsky, 1973) and the
bottom-up approach (Lipsky, 1980; Elmore, 1978) with an “interactive” model of
implementation. He found that although the initial push toward reform was initiated from
the top (i.e., the state), several counties were leading the charge to reform.
4
These
counties experimented with reform plans, many of which became models for other
counties and mandates adopted by the state. Mead cedes much control to the top of the
organization,
5
which sets the agenda and monitors progress. The authority over
implementation, however, is shared by all levels of the organization, and a strong value is
placed on relationships, communication, and networks, connecting not just the top of the
organization to the bottom, but developing connections throughout the organization. His
work clearly supports the need for flexibility at the local level in order to promote
innovation and creativity, while also stressing the need for a central authority, not to stifle
local efforts, but to maintain progress towards implementation objectives. “The governor
and legislators set the agenda by defining the welfare problem and calling for change. But
4
The state was also responsible for ensuring minimum compliance across all counties.
5
In this case the top of the organization is represented by the governor and the legislature.
13
then they left administrators the leeway to craft actual programs and trusted them to do
so” (p. 526).
Matland (1995) integrates the two perspectives into a situational approach to
policy implementation in a very useful way. Acknowledging both ambiguity and conflict
as important factors shaping the implementation process, he describes four ideal type
approaches to implementation: (1) administrative (low ambiguity, low conflict), where
resources are important to implementation; (2) political (low ambiguity, high conflict),
where power is important to implementation; (3) experimental (high ambiguity, low
conflict), where contextual conditions are important to implementation; and (4) symbolic
(high ambiguity, high conflict), where coalitional strength are important to
implementation. This model assists in creating a better understanding of the
implementation process and how different types of social problems, which policies seek
to address, are solved by different approaches. And, particularly, how varying degrees of
ambiguity shape how policymakers might choose to respond.
Bishop (2003) explores this middle ground further, finding that in addition to
flexibility, structure plays a key role in effective policy implementation. Using the four
frameworks of understanding policy implementation as offered by Elmore (1982), she
addresses welfare reform in a single agency. She observes that beyond sound policy
development (possessing a solid understanding of the problem and how the policy will
address it), there are also structural preparations that need to be made, specifically to
reorganize “internal organizational structures” (p. 621), so that individuals and the
organization as a whole are better able to respond to likely alterations in roles and
responsibilities. This is particularly the case in implementing policy that will bring about
14
significant changes. Cooperation among individuals and groups within the organization
including a commitment to the policy and open communication channels are essential. A
major pitfall is the failure of managers to translate “goals into specific tasks” (p. 597).
Ultimately Bishop concludes, “the process can undoubtedly be improved by strategically
addressing organizational needs prior to major policy changes” (p. 625).
Although dealt with independently, the concepts of flexibility, organizational
structure, and learning have been found to play an important role in policy
implementation. An examination of organizational structures and processes in the
context of learning has yet, however, to be integrated into a theory of policy
implementation. Organizational learning will be more thoroughly examined as it is
integrated into a model for studying policy implementation.
The Relevance of Learning to Policy Implementation
The connection between policy implementation and learning is not new. Learning
has been linked to policy implementation in useful and meaningful ways. Learning has
been used to characterize the means by which policy ought to be adjusted and changed
through the implementation process and used as a lens through which to understand
changes in approaches of policy implementation over time. Three particular approaches
will be discussed here all of which employ an evolutionary metaphor in connecting
learning to policy implementation. These include the work of: True, Jones, and
Baumgartner; Fiorino; and Wildavsky. The latter goes the furthest combining some
elements of organizational learning theory, opening the door to building stronger
connections between the two literatures as this study suggests.
15
True, Jones, and Baumgartner (1999) offer a framework of the policy process of
which implementation is a part. They argue that policymaking in the United States can
be characterized by long periods of slow incremental change with intermittent bursts of
change that offer dramatic changes in public policy. This punctuated-equilibrium theory
for public policy is based on evolutionary theory. While it does not explicitly address
learning, it does seek to provide a means for understanding how policy change takes
place over time. As will be discussed later, however, change, while part of the learning
process, is not sufficient for the understanding of learning. That change must be directed
and involve the conversion of information inputs to knowledge.
Another approach that offers a look at policy over time is that of Fiorino (2001),
who applies a learning model to how environmental policy has evolved over time in both
the United States and Europe. He shows that environmental policy has been shaped by
learning processes that influence how policy changes and policy actors interact over the
course of decades. Technical learning in the 1970s is characterized by a narrow
definition of problems and tension among policy actors, and begot a more conceptual
learning in the 1980s. Conceptual learning opened the door to new goals, strategies, and
tools to solve environmental problems. Finally, the 1990s brought social learning, which
emphasizes communication and cooperation among policy actors and possess a broader
ambition of sustainability. Applying a learning model suggests a more purposeful
process to policymaking contrasting more conflict-based interactions, and one that
“stresses knowledge acquisition and use” (p. 322). Activities and individuals grow more
informed over time; however, they continue to be constrained by elements of technical
learning which remain as part of the institutional and legal framework of the policy.
16
In the third edition of Implementation (Pressman and Wildavsky, 1984),
Wildavsky dedicates new chapters discussing the importance of learning to the policy
implementation process. Wildavsky’s focus on learning relates largely to policy
evaluation. In revisiting implementation, he finds it to be a mutual adaptation process,
partially retrospective—informed by learning and evaluations—and partially
prospective—guided by a solid policy “theory” and clarity of policy objectives. Further,
he critiques implementation scholars for being distracted from what was and should be
their objective—the study of the implementation process. The tendency, instead, is to
create preconditions for effective implementation in an effort to prevent failure.
Implementation is and should be wrought with failure, for it is through mistakes that
corrections can be made.
The real failure, as Wildavsky observes, is failing to learn through
implementation and thus the challenge is to help develop a “capacity to learn during
implementation” (Pressman and Wildavsky, 1984, p. 230). He further finds that learning
is generated largely through policy evaluation, which informs and should be informed by
implementation. Evaluation needs experience (generated through implementation), and
implementation requires evaluation (to learn). There is too often a disconnect between
these two processes, and, while having them work too closely can result in an inability to
do either well, failing to coordinate their activities can curtail successful implementation.
Wildavsky frames his discussion of implementation in evolutionary terms, and most often
in the context of program development.
6
In the end he does bring to the discussion issues
6
Implementation is a process that turns policies into programs, and so evolving implementation results in a
refinement of the resulting program.
17
of organizations and organizational learning, suggesting the value of learning to learn
(i.e., double-loop learning). Organizations, however, are discussed in the context of
conflicting organizational structures—market, hierarchy, and collegial. Ultimately,
Wildavsky stresses the importance of combining the different characterizations of
organizations to better understand how and why organizations are motivated to engage in
learning. Left absent are strategies for employing learning within organizations and an
examination of how organizations learn to adapt when policy demands are placed on
them.
Policy Implementation and Organizational Learning
While these approaches have proved useful in conceptualizing policy
implementation theory and linking this theory to learning, few, if any, attempts have been
made to connect policy implementation systematically with what is a vast organizational
behavior literature. This study seeks to fill that gap approaching learning at a different
level—from the organizational perspective. Policies and their implementation place
demands on organizations. The nature of these demands vary by the sector in which each
organization falls and on the particular policy, but successful policy implementation rests
largely on the ability of organizations to respond to those demands. These demands may
require additional effort and energy from organizational members, but in instances where
the demands are substantial, particularly novel and/or complex, or resources are limited,
organizations face substantial challenges. It is in this area that organizational learning
becomes a helpful tool for organizations to manage the demands of implementation.
Organizational learning, under ideal conditions, is theorized to enable an organization to
18
become adept at change, but more importantly it is the ability for information and
knowledge to be effectively communicated throughout the organization. Under such
circumstances, as organizational members deal with novel and complex problems, they
are able to share best practices and influence structural and informational processes to
advance organizational objectives.
While organizational learning is well established in the organizational
management literature, there has been little attempt to link it in a policy context. The
value of approaching learning from this standpoint is to recognize the power that the
organization has in influencing the policy implementation process. Policymakers rely on
organizations to move policy initiatives forward. As can often be the case, policy may
require changes in organization operations, processes, or even cultures. Understanding
the conditions in which organizations learn can help to facilitate those changes and
ultimately promote the effectiveness of the policies they are called to enact.
In developing the relationship between policy implementation and organizational
learning, first the connection between change and learning will be examined. The
literature on organizational learning and learning organizations will then be explored to
provide a context for how learning will be approached in this study. Finally, Schwandt
and Marquardt’s (2000) Organizational Learning Systems Model (OLSM) will be
introduced as a framework for studying organizational learning in the context of policy
implementation.
19
Change and Learning
Organizations are called upon to change and adapt constantly. Whether in
response to forces outside or inside the organization, change is inevitable and indeed
necessary for the survival of any organization. Yet, the ability of an organization to
change in ways that further or improve organizational functions is by no means a
certainty. The dynamic relationship between an organization and its individuals adds to
the complexity of understanding the nature of change within organizations, and
ultimately the learning that may ensue.
Decision-making underlies the change process, particularly planned change. Chin
and Benne (1976) define planned change as a “conscious” decision to apply “knowledge
as an instrument or tool” to create change—“modifying patterns and institutions of
practice” (p. 88). Organizational change, however, can be characterized in many ways,
each having its own set of assumptions regarding the impetus for change and how change
unfolds (Kezar, 2001). Van de Ven and Poole (1995) offers four overarching theoretical
models: life-cycle, teleological, dialectic, and evolutionary.
7
Morgan (1997) suggests
eight different theoretical perspectives: machines, organisms, brains, cultures, political
systems, psychic prisons, flux and transformation, and instruments of domination. The
six primary categories observed by Kezar include: evolutionary (change in response to
external factors), teleological (planned change model following a rational and linear
process), life cycle (organizational change occurs through a series of stages), dialectic
(political model where change results from conflict between values and beliefs), social
7
The teleological model is where purpose or goals guide movements. Dialectic is where the balance of
power between entities affects change. Evolutionary describes the cumulative change across society at
large.
20
cognition (change results from sense making and individual interests in growth and
learning), and cultural (change requires an alteration of values and belief systems).
Based on these categories, the change process described in this context is a
teleological (adaptive) and social cognitive change model. In terms of the former, it
represents planned change that while resulting from external forces is largely shaped and
directed in a top-down manner—either from management or policymakers beyond the
organizational level. Still, as represented by the social cognitive perspective, learning
occurs and knowledge structures are constructed at the individual level. Management
also serves to facilitate change and plays an important role in shaping the learning that
occurs via organizational structure (i.e., organizational arrangements, evaluation, and
incentives) and in communicating policymaker intent. The environment can be
considered as socially constructed and interpreted by management or other organizational
members, distinguishing this approach from an environmental perspective.
Within this context, learning is important because it enables thoughtfulness in the
change process, and, more importantly, the creation of new knowledge (Schwandt and
Marquardt, 2000). And so, organizational learning goes beyond planned change. It seeks
to create an organization that is readily able to adjust to shifts in the landscape resulting
from external and internal demands or from an interest in experimentation (Marshak,
1993; Senge, 1990). Huber (1991) recognizes organizational learning as a result of a
change in the range of potential behavior. In dealing with organizational cultural change,
Lundberg (1983) finds that “real organizational learning” is a transitional event that
includes a change across all levels of an organization. Such processes are described in
21
organizational terms, but rooted in the activity of individual actors. Ideally, learning
becomes the impetus for change that can improve organizational performance.
Learning in an Organizational Context
Learning is not merely the processing of information, but the creation and use of
new knowledge (Fiorino, 2001; Nonaka, 1994). This process begins at the individual
level and then permeates to the organizational level through the communication and the
exchange of information or ideas. Through the lens of learning, policy implementation is
seen as valuing the role of the street level bureaucrat and also the need for either
flexibility in policy outcomes or the opportunity to permit adjustments to policies as they
are implemented.
Learning is best understood at an individual level. Individuals take information
and experience and develop knowledge that is used to shape behavior, habits, and
attitudes, allowing for learning, in many instances, to be observable. Indeed, while the
process itself differs greatly, the characterization of organizational learning very much
reflects how individuals learn (Crossan, Lane and White, 1999; Huber, 1991). In his
evaluation of the literature, Huber (1991) suggests four constructs related to
organizational learning: knowledge acquisition, information distribution, information
interpretation, and organizational memory. In developing a framework for organizational
learning, Crossan, Lane and White (1999) characterize the process as including: intuiting,
interpreting, integrating, and institutionalizing. The difference between individual and
organizational learning, therefore, lies in the need to communicate learning across the
organization and to develop a means to manage the knowledge that has been created so
22
that organizations members are best able to access and further contribute to it. So,
despite initial similarities, learning at the organizational level is much more complex. In
the best case, it contributes to a learning organization, one that is adaptable and amenable
to internal and external challenges an organization must face. In the worse case, it may
merely serve as a metaphor for observed, but not well understood changes in an
organization’s activity, or as merely a collection of learning that takes place at an
individual level. Still, while individual learning is necessary for organizational learning,
it is not sufficient. Further, the issue of describing the purposefulness of the learning
process in the organizational context remains.
Argyris and Schön (1978) suggest a very simple and clear cut means of describing
organizational learning: the process of detecting and correcting errors (p. 2). Learning
occurs when conditions change, and the understanding of organizations are tested.
Individuals maintain their own views of organizations, or images, which are always
incomplete. Maps constitute “shared descriptions” which are jointly constructed and
manifest themselves as organizational charts, operating procedures, and physical
structures themselves. Both are modified and adjusted in response to changing
conditions, and through discourse among organizational members. Organizational
learning is also described in terms of the impact it has on the organization as a whole.
Single-loop learning enables for error detection and correction to take place without
changes in the policies or objectives of the organization. Double-loop learning, on the
other hand, requires a “modification of an organization’s underlying norms, policies, and
objectives” (p. 3). Here the use of a feedback loop is crucial to the learning process.
23
Another perspective on learning in the organizational context is the learning
organization. While learning organizations suggest an ideal form for organizations,
organizational learning seeks to describe a process, a process which may or may not
result in a learning organization (Ortenblad, 2001; Schwandt and Marquardt, 2000;
Senge, 1994). Although the focus in this research is on organizational learning, the
literature on learning organizations is helpful in suggesting qualities and characteristics of
organizations that advance learning. Ortenblad (2002) describes elements of the learning
organization, which includes: organizational learning, learning at work, learning climate,
and learning structure. Senge (1994), who emphasizes individual learning within the
learning organization context, suggests five disciplines: systems thinking, personal
mastery, mental models, building shared vision, and team learning. Nonaka and
Takeuchi (1995) develop the concept of a knowledge creating company, and emphasize
the importance of moving tacit knowledge to explicit knowledge. Among the common
characteristics of learning organization models, particularly those of Senge and Nonaka
and Takeuchi, is a strong emphasis on the role of management in facilitating the learning
process.
All these issues contribute to the complexity of understanding and studying
learning in an organizational context. While the definition of learning as offered by
Argyris and Schön is helpful, it remains quite limited and is unable to position
organizations in a place where they are able to manage change not only as a response to
different conditions, but as a result of self-reflection and with the aim of continuous
improvement. As the literature on learning organizations suggests, organizations need to
be better equipped to respond to environments and conditions that are ever changing.
24
Too often change is directed at immediate problems. Though, this may address problems
in the short-term, it establishes a poor precedent for future action and merely results in an
adaptive organization that reacts to environmental change. For a learning organization to
develop, generative learning is necessary. An organization that is proactive, with an
“emphasis on continuous experimentation and feedback” (Senge, 1990, p. 8). Further,
change should no longer be approached under a Lewinian pattern of unfreeze-movement-
refreeze pattern, where the status quo is valued more than the change process. Indeed
there is much to be learned from a more cyclical pattern that emphasizes process,
journey, and equilibrium (Marshak, 1993). This requires a change in orientation for
everyone in the organization, and management must become more comfortable with the
levels of uncertainty that will likely come with developing such a culture.
Schwandt and Marquardt (2000) offer a more useful conceptualization of
organizational learning for the purpose of this study, tying it to organizational outcomes.
He defines it as “a system of actions, actors, symbols, and processes that enable an
organization to transform information into valued knowledge which in turn increases its
long-run adaptive capacity” (p. 8). This definition serves as a basis for the
Organizational Learning Systems Model (OLSM) developed by Schwandt and Marquardt
(2000), which will be used as a framework for assessing organizational learning in this
study. The OLSM is described in the next section.
25
The Organizational Learning Systems Model
Schwandt and Marquardt (2000) observe organizational change as a function of
learning and performance.
8
The Organizational Learning Systems Model (OLSM)
developed by Schwandt and Marquardt builds on and explicates the General Theory of
Social Action posed by Parsons. Parsons sought to build stronger theory in the social
sciences that could help to predict human action. The theory sought to provide a way “to
analyze and organize knowledge” of human interaction through empirical investigation
and a more precise understanding of the components of action and the way they are
interrelated (Parsons and Shils, 1959, p. 27). An initial premise of the theory is that there
are four processes that support the survival function of organizations. These processes,
and more specifically the activities associated with them, can be characterized as:
adaptation, goal attainment, integration, and pattern maintenance. Further, Parsons
suggests that action, and more specifically change, within an organization is directed
through two interrelated systems: performance
9
and learning. Schwandt and Marquardt
find Parsons’ theory to be “one of the most thoroughly developed integrative theories that
emanates from sociological perspective and incorporates biological, psychological,
cultural, and sociological concepts associated with human action” (Schwandt and
Marquardt, 2000, 45), and so provides a solid foundation for OSLM.
The learning model developed by Schwandt and Marquardt is made up of four
processes, or susbsystems, based on the work of Parsons. These subsystems serve either
8
The underlying assumption is that the change described in the OSLM is forward moving. And so learning
in the process of performing (in whatever activity that the organization engages can lead to organizational
change.
9
Performance describes the act of doing. This action describes functions that support the purpose of the
organization in measurable terms, while the learning function can inform organizations how to improve that
performance.
26
an ends or a means of the learning process and focus either internal or external to the
organization. Specifically, they include: environmental interface; action/reflection;
meaning and memory; and dissemination and diffusion. Further, each of these processes
has an “interchange medium” associated with it. These media constitute activities and/or
resources that serve as outputs from their respective subsystem, as well as inputs which is
the means by which each of the subsystem communicates with the others. Figure 2-1
shows each of these subsystems as a function of their focus and purpose along with the
medium (i.e., outputs and inputs) associated with that subsystem (underlined). The
processes as identified by Parsons are noted in italics. Far from linear, each subsystem
feeds its outputs to and collects its inputs from all other subsystems, and so they influence
and reinforce one another. Each is essential in supporting the learning process within an
organization. A discussion of each of these subsystems and their respective “interchange
media” follows.
Figure 2-1. Dynamic Social Model of Organizational Learning Subsystems. Each subsystem
and medium in the learning model is identified as a function of its purpose and focus.
Purpose
Means Ends
Focus
External
ENVIRONMENTAL INTERFACE
New Information
Adaptation
ACTION/REFLECTION
Goal Reference Knowledge
Goal Attainment
Internal
MEANING AND MEMORY
Sensemaking
Pattern Maintenance
DISSEMINATION AND DIFFUSION
Structuring
Integration
Adapted from Learning Subsystems and Functional Prerequisites, Figure 4.5, p. 64, and Learning Systems
and Interchange Media, Figure 4.6, p. 69 (Schwandt and Marquardt, 2000).
27
The Environmental Interface provides new information to the organization, which
serves as the source of energy for the organization. In terms of this subsystem, three
issues frame how information is managed: collection, environmental perceptions, and
challenges in processing it into knowledge. Organizations employ a variety of methods
in the collection of information. It is necessary to evaluate the quality of information it
collects, as well as the process by which that information is collected. Daft and Weick
(1984) suggest important environmental perceptions that shape the type and process by
which organizations collect information. These include how organizations perceive their
environment (can you analyze the environment or not) and the influence it believes that it
can have on it (intrusive versus accepting). Finally, while new information can be
extremely valuable to an organization, information as it enters the organization has no
initial value. That value is supplied by the processing of that information in the other
subsystems. Therefore, identifying and anticipating challenges that limit that process is
essential to supporting the learning process.
Action/Reflection serves as the “nucleus of organizational learning
system…assigning meaning to new information, and in so doing creates goal reference
knowledge” (Schwandt and Marquardt, 2000, p. 152). Through this subsystem,
information from within and outside the organization is transformed into valued
knowledge corresponding to organizational goals. While knowledge is important, there
needs to be a link between knowledge and action. This is emphasized here with a link
between knowledge and goals. Schwandt and Marquardt are clear to distinguish that
there is a difference between goals of the performance subsystem and those of the
learning subsystem. While both seek to promote an organization’s ability to survive and
28
adapt, they each seek that out in different ways. The message is that organizations need
to reflect and act on learning goals, and so such behavior needs to be articulated and
valued. Organizations need to reflect on how things are done, why things are done, and
what is done. Further, focused efforts need to be made to convert tacit knowledge to
explicit knowledge and individual knowledge to social knowledge. Other activities that
promote the function of this subsystem is the promotion of genuine dialog, teaching at the
individual level the ability of learn and reflect in the context of the organization and job
responsibilities, and valuing the process of testing theories-in-use against espoused
theories within the organization.
Dissemination and Diffusion enables the processing and ordering of information
and knowledge within the organization, manifested in such artifacts as defined roles,
policies, and organization structure. It moves, retrieves, and captures both information
and knowledge, and so serves as important communicative function in the organization.
This subsystem is characterized as the most concrete and observable, and the structuring
medium helps to enable activity in the other subsystems. Schwandt and Marquardt
distinguish between the dissemination function, which they see as governed by formal
procedures, and diffusion, which is more informal. There is a struggle between the
formal and informal, as intended changes can be undermined by old behaviors which
reinforce values held within informal channels in the organization. So, organizational
learning requires structuring that promotes and encourages learning activities. While
strategy is an important first step, it is not just about managing knowledge, it is about
“nurturing people with knowledge” (Schwandt and Marquardt, 2000, 159). Further,
while organizations may approach structuring differently, promoting either a tightly
29
coupled or loosely coupled approach to the management of information and knowledge,
Schwandt and Marquardt suggest a value of loosely coupled systems as they invite more
information to flow in and circulate around the organization, enabling the ability to better
recognize and respond to the need for change. Finally, the definition of roles is also
important to the structuring process. Here, Schwandt and Marquardt stress the
importance of the roles that leaders and managers play in facilitating and maintaining the
learning system.
Meaning and Memory, through sensemaking, enables much of the activity in
other subsystems. It is through sensemaking that goal reference knowledge is stored into
organizational memory, and, further, it enables the generation of appropriate structuring
in the Dissemination and Diffusion subsystem. Language and symbols serve as a means
for this subsystem to communicate with the others, and it is here where understanding is
reached and knowledge is stored. Walsh and Ungson (1991) suggest six means by which
organizations store memory. These include: organizational members, which is limited by
their ability to recall and articulate knowledge; culture, which includes language, shared
assumptions, symbols, stories, and values, the details of which can be altered in the
retelling; transformation processes, where inputs are fashioned into outputs, and include
routines and standard operating procedures; structures, the roles and structures provide
social routines which store knowledge; ecology, the physical space that organizations
occupy; and external archives, which include former employees, competitors, and such
things as reports required and maintained by outside entities. A challenge identified by
Schwandt and Marquardt is the existence of different subcultures within an organization,
30
and the limitation that this can pose to the ability of an organization to have free
communication and exchange.
Studying policy implementation in the context of learning guides us to understand
the organizational conditions that are most relevant and components of the decision
making process that are central for achieving implementation objectives. This study
seeks to explore policy implementation at this level and within the context of
implementing technology. The implementation of technology provides a unique set of
challenges within the policymaking arena, while also affording the opportunity to
examine learning at an organizational as well as at an individual level, indeed where the
organizational learning process begins. The relationship between policy implementation,
learning, and technology will now be explored in more detail.
The Case for Technology
Technology related policies provide an interesting context for exploring policy
implementation process, particularly at the organizational level. The interest in
technology here relates more specifically to information technology (IT) and information
and communication technology (ICT). IT can be characterized as “computing and
telecommunications technologies that provide automatic means of handling information”
(Heeks, 1999 p. 15), and is contrasted with information systems (IS), which encompass
the interplay of technology and people in the management of information (Heeks, 1999).
The more contemporary label of information and communication technology (ICT)
captures this broader understanding of technology and its use in organizations. The
development of technology from the ICT perspective will be examined, as will the
31
opportunities and challenges that technology introduces to the study of policy
implementation.
Technology and Organizational Change and Learning
Studying the link between technology and organizational change is not new.
Initial research in this area developed from an emphasis on the individual, which was
then built up to understand how the individual might affect structural change. Such
studies related to themes including the examinations of performance and satisfaction, as
well as user participation in development (Barki and Hartwick, 1994; Barki and
Hartwick, 1989; Foster and Franz, 1986; Franz and Robey, 1986). Further studies
attempted a more systems level analysis that looked at how technology affected structural
characteristics of an organization (Markus, 2004; Heintze and Bretschneider, 2000; Fulk
and DeSanctis, 1995; Pickering and King, 1995; Burkhardt and Brass, 1990; Jin and
Franz, 1986), but these often offered contradictory findings. Barley (1990) suggested
that technology serves as an enabler of change in organizations influenced by the roles
and role relationships of the organization. Technology, depending on the model, would
either modify or reinforce those role relationships; the former resulting in change and the
latter not.
More contemporary analyses of technology adoption and effects on organizations,
as offered by Fountain (2001), suggest evolutionary mechanisms or a form of
technological determinism, both of which undervalue the role of people in the process, as
well as the rational-actor perspective, which overly emphasizes efficiency objectives.
The perspective forwarded by Fountain is founded on an institutional perspective, where
32
the culture, norms, and roles must be understood in how they interplay with new
technology. In this context, organizations are resistant to more substantive changes that
challenge their structure, thus limiting the effectiveness and efficiency gains available
through the introduction of new technologies and utilization of network systems.
The purpose here is to examine a slightly different relationship between
technology and organizational change. It is to understand how an organization and its
organizational actors manage and adopt externally directed, technologically-based change
resulting from a policy directive. In the tradition of Fountain, organizational factors are
considered relevant to how organizations respond to such demands.
Technology as a Policy Tool
Technology has and continues to serve as a powerful tool to meet policy
objectives. Much of the importance of technology is rooted in efficiency and
effectiveness gains, which has spurred research focused on facilitating the
implementation of technology—first in the private sector (Markus, 2004; Orlikowski,
2000; Tyre and Orlikowski, 1993; Jin and Franz, 1986; Mankin, Bikson, and Gutek,
1984/85), but more recently in the public sector (Berleur and Avgerou, 2005; Garson,
2005; Haque, 2005; Vann, 2004; Alshuwaikhat and Nkwenti, 2003; Bovens and Zouridis,
2002; Haque, 2001; Heeks, 1999; Berry, Berry, and Foster, 1998; Bugler and
Bretschneider, 1993). Improvements and reform in government operations result from
such activities as increased availability and exchange of information and services to
citizen-customers and other private and public partners, automation of processes,
improved resource management, and the replacement of paper-based systems. A further
33
value is the application of technology to improve government accountability and
transparency (Kudo, 2004; Jones, Schedler, and Mussari, 2004; Heeks, 1999). These
opportunities have proven to be an attractive option for policymakers at all levels of
government, as they seek to commit resources to proven and emerging technologies.
While such a tendency seems only natural, a commitment to and utilization of technology
also provides substantial challenges. Before discussing the implementation of technology
and outlining its associated challenges, an exploration of a major federal initiative that
helped to push technology investment to the forefront of the government reform will be
offered.
In 1993, the Clinton administration pursued a grand endeavor to reinvent
government. A core commitment of this initiative included improving government’s
utilization of technology. While the reinventing movement held such bold ambitions as
to truly transform the federal government and bureaucracy, the end products were not
nearly as grandiose, resulting in, for the most part, only surface level changes to
organizations (Fountain, 2001; Thompson, 2000; Wolfe, 1999). Consistent with
conclusions drawn by Barley (1990), these changes tended to result in a reinforcement of
existing roles and structural characteristics of the organizations and bureaucracies that
adopted them. While these changes may not have been deeply felt, they were indeed
broadly felt, introducing all levels of the federal government to emergent technologies,
including electronic mail and the internet.
As Fountain (2001) suggests, this initial investment in technology is merely the
first step in the process of a more meaningful transformation. She maintains that greater
demands will be placed on government to adopt technologies that will greatly challenge
34
structural and institutional assumptions, and that those demands will come from
policymakers such as Congress. In many ways this prediction has shown true. Such
federal mandates include the REAL ID Act, the US VISIT Program, and SEVIS, a
program that will be examined further in this research. These programs are placing
implementation responsibilities on the federal bureaucracy, state and local government,
educational institutions, and even private sector entities. An emerging interest in such
technologies goes beyond efficiency and coordination to include security measures as
well (Amoore and Goede, 2005). While technology is proving to serve as a useful policy
tool, it is important to provide a theoretical basis for understanding the process of
implementing reform through technology and the challenges that are associated with it.
Implementing Technology
Not all technology reform is the same. Heeks and Davies (1999) conceptualize
four phases of reform,
10
each of which differs in what kind of outcome is sought, in how
the organization is affected, and the role of both management and IT in supporting the
process. Table 2-1 outlines these phases.
10
Heeks and Davies use of the term phases suggests a sequence. While in actuality these are more types
rather than steps in reform objectives, the use of phases is intended to suggest an evolutionary process in
technology use, and so depth of reform sought.
35
Table 2-1. Information Age Reform. Indicates phases of management strategies for
information age reform outcomes.
Phase 1 Phase 2 Phase 3 Phase 4
Reform
objective
Automation Optimization Reengineering Transformation
Change sought Changing the
technology from
manual to IT via
automation
Changing
applications by
rationalizing data
structures and
work processes
Changing the
organization by
redesigning data
structures and
work processes
Changing the
organization by
completely
transforming data
structures and
work processes
Typical IS
management
issue
Getting the
information systems
to work and stay
working
Controlling the
information
systems costs and
staff
Coordinating
information
systems across the
whole organization
Harnessing
information
systems to meet
the needs of
organizational
customers
IT’s role Supplant Support Innovate Innovate
In lay terms Efficiency: doing
the same things in
the same way but
faster or cheaper
Incremental
effectiveness:
doing the same
things in
somewhat better
ways
Radical
effectiveness:
doing the same
things in radically
better ways
Transformation:
doing new things
Taken from Heeks and Davies (1999, Table 2.1, p. 41).
This is a helpful approach for framing how organizations can be affected by the
introduction of technology. As discussed, much of the utilization of technology in the
public sector has centered around the earlier phases of reform. It is important in
considering technology in the context of organizational learning that attention be placed
on the later phases of reform as they place the most demands on organizations. While
much of the promise of technology has been to establish more networks and
collaborations within and between organizations, to facilitate the flattening of
organizational structures, and to empower, there are conditions that both support and
inhibit their adoption.
As discussed, a substantial motivation is seeking efficiency and effectiveness
gains. Factors inhibiting or slowing adoption include availability of resources and
36
underutilization by organizational members. Underutilization may be a result of a lack of
trust not only in the technology but in the implementers of the technology. The
introduction of technology can also serve as a control function, and thus can garner from
organizational members a reaction to resist or at least approach it with caution and
trepidation. For all the potential that technology and networking systems provide in
sharing information and flattening organizations, it has also been used to limit discretion
and increase the centralization of decision making (Bovens and Zouridis, 2002; Fountain,
2001; Lessig, 1999). One way in which this is carried out is through the embedding of
rules in code, and as Lessig (1999) observed, rules embedded in code govern “invisibly
and powerfully”. Further, technology delivered by new hardware or software, may lead
to the deskilling workers, where technology adopts some of the responsibilities of
employees leading to not only the potential for job loss, but a lack of understanding of
certain job functions and underlying processes. This can modify the sorts of activities
and responsibilities distributed across the organization, including the loss of discretionary
authority at the operational level and increased responsibility for authority and oversight
at the top of the organization. The ability to move forward while being conscious of the
concerns of employees, and certainly the promotion of trust, facilitates the
implementation process.
Constructing the Learning Model
This is an exploratory study, which seeks to construct a learning model of policy
implementation. It is particularly interested in the process of implementation at the
organizational level, specifically related to managers and front line employees, but also, it
37
seeks to inform policymakers at the policy design stage on strategies that can help
support effective implementation. Broadly, it asks how learning takes place through the
policy implementation process. Further, what organizational features and management
activities support organizational learning and effective policy implementation? What
aspects of organizations and their learning processes support the achievement of
implementation objectives? These research questions provide the framework for creating
the learning model of policy implementation.
Seven propositions follow that serve as the foundation to be explored in the
development of the model, and they are based on seven general organizational
conditions: inter-organizational networks; communication with policymakers; policy-goal
congruence; promotion of a culture of experimentation; clarity of roles and
responsibilities; internal communication; and evaluation and routinization. Measures of
effectiveness are also discussed. Drawn from the literatures of policy implementation,
organizational learning, and information and communication technology, it is suggested
that these conditions support the effective adoption of policy. Each of these propositions
is related to one or more of the subsystems of OLSM as offered by Schwandt and
Marquardt. Further, each proposition is advanced either by organizational features or the
activity of the organization’s management (or a combination of the two), which furthers
the ability of an organization to learn through the process of implementing technology
related policy. The propositions are summarized in Table 2-1.
In their work studying factors that support and inhibit e-government reforms,
Schedler and Schmidt (2004) identify organizational features that facilitate
implementation, as well as “soft factors”, largely related to the qualities of management
38
in an organization, which are central to supporting the adoption of technology.
Organizational features include incentives to support change, structures and processes
that enable change, and the articulation of clear strategies. So while later phases of
reform may result in changes to organizational structures and processes, the initial state
of an organization is relevant to how amenable an organization is to begin moving
forward with implementation. Management qualities include inspiring leadership, trust,
and motivation, as well as taking ownership in the implementation process. Both these
perspectives are factored into the development of these propositions
Table 2-2. Propositions and their Relation to the OLSM Model. Also suggests focus driver
of each proposition..
Proposition OLSM Subsystems Focus
Proposition 1
Network with other
Implementing Organizations
Environmental Interface Organizational Features
Proposition 2
Communication with
Policymakers
Environmental Interface Management
Proposition 3 Policy and Goal Congruency Action/Reflection
Organizational Features
and Management
Proposition 4 Culture of Experimentation Action/Reflection
Organizational Features
and Management
Proposition 5
Clarity of Roles and
Responsibilities
Dissemination and
Diffusion
Organizational Features
and Management
Proposition 6 Internal Communication
Dissemination and
Diffusion and Meaning
and Memory
Organizational Features
Proposition 7 Evaluation and Routinization Meaning and Memory Organizational Features
Proposition 1
Organizations can take advantage of the learning that occurs in other
organizations by establishing networks and exchanging information and resources with
other implementing organizations.
39
Interorganizational communication is also a valuable resource in organizational
policy learning. While not widely discussed in organizational learning literature, it is
commonly referenced as crucial to developing innovations, particularly in the literatures
of New Public Management and reinventing government (Osborne and Gaebler, 1993;
Thompson, 1999), as well as a key principle of the efforts behind the National
Performance Review. Much of this is offered in terms of best practices and
benchmarking, which is valuable, but not clearly linked to the learning process. Drawing
from the policy literature, the value of the interface among implementing agencies is
recognized. Examples include diffusion models of policy development (Berry and Berry,
1999; Lindbolm, 1973) and evolutionary policy processes (Fiorino, 2001; True, Jones,
and Baumgartner, 1999). An interest in interorganizational exchange is also popular in
the ICT literature, both in its value as noted by Robey, Boudreau, and Rose (2000) who
suggest “organizational knowledge barriers may be overcome by learning from other
organizations”, as well as in its limitations, which are discussed often in identifying the
cultural and environmental limitations of technology transfer, particularly in the context
of adopting technology in developing countries (Heeks, 1999; Petroni and Cloete, 2005;
Snellen, 2005).
Therefore, it is valuable when a network of implementing agencies emerges and is
utilized as a means to share strategies and best practices. These may develop through
professional associations, policy forums orchestrated by those overseeing policy
implementation, or just networks of individuals facing similar challenges. Further, they
may include informal exchanges or formal and structured means of discourse.
40
Proposition 2
Organizational learning requires that members are well informed of policy and
policy changes. Management should serve as a conduit of information between
policymakers and organization members.
Private sector motivation for adopting new technologies is to enhance the bottom
line. While the public sector is similarly motivated in many instances, it is a much more
cumbersome process, particularly when change is the result of a dictate from
policymakers. Efficiency and effectiveness, while they may be valued, are not de facto
products of new technology. The additional measure of effectiveness in this context is
one of compliance with the regulation or law. This perspective, of course, is supported
by the top-down approach in policy implementation (Mazmanian and Sabatier, 1983;
Bardach, 1977; Pressman and Wildavsky, 1973). An ideal, the clarity of policymaker
interest can be clouded by a variety of factors including incrementalism and political
gamesmanship (Lindbolm, 1973). Moreover, the normal conditions of uncertainty
associated with implementation at the policymaking level are heightened when dealing
with technology policy. As Fountain suggests, the implementation of technology can be
characterized as “decision making under uncertainty” (p. 89). For policymakers who are
not only several steps removed from organizational issues related to implementation, but
may be less familiar with the technical processes and the challenges that arise with its
introduction. Factoring in conditions where new and possibly unproven technologies are
implemented that uncertainty is only exacerbated further. Therefore, a necessity in
managing the implementation of technology in the public sector is maintaining active and
regular communication between policymakers or the agency charged with overseeing
41
implementation and the organizations themselves. Such communication systems ought to
be coordinated through management, and the information regularly disseminated through
the organization and its members. Such communication can keep the organization well
informed in terms of changes or modifications relating to the technology, and,
conversely, can serve to draw information back to policymakers as to adjustments or
clarifications that may be required.
Proposition 3
Organizations can better direct organization learning by maintaining congruency
between organizational goals and policy objectives.
Interest in employee motivation has long been a concern for both the public and
private sectors. Ways to induce that motivation are many, most often including such
approaches as utilizing incentives, engendering organizational commitment, increasing
job satisfaction and participation in decision making, and promoting trust. The problem
that can arise, conversely, is described as the principal agent theory. Various
mechanisms, many discussed in terms of motivation, serve to minimize this resulting
problem. Much of this relates to the promotion of congruence between the goals and
interests of individual employees and of the organization. In relation to policy
implementation, one further layer is added, which is the promotion of a clear connection
between the interests represented by a policy (and policymakers) and those of the
organization. Management, therefore, needs to clearly align organizational goals and
activities to the implementation of the policy. Those goals ought to reference specific
tasks and timelines, and ideally linked to means to motivate employee support. The
42
literature clearly speaks to the importance of articulating the organizational mission and
goals (Ruiz-Mercader, Merono-Cerdan, and Sabater-Sanchez, 2006; Dessler, 1999),
aligning the organizational culture to support objectives (Davenport, De Long, and Beers,
1998; Lundberg, 1983), building shared vision (Senge, 1994), provide clarity in purpose
and in language (Ruiz-Mercader, Merono-Cerdan, and Sabater-Sanchez, 2006;
Davenport, De Long, and Beers, 1998; Bishop, 2003), as well as establishing a rewards
system that fosters learning and cooperation (Ruiz-Mercader, Merono-Cerdan, and
Sabater-Sanchez, 2006; Chen, Chen, and Meindl, 1998; DeMatteo, Eby, and Sundstrom,
1998; Petitt, 1996; Wilson, 1995; Ferris, and Tang, 1993; Krogh, 1998; Lundberg, 1983;
Perry and Porter, 1982).
Proposition 4
Learning is enhanced when the organization possesses and management promotes
a culture of experimentation as it relates to the use and implementation of policy driven
technology.
In the learning process, experimentation has been deemed an essential tool. And
this is particularly valuable in the context of technology, which, as discussed, often is
accompanied by a high degree of uncertainty. Etzioni (1989) described “[f]ocused trial
and error” as “the most widely used procedure for adapting to partial knowledge” (p.
122). Characteristics of experimentation that have been noted include the importance of
learning by doing (Robey, Boudreau, and Rose, 2000; Hippel and Tyre, 1995; Marshak,
1993; Senge, 1990), maintaining an organizational culture that values knowledge and
innovation over hierarchy (Davenport, De Long, and Beers, 1998), importance of
43
disclosing mistakes as part of the learning process, particularly among management
(Ruiz-Mercader, Merono-Cerdan, and Sabater-Sanchez, 2006; Krogh 1998; Imai, 1986),
and the practice of testing of new techniques before introducing them for general use
(Ruiz-Mercader, Merono-Cerdan, and Sabater-Sanchez, 2006).
As discussed, these changes can increase concerns among employees. Not only in
terms of organizational uncertainty, but concerns related to performance and failure.
Therefore, it is essential that management demonstrate a clear commitment to
experimentation and foster a climate of trust and a credible commitment to such an
environment. Management provides an opportunity for experimentation and creativity in
implementing the policy. A place where room for error is permitted, and the organization
engages in an examination of practices to determine necessary adjustments or changes.
Proposition 5
The learning process is more easily achieved when the organization provides
clarity of roles and responsibilities through the implementation process.
The clarity of roles and responsibilities is important not only to the individual in
understanding her place within an organization, but also understanding what others do so
that they can serve as resources. This is particularly important as organizations manage
change (Damanpour, 1991; Porras and Robertson, 1987; Van Maanen and Schein, 1979).
In many ways this is an extension of the need for solid intra-organizational
communication. When technology is introduced that results in additional demands and
changes in organizational structure, changes in individual responsibilities can also be
anticipated. Anticipating and articulating how to deal with those changes is part of
44
maintaining the necessary “technical and organizational infrastructure” (Davenport, De
Long, and Beers, 1998, p. 51), a responsibility which rests with management. While the
initial period of implementation may result in times of uncertainty and adjustments, these
changes are articulated, documented, and distributed. Further, insomuch as a clear
delineation is necessary, it is also necessary to anticipate the need for further change,
whether due to a need to make organizational corrections or adjust to further policy
changes or clarifications. Therefore, it is necessary to construct a structure that is
flexible, indeed fluid, while still ensuring that those engaged in the change have a sense
of the role they play in the management of the implementation process.
Proposition 6
Organizations must support a well developed communication system within the
organization and among organization members to facilitate the learning process.
As discussed, communication is a key to facilitating organizational learning
(Nonaka, 1994; Damanpour, 1991; Levy, 1986; Porras and Robertson, 1987; Elmore,
1982). The importance of communication in learning suggest the need to develop
“systems that support communication and discourse” (Robey, Boudreau, and Rose, 2000,
p.125) and “multiple channels for knowledge transfer” (Davenport, De Long, and Beers,
1998, p. 54), and is further supported by a foundational components of Senge’s (1994)
model for learning organizations, what he calls “team learning”, which begins with
dialogue. As Crossan, Lane, and White (1999) suggest organizational learning involves
interpreting and integrating. A conception furthered by Nonaka (1994) who describes
45
organizational knowledge as “created through a continuous dialogue between tacit and
explicit knowledge” (p. 14).
Therefore, lines of communication within the organization are open and flow
between management to frontline workers, as well as with others in the organizational
structure that support the implementation process. Additionally, this exchange within the
organization can provide the means to create opportunities for sensemaking and
furthering organizational memory. Further, the organization ought to be amenable to
feedback that not only questions what is being done (single-loop), but why it is being
done (double-loop) (Argyris and Schön, 1978). This includes both formal and informal
modes of communication, as well as communication with departments in the broader
organization, which are supporting the implementation process.
Proposition 7
Processes of evaluation and routinization of practices are essential for
documenting and adjusting change and contributing to the learning process.
The organization engages in a regular process of evaluation and reflection
reviewing the progress in implementing the policy and the impact that it has on the
organization. The evaluation should include feedback from the various areas within the
organization and detail steps needed for moving forward. Evaluation should be
purposeful with a value placed on improvement (Hackman and Wageman, 1995; Tyre
and Orlikowski, 1993) and a willingness to change (Crossan, Lane, and White, 1999;
Senge, 1994; Cole, 1991). The work of Tyre and Orlikowski (1993) provide particular
insights for technological changes. Their cross cultural study suggested that periods of
46
intense change follow the adoption of new technologies. They find that organizations are
more successful if they are able to capitalize on that natural period of flux to further
refine and improve the adopted technology, rather than implement it and wait for a
cooling off period before modifying the technology. The episode of change, however,
ought to be followed by a period of stability where routines and standards of practice are
introduced. This routinization process is very important, and is an integral part of
implementing innovations as determined by other research as well (Berry, Berry, and
Foster, 1998), as is the process by which it is shared within an organization.
Understanding Achievement of Objectives
In considering the introduction of technology in response to a policy mandate, it
becomes necessary to determine a means of conceptualizing the achievement of
implementation objectives. Turning to policy implementation literature would provide at
least two divergent paths. From the top-down perspective, implementation success is
realized when objectives set by policymakers are met, while the bottom-up perspective
may see value in policy objectives, but call for flexibility in interpretation, and in some
cases redirection, by street level bureaucrats charged with implementation. While the
fluidity of the latter is appreciated, particularly in the context of a learning perspective,
the value of a more structured approach is also acknowledged. From the top-down
perspective, Mazmanian and Sabatier (1983) offer three “broad categories” of factors that
facilitate effective policy: “(1) the tractability of the problem(s) being addressed, (2) the
ability of the statute to structure favorably the implementation process and (3) the net
effect of a variety of political variables on the balance of support for the statutory
47
objectives” (p. 21). Such a perspective places much of the weight on the policy itself,
and policymakers, and virtually ignores the role of organizations in supporting
implementation. Approaching the issue from the organizational learning perspective
might suggest the importance of that ability of an organization to adopt a new
technology, integrate it into the operation of the organization with the ultimate value
relating to an organization’s ability to not only adapt and respond in immediate terms, but
how the process might enable successful adaptation and response in the future as well. A
further complication, however, is the challenge of technology itself. While policymaker
intentions may be well informed, the technology itself may not be well formed at all. The
technology literature, however, does provide some helpful strategies in conceptualizing
effectiveness.
The technology literature suggests the concepts of acceptance and use as barriers
to implementation (Fountain, 2001; Berry, Berry, and Foster, 1998; Kwon and Zmud,
1987). Kwon and Zmud (1987) distinguish acceptance of an information system, which
is attitudinally based, with the actual utilization of technology. The relationship between
acceptance and use is understood when combined with the key implementation problems
identified by Van Meter and Van Horn (1976). These are: dispositional problems, where
implementers elect to not respond to policy demands; communication problems, where
implementers are unclear what policy requires of them; and capability problems, where
implementers lack the ability to do what the policy demands. Acceptance relates to the
dispositional problem, while use relates to communication and capability problems.
Considering these barriers helps to construct an understanding of achievement.
These barriers fall into two categories: perception and execution. While acceptance of
48
the policy as legitimate is a component of perception, it also relates to what organizations
and organizational members perceive to be the objectives and functions of a particular
policy and the technology associated with it. Execution, on the other hand, relates to
activity undertaken by the organization that directly interfaces with the technology and
activity that can be measured as contributing to meeting policy objectives. Table 2-3
provides a summary of these measures.
Table 2-3. Understanding Achievement of Implementation Objectives.
Themes General Measures Specific Activities
Perception Identified Objectives
Acceptance of Policy as
Legitimate
Articulated objectives of policy and/or technology
Observed value in policy objectives
Long term value of technology
Execution Use of Technology
Measurable Outputs
Compliance with Policy
Objectives
Goal attainment (from a policymakers perspective)
Efficiency
Dedication of resources
Able to gain or has gained clarity on policy
Managed culture shift to accommodate technology
policy
Perception of Policy and Technology
Perceived usefulness has been recognized as a limitation on acceptance and
adoption of technology (Venkatesh and Davis, 2000; Davis, Bagozzi, and Warshaw,
1989). Part of the issue is that organizational technological capacity is limited by
individual actors, and thus “[o]rganizations rarely use the full capacity of their
information systems” (Fountain, 2001, p. 88). Therefore different organizations may
adopt and use similar technology in different ways (Markus, 2004; Fountain, 2001). The
perceived understanding of the policy and its associated technology is a key to seeing the
basis from which organizational members view a particular policy, and can strongly
shape their commitment to it.
49
Ultimately, while acceptance is not itself a sufficient measure of policy
achievement, those charged with the responsibility of implementing policy must find
some level of value in the policy objectives, if not the technological means by which
those objectives will be met. Further, as can be the case with technology, short term
challenges must be overcome to realize long term gains. Legitimacy enables those
charged with implementation to tolerate that short term. Barring acceptance, the
implementation process is likely to encounter increased levels of resistance.
Execution of Policy and Technology
A more practical measure is that the technology is being utilized appropriately
and regularly as required by statute or policy. This theme captures more traditional
understanding of success, including goal attainment from the perspective of
policymakers, as well as efficiency. While it might be considered straightforward, use
and adoption of technology can be elusive measures. Beyond the training that might be
required for utilization, several other factors emerge. Issues of resources, particularly
when policies are unfunded or underfunded, can be quite complicated, specifically in
dealing with emerging technology. Policymakers may not be aware of the resources
required and the implications that the policy might have on organizations. When
resources are not available or are limited, the utilization of the technology may be
limited. Clarity of policy is also relevant and relates to the communication problem
noted by Van Meter and Van Horn (1976). Organizations must have some sense of
clarity and be provided with the means by which to clarify the objectives of policies and
to share challenges encountered through the adoption of the technology or that are due to
50
the technology itself. Finally, the issue of capability suggests the need for organizations
to overcome the disruptions and changes that may result from technology adoption. This
is the most compelling measure of use, and success for technologies may require
organizations to undergo dramatic cultural shifts and require organizations to re-examine
the process and means by which it operates.
Conclusion
While previous attempts to examine policy implementation have included
learning related interest, this study goes further seeking to integrate these literatures at the
organizational level, a key unit of analysis in understanding the learning process. It seeks
to do so by identifying organizational conditions which support learning as organizations
implement the new policy driven technologies, and whether the presence of learning
facilitates the achievement of implementation objectives. Also considered are how
management of organizations, or even the policy itself, can support more effective
implementation of policy. Seven propositions have been offered, which will be used to
analyze whether a learning process has unfolded at the organizational level. These
include: inter-organizational networks; communication with policymakers; policy-goal
congruence; promotion of a culture of experimentation; clarity of roles and
responsibilities; internal communication; and evaluation and routinization. The initial
assumption supports the need for flexibility and values the important role that street-level
bureaucrats play in informing effective policy implementation, as well as the overall
importance of supporting learning in order to manage change. The study will look for
these organizational conditions in two nested cases comparing two distinctly different
51
kinds of technology based policy as a means of working towards the development of a
learning model of policy implementation.
52
Chapter 3: Research Design and Methods
Introduction
This section will outline the research design and methods employed for the study,
beginning with a discussion of the site and cases that provide a framework for the study
and the unit of analysis that was utilized. Thumbnail sketches of the two policy areas
under study will then be reviewed, as will distinctions between the selected policies.
Each of the propositions and implementation objectives will then be operationalized, and
the various means by which evidence was collected and analyzed will be articulated. The
chapter will conclude with a discussion of the limitations and challenges of the design
and methodology.
The approach that was employed for this research is a case study with embedded
units of analysis, which focuses on implementation at the organizational level. The case
study examined two technology related policies being implemented at the same
university, exploring the utility of the model developed as articulated through the seven
propositions. The case study approach not only provides the richness necessary for an
exploratory study, but is also an appropriate approach for both policy implementation and
organizational learning research.
Williams (1982) stresses the importance of making policy implementation
research useful for policymakers and implementers, a sentiment echoed often in the
policy implementation literature (Mazmanian, and Sabatier, 1989; Pressman and
Wildavsky, 1984; Bardach, 1977). Although analysis of implementation often focuses on
actors operating on several different layers, involving the legislature, administration,
agencies, and other relevant participants (Forinio, 2001; Mazmanian and Sabatier, 1989;
53
Pressman and Wildavsky, 1984; Bardach, 1977), other approaches emphasize the need to
focus on the street-level bureaucrat (Lipsky, 1980; Elmore, 1978). Elmore (1982), in
describing backward mapping, advocates starting with “the lowest level of the
implementation process,” and then as the analysis works back up the system the
researcher asks: “What is the ability of this unit to affect the behavior that is the target of
the policy? And what resources does this unit require in order to have that effect?” (p.
21). While his interest in prescribing resource allocation advice to policymakers led him
to subsequent levels within the system structure, the interest of this study is to examine
more closely implementation at the organizational level to better understand
organizational conditions which serve to implement policy.
Both policy implementation and organizational learning research provide a rich
tradition of case studies. Case studies are useful because of the complex and procedural
nature of organizational phenomena. Policy implementation must relate not only the
policy, but also the actors involved, their motivation, and the progress (or lack there of)
achieved over time. It is this story that is captured through the case. Similarly,
organizational learning seeks to relate a process. This process involves a variety of
inputs, which result in outcomes that suggest learning. Case studies provide the
opportunity to capture the richness and depth necessary to further the understanding of
the learning that takes place. Thusly, case studies provide the contextual basis necessary
for a study of this type, a study that is largely exploratory in nature.
Williams (1982) offers a thorough examination of “methodological and
administrative issues” for implementation studies. The chapter by Yin is particularly
useful in his comparative analysis of “exemplary” implementation research. His study
54
provides several “methodological lessons” which suggest best practices for evidence
collection and analysis. Yin found highly structured discussions to be an incompatible
strategy for collecting evidence, while unstructured discussions were much more
common and useful. Direct observation and document and news report reviews were also
often used strategies, but the best approaches attempted to triangulate findings by seeking
two or more sources. In terms of analysis, exemplary implementation studies suggested
the importance of pre-analysis work to ensure that the study is well focused, particularly
for multi-case studies. There is also a tendency to construct an implementation story by
linking the facts of the implementation process. Further, Yin suggests that the most
important step in the analysis is in explaining why implementation occurred at all through
the process of testing alternative explanations.
In the terms offered by Yin (2003), the methodology of this study involves a
single site with nested cases, addressing multiple service areas. Further, this study is both
“exploratory” and “descriptive” (Yin, 2003, p. 15) in its attempt to better understand how
learning takes place through the policy implementation process, as well as to identify
conditions of an organization that are most relevant to facilitating the ability of an
organization to meet implementation objectives. Still, it also seeks to serve a normative
end in providing lessons as to what should be done within the context of organizational
structures and processes to promote better implementation. Such insights will prove
useful to both policymakers and implementers.
55
Site, Cases, and Unit of Analysis
While policy implementation can encompass a broad range of fields and sectors,
this study focused the parameters of the analysis in two specific ways. First, the study
examines the implementation of policy driven technology (PDT). This certainly can limit
the applicability of the findings to other policy areas, but technology is an important
policymaking tool and is quite relevant for implementation studies of this nature as has
been discussed. Second, the case centers on policy within a higher education setting.
Again such a setting does create limitations on the broader application of the findings, but
the area is an interest of the researcher and the variance in policies does provide a broader
perspective on policy learning.
The single site for the study is a large public university. The nested cases are two
policies: the federal monitoring of international students through the development of the
Student and Exchange Visitor Information System (SEVIS); and the integration of online
education into an academic department’s curriculum. The units of analysis for the study
are the departments charged with the implementation of policy. In the case of SEVIS, the
implementing organization is the international education (IE).
11
In the case of the online
course technology, the academic department (AD) integrating online education into its
curriculum will serve as the unit. Both policies are examined as single units. Further,
because the interest was in studying the learning process over the course of
implementation, the study design is post hoc. Organizations, therefore, have completed
11
The IE coordinates the campus response to SEVIS in cooperation with other campus units as will be
discussed later. The IE, though, serves as the lead department.
56
most of the implementation of the policy, and also have done so with some measureable
successes.
The two different policies provided for an interesting contrast. While they are
related in their reliance on technology as the policy tool and that they are being
implemented at the same university, the policies themselves are distinct in function and
mandating bodies directing the implementation. Further, the departments, while both
inside the same institution, are quite unique in form and character. What was sought
through this study is analytic replication. By comparing two structurally unique policies
and the manner in which organizations approach and learn to implement them, the study
hopes to provide a broader perspective on how organizations learn and what mechanisms
help drive more effective implementation of policy. This perspective, while reached
within the confines of an institution of higher education, should ideally have bearing
beyond the ivory tower.
In selecting an appropriate site for study, the researcher sought to provide not a
unique case, but a representative institution. As such, it was a fairly typical setting which
shared commonalities with a broad range of universities that are similarly suited. It must
be acknowledged that the university that was selected is one of which the researcher has
had an affiliation. Such a selection was made cautiously in an effort to limit both the
potentiality of researcher bias and the perception of its selection as a convenience sample.
The justification for the selection was twofold: the characteristics of the university and
the access afforded to the researcher. In terms of the latter, in selecting this institution the
researcher was afforded a greater degree of access to both individuals involved in
implementation and supporting documentation, which served to enhance the quality of
57
data generated and the overall findings from the study. Such access is particularly
valuable for single case study designs (Yin, 2003). It is important to note that the
researcher is not, nor has ever been, a member of either department, nor was he directly
or indirectly involved in the implementation of these technologies or in the formulation
of the campus response to these policies.
More important, however, are the characteristics of the university that lend
themselves well to a representative case, the findings of which can have wide
applicability. These include the profile and demographics of the campus, the technology
infrastructure available and offered to departments at the university, and, finally, qualities
specific to each of the policy cases. In terms of the profile of the institution, it is a large,
diverse public university in the southwestern United States. The campus awards degrees
at the bachelor and master levels, as well as a doctorate in education, and has a headcount
which exceeds 30,000 students (full-time equivalent enrollment is around 25,000). In
terms of ethnicity, the student body is a majority minority, where no ethnicity maintains a
majority of the enrollment.
12
The average age of students is 24 and nearly 60% of the
students are female. The campus is situated in an urban/suburban setting with a large
commuter student population. The institution also has branch campus facilities, but their
contribution to enrollment account for less than 10% of overall enrollment. Further, the
university is part of a large state system, which provides both support and direction in
managing institutional policy.
12
Cumulatively, White, Latino, and Asian/Pacific Islander students make up over 80% of the student
population.
58
In regards to technology, the university has a moderate to strong support for
technology and technological training, which was critical for this study. The importance
of institutional support of technology is that departments should not have been prevented
from meeting implementation objectives because of a lack of institutional commitment or
resources. Any technological limitations experienced by a department should rather have
been the result of a lack of departmental interest and individual sophistication. While
such a commitment may not result in boundless institutional support, some resources
ought to be available. Since both policies are being implemented at the same institution,
technological infrastructure will be held constant.
Certain characteristics of the SEVIS case were also considered relevant in its
selection. The department implementing SEVIS also served as a representative case.
Relevant elements included a moderate population of international students,
13
motivation
and interest on the part of the university for meeting the federal policy, and access,
although not unlimited, to resources to meet the federal demand. The university has also
been working to meet SEVIS compliance since 2003, when the system became
mandatory.
The online education technology case is more complicated. In selecting this site,
important characteristics included a department that maintained an aggressive stance in
expanding its course offerings. While this included some support from the department
faculty responsible for implementation, it is important that elements of resistance among
implementers remained. This is necessary because the interest of this study is to
understand how organizations respond to policy demands placed upon them. Some level
13
The university has maintained an international student population of between 4 to 5%.
59
of resistance in both the acceptance and use of the technology is needed. This is clearly
the case in dealing with federal mandates, such as offered by the SEVIS case, and for the
sake of sound construction of this study, it needed to be an element of the online
education adoption as well. Moreover, the need for a department that has aggressively
pursued the expansion of online course offerings is also important, because the study
looked at a situation where some level of success has been reached. Success in the initial
selection was merely the ability to provide a substantial increase in online course
offerings. The study then provided a better understanding of the success in terms of how
effectively the department has done so, both in faculty management of the technology, as
well as in the student experience in the course. Selecting a department based on this
initial limited view of success may be criticized for introducing an inherent bias to the
case, but it is necessary because as noted this is a post hoc study and the implementation
of technology is what will be measured. Through the implementation process, a
department that has greatly expanded its repertoire of online courses will likely have
experienced failures, or at least challenges, through the process. Those instances offer
opportunities for learning to take place.
Policy Areas
Before further detailing the structure of this study some background is offered on
the policies and organizations charged with their implementation. Both policies, as
noted, are being implemented at the same large public university in the southwestern
United States, and provide contrasting features—chiefly that one has an administrative
focus and the other an instructional focus—which seek to serve in the development of a
60
robust model of policy implementation. Each section will provide details regarding the
policy, as well as information on the department that was studied.
Policy Case 1: Monitoring of International Students
The first policy is a federal response to concerns regarding how international
students are monitored and tracked while studying at educational institutions in the
United States. Under a new federal reporting system called the Student and Exchange
Visitor Information System (SEVIS),
14
educational institutions are called to provide
detailed information and regular updates on the status of all enrolled international
students studying in the United States. The originating policy sought to utilize internet
technology and a centralized federal database to provide current data on international
students.
SEVIS is an online reporting system operated through the Student and Exchange
Visitor Program (SEVP), which is under the Bureau of Citizenship and Immigration
Services (USCIS) with the Bureau of Immigration and Customs Enforcement (ICE)
15
serving in an active monitoring and enforcement role. The predecessor to SEVIS was
CIPRIS, a pilot program
16
authorized by the Illegal Immigration Reform and Immigrant
Responsibility Act of 1996 (IIRIRA). IIRIRA followed the first attack on the World
Trade Center in 1993 in which one international student was found guilty of participating
in the bombing (ICE Report, 2002). Although the initial legislation gave no deadline for
full implementation, the events of September 11, 2001 resulted in renewed interest and
14
SEVIS was fully implemented on August 1, 2003.
15
Both USCIS and ICE is part of the Department of Homeland Security. The Immigration and
Naturalization Services (INS) oversaw SEVIS implementation until INS was dissolved in March 2003.
16
The pilot study was conducted from June 1997 through October 1999.
61
commitment to the program. The USA PATRIOT Act of 2001 and the Enhanced Border
Security and Visa Entry Reform Act 2002 modified IIRIRA and set SEVIS into motion.
Appendix A features a timeline of events and legislation relating to SEVIS.
The program enables the federal government to centrally maintain data on all
international students studying in the United States and to collect, on a regular basis, data
on their location and status. Most education institutions must obtain SEVIS certification,
including universities, colleges, technical and trade schools, as well as language
programs. In order to have visas issued to international students admitted to their
respective campuses, institutions must participate in SEVIS. Participating in SEVIS
requires that schools apply to the federal government, a process that includes an
application fee,
17
a site visit, training on the SEVIS system, and adherence to the
reporting deadlines set forth. Although no special equipment is required to access or use
SEVIS beyond a computer and internet access, the process is quite demanding on
campuses, involving the entering of data and the challenge of regular monitoring of
students in accordance with nationally established schedules and deadlines. This
challenge is exacerbated as the number of international students increases. A market has
developed for software that interfaces between the federal reporting systems and the
databases at educational institutions in response to the needs of campuses to extract and
process large amounts of data on a regular basis. Such batch processing systems have
become a necessity for campuses with sizeable international student populations, but also
increase the complexity of technology use as it creates the need to maintain data integrity
across multiple systems.
17
Fees are currently $230 for the application and $350 per campus on the application.
62
Department Profile
The department under study includes a staff of over a dozen individuals. While
all are familiar with SEVIS, only about half work regularly with the program. The
international student population typically includes over 1,400 students—between four
and five percent of the overall student population, which exceeds 30,000. Most
international students enter the university as transfer students from community colleges.
Implementation of SEVIS has required the addition of staff, as well as the reassignment
of duties for existing staff. Although one person plays a lead role as SEVIS coordinator,
several staff in the office are involved in the week-to-week management of the system,
including collecting information from students and inputting information into the system.
The office must also coordinate with other offices that process separate components of
SEVIS (e.g., visiting faculty, students in short term language study programs). The need
to collect and disseminate information on a regular basis has been and continues to be
necessary. The department works closely with the campus technology unit to support the
campus link to the federal database, including the support of the batch software system
that aids in the interface of the campus student record database to the federal database.
Some types of interface are shared among various users, lending itself more to the
learning process, while other responsibilities rest with one or two members of the
organization.
The department plays an active role in the association of international educators
(NAFSA
18
), which has emerged as an active and central part in the implementation of the
18
NAFSA was formerly the National Association of Foreign Student Administrators. As the phrase
“foreign student” dropped out of use, NAFSA has held onto its acronym but no longer uses its full name.
63
policy and the development of the technology tool. This organization provides expert
testimony at congressional hearings, maintains valuable information on its website,
assists in the policy interpretation process, provides forums for discussion at regional and
international meetings, and provides additional training opportunities for members and
member institutions. Members of the department attend these annual meetings, and seek
assistance and support from NAFSA.
The department lends itself to a study on organizational learning for several
reasons. These include the process of adapting to new technology and systems; the
impact that the technology has had upon the overall operations of the organization; and
the evolution of the policy over time. First, this new technology requires users to not
only be able to navigate the system, but also be aware of new processes and procedures.
The adoption of SEVIS has required the department to move from a paper-based system
of maintaining international student records to an online system. The system requires
additional information and updates throughout each term, as a means of verifying that
students remain in good standing. Second, it has resulted in a fundamental change in the
nature of the work performed by the organization. With the onset of SEVIS, the
international education offices have undergone a shift in the type of activities that they
are engaged in with international students. There has been a shifting from advising and
counseling of students to enrollment management and policy education and enforcement.
Such a shift in the focus of an organization can contribute to resistance to the adoption of
the technology, but also that change can place requirements to create a change in the
culture of an organization. Finally, as noted, the policy has evolved over time, despite its
short life span. This reflects both the effects of unintended consequences, as well as the
64
organized pressure that educators placed on the federal government. These unintended
consequences are the result of unexpected policy outcomes and technical problems with
the technology. The organized pressure illustrates the presence of a feedback loop in the
implementation process, which has aided in the implementation process.
Policy Case 2: Online Education Policy
The second policy relates to the development and expansion of online course
offerings in an academic department, and provides an interesting contrast to the federal
program that was just discussed. Before examining the dynamics of the department being
studies, attention will be turned to the policy governing the implementation process.
There is increasing interest in the development of online courses and degree
programs. The motivation for colleges and universities to offer them include increased
enrollment and revenue, as well as convenience and student access (Larreamendy-Joerns
and Leinhardt, 2006; Tallent-Runnels, Thomas, et al., 2006). The notion of access can be
characterized as a movement towards the democratization of education, and, indeed, there
are even examples of open access to instruction.
19
While the expansion of online course
offerings remains largely an interest of administrators, evidence indicates that it is
gaining support among faculty. Academic freedom may permit much flexibility to
faculty in the development of their courses, but there do remain concerns that online
education is a diminished educational experience. As a form of distance education, it
carries the stigma of diploma mills, which lack academic rigor, so efforts to monitor and
maintain quality in online education is growing. Still, indications of the mainstreaming
19
Access is made available to course content and materials, but not for degrees.
65
of online courses, however, abound, including the recent consideration by the US Senate
to lift restrictions of federal aid to institutions that provide more than half of its
instruction through distance learning.
Online course content and structure is managed and governed in a variety of
ways. Educational institutions may utilize it as a tool to meet an institutional priority
related to access. Thusly, it may be a value communicated within an institution’s
mission. Policies may exist at the institutional and even departmental level, which speaks
to what characterizes an online course, how they should be identified, the rights and
responsibilities of students and faculty as they pertain to online courses, as well as the
process for adopting new courses and degree programs. Regional accrediting agencies
also provide policy guidelines for adopting online courses.
20
Recent pushes for increased
accountability among institutions of higher education
21
suggest an increased likelihood of
interest and attention on more explicitly stating how quality in instruction can be
maintained in an online teaching environment. So, this combination of increased interest
in online education and accountability for achieving measurable learning outcomes,
suggest that online education policy will continue to grow in its sophistication.
Department Profile
The department selected for the study of the online education policy case has
experienced a tremendous growth in its online course offerings. Over the past two years
20
Examples of policy statements from the Western Association of Schools and Colleges, the accrediting
body of which the University of Southern California is a member, include: Good Practices For
Electronically Offered Degree and Certificate Programs and Good Practices for Electronically Offered
Degree and Certificate Programs.
21
An example is the Secretary of Education's Commission on the Future of Higher Education (2006).
66
it has grown from one to 17 approved online course with more than 13 additional course
proposals under review. The department chair has been a strong driving force in the
implementation, and his motivation has been largely to grow enrollment in the
department, both in terms of enrollment generally and increases in students in the major.
Full-time faculty members have been provided with release time to prepare online
versions of existing courses. Of the seven full-time faculty members, five have taught
online courses with the majority of online courses being taught by part-time faculty.
Additional faculty members have participated in the generation of online syllabi, but have
elected not to teach the course.
While the organizational structure for an academic department is rather flat, and
organizational members may develop courses rather independently, the development of
online courses operates differently. First, because courses rely on the use of emerging
technology, resources external to the organization are needed to support implementation.
In the case of this department, these include support primarily from the instructional
resource center (IRC) and the distance and extended education center (DEEC).
Additional support is available from the system wide office,
22
as well as through faculty
outside of the department. Closer to home, all faculty involved in teaching online courses
meet regularly to share ideas, strategies and best practices, and discuss policy. It is this
venue in particular, which can help to foster organizational learning.
In terms of policy, the department has utilized a diffuse set of policies from the
campus and its university system to guide the development of its online courses. As the
body of courses has increased over the past semesters, the department faces the standards
22
The public educational institution is part of a university system.
67
set by the regional accreditation agency, the Western Association of Schools and
Colleges (WASC). The WASC policies are more developed than the campus policies,
but also continue to become more refined.
23
The department is also motivated to pursue
“transformational” change through WASC, a policy change to convert a regular degree
programs to an online degree programs, the requirements of which will likely be adopted
by the campus. Ultimately, the policy directive to pursue the advancement of online
education in the department comes from the chair, who has identified online education as
one of several strategies to address an enrollment issue within the department.
Policy Case Contrasts
While the university context and technological focus of these policies make them
similar, these two implementation cases have been selected to provide analytic contrasts.
In the onset of the study, these contrasts are drawn in two areas, those relating to the
organization and those relating to the policy. These contrasts are outlined in Table 3-1.
Organization elements include those charged with implementation, the policy
source of motivation for organizational members, and the function that the technology
serves within the organization. For SEVIS, implementers of the policy are staff
members, while for online education, implementers are primarily faculty. It was
expected that faculty might be more resistant to adopt the new technology, particularly
tenured faculty. Comparing cases that include both populations of implementers can
suggest strategies that may have wider applicability. While the motivation within these
23
Current WASC policy applies only to departments that are half or more online, a standard that the
department does not yet reach, but is close depending on the precise interpretation of the policy
68
populations may differ, the source of motivation based in the policy does so as well.
SEVIS provides an example of a compulsory policy, where compliance is required.
Failure to do so is a violation of federal law and, for organizations, can result in loss of
the ability to enroll international students. As to the function of the technology in the
organization, for SEVIS the activity is administrative in function, providing a system for
information and data management. For online education, the function is academic in
nature. The latter suggests a more complex environment, which is also likely to be
implemented independently, requiring less coordination. This also raises concern
regarding compliance and the monitoring of performance.
Table 3-1. Policy Cases in Contrast.
Elements of Cases
Organization/Policy SEVIS Online Education
Charged with Implementation Primarily staff Primarily faculty
Motivation
Compliance; punitive
based
Incentive based
Technology function
Administrative,
information management
Academic, teaching
Interface with Technology Distributed use, shared Individual use
Source of policy External (federal)
Mixed sources, but driven
by organization
management
Specificity of policy
Very structured and
specific
Clear goal, ambiguous
means
Maturity of policy
Developed, but continues
to be refined
Emerging policy with more
specificity anticipated
Technology reform objective
(Heeks and Davies, 1999)
Optimization Reengineering
Ambiguity-Conflict Model for Policy
Implementation (Matland, 1995)
Administrative Political
As for the nature of the policy, the distinctions are even greater. SEVIS provides
a clear example of an external source of the policy, originating from the federal
government and articulated in several federal laws with great specificity. Online
69
education, by contrast, is a more general policy subject to greater interpretation and
originating from a mixed source with the campus articulating some policy guidance, but a
more, though still not as specific as SEVIS, detailed policy originating from its
accrediting body. Maturity of the policy in context of its implementation also varies.
While it continues to be refined, SEVIS is more developed and the organization has been
involved in its implementation since August of 2003, when institutions were first
required to have begun reporting to the federal government, and earlier as they prepared
for that compliance date. The online education policy dates back to only November
2006, and the department’s efforts at expanding its implementation of the technology on
a broad scale, dates back only two years. Further, over that time, the policy has
continued to grow more refined, creating challenges for the organization as it needs to
anticipate, as well as seek to influence, the refinement process.
Other means of distinguishing the policies are using models for technology
reform and policy implementation. Heeks and Davies (1999) provide a technology
reform model. Within their model they suggest several distinguishing processes by
which technology reform can take place. Using their framework, SEVIS is best
understood as an optimization reform, requiring change that seeks to do “the same thing”
previously done by the organization but in “somewhat better ways” (p. 41). That “thing”
in this instance is managing the records of international students and the “better ways”
means providing the information through shared systems with the federal government so
that it is instantly available. Online education, on the other hand, is an example of a
reengineering objective, where the reform interest is one that seeks to do “the same thing
in radically better ways” (p. 41). The “thing” here is of course instruction. “[R]adically
70
better ways” in this use of technology is better stated as radically different ways
suggesting the dramatic difference in teaching in an online environment.
Finally, as discussed previously, Matland (1995) developed an ambiguity and
conflict model for policy implementation. These policies provide a great contrast in this
regard with SEVIS serving as an example of “administrative” implementation where both
ambiguity and conflict are low, and online education serving as “political” where conflict
is also high, but ambiguity is low. A technology policy provides an interesting example
to apply this perspective on implementation, for while the intent of the SEVIS policy has
provided little ambiguity, the details of the delivery system have offered some room for
ambiguity that has created challenges beyond merely resources, as is suggested by
Matland. As for online education, the characterization of the policy as political suggests
a struggle within the department to truly accept online education as an important means
of delivering instruction within the department, but one that has been overcome.
Defining Constructs
As noted, this study seeks to work towards the development of a learning model
of policy implementation. Such a model is explored in the context of three key research
questions:
How does learning take place in the policy implementation process?
What features of an organization support organizational learning and how
do managers support organizational learning?
What aspects of organizations and their learning processes support the
achievement of implementation objectives?
71
Learning
As stated earlier, organizational learning is defined as “a system of actions, actors,
symbols, and processes that enable an organization to transform information into valued
knowledge which in turn increases its long-run adaptive capacity” (Schwandt and
Marquardt, 2000, p. 61). Learning, in this context, is conceptualized through the
Organizational Learning Systems Model (Figure 1) and is understood as four interrelated
processes, or subsystems, all of which play a key role in facilitating the learning process.
Therefore in examining the learning process through this study, the expectation is that
each of these subsystems must be active and healthy in order for the organization to learn
effectively. These subsystems thusly serve as a framework for understanding the
learning process. Each of the propositions are seen as related to at least one of these
critical subsystem and at least two propositions will serve to capture the presence of
activity in each subsystem.
Organizational Conditions
The following discussion articulates each proposition and the types of evidence
and issues that indicate activity in that regard. In beginning the discussion of collecting
evidence regarding these propositions, however, a few assumptions, which underlie this
research, should be stated. First, it was assumed that organizations that are equipped to
learn are better able to implement technology related policies. Further, organizational
member participation in decision making processes at the organizational and
policymaking levels improve overall implementation. These beliefs, while crucial in
72
conceptualizing the study, were challenged to the extent possible in its execution.
24
Table 2-2 is provided once again for your reference.
Table 2-2. Propositions and their Relation to the OLSM Model. Also suggests focus driver
of each proposition.
Proposition OLSM Subsystems Focus
Proposition 1
Network with other
Implementing Organizations
Environmental Interface Organizational Features
Proposition 2
Communication with
Policymakers
Environmental Interface Management
Proposition 3 Policy and Goal Congruency Action/Reflection
Organizational Features
and Management
Proposition 4 Culture of Experimentation Action/Reflection
Organizational Features
and Management
Proposition 5
Clarity of Roles and
Responsibilities
Dissemination and
Diffusion
Organizational Features
and Management
Proposition 6 Internal Communication
Dissemination and
Diffusion and Meaning and
Memory
Organizational Features
Proposition 7 Evaluation and Routinization Meaning and Memory Organizational Features
Proposition 1: Network with other Implementing Organizations
Here, the interest is in understanding what sources of information about
implementing the PDT were available from other implementing entities. These entities
included organizations and individuals at other universities or collections of individuals
managing implementation, such as professional organizations. The presence of such
entities, as well as the access and frequency of contact that organizational members have
to them are relevant. Collection of evidence took place through interviews and document
reviews. Again, questions relate to how often such networks were accessed to address
24
Indeed, the selection of the online case has challenged this notion, for in many ways it represents a top-
down approach to implementation with no broad consultation with the faculty at the onset.
73
problems or to provide direction in the general management of the technology were
relevant. Such information was disseminated within the organization itself, which is
relevant to the internal communication proposition.
Proposition 2: Communication with Policymakers
This proposition addresses the need for information specifically pertaining to the
policy. That information flowed through a variety of sources, including policymakers,
agents of policymakers, or other external entities. Relevant issues relate to how quickly
information was received and relayed through the organization. Examples of this
information flow were most pertinent at the onset of the policy, but occurred throughout
the implementation process for the cases being examined here. Document review and
interviews served as a means to collect evidence with the latter providing for an
opportunity to conduct perception checks between organization management and
employees. Also, relevant was the access to which organizations, and management in
particular, had to policymakers and the extent to which they could affect change in
policy.
Proposition 3: Policy and Goal Congruency
The first step in determining consistency between organizational goals and policy
objectives is the clear identification of each. While implementation objectives should be
clear, it is important to verify that there is a consensus among organizational members of
what those objectives are. Further, also useful was gathering the perspective of
policymakers or their agents on those objectives, particularly in those instances where
74
policy ambiguities exist. Assuming that organizations had established goals, it was not
only important to examine the goals themselves, but also to understand the process by
which those goals are identified, shared, and assessed. This process was not only
relevant to this proposition, but to others as well, in particular that of internal
communication and evaluation and routinization. Issues of department and institutional
commitment to implementation were important, as were how the obligation of the PDT
matched and impacted the overall purpose and mission of the organization. Here again,
interviews and document reviews served as the means for collecting this data.
Proposition 4: Culture of Experimentation
Of particular interest here is how risk adverse the organization is and whether
creative approaches are encouraged. Such issues as the general receptiveness of the
organization of novel approaches and ideas, as well as how it has reacted to failures in the
implementation process. The tolerance of and response to failures on the part of
management may suggest how they not only approach risk, but support experimentation.
Interviews were the most direct manner by which evidence is collected. Specifically, the
study identifies a number of situations where either novel approaches were introduced or
failures occurred, even outside of the implementation of PDT, and explored how those
situations were perceived by individuals in various parts of the organization.
Proposition 5: Clarity of Roles and Responsibilities
An initial consideration was the degree to which overall clarity in roles and
responsibilities existed within the organizations. As responsibilities related to PDT were
75
developed, an important consideration was to what extent were they formally articulated,
and to what extent was there a consistent understanding of them across the
organization—particularly those that were left as informal. The process by which roles
and responsibilities were determined was also important, as well as how they were
evaluated and how adjustments were made throughout the implementation process. This
contributes to the level of coordination and cooperation within the organization, and is
relevant to the internal communication processes as well. The ability of the organization
to identify and make adjustments as needed is essential to the learning process, as well as
how well such adjustments are communicated out to the organization as a whole. How
the organization responded to the initial introduction of the PDT, as well as to
reinterpretations of policies and other challenges that arose throughout the
implementation process, was also key. This evidence was collected through document
reviews and interviews.
Proposition 6: Internal Communication
The study, in this regard, was interested in the communication that took place
within the organization as it related to the implementation of the PDT. This included
factors such as those involved, the nature of the communication, its frequency, and the
types of settings in which it occurred. Communication with other entities within the
university was also considered in this domain. The data was collected through document
reviews, interviews, and direct observations. Specific lines of inquiry included:
76
What were the sources of information from within the organization on
how to manage the technology? How was that information generally
disseminated within the organization?
How were decisions regarding the PDT made and shared within the
organization?
How did you deal with PDT problems that arise?
Proposition 7: Evaluation and Routinization
Of issue here was the degree to which the organization was setting aside time to
establish and review operating procedures in relation to the PDT. This was very much
related to the clarification of roles and responsibilities, and these processes did indeed
intersect. It was also related to internal communication, where the efficiency gains
resulting from quick routinization were incumbent upon the organization’s quick
adoption of the new procedures. Ideally, these were well documented formal processes,
for which document review would be well suited. As anticipated, though, not all of these
processes were well articulated, nor were they always centrally located. In some cases,
documents were provided that developed the ground work for the creation of
routinization, and in others the actual processes were provided. Interviews and direct
observations provided additional mechanisms for collecting evidence, particularly in
suggesting the extent to which these processes were used and developed. Lines of
inquiry included how decisions were made, what value if any has been placed on
routinization, and again it related to how adjustments were made following the
reinterpretation of policy or when problems arose.
77
Achieving Implementation Objectives
As a policy study, this research was interested in contributing to improving the
ability of organization to achieve implementation objectives. As discussed, within the
technology literature, acceptance and use are important ways to measure that
achievement, and use, in particular, is important within the implementation context.
Effectiveness, within the context of this study, will relate to the ability of the organization
to achieve its implementation objectives. The determination of implementation
objectives was shaped both by the policy governing it (top-down) and those responsible
for its implementation (bottom-up).
An analysis of the policies provided broad indications of what the policies sought
to achieve, but varied in clarity. The initial findings are indicated in Table 3-2. Through
the process of the study, feedback was collected from organizational members as to
objectives of implementation, and other data was collected to assess how organizations
have met these objectives over the course of implementation. It was not only important
to understand whether implementation objectives are being met and policy outcomes
reached, but equally important in the context of this study, was understanding how
processes within the organizations, and learning in particular, have contributed to
meeting those objectives. Therefore, a line of inquiry did include the identification of
implementation objectives, indications of whether they had or had not been met, and
what, in general, has contributed to their being met.
78
Table 3-2. Measures for Achieving Implementation Objectives.
Cases
SEVIS Online Education
Implementation Objectives
Policy
Driven
All students in system Enrollment
On time reporting to ICE Academic performance
Maintain certification
Students in violation are identified
Policy
& User
Driven
Federal responsiveness
Quality of instruction
Student satisfaction
Improved resource management
Increased course offerings
User
Driven
Ease of use Ease of use for faculty
Availability of resources for implementation Ease of use for students
Student satisfaction with service Increased faculty participation
Maintaining international student enrollment
Collecting Evidence
In studying implementation and in conducting case studies, it is important to
maintain both an openness to the process and a clear framework for approaching the
study (Yin, 1982). This is particularly important because of the exploratory nature of this
study. Further, as Elmore (1978) notes, the conceptual framework with which one
approaches research affects how one understands the problem and limits the range of
solutions explored to address it. Conversely, the challenge is the abundance of
information that can be generated. Thus the concept of learning helped to frame the
inquiry, as did the seven propositions that have been identified and discussed. Data was
collected through interviews, documentation review, and direct observations.
Interviews
Yin (1982) differentiates between structured interviews and unstructured
discussions, emphasizing the latter as most important in implementation, particularly in
its ability to generate “‘explanations’ for an event or complex descriptions of events” (p.
79
45). While that may be the case, structured interviews are useful in maintaining a focus
on the specific components of organizational conditions, as outlined in the propositions,
with which the study is concerned. Both strategies were employed in the study, so the
form was a “focused interview” with a mix of open-ended and structured questions (Yin,
2003). All organization members who were technology users or potential users were
invited to participate in a brief interview, where interviewees were asked to share their
observations of how the implementation process unfolded. There were prompts and
follow up questions that helped to focus the discussion on elements relevant to the study
and also sought to identify additional resources and sources of information. These
interviews averaged about 45 minutes each. In depth interviews and follow up interviews
were conducted with key individuals as identified, and followed a slightly more
structured format.
With the exception of one new organizational member, all SEVIS users
participated in an interview, including one former organizational member who had been
active in the initial implementation. More than half of the faculty in the online education
case participated, including nearly all full-time faculty members, including the
department chair. A total of 27 interviews were conducted. Follow up interviews were
conducted with five individuals. Interviews were also conducted with other
organizational members not directly involved with the implementation of the policy to
provide insight on the broader implementation narratives.
Interview questions were pre-tested to ensure the clarity of the questions and the
accuracy with which the researcher collected responses. Interview protocols and script
are provided in Appendix B. Interviewees were provided a general policy
80
implementation context for the study, but not specifically the organizational learning or
the learning factors that were being analyzed. This is done to minimize potential biasing
of interviewees.
Documentation Review
Documentation were collected and analyzed in an effort “to corroborate and
augment evidence from other sources” (Yin, 2003; 87). Such a review provided insights
into the nature of information flow from policymakers and/or their agents and
implementing organizations, as well as within and among implementing organizations.
How well information traveled from the policymakers to individuals throughout the
organization is instrumental in maintaining communication and a commitment to the
process. Key documents were identified through the interview process. The content
included correspondence between policymakers and/or their agents and implementing
organizations, internal emails and memoranda, meeting minutes, organizational charts,
annual reports, communiqué among implementing entities, and other pertinent
documentation that provided insights into the implementation process. For the online
course policy specifically, it would also included syllabi and course Blackboard sites.
Direct Observations
As noted by Yin (2003) the greatest disadvantage to direct observation is the
commitment of time required. Although in the case of studying organizational structures
and processes, it is difficult to imagine not observing the setting first hand. Very limited
direct observations took place, and they were generally exchanges that were witnessed
81
through the course of interviews and other visits to the departments. Such visits provided
opportunities for observing the interactions among organizational members and nature of
their workspace.
Analytical Approach
As suggested by Yin (1982), the analysis of implementation studies begins in the
study’s planning stage and continues through the process of collecting evidence. As
discussed, the propositions helped to focus the study, but not to the exclusion of the
possibility that other phenomenon may be present. In addition to collecting information
regarding the learning model, information was also collected to develop a narrative
around the implementation process (Yin, 1982). This narrative required the merging of
various sources of data, as well as the reconciliation of various perspectives on how the
implementation process unfolded. In terms of the process of analysis, there were two
major steps involved. First was the challenge inherent in qualitative research, which was
identifying findings out of all of the data that was collected. The second related to the
comparison of data and findings across the two policy cases. The approach for each of
will be discussed in turn.
Identifying Findings
The study utilized four strategies in the analysis of data, which began in the data
collection phase. These included contact and document summary forms, coding, and
memoing drawn largely from Miles and Huberman (1994), as well as the development of
case reports for each of the policy cases.
82
Contact and document summary forms served to summarize and capture key
elements immediately following an interview, observation, or document review. Coding
was conducted at various points in the analysis. Interviews included at least three levels
of coding, including coding of contact summary forms, initial coding of transcriptions,
and a review of coded transcriptions as the coding was entered into the case study
database. Document review included coding of the document and of the document
summary forms. An initial listing of codes is provided in Appendix C and the final list of
codes is provided in Appendix D. Finally, Miles and Huberman’s (1994) strategy for
memoing were also used. Brief write ups were completed throughout the coding process
to begin generating ideas about codes, their relationships, and patterns. Direct
observations generated memoing data.
The study included a case report for each of the policy cases, which served as
their respective repository of information. The case report included case study notes,
case study documents, narratives, summary forms, as well as policy analyses and related
documents. It also included a case summary, which was developed near the end of the
data collection phase, but preceded final follow up interviews with key members of each
organization. This served to test the relevance of the propositions, identify gaps in the
evidence, and estimate the confidence in findings. Such a process is important so as to
ensure that the study is on track and allow for the opportunity to make adjustments to the
case study protocols (Miles and Huberman, 1994). A case report will also be generated
for the cross case comparison.
83
Comparison of Cases
The study seeks to provide analytic replication through the examination of two
different policies that share a common technological orientation. While unique cases, the
seven propositions will provide a framework for comparing and connecting the two cases
through the analysis process. Consistent coding schemes will be used across the two
cases in order to facilitate pattern matching, specifically through explanation building and
time-series analysis. In the case of the former, do the propositions effectively explain the
process of implementation in both cases, and if they do not, do they vary in similar ways?
For the latter, in creating implementation chronologies, do both cases suggest similar
causal relationships? Further, through the process of identifying commonalities and
differences among the narratives of the two policies and organizations, the researcher
may find ways to understand how the model, as proposed, supports organizational
learning, and whether that learning has supported the achievement of implementation
objectives. While these two cases both maintain a technological orientation, they are
sufficiently dissimilar to provide a broader understanding of how organizational learning
can relate to policy implementation.
Conclusion
This study looks at a small, but important, slice of the implementation process.
Although this study limits itself to investigating the organizational structure and
dynamics of frontline agencies in the implementation of policy directed technology
(PDT), it recognizes the larger setting in which this process takes place. If organizational
structure and its compatibility with organizational learning matter to the implementation
84
process, the study begs the questions: how well adapted are PDT initiatives to promoting
flexibility and learning through the implementation, and what effect do the structures of
organizations charged with implementation have on how effective policy is ultimately
carried out? These questions indeed go beyond implementation and challenge some
broadly held assumptions regarding the formulation of policy. While such considerations
are indeed important and demand attention, this study concerns itself with a more limited
focus in an effort to establish a sound justification for integrating organizational learning
into policy implementation theory. In this way, the model seeks to advance a new
direction for policy implementation study, and, ultimately, provide strategies and tools to
policymakers and managers to better meet implementation objectives and the societal
interests which these policies serve to address.
85
Chapter 4: Case Study of SEVIS
Introduction
This chapter considers the extent to which organizational learning occurred during
the implementation of the Student and Exchange Visitor Information System (SEVIS), a
federally mandated online reporting system. The study identifies that learning has taken
place directing changes in the organizational culture and membership and through the
development of new operational systems in response to the technology. The conditions
have largely proved useful in conceptualizing these learning processes. Further, the
achievement of implementation objectives have been advanced through organizational
learning, helping to reduce barriers to implementation as the organization navigated
second related policy driven implementation shortly after the launch of SEVIS.
The SEVIS system is operated through the Student and Exchange Visitor Program
(SEVP) which is under the Department of Homeland Security (DHS) Bureau of
Citizenship and Immigration Services (USCIS), with certain enforcement responsibilities
coordinated through the Bureau of Immigration and Customs Enforcement (ICE), also
part of DHS. Educational institutions desiring to enroll international students and visiting
scholars are required to participate. Over 9,000 educational institutions participate in
SEVIS, including not only colleges and universities, but also K-12 schools, trade schools,
and language programs. This analysis explores the efforts of a single organization to
manage implementation over the course of a three year period, from May 2002 through
the summer of 2005, as well as reflections on the process of implementation over the
course of more recent changes to the organization’s technology systems. While the
selected organization provides a typical example of SEVIS implementation in a higher
86
education setting, the broader context is to understand how an individual organization
manages a federal mandate for policy driven technology.
The primary focus is an analysis of data on the extent to which organizational
learning took place during the implementation of SEVIS, but this will be preceded by a
brief summary of the policy and its objectives, a profile of the organization, and the
process of implementing this policy for this organization. The implementation narrative
suggests a four phase process that resulted over the course of the three years under study,
and at the end of which most policy objectives were met. These phases are marked by
periods of study and preparation; planning, development (of technology and procedures)
and resource coordination; followed by execution and refinement; and finally, reflection,
continued refinement, and technology upgrades and changes. The analysis suggests that
the organizational conditions as advanced by the model were important to the learning
processes with the exception of a culture of experimentation. The chapter concludes
with a discussion of the extent to which organizational learning took place, and concludes
that there was learning in the form of changes in organizational membership and culture,
as well as the development of new operational systems, and that this learning supported
the achievement of implementation objectives.
Policy Background
The SEVIS policy has roots in the 1993 bombing of the World Trade Center in
New York and the piloting of a centrally maintained database system.
25
Following the
25
Immigration and Naturalization Service (INS), in partnership with the Department of State, the
Department of Education, and experts from INS-authorized schools and exchange visitor programs,
87
September 11, 2001 attacks, the USA PATRIOT ACT of 2001 made mandatory the
reporting on enrollment and academic progress information on international students for
all educational institutions inviting students to study in the United States. Managed under
the Bureau of Citizenship and Immigration Services (USCIS),
26
USCIS and SEVIS are
governed and directed by an array of federal policies that include the Illegal Immigration
Reform and Immigrant Responsibility Act of 1996 (IIRIRA),
27
USA PATRIOT Act of
2001,
28
and the Enhanced Border Security and Visa Entry Reform Act 2002.
29
While
SEVIS was enacted as part of an initiative to prevent terrorist entry into the United States,
it can also be understood more broadly as an effort to modernize the record system of the
United States as it relates to foreign guests.
30
This modernization seeks to establish a
paperless environment so that the federal government may have direct updates and
reliable information regarding international students.
31
While such data may not
provide clear indications of terrorist activities, the centrally housed data enable the
tracking of problems and patterns, and help to identify other concerns such as
international students who may be working illegally in the United States.
32
conducted a pilot program, CIPRIS, to test the concepts involved in an electronic reporting mechanism
from June 1997 through October 1999.
26
SEVIS is managed through the Student and Exchange Visitor Program (SEVP) within USCIS, which is
housed in Department of Homeland Security (DHS). The Bureau of Immigration and Customs
Enforcement (ICE) also manages aspects of SEVIS related to enforcement. SEVIS was initially managed
by Immigration and Naturalization Services (INS), then moved under USCIS in March 2003 with the
development of DHS.
27
IIRIRA, Public Law 104-208, Section 641. c. 1. A-D. September 30, 1996.
28
USA PATRIOT Act, Public Law 107-56, October 26, 2001. See also Appendix B.
29
Enhanced Border Security and Visa Entry Reform Act, Public Law 107-173, May 14, 2002.
30
Related modernization efforts have taken place within US Customs.
31
More specifically, the system provides the federal government with current addresses and enrollment
status of all international students and scholars (as well as their dependents).
32
Government Accountability Office, GAO-04-690. Homeland Security: Performance of Information
System to Monitor Foreign Students and Exchange Visitors Has Improved, but Issues Remain. Washington,
D.C.: GAO, 2004.
88
This policy was in effect an unfunded mandate where institutions desiring to
invite international students or scholars to their campuses have been required to invest
substantial resources to manage the system. Further, the policy relied on technology that
still needed substantial development and training for the multitude of new users.
Moreover, while a deeply detailed policy in terms of intent, a great deal of interpretation
and operational clarity was demanded of each institution. A fair degree of flexibility had
to be built into the system to permit the central database to accommodate the diverse
institutions that would report data to it. These conditions lent to interesting challenges
and opportunities as organizations responded to the call to implement.
Profile of the Organization
The subject of this study was an organization within a large public comprehensive
university in the southwest. As a comprehensive university, the organization had to
manage three fairly distinct populations of international visitors: regularly enrolled
students, students participating in an English language program, and visiting scholars and
professors. For the university under study, this amounts to the collection of information
from at least three functionally distinct departments at the university: the international
education (IE), which operates out of the division of student services and works with
regularly enrolled international students; the language program office (LP), which
operated out of extended education and provided short and intense English language
programs for international students; and academic programs, which works with the
majority of visiting international scholars. International students at the university
accounted the largest portion of students in SEVIS, in excess of 1,400 annually
89
accounting for just more than 4% of the overall student population.
33
A few hundred
students participate in the language program, enrolling in one of five programs offered
annually.
The determination was made early on that the institution would file a single
SEVIS certification and so all reporting would be managed through that single source.
34
Advisors of international students and scholars were based in each of the three
departments. They would be users of SEVIS and so were registered as Designated
School Officials (DSO) with the director of IE serving as the Primary Designated School
Official (PDSO).
35
This authority was placed with the director, largely since the
population of students they served was the largest and they had the largest staff to support
implementation. The decision to file jointly, while suggesting some efficiency in cost
and operations,
36
also required a high degree of coordination between the entities
responsible for the student populations, which are in separate divisions.
37
In managing the implementation of SEVIS, there were four departments that
served as participants in the implementation process. These included the international
office, the language program, the technology department, and student records. Perhaps
33
Transfer international students were the majority of the population with new transfer students
outnumbering freshman by more than a factor of two each year.
34
Some institutions had multiple departments that were certified under SEVIS and were responsible for
reporting on their distinct population of students or scholars.
35
Each campus had one PDSO and up to nine additional Designated School Officials (DSO) for each
SEVIS certification, designations from USCIS. All nine were assigned on the campus, and all but two
were housed in either IE or LP. The two were housed with academic programs. The PDSO served as the
main point of contact for the institution with USCIS with special responsibilities such as authorizing new
users.
36
These efficiencies include single certification and audit source at the institution and need to have only
one database interface to support.
37
Though academic programs represented the visiting scholars population and maintained access to SEVIS,
their role in implementation was limited and the international office assumed responsibility for processing
visiting scholar information through SEVIS.
90
most important in implementation was the international office, which served as the lead
department and functionally served as the key “organization” under study. It included a
professional staff of five prior to SEVIS, which increased to seven over the course of
time under study. Three of these staff members—the director, assistant director, and
international student advisor—served as a key group in planning the implementation, and
three to five staff members maintained access to SEVIS over the course of the time being
studied. The language program based in the larger extended education program, had a
dedicated staff of three staff members working with the program and with SEVIS, which
expanded to five over the course of time being studied.
38
The two additional
departments, technology and student records, provided crucial support throughout the
implementation process. Both these departments had staff involved in implementation
planning and execution throughout the period under study.
Organization Meets Policy: Implementing a Federal Mandate
The implementation objectives considered here reflect both the demands of the
policy and the goals and values of the organization. For the federal government,
compliance means maintaining certification, regular reporting of international students
(i.e., enrollment, contact information, and academic progress), as well as participation in
audits as they arise. The organization, while seeking to meet the mandate, seeks to meet
but not exceed what the policy requires. Further, the department has sought to integrate
SEVIS into its operations, while continuing to provide quality services and programming
38
Responsibilities of these individuals did include work in addition to the language program, even at the
onset of SEVIS.
91
to students. Advisors provide academic counseling, support in managing transition
issues, and seek to promote student development in such areas as leadership, cross
cultural understanding, and general areas related to student maturation.
39
Following the passage of laws authorizing a mandatory centrally-based federal
system of tracking international students, there began the subsequent development of
regulations governing SEVIS. Throughout 2002 regulations took shape that influenced
the “final rule” to which all educational institutions seeking to recruit international
students needed to respond. Over the course of just 14 and a half months,
40
the
organization worked towards the initial implementation of the SEVIS system, which
required the coordination of more than half a dozen departments, as well as hundreds of
individual decisions and interpretations of policy, thousands of man hours, and the
investment of tens of thousands of dollars.
41
The implementation process included a “study phase,” “planning and
implementation testing phase,” and “live implementation phase,” details of which are
summarized in Table 4-1. The fourth phase, which will be discussed only briefly, is
“implementation in practice.” The organization is currently in a fifth phase of
implementation, “study and practice,” where it is dealing with the conversion of its
student records system. While not discussed here, it will be referenced in the discussion
of the learning model as the department works to deal with similar issues in a new
context.
39
IE also serves students studying abroad.
40
This begins with the publishing of the rule for the Enhanced Border Security and Visa Entry Reform Act
on May 16, 2002.
41
Initial set aside was over $70,000 in December of 2002 covering the first year of implementation, which
did not include staff time of initial department employees.
92
Table 4-1. SEVIS Phases of Implementation.
Phase Dates Activities
Study 5/16 – 9/24/2002 Reviewed initial regulations and rules
Began consulting external sources
Began developing initial planning process
Contacted key departments
Planning &
Implementation Testing
9/25/2002 – 7/31/2003 Established roles in implementing SEVIS
Coordinated participants in the implementation
Interpreted policy for campus
Collected information from the external
environment
Live Implementation 8/1/2003 – 8/30/2004 Synchronization of databases and batch processing
Initial implementation and refinement of plan
Implementation in
Practice
9/1/2004 - 6/30/2005 Reflection on operational procedures
Consulted with other campuses on best practices
Study and Practice 7/1/2005 - Present Begin preparation for new conversion
Continue activities in the implementation in
practice phase
The Study Phase
The study phase began with the publishing of the rule for the Enhanced Border
Security and Visa Entry Reform Act on May 16, 2002 and led up to the publishing of the
final rule on September 25, 2002, and was marked by the initial review of the current
rule.
This phase provided the organization with a first look at how the policy might be
operationalized. The international education office also assumed lead responsibility for
overseeing campus compliance, and so a core planning group, the SEVIS Implementation
Team (SIT), formed, which included the director, assistant director, and the international
student advisor, and they worked to develop initial plans for implementation. Some
preliminary activity began, which included early identification of the impact on
operations and who might need to be involved in the implementation process.
Additionally, SIT members began engaging colleagues in the system and in NAFSA, the
93
professional association for international educators,
42
to discuss and understand the
implications of the policy. Both these actions helped laid the ground work for supporting
learning in the policy implementation process. The former assessed the current
organizational context and began engaging the necessary campus partners. The latter
served to supplement information regarding the policy coming from the federal
government.
The Planning and Implementation Testing Phase
The planning and implementation testing phase began with the publishing of the
final rule and ran through the amended deadline for full campus compliance on August 1,
2003, and was marked by the development of an implementation plan.
In developing its plan for implementation, the campus decided to file for SEVIS
certification jointly between the international office and the language program. Efforts
throughout this phase fell into four broad categories, which will be discussed in turn.
First is human resource management, which involves the establishment of roles in
implementing SEVIS, its impact on existing roles, and the coordination of participants in
the implementation process. Second is policy interpretation, which involved making
decisions on how the campus would interpret the policy and managing the conflict
between the policy directive and the commitment of the organization. Third is the
collection of information from the external environment, which included information
from the federal government, as well as other implementing organizations. And finally,
42
NAFSA was formerly the National Association of Foreign Student Administrators. As the phrase
“foreign student” dropped out of use, NAFSA has held onto its acronym but no longer uses its full name.
94
data management, the area that served as the greatest implementation challenge and
related to the need to establish a means of properly interfacing with SEVIS, as well as
making the necessary adjustments to the data currently available to the organization.
Human Resource Management
A key component of this phase of implementation was the organization of human
resources from the campus to support implementation, which included identifying those
players and their roles, reexamining the roles of the users of the technology and providing
them with the training and tools to support this new responsibility, and identify the need
for any additional staffing. Participants in the implementation process included student
advisors and support staff from technology and student records. These individuals came
to form expanded groups working to plan both the technology and data management side
as well as the general use of the technology, which included policy interpretation and
formation of operating guidelines. Part of this process also was the development of a
means of communicating progress among participants.
With the international education office having assumed leadership for the project,
an important initial step was defining roles within that department for moving forward
with implementation. There were very specific impacts on two members of SIT, the
director and the advisor. The director continued his work in creating awareness and
understanding regarding the policy and operational changes surrounding international
students to campus entities external to the department. He was also gathering the
necessary support from departments as needed. Further, it required pushing forward with
compliance, while also balancing SEVIS with the overall mission of the department,
95
which was serving students. This challenge was not always met, as he observed in
retrospect.
[W]hat we saw at the time of SEVIS implementation was this full-on focus on
SEVIS, SEVIS, SEVIS and my responsibility as a leader to remind everyone
including myself that what we’re here is more than about SEVIS. We’re here for
students and we try to keep that in sight, but I think that sometimes it was very
easy to lose that focus on our students because SEVIS became our reason for
being, our reason for existence, and I think sometimes we lost focus on our
students which is unfortunate. (180)
While the director assumed a lead role in communicating with the campus
community regarding the process of responding to the new regulation, securing
resources, and meeting extensively with the leadership of units from whom would be
required much support (i.e., technology and records), the student advisor assumed the
role of project coordinator, and ultimately a role which influenced the process greatly.
The department initially sought to fill the coordinator role with a technical staff person,
but that individual quickly encountered problems. While possessing strong technical
expertise, she lacked an institutional context and an understanding of immigration law.
The position was eliminated after two months
43
and was functionally replaced with the
student advisor, who had been relied on heavily by the compliance coordinator
throughout her brief tenure. The student advisor came to formally serve as the SEVIS
coordinator after this, splitting her time between coordinating implementation and
continuing her role advising students.
Looking at how SEVIS implementation unfolded at a number of campuses, one
interviewee remarked at “how incredibly influenced the whole process is by usually a
single individual who has this unique combination of skills” (205). “This unique
43
She left the department by the end of March 2003.
96
combination” included people skills, knowledge of the regulations, and comfort and
sophistication with technology. The student advisor would serve in that role, though
initially not in a formal capacity. The student advisor, who would later serve as the
SEVIS coordinator, filled the role of implementation facilitator. She did so not by setting
mandates, but by scheduling meetings, setting the timeline, assigning tasks, and keeping
parties up to date with regular email communications. Indeed, the general approach of
the IE was to engender support by inviting participation and managing the project in a
participative manner, inviting feedback to the process and approach.
44
Still, they were
charged with managing the compliance and indeed worked to keep the implementation on
schedule.
Throughout this phase there were regular meetings among international student
advisors, who were the group most concerned with determining policy interpretations for
the campus and creating operating procedures. All advisors served as DSOs
45
and had
direct access to SEVIS. Cooperation and coordination were generally good, as there
were efforts to collect and share information. The SEVIS coordinator served as the key
repository of this information, which was translated into the campus user manual for
SEVIS.
A particular difficulty that advisors experienced was the need to manage the
additional responsibilities and changes to their roles associated with SEVIS. The added
responsibilities were of a technical character and posed a problem for most of the staff,
who had no technical background. Further, as a result of SEVIS, the department
44
Much of this was a function of not having chain of command authority over these other departments.
Also, the weight of the federal mandate also likely assisted the garnering of necessary support from other
campus departments.
45
DSOs are Designated School Officials, a designation of USCIS.
97
operations and the role of the advisor changed fundamentally. “We’ve seen that the role
of an advisor has really changed perception from student, from the public, from anyone.
That we have now become reporting agents of DHS, because we’re using SEVIS.” All
advisors generally resisted the notion of being agents of DHS as commented by another
advisor,
So I think it made us feel like we’re the policemen all of a sudden instead of an
advisor that looks out for students in making sure that yes, we are compliant with
the government, but we also need to make sure that our students understand that
they need to be compliant with the rules, F-1 rules and regulations, and all of a
sudden we became just data trackers and enforcers, and it almost sometimes felt
like you’re really, you know working for the Immigration, especially when you
are reporting these students. (141)
Further, these reporting activities were added onto existing responsibilities. It
was difficult to advise students with experiential concerns while simultaneously
managing the technical reporting required by SEVIS. In the end, though, roles were
refined and additional staff was brought in to provide support in key areas.
Policy Interpretation
The SEVIS Implementation Team (SIT), in coordination with LP, worked to
translate the policy to fit the needs of the campus and its two populations of students. In
the process the organization encountered tensions between serving as compliance officers
for the federal government and advisors to students. These tensions, however, were
largely on marginal issues and did not prevent the organization from meeting its
obligation for compliance.
98
The campus had to establish the reporting calendar, define online courses,
46
create
a means of distinguishing its student populations, and make countless other
determinations that needed to be justified and documented to respond to any compliance
requests from USCIS. Indeed, the organization relied on the federal policy and the
implementing agency’s activities (i.e., guidelines, certification process, audits) to
structure both their implementation and ongoing activities related to the policy. Further,
these detailed plans and schedules also served to coordinate the efforts of individuals
throughout the campus, an essential element of implementation. This process took place
throughout the planning and implementation testing phase, and, indeed, continued well
into the live implementation phase.
The organization sought to define level of compliance by meeting, but not
necessarily exceeding regulation standards. SEVIS as a database had the ability to
collect, and campuses therefore had the ability to report, more than what was required by
law. The department from its leadership and among its advisors sought to be very clear
on what information needed to be collected and reported. This was motivated by an
interest in serving the students. There were concerns that they “wanted to respect as
much as possible the confidentiality of our students.” And so while committed to
“comply with the law, but we certainly don’t intend to become, you know, branch office
of Homeland Security here (125).”
46
SEVIS restricted the number of online education courses in which international students could enroll in
any academic term. The campus, however, needed to define what constituted an online course.
99
Information from the External Environment
Throughout this phase the department relied on information from external sources
to support its implementation planning efforts. These sources included the federal
government, other implementing campuses, and NAFSA. While all were important, the
latter was universally recognized as playing the greatest role, and continued to be
important in later phases of implementation.
Contact with the federal government came both directly, as well as mediated by
the efforts of NAFSA. Early on USCIS established a SEVIS help desk to provide support
on a day-to-day basis, as well as providing notices to designated users on general
information related to SEVIS. The help desk responded to technical issues, “data fixes”
(i.e., when inaccurate information was inputted by the user), and matters relating to the
interpretation of policy. The quality of service provided by the help desk was initially
problematic,
47
and resulted in long wait times, inaccurate information, and limited
reliability as a source of support. Another challenge related to the help desk was
managing policy interpretation questions, which required higher degrees of sophistication
regarding the policy. These types of matters were eventually referred to regular
conference calls held between USCIS leadership and NAFSA policy experts. NAFSA
served to funnel user issues directly to the federal government, and USCIS had a
relatively small group of users to work with directly at resolving problems. This was
very important information for the organization, which was circulated among users and
integrated into campus planning efforts.
47
These included insufficient staff, insufficient training of staff, and a high volume of new users, coupled
with a system that required debugging.
100
The department also made efforts throughout the implementation to contact
similar campuses to identify best practices and learn from problems that they had
encountered. This included sister campuses within its university system, as well as other
institutions utilizing the same batch processing software. While these networks were
important, NAFSA provided a broader perspective on implementation issues and useful
resources.
48
Data Management
While the SEVIS system itself presented technical challenges that had to be
overcome through debugging and proper training, it also introduced broader problems of
data management. These were twofold. First, SEVIS, an internet based program, needed
to interface with the campus student records system. Second, the campus database had
inaccurate or missing data in key fields. These issues represented the greatest
implementation challenges related to the technology and required considerable collective
effort to resolve. Indeed these problems to some extent continue as the organization now
converts to a new campus student records system.
Initially the SEVIS system purported to require only access to an internet
connection. It soon became evident that campuses with large international student
populations would require batch processing to connect campus student record systems
with the SEVIS database, eliminating the need for manual data entry of student data.
In the process of matching records, it became clear that the student data were
flawed and incompatible with the reporting requirements of SEVIS. First, additional
48
These included web resources, user manuals, listservs, conferences, and training opportunities.
101
student information was required. Second, information in records was out of date,
inaccurate, and erroneously entered.
49
The process of mapping data fields began,
creating a system of accounting for how fields from each system (e.g., campus records,
batch processing software, and SEVIS) would be operationalized and how fields in each
database would be connected. With over 200 fields in the batch processing software
alone, this was an arduous task that continued through the spring.
The Live Implementation Phase
The live implementation phase began in August 2003 as the campus began
reporting to the federal government. This phase is characterized by the organization’s
reporting to the federal government and ongoing efforts to refine their implementation
plan. Though initially unable to utilize the batch system, the databases were
synchronized enabling its use in the spring semester.
The activity that dominated this phase of implementation was synchronization to
ensure data consistency across three databases: campus student records, batch processing
system, and SEVIS. This synchronization process was not completed by the fall when
the organization first reported information to the federal government.
50
Data was cleaned
in time for spring reporting, but the organization faced an ongoing need to verify and test
data. The international office staff gained the ability to enter data into the student records
system so that it could be relied upon for accurate information, which could be fed into
batch process and ultimately reported to SEVIS.
49
International addresses posed a particular problem as the format did not always translate neatly in the
campus records system.
50
Data was entered into SEVIS manually using Real Time Interactive (RTI) in fall 2003.
102
The organization continued to fine tune its plan, as well as implement
improvements to the process, the organizational structure, and the technology itself.
Overall, the organization found that the structures that had been put into place functioned
well. Efforts to maintain communication among implementation participants and
advising staffs continued, including regular meetings, which became less frequent, but
more focused. Adjustments to advisement staffs were made, which included identifying
the need for administrative support to manage the reporting process to the federal
government, a process that became more technical as policy ambiguity was replaced by
clear business practices. Also, technical requirements became solidified in position
descriptions, and SEVIS became a part of organizational routines particularly as new
advisors came on board. On the technical side, throughout the implementation process
and particularly in this phase, the department continued to deal with new challenges. The
batch processing system released minor revisions, each accompanied by its obligatory
bugs and resulting in new errors in managing the transfer of data from the student records
to SEVIS. Further, there were additional updates to the student records database, some in
response to other mandates, others to facilitate tracking efforts.
51
All in all, from January
2003 through June of 2004, the international office logged approximately 9,800 hours
towards SEVIS implementation, which does not include time dedicated by other SEVIS
using offices and support departments (1303).
52
51
Such projects included converting to campus identification numbers in the batch process system (to a
meet new requirement of the university system), addition of a tracking field in the student records database
for SEVIS-monitored students, improved tracking of online courses for the campus and the capability to
track international student enrollment in these courses, as well as development of an alert notification
system to better notify students when they encounter problems with compliance.
52
Numerical citations used throughout the remainder of the dissertation denote references in the case study
database. These include both direct quotes and other information.
103
The campus, under the leadership of the international office, developed a very
deliberate process to ensure data reported to SEVIS was clean. During that time of the
live implementation phase, there were reports from other campuses that had sent flawed
data to SEVIS “with disastrous results (for students and institutions) that were difficult or
impossible to correct” (1356). For the campus, there were and have not been any
substantive audit findings or serious reporting errors. The sentiment of the campus and
the direction it took is best captured in a progress report filed by the director of the
international department.
The data management is overseen by international education professionals who
know regulations and SEVIS. This is the only sure way to ensure compliance
while protecting our students and scholars from unnecessary or excessive
reporting. Also, as SEVIS evolves and changes, we monitor and understand the
changes and the implications for our data-management and reporting. Our staff
undergoes regular training as well as monitors a host of electronic lists (including
SEVIS itself) to note changes. (1357)
The Implementation in Practice Phase
The implementation in practice phase began in September of 2004 with an
established and tested interface between the three systems. In what could be considered a
final implementation phase,
53
the process was marked by continuing attention to quality
control of data and system maintenance and upgrades, as well as a more reflective review
of operational procedures. This enabled the solidification of routines and the opportunity
to carry out adjustments to staffing.
53
Study and practice, the fifth phase, begins the implementation of a new technology, but is a continuation
of these implementation activities as they pertain to SEVIS.
104
The organization’s review of its procedures involved both a reflection back on a
full cycle of executing its plan, as well as an evaluative activity of consulting with other
campuses, in particular other users of their batch processing system, to share and collect
best practices and challenges being encountered. This enabled a refinement to its
procedures and solidification of its routines. A key adjustment to staffing was made as a
data entry operator, who reported to the SEVIS coordinator and was responsible for data
updates and report preparation (these having become structured and defined tasks), joined
the international office staff. The department also engaged in evaluation and testing of
systems and data, ensuring that they continue to operate in compliance and were prepared
for audits and recertification with the federal government.
Efforts continued to test the integrity of data across its databases, as the
organization also continued to engage in system maintenance and upgrades. As the
department moved into the fifth phase of implementation, study and practice, it not only
dealt with preparing for a new conversion and implementation process in response to
another mandate, this time directed by the university system, but it also continued to
manage the existing one. This included a series of errors in reporting on the heels of the
batch processing upgrade.
An important revelation in this implementation process is that not only can an
organization be faced with the implementation of complex technology, but that the
technology may be one of many technology tools. So, implementation of new technology
may happen on top of another implementation. Further, these tools are often upgraded,
which can demand that the organization participate in mini-implementations. It is
therefore critical for organizations in this environment, and in particular when dealing
105
with technology, to not only be prepared for the implementation process, but to be
prepared for an ongoing implementation processes—a state of constant flux. This calls
for adaptation, a process that this study argues is best served in the context of learning.
Data Analysis: Applying the Model to SEVIS Implementation
The policy maintains several qualities that both facilitate the achievement of
policy objectives and also create barriers towards that end. As discussed, Matland (1995)
describes four ideal type approaches to policy implementation with high and low levels
of ambiguity and conflict as important factors shaping the implementation process.
SEVIS can be characterized as administrative, with both low ambiguity and conflict, and
as such, according to Matland, implementation is mostly resource dependent. The
analysis will begin, therefore, with a discussion of the fact that despite this being an
“administrative” implementation, there existed real challenges to the implementation
process, challenges that may in part be overcome by the presence of learning systems at
the organizational level. The analysis suggests the importance of most of the conditions
as they relate to learning. A culture of experimentation and policy and goal congruency
proved to be less relevant than expected in supporting the learning process, with the latter
particularly limited.
There are several attributes of this policy that would seem to support successful
implementation. Specifically, these include: a clear and defined target population;
availability of resources for development at a local level; strong support for purported
aims of the policy (i.e., addressing terrorist threat), which existed at all levels from
policymakers through frontline implementers; development informed by an initial piloted
106
program; high punitive implications for failure; and utilization of established structures
for frontline implementation (i.e., educational institutions).
54
Themes here relate to those
qualities of effective policy implementation as advanced by Mazmanian and Sabatier
(1989), as well as the notion of policy windows as offered by Kingdon (2003) and
incrementalism as suggested by Lindbolm (1973).
At the same time, the organization confronted several classic implementation
obstacles, including: a typical principal-agent problem, which required efforts by the
federal government to set as specific a set of guidelines as possible and to invest heavily
in monitoring of compliance; lack of sophistication regarding the software, which
required refinement throughout the initial implementation;
55
a need for a very technical
orientation from an organization that had a much lower emphasis on technology; and a
wide breadth of types of users on a national level, which demanded flexibility for
individual interpretation and fit, a characteristic quite at odds with the needs of
overcoming the principal-agent problem. The complexity of the implementation
challenge is evidenced by the many examples of campuses that experienced a large
number of substantial errors which resulted in serious consequences for students and
institutions; errors which were avoided in the campus under study.
The question we now turn to is whether underlying learning processes served as a
means to support the achievement of policy objectives, and, if so, how the posited model
might serve as an effective tool in examining the learning that took place. To that end,
54
While beyond the scope of the study here since the focus is on implementation at the organizational
level, numerical data suggests relative success of implementation of the policy from a national perspective.
In 2002, the year prior to compliance, just over 1,000 institutions were certified, which grew to almost
6,500 once the mandate went into effect. Nearly 8,000 institutions were certified by 2005 and over 9,000
by 2008.
55
This continues even now as SEVP prepares for the release of SEVIS 2.0.
107
the seven propositions of the model will now be discussed in the context of this case.
General findings are summarized in Table 4-2.
Table 4-2. Propositions in Practice for SEVIS. An analysis of propositions within the
SEVIS case and their contributions to learning.
Proposition Contributions to Learning
1
Network with other Implementing
Organizations
Benchmarking and best practices
Policy clarity
2 Communication with Policymakers
Compliance feedback (i.e., certification and audits)
Policy clarity
3 Policy and Goal Congruency
Structured implementation plan
Defined objectives and milestones
4 Culture of Experimentation
Largely supported individual learning
Characterized by trial and error approach
5 Clarity of Roles and Responsibilities
Clarity to manage inter-departmental cooperation
Enhanced technical skills among advisors
Required reorientation of advisor responsibilities
6 Internal Communication
Clarity with regards to project leadership and facilitator
Regular meetings with key individuals followed by
documentation
Documented processes very accessible
7 Evaluation and Routinization
Culture of delineating practices and making them available
Work to expectations outlined by compliance reviews
Willingness to adjust practices as needed
Proposition 1
Organizations can take advantage of the learning that occurs in other
organizations by establishing networks and exchanging information and resources with
other implementing organizations.
“Nobody in this profession can survive without being a member of NAFSA.” (1066)
The value of networking across implementing organizations is reinforced
throughout the literature and is also one of the two sources of external environmental
information for the organization as posited in this model. Efforts here include such
108
activities as benchmarking and identifying best practices (Osborne and Gaebler, 1993;
Thompson, 1999), as well as interorganizational exchange as a means of learning from
other organizations both generally and in the context of information and communication
technology, ICT (Robey, Boudreau, and Rose, 2000).
Indeed, this condition served as an important conduit of information and
knowledge into the organization. In turn the organization did well to circulate it among
members and use it to inform processes, which promoted the learning process. The
primary source of networking came through NAFSA, which itself evolved over time in
terms of its effectiveness.
This condition was universally recognized by users as critical to the
implementation of SEVIS. The department communicated regularly with other campuses
as a resource in implementation, and sharing experiences across implementing
organizations was a key to developing organizational practices and avoiding missteps.
As one user noted in regards to collecting experiences from other campuses, “I would
make sure I documented that, because then, if that ever happens to us, I’d know how to
do it. And I know that it’s been tried and tested already by another DSO,
56
at another
school. So, it did make us work a lot more collaboratively, with our colleagues” (130).
While these networks were varied and did form around common institutional
characteristics, such as sister campuses in the system and other campuses utilizing the
same batch processing software, one source clearly dominated—NAFSA.
NAFSA, the professional association of international educators, played a key role
in facilitating this networking and so the organization’s learning process. NAFSA from
56
DSO is a Designated School Official, a term referring to an approved user of SEVIS.
109
the onset was involved in SEVIS and served as a forum for international educators to
discuss and prepare for the new policy. As one respondent suggested, however, initially
international educators lacked astuteness in the policy arena, which showed as they
expressed concerns regarding some specifics of the policy.
I think, in the way that [international educators] responded, we just, came across
to the public, and we came across undoubtedly to Congress, as a bunch of well-
meaning mealy mouth kinds of liberals who just think, ”Oh, aren’t all these
[international] students wonderful and they’re just here to make America a
wonderful place and, you know, none of them would ever do anything bad,” and I
think we’ve gotten so much more sophisticated in being able to articulate what
international students and international education and education abroad means to
people. [We] are talking a lot more about outcomes, and we’ve got data now and
all that sort of things. So I think it has really forced us to state our case in a much
more intelligent way. (199)
As noted that sophistication grew and within the association policy become an
integral focus. What has resulted is the development of a unified voice in advocating on
behalf of educators and their institutions led by those experts. This coordination has
enabled the pooling of resources and sharing of knowledge.
57
Further, the NAFSA
training manual was referred to as the “bible” for using SEVIS, second only to the
campus’s internal resource manual. The infusion of information and knowledge into the
organization and its exchange among members has facilitated the learning process.
57
As previously noted, NAFSA provides a wide array of support including policy interpretation and
general guidance in implementation, training resources, conferences, workshops, listservs, website and web
resources, and direct and regular communication with the federal government.
110
Proposition 2
Organizational learning requires that members are well informed of policy and
policy changes. Management should serve as a conduit of information between
policymakers and organization members.
"We’ve been audited...they selected schools and we were selected...they gave us a list of
200-plus students that’s on the SEVIS database...go through the names and ensure that
the data is accurate...So things like that happen from time to time...You just participate in
it. That just means more work, you know, just data.” (1073)
Two forces drive the importance of this condition in the literature. First is the
need to base implementation success, at least in part, on policymaker intent and the
intentions as defined in the policy. While clearly a perspective maintained by the top-
down approach in policy implementation (Mazmanian and Sabatier, 1983; Bardach,
1977; Pressman and Wildavsky, 1973), it does not need to be at odds with a bottom-up
approach, except perhaps when taken to the extreme.
58
This is particularly important
when implementing policy driven technology (PDT). For as Fountain suggests, the
implementation of technology can be characterized as “decision making under
uncertainty” (p. 89), and thus requires a return to policymakers (or at least the agents of
policymakers) to clarify and adjust as bugs are addressed and the technology itself is
utilized and better understood.
Similar to support from other implementing organizations, this condition provides
information regarding the use of the technology and clarification regarding policy
58
The bottom-up perspective advocates for more flexibility for frontline implementers, and indeed a role
for them to play in shaping policy objectives because of their better understanding of policy at the impact
level. Advocating that the frontline have input on objectives is not necessarily at odds with policymakers
having input as well.
111
interpretation, but also provides a third critical piece, necessary feedback on whether
policy objectives are being met. Throughout implementation communication between
the organization and the federal government fed information in these three ways to the
organization. While these lines of communication were present throughout
implementation, they were not well developed at the onset, but evolved and improved
over the implementation of the policy. This condition did support the learning process
for the organization and is critical for informing the policy implementation process. The
organization did not fully utilize this function, though, and the federal government,
namely USCIS, was slow (and continues to be) in fully developing the potential of this
learning process.
From the beginning, the SEVIS help desk and direct notices sent to users provided
assistance and direct contact to the federal government. However problems with the help
desk reinforced the organizational commitment to be cautious as errors could result in
serious consequences for students and the campus.
59
The service of the help desk did
improve over time, which allowed for less concern about errors.
60
Another form of communication was related to campus reporting and SEVIS
compliance issues (i.e., audits and recertification processes). These lines of
communication enabled the development of a feedback loop so that the department could
adjust and improve its application of the policy, and has provided information that has
59
This was not the case for all campuses. As was noted by the experience of one new advisor from her
previous institution, they relied heavily on the help desk and data fixes as a means of learning the system
through trial and error.
60
The quality of training materials provided by the federal government also improved throughout
implementation. Where initially nonexistent, the federal government became adept at coinciding the
release of upgrades with new training resources.
112
helped the organization develop routines.
61
Key learning processes available to the
department were audit and recertification processes. “There’s a recertification every two
years...we were done two years ago already...they’ll tell you if you are chosen, they will
come. They don’t go to every school...The first time the certification had to involve [a]
site visit, and they came. Second time, they just asked for some files to be checked and
we just gave it to them, and it seems to be okay (1070).” This feedback, while critical,
was not robust, providing the organization with only the opportunity to confirm data
integrity. While this certainly helped them stay on track through the implementation, the
federal government appears to be missing the opportunity to provide more guidance in
terms of the organization’s ability to meet policy objectives.
62
Proposition 3
Organizations can facilitate the learning process and effective implementation by
maintaining congruency between organizational goals and policy objectives.
“SEVIS is after all a tool that we use to report...it doesn’t interpret the regulations. The
interpretation still falls on the DSO attaché who interprets the rules.”(1029)
In order to root policy objectives in the activity of the organization, management
must provide clarity at the organizational level, and create a means to motivate
organizational members towards that activity. Management, therefore, must properly
61
Such routines include how the organizations collect and report information, as well as providing details
on the underlying logic related to policy interpretations.
62
Also problematic is that not all organization members were clear of audit results or even that one had
taken place, which indicated some deficits in internal communication. Information generated from
interactions with the federal government regarding needs for changes in practices, though, were fed back to
organizational members and discussed.
113
align incentive structures (Petitt, 1996; Wilson, 1995; Ferris and Tang, 1993; Lundberg,
1983; Perry and Porter, 1982), as well as provide clarity of purpose (Ruiz-Mercader,
Merono-Cerdan, and Sabater-Sanchez, 2006; Davenport, De Long, and Beers, 1998;
Bishop, 2003) and a shared vision (Senge, 1994). Clear congruency is needed between
organizational goals and activities and the policy objectives, or at least how the policy
objectives are understood by the organization and/or organization management.
The federal policy as operationalized by USCIS was very clear. The primary
focus from the federal perspective was the collection of enrollment information about
international students from educational institutions, which also involved initiating
enrollment in SEVIS (i.e., certification) on the front end of the process and the ability to
respond to audits once reported data was submitted. The organization used this policy
structure to guide its implementation process and ingrain compliance as part of the
structure of the organization. There was a tension, though, between the requirements of
the policy and a principle component of the organization’s mission. While this created a
challenge it did not limit the ability of the organization to respond to the policy
requirements.
The clarity of the policy requirements enabled the organization to develop a clear
implementation plan and points of exploration where alternatives for addressing policy
objectives could be formulated and tested. What resulted was a structure within the
organization to manage the new data system, and relationships with appropriate entities
from around the institution to support them. The efforts of the department to establish
congruence between the policy and the activity within the organization is best displayed
in an August 2004 progress report from the director to the campus administration entitled
114
“SEVIS: Moving to routine.” The director outlines not only what was done to meet
compliance, but organizational changes that were, and still needed to be, undertaken to
ensure sustainability in implementation, which included the batch process system and
also changes in staffing. Even in those areas where flexibility in the policy existed to
enable the fit to various institutions, the policy required “documentation to show this is
the reason why I’m doing this and that you’re still in compliance with the rules” (43).
The department also developed staffing changes within the department throughout
implementation process (some that worked and others that did not) in response to policy
requirements.
63
The organization continues to face challenges in integrating responsibility for
SEVIS into its normal operations. This, however, has not resulted in a failure to meet
policy demands, but is a problem because other priorities did not disappear. The
department head viewed one of his primary roles as maintaining a clear perspective on
the place of SEVIS in reference to the overall operations of the department. SEVIS
needed to be kept within the context of the organization’s mission and goals. So the
organization continues to struggle somewhat as a result, addressing the needs of SEVIS
while attempting to maintain the same expectation for serving students as existed pre-
SEVIS.
63
One example was the creation of a position whose responsibilities included preparing the regular
enrollment reports that are sent to the federal government.
115
Proposition 4
The organization possesses and management promotes a culture of
experimentation as it relates to the use and implementation of policy driven technology.
“So it was a lot of trial and errors...They had to develop it quickly, and it was designed in
a way where schools really had a hard time to make sure that all of the information got
captured the right way, and some of the features still don’t work” (140)
Experimentation has been deemed an essential tool in the learning process, and
particularly valuable in a situation where technology is implemented with a high degree
of uncertainty. Relevant characteristics include “trial and error” (Etzioni, 1989),
kinesthetic learning (Robey, Boudreau, and Rose, 2000; Hippel and Tyre, 1995; Marshak,
1993; Senge, 1990), valuing knowledge and innovation over hierarchy (Davenport, De
Long, and Beers, 1998), importance of disclosing mistakes as part of the learning
process, particularly among management (Ruiz-Mercader, Merono-Cerdan, and Sabater-
Sanchez, 2006; Krogh 1998; Imai, 1986), and the practice of testing of new techniques
before introducing them for general use (Ruiz-Mercader, Merono-Cerdan, and Sabater-
Sanchez, 2006). These activities ought to not only be in place, but management ought to
clearly support experimentation at the level of technology use.
While experimentation is theorized to be valuable in the learning process, within
this implementation context, its applicability was limited. This can likely be attributed to
the high risk associated with errors. Errors reported in SEVIS could result in loss of a
student’s visa and ability to continue studying in the United States. In those cases where
errors could be corrected, there were delays, particularly at the beginning, in processing
116
these data fixes with the SEVIS help desk. Even when they were processed, they
sometimes required that the international student report to a federal government service
center, which was associated with long waits. The department, therefore, placed a very
high value in accurate reporting, a commitment which ultimately created great challenges
since data had to be coordinated among three databases.
From the onset, there was a general understanding that this was very new, both
the policy and technology, and because of this ambiguity, comment and discussion were
encouraged. This, however, manifested itself more in the early implementation and
planning, and often through a process of trial and error. To the extent it occurred with
respect to the technology, it was mostly confined to training environments, where
opportunities for experimentation with software were built into the planning.
64
So, it was
not that the process was void of experimentation; it just was not strongly valued and
surfaced as a measure of last resort. There was a preference to enable learning through
the mistakes of others. NAFSA, as well as other institutions, provided that information,
so it was because of these strong networks that the organization was able to place less of
a reliance on experimentation.
Proposition 5
The learning process and effective implementation are more easily achieved when
the organization provides clarity of roles and responsibilities through the implementation
process.
64
One notable exception was the innovation to capture student information online using web forms;
however, the measure really provided efficiency gains using established technology, so the practice was not
truly experimental.
117
“Data management is overseen by international education professionals who know
regulations and SEVIS. This is the only way to ensure compliance while protecting our
students and scholars from unnecessary or excessive reporting.” (1308)
Clear understanding of membership roles and responsibilities benefits the
individuals both in terms of their own performance but also so they can identify where to
access other human resources in the organization. This becomes particularly important
under conditions of organizational change (Damanpour, 1991; Porras and Robertson,
1987; Van Maanen and Schein, 1979). Because technology evolves rapidly,
organizations need to anticipate change, which in turn requires flexibility and reflection.
Because of the variety of users and the reliance on supporting departments to
assist with the technology and student data, the definition of roles and responsibilities
was essential. Among users, the SEVIS system provided clarity regarding use of
technology, but clarification was required as new technical responsibilities were
integrated into the advisor role. The need for additional staffing was also an issue, the
approach to which has evolved over the course of implementation. The clarification of
roles has facilitated learning processes within the organization, ensuring coordinated
activities in both implementation planning and use of technology.
As discussed above, initially roles were not clear. IE in leading the effort worked
to develop a structured response, largely under the leadership of the SEVIS coordinator.
The process was iterative, not dictatorial. Regular meetings were held, initially for
planning, and then through implementation, and followed up with minutes and tasks lists.
Efforts were made to involve participants in developing their roles, a process largely
orchestrated by the coordinator. This process of inclusion enabled the international
118
education department to engender sufficient support from other departments that had
technical expertise.
Although slowly the users learned their responsibilities to interface with the
technology, the group of advisors also collaborated in establishing standards of practice.
The discussion helped to form consensus and provide buy in to the decisions being
made.
65
As articulated by the manager in the language program, “on a campus level we
work very closely with [IE staff] and have meetings where we talk about strategy or
procedures that we need to adopt in order to be compliant in this area or that area and
how we go about this as a campus” (155). This also helped to generate support among
these users when the situation grew critical, especially in preparing for the various
milestones. Implementation required all hands on deck and “we all became data-entry
clerks…Sometimes we just have to work late at night and hours and overtime” (1086).
Still, the fundamental changes to the advisor role introduced through SEVIS created
challenges as the organization and the advisors grappled with the additional
responsibilities, the need to acquire new technical skills, and their role as agents of DHS.
While a source of conflict, redefining roles and responsibilities in this way placed a
strong demand on the organizations and its members to change.
Another struggle was how the organization balanced technical and advisor-related
skills in meeting the need for additional staff. While the failure of the initial compliance
coordinator
66
indicated the need for more than just technical ability, the need for staff
with technical skills reemerged as the organization reached clarity in how it would
65
This was particularly important since users came from different divisions at the institution and so were
not in the same chain of command, organizationally.
66
She had a tenure of less than two months
119
manage implementation. In the meantime, the SEVIS coordinator struggled to manage
both compliance and advisement.
Proposition 6
Organizations must support a well developed communication system within the
organization and among organization members to facilitate the learning process and
effective implementation.
“The more important test of my leadership and management capabilities is…number
one, to ensure that the communication was constant and that everyone was being kept in
the loop.” (216)
Internal communication as a means to facilitating organizational learning
(Nonaka, 1994; Damanpour, 1991; Levy, 1986; Porras and Robertson, 1987; Elmore,
1982) is critical to distributing information across the organization and transforming
information into knowledge. This involves interpreting and integrating (Crossan, Lane,
and White, 1999) and a constant exchange between tacit and explicit knowledge
(Nonaka, 1994). This is also the mechanism by which individual learning is
communicated throughout the organization, and it does so through a variety of
communication channels, not just along hierarchical structures.
Because of the complexity and coordination required for the implementation and
maintenance of SEVIS, internal communication, within the department, across users, and
among support departments, was expected to be absolutely critical in facilitating the
learning process, and, indeed, it was. From the start, the organization and management
120
approached the understanding of the policy and its implementation with a participative
perspective that engaged those involved in implementation. The implementation process
was characterized by regular communications among involved parties, as well as among
users of SEVIS, as advisors worked to determine best practices, maintain training and
resource guides, and continue to refine the interpretations of policy.
The various levels of implementation were characterized by participative decision
making that involved users and supporters of the implementation, and where parties
worked toward consensus. As stated by the manager, “decisions are made as a team”
(184) and, while SIT served as the core team that structured the campus response, that
team approach was reflected within the various groups of users and support entities as
they managed implementation and use of SEVIS. The communication between
supporting departments was crucial but created special challenges largely because there
was less common ground and shared investment in the project. Much of this was
overcome with the inclusion of both records and technology staff on the SEVIS
implementation team, where they participated in meetings and were included on regular
communications. This helped to establish a clear understanding of the policy, and
provided those departments with the necessary information when conflicts arose.
Still, the responsibility for compliance rested with the international office, and it
was through their actions that the collective efforts were brought together to produce
learning during implementation. There was clarity related to who had leadership for the
project and who was ultimately responsible for making decisions. With that clarity,
meetings were held regularly and communication lines were generally good, much of this
under the direction of the coordinator. As implementation stretched on over months, she,
121
along with the IE director, maintained the institutional memory and made that readily
available to those participants.
Among advisors there was an effort to coordinate, both in the implementation and
initial policy interpretation process, but continuing into the use and refinement of SEVIS.
Within the international office itself, there was a very strong sense of coordination and
collaboration among the core SIT, which was further facilitated by the shared office
space. Ultimately, these lead users were seen as providing more than sufficient support,
training, and assistance, both within international education and among the advisors.
Because users were located in different office spaces and spanned different departments
and divisions, internal communication mechanisms between the two advising
departments were more difficult, but also essential. The organization addressed those
challenges as remarked by an advisor in language programs:
On a campus level we work very closely with [IE staff] and have meetings where
we talk about strategy or procedures that we need to adopt in order to be
compliant in this area or that area and how we go about this as a campus. And you
just go through a lot, not necessarily a lot of training because the system is not
that complicated. It’s just making sure that you can interpret the data properly and
report in the appropriate way. (155)
Regular meetings and articulated business guidelines helped to develop shared
practice and a commitment to the integrity of how SEVIS was used. Indeed,
collaborations between the departments extended beyond SEVIS to joint programming
and events, and that relationship grew largely as a result of their working together on
SEVIS.
122
Proposition 7
Processes of evaluation and routinization of practices are essential for
documenting and directing the learning process and enabling effective implementation.
“She is still helping us [with] the writing of the BPGs [business practice
guidelines]…And that is so critical, writing BPGs.” (1048)
This proposition advances the importance of the process of documenting
practices and learning efforts within the organization. Organizations need to engage in
evaluation with a willingness to change (Crossan, Lane, and White, 1999; Senge, 1994;
Cole, 1991), and that evaluation should be purposeful with a value placed on
improvement (Hackman and Wageman, 1995; Tyre and Orlikowski, 1993). The need to
collect information, and evaluate and adjust implementation and the use of technology, as
needed, is critical for effective use of policy driven technology in the long run.
There has been a clear and ongoing commitment to develop routines and integrate
the work of SEVIS into the normal business operations. These routines and the
evaluation of procedures have manifested themselves throughout the implementation
process and in the adoption and use of SEVIS by advisors. Much effort was required to
develop standard practices, but there was general acceptance of the need for routine and
the support for this continues. Indeed, an organizational culture committed to these
practices has emerged.
This process of evaluation/routinization has played a critical role in the learning
throughout both initial implementation planning and ongoing use. From the onset of
implementation, clear efforts were made to document operating procedures and articulate
123
interpretations of policy. This was particularly important for data management efforts
involving support entities, and is a common practice in data management, generally.
Structured learning as a means of developing best practices and routines were
incorporated into training as well.
67
Within the language program, the strategy of
learning was routinization. New staff would learn basic data entry, as additional issues
arose they had an opportunity to process new types of information and so slowly build up
a set of responsibilities. Further, the general practice of process adoption among advisors
tended to follow, four steps: Discuss, enDeavor, Document, and Distribute.
68
When a
new process arose in one of the departments, there was a tendency to pull other users into
the room and discuss how best to proceed (or just to allow the others to see how it
worked), often in the midst of an advising session with an international student. From
that a best practice was determined, which was documented and shared among users.
This four Ds model of process adoption enabled the organization to routinize novel
practices quickly.
An inherent challenge in managing SEVIS is articulated in a point raised by the
manager in a progress report submitted to the campus administration, which states that
while SEVIS “is highly technical, the day to day human decisions are many and often
complex.”(1307). This suggests why defining practices was so important. Indeed,
business practices are not only clearly defined and distributed, but have become an
expectation that they be developed as new issues arise and processes are put into place.
A distinct culture around this has emerged. Procedures are documented in a campus-
67
An example is when the batch processing system first became available. It was provided to users for
about two weeks to acquaint themselves with it, and submit questions and comments. Shortly thereafter, a
meeting was scheduled to cover additional training, evaluation, and discussion of business practices.
68
The Four Ds Model of Process Adoption.
124
based step-by-step manual that augments the training materials provided by NAFSA and
USCIS. These resources, of course, are informed by lessons learned by users in the
organization, as well as data collected from other implementing entities and NAFSA.
The implementation team has also revisited some of those interpretations and procedures
and made adjustments as needed.
The standard for much of this has been set by audits and the recertification
process, which have helped put this reporting culture into place. Not only have they
tested the integrity of data, but they have also required the organization to articulate the
various interpretations of policies in effect at the institution. As such, the various
decision points in the initial policy have been replaced by articulated procedures. One
result of this, as universally indicated, is that, for staff hired beyond those initial
implementation years, the whole of SEVIS is very straightforward. Similarly, evaluation
also is valued as a function of data management and assessing the integrity of data across
systems.
Organizational Learning
Evidence of learning in response to the policy directed technology (PTD) has
been observed in two broad areas. These include substantial changes in the
organizational culture and membership and the development of new operational systems
in response to the technology. The roles of members and of the organization as a whole
have changed substantially as the advising function has adapted to the technical
responsibilities inherent in the new federal policy. This has involved the creation of new
technically focused positions, which further reinforce these changes in organizational
125
culture. In terms of these operational systems, the organization has created documented
procedures serving both the user and the more technical interfaces involved with the
various databases. These procedures have evolved through planning, trial and error, and
the coordinated efforts of the parties involved in the implementation.
Changes in Organizational Culture and Membership
Over the course of implementing the SEVIS system, the organization has
witnessed a change in the roles of members, a change in its membership, and a shift in its
culture. This has been in response to the demands of the policy, and as organizational
members and the leadership have come to better understand SEVIS and its related
technology. Ultimately, the effect has been the development of more technically oriented
functions and operations.
The change in employee roles and organizational role evolution has been
dramatic. Staff are no longer simply advisors, but both advisors and technologists.
Throughout the implementation process there has been an uneasy balancing of these two
obligations, a tension well recognized by the leadership.
SEVIS has seen unprecedented level of resources directed into the whole
compliance area and fewer and fewer resources devoted to student programming,
leadership development…what we saw at the time of SEVIS implementation was
this full-on focus on SEVIS, SEVIS, SEVIS and my responsibility as a leader
[was] to remind everyone including myself that what we’re here for is more than
about SEVIS. We’re here for students and we try to keep that in sight, but I think
that sometimes it was very easy to lose that focus on our students because SEVIS
became our reason for being, our reason for existence, and I think sometimes we
lost focus on our students which is unfortunate. (179-180)
126
For the advisors, this demanded a substantial alteration in how they operated.
Counseling sessions with students came to be mediated by SEVIS (and a computer
screen) as student information was verified and there was a need to be particularly careful
about the advice provided to student, lest it take them out of compliance with federal
policy.
69
Indeed, at different points throughout implementation, advisors spent increasing
amounts of time in dealing with SEVIS, not just in use, as they advised students, but in
development or consideration of policy interpretation and in understanding databases and
how they interacted with one another. Technical skills were needed to managed the
complex, i.e., data mapping and synchronization across databases,, and the mundane, i.e.,
data entry, fact checking, and generating reports. Nonetheless the process generated
conflicts, as discussed and as indicated by the comments of another advisor. “We’ve
seen that the role of an advisor has really changed perception from student, from the
public, from anyone that we have now become reporting agents of DHS [Department of
Homeland Security], because we’re using SEVIS” (1035).
Beyond the role of the advisors, SEVIS has had an impact on the broader
organization, particularly in the international office. There was active participation in
implementation and technology use among the entire staff. This was particularly true in
managing the milestones in the initial implementation. Implementation required all
hands on deck: “We all became data-entry clerks…Sometimes we just have to work late
at night and hours and overtime.” (1086) Through this process organizational members
69
An example of this is when students are having difficulty with courses. In the past they may have been
advised to drop the course so that it would not affect their GPA. Under SEVIS, dropping below full-time
enrollment would necessitate that a student be deported, so advisors and students need to take more care
and understand the larger consequences.
127
developed a common understanding of the regulations and the technology, and ultimately
a shared knowledge base.
Further, as the implementation unfolded, new staff members were hired. While
these were only a pair of positions in each of the advising departments, these individuals
came on board with a clear understanding that SEVIS was part of the job and a high
regard for its contributions to their work. “I actually started working with it. I just
thought it was just part of my job...just basic data entry” (1). They also came on board
with a more technical or administrative background then previous advisors, and thus
approached student service from a different perspective. The following description by
one of the new advisors is indicative of a different perspective of this role than had been
the case prior to SEVIS.
I’m one of the international advisers, and as an adviser, I meet with students on a
day-to-day basis regarding their immigration status or their—just general
questions regarding their immigration status. I guess I'm in charge of updating
the records as far as the immigration records go, so we use SEVIS for that. And I
as the DSO, I'm the one inputting information into SEVIS, updating information
and data in SEVIS, issuing I-20s for students, as well as doing all the travel
signatures or authorizing their employment as far as SEVIS goes. (222)
This contributed to a shift in the culture of the organization, where technical
expertise has grown in currency as the organization rise to the challenge of meeting these
new very technical requirements. This change has enabled the organization to be better
adept in managing the technology, and has further emphasized the focus on compliance
over other aspects of student advisement.
128
Development of New Operational Systems
An inherent challenge in managing SEVIS is articulated in a point raised by the
manager in a progress report submitted to the campus administration, which states that
while SEVIS “is highly technical, the day to day human decisions are many and often
complex” (1307). This suggests why defining practices were so important. Such
practices were both technical in nature, relating to the interface of databases, as well as
user and programmatically driven, including such issues as deadline dates for enrollment
and changes to student registration. What resulted from a collaborative development
process is a clearly documented set of practices related to SEVIS, which reflected an
underlying organizational learning process. While the process was managed and
facilitated by the international office, it was highly participative in the decision making,
and committed to efforts of documenting processes and ensuring that this information
was readily available to users. Ultimately, individual learning and lessons from outside
entities were collected, discussed and disseminated as part of the operational systems for
managing the technology.
A key component of these practices was related to the users. The group of
advisors met regularly and deliberated decisions related to the use of SEVIS—its impact
to their particular group of students or scholars, what might be unanticipated
consequences related to a particular decision. While staff in the international office
formed the core group of planners, regular meetings were held among the advisors in the
different departments to ensure the procedures and interpretations of policy were well
informed. Such discussions were also informed by feedback collected from NAFSA and
other campuses, as well as by lessons learned by users in the organization. There was
129
also an interest in revisiting these determinations after a semester or two in practice. On
several occasions they were adjusted, in some instances to improve benefit to students by
providing them additional time. On other occasions they were adjusted to improve the
interface with other data systems, in particular the student records database. As stated by
the manager, “decisions are made as a team” (184).
Efforts were made to incorporate the policy driven technology into the general
operation of the organization. This resulted in efforts to create routines out of novel
processes, largely following the four Ds model as discussed, which enabled the quick
adoption of routines. Advisors consulted with one another, even in the midst of an
advising session. Any decisions were documented and shared among users. Further,
within the language program in particular, the strategy to train new users on the system
reflected a strong drive toward routinization. New staff would learn basic data entry, as
additional issues arose they had an opportunity to process new types of information and
so slowly build up sets of responsibilities. Such activity reinforced the processes as
adopted by the organization.
Further, documentation and operating guidelines served as important
communication tools throughout the implementation. A distinct culture around
documentation has emerged. These materials, which augment the training materials
provided by NAFSA and USCIS, have been provided to all advisors and provided, even
new staff, with a clear understanding of how to navigate the system, enter information,
and to understand their role within the organization. Further, these efforts to document
have been particularly helpful as processes have been revisited as they have provided a
130
clear and shared understanding of current practices, as well as a clear starting point from
which to revisit and amend current operating procedures.
[O]n a campus level we work very closely with [staff in the international office]
and have, you know meetings where we talk about strategy or procedures that we
need to adopt in order to be compliant in this area or that area and how we go
about this as a campus. And you just go through a lot, not necessarily a lot of
training because the system is not that complicated. It’s just making sure that you
can interpret the data properly and report in the appropriate way. (155)
This has certainly been the case as the organization responded to a change in the
student record system. The clarity that these provide is nowhere more evident than in the
responses of more recent hires, which not only found the system very user friendly, but
also were very clear in their role. They are not involved in the ambiguity of policy
implementation and find that SEVIS “is just part of my job.” (1)
Ultimately, the SEVIS coordinator, who also continued to serve as an advisor,
played a critical role in not only facilitating the policy implementation, but also in
supporting the development of these operational systems. As the department learned in
initially trying to fill this role with an individual with technical expertise, the
management of implementation, at least initially, required a knowledge base of not only
the technical aspects of implementation but a sophisticated understanding of immigration
laws and student needs. The role was critical in linking the technical implementation
process (a process coordinated with student records and the technology department) with
that of the users who interfaced with the system and with students. Much of this was
taken on by the coordinator, and she served the key role in the day to day management of
the project and in facilitating these lines of communication (i.e., develop timeline and
task lists, facilitate and follow up from meetings, develop training, lead development of
131
business operating guidelines). This structure was crucial to keep the participants well
aware of the progress in implementation and engaged in the process. As implementation
stretched on over months, she maintained the institutional memory and made that readily
available to those participants.
Achieving Implementation Objectives
The implementation objectives considered here reflect both the demands of the
policy and the goals and values of the organization. For the federal government, it is an
issue of compliance with the mandate, which includes such items as maintaining
certification, providing regular reporting, and responding to audits. Organizationally, the
department is interested in making SEVIS a sustainable activity in their department in
light of competing interests to provide students quality services and programming. As
noted, compliance has been substantiated by their recertification and positive audit
reports. While this aspect has been met, the organization has also sought to maintain
service level to students,
70
but it has had to rely increasingly on other departments to
supplement its programming. Although this reflects partnerships, which have enabled
international students to be better mainstreamed into the campus, it is an indication of the
loss that has occurred in the department, a loss which is ongoing even as SEVIS has been
integrated into the general operations of the organization.
The implementation of SEVIS has followed a traditional pattern of top-down
implementation, where the organization has been charged with a new mandate. The
70
Services that the advisors have offered include: academic counseling, support in managing transition
issues, and promotion of student development in such areas as leadership, cross cultural understanding, and
general areas related to student maturation.
132
motivation to comply was high for the organization, since failure to do so would mean
that the campus could not enroll international students. And errors could also carry
substantial consequences lending to a general risk averse tendency for the organization.
Organizational learning was of particular importance since the policy as operationalized
by the campus required a high level of cross departmental coordination. While learning
has contributed to the achievement of the implementation objectives in helping to direct
necessary changes in the membership and culture of the organization and in the
development of operational processes to sustain implementation, the contributions of
learning to the organization’s “long-run adaptive capacity” (Schwandt and Marquardt,
2000) is less clear. This lack of clarity is largely due to problems that the organization is
facing in managing a new technology mandate related to the new campus student records
system. While organizational learning has served in aiding the organization in
implementation, technical challenges continue to be substantial. The changes in the
organization resulting from the initial implementation, though, have enabled the
organization to engage the problem at a more sophisticated level. Conversely, as noted,
the organization continues to struggle under the weight of implementation having been
unable to maintain many of its support services, which were provided prior to SEVIS. So
while learning has advanced the ability of the organization to manage the technical aspect
of its implementation objectives, not all objectives have been achieved.
In terms of the value of learning for supporting implementation objectives,
learning has been important to the implementation process, particularly in overcoming
the challenges associated with managing compliance among multiple user groups on the
campus. The need to coordinate implementation among various groups of students added
133
a level of complexity to implementation. Learning would have been less relevant in a
context where such coordination was not necessary, since it would have only dealt with
one population of students and fewer users. Coordination among the various advisors
was most important, and it was evident that those communication lines were strong and
well developed. Several advisors remarked on the “team” approach to decision making,
as well as the expertise and support provided by veterans of the organization. As further
evidence of the strength of these ties, the relationships across the departments, which
were developed in implementing SEVIS, spawned new partnerships and collaborations.
Still, a better measure of the value of learning is in assessing the ability of an
organization to manage a similar technological challenge following the initial
implementation process. Such an opportunity presented itself here in the form of the
system wide mandate to change the student record system.
The conversion of the student record system indeed took full advantage of the
organizational learning that had taken place in the organization. Having had the
experience of implementation, the membership had developed a new set of technical
expertise, a knowledge base that was shared throughout the organization and with support
entities at the university. And those relationships continued and have served the
organization throughout this second implementation process. The operational systems in
place have also continued in use and have been important instruments and reference
materials as the organization has managed this new conversion. Yet, the organization has
faced serious implementation problems with this conversion. The university has brought
on a consultant to assist directly with issues related the new student records system, and
134
the process has been complicated by the loss of the batch processing system,
71
which has
required the development of new protocols in which SEVIS and the student records
system communicate directly. Through the process advisors have been as engaged in
reviewing the technical details of conversion as they had with the introduction of SEVIS,
with the need to be focused on issues related to immigration law—again as they had with
SEVIS. However, just because this new implementation has been difficult, there is no
indication that the learning processes developed through the initial implementation have
failed. As noted, the organization has relied heavily on those processes. The same
process of testing data integrity as had been done when SEVIS came on board continues
to be used. Indeed, some of the difficulties that the organization is facing with this
implementation are developing process improvements, opportunities that the organization
would not have been aware of without the previous experience. Still, this process raises
questions of barriers to achieving implementation objectives, and technology
implementation specifically, namely limited resources and highly technical and
cumbersome policy directed technologies.
As suggested by Yin (1982) in his analysis of implementation studies, it is
essential to consider alternative explanations as to why implementation occurred
successfully. In reflecting back to Matland’s (1995) classifications of implementation, it
does hold that SEVIS is indeed an administrative policy, where resources play the key
element in implementation. However, while ambiguity may have in fact been low in
terms of identifying the path to implementation, the technology related aspect has made
71
The company discontinued support for the batch processing system, a move seemingly related to the
difficulty of keeping up with changes to SEVIS. While other vendors are available, with the pending
conversion, the university decided to rely on a direct interface between the campus system and SEVIS.
135
the process of implementation much more difficult. Thus this may suggest that, in
dealing with technology related policy, the level of complexity for the policy under study
is of particular relevance to consider. Certainly, as this suggests, organizational learning
is not sufficient in achieving implementation objectives, but its presence in advancing the
long run adaptability of an organization can help facilitate adoption of technology by
lowering the barriers of implementation.
Conclusion
As observed in the case of SEVIS, the learning model of policy implementation
served as a useful tool in explaining the underlying organizational learning process.
Further, learning was found to contribute to the achievement of implementation
objectives. Networking with other implementing organizations and
evaluation/routinization (with an emphasis on routinization) were found to be highly
valuable in promoting learning for the organization. In responding to implementation
objectives, the organization clearly met compliance and reporting requirements of the
federal government, however, continues to struggle with managing many elements of the
student service functions, to which the organization is committed, and which played a
more central role prior to SEVIS.
The organization’s adaptive capacity, an important element of organizational
learning, was tested as it dealt with a second mandated policy driven technology related
to its campus student records system. This implementation required the organization to
return to many of the initial technical and database interface issues raised in the SEVIS
implementation. While this new implementation has not passed without difficulty, the
136
organization has taken advantage of the learning that took place with SEVIS, reflecting
back on and updating operational systems put into place and utilizing the new technical
skill set of its staff.
While indeed the learning model of policy implementation was found to serve this
case well, its application in other cases are important in advancing its relevancy. The
model will be applied to the case of online education is offered in the following chapter.
In the concluding chapter, a cross case analysis is provided alongside the more important
work of discussing this model and its implication on policy implementation.
137
Chapter 5: Case Study of Online Education
Introduction
In this chapter, the evidence of organizational learning is considered in the
context of an academic department’s rapid expansion of online courses. This study
identifies that organizational learning has taken place and has been largely supported by
the conditions. The learning relates to directing learning in the organizational culture and
membership and the development of new operational systems, both of which have
advanced the achievement of implementation objectives.
As discussed, expansion of online education represents an emerging trend, and is
a process that benefits greatly from the transfer of individual learning among faculty in
the use of technology tools to the organization as a whole. The department under study
has indeed participated in such a pattern of growth, as it has seen a nearly fivefold growth
in online sections in less than two years, representing more than half of their total
enrollment. The efforts of this department will be examined in depth here with a focus
on the learning that has occurred.
While the focus is learning, this examination will begin with a discussion of the
online education literature and the various policies that have directed implementation in
this organization. A more detailed profile of the department will then be provided,
suggesting the importance of an opportunity window where online education serves as a
tool to address enrollment problems within the department. The implementation
narrative suggests that the department has undergone five phases over the course of its
implementation. These phases are marked by the development of an online component of
a course, which led to the development of a prototype online course in the department;
138
planning for expanded implementation of online; followed by the rapid introduction of
these courses; and finally a focus on instructional quality. Evident in this unfolding is a
tension with regards to organizational goals, namely between enrollment growth and
instructional quality. As the analysis will indicate, the model does well in characterizing
the underlying learning processes with the condition of a culture of experimentation
being the least relevant. Organizational learning is identified as taking place and took the
form of directed changes in organizational membership and culture, and the development
of new operation systems. Further this learning was found to advance the achievement of
implementation objectives.
Policy Background
The implementation of this policy is informed by a variety of sources. It is based
on the broader context of online education, so this policy review will begin with a brief
overview of the literature. From a departmental perspective, much of the direction
towards online was informed by emerging interest in pursing online education at the
system and university level. A loose set of policies helped shaped this agenda, coupled
with various policies in practice which took shape around directives, projects, and
operational incentives. These will be reviewed as a means of clarifying the underlying
motivation for the department pursuing this policy. A discussion of the regulatory and
accreditation requirements that help to operationalize and inform some of the details of
the implementation will then be provided. These are based on policy statements from the
university and its accreditation body. While this overall effort can be viewed as a
commitment driven by management, it is important to understand it in a broader context
139
in which online education became an effective and viable policy tool in overcoming
organizational challenges. This review seeks to provide that context.
Online Education Literature
Online education, which has its roots in both computer assisted instruction and
distance learning, has witnessed a great transformation in American higher education
over just the past 12 years. A form of education, which once existed at the periphery of
higher education, has been growing in popularity, use, and acceptance. While this is
surely the case, it has not occurred smoothly and in many ways retains the liabilities
associated with these roots. Still, the literature suggests many strategies by which to
approach implementation, and by association some pitfalls. Examples of both are seen in
use by the organization under study.
The growth in popularity of online education has come as the technology has
grown in sophistication and become more widely available and as it has been endorsed,
through its use, by prestigious institutions validating its legitimacy as a viable source of
quality instruction. As Larreamendy-Joerns and Leinhardt (2006) suggest,
The promise of all these undertakings is to deliver courses that possess the
signature of academic excellence and incorporate sound cognitive and
instructional principles. These initiatives have been launched against the backdrop
of longstanding endeavors such as the British Open University, almost prophetic
in its vision; Pennsylvania State University’s World Campus program, which
continues a vigorous tradition of more than a hundred years of distance education;
and the University of Wisconsin-Extension, one of the longstanding leaders of
distance education. Although some online ventures by top schools have failed
(for example, Columbia’s Fanthom.com), these initiatives have succeeded in
capturing the attention of the public and have provided legitimacy to similar
ventures in less renowned institutions (570).
140
Further indication of this change was the consideration and eventual relaxing of
the federal government’s 50% rule, which required that in order to access federal student
aid no more than 50% of a college’s enrollment could come from distance learners, a
move that was largely motivated by growth of legitimate online education programs, and
which further spurs growth (Carnevale, 2003).
Larreamendy-Joerns and Leinhardt (2006) recognize that online education,
because of its roots, has inherited the rich scholarly pedagogy of instructional technology,
as well as the liabilities of distance education, which includes the concern of instructional
quality.
72
This concern has been based on such issues as the limitations of the medium or
technology used, restrictions on the interplay between instructor and student, and the
quality of instructor. Indeed, the issue of instructional quality undergirds much of the
discussion related to online education in the literature. In particular, when coupled with
the observation that administration and not faculty tend to be the driver for expanding
online, there is a tension between the business model of online education and pedagogy
of its instruction (Larreamendy-Joerns and Leinhardt, 2006; Tallent-Runnels et al., 2006).
While this is the case, the interest in pursuing online education tends to be enrollment
growth, revenue generation, educational access, and convenience (Larreamendy-Joerns
and Leinhardt, 2006; Tallent-Runnels et al., 2006). Ultimately, what is purported is the
need to keep faculty well engaged in the design and development of online education,
which can help to minimize the divide between online and traditional academia.
72
Two other historical themes which frame the development of distance education are “the promise of
democratization” and “the tension between professional education and liberal arts education” (572).
141
In their discussion of incentives for faculty participation in online instruction,
Rockwell et al. (1999) find that support from faculty is based on an interest in the
development of new teaching skills and a support of the educational access principles
advanced by online education. While economic awards may be used, their research
suggested that such awards provide no incentive, or disincentive, for support. In fact,
continuing participation and support is generated through recognition of faculty member
contributions and the alignment of online to underlying educational goals. These
findings will provide an interesting contrast to how the organization in this study
approached implementation.
Tallent-Runnels et al. (2006) offer a comprehensive review of the literature on
online education, which offers best practice strategies.
73
Findings from the review
suggested the importance of creating small learning communities, providing timely
feedback, and the need for training for both students and faculty. The review also
indicated the presence of a large number of learning styles, suggesting the need for
multimodal approaches in online courses that meet the needs of various learning styles.
Finally, in terms of learning outcomes, studies indicated that “both methods of delivery
were adequate. In some studies, students in the online classes outperformed students in
the traditional classes and vice versa” (109).
This latter finding is particularly important for, as discussed, critics of online
education charge that loss of the traditional in class experience diminishes the quality of
education. It is important to acknowledge that, while this tension exists, one must also
73
The broad categories gleaned from the review were four general areas: course environment, learning
outcomes, learner characteristics, and institutional and administrative aspects.
142
recognize that there are examples of both high quality online education and poor quality
in traditional classroom education. The mode of delivery in and of itself is not the root of
the problem.
Campus and System Motivation
Several factors have influenced the desire and pursuit of online education at both
the university and system-wide level, which mirror motivations as suggested in the
literature. In many ways these interests have supported each other, but they have also
operated independently. The focus of the system centers around a commitment to
provide educational access, but is also shaped by economic interests. Similar factors
influence the campus approach to online education, but it is nuanced by an interest in
dealing with more local concerns. Both these sets of interests are advanced by policies in
practice, which are a collection of programs, initiatives, and taskforces underway that
communicate the commitment and help shape the process by which organizations within
the institution approach implementation.
System interest is largely based around the commitment as a comprehensive
public system of higher education to deliver quality education to eligible students in the
state. It is thus focused on issues of educational access, while also seeking to bring
efficiencies to the enterprise and take advantage of economies of scale by developing
coordinated efforts that meet the needs of multiple campuses. Other economic factors
include use of online education as a tool to meet growing and evolving demands, through
a more flexible and cost effective means than the investment in new buildings and
additional campuses. Support for online initiatives include efforts to develop special
143
programs, services, and collaboratives, which “support academic technology initiatives
that enable student success through effective collaboration” (1365). While supporting a
wide variety of projects, online education is a significant beneficiary.
As noted, the university drive includes the pursuit of educational access, but is
shaped by other interests related and unrelated to economics. There are four additional
drivers directing energy to online course development. First, the university maintains a
desire to be innovative and sees online education as a factor. Second, it seeks to meet
emerging student needs, not so much an access issue, but a convenience one that may
influence student enrollment choice. Third, online provided educational access to
populations with special needs (i.e., nurses, military personnel), which in many ways
mirrors system wide efforts. And finally, it is a means to reduce the demand on campus
resources and infrastructure. This latter interest, while certainly seen as a useful means to
grow enrollment with only a marginal increase in costs, it is not a strong motivation since
online programs are seen as attracting new populations of students, not replacing existing
traditional courses.
74
These interests, of course, are also informed by an interest in
educational quality. The campus has over eight years of experience providing online
courses, and currently offers four online degree programs.
75
University resources have
been provided to departments developing online degree programs,
76
but not for individual
online courses, and the development of such programs are encouraged in areas where
there are niche markets, which will support their enrollment. The university, as noted,
74
A further interest motivating the campus’s approach is to be thoughtful as it endeavors into online,
focusing efforts on niche markets with little competition.
75
All these programs are at the graduate level and in technology related fields.
76
The support has generally been through the provision of course release time for faculty, which enables
them to develop online courses.
144
has standards in place which articulate basic requirements for online courses, and a
process to approve online course proposals.
77
While issues of instructional quality are
important, such matters are regulated at the college level, so there is no central
monitoring of online education beyond course proposal approval.
So while economics plays a role in the pursuit of online education, at both levels,
it is not the driving force. There is no mandate. And while many programs are in place
to facilitate and accelerate the move online, it would not be at the expense of traditional
delivery or instructional quality. Despite a fair amount of resources being made available
and rhetoric being directed from the bully pulpit at both levels, the actual process has
been permitted to unfold at the department level, based on faculty interest and
commitment.
Regulations and Accreditation
Beyond those efforts that influenced the underlying goals and intent of online
education policy are sets of policy statements, regulations, and accreditation requirements
that have shaped and informed the implementation process. These are based largely on
the university and from the accreditation body, which itself is directed by the federal
government. While largely technical in nature, they remain an underdeveloped set of
regulations, particularly the former. This has enabled the department to, in some ways,
help influence their development on campus and to take certain liberties in its execution
of online education. These two sets of regulations will be discussed briefly.
77
The review process of online course proposals for existing courses includes approval by the department
chair, department curriculum committee, the department full-time faculty as a whole, and then by the
college and university curriculum committees, the faculty governing body, and the president of the
university.
145
The primary university policy governing online instruction is a policy statement
on online instruction which provides minimum requirements that must be met when
teaching online courses, but it is also shaped by regional accreditation bodies, which in
turn respond to national accreditation standards established by the Department of
Education (1366). Specifically, this campus policy defines an online course, identifies
requirements and expectations for students,
78
and faculty, and articulates the process of
developing new online courses.
79
The federal government also has policy that directs how campuses approach
online education. Federal law, overseen by the Department of Education directing
accrediting agencies, stipulates the need for accrediting agencies to review and approve
any substantive changes that occur within a previously approved academic program.
80
Included in the definition of substantive changes is the offering of at least 50% of an
educational program at a location away from the main campus. This regulation has been
used as the basis for accrediting agencies to review academic programs that move from
traditionally based instruction to an online environment at the degree program level, and
so it will apply to the case being discussed. The accrediting body under which this
university falls is the Western Associate of Schools and Colleges (WASC), which is
charged with ensuring that institutions of higher education meet acceptable levels of
quality. Accreditation review is undertaken at the institutional as well as programmatic
78
These include interaction with faculty, availability of support, quality of instruction, and notice to
students as to whether a course is web-based.
79
There are other policies related to curricular and program. While relevant as a matter of policy, they lend
little to the specific challenges of implementing online education and so will not be examined in this
analysis (1367-1369).
80
Department of Education, The Secretary’s Recognition of Accrediting Agencies; Final Rule, Federal
Register, 64: 202, October 20, 1999. 56621. 602.22 Substantive Changes.
146
level (which varies by institution).
81
Substantial changes in program, however, require a
new review, which includes the authority to offer an online degree. While the WASC
policy specifies a 50% threshold in the change of how the program is delivered, it is still
ambiguous as to what constitutes 50% of an online program (i.e., number of courses,
number of students, total enrollment). Similar to the campus perspective, WASC is not
concerned with individual course instruction, but with larger issues, and considers such
issues as the availability of advising, access to resources, and institutional and support
services, which need to be made available to students via an online medium (1370).
There were several efforts underway to expand online offerings at the university
and system wide, but there was no mandate or urgent call requiring action. This was also
tempered by commitments to quality instruction and student learning. The interest to
push hard towards online education came from the department leadership, rooted largely
in the need to improve enrollment numbers for the department. So the chair tapped into
this general interest from the administration to add legitimacy to the department’s efforts,
and in some ways urgency as well. In many ways this was enabled by the diffuse set of
policies mixed with a strong expression of interest in its pursuit as communicated at both
the system and university level. A more detailed look at how this unfolded at the campus
will now be explored.
81
A comprehensive campus WASC review is undertaken once ever seven to ten years, depending on
previous accreditation findings. Departments may also elect to undergo accreditation, and depending on
the discipline may be reviewed by different accreditation bodies. This department had not elected to be
reviewed separately.
147
Organizational Profile
The academic department under study is within a large public comprehensive
university in the southwestern United States. The discipline is a social science mostly
based in the humanities, but has some roots in the natural sciences, which varies
considerably by concentration, of which it has several and in which each faculty maintain
a focus. It is important to note that the department is not in a technical field, which
would arguably lend itself better to delivery in a technical medium and the faculty of
whom would naturally have more experience in dealing with technology.
Prior to implementation of online courses, the department included seven full-
time faculty members
82
with the most recent hire joining the department in 2005. An
additional five full-time faculty members have been hired since the implementation
began.
83
The department has also maintained an extensive number of part-time faculty
on the books, which exceed 20 throughout the study.
84
Among full-time faculty, at least
at the onset of implementation, there was a mix of technical aptitude, as can be gleaned
from the research. There were a couple with strong interests in technology and
technology in the classroom, and a couple who were very wary of technology. The ones
with strong interest shared the same concentration, and it was the area that offered the
first example of online instruction in the department. The other faculty lay somewhere
between the two extremes.
82
One retired recently but continues to teach online sections on a part-time basis.
83
An additional four to five full-time faculty recruitments were planned over the next year as of spring
2008.
84
Actually part-time faculty vary, and their number varies, from semester to semester, with some only
teaching a course per semester or year.
148
Other aspects of the department are also important to understand in the context of
this study. Prior to the implementation of online education, the department was dealing
with declining enrollment, and was grappling with some organizational problems which
were only alluded to over the course of the research. Manifestations of these problems
were a loss of around eight faculty members in the span of ten years, something very
much out of the norm for the campus as a whole. These also led to the transition of
leadership within the department in 2006. The department maintains a central
department space with exhibit facilities, which houses the entire full-time faculty and
until recently some part-time faculty as well. Most of the part-time faculty members,
however, maintain office space away from department main office, on different floors,
and even in different buildings. The department also maintains a presence at the
university’s branch campus with a few part-time faculty members sharing office space
there. Likely due to the large number of part-time faculty, the department has in place a
mentor program, where new faculty are assigned to a more seasoned faculty member in
an effort to facilitate their introduction to the department and the campus.
In terms of students, the department enrollment has and continues to be a service
based major,
85
with the majority of its enrollment generated through students taking
general education courses. Recent enrollment numbers include over 2,500 students each
semester, while the department maintains less than 200 undergraduate majors and minors,
and less than 100 graduate students pursuing a master’s degree. Over half of this
85
As a service based major the department generates the majority of its enrollment from general education,
as opposed to students enrolled as a major or minor.
149
enrollment figure represents online enrollment with fewer than 1,000 students enrolled in
traditional, non-distance courses.
As discussed, the university has established, in addition to a commitment to
technology in a broad sense, a commitment to advancing the development of online
education. Over the past decade the campus has invested considerably in infrastructure to
support technology. “Services include a standardized computing environment for all
faculty and staff, access to computing resources in numerous areas across campus as well
as access to a high-speed wireless network for all students, faculty, and staff in all areas
of campus, and laptops provided to all faculty” (1363). The campus also offers help desk
services, which are available to faculty, staff, and students who experience technical
problems. These overall efforts have served to support online and distance education
indirectly, and there have also been efforts to support them more directly. Such efforts
were initially led by the instructional resource center (IRC) and then more recently have
involved the distance and extended education center (DEEC).
86
The IRC offers support
for technology based teaching tools, and provides numerous seminars on Blackboard
functionality,
87
one-on-one consultation, a variety of web-based resources and tutorials,
and use of computers with additional software to produce course content. The DEEC
provides more comprehensive support of online education, although its support for state
funded instruction only became available in 2007. It retained an expert on online
education, who has a strong understanding of best practices within online education, and
has sought to provide support for departments to introduce increasing levels of
86
Additional support is provided to faculty from the library for the use of copyrighted materials,
particularly video, in online courses.
87
For more than the past seven years, Blackboard has served as the selected web-based course management
system for the campus, and is being used by this department.
150
sophistication to online offerings, such innovations including video to complement
PowerPoint.
Organization Meets Policy: Taking an Academic Department Online
While policy served to help shape the ultimate development of the online
education implementation case, the initial drives was organizational needs and was built
around a single successful example of implemented technology to meet enrollment needs.
As the implementation unfolded, four objectives have become clear reflecting the interest
of both the chair and the faculty: increase in overall enrollment, increase in majors and
minors, development of online degree, and maintaining quality of instruction in an online
environment. The initial prototype provided a well established and universally well
regarded example from which to engage the entire department. As the implementation
unfolded, it did so under strong direction from the department leadership and within the
confines of an evolving and diffuse set of policies. Such a development enabled the
department to take some liberties in how it underwent the initial growth in online
offerings, which arguably sacrificed instructional quality, but ultimately was successful
and in many ways helped engender support among organizational members who at the
start very much resisted the online approach.
The implementation phases modeled in Chapter 4 with SEVIS are modified here
and shown in Table 5-1. Similar to SEVIS, the implementation also begins with a “study
phase”, but then it moved to a “planning & implementation of prototype phase”,
“planning expanded implementation phase”, “implementing success phase”, and, finally,
a “focus on quality phase”. Elements of each are summarized in the table.
151
Table 5-1. Online Education Phases of Implementation.
Phase Dates Activities
Study 2000 – Summer 2003 Integration of technology into course
Development of web-based lab component
Planning &
Implementation of
Prototype
Fall 2003 – Spring 2006 Approval to develop proposal for online course
Research, development, and implementation of
first online course in department
Continued improvements and refinements
Planning Expanded
Implementation
Summer – Fall 2006 New leadership identifies online instruction as a
tool to improve department enrollment
Faculty developed and began submitting proposals
for online courses
Faculty began completing technology training
Implementing
“Success”
Spring 2007 – Fall 2007 Newly approved online courses are taught
Part-time faculty primary online instructors
Additional resources secured to support online
Began development of online degree proposal
Focus on Quality Spring 2008 – Present Full-time faculty increase online teaching load
Recruitment of new full-time faculty
Development of standard practices for online
Development of new assessment tools
The Study Phase
The study phase began as early as the 2000-2001 academic year and continued
until the summer 2003, as an online component of a course was developed and
implemented.
This implementation immediately proceeded the fall of 2003 when a full-time
faculty member in the department began developing an online course proposal for an
introductory class. An early adopter of technology, he provided many technology tools
to his students early on. “From the start all of my classes had web pages. I did a lot of the
web-based things for the students in terms of interaction, providing them with interaction,
providing them with a central resource, and then I got into Blackboard and I realized right
away” the potential offered by the technology (1358). He first developed a successful
152
online lab component for the course in an effort to improve the quality of instruction,
88
utilizing web-based tools from other institutions. Having found a structure that worked
well, it became evident that if the lab could be delivered effectively in an online
environment, arguably the most difficult aspect to create online, then the full course
could certainly be delivered as well.
The Planning and Implementation of Prototype Phase
The planning an implementation of prototype phase began in fall 2003 and
continued through spring of 2006, and was marked by the development and offering of
the first online course in the department.
After consulting with the chair and with the other faculty in the department, the
initiator gained approval to move forward and submit an online course proposal. In
developing the course, the faculty member approached a faculty member from outside the
department who had already implemented a successful online course.
89
Through those
conversations, he collected materials and advice, which helped him substantially in
developing the course. “I would have done a lot of trial and error to get where he had
gotten because he had been doing it for two or three years” (1359). In terms of course
content, he turned mostly to the internet and similar courses being taught at other
universities, and was guided by his knowledge of the discipline and network of other
faculty at other campuses who were engaged in online teaching.
88
The in class labs were too large, and students did not have the hands on experience that the lab segment
should provide. These web tools enabled him to provide that level of experience.
89
This outside faculty member was among the very first, if not the first, in the academic school to have
developed an online course.
153
In fall of 2004, he taught two of the eight sections of the course as online, and
from the start they were very successful in terms of both student feedback as well as
enrollment. The course has continued to be offered with strong enrollment with sections
numbering up to five.
90
The faculty member, in collaboration with other full-time faculty
in this concentration, has worked to continue to refine and improve the course structure
and format. They have worked with a book publisher to develop a textbook specifically
tailored to the online course, and the course serves as a model in the discipline. Even
among faculty in the department who were and continue to resist technology, and online
education more specifically, the course has been observed to be a success. It has been
able to generate this support largely through the organic manner in which the
development took place and the pace at which it was implemented. The underlying focus
was and has continued to be improving the quality of instruction.
The Planning Expanded Implementation Phase
The planning expanded implementation phase began in the summer of 2006 and
continued through the fall semester, and was marked by the commitment to expand
online education within the department.
In this phase, the department managed a transition in leadership with the new
chair committing to the expansion of online education as a means of addressing the
department’s enrollment problems. The infrastructure and resources to support faculty
training were available on campus, but developing the commitment to both create online
course proposals and get faculty to teach them was a greater challenge. The chair
90
Other full-time and part-time faculty members have participating in teaching the course as well.
154
introduced incentives to help generate the necessary faculty support, which worked but
raised other concerns, including questions about the quality of instruction, which would
grow as the instruction of the online courses was undertaken.
With a somewhat sudden departure of the department chair, a new chair from
outside of the department took over the leadership in the summer of 2006. The
department was facing declining enrollments and some morale issues.
91
Very quickly,
the new chair encouraged the department to consider new modes of delivering courses
and strategies for marketing the major. Observing the success of the one existing online
course, online instruction was considered as one of several tools to improve department
enrollment.
92
The goal was to increase overall enrollment in the department courses, as
well as increase the number of majors. Development of online courses was particularly
appealing, since campus resources were available and training would be available to
faculty at no cost to the department.
The department chair was mindful of many of the policies dealing with online
education
93
and tapped into the interest of both the university and system as a means of
promoting the department’s commitment. The implementation required support from
faculty to both develop course proposals and commit to instructing online courses. Full-
time faculty could not be forced, but were all encouraged to at least participate by
submitting a course proposal. Faculty members were not required to teach the courses
they proposed, and later the chair secured resources to provide release time for the
91
Among indicators was the departure of several full-time faculty members over the previous ten years.
92
Strategies also included expanding summer course offerings, expansion of courses available at the
university satellite facilities, partnerships with other disciplines, and use of televised instruction
93
At this point the idea of an online degree was not considered, so accreditation standards were not one of
those policies referenced.
155
development of successful online course proposals. Part-time faculty had a different
motivation. They received no compensation for the development of course proposals, but
it was in their economic interest to participate since they would have the opportunity to
teach the online courses once approved. These sets of incentives for supporting the
initiatives raised several questions related to fairness, quality of instruction, and instructor
commitment, and contributed to some tensions between full-time and part-time faculty.
These will be addressed more fully as the next phase of implementation is discussed.
And so with the encouragement of the chair, both full-time and part-time faculty
developed and began submitting online course proposals as early as the summer.
94
The
effort to develop online course proposals resulted in ten proposals being submitted for
approval. Four were submitted that summer and an additional six in the fall.
95
The Implementing “Success” Phase
The implementing “success” phase began in the spring of 2007 and continued
through the following fall, and was marked by the instruction of new online courses. By
spring of 2007, within a year of taking on the initiative to increase online courses and
within a semester of having been proposed, five newly approved online courses began
being taught for the first time.
96
This implementation was quick and raised several
concerns as alluded to earlier, as in order to accommodate the growth the department
94
All online courses developed in the department were modifications of existing courses and complied with
university standards for online courses.
95
Seven of those ten were approved by the end of the fall semester (four in September), and four online
courses were slated to be taught in the spring. Five of the initial ten proposals came from three full-time
faculty members. Part-time faculty members were responsible for all four proposals submitted during the
first summer.
96
Online coursed were listed in the class schedule pending approval. The canceled course was not
approved in time to be offered for the spring.
156
took actions that arguably sacrificed quality of instruction. This created some tension
between part-time and full-time faculty and some resistance to the whole enterprise
among some full-time faculty. From the start, though, the department undertook efforts
to address and maintain quality in instruction, including monitoring of course websites
and then later the provision of additional support and resources. The outcomes were
mixed, though largely favorable. Enrollment growth was tremendous, but underlying
concerns over quality remained.
Throughout this early implementation, the department relied heavily on part-time
faculty to deliver instruction of these new online courses. Indeed for the first two
semesters, spring and summer, the only instructors of these new online courses were part-
time faculty.
97
With the exception of two courses in the summer, all of these had been
also developed by part-time faculty. The burden placed on part-time faculty created
questions of instructional quality because their previous experience with teaching was
less and most did not have terminal degrees. Further, these faculty members were less
connected to the department and campus, and often had other jobs that competed with
their time and ability to access training on campus. This does not suggest that part-time
faculty were not capable of quality online instruction—in fact most did well and
approached the obligation thoughtfully—only that the typical balance that may be sought
in a department’s management of instructional load was not being maintained in the
online environment.
97
It was not until fall 2007 that full-time faculty began teaching in any online courses outside of the
prototype course. In that semester, there were three, representing six upper-division sections.
157
The expansion of sections, from five in fall 2006 to 13 in the spring of 2007, was
the result of robust enrollment. Enrollment caps for a section were set at 89. Faculty
members with sections exceeding that cap were offered the opportunity to split the
section or they could count it as two classes. While class enrollment was generally
higher for online sections than for traditional sections of a particular course, most
sections were nowhere near that cap. Enrollments for online sections during these three
semesters averaged between 43 and 53.
98
There was a perception among some part-time
faculty that they maintained a higher enrollment cap than full-time faculty, and a
frustration of maintaining a commitment to focus on student writing and promoting
online discussion, when their class size was so large. Further, among some full-time
faculty there was also the question of quality. As one remarked, “So you get the lowest
paid people with the least investment in the university and give them the biggest
workload. It’s not rocket science, it’s not fair” (417).
As noted, due to high enrollments there were occasions that the decision to split a
section arose close to the start of the semester. This left the department in the position of
recruiting part-time faculty who were sufficiently experienced in terms of the course
content, but not always much experienced or trained with regard to online instruction.
While support was available, as has been discussed, there was not always sufficient time
particularly among part-time faculty who spend a limited time on campus. As one
instructor noted, “I got called in at the last minute, actually after the first week of the
courses had begun, and was asked to take it over without really too much prior
98
Only three sections exceeded the average in spring and summer, but nine did in the fall. However, in
terms of sections with 100 students or more, there was one in the spring and summer, but three in the fall.
158
experience as far as online courses goes” (944). While this was not the norm, it occurred
often enough and contributed at least to the perception among some in the department
that the quality of instruction might be suffering, particularly when coupled with the
reliance on part-time faculty and establishment of higher enrollment caps for online.
These practices illustrate that the motivation for this implementation was
enrollment growth, and while controls were put into place to monitor quality, which will
be discussed, in many ways enrollment growth trumped concerns over quality. Along
with the differing levels of compensation between full-time and part-time faculty, these
have contributed to frustrations and tensions within the organization. These are
exemplified by the remarks of some of the faculty. A part-time faculty member
responding to his motivation for supporting the effort shared, “No release time...Am I
being compensated? No. Nothing. Zero...What do I have? I have a choice? I’m a part-
time guy, I don’t have--I mean, my voice is somewhat limited right?” (824). Another,
remarking about the resistance among full-time faculty and the availability of training.
There is plenty of training available on this campus. So if anybody says, “Well,
there’s no training,” full-time professors here, who are here on campus all the
time and aren’t working in other places, they just need to step out and take
advantage of those. And they repeat these workshops. (969)
Among some full-time faculty there was also resentment that the faculty had not
been fully involved in the decision to grow online offerings.
99
This frustration is fueled
by concerns over quality, as one full-time faculty suggests.
99
There is agreement that these discussions did take place in faculty meetings, but faculty were not
engaged throughout the process, which is a point of some contention.
159
All of a sudden, all these part time people were developing online classes for us
and just getting really thrown into it. I saw some of them. They were awful. They
were the canned programs, PowerPoint—like a PowerPoint presentation that the
publisher gives you. It’s awful. There’s no class. (337)
The issue of quality in instruction, though, is a complicated one, and the online
enterprise is difficult to assess. As one proponent of online observed, “I think this quality
issue is a very valid one. [But] I wish I had the same concern expressed by those people
[about courses with traditional] delivery as well because I think some of it is hypocrisy”
(1239). This sentiment has merit, since for many these concerns over quality masked
their resistance of technology and reluctance to change.
Several steps were taken as a means of addressing quality of instruction, both in
light of these concerns, but also in anticipation of general concerns often held regarding
online course quality. First, the chair, and later the vice chair, maintained “enrollment” in
each online course and so had access to any course at any time.
100
Second, the chair
organized special meetings for both part-time faculty and online instructors to share
resources and provide additional support. Although attendance was not required, these
sessions were noted by many faculty members as valuable and are not a typical practice
in academy as noted by the chair.
All the part-time faculty get brought together, three times a semester. The web
people get brought together once a semester. So there is a group that’s brought
together on the basis of delivery. How often does that happen in a traditional
mode? All your lecturers get together? No. So we’re bringing them together on
that basis. So that’s a support system. And they share links, they share course
material and they share articles. (1180)
The third was the development of support to online instructors from the distance
and extended education center (DEEC) on campus, a partnership established by the chair.
100
This was a marked difference to the access maintained for traditional courses, where the chair would
make one scheduled class visit to a part-time faculty member, usually in the first semester of teaching.
160
DEEC has on staff an expert on online education, who came to serve as a consultant to
faculty teaching or planning to teach online. He also came to be enrolled in each online
section as a means of providing better assistance to faculty and to provide suggestions for
improving sites. Most faculty teaching or preparing to teach online were fairly well
aware of him, as were many with no online instruction experience. Overwhelmingly the
support and responsiveness of this consultant service was remarked as invaluable among
faculty. As remarked by one faculty member, the DEEC “has been excruciatingly
willing, and I say excruciating because he devotes a tremendous amount of time to
helping faculty make the best use of Blackboard” (490). DEEC also provided additional
resources to faculty, including the acquisition and incorporation of copyrighted video to
online course sites and the opportunity to complement online PowerPoint with a video
narration by the faculty member. This innovation, currently only utilized by a couple of
faculty, was very appealing and served as means of encouraging faculty to improve their
online courses and also attracting other faculty to make the move to online instruction.
This partnership was strengthened as the department worked towards developing an
online degree proposal, a process that is still underway and would be the first online
undergraduate degree at the university.
101
In terms of course offerings, enrollment growth, and at least the implicit support
of full-time faculty, it is easy to see how the department can claim success. The five
courses offered in the first semester (spring 2007) provided an additional eight sections of
online classes in the department, more than doubling the number of section that had been
101
One motivation for the support from DEEC was that online courses from an accredited department
enabled additional enrollment opportunities for non-university students, and so can contribute to the DEEC
enrollment.
161
offered the previous semester, when only the prototype course was being taught. Over
the next year, the number of courses would more than double and the number of online
sections would nearly double. Tables 5-2 and 5-3 display the growth of online offerings
by course and section, respectively. By fall of 2007 more than 30 course proposals had
been submitted of which 17 had been approved. Student enrollment in online courses
from fall 2005 to the end of this phase in fall 2007 grew more than eightfold from 160
(two sections of the prototype were offered) to 1333 (1364).
102
And so, when the chair
was re-appointed for a term of three years, an action affirmed by the department faculty,
online education was a firm component of the course offerings in the major. At this point
four full-time faculty members were teaching online courses, and all but one had
submitted an online course proposal. Because some of this was due to economic
incentives, though, it also raises concerns about the nature of commitment among the
faculty to online education. It did, however, draw in at least implicit support, support
further committed since the department clearly benefitted from the enrollment growth in
other ways as well.
103
Policy objectives, here, though, are broader. For one, the enrollment goal that had
not materialized was a growth in majors, which saw no marked increase in students
majoring or minoring in the department’s area of study. Online courses offered were for
the most part general education, but the underlying theory was students would take it as a
general education course and be sold on the major. One could therefore suggest that the
102
Number varies slightly from Table 5-3. Figures here use enrollment at census versus enrollment at end
of term.
103
As it reduced enrollment concerns, it enabled the courses of specialized interest with lower enrollment
to be offered and placed the department in a position to grow in resources and enable the recruitment of
full-time faculty.
162
courses may not have been as engaging and did not represent the best of the discipline,
104
and so despite a great increase in exposure of the major this has not been coupled with a
growth in majors.
105
And so returns the nagging issue of quality, and as the department
leadership began to consider a move to develop an online undergraduate degree, such an
issue would certainly need to be addressed further.
Table 5-2. Expansion of Department Online Offerings by Course. Indicates growth in
offerings from fall 2006 through spring 2008.
Course Fall 2006 Spring 2007 Summer 2007 Fall 2007 Spring 2008
Number Courses Courses Courses Courses Courses
100 0 1 1 1 1
*101 1 1 1 1 1
102 0 Canceled 0 1 1
103 0 1 0 1 1
300 0 0 1 1 1
301 0 0 0 0 1
304 0 0 1 1 1
305 0 1 1 1 1
315 0 0 0 1 1
321 0 1 1 1 1
340 0 0 0 1 0
342 0 0 0 1 1
344 0 0 0 1 1
347 0 0 0 0 1
350 0 1 1 1 1
424 0 0 0 0 1
Total 1 6 7 13 15
* Indicates prototype online course.
104
This may be due to quality of course and part-time faculty overrepresentation in instructors of online
courses.
105
An area of further research: As quality increases will the department see conversions to the major
increase?
163
Table 5-3. Expansion of Department Online Offerings by Section. Indicates growth in
offerings from fall 2006 through spring 2008.
Course Fall 2006 Spring 2007 Summer 2007 Fall 2007 Spring 2008
Number Sections Sections Sections Sections Sections
100 0 1 1 2 2
*101 5 5 3 4 2
102 0 Canceled 0 1 1
103 0 2 0 1 1
300 0 0 1 4 1
301 0 0 0 0 1
304 0 0 2 2 3
305 0 2 1 2 3
315 0 0 0 2 3
321 0 1 2 1 1
340 0 0 0 1 0
342 0 0 0 1 1
344 0 0 0 2 1
347 0 0 0 0 1
350 0 2 2 1 1
424 0 0 0 0 1
Total 5 13 12 24 23
* Indicates prototype online course.
The Focus on Quality Phase
The focus on quality phase began in spring 2008 and still continues as the
department turns to address quality of instruction in a comprehensive manner. While
enrollment remains important, there was the opportunity and need to direct additional
attention to the area of instructional quality. Quality is addressed in three predominant
ways: (1) by implementation of standard practices in using the technology; (2) by an
increasing reliance on full-time faculty to serve as instructors for online courses; and (3)
in the development of new assessment tools in analyzing quality of online instruction.
The development of standard practices in the use of the technology was generated
from observations from the monitoring process, input from faculty who had taught, and
recommendation from the DEEC expert. The latter included standardizations to improve
164
students’ ability to use the sites, such as adding a Blackboard tutorial for students and
developing a common layout for course sites.
106
This process also seeks to put into more
widespread practice new technology tools which can improve quality of instruction and
efficiency as well.
107
In terms of full-time faculty participation in instruction, this came through efforts
to involve current faculty, but the enrollment growth enabled the department to begin
recruiting new full-time tenure track faculty. While new hires would not be required to
teach online courses, the department’s emphasis on online instruction may attract faculty
interested in teaching online. Further, the prevailing interest in extending full-time
faculty participation has influenced the recruitment effort. As noted by the chair, “when
we recruit, they’re all going to be asked, ‘Are you willing to teach at [the branch
campus]? Are you willing to go online?’ And I can tell you both of those are going to
help to put people into the final spot” (1229). The department was positioned to nearly
triple its number of full-time faculty in two years.
108
Finally, the last way that the issue of quality of instruction came to the forefront,
and most closely aligned with policy learning, is the development of new assessment
tools in analyzing the quality of online instruction. These efforts were headed up by the
vice chair in the department as part of the annual evaluation process of part-time faculty,
but also as a way of better communicating expectations in the department as it applies to
106
The latter would help to ensure students taking more than one online class in the department would
encounter consistency in navigation schemes.
107
An example of the latter is GradeMark, which is an electronic grading tool which among other functions
can help faculty to provide detailed feedback on student writing using a coding system where common
remarks are added with a simple keystroke and modified as needed.
108
It was authorized to hire six new full-time faculty members for fall 2008, and four more in fall 2009
(1362).
165
online instruction. This built upon the previous practice of accessing online course
websites and periodically monitoring activity, to comprehensively reviewing part-time
faculty online course websites against a rubric developed by the department vice chair.
The evaluation was initially undertaken in the summer of 2008 and reviewed the online
courses of all part-time faculty members for the 2007-08 academic year. Through such
an effort, the emphasis switched from monitoring minimum levels of quality to
systematically providing feedback and direction on how to produce high quality online
courses. These materials are being widely distributed in the department. The department
is also extending efforts to facilitate communication among online instructors.
109
So the implementation story that unfolds is an interesting one with a key source of
underlying tension in relation to organizational goals. The implementation process was
informed from the “bottom up,” as an entrepreneurial faculty member developed an
innovative practice, the intention of which served the pedagogical goal of improving the
quality of instruction. This effort, secondarily, served to provide increased enrollment in
that one course. The process was adopted by the departmental leadership, where the
purpose of the policy became enrollment growth. While the department has witnessed a
return to a focus on quality, the initial emphasis on enrollment has created a host of
problems that diminish the department’s ability to achieve its overall policy objectives. It
is important to note that, while the department may not have been focused on quality, it
did ensure that a minimum expectation of quality was met. But this continues to be a
109
One important way is by increasing the frequency of online faculty meetings to monthly during the
semester.
166
work in progress. Yet, it is difficult to imagine that the support within the department
would have been generated or the additional resources would have come to be without
the swift implementation led by the chair. Indeed, it may have been necessary to
sacrifice quality of instruction in the short term in order to be able to deliver on both in
the long term.
Data Analysis: Applying the Model to Online Education
Several qualities of the policy serve both to facilitate the achievement of policy
objectives and as barriers to its implementation. Despite the rapid development of online
courses, this process was not inevitable and encountered several challenges, challenges
that may in part be overcome by the presence of learning systems at the organizational
level. An analysis of the learning model as it related to the implementation of this policy
in this organization is then provided. The model did prove useful in conceptualizing the
learning process, although the condition of experimentation, networking with other
implementing organizations, and communication with policymakers appeared to have the
least relevance to this case.
There are several attributes of this policy that contribute to successful
implementation. Specifically, these include: provision of resources; existence of a
successful prototype, which preceded the implementation; the opening of a policy
window as the department sought to address declining enrollments; and the presence of
organizational structures to facilitate the support of organizational members. Themes
here relate the value of policy windows as offered by Kingdon (2003) and
incrementalism as suggested by Lindbolm (1973).
167
Several obstacles also presented themselves over the course of implementation.
These included: a diffuse set of originating policies; the presence of strong resistance to
the policy among some organizational members, who maintained both a fear of
technology and a concern over quality; the need to overcome concerns related to quality
of instruction and standards of quality for online courses; academic freedom, which limits
the ability of leadership to put in place requirements while also limiting authority over
course content and contributing to a principal-agent problem; and that the discipline was
not in a technical field, so faculty had varying experience with technology.
The question of whether the underlying learning processes facilitated the
achievement of policy objectives remains, and, if so, how the posited model serves as an
effective tool in describing the learning that took place. To that end, the seven
propositions of the model will now be discussed in the context of this case. General
findings are summarized in Table 5-4.
168
Table 5-4. Propositions in Practice for Online Education. Analysis of propositions within
online education case and their contributions to learning.
Proposition Contributions to Learning
1
Network with other Implementing
Organizations
New ideas and tools for use in online courses
2 Communication with Policymakers
Clarification on process, but information flowed
through department leadership
3 Policy and Goal Congruency
Incentives established to generate faculty participation
Quality standards maintained through monitoring of
courses
Evaluation efforts articulated higher standards
4 Culture of Experimentation
Largely supported individual learning
Characterized by trial and error approach
5 Clarity of Roles and Responsibilities
Chair served as driver of policy in department
Monitoring and evaluation efforts served as vehicles to
articulate faculty responsibilities in online
environment
6 Internal Communication
Formal efforts included notices of resources and
regular meetings of online faculty
Informal network, though largely based on “proximity”
7 Evaluation and Routinization
Evaluation initially served individual needs, later in
implementation department efforts developed
Routines introduced by DEEC expert
Proposition 1
Organizations can take advantage of the learning that occurs in other
organizations by establishing networks and exchanging information and resources with
other implementing organizations.
“…we got permission to use a great web-based exercise from Berkeley. We got some
other—well, you need to apply for permission to use university-based web activities—and
all of a sudden, we realized, gosh, this whole lab experience is online.” (464)
The value of networking across implementing organizations is reinforced
throughout the literature and is also one of the two sources of external environmental
information for the organization as posited in this model. Efforts here include such
activities as benchmarking and identifying best practices (Osborne and Gaebler, 1992;
169
Thompson, 1999), as well as interorganizational exchange as a means of learning from
other organizations both generally and in the context of information and communication
technology, ICT (Robey, Boudreau, and Rose, 2000).
Networking provided information about the utilization of technology and a
resource for new ideas and tools to use in online instruction. Communication with other
faculty outside the department and utilization of resources beyond the campus has been
limited, occurring only by happenstance with few exceptions and did not seem to
generate information that was fed back to the organization as a whole. While very useful
for some, it was not an activity that was used by most faculty to inform the development
and improvement of their courses. It is also important to note that the networking process
did generate information to aid in implementation, but by and large that information was
not fed back to the organization. Further, as will be discussed later, the DEEC expert on
online education may have served as the surrogate for the networking function by
introducing best practices into the online instruction of the department.
As noted, in developing the initial prototype, the faculty member met several
times with faculty from another department who had a couple of years of experience
teaching online. As this was another discipline, it provided information on technology
use and strategies for using technology, rather than course design in an online
environment.
110
Other faculty also indicated similar efforts to enhance course offerings
though none in so comprehensive a manner, and often content related. “I never taught
this course before, so one of my colleagues at another college that used to teach here and
110
Initial formulation of this model sought to include communications with faculty in other departments at
the same university under Internal Communication rather than Networking. In analysis of the data, though,
it seemed more appropriate to place it here since the consulted faculty member did not have an ongoing
relationship at the organization level.
170
taught this course, she gave me plenty of resources to help with the study material” (701).
While certainly a common practice in teaching, it served an important use in online
instruction as course materials for online courses differed from materials for a
traditionally delivered course, in particular learning assessment tools. This only
manifested itself among part-time faculty, whose networking was often based at another
campus at which they taught.
Proposition 2
Organizational learning requires that members are well informed of policy and
policy changes. Management should serve as a conduit of information between
policymakers and organization members.
“I’m not exactly sure what the policies are of the university.” (706)
Two forces drive the importance of this condition in the literature. First is the
need to base implementation success, at least in part, on policymaker intent and the
intentions as defined in the policy. While clearly a perspective maintained by the top-
down approach in policy implementation (Mazmanian and Sabatier, 1983; Bardach,
1977; Pressman and Wildavsky, 1973), it does not need to be at odds with a bottom-up
approach,
111
except perhaps when taken to the extreme. This is particularly important
when implementing policy driven technology (PDT). For as Fountain suggests, the
implementation of technology can be characterized as “decision making under
111
The bottom-up perspective advocates for more flexibility for frontline implementers, and indeed a role
for them to play in shaping policy objectives because of their better understanding of policy at the impact
level. Advocating that the frontline have input on objectives is not necessarily at odds with policymakers
having input as well.
171
uncertainty” (p. 89), and thus requires a return to policymakers (or at least the agents of
policymakers) to clarify and adjust as bugs are addressed and the technology itself is
utilized and better understood.
Implementation seems to have flourished with limited communication from
policymakers at the university and system-wide level as well as from the accreditation
body. Indeed there has been limited interplay between the department and any of the
policymaking bodies. The challenges have been the diffuse and underdeveloped nature
of the policy and its underdeveloped nature. Communication has been limited and
policymaker intent has not been explicitly discussed, which has led to some
misunderstandings. The information, when it has been shared, has come predominantly
through the department chair.
There has been a fair amount of misunderstanding of policy and policy intent.
This has largely been fed by the administration’s support of online initiatives, while
lacking a comprehensive policy that articulates intent. The chair has used what is known
to push forward implementation, communicating his understanding of policy. There has
also been little interest on the part of faculty to verify policy or policy intent as the focus
seems to be on one’s individual course and for some the body of online courses offered in
the department. What has resulted are pockets of misinformation. This is reflected in
one part-time faculty member’s observation.
…the administration is trying to get a lot of [of web courses] turned into a web
class…they’re just dumping a lot of their classes out there, just dumping it out.
They’re still in the formative process of weeding things out--what doesn’t work
and what their role is supposed to be. I’m sure they think that they’ve thought it
through but unless they actually haven’t, you have all these multitude of web
courses out there, you don’t know what you’re dealing with. (782)
172
But in all actuality, there is no such comprehensive effort. And there was at least
one example of where a “policy” articulated by the chair and mentioned by several
faculty was just not true. This is not to suggest that the chair willfully misinformed the
department faculty. Indeed, this may have been a point of discussion at the system wide
level, but it was never translated into a policy objective.
112
Ultimately, while the
emerging nature of online education as a policy has created challenges to implementers, it
has also enabled a greater degree of flexibility throughout the process of implementation.
There have been few major developments of the policy over the course of the
implementation period under study, but some have been significant. One relates to the
policy requirements associated with the development of an online degree, as opposed to
just individual courses. Such a move requires the development of a proposal to the
accreditation body. Efforts to put that document together are underway, but involvement
in the process is limited to the leadership and strong proponents of online education in the
department. For other processes, like course proposal approvals, the chair has worked to
ensure involved parties are clear on the process.
Proposition 3
Organizations can facilitate the learning process and effective implementation by
maintaining congruency between organizational goals and policy objectives.
112
This was in regards to the system’s interest in having no more than one campus offer an online degree in
any particular major. The presence of such a policy would serve as an incentive to push forward efforts at
implementing a full online program.
173
“So I think there was a combination of survival, survival of the department, believing in
what the discipline means, desire to have new colleagues, all the space—what are you
going to do with—I think it was an easy conversation.”(1172)
In order to root policy objectives in the activity of the organization, management
must provide clarity at the organizational level, and create a means to motivate
organizational members towards that activity. Management, therefore, must properly
align incentive structures (Petitt, 1996; Wilson, 1995; Ferris, and Tang, 1993; Lundberg,
1983; Perry and Porter, 1982), as well as provide clarity of purpose (Ruiz-Mercader,
Merono-Cerdan, and Sabater-Sanchez, 2006; Davenport, De Long, and Beers, 1998;
Bishop, 2003) and a shared vision (Senge, 1994). Clear congruency is needed between
organizational goals and activities and the policy objectives, or at least how the policy
objectives are understood by the organization and/or organization management.
The department did well to translate the policy objectives of online education into
organizational goals. As discussed, even at the department level, compliance with the
policy was not required, but faculty members were strongly encouraged. Motivation for
part-time faculty was particularly compelling, as stated by the chair, “Do you want to
teach part-time? First, develop that course on ‘X’ online and get it through the Dean’s
Office. That’s a good employment opportunity” (1152). So, while this was not a
mandate, support was quickly enlisted. While there was support, there was not a high
degree of enthusiasm. As one faculty member remarked, “I’ve had more conversations
with the people who don’t like to do it, who aren’t going to do it...And I haven’t had my
other colleagues just sort of trying to convince me, ‘Hey, it really works.’ I haven’t had
those discussions” (560). Even among some supporters of online education there was
skepticism of the whole effort, questions of whether online was appropriate for upper
174
division courses and whether this was too much too fast. Still growth did occur, which
ensured the availability of more resources, drew the interest of extended education,
113
enabled the hiring of more full-time faculty, and provided the critical mass of faculty
teaching online to enable more selectivity in hiring.
Still, the concern whether the quality of learning opportunities offered to students
through online education is on par with the traditional delivery of education in a college
setting was alive and well within the department. Regardless of individual faculty
members’ personal stances on online education, some of the methods of implementation
did indeed raise concerns over quality. What also resulted was a tension and some
resentment among part-time faculty.
If you can't do Web, your value to us is limited. So it’s economic. And so if it’s
worth willing to do Web, to be very flexible in terms of the kinds of classes you're
doing such as televised courses, working on weekends, nights, you lose your
value as a candidate for employment…They would like to save money, save
resources on tangible facilities by offering Web courses. If they can accommodate
students without having to build new buildings and classrooms that saves them
money...they can double my load and just pay me one salary for a class. I can
teach up to 89 students in a class and just get paid one salary for that same course.
(776 & 784)
Sources of that tension were a sense of powerlessness, but a major concern was
the enrollment numbers. Campus trainings offered conflicting messages regarding
student engagement in online courses. As the research suggests, faculty should promote
discussion and written reflection, and those were messages from the IRC and from the
department itself, but doing so was very difficult with such large class sizes.
114
Another
113
“[Extended education] saw the numbers and saw what our faculty were doing, saw what [vice chair] was
doing, saw what [faculty early adopter] was doing and came to us” (1197).
114
The recount of one part-time faculty member of her exchange with the IRC suggests, “’Oh, when we
established these criteria for quality distance learning, we’re thinking of like a class of 15-16.’” I said,
‘Well, if it was a class of 15-6, I could do a lot of discussions’” (781).
175
source of concern was the perception that the rules for enrollment caps in online courses
were different for part-time faculty than for full-time faculty. “[T]he full-timers and their
online courses, they’re locked out at 35; as part-timers, [we] get the double loads” (644).
There is no such policy or guideline in the department, but it can be understood how such
a perception can arise with so few full-time faculty instructing online courses, and the
consequences further emphasize the sense of powerlessness.
115
Such an environment created obstacles to the learning process, largely as part-
time faculty had reduced energy to devote to learning and improving use of the
technology. It also provided fuel for the naysayers in the department. The presence of
this tension, though, coupled with the realization that the department is engaged in a high
profile implementation, has helped push the department to focus on improvement and the
issue of quality of instruction. Indeed, from the start, the department engaged in quality
control efforts.
116
As indicated by the chair’s remark in reference to campus support of a
specific technology, “WASC wants to know what kind of support systems do you have
24/7 for these students? And so that policy, the WASC policy drives what we’re doing
and we flipped on out one this week” (1248). And it represents a bigger sentiment of
maintaining quality. Further as noted, the rapid growth has provided the means to
acquire new resources and the department has worked to further align department goals
with policies in relation to quality.
117
Further, the technology itself has enabled
115
While only a couple of part-time faculty member made specific mention of enrollment cap difference,
others noted a general sentiment that full-time faculty had less of an investment in online instruction.
116
Department leadership enrolling in each online course and being able to observe it more than any
traditional course is one major example.
117
Such actions include mandatory training prior to teaching online for the first time, a reduction in online
enrollment caps, clearer expectations as related to teaching online courses, and more meetings among
implementers.
176
department leadership to examine the quality of instruction in a more robust and
comprehensive manner than was ever accomplished in the traditional instructional
format, and to meaningfully link it to policy objectives as related to quality.
Proposition 4
The organization possesses and management promotes a culture of
experimentation as it relates to the use and implementation of policy driven technology.
“[I]t’s a venue for creativity if you are technically adept, and in terms of doing
PowerPoints, hyperlinks, filming yourself, lecturing, or demonstrating something.” (771)
Experimentation has been deemed an essential tool in the learning process, and
particularly valuable in a situation where technology is implemented with a high degree
of uncertainty. Such characteristics include “trial and error” (Etzioni, 1989), kinesthetic
learning (Robey, Boudreau, and Rose, 2000; Hippel and Tyre, 1995; Marshak, 1993;
Senge, 1990), valuing knowledge and innovation over hierarchy (Davenport, De Long,
and Beers, 1998), importance of disclosing mistakes as part of the learning process,
particularly among management (Ruiz-Mercader, Merono-Cerdan, and Sabater-Sanchez,
2006; Krogh 1998; Imai, 1986), and the practice of testing of new techniques before
introducing them for general use (Ruiz-Mercader, Merono-Cerdan, and Sabater-Sanchez,
2006). These activities ought to not only be in place, but management ought to clearly
support experimentation at the level of technology use.
The use of experimentation in individual learning of how to use the technology
was important and widespread, particularly for adopters who had a short amount of time
177
to prepare and limited training. Faculty typically engaged in trial and error or learned
through the trial and error process of a colleague or as advised at a training session,
which then limited the need to experiment with her/his own course. Most users, though,
identified at least one practice that they had developed and refined through trial and error,
or they had adopted through the experimentation of a colleague in the department, and
even as an ongoing means of refinement, “I shared out my template with them, and then
they have been contributing” (275). This process was particularly useful for faculty with
limited technological abilities or interest in technology, or who do not have much time to
invest in learning the technology (i.e., part-time faculty), but was used by those with
advanced skills. Further, it does not differ much from the process typically used in
developing a new traditional delivery course.
Faculty with strong backgrounds in technology and longer histories with online
education found means of being creative and innovative in teaching with technology.
The department has worked to ensure such practices were shared, and that resources are
provided to faculty to take advantage of these opportunities. Here the support of the
DEEC expert has been critical.
He takes that time. I don’t know how he does it because he must be so busy but he
sat down with me several times and helped design the thing...he sends his film
team over, and we’re going to shoot some [materials for course]. So...we can put
that online. I mean, they have been incredibly helpful. (883)
The best example of this was the streaming video, which complements the
PowerPoint. The department has promoted the technology extensively in the department
and generated both awareness of possibilities and excitement. As observed by one
hardened non-user.
178
And a number of the faculty are pretty enthusiastic about it and some were--if he
talks to them earlier, they would be like me and say; “I’ll never do it.” Then they
try it and say; “Oh that wasn’t so bad...So, I feel more encouraged than I did
before. Once I saw Carl’s link that he sent me showing him in the pictures just
teaching and I found it very engaging. (386)
To the degree that experimentation has been present, it has advanced individual
learning, which has led to sharing and shared learning opportunities. Such efforts have
not been systematic, though. In terms of innovation, this has certainly been present but
mostly under the direction of outside expertise and best practices, rather than a culture of
experimentation within the organization.
Proposition 5
The learning process and effective implementation are more easily achieved
when the organization provides clarity of roles and responsibilities through the
implementation process.
“Some of them said, ‘Hell, no, I’m not going to do it.’ And some said, ‘I can’t
have traditional quality.’ Some said, ‘I’ll try it.’ Some said, ‘Does it mean I
never have to come to campus?’ Some said, ‘What about my office hours?’ All
the motives were not always honorable.” (1143-7)
Clear understanding of membership roles and responsibilities benefits the
individuals both in terms of their own performance but also so they can identify where to
access other human resources in the organization. This becomes particularly important
under conditions of organizational change (Damanpour, 1991; Porras and Robertson,
1987; Van Maanen and Schein, 1979). Because technology evolves rapidly,
organizations need to anticipate change, which in turn requires flexibility and reflection.
179
As the driver of the policy, the chair made sure organizational members were
aware of his commitment to online education, as well as of his expectations of them in
this new role. While sometimes conflicting messages arose between growing online
education and quality of instruction, the chair attempted to ensure that the department
was committed to both, as observed by one part-time faculty member.
[The chair] has tried to keep a high caliber in the online classes. His little notes
of—they may be frustrating as hell but I understand his point. He doesn’t want to
lose the quality of education that we would have in the classroom, the importance
of students being able to interact with one another...So, he’s been very good at his
reminders, information that’s available to us, how things are changing, how we
can have our little talking head and implementing new technology into these
classes....I think he's done a wonderful job in informing the instructors, keeping us
updated and having the information available to us as it’s constantly changing.
(647)
Monitoring of online courses was the initial means of reinforcing faculty
responsibilities in the online environment. Faculty members, particularly part-time
faculty, were very much aware of his presence and did not express too much concern
about it. Largely, they welcomed it as a means of helping to improve their courses. This
is likely due to its presence from the onset, and its linkage early on to additional support,
specifically by the addition of the DEEC expert to courses.
118
As the monitoring role was
taken on by the vice chair, she has used it as the basis to develop a comprehensive
evaluation system. Such efforts have advanced the understanding of faculty
responsibilities and the standards by which they are measured.
118
Through this access, he was able to provide faculty with quick fixes. As one full-time faculty member
remarked, “Don’t tell me every little possible thing I can do, just tell me what is the best way...And all the
stuff he said has really been clear” (534).
180
Proposition 6
Organizations must support a well developed communication system within the
organization and among organization members to facilitate the learning process and
effective implementation.
“I’m far less afraid of technology because of this unit, division that’s been set up with
[distance education center]. Otherwise, unless you’re very comfortable with technology
and computers, you’d feel very lost and unsupported.” (815)
Internal communication as a means to facilitating organizational learning
(Nonaka, 1994; Damanpour, 1991; Levy, 1986; Porras and Robertson, 1987; Elmore,
1982) is critical to distributing information across the organization and transforming
information into knowledge. This involves interpreting and integrating (Crossan, Lane,
and White, 1999) and a constant exchange between tacit and explicit knowledge
(Nonaka, 1994). This is also the mechanism by which individual learning is
communicated throughout the organization, and it does so through a variety of
communication channels, not just along hierarchical structures.
Communication within the department demonstrated several important elements.
Formal structures have included department meetings, departmental notices, and
meetings of online instructors. Faculty meetings have provided the opportunity for
discussing organizational objectives in terms of the policy, although there was not
agreement on the extent to which such discussions took place. One full-time faculty who
is a non-user remarked, “This is all just happening, we didn’t even talk about it” (374).
All faculty, users and non-users alike, observed receiving regular email communications
from the chair regarding developments in online education including opportunities for
181
training in the IRC, and most were aware of the support of DEEC. The chair was
especially pleased with meetings held once per semester for online instructors in the
department,
119
and saw it as unique and very beneficial. Faculty response to those
meetings, however, varied. Several enjoyed them and recognized them as opportunities
to network with each other and learn more about online instruction, and even desired to
meet more often. Others found these opportunities less beneficial, as noted here.
I know our department chair, he has had a couple of meetings as well regarding--
he calls us the “techies”—regarding the online classes, and so he’s been
supportive in that regard as well...But I haven’t really learned anything from the
other instructors. Maybe a little bit here, a little bit there, but nothing enough to--
if I didn’t have the [IRC], I wouldn’t be where I’m at. (841)
The important message to note in this comment, which seemed to ring true among
most if not all of the users, was not the value of all these support systems, but the need
for at least one. These formal means of communicating served the essential function of
delivering information to organizational members regarding the available support. So
long as faculty members were connected to one of these they seemed to have more
confidence in managing implementation.
One important support system was an informal network among faculty, which has
emerged out of formal arrangements, such as side conversations at meetings, as well as
through proximity. They provide guidance and opportunities to share resources. “I've got
a lot of good advice from colleagues…who have taught it. I learned a lot from their
mistakes” (778). Part of the issue of proximity is in terms of disciplinary focus. One
concentration in particular, from which came the prototype, had, at least for some of the
online courses, a sense of shared ownership and an openness to refinement and
119
They later changed to once per month during the academic year.
182
improvement of the course. Even in the age of technology, physical proximity surfaced
as very relevant. Some only conferred with others in their general office area. This was
nowhere more evident than with a faculty member new to online education that was
based off the main campus. She was the most secluded and demonstrated the greatest
challenges in utilizing the technology, and was the only one to observe flatly, “There is
no support from the department, none” (440).
The sharing of access to sites is also a common informal process, serving as a
means of showing new online faculty a model of how it is done. There are also
reciprocal effects, where new online faculty, particularly those with strong technology
backgrounds, learned to better use technology and feed that knowledge back. One
challenge related to these informal arrangements as a whole, though, is of course their
selective nature of sharing. As noted by one faculty member who had been approached
for access by two colleagues, “so I logged the instructor in and the other one I didn’t do it
‘cause he was too critical...So I try to be as supportive as I could within our own faculty.
Given the right people I’ll be supportive of it” (842).
Proposition 7
Processes of evaluation and routinization of practices are essential for
documenting and directing the learning process and enabling effective implementation.
“I’m hyper vigilant about this on-line delivery because I know some people are even
more vigilant than I am with…bad motive.” (1253)
183
This proposition advances the importance of the process of documenting
practices and learning efforts within the organization. Organizations need to engage in
evaluation with a willingness to change (Crossan, Lane, and White, 1999; Senge, 1994;
Cole, 1991), and that evaluation should be purposeful with a value placed on
improvement (Hackman and Wageman, 1995; Tyre and Orlikowski, 1993). The need to
collect information, and evaluate and adjust implementation and the use of technology, as
needed, is critical for effective use of policy driven technology in the long run.
Active evaluation and routinization activities have been in place and continue to
be developed to facilitate and improve the implementation of online education in the
department. At the individual level, evaluation is the same process that all faculty
members are engaged in as they refine their course over time and respond to feedback to
students and new resources and materials as they become available.
120
An interesting
development has been that, as faculty members refine their approach to online
instruction, they develop tools that can improve their teaching in a traditional classroom
format. This has been observed among several faculty members, both full-time and part-
time, but typically those who have been engaged in online teaching for multiple
semesters. This has been true of managing discussions, as noted here.
I find far more expression, discussion, and exchange of ideas through discussion
board in the Blackboard system...Sometimes, you’re just squeezing rocks to get
water in class. They just--they can’t--this way, it’s easier also to grade them for
their discussion. In class, it’s hard to keep track of everyone. (770)
It has also been true in dealing with student assessment, as noted by a faculty in
discussing his strategy of weekly assignments building toward a major course project.
120
Examining new strategies for both improving the amount of discussions online as well as approaches to
testing and student assessment have been particularly important.
184
I think that [with] the sort of…two midterms, final term paper strategy… all
[students] do is cram and the term papers are usually not of high quality. And this
way they get feedback on a much more consistent basis. I’m able to early on say,
“Hey look, really you’ll need to work on this.” And so now I’m converting all
[my courses] to online [format]… and it’s really interesting…how [in using] the
online pedagogy or pedagogy that works for online, I really came to see it as
being superior to the way that I’ve been doing it for all of these years. (292)
These are the types of strategies, learned and refined at the individual level, that
have been circulated around the department via both formal and informal settings. And
while they are not strategies that all faculty members embrace universally, they were
noted by several other faculty members suggesting the general interest in refining
teaching practices amongst technology users.
The department evaluation of online courses has largely related to the monitoring
conducted by the chair, but as discussed, current efforts are underway by management to
develop a more systematic approach. This access has enabled the chair or his surrogates
to be able to quickly address any concerns regarding the quality of online instruction and
occasions have arisen where that needed to happen. “There are some things going on in a
couple of my sites. When I saw them—when [the vice chair] brought them to my sight—I
was horrified. And we moved on it, essentially said, ‘You stop this or else you’re not
going to do this again’ (1237). While these aspects of the department’s efforts have been
helpful to the implementation process, they have largely been punitive, ensuring that
minimum standards are kept, and not so supportive of the learning process. The
evaluation efforts that the department is now engaged in more directly support the
learning process.
Using current online courses offered in the department and best practices related
to online instruction, the vice chair has been engaged in a full evaluation of each part-
185
time faculty member’s online course. This has included the development of a rubric on
which websites are evaluated, providing faculty with a clearer roadmap of the
expectations for online courses. The rubric has been made available to all faculty
members. This process is clearly outlining expectations for online classes and providing
direction on how to advance the quality of instruction.
In terms of efforts to routinize, the most significant are linked to the department’s
pursuit standardize all online course websites resulting from the DEEC
recommendations.
121
These efforts are ongoing and serve to reinforce emerging
standards of quality in online course delivery. Such standards are likely not only to help
students in navigating department course websites, but also to make it easier for faculty
to learn to develop their course and share resources across courses.
Organizational Learning
Evidence of learning in response to the policy directed technology (PTD) has
been observed in two broad areas in this case, as it was with SEVIS. These include
substantial changes in the organizational culture and membership and the development of
new operational systems in response to the technology. The department has undergone a
significant transformation in organizational culture and membership that is related to its
acceptance and use of technology. This has evolved over time as increasing numbers of
faculty members have participated in teaching online and as more resources have come to
the department due to enrollment growth resulting from online courses. The latter has
121
The department has also worked to build routines in various administrative functions related to the
policy, streamlining such processes as online course proposal reviews.
186
contributed to a general acceptance, if not embrace, of online education among full-time
faculty and has also enabled the department to recruit new full-time faculty altering the
general disposition among the faculty ranks towards technology and online education
more specifically. As for operational practices, the department has developed new
evaluation tools and processes, which seek to advance the quality of instruction in the
online environment. Currently underway, this process is being informed by individual
learning, fed by external best practices, and while present in the earlier phases of
implementation it has gained prominence in the later phase. A discussion of these
learning processes follows.
Changes in Organizational Culture and Membership
Through the implementation of online education, the organization has undergone
changes in both its culture and membership. The research indicates that these changes
are a product, in part, of organizational learning. The cultural shift reflects elements of
an organization that has grown more accepting of technology and online education, for
some at least implicitly, as well as increased sophistication in the use of technology.
Evidence that these changes have taken place are that individuals have come to be more
confident in their own use of online education technology, the development of new
networks of technology users within the organization, and the impact that technology use
has had on teaching pedagogy for these faculty within the traditional classroom settings.
This cultural change is also being facilitated by changes in membership of the
organization, which includes the reassessment of the role of technology in the lives of the
187
faculty in the department and an influx of new full-time faculty among whom technology
and online education is much more embraced.
Confidence in use of technology and delivery of online courses has come over
time for individual faculty members as they refine their online classes. It also tends to
generate an interest in increasing sophistication in the use of technology over time. That
confidence has come through individual learning, which came through a process of trial
and error and experience in using the technology. Lessons were also learned through the
experience and trial and error process of a colleague. The process reflected both the use
of technology and the structuring of online courses. Much of this advanced individual
learning occurred over time, from semester to semester or even within the semester. “I
am learning myself how to use it, you know, kind of trial and error, what works and what
doesn’t work” (710). Indeed, most users identified at least one practice that they had
developed and refined through trial and error, or they had adopted through the
experimentation of a colleague in the department.
122
A key additional resource in
advancing user confidence was the support of distance and extended education center
(DEEC) and the online expert in that area who served as a consultant to the department.
“I’m far less afraid of technology because of this unit/division that’s been set up with
[DEEC]. Otherwise, unless you’re very comfortable with technology and computers,
you’d feel very lost and unsupported” (815). Additionally, the development and sharing
of these innovative practices generated excitement to use technology and support of the
medium among naysayers.
122
Examples cited include student testing, structure of assigned papers, use of discussion board, general
communication with students, and communicating with students.
188
Another factor has been the development of new organizational networks among
technology users, which have emerged in both formal and informal settings. These
organizational networks transform individual learning into organizational learning. “I've
got a lot of good advice from colleagues…who have taught it. I learned a lot from their
mistakes” (778). Among many this source of support was critical.
[S]o it’s not something you can learn from a book. I find that I’m relying more on
other people’s experience which is more actually like a traditional mode
of…information transmission. I can learn the basics, the technique of how to do
things [from a book]. But the actual implementation, again, is still on learning
from other people. (800)
Much of this networking occurred informally, and included such strategies as the
sharing of access to sites, which served as a means of showing new online faculty a
model of how it is done. There are also reciprocal effects, where new online faculty,
particularly those with strong technology backgrounds, learned how to use technology
effectively in the classroom, and then fed that knowledge back. These efforts toward
collaboration and exchange emerged among faculty who shared office space and even
among faculty in a shared concentration. Among many, the support received from fellow
faculty was the most crucial. As noted by one part-time faculty,
[Name], who teaches here full-time has been an incredible help to me, and she has
done a lot of the searches…we usually talk at least once a week and we share
information, and that’s been great. So the whole collaborative thing that can
happen out of this, I think, is quite wonderful. And I’d like to see more of that.
(889-890)
Learning has also been advanced as the department has made efforts to coordinate
this sharing by creating formal opportunities for faculty engaged in online instruction to
interact and discuss teaching online. Meetings among online instructors to discuss issues
and present additional training or showcasing of new technology have been held each
189
semester to which all department faculty members are invited.
123
The chair was
especially pleased with this community built around online instructors in the department,
and saw it as unique and very beneficial, as he notes, “The web people get brought
together once a semester...How often does that happen in a traditional mode? All your
lecturers get together? No. So we’re bringing them together on that basis. So that’s a
support system. And they share links, they share course material and they share articles”
(1184). Faculty response to those meetings, however, varied. Many enjoyed them and
recognized them as opportunities to network with and learn from each other. As one
part-time faculty notes, “I would like to see them maybe once a month…It helps to
unload what’s happening with the courses, what’s working, what’s not working” (891).
Although this sentiment may not have been universal,
124
a community among technology
users has clearly emerged.
Further evidence of the change in organizational culture was the mark that
teaching online left on teaching pedagogy as individual experience with online courses
deepened. Teaching online required using different tools and strategies, particularly in
how faculty interact with students and assess their learning. Among faculty with more
than two semesters of experience teaching online, though, there has been a realization
that some of these strategies used online can indeed serve to enhance student learning in
traditional courses. One approach with student assessment is the development of
strategies that move away from a more traditional mechanism of exams in combination
with a single major paper, where no drafts are required. As one faculty member notes in
123
More recently these meetings are being held twice per month during the academic year.
124
“I know our department chair, he has had a couple of meetings as well regarding—he calls us the
“techies” —regarding the online classes, and so he’s been supportive in that regard as well...But I haven’t
really learned anything from the other instructors.” (841)
190
comparing the traditional model with his online assessment structure, which involves a
major writing assignment, where components of the assignment are required each week
and then tied together at the end of the semester,
I think that [with] the sort of…two midterms, final term paper strategy… all
[students] do is cram and the term papers are usually not of high quality. And this
way they get feedback on a much more consistent basis. I’m able to early on say,
“Hey look, really you’ll need to work on this.” And so now I’m converting all
[my courses] to online [format]… and it’s really interesting…how [in using] the
online pedagogy or pedagogy that works for online, I really came to see it as
being superior to the way that I’ve been doing it for all of these years. (292)
And indeed this sentiment arose on several occasions. The model of keeping
students engaged on a week to week basis was important in a well structured online
course, and as faculty had an opportunity to refine those assignments, they adopted them
in their traditional courses as well. This has also been true of managing discussions,
which some faculty have found more effective in a Blackboard environment than a
traditional classroom setting.
More coordinated efforts are underway from the department, as meetings among
online instructors are called twice per month during the semester and enhanced by a
Blackboard community site open to all faculty in the department that addresses online
issues and provides an additional source of support. More importantly, here, it provides a
new mechanism for the department to communicate the message of quality and standards.
Finally, there has been a clear change in the membership of the organization,
which is a direct result of this policy implementation and evidence of learning. This has
taken place in two ways. First, there has been an influx of new full-time faculty among
whom technology and online education is much more embraced. Second, the
implementation has yielded an institutionalization of online education within the
191
department, clearly impacting the nature of its membership. In terms of the new faculty,
the ability of the department to recruit new full-faculty is a direct result of the increase in
enrollment which online courses brought during the mass development phase of
implementation. While new hires would not be required to teach online courses, the
prevailing interest in extending full-time faculty participation has influenced the
recruitment effort. The department was authorized to hire six new full-time faculty
members as of fall of 2008, and an additional four are slated for fall 2009 (1362), which
would nearly triple its current size. Such a dramatic change in membership is certainly
changing the culture of the organization in many ways, including its embrace of
technology. As noted by the chair, “the lion’s share of this was either developed and/or
delivered by part-timers. But the faculty we are hiring are much younger and in tune with
it as a delivery mode than some of the people whom they will be joining as colleagues”
(1232).
In terms of the institutionalization of online education within the department, this
has resulted in a clear acceptance that online education is a necessary component of how
the department delivers instruction. First, arguably as a result of the incentives set before
full-time faculty, most of them have participated in at least the development of an online
course proposal and all are growing accustomed to the benefits associated with the
enrollment growth it has generated.
125
Second, the recruitment of new full-time faculty
helps institutionalize online education as the department becomes reliant on online
education’s enrollment generation. The department must now commit to meeting the
125
Beyond the opportunity to grow the number of full-time faculty, enrollment growth has also enabled
full-time faculty to teach smaller sections and niche subject areas with little concern over course enrollment
size. They also benefit from more resources for such areas as research and travel.
192
aggressive enrollment targets in order to support its growth in full-time faculty. As these
full-time faculty positions are filled, it will become increasingly difficult to reduce its
online offerings without the presence of an alternative method for generating enrollment.
Development of New Operational Systems
Within the online education case, the development of new operational practices
has taken shape around the establishment of standards for online courses, which are
aimed at improving the quality of online instruction in the department. These have
emerged as a product of the underlying organizational learning in the department.
Represented by efforts toward standardization, monitoring of quality, and, ultimately, a
comprehensive evaluation program, these systems are given meaning by the organization
as they are communicated and articulated, and are reinforced by the changes taking place
with respect to organizational membership and culture, which, as discussed, has grown
and continues to grow more receptive to technology.
Elements of standardization relate to aspects of the online course design and
training requirements for new faculty. Standard elements of online courses in the
department have included online course format and efforts to improve student navigation
of course websites. Conceived largely through the efforts of the DEEC expert and his
knowledge of external best practices, they have also been developed in concert with the
faculty experience and its understanding of the discipline and their students. In terms of
the navigation elements, currently 60% of the faculty have adopted them for their
courses, and as the evaluation program is put into place the leadership hopes to generate
increasing support.
193
The issue of training is more complex as it has evolved over the course of
implementation. While the support has been in place through the implementation and the
chair was, by several accounts, relentless in ensuring that faculty were aware of the
training opportunities available, nothing was required initially and the necessity of
meeting high student interest in online sections often resulted in the placement of faculty
with little or no preparation or training in online education or Blackboard. More recently
requirements are being put into place, and because so many faculty now have had the
opportunity to teach online, experience has become a condition to teach. As articulated
by the chair, “I am not interested in training. I am not interested in pioneering some of
the subjects anymore...They have to be familiar with the subject, first of all. That’s a
given. But if they’ve never taught online, I don’t want them teaching online. They don’t
have to teach that specific course online,” (1249). While partially a function of now
having a pool of online talent in the department, it is also the result of understanding the
challenges encountered by faculty in those positions and the quality concerns which have
emerged as the leadership engages in monitoring practices. Such standards will not only
help advance quality of instruction, but also make it easier for faculty to learn to develop
their course and share resources across courses.
The initial monitoring of course websites helped to maintain minimum standards
of quality, and it has also played a critical role in the learning process and in the
development of the current evaluation process underway. First, that access has generated
information on department practices related to online education, which has provided an
important source of information to develop standard practices for the department.
Second, it has contributed to the development of a culture that is receptive to evaluation
194
and assessment, in a very different way than is present in a traditional course. The vice
chair maintains access to every site and “so she can evaluate the contents of the site. So,
the analog is that we do a one-class visit a semester for people who have future personnel
action. We do one class a semester for part-time people. This means we have better
access to online classes everyday than we have to the traditional delivery” (1122).
This access has enabled the chair or his surrogates to address concerns of online
instruction quality, and, as discussed, opportunities have arisen to do so.
126
And that has
largely been the threat, a loss of the opportunity to teach online, and to some that means
to teach at all in the department. While these aspects of the department’s efforts have
been largely punitive, ensuring that minimum standards are kept, they have provided
quick feedback to instructors and helped to shape expectations for online courses. This
has served as a critical context from which the department can advance quality.
Complemented by more sophisticated assessment tools this is now possible.
While led by the vice chair, this effort of developing an evaluation system has
been a learning process. It has been informed by external resources including materials
from another institution and guidance from the DEEC expert, but also adapted to the
department and the discipline. It is informed by a structure used at another institution and
has been created by the vice chair following a thorough review of all current online
courses offered in the department. This evaluation process is based on six areas in online
instruction. Within each area specific details are provided on actions that support that
area, and, further, strategies for assessing each area on the basis of whether each is
126
One incident involved shared “lecture” materials among faculty teaching different courses, with the
result being that a single student taking different courses would see nearly the same material.
195
delivered in a basic, effective, or exemplary manner are also provided.
127
Fitting it to the
department, however, is essential and was possible because of the access to online
courses which management maintained.
Ultimately, meaning is made out of this process in three important ways. First, it
is the basis of a formal evaluation process of all part-time faculty teaching in the
department, which includes both a self-assessment component, as well as an individual
meeting with the chair to review the evaluation. This seeks to improve the teaching of
online courses by each part-time faculty member. Second, it is being provided to all
faculty in the department to provide clear expectations for the advancement of quality in
online education. The development of such a system represents not only an interest in
addressing quality, but also reflects a process aimed at evaluating the implementation of
the policy as it has been managed by the department. Finally, this system serves as the
basis for the department to structure training and support for faculty in formal settings.
The presence of an evaluation process that feeds back to organizational members and the
implementation process is particularly important to learning and clearly reflects the basic
tenet of policy implementation learning as advanced by Wildavsky (Pressman and
Wildavsky, 1984).
Achieving Implementation Objectives
Implementation objectives are four-fold: increase in overall enrollment,
increase in majors and minors, development of online degree, and maintaining quality of
instruction in an online environment. Learning has not been as vital to increasing
127
An example of one of these areas is instructional design and delivery.
196
enrollment, as management has relied heavily on incentive structures and existing basic
technology training support for faculty. In later efforts learning plays a pivotal role in
addressing issues of instructional quality. Similarly, learning would also likely serve
efforts of increasing majors. Advancing quality requires providing not only additional
support to faculty in the development and execution of their courses, it requires the
articulation of standards so that users can understand what is important and how to do it.
This goes beyond the initial monitoring activity, which provides feedback as to whether
standards were being met. The initial focus on enrollment has enabled an implementation
that has relied less so on learning. More recent efforts related to quality are much more
reliant on organizational learning.
In the case of online learning, implementation followed neither a purely
bottom-up nor top-down pattern, but was a hybrid. An initiative by a faculty innovator
was adopted by the “top” of the organization, a new chair who saw online education as
one strategy for addressing the enrollment challenges for the department. He created a
combination of incentives to generate support in the development of online course
proposals (release time for full-time faculty with no requirement to teach them; implicit
commitment to teach the course for part-time faculty). So there is interaction between
top-down and bottom-up. Despite suspicion and some goal conflict, substantial growth in
online course offerings and enrollment took place over a short period of time. This
yielded a narrow measure of implementation success initially. Organizational learning,
while present in that early process, was limited. Certainly changes to organization
membership and culture began as individual faculty members engaged the technology,
and as efforts included some coordination among users. Learning, however, advanced
197
more slowly than the overall implementation process. Organization learning has been an
integral part of the later phase of implementation as evident in changes in organizational
membership and culture, as discussed, as well as in the efforts to develop a sophisticated
evaluation process to increase standards of instructional quality for online courses.
The role that organizational learning has played in the implementation process
and in the achievement of implementation objectives has changed over the course of
implementation. In initial efforts, while yielding tremendous enrollment growth and
involving some efforts at coordination among faculty users of technology, organizational
learning was less relevant in accounting for that growth. Operational systems for the
technology were underdeveloped and focused on minimum expectations of quality (i.e.,
monitoring practice). Organizational culture, which slowly changed, did not engender
much support for the policy at the onset. Faculty in the department had little or no
experience teaching online, and in some cases no training. And in the cases of no
training, decisions were made to place those instructors to meet student demand and to
enable enrollment growth.
128
Implementation for enrollment growth could have occurred
with little organizational learning, so long as the incentives to engender basic faculty
support remained in place,
129
training was made available, and efforts were in place to
monitor quality. While not the focus of these efforts, they have contributed to the
evolution of learning, generating changes in organization culture as more faculty
members were exposed and became familiar with the technology. The importance of
organizational learning has only grown as online education has become a critical aspect
128
Conversely, late decisions to split courses arguably also served to improve instructional quality by
maintaining a lower faculty to student ratio.
129
Basic faculty support includes the development of online course proposals and willingness to teach
online courses.
198
of the department, and attention has been drawn to advancing quality. Key to this has
been evaluation efforts, which articulate management expectation of instructional quality,
coupled with processes of dispersing information, knowledge, and resources to enable
improvements in support of those standards. The support received from the subject
matter expert in DEEC, who helps ensure that training resources are available and also
works to provide benchmarking information on departmental efforts toward online
education, has also been crucial. Also, critical to this have been the additional resources
generated by the increase in enrollment growth. This has enabled the department to
recruit new full-time faculty with a greater comfort for technology, which further
contributed to learning, and can contribute to the advancement of instructional quality.
130
It seems that while a focus on maintaining minimum standards in quality was
less reliant on organizational learning, learning has been very impactful in efforts to
advance that quality and establish higher standards. The individualized nature of the
technology likely contributes to the delay in organizational learning affecting
implementation. Faculty, having sufficient training or previous experience, could pursue
quite independently the development of their course with some, but limited, direction or
oversight. While given the opportunity to interact and share, there was no requirement.
So, while the ground work for building the capacity for these changes in organizational
membership and culture were underway, individual users were less reliant on it. Further,
while the policy objective was clear, there was a lack of policy clarity in terms of the use
of the technology (i.e., standards of use). With the development of this clarity,
130
These new full-time faculty members are more likely to teach online courses, thereby helping to balance
the distribution of part-time and full-time faculty teaching online.
199
organizational learning has grown to be a more critical aspect of the implementation
process. Indeed, it is difficult to propose such improvements in the absence of
organizational learning.
Conclusion
As was true with the SEVIS case, the model served useful in conceptualizing and
analyzing evidence of learning processes in the online education case as well. That
learning has and continues to support the advancement of implementation objectives. As
to the model, policy and goal congruency and efforts towards evaluation and
routinization were particularly impactful on the learning process. Such conditions as
networking with other implementing and communicating with policymakers, while still
present and relevant, manifested themselves in strikingly different ways then was the case
for SEVIS. The organization was engaged in collecting information from the external
environment, and also had strong internal communication systems, but much of this
external information was not shared with the organizational members—mostly issues
directly related to policies. Although such a system was not ideal from the perspective of
the research, it does not appear to have inhibited the implementation process or
organizational learning.
While much of the initial effort toward expanding online may have been facilitated
through inducements to generate faculty support with the burden being placed heavily on
part-time faculty, it seemed to have worked. It generated sufficient momentum and the
opportunity to direct meaningful change in the culture and membership of the
organization. Coupled with efforts to more systematically evaluate and advance the
200
quality of online instruction in the later phase of implementation, the organization has
been able to be more thoughtful in advancing organizational learning. So, despite the
heavy emphasis on enrollment growth at the onset, this more recent emphasis on quality
of education will likely help meet the other implementation objectives, namely increasing
majors.
201
Chapter 6: Cross Case Analysis
Introduction
The study seeks to explore a learning model of policy implementation, examining
policy implementation at the organizational level and suggesting that, in attending to
conditions that support organizational learning, the achievement of implementation
objectives can be advanced. As a theory-based, exploratory model, it is important to
discuss the underlying logic model before proceeding with the cross case analysis. In
doing so it is important to consider first whether there was evidence of organizational
learning in these cases and whether the propositions and their underlying organizational
conditions indeed advanced that learning. And then what must be considered is what
affect that learning, and by implication these conditions, had on the achievement of
implementation objectives.
Through this process of examining these two cases, implications for the study of
policy implementation more broadly can be advanced. In the selection of the two cases,
the study began with policies having been implemented with some measure of success.
131
It is indeed plausible that while objectives had been achieved, no organizational learning
had taken place. Conversely, it is also possible that, while learning had taken place, the
learning had not advanced the achievement of implementation objectives, or had not had
a substantive impact on them. And, certainly, learning may have occurred, which
impacted achieving objectives, but the propositions used to focus the study of learning
may not have sufficiently explained the learning process.
131
For SEVIS, this meant they had met the federal reporting requirements. For online education, they had
substantially grown their offerings of online courses. The objectives in the analysis, as has been discussed,
are more complex.
202
As suggested by Figure 6-1, organizations undertake implementation within a
policy context. The implementation process directs that organization and policy towards
implementation objectives, as well as other outcomes (intended and unintended). This
study suggests that the implementation process is advanced by organizational learning.
The conditions, as presented, support both organizational learning and the
implementation process in general. Information from the organization, the policy and
policymakers, and the outcomes, are fed back through these conditions to further
facilitate the implementation process and organizational learning. Necessary adjustments
and adaptations are made to better achieve implementation objectives.
Figure 6.1 Policy Implementation Process. Implementation process is advanced through
underlying organizational learning. Both are supported through organizational conditions, which
also collect feedback to further inform progress towards implementation objectives.
The desire in developing and exploring this model is to understand whether
organizational learning supports the implementation process in such a way as to further
promote the achievement of implementation. Further, it seeks to identify those
Policy
Organization
Implementation
Objectives
Other
Outcomes
Implementation Process
Organizational Conditions
Organizational Learning
203
organizational conditions most important for sustaining learning. Such a perspective
would provide a predictive tool to policymakers, who could better anticipate how
successful organizations would be able to implement policy by the presence or absence of
these conditions. Conversely, it could also offer managers a set of tools to prepare and
direct organizational changes necessary in responding to policy directives. Ultimately,
the findings suggest limitations of the model’s predictive power, largely since much of
the learning that takes place is in response to policy pressures placed on the organization
and many of these conditions develop in response to the policy, and so are not present
before implementation. In terms of providing strategies and tools for managers and
policymakers, the findings do suggest the importance of learning in advancing
implementation objectives, although to different degrees within each case.
The focus of this chapter is to delineate the findings of this research, which will
be discussed in the context of the research questions as they relate to the two cases. The
implications of this research on policy implementation theory are then presented, which
suggest strategies for policymakers and implementing organizations for better serving
implementation objectives. For policymakers, it means approaching the development of
policy in consideration of the organization, maintaining lines of communication with
implementing organizations, and providing organizations with tools to enhance learning
through the implementation process. For managers, the study suggests a variety of
strategies for enhancing the learning process through implementation, related in large part
to the conditions of the model. An ideal case of implementation, where the learning
model is considered from the onset, is then presented as a means of illustrated the value
204
of the learning model of policy implementation. And finally, concluding remarks provide
some general observations, limitations of the study, and implications for future research.
Research Questions Revisited
While relevant findings have been discussed within each of the case study
chapters, the discussion here will focus on the most salient points and also seek to more
clearly connect the earlier discussions with research questions for the study. The research
questions are:
How did learning take place?
What features support organizational learning and how do managers
support organizational learning?
What aspect of organizations and learning process support the
achievement of implementation objectives?
These will be addressed in turn. As the discussion suggests, the learning model as
conceptualized through the organizational conditions enhanced the understanding of how
implementation unfolded and provide insights into how to advance the achievement of
implementation objectives.
How Did Learning Take Place?
As discussed, organization learning has been observed in two broad areas within
each of the cases. These include changes in the organizational culture and membership
and the development of new operational systems in response to the policy driven
technology. Table 6-1 provides a summary of the evidence of organizational learning in
205
each case for the discussed forms of learning that were observed. It also provides key
differences among the two cases that relate to organizational learning. The discussion
that follows illustrates each form of organizational learning by examining its
manifestation through a specific development in one of the organizations over the course
of implementation. Changes in organizational membership will be discussed in the
context of the changes in the role of advisors in the SEVIS case, and the development of
operational systems will be discussed in the context of the new evaluation process put
into place in the case of online education.
Table 6-1. Summary of Evidence of Organizational Learning. Provides a summary of
common characteristics of organizational learning along with key differences among the cases.
Organizational Manifestations Key Differences
Learning of Learning SEVIS Online Education
Changes in
Organizational
Membership
and Culture
Acquisition of new technology related
skills
Changed how work was done in
organizations
Development of new organizational
networks
Hiring of new members who already
possessed technical skills
Guidance provided by sophisticated users
Required
participation
among users
Shared use
Optional
participation
among potential
users
Individual use
Development of
New
Operational
Systems
Developed practices in technology use
Systems advanced use of technology and
core missions of organizations
Required technical knowledge and subject
matter expertise
Relied on external sources of information
for guidance in developing these systems
Quickly
developed
Participative
decision
making process
Evolved with use
Initial decision
making was
centralized
Comprehensive
evaluation
process
Changes in Organizational Membership and Culture
Changes in the membership and culture of the organizations showed themselves
in many ways. Within both cases there was a dramatic increase in the value of a
technical skill set and a corresponding transformation which occurred in the
206
organizations. While the development of this learning has been discussed in a broad
sense for both cases, to better illustrate how it has developed, a single element in the
SEVIS case, the change in the role of advisor, will be examined in greater detail. The
introduction of SEVIS has shifted the manner in which advisors do their work and the
nature of their interaction with students. These changes include the change in the nature
of their work with students, the need to develop advanced technical skills to guide the
implementation of SEVIS and ensure departmental needs are met, and in the
establishment of new networks among advisors.
Over the course of the implementation of SEVIS advisors have experienced
changed in how they advise and interact with students. This change has come as a result
of SEVIS now being part of their daily routines, a component of the advising process, but
is also a result of the additional demands that the implementation of SEVIS has placed on
the time of advisors. At different points of the implementation process, SEVIS has
crowded out other responsibilities, and necessitated that less time be dedicated to
working directly with students. Certainly this was the case with the SEVIS Coordinator,
who maintained a split commitment to both advising and SEVIS implementation for the
greater part of the implementation process, but it was the case with advisors as a whole.
The result of this, as discussed, as been less time for counseling of students beyond
academic and immigration related issues, as well as limitations to providing additional
student programming, the deficit of which has been filled by developing partnerships
with other departments on campus. Another critical component has been the verification
of data across databases, which consumed much attention initially. In terms of the actual
interaction with students in the advising process, as noted, they have become more
207
focused and are also mediated by a computer screen. Under the previous system an
advisor would sit down with a student and discuss a variety of issues, included in them
would be academic progress and immigration status. Post-SEVIS the process is more
mechanical, where the discussion involves the need to populate or verify student data.
Advisors have also had to develop a sophisticated understanding of technology,
specifically databases. This has been necessary to map the interface of the three
databases, so that advisors can guide implementation in a way that reflects the needs of
their departments and groups of students. Such an understanding has also been necessary
to implement the technology in a manner that meets immigration law, involving both the
interpretation and the operationalization of policy, and development of routines
consistent with those understandings. While much of this effort was led by the SEVIS
coordinator, individual advisors played a key role in the defining and directing the
system.
The need to coordinate across departments has resulted in new networks among
advisors. While these networks were initially developed to plan the implementation and
direct the interpretation of policy, it has blossomed into new partnerships and
collaborative efforts among advisors. This has been helpful in managing issues related to
SEVIS on an ongoing basis since the relationships have remained intact and are valued
by the individual advisors. It has also helped in the more recent policy driven
technology, since again the networks have remained in place.
208
Development of New Operational Systems
In terms of the development of new operational systems, both organizations
sought to develop standard practices related to the policy driven technology to not only
advance the use of technology but their underlying missions as well—a process which
required more than just technical ability. The development of a comprehensive online
course evaluation process in the case of online education represented a new approach to
understanding what was being done in the virtual classroom. Even more critical here is
that it has led to the advancement of processes to improve operational systems which
guide both faculty use of technology and their overarching pedagogical approach to
online education.
The development of the online course evaluation process in the department began
with a comprehensive review of existing online courses and an understanding of best
practices in online education. The effort was led by the vice chair in the department who
had been charged with monitoring course content.
132
Through the process of monitoring
the online courses, she developed a solid understanding of practices in the department,
and used that experience to identify strategies that could be utilized across all department
courses. A template from a sister campus in its system was used to develop a rubric to
review and evaluate online courses. Part-time faculty who taught online courses were
then evaluated using these materials, a process that involved a self-assessment, a review
of their course site, and a meeting with the department chair. Some of the findings of this
process included the need to require inclusion of writing assignment as part of online
132
As discussed, it had been a standard practice since the expansion of online courses in the department
that the chair or his designee maintains monitoring access to all online course websites. This process
ensured that standards in quality education were met.
209
courses, providing students with resources related to information literacy, and, more
generally, an interest in spreading the excitement of early adopters of the technology.
The materials were also shared with all online faculty members as a means to
direct course improvement. The structure of evaluation materials (i.e., categories of the
rubric) informed the development of additional faculty training, and this training was
integrated into the regular meetings held for online instructors in the department, which
were expanded from once per semester to twice per month. This process has also
informed the need for additional training to be offered at the university for all faculty
members, including the need to develop an online certification program. Until additional
resources are provided for campus wide training, though, the department will continue to
offer that support.
It is important to note that these standard practices have limited enforceability, as
faculty autonomy continues to be respected, even in an online environment. Certainly
this has been less of an issue with part-time faculty than with full-time faculty, but still
faculty members have been receptive to the feedback and the direction of these efforts. It
has not infringed on the nature of their instruction so much as having provided them with
strategies to more effectively use technology, including online organization and design,
assessment of learning, and the provision of support to students. There have been
ongoing discussions among users as this process has unfolded, and the department has
provided faculty with the support to update their sites, which is likely the largest hurdle to
adoption.
210
What Features Support Organizational Learning and How Do Managers Support
Organizational Learning?
As discussed, the conditions advanced through the learning model of policy
implementation were evident in the implementation process for both policies, though in
varying degrees and in different manifestations. Having reviewed these conditions in
detail within each case, a cross case analysis of them follows, which links each condition
to the evidence of organizational learning as presented—membership and cultural
change, and new operational systems. While expressing themselves in different ways for
each case, these seven conditions have demonstrated applicability in both instances.
Because these are very different cases, it does suggest that the conditions have promise in
articulating the learning processes underlying policy implementation more broadly.
The discussion will begin with a view of evaluation and routinization, as it most
directly relates to these two learning processes. Policy and goal congruency, followed by
clarity of roles and responsibilities, are then discussed, both of which seek to establish
new arrangements in how the respective organizations operate. Internal communication
is presented as a key means by with both pertinent information and changes in roles and
operation are distributed within the organization. A discussion of networking with other
implementing organization and communication with policymakers follows and is
presented jointly as they both serve to provide information pertinent to both policy and
use of technology to the organization and its membership. Finally, a culture of
experimentation surprisingly is found to have had the least impact on these learning
processes in both cases, so the discussion offers suggestions as to why this occurred
related the higher education context in which the study took place.
211
Evaluation and Routinization
Evaluation and routinization played a central role in the learning process. This
was not particularly surprising, as evaluation in particular, served as the cornerstone of
the policy learning process promoted by Wildavsky (1984). In terms of member and
cultural changes, routinization appeared to be particularly important, as it helped the
organization work towards the development of expectations and provided clarity in
individual roles. Further, it aided those less comfortable with technology, providing
additional structure and a shared knowledge base and language. Routinization also
served in both cases as a means by which new operational systems could be developed.
The presence of both evaluation and routinization was found in both cases, although they
manifested in slightly different ways. Part of this can be attributed to the nature of how
each organization engaged with the technology.
For SEVIS, because of the shared use of technology, coordination was needed
early in implementation. Such efforts helped to better direct how the roles of members of
the organization changed over time and also influenced the changes in culture.
Evaluation efforts have emerged following initial implementation as the organization has
had an opportunity to reflect back on how that implementation unfolded. Much of the
effort towards routinization was developed in concert with a variety of users, following a
“team” approach, as the organization grew to understand the technology and applied it to
the context of the campus, forming the basis for its new operational systems. What
developed was a fairly clear process by the time new members joined the organization.
As one noted, “There are rules and regulations for every single step that you take, so for
me everything’s pretty straightforward depending on the case” (21). The rules reflected
212
those from the federal government, as well as interpretations and procedures outlined by
the organization, a process that in and of itself was required for auditing purposes.
For online education, implementation was carried out on a much more
individualized manner, as individual faculty, for the most part, worked to implement their
individual courses. As discussed, the management of the organization established a
mechanism to collect and share this information and has worked to develop a community
among technology users,
133
both of which have facilitated the learning process, and to
develop routines. Here, efforts to routinize have helped to boost confidence in use of the
technology and have helped to create a shared understanding in use as faculty interact and
share resources in both formal and informal settings. Evaluation has played a far greater
role in the development of new operational systems related to technology. It was in use
early on as the management monitored the individual courses of faculty, although efforts
have greatly expanded and grown more sophisticated as the department focuses on the
advancement of quality in online instruction. For online education, efforts toward
routinization were more closely directed by the management and the DEEC expert.
Much of the focus on both evaluation and routinization has been efforts to maintain and
improve the quality of instruction. The chair has been “hyper vigilant” in this regard
(1253), to stave off criticism by some faculty and ensure compliance with accreditation
requirements.
Evaluation and routinization have shown themselves to play a central role in
helping the organizations to reflect on the implementation process. Still, as the cases
133
Efforts include regular meetings of online instructors, additional training, and development of an online
community.
213
suggest, organizations do not automatically engage in these activities. The focus at the
onset with SEVIS was routinization, and both were a focus for online education, but they
were engaged in later in the implementation process. This condition can easily retreat in
priorities over the course of implementation, and because it is so important it is necessary
to ensure that it is well utilized.
Policy and Goal Congruency
Policy and goal congruency was critical to organizational learning, as
management did need to adequately operationalize these policies for organizational
members. In both cases and in both learning processes, efforts to establish a connection
between the policy and organization goals have served as the means by which the
organization has understood how to proceed with the implementation process. The
presence of a facilitator in the process played a key role in establishing those goals and
kept organizational members focused on the task. Another area where policy and goal
congruency has been shown to be important is in overcoming initial resistance and goal
conflicts. Incentives and disincentives, as might be expected, have served to help both
organizations overcome these issues.
For SEVIS, as discussed, the policy was developed and called for a well
structured response. The SEVIS coordinator served to facilitate the process, working
closely with implementation participants to develop the plan and then keep members to
their commitments. The federal mandate served as a driving force and demanded a
change in how advisors and indeed how the organization operated. The organization
itself, in operationalizing the policy, worked to clearly operate within reporting
214
requirements, but not beyond. In terms of the change in membership and culture, by
interpreting the policy conservatively and not seeking to provide any additional
information to the federal government, managers made the policy more palatable to
advisors, and that helped to engender their support in implementation and overcome their
general resistance.
134
As for operational processes, it helped in the establishment of
necessary milestones to implementation and directed the details of implementation.
Within online education, policy and goal congruency served management as a
tool in generating faculty support. That participation was critical in the absence of
external mandates. As discussed, this included incentives to individual faculty to develop
and instruct courses, as well as departmentally the benefits that came with enrollment
growth (i.e., more resources). Initially, the chair served to define these organizational
goals. However, as an evaluation system has unfolded, the vice chair has taken on the
lead in this regard. Such efforts have served to define both minimum expectations related
to quality education and more recent efforts at advancing the level of quality.
Clearly the congruency between policy and organizational goals are important in
directing organizational activity relative to the goals. The study suggests, however, the
importance of not only considering organizational goals but individual motivation in
operationalizing policy at the organizational level. This was clearly important in
implementing policy absent of mandates, as was the case with online education.
134
This included being seen as agents of DHS and “policing” students.
215
Clarity of Roles and Responsibilities
Defining and articulating roles and responsibilities were vital to the learning
process as the implementation required changes in the organization and its membership.
As the organization managed these changes, it needed to clarify the shift in expectations,
and organizational members needed to understand their role and contributions to the
implementation process. Much of this for both cases involved the acquisition of new
technical skills, but it also called for more collaboration with other users in order to
transform individual learning to organizational learning. It has been particularly
important to articulate the role of facilitator, as it served to manage the implementation
process from an organizational perspective. In terms of the development of new
operational systems, the identification of roles and responsibilities has been valuable,
particularly as a means of articulating the changes that were taking place in the
organization. The importance of this process did relate back to the cases in unique ways.
For SEVIS, roles and responsibilities were more relevant as the implementation
was in many ways more technical and complex. As discussed, the complexity was
largely a product of the need to coordinate among users in different departments and
across various databases. Implementation required a great deal of structuring and detail.
This was managed through the coordinator in her facilitation role, and was served by the
development of strong internal communication systems. The evolution of the coordinator
role itself is indicative of the challenges related to the implementation of the policy,
which as explained required not just technical ability, but a strong understanding of
immigration law.
216
For online education, few faculty members were engaged in addressing
implementation from a department perspective.
135
Communication regarding faculty
roles and responsibilities were important at establishing minimum expectations in use of
technology and instructional quality. Coordination of this departmental effort was
facilitated by management and largely the vice chair. Her role served to direct those
expectations of faculty, and ultimately, served to pull together individual observations in
developing a higher set of standards for the department. Providing that feedback and
direction to faculty members has been vital to shaping the cultural changes within the
department and has formed the basis for its new operational systems.
Need to be both articulate in the changes as they occur, so individuals are well
aware of their responsibilities and know the changes to the responsibilities of others,
since the expectation is increased coordination and collaboration. Further, roles and
responsibilities need to be flexible, allowed to grow and evolve, and to also permit
reconsidering arrangements as needed. The latter is particularly important to an
organization’s capacity to adapt.
Internal Communication
In many ways, internal communication supported the underlying learning that
took place. Communication served to educate organizational members about the changes
in membership and culture, and served as the vehicle for expressing the new operational
135
While this was the prerogative of the full-time faculty, by all accounts they were notified of the
progress, but not necessarily involved much with the overall planning.
217
systems. In both cases, internal communication relied on both formal and informal
systems, although the development of the formal was the most critical.
Communication for SEVIS required a more complex array of networks in that
more departments were involved both as users and as participants in the implementation
process. Further, as discussed, the nature of the implementation demanded coordination
among users. This called for a more concerted effort on the part of both management and
the SEVIS coordinator to maintain communication systems. The latter in particular
worked diligently to ensure that roles and responsibilities, particularly coming out of
implementation meetings, were articulated and emailed to participants. Further,
operational systems and business guidelines were developed and readily available to
users.
For online education, communication was also very important, although it has
evolved over the course of implementation. Management made efforts from the onset to
maintain formal lines of communication among users. While helpful, their impact was
limited since initially online faculty meetings were only held once per semester and were
not required. Faculty also relied heavily on more informal communication, which itself
had its limitations. Faculty members were selective in who they sought assistance from
and who they were willing to help. This mostly fell to whom they knew, with whom they
shared office space, and with whom they were most comfortable.
136
However, as online
usage has become more commonplace and the formal communication lines have been
strengthened, its contributions to learning have also developed.
136
As mentioned, in one instance a faculty member avoided sharing access to his course website with
someone he deemed overly “critical.”
218
Networking with Other Organizations and Communication with Policymakers
137
Networking with other implementing organizations and communication with
policymakers served to supply external information to the organization regarding the
policy and use of technology. Specifically, it provided details and clarification regarding
policy, and served as a means for understanding best practices and how other
organizations generally used the technology. This information was necessary in helping
to direct learning and shape implementation in both cases. Within each case, however,
the sources of such information and how it was shared within the organization varied
widely.
SEVIS provided a well articulated policy that lacked a well developed source of
technology, and relied on a new federal agency to direct the implementation.
Communication with the federal government was critical in both the interpretation of
policy and use of technology. The federal government had mechanisms in place to
provide this support (i.e., SEVIS helpdesk), but the organization also relied heavily on
support from a professional organization, NAFSA.
138
NAFSA engaged in direct dialog
with the federal government to generate information related to policy interpretation and
technology use. From the perspective of the organization, NAFSA supplied it with
knowledge regarding implementation. Further, through NAFSA and outside of NASFA,
the organization maintained links with other campuses implementing SEVIS and
regularly engaged in sharing of strategies. All users had direct access to the SEVIS
helpdesk and most were members of NASFA. Information collected from both sources
137
These two conditions will be discussed jointly as they both function as a means for the organizations to
collect information regarding implementation from their external environment.
138
NAFSA is an association of international educators, which developed a policy community in response to
SEVIS.
219
was regularly shared with advisors, and, as needed, with implementation participants.
The availability of this information served the organization well, and helped to guide its
members as they participated in the design of implementation and the use of SEVIS.
The collection and distribution of information operated much differently for
online education, which reflected both the nature of the policy and how the technology
was used. As discussed, the policy directive to develop online courses came mostly from
the department chair. Stipulations, however, in how to proceed were from policies
outside of the organization. There was less need for regular communication of
information from these outside sources. Information regarding policy was shared as
needed, mostly as individual faculty developed online course proposals.
139
The lack of
availability of this information did not deter implementation; it just seemed that in this
case it was less necessary. Networking with other implementing organizations also
manifested itself differently. In many ways this occurred on an individual basis, different
than SEVIS though, as individual faculty pursued and collected resources to augment
their individual online courses. In other ways, this was also centralized, as best practices
were generated both by the DEEC expert and the vice chair. In this regard, all efforts
have been and continue to be made to share these resources and it is this effort that has
mostly informed the learning process of developing new operational systems.
139
Policies governing the development of an online degree, on the other hand, were even less known. The
process was moving forward with the input from the chair and a handful of faculty.
220
Culture of Experimentation
While, as discussed, the organizations did engage in experimentation in the use of
the technology, in both cases such experimentation was limited to efforts which can be
characterized as trial and error. The implication is that while experimentation is an
important component of the learning process, a culture of experimentation was not
evident. Both organizations made efforts to identify and utilize best practices, which
included communicating with other implementing organization and learning for the errors
other organizations made. Still this was not an indication of a culture of experimentation.
Indeed in the case of SEVIS, the policy itself created a circumstance that promoted risk
adverse behavior, but even in online education where the risks associated with failure
were less, outside of the initial prototype and its pioneering faculty member,
experimentation only seemed to be evident at the margins of the course.
While certainly even those marginal efforts have contributed to substantial
changes in both organizations, the implication is that learning can take place in the
absence of this culture of experimentation. Before excluding the value of
experimentation out entirely, though, it is important to consider the context of the study.
The diminished role of experimentation may be less an issue of its value to learning in
policy implementation and more of an issue of the characteristics of the institution where
the implementation is occurring, namely higher education. This is an important reminder
of the limitations imposed by the selection of the cases, as well as the opportunity for
further research in applying the model to other institutions.
221
As discussed earlier, the selection of these conditions is well grounded in
organizational learning literature as contributing to organizational learning. The research
indicates that with the exception of a culture of experimentation, these conditions have
enhanced and supported the learning process. Still, it is important to note that by no
means is this an all inclusive list, nor is this the sole manner in which to conceptualize
these categories. Examples of other conditions that could also contribute to the
understanding of organizational learning processes include: distribution of power; intra-
organizational networks;
140
and history of the organization in managing change. These
do, however, serve this exploratory study well in advancing the understanding of the
organizational learning process. While these conditions may have supported learning,
though, it is also important to discuss whether learning, and therefore these conditions,
has supported the achievement of implementation objectives. This after all is the central
purpose of policy implementation. So in turning to the next section, the achievement of
implementation objectives will be discussed as it relates to both cases.
What Aspect of Organizational and Learning Process Support the Achievement of
Implementation Objectives?
As noted, not only is it important to identify if learning has taken place, but
whether that learning has advanced the achievement of implementing objectives, and, if it
has, whether those objectives could have been served by other means. In beginning the
discussion, however, it is important to note that organizational learning is at best
considered to help support the achievement of these objectives. The degree to which the
140
Networks, in this regard, can be quantified through network analysis, and be assigned a strength.
222
findings indicate this will be discussed more fully, however, other elements are also at
work in advancing these objectives. Such elements relate both to issues associated with
policy implementation and organizational behavior, including the availability of
resources, clarity of policy, provision of training, complexity of the objective, personal
relationships among those involved, competing interests of organizations and
organizational members, and the politics and history related both to the organization and
the policy at hand.
The study does indicate that organizational learning can indeed contribute to
achieving implementation objectives. The ability to thoughtfully change organizational
culture and membership, as well as to establish new operational systems, has supported
the achievement of these objectives. Yet, it is also necessary to consider what other
processes facilitate implementation, and to what extent would implementation have been
possible in the absence of organizational learning. The study also provides insights here,
where elements of each of the cases suggest conditions in which learning may have less
relevance. The first relates to policies that are highly technical in nature, as in SEVIS.
The second relates to technology that is operationalized in a highly individualized manner
with a lack of policy clarity,
141
as was the case initially with online education.
Organizational learning has been identified as a means for assisting in the
individual understanding and use of technology, as well as a process by which shared
practices can be advanced. Policies, as studied here, often require a reorientation of how
organizations operate and of individual roles within the organization. Learning is an
141
While the goal of enrollment was clear, the level of quality in instruction and expectations were not
initially well articulated.
223
important vehicle in directing the development of those changes toward the achievement
of the implementation objectives. Part of this process is the development of incentives
(as with online education and the strategies used by the chair) or punitive action (as with
SEVIS and the repercussions from the government) to engender support. More
importantly, it is maintaining the direction of that change and working to create value
through the shifts in membership and culture that support the policy objectives. In both
cases, a major effort in this regard has been the development of technical skills to work in
concert with other skill sets required by individual roles. The process, as clearly
demonstrated with SEVIS, ought to provide the organization with a better adaptive
capacity to manage related future demands on the organization. And as shown by
SEVIS, organizational learning is not the panacea which will solve all difficulties in
managing future challenges, but it, by definition, means to reduce the barriers in meeting
those challenges. The degree to which it can do that measures both the organization’s
propensity for learning and the impact of learning on the implementation of the policy.
As also indicated, the act of reflection contributes substantially to the learning
process, particularly when such findings are valued and shared, and they are utilized to
introduce process improvements. While such learning can take place through informal
means, management plays an important role in ensuring that this activity is coordinated,
and the information and knowledge collected through it is maintained and fed back to
organizational members. This reflection does not always tie back to the immediate
implementation objectives, but managers need to pursue it for the benefits it provides to
the learning process. For online education, the evolution of monitoring to evaluate online
courses demonstrates efforts to improve the reflective abilities of individual members, so
224
they can improve in their use of technology as well as their overall pedagogical approach
to online instruction. The use of such evaluative processes to further advance overall
departmental standards also represents an advancement in the learning process.
Still, as mentioned, organizational learning is only one of many factors to
consider in the implementation of policy. As has been discussed, organizational learning
played a substantial part in the implementation of both policies. For SEVIS, despite the
contributions of learning to the overall process, resources continued to be a large concern
which prohibited the organization from maintaining many of its initial student service
functions following the federal mandate. Further, the technical challenges that the
management of these multiple databases created as demonstrated in the organization’s
management of a second systems conversion to a related database, which can undercut
the observable impact of organizational learning.
142
Much of what was initially achieved
for online education, on the other hand, largely grew out of individual learning practices.
Although management was actively engaged in setting the direction for implementation,
faculty members were able to manage the technology on an individual level, while
maintaining a minimum threshold of instructional quality. In many ways, there was a
lack of policy clarity (or at least consistency), which undermined organizational learning,
as it failed to provide clear direction to users. Further, due to the lack of coordination,
some of the initial changes that took place within the organization also lacked direction.
Such practices only took the department part of the way to achieving their
implementation objectives (i.e., enrollment growth). In the later phase of
142
Its actual impact, though, is better measured through the reduction in barriers to the implementation
process, which was certainly present in managing that second implementation.
225
implementation, as the organization became more committed to the online enterprise,
more cognizant of the need to change, and better informed on the type of changes desired
(i.e., organizational make up and standards in online instruction), organizational learning
played a more integral role.
Implications for Policy Implementation
This research was undertaken with the aim of offering a new direction for policy
implementation studies; one that incorporated organizational learning and established a
learning model for policy implementation, or policy implementation enhanced by
learning (PIEL). The policy implementation literature suggests a variety of strategies for
understanding how implementation should or did unfold, and in many instances why it
has failed. Among the arsenal to address improving implementation effectiveness are
refinements to policy (Mazmanian and Sabatier, 1983), anticipating obstacles (Pressman
and Wildavsky, 1984) or opportunities (Matland, 1995), and managing improvements to
policy over time, both on the implementation side (Pressman and Wildavsky, 1984) and
the policy side (Bardach, 1977). The model goes beyond Bishops’s consideration of
organizational structures (1993), as well as Wildavsky’s notion of learning (Pressman and
Wildavsky, 1984). Wildavsky advanced the notion of evaluation as the means by which
implementation evolves and improves over time. Evaluation is important, but needs to be
coupled with other organizational processes to ensure the information generated in the
evaluation process is converted into knowledge. Other changes at the organizational
level are also required in order to create that “long-run adaptive capacity” (Schwandt,
226
1993). As discussed, the learning model of policy implementation helps to identify the
mechanisms by which implementation objectives are better served.
It is important, though, to discuss more thoroughly the implications of this model
for the policy implementation literature. The model is useful in understanding how
implementation has unfolded and can provide insights to policymakers and managers on
strategies to help facilitate implementation. While the predictive power of the model is
limited, the implications of the study for policymakers are to promote a focus on the
organizational perspective in considering policy and to seek the advancement of the
learning process with organizations implementing policy by maintaining open
communication with them and promoting effective evaluation systems in the
implementation process. More valuable are the implications for managers and
organizations, which can help organizations in responding to policy driven changes.
These include operationalizing policy at an organizational level, providing the necessary
support of not only the implementation process but the underlying learning process, and,
finally, working to maintain the overall organizational learning process and its underlying
subsystems, as advanced by the learning model of policy implementation. These
implications will be illustrated through the discussion of an idealized case, where
technology is being implemented in a higher education setting, specifically, examining
the introduction of an automated Institutional Research Board (IRB) process.
As an exploratory study, examining two nested cases within the specific lens of
technology policy in a higher education setting, there are certainly limitations on the
applicability of the findings. Conversely, the cases differ in substantive ways suggesting
some opportunity for broader applicability. Both of these will be discussed. Ultimately,
227
pursuing this line of research can indeed prove to be very beneficial in furthering not only
the study of policy implementation, but efforts to more effectively implement policy,
providing a powerful tool to tackle societal concerns.
Applying the Model
The model has been thus far been discussed in terms of the two cases. While this
has demonstrated the value of the model and generated important findings, the cases are
by no means ideal examples of the model, mostly because the organizations did not
approach implementation with learning, and more precisely the learning model, in mind.
There are several implications of the model for both policymakers and managers of
implementing organizations. These are summarized in Table 6-2 and suggest the
importance of a focus on organization and communication, as well as process strategies
for both policymakers and managers. To best illustrate how the learning model of policy
implementation can advance implementing objectives, though, an ideal case will be
presented. The example will be the introduction of a new technology, specifically an
automated Institutional Research Board (IRB) system.
143
Due to the nature of IRB, the
chief principle users of the technology would faculty, particularly those engaged in
research with human subjects, but users also include students and staff that may be
engaged in research on the campus. The focus for this illustration, though, will be on
faculty users.
143
IRB is the mechanism by which research that involves human subjects is reviewed and approved.
Different types of research based on the extent to which they may be deemed as intrusive on human
subjects, require different levels of review. An automated system would bring several advantages to an
institution. These include providing reliable, up-to-date information on research projects, alerting users and
administrators on studies in danger of falling out of compliance, and report mechanisms to assist in tracking
research activities and providing tools for long term planning.
228
Table 6-2. Implications for Policy Implementation. The roles of policymakers and managers
in supporting the learning model of policy implementation.
Policymakers Organizations/Managers
Organization Maintain an organizational perspective to
implementation
Operationalize implementation in
organizational terms
Information Support communication systems with
implementing organizations
Provide resource, support, and expert
knowledge
Processes Develop meaningful evaluation systems Maintain organizational learning system
Even before considerations for organizational learning can be offered, several
factors must first be considered. These include the availability of resources, initial
objectives of implementation, population who will be most affected, models of the
technology and use of the technology that are available, timeline for implementation, and
the degree of support within the university administration.
144
Successful implementation begins with good policy design, so initial
considerations of advancing the model start with the policymaker perspective. The first
steps are to invest in a developed, tested system, set clear policy objectives, including a
date for full implementation and a timeline leading up to it, which itself should include
testing and training, and perhaps piloting of the system. The expectation among
policymakers, though, might be that since it is “better” than the current system it will be
used, but even if users do not agree they will need to comply if they wish to conduct
research. The learning model, though, suggests the need for more meaningful buy in
among users. Policymakers, though, influence not only the design of policy, but
participate in continued efforts to shape and clarify policy. The model reminds
policymakers to consider policy not only from a macro-level, but to consider it from an
144
As this is a policy discussion, the assumption is that the driver of the program is the administration,
although it may initiate with the Dean, Vice President/Provost, or President.
229
organizational perspective. It also suggests the importance of creating mechanisms to
communicate and provide policy clarification to implementing organizations. The latter
includes advancing evaluation systems that are not only be directed at ensuring
compliance but also provide more valuable information to the implementation process at
the organizational level.
Maintaining an organizational perspective would mean several things. These
include being aware of policy impacts on the organization and considering broader
definitions of objectives. While the interest for policymakers may be overall efficiency
gains in the long term, they need to be cognizant of, and attempt to address, some of the
short term challenges that such a change would impose on organizations and
organizational members within the institution. Among users there will be resistance to
technology and concerns about learning the technology, and the delays that this might
create in moving forward in new research projects. Department chairs may see the value
of the new system, and so be possible policy advocates, but may choose to resist it
because they know, or believe they know, how their faculty would react to the new
system, and they fear that the change might have a negative effect on the research agenda
of the department in the short term.
Policymakers can mitigate these concerns from the organizational level by
considering broader definitions of achieving implementation objectives. Their definition
of efficiency, for example, would include the operation of IRB for the board members
themselves, who serve to review the research requests, and the operation of the research
office that coordinates the process is important. But it is necessary to weigh those
benefits against the challenges that might be encountered at the user level: the drop in
230
productivity as users learn to navigate the system; delays or losses in research (with
possible impact on grant and research dollars) as the new system is learned or faculty
decide not to engage in certain types of research to avoid having to deal with the change.
Such considerations may suggest the need for a longer soft implementation, where dual
systems are maintained. This would be more difficult for central users, but help to ease
the transition to the new system while minimizing some of the negative impacts on the
organizational level.
A useful strategy to advance implementation, while also strengthening the
organizational perspective, is to find and cultivate advocates at the organizational level.
These are not only cheerleaders, individual to articulate the value to other users, but
individuals to provide helpful feedback to the technology, the implementation process,
and approaches of generate organizational support (and foresee sources of resistance) for
the initiative at the department level. Such a process would include identifying heavy
IRB users and involving them in the process early to provide feedback on system
selection or in the identification of features that are most important. Indeed, attempts to
generate support through participation in some key decisions early in the implementation
process, which the intent of considering feedback from the implementing organizations is
critical. Other advocates include newer members of the organization. Efforts should be
made early on to push technology to new faculty through faculty orientation, socializing
them so the technology is understood as an expectation. Among this group there will
231
likely be less resistance, and added value since tenure track faculty are expected to be
active researchers.
145
Policymakers also need to establish communication lines with implementing
organizations to guide the implementation process. The perspectives of department
chairs and individual faculty must be heard early on to understand the impact of the
policy on users and their research, provide an opportunity for feedback on the system,
and to anticipate resources of resistance. While policymakers would not need to respond
to all concerns, such a practice would go far to avoid negative unintended consequences
and contribute to the sense that individual concerns have been heard.
146
As noted, this is
where cultivating early advocates for the initiative is important as they can continue to
provide a voice in whether the automated IRB system is meeting the needs of users,
whether it fits the diverse needs of discipline, and whether it indeed is able to meet
increases in objectives that were anticipated. Also necessary is to provide ongoing
support to users. Critical to this is anticipating questions to address that through training
and additional resources, as well as understanding patterns of use. While a 24-hour help
desk may not be critical, ensuring that technology support staff is sufficient to address
inquiries in a timely fashion and that they are properly trained.
Finally, policymakers need to ensure institutional resources are provided to
evaluate not only implementation but also the learning associated with implementation.
145
Further, it is also reasonable to expect that they will be more receptive to technology than faculty more
distant from their doctoral studies, as this population will likely include in its number more technology
“natives” and not solely technology “immigrants”—natives being those who were born into the presence of
technology in all facets of their lives.
146
Such an approach can also slow policy design and implementation substantially. Policymakers and
administrators need to balance the need for collecting the organizational perspective with other institutional
needs, such as the consequences of a delayed or extended implementation.
232
Policymakers ought to make efforts to collect information regarding support systems
available at the department level. Such a process can help to capture various strategies of
support and learning among the various departments, offering opportunities to share
strategies and practices. As a means of meeting local needs, policymakers can work to
establish a network of experts throughout the department or school level. They would
receive additional compensation (i.e., course release time) and training, and provide more
localized support to users, and help to structure more specific support programs in
coordination with the central research office.
Policy implementation enhanced by learning has more implications at the
organizational level as it offers managers tools to better address both implementation
objectives and the needs of organizations as they manage the organizational changes
related to the policy. Still, the responsibilities for managers in many ways dovetail with
those of policymakers. These include operationalizing implementation in organizational
terms, providing organizational members adequate support, and maintaining the learning
system within the organization.
The most important first step that managers can provide is clarity of
implementation objective(s) for the organization. Generating support can be a major
obstacle. Within an academic institution even mandates can be resisted as faculty
maintain a special set of authority granted by academic freedom and tenure, leading to
the delay, if not derailing, of such initiatives. There need to be tangible benefits at the
organizational and organizational member level. The responsibility of managers is to
help draw those connections between the objectives of the policy and those of the
organization. For the manager, part of this is improving the ability to monitor and track
233
IRB submissions at the departmental or school level, as well as introducing a system that
generates assessment information. For the faculty member, it means largely long term
gains, but in such things as ease of system use, quicker processing of research
submissions, quicker response time from IRB, and improved system (i.e., direct access)
for tracking and monitoring IRB submissions. So first, it is ensuring that organizational
members understand the value of the policy in organizational, then it is devising a plan to
facilitate implementation. Arriving at those long term benefits is the critical component
here. This may begin with a discussion of the impact that the conversion would have on
the department, and the resource and support needs which organizational members
require to meet organizational objectives.
This discussion of resources leads to the next responsibility of managers.
However, while the provision of the necessary resources and support is obvious,
managers need to consider not only the needs of implementation, but also of the
underlying learning process. That process ought to be valued as implementation is
valued, and includes the opportunity for reflection and provide opportunities to make
adjustments as needed. For implementation, such resources are training in use of the
IRB system and the identification of departmental resources that can support
implementation, such as faculty with experience in managing the technology or interest
in developing their sophistication. Additional assistance may include the provision of
departmental support to managing the new system, such as dedicating administrative
support to new or additional data processing that may be required (or needed to convert
existing records), particularly in the initial conversion. It may also include providing
one-on-one support to faculty in using the technology and the hiring of technical support
234
staff to provide assistance. For the learning process, such resources may be developing
new strategies for facilitating communication within the organization or sharing
resources.
Very much related to this issue of emphasis on process, managers must also
consciously seek to support the learning process, which means monitoring the presence
and health of the various components of organizational learning as advanced by the
model. As presented, the conditions of the learning model of policy implementation are
based on the Organizational Learning System Model (Schwandt and Marquardt, 2000),
which has four interrelated subsystems: environmental interface; action/reflection;
meaning and memory; and dissemination and diffusion. The relationship of each
subsystem to the organizational conditions is shown in Figure 6-2, a modified version of
Figure 2-1. Such a conceptualization of learning provides a means by which managers
can monitor organizational learning in the context of implementation.
Figure 6-2. Integrated Dynamic Social Model of Organizational Learning. Social model
subsystems are shown integrated with the learning model of policy implementation. Each
subsystem (in bold) is identified with its corresponding organizational condition.
Purpose
Means Ends
Focus
External
ENVIRONMENTAL INTERFACE
Network with Other Implementing
Organizations
Communication with Policymakers
ACTION/REFLECTION
Policy and Goal Congruency
Culture of Experimentation
Internal
MEANING AND MEMORY
Internal Communication
Evaluation and Routinization
DISSEMINATION AND DIFFUSION
Clarity of Roles and Responsibilities
Internal Communication
Adapted from Learning Subsystems and Functional Prerequisites, Figure 4.5, p. 64, and Learning Systems
and Interchange Media, Figure 4.6, p. 69 (Schwandt and Marquardt, 2000).
235
In terms of the first quadrant, managers need to ensure that external
communication and information regarding the technology is available to organization.
These include such items as notices about training, changes to policy or the
implementation process, problems or potential problems faced by outside departments,
and best practices in use. Collecting this information is important, as is considering what
needs to be shared with organizational members and the means by which this will be
achieved.
Much of this feeds into the action/reflection quadrant, as that external policy is
applied to organizational goals. Understanding the health of this subsystem means
ensuring the faculty members are appropriately aware of the conversion to an automated
IRB and the associated timelines, and how it fits the needs of the organization and what
resources are available to support implementation, institutionally and in the department.
Also involved is ensuring that there is latitude for experimentation, and an opportunity to
learn from the mistakes of others.
In addressing the meaning and memory quadrant, the issue is whether
organizational users are accessing this information and are able to make sense of it now
and for future use. This involves documenting best practices and having these readily
available to faculty. It also means meeting users where they are currently—filling in the
gaps left by institutional training and offering more local training as needed. With a
focus on the learning process, collect feedback on the accessibility of information,
resources, and support, how are users learning to use the system and promoting those
strategies. Managers also need to display an openness to the insights generated through
236
such a process and be open to making necessary adjustment to their implementation
process as might be indicated. It is also the establishment of routines where they offer
clarity and simplify use. If the department office is provide administrative support for
updating or converting records, the routine is being sure that faculty know the support is
available, how long processing will take, and what information they need to provide.
Dissemination and diffusion supports all of these in facilitating communication
within the organization. Here a manager needs to ensure that there are mechanisms to
circulate and discuss relevant information and knowledge, developed through the
organization’s work with the policy and its implementation. A department committed to
this endeavor might institute monthly progress meetings to discuss new information that
has become available, share strategies and challenges, and revisit needs for resources and
support. Further, as changes are made, it means that managers are ensuring that they are
understood. For example, if there are identified expert users in the department, whether
due to their technical skills or familiarity with the IRB process, ensuring that it is known
that they are available and that they are clear in how much of their time ought to be
dedicated to that supporting role.
Ultimately, though, what the learning model of policy implementation provides is
a more comprehensive way to understand the learning needs of organizations through the
implementation process. As advanced, the model directs managers to key areas as a
means of both enhancing the organizational learning taking place and supporting the
achievement of implementation objectives.
237
Limitations of the Model
Clearly, not all policies demand organizational learning in order for
implementation objectives to be met. Understanding when to apply the model is an
important first step in its use. Technology was selected as a policy focus as its
introduction would require some individual learning, which would likely benefit from
organizational learning. This proved useful, but suggests further limitations. An
important consideration in determining the appropriateness of a case is selecting cases in
which policies have far reaching affects on an organization. Specifically, they should
require fundamental changes in how the organization operates and have a broad impact
on individual organizational members. Changes need to be required at the organizational
level. Supporting learning in that context can help ensure that the change is directed and
meaningful.
An additional consideration is the study took place within the context of higher
education. There are certainly notable distinctions between private and public
institutions, which include but are not limited to reward and incentive structures,
motivations of employees, power and politics, and goal clarity (Dessler, 1999; Moon,
2000; Nyhan, 1999; Wittmer, 1991). Many of the characteristics of public institutions
have applicability to higher education, but they can manifest themselves in other ways as
well. There are different cultural issues, resource constraints, and political structures that
are evident in this context. What is important to note here, as has been discussed, is that
the conditions, and learning itself, may unfold or apply in different ways in other types of
organizations.
238
Still, the selection of the two cases also suggests an opportunity to consider the
implications of the study more broadly. These differences, as discussed and noted in
Table 3-1 (chapter 3), related to the policy itself and the technology, as well as to the
organizations and members charged with implementation.
Table 3-1. Policy Cases in Contrast.
Elements of Cases
Organization/Policy SEVIS Online Education
Charged with Implementation Primarily staff Primarily faculty
Motivation
Compliance; punitive
based
Incentive based
Technology function
Administrative,
information management
Academic, teaching
Interface with Technology Distributed use, shared Individual use
Source of policy External (federal)
Mixed sources, but driven
by organization
management
Specificity of policy
Very structured and
specific
Clear goal, ambiguous
means
Maturity of policy
Developed, but continues
to be refined
Emerging policy with more
specificity anticipated
Technology reform objective
(Heeks and Davies, 1999)
Optimization Reengineering
Ambiguity-Conflict Model for Policy
Implementation (Matland, 1995)
Administrative Political
Organizational conditions have been discussed as observed, and certainly there
was no concerted effort to express each of these conditions as the implementation process
unfolded. Perhaps, with focused attention to each of these areas from the onset of
implementation, management and organizations may have been better served. Indeed,
that is the underlying importance of the study, to provide a new focus for policymakers
and management in addressing the capacity of organizations to meet the objectives of
policy. While the exploratory study only provides some initial guidance in this regard, it
does suggest the value of coupling an organizational perspective, and organizational
learning in particular, to the study of policy implementation.
239
A final note regarding the applicability of the model to policymakers suggests a
limitation of the study. Similar to many policy implementation studies, this one began
with an interest in developing tools to craft better policy, and specifically to provide tools
to policymakers to identify preconditions in organizations charged with implementation
which would indicate the propensity of the organization towards learning. The findings
of the study indicate that such an approach is not appropriate for this model. While
existing organizational features suggested some capacity for the organization to develop
the conditions, when dealing with policies that require substantive changes in how the
organization operates or changes in member roles or organizational culture, much of the
work must start anew. In both cases, organizations developed new systems in response to
the policy. The international office and the language program forged a strong
relationship to manage implementation, which led to other collaborations. An evaluation
process was developed to examine the quality of online instruction and create new
standards. While this can be expected, there was a hope that learning would have relied
more on existing structures. Though some did—international educators had a
professional organization that pre-dated SEVIS, but its central role in implementation
was not preordained—by in large new systems were required. Ultimately, the need to
develop these systems requires additional time and energy from the organization, which
can delay implementation. Therefore, it is premature to anticipate that the model could
suggest that implementation objectives will be achieved given a particular policy and a
specific organization.
240
Concluding Thoughts
The study provides a new perspective for understanding the process of policy
implementation. In coupling policy implementation with organizational learning,
policymakers and organizations gain a new perspective and new tools to achieve
implementation objectives. The study offers a variety of insights and observations
related to both sets of literature, but at the heart of this model is the interest in making
policies work better. So, in that vein, the conclusion will consider the challenges to
getting policymakers to recognize an organizational perspective in the development of
policy, as well as managers to invest energy and resources into learning as they direct
implementation. Limitations and caveats of the study will also be discussed, as will
implications for future research. Such research can help further develop and refine the
model, and further strengthen the case for the coupling of these literatures.
Among policymakers and within policy studies as well, there is a high value on
impact of policies. Economic models and theories form the basis for cost-benefit analysis
on the policy development side and outcomes assessment on the implementation side.
Organizational learning has two disadvantages in this regard: the focus on outcomes and
the difficulty in assessing the impact of learning. In terms of the first, organizational
learning supports the overall implementation process and most often any assessment of
outcomes (or outputs) of a policy will not capture learning. A critical element to
advancing this notion among policymakers, then, is to draw attention to the process of
implementation and not just the product. Many issues may appear to prohibit the ability
of policymakers to do so—the breadth of implementing organizations and limited
resources being high on that list. However, as the study suggests, the value of promoting
241
and improving process oriented aspects of implementation, learning specifically, can
have a substantial impact on the achievement of implementation objectives.
Incorporating process and learning oriented elements to evaluation and compliance
reports, and at the very least collecting information regarding their presence, will provide
some information to policymakers on the extent to which such processes are present
through implementation, and can also develop an awareness among organizations of such
processes. In terms of the difficulty of assessing the impact of learning, the greatest
obstacle relates directly to the emphasis of economics in valuing policy impact.
Assessing learning is not a quantitative process, but a qualitative process. It requires a
different set of skills and a different approach, and yields a different kind of results.
Policymakers and policy analysts need to develop a comfort with such approaches.
Fortunately, such assessment tools and approaches continue to develop in terms of
sophistication and use, which will assist in making it more palatable in policy circles.
For managers of implementing organizations, there are often similar issues
relating to the change in mindset required in considering the impact of learning. There
are two additional barriers that may be relevant: difficulty of putting in place
organizational learning and the difficulty of focusing on learning when addressing
implementation demands. First, while organizational learning literature continues to
grow and has certainly entered the vernacular, operationalizing it is not as clearly
understood. It requires the transformation of organizations and institutionalizing it is
difficult.
147
The hope, however, is that its popularity is growing and, as discussed, there
147
A manager may advance it within her unit, but if the broader organization does not embrace it, the
relevance is diminished.
242
is a growing number of tools and approaches to advancing learning within organizations.
The second issue is a matter of resources. When faced with an impending
implementation demand, which in this case places substantial new demands on an
organization, as discussed, it is exceedingly difficult to place time and energy on
promoting organizational learning. The advantage is that often there is a close
connection between supporting learning and achieving implementation objectives, as was
witnessed with the evaluation efforts in online education. At the start of implementation
when the development of organizational learning systems is most valuable, however,
those linkages are not as clear.
The development of a learning model of policy implementation, however, seeks
to advance the value of connecting implementation objectives with organizational
learning. The interrelation of learning systems and the implementation process can be
better understood and more easily put into practice within organizations. Policymakers
can also help to reinforce its value by placing an emphasis on not just the product of
implementation, but the process.
The Future Path
This study serves as an important next step in the evolution of policy
implementation studies. While an exploratory study examining technology related policy
in two nested case studies at a single university, the two dissimilar cases of policy
implementation offered varying perspectives on this model. This helped support its
underlying principles and deliver a more robust perspective on the contributions of
organizational learning to policy implementation. However, it is just a first step. Before
243
discussing implications for future research, it is helpful to note some of the limitations of
the study, which can help to refine and improve future efforts.
The discussion of achieving implementing objectives may benefit from those
being served by the policy—in these cases, students. Taken into consideration
were the intent of policy and observations of organizational members, as
substantiated by the document analysis. While the analysis benefitted from
general observations collected from course evaluations, in the case of online
education, more direct assessment may benefit future studies.
Routinization as a condition, while important, can be confused with an
outcome of learning—in these cases, new operational systems. The
distinction drawn in the analysis was that routinization, the condition, is
process oriented, as opposed to the products of that process, which are the
operational systems. Further clarification of this condition may be needed,
and it may be beneficial to separate it from evaluation.
Testing of the model may benefit from controlling for other factors
influencing implementation. Assuming that these measures are valid and the
organizational conditions can be adequately studied and understood, there is
still a likelihood that other factors may influence effectiveness, whether
positively or negatively. Such issues as changes in financial resources,
turnover in staff during the implementation process, complications related to
campus technology, and changes in organizational priorities may all influence
the achievement of implementation objectives, despite positive impact of
organizational learning. In terms of this study, the utilization of a case study
244
approach enabled a far richer collection of evidence than a quantitative
approach. Understanding these policies and organizations in context and
using a variety of methods helped to mitigate the impact of such factors.
In terms of future research, there are several directions that studies may take to
further develop and advance the learning model of policy implementation. These include
research efforts to:
Further clarify the types of policies that would benefit from organizational
learning. As suggested, the model can apply to policies that require changes
in organizational operations, and which have a broad impact on individual
members. Such characteristics of policy are not always easily identifiable,
and there may be situations where learning benefits organizations undergoing
less dramatic changes. Additionally, while it has been argued that the model
has applicability beyond technology, this too ought to be tested.
Apply the model to different types of organizations. Efforts to explore the
boundaries of the model and how the conditions support organizational
learning in settings beyond higher education would contribute to a more
robust model.
Examines whether organizational “footprint” features can be identified. These
would be features that exist in the organization prior to implementation that
contribute to the development of organizational conditions that support
organizational learning related to the policy. The identification of footprint
245
features would enable the model to generate a predictive function to better
serve policymakers.
Consider the impact of learning on other aspects of the organization and
organizational functions. This consideration of the side benefits of
organizational learning, if supported, would help to generate support among
managers to attend to the learning process during policy implementation.
Identify more specific strategies for maintaining the health and proper
functioning of the conditions and subsystems as outlined in the model. This
would be important to help managers better operationalize the learning model
of policy implementation within their organizations.
Advance the assessment of organizational learning and its impact within the
context of policy implementation. While much work is underway in this
regard within the organizational learning literature, placing it in the context of
achieving implementation objectives would help to make the model more
meaningful to policymakers.
Finally, consider other conditions that may support learning. As discussed,
such examples may include: distribution of power; intra-organizational
networks; and the history of the organization in managing change.
Continuing to test the learning model of policy implementation can help to better
identify the impact of learning on policy implementation. As a new model, it is
particularly important to be open to other strategies for conceptualizing learning in a
policy implementation context. Certainly, given other policies and other organizations, a
246
culture of experimentation may be more important, existing conditions of the model may
be less so, and other conditions may also need to be considered. Efforts such as these
will help to further refine the model and ultimately serve to improve the achievement of
implementation objectives, the intent of both policymakers and organizations, and the
needs of society.
247
Bibliography
Alshuwaikhat, H. M. and D. I. Nkwenti (2003). “Collaborative planning and management
frameworks: Approaches to effective urban governance by adoption of emerging
technologies.” International Journal of Management 20(3): 294-305.
Amoore, L. and M. D. Goede (2005). “Governance, Risk and Dataveillance in the War on
Terror.” Crime, Law and Social Change 43: 149-173.
Argyris, C. and D. A. Schön (1978). Organizational Learning: A Theory of Action
Perspective. Reading, Mass., Addison-Wesley Publishing Co.
Bardach, E. (1977). The Implementation Game: What Happens After a Bill Becomes a
Law. Cambridge, MA, The MIT Press.
Barki, H. and J. Hartwick (1989). “Rethinking The Concept Of User Involvement.” MIS
Quarterly 13(1): 53-64.
Barki, H. and J. Hartwick (1994). “Measuring user participation, user involvement, and
user attitude.” MIS Quarterly 18(1): 59-82.
Barley, S. R. (1990). “The Alignment of Technology and Structure through Roles and
Networks.” Administrative Science Quarterly 35(1 (Special Issue: Technology,
Organizations, and Innovation): 61-103.
Berleur, J. and C. Avgerou, Eds. (2005). Perspectives and Policies on ICT in Society.
New York, Springer.
Berry, F. S. and W. D. Berry (1999). Innovation and Diffusion Models in Policy
Research. Theories of the Policy Process. P. A. Sabatier. Boulder, Colorado,
Westview Press: 169-200.
Berry, F. S., W. D. Berry, and S. K. Foster (1998). “The Determinants of Success in
Implementing an Expert System in State Government.” Public Administration
Review 58(4): 293-305.
Bishop, S. W. (2003). “Welfare Reform from the Inside Out: Implementing the Missouri
Families-Mutual Responsibility Demonstration Plan.” Administration & Society
35(5): 597-628.
Bovens, M. and S. Zouridis (2002). “From Street-Level to System-Level Bureaucracies:
How Information and Communication Technology is Transforming
Administrative Discretion and Constitutional Control.” Public Administration
Review 62(2): 174-184.
Bugler, D. T. and S. Bretschneider (1993). Technology Push or Program Pull: Interest in
New Information Technologies within Public Organizations. Public Management:
The State of the Art. B. Bozeman. San Francisco, Jossey-Bass: 275-293.
248
Burkhardt, M. E. and D. J. Brass (1990). “Changing Patterns or Patterns of Change: The
Effects of a Change in Technology on Social Network Structure and Power.”
Administrative Science Quarterly 35(1 (Special Issue: Technology,
Organizations, and Innovation)): 104-127.
Carnevale, D. (2003a, February 7). Congress May End Distance-Education Limit.
Chronicle of Distance Education. Retrieved August 4, 2003, from
http://chronicle.com/prm/weekly/v49/i22/22a02701.htm
Chen, C. C., X.-P. Chen, et al. (1998). “How can cooperation be fostered? The cultural
effects of individualism-collectivism.” Academy of Management Review 23(2):
285-304.
Chin, R. and K. D. Benne (1976). General Strategies for Effecting Change in Human
Systems. The Planning of Change. W. G. Bennis, Holt, Rinehart and Winston.
Cole, R. E. (1991). Large-Scale Change and the Quality Revolution. Large-Scale
Organizational Change. J. Allan M. Mohrman, Susan Albers Mohrman, G. E.
Ledford, Jr., T. G. Cummings, Edward E. Lawler, III, and Associates. San
Francisco, Jossey-Bass: 229-254.
Crossan, M. M., H. W. Lane, and R. E. White (1999). “An organizational learning
framework: From intuition to institution.” Academy of Management Review
24(3): 522-537.
Daft, R. L. and K. E. Weick. (1984). “Toward a Model of Organizations as Interpretation
Systems.” The Academy of Management Review 9(2): 284-297.
Damanpour, F. (1991). “Organizational Innovation: A Meta-Analysis of Effects of
Determinants and Moderators.” Academy of Management Journal 35: 555-590.
Davenport, T. H., D. W. D. Long, and M. C. Beers (1998). “Successful Knowledge
Management Projects.” Sloan Management Review 39(2): 43-57.
Davis, F. D., R. P. Bagozzi, and P. R. Warshaw (1989). “User Acceptance of Computer
Technology: A Comparison Of Two.” Management Science 35(8): 982-1003.
DeMatteo, J. S., L. T. Eby, and E. Sundstrom (1998). Team-based rewards: Current
empirical evidence and directions for future research. Research in Organizational
Behavior. B. M. Staw and L. L. Cummings. 20: 141-183.
Dessler, G. (1999). “How to Earn Your Employees’ Commitment.” Academy of
Management Executives 13(2): 58-67.
Elmore, R. F. (1978). “Organizational Models of Social Program Implementation.”
Public Policy 26: 185-228.
249
Elmore, R. F. (1982). Backward Mapping: Implementation Research and Policy
Decisions. Studying Implementation: Methodological and Administrative Issues.
W. Williams. Chatham, New Jersey, Chatham House Publishers, Inc.: 18-35.
Etzioni, A. (1989). “Humble Decision Making.” Harvard Business Review 67(4): 122-
127.
Ferris, J. and S.-Y. Tang (1993). “The New Institutionalism and Public Administration:
An Overview.” Journal of Public Administration Research & Theory 3: 4-10.
Fiorino, D. J. (2001). “Environmental Policy As Learning: A New View of an Old
Landscape.” Public Administration Review 61(3): 322-334.
Foster, S. T. J. and C. R. Franz (1999). "User involvement during information systems
development: A comparison of analyst and user perceptions of system acceptance."
Journal of Engineering and Technology Management 16(3-4): 329-348.
Fountain, J. E. (2001). Building the Virtual State. Washington, D.C., Brookings
Institution Press.
Franz, C. R. and D. Robey (1986). “Organizational Context, User Involvement, and the
Usefulness of Information Systems.” Decision Sciences 17(3): 329-357.
Fulk, J. and G. DeSanctis (1995). “Electronic communication and changing
organizational forms.” Organization Science 6(4): 337-350.
Garson, G. D., Ed. (2005). Handbook of Public Information Systems. Public
Administration and Public Policy. New York, Taylor & Francis Group.
Hackman, J. R. and R. Wageman (1995). “Total Quality Management: Empirical,
Conceptual, and Practical Issues.” Administrative Science Quarterly 40: 309-342.
Haque, A. (2001). “GIS, public service, and the issue of democratic governance.” Public
Administration Review 61(3): 259-265.
Haque, A. (2005). “Implementation In A Wired World: Obstacles To Facilitating
Information-Sharing Procedures In North-Central Alabama.” Public
Administration Quarterly 29(1/2): 117-125.
Heeks, R. (1999). Reinventing Government in the Information Age. Reinventing
Government in the Information Age: International Practice in IT-Enabled Public
Sector Reform. R. Heeks. New York, Routledge: 9-21.
Heeks, R. and A. Davies (1999). Different Approaches to Information Age Reform.
Reinventing Government in the Information Age: International Practice in IT-
Enabled Public Sector Reform. R. Heeks. New York, Routledge: 22-48.
250
Heintze, T. and S. Bretschneider (2000). “Information Technology and Restructuring in
Public Organizations: Does Adoption of Information Technology Affect
Organizational Structures, Communications, and Decision Making?" Journal of
Public Administration Research and Theory 10(4): 801-830.
Huber, G. P. (1991). “Organizational learning: The contributing processes and the
literatures.” Organization Science 2(1): 88-115.
Imai, M. (1986). KAIZEN: The Key to Japan's Competitive Success. New York, NY,
Random House Business Division.
Jin, K. G. and C. R. Franz (1986). “Obstacle Coping During Systems Implementation.”
Information & Management 11(2): 65-76.
Jones, L. R., K. Schedler, and R. Mussari, Eds. (2004). Strategies for Public Sector
Reform. Research in Public Policy Analysis and Management. Boston, Elseviar.
Kezar, A. (2001). Understanding and Facilitating Organizational Change in the 21st
Century: Recent Research and Conceptualizations. San Francisco, Jossey-Bass.
Kezar, A. (2005). Organizational Learning in Higher Education. San Francisco,
California, Wiley Periodicals, Inc.
Kingdon, J. (2003). Agendas, Alternatives, and Public Policies. New York, N. Y.,
Longman.
Krogh, G. v. (1998). “Care in Knowledge Creation.” California Management Review
40(3): 133-153.
Kudo, H. (2004). Reform of Public Management through ICT: Interface, Accountability
and Transparency. Strategies for Public Sector Reform. L. R. Jones, K. Schedler
and R. Mussari. Boston, Elseviar. 13: 153-174.
Kwon, T. H. and R. W. Zmud (1987). Unifying the Fragmented Models of Information
Systems Implementation. Critical Issues in Information Systems Research. R. J.
Boland and R. A. Hirschheim. New York, John Wiley and Sons: 394.
Langley, A., H. Mintzberg, E. Posada, and J. Saint-Macary. (1995). “Opening Up
Decision Making: The View from the Black Stool.” Organization Science 6(3):
260-279.
Larreamendy-Joerns and Leinhardt (2006). “Going the Distance with Online Education.”
Review of Educational Research. 76(4): 567–605.
Lessig, L. (1999). Code and Other Laws of Cyberspace. New York, N.Y., Basic Books.
Levy, A. (1986). “Second-order planned change: Definition and conceptualization.”
Organizational Dynamics 15(1): 5-20.
251
Lindbolm, C. E. (1973). The Science of Muddling Through. American Politics and Public
Policy. M. P. Smith. New York, Random House: 167-177.
Lipsky, Michael. (1980). Street-Level Bureaucracy. New York, N.Y., Russell Sage
Foundation.
Lundberg, C. C. (1983). “On the feasibility of cultural intervention in organizations.”
Organizational Culture. P. J. Frost, L. F. Moore, M. Reis Louis, C. C. Lundberg, and
J. Martin. Beverly Hills, CA: Sage Publishing: 169-185.
Mankin, D., T. K. Bikson, and B. Gutek (1984/1985). “Factors in Successful
Implementation of Computer-Based Office Information Systems: A Review of the
Literature with Suggestions for OBM Research.” Journal of Organizational
Behavior Management 6(3,4): 1-21.
Markus, M. L. (2004). “Technochange management: using IT to drive organizational
change.” Journal of Information Technology 19(1): 4-20.
Marshak, R. J. (1993). “Lewin Meets Confucius: A Re-View of the OD Model of
Change.” Journal of Applied Behavioral Science 29(4): 393-415.
Matland, Richard (1995). “Synthesizing the Implementation Literature: The Ambiguity-
Conflict Model of Policy Implementation.” Journal of Public Administration
Research and Theory 5 (2): 145-174.
Maynard-Moody, S., M. Musheno, and D Palumbo (1990). “Street-Wise Social Policy:
Resolving the Dilemma of Street-Level Influence and Successful Implementation.”
The Western Political Quarterly 43(4): 833-848.
Mazmanian, D. A. and P. A. Sabatier (1983). Implementation and Public Policy.
Glenview, IL, Scott, Foresman.
Mazmanian, D. A. and P. A. Sabatier (1989). Implementation and Public Policy. Lanham,
MD, University Press of America.
McCubbins, M. D., R. G. Noll, and B. R. Weingast (1987). “Administrative Procedures
as Instruments of Political Control.” Journal of Law, Economics, and
Organization 3(2): 243-277.
Mead, L. M. (2001). “Welfare Reform in Wisconsin: The Local Role.” Administration &
Society 33(5): 523-554.
Miles, M. B. and A. M. Huberman (1994). Qualitative Data Analysis: An Expanded
Sourcebook. Thousand Oaks, CA, SAGE Publications, Inc.
Miller, G. (2000). “Above Politics: Credible Commitment and Efficiency in the Design
of Public Agencies.” Journal of Public Administration Research and Theory
10(2): 289-327.
252
Moon, M. J. (2000). “Organizational Commitment Revisited in New Public
Management—Motivation, Organizational Culture, Sector, and Managerial
Level.” Public Performance & Managerial Review 24(2): 177-194.
Morgan, G. (1997). Images of Organizations. Thousand Oaks, CA, SAGE Publications,
Inc.
Nonaka, I. (1994). “A Dynamic Theory of Organizational Knowledge Creation.”
Organization Science 5(1): 14-37.
Nonaka, I. and H. Takeuchi (1995). The Knowledge-Creating Company: How Japanese
Companies Create the Dynamics of Innovation. New York, Oxford University
Press.
Nyhan, R. C. (1999). “Increasing Affective Organizational Commitment in Public
Organizations.” Review of Public Personnel Administration XIX(3): 58-70.
Orlikowski, W. J. (2000). “ Using technology and constituting structures: A practice lens
for studying technology in organizations.” Organization Science 11(4): 404-429.
Ortenblad, A. (2001). “On Differences between Organizational Learning and Learning
Organization.” The Learning Organization 8(3/4): 125.
Ortenblad, A. (2002). “A Typology of the Idea of Learning Organization.” Management
Learning 33(2): 213.
Osborne, D. and T. Gaebler (1993). Reinventing Government: How the Entrepreneurial
Spirit Is Transforming the Public Sector. New York, Addison Wesley Publishing
Company.
Parsons, T. and E. A. Shils (1959). Toward a General Theory of Action. Cambridge,
Massachusetts, Harvard University Press.
Parsons, T., E. A. Shils, and R. F. Bells (1951). A General Theory of Action. New York,
Free Press.
Perry, J. L. and L. W. Porter (1982). Factors Affecting the Context for Motivation in
Public Organizations. Academy of Management Review 7: 89-98.
Petroni, G. and F. Cloete, Eds. (2005). New Technologies in Public Administration.
International Institute of Administrative Sciences Monographs. Washington, DC,
IOS Press.
Pettit, P. (1996). Institutional Design and Rational Choice. The Theory of Institutional
Design. R. E. Goodin. New York, Cambridge University Press: 54-89.
Pickering, J. M. and J. L. King (1995). “Hardwiring weak ties: Interorganizational
computer-mediated communication, occupational communities, and
organizational change.” Organization Science 4(479-487).
253
Porras, J. I. and P. J. Robertson (1987). Organization Development Theory: A Typology
and Evaluation. Research in Organizational Change and Development. W. A.
Pasmore and R. W. Woodman. Greenwich, CT, JAI Press. 1: 1-57.
Pressman, J. L. and A. Wildavsky (1973). Implementation. Berkeley, University of
California Press.
Pressman, J. L. and A. Wildavsky (1984). Implementation. Berkeley, University of
California Press.
Robey, D., M. C. Boudreau, and G. M. Rose. (2000). “Information technology and
organizational learning: a review and assessment of research.” Accounting,
Management and Information Technologies 10(2): 125-155.
Rockwell, S. K., J. Schauer, S. M. Fritz, and D. B. Marx, . (1999). “Incentives and
Obstacles Influencing Higher Education Faculty and Administrators to Teach Via
Distance.” Online Journal of Distance Learning Administration. 2(4).
Ruiz-Mercader, J., A. L. Merono-Cerdan, and R. Sabater-Sanchez. (2006). “ Information
Technology and Learning: Their Relationship and Impact on Organisational
Performance in Small Businesses.” International Journal of Information
Management 26(1): 16-29.
Schedler, K. and B. Schmidt (2004). Managing the E-Government Organization.
Strategies for Public Sector Reform. L. R. Jones, K. Schedler and R. Mussari.
Boston, Elseviar. 13: 133-152.
Schwandt, D. R. and M. J. Marquardt (2000). Organizational Learning: From World-Cass
Theories to Global Best Practices. Boca Raton, Florida, St. Lucie Press.
Senge, P. M. (1990). “The Leader’s New Work: Building Learning Organizations.” Sloan
Management Review 32(1): 7-23.
Senge, P. M. (1994). The Fifth Discipline: The Art and Practice of the Learning
Organization. New York, Currency Doubleday.
Sharkansky, I. (1996). What a Political Scientist Can Tell a Policymaker about the
Likelihood of Success or Failure. Classics of Public Administration. J. M. Shafritz
and A. C. Hyde. New York, New York, Harcourt Brace College Publishers: 514-522.
Snellen, I. (2005). Technology and Public Administration: Conditions for Successful E-
Government Development. New Technologies in Public Administration. G.
Petroni and F. Cloete. Washington, DC, IOS Press. 28: 5-19.
Stevens, J. B. (1993). The Economics of Collective Choice. Boulder, CO, Westview
Press.
254
Tallent-Runnels, T. C. Ahern, S. M. Shaw, and X. Liu. (2006). “Teaching Courses
Online: A Review of the Research.” Review of Educational Research. 76(1): 93-
135.
Thompson, J. R. (1999). “Devising Administrative Reform that Works: The Example of
the Reinvention Lab Program.” Public Administration Review 59(4): 283-292.
Thompson, J. R. (2000). “Reinvention as reform: Assessing the National Performance
Review.” Public Administration Review 60(6): 508-522.
True, J. L., B. D. Jones, and F. R. Baumgartner. (1999). Punctuates Equilibrium Theory:
Explaining Stability and Change in American Policymaking. Theories of the
Policy Process. P. A. Sabatier. Boulder, CO, Westview Press: 97-116.
Tyre, M. J. and W. J. Orlikowski (1993). "Exploiting Opportunities for Technological
Improvement in Organizations." Sloan Management Review 35(1): 13-27.
Tyre, M. J. and W. J. Orlikowski (1996). “The Episodic Process of Learning by Using.”
International Journal of Technology Management 11(7-8): 790-799.
Van de Ven, A. H. and M. S. Poole (1995). “Explaining Development and Change in
Organizations.” Academy of Management Review 20(3): 510-540.
Van Maanen, J. and E. H. Schein (1979). Toward a Theory of Organizational
Socialization. Research in Organizational Behavior. B. M. Staw. Greenwich, CT,
JAI Press Inc. 1.
Van Meter, D. and C. E. Van Horn (1976). “The Policy Implementation Process: A
Conceptual Framework,” Administration & Society 6(4): 445-488.
Vann, J. L. (2004). “Resistance to Change and the Language of Public Organizations: A
Look at “Clashing Grammars" in Large-Scale Information Technology Projects.”
Public Organization Review 4(1): 47-73.
Venkatesh, V. and F. D. Davis (2000). “A Theoretical Extension of the Technology
Acceptance Model: Four Longitudinal Field Studies.” Management Science
46(2): 186-204.
von Hippel, E. and M. J. Tyre (1995). “How Learning is Done: Problem Identification in
Novel Process Equipment.” Research Policy 24(1): 1-12.
Walsh, J. P. and G. R. Ungson (1991). “Organizational Memory.” The Academy of
Management Review 16(1): 57-91.
Williams, W. (1982). The Study of Implementation: An Overview. Studying
Implementation: Methodological and Administrative Issues. W. Williams. Chatham,
New Jersey, Chatham House Publishers, Inc.: 1-17.
255
Wilson, J. Q. (1974). The Politics of Regulation. Social Responsibility and the Business
Predicament. J. McKie. Washington DC, The Brookings Institute.
Wilson, J. Q. (1995). Organizational Maintenance and Incentives; Political Structure and
Organizations. Political Organizations. Princeton, Princeton University Press.
Wilson, W. (1887). "The Study of Administration." Political Science Quarterly 2: 197-
222.
Wittmer, D. (1991). “Serving the People or Serving for Pay: Reward Preferences Among
Government, Hybrid Sector & Business Managers.” Public Productivity &
Management Review XIV(4): 369-383.
Wolfe, L. (1999). Transforming Accountability for Government Information Technology
Projects: The Impact of New US Legislation. Reinventing Government in the
Information Age: International Practice in IT-Enabled Public Sector Reform. R.
Heeks. New York, Routledge: 230-252.
Yanow, D. (1996). American Ethnogenesis and Public Administration. Administration &
Society 27(4): 483-510.
Yin, R. K. (1982). Studying the Implementation of Public Programs. Studying
Implementation: Methodological and Administrative Issues. W. Williams. Chatham,
New Jersey, Chatham House Publishers, Inc.: 36-72.
Yin, R. K. (2003). Case Study Research: Design and Methods. Thousand Oaks, CA, Sage
Publications, Inc.
256
Appendix A
Homeland Security and SEVIS Implementation:
A Timeline of Relevant Events and Legislation
February 26, 1993 – Bomb explodes in the underground parking structure below Tower
One of the World Trade Center, killing 6 people. Among those found guilty for
detonating the explosion is one international student.
http://en.wikipedia.org/wiki/World_Trade_Center_bombing. Retrieved on 3/28/04.
www.ice.gov/graphics/enforce/imm/sevis/sevp121801.pdf. Retrieved on 3/28/04.
June 1995 – Immigration and Naturalization Service (INS) established task force to
conduct a comprehensive review and analysis of the current process for the
collection of information regarding foreign students and exchange visitors in the
United States.
1996 – Illegal Immigration Reform and Immigrant Responsibility Act (IIRIRA)
enacted instituting similar changes as noted by the task force. INS is required to
collect information on an ongoing basis from schools and exchange programs
relating to nonimmigrant foreign students and exchange visitors during the course
of their stay in the United States, using electronic reporting technology to the
fullest extent practicable (Section 641). Statute
June 1997 – October 1999 – INS, in partnership with the Department of State, the
Department of Education, and experts from INS-authorized schools and exchange
visitor programs, conducted a pilot program, CIPRIS, to test the concepts
involved in an electronic reporting mechanism. Pilot involved the Atlanta
Hartsfield Airport and District Office, the Texas Service Center, and 21
institutions of higher learning in the states of Georgia, Alabama, North Carolina,
and South Carolina.
December 1999 – Recommendation for INS to collect a $95 fee to process student visas.
The fee is currently under review. Rule
2000 – Development begins for the Student and Exchange Visitor Program (SEVP), the
reengineered nonimmigrant student and exchange visitor process (covering the F,
J, and M nonimmigrant visa categories).
www.ice.gov/graphics/enforce/imm/sevis/sevp121801.pdf. Retrieved on 3/28/04.
September 11, 2001 – Terrorist attack on the World Trade Center and Pentagon. All 19
highjackers had visas issued by the State Department and entered the country
legally. www.homelandsecurity.org/journal/Articles/Strickland.html. Retrieved on 3/28/04.
“6 months after 11 September, approved Mohammed Atta’s and Marwan al-Shehhi’s request for
new student visas despite their deaths in suicide terrorist attacks.” Cited “Hijackers Visa Fiasco
Points Up INS Woes,” Washington Post, 17 March 2002.
257
October 24, 2001 – USA Patriot Act: amends the IIRIRA to expand the scope of
educational institutions included in the original legislation (adds “other
educational institutions” to the higher education language in the original
legislature, i.e., air flight school, language training school, or vocational school);
mandates full implementation of reporting system prior to January 1, 2003,
appropriating $36,800,000 to the Department of Justice for such purposes. Statute
December 2001 – Student and Exchange Visitor Information System (SEVIS) developed
following the lessons learned from the CIPRIS pilot. Initial deployment at 11
institutions in the Boston area.
May 14, 2002 – Enhanced Border Security and Visa Entry Reform Act. H.R. 3525
became Public Law No. 107-173. It called for the implementation of an interim
student monitoring measures (in Title 5 of the law) within 120 days. Amends
IIRIRA. Statute
May 16, 2002 – Rule proposed in accordance with Enhanced Border Security and Visa
Entry Reform Act amending INS regulation governing the reporting of
international students (F, J, and M nonimmigrants) and outlining the process to
implement SEVIS. Rule
2002 – Department of Defense appropriated $36,800,000 to fund the development and
start-up operations of SEVIS. Statute
July 1, 2002 – INS allows institutions meeting specific criteria to apply for voluntary
“preliminary enrollment” to begin using the SEVIS Real-Time Interactive (RTI)
option. Regulation
September 25, 2002 – Voluntary “preliminary enrollment period” ends. INS specifies
how institutions that had not participated in the preliminary enrollment can apply
for certification to enroll in SEVIS. Regulation
November 2002 – Homeland Security Act. Largest reorganization of the federal
government in more than half a century.
January 1, 2003 – Initial deadline for SEVIS to be fully functional and to require full
compliance.
February 2003 – Revised deadline for full implementation of SEVIS. Institutions not
enrolled are unable to have visas issued for admitted students.
March 1, 2003 – INS officially split. Immigration responsibilities transferred to U.S.
Citizenship and Immigration Services (USCIS), enforcement related
responsibilities transferred to the Bureau of Immigration and Customs
Enforcement (ICE), and border control to the Bureau of Customs and Border
258
Protection (CBP). All are bureaus within the Department of Homeland Security.
SEVIS is placed under ICE.
www.dhs.gov/dhspublic/theme_home4.jsp. Retrieved on 3/28/04.
August 1, 2003 – Deadline for all new and continuing students and exchange visitors to
be entered in SEVIS, per Congressional mandate.
www.ice.gov/graphics/enforce/imm/imm_sevis.htm. Retrieved on 3/28/04.
259
Appendix B
Semi-Structured Interview Protocol
Briefly introduce the purpose of the study and the interview process:
“Thank again for your time and your willingness to share your expertise and insights
with me this [afternoon]. I am studying the implementation process as it relates to
this technology policy [SEVIS/Online Education] with the hope of understanding
ways that can improve the quality and effectiveness of implementation. My hope
through our conversation is to gain some insights from your first hand experience in
its implementation. I have some questions that I may use to structure our talk, but
mostly I am interested in your perspective on the process of implementing this policy
in your organization. Please note that I plan with your permission to record our
discussion to assist me in analyzing my research, but that all we discuss here will be
held in the strictest of confidence. Is that a concern? This should only take about a
half hour.”
Begin recording.
Please start by stating your role in the implementation process and how
this policy has been implemented in the department.
Prompt Questions
What challenges have you/others/the department as a whole faced through
the implementation of this policy and have they been overcome? How?
Who and what resources have been available to assist in implementation?
How are decisions about this policy made?
What information/communication takes place with policymakers?
What are the objectives of this policy? Have they or are they being met?
How do you know that?
How have organizational members and management reacted to setbacks or
challenges in the implementation process?
What were some of the situations where either novel approaches were
introduced, even outside of the implementation of PDT? How did
members of the organization respond?
Closing Questions:
Are there any key documents that might be helpful in this study?
Are there other people with whom you recommend I speak?
Are there any places that it would be helpful that I observe?
Would you be available to speak with me should I have any follow up
questions?
260
Appendix C
Initial Listing of Codes
SHORT DESCRIPTION CODE
Internal Communication
IC: With Peer IC-PEER
IC: With Management IC-MGT
IC: With IT Department IC-IT
IC: With Faculty Development Center IC-FDC
IC: With Extended Education IC-EE
IC: With Language Program IC-LP
IC: With Visiting Faculty Program IC-FP
IC: Conflict IC-CON
IC: Providing Information/Resources IC-INF-PRO
IC: Receiving Information/Resources IC-INF-REC
IC: Policy Update IC-POL-UP
IC: Policy Interpretation IC-POL-INT
IC: Sharing External Information IC-EXT
IC: New Process IC-PROC
IC: Problem with Technology IC-TECHPROB
IC: Addressing Technology Problem IC-TECHSOLU
IC: Sharing Evaluation Results IC-EV-RES
IC: Sharing Goal Information IC-GOALS
Network with other Implementing Organizations
NT: Network with Int’l Education Office NT-IEO
NT: Network with Faculty NT-FAC
NT: Network with Professional Assoc. NT-PRO
NT: Network with Other NT-OTH
NT: Regarding Policy NT-POL
NT: Dealing with Technology NT-TECH
NT: Problem with Technology NT-TECHPROB
NT: Addressing Technology Problem NT-TECHSOLU
Policy and Goal Congruency
PG: Goal Setting PG-GSET
PG: Evaluation of Goals PG-GEVAL
PG: Operationalizing Policy PG-OPER
261
SHORT DESCRIPTION CODE
Clarity of Roles and Responsibilities
RR: Uncertainty RR-UNCER
RR: Change in RR-
RR: Conflicts with Understanding of Others RR-CONFL
RR: Process of How Determined RR-PROC
Evaluation and Routinization
EV: Process of EV-PROC
EV: Sharing Results EV-RES
EV: Changes in Technology Use Resulting from EV-TECH
EV: Changes in Policy Interpretation Resulting from EV-POL
EV: Sharing Results Internally EV-IC
EV: Sharing Results Externally EV-EXCOM
RZ: Developing RZ-DEV
RZ: Change in RZ-
Communication with Policymakers
CP: Direct with Policymakers CP-DIR
CP: With Agents of Policymakers CP-APM
CP: Seeking Policy Clarity CP-POL-SKG
CP: Receiving Policy Clarity CP-POL-REC
CP: Sharing Problems with Technology CP-TECHPROB
CP: Sharing Evaluation Results CP-EV-RES
CP: Communication Problem with Policymakers CP-PMPROB
CP: Feedback on Goals CP-GOALS
Culture of Experimentation
EX: Performance Improvement EX-PERF
EX: Improved Learning of Technology EX-LRNG
EX: Performance Loss EX-PERF
EX: Inhibited Learning of Technology EX-LRNG
EX: Management Initiated EX-MGT
EX: Management Response EX-RESP-MGT
EX: Policymaker(s) Response EX-RESP-PM
EX: Non-Technology Related EX-NONTECH
Achieving Implementation Objectives
AIO: Policy Driven AIO-POLDR
AIO: User Driven AIO-USEDR
AIO: Policy and User Driven AIO-POLUSEDR
AIO: Goal(s) Achieved AIO-GOALS
Miscellaneous
IP: Implementation Process IP
DEM: Demographics DEM
LRNG: Learning
262
Appendix D
Final Coding Protocol with Reference Count
SHORT DESCRIPTION CODE COUNT
Network with other Implementing Organizations 59
NT: Network with Departments NT-DEPT 21
NT: Network with Professional Assoc. NT-PRO 30
NT: Other NT-OTH 8
Communication with Policymakers 47
CP: Direct with Policymakers CP-DIR 5
CP: With Agents of Policymakers CP-APM 32
CP: Other CP-OTH 10
Policy and Goal Congruency 119
PG: Operationalizing Policy PG-OPER 112
PG: Other PG-OTH 7
Clarity of Roles and Responsibilities 55
RR: Certainty RR-CERT 19
RR: Uncertainty RR-UNCER 2
RR: Change in RR-CHNG 6
RR: Conflicts with Understanding of Others RR-CONFL 1
RR: Process of How Determined RR-PROC 7
RR: Other RR-OTH 20
Culture of Experimentation 47
EX: Performance Improvement EX-PERF-I 24
EX: Improved Learning of Technology EX-LRNG-I 1
EX: Management Initiated EX-MGT 1
EX: Other EX-OTH 21
Internal Communication 443
IC: With Peer IC-PEER 104
IC: With Management IC-MGT 91
IC: With IT Department IC-IT 23
IC: With Faculty Development Center IC-FDC 45
IC: With Extended Education (ALP-IEE) IC-EE 29
IC: With Extended Education (DR) IC-EXED 35
IC: With Faculty (other departments) IC-FAC 13
IC: Conflict IC-CON 2
IC: Policy Update IC-POL-UP 2
IC: New Process IC-PROC 6
IC: Problem with Technology IC-TECHPROB 1
263
SHORT DESCRIPTION CODE COUNT
IC: Decision Making IC-DEC 33
IC: Other IC-OTH 59
Evaluation and Routinization 80
EV: Process of EV-PROC 9
EV: Other EV-OTH 27
RZ: Developing RZ-DEV 30
RZ: Other RZ-OTH 14
Achieving Implementation Objectives 311
AIO: Goal(s) Achieved AIO-GOALS 75
AIO: Objectives AIO-OBJ 97
AIO: Other AIO-OTH 139
Miscellaneous 193
MC: Implementation Process MC-IP 42
MC: Learning MC-LRNG 29
MC: Technology MC-TECH 56
MC: Problems MC-PROB 61
MC: Solutions MC-SOLU 1
MC: Other MC-OTH 4
Abstract (if available)
Abstract
This study integrates organizational learning and change theory into a learning model of policy implementation. This is an exploratory study of technology implementation with descriptive and normative objectives, utilizing organizational learning and change theory to identify organizational conditions that promote learning and contribute to achieving implementation objectives. Learning in policy implementation is studied in the context of a pair of technology related policies set in a higher education context (1), one serving an administrative function and the other an instructional function (2). Schwandt and Marquardt's (2000) Organizational Learning System Model (OLSM) is used to evaluate the presence and development of organizational learning in these policy contexts, and serves to develop an enriched model of policy implementation.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Transformation of governmental organizations: the case of the California Integrated Waste Management Board
PDF
An examination of response-to-intervention as a framework for school improvement: educators' perpectives regarding implementation
PDF
Lessons from TAP implementation: obstacles and solutions to improve the transit users experience
PDF
Organizational leadership and institutional factors related to the implementation of online educational programming in California community colleges
PDF
A gap analysis of course directors’ effective implementation of technology-enriched course designs: An innovation study
PDF
Information sharing, deliberation, and collective decision-making: A computational model of collaborative governance
PDF
The use of mobile technology and mobile applications as the next paradigm in development: can it be a game-changer in development for women in rural Afghanistan?
PDF
Knowledge, motivation, and organizational influences within leadership development: a study of a business unit in a prominent technology company
PDF
A view from below: the development and role of organizational social capital in neighborhood regeneration in Los Angeles
PDF
A stakeholder approach to reimagining private departments of public safety: the implementation of a community advisory board recommendation report
PDF
Making equity & student success work: practice change in higher education
PDF
Racing? to transform colleges and universities: an institutional case study of race-related organizational change in higher education
PDF
Technology integration and its impact on 21st century learning and instruction: a case study
PDF
Using innovative field model and competency-based student learning outcomes to level the playing field in social work distance learning education
PDF
Using decision and risk analysis to assist in policy making about terrorism
PDF
K-12 standards-based reform implementation: site-level shared roles of leadership: a case study
PDF
A multi-regional computable general equilibrium model in the Haritage of A. Anas and I. Kim
PDF
A public sector organizational change model: prioritizing a community-focused, inclusive, and collaborative approach to strategic planning
PDF
College and career readiness through independent study: an innovation study
PDF
Embedding sustainability: a change management guide for ports
Asset Metadata
Creator
Alcántara, Ryan Edward
(author)
Core Title
A learning model of policy implementation: implementing technology in response to policy requirements
School
School of Policy, Planning, and Development
Degree
Doctor of Philosophy
Degree Program
Public Administration
Publication Date
04/28/2009
Defense Date
02/23/2009
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
Higher education,implementation studies,OAI-PMH Harvest,organizational learning,policy implementation,Public Policy,technology policy
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Musso, Juliet A. (
committee chair
), Kezar, Adrianna (
committee member
), Robertson, Peter John (
committee member
)
Creator Email
ralcantara@fullerton.edu,realcant@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-m2149
Unique identifier
UC1206621
Identifier
etd-Alcantara-2683 (filename),usctheses-m40 (legacy collection record id),usctheses-c127-230323 (legacy record id),usctheses-m2149 (legacy record id)
Legacy Identifier
etd-Alcantara-2683.pdf
Dmrecord
230323
Document Type
Dissertation
Rights
Alcántara, Ryan Edward
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Repository Name
Libraries, University of Southern California
Repository Location
Los Angeles, California
Repository Email
cisadmin@lib.usc.edu
Tags
implementation studies
organizational learning
policy implementation
technology policy