Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Multilevel influences on organizational adoption of innovative practices in children's mental health services
(USC Thesis Other)
Multilevel influences on organizational adoption of innovative practices in children's mental health services
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
1
Multilevel Influences on Organizational Adoption of Innovative Practices in
Children’s Mental Health Services
by
Karissa M. Fenwick, MSW, LCSW
Dissertation
Doctor of Philosophy (Social Work)
Degree Conferral Date August 2019
Faculty of the USC Graduate School
University of Southern California
Dissertation Guidance Committee
Michael Hurlburt, Ph.D. (Co-chair)
Lawrence Palinkas, Ph.D. (Co-chair)
Peer Fiss, Ph.D.
2
Table of Contents
Acknowledgements ..........................................................................................................................4
List of Tables ...................................................................................................................................5
Abstract ............................................................................................................................................6
Chapter One- Overview of the Three Studies ..................................................................................8
Introduction and Background ...................................................................................................8
The Three Studies ...................................................................................................................12
References ..............................................................................................................................17
Chapter Two- Study One: Acquisition and Assimilation of External Sources of Information
about Innovative Practices in Children’s Mental Health Organizations .......................................27
Abstract...................................................................................................................................27
Introduction ............................................................................................................................30
Methods ..................................................................................................................................33
Results ....................................................................................................................................36
Discussion...............................................................................................................................44
References ..............................................................................................................................52
Chapter Three- Study Two: Configurations of Clinic Characteristics and Leader-Reported
Drivers Associated with Innovation Adoption in Children’s Mental Health Services: A
Qualitative Comparative Analysis ................................................................................................61
Abstract...................................................................................................................................61
Introduction ............................................................................................................................63
Methods ..................................................................................................................................65
Results ....................................................................................................................................70
Discussion...............................................................................................................................72
References ..............................................................................................................................80
3
Chapter Four- Study Three: State Executive and Clinic Leader Perspectives on Implementation
Success and Failure in Children’s Mental Health Services ..........................................................90
Abstract...................................................................................................................................90
Introduction ............................................................................................................................93
Methods ..................................................................................................................................96
Results ....................................................................................................................................98
Discussion.............................................................................................................................105
References ............................................................................................................................111
Chapter Five- Conclusions, Implications, and Future Directions ...............................................119
Introduction ..........................................................................................................................119
Major Findings .....................................................................................................................120
Practical Implications ...........................................................................................................122
Theoretical Implications .......................................................................................................126
Methodological Implications ................................................................................................127
Limitations and Future Directions ........................................................................................128
References ............................................................................................................................130
Appendix A: List of CTAC Trainings from September 2011 to August 2013 ............................135
Appendix B: Outcomes and Conditions by Case .........................................................................137
Appendix C: Truth Table for Clinical Innovation Adoption ......................................................138
Appendix D: Truth Table for Business Innovation Adoption ....................................................139
4
Acknowledgements
Thank you to the many people who supported me throughout this dissertation.
To my dissertation committee members for their invaluable guidance:
Dr. Michael Hurlburt
Dr. Lawrence Palinkas
Dr. Peer Fiss
To the New York University faculty who generously shared their data and expertise:
Dr. Kimberly Hoagwood
Dr. Sarah Horwitz
To the Suzanne Dworak-Peck School of Social Work, especially:
Dr. John Brekke
Dr. Julie Cederbaum
Dr. Benjamin Henwood
Dr. Mónica Pérez-Jolles
To my colleagues and friends in the Ph.D. program, especially Rebecca Lengnick-Hall
To my family, especially my parents, John and Carmen Fenwick
To my fiancé, Chris Ortiz
And finally, to the mental health service providers and leaders who gave their time
to make this research possible.
5
List of Tables
Table 1.1 Overview of the Three Studies
Table 2.1 Sample Clinic Characteristics
Table 2.2 Percent of Clinic Leaders Endorsing External Sources of Information about
Innovations
Table 3.1 Descriptive Statistics for Clinic-level Conditions and Outcomes from Administrative
Data
Table 3.2 Percent of Clinics Endorsing Qualitative Drivers of Innovation Adoption
Table 3.3 Solution for Adoption of Clinical Innovations
Table 3.4 Solution for Adoption of Business Innovations
Table 4.1 Percent of Clinic Leaders and State Executives Endorsing Study Themes
6
Abstract
Evidence-based clinical and business innovations are underutilized in the U.S. children’s
mental health system. The delay in moving innovations from research to practice has especially
serious implications in public sector clinics, where clients have less access to services and
greater risk for adverse outcomes if they do not receive high-quality care. To mitigate this delay,
many states have encouraged the use of innovations through initiatives, legislation, and/or
mandates. However, evaluations of these efforts show mixed results, indicating the need for
further research on how and why clinics adopt and implement innovations.
This three-study dissertation investigates three aspects of innovation adoption and
implementation in the context of a large-scale innovation rollout in New York State. Clinic
leader perspectives were explored using qualitative interviews conducted in a 10% stratified
sample of all clinics in the state children’s public mental health system. Training participation
was used as a proxy of clinic adoption, and administrative data related to clinic characteristics
and interviews with policy makers were used to incorporate factors in the organizational and
external contexts.
Chapter One synthesizes the literature on adoption and implementation of innovations in
children’s mental health services, summarizes recent system-level efforts to roll out innovations,
and presents an overview of the three studies. Chapter Two (Study One) investigates the external
sources that clinics rely on for information about innovations, and how differences in
organizational absorptive capacity are associated with clinic level of adoption. Chapter Three
(Study Two) uses Qualitative Comparative Analysis to examine combinations of clinic leader-
reported drivers of adoption decisions and clinic characteristics associated with adoption and
non-adoption of clinical and business innovations. Chapter Four (Study Three) qualitatively
7
explores clinic leader and state executive attitudes, beliefs, and perspectives around defining
implementation success and deciding to stop implementing a failed innovation. Finally, Chapter
Five presents practical implications of study findings for mental health system administrators and
clinic leaders, as well as theoretical and practical implications for implementation science.
8
Chapter One: Overview of the Three Studies
Introduction and Background
Innovations in Children’s Mental Health Services
One out of every five U.S. children and adolescents will experience at least one mental,
emotional, or behavioral health disorder in a given year (O’Connell, Boat, & Warner, 2009).
These disorders can result in greater risk of medical comorbidities, criminal justice system
involvement, suicide completion, and problems with family and peers (Copeland et al., 2007;
Gould, Greenberg, Velting, & Shaffer, 2003; Johnston & Mash, 2001; Perou et al., 2013; Roy-
Byrne et al., 2008). In addition, they are associated with higher risk of experiencing mental
disorders in adulthood, leading to further costs to individuals, communities, and society (Reeves
et al., 2011; Smit et al., 2006). It is therefore critical that mental health systems and organizations
provide access to high quality, evidence-based mental health services for children and their
families.
In the past 30 years, researchers have identified a range of effective clinical interventions
for child and adolescent mental, emotional, and behavioral disorders (Chorpita et al., 2011;
Silverman & Hinshaw, 2008; Kazdin & Weisz, 2003). These include cognitive-behavioral
therapies, parent management therapies, and medication management (Chan, Fogler, &
Hammerness, 2016; Kendall, Hudson, Gosch, Flannery-Schroeder, & Suveg, 2008; Sofronoff &
Farbotko, 2002). Researchers have also developed effective business practices and quality
improvement strategies to support and optimize mental health service delivery. Examples include
standardized assessments, electronic health records, and measurement feedback systems
(Bickman, Lyon, & Wolpert, 2016; Gleacher et al., 2016; Hoagwood et al., 2014). However,
there is a research-to-practice gap of a decade or longer until innovations—practices, sets of
behaviors or routines that are new to the organization— are adopted into community mental
9
health organizations, where they can in turn influence patient quality of life (Balas & Boren,
2000; Boren and Balas, 1999; Green, 2008). Low rates of innovation adoption and poor or partial
implementation confirm that simply developing effective innovations is not enough to change
practice (Balas & Boren, 2000; Green, 2008).
Adoption and Implementation of Innovations
Implementation frameworks such as the EPIS (Exploration, Adoption/Preparation,
Implementation, and Sustainment), PRISM (Practical Robust Implementation Sustainability
Model), and CFIR (Consolidated Framework for Implementation Research) catalog the
predictors that influence success of implementation efforts (Aarons, Hurlburt, & Horwitz, 2011;
Damschroder et al., 2009; Feldstein & Glasgow, 2008). They illustrate that implementation
requires simultaneous coordination of multiple factors at different organizational levels, and is
often characterized by setbacks and unanticipated challenges (Fixsen, Blase, Naoom, & Wallace,
2009; Greenhalgh, Robert, Macfarlane, Bate, & Kyriakidou, 2004; Proctor et al., 2009). Barriers
to implementation may include inadequate financial or other resources, lack of access to training,
unsupportive leadership or organizational culture, competing priorities, and resistance to change
(Proctor et al., 2009; Rapp et al., 2010).
As the first step in the implementation process, successful adoption—the decision to
proceed with full or partial implementation of an innovation— is a critical precursor to
successful implementation (Fixsen et al., 2005; Panzano & Roth, 2006; Proctor et al., 2011).
Mirroring the implementation process as a whole, adoption is complex, dynamic, multilevel, and
multiphasic (Wisdom et al., 2014). Ideally, in the pre-adoption phase, organizational staff gain
awareness of an innovation and access relevant information with which to make an adoption
decision (Mendel, Meridith, Schoenbaum, Sherbourne, & Wells, 2008; Wisdom et al., 2014). In
10
the established adoption phase, organizational leaders weigh the risk and benefits of adoption
and decide whether to commit to the innovation (Damanpour & Schneider, 2006; Panzano &
Roth, 2006). Considerations include the match between the innovation and the client population
needs, available resources, and the innovation’s key elements and activities (Bertram, Blase, &
Fixsen, 2015).
Wisdom and colleagues integrated and synthesized constructs from 20 frameworks and
generated a middle-range theory of adoption. They identified 27 predictors of adoption operating
across five levels: 1) external system (e.g. policies, financial incentives); 2) organizational (e.g.
leadership, resources); 3) innovation (e.g. innovation-values fit); 4) individual provider (e.g.
level of education, job tenure), and client (e.g. readiness for change). This framework aligns with
the constructs identified within well-established implementation frameworks (e.g. EPIS, PRISM,
CFIR) while diving deeper into the factors most salient to adoption.
System Innovation Rollouts
To increase adoption of evidence-based innovations in the public sector children’s mental
health system, many states and counties have encouraged the dissemination of innovations
through initiatives, legislation, and/or mandates (Bruns & Hoagwood 2008; Dorsey, Berliner,
Lyon, Pullman, & Murray, 2016; Hoagwood et al., 2014; Lau & Brookwood-Frazee, 2015;
McHugh & Barlow, 2010). Some focus on rolling out specific interventions (Chaffin, Hecht,
Bard, Silovsky, & Beasley, 2012; Dorsey et al., 2016), while others focus on more generalized
quality improvement (Saldana & Chamberlain, 2012). Rollout approaches vary, but all involve
substantial investments in terms of funding, time, and expertise (Chamberlain et al., 2012;
McHugh & Barlow, 2010). For example, in 2006, the California State Department of Mental
Health formed Community Development Teams to assist counties in developing peer networks
11
focused on enhancing implementation (Bruns & Hoagwood, 2008; Saldana & Chamberlain,
2012). Los Angeles County Department of Mental Health, the nation’s largest county mental
health department, launched a Prevention and Early Intervention initiative in 2010 that offered
providers free training in six evidence-based practices and reimbursement for using 32
innovative clinical practices (Brookman-Frazee et al., 2016; Park, Tsai, Guan, & Chorpita, 2018;
Regan et al., 2017). The New York State Office of Mental Health formed the Evidence-Based
Treatment Dissemination Center in 2006 through a state-academic partnership, and later founded
the Community Technical Assistance Center (CTAC) to provide training, consultation, and
education in clinical and business skills (Bruns & Hoagwood, 2008; Hoagwood et al., 2014;
McHugh & Barlow, 2010). Other states, including Hawaii, Michigan, Ohio, and Washington
have also led large-scale efforts to increase the use of innovations within their children’s mental
health systems during the past two decades (Carstens, Panzano, Massatti, Roth, & Sweeney,
2009; Hodges & Wotring, 2004; Massatti, Sweeney, Panzano, & Roth, 2008; Nakamura et al.,
2011; Trupin & Kerns, 2017).
Evaluations of these and other rollouts show mixed results, with many targeted
organizations or service areas adopting innovations but others choosing not to adopt or sustain
them (Brookman-Frazee et al., 2016; Chor et al., 2014; Massatti et al., 2008; Saldana &
Chamberlain, 2012). However, despite the resources involved, some evaluations of these rollouts
do not go beyond basic outcomes data. Others are confined to reports to states or other funders
(Dorsey et al., 2016). As a result, even though over 20 states have actively implemented
evidence-based practices, in-depth studies of these rollouts are limited, restricting opportunities
for dialogue that can inform the quality and reach of future efforts (Beidas & Kendall, 2014;
Dorsey et al., 2016).
12
Studies examining the perspectives of service providers are needed to identify factors
associated with how and why clinics initially decide to adopt or not to adopt innovations (Bruns
et al., 2008). Clinic leaders are often at the center of adoption and implementation, influencing
use of research evidence, identification of problems or needs that may be targeted with
innovations, decisions to send staff for training, and creation of organizational cultures and
climates that support implementation (Aarons, Ehrhart, Moullin, Torres, & Green, 2017; Proctor,
2004; Proctor et al., 2007; Simpson, 2002). Interviews with clinic leaders can shed light on the
drivers and processes related to innovation adoption decisions and offer insight into strategies
that states may use to create buy-in to future rollouts (Palinkas et al., 2017; Rodriguez, Lau,
Wright, Regan, & Brookman-Frazee, 2018). This dissertation explores clinic leader perspectives
in combination with data on clinic training participation, clinic administrative characteristics, and
state executive perspectives in the context of a large-scale innovation rollout.
The Three Studies
Dissertation Context
This dissertation is set in New York, one of the leading states in efforts to disseminate
and implement evidence-based practices within the children’s public mental health system
(Gleacher et al., 2011; Olin et al., 2014). The New York State Office of Mental Health founded
the Children’s Technical Assistance Center (now known as the Community Technical Assistance
Center) in 2011 as part of a large-scale effort to increase use of clinical and business innovations
in its child and adolescent mental health system. The CTAC offers training in three modalities—
webinars, in-person, and learning collaboratives— that represent escalating levels of intensity
and clinic commitment. Hour-long webinars represent the least intensive training modality; all-
day in-person seminars represent mid-level intensity; and 6-18 month learning collaboratives are
13
the most intensive. Each training topic is delivered via a single modality. Technical assistance is
provided on a voluntary, no charge basis to all outpatient mental health clinics licensed by New
York State (www.ctacny.org).
At the time of data collection in the present study, the CTAC offered 33 trainings, listed
in Appendix A (Chor et al., 2014). Eighteen trainings targeted evidence-based clinical practices,
including fourteen webinars, one in-person session, and three learning collaboratives. Twelve
trainings targeted improvement of business and organizational practices, including nine
webinars, one in-person session, and two learning collaboratives. Finally, three hybrid trainings
targeted improvement of both clinical and business practices, including two webinars and one in-
person session (Chor et al., 2014). Although the CTAC’s offerings have since evolved, at the
time of the rollout its clinical trainings included topics such as cognitive-behavior therapy,
strengthening families, and motivational interviewing. Business trainings included topics such as
collaborative documentation, quality assurance and risk management, and financial self-
assessment (Chor et al., 2014).
Chor and colleagues (2014) examined the naturalistic participation in CTAC trainings of
all 346 clinics licensed to treat children, adolescents, and their families in New York State
between 2011 and 2013. Twenty-three percent of clinics were classified as non-adopters (did not
access any trainings); 41% were classified as low adopters (accessed webinars only); 25% were
classified as medium adopters (accessed at least one in-person training but no learning
collaboratives); 22% were classified as high adopters (accessed one learning collaborative); and
12% were classified as super-high adopters (accessed at least two learning collaboratives).
Eighty-two percent of adopting clinics participated in business trainings, 78% accessed clinical
trainings, and 45% participated in hybrid trainings. Not only did low adopters choose the lowest
14
intensity trainings, but the majority also accessed a low number of trainings (between one and
four). In contrast, the majority of super-adopters accessed nine or more trainings. The present
study will use Chor et al.’s (2014) categorization of clinic participation in CTAC trainings as a
proxy measure for level of adoption.
Study Goals and Methods
This three-study dissertation investigates three key aspects of innovation adoption and
implementation in the context of New York State’s innovation rollout. These include how clinics
first learn about innovations; factors associated with clinic adoption decisions; and stakeholder
perceptions about implementation success and failure. First, clinic leader interviews were
analyzed to explore how clinics first learn about innovations (e.g. professional associations,
consultants) and how processes related to acquiring information differ depending on clinic level
of adoption. Next, qualitative comparative analysis (QCA) was used to examine how
configurations of leader-reported drivers of adoption (e.g. culture, innovation-organization fit)
and clinic characteristics from administrative data (e.g. size, client population) are associated
with innovation adoption. Finally, state policy maker and clinic leader interviews were used to
explore stakeholder attitudes and beliefs about implementation success and failure. Table 1.1
presents a summary of the three studies. The aims of each study are as follows:
Study One: Identify commonly accessed external sources of information on innovative
clinical and business practices and explore differences in acquisition of information between
clinics with higher versus lower levels of adoption.
Study Two: Identify combinations of clinic leader-reported drivers of adoption decisions
and clinic characteristics associated with adoption of clinical innovations and adoption of
business innovations.
15
Study Three: Examine clinic leader and state executive perspectives around defining
implementation success and deciding when to stop implementing a failed innovation.
This dissertation is grounded in a multilevel, multiphasic approach, guided by Wisdom et
al.’s (2014) innovation adoption framework and Aarons et al.’s (2011) implementation
framework. Although clinic leader perceptions are at the center, the three studies incorporate
factors from the external system, inner organizational context, and innovation levels. The first
study focuses on key tasks related to the initial, pre-adoption phase, including gaining awareness
of an innovation and processing information related to the innovation. The second study focuses
on the second phase of adoption, the adoption decision itself. Finally, the third study examines
stakeholder views about implementation success and failure that may underlie decisions across
all implementation phases.
This dissertation uses data collected from 34 clinics that were randomly selected from
within Chor et al.’s (2014) adoption categories to represent a 10% stratified sample of the 346
clinics in Chor et al.’s study. Based on administrative training data, these clinics were classified
as non- (n = 11), low (n = 5), medium (n = 6), high (n = 9), or super-high (n = 3) adopters using
Chor et al.’s (2014) criteria. Qualitative interviews about innovations, adoption, and
implementation were conducted with 68 clinic CEOs and program directors from these 34 clinics
between 2013-2014. The leader interview data were used in all three studies. Quantitative data
related to clinic characteristics were extracted from state administrative records (Olin et al.,
2015) and used in Study Two. Additional qualitative interviews conducted with nine New York
State executives were used in Study Three.
Improving adoption and implementation of innovations in children’s mental health
services is a public health priority. Results from this dissertation will inform current models of
16
implementation and offer recommendations to assist funders, policy makers, oversight agencies,
and organizational leaders in future efforts to scale up innovative clinical and business practices.
In turn, this will improve quality of care in child and adolescent mental health services.
17
References
Aarons, G. A., Ehrhart, M. G., Moullin, J. C., Torres, E. M., & Green, A. E. (2017). Testing the
leadership and organizational change for implementation (LOCI) intervention in
substance abuse treatment: A cluster randomized trial study protocol. Implementation
Science, 12, 29-39.
Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a conceptual model of
evidence-based practice implementation in public service sectors. Administration and
Policy in Mental Health and Mental Health Services Research, 38, 4-23.
Balas, E. A., & Boren, S. A. (2000). Managing clinical knowledge for health care
improvement. In Bemmel J., & McCray, A. T. (Eds.), Yearbook of medical informatics
2000: Patient-centered systems (65-70). Stuttgart, Germany: Schattauer.
Beidas, R. S., & Kendall, P. C. (Eds.). (2014). Dissemination and implementation of evidence-
based practices in child and adolescent mental health. New York, NY: Oxford
University Press.
Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes:
Implementation frameworks and organization change. Research on Social Work
Practice, 25(4), 477-487.
Bickman, L., Lyon, A. R., & Wolpert, M. (2016). Achieving precision mental health through
effective assessment, monitoring, and feedback processes: Introduction to the special
issue. Administration and Policy in Mental Health and Mental Health Services Research,
43, 271-276.
Brookman-Frazee, L., Stadnick, N., Roesch, S., Regan, J., Barnett, M., Bando, L., ... & Lau, A.
(2016). Measuring sustainment of multiple practices fiscally mandated in children’s
18
mental health services. Administration and Policy in Mental Health and Mental Health
Services Research, 43(6), 1009-1022.
Bruns, E. J., & Hoagwood, K. E. (2008). State implementation of evidence-based practice for
youths, part I. Responses to the state of the evidence. Journal of the American Academy
of Child and Adolescent Psychiatry, 47, 369–373.
Bruns, E. J., Hoagwood, K. E., Rivard, J. C., Wotring, J., Marsenich, L., Carter, B., & Hamilton,
J. D. (2008). State implementation of evidence-based practice for youths, Part II:
Recommendations for research and policy. Journal of the American Academy of Child &
Adolescent Psychiatry, 47(5), 499-504.
Boren, S. A., & Balas, E. A. (1999). Evidence-based quality measurement. Journal of
Ambulatory Care Management, 22(3), 17–23.
Carstens, C. A., Panzano, P. C., Massatti, R., Roth, D., & Sweeney, H. A. (2009). A naturalistic
study of MST dissemination in 13 Ohio communities. Journal of Behavioral Health
Services & Research, 36(3), 344-360.
Chaffin, M., Hecht, D., Bard, D., Silovsky, J. F., & Beasley, W. H. (2012). A statewide trial of
the SafeCare home-based services model with parents in Child Protective
Services. Pediatrics, 129(3), 509-515.
Chamberlain, P., Roberts, R., Jones, H., Marsenich, L., Sosna, T., & Price, J. M. (2012). Three
collaborative models for scaling up evidence-based practices. Administration and Policy
in Mental Health and Mental Health Services Research, 39(4), 278-290.
Chan, E., Fogler, J. M., & Hammerness, P. G. (2016). Treatment of attention-
deficit/hyperactivity disorder in adolescents: A systematic review. JAMA, 315(18), 1997-
2008.
19
Chor, K. H. B., Olin, S. C. S., Weaver, J., Cleek, A. F., McKay, M. M., Hoagwood, K. E., &
Horwitz, S. M. (2014). Adoption of clinical and business trainings by child mental health
clinics in New York State. Psychiatric Services, 65(12), 1439-1444.
Chorpita, B. F., Daleiden, E. L., Ebesutani, C., Young, J., Becker, K. D., Nakamura, B. J., …
Starace, N. (2011). Evidence-based treatments for children and adolescents: An updated
review of indicators of efficacy and effectiveness. Clinical Psychology Science and
Practice, 18(2), 154–172.
Copeland, W. E., Miller-Johnson, S., Keeler, G., Angold, A., & Costello, E. J. (2007). Childhood
psychiatric disorders and young adult crime: A prospective, population-based study.
American Journal of Psychiatry, 164, 1668–1675.
Damanpour, F., & Schneider, M. (2006). Phases of the adoption of innovation in organizations:
Effects of environment, organization and top managers. British Journal of
Management, 17(3), 215-236.
Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C.
(2009). Fostering implementation of health services research findings into practice: A
consolidated framework for advancing implementation science. Implementation Science,
4, 50-65.
Dorsey, S., Berliner, L., Lyon, A. R., Pullmann, M. D., & Murray, L. K. (2016). A statewide
common elements initiative for children’s mental health. The Journal of Behavioral
Health Services & Research, 43(2), 246-261.
Feldstein, A. C., & Glasgow, R. E. (2008). A Practical, Robust, Implementation and
Sustainability Model (PRISM) for integrating research findings into practice. The Joint
Commission Journal on Quality and Patient Safety, 34(4), 228-243.
20
Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005).
Implementation research: A synthesis of the literature. Tampa, FL: University of South
Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation
Research Network.
Fixsen, D. L., Blase, K. A., Naoom, S. F., & Wallace, F. (2009). Core implementation
components. Research on Social Work Practice, 19(5), 531-540.
Gleacher, A. A., Olin, S. S., Nadeem, E., Pollock, M., Ringle, V., Bickman, L., ... & Hoagwood,
K. (2016). Implementing a measurement feedback system in community mental health
clinics: A case study of multilevel barriers and facilitators. Administration and Policy in
Mental Health and Mental Health Services Research, 43(3), 426-440.
Green, L. W. (2008). Making research relevant: If it is an evidence-based practice, where's the
practice-based evidence? Family Practice, 25(supp. 1), i20-i24.
Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of
innovations in service organizations: Systematic review and recommendations. Milbank
Quarterly, 82(4), 581–629.
Hoagwood, K. E., Olin, S. S., Horwitz, S., McKay, M., Cleek, A., Gleacher, A., ... & Kuppinger,
A. (2014). Scaling up evidence-based practices for children and families in New York
State: Toward evidence-based policies on implementation for state mental health
systems. Journal of Clinical Child & Adolescent Psychology, 43(2), 145-157.
Hodges, K., & Wotring, J. (2004). The role of monitoring outcomes in initiating implementation
of evidence-based treatments at the state level. Psychiatric Services, 55(4), 396-400.
Kazdin, A. E., & Weisz, J. R. (eds). (2003). Evidence-based psychotherapies for children and
adolescents. New York, NY: Guilford Press.
21
Kendall, P. C., Hudson, J. L., Gosch, E., Flannery-Schroeder, E., & Suveg, C. (2008). Cognitive-
behavioral therapy for anxiety disordered youth: A randomized clinical trial evaluating
child and family modalities. Journal of Consulting and Clinical Psychology, 76(2), 282-
297.
Lau, A. S., & Brookman-Frazee, L. (2016). The 4KEEPS study: identifying predictors of
sustainment of multiple practices fiscally mandated in children’s mental health
services. Implementation Science, 11, 31-38.
Massatti, R. R., Sweeney, H. A., Panzano, P. C., & Roth, D. (2008). The de-adoption of
innovative mental health practices (IMHP): Why organizations choose not to sustain an
IMHP. Administration and Policy in Mental Health and Mental Health Services
Research, 35(1-2), 50-65.
McHugh, R., & Barlow, D. (2010). The dissemination and implementation of evidence-based
psychological treatments: A review of current efforts. American Psychologist, 65(2), 73-
84
Mendel, P., Meredith, L. S., Schoenbaum, M., Sherbourne, C. D., & Wells, K. B. (2008).
Interventions in organizational and community context: A framework for building
evidence on dissemination and implementation in health services
research. Administration and Policy in Mental Health and Mental Health Services
Research, 35, 21-37.
Nakamura, B. J., Chorpita, B. F., Hirsch, M., Daleiden, E., Slavin, L., Amundson, M. J., ... &
Stern, K. (2011). Large‐scale implementation of evidence‐based treatments for children
10 years later: Hawaii’s evidence‐based services initiative in children’s mental
health. Clinical Psychology: Science and Practice, 18, 24-35.
22
O'Connell, M. E., Boat, T., Warner, K. E., (eds). (2009). Preventing mental, emotional, and
behavioral disorders among young people: Progress and possibilities. Washington, DC:
National Academies Press.
Olin, S. C. S., Chor, K. H. B., Weaver, J., Duan, N., Kerker, B. D., Clark, L. J., ... & Horwitz, S.
M. (2015). Multilevel predictors of clinic adoption of state-supported trainings in
children’s services. Psychiatric Services, 66(5), 484-490.
Palinkas, L. A., Um, M. Y., Jeong, C. H., Chor, K. H. B., Olin, S., Horwitz, S. M., & Hoagwood,
K. E. (2017). Adoption of innovative and evidence-based practices for children and
adolescents in state-supported mental health clinics: A qualitative study. Health Research
Policy and Systems, 15, 27-35.
Panzano, P. C., & Roth, D. (2006). The decision to adopt evidence-based and other innovative
mental health practices: Risky business? Psychiatric Services, 57(8), 1153-1161.
Park, A. L., Tsai, K. H., Guan, K., & Chorpita, B. F. (2018). Unintended consequences of
evidence-based treatment policy reform: Is implementation the goal or the strategy for
higher quality care? Administration and Policy in Mental Health and Mental Health
Services Research, 45(4), 649-660.
Perou, R., Bitsko, R. H., Blumberg, S. J., Pastor, P., Ghandour, R. M., Gfroerer, J. C., ... &
Parks, S. E. (2013). Mental health surveillance among children—United States, 2005–
2011. Centers for Disease Control and Prevention Morbidity and Mortality Weekly
Report, Supplement, Vol 62, No. 2.
Proctor, E. K. (2004). Leverage points for the implementation of evidence-based practice. Brief
Treatment and Crisis Intervention, 4(3), 227-242.
23
Proctor, E. K., Knudsen, K. J., Fedoravicius, N., Hovmand, P., Rosen, A., & Perron, B. (2007).
Implementation of evidence-based practice in community behavioral health: Agency
director perspectives. Administration and Policy in Mental Health and Mental Health
Services Research, 34(5), 479-488.
Proctor, E. K., Lansverk, J., Aarons, G., Chambers, D., Glisson, C., & Mittman, B. (2009).
Implementation research in mental health services: An emerging science with conceptual,
methodological, and training challenges. Administration and Policy in Mental Health, 36,
24-34.
Proctor, E., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G.., Bunger, A., … & Hensley, M.
(2011). Outcomes for implementation research: Conceptual distinctions, measurement
challenges, and research agenda. Administration and Policy in Mental Health, 38, 65-76.
Rapp, C. A., Etzel-Wise, D., Marty, D., Coffman, M., Carlson, L., Asher, D., ... & Holter, M.
(2010). Barriers to evidence-based practice implementation: Results of a qualitative
study. Community Mental Health Journal, 46(2), 112-118.
Reeves, W. C., Strine, T. W., Pratt, L. A., Thompson, W., Ahluwalia, I., Dhingra, S. S., ... &
Morrow, B. (2011). Mental illness surveillance among adults in the United
States. Centers for Disease Control and Prevention Morbidity and Mortality Weekly
Report, Supplement, Vol. 60, No. 3.
Regan, J., Lau, A. S., Barnett, M., Stadnick, N., Hamilton, A., Pesanti, K., ... & Brookman-
Frazee, L. (2017). Agency responses to a system-driven implementation of multiple
evidence-based practices in children’s mental health services. BMC Health Services
Research, 17, 671-684.
24
Rodriguez, A., Lau, A. S., Wright, B., Regan, J., & Brookman-Frazee, L. (2018). Mixed-method
analysis of program leader perspectives on the sustainment of multiple child evidence-
based practices in a system-driven implementation. Implementation Science, 13, 44-57.
Roy-Byrne, P. P., Davidson, K. W., Kessler, R. C., Asmundson, G. J., Goodwin, R. D.,
Kubzansky, L., ... & Stein, M. B. (2008). Anxiety disorders and comorbid medical
illness. General Hospital Psychiatry, 30(3), 208-225.
Saldana, L., & Chamberlain, P. (2012). Supporting implementation: the role of community
development teams to build infrastructure. American Journal of Community
Psychology, 50(3-4), 334-346.
Simpson, D. D. (2002). A conceptual framework for transferring research to practice. Journal of
Substance Abuse Treatment, 22(4), 171-182.
Silverman, W. K., & Hinshaw, S. P. (2008). The second special issue on evidence-based
psychosocial treatments for children and adolescents: A 10-year update. Journal of
Clinical Child & Adolescent Psychology, 37, 1-7.
Smit, F., Cuijpers, P., Oostenbrink, J., Batelaan, N., de Graaf, R., & Beekman, A. (2006). Costs
of nine common mental disorders: Implications for curative and preventive
psychiatry. Journal of Mental Health Policy and Economics, 9(4), 193-200.
Sofronoff, K., & Farbotko, M. (2002). The effectiveness of parent management training to
increase self-efficacy in parents of children with Asperger syndrome. Autism, 6(3), 271-
286.
Trupin, E., & Kerns, S. (2017). Introduction to the special issue: Legislation related to children’s
evidence-based practice. Administration and Policy in Mental Health and Mental Health
Services Research, 44, 1-5.
25
Wisdom, J. P., Chor, K. H. B., Hoagwood, K. E., & Horwitz, S. M. (2014). Innovation adoption:
a review of theories and constructs. Administration and Policy in Mental Health and
Mental Health Services Research, 41(4), 480-502.
26
Table 1.1
Overview of the Three Studies
Study One Study Two Study three
Aim
Identify external sources of
information about innovations
and explore differences in
acquisition of information
between clinics with higher
versus lower levels of adoption.
Identify combinations of clinic
leader-reported drivers of adoption
decisions and clinic characteristics
associated with adoption.
Examine clinic leader and state
executive perspectives around
defining implementation success and
deciding when to stop implementing a
failed adoption.
Data sources
Clinic leader interviews
CTAC training participation
Clinic leader interviews
CTAC training participation
Clinic administrative data
Clinic leader interviews
State executive interviews
Factors/variables
External system
Sources of information about
innovations
--- State executive perceptions
Organization Leader perceptions Leader perceptions Leader perceptions
Absorptive capacity Culture
Size
Fiscal efficiency
Client population
Innovation --- Innovation-organization fit ---
Implementation phase
Pre-adoption
Adoption
Adoption
Implementation
Exnovation
27
Chapter Two: Study One
Acquisition and Assimilation of External Sources of Information about Innovative Practices
in Children’s Mental Health Organizations
Abstract
Introduction
Demands for access to effective mental health services for children, adolescents, and their
families have led to calls for increased adoption of evidence-based clinical and business practices
in public mental health clinics. In order to initiate adoption, organizational staff must first gain
awareness of and acquire knowledge about an innovation. However, little is known about how
organizations first learn about innovations, which external sources of information about
innovations they access most, or how organizational processes influence information acquisition.
This study uses qualitative interviews with clinic leaders to: 1) identify commonly accessed
external sources of information on innovative clinical and business practices; 2) examine how
organizational absorptive capacity influences how clinics access and use information; and 3)
explore differences in acquisition of information between clinics with higher levels of innovation
adoption versus clinics with lower levels of innovation adoption.
Methods
This study was conducted in the context of a New York State initiative to scale up
clinical and business innovations via technical assistance and trainings. Previous research
categorized clinic training participation during the early years of this initiative into five levels
(none, low, medium, high, and super-high) that served as proxies for level of adoption.
Qualitative interviews were conducted with 66 clinic leaders from 34 clinics that were randomly
selected from within these adoption levels to represent a 10% stratified sample of all clinics in
28
the state children’s mental health system. “Coding Consensus, Co-occurrence, and Comparison”,
a thematic content analysis method rooted in grounded theory, was used to code transcripts, and
a template organizing style was used to compare theme application between super/high-adopting
clinics and low/non-adopting clinics.
Results
Clinic leaders accessed information about innovations most frequently from the state
Office of Mental Health, followed by other government agencies, professional associations, peer
organizations, and, to a lesser extent, research literature and universities. There were no clear
differences in the types of external information sources endorsed by clinics with higher levels of
adoption compared to those with lower levels of adoption. However, super/high-adopting clinics
exhibited greater absorptive capacity than low/non-adopting clinics. Leaders in super/high-
adopting clinics were more likely to obtain information through in-person conversations,
collaborative relationships and partnerships, and active search processes. Super/high-adopting
clinics were also more likely to have both top-down and bottom-up organizational channels for
communicating information about innovations and values and norms that prioritize innovations.
Discussion
Results of this study offer insight into the sources mental health organizations rely on for
information about innovations. Findings align with other research suggesting that developing
absorptive capacity may strengthen an organization’s ability to identify and apply information
about innovations, and subsequently lead to higher rates of adoption. Oversight agencies and
technical assistance centers aiming to increase adoption should establish collaborative
relationships with clinic leaders and support them in developing new networks and partnerships.
Additional strategies for building absorptive capacity may include training leaders to effectively
29
search for, evaluate, and process information, developing organizational channels for knowledge
sharing, and promoting cultures that value capturing and sharing new knowledge.
30
Acquisition and Assimilation of External Sources of Information about Innovative Practices
in Children’s Mental Health Organizations
Researchers have identified a range of effective clinical interventions for child and
adolescent mental, emotional, and behavioral disorders, such as cognitive-behavioral therapies
and parent management therapies (Chorpita et al., 2011; Kazdin & Weisz, 2003; Kendall,
Hudson, Gosch, Flannery-Schroeder, & Suveg, 2008; Silverman & Hinshaw, 2008; Sofronoff &
Farbotko, 2002). They have also developed effective business and quality improvement practices
to optimize mental health service delivery, including standardized assessments and measurement
feedback systems (Bickman, Lyon & Wolpert, 2016; Gleacher et al., 2016; Hoagwood et al.,
2014). However, low rates of innovation adoption and poor or partial implementation indicate
that simply developing effective innovations is not enough to change practice in community-
based mental health organizations (Balas & Boren, 2000; Green, 2008). Further research is
needed to identify factors and processes that support implementation of innovations and in turn
improve clinical outcomes for children, adolescents and their families (Chor, Wisdom, Olin,
Hoagwood, & Horwitz, 2015).
Adoption is the first step in the implementation process during which organizational staff
decide whether to proceed with full or partial implementation of an innovation (Wisdom, Chor,
Hoagwood, & Horwitz, 2014). Exposure to credible sources of information about innovations
facilitates adoption by introducing organizational staff to innovations, assisting them in
identifying innovations to address specific needs, helping them weigh the adoption decision, and
encouraging them to direct adequate resources to the adoption effort (Aarons, Hurlburt, &
Horwitz, 2011; Bradley et al., 2004; Glisson & Schoenwald, 2005; Mendel, Meridith,
Schoenbaum, Sherbourne, & Wells, 2008; Panzano & Roth, 2006; Wisdom et al., 2014). In
31
contrast, lack of access to and familiarity with innovations hinders adoption (Bradley et al.,
2004; Feldstein & Glasgow, 2008; Solomons & Spross, 2011). Beyond adoption, acquisition and
use of information about innovations is associated with implementation and sustainment
(Palinkas, Saldana, Chou, & Chamberlain, 2017). Given that procurement of information about
an innovation is necessary for organizational adoption and implementation to proceed, it is
important to examine how organizational leaders learn about and use information about
innovations (Palinkas et al., 2015).
Leaders may learn about innovations from both impersonal communication channels and
social networks. Impersonal channels, including emails and mass media, spread innovations to
large audiences and play an important role during the early phase of adoption (Greenhalgh,
Robert, Macfarlane, Bate, & Kyriakidou, 2004; Rogers, 2003; Valente, Chou, & Pentz, 2007).
However, social networks may have greater influence on the overall success of adoption
(Greenhalgh et al., 2004; Mendel et al., 2008; Wisdom et al., 2014). Social networks facilitate
adoption by creating access to information and support, and by offering opportunities for
collaboration around adoption (Palinkas et al., 2011; Palinkas et al., 2017; Valente, Chou, &
Pentz, 2007; Valente & Davis, 1999). Organizations may adopt innovations in response to
external sources in their networks, such as other community-based organizations or state and
county colleagues (Horwitz et al., 2014; Palinkas et al., 2015). Additional potential outside
sources of information include university researchers, program developers, consultants,
conferences, or professional associations, although some literature suggests that these sources
may be relied upon less often (Horwitz et al., 2014).
In addition to identifying the external information sources that are most relevant to
innovation adoption, it is also necessary to examine the internal organizational factors that may
32
influence how information is accessed. An organization’s absorptive capacity, or its ability to
access and apply external knowledge, influences how organizational staff identify, assimilate,
and exploit information about innovations (Cohen & Levinthal, 1990; Zahra & George, 2002).
Absorptive capacity is associated with both pre-adoption and adoption of innovations in mental
health services, and consists of four elements that progress chronologically: acquisition,
assimilation, transformation, and exploitation (Aarons et al., 2011; Greenhalgh et al. 2004;
Wisdom et al., 2014). This study focuses on the first two elements: acquisition, defined as an
organization’s ability to recognize, value, and acquire external knowledge; and assimilation,
defined as organizational routines and processes that facilitate understanding and analysis of
external information (Zahra & George, 2002). Although generally analyzed at the organization
level, absorptive capacity is a complex construct that also encompasses the individual level since
an organization’s absorptive capacity depends on that of its staff (Cohen & Levinthal, 1990). In
the context of this study, absorptive capacity is indicated by: organizational staff that proactively
explore external information; cultures that value capturing and sharing innovations; knowledge-
sharing processes; and boundary-spanning roles with other organizations (Cohen & Levinthal,
1990; Greenhalgh et al., 2004; Horwitz et al., 2014; Palinkas et al., 2015; Rogers, 2003; Zahra &
George, 2002).
The Current Study
In order to initiate adoption, organizational staff must first gain awareness of and acquire
knowledge about an innovation. However, little is known about how organizations first learn
about innovations, which external sources of information about innovations they access most, or
whether differences in acquisition of information are associated with variation in adoption. The
current study uses qualitative interviews with community mental health agency/clinic leaders to
provide real world examples of the external sources and processes that influence information
33
acquisition. The objectives of this study are: 1) identify commonly accessed external sources of
information on innovative clinical and business practices in community mental health clinics that
serve children and adolescents; 2) examine how absorptive capacity influences how clinics
access and use information; and 3) explore differences in acquisition of information between
clinics with high levels of innovation adoption versus clinics with low levels of innovation
adoption. Findings related to these objectives can inform dissemination of information during
future innovation rollouts, and guide implementers in cultivating conditions that facilitate
information uptake and innovation adoption.
Methods
Data and Sample
This study examines innovation adoption in the context of a New York State initiative to
scale up clinical and business innovations in child and adolescent mental health services
(Hoagwood et al., 2014). The Community Technical Assistance Center (CTAC) was founded in
2011 and funded by the New York State Office of Mental Health (NYS OMH) to provide
training, consultation, and education for business and clinical skills. In 2011-2014, when data for
the current study were collected, the CTAC offered training in three modalities— webinars, in-
person, and learning collaboratives— that represent escalating levels of intensity and clinic
commitment. Hour-long webinars represent the least intensive training modality; all-day in-
person seminars represent mid-level intensity; and 6-18 month learning collaboratives are the
most intensive. The CTAC has since expanded its offerings and modified its focus, but at the
time of data collection in the present study, the CTAC offered 33 trainings. Eighteen of these
trainings targeted evidence-based clinical practices, twelve trainings targeted improvement of
business practices, and three hybrid trainings targeted improvement of both clinical and business
34
practices. More detail about these trainings may be found elsewhere (Chor et al., 2014). CTAC
provides technical assistance on a voluntary, no charge basis to all outpatient mental health
clinics licensed by New York State (www.ctacny.org).
Sending staff to be trained in an innovation represents an early clinic adoption behavior.
Therefore, training participation may be considered as a proxy measure for adoption (Olin et al.,
2015). Chor and colleagues (2014) examined the naturalistic participation in CTAC trainings of
all 346 clinics licensed to treat children, adolescents, and their families in New York State
between 2011 and 2013. They categorized clinics into non-adopters (did not access any
trainings); low adopters (accessed webinars only); medium adopters (accessed at least one in-
person seminar but no learning collaboratives); high adopters (accessed one learning
collaborative); and super-high adopters (accessed at least two learning collaboratives). The
present study uses data from 34 clinics that were randomly selected from within Chor et al.’s
(2014) adoption categories to represent a 10% stratified sample of the 346 clinics in the larger
study. Table 2.1 contains clinic characteristics.
Qualitative interview data were collected in 2013-2014 from 66 clinic leaders. Within
most clinics, two interviews were conducted: one with an upper level administrator (e.g. CEO or
Vice President) and one with a middle level manager (e.g. Program Director). Within three
clinics, only one interview was conducted due to simpler organizational structures, and within
one clinic a third interview was conducted with an additional key decision-maker. Clinic leaders’
demographic information was not collected. Informed consent was obtained from all individual
participants included in the study, and study procedures were approved by the Institutional
Review Boards of New York University and University of Southern California.
35
The hour-long, semi-structured clinic leader interviews were audio recorded and
transcribed verbatim (Palinkas et al., 2017). Participants answered questions about their initial
exposure and access to clinical and business innovations, including how they first learn about
innovations, who provides information about innovation adoption, and what type of information
is provided. Although data was collected in the context of a study examining adoption of CTAC
innovations, interview questions pertained to innovations generally, rather than CTAC
innovations specifically. Interview questions included: “How does your agency typically first
hear about an innovation?” “Who generally informs your agency about participating in
innovations?” and “What information is generally provided about an innovation the first time
your agency hears about it?” Interviews were sufficiently open-ended to allow participants to
elaborate on issues they deemed particularly relevant.
Data Analysis
Data was coded in Dedoose Version 8.0.35 (2018) using the thematic content analysis
method of “Coding Consensus, Co-occurrence, and Comparison” (Willms et al., 1990). This
method is rooted in grounded theory, in which theory is derived from data and then illustrated
using characteristic examples (Glaser and Strauss, 1967). The first author coded all transcripts,
and a graduate level researcher coded 50% of transcripts selected at random. The coders were
blinded to the clinic levels of adoption during coding to minimize bias.
First, open coding was used to identify broad themes and patterns, and then axial coding
was used to explore these themes more deeply (Strauss & Corbin, 1998). Codes captured both
information about a priori themes (i.e. direct responses to interview questions about the types of
sources leaders accessed), as well themes that spontaneously emerged from the data (i.e. the
processes through which organizations seek and use information). The co-coders created detailed
36
memos to describe codes and document decision-making (Charmaz, 2014; Strauss & Corbin,
1998). Discrepancies in coding were discussed until consensus was reached. Interrater reliability,
calculated from 10 randomly selected transcripts, was 92% (range 89-95%), indicating good
reliability (Padgett, 2012; Viera & Garrett, 2005).
Given that leaders at different levels of a single organization may have access to different
external information sources and observe different organizational processes, interviews
conducted in the same organization were treated as independent. Preliminary comparisons
between upper-level and middle-level leader interviews did not indicate systematic differences in
themes so were not included in further analysis.
After coding was complete, a template organizing style was used to compare codes from
interviews with clinic leaders in super/high-adopting clinics with those from low/non-adopting
clinics (Crabtree and Miller, 1999). The first author developed a matrix of themes and compared
the content and organization of the matrix to identify themes that were common to both groups
of clinics as well as themes that were specific to only one group (Crabtree & Miller, 1999).
Medium-adopting clinics were then compared with both super/high and low/non-adopting
clinics. On some themes, medium-adopting clinics appeared most similar to super/high-adopting
clinics, on other themes they were most similar to low/non-adopting clinics, and on others they
were mixed. Therefore, they were not incorporated into either the super/high or the low-non-
adopting groups.
Results
Results related to external sources of information about innovations are presented first,
followed by themes related to organizational absorptive capacity for identifying and assimilating
external information. Differences in findings between super/high and low/non-adopting agencies
37
are discussed where applicable, mainly in relation to themes related to absorptive capacity.
Given that this is a qualitative study, no inferential statistical tests were performed in making
comparisons, and terms such as ‘more likely’ or ‘less likely’ should be interpreted as qualitative
assessments by the investigators.
Findings related to external sources of information may be accurately dichotomized—
either a leader mentioned a particular source in response to the relevant prompt or they did not.
Therefore, descriptive statistics are presented for external sources of information in Table 2.2.
However, themes related to organizational absorptive capacity were not ‘quantitized’ for two
reasons. First, responses related to these process themes are less easily divided into mutually
exclusive categories. Second, given these themes emerged a posteriori from the data rather than
as direct responses to interview questions, they are not present in all interviews. Therefore, the
proportion of interviews that contained these themes may not represent a meaningful measure
(Cresswell & Plano Clark, 2011).
External Information Sources
The New York State Office of Mental Health (OMH) was the most frequently identified
source for information about innovations. In some cases, leaders identified OMH-funded
resources such as the Center for Practice Innovations and the CTAC. Three leaders explicitly
stated that communications from OMH are the most common way their organization first hears
about an innovation.
“…they [OMH] have been really helpful in trying to help the mental health agencies in
the State to be more aware of the evidence based practices out there, and quality
initiatives as well. So OMH is certainly a very important source.”
38
The majority of leaders at all adoption levels— including non-adopting clinics that did
not participate in the free CTAC trainings offered by through OMH—reported that they received
or accessed information about innovations through OMH. However, two leaders in low/non-
adopting agencies reported that they did not view OMH as an important source of information.
As one stated, “When it’s mandated we get things from OMH. But we don’t consider OMH the
leading innovator.”
Beyond OMH, clinic leaders reported accessing a wide range of government sources of
information about innovations. These included federal (e.g. Substance Abuse and Mental health
Services Administration; National Institute of Mental Health), state (e.g. Office of Alcoholism
and Substance Abuse Services; Administration for Children’s Services; Department of
Education) and local (e.g. New York City Department of Health and Mental Health) sources.
These sources were often linked to the clinic’s funding, as explained by one leader,
“And then the other way is that we are very active in terms of fundraising. And so we
also scan the funding opportunities. And very often as part of that, there will be potential
evidence based practices identified. So that’s also another vehicle by which we may hear
about something.”
Non-government sources of information included peer networks, professional
organizations, research literature, and universities. The Coalition for Behavioral Health, which
represents approximately 150 community-based behavioral health agencies in New York City
and the surrounding counties, was the most frequently endorsed peer network source of
information. One leader stated, “A lot of them [ideas for innovative practices] come from the
Coalition for Behavioral Health Agencies.” Professional associations, such as the American
Psychological Association and the National Council for Behavioral Health, were another
39
important source of information, as illustrated by one leader, who said, “We view that [National
Council for Behavioral Health] conference as an essential tool to expose us to what's happening
not only throughout the company in terms of other providers, but at the federal level in terms of
policy.”
The leaders who endorsed research literature as a source of information often described
active search processes. These leaders were mainly from super/high or medium-adopting
agencies, as will be described in the next section. One such leader reported, “I personally scan
the Medline. I do various literature searches, various things.” Universities were the least
frequently mentioned source of information. Leaders who identified this source generally
described receiving the information through university mailing lists, though two leaders
mentioned having deeper affiliations with local universities, such as participating on projects
related to innovations.
Leaders reported that emails were the most common modality through which they
received information about innovations from outside sources. They typically received these
emails via mailing lists or impersonal email ‘blasts’. One leader described the numerous mailing
lists through which they accessed information.
“It's obviously being on, every one's on every list serve it seems. So I get 150 emails a
day and half of those are from, whether it's advocacy groups or associations that are
encouraging things, whether it's from the federal government talking about a promising
practice. So I'd say we get them from a lot of different places, as well as from the
leadership in our own county who are saying, you know, we’re interested in this model or
that model.”
40
Less frequently, leaders reported accessing information through extraorganizational
meetings, such as conferences or committees. Leaders from super/high organizations were more
likely to elaborate on these meetings or describe them as important sources for learning about
innovations, as will be discussed in the next section.
Regarding information content, approximately half of the leaders interviewed described
the types of information about innovations they most frequently accessed or received. They
reported that this information typically consisted of a ‘basic overview’ about an innovation, or
‘broad strokes’ information about trainings, initiatives, or mandates. Leaders also described the
types of information about innovations that they personally wanted or sought out. They looked
for information about the innovation’s evidence base and relevance to their client population.
They also wanted information to help them evaluate whether or not the innovation is a good fit
for their clinic, including details about feasibility, cost, and what the implementation process
may entail. One leader said,
“…I generally am interested in reading about programs, you know, or sites that have
implemented. And sort of the details of implementation and sort of how it’s being applied
kind of in the real world, and what the benefits and challenges of that are. That’s kind of
my specific interest when I kind of hear about an innovation or a best practice.”
Organizational Absorptive Capacity
Data revealed three interrelated themes pertaining to organizational absorptive capacity to
acquire external information. These themes included 1) leader capacity; 2) organizational
knowledge-sharing mechanisms; and 3) organizational culture.
Leader capacity. Leaders illustrated their organization’s absorptive capacity through
their proclivity for actively seeking information from external sources and their connections with
41
external individuals and organizations. When describing how they first learn about an
innovation, leaders in super/high-adopting clinics were more likely to indicate taking an active
approach to seeking out and looking up information. These leaders were also more likely to
describe accessing the research literature. One leader said, “We’ve been pretty proactive, and so
we have done some literature reviews…so some combination of proactive searching and seeing
opportunities that come up from public funders.” Leaders in low/non-adopting clinics were less
likely to describe proactive searches, instead suggesting a more passive approach to receiving
information. One leader in a non-adopting agency explicitly stated that they did not search out
information on their own: “Honestly, no. I just don’t have…No. I don’t have the time.”
In addition to describing actively searching for external information, leaders in
super/high-adopting clinics were more likely to endorse in-person conversations with peers and
colleagues during conferences and meetings as important sources of information about
innovations. One leader explained,
“There’s a Mental Health Director’s meeting. And then there’s the school-based mental
health support team meetings. Often conversations about…practice innovations come up
at those meetings.”
Finally, leaders in super/high-adopting clinics were more likely to describe collaborative
relationships with OMH, as opposed to simply receiving OMH communication indirectly such as
through a list serve mailing. These leaders described mutually beneficial partnerships with OMH
and active roles in shaping OMH initiatives related to innovations, as reported by one leader,
“Well, certainly, [OMH] also discusses with us what it is they’re looking at in terms of
initiatives.”
42
Leaders in low/non-adopting clinics were less likely to describe active searches for
information or rely on collaborative relationships and in-person communication with peers for
information about innovations. In some cases, these leaders expressed feeling overwhelmed with
the task of processing current incoming information and likely lacked the capacity to seek out
further information or build connections. These leaders described ‘sifting through’ the material
they receive as a burden, rather than welcoming the new information. As one leader put it,
“There’s just too much information being sent every day and too many trainings and too many
things.”
Knowledge sharing mechanisms. Another theme that illustrated organizational
absorptive capacity related to internal processes for sharing or assimilating knowledge once it
entered the organization. Leaders in super/high-adopting clinics often described bottom-up
mechanisms for communicating and processing information about innovations. Sometimes these
mechanisms were in addition to top-down mechanisms. Leaders in super/high-adopting clinics
endorsed a vision of ‘learning together’ and democratic knowledge exchange, and shared
instances where staff clinicians distributed information about innovations up the organizational
hierarchy. For example, one leader reported,
“I mean, our clinicians, from time to time will also, either through their licensing bodies
or just through colleagues, they’ll bring potential training opportunities. Our psychiatrists
will, from time to time, bring training opportunities. So really, any of the clinic staff
could potentially run across different things that they feel would useful in part of their
practice and sort of bring those either to the treatment team as a whole or to me
specifically as a…you know, in an effort to see if that’s something we might be able to
support.”
43
In contrast, leaders of low/non-adopting clinics often focused on top-down approaches to
disseminating information throughout the organization. In some cases, the clinics had their own
training institutes that managed the dissemination of innovations within the agency. In others,
information was sent down from top administration, as in this leader’s example.
“And our health system actually tells us about new innovations that they’re working on.
And that’s the primary way that’s passed down to us… Usually, the Assistant Vice
President will call me and will call other clinic directors to let me know what we’re
working on. They’ll have a presentation for our clinic.”
Leaders in these organizations were less likely to endorse learning about innovations from their
staff, and a few explicitly reported that this did not happen.
“Interviewer: But is there someone, say a staff or a person who like brings up training
opportunities to your attention, and you will consider? Does that happen?
Agency Director: No.”
Organizational culture. Finally, leaders of super/high-adopting clinics described how
organizational values, norms, and goals that support evidence-based practice, quality
improvement, and innovation influence the acquisition and assimilation of information about
innovations. These leaders shared organizational and personal commitments to quality
improvement, as in this example, “You know, we have a commitment to, and I have a
commitment to integrate into our practices the best science that we can utilize within our
program.” Some leaders of super/high-adopting clinics stated that support for innovations is
ingrained into their clinic’s history, mission, and identity. One leader explained, “It was a huge
initiative for us about six years ago when we started kind of incorporating those [evidence-based]
principles into the way that we practice. So I would say that it’s part of a culture.” Leaders of
44
low/non-adopting clinics were less likely to incorporate references to their organization’s values,
goals, or culture into discussions of how they access and use information about innovations.
These leaders rarely spoke explicitly about unsupportive cultures, although one leader in a low-
adopting clinic offered a specific illustration of a culture that is resistant to certain clinical
innovations:
“Just so you know, I mean, at baseline this clinic is a pretty psychodynamic....That's sort
of our treatment of choice here. So most of the therapists here have been to psycho-
analytic, psychodynamic institute. So the innovations, as you'd say, usually come as
mandates from OMH. And then we are told that this is best practice and this what we
should be doing for our clients, though many people feel here that they are very well
trained in other ways. It's often seen as conflict.”
Discussion
This qualitative study examined mental health clinic leaders’ descriptions of how their
organizations first learn about clinical and business innovations. It explored which types of
external sources of information clinics access, and how intraorganizational processes influence
access to and use of information. Using training participation data from a recent statewide
innovation rollout, it also compared higher- versus lower- adopting clinics in terms of their
information acquisition.
Clinic leaders endorsed accessing information about innovations from a variety of
external sources, including the New York State Office of Mental Health, other government
agencies, professional associations, peer organizations, and research literature and universities.
Similar to other studies, OMH was the most frequently mentioned source, and universities were
mentioned least (Horwitz et al., 2014). There were no clear differences in the types of external
45
information sources endorsed by clinics with higher levels of adoption compared to those with
lower levels of adoption. However, there were differences in terms of organizational processes
that influenced how clinics access and use this external information.
Higher-adopting clinics were more likely than lower-adopting clinics to report that they
obtained information during in-person conversations with peers and colleagues at meetings and
conferences. They were also more likely to report current or previous collaborative relationships
with the state Office of Mental Health, the most salient external source of information in this
study. These findings are in line with previous empirical studies demonstrating that interpersonal
contacts and partnerships with external organizations are positively associated with innovation
adoption (Palinkas et al., 2011; Valente et al., 1996). In addition to being more collaborative and
relationally focused, leaders in higher-adopting clinics were more likely to report accessing
information as a result of active searching, rather than passive receiving, and were less likely to
report being overwhelmed by incoming information, corroborating research indicating that
information scanning is associated with absorptive capacity and innovation adoption (Knudsen &
Roman, 2004).
Higher-adopting clinics were more likely to have both top-down and bottom-up internal
processes for communicating information about innovations, in congruence with research noting
that two-way, mutual information exchange supports adoption (Rogers, 2003). Although the
current study focused on external sources of information, this finding aligns with others
indicating internal staff play important roles in providing information about innovations
(Horwitz et al., 2014). Finally, this study found that higher-adopting clinics have norms and
values that prioritize innovation, contributing to a large body of evidence indicating that
organizational culture and climate are critical for the success of innovation adoption and
46
implementation (Aarons & Sommerfeld, 2006; Ehrhart, Aarons, & Farahnak, 2014; Glisson et
al., 2008).
Practical Implications
Several findings have practical implications for improving adoption of innovations in
public mental health services. Taken together with other research, study results suggest that
interventions to develop organizational capacity may strengthen an organization’s ability to use
and apply information about innovations, and subsequently lead to higher rates of adoption
(Aarons, Ehrhart, Farahnak, & Sklar, 2014; Greenhalgh et al., 2004; Knudsen & Roman, 2004;
Wisdom et al., 2014). In public mental health clinics, strategies for building absorptive capacity
may include developing leadership skills, organizational channels for spreading knowledge, and
cultures that value capturing and sharing new knowledge (Greenhalgh et al., 2004). Given that
leadership has emerged as one of the most important influences on the organizational context,
investing in leadership training may an effective way to increase organizational absorptive
capacity. Leaders can be trained in how to effectively search for, evaluate, and process
information about innovations. They can be encouraged to engage in networking opportunities at
coalition meetings, administrative meetings, and conferences in order to increase their exposure
to innovations. Training in implementation leadership skills or styles such as transformational
leadership may also subsequently improve organizational knowledge sharing and culture.
(Aarons et al., 2014; Aarons et al., 2017; Richter et al., 2016).
Towards developing two-way communication channels, leaders should encourage
organizational staff to communicate information about innovations across hierarchical
boundaries rather than waiting for information or mandates related to innovations to trickle down
from top administrators. Leaders and supervisors can also encourage staff to share information
47
about innovations during clinical supervision, treatment team meetings, or staff meetings.
Internal trainings should include dedicated time for open communication about ideas and
innovations to facilitate knowledge transfer. To promote a culture that is open to adoption of
innovations, leaders may model the prioritization of innovations by allocating resources for
innovation adoption, sharing research evidence, and rewarding staff who promote use of
innovations (Aarons et al., 2014).
Findings also have practical implications for improving dissemination of information
about innovations in public mental health services. As previously mentioned, the New York
State Office of Mental Health was the most frequently accessed source of external information
about innovations, even among clinics that did not participate in its recent large-scale, low cost
innovation rollout. Further, leaders reported accessing only two different external sources on
average. These findings offer preliminary evidence to suggest that disseminating information
about innovations through oversight agencies may offer the greatest return on investment in
terms of reaching the largest number of clinics. Study results also suggest that oversight agencies
and technical assistance centers aiming to increase adoption should work to develop
collaborative relationships with clinic leaders and support them in developing new networks and
partnerships. However, the configuration of each service system will dictate which oversight
agency is most salient (e.g. state or county), and further research is needed to determine if these
findings generalize to other service systems. Therefore, innovation developers should continue to
distribute information about their interventions and products through provider coalitions and at
national, state, and local professional associations. To assist overwhelmed clinic leaders in
quickly locating the information they are most interested in, individuals promoting innovations
48
may consider including information about innovation cost and implementation requirements,
perhaps via links or contact information for consultants.
Theoretical Implications
Study findings are in line with conceptual frameworks of factors influencing adoption
and implementation of innovations in human service organizations. Aarons et al.’s (2011)
Exploration, Preparation, Implementation, and Sustainment (EPIS) framework and Wisdom et
al.’s (2014) framework of adoption both include absorptive capacity, culture, leadership, and
social networks in their lists of factors that impact exploration and adoption of innovations. This
study corroborates and contributes to these frameworks by providing concrete examples of what
these factors look like in real-world settings from the perspectives of those providing services.
The authors of both frameworks as well as other researchers note that adoption has received
relatively little focus, so the current study’s findings help to increase understanding of this
critical first phase of the implementation process (Aarons et al., 2011; Wisdom et al., 2014; Chor
et al., 2015). In addition, they illustrate how factors in the internal organizational context (i.e.
absorptive capacity) and external organizational context (i.e. external sources of information)
interact during adoption, in line with recent work indicating the importance exploring
connections between the internal and external contexts for implementation science (Moullin
Dickson, Stadnick, Rabin, & Aarons, 2019).
The organization and management literature offers theories, concepts and lessons that are
useful for understanding how organizations interact with their environment during adoption
(Birken et al, 2017). However, adoption of innovations in human service organizations may
differ from innovation adoption in business due to the complex nature of the innovations, the
variability of the clients, and differences in missions, norms, and histories (Aarons et al., 2011;
49
Damanpour, 1991). This study suggests that the concept of absorptive capacity may be applied to
understanding the ability of human service organizations to access and use external information
about innovations. It extends previous work applying absorptive capacity to this context, some of
which relied on more restricted definitions of the concept, such as an organization’s level of
professionalism or previous experience with evidence-based practices and skills (Greenhalgh et
al, 2004; Knudsen & Roman, 2004; Schoenwald et al., 2008).
Limitations and Future Directions
There are several limitations to this study. The reliance on retrospective accounts of clinic
leaders introduces potential for social desirability bias, and restricting the sample to a single
service system limits generalizability of findings to other states and contexts. Using clinic
training participation data as a proxy for adoption offers advantages over simple yes-no adoption
measures but captures only one aspect of the adoption process. The relatively small sample size
prohibits use of inferential statistics or inclusion of variables to control for important differences
in operating contexts, such as county size, urbanicity, organizational structure, or available
resources (Wang, Saldana, Brown, & Chamberlain, 2010). Further, the lack of available
demographic information about clinic leaders limits comparison of differences between leaders
in higher- versus lower-adopting clinics. Finally, the cross-sectional design of this study means
that the directionality of the association between how a clinic accesses and uses external
information and its level of innovation adoption cannot be established. It may be that clinics with
greater absorptive capacities are more likely to adopt innovations, or that clinics with higher
levels of adoption develop greater absorptive capacities as a result of that adoption experience, or
some combination of both possibilities.
50
The limitations of this study suggest several areas for future investigation, such as
longitudinal designs, replication in other states and service systems, multi-indicator measures for
adoption, and mixed methods. Future studies are also needed to examine several gaps in this
study’s findings. This study focused on external sources of information and organizational
processes that impact acquisition and use of this information. However, it did not explicitly
examine how information is used in making adoption decisions, whether different sources
provide different types of information, or whether accessing information improves adoption.
Insight into these questions can support strategies to increase dissemination and adoption of
innovations.
Negative case analysis may offer a deeper understanding of study results. Approximately
one quarter of leaders did not name the state Office of Mental Health as a source of information,
and several stated that they did not perceive the Office of Mental Health as an important source.
Collecting further information about such cases may provide insights into how state or local
oversight agencies can improve their outreach and increase clinic buy-in. Similarly, given that
leaders rarely described negative organizational contexts such as unsupportive cultures, further
exploration may be needed to better inform strategies for increasing absorptive capacity. Finally,
application of social network theory and analysis can be used to build on this study’s results by
examining how social networks influence access to information that facilitates selection and
adoption of innovations (Palinkas et al., 2011; Valente et al., 2007).
Conclusion
Mental health service systems are constantly changing and new clinical interventions,
quality improvement initiatives, and business approaches are continuously emerging (Hoagwood
et al., 2014). To facilitate implementation of these innovations and close the research-practice
51
gap, it is necessary to understand how clinics first learn about innovations for potential adoption.
Despite its limitations, the results of this study offer preliminary understanding of the sources
mental health organizations rely on for information about innovations, and how differences in
organizational absorptive capacity may influence how they use this information. Findings also
suggest opportunities for the development of implementation strategies to improve dissemination
of information about innovations and increase organizational absorptive capacity to acquire and
apply external information.
52
References
Aarons, G. A., Ehrhart, M. G., Farahnak, L. R., & Sklar, M. (2014). Aligning leadership across
systems and organizations to develop a strategic climate for evidence-based practice
implementation. Annual Review of Public Health, 35, 255-274.
Aarons, G. A., Ehrhart, M. G., Moullin, J. C., Torres, E. M., & Green, A. E. (2017). Testing the
leadership and organizational change for implementation (LOCI) intervention in
substance abuse treatment: a cluster randomized trial study protocol. Implementation
Science, 12, 29-39.
Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a conceptual model of
evidence-based practice implementation in public service sectors. Administration and
Policy in Mental Health and Mental Health Services Research, 38, 4-23.
Aarons, G. A., & Sommerfeld, D. H. (2012). Leadership, innovation climate, and attitudes
toward evidence-based practice during a statewide implementation. Journal of the
American Academy of Child & Adolescent Psychiatry 51, 423-431.
Balas, E. A., & Boren, S. A. (2000). Managing clinical knowledge for health care
improvement. In Bemmel J., & McCray, A. T. (Eds.), Yearbook of medical informatics
2000: Patient-centered systems (65-70). Stuttgart, Germany: Schattauer.
Bickman, L., Lyon, A. R., & Wolpert, M. (2016). Achieving precision mental health through
effective assessment, monitoring, and feedback processes: Introduction to the special
issue. Administration and Policy in Mental Health and Mental Health Services Research,
43, 271-276.
53
Birken, S. A., Bunger, A. C., Powell, B. J., Turner, K., Clary, A. S., Klaman, S. L., …, &
Weiner, B. J. (2017). Organization theory for dissemination and implementation research.
Implementation Science, 12, 62-74.
Bradley, E. H., Webster, T. R., Baker, D., Schlesinger, M., Inouye, S. K., Barth, M. C., ... &
Koren, M. J. (2004). Translating research into practice: Speeding the adoption of
innovative health care programs. Commonwealth Fund Issue Briefs, 724, 1-13.
Bruns, E. J., & Hoagwood, K. E. (2008). State implementation of evidence-based practice for
youths, part I. Responses to the state of the evidence. Journal of the American Academy
of Child and Adolescent Psychiatry, 47, 369–373.
Charmaz, K. (2014). Constructing grounded theory: A practical guide through qualitative
analysis, 2
nd
ed. Thousand Oaks, CA: Sage Publications, Inc.
Chor, K. H. B., Olin, S. C. S., Weaver, J., Cleek, A. F., McKay, M. M., Hoagwood, K. E., &
Horwitz, S. M. (2014). Adoption of clinical and business trainings by child mental health
clinics in New York State. Psychiatric Services, 65(12), 1439-1444.
Chor, K. H. B., Wisdom, J. P., Olin, S. C. S., Hoagwood, K. E., & Horwitz, S. M. (2015).
Measures for predictors of innovation adoption. Administration and Policy in Mental
Health and Mental Health Services Research, 42(5), 545-573.
Chorpita, B. F., Daleiden, E. L., Ebesutani, C., Young, J., Becker, K. D., Nakamura, B. J., …
Starace, N. (2011). Evidence-based treatments for children and adolescents: An updated
review of indicators of efficacy and effectiveness. Clinical Psychology Science and
Practice, 18(2), 154–172.
Cohen, W. M., & Levinthal, D. A. (1990). Absorptive capacity: A new perspective on learning
and innovation. Administrative Science Quarterly, 35(1):128–152.
54
Crabtree, B. F., & Miller, W. L. (Eds.). (1999). Doing qualitative research. Thousand Oaks, CA:
Sage Publications, Inc.
Damanpour, F. (1991). Organizational innovation: A meta-analysis of effects of determinants
and moderators. Academy of Management Journal, 34(3), 555-590.
Ehrhart, M. G., Aarons, G. A., & Farahnak, L. R. (2014). Assessing the organizational context
for EBP implementation: the development and validity testing of the Implementation
Climate Scale (ICS). Implementation Science, 9, 157-167.
Feldstein, A. C., & Glasgow, R. E. (2008). A Practical, Robust, Implementation and
Sustainability Model (PRISM) for integrating research findings into practice. The Joint
Commission Journal on Quality and Patient Safety, 34(4), 228-243.
Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for
qualitative research. New York: Aldine de Gruyter.
Glisson, C., Landsverk, J., Schoenwald, S., Kelleher, K., Hoagwood, K. E., Mayberg, S., ... &
Research Network on Youth Mental Health. (2008). Assessing the organizational social
context (OSC) of mental health services: Implications for research and
practice. Administration and Policy in Mental Health and Mental Health Services
Research, 35, 98-113.
Gleacher, A. A., Olin, S. S., Nadeem, E., Pollock, M., Ringle, V., Bickman, L., ... & Hoagwood,
K. (2016). Implementing a measurement feedback system in community mental health
clinics: A case study of multilevel barriers and facilitators. Administration and Policy in
Mental Health and Mental Health Services Research, 43(3), 426-440.
55
Glisson, C., & Schoenwald, S. K. (2005). The ARC organizational and community intervention
strategy for implementing evidence-based children's mental health treatments. Mental
Health Services Research, 7(4), 243-259.
Green, L. W. (2008). Making research relevant: If it is an evidence-based practice, where's the
practice-based evidence? Family Practice, 25(supp. 1), i20-i24.
Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of
innovations in service organizations: Systematic review and recommendations. Milbank
Quarterly, 82(4), 581–629.
Hoagwood, K. E., Olin, S. S., Horwitz, S., McKay, M., Cleek, A., Gleacher, A., ... & Kuppinger,
A. (2014). Scaling up evidence-based practices for children and families in New York
State: toward evidence-based policies on implementation for state mental health
systems. Journal of Clinical Child & Adolescent Psychology, 43(2), 145-157.
Horwitz, S. M., Hurlburt, M. S., Goldhaber-Fiebert, J. D., Palinkas, L. A., Rolls-Reutz, J.,
Zhang, J., ... & Landsverk, J. (2014). Exploration and adoption of evidence-based
practice by U.S. child welfare agencies. Children and Youth Services Review, 39, 147-
152.
Kazdin, A. E., & Weisz, J. R. (eds). (2003). Evidence-Based Psychotherapies for Children and
Adolescents. New York, NY: Guilford Press.
Kendall, P. C., Hudson, J. L., Gosch, E., Flannery-Schroeder, E., & Suveg, C. (2008). Cognitive-
behavioral therapy for anxiety disordered youth: A randomized clinical trial evaluating
child and family modalities. Journal of Consulting and Clinical Psychology, 76(2), 282-
297.
56
Knudsen, H. K., & Roman, P. M. (2004). Modeling the use of innovations in private treatment
organizations: The role of absorptive capacity. Journal of Substance Abuse
Treatment, 26(1), 51-59.
Mendel, P., Meredith, L. S., Schoenbaum, M., Sherbourne, C. D., & Wells, K. B. (2008).
Interventions in organizational and community context: A framework for building
evidence on dissemination and implementation in health services
research. Administration and Policy in Mental Health and Mental Health Services
Research, 35, 21-37.
Moullin, J. C., Dickson, K. S., Stadnick, N. A., Rabin, B., & Aarons, G. A. (2019). Systematic
review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework.
Implementation Science, 14, 1-16.
Padgett, D. K. (2012). Qualitative and mixed methods in public health. Thousand Oaks, CA:
Sage Publications, Inc.
Palinkas, L. A., Holloway, I. W., Rice, E., Fuentes, D., Wu, Q., & Chamberlain, P. (2011).
Social networks and implementation of evidence-based practices in public youth-serving
systems: A mixed-methods study. Implementation Science, 6, 113-123.
Palinkas, L. A., Saldana, L., Chou, C. P., & Chamberlain, P. (2017). Use of research evidence
and implementation of evidence-based practices in youth-serving systems. Children and
Youth Services Review, 83, 242-247.
Palinkas, L. A., Wu, Q., Fuentes, D., Finno-Velasquez, M., Holloway, I. W., Garcia, A., &
Chamberlain, P. (2015). Innovation and the Use of Research Evidence in Youth-Serving
Systems: A Mixed-Methods Study. Child Welfare, 94(2), 57-85.
57
Panzano, P. C., & Roth, D. (2006). The decision to adopt evidence-based and other innovative
mental health practices: Risky business? Psychiatric Services, 57(8), 1153-1161.
Richter, A., von Thiele Schwarz, U., Lornudd, C., Lundmark, R., Mosson, R., & Hasson, H.
(2016). iLead— a transformational leadership intervention to train healthcare managers’
implementation leadership. Implementation Science, 11, 108-124.
Rogers, E. M. (2003). Diffusion of innovations. (5th ed.) New York, NY: Free Press.
Schoenwald, S. K., Chapman, J. E., Kelleher, K., Hoagwood, K. E., Landsverk, J., Stevens, J., ...
& Research Network on Youth Mental Health. (2008). A survey of the infrastructure for
children’s mental health services: Implications for the implementation of empirically
supported treatments (ESTs). Administration and Policy in Mental Health and Mental
Health Services Research, 35(1-2), 84-97.
Silverman, W. K., & Hinshaw, S. P. (2008). The second special issue on evidence-based
psychosocial treatments for children and adolescents: A 10-year update. Journal of
Clinical Child & Adolescent Psychology, 37, 1-7.
Sofronoff, K., & Farbotko, M. (2002). The effectiveness of parent management training to
increase self-efficacy in parents of children with Asperger syndrome. Autism, 6(3), 271-
286.
Solomons, N. M., & Spross, J. A. (2011). Evidence‐based practice barriers and facilitators from a
continuous quality improvement perspective: an integrative review. Journal of Nursing
Management, 19, 109-120.
Strauss, A. L., & Corbin, J. (1998). Basics of qualitative research (2
nd
vol). Thousand Oaks, CA:
Sage Publications, Inc.
58
Valente, T. W., Chou, C. P., & Pentz, M. A. (2007). Community coalitions as a system: Effects
of network change on adoption of evidence-based substance abuse prevention. American
Journal of Public Health, 97(5), 880-886.
Valente, T. W., & Davis, R. L. (1999). Accelerating the diffusion of innovations using opinion
leaders. The Annals of the American Academy of Political and Social Science, 566(1), 55-
67.
Viera, A. J., & Garrett, J. M. (2005). Understanding interobserver agreement: the kappa
statistic. Family Medicine, 37(5), 360-363.
Wang, W., Saldana, L., Brown, C. H., & Chamberlain, P. (2010). Factors that influenced county
system leaders to implement an evidence-based program: A baseline survey within a
randomized controlled trial. Implementation Science, 5, 72-79.
Willms, D. G., Best, J. A., Taylor, D. W., Gilbert, J. R., Wilson, D., Lindsay, E. A., & Singer, J.
(1990). A systematic approach for using qualitative methods in primary prevention
research. Medical Anthropology Quarterly, 4(4), 391-409.
Wisdom, J. P., Chor, K. H. B., Hoagwood, K. E., & Horwitz, S. M. (2014). Innovation adoption:
a review of theories and constructs. Administration and Policy in Mental Health and
Mental Health Services Research, 41(4), 480-502.
Zahra, S. A., & George, G. (2002). Absorptive capacity: A review, reconceptualization, and
extension. Academy of Management Review, 27(2), 185-203.
59
Table 2.1
Sample Clinic Characteristics
Characteristic Percent of clinics
Level of adoption
Super 9
High 29
Medium 17
Low 14
None 31
Region
Central 11
Hudson Valley 14
Long Island 3
NYC 60
Western 11
Annual expenses*
< $1 million 12
$1 - $5 million 52
$5 - $10 million 18
> $10 million 18
Gain or loss per service unit*
< -$50 21
-$50 - $0 48
> $0 30
Percentage of youth clients
< 25 46
25 – 50 31
50- 75 6
75-100 17
*Data for these variables were unavailable for two clinics.
60
Table 2.2
Percent of Clinic Leaders Endorsing External Sources of Information about Innovations
Source Example
Clinic level of adoption
Overall Super/high Medium Low/none
Office of Mental
Health
Center for Practice
Innovations
75 72 63 81
Peer organization/
network
New York City Coalition for
behavioral health agencies
34 24 45 39
Other government
Office of Alcoholism and
Substance Abuse Services
28 28 54 19
Professional
association
National Council for
Behavioral Health
25 20 45 23
Literature Journals, databases 28 40 36 16
University New York University 10 4 18 13
Email Email blasts, listserves 54 48 55 54
Meetings
Extraorganizational, in-
person meetings
17 24 27 13
Mean number of
sources
2.0 1.9 2.6 1.8
61
Chapter Three: Study Two
Configurations of Clinic Characteristics and Leader-Reported Drivers Associated with
Innovation Adoption in Children’s Mental Health Services:
A Qualitative Comparative Analysis
Abstract
Introduction
Understanding the factors that contribute to clinic adoption or non-adoption of
innovations during large scale rollouts can inform future efforts to disseminate clinical and
business innovations in the children’s public mental health system. Current frameworks
characterize adoption as grounded in complex causality and equifinality; however, few studies
have examined adoption using methods that align with these assumptions. This study examined
combinations of clinic leader-reported drivers of adoption decisions and clinic characteristics
associated with adoption of clinical and business innovations following the founding of the
Community Technical Assistance Center in New York State.
Methods
This study used a mixed-methods design and qualitative comparative analysis (QCA), a
method that represents a blend of qualitative, case oriented research and quantitative, variable
oriented research. Qualitative interviews with 63 leaders of 32 clinics were analyzed to identify
drivers of CTAC training adoption or non-adoption. The top three drivers (organizational culture,
innovation relevance, lack of organizational capacity) were coded as binary variables and
merged with clinic administrative data for QCA. Clinic characteristics included size, fiscal
efficiency, proportion of youth clients, and region. Clinic adoption was measured using training
62
participation data. QCA analyses were conducted for 1) adoption of clinical innovations; 2)
adoption of business innovations; and 3) non-adoption.
Results
There were no necessary conditions for adoption of clinical or business innovations. The
solution for adoption of clinical innovations included four configurations based on combinations
of receptive organizational culture, perceived innovation relevance, high proportion of youth
clients, and small organizational size. The solution for adoption of business innovations included
three configurations based on combinations of receptive organizational culture, perceived
innovation relevance, fiscal efficiency, and small organizational size. The solution for non-
adoption consisted of lack of organizational capacity and downstate region.
Discussion
Results indicate that there are multiple configurations associated with adoption, rather
than a single path. Drivers related to leader perceptions of organizational culture and innovation
relevance were involved in all of them, suggesting that targeting provider values, attitudes, and
beliefs related to innovations may improve adoption. This study demonstrates the feasibility and
benefits of using configurational approaches such as QCA to examine adoption in mental health
clinics.
63
Configurations of Clinic Characteristics and Leader-Reported Drivers Associated with
Innovation Adoption in Children’s Mental Health Services:
A Qualitative Comparative Analysis
Evidence-based clinical and business innovations are underutilized in children’s mental
health services (Aos et al., 2006; Chorpita et al., 2011; Hoagwood et al., 2014). The delay in
moving innovations from research to practice has especially serious implications in public sector
mental health organizations (Balas & Boren, 2000; Green, 2008). Children, adolescents, and
families served by these organizations have lower access to services and greater risk for adverse
outcomes if effective treatments are not available (Chow, Jaffee, & Snowden, 2003; Kataoka,
Zhang, & Wells, 2002).
New York is one of the leading states in efforts to disseminate and implement evidence-
based practices within the children’s public mental health system (Gleacher et al., 2011; Olin et
al., 2015). In 2011, the State Office of Mental Health initiated a rollout of innovative clinical and
business practices through the Community Technical Assistance Center (CTAC). The CTAC
provides free training and technical assistance on a voluntary basis to all publicly funded mental
health clinics serving children, adolescents, and their families. Although the CTAC’s offerings
have since evolved, at the time of the rollout its clinical trainings included topics such as
cognitive-behavior therapy, strengthening families, and motivational interviewing. Business
trainings included topics such as collaborative documentation, quality assurance and risk
management, and financial self-assessment (Chor et al., 2014). Evaluations of this rollout
revealed variation in clinic adoption behavior, finding that over one third of eligible clinics did
not participate in any CTAC trainings (Chor et al., 2014).
64
Understanding the factors that contribute to clinic adoption or non-adoption of
innovations during rollouts can inform future efforts to disseminate innovations. Wisdom et al.’s
(2014) theory of innovation adoption illustrates that adoption is a complex, dynamic process
involving multiple factors at different levels of the organizational context. Some of these factors
include structural clinic characteristics such as operational size, fiscal efficiency, client
population, and urbanicity. Others are related to provider perceptions, norms and values, such as
organizational culture and innovation-organization fit. Although empirical research focusing on
the adoption phase of implementation is limited, studies have begun to examine how these
factors are associated with innovation adoption in real-world behavioral health settings (Chor,
Wisdom, Olin, Hoagwood, & Horwitz, 2015).
Olin and colleagues (2015) examined the association between structural clinic
characteristics and clinic participation in CTAC trainings following the New York State rollout.
They found that clinics with higher proportions of clinical staff and youth clients were more
likely to participate in clinical trainings, while clinics with larger operational size and higher
levels of fiscal efficiency and outsourcing were less likely to participate in business trainings.
These findings suggest that clinics may make adoption decisions based on how well an
innovation fits with clinic, provider, and client characteristics. Qualitative data from service
providers supports this, indicating that perceived costs and benefits, organizational capacity,
organizational culture, and acceptability of new practices influence clinic adoption decisions
(Palinkas et al., 2017; Seffrin et al., 2009; Zazzali et al., 2008). However, additional research that
combines both provider interviews and clinic characteristics is needed in order to empirically
explore how configurations of provider-reported drivers of adoption and clinic characteristics
predict adoption. Methods grounded in causal complexity and equifinality that can assess how
65
various conditions lead to adoption align with Wisdom et al.’s (2014) multifactorial framework
and can offer insights that compliment findings from regression-based and qualitative studies
(Kane, Lewis, Williams, & Kahwati, 2014).
The current study examined combinations of clinic leader-reported drivers of adoption
decisions and clinic characteristics associated with adoption of clinical and business innovations
in 32 clinics following the founding of the CTAC in New York State. This study used a mixed-
methods design and qualitative comparative analysis (QCA), a method that represents a blend of
qualitative, case oriented research and quantitative, variable oriented research. Results
suggesting clinic-level configurations associated with adoption can inform development of
implementation strategies and identification of organizations for targeted adoption support
(Schneider & Wagemann, 2010).
Methods
This study used a mixed methods design in which both qualitative and quantitative data
were used to answer the research questions (Palinkas et al., 2011). Qualitative interviews with
clinic leaders were analyzed to identify drivers of CTAC training adoption or non-adoption. The
top three drivers were ‘quantitized’ (i.e. coded as binary variables) and merged with quantitative
clinic characteristics from administrative data for QCA (Sandelowski, Voils, & Knafl, 2009).
QCA analyses were conducted for adoption of 1) clinical innovations and 2) business
innovations.
Study Sample and Context
The study sample consisted of 32 clinics licensed by the New York State Office of
Mental Health to treat children, adolescents, and their families. Clinics were selected based on
their level of adoption of CTAC trainings during 2011-2013 to represent a stratified sample of all
66
346 clinics in the state children’s mental health system. At the time of data collection in the
present study, the CTAC offered 33 trainings via webinars, in-person seminars, and learning
collaboratives. Eighteen of these trainings targeted evidence-based clinical practices, twelve
trainings targeted improvement of business practices, and three hybrid trainings targeted
improvement of both clinical and business practices. Chor and colleagues (2014) provide more
detail about trainings and classification of clinic adoption levels. For the purposes of this study,
two clinics were dropped from the original sample of 34 because administrative data for them
was not available.
Qualitative Data
Participants. Participants included clinic CEOs or Vice Presidents (n = 33) and Program
Directors (n = 30). In most clinics (29 out of 32), two interviews were conducted: one with an
upper level administrator and one with a middle level manager. Within two clinics, only one
interview was conducted due to simpler organizational structures, and within one clinic a third
interview was conducted with an additional key decision-maker. Informed consent was obtained
from all participants, and study procedures were approved by the Institutional Review Boards of
the University of Southern California and New York University.
Interviews and coding. Qualitative interviews were conducted with clinic leaders in
2013-2014 via hour-long, semi-structured interviews that were digitally recorded and transcribed
(Palinkas et al., 2017). Leaders were asked, “Has your agency used the technical assistance
provided by the CTAC? Why, or why not?
Interview transcripts were coded in Dedoose Version 8.0.35 (2018) using the thematic
content analysis method of “Coding Consensus, Co-occurrence, and Comparison” (Willms et al.,
1990). The first author developed an initial codebook using open coding to identify broad themes
67
and patterns, and then used focused/axial coding to explore these themes more deeply (Strauss &
Corbin, 1998). When excerpts aligned with multiple themes, all relevant codes were assigned. A
graduate-level researcher co-coded 50% of transcripts selected at random. The co-coders created
detailed memos describing codes and linking them to excerpts in the data and discussed
discrepancies until consensus was reached.
Selecting causal conditions. After qualitative coding was complete, transcripts were
assigned binary values indicating whether the leader endorsed each driver of adoption in the
codebook (1 = leader endorsed the driver; 0 = leader did not endorse the driver). These binary
values were then aggregated to the clinic level (1 = at least one leader in the clinic endorsed the
driver; 0 = none of the leaders in the clinic endorsed the driver). Three of the drivers met criteria
for inclusion as causal conditions in QCA (i.e. they were endorsed by at least one third of clinics)
(Rihoux & Ragin, 2009): limited organization capacity (lack of time, staff, or resources),
organization culture (openness and receptivity to innovations), and innovation relevance (fit with
the topical educational needs of the clinic).
Quantitative Data
Outcomes. Sending staff for training in an innovation represents an early clinic adoption
behavior. Therefore, training participation may be considered as a proxy measure for adoption
(Olin et al., 2015). For the current study, clinic adoption was operationalized as any participation
in CTAC trainings between September 2011 and August 2013. Binary codes were created to
indicate adoption of 1) clinical innovations; and 2) business innovations.
Causal conditions. Additional causal conditions were selected from administrative data
based on Wisdom et al.’s (2014) conceptual framework of adoption, results of Olin et al.’s
(2015) analysis, and iterative examination of truth tables during analysis. These conditions were
68
measured at the clinic level and included annual expenses (proxy for clinic size); financial gain
or loss per service unit/client encounter (proxy for clinic efficiency); proportion of youth clients;
and region.
Information about annual expenses and gain or loss per service unit came from the 2011
New York State Office of Mental Health directory of licensed clinics, and proportion of youth
clients was extracted from the 2011 New York State Office of Mental Health Patient
Characteristics Survey. Since these conditions were measured on interval scales, they were
calibrated into fuzzy sets using the calibration function of fsQCA as recommended by Rihoux &
Ragin (2009). Benchmarks were specified for full membership, non-membership, and the cross-
over point in each fuzzy set condition as shown in Table 3.1. Clinic regions were extracted from
the U.S. Department of Health and Human Services Area Health Resources Files. Clinics were
coded as downstate (1= New York City or Long Island) or upstate (0 = Hudson, Western, or
Central). The set coincidence for each pairing of causal conditions was examined to determine if
any conditions should be merged. All values were less than 0.5, so individual conditions were
retained. See Appendix B for a list of cases, outcomes, and conditions.
Qualitative Comparative Analysis. QCA involves both qualitative, iterative dialogue
with the data and quantitative calculations to assess cross-case patterns using set theoretic
methods. It is ideal for research designs involving small and intermediate numbers of cases (n =
10-50) (Ragin, 1987, 1999, 2008). In this study, each clinic represents a case (n = 32), the causal
conditions are those derived from qualitative analysis and administrative data as described
above, and the outcomes of interest are 1) adoption of clinical innovations; 2) adoption of
business innovations.
69
QCA uses Boolean algebra to identify different logical combinations of conditions that
might be necessary or sufficient to produce the outcome. Necessary conditions are always (or
almost always) present when the outcome occurs, and the outcome cannot occur without them.
For sufficient conditions, the outcome always (or almost always) occurs when the condition is
present, but the outcome may result from other conditions as well (Rihoux & Ragin, 2009).
Analyses were conducted using fs/QCA software (Ragin & Davey, 2016). First, the
potential necessary conditions for each outcome were assessed. Second, truth tables were
constructed to examine sufficient conditions. Truth tables list all logically possible
configurations of conditions, with each case assigned to the row in which it has highest
membership (see Appendices C and D). Contradictory configurations, or configurations in which
some cases have an outcome value of 1 (adoption) and others have an outcome of 0 (non-
adoption) were resolved where possible as outlined by Rihoux & Ragin (2009). For example,
driver codes from qualitative interviews were dropped in three cases because leaders appeared to
provide unreliable information regarding whether or not their clinic had accessed CTAC
trainings. Two other contradictions were resolved through comparing co-coding.
Third, the truth tables were analyzed and results were evaluated using goodness of fit
measures of consistency, or the degree to which cases agree on displaying a certain outcome, and
coverage, or the degree to which membership in the causal recipes accounts for the outcome
(Ragin, 2006). A consistency cutoff of .9 was used for identifying necessary conditions, and .75
to retain configurations of sufficient conditions. fsQCA produces three solutions: complex (based
only on observed configurations), intermediate (incorporates the most plausible configurations
without any observed cases), and parsimonious (uses the highest number of configurations
without any observed cases to produce a solution with the lowest number of conditions).
70
Intermediate solutions were selected for interpretation in this paper, in line with Rihoux &
Ragin’s (2009) recommendations. Solutions were assessed for their connections to individual
cases and to theory on innovation adoption. Finally, QCA was repeated for the negation of each
outcome (i.e. non-adoption) since set theoretical analysis is not symmetrical and the conditions
that lead to adoption may not be the same ones that are associated with non-adoption. Further
detail about QCA methods and procedures may be found in Ragin & Davey (2016), Rihoux &
Ragin (2009), and Schneider & Wagemann (2012).
Results
Drivers of Adoption Decisions
Leader-reported drivers of clinic adoption decisions aligned with external system,
organization, and innovation levels. External system factors included financial incentives and
policy mandates. Leaders of two clinics reported that a lack of external incentives or mandates
influenced their clinic’s decisions not to adopt CTAC innovations. Limited organizational
capacity, which included available time, staff, and/or money, was the most common driver
reported by leaders of non-adopting clinics.
Organizational culture, or a clinic’s openness and receptivity to innovations, was
endorsed by over half of the leaders in adopting clinics. The most common driver reported by
leaders in adopting clinics was innovation relevance, or fit with the topical educational needs of
the clinic. Finally, the feasibility of an innovation, or its fit with clinic needs in terms of cost
efficacy or ease of training participation, was a driver in adopting clinics. Table 3.2 presents
illustrative quotations for each driver.
For inclusion in QCA, conditions must vary by at least one third (Rihoux & Ragin, 2009).
After calculating the proportion of clinics with leaders that endorsed each driver as described in
71
the methods section and depicted in Table 3.2, three drivers—organizational capacity,
organizational culture, and innovation relevance—met this criteria. Culture and relevance were
selected as conditions for adoption, and limited capacity was selected as a condition for non-
adoption.
Qualitative Comparative Analysis
Adoption of clinical innovations. Analyses did not reveal any necessary conditions for
adoption of clinical innovations. Sufficient conditions are presented in Table 3.3, with each line
representing a different configuration of leader-reported drivers and clinic characteristics that
shares membership with adoption. These configurations are presented as logical statements in
which the operator ‘*’ represents the Boolean ‘AND’; ‘~’ represents ‘NOT’; and ‘+’ represents
‘OR’.
The solution coverage value indicates that approximately 73% of clinics that adopted
clinical innovations are included in at least one of the four configurations. The solution
consistency value of .97 indicates that cases in these configurations had a high level of
agreement in displaying the outcome of adoption. The raw coverage for each configuration
indicates the percentage of adopting cases that follow that configuration, and the unique
coverage represents the percentage of adopting cases that follow the configuration exclusively.
For example, the first configuration includes 47% of adopting clinics, and uniquely represents
22% of adopting clinics.
Clinics that endorsed culture as a driver of the adoption decision AND 1) perceived the
innovation as relevant OR 2) had a high proportion of youth clients adopted clinical innovations.
Clinics that were smaller AND 1) endorsed culture OR 2) perceived the innovation as relevant
AND had a high proportion of youth clients were also adopters. Table 3.3 includes one example
72
quotation per configuration in which a clinic leader explains why their clinic adopted CTAC
innovations.
Adoption of business innovations. Similar to adoption of clinical innovations, there
were no necessary conditions for adoption of business innovations. Configurations of sufficient
conditions and example quotations are presented in Table 3.4. Together, the three configurations
covered 81% of clinics, with a high consistency value of .95. Clinics that perceived innovations
as relevant AND were 1) not inefficient OR 2) smaller clinics adopted business innovations.
Clinics that were inefficient AND endorsed culture as a driver were also adopters.
Non-adoption. Consistent with the principle of non-symmetry, the causal conditions for
non-adoption were not simply the converse of those for adoption (Rihoux & Ragin, 2009).
Instead, the solution for non-adoption of both clinical and business innovations consisted of the
combination of downstate location AND leader-endorsed capacity issues as a driver of the
decision not to adopt. For non-adoption of clinical innovations, solution coverage was 69% and
consistency was 1. For non-adoption of business innovations, coverage was 70% and consistency
was 0.78. Downstate region was a necessary condition for non-adoption of business innovations,
with 41% coverage and 0.90 consistency.
Discussion
This study explored how combinations of factors from administrative data and leader
interviews were associated with clinic-level adoption behavior during a statewide innovation
rollout. QCA revealed configurations of clinic characteristics and leader-endorsed drivers of
adoption decisions associated with clinic and business innovation adoption. Results offer
preliminary support for the use of configurational approaches in implementation science, and
practical implications for future rollouts.
73
Sufficient conditions for adoption of both clinical and business innovations included
cultures that supported innovations, topical relevance of the innovation, and small organizational
size. In addition, high proportion of youth clients was a sufficient condition for adoption of
clinical innovations, while fiscal inefficiency was a sufficient condition for adoption of business
innovations. Although individual conditions may be discussed in conjunction with certain
interpretations or previous literature, it is important to note that all QCA results pertain to
relevant combinations of conditions, rather than isolated impacts of individual conditions.
Culture was a sufficient condition in three of the four configurations for adoption of
clinical innovations (culture * (relevance + youth + small)), with culture * relevance having the
highest membership. Qualitative interviews with leaders of these clinics reported that a
combination of their agency’s openness to innovations and interest in the training content drove
the adoption decision. As one leader explained, “[CTAC is] headed in the same direction we are,
but it isn't necessarily that we need them to give us a jump start on this because we're interested
in this. We do it even if they weren't.”
Leaders did not explicitly speak about the characteristics of their client population as a
driver of their clinic’s adoption decisions. However, given that most of the CTAC’s clinical
innovations targeted children and families, it seems possible that the culture * youth
configuration was similarly driven by the fit between training topics and clinic needs. In other
words, adoption in these clinics may have been the product of a combination of openness to
innovations and a clinical population that aligned with CTAC’s clinical training content.
Smaller operational size was a sufficient condition in two configurations for adoption of
clinical innovations (culture * small; relevance * youth * small) and in one configuration for
business innovations (relevance * small). Although leaders did not generally mention their
74
clinic’s size when discussing their adoption decisions, it may be that smaller clinics have more
flexibility to adopt innovations if they perceived them to be relevant and/or fit with their client
population, or because their culture generally supported their use. Taken together, the
configurational findings discussed thus far align with and contribute to large bodies of work
illustrating that organizational cultures and innovation-organization fit are critical for adoption
and implementation (Aarons & Sommerfeld, 2012; Ehrhart, Aarons, & Farahnak, 2014; Glisson
et al., 2008; Knudsen & Roman, 2015; Palinkas et al., 2017).
Although supported by Olin et al.’s (2015) systemwide study, the finding that larger
operational size was not a necessary or sufficient condition for adoption is contrary to other
research indicating a positive relationship between operational resources and adoption (Aarons et
al., 2011; Regan et al., 2017; Wisdom et al., 2014). Consistent with Olin et al.’s (2015)
interpretation, leaders of two larger clinics in this study reported that they did not adopt CTAC
innovations because they had their own internal training resources. As one leader explained,
“We're such a large agency and we have so many outpatient services that there's so much
training going on already and most of it's centralized.” This explanation aligns with frameworks
indicating that formalized, centralized structures may be negatively associated with adoption
(Frambach & Schillewaert, 2002; Greenhalgh, Robert, Macfarlane, Bate, & Kyriakidou, 2004).
However, size was not a sufficient condition for non-adoption, and additional exploration is
needed to confirm whether centralization or other conditions not captured by this study
systematically explain the adoption decisions of large organizations.
Both fiscal inefficiency and fiscal non-inefficiency (referred to as ‘efficiency’ going
forward for ease of interpretation) were sufficient conditions in configurations for adoption of
business innovations (inefficient * culture; efficient * relevance). Fiscally inefficient clinics with
75
cultures that supported use of innovations adopted business innovations. It may be that support
for innovation drove adoption of innovations despite a lack of financial resources, perhaps out of
a desire to improve the clinic’s financial profile. This explanation is partially supported in the
qualitative data, including the example quote in Table 3.4 and this clinic leader’s explanation,
“Integrating practice with fiscal and budget, obviously is the way you're going to stay, or try to
stay as best you can afloat. And I think that it's seen as a very worthy practice to be a part of.”
Fiscally efficient clinics with leaders that endorsed innovation relevance as a driver were also
adopters of business innovations. Even though these clinics were relatively financially stable,
they may have adopted business innovations in order to improve quality or navigate system
changes. As one clinic leader stated, “It felt like if CTAC could give us a boost, that would be
great.” Unexpectedly, two clinics in the efficient * relevance configuration explicitly reported
that they adopted business innovations due to financial deficits, even though administrative data
indicated that their clinics were not losing money per service unit. For example, one leader
explained that they adopted CTAC innovations to address “a huge deficit, which we’re still
fighting to overcome.” It may be that the clinics’ fiscal efficiency had declined since
administrative data was collected, or that they had other debts despite a relatively stable fiscal
profile.
The conditions that predicted adoption did not predict non-adoption within recommended
consistency and/or coverage thresholds. Instead, the solution for non-adoption of both clinical
and business innovations was a combination of downstate/urban location and limited capacity as
an adoption decision driver (region * limited capacity). Leaders of clinics with membership in
this configuration reported a lack of time for training and financial strain due to lost billing. It is
unclear why capacity issues were associated with non-adoption for downstate clinics specifically,
76
although this finding is supported by Olin et al.’s systemwide analysis, which found that upstate
clinics were more likely than downstate clinics to adopt business innovations. It may be that
downstate clinics have access to a wider variety of sources for training and technical assistance,
and clinics with limited capacity were stretched too thin to participate in the CTAC rollout.
However, this interpretation is conjectural, and further research is needed to explore the
conditions related to non-adoption in this study’s context.
Practical Implications
Study results suggest that there are several different possible pathways to adoption, rather
than one correct recipe. Supportive cultures and innovation-organization fit drove adoption of
CTAC innovations even in smaller and/or fiscally inefficient clinics. These findings suggest that
relatively static clinic characteristics such as operational size do not unilaterally determine
adoption behavior, and that targeting provider values, attitudes, and perceptions related to
innovations may improve adoption. Prior to innovation rollouts, funders and oversight agencies
may train leaders in how to embed cultural support for innovations within their clinics (Aarons,
Farahnak, Ehrhart, & Sklar, 2014). They may also conduct needs assessments to guide
innovation selection, and determine how best to align messaging about innovation rollouts with
clinic interests.
Although lack of resources and financial strain did not preclude innovation adoption,
capacity issues still played a critical role in this study. CTAC trainings were free and easily
accessible, but the majority of non-adopting clinics endorsed lost billing time and revenue as an
important barrier to participation. Therefore, funders and oversight agencies wishing to increase
innovation adoption in their systems may consider offering financial incentives to offset some of
the losses associated with training participation (Beidas et al., 2016).
77
Implications for Implementation Research
This study also has implications for implementation science and mental health services
research. Widely-used implementation frameworks suggest, even if indirectly, that
implementation is best explained using complex causality (combinations of conditions lead to
implementation) and equifinality (different combinations of conditions can lead to
implementation) (Aarons et al., 2011; Damschroder et al., 2009; Feldstein & Glasgow, 2008;
Greenhalgh et al., 2004; Kane et al., 2014; Wisdom et al., 2014). Yet, few studies have examined
implementation using methodological approaches grounded in such assumptions, and most
exceptions are restricted to health services research (e.g. Ford, Duncan, & Ginter, 2009).
Although exploratory, this study demonstrates the feasibility of using configurational methods
such as QCA to examine implementation in mental health service contexts. As evidenced by this
study’s results, which offer nuance to larger, quantitative analyses of New York State’s
innovation rollout, configurational methods may be used to complement findings from
regression-based approaches. They may also be used to contextualize qualitative findings and
link them to organizational-level implementation outcomes.
Limitations and Future Directions
There are limitations to this study that should be considered when interpreting results.
The first set of limitations relates to qualitative interview data. Given that the interview prompt
asked about adoption of CTAC innovations in general and that the majority of clinics
participated in both clinical and business trainings, it was not possible to distinguish between
drivers of adoption of clinical versus business innovations. In addition, there were discrepancies
between leader responses and administrative training participation data in three clinics that
illustrate the potential for error in qualitative interviews due to leader recall, confusion of CTAC
78
trainings with other trainings, or differences in CTAC training participation across programs in
the same clinic. Further, it is possible that clinic characteristics or adoption levels changed
between quantitative data collection (2011-2013) and qualitative leader interviews (2013-2014),
resulting in inaccurate set memberships. Finally, it was not feasible to assess agreement across
upper-level and middle-level leaders in the same clinic because the majority (approximately two
thirds) of clinics did not contain codes for both interviews. Codes were absent if the interviewer
did not ask the study question, or, more frequently, if the participant’s response did not contain
sufficient detail and/or did not align with any code.
Other limitations pertain to the QCA. Since conditions and variables must vary by one
third in order to be suitable for QCA, inclusion of additional drivers from qualitative interviews
and calibration options for conditions and outcomes were limited. For example, it was not
possible to compare super/high adopting clinics with low/medium adopting clinics. In addition,
given that all but three adopting clinics adopted both clinical and business innovations, attempts
to differentiate between causal recipes for adopting one type of innovation versus the other were
limited. Further, the use of clinic training participation data as a proxy for adoption offers
advantages over simple yes-no adoption measures, but captures only one aspect of the adoption
process and does not provide information about implementation or sustainment. Finally, QCA is
a case-based method, and its findings are not intended to be generalizable to a larger population.
Therefore, this study’s results are exploratory and their applicability to other populations and
contexts is limited.
Future studies in other service systems using multi-indicator measures for adoption and
targeted interview questions can address some of these limitations. Future studies may also
examine whether aspects of innovation-organization fit that were not included in this study, such
79
as cost efficacy and feasibility, may be necessary or sufficient for adoption. Research should also
investigate how perceptions of other stakeholders, such as clients and counselors, may be
involved in conditions that predict adoption. In addition, further exploration of the conditions
that lead to non-adoption may assist stakeholders in targeting clinics for additional adoption
support.
Conclusions
To increase dissemination of evidence-based innovations in the public sector, states have
encouraged the use of innovations through rollouts that involve substantial investments of
funding, time, and expertise (McHugh & Barlow, 2010). This study demonstrates how
configurational methods can be used to explore patterns of leader perspectives and clinic
characteristics that are associated with clinic-level adoption of innovations. Efforts to enhance
provider perceptions about innovations and tailor rollouts to provider needs may increase rates of
adoption.
80
References
Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a conceptual model of
evidence-based practice implementation in public service sectors. Administration and
Policy in Mental Health and Mental Health Services Research, 38, 4-23.
Aarons, G. A., & Sommerfeld, D. H. (2012). Leadership, innovation climate, and attitudes
toward evidence-based practice during a statewide implementation. Journal of the
American Academy of Child & Adolescent Psychiatry 51, 423-431.
Aos, S., Mayfield, J., Miller, M., & Yen, W. (2006). Washington state evidence-based treatment
of alcohol, drug, and mental health disorders: Potential benefits, costs and fiscal impacts
on Washington state. Olympia, Washington: Washington State Institute for Public Policy.
Balas, E. A., & Boren, S. A. (2000). Managing clinical knowledge for health care
improvement. In Bemmel J., & McCray, A. T. (Eds.), Yearbook of Medical Informatics
2000: Patient-centered systems (65-70). Stuttgart, Germany: Schattauer.
Beidas, R. S., Stewart, R. E., Adams, D. R., Fernandez, T., Lustbader, S., Powell, B. J., ... &
Rubin, R. (2016). A multi-level examination of stakeholder perspectives of
implementation of evidence-based practices in a large urban publicly-funded mental
health system. Administration and Policy in Mental Health and Mental Health Services
Research, 43(6), 893-908.
Chor, K. H. B., Olin, S. C. S., Weaver, J., Cleek, A. F., McKay, M. M., Hoagwood, K. E., &
Horwitz, S. M. (2014). Adoption of clinical and business trainings by child mental health
clinics in New York State. Psychiatric Services, 65(12), 1439-1444.
81
Chor, K. H. B., Wisdom, J. P., Olin, S. C. S., Hoagwood, K. E., & Horwitz, S. M. (2015).
Measures for predictors of innovation adoption. Administration and Policy in Mental
Health and Mental Health Services Research, 42(5), 545-573.
Chorpita, B. F., Daleiden, E. L., Ebesutani, C., Young, J., Becker, K. D., Nakamura, B. J., …
Starace, N. (2011). Evidence-based treatments for children and adolescents: An updated
review of indicators of efficacy and effectiveness. Clinical Psychology Science and
Practice, 18(2), 154–172.
Chow, J. C. C., Jaffee, K., & Snowden, L. (2003). Racial/ethnic disparities in the use of mental
health services in poverty areas. American Journal of Public Health, 93(5), 792-797.
Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C.
(2009). Fostering implementation of health services research findings into practice: A
consolidated framework for advancing implementation science. Implementation Science,
4, 50-65.
Ehrhart, M. G., Aarons, G. A., & Farahnak, L. R. (2014). Assessing the organizational context
for EBP implementation: The development and validity testing of the Implementation
Climate Scale (ICS). Implementation Science, 9, 157-167.
Feldstein, A. C., & Glasgow, R. E. (2008). A Practical, Robust, Implementation and
Sustainability Model (PRISM) for integrating research findings into practice. The Joint
Commission Journal on Quality and Patient Safety, 34(4), 228-243.
Ford, E. W., Duncan, W. J., & Ginter, P. M. (2005). Health departments' implementation of
public health's core functions: an assessment of health impacts. Public Health, 119(1),
11-21.
82
Frambach, R. T., & Schillewaert, N. (2002). Organizational innovation adoption: A multi-level
framework of determinants and opportunities for future research. Journal of Business
Research, 55(2), 163–176.
Gleacher, A. A., Nadeem, E., Moy, A. J., Whited, A. L., Albano, A. M., Radigan, M., ... &
Hoagwood, K. (2011). Statewide CBT training for clinicians and supervisors treating
youth: The New York State evidence-based treatment dissemination center. Journal of
Emotional and Behavioral Disorders, 19(3), 182-192.
Glisson, C., Landsverk, J., Schoenwald, S., Kelleher, K., Hoagwood, K. E., Mayberg, S., ... &
Research Network on Youth Mental Health. (2008). Assessing the organizational social
context (OSC) of mental health services: Implications for research and
practice. Administration and Policy in Mental Health and Mental Health Services
Research, 35, 98-113.
Green, L. W. (2008). Making research relevant: If it is an evidence-based practice, where's the
practice-based evidence? Family Practice, 25(supp. 1), i20-i24.
Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of
innovations in service organizations: Systematic review and recommendations. Milbank
Quarterly, 82(4), 581–629.
Hoagwood, K. E., Olin, S. S., Horwitz, S., McKay, M., Cleek, A., Gleacher, A., ... & Kuppinger,
A. (2014). Scaling up evidence-based practices for children and families in New York
State: Toward evidence-based policies on implementation for state mental health
systems. Journal of Clinical Child & Adolescent Psychology, 43(2), 145-157.
83
Kane, H., Lewis, M. A., Williams, P. A., & Kahwati, L. C. (2014). Using qualitative comparative
analysis to understand and quantify translation and implementation. Translational
Behavioral Medicine, 4(2), 201-208.
Kataoka, S. H., Zhang, L., & Wells, K. B. (2002). Unmet need for mental health care among
U.S. children: Variation by ethnicity and insurance status. American Journal of
Psychiatry, 159(9), 1548-1555.
Knudsen, H. K., & Roman, P. M. (2015). Innovation attributes and adoption decisions:
perspectives from leaders of a national sample of addiction treatment
organizations. Journal of Substance Abuse Treatment, 49, 1-7.
McHugh, R., & Barlow, D. (2010). The dissemination and implementation of evidence-based
psychological treatments: A review of current efforts. American Psychologist, 65(2), 73-
84.
Olin, S. C. S., Chor, K. H. B., Weaver, J., Duan, N., Kerker, B. D., Clark, L. J., ... & Horwitz, S.
M. (2015). Multilevel predictors of clinic adoption of state-supported trainings in
children’s services. Psychiatric Services, 66(5), 484-490.
Palinkas, L. A., Aarons, G. A., Horwitz, S., Chamberlain, P., Hurlburt, M., & Landsverk, J.
(2011). Mixed method designs in implementation research. Administration and Policy in
Mental Health and Mental Health Services Research, 38(1), 44-53.
Palinkas, L. A., Um, M. Y., Jeong, C. H., Chor, K. H. B., Olin, S., Horwitz, S. M., & Hoagwood,
K. E. (2017). Adoption of innovative and evidence-based practices for children and
adolescents in state-supported mental health clinics: A qualitative study. Health Research
Policy and Systems, 15, 27-35.
84
Ragin, C. C. (1987). The comparative method: Moving beyond qualitative and quantitative
strategies. Berkeley, CA: University of California Press.
Ragin, C. C. (1999). Using qualitative comparative analysis to study causal complexity. Health
Services Research, 34(5 Pt 2), 1225-1239.
Ragin, C. C. (2006). Set relations in social research: Evaluating their consistency and
coverage. Political Analysis, 14(3), 291-310.
Ragin, C. C. (2008). Redesigning social inquiry: Fuzzy-sets and beyond. Chicago, IL: University
of Chicago Press.
Ragin, C. C. & Davey, S. (2016). Fuzzy-set/Qualitative Comparative Analysis 3.0. Irvine,
California: Department of Sociology, University of California.
Regan, J., Lau, A. S., Barnett, M., Stadnick, N., Hamilton, A., Pesanti, K., …, & Brookman-
Frazee, L. (2017). Agency responses to a system-drive implementation of multiple
evidence-based practices in children’s mental health services. BMC Health Services
research, 17, 671-784.
Rihoux, B., & Ragin, C. C. (2009). Configurational comparative methods: Qualitative
comparative analysis (QCA) and related techniques. Thousand Oaks, CA: Sage
Publications.
Sandelowski, M., Voils, C. I., & Knafl, G. (2009). On quantitizing. Journal of Mixed Methods
Research, 3(3), 208-222.
Schneider, C. Q., & Wagemann, C. (2010). Standards of good practice in qualitative comparative
analysis (QCA) and fuzzy-sets. Comparative Sociology, 9(3), 397-418.
Schneider, C. Q., & Wagemann, C. (2012). Set theoretic methods for the social sciences: A guide
to qualitative comparative analysis. New York, NY: Cambridge University Press.
85
Seffrin, B., Panzano, P., & Roth, D. (2009). What gets noticed: How barrier and facilitator
perceptions relate to the adoption and implementation of innovative mental health
practices. Community Mental Health Journal, 45(4), 260-269.
Strauss, A. L., & Corbin, J. (1998). Basics of qualitative research (2
nd
vol.). Thousand Oaks, CA:
Sage.
Willms, D. G., Best, J. A., Taylor, D. W., Gilbert, J. R., Wilson, D., Lindsay, E. A., & Singer, J.
(1990). A systematic approach for using qualitative methods in primary prevention
research. Medical Anthropology Quarterly, 4(4), 391-409.
Wisdom, J. P., Chor, K. H. B., Hoagwood, K. E., & Horwitz, S. M. (2014). Innovation adoption:
a review of theories and constructs. Administration and Policy in Mental Health and
Mental Health Services Research, 41(4), 480-502.
Zazzali, J. L., Sherbourne, C., Hoagwood, K. E., Greene, D., Bigley, M. F., & Sexton, T. L.
(2008). The adoption and implementation of an evidence-based practice in child and
family mental health services organizations: A pilot study of functional family therapy in
New York State. Administration and Policy in Mental Health and Mental Health Services
Research, 35(1-2), 38-49.
86
Table 3.1
Descriptive Statistics for Clinic-level Conditions and Outcomes from Administrative Data (n =
32)
* Fuzzy set membership thresholds indicated in parentheses.
Characteristic Percent of clinics
Adopters
Clinical innovations 59
Business innovations 69
Region
Downstate 69
Upstate 31
Small clinic size (annual expenses)
< $1 million (full)* 13
$1 - $5 million (crossover) 41
$5 - $10 million (non) 13
> $10 million 34
Clinic inefficiency (gain/loss per service unit)
< -$30 (full)* 31
-$30 - -$10 (crossover) 22
-$10 - 0 (non) 16
> 0 31
High proportion of youth clients
> 60 (full)* 34
60 - 40 (crossover) 34
40 - 20 (non) 13
< 20 19
87
Table 3.2
Percent of Clinics Endorsing Qualitative Drivers of Innovation Adoption
Adoption
decision driver
Percent of clinics
Example quote
Adopters
(n = 22)
Non-Adopters
(n = 10)
Total
(n = 32)
External system 0 20 6
“I think that there’s so many changes happening right now with
behavioral health, so many different requirements and everybody’s
getting stricter it seems like with the requirements that, at this
point…the [trainings] that I select are the ones that are most urgent.”
(non-adopting clinic)
Organization
Capacity 18 80 38
“There’s not a lot of what I would call extra time to do things that
would even be better for the agency.” (non-adopting clinic)
Culture 55 0 38
“So I think that the agency sees it as a valuable opportunity for us to
collaborate with others, learn from others, and do better.” (adopting
clinic)
Innovation
Relevance 77 20 59
“Because there are many inefficiencies at the clinic, and there’s a way
to improve it. And there’s a clinical need to improve our services.”
(adopting clinic)
Feasibility 36 0 25
“Usually the webinars are an hour, an hour and a half, and they're
more manageable in people's schedules.” (adopting clinic)
88
Table 3.3
Solution for Adoption of Clinical Innovations
Configuration
Raw / unique
coverage
Consistency
Number of
cases
Example quote
culture * relevance 0.47 / 0.22 1 9
“We're part of the [clinical learning collaborative] project because
it made sense. We’re interested in using evidence based practice.
And providing the best possible care we can.”
culture * youth 0.22 / 0.05 1 4
“I mean, you have the same mission about good clinical work
and keeping your business open, but it's just so much harder now.
So when you get somebody like CTAC that is really embracing
the clinical and the fiscal, it's a great thing.”
small * culture 0.28 / 0.06 1 6
“We needed that kind of technical assistance to really understand
what other clinics do to share evidence based practices. We had our
own kind of set of things that we wanted to accomplish in terms of
being a different kind of clinic than the other clinics in our
community.”
small * relevance *
youth
0.21 / 0.13 0.91 4
“And I also think the need for training. We had a huge staff
turnover for a long period of time. And just the quality of care,
we could see in the progress notes that the quality wasn’t what
we wanted it to be.”
Solution coverage: 0.73
Solution consistency: 0.97
89
Table 3.4
Solution for Adoption of Business Innovations
Configuration
Raw / unique
coverage
Consistency
Number
of cases
Example quote
relevance * small 0.38 / 0.14 0.96 9
“We are a fairly small mental health clinic. We don’t have
years and years and years of structural experience. And so
the CTAC has really helped us to understand the financing
changes.”
relevance * ~inefficient 0.41 / 0.11 0.95 10
“Because there are many inefficiencies at the clinic, and
there’s a way to improve it. And there’s a clinical need to
improve our services.”
culture * inefficient 0.37 / 0.24 1 8
“At the time when [CTAC] initiated, clinics routinely were
losing money throughout the state. In fact, I don't know of
any that were breaking even. And so we decided to get
involved to see if we could improve our financial stability
as well as learn new ways of providing services.”
Solution coverage: 0.81
Solution consistency: 0.95
90
Chapter Four: Study Three
State Executive and Clinic Leader Perspectives on Implementation Success and Failure in
Children’s Mental Health Services
Abstract
Introduction
Scaling up evidence-based innovations in mental health services is complex, involving
stakeholders at multiple levels of the organizational context. Within the inner organizational
context, treatment organization leaders make key decisions about adopting or discontinuing
innovations. Within the external system context, system executives develop regulations that
facilitate and/or mandate innovation use. The beliefs, perceptions, and priorities of these
stakeholders drive their behavior and subsequently shape the effectiveness of innovation rollouts.
Most research on stakeholder perspectives focuses on barriers and facilitators related to
implementation success, but it is also important to examine how stakeholders define
implementation success and failure in the first place. This study examines qualitative clinic
leader and state executive perspectives around the following research questions: 1) How do
stakeholders define implementation success? 2) How do they decide when to stop implementing
or promoting a failed innovation?
Methods
Study participants included 68 leaders of 34 clinics licensed by New York State to treat
children, adolescents, and their families, and nine executives in the state Office of Mental Health.
Clinics were selected based on their adoption behavior during a statewide rollout of clinical and
business innovations to represent a 10% stratified sample of all 346 clinics in the state children’s
mental health system. Semi-structured interviews were conducted with participants in 2013-
91
2014. The thematic content analysis method of “Coding Consensus, Co-occurrence, and
Comparison”, rooted in grounded theory, was used to code transcripts, and a template organizing
style was used to compare theme application between clinic leader and state executive
interviews.
Results
The most common stakeholder definitions of implementation success focused on client
outcomes. Other definitions of success centered around an innovation’s widespread use within an
organization and its acceptability to the direct service providers using it, with few participants
defining success in terms of fidelity or sustainment. In response to prompts about
implementation failure, over one third of participants reported that they rarely, if ever, made
conscious decisions to stop using or promoting an innovation. Some clinic leaders expressed that
they viewed implementation failure as a personal failing that must be prevented, whereas state
executives stated that innovations may be discontinued due to changing political priorities, even
if they produce positive outcomes. Stakeholders reported that the most common indicators of
implementation failure were poor client outcomes and provider feedback.
Discussion
Stakeholder definitions of implementation success aligned with Proctor et al.’s (2011)
implementation outcomes, offering support for the taxonomy’s external validity in mental health
services. Implementation strategies aimed at increasing motivation and buy-in for innovations at
both the system and organizational level should highlight client outcomes and provider testimony
given the salience of these indicators among stakeholders. Stakeholder perceptions about
implementation failure reflect pressure for continuous and persistent implementation in mental
health services and a need for greater education and support related to exnovation at both the
92
organizational and system levels. Findings suggest several future research directions, including
examining the role of fidelity in stakeholder theories of client change and identifying strategies
to align researcher, policy maker, and clinic leader goals during implementation.
93
State Executive and Clinic Leader Perspectives on Implementation Success and Failure in
Children’s Mental Health Services
There is a research-to-practice gap of 17 years or longer until innovations— practices,
sets of behaviors or routines that are new to the organization— are adopted into community
mental health organizations, where they can in turn influence patient quality of life (Balas &
Boren, 2000; Boren and Balas, 1999; Green, 2008). To close this gap, many states have
encouraged the use of innovations through initiatives, legislation, and/or mandates (Bruns &
Hoagwood 2008; Dorsey et al., 2016; Hoagwood et al., 2014; Lau & Brookwood-Frazee, 2015;
McHugh & Barlow, 2010). State approaches to scaling up innovations include offering financial
incentives for innovation adoption, providing trainings, and establishing technical assistance
centers (McHugh & Barlow, 2010).
Scaling up innovations is complex and involves stakeholders at multiple levels of the
organizational context. Commonly used implementation frameworks such as the EPIS
(Exploration, Adoption/Preparation, Implementation, and Sustainment), PRISM (Practical
Robust Implementation Sustainability Model), and CFIR (Consolidated Framework for
Implementation Research) distinguish between factors in the inner organizational context and
external system context (Aarons, Hurlburt, & Horwitz, 2011; Damschroder et al., 2009; Feldstein
& Glasgow, 2008). Within the inner organizational context, administrators and managers make
key adoption and implementation decisions (Rodriguez et al., 2018; Wisdom et al., 2014).
Within the external system context, policy makers and system executives develop regulations
that enable and/or mandate provider innovation use (Raghavan, Bright, & Shadoin, 2008).
Together, treatment organization leaders and system executives negotiate service and
performance contracts that may influence both current implementation efforts and future policies
94
(Brodkin, 2010; Gronbjerg, 2010; Moullin, Dickson, Stadnick, Rabin, & Aarons, 2019).
Therefore, the beliefs, perceptions, and priorities of these two stakeholder groups shape the
process, effectiveness, and sustainment of innovation rollouts (Aarons et al., 2016; Green &
Aarons 2011).
Most research on stakeholder perspectives about implementation focuses on barriers and
facilitators related to implementation success (Aarons, Wells, Zagursky, Fettes, & Palinkas,
2009; Beidas et al., 2016; Carstens, Panzano, Massatti, Roth, & Sweeney, 2009; Palinkas et al.,
2017; Rodriguez, Southam-Gerow, O’Connor, & Allin, 2014). These studies provide valuable
information to guide implementation strategies, conceptual frameworks, and policy decisions.
However, it is also important to examine how different stakeholders define implementation
success in the first place. Stakeholders’ definitions of success provide insight into the factors
they perceive as most meaningful and relevant within their sphere of practice (Aarons et al.,
2009; Aarons et al., 2014). Treatment organization leaders’ perceptions about implementation
success determine their priorities and guide their attention during adoption and implementation
(Palinkas et al., 2017; Proctor et al., 2007; Rodriguez et al., 2018). System executives’
perceptions of implementation success may shape the design, incentive structure, and evaluation
of innovation rollouts (Raghavan et al., 2008; Rapp et al., 2005; Willging et al., 2015).
Implementation success may be inferred from improved client outcomes such as
symptoms, functioning, or satisfaction (Proctor et al., 2011). Additionally, it may be defined
through implementation outcomes, as described by Proctor and colleagues (2011). Examples of
implementation outcomes include fidelity, or the extent to which an innovation is delivered as its
developers intended; acceptability, or provider satisfaction with the innovation; and penetration,
or the spread of an innovation across an organization. However, it is unknown whether mental
95
health providers rely on these outcomes, or how reliance on different outcomes may influence
implementation success. More research is needed to understand mental health provider
conceptualizations of implementation success, as well as the salience of implementation
outcomes among different stakeholders (Proctor et al., 2011).
Although implementation success and implementation failure represent valences on the
same dimension, implementation failure has received much less research attention. Conceptual
frameworks offer a blueprint that mental health providers can use to guide their next steps after
‘successful’ adoption and implementation, but there is little guidance on how to proceed if an
implementation effort is failing (Greenhalgh, Robert, Macfarlane, Bate, & Kyriakidou, 2004;
Massatti, Sweeney, Panzano, & Roth, 2008). For example, when should organizations
troubleshoot or invest more resources versus decide to stop implementing the innovation?
Further, there is an implicit assumption underlying much mental health research and policy that
continuous and persistent implementation leads to quality improvement. This bias may be
discouraging exnovation—the strategic removal of an innovation that does not improve
outcomes, is too disruptive, or does not fit with the organization—that could free up resources
and improve organizational performance (Rodriguez, Henke, Bibi, Ramsay, & Shortell, 2016;
Ubel & Asch, 2015). Examining stakeholder perceptions about implementation failure and
exnovation decisions can inform emerging research in this area and offer further insight into
motivators of stakeholder behavior.
This study examines clinic leader and state executive perspectives about implementation
success and failure in the context of a statewide initiative to scale up innovations in children’s
mental health services. The questions guiding this study are: 1) How do these two stakeholders
groups define implementation success? and 2) How do they decide when to stop implementing or
96
promoting a failed innovation? Given that this study focuses on stakeholder attitudes, beliefs and
perceptions about relatively unstudied issues, qualitative methods are used to address the
research questions.
Methods
Participants and Procedure
Study participants included 68 leaders of clinics licensed by the State of New York to
treat children, adolescents, and their families, and nine executives with top leadership positions
in the state Office of Mental Health. State executives were current or former government
appointees who were involved in developing and/or overseeing children’s mental health services.
All state executives held graduate degrees and 44% were clinically licensed. They had a mean
tenure of 4.5 years in their appointed positions (range 6 months to 10 years).
Clinic leaders were 35 upper level administrators (e.g. CEOs or Vice Presidents) and 33
middle level managers (e.g. Program Directors) recruited from 34 clinics. Clinics were selected
based on their adoption behavior during a statewide rollout of clinical and business innovations
to represent a 10% stratified sample of all 346 clinics in the state children’s mental health
system. More detail about the rollout and clinic adoption behavior can be found in Chor et al.
(2014). In most clinics (31 out of 34), two interviews were conducted: one with an upper level
administrator and one with a middle level manager. Within two clinics, only one interview was
conducted due to simpler organizational structures, and within two clinics a third interview was
conducted with an additional key decision-maker. Demographic data for clinic leaders was not
collected. Informed consent was obtained from all participants, and study procedures were
approved by the Institutional Review Boards of the University of Southern California and New
York University.
97
Qualitative data were collected from participants in 2013-2014 via hour-long, semi-
structured interviews that were digitally recorded and transcribed (Palinkas et al., 2017). Clinic
leaders responded to the following questions: How would you define/determine successful
adoption of an innovation? How do you decide when to stop implementing an innovation? How
can you tell if an innovation is going to fail? State executives responded to similar questions
tailored to their role: “How do you define/determine a successful innovation rollout? How do you
decide when to stop promoting an innovation? How can you tell if an innovation is going to fail?
Interviewers used follow-up prompts to encourage participants to elaborate on relevant issues. It
should be noted that although some clinic leader question prompts asked about ‘adoption’,
participant responses pertained to the broader process of implementation, suggesting that many
clinic leaders interpret the terms ‘adoption’ and ‘implementation’ interchangeably.
Analysis
Interview transcripts were coded in Dedoose Version 8.0.35 (2018) using the thematic
content analysis method of “Coding Consensus, Co-occurrence, and Comparison” (Willms et al.,
1990). This method is rooted in grounded theory, in which theory is derived from the data and
then illustrated using characteristic examples (Glaser and Strauss, 1967). The first author
developed an initial codebook using open coding to identify broad themes and patterns, and then
used focused/axial coding to explore these themes more deeply (Strauss & Corbin, 1998). Codes
captured both information about a priori themes (e.g. direct responses to interview questions
about definitions of implementation success), as well themes that spontaneously emerged from
the data. Codes included in the final codebook were endorsed by at least 10% of participants or
had particular conceptual salience (e.g. fidelity).
98
A graduate-level researcher co-coded 50% of transcripts selected at random. The co-
coders created detailed memos describing codes and linking them to passages in the data and
discussed discrepancies until consensus was reached. When an excerpt aligned with multiple
themes, all relevant codes were assigned. Interrater reliability calculated from ten randomly
selected transcripts was 92%, indicating good reliability (Padgett, 2012; Viera & Garrett, 2005).
A template organizing style of interpretation was used to compare codes from interviews
with clinic leaders and those with state executives (Crabtree & Miller, 1999). The first author
developed a matrix of codes and compared the content and organization of the matrix to identify
themes that were common to both groups of participants as well as themes that were specific to
only one group (Crabtree & Miller, 1999). Differences in code application between upper level
administrators and middle level managers were similarly examined. Within-organization
comparisons to assess agreement between upper level administrators and middle level managers
in the same clinic were attempted but were limited given that only 11-18 out of 34 clinics
(depending on the code) contained codes for both upper and middle level leader interviews.
Codes were absent if the interviewer did not ask the study question, or, more frequently, if the
participant’s response did not contain sufficient detail and/or did not align with any code.
Results
Results related to definitions of success are presented first, followed by results related to
implementation failure. Differences in themes between clinic leaders and state executives are
presented where relevant. There was only one notable difference in code application between
upper level administrators and middle level managers, which is presented under definitions of
success.
99
Descriptive statistics indicating the proportions of stakeholders endorsing each theme are
presented in Table 4.1. Quotations exemplifying participant perspectives are provided for each
theme in the table and throughout the results section. To improve readability, some quotations
were edited to remove unnecessary utterances and redundant working. Given the small sample of
state executives and qualitative nature of the study, no statistical comparisons between state
executives and clinic leaders were made. Therefore, terms such as ‘more likely’ or ‘less often’ in
the results section should be interpreted as qualitative assessments only.
Definitions of Implementation Success
The majority of both state executives and clinic leaders defined implementation success
in terms of client outcomes such as symptomology and functioning. Clinic leaders explained that
client outcomes were central to the organization’s mission, and two explicitly stated that they
prioritized them above other measures of success:
[clinic leader] “So as long as we’re getting client outcomes, that’s why we exist. Being a
non-profit organization, the outcome is efficacy of treatment, or efficacy of intervention
and therefore outcome.”
Some state executives and clinic leaders emphasized that these client outcomes should be
measurable, and that success should be defined by meeting a predetermined, operationalized
goal. However, a minority of clinic leaders reported that success in terms of qualitative client
reports of satisfaction or feedback about their services or functioning was more important, as
explained by this leader:
[clinic leader] “Well, I guess ideally you use criteria that are measurable. But I think to
us, a better criteria is that the client reports feeling better, being able to cope or manage.
And it’s not necessarily all that measurable. It sometimes is just self-report.”
100
Initial transcript review and codebook development suggested that, after accounting for
client-focused definitions of success, definitions aligned with Proctor et al.’s (2011) taxonomy of
implementation outcomes. Therefore, this taxonomy was used to label themes in the final
codebook. The first such theme centered on penetration, or the integration of an innovation
within a service setting (Proctor et al., 2011). According to participants who endorsed this theme,
implementation is successful when an innovation becomes visibly incorporated into routine
practice and institutionalized by the clinic(s) using it.
[state executive] “I’ll see them using it. And one of the ways we would monitor is we
would use our licensing people as our eyes and ears. And we would basically make that
one of the questions of, you know, are you using such and such? Could I see some
information on it? And we would see how much it’s been institutionalized by the
agency.”
Three clinic leaders noted that penetration of an innovation is something that is difficult to
achieve. As one stated, “I [would define success] by [the innovation] becoming a routine part of
services that somebody would get. And that is not so easy to make happen.” Upper level
administrators were more likely than middle level managers to include penetration in their
definitions of success.
Another definition of success centered around acceptability, or the perception among
implementation stakeholders than an innovation is agreeable, palatable, or satisfactory (Proctor
et al., 2011). Participants who endorsed this outcome spoke about individual provider buy-in and
attitudes, as illustrated by this clinic leader, “…at the heart, is it well understood and valued by
the clinicians and providers who are providing the service.” Participants typically described
101
acceptability in positive terms, but three described how the absence of negative provider attitudes
may also indicate success:
[state executive] “I went to monthly meetings with all of my counties. And I’d listen to
what they were bitching about, you know. And sometimes if they were in touch with the
problems of their providers, they would either complain about a new thing or not mention
it, which meant it was okay.”
More than other definitions of success, acceptability had affective significance to three
participants, who described a sense of personal satisfaction when providers embrace an
innovation:
[clinic leader] “The other is really qualitative, and if the therapists buy into it and they
say, wow, it's good, it's working, then I know it's successful. But that really means a lot,
because the rest is just kind of going through the motions and making sure you’re
checking this off and let's see if... And it's a little bit more stilted and I think I'm much
more like, if I hear that it's working and they're doing it and they're excited about it then
that's really telling.”
Fidelity, or the degree to which an innovation is implemented as intended by program
developers, was one of the least common definitions of success. As this clinic leader
summarized, fidelity may be important, but implementation success in terms of staff and client
outcomes is more critical to the overall functioning of the agency:
[clinic leader] “Well, I don't know if I would define it by a hundred percent fidelity. I
think fidelity is important. But we need our staffing, and it's a unionized staff. We need
the staff to get the buy-in and to do the training, acquire the skills, and use it. And
patients to be interested in it.”
102
None of the state executives mentioned fidelity in their definitions of implementation success.
Sustainment, or the extent to which an implemented treatment is maintained with in a
clinic’s operations over time, was also a relatively uncommon theme, with participants tending to
focus on more proximal indicators of success. However, this clinic leader explains how
sustainment is integral to implementation success, “To me, if I’ve implemented something and I
go back in six, eight, twelve, two years later and nobody knows about it, nobody’s doing it or it’s
half done, then it wasn’t a successful implementation.”
Exnovation of Failed Innovations
Beliefs about exnovation. Beliefs about exnovation emerged a posteriori from the data
in response to prompts asking participants to describe when they stop using or promoting an
innovation. One of the most common beliefs was that exnovation is atypical. Participants
explained that conscious decisions to stop implementing or promoting an innovation were rarely,
if ever, made— or even considered— in their realm of practice:
[clinic leader] “And, I never hear the question asked, well, should we drop group? And
nobody’s coming upstairs knocking on my door saying, hey, I think we’ve got to get rid
of group because it’s just not working.”
[state executive] “I think the state has a difficult time in doing that. You know, I think it's
much easier to start things than it is to stop them.”
Related to this theme, some participants suggested that there is a need for greater focus to close
this “gap in the literature” regarding when to stop using an innovation.
Other participants explained that, rather than an active process involving a conscious
exnovation decision, removal of innovations happens passively when the innovation gradually
falls out of use or is “replaced by something better”. One clinic leader stated:
103
[clinic leader] “I don't think that there's a conscious, okay, we're not going to do this
anymore. I think it's more kind of it peters out or fizzles out. Which happened with a few
evidence based practices that we tried.”
This theme was more prevalent among state executives, some of who described a process akin to
natural selection, as in this example:
[state executive] “And I think probably a lot of the ones I’ve seen peter out probably
weren’t that beneficial to begin with, or weren’t that practical to begin with. And so I
think probably, that’s probably what will happen is there will be just a natural culling
process…”
A third theme focused around participant perceptions that exnovation is not necessary
because implementation failure is preventable. These participants expressed that they personally
would not allow an implementation effort to fail, as described by this state executive:
[state executive] “That's interesting, because I would say we wouldn't adopt an
innovation that's going to fail. And if it's going to fail, we fix it. So I wouldn't
acknowledge that I would let it fail.”
This theme was more prevalent among clinic leaders, who stated that implementation failure can
be avoided through measures such as pilot testing innovations, adequate pre-adoption
preparation, persevering in the face of problems, and troubleshooting during implementation.
Four of these clinic leaders took personal responsibility over the success or failure of an
implementation effort, as stated by this clinic leader:
[clinic leader] “If I were going to try an innovation I would think that it was going to be
successful because I would set it up to be. And if I think something is not feasible then
why would I try to make it work.”
104
In contrast to the belief that preventing the need for exnovation is within one’s personal
locus of control, approximately half of the state executives reported that exnovation occurs at the
state level due to changing political priorities. These executives stated that, in some cases,
innovations lose funding or support because policy makers want to shift focus to another need. In
such cases, exnovation may not be data driven, as this state executive explained, “So we'll
decide, for example, whether we're going to continue early recognition and screening not just
based on the performance of the program as a whole, but should we be funding something else
instead even if that's a successful program, because there's some other priority that's emerged.”
Indicators of implementation failure. The top themes related to indicators of
implementation failure or the need for exnovation were similar to themes related to definitions of
success. Provider feedback about an innovation was the most frequently mentioned indicator of
implementation failure for both clinic leaders and state executives. Clinic leaders typically
described receiving provider feedback via an organic, bottom-up process: “I don’t have a shy
staff, so they will come find me and say this is a waste of time.” State executives described the
need to seek feedback about innovations from service organizations, as in this example:
[state executives] “I think it's all about watching across, looking at the uptake, you know,
going to get back to the leadership at the agency level or the program level. You know,
are people promoting it? Are they excited about it?”
Other participants stated that a lack of improvement or inability to meet predetermined
benchmarks related to client outcomes would indicate failure. One clinic leader summarized,
“The primary thing has always got to be patient care, you know quality of care.” A state
executive similarly explained, “And if it’s not producing the kind of result that people were
expecting, let’s say shorter length of stays or reduction in unnecessary readmissions to whatever
105
service, then I think the process would kind of stop.” Some clinic leaders reported that client
feedback, communicated either directly through complaints or indirectly through decreased
attendance would signal a need to reconsider an innovation, as this clinic leader expressed, “Or if
people started...the clients actually started complaining and actually started either giving them
written or verbal complaints or actually dropping out of treatment because of the innovations,
then we would stop it immediately.”
Finally, some clinic leaders reported that a lack of financial resources to support
innovation signals failure and a reason for exnovation. This clinic leader elaborated on the types
of resource deficits that may contribute to implementation failure:
[clinic leader] “I mean, I think if we don't have the staff. The staff and the infrastructure
to be able to support its model. You know, just the concretes that go into actually, if we
don't have the space, if we don't have the funds to be able to provide the incentives if
those are not provided for us, if childcare is needed to attend when something...we don't
have sitters who can sit, we don't have the staff to be able to provide coverage for those
kids.”
Discussion
This qualitative study examined clinic leader and state executive perceptions about
implementation of clinical and business innovations. It explored how these stakeholders define
implementation success and decide to stop implementing or promoting a failed innovation. In
line with calls for further research on the connections between internal and external
organizational contexts, it also examined similarities and differences between perceptions of
clinic leaders in the internal context and state executives in the external context (Moullin et al.,
2019).
106
In their seminal paper on implementation outcomes, Proctor et al (2011) noted, “the
success of efforts to implement evidence-based treatment may rest on their congruence with the
preferences and priorities of those who shape, deliver, and participate in care. Implementation
outcomes may be differentially salient to various stakeholders, just as the salience of clinical
outcomes varies across stakeholders (p. 72).” The results of this study revealed many areas of
congruence between state executive and clinic leader perceptions. Within both stakeholder
groups, client outcomes were a primary consideration in assessing success and failure, followed
by feedback from providers. In some cases, provider feedback likely represented a proxy for
client outcomes (i.e. provider feedback will be positive if clients are improving), further
underscoring the role of client outcomes as a key driver in human service organizations and
systems (Hasenfeld, 2010). Implementation strategies aimed at increasing motivation and buy-in
for innovations at both the system and organizational level should highlight client outcomes and
provider testimony given the salience of these indicators among stakeholders.
Other outcomes were notably absent from stakeholder definitions of implementation
success. In particular, fidelity was endorsed so infrequently that it would not have been included
as a theme in this study if not for its important conceptual role. This lack of emphasis on fidelity
makes sense given that the New York State Office of Mental Health does not monitor fidelity,
instead focusing on client and process outcomes (K. Hoagwood, personal communication, March
3, 2019). It is also possible that participants viewed fidelity as a proximal step on the path to a
more salient goal (i.e. client outcomes), and future research should examine whether fidelity to
the model is inferred within stakeholder theories of client change. In addition, fidelity may have
been implied in definitions of success involving penetration, or the widespread incorporation of
an innovation within a clinic. However, the lack of explicit focus on fidelity at both the system
107
and provider levels is in line with other qualitative findings suggesting that fidelity is not a
meaningful outcome for some providers and that more work is needed to bridge provider and
researcher conceptualizations of implementation success (Cutbush, Gibbs, Krieger, Clinton-
Sherrod, & Miller, 2017; Lengnick-Hall, Fenwick, & Henwood, 2018).
In contrast to implementation success, participants had more difficulty discussing
implementation failure. Approximately one third of participants in both stakeholder groups
reported that they rarely or never made conscious decisions to stop using innovations that had
failed on some measure. Some reported that exnovation is not necessary because they would not
allow an implementation effort to fail, or that innovations that are no longer useful simply “fizzle
out” or are replaced with another. These perspectives reflect the pressure for continuous and
persistent implementation in mental health services. They also reflect the gap in research
regarding when and how to exnovate innovations that are no longer useful or do not fit with the
organization or system’s priorities. Given that strategic exnovation can be beneficial, freeing up
resources and time for adopting and implementing alternative innovations, results indicate the
need for greater education and support related to exnovation at both the organizational and
system levels (Rodriguez et al., 2016).
Although this study found that clinic leaders and system executives share many of the
same perceptions, it also highlighted opportunities for improving alignment between stakeholder
groups. For example, half of the state executives reported that innovation rollouts- even those
that show signs of success- may be stopped at the state level due to changing political priorities.
In contrast, clinic leaders rarely included the political context in their discussions of exnovation,
and some believed that preventing the need for exnovation is within their locus of control. These
findings are consistent with previous research suggesting that stakeholders focus on the
108
processes and factors that are most proximal to them (Rodriguez et al., 2014; Rodriguez et al.,
2018). However, a greater understanding of the connection between political priorities and
innovation rollouts may assist clinic leaders in building multilevel conceptualizations of
implementation that better reflect system realities and enable them to lead implementation efforts
more skillfully. Similarly, understanding how clinic leaders view their roles during
implementation may assist state executives in developing supports to facilitate innovation
rollouts.
Overall, study findings indicate the need for measurable benchmarks defining
implementation success and procedures to address implementation failure that are salient across
researchers, system executives, and providers. Towards achieving these goals, technical
assistance centers—such as the CTAC in the current study—may offer trainings and consultation
for clinic leaders around assessing implementation success, taking steps to address
implementation failure, and systematically exnovating innovations that are no longer useful. In
addition to serving an educational purpose, these interventions could reduce stigma around
discussing implementation failure and exnovation, and provide a forum for addressing issues that
prevent sustainment. With support and guidance from entities in the external context, clinic
leaders should establish organizational protocols for routinely assessing the success of their
implementation efforts, communicating feedback about innovations to state executives, and
developing procedures for exnovating innovations when needed.
Limitations and Future Research Directions
There are several limitations to this study that should be considered when interpreting
results. Given that data was restricted to New York State, participant perspectives may not
generalize to other systems with different priorities, structures, or resources. Further, the lack of
109
available demographic information about clinic leaders limits assessments of sample
representativeness. In addition, it is possible that differences in terminology used across
interviews (e.g. adoption vs. implementation) may have influenced findings, although there was
consensus among the research team that participants appeared to view such terms
interchangeably. The study questions were asked in the context of a broader interview about
innovation adoption, and additional targeted questions about implementation success and failure
may have yielded more nuanced findings. Finally, the relatively small number of participants in
the state executive interviews limited statistical comparison between stakeholder groups.
Studies using mixed methods, triangulation of outcomes data, and different samples and
settings will address some of these limitations. Future research may address questions raised by
this study, including how stakeholder definitions of success influence their implementation
behavior, whether stakeholder perceptions vary depending on the type of innovation, and
whether stakeholder perceptions about success and failure are associated with differences in
implementation outcomes such as penetration, fidelity, and sustainment. Future studies may also
examine whether stakeholders perceive implementation failure as the opposite of success or as a
qualitatively different phenomena, and how principles of behavioral economics may be used to
understand exnovation decisions (Bickel, Green, & Vuchinich, 1995; Rice, 2013). This study
offers preliminary evidence for the external validity of Proctor et al.’s (2011) implementation
outcomes, which can be used as a taxonomy in future studies examining the salience of
implementation outcomes with other important stakeholder groups, such as clients and direct
service providers. Finally, in line with other research indicating a need for greater connection
between organizational levels during implementation, future work could identify strategies to
110
align researcher, system executive, and clinic leader goals during implementation (Moullin et al.,
2019).
Conclusion
The beliefs and priorities of system executives and clinic leaders shape decisions that
impact innovation adoption, implementation, and sustainment in the children’s mental health
system (Aarons et al., 2016). This study explores themes present in stakeholder perceptions
about implementation success and failure that can inform future implementation research and
innovation rollouts. System and organizational leaders can benefit from training and coaching to
refine how they assess, measure, and address implementation success and failure and improve
alignment between stakeholder groups.
111
References
Aarons, G. A., Green, A. E., Trott, E., Willging, C. E., Torres, E. M., Ehrhart, M. G., & Roesch,
S. C. (2016). The roles of system and organizational leadership in system-wide evidence-
based intervention sustainment: A mixed-method study. Administration and Policy in
Mental Health and Mental Health Services Research, 43(6), 991-1008.
Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a conceptual model of
evidence-based practice implementation in public service sectors. Administration and
Policy in Mental Health and Mental Health Services Research, 38, 4-23.
Aarons, G. A., Wells, R. S., Zagursky, K., Fettes, D. L., & Palinkas, L. A. (2009). Implementing
evidence-based practice in community mental health agencies: A multiple stakeholder
analysis. American Journal of Public Health, 99(11), 2087-2095.
Balas, E. A., & Boren, S. A. (2000). Managing clinical knowledge for health care
improvement. In Bemmel J., & McCray, A. T. (Eds.), Yearbook of medical informatics
2000: Patient-centered systems (65-70). Stuttgart, Germany: Schattauer.
Beidas, R. S., Stewart, R. E., Adams, D. R., Fernandez, T., Lustbader, S., Powell, B. J., ... &
Rubin, R. (2016). A multi-level examination of stakeholder perspectives of
implementation of evidence-based practices in a large urban publicly-funded mental
health system. Administration and Policy in Mental Health and Mental Health Services
Research, 43(6), 893-908.
Bickel, W. K., Green, L., & Vuchinich, R. E. (1995). Behavioral economics. Journal of the
experimental Analysis of Behavior, 64(3), 257-262.
Boren, S. A., & Balas, E. A. (1999). Evidence-based quality measurement. Journal of
Ambulatory Care Management, 22(3), 17–23.
112
Bruns, E. J., & Hoagwood, K. E. (2008). State implementation of evidence-based practice for
youths, part I. Responses to the state of the evidence. Journal of the American Academy
of Child and Adolescent Psychiatry, 47, 369–373.
Brodkin, E. Z. (2010). Human service organizations and the politics of practices. In Y. Hasenfeld
(Ed.), Human services as complex organizations (2
nd
ed., pp. 61-78). Thousand Oaks,
CA: Sage.
Carstens, C. A., Panzano, P. C., Massatti, R., Roth, D., & Sweeney, H. A. (2009). A naturalistic
study of MST dissemination in 13 Ohio communities. The Journal of Behavioral Health
Services & Research, 36(3), 344-360.
Crabtree, B. F., & Miller, W. L. (Eds.). (1999). Doing qualitative research. Thousand Oaks, CA:
Sage Publications, Inc.
Chor, K. H. B., Olin, S. C. S., Weaver, J., Cleek, A. F., McKay, M. M., Hoagwood, K. E., &
Horwitz, S. M. (2014). Adoption of clinical and business trainings by child mental health
clinics in New York State. Psychiatric Services, 65(12), 1439-1444.
Cutbush, S., Gibbs, D., Krieger, K., Clinton-Sherrod, M., & Miller, S. (2017). Implementers’
perspectives on fidelity of implementation: “Teach every single part” or “be right with
the curriculum”? Health Promotion Practice, 18(2), 275–282.
Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C.
(2009). Fostering implementation of health services research findings into practice: A
consolidated framework for advancing implementation science. Implementation Science,
4, 50-65.
113
Dorsey, S., Berliner, L., Lyon, A. R., Pullmann, M. D., & Murray, L. K. (2016). A statewide
common elements initiative for children’s mental health. The Journal of Behavioral
Health Services & Research, 43(2), 246-261.
Feldstein, A. C., & Glasgow, R. E. (2008). A Practical, Robust, Implementation and
Sustainability Model (PRISM) for integrating research findings into practice. The Joint
Commission Journal on Quality and Patient Safety, 34(4), 228-243.
Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for
qualitative research. New York: Aldine de Gruyter.
Green, L. W. (2008). Making research relevant: If it is an evidence-based practice, where's the
practice-based evidence? Family Practice, 25(supp. 1), i20-i24.
Green, A. E., & Aarons, G. A. (2011). A comparison of policy and direct practice stakeholder
perceptions of factors affecting evidence-based practice implementation using concept
mapping. Implementation Science, 6, 104-115.
Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of
innovations in service organizations: Systematic review and recommendations. Milbank
Quarterly, 82(4), 581–629.
Gronbjerg, K. A. (2010). The U.S. nonprofit human service sector: A creeping revolution. In Y.
Hasenfeld (Ed.), Human services as complex organizations (2
nd
ed., pp. 79-98).
Thousand Oaks, CA: Sage.
Hasenfeld, Y. (2010). The attributes of human service organizations. In Y. Hasenfeld (Ed.),
Human services as complex organizations (2
nd
ed., pp. 9-32). Thousand Oaks, CA: Sage.
Hoagwood, K. E., Olin, S. S., Horwitz, S., McKay, M., Cleek, A., Gleacher, A., ... & Kuppinger,
A. (2014). Scaling up evidence-based practices for children and families in New York
114
State: Toward evidence-based policies on implementation for state mental health
systems. Journal of Clinical Child & Adolescent Psychology, 43(2), 145-157.
Lau, A. S., & Brookman-Frazee, L. (2016). The 4KEEPS study: identifying predictors of
sustainment of multiple practices fiscally mandated in children’s mental health
services. Implementation Science, 11, 31-38.
Lengnick-Hall, R., Fenwick, K., & Henwood, B. (2018). “It’s like you do it without knowing
that you’re doing it”: Practitioner experiences with ACT implementation. Community
Mental Health Journal, 55(3), 448-453.
Massatti, R. R., Sweeney, H. A., Panzano, P. C., & Roth, D. (2008). The de-adoption of
innovative mental health practices (IMHP): Why organizations choose not to sustain an
IMHP. Administration and Policy in Mental Health and Mental Health Services
Research, 35(1-2), 50-65.
McHugh, R., & Barlow, D. (2010). The dissemination and implementation of evidence-based
psychological treatments: A review of current efforts. American Psychologist, 65(2), 73-
84.
Moullin, J. C., Dickson, K. S., Stadnick, N. A., Rabin, B., & Aarons, G. A. (2019). Systematic
review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework.
Implementation Science, 14, 1-16.
Padgett, D. K. (2012). Qualitative and mixed methods in public health. Thousand Oaks, CA:
Sage Publications, Inc.
Palinkas, L. A., Um, M. Y., Jeong, C. H., Chor, K. H. B., Olin, S., Horwitz, S. M., & Hoagwood,
K. E. (2017). Adoption of innovative and evidence-based practices for children and
115
adolescents in state-supported mental health clinics: A qualitative study. Health Research
Policy and Systems, 15, 27-35.
Proctor, E. K., Knudsen, K. J., Fedoravicius, N., Hovmand, P., Rosen, A., & Perron, B. (2007).
Implementation of evidence-based practice in community behavioral health: Agency
director perspectives. Administration and Policy in Mental Health and Mental Health
Services Research, 34(5), 479-488.
Proctor, E., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G., Bunger, A., ... & Hensley, M.
(2011). Outcomes for implementation research: conceptual distinctions, measurement
challenges, and research agenda. Administration and Policy in Mental Health and Mental
Health Services Research, 38(2), 65-76.
Raghavan, R., Bright, C. L., & Shadoin, A. L. (2008). Toward a policy ecology of
implementation of evidence-based practices in public mental health
settings. Implementation Science, 3, 26-34.
Rapp, C. A., Bond, G. R., Becker, D. R., Carpinello, S. E., Nikkel, R. E., & Gintoli, G. (2005).
The role of state mental health authorities in promoting improved client outcomes
through evidence-based practice. Community Mental Health Journal, 41(3), 347-363.
Rice, T. (2013). The behavioral economics of health and health care. Annual Review of Public
Health, 34, 431-447.
Rodriguez, A., Lau, A. S., Wright, B., Regan, J., & Brookman-Frazee, L. (2018). Mixed-method
analysis of program leader perspectives on the sustainment of multiple child evidence-
based practices in a system-driven implementation. Implementation Science, 13, 44-57.
116
Rodríguez, A., Southam-Gerow, M. A., O'Connor, M. K., & Allin Jr, R. B. (2014). An analysis
of stakeholder views on children's mental health services. Journal of Clinical Child &
Adolescent Psychology, 43(6), 862-876.
Rodriguez, H. P., Henke, R. M., Bibi, S., Ramsay, P. P., & Shortell, S. M. (2016). The
exnovation of chronic care management processes by physician organizations. The
Milbank Quarterly, 94(3), 626-653.
Strauss, A. L., & Corbin, J. 1998. Basics of qualitative research (2
nd
vol.). Thousand Oaks, CA:
Sage Publications, Inc.
Ubel, P. A., & Asch, D. A. (2015). Creating value in health by understanding and overcoming
resistance to de-innovation. Health Affairs, 34(2), 239-244.
Viera, A. J., & Garrett, J. M. (2005). Understanding interobserver agreement: the kappa
statistic. Family Medicine, 37(5), 360-363.
Willging, C. E., Green, A. E., Gunderson, L., Chaffin, M., & Aarons, G. A. (2015). From a
“perfect storm” to “smooth sailing” policymaker perspectives on implementation and
sustainment of an evidence-based practice in two states. Child Maltreatment, 20(1), 24-
36.
Willms, D. G., Best, J. A., Taylor, D. W., Gilbert, J. R., Wilson, D., Lindsay, E. A., & Singer, J.
(1990). A systematic approach for using qualitative methods in primary prevention
research. Medical Anthropology Quarterly, 4(4), 391-409.
Wisdom, J. P., Chor, K. H. B., Hoagwood, K. E., & Horwitz, S. M. (2014). Innovation adoption:
a review of theories and constructs. Administration and Policy in Mental Health and
Mental Health Services Research, 41(4), 480-502.
117
Table 4.1
Percent of Clinic Leaders (n = 68) and State Executives (n = 9) Endorsing Study Themes
Theme
Clinic
leaders
State
executives
Example
Definitions of success
Client outcomes 36 57
“I mean, the finances will always be in the room as an issue for us. But as a clinical service, the only way to define
it, as far as I’m concerned, is by patient outcomes.” [clinic leader]
Client satisfaction 15 0
“I think it would be a program that clients would give positive feedback about. So something that the client feels
was really worth their time.” [clinic leader]
Penetration 35 14
“…it’s being used regularly by everybody that’s been trained in an effective way for the right population.” [clinic
leader]
Acceptability 22 29 “…I think for me, a successful innovation will empower people to get excited about what they do.” [state executive]
Fidelity 7 0
“…monitoring our fidelity, in terms of once we train people in a practice, how successful are we in being sure that
we're providing our supervision from that framework.” [clinic leader]
Sustainment 5 14 “Well, I guess whether it hangs on, whether it stays after the pilot.” [state executive]
Exnovation beliefs
Atypical 31 38 “…I don't know if I've ever really stopped.” [state executive]
Passive 13 13 “…the first thing that occurred to me is that it would be replaced by something better.” [state executive]
Preventable 29 25
“…if you've done enough research on it in other places, and feel like you've been thorough in it and you feel like
you've designed a good plan and if you're on target, trying it that's going to be a good effort.” [clinic leader]
Political 3 50
“I think also that sometimes what the innovation is promoting is something that's not as important to us anymore.”
[state executive]
Indicators of failure
Provider feedback 37 38
“…the first signs are people are rolling their eyes and saying, this is…this is way too much time and effort for the
amount of benefit we’re getting from it.” [clinic leader]
Client outcomes 20 25
“And what does it look like for the consumers? What are the factors that we set out to change, and is it having an
impact?” [state executive]
118
Client feedback 27 0
“…feedback from clients through consumer satisfaction surveys. So we’d maybe be able to get feedback that way,
things of that nature.” [clinic leader]
Resources 17 0
“…usually it has always been a financial reason, because the issues about whether or not we should be doing it for
a population has never really come up.” [clinic leader]
119
Chapter Five: Conclusions, Implications, and Future Directions
Introduction
Improving access to high-quality, evidence-based care is one of the most pressing issues
in child and adolescent mental health services today (Novins, Green, Legha, & Aarons, 2013).
Although researchers have developed numerous clinical and business innovations that increase
quality of care, low rates of uptake by mental health service organizations have led to the need
for targeted efforts to increase innovation dissemination, adoption, and implementation (Beidas
& Kendall, 2014). State oversight agencies are in optimal positions to influence the use of
innovations within their systems through large-scale rollouts. These rollouts offer unique
opportunities for studying adoption and implementation within real-world service organizations.
This dissertation examined adoption and implementation within the context of a rollout
in New York State’s child and adolescent mental health system. The state Office of Mental
Health founded the Community Technical Assistance Center to offer training in clinical and
business innovation via webinars, in-person sessions, and learning collaboratives (Chor et al.,
2014). The three studies in this dissertation explored clinic leader perspectives related to
adoption and implementation using qualitative interviews with leaders of 34 clinics selected to
represent a 10% stratified sample of all clinics in the children’s public mental health system.
Training participation was used as a proxy of clinic adoption, and administrative data related to
clinic characteristics and interviews with policy makers were used to incorporate factors in the
organizational and external contexts. Each study and two of its main practical, theoretical, or
methodological contributions are summarized below.
120
Major Findings
Study One
Study One examined the external sources that clinics rely on for information about
innovations, and how differences in organizational absorptive capacity were associated with
clinic adoption of CTAC innovations. Results offer empirical evidence of the most commonly
accessed sources of information about innovations (e.g., government sources, peer organizations,
professional associations) and information about factors (culture, knowledge-sharing
mechanisms, and leader capacity) that impact information seeking and processing. These
findings increase understanding of key tasks in the pre-adoption/exploration phase of
implementation, which has received relatively little research attention despite its role in setting
the tone for the larger implementation process (Aarons, Hurlburt, & Horwitz, 2011; Wisdom,
Chor, Hoagwood, & Horwitz, 2014; Chor, Wisdom, Olin, Hoagwood, & Horwitz, 2015).
A second contribution is the finding that clinics with higher absorptive capacity were
higher adopters of CTAC innovations. Although this study did not statistically assess the
relationship between organizational processes and innovation adoption, results indicated that
higher-adopting clinics were more likely to obtain information during conversations with peers
and colleagues; were active seekers vs. passive recipients of information; and were less likely to
report information ‘overload’. They also had cultures that valued evidence-based practice and
quality improvement, and shared information about innovations via both top-down and bottom-
up mechanisms. These results suggest that building leader and organizational absorptive capacity
may improve implementation even at its earliest stage.
121
Study Two
Study Two explored how combinations of clinic characteristics from administrative data
and leader-reported drivers of adoption decisions were associated with clinic adoption of CTAC
innovations. Study Two’s first contribution is its identification of different pathways to adoption
of CTAC innovations in this study’s context. Rather than a single correct recipe, there were
multiple configurations associated with adoption. Further, malleable factors related to leader
perceptions of culture and innovation-organization fit were involved in all of them, suggesting
that targeting provider values, attitudes, and perceptions related to innovations may improve
adoption even in smaller and/or less fiscally efficient clinics.
Study Two’s second contribution is its demonstration of the benefits of using
configurational approaches such as QCA to examine implementation behaviors. Few studies
have explored implementation in mental health services using such methodologies, despite
current frameworks characterizing implementation as grounded in complex causality and
equifinality (Aarons et al., 2011; Wisdom et al., 2014). Results from this study illustrate how
such approaches offer nuance to regression-based quantitative findings, and contextualize
qualitative findings by linking them to organizational-level characteristics and outcomes.
Study Three
Study Three examined leader and state executive attitudes, beliefs, and perspectives
around defining implementation success and deciding to stop implementing a failed innovation.
One of Study Three’s main contributions is its cataloging of stakeholder definitions of
implementation success. Stakeholders prioritized client outcomes and provider feedback,
offering insight into the values that drive their implementation decision-making. In general,
122
stakeholder definitions aligned with Proctor et al.’s (2011) taxonomy of implementation
outcomes, supporting the external validity of this taxonomy in mental health services.
This study’s findings also point to an important gap in implementation science regarding
exnovation, or the removal of an innovation that is no longer useful or has failed on some
measure. Stakeholders had difficulty discussing implementation failure, and reported that they
rarely, if ever, made conscious exnovation decisions, likely reflecting the bias for continuous and
persistent implementation in mental health services. These findings highlight the need for
additional research, education, and support related to exnovation within academic, policy, and
service realms.
Practical Implications
Findings from these three studies, taken together with other research, have practical
implications for funders and policy makers who develop and oversee innovation rollouts. They
also have implications for directors and managers of service organizations seeking to implement
new clinical or quality improvement practices. These practical implications and
recommendations are discussed below.
Policy Makers and Oversight Agencies
This dissertation is set in one of the leading states in efforts to implement clinical and
business innovations within child and adolescent mental health services (Gleacher et al., 2011).
The CTAC model offers an example for other funders and policy makers looking to increase
adoption and implementation of innovations within their systems. CTAC provides trainings free
of charge, using a variety of modalities so that clinics can select trainings based on their desired
level of intensity. In addition to clinical practice trainings, it offers business-related trainings that
123
assist clinics in understanding and adapting to the constantly changing fiscal and policy demands
of their environment.
Qualitative interviews revealed that many clinic leaders in this study perceived CTAC
trainings as convenient, relevant, and beneficial. These findings compliment adoption outcomes
data, which indicated that the majority of clinics accessed CTAC trainings in the years following
the rollout (Chor et al., 2014). Based on its success, the CTAC has since expanded its scope
through the addition of the Managed Care Technical Assistance Center (MCTAC), which
prepares clinics to transition to Medicaid managed care. Further, it has expanded its reach to
include all publicly funded mental health clinics in the State (not just those that serve children)
(www.ctacny.org).
Through its exploration of clinic leader perspectives related to adoption and
implementation in the New York State children’s mental health system, this dissertation offers
several implications for service systems. First, results emphasize the need for holistic approaches
to quality improvement at the system level that go beyond a singular focus on disseminating
innovations. Organizational context factors such as culture, leadership, attitudes towards
innovations, and knowledge-sharing mechanisms had critical influence on clinic adoption of
innovations in this study. Therefore, resources aimed at enhancing the general organizational
context of clinics may enhance clinics’ capacity and motivation to adopt and implement
innovations that improve quality of care (Novins et al., 2013; Leeman et al., 2017). Interventions
to improve organizational contexts may include training, technical assistance, and consultation
around creation of clinic cultures that support innovations, developing bidirectional
communication channels, and leadership skills and styles (Aarons, Ehrhart, Farahnak, & Sklar,
2014; Aarons, Ehrhart, Moullin, Torres, & Green, 2017; Greenhalgh, Robert, Macfarlane, Bate,
124
& Kyriakidou, 2004; Richter et al., 2016). Other interventions may involve creating
opportunities for peer networking or learning collaboratives around cultivating organizational
contexts that support quality improvement (Leeman et al., 2017).
Related to the need for more holistic approaches to quality improvement, results from this
dissertation suggest that clinics may benefit from targeted implementation support to augment
trainings in clinical and business innovations. To date, CTAC has not offered trainings focusing
on issues related to adoption or implementation; however, these issues may be addressed through
individual consultation (K. Hoagwood, personal communication, March 5, 2019).
Implementation-related trainings, technical support, and/or consultations may guide clinic
leaders in how to identify innovations that fit their needs, weigh adoption decisions, evaluate
implementation success, and exnovate innovations that are no longer useful. Implementation
strategies that focus on interagency collaboration, such as building networks of peer adopters
who problem-solve implementation barriers or partnerships between stakeholders from different
systems may also be useful (Hurlburt et al., 2014). However, these strategies have shown mixed
results, and more research is needed to weigh their costs and benefits (Brown et al., 2014).
Public mental health services have long been underfunded and under-resourced, creating
challenges for adoption and implementation. The New York State system is no exception, and
clinic leaders in this study endorsed capacity issues related to time, finances, and staffing as
barriers to accessing information, participating in trainings, and implementation success. New
York State has taken important steps towards assisting clinics in navigating financial and billing
challenges via the CTAC, and later the MCTAC. However, even though CTAC trainings are free
of charge, clinic leaders explained that taking time away from billing for staff training results in
lost revenue. In some clinics, the perceived benefits of the trainings outweighed the costs, but in
125
others, capacity issues prevented adoption. Funders and oversight agencies wishing to increase
innovation adoption in their systems may consider offering financial incentives to offset some of
the losses associated with training participation (Beidas et al., 2016).
This dissertation’s findings illustrate the importance of communication between system
administrators and clinic leaders. Many clinic leaders reported that they relied on the state Office
of Mental Health for information about clinical and business innovations. Further, adoption
decisions were frequently driven by clinic leader perceptions of innovation relevance. Therefore,
consensus between system administrators and clinic leaders regarding training needs and
priorities can help to further the agendas of both stakeholder groups. The Office of Mental
Health and analogous oversight agencies in other systems should create opportunities for
communication between state executives and providers by conducting surveys, seeking feedback
around specific initiatives, and dialoguing with providers who attend their trainings.
Results also suggest opportunities to increase the success of future rollouts through
engagement between system administrators and clinics that do not access CTAC trainings. As
previously mentioned, results from this dissertation align with others indicating that clinics that
are more supportive of and open to innovations are more likely to adopt innovations (Aarons &
Sawitzky, 2006; Aarons & Sommerfeld, 2012; Ehrhart, Aarons, & Farahnak, 2014; Glisson et
al., 2008). Conversely, clinics with the highest need for quality improvement (i.e., those with
leadership and/or cultures that are not open to innovations) are less likely to adopt innovations,
creating what Saldana and Chamberlain (2012) refer to as a “needs-innovation paradox”.
Therefore, system administrators looking to increase the quality of mental health services should
conduct needs assessments with non-adopting clinics to identify their barriers, facilitators, and
drivers of adoption decisions. They should target non-adopting clinics for additional support
126
around implementation and capacity building, using a collaborative approach, rather than relying
on these clinics to engage with system resources.
Clinic Leaders
This dissertation’s findings also have practical implications for clinic leaders. Most
importantly, leaders should strive to create organizational contexts that are open to adoption and
implementation of innovations. They may do this using strategies grounded in Schein’s (2010)
embedding mechanisms and applied to health care organizations by Aarons and colleagues
(2014). These include allocating resources for innovation adoption, sharing research evidence,
rewarding staff who promote use of innovations, and role modeling positive attitudes towards
innovation and implementation (Aarons et al., 2014). Leaders should also seek to improve their
personal capacity to guide their organizations in adopting and implementing quality
improvement initiatives via trainings in areas such as transformational leadership or
implementation leadership (Aarons et al., 2017; Richter et al., 2016). Further, given the
importance of communication between the internal and external organizational contexts for the
success of innovation rollouts, clinic leaders should proactively communicate problems and
needs to system administrators rather than waiting for system administrators to seek their
feedback. For example, if trainings related to leadership and climate are not readily available,
leaders should advocate for them with system administrators.
Theoretical Implications
Results have theoretical implications for implementation science and mental health
services research. First, they contribute to large bodies of work underscoring the importance of
inner organizational contextual factors such as leadership, culture, and provider attitudes for
implementation (Aarons & Sawitzky, 2006; Aarons & Sommerfeld, 2012; Ehrhart, Aarons, &
127
Farahnak, 2014; Glisson et al., 2008). This dissertation’s findings extend this work by offering
empirical evidence of the relationship between these inner contextual factors and adoption
outcomes such as training participation. Further, this dissertation contributes to conceptual work
on adoption, the understudied but critical first stage in implementation (Chor et al., 2015). Most
empirical studies focus on implementation after the adoption decision is already made, with far
fewer delving into the processes that define the pre-adoption and adoption phases. This
dissertation offers examples of factors and processes related to adoption from a real-world
service context that corroborate and contribute to implementation frameworks (Aarons et al.,
2011; Wisdom et al., 2014). In addition, this dissertation improves understanding of bidirectional
relationships between factors in the internal organization and external system contexts,
contributing to an emerging area of implementation science focusing on the interconnections
between organization and system levels (Moullin et al., 2019). Finally, this study’s findings
related to innovation model fidelity suggest future directions for implementation research.
Fidelity was not a salient outcome to stakeholders in this study, likely because it was not
monitored at the New York State system level (K. Hoagwood, personal communication, March
5, 2019). This finding is in line with others indicating that implementation researchers should
further investigate the role of fidelity in stakeholder theories of change, and tailor evaluations to
include outcomes that are salient within the system under study (Cutbush, Gibbs, Krieger,
Clinton-Sherrod, & Miller, 2017; Lengnick-Hall, Fenwick, & Henwood, 2018).
Methodological Implications
This dissertation has several methodological implications for future studies evaluating
efforts to scale up innovations. It illustrates how qualitative interviews with stakeholders may be
used to augment findings from quantitative adoption outcomes, adding context and giving
128
service providers voice (Chor et al., 2014; Olin et al., 2015). As such, it emphasizes the
importance of using mixed methods designs to evaluate innovation rollouts (Palinkas et al.,
2011). In addition, this dissertation argues for the use of configurational approaches to studying
implementation, which align with current conceptualizations of adoption and implementation as
complex, multifactorial, equifinal processes (Aarons et al., 2011; Kane, Lewis, Williams, &
Kahwati, 2014; Wisdom et al., 2014). Further, findings illustrate the importance of connections
between the inner organization and external system contexts, indicating that future
implementation studies should focus on service systems as a whole in order to attend to these
interconnections. Finally, future work using longitudinal designs is necessary in order to assess
the temporal relationship between stakeholder perspectives and adoption and implementation
behaviors.
Limitations and Future Directions
Limitations related to each study’s methodology are presented in detail under that study’s
Discussion section. However, there are several limitations that are applicable across this
dissertation. These should be kept in mind when interpreting results.
First, this dissertation relies on participation in CTAC training as a proxy for adoption.
Although this measure offers advantages over simple yes/no adoption outcomes, it does not
assess quality of implementation or sustainment. Further, this dissertation relies on qualitative
interview data with a relatively small sample of participants, and its methodologies are not
designed to assess causality. Therefore, findings should be viewed as exploratory and
conclusions should be interpreted as qualitative assessments by investigators rather than
statistical inferences. Finally, this dissertation’s data comes from a single service system,
potentially confounding findings with this study’s system context.
129
These limitations suggest several areas for future research. Longitudinal designs and
replication in other states and/or service systems can assess the generalizability of this study’s
results. Future studies can examine how clinic leader perceptions relate to adoption using multi-
indicator outcomes that incorporate independent verification of adoption and implementation
behaviors. Future studies may also test strategies targeted at building system capacity using
rigorous, controlled designs, such as that applied in a recent California/Ohio innovation rollout
(Saldana & Chamberlain, 2012).
Conclusion
This dissertation examined adoption and implementation of clinical and business
innovations in the context of a large-scale rollout in the public children’s mental health service
system. Results support a holistic approach to improving quality of services that combines
training, technical support, and consultation in innovative practices with targeted implementation
support and organizational capacity-building strategies. Findings also point to the mutual
benefits of intentional, regular and sustained communication between clinic leaders and the
system administrators regarding implementation and training needs. These findings can inform
future innovation rollouts and subsequently improve quality of mental health care for children,
adolescents, and their families.
130
References
Aarons, G. A., Ehrhart, M. G., Farahnak, L. R., & Sklar, M. (2014). Aligning leadership across
systems and organizations to develop a strategic climate for evidence-based practice
implementation. Annual Review of Public Health, 35, 255-274.
Aarons, G. A., Ehrhart, M. G., Moullin, J. C., Torres, E. M., & Green, A. E. (2017). Testing the
leadership and organizational change for implementation (LOCI) intervention in
substance abuse treatment: a cluster randomized trial study protocol. Implementation
Science, 12, 29-39.
Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a conceptual model of
evidence-based practice implementation in public service sectors. Administration and
Policy in Mental Health and Mental Health Services Research, 38, 4-23.
Aarons, G. A., & Sawitzky, A. C. (2006). Organizational culture and climate and mental health
provider attitudes toward evidence-based practice. Psychological Services, 3(1), 61-72.
Aarons, G. A., & Sommerfeld, D. H. (2012). Leadership, innovation climate, and attitudes
toward evidence-based practice during a statewide implementation. Journal of the
American Academy of Child & Adolescent Psychiatry 51, 423-431.
Beidas, R. S., & Kendall, P. C. (Eds.). (2014). Dissemination and implementation of evidence-
based practices in child and adolescent mental health. New York, NY: Oxford
University Press.
Beidas, R. S., Stewart, R. E., Adams, D. R., Fernandez, T., Lustbader, S., Powell, B. J., ... &
Rubin, R. (2016). A multi-level examination of stakeholder perspectives of
implementation of evidence-based practices in a large urban publicly-funded mental
131
health system. Administration and Policy in Mental Health and Mental Health Services
Research, 43(6), 893-908.
Brown, C. H., Chamberlain, P., Saldana, L, Padgett, C., Wang, W., & Cruden, G. (2014).
Evaluation of two implementation strategies in 51 child county public service systems in
two states: Results of a cluster randomized head-to-head implementation trial.
Implementation Science, 9, 134-148.
Chor, K. H. B., Olin, S. C. S., Weaver, J., Cleek, A. F., McKay, M. M., Hoagwood, K. E., &
Horwitz, S. M. (2014). Adoption of clinical and business trainings by child mental health
clinics in New York State. Psychiatric Services, 65(12), 1439-1444.
Chor, K. H. B., Wisdom, J. P., Olin, S. C. S., Hoagwood, K. E., & Horwitz, S. M. (2015).
Measures for predictors of innovation adoption. Administration and Policy in Mental
Health and Mental Health Services Research, 42(5), 545-573.
Cutbush, S., Gibbs, D., Krieger, K., Clinton-Sherrod, M., & Miller, S. (2017). Implementers’
perspectives on fidelity of implementation: “Teach every single part” or “be right with
the curriculum”? Health Promotion Practice, 18(2), 275–282.
Ehrhart, M. G., Aarons, G. A., & Farahnak, L. R. (2014). Assessing the organizational context
for EBP implementation: the development and validity testing of the Implementation
Climate Scale (ICS). Implementation Science, 9, 157-167.
Gleacher, A. A., Nadeem, E., Moy, A. J., Whited, A. L., Albano, A. M., Radigan, M., ... &
Hoagwood, K. (2011). Statewide CBT training for clinicians and supervisors treating
youth: The New York State evidence-based treatment dissemination center. Journal of
Emotional and Behavioral Disorders, 19(3), 182-192.
132
Gleacher, A. A., Olin, S. S., Nadeem, E., Pollock, M., Ringle, V., Bickman, L., ... & Hoagwood,
K. (2016). Implementing a measurement feedback system in community mental health
clinics: A case study of multilevel barriers and facilitators. Administration and Policy in
Mental Health and Mental Health Services Research, 43(3), 426-440.
Glisson, C., Landsverk, J., Schoenwald, S., Kelleher, K., Hoagwood, K. E., Mayberg, S., ... &
Research Network on Youth Mental Health. (2008). Assessing the organizational social
context (OSC) of mental health services: Implications for research and practice.
Administration and Policy in Mental Health and Mental Health Services Research, 35,
98-113.
Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of
innovations in service organizations: Systematic review and recommendations. Milbank
Quarterly, 82(4), 581–629.
Hurlburt, M., Aarons, G. A., Fettes, D., Willging, C., Gunderson, L., & Chaffin, M. J. (2014).
Interagency collaborative team model for capacity building to scale-up evidence-based
practice. Children and Youth Services Review, 39, 160-168.
Kane, H., Lewis, M. A., Williams, P. A., & Kahwati, L. C. (2014). Using qualitative comparative
analysis to understand and quantify translation and implementation. Translational
Behavioral Medicine, 4(2), 201-208.
Leeman, J., Birken, S. A., Powell, B. J., Rohweder, C., & Shea, C. M. (2017). Beyond
“implementation strategies”: Classifying the full range of strategies used in
implementation science and practice. Implementation Science, 12, 125-133.
133
Lengnick-Hall, R., Fenwick, K., & Henwood, B. (2018). “It’s like you do it without knowing
that you’re doing it”: Practitioner experiences with ACT implementation. Community
Mental Health Journal, 55(3), 448-453.
Moullin, J. C., Dickson, K. S., Stadnick, N., Rabin, B., & Aarons, G. A. (2019). Systematic
review of the exploration, preparation, implementation, sustainment (EPIS) framework.
Implementation Science, 14, 1-15.
Novins, D. K., Green, A. E., Legha, R. K., & Aarons, G. A. (2013). Dissemination and
implementation of evidence-based practices for child and adolescent mental health: A
systematic review. Journal of the American Academy of Child & Adolescent
Psychiatry, 52(10), 1009-1025.
Olin, S. C. S., Chor, K. H. B., Weaver, J., Duan, N., Kerker, B. D., Clark, L. J., ... & Horwitz, S.
M. (2015). Multilevel predictors of clinic adoption of state-supported trainings in
children’s services. Psychiatric Services, 66(5), 484-490.
Palinkas, L. A., Aarons, G. A., Horwitz, S., Chamberlain, P., Hurlburt, M., & Landsverk, J.
(2011). Mixed method designs in implementation research. Administration and Policy in
Mental Health and Mental Health Services Research, 38(1), 44-53.
Proctor, E., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G., Bunger, A., ... & Hensley, M.
(2011). Outcomes for implementation research: conceptual distinctions, measurement
challenges, and research agenda. Administration and Policy in Mental Health and Mental
Health Services Research, 38(2), 65-76.
Richter, A., von Thiele Schwarz, U., Lornudd, C., Lundmark, R., Mosson, R., & Hasson, H.
(2016). iLead—A transformational leadership intervention to train healthcare managers’
implementation leadership. Implementation Science, 11, 108-124.
134
Saldana, L., & Chamberlain, P. (2012). Supporting implementation: The role of community
development teams to build infrastructure. American Journal of Community
Psychology, 50(3-4), 334-346.
Schein, E. (2010). Organizational culture and leadership. San Francisco: Wiley.
Wisdom, J. P., Chor, K. H. B., Hoagwood, K. E., & Horwitz, S. M. (2014). Innovation adoption:
a review of theories and constructs. Administration and Policy in Mental Health and
Mental Health Services Research, 41(4), 480-502.
135
Appendix A
List of CTAC Trainings from September 2011-August 2013
Type Title Intensity Duration
Clinical Evidence-
Informed Practices
(18 trainings)
Working with Children Suffering from Trauma –
Trauma Assessment
Webinar 1 hour
Motivational Interviewing in Children’s Services Webinar 1 hour
Working with Children Suffering from Trauma –
Trauma Treatment
Webinar 1 hour
Motivational Interviewing in Children’s Services,
Part II
Webinar 1 hour
Autism Spectrum Disorders and Mental Health
Outpatient Settings
Webinar 1 hour
Disaster Trauma in Outpatient Settings Webinar 1 hour
Introduction to Cognitive Behavior Therapy
(CBT)
Webinar 1 hour
Secondary Trauma and Compassion Fatigue Webinar 1 hour
Family Focused Engagement in Child Mental
Health Services
Webinar 1 hour
Parent Partnership Webinar 1 hour
CPT Coding Changes, in Partnership with the
Coalition of Behavioral Health Agencies
Webinar 1 hour
Child Development and the Brain: Promoting
Resilience and Joy
Webinar 1 hour
Peers in Clinic: Improving Quality and
Outcomes
Webinar 1 hour
Suicide Prevention: Recognize the Signs – Take
Actions to Save a Life
Webinar 1 hour
4Rs and 2Ss (Group Model) Open Training
[Strengthening families]
In-person 1 day
4Rs and 2Ss (Group Model) – Rules, Roles and
Responsibilities, Respectful Communication,
Relationships, Stress, and Social Support
[Strengthening families]
Learning
collaborative
12 months
4Rs and 2Ss (Individual Model) – Rules, Roles
and Responsibilities, Respectful Communication,
Relationships, Stress, and Social Support
[Strengthening families]
Learning
collaborative
6 months
Practitioner Education and Decision Support
(PEDS)
Learning
collaborative
12 months
136
Business/Quality
Improvement
(12 trainings)
Basic Tools to Get on the Road to Financial
Planning
Webinar 1 hour
Financial Modeling Tools: Setting Benchmark
for Fiscal Viability
Webinar 1 hour
Integrating Services Delivery with your
Financial Model: Understanding the Impact
Webinar 1 hour
Managing your Workload: A Therapist Self-
Management Tool for Productivity Demands
Webinar 1 hour
Uncovering the Elements: Quality Assurance,
Corporate Compliance, and Risk Management
Webinar 1 hour
Staff Performance Reporting & Monitoring:
Coaching for Success
Webinar 1 hour
Business Webinar: Open Access Webinar 1 hour
Business Webinar: Collaborative Documentation Webinar 1 hour
Business Webinar: Centralized Scheduling Webinar 1 hour
Successfully Meeting the Challenges of a
Changing Behavioral Health System
In-person 1 day
Business Efficiencies and Effectiveness Project
(BEEP)
Learning
collaborative
18 months
Business Effectiveness Assessment Module
(BEAM)
Learning
collaborative
6 months
Hybrid
(3 trainings)
Outcome Measurement: Clinical Support Tools Webinar 1 hour
Outcome Measurement: Clinical Support Tools,
Part II
Webinar 1 hour
Training Intervention for the Engagement of
Families (TIES)
In-person 1 day
Adapted from Chor, K. H. B., Olin, S. C. S., Weaver, J., Cleek, A. F., McKay, M. M.,
Hoagwood, K. E., & Horwitz, S. M. (2014). Adoption of clinical and business trainings by child
mental health clinics in New York State. Psychiatric Services, 65(12), 1439-1444.
137
Appendix B
Outcomes and Conditions by Case
Case
Clinical
adoption
Business
adoption
Culture Relevance Capacity Small Inefficiency
High
youth
Region
1 1 1 1 1 0 0.64 0 0.65 1
2 1 1 1 1 0 0.98 1 0 0
3 1 1 0 1 0 0.26 0.23 0.01 0
4 1 1 1 1 0 0 0.2 0.23 0
5 0 0 0 0 1 0.98 0.02 0 1
6 0 1 0 1 1 0.89 0 0.39 1
7 0 1 0 0 1 0 0 0.23 1
8 1 1 0 1 0 0.83 0 0.08 1
9 0 0 0 0 1 0.95 0.01 0.65 1
10 1 1 1 1 0 0 0.85 0.65 1
11 0 0 0 0 1 0.03 0.03 0.21 1
12 1 1 0 1 0 0.63 0 0.07 1
13 0 0 0 0 1 0.97 0.07 0.01 1
14 1 1 0 1 0 0.92 0.94 0.96 1
15 1 1 0 1 0 0.95 1 1 1
16 1 1 1 1 0 0.96 0.37 1 1
17 0 0 0 1 0 0 0.67 1 1
18 0 1 0 1 0 0.27 0.93 0.01 0
19 0 0 0 0 1 0 0 0.04 1
20 1 1 0 1 0 0.58 0 0.35 1
21 1 1 1 1 0 0.22 0.23 0.05 0
22 1 1 1 1 0 0.09 0.92 0.18 0
23 1 1 1 0 1 0.73 0.61 0.35 0
24 1 1 1 0 0 0.84 1 0.02 0
25 1 1 0 0 1 0.01 1 0.71 0
26 1 1 1 0 0 0 1 0.96 1
27 0 0 0 0 1 0 0.96 0.02 1
28 0 0 0 1 0 0.51 1 0 1
29 1 1 1 1 0 0 1 0.07 1
30 0 0 0 0 1 0 1 1 1
31 0 0 0 0 1 0.85 1 0 0
32 1 1 1 1 0 0.78 1 0.04 1
138
Appendix C
Truth Table for Clinical Innovation Adoption (n = 32)
Culture Relevance Small Youth Adoption* Cases
0 1 1 0 0 (c) 5
0 0 0 0 0 4
1 1 0 0 1 4
0 0 1 0 0 3
0 1 0 0 0 (c) 2
1 0 1 0 1 2
1 1 1 0 1
2
0 0 0 1 0
2
0 1 1 1 1
2
1 1 1 1 1
2
1 0 0 1 1
1
0 1 0 1 0
1
1 1 0 1 1
1
0 0 1 1 -
0
1 0 0 0 -
0
*(c) = unresolved contradiction
139
Appendix D
Truth Table for Business Innovation Adoption (n = 32)
Culture Relevance Small Inefficiency Adoption* Cases
0 1 1 0 1 4
0 0 0 0 0 (c) 3
0 0 1 0 0 3
0 0 0 1 0 (c) 3
1 1 0 1 1 3
0 1 1 1 1 3
1 1 0 0 1
2
1 1 1 0 1
2
0 1 0 0 0
(c)
1
1 0 0 1 1
1
0 0 1 1 0
1
1 0 0 0 -
0
1 0 1 0 -
0
*(c) = unresolved contradiction
Abstract (if available)
Abstract
Evidence-based clinical and business innovations are underutilized in the U.S. children’s mental health system. The delay in moving innovations from research to practice has especially serious implications in public sector clinics, where clients have less access to services and greater risk for adverse outcomes if they do not receive high-quality care. To mitigate this delay, many states have encouraged the use of innovations through initiatives, legislation, and/or mandates. However, evaluations of these efforts show mixed results, indicating the need for further research on how and why clinics adopt and implement innovations. ❧ This three-study dissertation investigates three aspects of innovation adoption and implementation in the context of a large-scale innovation rollout in New York State. Clinic leader perspectives were explored using qualitative interviews conducted in a 10% stratified sample of all clinics in the state children’s public mental health system. Training participation was used as a proxy of clinic adoption, and administrative data related to clinic characteristics and interviews with policy makers were used to incorporate factors in the organizational and external contexts. ❧ Chapter One synthesizes the literature on adoption and implementation of innovations in children’s mental health services, summarizes recent system-level efforts to roll out innovations, and presents an overview of the three studies. Chapter Two (Study One) investigates the external sources that clinics rely on for information about innovations, and how differences in organizational absorptive capacity are associated with clinic level of adoption. Chapter Three (Study Two) uses Qualitative Comparative Analysis to examine combinations of clinic leader-reported drivers of adoption decisions and clinic characteristics associated with adoption and non-adoption of clinical and business innovations. Chapter Four (Study Three) qualitatively explores clinic leader and state executive attitudes, beliefs, and perspectives around defining implementation success and deciding to stop implementing a failed innovation. Finally, Chapter Five presents practical implications of study findings for mental health system administrators and clinic leaders, as well as theoretical and practical implications for implementation science.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Delineating the processes and mechanisms of sustainment: understanding how evidence-based prevention initiatives are maintained after initial funding ends
PDF
How organizations adapt during EBP implementation
PDF
Implementation of peer providers in integrated health care settings
PDF
Mental Health First
PDF
From “soul calling” to calling a therapist: meeting the mental health needs of Hmong youth through the integration of spiritual healing, culturally responsive practice and technology
PDF
Strength-Based Reporting: a trauma-informed practice for mandated reporters, to address behavioral health concerns in children at risk of child welfare involvement
PDF
Neighborhood context and adolescent mental health: development and mechanisms
PDF
Acculturation team-based clinical program: pilot program to address acculturative stress and mental health in the Latino community
PDF
Integration of behavioral health outcomes into electric health records to improve patient care
PDF
Low-socioeconomic status families: the role of parental involvement and its association with early childhood academic achievement trajectories
PDF
Broken promises: gaining an understanding of policy-practice and means-ends decoupling within diversity and inclusion
PDF
Inter professional education and practice in the health care setting: an innovative model using human simulation learning
PDF
Immigrants at a loss: the need for services that promote child well-being among Latino families with child welfare contact
PDF
Incarceration trajectories of mothers in state and federal prisons and their relation to the mother’s mental health problems and child’s risk of incarceration
PDF
Full spectrum transition assistance: preventing loneliness and social isolation in military members during moves
PDF
LGBT+Aging Immersion Experience: an innovative LGBT cultural competency course for healthcare professionals and students
PDF
Brazos Abiertos: addressing mental health stigma among the Latino Catholic community
PDF
Exploring the social determinants of health in a population with similar access to healthcare: experiences from United States active-duty army wives
PDF
Promoting emotional intelligence and resiliency in youth: S.U.P.E.R. peer counseling program ©
PDF
Building a trauma-informed community to address adverse childhood experiences
Asset Metadata
Creator
Fenwick, Karissa Marie
(author)
Core Title
Multilevel influences on organizational adoption of innovative practices in children's mental health services
School
Suzanne Dworak-Peck School of Social Work
Degree
Doctor of Philosophy
Degree Program
Social Work
Publication Date
07/23/2019
Defense Date
06/06/2019
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
dissemination,evidence-based practice,implementation,Mental Health,OAI-PMH Harvest
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Hurlburt, Michael (
committee chair
), Palinkas, Lawrence (
committee chair
), Fiss, Peer (
committee member
)
Creator Email
kfenwick@usc.edu,kmfenwick@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c89-189510
Unique identifier
UC11660654
Identifier
etd-FenwickKar-7579.pdf (filename),usctheses-c89-189510 (legacy record id)
Legacy Identifier
etd-FenwickKar-7579.pdf
Dmrecord
189510
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Fenwick, Karissa Marie
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
dissemination
evidence-based practice
implementation